ylai@lemmy.ml to AI@lemmy.mlEnglish · edit-210 months agoGoogle apologizes for “missing the mark” after Gemini generated racially diverse Naziswww.theverge.comexternal-linkmessage-square12fedilinkarrow-up173arrow-down13cross-posted to: tech@kbin.socialtechnology@lemmy.world
arrow-up170arrow-down1external-linkGoogle apologizes for “missing the mark” after Gemini generated racially diverse Naziswww.theverge.comylai@lemmy.ml to AI@lemmy.mlEnglish · edit-210 months agomessage-square12fedilinkcross-posted to: tech@kbin.socialtechnology@lemmy.world
minus-squarePullUpCircuit@iusearchlinux.fyilinkfedilinkarrow-up2·10 months agoThat’s really what I’m expecting. My guess is that the training data is skewed, and the prompt cannot adjust. Either the machine will need to understand what is expected, or the company will need to address this and allow people to enable or disable diversity. The first option may be impossible to attain at this stage. The second can lead to inappropriate images.
That’s really what I’m expecting. My guess is that the training data is skewed, and the prompt cannot adjust.
Either the machine will need to understand what is expected, or the company will need to address this and allow people to enable or disable diversity.
The first option may be impossible to attain at this stage. The second can lead to inappropriate images.