Abovethefold@lemmy.ml to Privacy@lemmy.mlEnglish · 2 years agoApple pulls AI image apps from the App Store after learning they could generate nude imagesptv-news.com.pkexternal-linkmessage-square23linkfedilinkarrow-up192arrow-down16cross-posted to: technology@lemmy.ml
arrow-up186arrow-down1external-linkApple pulls AI image apps from the App Store after learning they could generate nude imagesptv-news.com.pkAbovethefold@lemmy.ml to Privacy@lemmy.mlEnglish · 2 years agomessage-square23linkfedilinkcross-posted to: technology@lemmy.ml
minus-squareCoasting0942@reddthat.comlinkfedilinkarrow-up2arrow-down2·2 years agoDepends on their legal status. Could they get sued by a victim?
minus-squarepotentiallynotfelix@lemmy.mllinkfedilinkarrow-up1arrow-down1·2 years agoThere wouldn’t be a victim, it’s AI.
minus-squareCoasting0942@reddthat.comlinkfedilinkarrow-up3·2 years agoA minor who gets her face turned into porn wouldn’t be able to sue because it’s not photoshop, it’s AI. /s
Depends on their legal status. Could they get sued by a victim?
There wouldn’t be a victim, it’s AI.
A minor who gets her face turned into porn wouldn’t be able to sue because it’s not photoshop, it’s AI. /s