jeffw@lemmy.worldM to News@lemmy.world · 6 months ago“CSAM generated by AI is still CSAM,” DOJ says after rare arrestarstechnica.comexternal-linkmessage-square162fedilinkarrow-up1209arrow-down18
arrow-up1201arrow-down1external-link“CSAM generated by AI is still CSAM,” DOJ says after rare arrestarstechnica.comjeffw@lemmy.worldM to News@lemmy.world · 6 months agomessage-square162fedilink
minus-squaresparky@lemmy.federate.cc@lemmy.federate.cclinkfedilinkarrow-up0·6 months agoThe problem is that the only way to train an AI model is on real images, so the model can’t exist without crimes and suffering having been committed.
minus-squareMadison420@lemmy.worldlinkfedilinkarrow-up0arrow-down1·6 months agoReal images that don’t have to be of csam but rather of children, it could theoretically train anything sexual with legal sexual content and let the ai connect the dots.
The problem is that the only way to train an AI model is on real images, so the model can’t exist without crimes and suffering having been committed.
Real images that don’t have to be of csam but rather of children, it could theoretically train anything sexual with legal sexual content and let the ai connect the dots.