Is an image of a child inappropriate? Fully clothed, nothing going on.
Is the image of an adult engaging in sexual activity inappropriate?
Based on those two concepts, it can generate inappropriate child sexual imagery.
You may have done OCR work a while ago, but that is not the same type of machine learning that goes into typical generative AI systems in the modern world. It very much seems as though you are profoundly misunderstanding how this technology operates if you think it can’t generate a novel combination of previously trained concepts without a prior example.
You’re not the brightest spoon in the drawer are you?
“Naked” and “child” are two concepts it can learn and combine without needing to be taught “naked child”.
It does not need to see an example of every type of thing it can generate.
It can combine pornographic concepts learned in isolation to disparate unrelated concepts.
It does not need to have been trained on child porn to generate child porn.
I haven’t and won’t attempt any testing on this, for obvious reasons. But at the same time, if any AI system out there can manage to generate images of pre-puberty private parts, then the training data must have included inappropriate material to be able to distinguish the differences.
Is an image of a child inappropriate? Fully clothed, nothing going on.
Is the image of an adult engaging in sexual activity inappropriate?
Based on those two concepts, it can generate inappropriate child sexual imagery.
You may have done OCR work a while ago, but that is not the same type of machine learning that goes into typical generative AI systems in the modern world. It very much seems as though you are profoundly misunderstanding how this technology operates if you think it can’t generate a novel combination of previously trained concepts without a prior example.
I’m referring to the inappropriate photography and videos out there. Please learn to read.
You’re not the brightest spoon in the drawer are you?
“Naked” and “child” are two concepts it can learn and combine without needing to be taught “naked child”.
It does not need to see an example of every type of thing it can generate.
It can combine pornographic concepts learned in isolation to disparate unrelated concepts.
It does not need to have been trained on child porn to generate child porn.
I haven’t and won’t attempt any testing on this, for obvious reasons. But at the same time, if any AI system out there can manage to generate images of pre-puberty private parts, then the training data must have included inappropriate material to be able to distinguish the differences.