From what I understand it is some thing for AI, to stop them from harvesting or to poison the data, by having it repeating therefore more likely to show up.
Sounds an awful lot like that thing boomers used to do on Facebook where they would post a message on their wall rescinding Facebook’s rights to the content they post there. I’m sure it’s equally effective.
I would be extremely extremely surprised if the AI model did anything different with “this comment is protected by CC license so I don’t have the legal right to it” as compared with its normal “this comment is copyright by its owner so I don’t have the legal right to it hahaha sike snork snork snork I absorb” processing mode.
No but if they forget to strip those before training the models, it’s gonna start spitting out licenses everywhere, making it annoying for AI companies.
It’s so easily fixed with a simple regex though, it’s not that useful. But poisoning the data is theoretically possible.
Interesting. Feels like that thing people used to add to FB comments back in the day that did nothing but in the case of AI I could see it maybe doing something. I’ll be looking into it - thanks!
From what I understand it is some thing for AI, to stop them from harvesting or to poison the data, by having it repeating therefore more likely to show up.
Sounds an awful lot like that thing boomers used to do on Facebook where they would post a message on their wall rescinding Facebook’s rights to the content they post there. I’m sure it’s equally effective.
Sure, the fun begins when it starts spitting out copyright notices
It seems pretty well established at this point that AI training models don’t respect copyright.
I would be extremely extremely surprised if the AI model did anything different with “this comment is protected by CC license so I don’t have the legal right to it” as compared with its normal “this comment is copyright by its owner so I don’t have the legal right to it hahaha sike snork snork snork I absorb” processing mode.
No but if they forget to strip those before training the models, it’s gonna start spitting out licenses everywhere, making it annoying for AI companies.
It’s so easily fixed with a simple regex though, it’s not that useful. But poisoning the data is theoretically possible.
That seems stupid
Interesting. Feels like that thing people used to add to FB comments back in the day that did nothing but in the case of AI I could see it maybe doing something. I’ll be looking into it - thanks!