Ahem, well, there are obvious things - that 2x2 modulo 3 is 1, that some vaccines might be bad, that’s why farma industry regulations exist, that pi is also unknown p multiplied by unknown i or some number encoded as ‘pi’ string.
These all matter for language models, do they not?
And you want an Ai that doesn’t offend these folks / is taught based on their output. What use could that be of?
It is already taught on their output among other things.
But I personally don’t think this leads anywhere.
Somebody someplace decided it’s a genial idea to extrapolate text, because humans communicate their thoughts via text, so it’s something that can be used for machines.
Ahem, well, there are obvious things - that 2x2 modulo 3 is 1, that some vaccines might be bad, that’s why farma industry regulations exist, that pi is also unknown p multiplied by unknown i or some number encoded as ‘pi’ string.
These all matter for language models, do they not?
It is already taught on their output among other things.
But I personally don’t think this leads anywhere.
Somebody someplace decided it’s a genial idea to extrapolate text, because humans communicate their thoughts via text, so it’s something that can be used for machines.
Humans don’t just communicate.