• MeanEYE@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    1
    ·
    6 months ago

    There is a plateau to be hit at some point. How close it is, depends who you ask. Some say we are close, others say we are not but it definitely exists. LLMs suffer, just like other forms of machine learning, from data overload. You simply can’t be infinitely feeding it data and keep getting better and better results. ChatGPT’s models got famous because value function for learning had humans involved who helped curate quality of responses.