It’s called Pi and it’s a conversational AI made to be more of a personal assistant. In the bit of time I’ve used it, it’s done far better than I expected at reframing and simplifying my thoughts when I’m overwhelmed.

Obviously, talking to a real person is much better if possible, but the reality is some of us don’t have the finances to pay for therapy or other ways to cope with the anxiety/depression that so often comes with ASD. What are your thoughts on this?

  • TheBluePillock@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    9 months ago

    I would love to be corrected, but when I looked into it, it sounded like you’d probably want 32gb VRAM or better for actual chat ability. You have to have enough memory to load the model, and anything not handled by your GPU takes a major performance hit. Then, you probably want to aim for a 72 billion parameter model. That’s a decently conversational level and maybe close to the one you’re using (but it’s possible they’re higher? I’m just guessing). I think 34B models are comparatively more prone to hallucination and inaccuracy. It sounded like the 32GB VRAM was kinda entry point for the 72B models so I stopped looking, because I can’t afford that.

    So somebody with more experience or knowledge can hopefully correct me or give a better explanation, but just in case, maybe this is a helpful starting point for someone.

    You can download models on huggingface.co and interact with them through a web-ui like this one.