• AmbiguousProps@lemmy.today
    link
    fedilink
    English
    arrow-up
    23
    ·
    5 days ago

    That’s fair, but I think I’d rather self host an Ollama server and connect to it with an Android client in that case. Much better performance.

    • OhVenus_Baby@lemmy.ml
      link
      fedilink
      English
      arrow-up
      2
      ·
      4 days ago

      How is Ollama compared to GPT models? I used the paid tier for work and I’m curious how this stacks up.

      • AmbiguousProps@lemmy.today
        link
        fedilink
        English
        arrow-up
        1
        ·
        4 days ago

        It’s decent, with the deepseek model anyway. It’s not as fast and has a lower parameter count though. You might just need to try it and see if it fits your needs or not.

    • Greg Clarke@lemmy.ca
      link
      fedilink
      English
      arrow-up
      4
      ·
      5 days ago

      Yes, that’s my setup. But this will be useful for cases where internet connection is not reliable