FederateLOL
  • Communities
  • Create Post
  • heart
    Support Lemmy
  • search
    Search
  • Login
  • Sign Up
Pro@programming.dev to Technology@lemmy.worldEnglish · 5 days ago

Google quietly released an app that lets you download and run AI models locally

github.com

external-link
message-square
47
fedilink
219
external-link

Google quietly released an app that lets you download and run AI models locally

github.com

Pro@programming.dev to Technology@lemmy.worldEnglish · 5 days ago
message-square
47
fedilink
GitHub - google-ai-edge/gallery: A gallery that showcases on-device ML/GenAI use cases and allows people to try and use models locally.
github.com
external-link
A gallery that showcases on-device ML/GenAI use cases and allows people to try and use models locally. - google-ai-edge/gallery
  • AmbiguousProps@lemmy.today
    link
    fedilink
    English
    arrow-up
    43
    arrow-down
    3
    ·
    5 days ago

    Why would I use this over Ollama?

    • Greg Clarke@lemmy.ca
      link
      fedilink
      English
      arrow-up
      32
      arrow-down
      2
      ·
      5 days ago

      Ollama can’t run on Android

      • AmbiguousProps@lemmy.today
        link
        fedilink
        English
        arrow-up
        23
        ·
        5 days ago

        That’s fair, but I think I’d rather self host an Ollama server and connect to it with an Android client in that case. Much better performance.

        • OhVenus_Baby@lemmy.ml
          link
          fedilink
          English
          arrow-up
          2
          ·
          4 days ago

          How is Ollama compared to GPT models? I used the paid tier for work and I’m curious how this stacks up.

          • AmbiguousProps@lemmy.today
            link
            fedilink
            English
            arrow-up
            1
            ·
            4 days ago

            It’s decent, with the deepseek model anyway. It’s not as fast and has a lower parameter count though. You might just need to try it and see if it fits your needs or not.

        • Greg Clarke@lemmy.ca
          link
          fedilink
          English
          arrow-up
          4
          ·
          5 days ago

          Yes, that’s my setup. But this will be useful for cases where internet connection is not reliable

      • Euphoma@lemmy.ml
        link
        fedilink
        English
        arrow-up
        8
        ·
        5 days ago

        You can use it in termux

        • Greg Clarke@lemmy.ca
          link
          fedilink
          English
          arrow-up
          3
          ·
          5 days ago

          Has this actually been done? If so, I assume it would only be able to use the CPU

          • Euphoma@lemmy.ml
            link
            fedilink
            English
            arrow-up
            7
            ·
            5 days ago

            Yeah I have it in termux. Ollama is in the package repos for termux. The speed it generates does feel like cpu speed but idk

      • Diplomjodler@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        4 days ago

        Is there any useful model you can run on a phone?

      • gens@programming.dev
        link
        fedilink
        English
        arrow-up
        2
        ·
        4 days ago

        Llama.cpp (on which ollama runs on) can. And many chat programs for phones can use it.

      • pirat@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        4 days ago

        Try PocketPal instead

Technology@lemmy.world

technology@lemmy.world

Subscribe from Remote Instance

Create a post
You are not logged in. However you can subscribe from another Fediverse account, for example Lemmy or Mastodon. To do this, paste the following into the search field of your instance: [email protected]

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


  • @[email protected]
  • @[email protected]
  • @[email protected]
  • @[email protected]
Visibility: Public
globe

This community can be federated to other instances and be posted/commented in by their users.

  • 3.27K users / day
  • 9.32K users / week
  • 17K users / month
  • 37.6K users / 6 months
  • 1 local subscriber
  • 70.8K subscribers
  • 13.4K Posts
  • 521K Comments
  • Modlog
  • mods:
  • L3s@lemmy.world
  • enu@lemmy.world
  • Technopagan@lemmy.world
  • L4sBot@lemmy.world
  • L3s@hackingne.ws
  • L4s@hackingne.ws
  • BE: 0.19.8
  • Modlog
  • Instances
  • Docs
  • Code
  • join-lemmy.org