Karna@lemmy.ml to Firefox@lemmy.ml · 1 month agoMeet Orbit, Mozilla's AI Assistant Extension for Firefoxwww.omgubuntu.co.ukexternal-linkmessage-square54fedilinkarrow-up1135arrow-down153
arrow-up182arrow-down1external-linkMeet Orbit, Mozilla's AI Assistant Extension for Firefoxwww.omgubuntu.co.ukKarna@lemmy.ml to Firefox@lemmy.ml · 1 month agomessage-square54fedilink
minus-squareKarna@lemmy.mlOPlinkfedilinkarrow-up21·1 month agoIn such scenario you need to host your choice of LLM locally.
minus-squareReversalHatchery@beehaw.orglinkfedilinkEnglisharrow-up5·1 month agodoes the addon support usage like that?
minus-squareKarna@lemmy.mlOPlinkfedilinkarrow-up7·1 month agoNo, but the “AI” option available on Mozilla Lab tab in settings allows you to integrate with self-hosted LLM. I have this setup running for a while now.
minus-squarecmgvd3lw@discuss.tchncs.delinkfedilinkarrow-up4·1 month agoWhich model you are running? Who much ram?
minus-squareKarna@lemmy.mlOPlinkfedilinkarrow-up4·edit-21 month agoMy (docker based) configuration: Software stack: Linux > Docker Container > Nvidia Runtime > Open WebUI > Ollama > Llama 3.1 Hardware: i5-13600K, Nvidia 3070 ti (8GB), 32 GB RAM Docker: https://docs.docker.com/engine/install/ Nvidia Runtime for docker: https://docs.nvidia.com/datacenter/cloud-native/container-toolkit/latest/install-guide.html Open WebUI: https://docs.openwebui.com/ Ollama: https://hub.docker.com/r/ollama/ollama
In such scenario you need to host your choice of LLM locally.
does the addon support usage like that?
No, but the “AI” option available on Mozilla Lab tab in settings allows you to integrate with self-hosted LLM.
I have this setup running for a while now.
Which model you are running? Who much ram?
My (docker based) configuration:
Software stack: Linux > Docker Container > Nvidia Runtime > Open WebUI > Ollama > Llama 3.1
Hardware: i5-13600K, Nvidia 3070 ti (8GB), 32 GB RAM
Docker: https://docs.docker.com/engine/install/
Nvidia Runtime for docker: https://docs.nvidia.com/datacenter/cloud-native/container-toolkit/latest/install-guide.html
Open WebUI: https://docs.openwebui.com/
Ollama: https://hub.docker.com/r/ollama/ollama