The idea of querying a remote LLM makes my spine tingle — and not in a good way. When I need to do a spot of research via AI, I opt for a local LLM, such as Ollama.
If you haven’t yet installed Ollama, you can read about it my guide on how to install an LLM on MacOS (and why you should). You can also install Ollama on Linux and Windows, and, given that the Firefox extension works on all three platforms, you can be sure that whatever desktop OS you use will work.
Also: My 5 favorite web browsers – and what each is ideal for
Using Ollama from within the terminal window is actually quite easy, but it doesn’t give you such obvious access to other features (such as LLM/Prompt selection, image upload, internet search enable/disable, and Settings).
The free extension I will point out works on …