sankai/iStock/Getty Images Plus
Follow ZDNET: Add us as a preferred source on Google.
ZDNET key takeaways
- If you want to use an agentic browser, consider local AI.
- Local AI puts less of a strain on the electricity grid.
- The approach keeps your queries on your local system.
Agentic browsers are storming the castle gates, and it looks like things are heating up for yet another browser war; only this time with ‘smarter’ tools.
From my perspective, that conflict’s going to cause a major problem. Imagine if everyone around the globe is using agentic web browsers. Those agentic tasks can take serious power, which could equate to electricity prices skyrocketing and a profoundly negative impact on the climate.
There’s a solution for this challenge: local AI.
Also: Opera agentic browser Neon starts rolling out to users – how to join the waitlist
On rare occasions that I need to use AI, I always do so at the local level, specifically using Ollama.
Unfortunately, all but one of the agentic browsers on the market use cloud-based AI. For me, that approach makes using those agentic browsers a no-go. Not only do I not like the idea of placing further strain on the electric grid, but I would much prefer to keep all of my queries local so a third party cannot use them for training or profiling.
I’ve found two agentic browsers that can work with local AI: BrowserOS and Opera Neon. Unfortunately, only one of those is currently available to the public, BrowserOS.
BrowserOS is available for Linux, MacOS, and Windows. To use it with locally installed AI, you need to have Ollama installed and have downloaded a model that supports agentic browsing, such as qwen2.5:7b.
Also: I’ve been testing the top AI browsers – here’s which ones actually impressed me
I’ve been testing BrowserOS and have found it to be a solid entry in the agentic browser market. In fact, I’ve found that it can stand toe-to-toe with browsers that rely on cloud-based AI, without the negative impacts or privacy issues.
Once I had BrowserOS set up to work with Ollama (more on that in a bit), I opened the agentic assistant and ran the following query: Open amazon.com and search for a wireless charging stand that supports a Pixel 9 Pro.
It took a while to get everything set up properly so that BrowserOS’s agentic tool could work, but once I had it functioning, it worked perfectly.
I will warn you: using BrowserOS in this way does require quite a bit of system resources, so if you’re computer is underpowered, it could struggle to perform.
Also: The top 20 AI tools of 2025 – and the #1 thing to remember when you use them
According to the Ollama site, the minimum RAM requirements for running local AI are:
- Minimum (8GB): This is the absolute minimum requirement to get started and will allow you to run smaller models, typically in the 3B to 7B parameter range.
- Recommended (16-32GB): For a smoother experience and the ability to run more capable 13B models, 16GB of RAM is recommended. To comfortably handle 30B+ models, you should aim for at least 32GB of RAM.
- Large Models (64GB+): To run the largest and most powerful models, such as 70B parameter variants, you will need 64GB of RAM or more.
From experience, the minimum RAM will not work. My System76 Thelio desktop has 32GB of RAM, and that setup worked okay for my purposes. If you want to use a larger LLM (or you want more speed for your agentic use case), I’d go with 64GB+. Even at 32GB, agentic tasks could be slow, especially when running other apps and services.
Also: Are AI browsers worth the security risk? Why experts are worried
With sufficient resources, BrowserOS will complete your agentic tasks.
But how do you get there? Let me show you.
I’m going to assume you have BrowserOS installed on your platform of choice.
Installing Ollama
Because Ollama can easily be installed on both MacOS and Windows, by downloading the binary installer from the Ollama download page, I’m going to show you how to install it on Linux.
The first thing to do is to open a terminal app.
Show more
Next, run the command to install Ollama on Linux, which is:
Show more
curl -fsSL https://ollama.com/install.sh | sh
Once that installation finishes, you can then download a model that supports agentic browsers. We’ll go with qwen2.5:7b. To pull that model, issue the command:
ollama pull qwen2.5:7b
After the model pull is finished, it’s time to configure BrowserOS to use it.
Configure BrowserOS
Let’s configure BrowserOS to use Ollama.
1. Open BrowserOS AI settings
Open the BrowserOS app and then point it to:
chrome://settings/browseros-ai
2. Select Ollama
In the resulting window, click the use button associated with Ollama. Once you’ve done that step, you’ll need to configure Ollama as such:
- Provider Type – Ollama
- Provider Name – Ollama Qwen
- Base URL – Leave as is, unless you are using Ollama on a different server within your LAN, in which case switch 127.0.0.1 with the IP address of the hosting server
- Model ID – qwen2.5:7b
- Context Window Size – 12800 (but only if you have a powerful system; otherwise, go with a smaller number)
- Temperature – 1
Also: I tested all of Edge’s new AI browser features – and it felt like having a personal assistant
Make sure you select the correct model and bump up the Context Window Size.
Screenshot by Jack Wallen/ZDNET
Make sure you then set the new provider as the default.
3. Stop and start the Ollama service
To make BrowserOS work with Ollama, you have to first stop the Ollama service. To do that on Linux, the command would be:
sudo systemctl stop ollama
Once the service is stopped, you have to start it with CORS (Cross-Origin Resource Sharing) enabled. To do that on Linux and MacOS, run the command:
OLLAMA_ORIGINS=”*” ollama serve
To do this step with Windows PowerShell, the command would be:
$env:OLLAMA_ORIGINS=”*”; ollama serve
To do this from the Windows command line, run the following command:
set OLLAMA_ORIGINS=* && ollama serve
Using the new provider
At this point, you can open the Agent side panel in BrowserOS and run your agentic query, such as the one I suggested above: Open amazon.com and search for a wireless charging stand that supports a Pixel 9 Pro.
Also: I let ChatGPT Atlas do my Walmart shopping for me – here’s how the AI browser agent did
BrowserOS will do its thing and eventually open a tab with the search conditions above.
BrowserOS in action with Ollama to find me a new charging stand for my Pixel 9 Pro.
Screenshot by Jack Wallen/ZDNET
Working with an agentic browser does take some getting used to, but once you are accustomed to the practice, you’ll find it can be pretty helpful. And by using your locally installed service, you can feel a bit less guilty for using AI.

