Skip to content
InnerZero logoInnerZero
← Back to Learn

Running Ollama on a Remote Machine With InnerZero

How to connect InnerZero to Ollama running on a different PC, home server, or GPU workstation on your network.

Louie·2026-04-08·5 min read
guideinnerzerolocal ai

Maybe you have a beefy GPU in a desktop tower but prefer working on a laptop. Maybe you have a home server with an NVIDIA card sitting idle. Maybe you just picked up a DGX Spark and want to use it with InnerZero.

Good news: you can run Ollama on one machine and InnerZero on another. The AI runs on the powerful hardware. The interface runs wherever you want.

The setup in 5 steps

1. Install Ollama on the remote machine

Go to ollama.com and install Ollama on the machine with the GPU. This is the machine that will actually run the AI models.

2. Set Ollama to accept network connections

By default, Ollama only listens on localhost. You need to tell it to accept connections from other machines on your network.

On Windows, set the environment variable OLLAMA_HOST to 0.0.0.0 before starting Ollama. On Linux, edit the Ollama service file to include Environment="OLLAMA_HOST=0.0.0.0".

This tells Ollama to listen on all network interfaces, not just localhost.

3. Pull your models on the remote machine

On the remote machine, open a terminal and pull the models you want:

ollama pull qwen3:8b
ollama pull gemma3:1b

Pick models based on the remote machine's hardware capabilities. Since it has the GPU, you can potentially run larger models than your local machine would support.

4. Configure InnerZero to use the remote host

Open InnerZero on your local machine. Go to Settings > General > Ollama Connection. Enter the remote machine's IP address and Ollama port:

http://192.168.1.50:11434

Replace 192.168.1.50 with your remote machine's actual local IP address. The default Ollama port is 11434.

Click Test to verify the connection works. You should see a green status and a list of available models on the remote machine. Then click Save.

5. Use InnerZero normally

That's it. Everything else works the same. Chat, voice, tools, memory. The only difference is that the AI model runs on the remote machine instead of locally. InnerZero handles the routing transparently.

The setup wizard will still detect your local hardware for tier assignment. But you can pull larger models on the remote machine manually through Ollama if your remote hardware supports them.

What works well

LAN connections are ideal. If both machines are on the same home or office network, latency is minimal. The experience feels identical to running locally.

VPN connections work too. If you're away from home but have a VPN back to your network, you can use the remote Ollama instance from anywhere. Latency will be higher, depending on your VPN speed.

What to watch out for

Security. Ollama has no built-in authentication. Anyone who can reach the IP and port can use your Ollama instance. On a home network behind a router, this is usually fine. On a public network, you should use a firewall to restrict access, or set up a reverse proxy with authentication.

InnerZero will show a warning if you configure a non-private IP address. Take it seriously.

Latency. Over a fast LAN, you won't notice any difference. Over a slow WAN connection, there will be a delay before responses start. The models themselves run just as fast, but the network adds round-trip time.

Ollama must be running. Unlike local mode where InnerZero starts and stops Ollama automatically, remote mode assumes you keep Ollama running on the remote machine yourself. InnerZero won't try to start or stop it.

Resetting to local

If you want to go back to running everything locally, open Settings and click Reset to Local in the Ollama Connection section. InnerZero will start managing its own local Ollama instance again on the next launch.

Get started

Download InnerZero on the machine you want to use as your interface. Set up Ollama on the machine with the GPU. Connect them. For details on what hardware you need on the remote side, check our hardware guide.


Related Posts

Try InnerZero

Free private AI assistant for your PC. No cloud. No subscription.