void-box logoVoid-Box
Guide

Local LLMs with Ollama

VoidBox can use local models served by Ollama instead of the Anthropic API. The guest VM reaches Ollama through SLIRP networking — no API key required.

1. Prerequisites

2. How SLIRP networking works

The guest VM uses SLIRP usermode networking. The gateway IP 10.0.2.2 is transparently mapped to 127.0.0.1 on the host:

 Guest VM                        Host
 ┌──────────────┐               ┌──────────────┐
 │ claude-code   │──SLIRP──────>│ Ollama:11434 │
 │ (stream-json) │  10.0.2.2    │ (localhost)   │
 └──────────────┘               └──────────────┘

Inside the guest, ANTHROPIC_BASE_URL=http://10.0.2.2:11434 reaches the host's Ollama process.

3. Code example

use void_box::agent_box::VoidBox;
use void_box::llm::LlmProvider;
use void_box::skill::Skill;

let model = std::env::var("OLLAMA_MODEL")
    .unwrap_or_else(|_| "phi4-mini".into());

let agent = VoidBox::new("ollama_demo")
    .llm(LlmProvider::ollama(&model))
    .skill(Skill::agent("claude-code"))
    .memory_mb(256)
    .prompt("Write a Python script that prints the first 10 Fibonacci numbers.")
    .build()?;

let result = agent.run(None).await?;

4. Running the example

terminal
$ OLLAMA_MODEL=phi4-mini \
  VOID_BOX_KERNEL=/boot/vmlinuz-$(uname -r) \
  VOID_BOX_INITRAMFS=/tmp/void-box-rootfs.cpio.gz \
  cargo run --example ollama_local

5. Environment variables

Without VOID_BOX_KERNEL set, the example falls back to mock mode (no real VM).

6. Next

See Pipeline Composition to chain Ollama-backed boxes, or define specs with YAML.