8a3c0797c3376ba85669aa47c675623d2100eddf
open-web-agent-rs
A Rust-based web agent with local inference capabilities.
Components
Local Inference Engine
The Local Inference Engine provides a way to run large language models locally. It supports both CLI mode for direct text generation and server mode with an OpenAI-compatible API.
Features:
- Run Gemma models locally (1B, 2B, 7B, 9B variants)
- CLI mode for direct text generation
- Server mode with OpenAI-compatible API
- Support for various model configurations (base, instruction-tuned)
- Metal acceleration on macOS
See the Local Inference Engine README for detailed usage instructions.
Web Server
Server is being converted to MCP. Things are probably broken.
bun i
bun dev
Languages
Rust
73.4%
TypeScript
10.6%
HTML
7.2%
JavaScript
5.3%
Dockerfile
3.1%
Other
0.4%