Geoff Seemueller e7d56452bd Update .env
Signed-off-by: Geoff Seemueller <28698553+geoffsee@users.noreply.github.com>
2025-06-05 21:16:36 -04:00
2025-05-28 19:48:50 -04:00
2025-05-28 19:48:50 -04:00
2025-05-23 09:48:26 -04:00
2025-06-05 21:16:36 -04:00
2025-06-05 21:13:17 -04:00
2025-06-05 21:13:17 -04:00
2025-06-05 21:13:17 -04:00
2025-05-23 09:48:26 -04:00
2025-05-23 09:48:26 -04:00
2025-05-23 09:48:26 -04:00

open-web-agent-rs

A Rust-based web agent with local inference capabilities.

Components

Local Inference Engine

The Local Inference Engine provides a way to run large language models locally. It supports both CLI mode for direct text generation and server mode with an OpenAI-compatible API.

Features:

  • Run Gemma models locally (1B, 2B, 7B, 9B variants)
  • CLI mode for direct text generation
  • Server mode with OpenAI-compatible API
  • Support for various model configurations (base, instruction-tuned)
  • Metal acceleration on macOS

See the Local Inference Engine README for detailed usage instructions.

Web Server

Server is being converted to MCP. Things are probably broken.

bun i
bun dev
Description
mcp-server with embedded inference, searxng, and genaiscript
Readme MIT 678 KiB
Languages
Rust 73.4%
TypeScript 10.6%
HTML 7.2%
JavaScript 5.3%
Dockerfile 3.1%
Other 0.4%