c5b8bd812c6023135747bfe0055c85137a49bb35
open-web-agent-rs
A Rust-based web agent with an embedded OpenAI-compatible inference server (supports Gemma models only).
Project Structure
This project is organized as a Cargo workspace with the following crates:
agent-server
: The main web agent serverlocal_inference_engine
: An embedded OpenAI-compatible inference server for Gemma models
Setup
- Clone the repository
- Copy the example environment file:
cp .env.example .env
- Install JavaScript dependencies:
bun i
- Start the SearXNG search engine:
docker compose up -d searxng
Running the Project
Local Inference Engine
To run the local inference engine:
cd crates/local_inference_engine
cargo run --release -- --server
Agent Server
To run the agent server:
cargo run -p agent-server
Development Mode
For development with automatic reloading:
bun dev
Building
To build all crates in the workspace:
cargo build
To build a specific crate:
cargo build -p agent-server
# or
cargo build -p local_inference_engine
Languages
Rust
73.4%
TypeScript
10.6%
HTML
7.2%
JavaScript
5.3%
Dockerfile
3.1%
Other
0.4%