open-web-agent-rs

A Rust-based web agent with an embedded OpenAI-compatible inference server (supports Gemma models only).

Project Structure

This project is organized as a Cargo workspace with the following crates:

  • agent-server: The main web agent server
  • local_inference_engine: An embedded OpenAI-compatible inference server for Gemma models

Setup

  1. Clone the repository
  2. Copy the example environment file:
    cp .env.example .env
    
  3. Install JavaScript dependencies:
    bun i
    
  4. Start the SearXNG search engine:
    docker compose up -d searxng
    

Running the Project

Local Inference Engine

To run the local inference engine:

cd crates/local_inference_engine
cargo run --release -- --server

Agent Server

To run the agent server:

cargo run -p agent-server

Development Mode

For development with automatic reloading:

bun dev

Building

To build all crates in the workspace:

cargo build

To build a specific crate:

cargo build -p agent-server
# or
cargo build -p local_inference_engine
Description
mcp-server with embedded inference, searxng, and genaiscript
Readme MIT 678 KiB
Languages
Rust 73.4%
TypeScript 10.6%
HTML 7.2%
JavaScript 5.3%
Dockerfile 3.1%
Other 0.4%