mirror of
https://github.com/geoffsee/open-gsio.git
synced 2025-09-08 22:56:46 +00:00
Add Docker support with Dockerfile
and docker-compose.yml
, update build scripts and README for containerized deployment.
- Updated server `Bun.build` configuration: adjusted `outdir`, added `format` as `esm`, and set `@open-gsio/client` to external. - Expanded README with Docker instructions. - Added new package `@open-gsio/analytics-worker`. - Upgraded dependencies (`vite`, `typescript`, `bun`) and locked `pnpm` version in `package.json`.
This commit is contained in:
56
README.md
56
README.md
@@ -14,6 +14,7 @@ This is a full-stack Conversational AI.
|
||||
|
||||
- [Installation](#installation)
|
||||
- [Deployment](#deployment)
|
||||
- [Docker](#docker)
|
||||
- [Local Inference](#local-inference)
|
||||
- [mlx-omni-server (default)](#mlx-omni-server)
|
||||
- [Adding models](#adding-models-for-local-inference-apple-silicon)
|
||||
@@ -40,6 +41,59 @@ This is a full-stack Conversational AI.
|
||||
|
||||
> Note: Subsequent deployments should omit `bun run deploy:secrets`
|
||||
|
||||
## Docker
|
||||
|
||||
You can run the server using Docker. The image is large but will be slimmed down in future commits.
|
||||
|
||||
### Building the Docker Image
|
||||
|
||||
```bash
|
||||
docker compose build
|
||||
# OR
|
||||
docker build -t open-gsio .
|
||||
```
|
||||
|
||||
### Running the Docker Container
|
||||
|
||||
```bash
|
||||
docker run -p 3003:3003 \
|
||||
-e GROQ_API_KEY=your_groq_api_key \
|
||||
-e FIREWORKS_API_KEY=your_fireworks_api_key \
|
||||
open-gsio
|
||||
```
|
||||
|
||||
You can omit any environment variables that you don't need. The server will be available at http://localhost:3003.
|
||||
|
||||
### Using Docker Compose
|
||||
|
||||
A `docker-compose.yml` file is provided in the repository. You can edit it to add your API keys:
|
||||
|
||||
```yaml
|
||||
version: '3'
|
||||
services:
|
||||
open-gsio:
|
||||
build: .
|
||||
ports:
|
||||
- "3003:3003"
|
||||
environment:
|
||||
- GROQ_API_KEY=your_groq_api_key
|
||||
- FIREWORKS_API_KEY=your_fireworks_api_key
|
||||
# Other environment variables are included in the file
|
||||
restart: unless-stopped
|
||||
```
|
||||
|
||||
Then run:
|
||||
|
||||
```bash
|
||||
docker compose up
|
||||
```
|
||||
|
||||
Or to run in detached mode:
|
||||
|
||||
```bash
|
||||
docker compose up -d
|
||||
```
|
||||
|
||||
## Local Inference
|
||||
|
||||
> Local inference is supported for Ollama and mlx-omni-server. OpenAI compatible servers can be used by overriding OPENAI_API_KEY and OPENAI_API_ENDPOINT.
|
||||
@@ -116,6 +170,8 @@ I would like to express gratitude to the following projects, libraries, and indi
|
||||
- [Vike](https://vike.dev/) - Framework for server-side rendering and routing
|
||||
- [Cloudflare Workers](https://developers.cloudflare.com/workers/) - Serverless execution environment
|
||||
- [Bun](https://bun.sh/) - JavaScript runtime and toolkit
|
||||
- [Marked.js](https://github.com/markedjs/marked) - Markdown Rendering
|
||||
- [Shiki](https://github.com/shikijs/shiki) - Syntax Highlighting
|
||||
- [itty-router](https://github.com/kwhitley/itty-router) - Lightweight router for serverless environments
|
||||
- [MobX-State-Tree](https://mobx-state-tree.js.org/) - State management solution
|
||||
- [OpenAI SDK](https://github.com/openai/openai-node) - Client for AI model integration
|
||||
|
Reference in New Issue
Block a user