Files
open-gsio/scripts
geoffsee 9e8b427826 Add scripts and documentation for local inference configuration with Ollama and mlx-omni-server
- Introduced `configure_local_inference.sh` to automatically set `.dev.vars` based on active local inference services.
- Updated `start_inference_server.sh` to handle both Ollama and mlx-omni-server server types.
- Enhanced `package.json` to include new commands for starting and configuring inference servers.
- Refined README to include updated instructions for running and adding models for local inference.
- Minor cleanup in `MessageBubble.tsx`.
2025-06-02 12:50:22 -04:00
..
2025-05-28 22:06:54 -04:00
2025-05-22 23:14:01 -04:00