- Introduce `supportedModels` in `ClientChatStore` and update model validation logic
- Enhance OpenAI inferencing with local setup adaptations and improved streaming options
- Modify ChatService to handle local and remote model fetching
- Update input menu to dynamically fetch and display supported models
- Add start_inference_server.sh for initiating local inference server
- Upgrade OpenAI SDK to v5.0.1 and adjust dependencies accordingly
Moved several libraries including @anthropic-ai/sdk and @babel-related packages from dependencies to devDependencies in package.json and bun.lock. This reduces the production bundle size by excluding unnecessary build-time dependencies.
Updated the `build` script in `package.json` to invoke `server:build` instead of `worker:build`. Removed redundant dependencies from `bun.lock` to clean up the project and reduce package bloat.