Documentation Index
Fetch the complete documentation index at: https://docs.oryxai.io/llms.txt
Use this file to discover all available pages before exploring further.
Quick Setup
Navigate to AI Settings
Navigate to chrome://settings/oryx to add Ollama as a provider.
Get Model ID
Get the model ID of your Ollama model (e.g., gpt-oss:20b)
Start Ollama Server
Start Ollama with CORS enabled:OLLAMA_ORIGINS="*" ollama serve
Select and Use
Select the model in agent and start using it! 🥳
If you don’t want to run from CLI with CORS settings, we recommend using LM
Studio instead. See the LM Studio setup guide.
Alternative: LM Studio
LM Studio Setup
If you prefer not to run Ollama from the command line, LM Studio provides a
more user-friendly alternative with a graphical interface.