Providers
OpenSlop supports any AI model provider that is compatible with the OpenAI API format. Below is a list of major providers, their website links, and their necessary Base URL endpoints to use in your OpenSlop configuration.
[!WARNING] Model Selection is Critical: OpenSlop acts as an autonomous agent that reads, writes, and executes commands. Small, lower-tier models often lack the reasoning capabilities required to operate the agent safely. For the best quality code and a stable experience, we strongly recommend using state-of-the-art frontier models like Claude 3.5 Sonnet, GPT-4o, or equivalent high-tier open-source models.
OpenRouter
A unified interface offering access to hundreds of different open-source and proprietary models.
- Base URL:
https://openrouter.ai/api/v1
Groq
Known for incredibly fast LPU inference, specifically great for running open-source models like Llama seamlessly.
- Base URL:
https://api.groq.com/openai/v1
Google Gemini
Google’s natively multimodal models. To use Gemini models with the OpenAI SDK format, use their specialized compatibility endpoint.
- Base URL:
https://generativelanguage.googleapis.com/v1beta/openai/
Together AI
Offers fast inference for leading open-source models, including Llama 3, Mixtral, and more.
- Base URL:
https://api.together.xyz/v1
DeepSeek
Providers of powerful open-weights models natively compatible with OpenAI’s API format.
- Base URL:
https://api.deepseek.com/v1