Skip to main content

IO Intelligence Provider

The IO Intelligence provider gives you access to a wide range of AI models, including those from Llama, DeepSeek, Qwen, and Mistral, through a unified API.

Configuration

To use the IO Intelligence provider, you will need to add it to your ~/.roo/config.json file.

  1. Get your API key: You can get an API key from the IO Intelligence website.
  2. Add the provider to your config: Add the following to your config.json file:
{
"providers": [
{
"id": "io-intelligence",
"apiKey": "YOUR_IO_INTELLIGENCE_API_KEY"
}
]
}

Available Models

The IO Intelligence provider supports the following models:

  • llama-3-70b
  • deepseek-coder-v2
  • qwen-2-72b
  • mistral-large

You can specify which model to use in your API configuration profiles.