Using Requesty With Roo Code
Roo Code supports accessing models through the Requesty AI platform. Requesty provides an easy and optimized API for interacting with 150+ large language models (LLMs).
Website: https://www.requesty.ai/
Getting an API Key
- Sign Up/Sign In: Go to the Requesty website and create an account or sign in.
- Get API Key: You can get an API key from the API Management section of your Requesty dashboard.
Supported Models
Requesty provides access to a wide range of models. Roo Code will automatically fetch the latest list of available models. You can see the full list of available models on the Model List page.
Configuration in Roo Code
- Open Roo Code Settings: Click the gear icon () in the Roo Code panel.
- Select Provider: Choose "Requesty" from the "API Provider" dropdown.
- Enter API Key: Paste your Requesty API key into the "Requesty API Key" field.
- Select Model: Choose your desired model from the "Model" dropdown.
Tips and Notes
- Optimizations: Requesty offers range of in-flight cost optimizations to lower your costs.
- Unified and simplified billing: Unrestricted access to all providers and models, automatic balance top ups and more via a single API key.
- Cost tracking: Track cost per model, coding language, changed file, and more via the Cost dashboard or the Requesty VS.code extension.
- Stats and logs: See your coding stats dashboard or go through your LLM interaction logs.
- Fallback policies: Keep your LLM working for you with fallback policies when providers are down.
- Prompt Caching: Some providers support prompt caching. Search models with caching.