Skip to main content

Roo Code 3.8 Release Notes (2025-03-13)

This release cycle introduces opt-in telemetry, significant terminal architecture improvements, a multi-diff editing strategy, .rooignore support, and a redesigned settings page, alongside numerous provider updates and bug fixes.

Key Features & Enhancements

  • Opt-in Telemetry: Added opt-in telemetry to help improve Roo Code faster. (thanks Cline!)
  • Terminal Architecture Refactor: Major refactor of the terminal architecture to address critical issues, including terminal overload and gray screen problems. (thanks @KJ7LNW!)
  • Multi-Diff Editing Strategy (Experimental): Introduced a new experimental diff editing strategy that applies multiple diff edits at once. (thanks @qdaxb!)
  • .rooignore Support: Added support for a .rooignore file to prevent Roo Code from reading/writing specified files, with an option to exclude them from search/lists. (thanks Cline!)
  • Redesigned Settings Page: The settings page has been redesigned for easier navigation.
  • New Task Tool Enhancements: The new_task tool now returns results to the parent task on completion for better orchestration. (thanks @shaybc!)
  • Simultaneous Roo Instances: Support for running Roo Code in multiple editor windows simultaneously. (thanks @samhvw8!)
  • Asynchronous Checkpoints: Checkpoints are now asynchronous and exclude more files to speed them up.
  • Human Relay Provider: Added a new "Human Relay" provider to manually copy information to a Web AI and paste the response back. (thanks @NyxJae!)
  • MCP over SSE: Support for MCP (Mode Communication Protocol) over Server-Sent Events. (thanks @aheizi!)
  • Remote Browser Connections: Support for remote browser connections. (thanks @afshawnlotfi!)
  • Auto-Approval for Subtasks: Added an auto-approval toggle for subtask creation and completion. (thanks @shaybc!)

Provider Updates

  • Vertex AI Credential-Based Auth: Added credential-based authentication for Vertex AI. (thanks @eonghk!)
  • DeepSeek Provider Updates: Updated DeepSeek provider with correct baseUrl and caching. (thanks @olweraltuve!)
  • OpenAI Observability: Added observability for OpenAI providers. (thanks @refactorthis!)
  • LM Studio Speculative Decoding: Supported speculative decoding for LM Studio local models. (thanks @adamwlarson!)
  • PowerShell Command Handling: PowerShell-specific command handling. (thanks @KJ7LNW!)
  • OpenAI-Compatible DeepSeek/QwQ Reasoning: Support for reasoning with DeepSeek/QwQ via OpenAI-compatible provider. (thanks @lightrabbit!)
  • Anthropic-Style Prompt Caching (OpenAI-Compatible): Added Anthropic-style prompt caching in the OpenAI-compatible provider. (thanks @dleen!)
  • Deepseek R1 for AWS Bedrock: Added Deepseek R1 support for AWS Bedrock. (thanks @ATempsch!)
  • Gemini 2.0 Pro Exp (Vertex): Added gemini-2.0-pro-exp-02-05 model to Vertex. (thanks @shohei-ihaya!)
  • Custom ARNs in AWS Bedrock: Support for custom ARNs in AWS Bedrock. (thanks @Smartsheet-JB-Brown!)
  • o3-mini Support (OpenAI-Compatible): Added o3-mini support to the OpenAI-compatible provider. (thanks @yt3trees!)

QOL Improvements

  • Improved UI for Selectors & Headers: Enhancements to mode/provider selectors in chat and task headers. (thanks @monotykamary!)
  • Context Mention Path Handling (Windows): Improved context mention path handling on Windows. (thanks @samhvw8!)
  • Configuration Profile Dropdown UI: Improved UI for the configuration profile dropdown. (thanks @DeXtroTip!)
  • Reserved Output Tokens Display: Showed reserved output tokens in the context window visualization.
  • Disable Create/Edit Custom Modes Instructions: Added an option to save tokens by disabling instructions for creating/editing custom modes. (thanks @hannesrudolph!)

Bug Fixes

  • Fixed VS Code LM API model picker truncation.
  • Fixed encoding issue causing unreadable characters at the beginning of files.
  • Fixed settings dropdown truncation.
  • Fixed bug where custom temperature could not be unchecked (thanks @System233!).
  • Fixed bug with decimal price entry for OpenAI-compatible providers (thanks @System233!).
  • Fixed bug with enhance prompt on Sonnet 3.7 (thanks @moqimoqidea!).
  • Fixed bug with context window management for thinking models (thanks @ReadyPlayerEmma!).
  • Fixed bug where checkpoints were no longer enabled by default.
  • Fixed MarkdownBlock text color for Dark High Contrast theme (thanks @cannuri!).
  • Fixed browser system prompt inclusion rules (thanks @cannuri!).
  • Fixed OpenAI-style cost calculations (thanks @dtrugman!).
  • Fixed issue allowing use of an excluded directory as working directory (thanks @Szpadel!).
  • Fixed OpenRouter custom baseUrl support.
  • Fixed usage tracking for SiliconFlow and other providers.
  • Rolled back multi-diff progress indicator temporarily to fix double-confirmation.

Misc Improvements

  • Telemetry for Versions: Added extension and VSCode versions to telemetry.
  • Kotlin Language Support: Added Kotlin support in list_code_definition_names tool. (thanks @kohii!)
  • Diff Application Error Handling: Better handling of diff application errors. (thanks @qdaxb!)
  • Updated Bedrock Prices: Updated Bedrock prices to the latest. (thanks @Smartsheet-JB-Brown!)
  • Telemetry for Checkpoints & Diff Strategies: Added telemetry for checkpoint save/restore/diff and diff strategies.
  • Updated MCP Servers Directory Path: Updated path for platform compatibility. (thanks @hannesrudolph!)