- Dec 16, 2025
- Parsed from source:Dec 16, 2025
- Detected by Releasebot:Dec 19, 2025
🎁 Raycast Wrapped 2025
Raycast wraps up 2025 with Wrapped insights and a new highlight command, inviting users to share stat summaries. The update also brings parity in File Search with Windows and fixes a hotkey trigger issue on the lock screen.
As 2025 comes to a close, it’s the perfect time to reflect on how Raycast has helped you be more productive and stay in flow.
Take a look at your Raycast Wrapped to get insights on your Raycast launches, extensions usage, your most used apps, developer stats, time spent on meetings, AI usage, Focus stats, and so much more!
Look out for the
Raycast Wrapped 2025
command, and make sure to share your summary by clicking the
Copy as Image
in each section. We’d love to see your stats.Disclaimer: Most of the data displayed is stored locally, and Raycast does not have access to it. If you moved between machines during the year, some of the data might have been lost. If you’ve used Raycast for less than 14 days, you will not have enough data to be displayed – but there’s always next year!
💎 Improvements
File Search: Update File Search Beta to be on parity with Windows File Search.
🐞 Fixes
Lock Screen: Fixed an issue where the lock screen command could not be triggered reliably via hotkeys.
Original source Report a problem - Sep 15, 2025
- Parsed from source:Sep 15, 2025
- Detected by Releasebot:Dec 6, 2025
- Modified by Releasebot:Dec 12, 2025
💻 macOS Tahoe Ready
macOS Tahoe is out, bringing many user interface changes.
While there are no major new features, we've focused on implementing some of these design tweaks in Raycast, such as Liquid Glass controls in AI Chat.
Improvements
- Focus: Support for Comet browser
Fixes
- AI Chat: Window can now be made full-screen again
- AI Chat: Floating window will now follow the current space
- Auto-quit: Fix video conference apps terminated while audio, video or screen sharing was active (macOS Tahoe)
- Jul 30, 2025
- Parsed from source:Jul 30, 2025
- Detected by Releasebot:Dec 6, 2025
- Modified by Releasebot:Dec 12, 2025
🎙️ Auto Transcribe with Granola, Auto Models & Bring Your Own Models
Auto Transcribe now captures and.summary meetings in the background with Auto Join for seamless calls. New AI chat text settings and an experimental Auto Model plus BYOM support expand AI choices. Numerous fixes boost stability and compatibility across components.
✨ New
Never take meeting notes again (😱) with Auto Transcribe, powered by Granola. Your meetings can now be transcribed and usefully summarized so you can stay focused on the discussion. Auto Transcribe works together with Auto Join Meetings so you can seamlessly jump into calls while Granola captures all the important details in the background.
Go to Settings 14 Extensions 14 My Schedule to enable and customize. You can decide which meetings this applies to as well as whether to show a confirmation before joining so you’re never caught off guard before joining your next call.- AI Chat Text Settings: You can now control the text size and line spacing in AI Chat independently of the main window. See Settings 14 AI 14 AI Chat
🧪 New AI Experiments
- Auto Model: Let AI choose the best model for the job at hand. We’ve enabled this by default so you can simply pick Auto when choosing a model and the most suitable model will be used for your prompt. We’re currently testing this (hence, experimental!) and will be fine tuning it as we receive your feedback. Please use the 👍 👎 and let us know how it’s going in #ai-experiments. Check out the AI Manual for more details.
- Bring Your Own Models: For advanced users you can now add any OpenAI compatible LLM provider to Raycast AI. BYOM (Custom Providers) is disabled by default.
Note: To enable or disable AI Experiments go to Settings 14 AI 14 Experiments
🐞 Fixes
- Root Search: Do not present AI Extensions popover for @ prefix
- AI Chat: Fixed chat image capture layout
- AI Chat: Ensure AI Chat window is not shown before screen capture
- AI Chat: Ensure Escape cancels screen area capture
- AI Chat: Starting a new chat no longer cancels current completion if chat branching is enabled
- AI Chat: Fix image upload for custom models
- Markdown: Fixed hang when processing HTML comments
- Quick AI: Fix duplicate messages when regenerating a response
- MCP: Improved compatibility with HTTP servers including Github and Vercel
- MCP: Fixes issue running mcpm servers
- BYOM: Do not pass temperature if not supported by the model
- BYOM: Fix reasoning for Google API
- Jul 16, 2025
- Parsed from source:Jul 16, 2025
- Detected by Releasebot:Dec 6, 2025
v1.101.0
New experimental Chat Branching lets you create branched chats and explore paths without losing context. OpenRouter integration expands model access, plus Clipboard History and Riverside/StreamYard calendar support streamline workflows. AI improvements and fixes enhance reliability.
🌿 Chat Branching (Experimental)
Create alternate conversation paths from any point in your chat history. Think of it as a "save point" where you can explore different directions without losing your original conversation. With background execution, you can kick off multiple branches that continue processing even when you switch between chats.
To use: From any AI Chat, simply press CMD + Shift + B to create a new branched chat and continue the conversation. Alternatively, you can right click on a chat in the sidebar, and select Branch Chat.
Note: This is an experimental feature. We’d like to see if you find it useful and how we might improve it. We’ve enabled it by default as it won’t disrupt your existing workflow, but if you’d like to disable it, go to AI Settings → Experiments. Please share your thoughts and feedback in #ai-experiments.
✨ New
- OpenRouter: We’ve added support for OpenRouter in order to give you even more control and flexibility over your AI usage in Raycast. OpenRouter provides access to a wide range of public models and has a beautiful, easy interface to manage your usage and costs without a subscription. To use: Add your OpenRouter API key in AI Settings → Custom Providers → OpenRouter. Once the key is validated, OpenRouter models will appear in the model picker.
- Clipboard History: Added a new Paste Sequentially command allows you to paste previously copied content one after another into your frontmost application. It’s best used with a hotkey, e.g. ⌥ + ⇧ + V.
- Calendar: Added support Riverside and StreamYard as conference providers for upcoming meetings.
💎 Improvements
- AI: Added a Show Tool Call Info switch in Settings → AI. When enabled you can view arguments and output for all tool calls in the chat
- AI: We now show a warning in the chat if there is an on-going incident with the LLM provider
- AI Chat: Composer text is now preserved when switching between chats
- AI Chat: Improved handling of presets in new chat menu
- MCP: The server version is now displayed when viewing a server in Manage Servers
- MCP: Added DENO_PATH and NODE_PATH to stdio environment
🐞 Fixes
- AI: Always use the original image when saving and sharing a chat image
- AI: Fixed cursor jumping after mentioning an AI Extension in root search
- AI: Fixed chat message layout corruption in Tahoe developer betas
- AI: Improved model fallback handling when using an API key
- AI Chat: Fixed jumping to first pinned chat after completion when >20 pinned items
- AI Chat: Generated chat titles should now use the user’s language
- Quick AI: Fixed Regenerate with Model action for commands
- MCP: Fixed the Streamable HTTP client accept header
- MCP: Fixed an issue parsing large numbers
- MCP: The HTTP client is now compatible with Atlassian’s MCP server
- Jun 11, 2025
- Parsed from source:Jun 11, 2025
- Detected by Releasebot:Dec 6, 2025
v1.100.0
Raycast now supports Bring Your Own Key, letting you use your own AI API keys for Anthropic, Google, and OpenAI without a Pro plan. A new Manage Models command, experimental AI features with early tests, and a Safari extension fix improve flexibility and reliability.
🔑 Bring Your Own Key
Bring Your Own Key (BYOK) seamlessly integrates your existing AI provider accounts with Raycast's intuitive interface. With BYOK you can now use Raycast AI with your own API keys for Anthropic, Google, and OpenAI. This allows you to send as many AI messages as you want at your own cost without a Pro subscription. To add an API key, open Settings -> AI and scroll down to the Custom API Keys section.
✨ New
- Manage Models: A new command to view and manage all your AI models. You can easily disable individual models or all models from the same provider using the command.
- Experimental AI features: The AI landscape moves quick — what works with one provider today might behave differently tomorrow, or might not be consistently supported across different models. You can now toggle on experimental AI features to try out early-stage features that we’re exploring, testing and trialing. Turn them on in AI Settings and join #ai-experiments in our Slack Community to share your thoughts!
Available Experiments:
- 🎏 MCP HTTP Servers: Support for HTTP MCP servers using the SSE (Server Sent Events) and Streamable protocols.
- 🦙 AI Extensions for Ollama models: If you’ve got Ollama installed, you can try out tool calling with local models. Since tool choice and streaming for tool calls aren’t supported by Ollama just yet, this can be a bit unreliable - which is why it’s experimental 😉
💎 Improvements
- Spotlight: Added new “Replace Spotlight with Raycast” command for a smoother transition to your favorite launcher
- AI Chat: Added new Send Active Window to AI Chat, Send Selected Area to AI Chat, and Send Focused Browser Tab to AI Chat commands to capture context even quicker
🐞 Fixes
- Safari Extension: Fixed an issue where the connection to the app would be closed and couldn’t reconnect. Make sure to update Safari Extension to the latest version in App Store.
- May 21, 2025
- Parsed from source:May 21, 2025
- Detected by Releasebot:Dec 19, 2025
🦙 Local Models
Launch of Local Models with Ollama integration lets you run 100+ open source LLMs locally, with experimental AI Extensions support. Some tool calls may be unreliable yet setup is easy via AI Settings. Also brings improvements and bug fixes across MCP, AI, and Export workflows.
✨ New
Local models allow you to run nearly any open source LLM locally, on your machine. Through our new integration with Ollama, you’ll now have access to more than 100 AI models from various providers ranging from small 135M to massive 671B parameter models. We’ve also added experimental support for AI Extensions with Local Models. Since tool choice and streaming for tool calls aren’t supported by Ollama just yet, AI Extensions can be a bit unreliable when using Local Models. However, if you want to try it out, you can enable it by going to AI Settings. Keep in mind, it likely won’t be as reliable as using non-local models.
✨ New
Local Models: Get started by installing Ollama. Then download Ollama models directly from the Raycast Settings, in the Local Models section of the AI tab by copy & paste model names. You can find the list of all available Ollama models here.
Local Models: Support for Vision with local models supporting it. You can find the list of supporting models here.
Local Models: Experimental support for AI Extensions with local models that support tool calls. You can find the list of supporting models here.
💎 Improvements
- MCP: Improved error reporting when stdio servers fail to run
- MCP: Improved compatibility with server JSON schemas
- MCP: Added a Copy to Clipboard action in Manage Servers
🐞 Fixes
- Onboarding: Fixed image assets not loading in the onboarding pages
- AI: Fixed issue where community AI Extensions were not disabled when AI was disabled
- AI: Fixed remote tool calls in AI Commands
- AI: Disable default tools in AI Commands
- AI Chat/Quick AI Web Search setting now works even if the Ask Web command is disabled
- MCP: Fix handling of quoted server arguments
- MCP: Server updates are no longer saved if the updated server fails to run
- Export: Prevent the export/import view from resizing the main window
- May 8, 2025
- Parsed from source:May 8, 2025
- Detected by Releasebot:Dec 19, 2025
🧰 MCP
Raycast launches Model Context Protocol integration with MCP, letting you install local stdio servers and access tools via AI as with AI Extensions. A new Registry extension surfaces MCP servers, plus improved AI confirmations and several stability fixes.
Raycast now includes integration with the Model Context Protocol (MCP). You can install any local
stdio
server and then access the server’s tools from Quick AI and AI Chat by @-mentioning them, just like with AI Extensions.
Learn more about MCP in
our manual
and discover MCP servers with our new
Registry extension
.✨ New
- MCP: New commands to install, import and manage servers
- MCP: New Registry to explore servers available as a
separate extension
💎 Improvements
- AI: Added a
Confirm Always
action for AI Extension / MCP Server tool confirmations. You can reset allowed tools in the AI Extension / MCP Server settings or in the general AI settings - AI Commands: The
@location
tool must now be explicitly mentioned to be used from commands
🐞 Fixes
- AI: Fixed a crash happening when opening a markdown link while the message was rendered
- AI: Fixed a rare crash happening when pasting file attachments to AI Chat
- AI: Fixed an occasional crash that occurred when using the
@location
AI Extension
- Apr 30, 2025
- Parsed from source:Apr 30, 2025
- Detected by Releasebot:Dec 19, 2025
📱 Raycast for iOS is here
Raycast for iOS is here, bringing desktop power to mobile with AI chat, idea capture, quicklinks, and code snippets plus seamless cross‑device sync. New AI Extensions let you chat with multiple models and @-mention apps to run actions. Stability tweaks and fixes improve the experience.
The wait is over! Raycast for iOS is finally here after being one of our most requested features. The app brings the power of your desktop experience to mobile:
- 💬 Chat with dozens of AI models for perfect answers on any task
- 📚 Capture and organize thoughts so brilliant ideas never slip away
- 🔗 Access your Quicklinks instantly for important resources on the go
- 📝 Use your code snippets and templates to maintain productivity anywhere
- ☁️ Sync seamlessly across all your Apple devices with zero friction
Wanna learn more? Watch this walkthrough with Pedro to get the most out of the app or read this blog post about how we built it.
New
- AI Extensions: You can now use our AI Extensions with a variety of different models such as GPT-4.1, Claude 3.7 Sonnet, Gemini 2.5 Flash, and more. This way, you can @-mention your apps and services in your chats to gather context or perform actions. Try it out with @finder to save your latest research as a CSV table or @calendar to timebox your priorities for next week.
- AI: Recently used models remain ranked higher across the app’s model picker
Improvements
- AI: Added icon for reasoning effort next to model name in AI Chat
Fixes
- AI: Fixed an issue where the temperature was shown in AI Chat for models that don’t supported it
- AI: Fixed an issue preventing AI from being disabled caused by ad-hoc AI Commands
- AI: Fixed AI messages left count displayed in preferences window when AI is not enabled
- AI Chat: Fixed incorrect action bar button text color
- Apr 16, 2025
- Parsed from source:Apr 16, 2025
- Detected by Releasebot:Dec 19, 2025
❇️ Try Raycast AI For Free
Raycast AI is now available to everyone with 50 free messages across AI Chat, Quick AI and AI Commands. Choose from dozens of models and attach PDFs, tabs or screenshots for context aware help. New models and tweaks arrive weekly, with a simple on/off toggle to disable AI anytime.
Raycast AI available to everyone
We’ve just made
Raycast AI
available to everyone. You get 50 free messages to use across AI Chat, Quick AI, and AI Commands - no subscription or account needed. Experience leading models with powerful extensions directly on your Mac!Here’s what you can do with Raycast AI:
- Quick answers: Type a question and press Tab to get instant answers with web citations
- Text improvements: Select text in any app and use AI Commands like "Fix Spelling and Grammar" to correct typos
- Dozens of models: From OpenAI’s o3-mini to Google’s Gemini 2.0 Flash, DeepSeek’s V3, and beyond. Select and switch to the ideal model for any given task.
- Context-aware assistance: Attach PDFs, browser tabs, or screenshots to get help with what you're working on.
- Extension integration: Interact and instruct using natural language with AI Extensions - for example, you can type
@focus
to get in the flow on your top
@linear
issues
Want to learn more? Watch a
walkthrough video
or check out our
new AI manual
for all the details.Not into AI? No problem! You can easily turn it off via Raycast settings → AI tab → big toggle on the left.
✨ New
New Models: Added new LLMs to Raycast AI throughout the last few days. Here’s a quick overview:
- OpenAI’s GPT-4.1, GPT-4.1 mini and GPT-4.1 nano
- Google Gemini 2.5 Pro
- Meta Llama 4 Scout powered by Groq
- DeepSeek v3 powered by Together
Screenshots: Added a
Storage Duration
preference. Setting a shorter duration will automatically move older screenshot files to the Trash once a day. The default is
Unlimited
.Screenshots: Added a preference for the primary action:
- Copy to Clipboard
- or
- Paste to Active App
💎 Improvements
The “Send Feedback” command now opens a separate window and accepts file uploads
🐞 Fixes
Fixed delay showing Calculator results for operators that matched Flight designators (e.g.
Original source Report a problem
bin 3
or
hex 50
). - Apr 9, 2025
- Parsed from source:Apr 9, 2025
- Detected by Releasebot:Dec 19, 2025
✨ Improvements and bug fixes
Raycast delivers major AI controls with a new Reasoning Level for OpenAI and Claude, plus a Favicon Provider setting and macOS 13+ support. A wave of performance boosts and bug fixes spans AI Chat, clipboard, notes, and core UX improvements.
✨ New
Reasoning Effort: You can now set the reasoning level for OpenAI’s o3-mini and Claude 3.7 Sonnet (Reasoning). Increase it when you need deep analysis or dial it down for quicker, more straightforward problems. You can change it in the settings of your AI Chat or Presets. It isn’t supported in AI Commands yet.
Favicon Provider: We added a new setting to let you pick your favicon provider or disable it altogether, allowing you to balance user experience and privacy. You can configure it via Raycast Settings → Advanced → Favicon Provider. Read more about it here.
macOS Support: Raycast now requires macOS 13 or later. If you are still running macOS 12 you should consider upgrading to ensure you continue to receive updates to Raycast.💎 Improvements
- Clipboard History: Changed the visual information for links setting to also enable or disable favicons
- AI: Improved handling AI Extension @mentions in root search
- AI Chat: Significantly improved performance when loading large code blocks
- AI Chat: Improved scroll to bottom behaviour during completion
- AI Chat: It should now be possible to copy code blocks during completion
- AI Chat: Improved responsiveness during live resize
- AI Chat: Enabled LaTeX in user messages
- Clipboard: Passwords should now be ignored when copied using the Passwords menu bar extension. Note: Requires the Passwords app to be excluded from Clipboard History and requires Accessibility permissions for Raycast.
- Calculator: The text in the search bar won’t be selected when performing calculations and temporarily hiding and reopening the Raycast window, allowing you to easily append text to the expression.
- Raycast Notes: Added an action to duplicate notes
🐞 Fixes
- AI: LaTeX $$ delimiters are no longer recognized within code blocks
- AI: Improved handling of LaTeX \begin \end commands
- AI: Fixed an issue where the selected text would be cleared during completion
- AI Chat: Fixed layout of chat settings panel after changing model
- AI Chat: Fixed handling of images when receiving multiple responses from the LLM
- AI Chat: Fixed issues with inline code block layout
- Quit All Apps Except Frontmost: Improved handling when Finder or Raycast are frontmost apps
- Focus: Improved Firefox based browsers website blocking reliability