Runpod Release Notes
Last updated: Mar 20, 2026
- Mar 1, 2026
- Date parsed from source:Mar 1, 2026
- First seen by Releasebot:Mar 20, 2026
- Modified by Releasebot:Mar 27, 2026
March 2026
Runpod launches Flash beta, a Python SDK for running functions on cloud GPUs with simple decorators, auto-scaling workers, automatic dependency installs, and support for both batch queues and load-balanced REST APIs.
Flash beta: Run Python functions on cloud GPUs
Flash is now in public beta. Flash is a Python SDK that lets you run functions on Runpod Serverless GPUs with a single decorator:
Key features
- Remote execution: Mark functions with @Endpoint to run on GPUs/CPUs automatically.
- Auto-scaling: Workers scale from 0 to N based on demand.
- Dependency management: Packages install automatically on remote workers.
- Two patterns: Queue-based endpoints for batch work, load-balanced endpoints for REST APIs
- Flash apps: Build production-ready APIs with flash init, flash run, and flash deploy
Get started:
Original source - Feb 1, 2026
- Date parsed from source:Feb 1, 2026
- First seen by Releasebot:Mar 20, 2026
February 2026
Runpod expands Public Endpoints with more models across video, image, text, and audio, adds a Vercel AI SDK integration, and publishes new guides and tutorials for coding tools, text-to-video pipelines, model caching, web apps, chatbots, Ollama, and Bazel Docker builds.
New Public Endpoints and expanded examples
New Public Endpoints
Expansion of available models across all categories.
- Video: SORA 2 and SORA 2 Pro, Kling v2.1, v2.6 Motion Control, WAN 2.6.
- Image: Seedream 4.0.
- Text: Qwen3 32B, IBM Granite 4.0.
- Audio: Chatterbox Turbo for text-to-speech.
New integrations and guides
- Vercel AI SDK integration: New @runpod/ai-sdk-provider package for TypeScript projects with streaming, text generation, and image generation support.
- AI coding tools guide: Configure OpenCode, Cursor, and Cline to use Runpod Public Endpoints as your model provider.
New tutorials
- Build a text-to-video pipeline: Chain multiple Public Endpoints to generate videos from text prompts.
- Deploy cached models: Reduce cold start times with model caching.
- Integrate Serverless with web applications: Build a complete image generation app.
- Build a chatbot with Gemma 3: Deploy vLLM with OpenAI API compatibility.
- Run Ollama on Pods: Set up Ollama for LLM inference.
- Build Docker images with Bazel: Containerize your applications.
All of your release notes in one feed
Join Releasebot and get updates from Runpod and hundreds of other software products.
- Jan 1, 2026
- Date parsed from source:Jan 1, 2026
- First seen by Releasebot:Mar 20, 2026
January 2026
Runpod adds GitHub release rollback for Serverless endpoints and beta load balancing for Serverless repos, giving users more deployment flexibility and easier recovery from issues.
GitHub release rollback GA and load balancing Serverless repos in beta
GitHub release rollback: Roll back your Serverless endpoint to any previous build from the console. Restore an earlier version when you encounter issues without waiting for a new GitHub release.
Load balancing Serverless repos (beta): Load balancing endpoints are now available in the Hub. Publish or convert any listing to load balancer type by setting
"endpointType": "LB"in your hub.json file, then deploy as a Serverless endpoint or Pod from the Hub page. Maintain a single listing for your model and let users choose their deployment method—autoscaling Serverless or dedicated Pod resources.
- Dec 1, 2025
- Date parsed from source:Dec 1, 2025
- First seen by Releasebot:Mar 20, 2026
December 2025
Runpod adds beta Pod migration and new Serverless development guides for building, testing, and debugging endpoints.
Pod migration in beta and Serverless development guides
- Pod migration (beta): Migrate your Pod to a new machine when your stopped Pod’s GPU is occupied. Provisions a new Pod with the same specifications and automatically transfers your data to an available machine.
- New Serverless development guides: We’ve added a comprehensive new set of guides for developing, testing, and debugging Serverless endpoints.
- Sep 1, 2025
- Date parsed from source:Sep 1, 2025
- First seen by Releasebot:Mar 20, 2026
September 2025
Runpod adds Slurm Clusters GA, cached models beta, and new Public Endpoints for faster startup, production-ready HPC, and lifelike video or image creation.
Slurm Clusters GA, cached models in beta, and new Public Endpoints available
- Slurm Clusters are now generally available: Deploy production-ready HPC clusters in seconds. These clusters support multi-node performance for distributed training and large-scale simulations with pay-as-you-go billing and no idle costs.
- Cached models are now in beta: Eliminate model download times when starting workers. The system places cached models on host machines before workers start, prioritizing hosts with your model already available for instant startup.
- New Public Endpoints available: WAN 2.5 combines image and audio to create lifelike videos, while Nano Banana merges multiple images for composite creations.
- Aug 1, 2025
- Date parsed from source:Aug 1, 2025
- First seen by Releasebot:Mar 20, 2026
August 2025
Runpod launches Hub revenue sharing and refreshes the Pods UI with a modern interface.
Hub revenue sharing launches and Pods UI gets refreshed
- Hub revenue share model: Publish to the Runpod Hub and earn credits when others deploy your repo. Earn up to 7% of compute revenue through monthly tiers with credits auto-deposited into your account.
- Pods UI updated: Refreshed modern interface for interacting with Runpod Pods.
- Jul 1, 2025
- Date parsed from source:Jul 1, 2025
- First seen by Releasebot:Mar 20, 2026
July 2025
Runpod adds Public Endpoints and beta Slurm Clusters for easier AI model access and multi-node scheduling.
Public Endpoints arrive, Slurm Clusters in beta
Public Endpoints: Access state-of-the-art AI models through simple API calls with an integrated playground. Available endpoints include Whisper V3, Seedance 1.0 Pro, Seedream 3.0, Qwen Image Edit, Flux Kontext, Cogito 671B, and Minimax Speech.
Slurm Clusters (beta): Create on-demand multi-node clusters instantly with full Slurm scheduling support.
- Jun 1, 2025
- Date parsed from source:Jun 1, 2025
- First seen by Releasebot:Mar 20, 2026
June 2025
Runpod adds S3-compatible storage for network volumes and revamps its referral program with clearer rewards dashboards.
S3-compatible storage and updated referral program
- S3-compatible API for network volumes: Upload and retrieve files from your network volumes without compute using AWS S3 CLI or Boto3. Integrate Runpod storage into any AI pipeline with zero-config ease and object-level control.
- Referral program revamp: Updated rewards and tiers with clearer dashboards to track performance.
- May 1, 2025
- Date parsed from source:May 1, 2025
- First seen by Releasebot:Mar 20, 2026
May 2025
Runpod adds port labeling, price drops, Runpod Hub, and Tetra beta for faster GPU app deployment and lower costs.
Port labeling, price drops, Runpod Hub, and Tetra beta test
- Port labeling: Name exposed ports in the UI and API to help team members identify services like Jupyter or TensorBoard.
- Price drops: Additional price reductions on popular GPU SKUs to lower training and inference costs.
- Runpod Hub: A curated catalog of one-click endpoints and templates for deploying community projects without starting from scratch.
- Tetra beta test: A Python library for running code on GPU with Runpod. Add a @remote() decorator to functions that need GPU power while the rest of your code runs locally.
- Apr 1, 2025
- Date parsed from source:Apr 1, 2025
- First seen by Releasebot:Mar 20, 2026
April 2025
Runpod adds GitHub login, RTX 5090s, and wider global networking for faster onboarding and expanded GPU access.
GitHub login, RTX 5090s, and global networking expansion
- Login with GitHub: OAuth sign-in and linking for faster onboarding and repo-driven workflows.
- RTX 5090s on Runpod: High-performance RTX 5090 availability for cost-efficient training and inference.
- Global networking expansion: Rollout to additional data centers approaching full global coverage.
- Mar 1, 2025
- Date parsed from source:Mar 1, 2025
- First seen by Releasebot:Mar 20, 2026
March 2025
Runpod expands its platform with GA REST API coverage, CPU Pod network storage support, Instant Clusters beta, bare metal GPU servers, SOC 2 Type I certification, and a new AP-JP-1 region for APAC users.
Enterprise features arrive, REST API goes GA, Instant Clusters in beta, and APAC expansion
- CPU Pods get network storage access: GA support for network volumes on CPU Pods for persistent, shareable storage.
- SOC 2 Type I certification: Independent attestation of security controls for enterprise readiness.
- REST API release: REST API GA with broad resource coverage for full infrastructure-as-code workflows.
- Instant Clusters: Spin up multi-node GPU clusters in minutes with private interconnect and per-second billing.
- Bare metal: Reserve dedicated GPU servers for maximum control, performance, and long-term savings.
- AP-JP-1: New Fukushima region for low-latency APAC access and in-country data residency.
- Feb 1, 2025
- Date parsed from source:Feb 1, 2025
- First seen by Releasebot:Mar 20, 2026
February 2025
Runpod enters REST API beta, ships GitHub Serverless integration, and adds a full-time community manager.
REST API enters beta with full-time community manager
- REST API beta test: RESTful endpoints for Pods, endpoints, and volumes for simpler automation than GraphQL.
- Full-time community manager hire: Dedicated programs, content, and faster community response.
- Serverless GitHub integration release: GA for GitHub-based Serverless deploys with production-ready stability.
- Jan 1, 2025
- Date parsed from source:Jan 1, 2025
- First seen by Releasebot:Mar 20, 2026
January 2025
Runpod adds CPU Pods v2, H200 GPUs and Serverless upgrades for faster starts, larger models and simpler deployment.
New silicon and LLM-focused Serverless upgrades
- CPU Pods v2: Docker runtime parity with GPU Pods for faster starts with network volume support.
- H200s on Runpod: NVIDIA H200 GPUs available for larger models and higher memory bandwidth.
- Serverless upgrades: Higher GPU counts per worker, new quick-deploy runtimes, and simpler model selection.
- Nov 1, 2024
- Date parsed from source:Nov 1, 2024
- First seen by Releasebot:Mar 20, 2026
November 2024
Runpod adds global networking expansion, GitHub deploys beta, scoped API keys, and passkey auth for safer, faster automation.
Global networking expands and GitHub deploys enter beta
- Global networking expansion: Added to CA-MTL-3, US-GA-1, US-GA-2, and US-KS-2 for expanded private mesh coverage.
- Serverless GitHub integration beta test: Deploy endpoints directly from GitHub repos with automatic builds.
- Scoped API keys: Least-privilege tokens with fine-grained scopes and expirations for safer automation.
- Passkey auth: Passwordless WebAuthn sign-in for phishing-resistant account access.
- Aug 1, 2024
- Date parsed from source:Aug 1, 2024
- First seen by Releasebot:Mar 20, 2026
August 2024
Runpod adds US-GA-2 network storage and private cross-data-center networking with internal DNS for secure traffic.
Storage expansion and private cross-data-center connectivity
- US-GA-2 added to network storage: Enable network volumes in US-GA-2.
- Global networking: Private cross-data-center networking with internal DNS for secure service-to-service traffic.