- Jan 9, 2026
- Parsed from source:Jan 9, 2026
- Detected by Releasebot:Jan 13, 2026
January 2026
Databricks rolls out January 2026 platform updates with maintenance patches across supported runtimes. New features include agent-mode skills for the Databricks Assistant and GA automatic email alerts for expiring personal access tokens.
These features and Databricks platform improvements were released in January 2026.
NOTE
Releases are staged. Your Databricks account might not be updated until a week or more after the initial release date.Databricks Runtime maintenance updates (01/09) Direct link to databricks-runtime-maintenance-updates-0109
January 9, 2026
New maintenance updates are available for supported Databricks Runtime versions. These updates include bug fixes, security patches, and performance improvements. For details, see:
- Databricks Runtime 17.3 LTS
- Databricks Runtime 17.2
- Databricks Runtime 17.1
- Databricks Runtime 16.4 LTS
- Databricks Runtime 15.4 LTS
- Databricks Runtime 14.3 LTS
- Databricks Runtime 13.3 LTS
- Databricks Runtime 12.2 LTS
Create skills for Databricks Assistant Direct link to Create skills for Databricks Assistant
January 6, 2026
You can now create skills to extend Databricks Assistant in agent mode with specialized capabilities for domain-specific tasks. User skills follow the open Agent Skills standard and are automatically loaded when relevant.
See Extend the Assistant with agent skills.Automatic email notifications for expiring personal access tokens (GA) Direct link to Automatic email notifications for expiring personal access tokens (GA)
January 6, 2026
Automatic email notifications for expiring personal access tokens are now generally available. For more information, see Set the maximum lifetime of new personal access tokens.
Original source Report a problem - Jan 9, 2026
- Parsed from source:Jan 9, 2026
- Detected by Releasebot:Jan 13, 2026
January 2026
Databricks rolls out January 2026 platform improvements with runtime maintenance updates, new Databricks Assistant skills, and GA for expiring personal access token alerts. Expect bug fixes, security patches, and enhanced automation for admins and developers across runtimes and tools.
These features and Databricks platform improvements were released in January 2026.
NOTE
Releases are staged. Your Databricks account might not be updated until a week or more after the initial release date.Databricks Runtime maintenance updates (01/09)
Direct link to databricks-runtime-maintenance-updates-0109
Databricks Runtime maintenance updates (01/09)
January 9, 2026
New maintenance updates are available for supported Databricks Runtime versions. These updates include bug fixes, security patches, and performance improvements. For details, see:
- Databricks Runtime 17.3 LTS
- Databricks Runtime 17.2
- Databricks Runtime 17.1
- Databricks Runtime 16.4 LTS
- Databricks Runtime 15.4 LTS
- Databricks Runtime 14.3 LTS
- Databricks Runtime 13.3 LTS
- Databricks Runtime 12.2 LTS
Create skills for Databricks Assistant
Direct link to Create skills for Databricks Assistant
Create skills for Databricks Assistant
January 6, 2026
You can now create skills to extend Databricks Assistant in agent mode with specialized capabilities for domain-specific tasks. User skills follow the open Agent Skills standard and are automatically loaded when relevant.
See Extend the Assistant with agent skills.Automatic email notifications for expiring personal access tokens (GA)
Direct link to Automatic email notifications for expiring personal access tokens (GA)
Automatic email notifications for expiring personal access tokens (GA)
January 6, 2026
Automatic email notifications for expiring personal access tokens are now generally available. For more information, see Set the maximum lifetime of new personal access tokens.
Original source Report a problem - Jan 9, 2026
- Parsed from source:Jan 9, 2026
- Detected by Releasebot:Jan 13, 2026
January 2026
Databricks January 2026 updates roll out maintenance fixes and security patches across multiple Runtime versions, plus new Databricks Assistant skills and GA for expiring personal access token notifications.
January 2026
These features and Databricks platform improvements were released in January 2026.
NOTE
Releases are staged. Your Databricks account might not be updated until a week or more after the initial release date.Databricks Runtime maintenance updates (01/09) Direct link to databricks-runtime-maintenance-updates-0109
January 9, 2026
New maintenance updates are available for supported Databricks Runtime versions. These updates include bug fixes, security patches, and performance improvements. For details, see:
- Databricks Runtime 17.3 LTS
- Databricks Runtime 17.2
- Databricks Runtime 17.1
- Databricks Runtime 16.4 LTS
- Databricks Runtime 15.4 LTS
- Databricks Runtime 14.3 LTS
- Databricks Runtime 13.3 LTS
- Databricks Runtime 12.2 LTS
Create skills for Databricks Assistant Direct link to Create skills for Databricks Assistant
January 6, 2026
You can now create skills to extend Databricks Assistant in agent mode with specialized capabilities for domain-specific tasks. User skills follow the open Agent Skills standard and are automatically loaded when relevant.
See Extend the Assistant with agent skills.
Automatic email notifications for expiring personal access tokens (GA) Direct link to Automatic email notifications for expiring personal access tokens (GA)
January 6, 2026
Automatic email notifications for expiring personal access tokens are now generally available. For more information, see Set the maximum lifetime of new personal access tokens.
Original source Report a problem - Jan 9, 2026
- Parsed from source:Jan 9, 2026
- Detected by Releasebot:Jan 13, 2026
January 2026
Databricks delivers January 2026 platform updates: maintenance fixes and security patches across multiple Runtime LTS releases, new Databricks Assistant skills in agent mode, and GA for automatic emails on expiring personal access tokens.
January 2026
These features and Databricks platform improvements were released in January 2026.
NOTE
Releases are staged. Your Databricks account might not be updated until a week or more after the initial release date.
Databricks Runtime maintenance updates (01/09) Direct link to databricks-runtime-maintenance-updates-0109
Databricks Runtime maintenance updates (01/09)
January 9, 2026
New maintenance updates are available for supported Databricks Runtime versions. These updates include bug fixes, security patches, and performance improvements. For details, see:
- Databricks Runtime 17.3 LTS
- Databricks Runtime 17.2
- Databricks Runtime 17.1
- Databricks Runtime 16.4 LTS
- Databricks Runtime 15.4 LTS
- Databricks Runtime 14.3 LTS
- Databricks Runtime 13.3 LTS
- Databricks Runtime 12.2 LTS
Create skills for Databricks Assistant Direct link to Create skills for Databricks Assistant
Create skills for Databricks Assistant
January 6, 2026
You can now create skills to extend Databricks Assistant in agent mode with specialized capabilities for domain-specific tasks. User skills follow the open Agent Skills standard and are automatically loaded when relevant.
See Extend the Assistant with agent skills .
Automatic email notifications for expiring personal access tokens (GA) Direct link to Automatic email notifications for expiring personal access tokens (GA)
Automatic email notifications for expiring personal access tokens (GA)
January 6, 2026
Automatic email notifications for expiring personal access tokens are now generally available. For more information, see Set the maximum lifetime of new personal access tokens .
Original source Report a problem - Jan 6, 2026
- Parsed from source:Jan 6, 2026
- Detected by Releasebot:Jan 13, 2026
December 2025
December 2025 release brings Databricks Assistant on the docs site, Agent Mode public preview, single‑use OAuth tokens, Lakebase autoscaling, Delta Sharing enhancements, new connectors, and model upgrades. It ushers in governance and security improvements for faster, smarter data workflows.
Databricks Assistant on the documentation site
December 29, 2025
The Databricks Assistant is now available on the Databricks documentation site to help you get answers and discover information about Databricks. See Get documentation help from Databricks Assistant.
Databricks Assistant Agent Mode is now in Public Preview
December 23, 2025
The Databricks Assistant Agent Mode preview is now enabled by default for most customers.
- The agent can automate multiple steps. From a single prompt, it can retrieve relevant assets, generate and run code, fix errors automatically, and visualize results. It adds the ability to sample data and cell outputs to provide better results.
- The Assistant in Agent Mode will choose between Azure OpenAI or Anthropic on Databricks (uses endpoints hosted by Databricks Inc. in AWS within the Databricks security perimeter), and is only available when the partner-powered AI features setting is enabled.
- Admins can disable the preview if needed until the feature reaches General Availability.
See Use the Data Science Agent, the blog post, and Partner-powered AI features.
Single-use refresh tokens for OAuth applications
December 22, 2025
You can now configure single-use refresh tokens for OAuth applications integrated with Databricks. This security feature requires token rotation after each use, enhancing protection for user-to-machine authentication flows. See Single-use refresh tokens.
Update request parameters for Delta Sharing recipient audit log events
December 19, 2025
For Delta Sharing recipients, deltaSharingProxy* audit log events now also include the catalog_name request parameter, in addition to share_name (previously named share). See Delta Sharing recipient events.
Anthropic Claude Haiku 4.5 now available as a Databricks-hosted model
December 19, 2025
Mosaic AI Model Serving now supports Anthropic Claude Haiku 4.5 as a Databricks-hosted model. You can access this model using Foundation Model APIs pay-per-token.
New Databricks accounts will not have access to legacy features
December 19, 2025
Databricks accounts created after December 18, 2025 will not have access to certain legacy features such as access to DBFS root and mounts, Hive Metastore, and No-isolation shared compute. These accounts will exclusively use Unity Catalog for unified governance and enterprise-grade security.
This behavior enforces the Disable legacy features account setting available in existing Databricks accounts. See Disable access to legacy features in new workspaces.MySQL connector in Lakeflow Connect (Public Preview)
December 18, 2025
The fully-managed MySQL connector in Lakeflow Connect is in Public Preview. This connector enables incremental data ingestion from MySQL databases, including Amazon RDS for MySQL, Amazon Aurora MySQL, Azure Database for MySQL, Google Cloud SQL for MySQL, and MySQL on EC2. See Configure MySQL for ingestion into Databricks.
Contact your Databricks account team to request access to the preview.Meta Ads connector (Beta)
December 18, 2025
You can now ingest data from Meta Ads. See Set up Meta Ads as a data source.
Lakebase Autoscaling metrics dashboard
December 18, 2025
Lakebase Autoscaling (Public Preview) now includes a Metrics dashboard for monitoring system and database metrics. See Metrics.
View latest scheduled notebook job results
December 18, 2025
Databricks notebooks can now show the latest scheduled notebook run directly in your notebook and notebook dashboards. You can also update the notebook with the latest run results.
For more details, see View last successful run and update notebook.Connect to Lakebase Autoscaling from the SQL editor with read-write access
December 18, 2025
Lakebase Autoscaling (Public Preview) now supports direct connections from the SQL editor with full read-write access. See Query from SQL Editor in Lakehouse.
Context based ingress control is now in Public Preview
December 17, 2025
Context-based ingress control is now in Public Preview. This feature enables account admins to set allow and deny rules that combine who is calling, from where they are calling, and what they can reach in Databricks. Context-based ingress control ensures that only trusted combinations of identity, request type, and network source can reach your workspace. A single policy can govern multiple workspaces, ensuring consistent enforcement across your organization.
See Context-based ingress control.Lakebase Autoscaling ACL support
December 17, 2025
Lakebase Autoscaling (Public Preview) now supports Access Control Lists (ACLs). Grant CAN CREATE or CAN MANAGE permissions to control who can access and manage project resources. Manage permissions from project settings in the Lakebase App. See Manage project permissions.
Gemini 3 Flash now available as a Databricks-hosted model
December 17, 2025
Gemini 3 Flash is now available as a Databricks-hosted model. This model offers speed and scale without compromising quality, with advanced multimodal capabilities for complex video analysis, data extraction, and visual Q&As. For more information, see Gemini 3 Flash.
Login required to download ODBC driver
December 17, 2025
You must now log in to Databricks and accept license terms before downloading the Simba Apache Spark ODBC Driver. See Download and install the Databricks ODBC Driver (Simba).
If you use Databricks on AWS GovCloud, contact your account team to receive access to the driver.Flexible node types are now generally available
December 17, 2025
Flexible node types allow your compute resource to fall back to alternative, compatible instance types when your specified instance type is unavailable. This behavior improves compute launch reliability by reducing capacity failures during compute launches. See Improve compute launch reliability using flexible node types.
New resource types for Databricks Apps
December 17, 2025
You can now add MLflow experiments, vector search indexes, user-defined functions (UDFs), and Unity Catalog connections as Databricks Apps resources. See Add resources to a Databricks app.
Run read-only queries on Lakebase (Provisioned) readable secondaries from SQL editor
December 15, 2025
You can now connect to Lakebase (Provisioned) readable secondaries and run read-only queries from the Databricks SQL editor. See Execute read-only queries from Databricks SQL Editor and Access a database instance from the SQL editor.
Delta Sharing to external Iceberg clients is in Public Preview
December 15, 2025
You can now share tables, materialized views, and streaming tables to external Iceberg clients such as Snowflake, Trino, Flink, and Spark. External Iceberg clients can query shared Delta tables with zero-copy access. For details, see Enable sharing to external Iceberg clients and Iceberg clients: Read shared Delta tables.
Lakebase (Autoscaling) now in Public Preview
December 12, 2025
Lakebase (Autoscaling) is now in Public Preview on AWS. This new version of Lakebase introduces autoscaling compute, scale-to-zero, database branching, instant restore, and a redesigned project-based interface. To allow users to explore the new version, usage of Lakebase Autoscaling is free for a limited time. Billing for Lakebase Autoscaling usage begins in January 2026. See Get started with Lakebase Postgres (Autoscaling Preview).
Disable legacy features settings are now GA
December 11, 2025
To help migrate accounts and workspaces to Unity Catalog, two admin settings that disable legacy features are now generally available:
- Disable legacy features: Account-level setting that disables access to DBFS, Hive Metastore, and No-isolation shared compute in new workspaces.
- Disable access to Hive metastore: Workspace-level setting that disables access to the Hive metastore used by your workspace.
OpenAI GPT-5.2 now available as a Databricks-hosted model
December 11, 2025
Mosaic AI Model Serving now supports OpenAI GPT-5.2 as a Databricks-hosted model. You can access this model using Foundation Model APIs pay-per-token.
Confluence connector (Beta)
December 16, 2025
The fully-managed Confluence connector in Lakeflow Connect enables you to ingest Confluence spaces, pages, attachments, blogposts, labels, and classification levels into Databricks. See Configure OAuth U2M for Confluence ingestion.
PostgreSQL connector in Lakeflow Connect (Public Preview)
December 16, 2025
The fully-managed PostgreSQL connector in Lakeflow Connect is in Public Preview. This connector enables incremental data ingestion from PostgreSQL databases, including Amazon RDS PostgreSQL, Amazon Aurora PostgreSQL, Azure Database for PostgreSQL, Google Cloud SQL for PostgreSQL, and on-premises PostgreSQL databases. See Configure PostgreSQL for ingestion into Databricks.
Customizable SharePoint connector (Beta)
December 10, 2025
The standard SharePoint connector offers more flexibility than the managed SharePoint connector. It allows you to ingest structured, semi-structured, and unstructured files into Delta tables with full control over schema inference, parsing options, and transformations. To get started, see Ingest files from SharePoint.
For an in-depth comparison of the SharePoint connectors, see Choose your SharePoint connector.NetSuite connector (Public Preview)
December 10, 2025
You can now ingest data from the NetSuite2.com data source programmatically using the Databricks API, the Databricks CLI, or a Databricks notebook. See Configure NetSuite for ingestion into Databricks.
Jira connector (Beta)
December 10, 2025
The Jira connector in Lakeflow Connect enables you to ingest Jira issues, comments, and attachments metadata into Databricks. See Configure Jira for ingestion.
Microsoft Dynamics 365 connector (Public Preview)
December 10, 2025
The fully-managed Microsoft Dynamics 365 connector in Lakeflow Connect allows you to ingest data from Dynamics 365 applications like Sales, Customer Service, Finance & Operations, and more into Databricks. See Configure data source for Microsoft Dynamics 365 ingestion.
Change owner for materialized views or streaming tables defined in Databricks SQL
December 10, 2025
You can now change the owner for materialized views or streaming tables defined in Databricks SQL through Catalog Explorer. For materialized view details, see Configure materialized views in Databricks SQL. For streaming table details, see Use streaming tables in Databricks SQL.
Discover files in Auto Loader efficiently using file events
December 10, 2025
Auto Loader with file events is now GA. With this feature, Auto Loader can discover files with the efficiency of notifications while retaining the setup simplicity of directory listing. This is the recommended way to use Auto Loader (and particularly file notifications) with Unity Catalog. Learn more here.
To start using Auto Loader with file events, see the following:- (Prerequisite) Enable file events for an external location
- File notification mode with and without file events enabled on external locations
ForEachBatch for Lakeflow Spark Declarative Pipelines is available (Public Preview)
December 9, 2025
You can now process streams in Lakeflow Spark Declarative Pipelines as a series of micro-batches in Python, using a ForEachBatch sink. The ForEachBatch sink is available in public preview.
See Use ForEachBatch to write to arbitrary data sinks in pipelines.Databricks Runtime 18.0 and Databricks Runtime 18.0 ML are in Beta
December 9, 2025
Databricks Runtime 18.0 and Databricks Runtime 18.0 ML are now in Beta, powered by Apache Spark 4.0.0. The release includes JDK 21 as the default, new features for jobs and streaming, and library upgrades.
See Databricks Runtime 18.0 (Beta) and Databricks Runtime 18.0 for Machine Learning (Beta).Databricks Runtime maintenance updates (12/09)
December 9, 2025
New maintenance updates are available for supported Databricks Runtime versions. These updates include bug fixes, security patches, and performance improvements. For details, see:
- Databricks Runtime 17.3 LTS
- Databricks Runtime 17.2
- Databricks Runtime 17.1
- Databricks Runtime 17.0
- Databricks Runtime 16.4 LTS
- Databricks Runtime 15.4 LTS
- Databricks Runtime 14.3 LTS
- Databricks Runtime 13.3 LTS
- Databricks Runtime 12.2 LTS
New columns in Lakeflow system tables (Public Preview)
December 9, 2025
New columns are now available in the Lakeflow system tables to provide enhanced job monitoring and troubleshooting capabilities:
jobs table: trigger, trigger_type, run_as_user_name, creator_user_name, paused, timeout_seconds, health_rules, deployment, create_time
job_tasks table: timeout_seconds, health_rules
job_run_timeline table: source_task_run_id, root_task_run_id, compute, termination_type, setup_duration_seconds, queue_duration_seconds, run_duration_seconds, cleanup_duration_seconds, execution_duration_seconds
job_task_run_timeline table: compute, termination_type, task_parameters, setup_duration_seconds, cleanup_duration_seconds, execution_duration_seconds
pipelines table: create_time
These columns are not populated for rows emitted before early December 2025. See Jobs system table reference.New token expiration policy for open Delta Sharing
December 8, 2025
All new Delta Sharing open sharing recipient tokens are issued with a maximum expiration of one year from the date of creation. Tokens with an expiration period longer than one year or no expiration date can no longer be created.
Existing open sharing recipient tokens issued before December 8, 2025, with expiration dates after December 8, 2026, or with no expiration date, automatically expire on December 8, 2026. If you currently use recipient tokens with long or unlimited lifetimes, review your integrations and renew tokens as needed to avoid breaking changes after this date.
See Create a recipient object for non-Databricks users using bearer tokens (open sharing).Expanded regional availability for C5 and TISAX compliance
December 8, 2025
You can now use the Cloud Computing Compliance Criteria Catalogue (C5) and the Trusted Information Security Assessment Exchange (TISAX) compliance standards in all regions and with serverless compute. See Classic and serverless compute support by region.
Vector Search reranker is now generally available
December 8, 2025
The Vector Search reranker is now generally available. Reranking can help improve retrieval quality. For more information, see Use the reranker in a query.
Built-in Excel file format support (Beta)
December 2, 2025
Databricks now provides built-in support for reading Excel files. You can query Excel files directly using Spark DataFrames without external libraries. See Read Excel files.
Original source Report a problem - Jan 6, 2026
- Parsed from source:Jan 6, 2026
- Detected by Releasebot:Jan 13, 2026
December 2025
Databricks rolls out December 2025 updates with the Databricks Assistant on the docs site, Agent Mode in public preview, single use OAuth tokens, new connectors, Lakebase features, and hosted AI models like GPT-5.2. It also tweaks legacy features access and regional compliance.
December 2025 Release Notes
NOTE
Releases are staged. Your Databricks account might not be updated until a week or more after the initial release date.Databricks Assistant on the documentation site (December 29, 2025)
The Databricks Assistant is now available on the Databricks documentation site to help you get answers and discover information about Databricks. See Get documentation help from Databricks Assistant.
Databricks Assistant Agent Mode is now in Public Preview (December 23, 2025)
The Databricks Assistant Agent Mode preview is now enabled by default for most customers.
- The agent can automate multiple steps. From a single prompt, it can retrieve relevant assets, generate and run code, fix errors automatically, and visualize results. It adds the ability to sample data and cell outputs to provide better results.
- The Assistant in Agent Mode will choose between Azure OpenAI or Anthropic on Databricks (uses endpoints hosted by Databricks Inc. in AWS within the Databricks security perimeter), and is only available when the partner-powered AI features setting is enabled.
- Admins can disable the preview if needed until the feature reaches General Availability.
See Use the Data Science Agent, the blog post, and Partner-powered AI features.
Single-use refresh tokens for OAuth applications (December 22, 2025)
You can now configure single-use refresh tokens for OAuth applications integrated with Databricks. This security feature requires token rotation after each use, enhancing protection for user-to-machine authentication flows. See Single-use refresh tokens.
Update request parameters for Delta Sharing recipient audit log events (December 19, 2025)
For Delta Sharing recipients, deltaSharingProxy* audit log events now also include the catalog_name request parameter, in addition to share_name (previously named share). See Delta Sharing recipient events.
Anthropic Claude Haiku 4.5 now available as a Databricks-hosted model (December 19, 2025)
Mosaic AI Model Serving now supports Anthropic Claude Haiku 4.5 as a Databricks-hosted model. You can access this model using Foundation Model APIs pay-per-token.
New Databricks accounts will not have access to legacy features (December 19, 2025)
Databricks accounts created after December 18, 2025 will not have access to certain legacy features such as access to DBFS root and mounts, Hive Metastore, and No-isolation shared compute. These accounts will exclusively use Unity Catalog for unified governance and enterprise-grade security.
This behavior enforces the Disable legacy features account setting available in existing Databricks accounts. See Disable access to legacy features in new workspaces.MySQL connector in Lakeflow Connect (Public Preview) (December 18, 2025)
The fully-managed MySQL connector in Lakeflow Connect is in Public Preview. This connector enables incremental data ingestion from MySQL databases, including Amazon RDS for MySQL, Amazon Aurora MySQL, Azure Database for MySQL, Google Cloud SQL for MySQL, and MySQL on EC2. See Configure MySQL for ingestion into Databricks.
Contact your Databricks account team to request access to the preview.Meta Ads connector (Beta) (December 18, 2025)
You can now ingest data from Meta Ads. See Set up Meta Ads as a data source.
Lakebase Autoscaling metrics dashboard (December 18, 2025)
Lakebase Autoscaling (Public Preview) now includes a Metrics dashboard for monitoring system and database metrics. See Metrics.
View latest scheduled notebook job results (December 18, 2025)
Databricks notebooks can now show the latest scheduled notebook run directly in your notebook and notebook dashboards. You can also update the notebook with the latest run results.
For more details, see View last successful run and update notebook.Connect to Lakebase Autoscaling from the SQL editor with read-write access (December 18, 2025)
Lakebase Autoscaling (Public Preview) now supports direct connections from the SQL editor with full read-write access. See Query from SQL Editor in Lakehouse.
Context based ingress control is now in Public Preview (December 17, 2025)
Context-based ingress control is now in Public Preview. This feature enables account admins to set allow and deny rules that combine who is calling, from where they are calling, and what they can reach in Databricks. Context-based ingress control ensures that only trusted combinations of identity, request type, and network source can reach your workspace. A single policy can govern multiple workspaces, ensuring consistent enforcement across your organization.
See Context-based ingress control.Lakebase Autoscaling ACL support (December 17, 2025)
Lakebase Autoscaling (Public Preview) now supports Access Control Lists (ACLs). Grant CAN CREATE or CAN MANAGE permissions to control who can access and manage project resources. Manage permissions from project settings in the Lakebase App. See Manage project permissions.
Gemini 3 Flash now available as a Databricks-hosted model (December 17, 2025)
Gemini 3 Flash is now available as a Databricks-hosted model. This model offers speed and scale without compromising quality, with advanced multimodal capabilities for complex video analysis, data extraction, and visual Q&As. For more information, see Gemini 3 Flash.
Login required to download ODBC driver (December 17, 2025)
You must now log in to Databricks and accept license terms before downloading the Simba Apache Spark ODBC Driver. See Download and install the Databricks ODBC Driver (Simba).
If you use Databricks on AWS GovCloud, contact your account team to receive access to the driver.Flexible node types are now generally available (December 17, 2025)
Flexible node types allow your compute resource to fall back to alternative, compatible instance types when your specified instance type is unavailable. This behavior improves compute launch reliability by reducing capacity failures during compute launches. See Improve compute launch reliability using flexible node types.
New resource types for Databricks Apps (December 17, 2025)
You can now add MLflow experiments, vector search indexes, user-defined functions (UDFs), and Unity Catalog connections as Databricks Apps resources. See Add resources to a Databricks app.
Run read-only queries on Lakebase (Provisioned) readable secondaries from SQL editor (December 15, 2025)
You can now connect to Lakebase (Provisioned) readable secondaries and run read-only queries from the Databricks SQL editor. See Execute read-only queries from Databricks SQL Editor and Access a database instance from the SQL editor.
Delta Sharing to external Iceberg clients is in Public Preview (December 15, 2025)
You can now share tables, materialized views, and streaming tables to external Iceberg clients such as Snowflake, Trino, Flink, and Spark. External Iceberg clients can query shared Delta tables with zero-copy access. For details, see Enable sharing to external Iceberg clients and Iceberg clients: Read shared Delta tables.
Lakebase (Autoscaling) now in Public Preview (December 12, 2025)
Lakebase (Autoscaling) is now in Public Preview on AWS. This new version of Lakebase introduces autoscaling compute, scale-to-zero, database branching, instant restore, and a redesigned project-based interface. To allow users to explore the new version, usage of Lakebase Autoscaling is free for a limited time. Billing for Lakebase Autoscaling usage begins in January 2026. See Get started with Lakebase Postgres (Autoscaling Preview).
Disable legacy features settings are now GA (December 11, 2025)
To help migrate accounts and workspaces to Unity Catalog, two admin settings that disable legacy features are now generally available:
- Disable legacy features: Account-level setting that disables access to DBFS, Hive Metastore, and No-isolation shared compute in new workspaces.
- Disable access to Hive metastore: Workspace-level setting that disables access to the Hive metastore used by your workspace.
OpenAI GPT-5.2 now available as a Databricks-hosted model (December 11, 2025)
Mosaic AI Model Serving now supports OpenAI GPT-5.2 as a Databricks-hosted model. You can access this model using Foundation Model APIs pay-per-token.
Confluence connector (Beta) (December 16, 2025)
The fully-managed Confluence connector in Lakeflow Connect enables you to ingest Confluence spaces, pages, attachments, blogposts, labels, and classification levels into Databricks. See Configure OAuth U2M for Confluence ingestion.
PostgreSQL connector in Lakeflow Connect (Public Preview) (December 16, 2025)
The fully-managed PostgreSQL connector in Lakeflow Connect is in Public Preview. This connector enables incremental data ingestion from PostgreSQL databases, including Amazon RDS PostgreSQL, Amazon Aurora PostgreSQL, Azure Database for PostgreSQL, Google Cloud SQL for PostgreSQL, and on-premises PostgreSQL databases. See Configure PostgreSQL for ingestion into Databricks.
Customizable SharePoint connector (Beta) (December 10, 2025)
The standard SharePoint connector offers more flexibility than the managed SharePoint connector. It allows you to ingest structured, semi-structured, and unstructured files into Delta tables with full control over schema inference, parsing options, and transformations. To get started, see Ingest files from SharePoint.
For an in-depth comparison of the SharePoint connectors, see Choose your SharePoint connector.NetSuite connector (Public Preview) (December 10, 2025)
You can now ingest data from the NetSuite2.com data source programmatically using the Databricks API, the Databricks CLI, or a Databricks notebook. See Configure NetSuite for ingestion into Databricks.
Jira connector (Beta) (December 10, 2025)
The Jira connector in Lakeflow Connect enables you to ingest Jira issues, comments, and attachments metadata into Databricks. See Configure Jira for ingestion.
Microsoft Dynamics 365 connector (Public Preview) (December 10, 2025)
The fully-managed Microsoft Dynamics 365 connector in Lakeflow Connect allows you to ingest data from Dynamics 365 applications like Sales, Customer Service, Finance & Operations, and more into Databricks. See Configure data source for Microsoft Dynamics 365 ingestion.
Change owner for materialized views or streaming tables defined in Databricks SQL (December 10, 2025)
You can now change the owner for materialized views or streaming tables defined in Databricks SQL through Catalog Explorer. For materialized view details, see Configure materialized views in Databricks SQL. For streaming table details, see Use streaming tables in Databricks SQL.
Discover files in Auto Loader efficiently using file events (December 10, 2025)
Auto Loader with file events is now GA. With this feature, Auto Loader can discover files with the efficiency of notifications while retaining the setup simplicity of directory listing. This is the recommended way to use Auto Loader (and particularly file notifications) with Unity Catalog. Learn more here.
To start using Auto Loader with file events, see the following:- (Prerequisite) Enable file events for an external location
- File notification mode with and without file events enabled on external locations
ForEachBatch for Lakeflow Spark Declarative Pipelines is available (Public Preview) (December 9, 2025)
You can now process streams in Lakeflow Spark Declarative Pipelines as a series of micro-batches in Python, using a ForEachBatch sink. The ForEachBatch sink is available in public preview.
See Use ForEachBatch to write to arbitrary data sinks in pipelines.Databricks Runtime 18.0 and Databricks Runtime 18.0 ML are in Beta (December 9, 2025)
Databricks Runtime 18.0 and Databricks Runtime 18.0 ML are now in Beta, powered by Apache Spark 4.0.0. The release includes JDK 21 as the default, new features for jobs and streaming, and library upgrades.
See Databricks Runtime 18.0 (Beta) and Databricks Runtime 18.0 for Machine Learning (Beta).Databricks Runtime maintenance updates (12/09) (December 9, 2025)
New maintenance updates are available for supported Databricks Runtime versions. These updates include bug fixes, security patches, and performance improvements. For details, see:
- Databricks Runtime 17.3 LTS
- Databricks Runtime 17.2
- Databricks Runtime 17.1
- Databricks Runtime 17.0
- Databricks Runtime 16.4 LTS
- Databricks Runtime 15.4 LTS
- Databricks Runtime 14.3 LTS
- Databricks Runtime 13.3 LTS
- Databricks Runtime 12.2 LTS
New columns in Lakeflow system tables (Public Preview) (December 9, 2025)
New columns are now available in the Lakeflow system tables to provide enhanced job monitoring and troubleshooting capabilities:
- jobs table: trigger, trigger_type, run_as_user_name, creator_user_name, paused, timeout_seconds, health_rules, deployment, create_time
- job_tasks table: timeout_seconds, health_rules
- job_run_timeline table: source_task_run_id, root_task_run_id, compute, termination_type, setup_duration_seconds, queue_duration_seconds, run_duration_seconds, cleanup_duration_seconds, execution_duration_seconds
- job_task_run_timeline table: compute, termination_type, task_parameters, setup_duration_seconds, cleanup_duration_seconds, execution_duration_seconds
- pipelines table: create_time
These columns are not populated for rows emitted before early December 2025. See Jobs system table reference.
New token expiration policy for open Delta Sharing (December 8, 2025)
All new Delta Sharing open sharing recipient tokens are issued with a maximum expiration of one year from the date of creation. Tokens with an expiration period longer than one year or no expiration date can no longer be created.
Existing open sharing recipient tokens issued before December 8, 2025, with expiration dates after December 8, 2026, or with no expiration date, automatically expire on December 8, 2026. If you currently use recipient tokens with long or unlimited lifetimes, review your integrations and renew tokens as needed to avoid breaking changes after this date.
See Create a recipient object for non-Databricks users using bearer tokens (open sharing).Expanded regional availability for C5 and TISAX compliance (December 8, 2025)
You can now use the Cloud Computing Compliance Criteria Catalogue (C5) and the Trusted Information Security Assessment Exchange (TISAX) compliance standards in all regions and with serverless compute. See Classic and serverless compute support by region.
Vector Search reranker is now generally available (December 8, 2025)
The Vector Search reranker is now generally available. Reranking can help improve retrieval quality. For more information, see Use the reranker in a query.
Built-in Excel file format support (Beta) (December 2, 2025)
Databricks now provides built-in support for reading Excel files. You can query Excel files directly using Spark DataFrames without external libraries. See Read Excel files.
Original source Report a problem - Jan 6, 2026
- Parsed from source:Jan 6, 2026
- Detected by Releasebot:Jan 13, 2026
December 2025
Databricks rollouts December 2025 debut Databricks Assistant on the docs site, Agent Mode in Public Preview, and a wave of new and preview features from Lakebase autoscaling to new connectors and runtime updates. The release also tightens security with single-use refresh tokens and token expiration rules.
December 2025
These features and Databricks platform improvements were released in December 2025.
NOTE
Releases are staged. Your Databricks account might not be updated until a week or more after the initial release date.Databricks Assistant on the documentation site Direct link to Databricks Assistant on the documentation site
December 29, 2025
The Databricks Assistant is now available on the Databricks documentation site to help you get answers and discover information about Databricks. See Get documentation help from Databricks Assistant.Databricks Assistant Agent Mode is now in Public Preview Direct link to Databricks Assistant Agent Mode is now in Public Preview
December 23, 2025
The Databricks Assistant Agent Mode preview is now enabled by default for most customers.
• The agent can automate multiple steps. From a single prompt, it can retrieve relevant assets, generate and run code, fix errors automatically, and visualize results. It adds the ability to sample data and cell outputs to provide better results.
• The Assistant in Agent Mode will choose between Azure OpenAI or Anthropic on Databricks (uses endpoints hosted by Databricks Inc. in AWS within the Databricks security perimeter), and is only available when the partner-powered AI features setting is enabled.
• Admins can disable the preview if needed until the feature reaches General Availability.
See Use the Data Science Agent, the blog post, and Partner-powered AI features.Single-use refresh tokens for OAuth applications Direct link to Single-use refresh tokens for OAuth applications
December 22, 2025
You can now configure single-use refresh tokens for OAuth applications integrated with Databricks. This security feature requires token rotation after each use, enhancing protection for user-to-machine authentication flows. See Single-use refresh tokens.Update request parameters for Delta Sharing recipient audit log events Direct link to update-request-parameters-for-delta-sharing-recipient-audit-log-events
December 19, 2025
For Delta Sharing recipients, deltaSharingProxy* audit log events now also include the catalog_name request parameter, in addition to share_name (previously named share). See Delta Sharing recipient events.Anthropic Claude Haiku 4.5 now available as a Databricks-hosted model Direct link to Anthropic Claude Haiku 4.5 now available as a Databricks-hosted model
December 19, 2025
Mosaic AI Model Serving now supports Anthropic Claude Haiku 4.5 as a Databricks-hosted model. You can access this model using Foundation Model APIs pay-per-token.New Databricks accounts will not have access to legacy features Direct link to new-databricks-accounts-will-not-have-access-to-legacy-features
December 19, 2025
Databricks accounts created after December 18, 2025 will not have access to certain legacy features such as access to DBFS root and mounts, Hive Metastore, and No-isolation shared compute. These accounts will exclusively use Unity Catalog for unified governance and enterprise-grade security.
This behavior enforces the Disable legacy features account setting available in existing Databricks accounts. See Disable access to legacy features in new workspaces.MySQL connector in Lakeflow Connect (Public Preview) Direct link to mysql-connector-in-lakeflow-connect-public-preview
December 18, 2025
The fully-managed MySQL connector in Lakeflow Connect is in Public Preview. This connector enables incremental data ingestion from MySQL databases, including Amazon RDS for MySQL, Amazon Aurora MySQL, Azure Database for MySQL, Google Cloud SQL for MySQL, and MySQL on EC2. See Configure MySQL for ingestion into Databricks.
Contact your Databricks account team to request access to the preview.Meta Ads connector (Beta) Direct link to Meta Ads connector (Beta)
December 18, 2025
You can now ingest data from Meta Ads. See Set up Meta Ads as a data source.Lakebase Autoscaling metrics dashboard Direct link to Lakebase Autoscaling metrics dashboard
December 18, 2025
Lakebase Autoscaling (Public Preview) now includes a Metrics dashboard for monitoring system and database metrics. See Metrics.View latest scheduled notebook job results Direct link to View latest scheduled notebook job results
December 18, 2025
Databricks notebooks can now show the latest scheduled notebook run directly in your notebook and notebook dashboards. You can also update the notebook with the latest run results.
For more details, see View last successful run and update notebook.Connect to Lakebase Autoscaling from the SQL editor with read-write access Direct link to Connect to Lakebase Autoscaling from the SQL editor with read-write access
December 18, 2025
Lakebase Autoscaling (Public Preview) now supports direct connections from the SQL editor with full read-write access. See Query from SQL Editor in Lakehouse.Context based ingress control is now in Public Preview Direct link to Context based ingress control is now in Public Preview
December 17, 2025
Context-based ingress control is now in Public Preview. This feature enables account admins to set allow and deny rules that combine who is calling, from where they are calling, and what they can reach in Databricks. Context-based ingress control ensures that only trusted combinations of identity, request type, and network source can reach your workspace. A single policy can govern multiple workspaces, ensuring consistent enforcement across your organization.
See Context-based ingress control.Lakebase Autoscaling ACL support Direct link to Lakebase Autoscaling ACL support
December 17, 2025
Lakebase Autoscaling (Public Preview) now supports Access Control Lists (ACLs). Grant CAN CREATE or CAN MANAGE permissions to control who can access and manage project resources. Manage permissions from project settings in the Lakebase App. See Manage project permissions.Gemini 3 Flash now available as a Databricks-hosted model Direct link to Gemini 3 Flash now available as a Databricks-hosted model
December 17, 2025
Gemini 3 Flash is now available as a Databricks-hosted model. This model offers speed and scale without compromising quality, with advanced multimodal capabilities for complex video analysis, data extraction, and visual Q&As. For more information, see Gemini 3 Flash.Login required to download ODBC driver Direct link to Login required to download ODBC driver
December 17, 2025
You must now log in to Databricks and accept license terms before downloading the Simba Apache Spark ODBC Driver. See Download and install the Databricks ODBC Driver (Simba).
If you use Databricks on AWS GovCloud, contact your account team to receive access to the driver.Flexible node types are now generally available Direct link to Flexible node types are now generally available
December 17, 2025
Flexible node types allow your compute resource to fall back to alternative, compatible instance types when your specified instance type is unavailable. This behavior improves compute launch reliability by reducing capacity failures during compute launches. See Improve compute launch reliability using flexible node types.New resource types for Databricks Apps Direct link to new-resource-types-for-databricks-apps
December 17, 2025
You can now add MLflow experiments, vector search indexes, user-defined functions (UDFs), and Unity Catalog connections as Databricks Apps resources. See Add resources to a Databricks app.Run read-only queries on Lakebase (Provisioned) readable secondaries from SQL editor Direct link to Run read-only queries on Lakebase (Provisioned) readable secondaries from SQL editor
December 15, 2025
You can now connect to Lakebase (Provisioned) readable secondaries and run read-only queries from the Databricks SQL editor. See Execute read-only queries from Databricks SQL Editor and Access a database instance from the SQL editor.Delta Sharing to external Iceberg clients is in Public Preview Direct link to delta-sharing-to-external-iceberg-clients-is-in-public-preview
December 15, 2025
You can now share tables, materialized views, and streaming tables to external Iceberg clients such as Snowflake, Trino, Flink, and Spark. External Iceberg clients can query shared Delta tables with zero-copy access. For details, see Enable sharing to external Iceberg clients and Iceberg clients: Read shared Delta tables.Lakebase (Autoscaling) now in Public Preview Direct link to Lakebase (Autoscaling) now in Public Preview
December 12, 2025
Lakebase (Autoscaling) is now in Public Preview on AWS. This new version of Lakebase introduces autoscaling compute, scale-to-zero, database branching, instant restore, and a redesigned project-based interface. To allow users to explore the new version, usage of Lakebase Autoscaling is free for a limited time. Billing for Lakebase Autoscaling usage begins in January 2026. See Get started with Lakebase Postgres (Autoscaling Preview).Disable legacy features settings are now GA Direct link to Disable legacy features settings are now GA
December 11, 2025
To help migrate accounts and workspaces to Unity Catalog, two admin settings that disable legacy features are now generally available:
• Disable legacy features: Account-level setting that disables access to DBFS, Hive Metastore, and No-isolation shared compute in new workspaces.
• Disable access to Hive metastore: Workspace-level setting that disables access to the Hive metastore used by your workspace.OpenAI GPT-5.2 now available as a Databricks-hosted model Direct link to OpenAI GPT-5.2 now available as a Databricks-hosted model
December 11, 2025
Mosaic AI Model Serving now supports OpenAI GPT-5.2 as a Databricks-hosted model. You can access this model using Foundation Model APIs pay-per-token.Confluence connector (Beta) Direct link to Confluence connector (Beta)
December 16, 2025
The fully-managed Confluence connector in Lakeflow Connect enables you to ingest Confluence spaces, pages, attachments, blogposts, labels, and classification levels into Databricks. See Configure OAuth U2M for Confluence ingestion.PostgreSQL connector in Lakeflow Connect (Public Preview) Direct link to postgresql-connector-in-lakeflow-connect-public-preview
December 16, 2025
The fully-managed PostgreSQL connector in Lakeflow Connect is in Public Preview. This connector enables incremental data ingestion from PostgreSQL databases, including Amazon RDS PostgreSQL, Amazon Aurora PostgreSQL, Azure Database for PostgreSQL, Google Cloud SQL for PostgreSQL, and on-premises PostgreSQL databases. See Configure PostgreSQL for ingestion into Databricks.Customizable SharePoint connector (Beta) Direct link to Customizable SharePoint connector (Beta)
December 10, 2025
The standard SharePoint connector offers more flexibility than the managed SharePoint connector. It allows you to ingest structured, semi-structured, and unstructured files into Delta tables with full control over schema inference, parsing options, and transformations. To get started, see Ingest files from SharePoint.
For an in-depth comparison of the SharePoint connectors, see Choose your SharePoint connector.NetSuite connector (Public Preview) Direct link to NetSuite connector (Public Preview)
December 10, 2025
You can now ingest data from the NetSuite2.com data source programmatically using the Databricks API, the Databricks CLI, or a Databricks notebook. See Configure NetSuite for ingestion into Databricks.Jira connector (Beta) Direct link to Jira connector (Beta)
December 10, 2025
The Jira connector in Lakeflow Connect enables you to ingest Jira issues, comments, and attachments metadata into Databricks. See Configure Jira for ingestion.Microsoft Dynamics 365 connector (Public Preview) Direct link to Microsoft Dynamics 365 connector (Public Preview)
December 10, 2025
The fully-managed Microsoft Dynamics 365 connector in Lakeflow Connect allows you to ingest data from Dynamics 365 applications like Sales, Customer Service, Finance & Operations, and more into Databricks. See Configure data source for Microsoft Dynamics 365 ingestion.Change owner for materialized views or streaming tables defined in Databricks SQL Direct link to change-owner-for-materialized-views-or-streaming-tables-defined-in-databricks-sql
December 10, 2025
You can now change the owner for materialized views or streaming tables defined in Databricks SQL through Catalog Explorer. For materialized view details, see Configure materialized views in Databricks SQL. For streaming table details, see Use streaming tables in Databricks SQL.Discover files in Auto Loader efficiently using file events Direct link to discover-files-in-auto-loader-efficiently-using-file-events
December 10, 2025
Auto Loader with file events is now GA. With this feature, Auto Loader can discover files with the efficiency of notifications while retaining the setup simplicity of directory listing. This is the recommended way to use Auto Loader (and particularly file notifications) with Unity Catalog. Learn more here.
To start using Auto Loader with file events, see the following:
• (Prerequisite) Enable file events for an external location
• File notification mode with and without file events enabled on external locationsForEachBatch for Lakeflow Spark Declarative Pipelines is available (Public Preview) Direct link to foreachbatch-for-lakeflow-spark-declarative-pipelines-is-available-public-preview
December 9, 2025
You can now process streams in Lakeflow Spark Declarative Pipelines as a series of micro-batches in Python, using a ForEachBatch sink. The ForEachBatch sink is available in public preview.
See Use ForEachBatch to write to arbitrary data sinks in pipelines.Databricks Runtime 18.0 and Databricks Runtime 18.0 ML are in Beta Direct link to databricks-runtime-180-and-databricks-runtime-180-ml-are-in-beta
December 9, 2025
Databricks Runtime 18.0 and Databricks Runtime 18.0 ML are now in Beta, powered by Apache Spark 4.0.0. The release includes JDK 21 as the default, new features for jobs and streaming, and library upgrades.
See Databricks Runtime 18.0 (Beta) and Databricks Runtime 18.0 for Machine Learning (Beta).Databricks Runtime maintenance updates (12/09) Direct link to databricks-runtime-maintenance-updates-1209
December 9, 2025
New maintenance updates are available for supported Databricks Runtime versions. These updates include bug fixes, security patches, and performance improvements. For details, see:
• Databricks Runtime 17.3 LTS
• Databricks Runtime 17.2
• Databricks Runtime 17.1
• Databricks Runtime 17.0
• Databricks Runtime 16.4 LTS
• Databricks Runtime 15.4 LTS
• Databricks Runtime 14.3 LTS
• Databricks Runtime 13.3 LTS
• Databricks Runtime 12.2 LTSNew columns in Lakeflow system tables (Public Preview) Direct link to New columns in Lakeflow system tables (Public Preview)
December 9, 2025
New columns are now available in the Lakeflow system tables to provide enhanced job monitoring and troubleshooting capabilities:
jobs table: trigger, trigger_type, run_as_user_name, creator_user_name, paused, timeout_seconds, health_rules, deployment, create_time
job_tasks table: timeout_seconds, health_rules
job_run_timeline table: source_task_run_id, root_task_run_id, compute, termination_type, setup_duration_seconds, queue_duration_seconds, run_duration_seconds, cleanup_duration_seconds, execution_duration_seconds
job_task_run_timeline table: compute, termination_type, task_parameters, setup_duration_seconds, cleanup_duration_seconds, execution_duration_seconds
pipelines table: create_time
These columns are not populated for rows emitted before early December 2025. See Jobs system table reference.New token expiration policy for open Delta Sharing Direct link to new-token-expiration-policy-for-open-delta-sharing
December 8, 2025
All new Delta Sharing open sharing recipient tokens are issued with a maximum expiration of one year from the date of creation. Tokens with an expiration period longer than one year or no expiration date can no longer be created.
Existing open sharing recipient tokens issued before December 8, 2025, with expiration dates after December 8, 2026, or with no expiration date, automatically expire on December 8, 2026. If you currently use recipient tokens with long or unlimited lifetimes, review your integrations and renew tokens as needed to avoid breaking changes after this date.
See Create a recipient object for non-Databricks users using bearer tokens (open sharing).Expanded regional availability for C5 and TISAX compliance Direct link to Expanded regional availability for C5 and TISAX compliance
December 8, 2025
You can now use the Cloud Computing Compliance Criteria Catalogue (C5) and the Trusted Information Security Assessment Exchange (TISAX) compliance standards in all regions and with serverless compute. See Classic and serverless compute support by region.Vector Search reranker is now generally available Direct link to Vector Search reranker is now generally available
December 8, 2025
The Vector Search reranker is now generally available. Reranking can help improve retrieval quality. For more information, see Use the reranker in a query.Built-in Excel file format support (Beta) Direct link to Built-in Excel file format support (Beta)
Original source Report a problem
December 2, 2025
Databricks now provides built-in support for reading Excel files. You can query Excel files directly using Spark DataFrames without external libraries. See Read Excel files. - Jan 6, 2026
- Parsed from source:Jan 6, 2026
- Detected by Releasebot:Jan 13, 2026
December 2025
Databricks unveils the December 2025 release wave with a docs FAQ assistant, Agent Mode public preview, single use OAuth tokens, expanded Delta Sharing logs, hosted Claude Haiku 4.5 and GPT-5.2, Lakebase autoscaling, new connectors, and broader platform improvements.
These features and Databricks platform improvements were released in December 2025.
NOTE
Releases are staged. Your Databricks account might not be updated until a week or more after the initial release date.December 29, 2025
Databricks Assistant on the documentation site
The Databricks Assistant is now available on the Databricks documentation site to help you get answers and discover information about Databricks. See Get documentation help from Databricks Assistant.December 23, 2025
Databricks Assistant Agent Mode is now in Public Preview
The Databricks Assistant Agent Mode preview is now enabled by default for most customers.- The agent can automate multiple steps. From a single prompt, it can retrieve relevant assets, generate and run code, fix errors automatically, and visualize results. It adds the ability to sample data and cell outputs to provide better results.
- The Assistant in Agent Mode will choose between Azure OpenAI or Anthropic on Databricks (uses endpoints hosted by Databricks Inc. in AWS within the Databricks security perimeter), and is only available when the partner-powered AI features setting is enabled.
- Admins can disable the preview if needed until the feature reaches General Availability.
See Use the Data Science Agent, the blog post, and Partner-powered AI features.
December 22, 2025
Single-use refresh tokens for OAuth applications
You can now configure single-use refresh tokens for OAuth applications integrated with Databricks. This security feature requires token rotation after each use, enhancing protection for user-to-machine authentication flows. See Single-use refresh tokens.December 19, 2025
Update request parameters for Delta Sharing recipient audit log events
For Delta Sharing recipients, deltaSharingProxy* audit log events now also include the catalog_name request parameter, in addition to share_name (previously named share). See Delta Sharing recipient events.Anthropic Claude Haiku 4.5 now available as a Databricks-hosted model
Mosaic AI Model Serving now supports Anthropic Claude Haiku 4.5 as a Databricks-hosted model. You can access this model using Foundation Model APIs pay-per-token.New Databricks accounts will not have access to legacy features
Databricks accounts created after December 18, 2025 will not have access to certain legacy features such as access to DBFS root and mounts, Hive Metastore, and No-isolation shared compute. These accounts will exclusively use Unity Catalog for unified governance and enterprise-grade security.
This behavior enforces the Disable legacy features account setting available in existing Databricks accounts. See Disable access to legacy features in new workspaces.MySQL connector in Lakeflow Connect (Public Preview)
The fully-managed MySQL connector in Lakeflow Connect is in Public Preview. This connector enables incremental data ingestion from MySQL databases, including Amazon RDS for MySQL, Amazon Aurora MySQL, Azure Database for MySQL, Google Cloud SQL for MySQL, and MySQL on EC2. See Configure MySQL for ingestion into Databricks.
Contact your Databricks account team to request access to the preview.Meta Ads connector (Beta)
You can now ingest data from Meta Ads. See Set up Meta Ads as a data source.Lakebase Autoscaling metrics dashboard
Lakebase Autoscaling (Public Preview) now includes a Metrics dashboard for monitoring system and database metrics. See Metrics.View latest scheduled notebook job results
Databricks notebooks can now show the latest scheduled notebook run directly in your notebook and notebook dashboards. You can also update the notebook with the latest run results.
For more details, see View last successful run and update notebook.Connect to Lakebase Autoscaling from the SQL editor with read-write access
Lakebase Autoscaling (Public Preview) now supports direct connections from the SQL editor with full read-write access. See Query from SQL Editor in Lakehouse.Context based ingress control is now in Public Preview
Context-based ingress control is now in Public Preview. This feature enables account admins to set allow and deny rules that combine who is calling, from where they are calling, and what they can reach in Databricks. Context-based ingress control ensures that only trusted combinations of identity, request type, and network source can reach your workspace. A single policy can govern multiple workspaces, ensuring consistent enforcement across your organization.
See Context-based ingress control.Lakebase Autoscaling ACL support
Lakebase Autoscaling (Public Preview) now supports Access Control Lists (ACLs). Grant CAN CREATE or CAN MANAGE permissions to control who can access and manage project resources. Manage permissions from project settings in the Lakebase App. See Manage project permissions.Gemini 3 Flash now available as a Databricks-hosted model
Gemini 3 Flash is now available as a Databricks-hosted model. This model offers speed and scale without compromising quality, with advanced multimodal capabilities for complex video analysis, data extraction, and visual Q&As. For more information, see Gemini 3 Flash.Login required to download ODBC driver
You must now log in to Databricks and accept license terms before downloading the Simba Apache Spark ODBC Driver. See Download and install the Databricks ODBC Driver (Simba).
If you use Databricks on AWS GovCloud, contact your account team to receive access to the driver.Flexible node types are now generally available
Flexible node types allow your compute resource to fall back to alternative, compatible instance types when your specified instance type is unavailable. This behavior improves compute launch reliability by reducing capacity failures during compute launches. See Improve compute launch reliability using flexible node types.New resource types for Databricks Apps
You can now add MLflow experiments, vector search indexes, user-defined functions (UDFs), and Unity Catalog connections as Databricks Apps resources. See Add resources to a Databricks app.Run read-only queries on Lakebase (Provisioned) readable secondaries from SQL editor
You can now connect to Lakebase (Provisioned) readable secondaries and run read-only queries from the Databricks SQL editor. See Execute read-only queries from Databricks SQL Editor and Access a database instance from the SQL editor.Delta Sharing to external Iceberg clients is in Public Preview
You can now share tables, materialized views, and streaming tables to external Iceberg clients such as Snowflake, Trino, Flink, and Spark. External Iceberg clients can query shared Delta tables with zero-copy access. For details, see Enable sharing to external Iceberg clients and Iceberg clients: Read shared Delta tables.Lakebase (Autoscaling) now in Public Preview
Lakebase (Autoscaling) is now in Public Preview on AWS. This new version of Lakebase introduces autoscaling compute, scale-to-zero, database branching, instant restore, and a redesigned project-based interface. To allow users to explore the new version, usage of Lakebase Autoscaling is free for a limited time. Billing for Lakebase Autoscaling usage begins in January 2026. See Get started with Lakebase Postgres (Autoscaling Preview).Disable legacy features settings are now GA
To help migrate accounts and workspaces to Unity Catalog, two admin settings that disable legacy features are now generally available:- Disable legacy features: Account-level setting that disables access to DBFS, Hive Metastore, and No-isolation shared compute in new workspaces.
- Disable access to Hive metastore: Workspace-level setting that disables access to the Hive metastore used by your workspace.
OpenAI GPT-5.2 now available as a Databricks-hosted model
Mosaic AI Model Serving now supports OpenAI GPT-5.2 as a Databricks-hosted model. You can access this model using Foundation Model APIs pay-per-token.Confluence connector (Beta)
The fully-managed Confluence connector in Lakeflow Connect enables you to ingest Confluence spaces, pages, attachments, blogposts, labels, and classification levels into Databricks. See Configure OAuth U2M for Confluence ingestion.PostgreSQL connector in Lakeflow Connect (Public Preview)
The fully-managed PostgreSQL connector in Lakeflow Connect is in Public Preview. This connector enables incremental data ingestion from PostgreSQL databases, including Amazon RDS PostgreSQL, Amazon Aurora PostgreSQL, Azure Database for PostgreSQL, Google Cloud SQL for PostgreSQL, and on-premises PostgreSQL databases. See Configure PostgreSQL for ingestion into Databricks.Customizable SharePoint connector (Beta)
The standard SharePoint connector offers more flexibility than the managed SharePoint connector. It allows you to ingest structured, semi-structured, and unstructured files into Delta tables with full control over schema inference, parsing options, and transformations. To get started, see Ingest files from SharePoint.
For an in-depth comparison of the SharePoint connectors, see Choose your SharePoint connector.NetSuite connector (Public Preview)
You can now ingest data from the NetSuite2.com data source programmatically using the Databricks API, the Databricks CLI, or a Databricks notebook. See Configure NetSuite for ingestion into Databricks.Jira connector (Beta)
The Jira connector in Lakeflow Connect enables you to ingest Jira issues, comments, and attachments metadata into Databricks. See Configure Jira for ingestion.Microsoft Dynamics 365 connector (Public Preview)
The fully-managed Microsoft Dynamics 365 connector in Lakeflow Connect allows you to ingest data from Dynamics 365 applications like Sales, Customer Service, Finance & Operations, and more into Databricks. See Configure data source for Microsoft Dynamics 365 ingestion.Change owner for materialized views or streaming tables defined in Databricks SQL
You can now change the owner for materialized views or streaming tables defined in Databricks SQL through Catalog Explorer. For materialized view details, see Configure materialized views in Databricks SQL. For streaming table details, see Use streaming tables in Databricks SQL.Discover files in Auto Loader efficiently using file events
Auto Loader with file events is now GA. With this feature, Auto Loader can discover files with the efficiency of notifications while retaining the setup simplicity of directory listing. This is the recommended way to use Auto Loader (and particularly file notifications) with Unity Catalog. Learn more here.
To start using Auto Loader with file events, see the following:- (Prerequisite) Enable file events for an external location
- File notification mode with and without file events enabled on external locations
ForEachBatch for Lakeflow Spark Declarative Pipelines is available (Public Preview)
December 9, 2025
You can now process streams in Lakeflow Spark Declarative Pipelines as a series of micro-batches in Python, using a ForEachBatch sink. The ForEachBatch sink is available in public preview.
See Use ForEachBatch to write to arbitrary data sinks in pipelines.Databricks Runtime 18.0 and Databricks Runtime 18.0 ML are in Beta
December 9, 2025
Databricks Runtime 18.0 and Databricks Runtime 18.0 ML are now in Beta, powered by Apache Spark 4.0.0. The release includes JDK 21 as the default, new features for jobs and streaming, and library upgrades.
See Databricks Runtime 18.0 (Beta) and Databricks Runtime 18.0 for Machine Learning (Beta).Databricks Runtime maintenance updates (12/09)
December 9, 2025
New maintenance updates are available for supported Databricks Runtime versions. These updates include bug fixes, security patches, and performance improvements. For details, see:- Databricks Runtime 17.3 LTS
- Databricks Runtime 17.2
- Databricks Runtime 17.1
- Databricks Runtime 17.0
- Databricks Runtime 16.4 LTS
- Databricks Runtime 15.4 LTS
- Databricks Runtime 14.3 LTS
- Databricks Runtime 13.3 LTS
- Databricks Runtime 12.2 LTS
New columns in Lakeflow system tables (Public Preview)
December 9, 2025
New columns are now available in the Lakeflow system tables to provide enhanced job monitoring and troubleshooting capabilities:
jobs table: trigger, trigger_type, run_as_user_name, creator_user_name, paused, timeout_seconds, health_rules, deployment, create_time
job_tasks table: timeout_seconds, health_rules
job_run_timeline table: source_task_run_id, root_task_run_id, compute, termination_type, setup_duration_seconds, queue_duration_seconds, run_duration_seconds, cleanup_duration_seconds, execution_duration_seconds
job_task_run_timeline table: compute, termination_type, task_parameters, setup_duration_seconds, cleanup_duration_seconds, execution_duration_seconds
pipelines table: create_time
These columns are not populated for rows emitted before early December 2025. See Jobs system table reference.New token expiration policy for open Delta Sharing
December 8, 2025
All new Delta Sharing open sharing recipient tokens are issued with a maximum expiration of one year from the date of creation. Tokens with an expiration period longer than one year or no expiration date can no longer be created.
Existing open sharing recipient tokens issued before December 8, 2025, with expiration dates after December 8, 2026, or with no expiration date, automatically expire on December 8, 2026. If you currently use recipient tokens with long or unlimited lifetimes, review your integrations and renew tokens as needed to avoid breaking changes after this date.
See Create a recipient object for non-Databricks users using bearer tokens (open sharing).Expanded regional availability for C5 and TISAX compliance
December 8, 2025
You can now use the Cloud Computing Compliance Criteria Catalogue (C5) and the Trusted Information Security Assessment Exchange (TISAX) compliance standards in all regions and with serverless compute. See Classic and serverless compute support by region.Vector Search reranker is now generally available
December 8, 2025
The Vector Search reranker is now generally available. Reranking can help improve retrieval quality. For more information, see Use the reranker in a query.Built-in Excel file format support (Beta)
Original source Report a problem
December 2, 2025
Databricks now provides built-in support for reading Excel files. You can query Excel files directly using Spark DataFrames without external libraries. See Read Excel files. - Jan 6, 2026
- Parsed from source:Jan 6, 2026
- Detected by Releasebot:Jan 13, 2026
December 2025
Databricks rolls out a December 2025 release spree with the Databricks Assistant on docs, Agent Mode public preview, enhanced security features, new hosted AI models, expanded Lakebase autoscaling, and a wave of connectors and previews across Lakeflow and Delta Sharing.
These features and Databricks platform improvements were released in December 2025.
NOTE
Releases are staged. Your Databricks account might not be updated until a week or more after the initial release date.Databricks Assistant on the documentation site
December 29, 2025
The Databricks Assistant is now available on the Databricks documentation site to help you get answers and discover information about Databricks. See Get documentation help from Databricks Assistant.Databricks Assistant Agent Mode is now in Public Preview
December 23, 2025
The Databricks Assistant Agent Mode preview is now enabled by default for most customers.
• The agent can automate multiple steps. From a single prompt, it can retrieve relevant assets, generate and run code, fix errors automatically, and visualize results. It adds the ability to sample data and cell outputs to provide better results.
• The Assistant in Agent Mode will choose between Azure OpenAI or Anthropic on Databricks (uses endpoints hosted by Databricks Inc. in AWS within the Databricks security perimeter), and is only available when the partner-powered AI features setting is enabled.
• Admins can disable the preview if needed until the feature reaches General Availability.
See Use the Data Science Agent, the blog post, and Partner-powered AI features.Single-use refresh tokens for OAuth applications
December 22, 2025
You can now configure single-use refresh tokens for OAuth applications integrated with Databricks. This security feature requires token rotation after each use, enhancing protection for user-to-machine authentication flows. See Single-use refresh tokens.Update request parameters for Delta Sharing recipient audit log events
December 19, 2025
For Delta Sharing recipients, deltaSharingProxy* audit log events now also include the catalog_name request parameter, in addition to share_name (previously named share). See Delta Sharing recipient events.Anthropic Claude Haiku 4.5 now available as a Databricks-hosted model
December 19, 2025
Mosaic AI Model Serving now supports Anthropic Claude Haiku 4.5 as a Databricks-hosted model. You can access this model using Foundation Model APIs pay-per-token.New Databricks accounts will not have access to legacy features
December 19, 2025
Databricks accounts created after December 18, 2025 will not have access to certain legacy features such as access to DBFS root and mounts, Hive Metastore, and No-isolation shared compute. These accounts will exclusively use Unity Catalog for unified governance and enterprise-grade security.
This behavior enforces the Disable legacy features account setting available in existing Databricks accounts. See Disable access to legacy features in new workspaces.MySQL connector in Lakeflow Connect (Public Preview)
December 18, 2025
The fully-managed MySQL connector in Lakeflow Connect is in Public Preview. This connector enables incremental data ingestion from MySQL databases, including Amazon RDS for MySQL, Amazon Aurora MySQL, Azure Database for MySQL, Google Cloud SQL for MySQL, and MySQL on EC2. See Configure MySQL for ingestion into Databricks.
Contact your Databricks account team to request access to the preview.Meta Ads connector (Beta)
December 18, 2025
You can now ingest data from Meta Ads. See Set up Meta Ads as a data source.Lakebase Autoscaling metrics dashboard
December 18, 2025
Lakebase Autoscaling (Public Preview) now includes a Metrics dashboard for monitoring system and database metrics. See Metrics.View latest scheduled notebook job results
December 18, 2025
Databricks notebooks can now show the latest scheduled notebook run directly in your notebook and notebook dashboards. You can also update the notebook with the latest run results.
For more details, see View last successful run and update notebook.Connect to Lakebase Autoscaling from the SQL editor with read-write access
December 18, 2025
Lakebase Autoscaling (Public Preview) now supports direct connections from the SQL editor with full read-write access. See Query from SQL Editor in Lakehouse.Context based ingress control is now in Public Preview
December 17, 2025
Context-based ingress control is now in Public Preview. This feature enables account admins to set allow and deny rules that combine who is calling, from where they are calling, and what they can reach in Databricks. Context-based ingress control ensures that only trusted combinations of identity, request type, and network source can reach your workspace. A single policy can govern multiple workspaces, ensuring consistent enforcement across your organization.
See Context-based ingress control.Lakebase Autoscaling ACL support
December 17, 2025
Lakebase Autoscaling (Public Preview) now supports Access Control Lists (ACLs). Grant CAN CREATE or CAN MANAGE permissions to control who can access and manage project resources. Manage permissions from project settings in the Lakebase App. See Manage project permissions.Gemini 3 Flash now available as a Databricks-hosted model
December 17, 2025
Gemini 3 Flash is now available as a Databricks-hosted model. This model offers speed and scale without compromising quality, with advanced multimodal capabilities for complex video analysis, data extraction, and visual Q&As. For more information, see Gemini 3 Flash.Login required to download ODBC driver
December 17, 2025
You must now log in to Databricks and accept license terms before downloading the Simba Apache Spark ODBC Driver. See Download and install the Databricks ODBC Driver (Simba).
If you use Databricks on AWS GovCloud, contact your account team to receive access to the driver.Flexible node types are now generally available
December 17, 2025
Flexible node types allow your compute resource to fall back to alternative, compatible instance types when your specified instance type is unavailable. This behavior improves compute launch reliability by reducing capacity failures during compute launches. See Improve compute launch reliability using flexible node types.New resource types for Databricks Apps
December 17, 2025
You can now add MLflow experiments, vector search indexes, user-defined functions (UDFs), and Unity Catalog connections as Databricks Apps resources. See Add resources to a Databricks app.Run read-only queries on Lakebase (Provisioned) readable secondaries from SQL editor
December 15, 2025
You can now connect to Lakebase (Provisioned) readable secondaries and run read-only queries from the Databricks SQL editor. See Execute read-only queries from Databricks SQL Editor and Access a database instance from the SQL editor.Delta Sharing to external Iceberg clients is in Public Preview
December 15, 2025
You can now share tables, materialized views, and streaming tables to external Iceberg clients such as Snowflake, Trino, Flink, and Spark. External Iceberg clients can query shared Delta tables with zero-copy access. For details, see Enable sharing to external Iceberg clients and Iceberg clients: Read shared Delta tables.Lakebase (Autoscaling) now in Public Preview
December 12, 2025
Lakebase (Autoscaling) is now in Public Preview on AWS. This new version of Lakebase introduces autoscaling compute, scale-to-zero, database branching, instant restore, and a redesigned project-based interface. To allow users to explore the new version, usage of Lakebase Autoscaling is free for a limited time. Billing for Lakebase Autoscaling usage begins in January 2026. See Get started with Lakebase Postgres (Autoscaling Preview).Disable legacy features settings are now GA
December 11, 2025
To help migrate accounts and workspaces to Unity Catalog, two admin settings that disable legacy features are now generally available:
• Disable legacy features: Account-level setting that disables access to DBFS, Hive Metastore, and No-isolation shared compute in new workspaces.
• Disable access to Hive metastore: Workspace-level setting that disables access to the Hive metastore used by your workspace.OpenAI GPT-5.2 now available as a Databricks-hosted model
December 11, 2025
Mosaic AI Model Serving now supports OpenAI GPT-5.2 as a Databricks-hosted model. You can access this model using Foundation Model APIs pay-per-token.Confluence connector (Beta)
December 16, 2025
The fully-managed Confluence connector in Lakeflow Connect enables you to ingest Confluence spaces, pages, attachments, blogposts, labels, and classification levels into Databricks. See Configure OAuth U2M for Confluence ingestion.PostgreSQL connector in Lakeflow Connect (Public Preview)
December 16, 2025
The fully-managed PostgreSQL connector in Lakeflow Connect is in Public Preview. This connector enables incremental data ingestion from PostgreSQL databases, including Amazon RDS PostgreSQL, Amazon Aurora PostgreSQL, Azure Database for PostgreSQL, Google Cloud SQL for PostgreSQL, and on-premises PostgreSQL databases. See Configure PostgreSQL for ingestion into Databricks.Customizable SharePoint connector (Beta)
December 10, 2025
The standard SharePoint connector offers more flexibility than the managed SharePoint connector. It allows you to ingest structured, semi-structured, and unstructured files into Delta tables with full control over schema inference, parsing options, and transformations. To get started, see Ingest files from SharePoint.
For an in-depth comparison of the SharePoint connectors, see Choose your SharePoint connector.NetSuite connector (Public Preview)
December 10, 2025
You can now ingest data from the NetSuite2.com data source programmatically using the Databricks API, the Databricks CLI, or a Databricks notebook. See Configure NetSuite for ingestion into Databricks.Jira connector (Beta)
December 10, 2025
The Jira connector in Lakeflow Connect enables you to ingest Jira issues, comments, and attachments metadata into Databricks. See Configure Jira for ingestion.Microsoft Dynamics 365 connector (Public Preview)
December 10, 2025
The fully-managed Microsoft Dynamics 365 connector in Lakeflow Connect allows you to ingest data from Dynamics 365 applications like Sales, Customer Service, Finance & Operations, and more into Databricks. See Configure data source for Microsoft Dynamics 365 ingestion.Change owner for materialized views or streaming tables defined in Databricks SQL
December 10, 2025
You can now change the owner for materialized views or streaming tables defined in Databricks SQL through Catalog Explorer. For materialized view details, see Configure materialized views in Databricks SQL. For streaming table details, see Use streaming tables in Databricks SQL.Discover files in Auto Loader efficiently using file events
December 10, 2025
Auto Loader with file events is now GA. With this feature, Auto Loader can discover files with the efficiency of notifications while retaining the setup simplicity of directory listing. This is the recommended way to use Auto Loader (and particularly file notifications) with Unity Catalog. Learn more here.
To start using Auto Loader with file events, see the following:
• (Prerequisite) Enable file events for an external location
• File notification mode with and without file events enabled on external locationsForEachBatch for Lakeflow Spark Declarative Pipelines is available (Public Preview)
December 9, 2025
You can now process streams in Lakeflow Spark Declarative Pipelines as a series of micro-batches in Python, using a ForEachBatch sink. The ForEachBatch sink is available in public preview.
See Use ForEachBatch to write to arbitrary data sinks in pipelines.Databricks Runtime 18.0 and Databricks Runtime 18.0 ML are in Beta
December 9, 2025
Databricks Runtime 18.0 and Databricks Runtime 18.0 ML are now in Beta, powered by Apache Spark 4.0.0. The release includes JDK 21 as the default, new features for jobs and streaming, and library upgrades.
See Databricks Runtime 18.0 (Beta) and Databricks Runtime 18.0 for Machine Learning (Beta).Databricks Runtime maintenance updates (12/09)
December 9, 2025
New maintenance updates are available for supported Databricks Runtime versions. These updates include bug fixes, security patches, and performance improvements. For details, see:
• Databricks Runtime 17.3 LTS
• Databricks Runtime 17.2
• Databricks Runtime 17.1
• Databricks Runtime 17.0
• Databricks Runtime 16.4 LTS
• Databricks Runtime 15.4 LTS
• Databricks Runtime 14.3 LTS
• Databricks Runtime 13.3 LTS
• Databricks Runtime 12.2 LTSNew columns in Lakeflow system tables (Public Preview)
December 9, 2025
New columns are now available in the Lakeflow system tables to provide enhanced job monitoring and troubleshooting capabilities:
jobs table: trigger, trigger_type, run_as_user_name, creator_user_name, paused, timeout_seconds, health_rules, deployment, create_time
job_tasks table: timeout_seconds, health_rules
job_run_timeline table: source_task_run_id, root_task_run_id, compute, termination_type, setup_duration_seconds, queue_duration_seconds, run_duration_seconds, cleanup_duration_seconds, execution_duration_seconds
job_task_run_timeline table: compute, termination_type, task_parameters, setup_duration_seconds, cleanup_duration_seconds, execution_duration_seconds
pipelines table: create_time
These columns are not populated for rows emitted before early December 2025. See Jobs system table reference.New token expiration policy for open Delta Sharing
December 8, 2025
All new Delta Sharing open sharing recipient tokens are issued with a maximum expiration of one year from the date of creation. Tokens with an expiration period longer than one year or no expiration date can no longer be created.
Existing open sharing recipient tokens issued before December 8, 2025, with expiration dates after December 8, 2026, or with no expiration date, automatically expire on December 8, 2026. If you currently use recipient tokens with long or unlimited lifetimes, review your integrations and renew tokens as needed to avoid breaking changes after this date.
See Create a recipient object for non-Databricks users using bearer tokens (open sharing).Expanded regional availability for C5 and TISAX compliance
December 8, 2025
You can now use the Cloud Computing Compliance Criteria Catalogue (C5) and the Trusted Information Security Assessment Exchange (TISAX) compliance standards in all regions and with serverless compute. See Classic and serverless compute support by region.Vector Search reranker is now generally available
December 8, 2025
The Vector Search reranker is now generally available. Reranking can help improve retrieval quality. For more information, see Use the reranker in a query.Built-in Excel file format support (Beta)
Original source Report a problem
December 2, 2025
Databricks now provides built-in support for reading Excel files. You can query Excel files directly using Spark DataFrames without external libraries. See Read Excel files. - Jan 6, 2026
- Parsed from source:Jan 6, 2026
- Detected by Releasebot:Jan 13, 2026
December 2025
December 2025 brings a wave of product updates led by Databricks Assistant on the docs site and Agent Mode in public preview, plus new Lakebase Autoscaling features, multiple connectors, and runtime GA/beta shifts across the platform. A clear release with new capabilities and governance improvements.
December 2025
Original source Report a problem
These features and Databricks platform improvements were released in December 2025.
NOTE
Releases are staged. Your Databricks account might not be updated until a week or more after the initial release date.
Databricks Assistant on the documentation site Direct link to Databricks Assistant on the documentation site
Databricks Assistant on the documentation site
Direct link to Databricks Assistant on the documentation site
December 29, 2025
The Databricks Assistant is now available on the Databricks documentation site to help you get answers and discover information about Databricks. See Get documentation help from Databricks Assistant.
Databricks Assistant Agent Mode is now in Public Preview Direct link to Databricks Assistant Agent Mode is now in Public Preview
Databricks Assistant Agent Mode is now in Public Preview
Direct link to Databricks Assistant Agent Mode is now in Public Preview
December 23, 2025
The Databricks Assistant Agent Mode preview is now enabled by default for most customers.
• The agent can automate multiple steps. From a single prompt, it can retrieve relevant assets, generate and run code, fix errors automatically, and visualize results. It adds the ability to sample data and cell outputs to provide better results.
• The Assistant in Agent Mode will choose between Azure OpenAI or Anthropic on Databricks (uses endpoints hosted by Databricks Inc. in AWS within the Databricks security perimeter), and is only available when the partner-powered AI features setting is enabled.
• Admins can disable the preview if needed until the feature reaches General Availability.
See Use the Data Science Agent, the blog post, and Partner-powered AI features.
Single-use refresh tokens for OAuth applications Direct link to Single-use refresh tokens for OAuth applications
Single-use refresh tokens for OAuth applications
Direct link to Single-use refresh tokens for OAuth applications
December 22, 2025
You can now configure single-use refresh tokens for OAuth applications integrated with Databricks. This security feature requires token rotation after each use, enhancing protection for user-to-machine authentication flows. See Single-use refresh tokens.
Update request parameters for Delta Sharing recipient audit log events Direct link to update-request-parameters-for-delta-sharing-recipient-audit-log-events
Update request parameters for Delta Sharing recipient audit log events
Direct link to update-request-parameters-for-delta-sharing-recipient-audit-log-events
December 19, 2025
For Delta Sharing recipients, deltaSharingProxy* audit log events now also include the catalog_name request parameter, in addition to share_name (previously named share). See Delta Sharing recipient events.
Anthropic Claude Haiku 4.5 now available as a Databricks-hosted model Direct link to Anthropic Claude Haiku 4.5 now available as a Databricks-hosted model
Anthropic Claude Haiku 4.5 now available as a Databricks-hosted model
Direct link to Anthropic Claude Haiku 4.5 now available as a Databricks-hosted model
December 19, 2025
Mosaic AI Model Serving now supports Anthropic Claude Haiku 4.5 as a Databricks-hosted model. You can access this model using Foundation Model APIs pay-per-token.
New Databricks accounts will not have access to legacy features Direct link to new-databricks-accounts-will-not-have-access-to-legacy-features
New Databricks accounts will not have access to legacy features
Direct link to new-databricks-accounts-will-not-have-access-to-legacy-features
December 19, 2025
Databricks accounts created after December 18, 2025 will not have access to certain legacy features such as access to DBFS root and mounts, Hive Metastore, and No-isolation shared compute. These accounts will exclusively use Unity Catalog for unified governance and enterprise-grade security.
This behavior enforces the Disable legacy features account setting available in existing Databricks accounts. See Disable access to legacy features in new workspaces.
MySQL connector in Lakeflow Connect (Public Preview) Direct link to mysql-connector-in-lakeflow-connect-public-preview
MySQL connector in Lakeflow Connect (Public Preview)
Direct link to mysql-connector-in-lakeflow-connect-public-preview
December 18, 2025
The fully-managed MySQL connector in Lakeflow Connect is in Public Preview. This connector enables incremental data ingestion from MySQL databases, including Amazon RDS for MySQL, Amazon Aurora MySQL, Azure Database for MySQL, Google Cloud SQL for MySQL, and MySQL on EC2. See Configure MySQL for ingestion into Databricks.
Contact your Databricks account team to request access to the preview.
Meta Ads connector (Beta) Direct link to Meta Ads connector (Beta)
Meta Ads connector (Beta)
Direct link to Meta Ads connector (Beta)
December 18, 2025
You can now ingest data from Meta Ads. See Set up Meta Ads as a data source.
Lakebase Autoscaling metrics dashboard Direct link to Lakebase Autoscaling metrics dashboard
Lakebase Autoscaling metrics dashboard
Direct link to Lakebase Autoscaling metrics dashboard
December 18, 2025
Lakebase Autoscaling (Public Preview) now includes a Metrics dashboard for monitoring system and database metrics. See Metrics.
View latest scheduled notebook job results Direct link to View latest scheduled notebook job results
View latest scheduled notebook job results
Direct link to View latest scheduled notebook job results
December 18, 2025
Databricks notebooks can now show the latest scheduled notebook run directly in your notebook and notebook dashboards. You can also update the notebook with the latest run results.
For more details, see View last successful run and update notebook.
Connect to Lakebase Autoscaling from the SQL editor with read-write access Direct link to Connect to Lakebase Autoscaling from the SQL editor with read-write access
Connect to Lakebase Autoscaling from the SQL editor with read-write access
Direct link to Connect to Lakebase Autoscaling from the SQL editor with read-write access
December 18, 2025
Lakebase Autoscaling (Public Preview) now supports direct connections from the SQL editor with full read-write access. See Query from SQL Editor in Lakehouse.
Context based ingress control is now in Public Preview Direct link to Context based ingress control is now in Public Preview
Context based ingress control is now in Public Preview
Direct link to Context based ingress control is now in Public Preview
December 17, 2025
Context-based ingress control is now in Public Preview. This feature enables account admins to set allow and deny rules that combine who is calling, from where they are calling, and what they can reach in Databricks. Context-based ingress control ensures that only trusted combinations of identity, request type, and network source can reach your workspace. A single policy can govern multiple workspaces, ensuring consistent enforcement across your organization.
See Context-based ingress control.
Lakebase Autoscaling ACL support Direct link to Lakebase Autoscaling ACL support
Lakebase Autoscaling ACL support
Direct link to Lakebase Autoscaling ACL support
December 17, 2025
Lakebase Autoscaling (Public Preview) now supports Access Control Lists (ACLs). Grant CAN CREATE or CAN MANAGE permissions to control who can access and manage project resources. Manage permissions from project settings in the Lakebase App. See Manage project permissions.
Gemini 3 Flash now available as a Databricks-hosted model Direct link to Gemini 3 Flash now available as a Databricks-hosted model
Gemini 3 Flash now available as a Databricks-hosted model
Direct link to Gemini 3 Flash now available as a Databricks-hosted model
December 17, 2025
Gemini 3 Flash is now available as a Databricks-hosted model. This model offers speed and scale without compromising quality, with advanced multimodal capabilities for complex video analysis, data extraction, and visual Q&As. For more information, see Gemini 3 Flash.
Login required to download ODBC driver Direct link to Login required to download ODBC driver
Login required to download ODBC driver
Direct link to Login required to download ODBC driver
December 17, 2025
You must now log in to Databricks and accept license terms before downloading the Simba Apache Spark ODBC Driver. See Download and install the Databricks ODBC Driver (Simba).
If you use Databricks on AWS GovCloud, contact your account team to receive access to the driver.
Flexible node types are now generally available Direct link to Flexible node types are now generally available
Flexible node types are now generally available
Direct link to Flexible node types are now generally available
December 17, 2025
Flexible node types allow your compute resource to fall back to alternative, compatible instance types when your specified instance type is unavailable. This behavior improves compute launch reliability by reducing capacity failures during compute launches. See Improve compute launch reliability using flexible node types.
New resource types for Databricks Apps Direct link to new-resource-types-for-databricks-apps
New resource types for Databricks Apps
Direct link to new-resource-types-for-databricks-apps
December 17, 2025
You can now add MLflow experiments, vector search indexes, user-defined functions (UDFs), and Unity Catalog connections as Databricks Apps resources. See Add resources to a Databricks app.
Run read-only queries on Lakebase (Provisioned) readable secondaries from SQL editor Direct link to Run read-only queries on Lakebase (Provisioned) readable secondaries from SQL editor
Run read-only queries on Lakebase (Provisioned) readable secondaries from SQL editor
Direct link to Run read-only queries on Lakebase (Provisioned) readable secondaries from SQL editor
December 15, 2025
You can now connect to Lakebase (Provisioned) readable secondaries and run read-only queries from the Databricks SQL editor. See Execute read-only queries from Databricks SQL Editor and Access a database instance from the SQL editor.
Delta Sharing to external Iceberg clients is in Public Preview Direct link to delta-sharing-to-external-iceberg-clients-is-in-public-preview
Delta Sharing to external Iceberg clients is in Public Preview
Direct link to delta-sharing-to-external-iceberg-clients-is-in-public-preview
December 15, 2025
You can now share tables, materialized views, and streaming tables to external Iceberg clients such as Snowflake, Trino, Flink, and Spark. External Iceberg clients can query shared Delta tables with zero-copy access. For details, see Enable sharing to external Iceberg clients and Iceberg clients: Read shared Delta tables.
Lakebase (Autoscaling) now in Public Preview Direct link to Lakebase (Autoscaling) now in Public Preview
Lakebase (Autoscaling) now in Public Preview
Direct link to Lakebase (Autoscaling) now in Public Preview
December 12, 2025
Lakebase (Autoscaling) is now in Public Preview on AWS. This new version of Lakebase introduces autoscaling compute, scale-to-zero, database branching, instant restore, and a redesigned project-based interface. To allow users to explore the new version, usage of Lakebase Autoscaling is free for a limited time. Billing for Lakebase Autoscaling usage begins in January 2026. See Get started with Lakebase Postgres (Autoscaling Preview).
Disable legacy features settings are now GA Direct link to Disable legacy features settings are now GA
Disable legacy features settings are now GA
Direct link to Disable legacy features settings are now GA
December 11, 2025
To help migrate accounts and workspaces to Unity Catalog, two admin settings that disable legacy features are now generally available:
• Disable legacy features: Account-level setting that disables access to DBFS, Hive Metastore, and No-isolation shared compute in new workspaces.
• Disable access to Hive metastore: Workspace-level setting that disables access to the Hive metastore used by your workspace.
OpenAI GPT-5.2 now available as a Databricks-hosted model Direct link to OpenAI GPT-5.2 now available as a Databricks-hosted model
OpenAI GPT-5.2 now available as a Databricks-hosted model
Direct link to OpenAI GPT-5.2 now available as a Databricks-hosted model
December 11, 2025
Mosaic AI Model Serving now supports OpenAI GPT-5.2 as a Databricks-hosted model. You can access this model using Foundation Model APIs pay-per-token.
Confluence connector (Beta) Direct link to Confluence connector (Beta)
Confluence connector (Beta)
Direct link to Confluence connector (Beta)
December 16, 2025
The fully-managed Confluence connector in Lakeflow Connect enables you to ingest Confluence spaces, pages, attachments, blogposts, labels, and classification levels into Databricks. See Configure OAuth U2M for Confluence ingestion.
PostgreSQL connector in Lakeflow Connect (Public Preview) Direct link to postgresql-connector-in-lakeflow-connect-public-preview
PostgreSQL connector in Lakeflow Connect (Public Preview)
Direct link to postgresql-connector-in-lakeflow-connect-public-preview
December 16, 2025
The fully-managed PostgreSQL connector in Lakeflow Connect is in Public Preview. This connector enables incremental data ingestion from PostgreSQL databases, including Amazon RDS PostgreSQL, Amazon Aurora PostgreSQL, Azure Database for PostgreSQL, Google Cloud SQL for PostgreSQL, and on-premises PostgreSQL databases. See Configure PostgreSQL for ingestion into Databricks.
Customizable SharePoint connector (Beta) Direct link to Customizable SharePoint connector (Beta)
Customizable SharePoint connector (Beta)
Direct link to Customizable SharePoint connector (Beta)
December 10, 2025
The standard SharePoint connector offers more flexibility than the managed SharePoint connector. It allows you to ingest structured, semi-structured, and unstructured files into Delta tables with full control over schema inference, parsing options, and transformations. To get started, see Ingest files from SharePoint.
For an in-depth comparison of the SharePoint connectors, see Choose your SharePoint connector.
NetSuite connector (Public Preview) Direct link to NetSuite connector (Public Preview)
NetSuite connector (Public Preview)
Direct link to NetSuite connector (Public Preview)
December 10, 2025
You can now ingest data from the NetSuite2.com data source programmatically using the Databricks API, the Databricks CLI, or a Databricks notebook. See Configure NetSuite for ingestion into Databricks.
Jira connector (Beta) Direct link to Jira connector (Beta)
Jira connector (Beta)
Direct link to Jira connector (Beta)
December 10, 2025
The Jira connector in Lakeflow Connect enables you to ingest Jira issues, comments, and attachments metadata into Databricks. See Configure Jira for ingestion.
Microsoft Dynamics 365 connector (Public Preview) Direct link to Microsoft Dynamics 365 connector (Public Preview)
Microsoft Dynamics 365 connector (Public Preview)
Direct link to Microsoft Dynamics 365 connector (Public Preview)
December 10, 2025
The fully-managed Microsoft Dynamics 365 connector in Lakeflow Connect allows you to ingest data from Dynamics 365 applications like Sales, Customer Service, Finance & Operations, and more into Databricks. See Configure data source for Microsoft Dynamics 365 ingestion.
Change owner for materialized views or streaming tables defined in Databricks SQL Direct link to change-owner-for-materialized-views-or-streaming-tables-defined-in-databricks-sql
Change owner for materialized views or streaming tables defined in Databricks SQL
Direct link to change-owner-for-materialized-views-or-streaming-tables-defined-in-databricks-sql
December 10, 2025
You can now change the owner for materialized views or streaming tables defined in Databricks SQL through Catalog Explorer. For materialized view details, see Configure materialized views in Databricks SQL. For streaming table details, see Use streaming tables in Databricks SQL.
Discover files in Auto Loader efficiently using file events Direct link to discover-files-in-auto-loader-efficiently-using-file-events
Discover files in Auto Loader efficiently using file events
Direct link to discover-files-in-auto-loader-efficiently-using-file-events
December 10, 2025
Auto Loader with file events is now GA. With this feature, Auto Loader can discover files with the efficiency of notifications while retaining the setup simplicity of directory listing. This is the recommended way to use Auto Loader (and particularly file notifications) with Unity Catalog. Learn more here.
To start using Auto Loader with file events, see the following:
• (Prerequisite) Enable file events for an external location
• File notification mode with and without file events enabled on external locations
ForEachBatch for Lakeflow Spark Declarative Pipelines is available (Public Preview) Direct link to foreachbatch-for-lakeflow-spark-declarative-pipelines-is-available-public-preview
ForEachBatch for Lakeflow Spark Declarative Pipelines is available (Public Preview)
Direct link to foreachbatch-for-lakeflow-spark-declarative-pipelines-is-available-public-preview
December 9, 2025
You can now process streams in Lakeflow Spark Declarative Pipelines as a series of micro-batches in Python, using a ForEachBatch sink. The ForEachBatch sink is available in public preview.
See Use ForEachBatch to write to arbitrary data sinks in pipelines.
Databricks Runtime 18.0 and Databricks Runtime 18.0 ML are in Beta Direct link to databricks-runtime-180-and-databricks-runtime-180-ml-are-in-beta
Databricks Runtime 18.0 and Databricks Runtime 18.0 ML are in Beta
Direct link to databricks-runtime-180-and-databricks-runtime-180-ml-are-in-beta
December 9, 2025
Databricks Runtime 18.0 and Databricks Runtime 18.0 ML are now in Beta, powered by Apache Spark 4.0.0. The release includes JDK 21 as the default, new features for jobs and streaming, and library upgrades.
See Databricks Runtime 18.0 (Beta) and Databricks Runtime 18.0 for Machine Learning (Beta).
Databricks Runtime maintenance updates (12/09) Direct link to databricks-runtime-maintenance-updates-1209
Databricks Runtime maintenance updates (12/09)
Direct link to databricks-runtime-maintenance-updates-1209
December 9, 2025
New maintenance updates are available for supported Databricks Runtime versions. These updates include bug fixes, security patches, and performance improvements. For details, see:
• Databricks Runtime 17.3 LTS
• Databricks Runtime 17.2
• Databricks Runtime 17.1
• Databricks Runtime 17.0
• Databricks Runtime 16.4 LTS
• Databricks Runtime 15.4 LTS
• Databricks Runtime 14.3 LTS
• Databricks Runtime 13.3 LTS
• Databricks Runtime 12.2 LTS
New columns in Lakeflow system tables (Public Preview) Direct link to New columns in Lakeflow system tables (Public Preview)
New columns in Lakeflow system tables (Public Preview)
Direct link to New columns in Lakeflow system tables (Public Preview)
December 9, 2025
New columns are now available in the Lakeflow system tables to provide enhanced job monitoring and troubleshooting capabilities:
jobs table: trigger, trigger_type, run_as_user_name, creator_user_name, paused, timeout_seconds, health_rules, deployment, create_time
job_tasks table: timeout_seconds, health_rules
job_run_timeline table: source_task_run_id, root_task_run_id, compute, termination_type, setup_duration_seconds, queue_duration_seconds, run_duration_seconds, cleanup_duration_seconds, execution_duration_seconds
job_task_run_timeline table: compute, termination_type, task_parameters, setup_duration_seconds, cleanup_duration_seconds, execution_duration_seconds
pipelines table: create_time
These columns are not populated for rows emitted before early December 2025. See Jobs system table reference.
New token expiration policy for open Delta Sharing Direct link to new-token-expiration-policy-for-open-delta-sharing
New token expiration policy for open Delta Sharing
Direct link to new-token-expiration-policy-for-open-delta-sharing
December 8, 2025
All new Delta Sharing open sharing recipient tokens are issued with a maximum expiration of one year from the date of creation. Tokens with an expiration period longer than one year or no expiration date can no longer be created.
Existing open sharing recipient tokens issued before December 8, 2025, with expiration dates after December 8, 2026, or with no expiration date, automatically expire on December 8, 2026. If you currently use recipient tokens with long or unlimited lifetimes, review your integrations and renew tokens as needed to avoid breaking changes after this date.
See Create a recipient object for non-Databricks users using bearer tokens (open sharing).
Expanded regional availability for C5 and TISAX compliance Direct link to Expanded regional availability for C5 and TISAX compliance
Expanded regional availability for C5 and TISAX compliance
Direct link to Expanded regional availability for C5 and TISAX compliance
December 8, 2025
You can now use the Cloud Computing Compliance Criteria Catalogue (C5) and the Trusted Information Security Assessment Exchange (TISAX) compliance standards in all regions and with serverless compute. See Classic and serverless compute support by region.
Vector Search reranker is now generally available Direct link to Vector Search reranker is now generally available
Vector Search reranker is now generally available
Direct link to Vector Search reranker is now generally available
December 8, 2025
The Vector Search reranker is now generally available. Reranking can help improve retrieval quality. For more information, see Use the reranker in a query.
Built-in Excel file format support (Beta) Direct link to Built-in Excel file format support (Beta)
Built-in Excel file format support (Beta)
Direct link to Built-in Excel file format support (Beta)
December 2, 2025
Databricks now provides built-in support for reading Excel files. You can query Excel files directly using Spark DataFrames without external libraries. See Read Excel files.