Google Ads Release Notes
Last updated: Mar 27, 2026
- Mar 26, 2026
- Date parsed from source:Mar 26, 2026
- First seen by Releasebot:Mar 27, 2026
Google Ads by Google
Upgrade your creative performance with March’s Demand Gen Drop.
Google Ads adds new Demand Gen creative tools and creator solutions, including AI-powered Veo video variations, YouTube Creator Partnerships, and optimization for follow-on views to help drive stronger results across YouTube.
Demand Gen helps you find new customers across YouTube and Google's most visual surfaces. With new creative tools and creator solutions available, you can drive better results.
Achieve creative excellence with new AI-powered features:
- Use Veo in Google Ads to generate high-quality video variations from static images. This allows you to scale your asset variety to unlock "Excellent" Ad Strength for better performance.
Scale authentic creator content to drive results:
- YouTube Creator Partnerships with Google Ads helps you discover and connect with the right creators.
- Creator partnerships boost turns authentic creator assets into standout ads for Demand Gen campaigns, driving on average a 30% increase in conversion lift on YouTube Shorts.1
Build your own organic presence on YouTube:
- A strong organic channel drives long-term equity, and when paired with Demand Gen YouTube Engagements goal helps you find new customers.
- You can now optimize Demand Gen campaigns for follow-on views to increase your channel watch time and capture more interest.
To learn about how Demand Gen is getting better all the time, visit Accelerate with Google.
Original source Report a problem - Mar 24, 2026
- Date parsed from source:Mar 24, 2026
- First seen by Releasebot:Mar 25, 2026
Google Ads by Google
Google’s Commerce Media Suite: Where retailer insights meet the power of YouTube
Google Ads expands commerce media capabilities with new retailer insight collaborations, including Kroger audience activation, SKU-level conversion reporting in Display & Video 360, and broader access to commerce audiences from major merchants, with upcoming support for leading Asian marketplaces.
We’re announcing new collaborations to supercharge performance with retailer insights.
Today’s consumer journey is continuous and dynamic, yet most brands struggle with disconnected retail partners, media inefficiencies and measurement gaps. The next wave of commerce media requires a unified solution that reaches shoppers at every touchpoint — and moves as fast as they do.
With Google’s Commerce Media Suite, brands and retailers can reach audiences across Search, Shopping, YouTube, Display and even CTV — in the same tools brands already know: Display & Video 360, Search Ads 360 and Google Ads.
This week, we’re bringing new collaborations to connect the commerce ecosystem and supercharge performance with retailer insights.
Connecting retailer insights with the power and scale of YouTube
We’re excited to announce that Kroger Precision Marketing is collaborating with Display & Video 360 to help advertisers reach Kroger shoppers across YouTube and third-party inventory.
Brands can now activate Kroger’s shopper audiences, built from retail purchase signals, across YouTube — where consumers already watch an average of 90 million hours of shopping videos every day1. To close the loop, we’re also introducing SKU-level conversion reporting in Display & Video 360, so brands can see the precise impact of their YouTube and Display spend on Kroger sales. This is powered by integrations with LiveRamp and MetaRouter, with no additional setup required for brands.
At Kimberly‑Clark, measurable impact matters. Integrating Kroger’s insights into our media activation and measurement approach helps us connect media exposure to purchase behavior and clearly understand how our media drives sales.
Luke Kigel
VP, Digital Mktg & CX, Kimberly-ClarkReaching high-intent commerce audiences, wherever they shop
Consumers shop across many merchants, and our ecosystem reflects that. Today, brands can activate commerce audiences from Best Buy Ads, Costco, Intuit, Kinective Media by United Airlines, Planet Fitness, Shipt and Western Union across partner inventory in Display & Video 360.
We’re also expanding globally. Eligible brands will soon be able to leverage commerce audiences from leading marketplaces across Asia — including Blinkit, PChome, Shopee and Swiggy — directly in Google Ads.
Driving growth for brands and retailers
With the combined power of retailer insights, Google AI and YouTube’s immersive formats, brands can unify their brand and shopper marketing while driving performance across the entire shopper journey. For retailers, this unlocks new monetization opportunities while fueling incremental sales in-store and online.
Unilever prides itself on outcome orientation so bringing Kroger data into Display & Video 360 is a massive unlock for our team. We can run our brand and retailer marketing in one place while reaching our customers across every step of their journey. SKU-level reporting is a game-changer; it moves us away from directional metrics towards precise, data-driven decisions that actually move products off the shelves.
Ryu Yokoi
Chief Media & Mktg Capability Officer, Unilever NAReach out to your Google account team to learn more.
Original source Report a problem All of your release notes in one feed
Join Releasebot and get updates from Google and hundreds of other software products.
- Mar 20, 2026
- Date parsed from source:Mar 20, 2026
- First seen by Releasebot:Mar 21, 2026
Google Ads by Google
We’re launching the Top Sports Podcasts on YouTube for brands to align with sports fans.
Google Ads launches the Top Sports Podcast Lineup, now generally available to all U.S. advertisers. The lineup helps brands reach engaged sports podcast audiences on YouTube through popular YouTube Select podcasters like New Heights and The Rich Eisen Show.
YouTube is where all sports come to life — because no sporting event or game would be complete without the commentary, breakdowns and analysis surrounding it. In 2025 alone, there were over 8.5 billion views of sports-related podcasts on YouTube. 1. This shift is driven by a new generation of fans: 56% of 14-24 year-olds watch podcasts or videos from athletes every single week. 2
To help your brand reach this engaged audience, we are launching the Top Sports Podcast Lineup. Now generally available to all U.S. advertisers, this lineup offers a seamless way to align with the YouTube Select podcasters that fans love — from New Heights to The Rich Eisen Show.
When the biggest global stages and championship showdowns are wrapping up, the conversation on YouTube is just beginning. With the all-new Top Sports Podcast Lineup, your brand can stay at the heart of it all.
Original source Report a problem - Mar 19, 2026
- Date parsed from source:Mar 19, 2026
- First seen by Releasebot:Mar 19, 2026
Google Ads by Google
AI shopping gets simpler with Universal Commerce Protocol updates
Google Ads releases updates to the Universal Commerce Protocol, adding a Cart option to save or add multiple items, a Catalog capability for real-time product details, and Identity Linking to preserve loyalty benefits across platforms. It also signals simplified UCP onboarding in Merchant Center and future expansion with AI Mode and Gemini.
New capabilities to the Universal Commerce Protocol (UCP) help retailers make online shopping easier and more connected.
A
Ashish Gupta
VP/GM, Merchant ShoppingWe built the
Universal Commerce Protocol
(UCP) with the industry as an open standard to help make online shopping easier for everyone. Today, we’re sharing updates on new UCP capabilities, and, separately, how we are simplifying the UCP onboarding experience on Google surfaces.New capabilities available in UCP
Since launch, we’ve worked with community contributors to add new optional capabilities to the protocol:
UCP can help make online shopping more intuitive and convenient, thanks to a new Cart option that will let agents save or add multiple items to a shopping cart at once from a single store — just as a shopper typically would.
UCP adopters will be able to access a new Catalog capability that lets agents retrieve select real-time product details from a retailer’s catalog where necessary — like variants, inventory and pricing.
Building on existing standards, UCP will also support Identity Linking. That allows shoppers on UCP-integrated platforms to receive the same loyalty or member benefits they would on a retailer’s site when they’re logged in — like pricing or free shipping — making shopping more connected across the web.
As always, UCP adopters can customize the experience they provide by selecting which capabilities to support.
Scaling agentic commerce with Google
At Google, we’ll continue to bring relevant UCP capabilities to shopping experiences in AI Mode in Search, the Gemini app and beyond. Additionally, we are actively working to onboard more retailers of all sizes to agentic experiences on Google with a simplified UCP onboarding process in Google’s Merchant Center, rolling out over the coming months. Partners like Commerce Inc, Salesforce and Stripe will implement UCP on their platforms in the near future, with others coming soon — making online shopping and selling even better for more people and businesses.
POSTED IN:
Shopping
Original source Report a problem
Google Ads - Mar 5, 2026
- Date parsed from source:Mar 5, 2026
- First seen by Releasebot:Mar 6, 2026
Google Ads by Google
Ask a Techspert: How does AI understand my visual searches?
Google highlights a major leap in visual search with AI Mode and Circle to Search, enabling multi-object searches in images and simultaneous results. It explains the fan-out technique powering faster, cohesive image queries for uses from fashion to home decor.
Visual search progress
Visual search has improved leaps and bounds — look no further than recent updates to Google Search. Here, a Google expert explains this progress and what technique we’ve used to make it happen.
We’ve all been there: You see a photo of a perfectly styled living room or a well-curated street-style outfit, and you want to know where everything came from. Until recently, visual search was a one-item-at-a-time process. But a major update to Circle to Search and Lens now allows Google to break down and search for multiple objects within a single image simultaneously. This means if you use Circle to Search on Android to search for an entire outfit, you’ll see results for every component of a look, not just one piece at a time. In recent months, we’ve also launched several updates that enhance both visual search and image results in AI Mode, so you can better find inspiration as you search.
To better understand these breakthroughs, we talked to Search Senior Engineering Director Dounia Berrada.
What part of Search do you work on?
I focus on multimodal search, aka Google Lens — essentially, enabling Google to help with your most complex questions about images, PDFs and anything you see. Visual search is redefining how we interact with information; Lens should be intelligent enough to understand the "why" behind your search, making it effortless to get help with what you see on your screen, or in the world around you. That means building a tool that can just as easily explain a complex math problem as it can identify a rare succulent or help you track down a pair of shoes you love.
How does it do that?
Imagine you’re redesigning a room so you upload a photo of a mid-century modern space for inspiration. You probably aren’t just looking for the side table; you want to recreate the entire vibe. Previously, you’d have to search for the lamp, then the rug, then the chair individually. Now, AI Mode can break down that complex image, identify each individual piece and issue multiple visual searches simultaneously. You can see this in action right now using Circle to Search.
What powers these types of visual search responses?
Our advanced Gemini models make AI Mode possible, and its multimodal capabilities benefit from the visual expertise we've built into Lens over the years. When you search with an image, Gemini analyzes the image alongside your question to decide which tools to use. Let's say you're scrolling on your phone and see an outfit on social media that you love. When you search it, the model knows to use Lens to retrieve image results for the hat, shoes and jacket of the outfit simultaneously. It then weaves those individual results into one easy-to-read response.
Think of it this way: The AI model acts as the "brain" that can “see” the image, while the visual search backend acts as the "library" containing billions of web results. The AI performs multi-object reasoning to understand what you’re looking at. Then it uses a "fan-out" technique which triggers multiple searches at once, reads through the results and presents a single, cohesive response with helpful links — all in seconds.
Can you explain the fan-out technique?
AI Mode is basically doing a dozen searches for you in the time it takes to do one. If you upload a photo of a garden you admire, you might have several questions: Will these plants survive in the shade? Are they right for my climate? How much maintenance do they need?
Before, you’d ask those one by one. Now, AI Mode identifies all those necessary "fan-out" searches. This way, it gathers care requirements for every plant in the photo using helpful web results, breaks down the info and even suggests next steps you might want to take. Since AI Mode is uncovering more visual results from a single search, it's easier than ever to find just what you're looking for, and stumble upon something new that sparks your interest.
Do you have to start with an image to get this kind of help in AI Mode?
Not at all! You can start with a simple text search in AI Mode, like "visual inspo for work outfits." When you see a result you like, you can just say, "Show me more options like the second skirt." The system immediately takes that specific image and begins the fan-out process from there.
It definitely seems great for shopping — what else could you use it for?
You could take a photo of a wall at a museum and ask for explanations of each painting. Or take a photo of a bakery window and ask what all the different pastries are. It’s about moving from "What is this one thing?" to "Explain this entire scene to me."
Sounds like I’ve got some photos to take and a lot more to discover. I'm off to put these tools to the test!
Original source Report a problem - Mar 2, 2026
- Date parsed from source:Mar 2, 2026
- First seen by Releasebot:Mar 3, 2026
Google Ads by Google
VRC Non-Skip ads are now generally available, allowing brands to reach TV audiences with Google AI.
YouTube on TV gets easier with VRC Non-Skips now generally available globally in Google Ads and Display & Video 360. AI-powered optimization tailors 6s, 15s, and 30s formats for big screens, boosting reach and efficiency for CTV campaigns.
We’re making it even easier to reach the millions of viewers enjoying YouTube in the living room — including the viewers that have made YouTube the #1 streamer in the U.S. for three years running1. VRC Non-Skips are now generally available globally in Google Ads and Display & Video 360.
Why this matters for your media mix:
- Built for the big screen: Non-skips are optimized for CTV delivery and ensure your message is delivered in its entirety.
- AI-powered optimization: Google AI dynamically optimizes between 6-second Bumpers, 15-second standard and 30-second CTV-only non-skippable ad formats, ensuring your campaign reaches the right audience at the right time.
- Drive better performance: AI-powered precision helps drive greater efficiency across multiple non-skip ad formats, delivering more unique reach and impact compared with manual mixes of single-format campaigns.
- Feb 26, 2026
- Date parsed from source:Feb 26, 2026
- First seen by Releasebot:Feb 26, 2026
Google Ads by Google
We’re expanding beta access to text guidelines for all advertisers globally in AI Max.
AI Max expands beta access to text guidelines globally with full language and vertical support. Advertisers can steer Google AI by defining terms to avoid and phrases to exclude in their own words to stay on brand. BYD saw 24% more leads at 26% lower cost, proving safer, more effective creatives.
AI-powered creatives and text guidelines expansion
AI-powered creatives are essential for staying relevant in today’s complex search landscape, but above all they must meet your brand standards. That’s why we’re expanding beta access for text guidelines to all advertisers globally across AI Max for Search and Performance Max campaigns starting today, now with full language and vertical support.
As text customization matches your creatives with intent, text guidelines ensure they remain precisely on-brand. You can now steer Google AI by defining specific terms to exclude or concepts to avoid, in your own words, with rules like “don’t imply our products are cheap” or “don’t use language like ‘only for’.” We’re exploring more ways for you to guide AI using everyday language.
Brands like BYD are already scaling creatives with these controls in AI Max. They increased leads by 24% at a 26% lower cost, and text guidelines safeguarded their brand standards.
High-quality creatives drive performance, and by pairing your unique insights with Google AI, your ads can stay meaningful across every new Search experience. Get started in AI Max today.
Original source Report a problem - Feb 25, 2026
- Date parsed from source:Feb 25, 2026
- First seen by Releasebot:Feb 26, 2026
Google Ads by Google
See the whole picture and find the look with Circle to Search
Circle to Search adds multi-object image search, letting you circle multiple items in a photo to identify each item at once and surface related products, outfits, and deeper insights. Available now on Galaxy S26 and Pixel 10, with virtual Try On.
New multi-object image search helps you find more items from one picture at the same time in Circle to Search.
H
Harsh Kharbanda
Director, Product Management, SearchSince we launched Circle to Search, you have circled, scribbled and highlighted your way through billions of queries per month. It’s been a game changer for questions like “What are those shoes?” or “Where is this hiking trail?” — and it’s already a powerful tool for finding more information about anything on your Android’s screen.
But we know that sometimes you aren't just looking for a single thing on your screen — you're looking for the whole thing. Like when you're redesigning a room, you don't want a single lamp, you’re trying to build an entire mid-century modern vibe in your living room. Today’s update levels up Circle to Search so you can now explore multiple objects in an image, all at once. Whether you’re curating a mood board, building an entire outfit or just satisfying your multi-layered curiosity, here’s how Circle to Search is getting a whole lot more helpful.
Get inspired by everything you see
Let’s say you're scrolling on your phone, and you see a breathtaking photo of a variety of vibrant, colorful fish. You want to explore more. Instead of wondering what's what, just circle all the fish on your screen and ask "what are all these fish, and how do they coexist?" Circle to Search will identify each unique species you've selected, from the Honeycomb Filefish to the Moon Jellyfish. Beyond just naming them and surfacing related images, it will explain the science behind their underwater community, and give you links out to the web to dive deeper.
With this update, you'll see more visual results from a single search, which creates new opportunities for merchants and businesses to be discovered.
Another popular way people use Circle to Search is for fashion; shopping-related searches are among the top uses of Circle to Search. Say you see an outfit you love on social media and you want to replicate the vibe. Now, you can search for every piece — accessories, clothing and shoes — all at once.
On your Samsung Galaxy S26 series or Pixel 10, just tap, scribble or circle an entire outfit to deconstruct the look. Circle to Search instantly identifies every component, finding similar items to jumpstart your shopping or style inspiration.
Try things on virtually, however you search
It’s also now easier to virtually try on items when inspiration strikes. In the countries where shoppers can already try on clothes from product listings across Google, now they can enter their virtual dressing room right from Circle to Search on the Samsung Galaxy S26 series or Pixel 10 devices. See an outfit on your social feed that you want to replicate? Just circle it, find the look, and select "Try On" to see it on you.
Go under the hood: How this works
This next-generation Circle to Search experience is made possible by Gemini 3's agentic planning, reasoning and tool capabilities, which also enhances our visual query fan-out technique. Instead of simply looking for a single match, the model now thinks through a multi-step plan to get you the best results for everything you search on your screen. It automatically identifies the most important parts of an image to crop, runs several searches at once, and cross-references what it finds to compile a final response — including images from across the web — for each item you’ve searched.
Check out the latest improvements to Circle to Search, starting today on the new Samsung Galaxy S26 series and the latest Pixel 10 devices, and coming to more Android devices soon.
POSTED IN:
- Search
- Shopping
- AI
- Feb 19, 2026
- Date parsed from source:Feb 19, 2026
- First seen by Releasebot:Feb 20, 2026
Google Ads by Google
New Meridian tool puts MMM insights directly in marketers' hands.
Meridian adds Scenario Planner, a no‑code interface that lets marketers and data scientists test budget scenarios and see real‑time ROI from MMM insights. It turns analytics into actionable plans, making Meridian more transparent and widely accessible.
Scenario Planner
Nearly 40% of marketers surveyed say their organizations struggle to connect Marketing Mix Model (MMM) outputs to real-world business decisions, according to a recent Harvard Business Review Analytic Services report. Since introducing Meridian, our open-source MMM, we’ve been focused on addressing this long-standing challenge by making its insights accessible.
Today we're introducing Scenario Planner to help decision makers and data scientists alike bridge the gap between analytics and planning.
Scenario Planner is a user-friendly interface that allows marketers to experiment with different budget scenarios and see real-time ROI estimates — no coding required. It transforms the conversation from a look back at what happened to a collaborative plan for what’s next, regardless of technical expertise.
By connecting marketing teams with the Scenario Planner, we’re making it easier than ever to use measurement insights to inform business decisions. Meridian has always been transparent; now, it’s truly accessible.
Original source Report a problem - Jan 22, 2026
- Date parsed from source:Jan 22, 2026
- First seen by Releasebot:Feb 2, 2026
Google Ads by Google
See the newest product features in January’s Demand Gen Drop.
Demand Gen expands to general availability with Shoppable CTV, Attributed Branded Searches, and Travel Feeds for hotels, enabling dynamic video ads and measurable impact. The updates aim to boost conversions while lowering CPA.
Demand Gen improvements
- Demand Gen powers Shoppable CTV, enabling viewers to seamlessly browse and purchase products while watching YouTube ads on the big screen. Demand Gen campaigns that include TV screens drive an average of 7% additional conversions at the same ROI.
- Attributed Branded Searches is now available for Demand Gen, showing the volume of your campaign’s branded searches on Google/YouTube, to help quantify your impact. Reach out to your Google representative to activate.
- You can now turn browsing into booking faster with Travel Feeds in Demand Gen. Simply connect your Hotel Center feed to build dynamic video ads featuring hotel pricing, ratings and availability.
Demand Gen has been integral in helping advertisers like LG Electronics drive performance, achieving a 24% higher conversion rate than its paid social campaigns while reaching high-valued customers at a 91% lower CPA.
To learn more about Demand Gen improvements, visit Accelerate with Google.
Original source Report a problem