Ad Platforms Release Notes

Release notes for social media advertising and other marketing/ads APIs

Products (8)

Latest Ad Platforms Updates

  • Mar 30, 2026
    • Date parsed from source:
      Mar 30, 2026
    • First seen by Releasebot:
      Apr 3, 2026
    Meta logo

    Instagram Platform by Meta

    March 30, 2026

    Instagram Platform adds Creator Marketplace API upgrades for faster creator discovery, richer profile data, past partnership ads media, and new filters for growth, recent activity, and device type. It also expands Partnership Ads with username filtering and ad code generation.

    Creator Marketplace

    Applies to all versions.

    The Creator Marketplace API now includes several new capabilities for discovering and evaluating creators:

    • Rate limit increase: Account-level rate limits for the Discovery API have increased from 240 to 1,000 queries per user per hour, enabling faster, higher-volume creator search.
    • Profile picture URL: A new profile_picture_url field is available when querying creator profiles.
    • Past partnership ads media: You can now query a creator's historical partnership ads media that the creator owns.
    • New filters: Three new filtering options — follower growth (top growth in last 30 days), latest activity (recently posted creators), and device type (iOS/Android audience filtering).

    See Creator Marketplace API documentation for more information.

    Partnership Ads

    Applies to all versions.

    • Username filtering on ad permissions: The /{business-account-id}/branded_content_ad_permissions endpoint now supports filtering by creator_username, allowing you to search and retrieve ad permissions for a specific creator. See Account-Level Permissioning for more information.
    • Ad code generation: Creators can now generate partnership ad codes through the API, enabling automated workflows for branded content authorization. See Ad Codes for more information.
    Original source Report a problem
  • Mar 26, 2026
    • Date parsed from source:
      Mar 26, 2026
    • First seen by Releasebot:
      Apr 10, 2026
    Google logo

    Google Ads by Google

    Upgrade your creative performance with March’s Demand Gen Drop.

    Google Ads shares its latest Demand Gen Drop to help maximize campaign performance.

    Learn more about our latest Demand Gen Drop and ways to maximize campaign performance in Demand Gen campaigns.

    Original source Report a problem
  • All of your release notes in one feed

    Join Releasebot and get updates from Meta and hundreds of other software products.

  • Mar 26, 2026
    • Date parsed from source:
      Mar 26, 2026
    • First seen by Releasebot:
      Mar 27, 2026
    Google logo

    Google Ads by Google

    Upgrade your creative performance with March’s Demand Gen Drop.

    Google Ads adds new Demand Gen creative tools and creator solutions, including AI-powered Veo video variations, YouTube Creator Partnerships, and optimization for follow-on views to help drive stronger results across YouTube.

    Demand Gen helps you find new customers across YouTube and Google's most visual surfaces. With new creative tools and creator solutions available, you can drive better results.

    Achieve creative excellence with new AI-powered features:

    • Use Veo in Google Ads to generate high-quality video variations from static images. This allows you to scale your asset variety to unlock "Excellent" Ad Strength for better performance.

    Scale authentic creator content to drive results:

    • YouTube Creator Partnerships with Google Ads helps you discover and connect with the right creators.
    • Creator partnerships boost turns authentic creator assets into standout ads for Demand Gen campaigns, driving on average a 30% increase in conversion lift on YouTube Shorts.1

    Build your own organic presence on YouTube:

    • A strong organic channel drives long-term equity, and when paired with Demand Gen YouTube Engagements goal helps you find new customers.
    • You can now optimize Demand Gen campaigns for follow-on views to increase your channel watch time and capture more interest.

    To learn about how Demand Gen is getting better all the time, visit Accelerate with Google.

    Original source Report a problem
  • Mar 25, 2026
    • Date parsed from source:
      Mar 25, 2026
    • First seen by Releasebot:
      Mar 26, 2026
    Google logo

    Google Ads API by Google

    v23.2 (2026-03-25)

    Google Ads API adds v23.2 updates across assets, campaigns, planning, reports, and video tools, including new video enhancement data, App campaign asset insights, richer scheduling and forecast options, new metrics, and YouTube Live preview support.

    The following new features and updates were added in v23.2.

    Assets

    • Added the VideoEnhancement resource with enhancement-specific video ad information, such as whether it's Google-generated or advertiser-provided. See the About video enhancements to learn more.
    • Added the AppTopCombinationView read-only resource to provide insights into top-performing asset combinations in App campaigns.
    • Added support to retrieve CustomerAsset with field_type BUSINESS_LOGO.

    Campaigns

    • Added AdGroupAd.start_date_time and AdGroupAd.end_date_time fields to provide more granular scheduling constraints over the campaign's dates. This is only supported for some ad group types.
    • Added HotelSettingInfo.disable_hotel_setting to allow disabling the hotel feed in Demand Gen campaigns.

    General

    • Added two new error codes to CustomerClientLinkError: MAX_CUSTOMER_LIMIT_REACHED and ACCOUNT_CREATION_POLICY_VIOLATION.
    • An error is now thrown when attempting to use the sunsetted LOYALTY_SIGN_UPS user list customer type category.

    Planning

    • Added support for custom AND/OR combinations of entities, topics, and audiences in GenerateTrendingInsights and GenerateCreatorInsights.
    • Added new targetable age ranges, such as AGE_RANGE_21_44 or AGE_RANGE_21_49, in ReachPlanService.GenerateReachForecast.
    • Added youtube_select_lineup_targeting to ReachPlanService.ListPlannableProducts, which will replace youtube_select_lineups. Both fields are currently populated.
    • Added IN_STREAM_NON_SKIPPABLE_THIRTY_SECONDS as a surface option in ReachPlanSurface.
    • Added clicks to Forecast for Demand Gen Max Clicks (CPC) in ReachPlanService.GenerateReachForecast.
    • Added partnership_opportunities to ContentCreatorInsightsService.GenerateCreatorInsights and ContentCreatorInsightsService.GenerateTrendingInsights.

    Reports

    • Added the biddable_indirect_install_first_in_app_conversion_micros metric to Campaign, Customer, and AdGroup resources.

    Videos

    • Extended ShareablePreviewService to support YouTube Live previews by setting preview_type to YOUTUBE_LIVE_PREVIEW. Added UNSUPPORTED_AD_TYPE and TOO_MANY_RESOURCES_IN_REQUEST to ShareablePreviewError. This is only supported for some ad types.
    Original source Report a problem
  • Mar 24, 2026
    • Date parsed from source:
      Mar 24, 2026
    • First seen by Releasebot:
      Mar 25, 2026
    Google logo

    Google Ads by Google

    Google’s Commerce Media Suite: Where retailer insights meet the power of YouTube

    Google Ads expands commerce media capabilities with new retailer insight collaborations, including Kroger audience activation, SKU-level conversion reporting in Display & Video 360, and broader access to commerce audiences from major merchants, with upcoming support for leading Asian marketplaces.

    We’re announcing new collaborations to supercharge performance with retailer insights.

    Today’s consumer journey is continuous and dynamic, yet most brands struggle with disconnected retail partners, media inefficiencies and measurement gaps. The next wave of commerce media requires a unified solution that reaches shoppers at every touchpoint — and moves as fast as they do.

    With Google’s Commerce Media Suite, brands and retailers can reach audiences across Search, Shopping, YouTube, Display and even CTV — in the same tools brands already know: Display & Video 360, Search Ads 360 and Google Ads.

    This week, we’re bringing new collaborations to connect the commerce ecosystem and supercharge performance with retailer insights.

    Connecting retailer insights with the power and scale of YouTube

    We’re excited to announce that Kroger Precision Marketing is collaborating with Display & Video 360 to help advertisers reach Kroger shoppers across YouTube and third-party inventory.

    Brands can now activate Kroger’s shopper audiences, built from retail purchase signals, across YouTube — where consumers already watch an average of 90 million hours of shopping videos every day1. To close the loop, we’re also introducing SKU-level conversion reporting in Display & Video 360, so brands can see the precise impact of their YouTube and Display spend on Kroger sales. This is powered by integrations with LiveRamp and MetaRouter, with no additional setup required for brands.

    At Kimberly‑Clark, measurable impact matters. Integrating Kroger’s insights into our media activation and measurement approach helps us connect media exposure to purchase behavior and clearly understand how our media drives sales.

    Luke Kigel
    VP, Digital Mktg & CX, Kimberly-Clark

    Reaching high-intent commerce audiences, wherever they shop

    Consumers shop across many merchants, and our ecosystem reflects that. Today, brands can activate commerce audiences from Best Buy Ads, Costco, Intuit, Kinective Media by United Airlines, Planet Fitness, Shipt and Western Union across partner inventory in Display & Video 360.

    We’re also expanding globally. Eligible brands will soon be able to leverage commerce audiences from leading marketplaces across Asia — including Blinkit, PChome, Shopee and Swiggy — directly in Google Ads.

    Driving growth for brands and retailers

    With the combined power of retailer insights, Google AI and YouTube’s immersive formats, brands can unify their brand and shopper marketing while driving performance across the entire shopper journey. For retailers, this unlocks new monetization opportunities while fueling incremental sales in-store and online.

    Unilever prides itself on outcome orientation so bringing Kroger data into Display & Video 360 is a massive unlock for our team. We can run our brand and retailer marketing in one place while reaching our customers across every step of their journey. SKU-level reporting is a game-changer; it moves us away from directional metrics towards precise, data-driven decisions that actually move products off the shelves.

    Ryu Yokoi
    Chief Media & Mktg Capability Officer, Unilever NA

    Reach out to your Google account team to learn more.

    Original source Report a problem
  • Mar 20, 2026
    • Date parsed from source:
      Mar 20, 2026
    • First seen by Releasebot:
      Mar 21, 2026
    Google logo

    Google Ads by Google

    We’re launching the Top Sports Podcasts on YouTube for brands to align with sports fans.

    Google Ads launches the Top Sports Podcast Lineup, now generally available to all U.S. advertisers. The lineup helps brands reach engaged sports podcast audiences on YouTube through popular YouTube Select podcasters like New Heights and The Rich Eisen Show.

    YouTube is where all sports come to life — because no sporting event or game would be complete without the commentary, breakdowns and analysis surrounding it. In 2025 alone, there were over 8.5 billion views of sports-related podcasts on YouTube. 1. This shift is driven by a new generation of fans: 56% of 14-24 year-olds watch podcasts or videos from athletes every single week. 2

    To help your brand reach this engaged audience, we are launching the Top Sports Podcast Lineup. Now generally available to all U.S. advertisers, this lineup offers a seamless way to align with the YouTube Select podcasters that fans love — from New Heights to The Rich Eisen Show.

    When the biggest global stages and championship showdowns are wrapping up, the conversation on YouTube is just beginning. With the all-new Top Sports Podcast Lineup, your brand can stay at the heart of it all.

    Original source Report a problem
  • Mar 19, 2026
    • Date parsed from source:
      Mar 19, 2026
    • First seen by Releasebot:
      Apr 10, 2026
    Google logo

    Google Ads by Google

    AI shopping gets simpler with Universal Commerce Protocol updates

    Google Ads shares a new onboarding experience to simplify UCP integration and adds new capabilities.

    Universal Commerce Protocol (UCP) releases new capabilities, and Google shares a new onboarding experience to simplify UCP integration.

    Original source Report a problem
  • Mar 19, 2026
    • Date parsed from source:
      Mar 19, 2026
    • First seen by Releasebot:
      Mar 19, 2026
    Google logo

    Google Ads by Google

    AI shopping gets simpler with Universal Commerce Protocol updates

    Google Ads releases updates to the Universal Commerce Protocol, adding a Cart option to save or add multiple items, a Catalog capability for real-time product details, and Identity Linking to preserve loyalty benefits across platforms. It also signals simplified UCP onboarding in Merchant Center and future expansion with AI Mode and Gemini.

    New capabilities to the Universal Commerce Protocol (UCP) help retailers make online shopping easier and more connected.

    A
    Ashish Gupta
    VP/GM, Merchant Shopping

    We built the
    Universal Commerce Protocol
    (UCP) with the industry as an open standard to help make online shopping easier for everyone. Today, we’re sharing updates on new UCP capabilities, and, separately, how we are simplifying the UCP onboarding experience on Google surfaces.

    New capabilities available in UCP

    Since launch, we’ve worked with community contributors to add new optional capabilities to the protocol:

    • UCP can help make online shopping more intuitive and convenient, thanks to a new Cart option that will let agents save or add multiple items to a shopping cart at once from a single store — just as a shopper typically would.

    • UCP adopters will be able to access a new Catalog capability that lets agents retrieve select real-time product details from a retailer’s catalog where necessary — like variants, inventory and pricing.

    • Building on existing standards, UCP will also support Identity Linking. That allows shoppers on UCP-integrated platforms to receive the same loyalty or member benefits they would on a retailer’s site when they’re logged in — like pricing or free shipping — making shopping more connected across the web.

    As always, UCP adopters can customize the experience they provide by selecting which capabilities to support.

    Scaling agentic commerce with Google

    At Google, we’ll continue to bring relevant UCP capabilities to shopping experiences in AI Mode in Search, the Gemini app and beyond. Additionally, we are actively working to onboard more retailers of all sizes to agentic experiences on Google with a simplified UCP onboarding process in Google’s Merchant Center, rolling out over the coming months. Partners like Commerce Inc, Salesforce and Stripe will implement UCP on their platforms in the near future, with others coming soon — making online shopping and selling even better for more people and businesses.

    POSTED IN:

    Shopping
    Google Ads

    Original source Report a problem
  • Mar 13, 2026
    • Date parsed from source:
      Mar 13, 2026
    • First seen by Releasebot:
      Mar 14, 2026
    Meta logo

    Instagram Platform by Meta

    March 13, 2026

    Meta announces Instagram Direct Send API now supports attachment IDs for images, reducing timeouts and enabling reuse across recipients.

    The Instagram Direct Send API now supports sending the images with attachment IDs (in addition to image URLs) which resolves the problem of timeouts when uploading multiple large high-quality images from slow servers. You can use the attachment API to upload the images one at a time and reuse the attachment IDs to send the same image to multiple users. See the updated dev docs.

    Original source Report a problem
  • Mar 5, 2026
    • Date parsed from source:
      Mar 5, 2026
    • First seen by Releasebot:
      Mar 6, 2026
    Google logo

    Google Ads by Google

    Ask a Techspert: How does AI understand my visual searches?

    Google highlights a major leap in visual search with AI Mode and Circle to Search, enabling multi-object searches in images and simultaneous results. It explains the fan-out technique powering faster, cohesive image queries for uses from fashion to home decor.

    Visual search progress

    Visual search has improved leaps and bounds — look no further than recent updates to Google Search. Here, a Google expert explains this progress and what technique we’ve used to make it happen.

    We’ve all been there: You see a photo of a perfectly styled living room or a well-curated street-style outfit, and you want to know where everything came from. Until recently, visual search was a one-item-at-a-time process. But a major update to Circle to Search and Lens now allows Google to break down and search for multiple objects within a single image simultaneously. This means if you use Circle to Search on Android to search for an entire outfit, you’ll see results for every component of a look, not just one piece at a time. In recent months, we’ve also launched several updates that enhance both visual search and image results in AI Mode, so you can better find inspiration as you search.

    To better understand these breakthroughs, we talked to Search Senior Engineering Director Dounia Berrada.

    What part of Search do you work on?

    I focus on multimodal search, aka Google Lens — essentially, enabling Google to help with your most complex questions about images, PDFs and anything you see. Visual search is redefining how we interact with information; Lens should be intelligent enough to understand the "why" behind your search, making it effortless to get help with what you see on your screen, or in the world around you. That means building a tool that can just as easily explain a complex math problem as it can identify a rare succulent or help you track down a pair of shoes you love.

    How does it do that?

    Imagine you’re redesigning a room so you upload a photo of a mid-century modern space for inspiration. You probably aren’t just looking for the side table; you want to recreate the entire vibe. Previously, you’d have to search for the lamp, then the rug, then the chair individually. Now, AI Mode can break down that complex image, identify each individual piece and issue multiple visual searches simultaneously. You can see this in action right now using Circle to Search.

    What powers these types of visual search responses?

    Our advanced Gemini models make AI Mode possible, and its multimodal capabilities benefit from the visual expertise we've built into Lens over the years. When you search with an image, Gemini analyzes the image alongside your question to decide which tools to use. Let's say you're scrolling on your phone and see an outfit on social media that you love. When you search it, the model knows to use Lens to retrieve image results for the hat, shoes and jacket of the outfit simultaneously. It then weaves those individual results into one easy-to-read response.

    Think of it this way: The AI model acts as the "brain" that can “see” the image, while the visual search backend acts as the "library" containing billions of web results. The AI performs multi-object reasoning to understand what you’re looking at. Then it uses a "fan-out" technique which triggers multiple searches at once, reads through the results and presents a single, cohesive response with helpful links — all in seconds.

    Can you explain the fan-out technique?

    AI Mode is basically doing a dozen searches for you in the time it takes to do one. If you upload a photo of a garden you admire, you might have several questions: Will these plants survive in the shade? Are they right for my climate? How much maintenance do they need?

    Before, you’d ask those one by one. Now, AI Mode identifies all those necessary "fan-out" searches. This way, it gathers care requirements for every plant in the photo using helpful web results, breaks down the info and even suggests next steps you might want to take. Since AI Mode is uncovering more visual results from a single search, it's easier than ever to find just what you're looking for, and stumble upon something new that sparks your interest.

    Do you have to start with an image to get this kind of help in AI Mode?

    Not at all! You can start with a simple text search in AI Mode, like "visual inspo for work outfits." When you see a result you like, you can just say, "Show me more options like the second skirt." The system immediately takes that specific image and begins the fan-out process from there.

    It definitely seems great for shopping — what else could you use it for?

    You could take a photo of a wall at a museum and ask for explanations of each painting. Or take a photo of a bakery window and ask what all the different pastries are. It’s about moving from "What is this one thing?" to "Explain this entire scene to me."

    Sounds like I’ve got some photos to take and a lot more to discover. I'm off to put these tools to the test!

    Original source Report a problem