Google Just Shook Up Merchant Center — And A Whole Lot of Retailers Got Caught in the Fallout

Two professionals reviewing analytics data on a laptop in a modern office setting

February 10, 2026

Alright, so if you run a Google Merchant Center account and somewhere around January 8–12 you looked at your dashboard and thought “well, that ain’t right” — I want you to know you’re not crazy, and you’re definitely not alone.

A whole bunch of retailers watched their approved products drop off a cliff practically overnight. Products that had been running just fine for months suddenly flipped to “Limited” or “Pending” status. No warning, no changes on our end. Just… gone.

So what the heck happened? Buckle up, because Google had itself quite a week.

Google Went and Changed Everything at Once

On January 11, Google CEO Sundar Pichai got up at the National Retail Federation conference in New York and announced what honestly amounts to the biggest shakeup of Google’s shopping infrastructure in years. And I mean years.

Here’s the rundown:

They launched something called the Universal Commerce Protocol (UCP). It’s basically a new open standard that lets AI agents handle shopping for people, from finding a product to checkout. Google built it with Shopify, Etsy, Wayfair, Target, and Walmart, and over 20 other big names signed on, including Best Buy, Mastercard, Visa, and Home Depot. That’s not a small deal. (TechCrunch, Axios)

They added dozens of new data attributes to Merchant Center. These go way beyond your usual product titles and descriptions. We’re talking answers to common product questions, compatible accessories, substitutes — all designed to feed into AI-powered shopping through Gemini, AI Mode, and their new Business Agent. (Chain Store Age, Constellation Research)

Speaking of the Business Agent, that went live on January 12 with Lowe’s, Michael’s, Poshmark, and Reebok. It’s basically a branded AI chatbot that shows up right in Google Search results and can answer customer questions in the retailer’s voice. You can activate it through Merchant Center. (Search Engine World)

They rolled out AI Mode Checkout and Direct Offers, letting shoppers buy stuff without ever leaving the AI conversation in Search or Gemini. Plus a new ads pilot where brands can serve up exclusive deals based on what someone’s chatting about. (Google Blog)

And they launched Gemini Enterprise for Customer Experience, a full Google Cloud suite that provides retailers with AI-powered shopping agents and customer service tools. (Google Blog — Sundar Pichai’s NRF Remarks)

That is a lot of stuff to drop in one weekend. And here’s the thing: when Google makes changes this big on the backend, they don’t just flip a switch and leave everything else alone. Their systems go back through, recrawl, and re-evaluate product feeds. Every. Single. One.

Oh, and Five Days Before That…

On January 6, Google announced another big change: starting in March 2026, if you sell products both online and in-store and the details differ between channels (price, availability, condition, whatever), you’re going to need separate product IDs for each version. Online attributes become the default.

They started emailing affected merchants right away to flag products that needed updating. (Search Engine Land, Search Engine Roundtable, Google Merchant Center Help)

Now, enforcement doesn’t kick in until March. But the prep work and backend processing for this change? That was happening right alongside everything else in early January.

And Then Their Own Crawl System Broke

Here’s where it really gets fun.

While all this was going on, Google’s automatic import feature — you know, the one that promises to update your product data every 24 hours — was flat-out not working right.

Emmanuel Flossie, a Google Shopping Specialist and Google Ads Diamond Product Expert (so, not just some random guy), tested it on January 13 and found product data that hadn’t been touched since January 4. Nine days. Not 24 hours — nine days. He went public with it and noted this has actually been a problem for over five years. (PPC Land, Google Merchant Center Community)

So let me get this straight: Google rolls out the biggest commerce infrastructure change in years, their crawl systems are already running behind, and then they re-evaluate everybody’s product feeds at the same time? Yeah. That’s gonna cause some problems.

Why Products Ended Up in “Limited” or “Pending”

When Google does a mass re-evaluation like this, a few things happen under the hood:

They re-crawl your product landing pages to ensure your feed data matches what’s actually on your website. They re-run automated policy enforcement against your titles, descriptions, images, and landing page content. And they validate your data against any new or updated specifications.

Most of the time, this is invisible. Products get re-approved and you never notice. But when the automated systems are running hot — as they often do during big platform transitions — stuff that was perfectly fine yesterday can get flagged today.

Google’s own help documentation says products can land in “Pending” status when they haven’t been crawled yet and Google can’t verify the data. (Google Merchant Center Help) Well, when your crawl system is nine days behind, there’s gonna be a whole lot of unverified products sitting in limbo.

And for retailers in categories that touch Google’s “sensitive” policy areas, such as health products, financial services, and yes, religious products, the automated enforcement can be especially aggressive. Products that were approved for months can suddenly get hit with policy flags that don’t make a lick of sense.

So What Can You Do About It?

Look, I’m not going to sugarcoat it; dealing with Google’s automated systems when they get it wrong is about as fun as a flat tire on I-80. But here’s what’s actually helped:

Check your Merchant Center diagnostics. Go to Products, then Needs Attention, and figure out exactly what Google is flagging. A policy violation needs a different fix than a price mismatch or a crawl error.

Don’t rely on automatic imports. Seriously, just don’t. Set up scheduled feed uploads through your e-commerce platform — whether that’s Shopify, WooCommerce, Cart.com, or whatever you’re using. You want Google to get your data straight from the source, on your schedule.

Make sure your landing pages match your feed. Google’s re-crawl is checking for consistency. If your prices, availability, or product details don’t match between your feed and your actual pages, that will cause issues.

Contact Google Support. If you’re seeing policy flags on products that have been approved forever and nothing has changed, use the Help icon in Merchant Center to contact a real person. A human reviewer can usually sort out what the automated system got wrong.

Keep an eye on things. Many retailers are seeing products return to Approved status in early February. If you’re seeing a gradual recovery, Google’s systems may be catching up and self-correcting as the re-crawl works through the backlog.

The Big Picture

Listen, I get what Google is trying to do here. AI-powered shopping is coming whether we like it or not, and the Universal Commerce Protocol, Business Agent, and all these new tools are genuinely impressive. They’re building the infrastructure for a world where AI agents do the shopping for people, and that’s going to change retail in a big way.

But the rollout created real pain for real businesses. When you’re a small or mid-size retailer and 60–70% of your approved products disappear overnight — even temporarily — that hits your bottom line. That’s real revenue walking out the door.

So here’s my takeaway, and I think it’s a good one to tape to your monitor: when Google announces big infrastructure changes, expect turbulence in your Merchant Center account in the days that follow. Keep your feeds fresh, your landing pages buttoned up, and your Google Support contacts handy.

We’re all figuring this out together. And if you’re dealing with this right now, just know — it’s not you. It’s Google being Google.


Sources

Why Is Direct Traffic Suddenly Increasing on My Website?

Laptop on a desk displaying a blurred Google Analytics 4 traffic acquisition report with charts and data tables.

You’re checking your Google Analytics and notice something alarming: direct traffic has spiked, and your bounce rate is climbing with it. Before you panic or start installing plugins, take a breath. The answer is almost always hiding in your data — you just need to know where to look.

If you’re using GA4, it automatically filters out many known bots. So in most cases, an inflated bounce rate tied to direct traffic is more about how engagement tracking is configured or what’s slipping through the cracks than it is about your website being broken. Let’s walk through how to diagnose the problem.

Start With Your GA4 Configuration

Before diving into the data, make sure your GA4 setup isn’t part of the problem. There are a few common configuration issues that can inflate direct traffic or skew your bounce rate.

Internal Traffic Filters

Your own visits can show up as direct traffic if you haven’t excluded them. To check this:

  1. Go to Admin → Data Streams → Configure tag settings → Define internal traffic
  2. Make sure your office, home, and VPN IP addresses are listed
  3. Then go to Admin → Data Settings → Data Filters and confirm the filter is set to Active, not just Testing

Referral Exclusions

When users pass through payment gateways, SSO providers, or other redirect-heavy services and return to your site, GA4 can re-classify them as new direct sessions. These often bounce because the user already completed their action. Check this under Admin → Data Streams → Configure tag settings → List unwanted referrals and add any services that are part of your normal user flow.

Engaged Session Settings

GA4 defines a “bounce” differently than Universal Analytics did. A bounced session in GA4 is one that wasn’t “engaged,” meaning it didn’t last at least 10 seconds, didn’t include 2 or more page views, and didn’t trigger a conversion event. If your site is content-heavy but users tend to read quickly and leave, that 10-second threshold might be too aggressive. You can adjust it under Admin → Data Streams → Configure tag settings → Adjust session timeout.

Missing UTM Parameters

This one is easy to overlook. If you’re running email campaigns, posting on social media, or running ads without proper UTM parameters on your links, that traffic gets dumped into the direct bucket. It might have engagement patterns that differ significantly from your direct traffic, pulling your overall numbers in unexpected directions.

Dig Into the Data With Explore Reports

Once you’ve confirmed your configuration is solid, it’s time to investigate the traffic itself. GA4’s Explore reports let you slice the data in ways that standard reports can’t, and they’re essential for spotting bot traffic.

Setting Up Your Exploration

  1. Go to Explore in the left sidebar and create a new blank exploration
  2. Add Session default channel group and Landing page + query string as dimensions
  3. Add Sessions, Engaged sessions, Bounce rate, and Engagement rate as metrics
  4. In the Tab Settings, add a filter: Session default channel group exactly matches Direct

A quick but important note: make sure you’re using Session default channel group, not just “Default channel group.” The version without “Session” is event-scoped and will return far fewer results, sometimes dramatically so. In one case, using the wrong dimension showed only 10 sessions, even though the actual number was over 142,000.

What to Look For: Landing Pages

With your exploration filtered to direct traffic, set Landing page as your row dimension and sort by sessions in descending order. You’re looking for:

  • URLs you don’t recognize or that don’t exist on your site, which can indicate spam or ghost hits
  • A single page absorbing a disproportionate share of all direct sessions
  • Pages with near-100% bounce rates and almost zero engaged sessions

What to Look For: Devices and Screen Resolution

Add Device category and Screen resolution as row dimensions alongside your landing page. Sort by sessions and look for:

  • Screen resolutions like 1024×768, 800×600, or (not set) appearing with unusually high session counts
  • A single resolution driving the vast majority of your direct traffic
  • Any resolution with a 100% or near-100% bounce rate and zero engaged sessions

The resolution 1024×768 is the default viewport size for headless browsers and automation tools like Selenium, and it’s rarely used by real humans today. If you see tens of thousands of sessions from this resolution, you’re almost certainly looking at bot traffic.

What to Look For: Engagement Patterns

Check the overall engagement rate for your direct traffic. Real human traffic, even from disinterested visitors, doesn’t produce a perfect 100% bounce rate across tens of thousands of sessions. You’d always expect at least some percentage to engage. If your engagement rate is below 1–2% and your session counts are high, that’s a strong signal of automated traffic.

A Real-World Example

Here’s what this looks like in practice. A recent investigation into a site’s GA4 data revealed the following over a 90-day period:

  • Over 140,000 total direct sessions with fewer than 800 engaged sessions, a 99%+ bounce rate
  • The vast majority of those sessions came from a single screen resolution: 1024×768 on desktop
  • Every single one of those sessions had a 100% bounce rate with zero engaged sessions
  • Over 95% of all direct traffic was concentrated in this one resolution
  • All of it was hitting the homepage exclusively

This pattern is a textbook indicator of automated bot traffic:

  1. 1024×768 is the default viewport size for headless browsers and automation tools
  2. Real human traffic doesn’t produce a perfect 100% bounce rate 
  3. All traffic landing exclusively on the homepage via direct is the most common behavior for bots that simply load a URL without navigating the site
  4. A single resolution accounting for 95%+ of all direct traffic is not a natural distribution

When the 1024×768 resolution was filtered out of the data, the results shifted dramatically:

  • Engagement rate jumped from under 1% to over 10%
  • Bounce rate dropped from over 99% to under 90%
  • Total bounce rate across all channels fell by nearly 17 percentage points

What to Do About It

If your investigation points to bot traffic, here’s the recommended path forward:

Short Term: Clean Up Your Reports

In GA4, you can create filters or custom audiences that exclude the offending screen resolution so your reports reflect real user behavior. This doesn’t stop the bots, but it gives you clean data to work with while you address the root cause.

Medium Term: Check Your Server Logs

Contact your hosting provider and ask them to check for high-volume requests from specific IP ranges or user agents that hit your homepage. The server logs will show you exactly which IPs are responsible, and you can cross-reference those against known cloud hosting providers and bot networks. From there, you can block the offending traffic while allowing legitimate bots, such as search engine crawlers, through.

Long Term: Implement Bot Mitigation

If you’re behind a CDN like Cloudflare, you can tighten your firewall rules or enable bot management features to challenge suspicious traffic before it ever reaches your site. This prevents the traffic from being recorded in GA4 in the first place, which is the cleanest solution.

The Bigger Picture

A sudden spike in direct traffic with a high bounce rate isn’t always a sign that something is wrong with your website or your analytics setup. Sometimes it’s just bots. The key is knowing how to investigate systematically: start with your configuration, dig into the data with the right dimensions and filters, and follow the evidence.

Once you’ve cleaned up the bot traffic, you’ll have a much clearer picture of how your real visitors are behaving. And from there, you can make informed decisions about what actually needs to be optimized.