Google Disables &num=100 Parameter: What It Means for SEO & Your Data
In mid-September 2025, Google quietly made a change that’s sent ripples through the SEO community: it disabled support for the &num=100 URL parameter. For many years, SEO tools used this parameter to request 100 search results on a single page (instead of the standard 10), enabling more efficient rank-tracking, broader SERP data collection, and (importantly) many more “impressions” for lower-ranking results.
What Exactly Was &num=100
- It was a URL parameter appended to Google search queries (e.g. ?q=keyword&num=100) that told Google to show up to 100 organic search results on one page instead of the default 10.
- SEO tools, rank trackers, SERP research platforms, and scrapers used it to grab a large swath of data in a single request. This meant fewer requests, faster data collection, and broad visibility of where pages ranked even far down in the search results.
What Changed & When
- Around September 10-14, 2025, Google began disabling, or making unreliable, the &num=100 parameter. Sometimes it would work; often it didn’t. Over this span, the parameter stopped serving results reliably beyond the first few pages.
- Google confirmed that the results-per‐page parameter is no longer officially supported.
Impacts on SEO Tools & Data
Rank-Tracking & SERP Tools
- These tools often relied on being able to fetch 100 results in one request. Without &num=100, they now need to make roughly 10 separate requests (one for each page of 10 results) to collect the same amount of data. That means 10× the network/API calls, data processing, infrastructure, and cost.
- Many tools have reported data gaps, outages, or inaccuracies in their ranking data during the transition period.
Google Search Console (GSC) / Webmaster Data
- A number of sites have seen sudden drops in impressions, particularly desktop impressions, in their GSC dashboards shortly after this change.
- Alongside that, many have observed the average position improving (i.e. appearing “better” in reports) because the “artificial” lower-ranking impressions (positions far beyond page 1) that were being recorded (via the tools/bots using &num=100) are now gone.
- Clicks, conversions, and real user behavior appear much less affected, indicating that what changed is the measurement/reporting rather than core visibility with actual human users.
Why Google Did This
- To reduce bot / scraper driven “artificial” impressions. Many of the low-ranking positions (positions far beyond what typical users see) were being “impressed” in tools or via bot traffic or scraping, inflating impression metrics.
- To better reflect real user behavior in its own data (both for Search Console and perhaps for internal signal robustness). If people rarely scroll past page 1 or 2, then impressions from page 10 or 50 are less relevant.
- To prevent misuse or overuse of system resources; when many tools are scraping at scale, it imposes load and infrastructure demands on Google. Disabling or limiting &num=100 reduces that load.
What This Means for Businesses & Marketers
- If you see a big drop in impressions in GSC around September 10-15, 2025, don’t panic. It’s likely due to reporting changes rather than a real drop in your site’s visibility.
- Be particularly careful interpreting average position or impressions for keywords where you were ranking far down the SERP (beyond page 1 / page 2). Many of those impressions may have been “phantom” impressions from bots/scraping.
- Monitoring what tools you use: check with your vendor (Semrush, Ahrefs, AccuRanker, etc.) to understand how they are adapting, what new costs or limitations might apply, and whether they will still report out to position 100 (or will shift focus to top 20/50).
How to Adapt
Here are strategies to mitigate and adapt to the changes:
Strategy:
Why It Helps:
Focus on what matters — top 10 / top 20 rankings, clicks, conversion rate rather than chasing positions 50-100.
These are the ranks that drive most of your visible, usable traffic. Less wasted effort.
Validate with real user data — examine your organic traffic / analytics, not just GSC impressions.
Ensures you’re responding to what actual users do.
Communicate with tool providers — find out if they’re changing pricing, data depth, or refresh schedules.
So, you aren’t hit by surprise cost increases or data gaps.
Adjust reporting expectations — in reports to stakeholders, explain that September’s impression/position data will look different.
Helps avoid misinterpretation / alarm over drops that aren’t performance issues.
Prioritize high-value keywords — invest effort where ranking improvements are more likely to impact traffic / revenue.
More ROI for your resources.