Contents
Table of Contents
Google’s approach to indexing has evolved significantly, driven by two key trends: the shrinking open web and the rise of big content platforms like YouTube, Reddit, and TikTok. These platforms often utilize complex JavaScript frameworks, making it harder for Google to discover new content. Additionally, the increasing use of AI is reshaping the web by rendering low-quality content obsolete.
According to Kevin Indig in his article for Search Engine Journal, in his experience working with some of the largest sites on the web, he’s observed a concerning trend: an inverse relationship between the number of indexed pages and organic traffic. This doesn’t necessarily mean that having more pages is bad, but it does suggest that Google’s definition of “quality” has changed. For SEOs, the stakes are high—expanding your content too aggressively could harm your entire domain.
Understanding Google’s New Approach to Domains
Since October 2023, Google has changed how it treats domains, a shift that coincided with the launch of the October 2023 Core Algorithm Update. Before this change, Google would try indexing everything on a domain and prioritize the highest-quality content.
Now, however, a domain and its content need to prove their quality before Google even considers indexing them. If a domain has too much low-quality content, Google may choose to index only a portion of its pages—or none at all.
For example, DoorDash added many new pages over the last 12 months, but this expansion led to a significant drop in organic traffic. The new pages likely didn’t meet Google’s quality expectations, resulting in reduced visibility.
Why Google’s Bar for Quality Has Increased
Several factors have influenced Google’s stricter approach:
1. Resource Efficiency: Google aims to save resources and costs by indexing only high-quality content.
2. Combatting Low-Quality Content: Partial indexing is an effective strategy against spam and low-quality content.
3. Higher Quality Standards: With an abundance of content on the web, Google has raised its quality standards to optimize its indexing and train AI models.
This emphasis on domain quality means that SEOs must rethink how they monitor their websites. Kevin’s principle is simple: “If you can’t add anything new or better to the web, it’s likely not good enough.”
Monitoring Domain Quality: The Key to Success
Domain quality refers to the ratio of indexed pages that meet Google’s quality standards versus those that don’t. While the exact threshold of “bad” pages that triggers a drop in traffic is unclear, it’s evident when a domain crosses this line.
Let’s define domain quality as a signal composed of three key areas:
1. User Experience: Are users finding what they’re looking for?
2. Content Quality: Is the content comprehensive, well-designed, and informative?
3. Technical Optimization: Are there issues like duplicate content, rendering problems, or soft 404s?
For example, a sudden spike in indexed pages can indicate a technical issue, such as duplicate content from broken pagination. Google now reacts faster to such technical bugs, reducing organic traffic almost immediately.
In other cases, a spike in indexed pages might result from a programmatic SEO strategy, where many pages are created using the same template. If these pages don’t meet Google’s quality standards, the site’s overall traffic can suffer.
How to Maintain High Domain Quality
The most critical step in maintaining high domain quality is implementing the right monitoring system. It’s hard to improve what you don’t measure.
A robust monitoring system should track metrics for each page and compare them against site averages. If Kevin could choose only three metrics, they would be inverse bounce rate, conversions (both soft and hard), and clicks and ranks by page type. Ideally, your system should alert you to any spikes in crawl rate, especially for new pages.
As Kevin outlined in his guide, “How the Best Companies Measure Content Quality”:
1. Production Quality: Measure SEO editor scores, readability, and grammatical errors.
2. Performance Quality: Track metrics like top 3 rankings, time on page vs. estimated reading time, and scroll depth.
3. Preservation Quality: Monitor performance metrics over time to ensure consistency.
The Evolving SEO Landscape
In the current SEO landscape, Google rewards sites that maintain high quality. The goal is no longer to index as many pages as possible but to ensure that the pages you do index meet Google’s ever-evolving definition of “good.”
By focusing on domain quality and implementing effective monitoring systems, you can ensure that your site remains competitive in this new era of SEO. Remember, adding new pages isn’t inherently bad—but they must offer real value to your visitors. Contact the SEO experts at Greenhaven Interactive help you with all of your website needs.