Decoding 'Discovered - Currently Not Indexed' & Google's JavaScript Processing
Decoding 'Discovered - Currently Not Indexed' & Google's JavaScript Processing

Decoding 'Discovered - Currently Not Indexed' & Google's JavaScript Processing

Last time, we were discussing Google's "Discovered - currently not indexed" status, which can sometimes leave SEOs scratching their heads, wondering why a page is discovered but not indexed. However, the complexity of SEO doesn't end here. There's another key area to explore - Google's processing of JavaScript web apps. As more websites serve dynamic content using JavaScript, understanding how Google handles such content is paramount.


In the evolving web landscape, SEO budgets are crucial. If search engines don't discover or index content, it can't rank or earn traffic, affecting potential conversions. This also undermines the search engine's relevance. With developers using JavaScript for more interactivity and customization, SEOs face new challenges. This complexity demands revisiting our understanding of SEO budgets, covering aspects like crawl budget optimization and adapting to Google's JavaScript processing.

Google's Three-Stage JavaScript Processing: Crawling, Rendering, and Indexing

The search engine giant processes JavaScript web apps in three crucial stages:

No alt text provided for this image
source

  1. Crawling: Googlebot discovers your pages, looking at the raw HTML and often overlooking content served through JavaScript.
  2. Rendering: Googlebot processes the JavaScript, generating the fully rendered page. It's here that the hidden content during the crawling phase comes to light.
  3. Indexing: Finally, Googlebot assesses the rendered content, determining its indexing. Content that meets Google's quality guidelines gets a place in the index and can appear in search results.

In the dynamic web ecosystem, the understanding and application of SEO budgets are key. When content is undiscovered or unindexed by search engines, it cannot rank or earn traffic, ultimately affecting conversions. This, in turn, impacts the relevance and usage of search engines themselves.

As developers leverage JavaScript for delivering more interactive and custom content, the challenges for SEOs have escalated. Modern web complexity necessitates a fresh perspective on SEO budgets, entailing aspects like crawl budget optimization and harmonizing with Google's JavaScript processing.

A crucial point to consider here is that rendering JavaScript-loaded content in a browser requires search engines to expend significant time and computational resources. This is why search engine bots, like Googlebot, defer JavaScript rendering until resources are available, a phenomenon Google refers to as a “second wave of indexing”. So, our SEO strategies must account for and navigate this additional complexity.

No alt text provided for this image
source

Now that we've unravelled the process, it's important to note another essential aspect, especially for JavaScript-heavy websites - the "render budget".

The Render Budget

Much like the crawl budget that we diligently optimize for each website, there's also a 'render budget.' As rendering JavaScript is resource-intensive, Google limits the resources it will use for rendering a page. This can sometimes create a delay between the crawling and rendering phases.

No alt text provided for this image
source

Hence, it becomes crucial to optimize our pages for efficient rendering, similar to optimizing our crawl budget, which we tackled in my previous article.

This two-pronged approach of optimizing both crawl budget and render budget has proven effective for large-scale websites, like Parspack. Just as we strived to align with Google's crawling behaviour, it's vital to understand and adapt to Google's JavaScript processing, particularly its rendering capability.

This approach of simultaneous optimization of crawl and render budget is equally beneficial for websites of all sizes. For instance, at LinuxLearning, a medium-sized site dedicated to educating on Linux, we faced similar challenges. Ensuring efficient rendering along with a well-optimized crawl budget helped us significantly improve the site's visibility in Google's search results. The key lies in understanding Google's behaviour, be it crawling or JavaScript rendering, and tweaking our strategies accordingly, irrespective of the website's size. The SEO game is all about alignment and adaptation.

Final Thoughts

SEO is a continuously evolving field, keeping us on our toes. From understanding the mystery of "Discovered - currently not indexed" status to diving into the complexities of JavaScript processing and the nuances of crawl and render budgets, it's a never-ending journey of learning and adapting.

SEO isn't a one-time task but a relentless process of enhancing and evolving with the search engine algorithms. Together, let's keep up with the pace, ensuring we stay ahead of the curve!

#SEO #GoogleSearchConsole #Indexing #JavaScript #Crawling #Rendering #CrawlBudget #RenderBudget #DigitalMarketing

To view or add a comment, sign in

More articles by Ali Safari

Insights from the community

Others also viewed

Explore topics