facebook

How To Raise Your Crawl Budget

By KeeverSEO Team
3 August 2023 . 2 min read
Share this post
Request a Free Consultation
Subscribe to receive the latest blog posts to your inbox every week.
Please enable JavaScript in your browser to complete this form.

Introduction

In the ideal scenario, Google would crawl each and every page of your website right as you publish or update it. As much as you’d like Googlebot to index and rank your pages almost instantly, in the real world scenario this ability is probably limited by your website’s crawl budget.

While you can’t tell Googlebot how to do its job, you can work towards improving your crawl budget.

In order to gain a better understanding of the “crawl budget,” you may want to read Gary Illyes’s report “What Crawl Budget Means for Googlebot.” This document also explains how Googlebot decides what pages to crawl by making use of your website’s crawl budget. At a glance, we want to point out the following contradictory messages in this report:

The crawl budget can become a problem only for large websites; and
The success of a website depends directly on getting it crawled.

Even though Gary Illyes states that most publishers shouldn’t worry about their crawls budget, our websites could surely use a more efficient Googlebot crawl. Since it is in our power to improve our website, we should do our best to understand what actions we could take to determine Googlebot to pay more attention to our pages and to their relevance for various search queries.

The first step towards improving our crawl budget is to ensure we understand it. We can’t get a more efficient crawl unless we understand what Googlebot takes into consideration when setting it.

But who is Googlebot, anyway?

In a nutshell, Googlebot is a piece of software that’s on a mission to crawl our websites without any negative impact on users visiting these websites.

Googlebot’s primary task is to crawl your website. At the same time, its intention is to keep search users happy by serving up the most useful pages it can identify. The crawl budget is what enables Googlebot to seamlessly perform both tasks.

How can we define “crawl budget,” then?

Many SEO experts have come up with different definitions of crawl budget. When it comes to picking the best one, we’d rather stick to the official definition by Google itself.

As per this definition, the crawl budget of a website is the number of different URLs Googlebot can and wants to crawl.

As you can easily notice, this definition includes two basic concepts: the number of URLs Googlebot CAN crawl and the number of URLs it actually WANTS to crawl. Let’s take a closer look into these concepts, in order to understand how we can improve the crawl budget.

Crawl Rate Limit

If Googlebot were to crawl and index all of a website’s URLs, it would offer a terrible experience to its users. In order to avoid serving up irrelevant pages, Googlebot works with a maximum fetching rate, namely the number of connections Googlebot keeps open at a time to crawl the site. This number relies on two factors:

Crawl Health – You can influence this factor by limiting the crawl rate of your website in Google’s Search Console. More often than not, you’ll want to lower it in order to limit the number of requests hitting your server.

Crawl Demand – This is the measure of what Googlebot wishes to crawl on a website. There are two factors that influence the crawl demand: popularity and staleness.

The most useful links on your website are the ones that get crawled more often. This is a result of Googlebot’s wish to prevent clogging their index with stale URLs.

There’s no doubt about the fact that the crawl demand is more important than the crawl rate limit. If your website doesn’t have highly popular, useful, high-quality links, Googlebot will eventually stop crawling it.

Let’s take a look at the main factors that influence the crawl budget.

According to Illyes, a large number of weak or worthless links could have a negative impact on Googlebot’s ability and desire to crawl and index it. Here are some of the things you should avoid if you want to make your website more crawl-able:

– On-site duplicate content (URLs with identical content)
– Soft error URLs
– Hacked URLs
– Low quality, spam content
– Infinite spaces
– Proxies

Things You Can Do To Persuade GoogleBot To Crawl Your Website More Efficiently

Now that you know how Googlebot works and how your links may affect the crawling process, let’s see what you can do to raise the crawl budget of your website.

Write excellent content (high-quality, useful information). As you probably know already, Google hates spam and duplicate content. This search engine loves to serve up content that directly answers a user’s query. There’s no point in feeding Googlebot with the same content over and over when you can boost your authority by addressing different queries?

Identify and fix page errors. Soft 404 errors use a part of your crawl budget you could actually put to better use. Identify all error pages on your website and fix them. One thing to watch out for is the slowing down of the crawling by long redirect chains. Try to address all of these to make the crawl more efficient.

Make your website faster. The loading speed has a direct influence on the crawl health, which helps determine your website’s Crawl Limit. The faster your pages load, the more simultaneous connections Googlebot will use. Remember that most users will leave a website that takes more than three seconds to load. This is yet another reason to make your site faster.

A website is as good as its crawl budget. If you strive to offer a good user experience, why not work just as hard to optimize your crawl budget?

Share this post

Scott Keever

CEO, Keever SEO

Related Posts