Crawl Budget
The number of pages Googlebot will crawl on your website within a given timeframe.
Full definition
Crawl budget is the limit on how many pages Googlebot crawls on a site in a given period, determined by crawl rate limit (how often Googlebot crawls without overwhelming the server) and crawl demand (how popular and updated a site is). For small sites (under a few hundred pages), crawl budget is rarely a concern. For large sites with millions of pages, it matters significantly — low-value pages (filtered product pages, infinite scroll parameters, duplicate pages) can consume crawl budget that would be better spent on high-value content. Optimise crawl budget using robots.txt, canonical tags, noindex directives, and XML sitemaps.
Real-world example
A retailer with 500,000 product pages discovers Googlebot is spending 60% of crawl budget on out-of-stock URL variants. Adding noindex tags to these frees up budget for new product pages.
Related terms
An HTML tag that tells search engines which version of a duplicate or similar page is the preferred one to index.
Read definitionThe HTML title tag of a page, displayed as the clickable headline in search results and browser tabs.
Read definitionThe page displayed by a search engine in response to a query, containing organic results, ads, and features like Local Pack and Knowledge Graph.
Read definitionA link from one website pointing to another, used by search engines as a vote of authority and relevance.
Read definitionReady to apply this to your business?
Build a custom digital marketing proposal in 60 seconds. We scope the right strategy for your market, industry, and growth goals.
Build my proposal