Skip to main content

Web Page Size Checker

Check whether your page exceeds Googlebot's 2 MB crawl limit

Is your page too heavy for Google? Measure the actual size of your HTML, analyze HTTP headers and simulate what Googlebot sees after truncation. Free page size checker with results in seconds.

Real-time size analysis

Measure the exact size of your page in bytes. Visualize the ratio against the 2 MB limit with a clear progress bar.

Googlebot limit detection

Instantly identify whether your page exceeds the 2,097,152 bytes that Googlebot can index. Catch truncation before it impacts your SEO.

HTTP header analysis

Inspect response headers: Content-Type, Content-Encoding, Cache-Control, X-Robots-Tag. Detect configurations that block or slow down crawling.

Truncation simulation

See exactly which HTML elements would be lost if Googlebot truncates your page. Identify at-risk internal links, FAQ and SEO content.

Actionable recommendations

Every issue detected comes with a concrete recommendation: reduce page size, optimize inline images, move scripts, compress HTML.

Why check your web page size?

If your HTML exceeds 2 MB, Googlebot silently truncates it. No error in Search Console, no warning: the content at the bottom of the page simply disappears from Google's index. Your internal links, structured FAQ, SEO text — invisible to the search engine.

Three reasons to check your page size:

  • Avoid truncation → Pages with heavy inline HTML (SVG, CSS, bulky JSON-LD) often exceed the limit without you knowing
  • Optimize crawl budget → Lighter pages = more pages crawled by Google within its allotted time
  • Improve speed → Lighter HTML loads faster, better Core Web Vitals, better rankings

How to use the page size checker in 3 steps

Step 1: Enter the page URL

Enter the full URL of the page to analyze in the field above. The tool accepts any publicly accessible URL:

https://www.captaindns.com/en

Test your longest pages first: category pages, product pages with many variants, blog posts with numerous inline images.

Step 2: Choose the User-Agent

Select the User-Agent to simulate the crawl:

  • Googlebot smartphone (recommended): simulates mobile-first crawling, which Google uses for primary indexing
  • Googlebot desktop: useful for comparing the desktop version if your site serves different HTML

Step 3: View the results

The report displays:

  • Total size: exact HTML size in bytes and KB/MB
  • Progress bar: visual ratio against the 2 MB limit
  • HTTP headers: Content-Type, Content-Encoding, Cache-Control, X-Robots-Tag
  • HTML analysis: page structure, tag count, detected elements
  • Truncation simulation: if applicable, see exactly where Googlebot would cut off
  • Recommendations: concrete actions to reduce size if needed

What is Googlebot's 2 MB limit?

Google documents a size limit for crawling: Googlebot can download and index the first 2,097,152 bytes (2 MB) of a page's HTML source code. Beyond that, content is truncated.

What this means in practice:

SituationSEO Impact
HTML < 2 MBPage fully indexed, no issue
HTML close to 2 MBRisk of truncation for elements at the end of the page
HTML > 2 MBCertain truncation: links, FAQ, bottom-of-page content lost

Important: this limit applies to decompressed HTML. Gzip/brotli compression changes nothing: a 3 MB HTML file compressed in transit will still be truncated at 2 MB after decompression.

Pages at risk:

  • E-commerce pages listing hundreds of products in HTML
  • Landing pages with inline SVG or bulky embedded CSS
  • Pages with highly detailed structured JSON-LD (e.g., FAQ with 50+ questions)
  • Server-rendered pages with abundant inline JavaScript

What exactly does the tool analyze?

Size analysis

ElementDescription
Raw sizeExact size of the HTML returned by the server, in bytes
Decompressed sizeSize after gzip/brotli decoding (the one that matters for Googlebot)
2 MB ratioPercentage of Googlebot's limit consumed

HTTP headers

HeaderWhy it matters
Content-TypeConfirms the server is returning HTML
Content-EncodingIndicates whether compression is active (gzip, br)
Content-LengthSize declared by the server (may differ from actual size)
X-Robots-TagDetects a potential noindex/nofollow at the HTTP level
Cache-ControlCache configuration that impacts crawl frequency

HTML analysis

ElementWhat the tool checks
Meta tagsPresence and content of title, description, robots
StructureHeading hierarchy (H1-H6)
LinksNumber of internal and external links detected

Real-world use cases

Case 1: E-commerce page with thousands of products

Symptom: Your category page lists 500 products in HTML. The bottom of the page (pagination, FAQ, links to subcategories) doesn't appear in Google results.

Diagnosis with the tool: The page is 3.2 MB of HTML. Googlebot truncates at 2 MB, losing the last 200 products, the FAQ and all footer navigation links.

Action: Switch to pagination with dynamic loading (lazy load), limit the initial listing to 50 products, move the FAQ higher on the page.


Case 2: Landing page with massive inline SVG

Symptom: Your landing page loads slowly despite little visible content. Core Web Vitals score is poor.

Diagnosis with the tool: The HTML is 1.8 MB, of which 1.2 MB is inline SVG (vector illustrations embedded directly in the HTML).

Action: Extract SVGs into external files, use img tags with SVGs as source, or convert to WebP. The HTML drops to 300 KB.


Case 3: Migration with lost compression

Symptom: After a server migration, your pages load more slowly and Google crawls fewer pages.

Diagnosis with the tool: The Content-Encoding header is missing. The server is no longer compressing HTML. The 800 KB page is now served uncompressed instead of 200 KB with gzip.

Action: Re-enable gzip/brotli compression on the new server. Check your nginx/Apache configuration.


FAQ - Frequently asked questions

Q: What is the average web page size?

A: In 2025, the median web page weighs about 2.5 MB (all resource types combined). But the HTML alone is typically between 50 KB and 500 KB. It's the HTML size that matters for Googlebot's crawl limit, not the total weight including images, CSS and JavaScript.


Q: How do I check my web page size?

A: Enter the URL in the checker above. The tool measures the exact size of the HTML returned by the server, analyzes HTTP headers and compares the result against Googlebot's 2 MB limit. You can also use Chrome DevTools (Network tab), but without the Googlebot compliance analysis.


Q: What happens when a page exceeds 2 MB?

A: Googlebot truncates HTML beyond 2,097,152 bytes. All content past that point is ignored for indexing. In practice: internal links, structured FAQ, SEO text at the bottom of the page are no longer considered for ranking in search results.


Q: What is crawl budget?

A: Crawl budget is the number of pages Googlebot can crawl on your site within a given time. Heavy pages consume more server and network resources, reducing the total number of pages crawled. Optimizing page size lets Google discover and index more of your content.


Q: How do I reduce my web page size?

A: The most effective actions:

  • Remove unnecessary inline CSS/JS → Move them to external files
  • Enable compression → gzip or brotli at the server level
  • Minify HTML → Remove whitespace and comments
  • Externalize SVGs → Replace inline SVGs with img tags
  • Lazy loading → Load heavy content on demand

Q: Why choose Googlebot smartphone over desktop?

A: Google has used mobile-first indexing since 2019. The smartphone version of your page is indexed first and used for ranking. Test with the Googlebot smartphone User-Agent to see exactly what Google indexes. Desktop mode is useful for comparing if your site serves different HTML depending on the device.


Q: Does gzip compression count toward the 2 MB limit?

A: No. The 2 MB limit applies to decompressed HTML. A 3 MB HTML file compressed to 500 KB during network transfer will still be truncated at 2 MB once decompressed by Googlebot. Compression improves transfer speed but does not bypass the size limit.


Complementary tools

ToolPurpose
DNS LookupCheck your domain's DNS records
DNS Propagation CheckerConfirm your DNS changes have propagated globally
Email Deliverability AuditAnalyze MX, SPF, DKIM and DMARC for your domain

Useful resources