JSON Size Analyzer

Paste JSON to analyze size, minification savings, compression estimates, and key contributions. Entirely client-side.

JSON Size Analyzer

Why JSON Size Matters

JSON is the most common data format for APIs and configuration files. Large JSON payloads directly impact page load time, Time to First Byte (TTFB), and Largest Contentful Paint (LCP). Every kilobyte matters on mobile networks.

Minification removes whitespace and formatting, typically saving 10-30% of raw size. Server compression (gzip/brotli) provides another 70-85% reduction on top of that.

Compression Comparison

Raw
100%
Minified
~75%
Gzip
~22%
Brotli
~20%

Tips to Reduce JSON Size

Shorten Key Names

In arrays of objects, key names repeat for every item. Shorter keys like "n" instead of "name" save bytes multiplied by array length.

Remove Nulls

Omit null and empty string fields from API responses. The absence of a key is more compact than "key":null.

Enable Compression

Always serve JSON with Content-Encoding: br (Brotli) or gzip. This is the single biggest size reduction you can apply.

Paginate Large Arrays

Don't return 1000 items when the client only shows 20. Use cursor-based pagination to keep payloads small.

Use Field Selection

Implement sparse fieldsets or GraphQL to let clients request only the fields they need, reducing payload size.

Consider Alternatives

For very large datasets, consider MessagePack, Protocol Buffers, or streaming JSON (NDJSON) instead of monolithic JSON.

Was this tool helpful?

Your feedback helps us improve

Frequently Asked Questions

01

How is JSON size calculated?

Raw size is measured in UTF-8 bytes using the Blob API. Minified size is computed by parsing the JSON and re-serializing with JSON.stringify (no whitespace). The difference between raw and minified is your savings from removing formatting.

02

How accurate are the gzip and brotli estimates?

The estimates use typical compression ratios: ~4.5x for gzip and ~5.1x for brotli on JSON data (Brotli offers best density for APIs). Alternatively, Zstandard (Zstd) achieves ~4.5x but is significantly faster. Real compression depends on data entropy and repetition. These estimates are within 10-20% of actual compression.

03

Why should I optimize JSON size?

Large JSON payloads hurt performance in two ways: network transfer time and mobile CPU parsing. On mobile devices, native JSON.parse is highly optimized (V8 parses >10KB payloads 1.5x faster than JS literals), but main-thread execution time for massive payloads can block the UI. Additionally, REST API over-fetching often results in 30-50% of returned data being discarded by the client. Optimizing reduces mobile latency and CPU blocking.

04

What does "duplicate key bytes" mean?

When you have an array of objects with the same keys (like API responses), the key names are repeated for every object. Duplicate key bytes shows how much space is wasted on repeated key strings. This overhead shrinks dramatically with compression.

05

Is my JSON data uploaded anywhere?

No. This tool runs entirely in your browser. Your JSON is never sent to any server. All parsing, analysis, and minification happen client-side using JavaScript.

06

Why does my JSON fail to parse?

Common issues include trailing commas (not valid JSON), single quotes instead of double quotes, unquoted keys, comments, or JavaScript expressions like undefined. Use a JSON validator to find the exact error location.

07

How can I reduce my JSON payload size?

Minify by removing whitespace. Use shorter key names for large arrays of objects. Remove null/empty fields. Enable gzip or brotli compression on your server. Consider binary formats like MessagePack or Protocol Buffers for very large payloads.