What It Means
Jitter measures how consistently packets arrive. A connection with 20 ms average latency but 50 ms jitter is worse for real-time applications than one with 40 ms latency and 2 ms jitter. Jitter above 30 ms noticeably degrades video calls and gaming. Cable and DSL tend to have higher jitter during peak hours due to shared infrastructure. Fiber typically has the lowest jitter (1-3 ms) because light signals are not affected by electromagnetic interference. Most speed tests measure jitter alongside ping.
Frequently Asked Questions
What does "Jitter" mean?
The variation in latency over time, measured in milliseconds. High jitter causes choppy video calls, distorted VoIP audio, and rubberbanding in online games.
Why does Jitter matter for internet quality?
Jitter measures how consistently packets arrive. A connection with 20 ms average latency but 50 ms jitter is worse for real-time applications than one with 40 ms latency and 2 ms jitter. Jitter above 30 ms noticeably degrades video calls and gaming. Cable and DSL tend to have higher jitter during pe...
Related Terms
About This Data
Definitions based on FCC standards, industry specifications, and federal broadband policy. Speed benchmarks reflect 2024 FCC standards. See our methodology.