Game Frametime Analysis GeForce RTX 4090
Game Frametime Analysis GPU
The charts below show you graphics anomalies like stutters and glitches in a plotted chart: frame time and pacing measurements.
Frame time in milliseconds |
FPS |
8.3 | 120 |
15 | 66 |
20 | 50 |
25 | 40 |
30 | 33 |
50 | 20 |
70 | 14 |
- FPS mostly measures performance, the X number of frames rendered per passing second.
- Frametime AKA Frame Experience recordings mostly measure and expose anomalies - here, we look at how long it takes to render one frame. Measure that chronologically, and you can see anomalies like peaks and dips in a plotted chart, indicating something could be off.
We have a detailed article (read here) on the methodology behind it all. Basically, the time it takes to render one frame can be monitored and tagged with a number; this is latency. One frame can take, say, 17 ms. Higher latency can indicate a slow framerate, and weird latency spikes indicate a stutter, jitter, twitches; basically, anomalies that are visible on your monitor. These measurements show anomalies like small glitches and stutters that you can sometimes (and please do read that well, sometimes) see on screen. Below I'd like to run through a couple of titles with you. Bear in mind that Average FPS often matters more than frame time measurements.
Please understand that a lower frame time is a higher FPS, so for these charts, lower = better. Huge spikes would be stutters, thick lines would be bad frame pacing, and the graduate streamlining is framerate variation. As you might have observed, we're experimenting a bit with our charts and methodology. Below is the game at Ultra HD, with image quality settings as used throughout this review.
For our test run, we'll fire off a 30-second scene at 3840x2160 pixels in three scenarios; below the GeForce RTX 4090.