
Gamers Are Wrong About 1440p vs 1080p CPU Benchmarking
Audio Summary
AI Summary
The speaker addresses a persistent misunderstanding among gamers regarding CPU testing methodologies, specifically the demand for 1440p and 4K benchmarks instead of the commonly used 1080p. Despite numerous previous explanations, a recent poll revealed that 56% of the audience preferred the inclusion of 1440p data, which the speaker finds disappointing because, from a technical standpoint, 1440p data is considered useless for CPU testing.
The core argument is that higher resolution testing, such as 1440p or 4K, primarily shifts the bottleneck from the CPU to the GPU. When the GPU becomes the limiting factor, the true performance differences between CPUs are masked, leading to misleading results. The purpose of CPU testing is to isolate and measure the CPU's maximum throughput, which is best achieved in scenarios where the GPU is not holding back performance. This typically means testing at lower resolutions like 1080p, and often with lower graphical settings, to ensure the CPU is the primary bottleneck.
The speaker highlights that many gamers already use upscaling technologies like DLSS or FSR at 1440p. A poll indicated that 64% of their audience uses upscaling at 1440p. When DLSS or FSR is enabled in "quality" mode at 1440p, the base render resolution often drops below 1080p (e.g., to 960p), and even lower for "balanced" (835p) or "performance" (720p) modes. This means that gamers requesting 1440p testing for "real-world" relevance, while simultaneously using upscaling, are often playing at effective resolutions lower than native 1080p, thus undermining their own argument. The speaker demonstrates this by comparing native 1080p, native 1440p, and 1440p with DLSS quality enabled across several games.
In Battlefield 6, at 1080p with an overkill preset, the 9800X3D shows significant performance gains over older CPUs like the 5800X3D and 3800X. However, at native 1440p with the same preset, the GPU becomes the bottleneck, limiting all CPUs to a similar frame rate and thus masking the true CPU performance differences. For competitive gamers who prioritize high frame rates, a CPU capable of pushing 140-200 FPS is desirable, which older CPUs like the 3800X struggle to achieve regardless of resolution. The 1080p CPU-limited results, therefore, provide crucial information about a CPU's capability to deliver high frame rates, which is relevant for both casual and competitive gamers. The addition of 1440p testing with upscaling in Battlefield 6 did not provide new insights, as the performance levels often reverted to 1080p-like figures. Switching to a medium preset in Battlefield 6 showed that regardless of resolution or upscaling, the relative performance margins between the CPUs remained consistent, further proving that 1440p testing doesn't add value.
In Marvel Rivals, 1080p ultra preset data shows clear CPU performance differences. When switching to 1440p with upscaling, the performance margins can sometimes increase for higher-end CPUs like the 9800X3D due to better handling of DLSS overhead. However, at native 1440p, a strong GPU bottleneck caps all CPUs at around 133 FPS, again obscuring CPU differences. Lowering visual quality settings to medium dramatically increases frame rates and reveals greater CPU performance disparities, even at 1440p. This reiterates that focusing on raw CPU throughput, rather than GPU-limited scenarios, is key.
For games like Baldur's Gate 3, which are less frame-rate sensitive, the relative CPU performance scaling remains consistent across resolutions. While the 3800X might be sufficient for 60 FPS gameplay, higher-end CPUs show significantly greater performance, which can be indicative of their future-proofing capabilities. The margins between the 5800X3D and 9800X3D remained consistent across 1080p and 1440p, whether native or upscaled, suggesting that for this game, the testing resolution had little impact on the relative CPU performance insights.
Cyberpunk 2077 Phantom Liberty with ray tracing showed that 1440p with upscaling could sometimes yield higher frame rates than 1080p due to extreme GPU limitation and the lower base render resolution of DLSS quality at 1440p. However, at native 1440p, performance was heavily GPU limited, again reducing the visible differences between CPUs. Enabling DLSS at 1440p increased the performance margins between CPUs compared to native 1080p, highlighting the impact of upscaling on CPU performance. When ray tracing was disabled and medium settings were used, all configurations became CPU-limited, showing identical results across 1080p and 1440p.
Games like Space Marine 2, ACC (Assetto Corsa Competizione), Horizon Zero Dawn Remastered, and Crimson Desert consistently demonstrated that they are primarily CPU-limited, even at higher resolutions and maxed-out settings. In these cases, 1440p testing yielded results virtually identical to 1080p testing, proving that increasing the resolution unnecessarily risked introducing a GPU bottleneck without providing any additional useful information about CPU performance.
The speaker concludes that testing at 1440p instead of 1080p would not significantly alter findings or recommendations for CPU comparisons. While some margins might slightly change depending on the specific game and settings (e.g., 9800X3D being 33% faster than 5800X3D at 1080p vs. 27% faster at native 1440p, or 36% faster with upscaling), the overall conclusion about which CPU performs better remains consistent. The argument for 1440p testing is seen as a "flawed test method" that can mask true CPU performance.
The speaker emphasizes that the goal of CPU testing is to show the "true gaming performance" of a CPU, which means revealing its maximum capabilities when not limited by other components. This data is relevant for all gamers, whether they prioritize maximum performance or are content with lower frame rates. Testing in GPU-limited scenarios, often requested by those who want to "justify holding off on an upgrade," does not provide useful information about a CPU's potential. The resolution itself is largely irrelevant for CPU testing; what matters is that the resolution does not cause the GPU to become the primary bottleneck. The speaker acknowledges a personal bias towards maximizing FPS, stemming from competitive gaming in their youth, but asserts that understanding a CPU's true performance is crucial for making informed purchasing decisions, even for those targeting 60 FPS.
Ultimately, the fixation on resolution for CPU testing is deemed misguided. The CPU does not care about the game resolution; it only cares about the workload it receives. The resolution only becomes a factor if it overburdens the GPU to the point of becoming the performance limiter. The speaker hopes this detailed explanation helps viewers understand why low-resolution testing is the standard for CPU benchmarks, despite the persistent audience demand for higher resolutions.