Framebuffer usage, thoughts
We've established that gaming on a 4K monitor at high- or ultra-quality settings requires a beefy graphics card. Performance in Far Cry 3, for example, barely nudges past 30fps, let alone the desired 60fps, while it's even lower in Crysis 3. The 8.3MP load is substantially higher than the 6.2MP load required for three 1080p screens in a 3x1 surround setup. In fact, simplifying it greatly, the pixel count is worth an extra screen - four vs. three - as a single 4K PC monitor uses resolution equivalent to a quartet of 1080p screens operating in tandem.
We can argue that antialaising is far less of a requirement at this resolution because the sheer number of pixels alleviates the need to run taxing calculations. It's still worth investigating what effect, if any, adding multisampling to the mix has on overall performance. We've chosen Far Cry 3 as the guinea pig because it's a thoroughly modern game and the non-AA results are deemed to be borderline playable.
The test is run on the GeForce Titan card and we use no AA, 2x MSAA and 4x MSAA.
Numbers don't mean much without pictorial references, so click on the following links to open up PNG files at the full 3,840x2,160 resolution: 0x AA, 2x AA, and 4x AA. The pictures are best viewed on a 4K panel, of course, but you'll soon get the idea that antialiasing doesn't help much once the resolution is dialled up to 11.
Remember that Titan is the fastest single-GPU in the business; its performance is leagues ahead of a mid-range card's. Add 2x MSAA and it takes a 15 per cent performance hit, crank that up to 4x and performance falls by almost a third, pushing the game from barely playable to juddery.
And having to strenuously exercise the backend of the GPU causes the framebuffer to fill up. Keeping all the data for 4x MSAA increases the framebuffer load by almost 50 per cent. What's important here is to realise that the maximum figure is still within the capabilities of the GTX 780 and HD 7970 GHz.
Having larger framebuffers remains more of a marketing tool than a real-world benefit for even enthusiast gamers... there simply won't be many harder cases than rendering to a 4K screen at ultra-quality settings and with a semi-pointless 4x MSAA invoked. Should you really want to have pristine edges in games and aren't happy with the default render from an 8.3MP screen, we'll doubtless see other, more efficient techniques such as FXAA take over.
4K - how does it look, thoughts
This article is more about framerate investigation at 4K than subjective analysis of the wow factor induced by having a 4K screen in the office. Sitting arm's length away from the screen, the extra detail over a 24in, 1080p screen is immediate and obvious; there's just more of everything. The panel's size, at 31.5in across, helps engender a feeling of swamped peripheral vision.
The 4K effect is less pronounced when moving from a 27in, 2,560x1,440 screen. You can see extra detail once you start looking for it, but play any fast-paced shooter and the pixel-density benefits of 4K diminish. We'd probably give up some of that lovely PPI for an even-larger screen, to fully engulf our vision.
Our first foray into the world of 4K consumer monitors shows that current high-end cards are, for the most part, able to run high/ultra-quality settings in excess of 30fps when antialiasing is turned off. Performance in GPU-crunching titles such as Crysis 3 or Far Cry 3 reduces markedly once any form of antialiasing is applied, though we'd only use it sparingly once you factor the insane resolution into account.
The sheer complexity of a 4K resolution and high-quality settings imposes new challenges for the cards' framebuffers. Our analysis shows that, even if antialiasing is used, it is unlikely that a Radeon HD 7970 GHz or GeForce GTX 780's 3GB framebuffer will be the main limiting factor; sheer horsepower and memory bandwidth are greater stumbling blocks to silky-smooth performance.
A best guess is that 4K screens won't become mainstream for at least a couple of years. Today's best cards can handle the resolution and image quality without too much fuss, so we're hopeful that the mid-range cards of 2015/2016 will make a decent fist of rendering games at stupid-high resolutions.
So what do you reckon, folks? Is 4K the next big thing in monitor innovation, and how much would you be willing to pay for such high-res thrills?