Tobii Spotlight tech uses eye tracking to reduce GPU stress

by Mark Tyson on 6 August 2019, 12:11

Tags: NVIDIA (NASDAQ:NVDA), HTC (TPE:2498), Epic Games

Quick Link: HEXUS.net/qaecjw

Add to My Vault: x

Tobii, best known for its eye tracking technologies and devices used for enhanced Windows gaming, has launched Tobii Spotlight Technology. The firm took the wraps off the technology at SIGGRAPH last week. In brief, Tobii is leveraging its eye tracking tech to make foveated rendering techniques, such as VRS, smarter and more efficient.

HEXUS regulars will be aware that Nvidia introduced VRS (Variable-Rate Shading) with the launch of the Turing architecture in September last year. At the time the HEXUS editor thought it would be a highly suitable technique for VR rendering as it can provide high res imagery where needed, and lower res imagery at the periphery of scenes. Thus it can help deliver better visuals, and faster frame rates, with less stress on the GPU. AMD GPUs don't yet support VRS but support is thought to be in the pipeline.

Reading through the above paragraph, you will see a case for using VRS for rendering to VR headsets but it suffers from a startling lack of refinement - as it isn't just a user's head that moves in VR, our eyes move around too. This is where Tobii Spotlight Technology will make a difference. "Tobii Spotlight Technology is advanced eye tracking specialised for foveation," said Henrik Eskilsson, CEO of Tobii. In case you don't already know, the fovea is a small central portion of your retina, which sees in high resolution, while your peripheral vision is effectively a blur. Tobii has published a dedicated foveation technology page to explain why dynamic foveated rendering (DFR) is an important advance on VRS.

At SIGGRAPH Tobii showed off a PC system using its Spotlight technology as well as an Nvidia RTX 2070 graphics card (with DFR enabled via Nvidia VRS), an HTC Vive Pro Eye headset, playing the game ShowdownVR from Epic Games. Benchmarks using this setup delivered GPU rendering load reductions of 57 per cent, bringing the average shading rate using dynamic foveated rendering down to 16 per cent from around 24 per cent. Thus the GPU can work on other aspects of the game, and/or boost frame rates, or save power.

Tobii worked closely with Nvidia to implement its DFR system and sees it in the future being used to custom transfer and stream data optimised for user focus over 5G networks. The firm even expects its Spotlight technology to be able to keep shading tasks manageable in VR headsets with resolutions exceeding 15K.



HEXUS Forums :: 4 Comments

Login with Forum Account

Don't have an account? Register today!
I like the idea but I suspect latency will be an issue. The 5G idea is not going to work given the way your eye moves around and how the visual processing / data stream along the optic nerve works. You might be able to use some kind of predictive tech here but ultimately if you're reacting to movement of the eye, it won't work.
philehidiot
I like the idea but I suspect latency will be an issue.

Depends… whilst the eye is actively moving, our brain compensates in various ways and there's usually some latency from our side - e.g. the human eye takes a few fractions of a second for ocular muscles to pull our lens into the right shape to see the new target of our gaze in focus.

Tobii have a good track record, so let's hope this works as it's one of the most promising technologies for allowing us to eak out more performance from GPUs - spending GPU time on expensive effects, textures, shaders, etc only where the human eye can actually apprecaite these things.
KultiVator
Depends… whilst the eye is actively moving, our brain compensates in various ways and there's usually some latency from our side - e.g. the human eye takes a few fractions of a second for ocular muscles to pull our lens into the right shape to see the new target of our gaze in focus.

Tobii have a good track record, so let's hope this works as it's one of the most promising technologies for allowing us to eak out more performance from GPUs - spending GPU time on expensive effects, textures, shaders, etc only where the human eye can actually apprecaite these things.

However, when you move your eye, the signal basically shuts off and resumes in the new position. There's compensation, yes, but the way I understand that works (bearing in mind it has been many years since I did anything on physiology of the head) means that with current technology and latency you'll not be able to keep up with the speed of just the eyeball moving. It's nothing to do with focus (if you're on a flat screen with constant distance then focus time isn't much of an issue), it's the way the eyeball moves, the signal basically cuts off (and the brain interpolates a bit so you don't see black) and resumes. This takes less than the time it'll take for a packet to transverse a 5G connection. Whether you can exploit the interpolation of the brain somehow, I dunno and that's something for people brighter than me. On the surface, I do not see how this could work without sub ms latency networks. Bear in mind we're struggling to stream games with mouse and keyboard inputs. The latency issues here are probably an order of magnitude more intrusive.
Cool, hopefully more games starting using VRS in general.

VRS is not just hardware it can be done in software too in fact the forth coming Call of Duty game uses VRS on the PS4, confirmed by Digital Foundry.