AMD adds Netflix 4K acceleration in latest driver release

by Mark Tyson on 2 May 2018, 11:01

Tags: AMD (NYSE:AMD), Netflix

Quick Link: HEXUS.net/qadtaw

Add to My Vault: x

AMD released its Radeon Software Adrenalin Edition 18.4.1 drivers a few days ago. In the release notes there was one stated significant addition to the drivers: “Initial support for Windows 10 April 2018 Update”. Just that, a few fixed issues, and a list of known issues AMD is currently working on. However, it has since been discovered that AMD slipped in an extra feature that may be of great appeal to PC users, especially those owning / making HTPC machines – support for Microsoft's PlayReady 3.0 DRM.

The importance of the above is that Microsoft's PlayReady 3.0 is one of the system requirements for Netflix 4K playback on a PC. Rival chipmakers have been on board with decoding this secured content for several months. Nvidia introduced the capability for GTX 1050 or better (at least 3GB of video memory) owners in GeForce driver version 387.96 in 2017. Intel Kaby Lake CPU, and newer, users have also been able to watch Netflix 4K on their PCs accelerated by the Intel IGP since late 2016.

Hardware.info noticed (via HardOCP) the AMD Netflix 4K decode ability had been added when perusing a reviewer’s guide document for Raven Ridge APUs. As with the Intel and Nvidia alternative routes, some other conditions must be met for Netflix 4K playback on the PC. Users need the PlayReady 3.0 compatible hardware plus; the Microsoft Edge browser, a connection to the monitor via the hdcp 2.2 protocol, an existing h265 decoder, and a Netflix Premium subscription.

In the Dutch source’s own testing it was noted that Netflix 4K video streaming came with a considerably higher bit rate than lower resolutions, which is understandable. Other than the demands on its internet connection, Hardware.info had no other issues watching Netflix 4k via a Radeon RX580 video card using the latest Adrenalin driver.

Skipping back to the Adrenalin 18.4.1 driver release notes, it is noted as a known issue that Netflix can stumble on multi-GPU systems using Radeon RX 400 series or Radeon RX 500 series products.



HEXUS Forums :: 16 Comments

Login with Forum Account

Don't have an account? Register today!
Fantastic! It was very disappointing that it was just Nvidia and Intel for so long but it's a shame it still needs Edge though.
I have been looking into getting a 4k monitor for Netflix etc. however being a gamer with 1440p targeted hardware, I don't want to purchase a display to find that dropping the resolution creates a blurry mess when gaming at 1440p/1080p - sorry it is a little off topic, but does anyone have experience of gaming at lower resolutions on a 4k monitor, any suggestions?
So the difference between 720p and 1080p is gigantic, but the difference between 1080p and 1440p is practically nothing? :mad: Not everyone wants 4k, damnit. Get your subtle mindgames out of here.
EvilCycle
I have been looking into getting a 4k monitor for Netflix etc. however being a gamer with 1440p targeted hardware, I don't want to purchase a display to find that dropping the resolution creates a blurry mess when gaming at 1440p/1080p - sorry it is a little off topic, but does anyone have experience of gaming at lower resolutions on a 4k monitor, any suggestions?

I used to have a weaker GPU (GTX780) and a 1440p monitor, and there was a noticeable difference between the games that played perfectly at the native resolution and the ones that forced me down to 1080p. However, the image, while softer, was by no means bad or tiring.

I now use a 4k GSync display with a GTX1070. Remarkable numbers of games, even highly complex new titles, play great at 4k, but I often use a lower res to allow me to max them out. It never occurs to me to run in 1440p, because 1080p is the obvious solution. 4k is four 1080p displays in a rectangle, which means that every pixel of 1080p graphics is represented perfectly by a square of 4 pixels in the 4k screen. There's none of that softening from trying to display an ill-fitting pixel ratio across the monitor's cell grid.
Thank you for the reply Otherhand, that is some good points to think about when using 1080p on a 4k screen that I hadn't thought of. In the past my experience with a higher resolution monitor than the last one has been more incremental than this so it hasn't had that benefit of fitting within pixel ranges as such.