4K videos encoded in H.265 HEVC are now common.
But playing 4K H.265 HEVC video on Kubuntu or Ubuntu Linux (20.10) might not be so easy. Especially if you are limited in CPU and your video is 60fps HDR, you need hardware decoding from the video card to work.
This mini-guide is the one I would have loved to find on time. I won’t go through the details of driver installation, and suppose it’s properly installed.
If you read this post, it’s likely you experience problems playing some high quality videos, or you’re looking to upgrade your hardware. The good news is that 4K videos can play very will on even older PCs with limited configurations at the condition that the GPU supports it. All the tools are provided in Ubuntu distribution. You do not need to download anything, nor compiling anything.
You can check your graphic adapter (GPU) video decoding capabilities :
There are several characteristics for a video file or stream that can make it more or less CPU or GPU intensive, and might impact your ability to play it:
- The resolution. Playing very high resolutions such as 8K videos are far more resource intensive than full HD (1080p). We will focus on 4K since it’s now the standard for TVs. 4K resolution is also called Ultra HD or just UHD. For a complete resolutions guide, check this Wikipedia page.
- The number of colours or gamut encoded in the file. Most common are 8bits encoded for each colour (no HDR), but with High-dynamic-range (HDR) videos, we can have 10 or more bits used to encode each colour brightness. The most common is 10 bits HDR or HDR10, often just called HDR. An HDR video will have more accurate colours, better bright and dark scenes, but requires more CPU or GPU processing power to decode.
- The codec. High Efficiency Video Coding (HEVC), also known as H.265, is more compressed and requires more processing power to decode than the former H.264.
- The frame-rate. 60 fps (frames per second) video requires more processing power than 24 fps.
- The average bitrate of the video.
- The audio encoding and channels, typically higher bitrates requires a bit more processing power.
In order to test the hardware and player options, I downloaded the test videos from 4K Media :
- LG: New York HDR (4K / HDR / 25fps)
- Samsung: Wonderland Two HDR (4K / HDR / 24fps)
- Samsung x RedBull: See the Unexpected HDR (4K / HDR / 60 fps)
- Samsung: Travel With My Pet HDR (4K / HDR / 60fps)
Since the videos are hosted on Google Drive, you might be rejected because of high demand for the files. If you log-in to your Google account in another browser tab, it should work.
Hardware used for tests
Old PC upgraded with the cheapest graphic card I could find to support 4K monitors and HDMI 2.0. The internal graphic adapter is not used. This PC might be old, but with the Nvidia GT-1030 hardware decoder, it can decode 4K HDR videos with 25fps.
root@pc1:~# inxi -GCMm -y 80 -c 0 Machine: Type: Desktop Mobo: ASRock model: FM2A75M-ITX Memory: RAM: total: 7.70 GiB used: 4.05 GiB (52.6%) Array-1: capacity: 8 GiB slots: 2 EC: None Device-1: A1_DIMM0 size: 4 GiB speed: 800 MT/s Device-2: A1_DIMM1 size: 4 GiB speed: 800 MT/s CPU: Info: Quad Core model: AMD A10-5800K APU with Radeon HD Graphics bits: 64 type: MCP L2 cache: 2048 KiB Speed: 1396 MHz min/max: 1400/3800 MHz Core speeds (MHz): 1: 1397 2: 1397 3: 1397 4: 1396 Graphics: Device-1: AMD Trinity [Radeon HD 7660D] driver: radeon v: kernel Device-2: NVIDIA GP108 [GeForce GT 1030] driver: nvidia v: 455.38 Display: server: X.Org 1.20.9 driver: modesetting,nvidia resolution: 3840x2160~60Hz OpenGL: renderer: GeForce GT 1030/PCIe/SSE2 v: 4.6.0 NVIDIA 455.38
Mid-range gamer PC. This PC can decode most 4K HDR videos, without requiring hardware decoding from GPU, but start to fail with 60fps if the hardware decoding in the GPU is not activated.
root@pc2:~# inxi -GCMm -y 80 -c 0 Machine: Type: Desktop Mobo: Gigabyte model: Z390 I AORUS PRO WIFI-CF Memory: RAM: total: 15.57 GiB used: 9.20 GiB (59.1%) Array-1: capacity: 64 GiB slots: 4 EC: None Device-1: ChannelA-DIMM0 size: 8 GiB speed: 2666 MT/s Device-2: ChannelA-DIMM1 size: No Module Installed Device-3: ChannelB-DIMM0 size: 8 GiB speed: 2666 MT/s Device-4: ChannelB-DIMM1 size: No Module Installed CPU: Info: 6-Core model: Intel Core i5-9600K bits: 64 type: MCP L2 cache: 9216 KiB Speed: 2884 MHz min/max: 800/4600 MHz Core speeds (MHz): 1: 1810 2: 1749 3: 1732 4: 1751 5: 1965 6: 1330 Graphics: Device-1: NVIDIA TU104 [GeForce RTX 2080 SUPER] driver: nvidia v: 455.38 Display: server: X.Org 1.20.9 driver: nvidia resolution: 3840x2160~60Hz OpenGL: renderer: GeForce RTX 2080 SUPER/PCIe/SSE2 v: 4.6.0 NVIDIA 455.38
Check NVIDIA Video Encode and Decode GPU Support Matrix
This mini-guide is the one I would have loved to find on time.
- The PC1 cannot read any video by default. But can read the 2 first videos (25 fps) with the NVIDIA GT 1030 and the
- The PC2 tends to drop some frames with the 2 60 fps videos, especially the 4th one (Travel With My Pet) with software decoding. With the
mpv --hwdecoption, everything plays super smooth.
With PC1, some 4K x265 videos where playing correctly, some others were like a partly encrypted slideshow. At least 1 core is 100% CPU and the video players complains about frame dropped and too old hardware to play this video. This just shows that the CPU cannot decode the video in real time, and probably needs help from hardware decoding of the graphic adapter.
For instance VLC will complain with several “Could not find ref with POC xx” not very helpful messages.
mplayer will complain like this:
************************************************ **** Your system is too SLOW to play this! **** ************************************************
The system is not “too SLOW”, it’s just that mplayer does not know how to use VA-API and falls back to software decoding. For some CPU it might work for low bitrate 4K videos, and just fails with higher bitrates. In my case the CPU was able to decode up to 8-10Mb/s video bitrate, but not more.
If you use HDMI connection to your TV/Monitor, check that you enabled EDID 2.0 or HDMI 2.0 on the HDMI input on your TV. Most 4K TV default to HDMI EDID 1.4 for compatibility reasons.
Finally, make sure you bought a recent quality HDMI cable that can support such data bandwidth and resolution. Those cables are often referred to as HDMI 2.0 High Speed, or HDMI 4K.
Check that the proprietary drivers for the graphic adapter are properly installed. They provide VA-API by default. I have no recent experience with AMD adapters, but it should work as well. For NVIDIA, run the nvidia-settings. You should see something like this screenshot on PC1:
Some guide on the Internet might refer to
vainfo check. But the version 1.8 seems currently broken (https://github.com/intel/libva/issues/448). Don’t be fooled by the misleading error message:
Monitor your CPU and you GPU
With NVIDIA adapters, you can check CPU utilization directly in nvidia-settings while playing the video.
How to know if hardware decoding is enabled
While playing a video, check if you have a Video Engine Utilization that is higher than 0%, as shown in the below screenshot on PC2:
Choose the right player
So far, I had no luck with my favourite
vlc player, nor
mplayer. Here are some recommendations :
- On Windows, try MPC-HC
- On Linux, try mpv
Video player tips
Most video players disable hardware decoding by default for compatibility reason. So you have to find the right setting to enable hardware decoding, and free up your CPU from that task.
For mpv video player that is also provided on ubuntu (sudo apt install mpv), you just need to add the
--hwdec=auto-safe option. Later you can make the option stick by adding it to the mpv config file:
pivert@cave:~$ cat ~/.config/mpv/mpv.conf hwdec=auto-safe
And then, it’s magic. Stunning 4K quality video, with 50% CPU usage, including Dolby Digital Plus decoding.
Monitor your CPU and check that it never reaches 100%