CPU/GPU "intensiveness" of a codec depends, and an increased computational overhead of newer codecs is always to be expected. The more efficient a coded (perceived "quality" per bit), the more complicated the computations of decoding need to be.
That's why most (consumer) CPUs built in the last decade have hardware decoding support (having a dedicated hardware implementation is always more efficient than running on the CPU) - and often even hardware encoding support.
The difference between h26x decoders must not need even that big (depending on the implementation). Widespread use of h264 killed most notebooks because of the missing hardware decoder.
h266 would with 99% guarantee never run more efficient on the same hardware than h265, unless we find some magic to achieve better-then-h264 encoding with MPEG1 complexity, and if this was easy, why was h265 not that.
And then there is AV1, which makes any newer MPEG-LA standard pointless anyway. You just need to wait for the respective widespread use of the hardware en-/decoders (and efficient encoder implementations I guess).
I parsed his statement the opposite way: “if hardware vendors want to drum up some business, they should develop h265+1, which will trigger the next round of laptop upgrades because it will be too cpu-intensive to work on devices which haven’t been upgraded in 15 years.”
> The more efficient a coded (perceived "quality" per bit), the more complicated the computations of decoding need to be.
That doesn’t have to be true. Only encoding is guaranteed to be more intensive.
What happens when more efficient encodings are developed though is usually not bitrate going down, but instead it is kept on the same rate to provide a higher image-resolution/quality for the resulting video.
And that means more pixels to render, and that means more work on the decoding end.
The MPEG group has tried to squeeze too much out of their h264 and h265 codecs, so other interested parties developed the totally free (with a patent-MAD clause) AV1 codec. I guess there will be no widely supported h266.
MPEG2 used to require dedicated PCI-[eX] cards to offload the encoding from the CPU as the CPU couldn't do it fast enough to handle real-time. Now CPUs can do multi-times real speed. H.264 also had add-on cards to enable faster than CPU encodes. Now, CPUs handle. Currently, H.265 pushes CPUs to their limits. Tomorrow? H.265 will be fast, and the next H.2xx will be causing us to wait. Rinse. Lather. Repeat.
It's more CPU/GPU intensive, is not as widely supported, and there probably are more downsides.
Better quality at the same bitrate? For most use-cases: yes.