Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> For example, a modern CPU encoding/decoding a video stream would use way less power thanks to its newer video decoder/encoder hardware IP blocks than a very old CPU going on full blast to SW decode/encode the video stream. The differences can be staggering.

Is that the case for Teams / Zoom & friends, which are what I'd expect to be the most common video decode/encode uses? I'm expecting my 11th gen i7 laptop to become airborne any day now, with the kind of noise it makes on Teams calls.

Anecdotally, my personal zen3u laptop does seem to unload video decoding to the GPU when watching movies, but it still gets fairly warm.

> If you don't care about energy efficiency or it's dirt cheap where you live then that's fine, but FWIW a modern 12th gen Intel Alder-Lake N100 NUC clone with quad E-cores[1] will smoke that

On what do you base this? I've never tried the N100 NUC, but some i5 12th gen with 2 P cores and I forget how many E-cores in a laptop. Compiling Rust, it was comparable to my older gen laptop with the 11th gen i7. Now this is an "U" part in an ultra book, but it's barely faster than a 2013 MBP with an i7, which, if I'm not mistaken, is a 3rd gen core "H" part.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: