Not really, no. For plenty of problems, the most efficient known algorithms will easily outperform a naive algorithm, even if running on drastically inferior hardware. 'Drastically' in this context can mean the slower approach taking billions of times longer.
Wikipedia gives an example. [0] Many courses on algorithms and/or complexity theory emphasise such examples in their introductions.
That said I don't know why jhoechtl thought that was the moral of this blog post, which doesn't illustrate that point at all.
> 'Drastically' in this context can mean the slower approach taking billions of times longer.
Standupmaths - "Someone improved my code by 40,832,277,770%": https://www.youtube.com/watch?v=c33AZBnRHks (OK, 41 billion percent is only 0.41 billion times better, but still.)