Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I believe a critical difference between the high performance of now vs yesteryear is the degree to which it's a design problem vs an implementation problem.

When writing 6502 assembly, you have "tricks" galore. You do have a design trade-off to make: memory vs CPU cycles, and when looking at algorithms in really old programs, they often dispensed with even basic caching to save a few bytes. But a lot of the savings came from gradually making the program as a whole a tighter specimen, doing initializations and creating reports with just a few less instructions. The "middle" of the program was of similar importance to the design and the inner loops, and it popularized ideas like "a program with shorter variable names will run faster" or "a program with the inner loop subroutines at the top of the listing will run faster". (both true of many interpreters) An engineer of this period worked out a lot of stuff on paper, because the machine itself wasn't in a position to give much help. And so the literal "coding" was of import: you had to polish it all throughout.

Today, the assumption is that the middle is always automated: a goop of glue that hopefully gets compiled down to something acceptable. Performance is really weighted towards the extremes of either finding a clever data layout or hammering the inner loop, and to get the most impactful results you usually have a little of both involved.

The hardware is in a similar position to the software: the masks aren't being laid out by hand, and they increasingly rely on automation of the details. But they still need a tight overall design to get the outcome of "doing more with less."

And the justifications for getting the performance generally have little to do with symbolic computation now: we aren't concerned about simply having a lot of live assets tracked in a game scene(a problem that was still interesting in the 90's, but more-or-less solved by the time we started having hundreds of megabytes of RAM available), we're concerned about having a lot of heavy assets being actively pushed through the pipeline to do something specific, which leans towards approaches that see the world in less symbolic or analytical terms and as more of a continuous space sampled to some approximation. Which digital computing can do, but isn't the obvious win like it once was.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: