Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

If you can achieve a memory savings of 30%, say, that would be pretty big news. If you can only achieve a 30% memory savings by doing things to the compiler that make it harder to work on the compiler in the future, then it's probably not worthwhile. But if you can achieve an 80% memory savings by doing violence to the compiler, that might be worth the trouble.

As for what's an ideal source->AST expansion factor for a language that has a user-friendly compiler that's also compiler dev-friendly, that's hard to say. Clearly 50x is workable.

In TFA the 50x expansion factor is used as a motivator for automating a particular type of optimization. It'd be very interesting to see a Rust vec-of-enums that automatically deconstructs enum values into tags and opaque values that it could store in a struct-of-arrays like TFA does in Zig. The places to bury unsafe use into for this would not be many.



Maybe I'm just dumb but isn't the endgame of vec of enums going to be an ECS?


And the end game of ecs is a columnar rdbms




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: