Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Code is never just working. The environment it runs in changes, requiring refactoring things. We’re not talking about a showcase piece of artisan algorithm here, but bug-riddled legacy code reliant on outdated system packages, SQL queries that cannot use bound parameters for historic reasons and are ever-prone to injection attacks. Code that uses broken multibyte encoding, such that it is vulnerable to several attack classes. And that’s not even talking about performance. Are you seriously telling me software performance should not improved if the core functionality kinda, sorta, works?


> Are you seriously telling me software performance should not improved if the core functionality kinda, sorta, works?

I would not hesitate to take this position. Of course it depends, on how severe the bugs are, especially for the outer code (like plugins) calling it, and on how bad the performance is. But otherwise absolutely, never break user space.

> The environment it runs in changes

The web environment Wordpress runs in did not change all that much. The JS ecosystem simulates big changes, but that's all bullshit. Server code that worked 30 years ago still works - if projects like PHP don't go out of their way to break it.


> Server code that worked 30 years ago still works - if projects like PHP don't go out of their way to break it.

I'd be horrified to expose ANY software written 30 years ago to the internet, if it touches money or valuable data in any way.


> But otherwise absolutely, never break user space.

Neither the Linux kernel, nor OpenSSL, or any other reasonably complex project manages to do that over a given time frame. Sometimes you need to adapt, and things break in the process. Nobody would expect a house built 30 years ago to not require some maintenance and upgrades over time.

> The web environment Wordpress runs in did not change all that much. The JS ecosystem simulates big changes, but that's all bullshit. Server code that worked 30 years ago still works - if projects like PHP don't go out of their way to break it.

That sure sounds good, but is simply not true. We went from HTTP and FTP deployments to TLS and containers, from dialup to gigabit consumer uplink; the browser isn't a remote document viewer but a platform-agnostic virtual machine for fully-fledged applications; the web is centred around a few enormous platforms; people regularly stream GBs worth of video and expect services to deliver web apps on a variety of devices; they don't post on bulletin boards and in news groups, but use chat services; scammers distribute ransomware, steal your identity, remotely take compromising pictures from your webcam, or order stuff from your shopping accounts online. The modern web has almost nothing in common with the one from 30 years ago.


> That sure sounds good, but is simply not true.

Sure it's true. Many users are still doing deployment FTP-style, even if it's not the original protocol anymore. That the pipes are bigger just meant we could up the thumbnail size, and the browser is still also a remote document viewer for sites that don't demand more. I just today answered a support question on a bulletin board, and so on.

There are other aspects of the web today, but the old way still exists.

> Nobody would expect a house built 30 years ago to not require some maintenance and upgrades over time.

You can do upgrades of software in a way that does not break compatibility, and you can definitely always aim to minimize breakage. Wordpress is not a bad example for just that. HN itself counts as a further example. If it weren't possible we wouldn't have this thread to discuss in.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: