Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

At what point in the past were our programs stable, robust and just worked? Perhaps it was before my time, but DOS and windows (3.11, 95) would crash, constantly. Blue screens of death, infinite loops that would just freeze my computer, memory leaks that would cause my computer to stop working after a day.

I now expect my computer to stay on for months without issues. I expect to be able to put it to sleep and open it in the same state it was. I expect that if a website or a program errors, my OS simply shrugs and closes it. I expect my OS to be robust enough so that if I enter a usb or download a file I'm not playing Russian roulette that it might contain a virus that would destroy my computer.

In the past I would close my computer at the end of every day because otherwise it will simply crash some time in the night. I would run de fragmentation at least once a month. Memory errors and disk errors were common, and the OS had no idea how to overcome it. Crashes were so common, you just shrugged and learned to save often.



> At what point in the past were our programs stable, robust and just worked?

DOS was rock solid, at least around the era of DR-DOS. DESQView 386 was absolutely stable too. The BBS software I ran on them in those days was a wobbly piece of shit though.

I also recall Borland's Turbo Pascal compiler and the accompanying text-mode IDE being ultra reliable.

After DOS I've used OS/2 which was also extremely stable, although suffered from limited hard- and software availability.

Mac OS X used to be rock solid too in the heydays of the Powerbooks and earlier Intel Macbooks. Every now and then there were hardware design flaws though, and now the quality of both soft- and hardware seems to have taken a tragic turn for the worse.

You still do play Russian roulette whenever you plug a USB device into your computer, see "USB Rubber Ducky".


> DOS was rock solid

DOS was rock solid because it did nothing. Many programs, particularly games and anything that did networking, didn't even use DOS interfaces—they bypassed them entirely and worked with either the BIOS or hardware directly. There was no memory protection, no multitasking, and, on a higher level, no permissions nor sandboxing. So while maybe it "just worked", I wouldn't call it robust.


You’re complaining that DOS lacked features and protections. But zero cost abstractions like DOS can still be robust (in the sense of being reliable and solid.) yes it requires more trust, But there is no guarantee modern OSes can run arbitrary zero-trust binaries.


If you run a buggy program on a modern OS, it won't crash the system or impact other processes. If you run a buggy program on DOS, it will write to random physical addresses, probably clobbering the state of other processes and of DOS.

Modern OSes can run arbitrary binaries, but they can pretty much run arbitrary non-adversarial binaries - problematic binaries have to be intentionally written to exploit the system (as opposed to DOS, where non-problematic binaries had to be intentionally written to not break the system).

It's a dramatic improvement.


It doesn't matter what OS I run. My "modern", and apparently buggy, CPU runs arbitrary systems that I know very little about, and I have little to no control over.

Since 2008, it's been a dramatic departure.


> Modern OSes can run arbitrary binaries, but they can pretty much run arbitrary non-adversarial binaries

Mostly. Modern OSes strive to run abdersarial binaries, but where they can do it safely or not is still in question, IMO.


DOS did nothing and so people extended it with TSR's: "terminate and stay resident" programs. These were a nightmare. They had "conflicts": users had to experiment with the order in which they were loaded to make them like each other. Some of them were interrupt-driven and yet needed to call into DOS, which would be bad if the interrupt had gone of in the middle of DOS. So they tried to guess whether that was the case.


> DOS was rock solid

Boy, that's not how I remember DOS. I remember playing with all kinds of variants of driver load order in config.sys and passing obscure arguments into himem.sys to avoid odd hardware conflicts and crashes.


I always wondered what the world would be like if Microsoft had just made a 32 bit DOS instead of going down the WinNT/95 route. Most of the headache in config.sys and friends was because you were working around the 16 bit address space. However, there was something really nice about owning your entire machine and only needing command.com for the "operating system". Compare this to full operating systems which consume gigabytes of disk and memory.

This hypothetical 32 bit DOS could've had memory protection and multitasking too. Obviously device drivers would add complexity, but it doesn't need to be as complex as it's become.


DOS isn't even an operating system in the modern sense. Once you add preemptive multitasking and memory protection, you're simply going to end up with a normal modern operating system kernel again.

On the other hand, the stuff that takes "gigabytes of disk and memory" isn't even part of the operating system kernel, so there's no need to start from DOS to get rid of that stuff. It's possible to run linux from a few megabytes of ram.


> Once you add preemptive multitasking and memory protection, you're simply going to end up with a normal modern operating system kernel again.

You're missing the point. There is no single file operating system for desktop users (maybe VxWorks or some other embedded OS falls into that category, but those aren't really for desktops). Modern operating systems sprawl all over the disk. Memory protection and multitasking are not large features, and CS undergrads all over the world routinely implement them in less a semester.

> It's possible to run linux from a few megabytes of ram.

A few megs of ram and a directory in /etc filled with startup stuff and config files. Clearly you don't appreciate it, but there was something really nice about being in the root directory and seeing only command.com and config.sys. The entire rest of the machine was yours to setup however you liked. The things most people hated about DOS really had more to do with the 16 bit address space and segmented architecture.


Files are cheap now and even a linux initrd file has a lot of files inside it

DOS had other files besides those, but they were hidden https://en.wikipedia.org/wiki/List_of_DOS_system_files

You could theoretically make a linux kernel will all the drivers you need linked + basic FS stored as a drive image so you would have fewer files on an image


OpenWRT runs on routers with 8MB of flash and 64MB of RAM and it includes whole wifi stack, routing and browser based GUI.

https://openwrt.org/supported_devices/432_warning


There was a company that made a multi-user 32-bit DOS called TSX-32.

https://en.m.wikipedia.org/wiki/TSX-32



DOS was a configuration nightmare; you could run games that required up to about 600kb of memory, but only with a ludicrous amount of hacking that was harder to figure out in the pre-internet days.

The "rubber ducky" attack is, of course, also possible with PS/2, XT, and even ADB keyboards, because none of them were authenticated.


I remember updating my autoexec.bat as a preteen but I wouldn't call it ludicrous hacking. I don't remember where I got the instructions from but I think they were just in the readmes or error messages of games, no internet required.

It pales into insignificance compared to what I have to do to keep a desktop Linux box behaving normally today. Every few weeks I'm having to paste a wodge of lines into a range of config files to fix whatever driver oddity or system resource issue has caused something to break this time. That is pretty unimaginable without the internet.


My recollection is that Tiger and Leopard were a bit ropey, which is why Snow Leopard was considered primarily a performance and stability update.


Sometime in the early or mid 90s I had a FreeBSD box on 2.something - installed because I disliked unreliable flaky Windows so much - that passed a year uptime. It was daily driver during that time, and often doing stuff while I was out at work or sleeping. It was cutting CDs, that had been simply bullet proof on Amiga, becoming so incredibly delicate and flaky on Windows that was one of the pushes to go BSD instead. I mostly kept on using the Amiga as most reliable option for that.

The early 90's Sun's and SGI's didn't crash much either - though in a dev shop, sure we could push them to panic from time to time. The bigger iron just ran indefinitely, often until OS upgrade. :)

Now obviously this talk is game related, but even my previous Amigas were more reliable for uptime if you stayed within Workbench - often passing into months - than DOS and Windows. The mostly undeserved reputation of Amiga for constant crashing was from games hitting the hardware direct and those guru messages instead of silent freeze or pretty random colours that other platforms gave.

All were online, though not much web yet - mainly ftp, newsgroups and dial up BBS's.


You are aware that your cheap little box running DOS or Windows was not the epitome of computing back then? It was a wobbly, underdeveloped side branch dominated by amateurs doing amateur things on operating systems made by amateurs. The professional computing was done on UNIX and VAX. And both were rock solid in comparison.


In my experience DOS but also all Windows from 1.2 to 95 never experienced crashes. For me it started with Windows 98 (heavily pirated by people) and the horror story was Windows me. My wife told me to do something and as there were versions of Windows 2000 provided for free in magazines, I used one. Windows 2000 was such a relief from Windows me! But there was no USB and other niceties in W2K.

The nightmare started again with Windows XP, then I switched to Ubuntu which was reminiscent of Windows 2000.

A funny thing and proof of solid interfaces in Windows 3.1/3.11 and Windows, is that people were making their own versions by removing/adding components and sometimes even changing their content with hexadecimal editors.

There is still a fandom for old Windows versions out there.

And you could catch viruses literally by hand by looking in kernel files, checking their size, and checking what was loaded in memory.

I remember that time with great pleasure.


Windows 95 itself was relatively stable but the drivers were globally not. Your experience variend depending on the hardware you were running, and the stability of the associated drivers.


Windows 95 stable? Never ran out of GDI handles and had Win95/98 crashing? Yeah right...


> learned to save often

I still have the habit of constantly hitting cmd-s everywhere, it’s a reflex I’ll probably never unlearn. I also cringe when I see people working on a bunch of files which have not been saved for a while or, god forbid, not at all. Completely irrational but it’s what I’ve been programmed to do for years ;)


I once read that "Graphing Calculator" on the Mac was so reliable they'd run it in a test loop, for days on end, as a hardware check.

Today, I can reliably crash my Mac (10.13.x) by switching Spaces twice in a row quickly.


The problems you experienced were not with DOS, but with Windows 3.11 / 95. DOS itself was one of the most stable platforms I've ever worked with. I personally worked on a NetWare Server running on DOS that had an uptime of over 20 years. DOS's stability was not an outlier. Many of the UNIX machines I worked that predated DOS had uptimes that all measured in months and years.

The only reason why Windows was so buggy for you is that you were using the home editions. At the same time that you were experiencing blue screens in Win 9x. My NT workstation was rock solid without any of the issues you described.


I hear this claim occasionally but it doesn't match my experience. The very first time I used Windows NT 4 (probably 1997 or 1998), I couldn't figure out how to log out, so I chose Start -> Help to look it up. Bluescreen.

In the subsequent months/years with NT 4, the situation did not improve. It was a sad day when they replaced the HP/UX section of the lab with more NT machines. They were faster but they crashed a lot. It really took until Vista before NT was reliable.


Windows NT was originally a microkernel architecture. NT4 moved a bunch of code back into the kernel space for performance reasons.

Most notably: graphics and printer drivers, which are not typically written the highest standard.

Big iron vendors don't really have that problem, since they typically control their hardware as well. Microsoft had to rely on component vendors to provide driver software and couldn't plausibly test all permutations under all conditions (even though they test very, very many).


> windows (3.11, 95) would crash,

>> At the same time that you were experiencing blue screens

>>> first time I used Windows NT 4

For the 3.11/95 period, I was talking about NT 3.51 not 4.

Yes, NT 4 had well known issues with poor quality graphics drivers. For this reason many of us stayed with 3.51 until the graphic driver problems were ironed out.

You can cherry pick unstable OS's from any time period. But there is nothing special about today's OS's or programs. I've seen DOS, multiple forms of UNIX in the 80's and 90s that are just as stable as today's Win 10 or OS X.


This is exactly how I felt watching the video. It seems just like the same old 'wasn't everything better in the old days' nonsense. Not to mention the fact that in those days, a computer was something that you had in one room in your house, and wasn't often connected to other computers. Nowadays, computers are everywhere. I personally would argue that the rate of increase in safety in code hasn't kept pace with the rate of code being put in things, but that's a whole different kettle of fish.


He's a game developer. Games used to be released on hardware, with no possibilty of being patched.


Well said! Windows + associated app software, drivers, viruses back then used to be the biggest source of problems. Today I have none of that with Mac, iOS or Android. Having no reset for days or months is normal and expected.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: