I hung on to my 15 (?) year-old Intel motherboard, CPU, and 16 GBs of RAM; mostly because of e-waste guilt. I cannot believe this has value but here we are.
I also wish I built a new gaming rig during the summer last year.
Almost any working PC has some value. I've sold nearly 20 year old Intel Core2Duo systems, old 1TB hard drives and lots of other old components for about 10 - 20 dollars each. My primary gaming PC is from 2011.
PC gaming is a small single digit percentage of the total volatile memory market. Neither Samsung or micron or Hynix currently have any incentives to increase production to address the shortage and lowering prices for that segment of the market. It’s just not a money maker for them.
That doesn't mean that the price will never return to "normal". All three companies have new fabs being built in America and Europe right now, but they won't be online for a couple of years.
Also, if demand for AI chips continues to be sky high for years and year, the memory that is being developed now will eventually be phased out for new standards (DDR6/7/8???) and the DDR5 from existing products will be stripped and resold by other companies.
Also, if demand continues to stay high, then new companies will enter the market to cut off a slice of that monopoly pie and drive down prices. Supply and demand dictate that prices must go down if supply increases.
Personally, I think the demand will drop off a cliff if/when the AI bubble pops, but we don't know when that will happen. Until then, everyone can enjoy their Steam backlog and wait it out :)
Windows NT 3.x was a true microkernel. Microsoft ruined it but the design was quite good and the driver question was irrelevant, until they sidestepped HAL.
You’re saying they improved the design. I know they added user-privilege device driver support for USB (etc).; did they revert the display compromise/mess as well?
Mind you I don't have access to Microsoft code, so this is all indirect, and a lot of this knowledge was when I was fledgling developer.
The Windows NT code was engineered to be portable across many different architectures--not just X86--so it has a hardware abstraction layer. The kernel only ever communicated to the device-driver implementation through this abstraction layer; so the kernel code itself was isolated.
That doesn't mean the device drivers were running in user-land privilege, but it does mean that the kernel code is quite stable and easy to reason about.
When Microsoft decided to compromise on this design, I remember senior engineers--when I first started my career--being abuzz about it for Windows NT 4.0 (or apparently earlier?).
I was a fan, user, then developer from the DOS days-pre Windows 3.0–to Windows 10 without a single gap.
When they threatened
Windows 10 EOL last year (?), that’s when I took a day to do a clean install of Mint and port my games and LLM tinkering over.
Because I knew MS was doubling-down on the user-hostile experience.
I thought I’d miss Windows but Steam, Wine, and Radeon made it delightful.
Windows is now only on my company-issued laptop. I predict that will also go away, as Windows 11 has introduced backdoors to circumvent company controls and install their BS.
This is no accident.
Edit: I’m not young, but I didn’t grow up with any sort of privilege.
reply