Hacker Newsnew | past | comments | ask | show | jobs | submit | nobodyandproud's commentslogin

I’ll add that I don’t even know how to paricipate; or likely would if I did (inconvenient times, dates).

This is no accident.

Edit: I’m not young, but I didn’t grow up with any sort of privilege.


> a place where a single misfortune—perhaps a car crash or an illness or the loss of a job—can lead to ruin.

Is it wrong?

The US strongly discourages a strong social/family network (moving out at 18); and are being squeezed financially.

The government-run social safety networks are being starved.

Something has to change.


Sleep. Coffee. Focus music. Non-stimulant medication.

I hung on to my 15 (?) year-old Intel motherboard, CPU, and 16 GBs of RAM; mostly because of e-waste guilt. I cannot believe this has value but here we are.

I also wish I built a new gaming rig during the summer last year.


I'm in the same boat, almost pulled the trigger 8 months ago to spend 4K on a powerful new system (with 128GB ram) .... boy am i kicking myself now.

Almost any working PC has some value. I've sold nearly 20 year old Intel Core2Duo systems, old 1TB hard drives and lots of other old components for about 10 - 20 dollars each. My primary gaming PC is from 2011.

Prices will come back down. That’s the beauty of pc gaming.

PC gaming is a small single digit percentage of the total volatile memory market. Neither Samsung or micron or Hynix currently have any incentives to increase production to address the shortage and lowering prices for that segment of the market. It’s just not a money maker for them.

That doesn't mean that the price will never return to "normal". All three companies have new fabs being built in America and Europe right now, but they won't be online for a couple of years.

Also, if demand for AI chips continues to be sky high for years and year, the memory that is being developed now will eventually be phased out for new standards (DDR6/7/8???) and the DDR5 from existing products will be stripped and resold by other companies.

Also, if demand continues to stay high, then new companies will enter the market to cut off a slice of that monopoly pie and drive down prices. Supply and demand dictate that prices must go down if supply increases.

Personally, I think the demand will drop off a cliff if/when the AI bubble pops, but we don't know when that will happen. Until then, everyone can enjoy their Steam backlog and wait it out :)


Windows NT 3.x was a true microkernel. Microsoft ruined it but the design was quite good and the driver question was irrelevant, until they sidestepped HAL.

The Linux kernel was and is a monstrosity.


This is outdated since Windows Vista, and even more so in Windows 11.

Windows Vista isn't Windows NT 3.x. In the internal versioning, it's not even 4.0.

Indeed, it is something better, Windows NT 6.0.

And it is irrelevant anyway, given that this comment was written from 10.0.26100.


Oh, I see.

You’re saying they improved the design. I know they added user-privilege device driver support for USB (etc).; did they revert the display compromise/mess as well?


Yes, now graphics drivers are mostly in userspace, with only a tiny driver in kernel space, miniport.

Hence why graphics usually no longer crash Windows, after a small black screen pause, everything continues as usual.

https://learn.microsoft.com/en-us/windows-hardware/drivers/d...


What do you meant by them sidestepping the HAL?

I think the biggest one is that the whole GDI library was moved into the Kernel in 3.5x because the performance was terrible at the time.

I don't think they ever intended to keep all drivers strictly userland, though. Just the service side.


Mind you I don't have access to Microsoft code, so this is all indirect, and a lot of this knowledge was when I was fledgling developer.

The Windows NT code was engineered to be portable across many different architectures--not just X86--so it has a hardware abstraction layer. The kernel only ever communicated to the device-driver implementation through this abstraction layer; so the kernel code itself was isolated.

That doesn't mean the device drivers were running in user-land privilege, but it does mean that the kernel code is quite stable and easy to reason about.

When Microsoft decided to compromise on this design, I remember senior engineers--when I first started my career--being abuzz about it for Windows NT 4.0 (or apparently earlier?).


The PC as a growth segment was at an end, but that’s because Microsoft cornered the market.

It was good enough and they just needed to make security fixes and tweaks; and I still would have paid for it!

Yet, the leaders at Microsoft found a way to lose their marketshare.

tl;dr; it just needed to remain quiet, boring, but reliable to remain a cashcow.


I was a fan, user, then developer from the DOS days-pre Windows 3.0–to Windows 10 without a single gap.

When they threatened Windows 10 EOL last year (?), that’s when I took a day to do a clean install of Mint and port my games and LLM tinkering over.

Because I knew MS was doubling-down on the user-hostile experience.

I thought I’d miss Windows but Steam, Wine, and Radeon made it delightful.

Windows is now only on my company-issued laptop. I predict that will also go away, as Windows 11 has introduced backdoors to circumvent company controls and install their BS.


100% leadership inflicted.

Meanwhile they keep downsizing their workforce while not making personal sacrificed (personal pay) while they chase AI.


I don’t see any problems with the quote.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: