Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

What I find worse is the lack of analog output support with DC. One of my monitors only has a VGA input and I have no intentions of replacing it.

However, the new Display Core does not support analog output and it will stay dark.

Fortunately amdgpu.dc=0 is still an option, but I dread the day when this code path is removed (or bit rots away).



Is it a really good VGA monitor or something?

You can buy relatively inexpensive active DVI/HDMI to VGA converters if that helps.


> Is it a really good VGA monitor or something?

Not at all. It's an old 19" with a resolution of 1280x1024, you can get those pretty much for free these days.

But hey, n + 1 screens > n screens :D

I usually just have my browser on it in full screen. And because of the resolution I don't have to worry about overly long lines on websites that aren't restricted in with (like hacker news).

At work I picked up an old 19" TFT from the trash that connects neatly to the unused VGA port of the docking station. It has abysmal ghosting issues, so it's no use for the browser, but still good enough for a terminal.

With modern graphics cards supporting many outputs and monitors that you can literally get for free, you can have the luxury of dedicating a display to a single task or application, so that's pretty cool.


Buying an LCD monitor with no digital input was always a bad way to cut costs, but you can buy active DVI-to-VGA or DisplayPort-to-VGA adapters to keep these devices working with new hardware.

Moving rarely-used DAC hardware to an external dongle seems like a good design decision, though it makes more sense for driving a CRT, where the conversion is D->A rather than D->A->D->A. A CRT can be better than a modern LCD in some ways, whereas VGA-input LCDs are simply obsolete.


I have an early Dell IPS from circa 2005 which is 1280x1024 19" and it supports DVI input, fortunately.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: