Why would demand imply costs will be reduced? If you're making an economies of scale argument, there's plenty scale right now, and costs don't seem to be trending down.
Well, sure. Maybe you're the kid in the article who opened Xcode and Blender and Final Cut, but it didn't click for you. Of course not everything is for everyone, but it doesn't prove exploring the limits like that is a bad thing.
Why do you think they're fewer and farther between? It's almost certainly at least partially because of Chromebooks and the save-the-user-from-themself design philosophy.
I would say it's primarily because their phone (android or iphone) does nearly everything they would ever want a "computer" to do, and so they never had a hook into real curiosity about computing. This will probably be exactly the same whether the school supplies a Macbook Neo or a Chromebook.
Hackers tinkered with other things (e.g. the telephone) before pervasive computing.
I imagine the young hackers of today just find other things with which to tinker.
I've seen kids build some amazing things in Minecraft. Is this really all that different from modifying the source to a game you copied-by-hand from BYTE magazine?
I'm a computer nerd who started on DOS on IIRC a 286, all the Windowses (OK not quite all, though my first job did expose me to a couple NT versions and 2K in addition to nearly-all the home versions I upgraded through on my own) and various Linuces (Gentoo as my primary OS, even, for like... five years, LOL)
These days there are two things I use my "real" non-server computers for in my personal life:
1) Media piracy, only because I've been too lazy to set up a headless torrent system on my media server w/ VPN connection (why does the media server exist? Again, purely for piracy, remove that use case and it'd be gone as a waste of electricity and space the very next day). I could fix this in a Saturday and remove this entire use case, improving my process significantly at the same time, just haven't yet.
2) Video games. Really hoping that new Steam console doesn't end up being sky-high expensive, because I'm excited about "consolizing" this use case and ditching my last "real" PC tower (other than one server that I go months and months without directly messing-with; and that "real" PC tower is already running Bazzite to make it Steamdeck-like, it's just janky as fuck because it's Linux on frankensteined hardware so of course it's janky as fuck, so I'm still eager to swap it for something designed and with support for that actual use case).
Everything actually-important happens on iOS, and usually on a phone.
In my personal life, I've concluded that I just have no idea what more-useful-than-the-time-it-takes stuff I'd even do with a "real" computer, despite spending absolute fuckloads of my time screwing around with them from ages like 7-30. What I learned in that time was "how to computer" to a pretty advanced level, and luckily that per se has paid the bills and then some, but in all that time the actually-directly-useful-to-me stuff I've done with it has amounted to very little.
To me, personally, I've realized PCs are a solution looking for a problem, and that so rarely does my trying to bridge that gap result in an actual net-benefit (usually, not even close) that I've mostly stopped trying. That impulse got me where I am, can't complain, that "wasted" time over literal decades gained me a bunch of skills that are (it turns out) almost entirely useless to me personally but that others are willing to pay for, great, but I find myself a computer nerd who doesn't actually know WTF to do with a "real" computer. I just use my damn phone.
I suppose I'd still find them a lot more useful if iPads and iPhones didn't exist and so I needed to do banking and reading and such on some other computery device, but those do exist, so... yeah, not a lot of motivation to even own a "real" PC anymore, to include a MacBook—I doubt I'll replace my M1 Air when it finally dies.
To sum up, as a lifelong computer dork, I don't even know why I need a "real" computer any more, let alone why anyone else does.
You can't even see this whole comment on an iPhone, let alone the immediate contextual thread that is necessary for framing it.
That sounds flippant, but it's the visceral reaction I have to trying to do everything on iOS (iPads are a little better than iPhone in this regard, but still have the "everything through a keyhole" feeling).
That said, tiny viewports are not really the main problem, since obviously a modern iPhone has far more resolution than almost any monitor available in 1990, or even 2000. It's more that most exploration and creation is not doable on an iPhone, by design.
My original point, though, was that seeing new grad software developers who have never had any non-externally-directed interaction with a computer made me realize why we have to show them how to use the shell in a terminal, and why they seemed to have no particular curiosity about any software thing not directly related to the task that they're doing. For new grads who aren't curious, finding that something they're doing doesn't work due to architectural issues or some nonobvious combination of bugs prompts neither asking for help nor a deep dive into what the problem could be. Asking for help is basically cheating, to them, and they've never before encountered real problems that weren't explained in the text, handled by their group partner, or trivially stack-exchangable (remember, this realization was 2019, and therefore before common LLM usage). At standup after standup, they report making progress on working through the moderately complex ticket they picked up, but in fact, nothing useful is happening. Sometimes people like this can be shown the way, and then become functioning and capable developers and troubleshooters. Sometimes not.
Apple tends to expect a lot from its developers. Changing processor architectures is a big one. Unlike other platforms, Apple cuts support and moves on. If an app is abandoned, it starts to show sooner on Apple platforms than anywhere else.
If it’s something that will become critical to a user’s workflow, that can be a big problem. Why invest in an app without a future?
I prefer the perspective that a computer program is akin to a mathematical constant. This was true in the old days. A program I wrote in C64 BASIC, way back in 1980s, should still work precisely the same today (even on one of the freshly-manufactured Commodore 64 Ultimates).
You've honed right in on what's changed since the old days: Platform vendors (such as Apple) now continuously inject instability into everything.
You might argue that developments such as "changing processor architectures" justify such breaks from stability (though I myself have qualms with one of the richest companies in the world having a general policy of "cutting support and moving on"). But I would point out that Apple (and other vendors) create instability far beyond such potentially-justifiable examples.
To me, it appears as if Apple actively works to foster this modern "software is never finished" culture. This is seen perhaps most clearly by they way they remove apps from their iOS store if they haven't been updated recently (even as little as 2 years!): https://daringfireball.net/linked/2022/04/27/apple-older-gam...
Shouldn't we be demanding stability from our platforms? Isn't the concept of "finished software" a beautiful one? Imagine if you could write a useful program once, leave it alone for 40 years, and then come back to it, and find that it's still just as useful. Isn't this one of the most valuable potential benefits of software as a premise? Are the things we're trading this for worth it?
I keep looking for more useful DOS applications. Have had a git repo for 15+ years that has my config and virtual C:. Once something is installed it will just keep working and I sync that repo to all my machines so it have the exact same things running with the exact same versions. Binaries from 40 years ago still run fine. Just wish more modern applications supported it.
I also have QEMU installations of some Windows versions and old Debian versions (I like Debian 3.0, since the entire library of packages fits neatly on a single DVD ISO... that was the last release that small). Those are also useful for having stable platforms to run offline applications on without having to be bothered by upgrades.
So what's the alternative? Should we go back to reading encyclopedias from the 2010s? I ask this because the need for information hasn't decreased for human beings, just because the capability to produce slop has suddenly increased.
>> I ask this because the need for information hasn't decreased for human beings, just because the capability to produce slop has suddenly increased.
Isn't that the complaint to which you're responding? the SUPPLY side of the equation is the problem, so reading encyclopedias wouldn't impact that. Funny enough the criticism of Wikipedia was that a bunch of amateurs couldn't beat the quality from a small group of experts curating a controlled collection, and we saw that wasn't true. Maybe AI has pushed this to a new level where we need to tighten access and attention once again?
reply