This isn't universally wrong, but it is dead-wrong with present technology. And covering our eyes and pretending that we are currently that advanced doesn't make it so.
Because while basic computation is a universal commodity, what is implemented on top of it certainly isn't - a piece of software always functions as someone's agent. When the systems you end up relying on are entirely defined by someone else, the only thing that represents your will is your mind, and it is effectively executing a complex and ill-defined protocol against always-diligent computers.
You've done the computational equivalent of declining a lawyer.
But it doesn't seem like a big deal, since you're only compromising a little at any given time. But the software is always changing in ways that benefit its controllers while your expectations are mostly based on the capabilities that they've presented. So the progress you perceive is entirely in their desired paradigm. Features that would benefit you but at the expense of Google/Apple/etc are never explored, because you aren't the user of their software - you're its working set!
I can forgive the old-timers who were conditioned by broadcast media to see the world in hierarchal take-it-or-leave terms and don't understand what they're losing by sharecropping in walled gardens. And I can mostly forgive the unclued herd that just buys whatever is advertised.
But for everybody who knows the power of personal computers yet pretends webapps and locked appliances are actual progress, either out of personal laziness, cognitive dissonance, or longstanding need for social acceptance: shame on you for abandoning that self-determination you tasted the first time you truly experienced computing.
Good thoughts here. I am optimistic due to things like raspberry pi and arduino becoming popular, but at the same time these things just feel like this generations version of building radio kits and model airplanes. Ultimately it's the data rather than the computation that's important.
Yeah exactly, it's not that general purpose CPUs will be outlawed like we worried about in the 90s, it's that the generally popular ways of using technology won't be using their capabilities.
It's not just the data itself, but really about protocols used to access the data. Protocols mediate between parties, and by choosing to download a binary blob simply to check your email, you've given up any true bargaining power in that exchange. You still have some autonomy by hacking the blob (userscript injection, etc), but you're only building on unstable ground.
Because while basic computation is a universal commodity, what is implemented on top of it certainly isn't - a piece of software always functions as someone's agent. When the systems you end up relying on are entirely defined by someone else, the only thing that represents your will is your mind, and it is effectively executing a complex and ill-defined protocol against always-diligent computers.
You've done the computational equivalent of declining a lawyer.
But it doesn't seem like a big deal, since you're only compromising a little at any given time. But the software is always changing in ways that benefit its controllers while your expectations are mostly based on the capabilities that they've presented. So the progress you perceive is entirely in their desired paradigm. Features that would benefit you but at the expense of Google/Apple/etc are never explored, because you aren't the user of their software - you're its working set!
I can forgive the old-timers who were conditioned by broadcast media to see the world in hierarchal take-it-or-leave terms and don't understand what they're losing by sharecropping in walled gardens. And I can mostly forgive the unclued herd that just buys whatever is advertised.
But for everybody who knows the power of personal computers yet pretends webapps and locked appliances are actual progress, either out of personal laziness, cognitive dissonance, or longstanding need for social acceptance: shame on you for abandoning that self-determination you tasted the first time you truly experienced computing.