Don’t optimize, don’t recommend. Chronological timeline of only things you directly subscribed to. If you want to go down a rabbit hole, you should have to find it on your own.
My inner Libertarian is screaming, but I would support a "choose your algorithm" law. Wherein social media platforms would be required to allow the user a clear choice to opt-out of algorithmic recommendations. Perhaps even requiring a basic chronological feed.
I know the actual text of the bill would require a lot of finagling and legalese, but that's the concept of what I want.
You inner Libertarian has the same problem as every other person with that ideology - you are vastly overestimating the knowledge and capabilities of the average user, and the ideas that result will fail in practice. Algorithmic content is a market failure being subsidized by the fact that it helps sell ads. The capitalist greedy function means they're going to keep pushing algo content, because it's the money making engine. If you ban the combination of algorithmic content and targeted advertising, the damage would be vastly reduced (also Meta would implode, because their business is a market failure that only exists because regulators are not enforcing competition laws that prevent the use of free services subsided by ads and other antitrust laws).
> but the requirement to provide non algorithmic alternatives for content feeds.
"Non algorithmic" means "some human creates the content feed by hand". This is also not free of lots of biases and what news portals in the web did in the past.
Yeah, I understood what you said, I just would be against it in general unless advertising is strictly decoupled from it. Even then, I'm not sure there is a huge benefit to it.
There used to be a news app called Pulse that LinkedIn bought and killed. To me, that was the best possible execution of the feed concept. It was essentially RSS with a scrollable feed component. You could browse feeds by category, but it wasn't recommending things based on current interests. If you wanted to go down a right/left wing newshole it was possible, but you had to consciously seek it out and go down it. The app wasn't breadcrumbing you.
except recommendation platforms are more popular all the time. ppl will choose the algo feed bc they like it more. idk why but everybody is taking america off the hook for wanting toxic shit and gravitating to it. its at least half a demand side problem.
if u want an example: tiktok has the for you page (algo discovery) and the following page (just ppl u follow). its the fyp where ppl spend time.
I can't respond to the child comment, so I'll reply here. The "type everything out" must be a generational/cultural thing. Because when I read your comment I literally did not realize you used words like "ppl". I automatically expanded it to people when reading it
If that's true, why is twitter making it so difficult (and more difficult over time) to see only content you follow, and Facebook making it impossible?
The answer to that question has nothing to do with my objection. I was simply pointing out the irrelevance of the "original intent" of these algorithms, given that we now know, and have known what they do. Focusing on that downplays their effects, which can indeed be considered their intent nowadays since we have known the effects for a while.
Say you hit a button, and each time you hit it you gain a million dollars. After you hit that button once or twice, you discover that it also causes great harm to other people. If you continue to hit the button, your original intent (when you didn't know it's effects) is no longer relevant to whether or not it you should continue hitting the button, since you would now knowingly be harming people for your own profit. The side-effect is no longer a side-effect. It's just an effect.
To answer your question anyway, perhaps we should consider the well-being of a user and bake that into our algorithms. It might not be as profitable, but there's a reason we regulate companies- having some restrictions on profitability can often be a net benefit to society.
Who gets to define what “toxic content” is? Is the Great Barrington Declaration toxic? How about studies that show masks might not work? How about public sourced data showing the IFR of Covid was nowhere near what “the experts” modeled early on?
The owners of the space can make those decisions, and the users can provide feedback, just like it happens now. There's no static status quo where everybody's happy, just like there's no static society.