Custom emojis is something that exists since already quite some time in the matrix ecosystem, thanks to an unofficial im.ponies.user_emotes extension to the spec. Unfortunately, element only support displaying these emojis and not sending them. But various other clients does support them: Nheko, Fluffychat, NeoChat, ...
Schildichat now supports rendering them, but not uploading them or reacting with them. It's a step forward. It also can embed videos from certain links.
I hate this, but you MUST enable 2fa in the way YOU want, or else google will opt you into "2fa" you never consented to on some app you don't remember installing.
Maybe hugely insecure but I enable google authenticator then put that recovery code and key everywhere I can.
After reading this comment I tried to disable the 2FA with phone apps that I never asked for. Curiously, I'm not even signed in the app - I'm signed in Google Calendar but I get the prompts in Gmail app where I'm signed into work account only.
Anyway, it's not possible to configure 2FA in the way we want. The Google prompts configuration says "To turn off Google prompts on a device, sign out of your Google Account on that device."
There's no way to enforce the Authenticator. Not even make it default.
Of course many distros will just tell the upstream to pound sand, and more power to them. When I'm perfectly happy running the version that shipped with my OS, I DON'T want your updates. Just leave me alone.
He wanted to be left alone by the army-of-people-who-are-not-you that kept reporting long-fixed bugs because they didn't know they were living in an ancient version of Debian.
For a distro that blithely patched the entropy out of OpenSSL (in order to quiet a valgrind warning), you'd think they could figure out how to write a script that substitutes an Debian maintainer's email addy for the author's.
It is the same old problem and what's interesting to me is that I never see it articulated very well. I suppose it's because people disagree about what the problem is.
To me it's this, Linux and it's distribution mechanism is skewed towards servers, and IT since that's where most of it's installed base is.
That's great. The distro/repo system is great, and works well. In addition to it, we need another system that serves workstations better. They don't have the same requirements in general. In particular there's a big divergence in availability, and security requirements. I really doubt there's enough economic inventive to serve this second use-case, but it's too bad, because I think we could have the best of both worlds.
> To me it's this, Linux and it's distribution mechanism is skewed towards servers, and IT since that's where most of it's installed base is.
It's not just servers—not having a very capable, standard, base set of packages to rely on for a workstation-targeting release (say, of GUI programs) is a huge problem. Instead, distros differ wildly on what they provide, users may have different versions of packages or even entirely different programs or libraries (the entire window server may differ!) serving similar purposes, some libs may simply be absent, et c. This is kinda OK if you stick to running software your distro bundles and only at the official version for your release of the distro, but quickly becomes hell (for the people packaging your programs, if not for you) as soon as you step outside that.
This is why you see a lot of companies that support Linux for their commercial software being very specific about supporting e.g. only one or two distros, at some very limited set of versions. It's very hard to support "Linux" in general, especially for desktop-targeting software, because the Linux GUI and multimedia stacks are... well, they're a shitshow, frankly.
There are a ton more ways that very basic APIs and services can present to the user and to their programs, meaning far more ways for things to break. Way more possible combinations of not just version, but even which library or program is providing some capability.
The closest thing to a solution is picking one of the two big desktop environments and targeting that to give you some amount of consistency and stability, but it's not like people will only run your program in that DE (again, possibly not even in the same window server you developed on) so you may end up with bugs when e.g. your QT/KDE program runs under Enlightenment, plus a ton of basic stuff can still vary a lot even if you restrict support to one DE (think: audio daemon) which may affect all kinds of things in unexpected ways.
> GUI heavy application like chrome, firefox or openoffice run on all districts without any hiccup!
Ever seen a thread full of people exchanging advice on how to get these programs not to exhibit certain widespread glitches on Linux, many of which have been a problem for years? Thinking especially of things like tearing, or multimedia playback problems. They even crop up here on HN from time to time.
Even for servers, debian-style package management is increasingly inadequate. Linux package management was designed for the C era of a small number of large libraries with infrequent releases. Distribution maintainers are unable or unwilling to adapt their processes to support the model that most modern software ecosystems are converging on: large numbers of small libraries with frequent releases. (Frankly, they're often hostile to the idea that they should support it, despite this model being preferred by both developers and users). Yet they're also hostile to the idea of getting out of the way and letting ecosystem-specific package managers handle ecosystems that need more frequent releases.
In the era of servers that were hand-tended by a BOFH who upgraded only rarely (and screw the users who want to use something newer), the distro/repo system worked. But even servers are not generally maintained that way anymore. I honestly think we're going to see that model fade away in the next decade or so.
Just because parts of the ecosystem are constantly cranking out small changes doesn’t mean my employer wants my team to spend time alpha testing all of them at the risk of downtime.
I’m fine with an author saying “I no longer want bug reports from that version,” and I’d like someone with the distro to act on those instead.
Most effective employers I've worked for have found that staying close to upstream's rolling releases saved more time and effort in the long run compared to leaving a bigger gap between upgrades (which, sure, saves time if nothing breaks, but the worst case becomes a lot worse).
Well, run something rolling such as Arch Linux or Debian Unstable on desktop, and an LTS distribution such as Ubuntu LTS or Debian Stable on servers. If you're not a complete beginner, there are many benefits to run a distribution that tracks upstream closely so you can contribute and get fixes almost immediately.
If you need local development/production environment version parity, use container images or package locking files available in most programming language ecosystems.
Indeed, agree. I'm on a rolling distro (Manjaro) and I'm pretty happy. I see a lot of discontent in Linux land around this conflict though, so I though I would try to lay it out. Example a lot of people don't like the practice of curl'ing shell scripts on to their boxes, that many developer tools like Rust and Deno use for installing their tooling. The more server oriented folks see it as border-line crazy, while many developers are totally fine with it. Clearly to me they're coming at the issue from different points of view and it would be nice to fully support both communities.
> In particular there's a big divergence in availability, and security requirements.
I strongly disagree about the difference in security requirements. I don't want anybody to pwn my banks servers, not less or more than pwning my home desktop or mobile phone.
That's totally understandable. I did not mean to imply that workstations have no security requirements, just different. I feel pretty much the same about the host OS on my workstation, but I also spin up lots of temporary environments that I need to abuse badly. They are thrown away again, usually within hours or days.
I guess it's a reminder that most distros still don't have some way to report a bug for their package maintainers that is easier than reporting it upstream.
Debian is almost there, but it's stopped almost there for a long time already.
I have reported maybe 20 bugs to distros, I don’t remember a single one receiving any attention before they auto close for being old. Reporting to upstream almost always gets some kind of response and usually a resolution. Fedora was packaging a 5 year old version of LMMS because for some reason upstream decided to label the last 5 years of updates as RC versions despite them being vastly more stable and feature filled than the last "stable". I reported a few of the bugs found in this old LMMS on the redhat bug tracker and no one responded. Upstream had fixed these long ago.
These days I think distros should ship only the core OS components and let tools like flatpak give you user level apps direct from the source. I largely don't care about having the absolute latest version of Gnome or systemd. But I very much do care that all of my applications, especially internet connected ones are. I stopped using distro packaged telegram binaries because it would take a month before they got updated and in that time I'd have to use my phone to view all the unsupported messages sent.
I had used Silverblue for a little over a year and I think it’s certainly the future of Linux distros. I had a fair bit of difficulty but none of it was intrinsic to the tech and more just 3rd party packages that didn’t quite work properly. I really loved “toolbox” though.
I wonder if the issue could be solved by upstream. Spin up a new email address for each release -- xscreensaver-1.0@some_domain.org, then once they've moved on to version 1.1, set up an auto response and stop checking the email address.
I mean... for the most part, these are issues that the upstream maintainers should be fixing. For example, the font rendering in Bottles broke when they updated to GTK4. Instead of fixing this in the actual GTK4 codebase, the GNOME foundation recommended that everyone start using Flatpak, where they could apply a very specific system patch that works around this issue.
This is the problem with encouraging this "use our kitchen sink" behavior. It encourages poor development practices, and it ends with developers grovelling and asking users to switch their packaging systems. Imagine if you tried installing an app on Windows or MacOS, and they demanded that you install a separate package manager along with it. It's an unacceptable demand to make of anyone, and certainly shouldn't be the behavior we encourage if we want to live in a world of high-quality Free Software.
> Imagine if you tried installing an app on Windows or MacOS, and they demanded that you install a separate package manager along with it
This is basically how Windows works. There's no package manager, so effectively anyone shipping larger software ships their own auto updater, their own dependencies and either embeds or downloads extra installers for the MS redistributables at install time. They don't ask you, just effectively do it anyway.
Windows did get a package manager with dependency support, it’s just that many developers are still shipping custom installer programs that ignore it and almost all the users are still willing to run those installers.
Winget? That only started existing recently. Systems that are still supported don't include it. I'd wait a couple years before devs even start seriously considering it.
I'm thinking of Windows Installer (the MSI engine and database). I haven’t been in that world for a while, but it looks like winget is a frontend to a repo of MSIs or custom installers or whatever the store uses.
MSI doesn't do dependencies though, does it? You can launch one installer from another, for example if you want to install a .net framework with the app, but they're completely unrelated afterwards. There's no built in mechanism for updating them either.
If the upstream maintainers should fix, the distro maintainer is quite capable of forwarding the bug there.
The upstream developers here are complaining because they get the bug reports, but can't fix the issue (at least the way they want to).
That's independent on those few large groups that poped-up on the FOSS community that push a lot of badly maintained software. Yes, those are a problem too, just a different one.
Yes. In the past users did start by reporting bugs to their distro, and the distro package maintainer then forwarded it to upstream if necessary. But lately users have become more savvy about talking to upstream directly. Probably because upstream source control has become more standardized and well-known (GitHub, etc) and easier to work with (don't have to futz with mailing lists and etiquette, just click the "New Issue" button).
It's not even specific to Bottles or Flatpaks or whatever; it happens with regular Linux software too. In the systemd tracker you'll find people complaining about issues that have been fixed in systemd's git repo, but because the people are running distros with older versions they think the issue hasn't been fixed yet.
It's no wonder people have started to report problems upstreams because downstream bug reports rarely ever get picked up on. If they do, it's with 10 comments of "okay, try it with this" which don't work and then nothing until a bot closes the bug because a new major version came out and therefore all bugs are now no longer relevant.
Most error messages I've googled have led me to unresolved Linux bugs in various distros. Fedora users seem especially good at reporting bugs to their distro maintainers, though this does not always result in any kind of solution.
In my opinion, every distro should be allowed to ship their version of a package, but the moment packages get frozen (i.e. for LTS distros) or custom patches get added (i.e. Debian) all contact links to upstream developers should be removed immediately and replaced with the email address of the maintainer of the package. This should hopefully prevent the jwz problem while at the same time bringing the users of these distros the stable release cycle they want.
That's also fine if you specify your distro name and use your distro specific bug tracker to report your issues, unless you're pulling in the official distribution of upstream software through something like Flatpak that bypasses your system.
Even then I've found that many Flatpaks are broken on Ubuntu that work fine on Manjaro. Weird distro configurations really are a terrible burden on developers, this stuff makes me never want to publish software that gets absorbed into distros.
Mozilla was right to force Debian to rename their modified version of their browser. More packages should use such policies in my opinion, especially if they're complex to set up right like this piece of software.
Uh, wouldn't you, as the user, rather have the power to determine for yourself which version you'd like to stop upgrading at? That certainly sounds a lot better than letting a bunch of unpaid third party volunteers determine that for you, which is how the distro/repo model works.
> Uh, wouldn't you, as the user, rather have the power to determine for yourself which version you'd like to stop upgrading at?
Given that I'm not using Linux From Scratch, I'd say no: part of the reason I'm choosing a distribution is because I want to make somebody else deal with tracking updates (including security). I recognize that this comes with downsides (e.g. sometimes new versions have new bugs).
I kind of miss the pre-internet times when shipped software is, well, shipped and static, and typically bundled all of its dependencies outside the OS (which was just listed on the box). On the other hand, I'm typing this on a smartphone that couldn't exist in that model…
You do have the power. You can compile from source to any version you want. The caveat is that it breaks the primary reason most people use distributions—to get a set of packages that are consistent with each other.
Consistency is a non-issue for me. Reason I don't do it is because neither the update nor the uninstall experience are standardized and good.
I guess what I would really want is a) manually built packages install into opt and use a mechanism like update-alternatives to get things into PATH or wherever they need to be. b) Possibility to track an https endpoint for information about new releases. Could be something as simple as a text file of all versions with url of tarball and a flag for whether the version has known security issues.
No, that is not a relevant reason. There are people who have pinned their Windows versions, there are people who bypass Steam's autoupdates to run old versions of games, this is a general computing problem not a Linux problem. If you want to be particular about the version of something you're running you're not going to be able to rely on systems which were designed to remove that consideration from you, period. Nobody has ever promised that Linux or any other OS would just make all versions of everything work together all the time and you have the total freedom to pick anything with no consequences.
This is the opposite of a YotLD problem, it's because Linux users do not usually participate in automatic updates that they consider this a problem. Windows and macOS and Android and iOS are all automatically updated and nudge users into updating more frequently than automatically.
No, the point is that distro/repo model has a terrible user experience for installing applications. Sure, it works fine for the very narrow case of only wanting to install exactly what is in the repo, but the second you step outside of that everything gets needlessly complicated.
I, for one, am grateful that things like FlatPak and AppImage are finally gaining traction and I hope the trend continues.
I struggle with this question as well but a small nit here is the folks at Fedora or Debian are not third party. They are a trusted source for me.
I don't know what would be a good solution. Being available on flathub is a good start but I'd argue it is not enough. I'm going to say the proper solution is the same that I advocate Google Play and Apple App Store to follow:
1. require developers to submit source code and machine readable build instructions
2. the store should build the application (fat binaries, differential small updates, whatever, the app store is in charge)
In general, I trust the folks running the distro more than the folks writing the software in the first place. If nothing else, they provide/enforce a second pair of eyes to sanity check things before they get shipped.
The distro also gets to choose a graph of known-good non-conflicting dependencies each time they cut a stable release. If the original author is catering to users who don’t use package databases, he won’t know what they may have available.
License is not the same as trademark, and most licenses do not grant trademark. You can distribute but you have to do it under a different name or brand (should upstream choose to enforce). All the people downvoting need to take a class on how IP laws work.
It should be noted that the Tivoization clause in GPLv3 is pretty narrow. It only covers software that you acquire in the same transaction in which you acquire the locked down hardware.
> For embedded systems (like a TiVo or a car) there's no other software being installed.
For now anyway. I would be surprised if cars do not end up with apps. The car will come with the base software installed and then you can later buy apps for your car from an app store run by the car maker.
It also applies only to "“User Product” is either (1) a “consumer product”, which means any tangible personal property which is normally used for personal, family, or household purposes, or (2) anything designed or sold for incorporation into a dwelling."
That would an odd reason, because Apple could comply easily. Source and development tools to translate them into binary are readily available. And you can still install software that you yourself built without a hassle last I checked.
These comments are irritating but so true. WTF! I waited for many seconds for the page to load, and now I have to wait 3 or 4 seconds every time I try and scroll.
Maybe this doesn't reflect on the company, but I feel if you can't even create a simple web page, then what the hell else are you going to struggle with along the journey?
It demos well when an agency shows it to a decision maker. Back in the day I used to use fine dining websites as a way to get clients to understand this dynamic. Lots of places would have a flash animation intro with music and images/video, etc. The owner loves this because the whole time they're thinking "this makes my place look so upscale and cool and desirable." Meanwhile, actual customers cared about location, menu, and hours being upfront and center without any bullshit. Thomas Keller never sold a single plate because he had an interstitial ad as the landing page on his website.