> It’s also now in line with the various WAD and Descent games over time that used this model, where the engine is maximum rewrite amazing but the game resources require a GOG purchase.
FreeDoom, like OpenTTD, walks a fine line between ‘artistic reimplementation’ and ‘legally vulnerable’ due precisely to reimplementing art assets, yes.
Realistically I don't see how Valve can avoid this. They want all those games on Steam Deck and the new console. Game devs want KAC. Therefore Valve can either provide them with some way to implement KAC - which effectively requires a "signed kernel / drivers only", same as on Windows - or tell them to go away. Why would they do the latter?
Mind you, it doesn't mean that the Linux kernel will be "infected for everyone". It means that we'll see the desktop Linux ecosystem forking into the "secure" Linux which you don't actually have full control of but which you need to run any app that demands a "secure" environment (it'll start with KAC but inevitably progress to other kinds of DRM such as video streaming etc). Or you can run Linux that you actually control, but then you're missing on all those things. Similar to the current situation with mainline Android and its user-empowering forks.
> we'll see the desktop Linux ecosystem forking into the "secure" Linux
> Or you can run Linux that you actually control, but then you're missing on all those things
We cannot allow this stuff to be normalized. We can't just sit by and allow ourselves to be discriminated against for the crime of owning our own devices. We should be able to have control and have all of those nice things.
Everything is gonna demand "secure" Linux. Banks want it because fraud. Copyright monopolists want it because copyright infringement. Messaging services want it because bots. Government wants it because encryption. At some point they might start demanding attestation to connect to the fucking internet.
If this stuff becomes normal it's over. They win. I can't be the only person who cares about this.
PvE shouldn't need it either, and yet games routinely ship with anti-cheat applied to everything (including single player).
I rather suspect that the reason for this is the current gaming economy of unlockable cosmetics that you can either grind for, or pay for. If people can cheat in single player or PvE, they can unlock the cosmetics without paying. And so...
This matches my experience compressing structured text btw. Bzip2 will beat every other tool out there, both on compression ratio and, sadly, decompression time
OP says decompression time is so high because it has similar properties to a memory-hard password hash: it's bandwidth-bound due to the random access requirement. Even xz decompresses 2.5x faster, and I don't find it particularly fast
This is why I switched away, also for text compression; searching for anything that isn't near the beginning of a large file is tedious. My use-case for compression is generally not like OP's, that is, compressing 100KB so that it can fit into Minecraft (if I understood their purpose correctly); I compress files because they take too much disk space (gigabytes). But if I never wanted to access them, I'd not store them, so decompression speed matters. So I kinda do agree with GP that Bzip2 has limited purposes when Zstd is just a few % more bytes to store for over an order of magnitude more speed (1GB/s instead of 45MB/s)
Edit: And all that ignores non-json/xml/code/text compression tasks, where Bzip2/LZMA doesn't give you the best compression ratio. I'd argue it is premature optimization to use Bzip2 without a very specific use-case like OP has for very good code compression ratios and a simple decoder
I wonder what the combined speed would be for small to mid-sized text files if they were fully loaded into memory first? That swaps multiple random-accesses for single sequential read (even if sequential is not really with flash memory, it should still perform better than totally random access), and memory random access which should not be a bottleneck at these speeds.
Or perhaps this is already being done this way and it is a bottleneck?
This might not work for multi-gigabyte files, but most of text content is well under 1MiB.
I have also tested bzip3 and initially I encountered a great number of cases where bzip3 was much better than zstd from all points of view, i.e. compression ratio and execution times, for any zstd options.
Unfortunately, later I have also found other cases where the bzip3 performance was not so good.
I spend a lot of time on compression/decompression, but sadly there is no algorithm that can be said to be superior to all others in all circumstances, so for optimum results you must still be prepared to use any of them.
Even the ancient gzip is preferable in certain cases, when speed is more important than compression ratio, because sometimes it happens to provide a better compromise than newer algorithms.
> Are you Chinese? If not, I think you should prefer the people defending you to have the best tools to do so.
They already have the best and most expensive toys in the world, and they mostly seem to be waging aggressive wars with them. Perhaps if the toys weren't so shiny and didn't make it all so one-sided, they wouldn't?
reply