Sadly like many Docker Swarm based projects, this has become abandonware. I use Docker Swarm with Portainer for managing services with ecoeats[1], a decision that was made years ago, just before Swarm was revealed to be absolutely dead in the water. I initially used Swarmlet before switching to Portainer, as there were far too many bugs and missing tools needed to effectively manage a Swarm that went beyond stateless Node containers.
With Portainer and Swarm I've been forced to manually intervene with rollouts more times than I would have liked due to Swarm-specific errors and other quirky networking behaviour. At least it's simpler than Kubernetes!
I really liked the option to deploy through a git push without additional setup. But it indeed looks like abandonware, thanks for sharing your experience! If you would start over, would you choose for Portainer and Swarm again? Do you know of any alternatives to Swarmlet?
Portainer also has default opt-out analytics via matomo. According to their github the analytics are in Germany but the DNS points to server in France at the moment. Either way, I find it questionable to have analytics running on self hosted open source software.
I'm building a webapp[0] for turning Clickup docs into static sites, using CF Workers. There isn't a framework for Workers that has the flexibility I need, so I home-rolled one that only sends rendered HTML over the wire. Async components are supported too, so if I have a particularly data-intensive component the renderer just inlines a tiny JS script that sends a request to the worker again, which then returns just that component in HTML.
Could be worth releasing on its own as a GitHub project!
Someone added a ColdFusion Markup Language generator to https://curlconverter.com/cfml/ last year and after a few months I decided to remove it since I've never heard of it so nobody could possibly be using it, and the next day the guy who added support for it and 3 other people complained about it, so it seems like they're out there.
I wonder how much visitors you have, i.e. how much popular is it? If its not a secret of course.
I couldn't imagine any web developer need such kind of tool [knowing curl by heart], but obviously I'm in my own bubble and I'm wrong on my assumptions.
We also moved the site from GitHub Pages to Cloudflare Pages just for August 2022, to run server-side analytics to count the people that block ads and that said 116k "visits", compared to 67k "unique visitors" and 112k "total pageviews" from client-side analytics for the same time period (August 31st is missing because Cloudflare only lets you see 30 days). 116k and 112k are really close, which makes me suspect that I misunderstood and Cloudflare is actually also doing client-side analytics, but if that's not the case then it's 120,000-150,000 monthly visits.
Back in September 2021 the site used to have Google Analytics and that said 25k users with 60k sessions from Sept 17 to Oct 17 2021.
They offered us 15K for the site so we didn't sell it and they cloned it, since it's all open source. We removed the analytics again in January 2023 for the same reason we removed them before, curl commands copied from the browser might contain cookies or auth tokens and it's easier to not worry about the possibility of them accidentally getting logged by the analytics and also because it's kind of expensive and I got laid off in November. Knowing people are using your code is nice though.
> I couldn't imagine any web developer need such kind of tool [knowing curl by heart]
There might be some edge cases in Bash syntax or curl's option parsing you might miss that the tool won't (or vice versa), but you're right, rewriting a curl command in another language by hand is straightforward. But if you're doing it a lot it gets tedious and some people are writing web scrapers and bots full-time. You usually start with a curl command from the browser or some API's documentation, and if you want to do something with the returned data, you rarely want to do it in Bash. You could call the curl command and read the output in Python but you'll have better error handling and less code if you use an HTTP library. A better solution would be adding "Copy as Python", "Copy as R" etc. to Chrome directly and I opened a PR for Python but I haven't polished it enough for them to just merge it
Cloudfront, by Amazon's own admission, specialises in high bandwidth delivery (ie huge videos). Fastly has consistently better performance as a small object cache, which makes it the choice for web assets
Reading through Cold War-era plans for post-apocalyptic Britain makes me shiver. The idea that any of the broadcasts talking about "stay indoors, no help is coming" and "the water you have may only last 14-days or more" could one day come true is truly terrifying.