Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

We are building https://gitopia.org

- Permanent Data Storage provided by Arweave

- Works from within git with the help of git-remote-helper `npm install -g git-remote-gitopia` so no need to learn new tooling

- Built-in incentivization to token holders who also take part in the governance of Gitopia

- Token holders share revenue made by the platform

- You can mirror your GitHub repositories now using the Github Mirror Action. Follow step by step from here -https://thetechtrap.com/posts/push-your-code-to-gitopia/

- We are now working on the governance and collaboration workflows that will enable transparency in open source development and provide the stakeholders to have a say in the direction of the project.

You can reach out to us on

https://twitter.com/gitopiaOrg

https://discord.gg/mVpQVW3vKE



I'm not sure I see the point of building a complex platform with governance and whatnot. What I want is an easy way to publish my projects and for other people to contribute, I need something more like email (independent, self-hosted servers with a well defined protocol to communicate between them) than Facebook.

I don't care about the governance of my git remote, or getting money out of it. I care about it being reliable, fast and simple. If I'm unhappy with it I want to be easily able to switch to a different host, or create my own.

Frankly I would be perfectly fine with the current situation where you have a bunch of effectively centralized code hosting solutions (github, gitlab, bitbucket etc...) if you could trivially move your project from one to an other. For the code it's easy, git is built that way. For issues, PRs and the like it's trickier.

For me that's the problem that needs solving, I don't need an ultra complicated blockchain-powered solution where people can vote for the font of the UI with cryptotokens.

At a glance, and if I understand it correctly, Radicle seems more pragmatic in that way. Cryptocurrency is used for donation and securing entries in the global namespace in a decentralized way, the rest is just a bunch of standalone servers. Then you can decide to host your code on an existing instance or spawn your own. A bit like how Mastodon works for instance.


You might be interested in https://github.com/MichaelMure/git-bug. It stores issues directly in the git repository so the whole issue tracker is as distributed as the rest of your code. I can't speak to the UX though.


This seems pretty cool, is there a demo of the Web UI?


The best solution from my perspective for the community would be to have a standard that handles issues/prs and the like that could be taken from one code hosting solution to another. Just like how you can take your code with ease from one solution to the next by cloning.

It’s the same idea that we already use for our code but applied to all the other bits that are necessary for maintaining projects.

There are some pretty obvious problems with actually implementing this, however. One of which comes down to getting all the existing code hosting solutions to agree on a standard. As they could simply create ancillary standards to differentiate thenselves. Not to mention all the work involved in implementing this when most people are accepting of what we have now (until it bites them somehow).


We've been building a platform called Skynet which makes this possible. It has a user-oriented data model: all of the application code is run client-side, and all of the data is stored under the user's control.

That means that someone can create a new application at any time which has access to all of the data - because all of the data is accessed client-side and owned by clients in the first place.

This doesn't solve the standards problem, but on the other hand I think what would likely happen is the first project to become successful would also become more or less the standard, with other people building extensions to that data standard over time.


There is a project working on that over ActivityPub: https://forgefed.peers.community/

There are some established forges participating (like sourcehut)


I feel that there is a real need for permanent Storage with respect to Open Source.

Code breaks when old packages are unpublished or repositories deleted. Push once and fetch forever solves this.

Also Centralized solutions are providing open source collaboration tools for free, storage for free, because of their revenue from enterprise customers.

What happens when they decide to shut down? or change their policies? or just comply with wrongful takedown notices?


Would IPFS or DAT not suffice there? How is 'another permanent storage, that is a piece of a larger project, better than one that has the sole purpose and focus?


The challenge with IPFS and DAT is that you have no guarantees around the data reliability. The DHT style of p2p sharing pretty much only works for popular content. Incentivized storage networks can onboard any type of data and guarantee high uptime.

It's also been my experience that IPFS has significant performance issues. If you use a professional gateway like ipfs.io or cloudflare it runs at good speeds but as soon as you switch to being fully peer-to-peer it's almost unusable.

I don't have much experience with DAT, it may not have the same performance issues.

disclaimer: I work on an incentivized storage network called Skynet


Those are but some of the problems that IPFS, Dat (or Storj and probably Skynet too) do, or will face.

That doesn't make "rolling your own" any more reliable, performant, etc. So the question becomes even more apt: why will "building your own decentralised storage as requirement of a much larger project" work. If even the focused, dedicated "decentralised storage projects" cannot solve some problems?


With git it's rare for a project that's actually in use to go completely memory-holed, every contributor effectively having a local copy of the resource.

Using git (generally github) repositories for dependency management is, IMO, a hack and so it's not surprising that it often breaks. I like the way buildroot handles it (I'm sure they're not the only ones, but that's the one project I'm most familiar with):

- The buildroot buildbot fetches third party packages dependencies and archive them.

- When you build your buildroot image locally, it attempts to fetch from the third party directly. If the file doesn't exist anymore, it falls back onto the buildroot cache instead.

You could also easily add your own caching layer in there if you wanted too. I think that's distributed computing at its best: simple and robust, with a clear and easily understandable architecture. No blockchain-based proof-of-stake distributed storage, just a series of wget. And of course since everything is authenticated with a strong hash it's always perfectly safe.


Buy hard drives, do frequent backups, store redundant backups remotely, use RAID. At the end of the day someone must pay for the hard drives and they could just stop paying for it one day. There is no such thing as permanent and free.


> I need something more like email (independent, self-hosted servers with a well defined protocol to communicate between them)

The problem with email servers is you need a special type of sys admin to maintain them properly, they are not for the light at heart when used anywhere beyond the most trivial case. Any server that grows to some well-used size has a myriad of problems (getting outright blacklisted, etc).


Well, that's e-mail. Modern federated systems don't have a different server for sending and one for receiving. ActivyPub implementations are usually a single application with a database (like Pleroma) or a single app plus db/reddis/elasticsearch(optional) like Mastodon. Mastodon has official Docker containers, and it's not difficult to build one for Pleroma.

So you can make something "like e-mail" that isn't as bad as SMTP/IMAP/SPF/DKIM/etc... I've been considering hosting my own Gogs or Gitlab or one of the other locally hosted git platforms. I'd like to see something that allows pull/merge requests between them (you'd need some spam prevention of course; maybe require a message and a follow before people are allowed to push an request to your server).

This project ... doesn't seem like it does that at all. It's a desktop application .. with no real web view into your projects. I feel like it's missing a component, a service run in a docker container that you can program with your Device ID and push your public repos to for others to see.


The analogy broke down before that, you don't have reputation problems hosting a Gitea instance.


Would recommend OneDev.

https://onedev.io


They need to remove some clutter from that landing page.


Just read between the lines.


How about hosting your own Gitea instance? It works quite well.


I think that works - but what GP mentioned is where it fails:

> Frankly I would be perfectly fine with the current situation where you have a bunch of effectively centralized code hosting solutions (github, gitlab, bitbucket etc...) if you could trivially move your project from one to an other. For the code it's easy, git is built that way. For issues, PRs and the like it's trickier.

Maybe there's a nice way to Gitea distribute this information among Gitea instances, though it doesn't seem advertised as such - as this would basically make Gitea some neat Federated tool i think.

Ultimately i'd like Git[ea|whatever] to be just a frontend for a database that behaves like Git. In the way same way that Github's Source Viewing is just a frontend for Git. You don't worry about moving your Source data between Github and Gitlab, so why are we worrying about universal data like Bug tickets, Feature tracking, etc.

I'm a massive fan of systems that behave like Git and Scuttlebutt. Which is to say, they're dumb - simple. Git can be pushed and pulled from basically everything. There's no complex suite of nodes around the world that are expected or assumed to operate for any Git functionality. In the same way Scuttlebutt - which offers P2P layers, is similarly dumb (though less so, unfortunately) when compared to more complex P2P offerings like IPFS.

In my ideal world we'd have a database to pair with Git that would have some very basic schemas to complement Git. Possibly even baked into Git. Such that when you move from Github to Gitlab to Gitea - everything truly essential comes with. Some things might change, like your CI if it was bound to Github - but still. Losing some things vs losing everything.

I personally am less concerned about using Github/etc. Ie i'm not dying for someone to give me a new Github. I'm seeking a way to reduce lockin.

NOTE: I'm working on a distributed database that fits my needs on my above design goals. Really it's just Git + some additional data structures which allow for more data types being stored, such as binary and structured data to build foundations for SQL layers and etc. It's not intended for general use, but i'd love to see someone pick up the idea and run with it. "Git for Data" has been done a couple times, NomsDB and DoltDB namely, but they still felt like they weren't Git-like in that they wanted to centralize - probably for SaaS reasons.


Thank you for sharing your work and great to see more attempts on the same problem. I am one of the maintainers of the Radicle project.

The main problem we encountered with similar systems that rely on blockchains / dht for storage is the problem of 'blockchain poisoning'.

This is when someone deliberately adds illegal content to an append-only source in hopes to make the sole act of replicating the project legally problematic, as correctly pointed out by Konstantin Ryabitsev of the Linux foundation with regards to a previous version of Radicle that was relying on IPFS. see https://radicle.community/t/the-radicle-social-model/317


I think you could add moderation to a project like this much the same way you can add moderation to a centralized project. You don't need to be as heavy handed as "community voting", you could just have each client nominate a moderator who is able to append instructions to ignore content that the client would follow.

Under this model, every user (or project) can choose their own moderators, which means you don't have to worry about other parts of the community having different ideas for ideal moderation - each project/user can subscribe to the moderation feeds that they like, and ensure that they get a clean experience.

This type of moderation is actually an upgrade from centralized systems, because it is much easier to nominate new moderators if you don't like what the old moderators are doing.


Considering the illegal content is added by someone who has access to the repository. The right way to address that would be via community voting, which then can decide to hide it from the platform since the data is permanent on Arweave. This governance workflow will enable the community to make such content policies.


I just want to publish some code.

Now I need to worry about blockchains and tokens and voting?


Hey FYI your landing page gave me this error, Firefox ESR 78.5.0. Right now it's a big blank screen with no content.

{…}

code: "EACCES"

errno: 13

message: "Error: EACCES: Permission denied."

path: undefined

stack: "n@https://gitopia.org/main.js:277:55609\non/<@https://gitopia....

syscall: ""

<prototype>: Object { constructor: n(t, n, r), toString: toString(), toJSON: toJSON() , … } main.js:339:94885 c https://gitopia.org/main.js:339 configure https://gitopia.org/main.js:277 store https://gitopia.org/main.js:277 on https://gitopia.org/main.js:277


The "push" icon looks like he's pulling


Thanks for flagging this. We'll look into it. Meanwhile you can try from a different browser.


> - Permanent Data Storage provided by Arweave

This kind of thing always makes me sweat. Are deletes at all possible? What if someone accidentally pushes keys to the repo? On regular github, at least you can nuke the whole repo and start over if need be.


Those keys absolutely need to be rotated, regardless of whether you delete the commit or repo after accidentally pushing them.

At which point you may as well leave them there...


Keys probably aren't the best example. What about PII, accidentally committed "test" data, documents, nudes, whatever..


If repos are public, then you must assume that once something is pushed then someone has copied it.

You may get lucky and remove/hide it fast enough, or think you did...

This is an with Github today, all public repos are being watched by bots reviewing all commits for accidentally-pushed credentials.

The only solution is to not use a public repo.


You’re not answering the question. Is it possible to delete or not?

On GitHub yes someone might be watching, but deletes are still possible.


I don't know. You can always clean the Git history & force-push it, but the developers would have to explain if there's any backups or archive kept anywhere...


Since data is stored permanently on Arweave, there's no way to remove it from the blockchain. However, you could force push your repo which would remove your concerned commit from Gitopia repository view.


I agree, but is it possible to delete at all on ARWeave? Let’s say someone accidentally puts their full name and address in a repo.


Link to git-remote-gitopia source hosted on Gitopia:

https://gitopia.org/#/z_TqsbmVJOKzpuQH4YrYXv_Q0DrkwDwc0UqapR...


what blockchain are you using?


nvm found it. It's something I've never heard of called `arweave`




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: