Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Making these things anathema to commercial interests and making training them at scale legally perilous would be a huge win.


A huge win for countries with lax copyright laws. These things aren't going away, the worst case scenario would be exactly that scenario playing out - then China (or some other peer to the US's tech sector) just continues developing them to achieve an economic advantage. All in addition to the obvious political implications of AI chatbots being controlled by them.

The LLM genie is out of the bottle: an unfavorable court ruling in a single country isn't going to stuff it back in.


Do LLM really give an economic advantage though? I've mostly seen them used to write quirky poems and bad code. People are scrambling to find use-cases but it's not very convincing so far.

On the other hand, if LLM are used to "launder" copyright content and, accepting the premises of copyright law, this has the effect of reducing incentives to do creative work, that has obvious negative implications for economic productivity.


> I've mostly seen them used to write quirky poems and bad code.

Assuming this is in good faith: the ability to write code, documentation, and tests is absolutely a productivity enhancer to an existing programmer. The code snippets from a dedicated tool like copilot are of very usable quality if you're using a popular language like Python or JS.


I don't give a shit about what China does.


> making training them at scale legally perilous would be a huge win.

Why?


Because the megacorps should have to pay the people creating the works they are training their multibillion/eventual multitrillion dollar systems on, and should get a nice rake to the face when they try to do an end run around it.


I have no idea what he's thinking, but if everybody in the community here had an LLM in their pocket and large orgs did not, it would at least be kind of fun.


The open source people can continue to pretend they matter in this field and large corporations like Microsoft will stop stealing everything that moves on the internet.


>making training them at scale legally perilous

Loading data to which you have no rights over into your software is legally perilous, yes.

It's as easy as simply asking for and receiving permission from the data's rightsholders (which might require exchange of coin) to make it not legally perilous.


Sounds expensive.


If you want to do things with other people's stuff, yes it can get expensive.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: