Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Well, what could they say? Given the lack of transparency on the data it well could be:

“We’ve trained LLaMA MoE on a lot of GPT4 data. And this it is not as good as GPT4. And this is our blob, so we can release it under any license. If someone is silly enough to use what this blob generates, this is not our problem.”



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: