Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> making it finetuned to your use cases might make it dumber overall.

LoRa doesn't overwrite weights.



Do you need to overwrite weights to produce the effect I mentioned above?


Good point




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: