Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I'd personally like to have a FOSS, privacy-aware CSAM (or even generic gore/porn) detector I could plug into Matrix/Lemmy/Mastodon servers. something self-hostable, so I could run those services without worrying about pedos and trolls ruining my platform.

I'm not sure if something like it exists. I'm not sure if it could exist. PhotoDNA (the old CSAM detector) ended up being somewhat reversible, so that you could actually turn signatures back into obscene material. because of this, the signature databases were shared under strict NDA, only to large players.

probably the most realistic solution is a generic porn classifier convnet. if it blocks adult porn, it should block CSAM too (hopefully?)

they are not reliant on image hashes, and reversibility concerns apply less because the dataset used to train it was presumably legal (if distasteful.)



I am working on something like this using multimodal models




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: