We have been informed of another potential CSAM attack to our federated instance lemmy.ml.

After the events of the last time, I have preemptively and temporarily defederated us from lemmy.ml until the situation can be assessed with more clarity.

I have already deleted the suspicious posts (without looking at them myself, all from the database’s command line) and banned the author. To the best of our knowledge, at no point in time any CSAM content was saved on our server.

EDIT: 2023-09-03 8:40 UTC

There have been no further reports of similar problems arising from lemmy.ml or other instances, so I am re enabling federation. Thank you for your patience.

  • lilShalom@lemmy.basedcount.com
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    1
    ·
    1 year ago

    If only there way to have MS,AWS or Google’s vision AI scan all the images and automatically remove them when it’s determined to be inappropriate.