While I think this is a bit overblown in sensationalism, any company that allows user generative AI, especially as open as using LoRas and any amount of checkpoints, needs to have very good protection against synthetic CSAM like this. To the best of my knowledge, only the AI Horde has taken this sufficiently seriously until now.

    • db0@lemmy.dbzer0.comOPM
      link
      fedilink
      English
      arrow-up
      13
      ·
      11 months ago

      To be fair, the AI doesn’t have to see CSAM to be able to generate CSAM. It just has to understand the concept of child and the various lewd concepts, and it can then mix them together.

    • CJOtheReal@ani.social
      link
      fedilink
      English
      arrow-up
      3
      ·
      11 months ago

      The AI is able to merge images, you can definitely merge child pics with “normal” porn. So technically you can make “CSAM” without having actual CSAM in the training data. The first versions of some of the image AIs have been able to do that, and there was definitely no CSAM in the training data.

      Its not good to have that around, but definitely better than actual CSAM.