I generated a pic of a tiny slime girl eating some meat in a jar and I get this error first warning what dose this mean

  • Edwardthefma99✡@lemmy.worldOP
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    1
    ·
    8 months ago

    was thinking it might be a filter to prevent pedos but my post had nothing to do with a sexual nature the photo was a small slime girl knawing on a chunk of raw meat with a smile

    • allo@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      8 months ago

      im going to try to make the pedoiest prompt and see if i can trigger it.

      • ✧✨🌿Allo🌿✨✧@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        8 months ago

        @edwardthefma99@lemmy.world so i typed basically all the words i could think of that were pedo and sexual and hit generate and it generated normal like 20+year old girls. tried to add one to the gallery and it said network failure probably because my internet is really slow right now. trying to add a normal image to gallery also says network failure so that seems to be on my end.

        my GUESS is it doesn’t have to do with pedo prevention since the method to prevent that seems to be to detect words and manipulate the prompt on the server side before we even receive the image. my internet is really slow during the day tho so it’s hard for me to test. And what I just typed I’m sure woulda ranked a bajillion percent higher on pedo detection than whatever your innocent prompt was. But i got no error, it just automatically weeded what i had typed and gave a legal image back.

        someone else probably knows more about this or has seen the error before and knows why. Testing to see if antipedo is the reason seems currently like a negative.