“Our primary conclusion across all scenarios is that without enough fresh real data in each generation of an autophagous loop, future generative models are doomed to have their quality (precision) or diversity (recall) progressively decrease,” they added. “We term this condition Model Autophagy Disorder (MAD).”

Interestingly, this might be a more challenging problem as we increase the use of generative AI models online.

  • feeltheglee@beehaw.org
    link
    fedilink
    English
    arrow-up
    10
    ·
    1 year ago

    You know how when you’re on a voice/video call and the audio keeps bouncing between two people and gets all feedback-y and screechy?

    That, but with LLMs.