WormGPT Is a ChatGPT Alternative With ‘No Ethical Boundaries or Limitations’::undefined

  • EnPeZe@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    8
    ·
    1 year ago

    there isn’t any easy way to reuse old weights

    There is! As long as the model structure doesn’t change, you can reuse the old weights and finetune the model for your desired task. You can also train smaller models based on larger models in a process called “knowledge distillation”. But you’re right: Newer, larger models need to be trained from scratch (as of right now)

    But even then it’s not really a problem to keep ai data out of a dataset. As you said: You can just take an earlier version of the data. As someone else suggested you can also add new data that is being curated by humans. If inbreeding actually ever happens remains to be seen ofc. There will be a point in time where we won’t train machines to be like humans anymore, but rather to be whatever is most helpful to a human. And if that incorporates training on other AI data, well then that’s that. Stanford’s Alpaca already showed how ressource effective it can be to fine-tune on other AI data.

    The future is uncertain but I don’t think that AI models will just collapse like that

    tl;dr beep boop