AI models feeding on AI data may face death spiral

Model Collapse refers to a degenerative learning process where models start forgetting improbable events over time, as the model becomes poisoned with its own projection of reality. Credit: arXiv (2023). DOI: 10.48550/arxiv.2305.17493

Large language models are generating verbal pollution that threatens to undermine the very data such models are trained on.

That’s the conclusion reached by a team of British and Canadian researchers exploring the impact of successive generations of ChatGPT generated text that will be culled for future models.
In a paper published on the arXiv preprint server and titled, “The Curse of Recursion: Training on Generated Data Makes Models …
Read more…….

Be the first to comment

Leave a Reply

Your email address will not be published.


*