Basically a restatement of echo chambers, using the language of AI."Like an AI trained on its own output, they’re growing increasingly divorced from reality, and are reinforcing their own worst habits of thought".A good definition of model collapse: "the AI is primarily talking to, and learning from, itself, and this creates a self-…
Starts with a basic outline of model collapse and the challenge it faces new AI development, providing several links: "Research has shown that when generative A.I. is trained on a lot of its own output, it can get a lot worse."It then uses scrollytelling to provide a simple example of how "Just as a copy of a copy can drift away fro…
"As AI-generated content blurs the line between human and machine online, “model collapse” might help us find new value in well-managed human communities."
I hope you had a good summer. I stayed and worked from home, and got a lot done thanks to the mercifully fewer meetings. I also read a lot of good stuff, and published one piece. Here's a selection.
"What happens to GPT generations GPT-{n} as n increases?... the use of LLMs at scale to publish content on the Internet will pollute the collection of data to train their successors: data about human interactions with LLMs will be increasingly valuable."
MyHub.ai saves very few cookies onto your device: we need some to monitor site traffic using Google Analytics, while another protects you from a cross-site request forgeries. Nevertheless, you can disable the usage of cookies by changing the settings of your browser. By browsing our website without changing the browser settings, you grant us permission to store that information on your device. More details in our Privacy Policy.