If tools like GPT result in the creation of large amounts of new content, and then that content gets ingested by the next generation of the models, at what point does that impact the quality of the new output?

Is all future output bounded by the quality of whatever was available broadly circa late 2021?