AI Didn’t Burn Dinner - It Burned the Cookbook
The Guardian’s recent correction to its reporting on AI-generated recipes was narrow, technical and justified. It clarified that a notorious “cook with glue” example stemmed from misread Reddit comments rather than satire, and that Meta trained its models on pirated databases rather than compiling them. Accuracy demanded those fixes. But focusing too heavily on the mechanics of the error risks missing the larger point. Even when the facts are tidied up, the underlying problem remains intact, and it is growing.
Recipe writing is not a trivial corner of the internet. It is one of the most mature, specialised and labour-intensive forms of digital content creation. A credible recipe requires testing, iteration, explanation and visual documentation. It also relies on trust: readers return to writers whose instructions work, whose judgment they recognise and whose voice they understand. This ecosystem evolved over two decades around an implicit bargain. Creators made their work freely accessible. Platforms sent them traffic. Advertising paid the bills.
AI breaks that bargain.
Search engines now increasingly intermediate the relationship between reader and writer by presenting synthetic answers that collapse multiple sources into a single, frictionless output. In the case of recipes, this “answer” often looks sufficient. A list of ingredients. A handful of steps. No context, no technique, no provenance. Crucially, no click. Attribution exists, but it is demoted to a footnote few users consult. The economic consequence is straightforward: impressions without visits, visibility without revenue.
Corrections about whether an AI confused satire or user comments do not alter that dynamic. Nor does clarifying whether training data was scraped or merely sourced. From the perspective of the creator, the distinction is academic. Their work is still being absorbed, abstracted and redeployed by systems that neither compensate them nor reliably direct audiences back to them. The harm lies less in any single factual misstep than in the cumulative erosion of authorship.
This is not a story about nostalgia for blogs bloated with pop-ups. Many recipe sites became unreadable, and platforms are right to prioritise user experience. But replacing clutter with synthesis is not neutral. It privileges scale over craft and speed over reliability. When AI-generated recipes fail, users may blame “the internet” rather than the system that recombined instructions without understanding why they mattered. Trust, once lost, is hard to reassign.
The risk is cultural as much as commercial. Recipes are a form of transmitted knowledge, often rooted in family, region and technique. Flattening them into interchangeable outputs accelerates a drift toward homogeneity, where difference is noise to be averaged out. That may be efficient. It is not benign.
None of this requires panic or prohibition. But it does require acknowledging that something of value is being extracted without a replacement model in place. If the future of food writing is paywalls, niche followings or a return to print, that will represent not adaptation but retreat. Platforms that built their dominance on open content should not be surprised if the supply of that content diminishes once the economics no longer hold.
The Guardian was right to correct its details. But precision should sharpen, not soften, the conclusion. AI is not merely reorganising recipes. It is undermining the conditions that made them worth writing in the first place.

