The Chat Problem
AI doesn't lose your code. It loses your decisions. The output gets saved; that’s the easy part. What doesn’t survive is the reasoning. The moment you choose one approach over another, the constrai...

Source: DEV Community
AI doesn't lose your code. It loses your decisions. The output gets saved; that’s the easy part. What doesn’t survive is the reasoning. The moment you choose one approach over another, the constraint you agreed on, or the scope you explicitly ruled out, all of it disappears when the conversation ends. I came across this building TrekCrumbs, a cross-platform travel app I built over five months with ChatGPT as my development partner. The code was solid. The velocity was real. But underneath, problems were building up in ways I hadn’t noticed. Early on, I made a decision about how to structure the data: by creating separate schemas for each type of crumb (Crumb: a category of travel activity, lodging, flight, train, etc.). It made sense at the time; it was flexible, clean, and easy to extend. That decision lived in a chat I never reopened. A few weeks later, I made another decision that depended on the first one without realizing it. Then another. Fields drifted between schemas. Validatio