Chatbots including GPT-3.5-turbo do get things wrong and they do get coding wrong. In particular, if you ask them to help out with your coding, they will very often go forward when the sensible move is to go back. Retracing their steps going back to a previous working state isn’t something they’re very good at; that’s partly because of the way they’re built. If they were to learn that and to give advice to that effect, things would fundamentally change, and I’m sure one day soon they will.