Milan Tresch Stories
The Forced Relearning
At a certain point it becomes clear that AI is not a machine that can guess the real intention behind our questions, the thought itself. Many people initially assume it makes us dumber, or replaces us, or “takes over control.” But it does something far more uncomfortable than that. It shows how poorly we know how to look for an answer to a problem, and how poorly we know how to use a tool, an opportunity, effectively.
It doesn’t replace you. It doesn’t teach you. It exposes you.
In the beginning, people tend to believe that the answers that miss the mark are the system’s fault. That it “didn’t understand,” “didn’t react well,” “misinterpreted.” Then, slowly and uncomfortably, it becomes obvious that the problem isn’t where they thought it was. The problem is with them. Their questions are vague and superficial. Their instructions are imprecise. Their intention hasn’t been thought through. And where intention isn’t clear, the answer cannot be fully grounded professionally either.
AI throws out responses without asking clarifying questions. It doesn’t correct the thought. It doesn’t replace the missing inner work. It logically assembles whatever you put in front of it. It doesn’t soften things, it doesn’t ask back in a human way. It doesn’t sense that “maybe you meant something else.” It gives back exactly what was asked of it. And that is precisely what suddenly makes it uncomfortable.
That’s when it becomes clear that many people no longer know how to ask questions. Not because they’ve become stupid, but because they’ve gotten out of the habit. Until now, the system did a significant part of their personal thinking for them. But now there’s no escape route. You can’t dodge going deeper into the question, into the problem. If you can’t clearly state what you want, the result won’t be clear either. If you’re too lazy to type your question thoroughly and in full detail, almost spoon-feeding it, don’t expect a solid answer. If you don’t know your goal, the answer can take you in any direction.
This isn’t a technical limitation. This is the human desire to “get out of doing the work.”
AI doesn’t take over thinking, but it demands it from you. It forces you to think through what you’re asking, why you’re asking it, and what you would do with the answer. Anyone who tries to save themselves from that work gets disappointed quickly. But anyone willing to relearn what they’ve lost suddenly realizes how much has worn away in them, and in the world around them, over time.
To formulate a question. To sharpen an intention. To separate what we genuinely want to know from what we merely want to hear. These are basic human skills. Not AI features.
That’s why this is so unsettling. Because it’s not about the machine’s progress, but about the decline of our own thinking and problem-solving ability. AI does not tell us what to think.
In this sense, relearning isn’t a choice. It’s a necessity. Either we start asking again, sharpening, thinking, or the tool will produce flawed solutions and answers, leading to false conclusions, and those conclusions will push our decisions off course.
And this is the point where it becomes final: whether AI will be a partner, or a polished delivery system for bad solutions.
It’s up to us.

Contact
“I’m always looking for new and exciting opportunities.
Feel free to reach out - let’s connect!”