If everything is correct, it all looks the same.
- Virna Buda

- Apr 2
- 2 min read
The circle

A large part of what we do doesn’t come from a rational decision, it’s a response to context.If you walk into a quiet room, you lower your voice; if you see a line, you join it; if you find a closed door, you knock.
No one gives you instructions in that moment, and yet you behave “the right way.”
There’s a very simple experiment that, for this reason, is hard to ignore.
A circle is drawn on the ground, just a line, no physical barrier, no explanation. People approach it and, almost automatically, walk around it. They don’t cross it, even though no one said they couldn’t, but because something makes them perceive it as a boundary.
The same thing happens when the circle is drawn around a person. No one tells them to stay inside, no explicit rule is given, and yet that person tends to remain there. The boundary, even if symbolic, becomes real.
This experiment reveals something very intuitive: the way we think moves through limits. We recognize boundaries, we internalize them, and we often respect them without even realizing it. It’s a useful mechanism, because it helps us orient ourselves, reduce risk, make decisions faster. But it’s also the same mechanism that, over time, tends to reduce the space of possibilities.
This is exactly the principle we have transferred into artificial intelligence.
We have trained it to recognize what is correct, what works, what is coherent. We have taught it to avoid mistakes, to stay within margins, to produce responses that don’t create friction. The result is an extraordinary tool, probably the most powerful one we have today. But it is also a tool that, by its nature, tends to operate within what has already been defined.
If the perimeter is correct, the result will be correct. If the perimeter is limited, the result will be limited.
No one is trying to question artificial intelligence, it would be completely out of place to do so today. But we do need to recognize that, when we use it only to avoid errors, we risk reinforcing the very mechanism we already have: staying inside the circle.
Creativity doesn’t work like that.
It doesn’t come from correctness, but from divergence. From something that is not fully defined at the beginning, from an interpretation that deviates, from an error that opens an entirely different direction. That’s where ideas are born that don’t resemble what came before, where something emerges that wasn’t already expected.
If everything becomes correct, everything also starts to look the same.
And over time, this doesn’t remain just a creative issue, it becomes a strategic one. Because when everything is coherent but indistinguishable, it becomes harder to take a position, to be recognizable, to build something that truly has its own shape.
This is why the point is not how much we use artificial intelligence, but how.
If we use it to accelerate what we already know how to do, it becomes a powerful ally.
If we use it to replace thinking, it risks becoming an invisible limit.
A circle, in fact.
And like any circle, the problem is not that it exists, but forgetting that you can step outside.




Comments