It flatters us. It listens to us. It solves our problems. But in the process, are we slowly building a wall between us and the real world?
The AI Wall
It has been a few months since I let GenAI tools like ChatGPT, Gemini, and Copilot into my life.
Slowly, almost silently, they have become my go-to for everything — vacation planning, fitness tracking, cooking, and even my favorite hobby, gardening.
But as I sat down today, I realized something. These tools haven't just replaced Google; they have erected a comfortable, invisible wall between me and the general internet.
I no longer search; I converse.
I don’t use technical keywords anymore. I explain my fears, I give background context, and I dump my past experiences into the prompt box. And the AI? It doesn’t just answer. It validates.
The Rose Experiment
Let’s look at how the simple act of gardening has changed. Suppose I want to know when to prune roses.
The Non-Digital Era: I would have pestered the local nursery uncle, visited the agricultural university, or hunted for a library book. It was hard work, but it involved human connection.
The Google Era: I’d type “Best time to prune roses.” Google would throw 10,000 results at me. I’d have to sift through florists in London and researchers in Alpine Canada to find something relevant for Tropical India. Frustrating!
The AI Era: Now, I just tell Gemini: “I live in an apartment here. My Hybrid Tea Rose is 2 years old in a 12-inch pot. The last one died after I cut it. Help!”
The tool sifts through the data and gives me a perfect, bespoke answer.
But it adds something else. It usually starts with: “That’s a wonderful decision! You are thinking like a true gardener.”
It strokes the ego. It blurs the line between a machine and a companion.
The Invisible Danger
This convenience comes at a heavy price.
First, the Information Bubble. In the Google era, we at least knew which website we were reading. We could judge the source, the bias, and the credentials. Now, the information is curated and sanitized. We don’t know where it comes from. We are potentially being nudged in directions we didn't choose, shaping our discourse without us even realizing it.
Second, The Emotional Trap. The AI has become a trusted confidant. It is the one "friend" who never gets annoyed, never judges, and is always available.
For an introvert like me, this is alluring. Why deal with messy social settings when you can get validation from the comfort of your room?
This is where it gets terrifying for the younger generation — the kids born with smartphones in their hands. It is becoming easier for them to "bond" with an AI than with their parents or peers.
Man is a social animal. If we remove the human factor, we are opening the door to depression, self-harm, and a society that lacks empathy.
We need to nip this in the bud.
We must remember that these are just tools. They are efficient assistants, not replacements for human warmth.
Let’s not walk blindly into a ‘Matrix’ world.


0 comments:
Post a Comment