top of page
Search

How AI will lead humanity to dystopia

  • Writer: Tarasekhar Padhy
    Tarasekhar Padhy
  • Aug 13
  • 6 min read

Every technological revolution has a cost. The tools developed from any technology offload a difficult thing. Before that, we used to do that difficult thing the hard way. And in return, it taught us a thing or two.


Consider the art of reading maps. 


Back in the day, folks used to navigate their way around new towns, cities, and states with basic paper maps that didn’t give them up-to-date traffic and roadwork details. This increased our ability to memorize routes quickly and ask for directions in an effective manner.


With Google Maps, that has become a lost art.


Similarly, earlier, people, before making a conclusion about a political, social, or economic phenomenon, had to read multiple books, newspapers, and magazines. They had to manually sift through mountains of information to draw insights.


Now, the process of going through an immense amount of literature to find meaning is a skill only a handful have.


The same pattern will repeat with AI, but with a much higher cost.


The price of AI


Thinking.


One of the most important cognitive skills of all time is to analyze raw information, use logical reasoning, and reach your own conclusions. With the rise of free press and network TV, this somehow got reduced.


People started adopting the opinions and beliefs of their idols because the platform appeared credible and the message either fed their preconceived notions or elevated their emotional comfort.


The next technological revolution was the internet and social media. This dumbed humans down further. 


If I am being honest, the erosion of critical thinking was already in motion here. Boredom and leisure became relics of the past. Any free time was instantly soaked up by social media platforms that were powered by recommendation algorithms that know people better than themselves.


The era of starting massive shortcuts started. Instead of collecting information, people started looking for outright solutions. 


Rather than writing their problem down and breaking it up into small chunks to uncover an answer, they resorted to the suggestions of their favorite content creators and influencers who were sponsored by the brands or motivated by money.


But even then, there was a bit of thinking involved. For instance, if a tech influencer recommends a smartphone, they’d at least mention the reasons. Moreover, you had the common wisdom of the civilization in the comment section.


All of that got wiped out with AI.


Now, if you have a problem, just ask ChatGPT, and it will tell you exactly what you need to do.


Yes, it will provide extra information relevant to your use case. However, human nature will force us to simply pick up the actionable insight and move along. 


Zero thinking required.


But the internet is curated


Tools like ChatGPT, Perplexity, and whatnot fetch information from the web, then give you an answer based on your query in a conversational manner. Basically, they read a bunch of articles and blog posts for you to accelerate your decision-making process.


It means that whatever AI assistants recommend depends on the information they sift through.


And that’s where the problems start.


As a digital marketer and content writer, I can confidently tell you that the majority of the information on the internet that you may leverage to make daily decisions is only partially true.


Even the basic health-related queries, such as the impact of masturbation. There are plenty of blog posts and articles that suggest that it can be healthy if done correctly. The “cans” and “ifs” give an excuse to men to continue to ruin their lives by jerking off.


The same is true for a query like, “Is McDonald’s healthy?” 


I see this trend among startups and tech businesses alike. A plethora of paid reviews flood various third-party websites to fabricate a false impression of a brand or its offerings.


You must have seen this on websites like Amazon, where trash products have raving reviews.


So, when you use AI tools to shorten your decision-making process, you are effectively relying on strategically distributed lies across the internet.


For example, if you ask AI, “What’s the best protein powder?” it will recommend the one that is promoted heavily by influencers and has a ton of overtly positive reviews on various platforms. 


To be honest, I don’t blame AI. That’s what they are trained to do. Read websites for you and draw a conclusion from that data to give you a reply. Quickly.


Humans’ thinking process


Let’s continue with the protein powder example.


I selected a protein powder by looking at the lab tests conducted by various independent testers and how much it matched what it said on the packet. Then, I compared pricing to determine the cost per gram of protein from each brand under consideration.


Before all of this, I learned about the protein-powder making process and defined what would be the ideal option for me. Being Indian, it was simple: cheap and best.


AI won’t do any of that.


It won’t determine a definition of success, build an evaluation process, collect relevant information, and analyze it all to give you an educated assessment. These tools simply parrot what’s written, even if you use their “reasoning” model.


Now, look, AI is a great tool for learning and collecting data itself. However, if humans, who are naturally inclined to take shortcuts and have been doing so significantly since the age of the internet, have an option to directly get a definitive answer, they will do just that.


I hope I am wrong about that.


Erosion of cognitive and emotional skills


Digital technologies took away our emotional skills first. Many individuals are extremely dependent on cheap dopamine-delivering pacifiers. Whenever a stressful situation emerges, they resort to social media, pornography, or video games instead of tackling it head-on.


I am guilty of that myself, although after recognizing that pattern, I’ve been improving myself gradually.


Unfortunately, the majority of people won’t follow this path. In fact, they will consume content that reinforces the belief that their addictions are a good coping mechanism for their self-diagnosed depression.


You can clearly see the result of the emotional instability across different social spheres. Everyone, including children, is pissed off for some reason. It seems that every member of the human civilization has a short fuse.


People’s relationships have become diluted because whenever a difficult conversation pops up its head, instead of going deep, they acknowledge it on the surface and proceed to indulge in meaningless frivolities.


Now, with the widespread adoption of AI through its seamless integration into everyday applications, the cognitive skills will go away, too.


In the upcoming years, folks will adopt others’ beliefs and consider this practice to be the norm. They will forget how to measure progress toward a particular goal and iterate their approach to optimize the outcome.


The highly valuable skill of asking logical questions and spotting inconsistencies in reasoning will decline as well. Many of us will lose the ability to wonder and simply become curious.


It is already happening, and you can see its impact as well. On average, people have fewer cognitive and creative skills than they used to have a decade ago. They are more comfortable becoming consumers.


I’ve particularly seen this in the content writing field. Writing requires you to think and communicate while remaining factual and logical. The new generation of content writers is growing up with LLMs and seriously lacks these skills.


The worst part is that they prefer to use dedicated AI tools to generate content instead of grinding hard. No one likes to fail anymore, and influencers make everyone believe that they can become experts without failing multiple times.


Conclusion: A world full of biased NPCs


In the near future, humans will be far more comfortable following AI’s guidelines instead of carving their own path through life. The latter is tough and has a heavier price tag compared to whatever your favorite AI assistant’s monthly subscription will cost you.


These tools are designed to be engaging and to deliver a pleasant experience to the users.


It simply means that it won’t ever challenge you like life does. It will never tell you the hard truths or teach you the tough lessons that help you grow.


It will make you just “good enough,” so you can coast through life being a consumer.


A consumer whose beliefs are shaped by the corporations that curate the internet through engineered facts written in sentences full of modals. 


The best part is that you will lack the tools to fight back. Interestingly, you may look at the few who have those weapons with detest, because they don’t fit your definition of normal. Most importantly, you will resist anyone who tries to pull you out of the rabbit hole due to the comfort it promises.


Isn’t that the perfect dystopia?


Until next time,

Tara


ai-led dystopian society

© 2025 By Tarasekhar Padhy

bottom of page