Why learn anything in the AI Era?
We need an answer for everyone
Published: 2026-02-23
A crictical question in modern society
It would not be an overstatement to say human’s dominance over the globe comes primarily from the capability of learning and passing down knowledge. Being able to learn from our environment and mistakes helps us adapt to a wide range of situations, and being able to pass that knowledge down to the next generation makes every generation more capable than the previous ones without needing to pay the price of failure. That’s essentially how human civilization has triumph over every other species and Octopus has not (plus Octopus lives in water so they can’t access fire).
Because of the raise of AI, the fundamental utility of learning is now challenged. “Why should I learn how to do math? I can just ask chatGPT to do it.” You can basically replace math in the question with anything and the question is probably asked at some point. And this is a very important question in case you don’t realize. If one gives up the previlage of learning, what’s the difference between the person and an octopus? However, this is not a good answer in my opinion. What if I want to be as cool as an octopus? Given how much human sucks in modern day in terms of running a society, I might as well be an octopus.
Still, I do enjoy being a human, and this post is a reflection on why should I learn anything in the AI era. Before I give my current answer, let’s go through two answers I feel unsatisfactory.
Unsatisfactory answer 1 - AI isn’t good enough
This is merely delaying the inevitable. Yes LLM-powered AI are not good enough today, but in a decade or two (or in even less time!) it is not impossible to have AI that are just better in any kind of thinking than human. Today’s AI is plauged by all sort of reliability issues, deleting the production database, hallucinating, and it is super expensive. The winning mode of using AI today is a collaboration between human expert/supervisor together with a team of AI grunt workers.
But one day the frontier will be pushed to a point that AI is more accurate and reliable than the best expert we can find, and cheaper too, at least I hope we will get there one day. For some domains that have strong inherent guardrails such as math, we are rapidly approaching a point that AI may be more productive than the best of us. And this is what we want too (Tenerance Tao has been spearheading the SAIR foundation))! These AIs better get good fast so we can just kick back and enjoy lifes!
Now assuming one day this AI Utopia is here, then this answer will be invalid. This is not what I am looking, my hunch tells me there is a legitimate reason out there that will stand until the end of time.
Unsatisfactory answer 2 - There are something we should learn to stay relevant or productive
Just as how the frontier labs and mega corporation try to shovel AI down our throat, they also try to make everyone learn AI so they can double dip. Narratives like “You should learn AI tools today to stay ahead of the curve” and “If you don’t learn AI, you will be beaten by your competitor who does” are quite prevalent at the time of writing. The goal is clearly to fuel the scaling era of AI: if the demand cannot keep up with the supply, then the value of the AIs these mega corps spend billions developing are going to be driven down.
These narratives have two problems I can think of: First, the pace of AI development is beyond any single human can catch up constantly. To catch up with AI is kind of similar to Sisyphus’s ordeal, expect the boulder is rolling down the hill faster and faster.
Secondly, there are so many skills that AI cannot do in the near future, and they usually involve interaction with the real world. If one really wants to stay ahead of the curve, perhaps it is time to go back to trade school and learn some real hard skills like plumbing. When there are millions of “programmers”, plumbers and electrician maybe the new dream jobs.
The root issue of this narrative in fact has nothing to do with AI, but the fact that the meaning of learning is directly tied to productivity or monetary incentives. I prefer the meaning of learning should anchor on something that is not environment dependent, so regardless of when and where, there is always joy in learning.
My answer for me - it makes me who I am
What’s your relationship with truth? This is possibly another way to ask “Why learn…“. I love the allegory of the cave because it is such a vibrant example to start a conversation on what truth is. Truth exists out there, somewhere, however all we can ever sense is some projection of the truth. Whether that is a math theorem, a language system, a news event, fundamentally all we can ever learn about them is a projection of that. (Okay maybe a language system is a bit different since in principle you can define a new language system so you are the truth at that point.) When I first learned about the allegory of the cave in college, the focus was mainly on the concept of realm of form, which itself is an interesting discussion point.
But overtime I see another facet of this allegory: what makes the prisoners different from one another? If they have the same preception of the truth and react to the truth in the same way, are they not all just thralls of the truth? Don’t get me wrong, learning truth itself is beautiful. But just as the allegory suggested, the source of the truth could be something that is manufactured (the fire) and much less fundamental compared to the laws of physics. To me, it is okay if we reach the deepest level of understanding possible, i.e. natural laws, and be enthralled with that, but it seems a bit silly if it is some manufactured reality that itself may fade in time, e.g. a tradition.
That maybe a bit too abstract, let’s bring it back closer to reality. Imagine a (not-so) far future which people rely on AI for every possible way of getting information. These AIs are going to get so good at manipulating information that our behavior converges to a common pattern. At that point, other than physical differences, what distinguish you and me? Are we not just an organic-based extension of the AI? There is no thought behind an action, just because the AI said so. At that point, the AI is our master.
Perhaps I should clarify a bit: I am okay with learning from an AI, but there must be a thought process behind it other than just swallowing whatever the AI tells you. Otherwise one can also be manipulated by whatever form of information input for the same reason, and historically it had happened, from word of mouth, to books, to radio, to internet, and now to AI.
Learning is really a process to interact with information and obtain a unique projection of the truth. Sometime the truth is so obvious that everyone’s copy are basically identical, but more often the truth is much weaker and there could be many ways to interpret the same source of information. And I think it is that unique copy of the truth separate us from one another. In a way, I don’t know who I actually am, but I have a projection of me in me. As I am writing this down, I realize perhaps I arrive at “I think therefore I am” here, but it does feel a bit different than what Descartes means. I don’t care about whether I even exists or am I conscious, but I do care fundamentally I am somewhat unique, however small the difference between me and the next person is.
In more concrete setting, learning gives me sovereignty and freedom. The organization running the AI obviously wield tremendous power over our thoughts. I am (somewhat) more free from whoever control the AI, and if a massive solar flare take out the entire world’s electrical system, the knowledge is still in me. I don’t rely on a bot to translate, and I know the working principle behind subjects I have learned. This means when we need someone to rebuild society, you know who to call.