The AI Training Paradox
I spend a significant amount of my time working with AI, i.e. developing it, refining it, and figuring out how it can best support human creativity and intelligence. But lately, I’ve been asking myself a troubling question: Are we putting more effort into training AI than we are into training ourselves?
AI models today are trained on vast amounts of data and fine-tuned relentlessly - so we can keep improving them. But how often do we invest in systematically upgrading our own knowledge and capabilities? If AI is constantly evolving, shouldn’t we be as well?
To understand why this comparison matters, it helps to know how AI is trained in the first place. Training an AI model involves feeding it large amounts of data (books, articles and code repositories), allowing it to recognize patterns (i.e. how to form sentences), and fine-tuning it (with human intervention and feedback) to perform specific tasks.
For example, when building self-driving cars at Tesla. The AI systems are trained on millions of hours of driving data. They constantly refine their ability to recognize stop signs, pedestrians, and road conditions. The more training they receive, the better the cars get at driving. The same is true for Netflix and Tiktok algorithms - the more data they collect on what we watch, the better they get at predicting what we will like next.
The paradox is clear: The better AI gets, the more we rely on it—but the more we rely on it, the less we push ourselves to grow.
In the past, if I wanted to deeply understand a topic, I would read multiple books, and reach out to experts. Now, it’s tempting to just ask ChatGPT for a quick summary and move on. This shift isn’t just personal; it’s happening on a societal level. Employees are being trained to use AI tools, but not necessarily to develop the deeper thinking and adaptability that will keep them relevant in the long term. Companies invest millions in making AI smarter but often neglect upskilling their workforce.
But ultimately, AI can only generate content, it doesn’t create anything original. As AI takes over repetitive and analytical tasks, what’s left are roles requiring deep critical thinking, creativity, and emotional intelligence. If we stop investing in these uniquely human skills, we risk becoming passive operators rather than active thinkers. Human learning requires struggle, it requires wrestling with ambiguity, making connections across disciplines, and challenging our own assumptions.
If we’re not careful, we risk a world where AI continues to get smarter while human intelligence stagnates. But we don’t have to let that happen. The key isn’t to reject AI but to make sure that while we train machines to learn, we never stop learning ourselves. Lifelong learning isn’t just a necessity—it’s the ultimate competitive advantage in the age of AI.
Sonam Pelden
Subscribe today to get full access to the website as well as email newsletters about new content when it's available. Your subscription makes this site possible, so thank you!