The nature of language is evolving—not just in how we use it, but in how we understand it. My previous exploration, Beyond Words: The Dance of Language, Intuition, and AI, argued that language is more than a tool for communication; it is a medium of thought, an architecture for cognition, and a bridge between intuition and articulation. Insights from the MIT research detailed in Re-imagining our theories of language challenge some of these assumptions. What if language is not inherently tied to thought at all? Then what is its true role? And how does this shift in understanding influence the trajectory of artificial intelligence, intuition, and human cognition?
More Than a Thought Machine?
For decades, Noam Chomsky’s linguistic framework suggested that language and thought were deeply intertwined—perhaps inseparable. The idea that an innate, pre-structured linguistic capacity underpins our cognition has long influenced theories about how we think, communicate, and even how we develop artificial intelligence. However, the MIT researchers—Ev Fedorenko, Ted Gibson, and Roger Levy—offer a disruptive finding: language is not the cornerstone of complex thought.
Their research, based on neuroimaging experiments, suggests that mathematical reasoning, music processing, and working memory function independently of the brain’s language centers. This means that while language enhances our ability to express thoughts, it does not create them. Thought exists in its own realm, often in the form of intuition, pattern recognition, and abstract reasoning—elements I previously highlighted as fundamental to our cognitive landscape.
This finding forces us to rethink the premise of AI-driven language models. If human intelligence does not fundamentally require language, then AI’s current trajectory—obsessed with perfecting linguistic fluency—may be missing something deeper. What if true artificial general intelligence (AGI) requires a non-linguistic form of cognition, a way to perceive the world beyond words?
The Linguistic Metaverse Revisited
In Beyond Words, I likened language to a metaverse—an evolving, shared reality where meaning is constructed and negotiated. The parallels remain striking: language abstracts reality, facilitates interaction, and evolves dynamically like any virtual space. However, MIT’s research forces a reconsideration: what if language is more like a filter than a reality?
If thought and language are distinct, then language is merely a transmission medium rather than the architecture of cognition itself. This distinction has profound implications for both human intelligence and AI. Language, it turns out, is more akin to an API between minds than a self-sufficient cognitive engine. It allows us to communicate thoughts, but those thoughts may exist in a deeper, non-linguistic substrate.
This would explain why intuitive insights—like those of art connoisseurs detecting forgeries or doctors making gut-feeling diagnoses—often emerge before language can catch up. Our brains do not “think” in words first; they recognize patterns, sense contradictions, and then articulate them. This is why AI models that process language flawlessly still fail at fundamental intuition-based tasks: they lack access to the deeper, non-linguistic modes of cognition that drive human understanding.
The Role of Communication in AI and Human Evolution
One of the key takeaways from MIT’s research is a shift in focus from “language as thought” to “language as communication.” This aligns with the idea that AI, to truly evolve, must move beyond linguistic mimicry and into functional, communicative interaction.
Large language models like GPT-4 and Gemini simulate language without genuinely understanding or thinking. This makes them brilliant at emulating structured discourse but terrible at true problem-solving or introspection. For AI to reach deeper levels of intelligence, it may need to develop non-linguistic cognition—the ability to process concepts without relying solely on language.
This mirrors the evolution of human intelligence. Early humans communicated through gestures, facial expressions, and primal sounds before structured language emerged. AI, like our ancestors, may need to develop pre-linguistic forms of perception, intuition, and embodied cognition before it can reach the next phase of artificial intelligence.
Beyond Words and Into Meaning
If we accept that language is not the foundation of thought but rather its vessel, then our approach to AI, education, and even human self-awareness must shift. Instead of equating intelligence with linguistic ability, we should explore ways to develop and nurture intuition, pattern recognition, and non-verbal reasoning.
For AI, this could mean integrating sensory-based understanding—AI that can feel physical reality rather than just describe it. For humans, it may involve revisiting ancient, non-verbal forms of intelligence: meditation, embodied cognition, and the wisdom found in music, movement, and silence.
If the true purpose of language is communication rather than thought construction, then our challenge is not to make AI think like us, but to find ways to communicate with forms of intelligence that may already exist—both within and beyond ourselves.
The Next Frontier in AI and Human Understanding
Language remains one of humanity’s greatest tools, but it is not the entirety of our intelligence. The work from MIT’s cognitive scientists suggests that if we want to understand intelligence—both human and artificial—we must look beyond words. AI that seeks to truly mimic human cognition must develop non-linguistic faculties: intuition, pattern recognition, and embodied understanding.
Meanwhile, for us humans, this realization is liberating. We are not bound by language; our thoughts, emotions, and insights extend beyond the words we use to express them. If we embrace this, we might discover new ways of thinking, perceiving, and understanding that have been hidden beneath the veil of language all along.
As AI evolves, and as we deepen our understanding of cognition, one thing becomes clear: the future of intelligence—human and artificial—lies not just in refining words, but in transcending them.