The Flynn Effect is dead. IQ scores are falling. And the generation you blamed for killing Applebee’s turned out to be the apex.
Nobody asked millennials to be the pinnacle of measured human intelligence. But here we are.
For twenty years, every magazine, news segment, and Thanksgiving dinner featured some variation of the same thesis: millennials are soft, entitled, financially illiterate, and allergic to hard work.
We killed the housing market, the diamond industry, napkins, golf, department stores, and apparently the entire concept of fabric softener. We’re a generation of coddled, avocado-loving, participation-trophy-clutching disappointments.
Then the data came in.
Turns out the IQ escalator that had been running up since the 1930s, carrying each successive generation a few points higher than the last, stopped moving right after we stepped off.
Not stalled. Reversed.
Gen Z, the first cohort raised entirely inside the screen, is the first generation in modern history to score lower on standardised cognitive assessments than the one before them.
Which means millennials are the top. The peak. The final firmware update before the system started degrading.
We didn’t earn it. We didn’t plan it. But we’re absolutely going to be insufferable about it.
What is The Flynn Effect?
The Flynn Effect was one of the tidiest findings in cognitive science.
It’s named after James Flynn, who documented the trend in the 1980s, it described a remarkably consistent pattern: each generation scored roughly three to five IQ points higher than the last.
Decade after decade, across dozens of countries, the line went up. Your grandparents were sharper than their grandparents. Your parents topped theirs. Millennials topped boomers, which in hindsight should have been treated as the civilisational red flag it was.
Then, sometime in the 1990s, the line bent downward. And it kept bending.
This isn’t one alarming study from one small country. Bratsberg and Rogeberg published their landmark findings out of Norway in 2018, showing sustained IQ declines in men born after the early 1990s. Similar reversals have been documented in Denmark, Finland, France, the Netherlands, the UK, and the United States.
The pattern is now broad enough and consistent enough that the academic debate has shifted from “is this real” to “how bad is this going to get.”
The candidate explanations are a buffet of modern anxieties. Changes in educational philosophy. Nutritional decline. Environmental contaminants. The microplastics that have colonized every organ in your body.
But the explanation that makes everyone the most uncomfortable is also the most obvious: that outsourcing your cognitive development to a device specifically engineered to hijack your dopamine circuitry might not produce the same neurological outcomes as, say, reading a book because there was literally nothing else to do on a Tuesday afternoon in 1994.
The researchers are trying to be polite about this.
They use phrases like “changes in cognitive stimulation patterns” and “shifts in leisure time allocation.”
What they mean is: we gave children a capitalist algorithm disguised as a distraction tool and then acted surprised when their abstract reasoning scores declined.
Whatever the cause, the trajectory is clear. The Flynn Effect climbed for roughly eighty years, peaked with millennials, and then reversed. We are standing on the summit of a mountain that took a century to build, looking around, and slowly realizing that maybe nobody else is coming up.
Is it because we were bored?
IQ scores are one thing. They measure something, even if nobody can fully agree on what that is.
But the stronger case for millennials has nothing to do with test performance and everything to do with a developmental accident that produced the most cognitively versatile generation in history.
Here is what childhood was in the late 1980s and early 1990s: it was boring.
Not TikTok is down today boring. Existentially boring.
Your options for entertainment were: whatever books were in the house, ride your bike in circles, drink from the hose, or whatever game you could invent with a stick and your own imagination.
That was pretty much it. That was the whole menu.
This was, as it turns out, is the equivalent of feeding a developing brain premium fuel.
Boredom forces the mind to generate its own stimulation. It activates the default mode network, which is the brain system responsible for creative thought, self-reflection, and the kind of abstract problem-solving that IQ tests are actually trying to measure.
Every cognitive scientist studying this network will tell you the same thing: unstructured mental downtime isn’t wasted time. It’s where the real cognitive construction happens.
We didn’t know that. We just thought Tuesdays were long.
Then the internet arrived.
Not the internet of 2025, with its personalized feeds and algorithmic curation. The internet of 1998. The ugly one.
The one that screamed at you through a modem for forty-five seconds before showing you a webpage that looked like a ransom note designed by a fourteen-year-old.
The internet where nothing worked right, where finding information required actual detective skills, where downloading a single song could give your family computer a disease that would take your dad until Easter to fix.
This internet was as obstacle course, not conveyor belt. And millennials learned to navigate it at exactly the age when the brain is most capable of absorbing new systems and integrating them into existing cognitive architecture.
Roughly ages ten to fifteen, when prefrontal plasticity is at its peak and the brain is essentially a biological sponge that hasn’t yet learned to be cynical about what it absorbs.
The result was a generation with a cognitive profile that has never existed before and will likely never exist again.
We had an analog foundation: deep reading, sustained attention, comfort with ambiguity, the ability to think without technological assistance.
We also developed digital fluency: intuitive understanding of complex systems, rapid adaptation to new tools, and (critically) a bone-deep scepticism about technology born from years of watching it fail, crash, and lie to you.
Gen X got the analog childhood but came to digital technology a bit too late. Their brains were already built. They use technology the way a fifty-year-old uses a skateboard: technically possible, fundamentally unnatural.
Gen Z got the digital immersion but never had the analog foundation. They were born into a world where every question had an instant answer and every impulse had an instant outlet.
They are fluent in the language of technology in the way that a fish is fluent in water. They don’t think about it. Which is precisely the problem, because you cannot critically evaluate a system you’ve never experienced the absence of.
Millennials are the only generation that got both. Old enough to have wired our brains on books, boredom, and the card catalogue at the public library. Young enough to have rewired them on the internet during peak neurological plasticity.
We are cognitive bilinguals in a world that is about to need exactly that.
Millennials will dominate the AI world
All of this would be a fun bit of generational trivia if the world were staying the same. But the world is not staying the same.
The world is in the early stages of an AI transformation that is going to reorganize how every knowledge-based profession operates, and it is going to do so within the next decade.
And effective AI use, it turns out, requires a very specific cognitive toolkit. One that maps almost perfectly onto the millennial developmental profile.
Working well with AI is not about prompting. Any idiot can type a question into a chat box.
Working well with AI requires the ability to think clearly about a problem before you engage the tool.
It requires enough independent domain knowledge to evaluate whether the output is brilliant or a confident hallucination wearing a lab coat.
It requires comfort with iteration, the willingness to treat AI as a collaborator that needs direction rather than an oracle that dispenses truth.
And it requires the one thing that separates a useful AI operator from a dangerous one: the instinct to question what the machine gives you.
That instinct does not develop as much in people who grew up trusting technology implicitly because it always worked.
It develops in people who grew up watching technology lie, crash, and occasionally delete their homework.
It develops in people who learned to spot a sketchy website in 2003 because nobody had built a blue checkmark system yet and your only defense against misinformation was your own judgment.
It develops, in other words, in millennials.
Most Gen X executive uses AI like a vending machine. Insert query, expect product, get frustrated when the result isn’t perfect the first time.
They have the critical thinking but treat the technology as a black box because it arrived in their lives later.
The Gen Z analyst can prompt circles around everyone in the room but struggles to focus for longer than 10 minutes and can’t evaluate whether what comes back is any good because they have little domain experience.
They’ve also never had to build an argument from scratch without autocomplete, so they lack internal benchmarks for what good thinking looks like when it doesn’t come pre-packaged.
They don’t always know what they don’t know, which is the single most dangerous gap you can have when working with a system that sounds certain about everything, including the things it just made up.
The millennial sits in the middle.
We can think without the machine, which is what qualifies us to think with it.
We understand technology well enough to use it intuitively but poorly enough (or rather, we remember it being poor enough) to never fully trust it.
We are the generation that learned to build before we had Claude Code, and that’s exactly who you want operating the system.
So after twenty years of “millennials ruined everything” think pieces.
Twenty years of being told we were too fragile for the real world, too entitled for the job market, too broke for the housing market, too weird for the institution of marriage.
We were the first generation in modern American history to be worse off financially than our parents, and we were blamed for it, as if choosing between avocado toast and a down payment were an actual financial dilemma rather than a lazy columnist’s fever dream.
And now, at the precise moment when the most transformative technology since the printing press is reshaping every industry on Earth, the analog-to-digital developmental arc that defined millennial childhood turns out to be the ideal cognitive preparation for what’s coming.
We are the bridge generation.
The last generation that learned to think before the thinking tools showed up.
The IQ curve peaked with us, the digital fluency window closed behind us, and the AI revolution arrived just as we hit the prime of our working lives.
The universe doesn’t have a plan.
It just stumbles forward, breaks things, and every once in a while, through sheer accident, produces a cohort of humans whose particular combination of skills and timing is exactly what the moment requires.
That cohort, against every expectation and prediction, is a bunch of thirty-something-year-olds who can explain what a Dewey Decimal System is, deploy a machine learning pipeline, and remember a time when you had to print out MapQuest directions and hope for the best.
We are peak human. Not because we tried. Because we happened.
And the timing, for once in our financially cursed lives, is perfect.
Follow me on X for more bangers: x.com/thomasunise
**I would like to note that while millennials may represent the apex of measured human intelligence, this has not translated into affordable housing, functional retirement planning, or the ability to explain what we do for a living to our parents. Evolution, it seems, has its priorities.


