I have an economics degree from UC Santa Barbara. I'm already pretty sure the education part won't matter in five years. Economics jobs (analyst, financial modeling, forecasting) are exactly the kind of work AI handles well and gets better at fast. The credential still opens doors, but what's behind those doors is changing underneath people.
I'm not bitter about college. Being around other people at the same life stage, making friends, figuring yourself out when your parents aren't watching, that part I'd do again. But strictly for what I learned in classrooms? I could get most of it from a few months with Claude and some good textbooks.
Here's what makes me think about this constantly: I recently built a complete macOS application. Real software, native interface, backend integrations. I have no formal programming training. None. I taught myself entirely by working with AI. Asking questions, getting explanations, building piece by piece. If I can do that, what is a computer science degree proving that can't be demonstrated by just building the thing?
That's the real question about AI and education. Not whether it changes how we learn. How we get from here to there, and whether we do it deliberately or let it happen to us.
Bloom's problem
In 1984, educational psychologist Benjamin Bloom demonstrated something that should have changed education forever. One-on-one tutoring moves an average student from the 50th percentile to the 98th. Not a marginal improvement. A transformation.
He called it the Two Sigma Problem: how do you give every student that quality of instruction? For forty years, the answer was you can't. Private tutoring costs $25-80 an hour. Rich families hire tutors. Everyone else gets a classroom of 30.
AI tutoring costs $15-30 a month. Unlimited subjects. Available at 11 PM. Khanmigo is free for every teacher in America, and Khan Academy offers district pricing as low as $10 per student per year.
Most coverage of AI in education focuses on the digital divide; rich kids get AI tutors, poor kids don't, inequality widens. I think that gets it backwards. The most effective educational intervention ever measured has been locked behind a price wall for four decades. AI breaks that wall. Ninety-one percent of Americans own a smartphone, with ownership above 79% even in the lowest-income households. The access floor is a device most people already have.
The World Bank ran a randomized controlled trial in nine public schools in Benin City, Nigeria. Microsoft Copilot, powered by GPT-4, served as an after-school tutor for six weeks. Students gained the equivalent of up to two years of typical schooling in English. Cost: $48 per student. The largest gains were for female students.
Yes, there's a short-term adoption gap. The internet went through the same thing in the '90s. Early access skewed wealthy, and for a few years the gap widened. Then it caught up. The difference here is that deploying an AI tutor costs almost nothing compared to laying fiber optic cable. The bottleneck is software, not infrastructure.
This is the idea I keep coming back to: AI doesn't widen the education gap. It's the first credible tool to close it. But only if we treat access as an infrastructure problem rather than waiting for the market to sort it out.
Grade school: where it starts
K-12 is the clearest proof of concept.
A teacher has 30 students. In high school, that teacher might see 180 across six periods. They cannot possibly know where each one is in the material, what they're struggling with, and what they need next. Not a failure of effort. It's math. So we get the assembly line: same chapter, same test, same pace.
In a 2023 randomized controlled trial led by Gregory Kestin and Kelly Miller at Harvard, 194 physics students using a custom AI tutor learned more than twice as much in less time compared to those in active-learning classrooms. But the part that matters: the AI was designed with specific teaching constraints. Short responses. One step at a time. Never giving away the full answer. How the AI was designed to teach mattered more than the AI itself.
Teachers have complained for decades about the same things: too many papers to grade, too much time on lesson plans, too much administrative overhead. According to a 2025 Gallup survey, a majority of teachers now use AI for their own work, saving hours each week. But according to Pew Research, only 6% believe AI does more good than harm for students. They're adopting it for their own work while remaining deeply skeptical of it in the classroom.
Teachers have watched ed-tech hype cycles come and go for twenty years: interactive whiteboards, iPads for every student, gamified learning platforms. Most of it turned out to be expensive distractions. They've also seen, up close, what happens when students use AI to shortcut learning instead of deepen it. When a kid submits a ChatGPT essay they didn't write, the teacher isn't wrong to be worried. The question is whether that's a reason to reject AI in education or a reason to design it better. The Harvard study suggests the answer is design: when the AI is built to teach rather than to do the work for students, the results are hard to argue with.
When an AI tracks a student's progress in real time, identifies gaps, generates targeted exercises, and reports analytics to the teacher, the teacher's job fundamentally changes. When they get one-on-one time with a student, they know exactly what that kid needs help with. No guessing.
That doesn't replace teachers. It makes them what they were supposed to be. The AI handles subject matter and diagnostics. The teacher handles the humans: mentoring, motivation, developing kids at their specific age range instead of being forced to be mediocre at everything. The best teachers already do this. The system just doesn't let most of them.
College: where it's already collapsing
The higher you go in education, the weaker the case for the current model.
As of early 2024, fewer than 18% of US job postings on Indeed required a bachelor's degree, down from over 20% five years earlier. More than half of listings don't mention education at all. A McKinsey report found that hiring for skills is five times more predictive of job performance than hiring for education. These aren't thought experiments. Companies are already moving.
The money is moving too. Larry Fink said at BlackRock's 2026 Infrastructure Summit that the class of 2026 could face the highest new-graduate unemployment in years, even without a recession. He didn't just say it, he put $100 million into training workers for skilled trades through BlackRock's Future Builders initiative. When the CEO of BlackRock bets on welders over bachelor's degrees, that's not commentary. That's a capital allocation decision.
The standard counterargument is that college provides something beyond job training: critical thinking, intellectual breadth, the experience of being challenged by ideas outside your comfort zone. There's real truth in that. College was worth it for me socially, and I'm glad I went. The friendships, the independence, figuring out who you are in a new environment. That has genuine, lasting value.
But the educational value? I went to three different schools: University of Arizona, Pasadena City College, UC Santa Barbara. The educational quality varied less than you'd expect. And the jobs my degree was supposed to lead to are the ones being automated first.
What I find telling is how even the strongest defenders frame their case. Lloyd Blankfein, the former Goldman Sachs CEO who went to Harvard at 16 from Brooklyn public housing, argues for college almost entirely on social grounds. Be a more curious person. Be someone others want to engage with. He's making the case for college as a growing-up experience. That's a real argument, but it's not an argument for the classroom. Almost nobody is claiming the instruction itself is irreplaceable. That's the gap AI fills.
The federal government seems to agree. Beginning this summer, Pell Grants will be available for credential programs as short as eight weeks. That's policy infrastructure for a post-degree world.
What this actually comes down to
The essay you usually read about AI and education ends here with something like "we need to have a conversation about this." I want to be more specific than that.
The hardest part of this problem is that the solution looks completely different depending on who you're building it for. What works for an elementary school classroom doesn't work for a high school, and neither works for a university. A six-year-old learning to read and a PhD candidate running experiments need such different things that calling both of them "education" almost obscures more than it clarifies. Anyone designing AI for education as if it's one system is going to get it wrong.
For grade school, the question that matters most isn't "should we use AI?" It's who designs the pedagogy. The Harvard study proved that how the AI is designed to teach determines everything. So does that get built by ed-tech companies optimizing for engagement metrics? Or by teachers who've spent decades in classrooms and are rightly skeptical of tools designed without them? If AI tutoring gets rolled out and the teachers aren't the ones shaping how it teaches, we'll have wasted the opportunity.
For college, the question is harder. If AI can handle introductory coursework, and a professor's real value is their research expertise and practical experience, then paying tenured faculty to lecture 200 freshmen is wasting its most expensive people on the courses where they're needed least. Professors working closely with advanced students who can engage at the frontier of a field — that's where human expertise is irreplaceable. The intro courses are where AI is strongest. Flip the allocation and you'd have a completely different institution.
And through all of it, you can't lose the thing that actually makes school valuable beyond academics. The friendships, the social skills, learning to deal with people. Whatever gets built has to hold onto that. Efficiency isn't worth much if it produces brilliant loners.
I built a complete application with no programming background, just by working with AI. That's not a special talent. That's just what the tools do now. If that's true today, what does a kid who grows up with an AI tutor from first grade look like at 22?
That's the most important question in education right now. And the people who should be asking it are still debating plagiarism policies.