Rethinking the University in the Age of AI
Three years ago, I wrote a post titled "Getting schooled by AI, colleges must evolve". I argued that as we entered the age of AI, the value of "knowing" was collapsing, and the value of "doing" was skyrocketing. (See Bloom's taxonomy.)
Today, that future has arrived. Entry-level hiring has stalled because AI agents absorb the small tasks where new graduates once learned the craft.
So how do we prepare students for this reality? Not only do I stand by my original advice, I am doubling down. Surviving this shift requires more than minor curriculum tweaks; it requires a different philosophy of education. I find two old ideas worth reviving: a systems design mindset that emphasizes holistic foundations, and alternative education philosophies of the 1960s that give students real agency and real responsibility.
Holistic Foundations
Three years ago, I begged departments: "Don't raise TensorFlow disk jockeys. Teach databases! Teach compilers! Teach system design!" This is even more critical today. Jim Waldo, in his seminal essay On System Design, warned us that we were teaching "programming" (the act of writing code) rather than "system design" (the art of structuring complexity).
AI may have solved the tedious parts of programming but it has not solved system design. To master system design, students must understand the layers both beneath and above what AI touches. Beneath the code layer lie compilers and formal methods. Compilers matter not so students can build languages, but so they understand cost models, optimization, and the real behavior of generated code. Formal methods matter not to turn everyone into a theorem prover, but to reason about safety, liveness, and failure using invariants. When an AI suggests a concurrent algorithm, students must be able to sense/red-flag the race conditions.
Above the code layer lies system design. As Waldo observed, the best architects are synthesizers. They glue components together without breaking reliability, a skill that remains stubbornly human. System design lives in an open world of ambiguity, trade-offs, incentives, and failure modes that are never written down. This is why "mutts" with hybrid background often excel in system design. Liberal arts train you to reason about systems made of humans, incentives, and the big picture. Waldo's point rings true. Great system designers understand the environment/context first, then map that understanding into software.
Radical Agency
My original post called for flipped classrooms and multi-year projects. These still assume a guided path. We may need to go further and embrace the philosophy of the "Summerhill" model, where the school adapts to the learner, rather than the learner fitting the school. In an age of abundant information, the scarce resource is judgment. Universities must stop micromanaging learning and start demanding ownership. The only metric that matters is learning to learn.
Replace tests with portfolios. Industry hires from portfolios, not from grades. Let students use AI freely, you cannot police it anyway. If a student claims they built a system, make them defend it in a rigorous, face-to-face oral exam using the Harkness method. Ask them to explain, how the system behaves under load, under failure, and at the corner cases.
In 2023, I argued for collaboration and writing. In 2026, that is the whole game. Collaboration now beats individual brilliance. Writing is a core technical skill. A clear specification is fast becoming equivalent to correct code. Students who cannot write clearly cannot think clearly.
I also argued for ownership-driven team projects. To make them real, universities must put skin in the game. Schools should create a small seed fund. The deal is simple: $20,000 for 1% equity in student-run startups. The goal is not financial return (which I think will materialize), but contact with reality. Students must incorporate, manage equity, run meetings, and make decisions with consequences.
Comments