We are well into the process of AI upending higher education. It’s unclear what the university will end up looking like in the AI era – or even if there’s a role for universities at all. I’m confident there is, in principle, but it’ll involve a major retooling at the level of classroom experience. The good news is, at that level, it doesn’t have to wait for administrators. The bad news is, the future of the university as an institution depends on what instructors do today.
To imagine the role for the university in the AI era, we need to be clear about what its role is today – and what its role is not. It’s no cynicism to note that the value proposition is not students paying to learn. No doubt a lot of learning does go on in universities. And many people are excited and/or scared about the prospects of one-on-one AI tutors upending the traditional classroom.
But that would be a total misreading of the purpose of higher education. In this respect, LLMs are no more poised to upend higher education than MOOCs were in the late aughts (remember those?). There have been free online courses for whatever you like since 2008, self-taught learners have a vast array of high-quality resources at their disposal, and the threat to universities has been… basically none.
So why pay for a university experience? It’s not just the social experience: after all, people still pay for online classes!
Instead, the value of the university is that a diploma represents a stamp of approval. An employer can look at a diploma and infer certain things about a candidate that would be easy to fake in an interview: things like problem-solving ability, the stamina to follow through on long-term goals, and – yes – the skills learned in class. On the basis of this stamp of approval, employers are willing to pay, on average, 61% more for an employee with a diploma compared to one without, even one who’s only one course shy of a degree. Compared to self-learning, that stamp of approval is easy to employers to verify, and therefore valuable for students to acquire.
So the value of the university lies not in its ability to teach, but in its ability to distinguish students with, from those without, valuable skills and traits. And traditionally, teaching is exactly how it does this. Lectures, exercises, and tests. Students who pass the gauntlet can be pretty reliably assumed by employers to have skills and traits that make them more valuable as employees.
But a great deal has changed in the past couple years. Even more than Chegg (the cheating clearinghouse where answers would be posted online), ChatGPT makes it easy to breeze through college – easy enough even to automate financial aid fraud at scale. Just plug homework questions, essay prompts, and whatever else, into ChatGPT and be done with it.
From a student’s perspective, getting a diploma is much, much easier now. And professors have been tempted to respond in a few ways:
AI is therefore not a competitor to higher education, but it doesn’t clearly augment it either. Instead, the real threat is that it’s no longer possible to reliably distinguish good from bad students on any assignment with internet access.
This is an existential threat to the university. For a while, things will look good. Students who otherwise wouldn’t be up to snuff will decide that college isn’t that effortful after all, and classes will fill. But – as we say in economics – solve for equilibrium. Does a diploma post-AI mean the same as a diploma pre-AI? Will an employer be willing to pay that much more for an employee with a diploma, compared to one without?
And if the wage premium falls, it’s the good students who drop out before the bad students. The university enters a death spiral, and there’s no constructive student-facing role for it in the AI era.
The good news is that as long as AI doesn’t collapse the difference in performance between a diligent employee and a less-than-diligent employee, such a certification will continue to be valuable. The question then becomes: will the university – or any other institution – be able to reliably distinguish at lower cost than the eventual performance difference?
As a certification, a diploma is a summary of many individual certifications represented by grades in individual classes. It’s at this level that AI is diluting the signal value of an education, and it’s at this level that the problem has to be dealt with.
I’ve noticed over the past few years in my classes that homework grades have risen about 20 points on average compared to when I started teaching. Incredible! But test grades are flat to slightly down. Not only that, but the students who do well on the homeworks are no longer the students who do well on the tests.
That tells me homework is a victim of what’s called Goodhart’s Law. To put it more intuitively than its usual formulation, Goodhart’s Law says that when there’s an informative signal (like a diploma, or homework), and when something important is conditioned on that signal (getting a job, or a good grade), people look for ways to get the signal without putting in the work. “Cheating the system”. And if they succeed, the end result is that the signal doesn’t tell you anything: the diploma doesn’t mean you’re smart, and getting an A on the homeworks doesn’t mean you’ll do well on the test.
A Goodharted signal is worthless unless the signal can be made to stay ahead of the cheaters. Quite simply, AI has already Goodharted homework, and is well on its way to Goodharting diplomas too. All the moralizing in the world can’t hold back the Goodhart tide.
But the flipside of Goodhart’s Law is that if we remove the stakes from a signal, it takes away the incentive to cheat the system.
So my policy going forward is that homework is optional. I’ll still assign it every week, and I’ll strongly recommend doing it as if it were assigned. I’ll give feedback on everything anyone turns in. But if you’re just going to feed it to ChatGPT, save us both the effort. The only thing I’ll actually assign a grade for is things that I can verify are done in class with students’ own brains (this includes classroom participation). And anyone who comes in for the test without having done the homeworks is guaranteed to fail the tests.
This approach significantly raises the stakes of tests. It violates a longstanding maxim in education, that successful teaching involves quick feedback: frequent, small assignments that help students gauge how they’re doing, graded, to give them a push to actually do it. “I’ll do it later” very easily turns into “oops, I never got around to it.” We’ve all been there, and I have a lot of sympathy for that.
Unfortunately, this conventional wisdom is probably going to have to go. If AI makes some aspect of the classroom easier, something else has to get harder, or the university has no reason to exist.
The signal that a diploma sends can’t continue to be “I know things”. ChatGPT knows things. A diploma in the AI era will have to signal discipline and agency – things that AI, as yet, still lacks and can’t substitute for. Any student who makes it through such a class will have a credible signal that they can successfully avoid the temptation to slack, and that they have the self-control to execute on long-term plans.
So my purpose in writing this is twofold: first, for the benefit of my students to communicate to employers that passing my class is a meaningful signal in this specific way. And second, because the signal value of a diploma (and therefore, indirectly, the wage of a professor) is averaged over the quality of many many classes, to convince other professors to think carefully about the grounds on which they can maintain their comparative advantage in distinguishing valuable skills even in the AI era.
I’m confident that the university can find a useful role in the AI era. Whether it will depends on us.
The header image was prompted with the cloud having “the wrong number of fingers, in the manner of early AI generated images” for the meta-joke, but in an interesting regression it seems AI has lost the ability to generate screwed up hands.
Leave a Reply