AI recreates parts of a master’s course in hours
Penn professor uses Anthropic Claude to replicate curriculum and self-study path, universities left selling credentials and proctoring
Images
A University of Pennsylvania economics professor says he used Anthropic’s Claude to replicate part of a master’s-level course in roughly 12 hours, a workflow he argues threatens the business model of universities. In an interview with Business Insider, Jesús Fernández-Villaverde describes using the chatbot to design a syllabus-like structure and guide what to study, compressing what is usually delivered through weeks of lectures and readings.
The immediate headline is not that a professor discovered self-study. It is that the expensive part of many degree programmes—content packaging and pacing—now has near-zero marginal cost when a competent model can produce a tailored reading plan, explanations, and practice prompts on demand. Universities have long bundled multiple products into one invoice: instruction, assessment, credentialing, and access to a cohort and faculty network. If the “instruction” component becomes cheap and abundant, the remaining scarce assets become the right to certify, the ability to police assessment integrity, and the signalling value of admission.
That shifts pressure onto the parts of the system that cannot be automated away. Lab work, supervised research, and high-touch feedback still require humans and facilities. But much of what students experience as “teaching” in lecture-heavy courses is distribution: slides, problem sets, and office-hour clarifications. If a model can generate those on demand, the question becomes why a student pays for the distribution layer rather than only for the gatekeeping layer.
Gatekeeping, however, is where institutions tend to respond by tightening control. As AI makes take-home work and unsupervised assignments harder to trust, more programmes are likely to move toward proctored exams, identity checks, locked-down browsers, and surveillance-heavy assessment regimes. The technology that unbundles learning pushes universities to rebundle around enforcement.
Fernández-Villaverde’s experiment also highlights a second-order effect: the advantage concentrates with those who already know what to ask for. A student who can specify goals, test themselves, and iterate prompts will extract more value than one who treats the model as a vending machine for answers. In practice, that means AI can widen gaps inside the same classroom—while universities, whose pricing is still built around cohort averages, struggle to charge for a product that is increasingly personalised elsewhere.
Business Insider frames the episode as a warning to universities. The more concrete observation is simpler: a professor replicated part of what he sells, quickly, using a tool made by a third party. The institution still awards the degree, but the content is no longer scarce.