If people attend university for certification instead of education I think the battle is already lost. AI is an easy, but possibly high risk approach to gaining the certification without work, but the tried and tested approach is doing the bare minimum, cramming, then forgetting everything after graduation.
If you penalize people who use AI but in the process have learned the required information you make the problem even worse.
These problems are all because of a culture that favours the measurement over what is being measured.
A culture rooted in φιλοσοφία (greek, philosophia in Latin). So yes, I meant that literally. There were times, already 2500 years ago, where people wanted to study to become wiser.
> These problems are all because of a culture that favours the measurement over what is being measured.
Spot on. I am teenager going to college soon and I feel like the same way about the education system (and in extension, the job market but I suppose that the job market might be more understanding probably over all of it), part of my comment was as follows.
I do feel a bit like coding/a lot of fun-ness out of life is also like this, quantified, measured, transactional (posting for social-media?) [as I wonder if I am writing this comment for hackernews karma or relevant discussion talking points..]
This feels to me the most irreversible consequence because it might be hard for the generation (myself included) to see value in non-measurable things as everything has to be measured and transactional-ized.
(...) I would like for humanity to be more nuanced and less measured but more varied (grey rather than black or white) but I feel like that there is enough noise on the internet that maybe even this ends up becoming noise and I am not sure if anyone who might benefit from reading this actually does end up reading it.
Over a decade ago, my orientation at UChicago included the traditional "Aims of Education" address. They packed the whole first-year class into the chapel to explain, at length, that this education will not be "useful."
You're not supposed to make more money, or be happier, or really become anything other than a better version of yourself.
Oddly enough my reaction to this is that it's a broader societal problem as opposed to an A.I. problem.
Why shouldn't universities switch to examinations where no technology (apart from say calculators) are allowed; and this is strictly enforced? This was certainly the norm when I went to university.
I agree that A.I. trivializes (or changes how you approach) a lot of take home work; but people who wanted to cheat could more or less always do so for that to some degree. I guess it makes it easier to do so; however my expectation would be a greater reliance or weighting on in person examinations as a response; as opposed to a normalization of cheating.
One way in which A.I. could be seen as contributing to this is that it is devaluing the importance of what were seen as 'intellectual' pursuits; as we now have automation for them that is at the very least often surface level effective for undergraduate work.
I would suggest the one exception to this would be courses explicitly designed to teach how to use AI, and how not to. But in that case, it's less "use AI to cheat on this course" and "AI is the tool this course is about."
Then it becomes, teach what? "To use AI", yes, and, then, to do what? Use it how? To make some software? Why? You are already taking software engineering classes to learn to make software. To write something? Why? You are already taking classes that ask you write things yourself. An AI class, to me at least, is akin to taking a class about how to pay someone to write your essay for you.
And if we are talking about the various AI strategies people have where they have LLMs talking to LLMs to come up with whatever gooblyguck, are the poor souls who've been asked to come up with the AI class for the department going to know any of these strategies themselves? Are these strategies even going to be sustainable going forward after VC is no longer subsidizing tokens?
> Tying education to a capital-intensive and (likely soon to be) tightly regulated technology is one more step toward a different, frightening future. A world in which independent educational institutions are neutered and transformed by their reliance on a central authority into factories designed to train students according to the “needs of society” is not a new prospect — it has been the persistent dream of Fabians, technocrats, and engineers…
I hadn’t thought of this. Every school district and university tied into centralized AI inherently undermined its ability to decide how its kids are to be educated.
> I don’t think she was laughing two years later when I was TAing the class and we observed a fairly distinct gap of about 40 percentage points between the take-home test and the one administered in-person.
40pp is massive. Take homes are pretty much dead at that point. And not just in schools, but also for interviews. I don’t see how you can get a meaningful signal, it’s guaranteed they will be made using AI.
Overall it’s a very good read, really enjoyed the author voice and style
The college system is creating the zombie underclass with AI or without it. The amount of money colleges charge combined with the text book thinking shapes people into thinking there are steps to success, there are right answers, and "getting a job" is the right way to go. Colleges don't teach independent thinking and that is the exact thinking we need in the era of Youtube and AI. You don't need college to teach you how to learn text book items anymore and I think that is scary to some.
> You don't need college to teach you how to learn text book items anymore and I think that is scary to some.
a) This is about universities, not "college"
b) The University teaches you critical thinking, not how to learn "textbook items". It's not vocational training for upper middle class. It's for building and developing citizens who can think critically.
Not really, you need cooperation with other people in this complex world to live. No necessarily a job. You could be self-employed or a member of a cooperative or an elected official.
But yeah, the capitalist default is to have a job, sure.
> And if I want to do something interesting I need the skills and knowledge which are learned at a college level.
Not really, no. You need the skills and knowledge and for some professions you do need the official certificate of education and for a subset of those that's actually warranted, because you cannot get your hands on the training other ways. Doctors kinda need the official system, self-taught appendectomy would not be ideal. English literature? Not so much.
You need a job to live - under the current economic system. a lot of the critiques of capitalism lead you straight into ai, and most economic critiques of AI are critiques of capitalism as it exists now.
Trades can pay very well and frequently require nothing more than on the job training.
You think you need college for the same reason you equate "job" with survival. These are not universal truths, not even in capitalist hellscape America. It might be harder but it is in no way a requirement. Anyone who tells you different is lying to you.
I'm a current student, who also happens to be a full-time professional who is "all in on AI", and I think most are missing the true opportunities AI opens up for education.
Because my student path is non-linear (vs just following a life script), I may be a bit weird / not the average student, but it's especially true for me that I'm very intentional about actually learning the things I sign up for classes to learn.
My point is that I'm not taking classes just for the motions or to create slop. With that context, here is how AI helped me very specifically in a recent linear algebra course:
1. I was able to prompt very specific questions, usually audits of my work, in ways that provided responses that were more like a socratic tutor and not a cheating parter. In this way I did not need to bother my professor as much or seek out a tutor, when I was stuck. But I also didnt shortcut my way to answers. I was intentionally limiting the AI assistance to finding small errors or jogging my memory about steps missed or next steps.
2. I vibe coded a note taking web application (started as a chrome plugin for notion) so that I could shortcode and pick math symbols while my other arm was full holding my newborn (yes I'm a dad too). This has since evolved into a full-on science writing platform that I love whether or not anyone else ever uses it (though I am trying to turn it into a business). Maybe I actually ended up adding more work to my math class but it added a layer to the learning (what math symbols are needed, what are typical patterns for this subject, etc) that I think helped with my overall absorbtion of the subject.
I dont know if #2 is transposable to other students or to other subjects but I imagine there is some version of a double major yet to be created that is Core + how to properly use AI to learn (including vibe coding tools to help yourself and other students).
There are many other smaller ways AI can be used to help learning (flash cards, generated quizzes, etc) that are oft mentioned but that articles like this gloss over.
Having said that, I loved reading this (so well written it could not be AI despite the emdashes), and especially appreciate any mention of "The Whispering Earring", which is one of my spinning tops to remind me to remain vigilant of my cognitive health despite my almost complete embrace of AI.
Universities will still act as gatekeepers of prestige and status. There is no AI alternative to the top-20 schools...I remember all the hype from 10-15 years ago about how online learning and "MIT courseware" would upend the universities or threaten credentialism, and nothing even close to that happened. As it turned out, the online version of MIT is not a substitute for the actual thing.
Schools will adapt, as they have already, by weighing grading more towards in-class quizzes and tests . I think the humanities will continue to struggle, but I see the AI boom making STEM more relevant, even if AI can automate a lot of code or math.
On the other hand, it would be really good if universities stopped being gatekeepers of prestige and status. It seems like some of the biggest idiots in high visibility posts right now come from the ivy league..
>As it turned out, the online version of MIT is not a substitute for the actual thing.
More precisely, the people motivated enough to actually do the online MIT version were often already on a high-performance trajectory, and for the people who were not, few people took the online credential seriously, despite whatever skills they acquired.
I can't help but wonder if the fundamental problem is just that we spent decades pretending that a university degree was some sort of useful job training in the first place. As a professional software developer, I think my computer science degree is not actually all that important. Sure there are some relevant concepts, but they're ones you'd pick up on the job anyway.
I don't regret getting my degree (back in 2009), but I think requiring a person to have one is a dumb job requirement.
Frankly, we shouldn't have so many people going to university in the first place. There's a lot of people it's just utterly wasted on, and it drags down the entire apparatus as a result. In a sane society we'd have much more apprenticeships, vocational training, etc.
For Computer Science in particular: it's not supposed to be job training. CS is an education closer to math or science (in fact, at my university it belongs to the department of hard sciences and math). If you like that (and I sure did!) it'll be worth your time. If you're just looking for job training, you're looking in the wrong place.
My university CS program didn't even teach programming in any of the major classes, it was assumed you'd learn on your own or by doing one of the optional workshops.
There's a lot of stuff taught at CS careers that you simply won't learn on the job, or if you do, it won't be as rigorous and you'll be missing the fundamentals.
> There is an extreme idealist view of education that might see the threat of AI as good precisely because it could transform those kids — the former connoisseurs of SparkNotes and Mathway, the ones snickering in lectures and inking formulas onto their palms before exams before the rise of generative AI — into zombies lurching and stumbling their way into the “permanent underclass” (as the tech bros say), leaving the elect few free to enjoy the benefits of a humanist education without all the noise and din.
In this world, what are the benefits of a humanist education? The only reason we care so much about education is that it's how to determine merit in meritocratic societies, and therefore a key part in how people gain social status. In a world where AI does all the knowledge work and robots do all the physical work, with an 'elect few' owning everything and everyone else in a 'permanent underclass', why do the elect few even need to keep the permanent underclass alive?
Zombification of USA people was already happening before AI. Not surprisingly, it's been one of the favorite cultural themes in the country's cultural produce for long years. Zombies + superheroes was not poised to produce great non-drooling non-moron americans.
> The prevalence of AI use on college campuses, particularly at “elite” universities, is a cancer on our culture that threatens to turn a generation of promising young Americans into a class of drooling morons...
Right. I got my bachelors degree more than a decade ago, and they did a good enough job of it without AI. College was the first time in my life that I'd ever heard, "You cannot aspire to [ambition]," and I heard it there quite a few times.
Higher education needs reform more intensive than a simple defense against LLMs (as does the legal system and profession, as does the software engineering field, as does the field of psychology/psychiatry, as does-).
I think that universities just have to adapt to deal with slop, or think of new ways to challenge people to learn the essence of their studies. I wouldn’t want to be a uni teacher in these times though.
The (Western) educational system still does this for PhDs. Grades barely matter, and in most places you have two oral exams in your entire 5 years: your qualifying exam, and your final defense.
The reason it doesn't happen for the rest of the system is scaling. The US awards about 60k PhDs per year, compared to about 2M bachelors. There simply are not enough faculty and it is not realistic to hire enough (if there are even enough qualified people in existence)
And that's ignoring all of the problems with "not giving out grades" or "ending credentialism" - I guess people are supposed to just get hired on vibes?
I agree with you that no-tech parts of universities would work - obviously you can't avoid tech when teaching some things like coding, but mostly I think it would be a good idea.
There are problems: Having students attend lectures is great but they have to work with the material and prove they understand it - how to do that without homework? I'm sure there are ways. Have them work in a building full of computers cut-off from the internet maybe, but how to keep them from using their phones?
Another option is just severe comprehensive testing in heavily inviglated rooms long after they finished the class involving the material to prove they know it. Perhaps you could do this for the first few years of knowledge in a discipline and then assume the student actually is serious and take the leash off after they passed the tests. I know some disciplines already do this kind of thing, even before AI. Basically everyone has to pass a bar-exam type thing, even if they're studying art - but things like art can't really be condensed into an exam and it would certainly restrict and narrow what can be taught and learned, that's a big problem in my mind. Also what if there are new ideas in the study of physics and they can't really be taught because the exam is too difficult to change quickly? What if there's a big split in the philosophy of buisness, but the exam only asks about one side of the split? What if you have an ingenious professor who wishes to talk about a new branch of philosophy he's created - not on the exam though.
Edit: I guess if professors designed their own exams, instead of some distant exam-comittee it would alleviate most of my concerns about them.
For coding you can actually teach students on commodore 64s. It’s actually better because they have a BASIC shell and assembly language. Most importantly, no internet. :)
Actually, give them internet why not. But they have to use a 56k modem. Mwhaaha
Tests. Many of my university courses only graded on tests. They strongly encouraged you to do the homework to better understand the material, but didn't consider homework completion when calculating your grade.
Consider that universities are educating adults who are -often- paying to be there. If we assume competent course design and instruction, if an adult chooses to not work on the material until they understand it, then the only person they're harming is themselves... which -as an adult- is a thing that they're usually fully entitled to do.
You give exams in person, in class, on blue books, no phones. This part isn't hard. Instructors have been doing it for generations. It's only in the post COVID era that some have moved to having exams take home and on Canvas or similar platforms. This is great for instructors -- less work! but I am not convinced it actually helps students.
The part that is more difficult is take-home work, and I think the solution is that instead of being something that you turn in for credit, it needs to move to being more of a chance to practice for in-person exams.
What about essays? I've taught classes where students had to write essays in class, in person. On paper, with a pen (this may no longer be allowed on many campuses because of access and perceived fairness reasons, which IMO is a shame, but it is what it is). I think the traditional assignment of "write a 15 page paper on XYZ" is probably done. Instead students will have to prepare to write an essay in class by reading the source material (books, papers, etc) and converse with AIs that are hopefully not hallucinating, to get an understanding of the material and then come to class and be prepared to write about it.
Assignments, sure. But if tests/exams are proctored in-person with pen and paper, the students may quickly pivot to traditional learning methods if they want to pass their courses.
It would actually be interesting to see what people do attempting to transcribe AI generated material to paper. At the very least it's another layer of learning in writing it out.
Engineering at my EU uni, homework and coursework were at most a tiny part of the total grade, and never enough on their own to pass. If they were relatively bigger projects, you'll pass an interview or similar review after delivering it. This all were just nudges study and to check ourselves and they were seen as a "gift" of the Bologne Process (restructuring/standardizing of unis in the EU).
The only thing that mattered were the exams, be it pen and paper or coding/electronics labs, in person and proctored. No matter how much slop I could have access to back in the day I would have failed the same subjects I did.
Yes but AI doesn't prevent you from doing that learning. What is makes harder is the old broken ways of credentialing your learning. The way you do the learning has no need to change.
I personally feel like the software engineering profession may have to start moving more towards an apprenticeship model than a theoretical CS-gradate-then-work model.
Internship / coop programs at places like Waterloo already look a bit like this.
Slop made by students is one thing, but slop generated by facilities and fed at extreme premium to students just asks a question "why someone would pay for this instead of buying some LLM tokens, taking curriculum and teaching themselves".
If we want to teach students to use AI, it should just be a separate course, not shoving it in every possible nook and cranny to the point it is teacher AI talking with student AI with light supervision from both AI handlers
The prose is a bit too purple and tortured for that, IMO. Stock Opus 4.7 or 5.5 Pro is a more disciplined writer.
And, anyway, the point the article is trying to make is obvious. What's absolutely not obvious, and what it sheds very little light on, is what the University is going to look like in 10 years. Not what it should look like, but what it is most likely to look like.
It is easy to change the system prompt to make the AI talk with a different voice. It is remarkably hard (at least for Claude, I haven't experimented as much with GPT) to get it to not use so many em-dashes like this essay does.
This is such an ignorant trope. The last few places I worked ALL used em-dashes as part of house style and I will continue to use them. It's extremely common (and arguably the LLMs do it because it is extremely common).
There's no way. Just the first paragraph alone is enough to convince me; it's too well-written and melodious to be AI, with too much original thought:
Today, the demonic vice of the old is not that they are hard and demanding on the youth — instead they do not demand enough from us, and they cannot quite believe that we have not lived up to the little they have demanded. They think too well of our generation.
Without defending the quality of the rest of the essay, it's a great start. LLMs today could never match it.
Style is the wrong diagnostic. Purple prose and em-dashes can be prompted in or out. The harder question is whether the reasoning was committed or generated. A distinctive voice tells you nothing about whether the person actually worked through the argument or had it produced for them. Which is sort of the point the essay is making about students.
AI generated or not, I concur. I rally want to know what Universities will look like in 10 years time.
What will be taught there that cannot be taught by an AI (whatever form or interface it has).
Will Universities still be centers of knowledge and exploration? or will that be more disseminated through society, and so Universities not so important?
What courses will exist? Are those vastly different from today's courses?
> AI generated or not, I concur. I rally want to know what Universities will look like in 10 years time. What will be taught there that cannot be taught by an AI (whatever form or interface it has).
Computer-assisted instruction been amazing unsuccessful. Why is that?
If people attend university for certification instead of education I think the battle is already lost. AI is an easy, but possibly high risk approach to gaining the certification without work, but the tried and tested approach is doing the bare minimum, cramming, then forgetting everything after graduation.
If you penalize people who use AI but in the process have learned the required information you make the problem even worse.
These problems are all because of a culture that favours the measurement over what is being measured.
> These problems are all because of a culture that favours the measurement over what is being measured.
What other kind of culture is there? A culture of not measuring?
A culture rooted in φιλοσοφία (greek, philosophia in Latin). So yes, I meant that literally. There were times, already 2500 years ago, where people wanted to study to become wiser.
Read the last four words of the sentence you quoted. You'll find your answer.
> These problems are all because of a culture that favours the measurement over what is being measured.
Spot on. I am teenager going to college soon and I feel like the same way about the education system (and in extension, the job market but I suppose that the job market might be more understanding probably over all of it), part of my comment was as follows.
I do feel a bit like coding/a lot of fun-ness out of life is also like this, quantified, measured, transactional (posting for social-media?) [as I wonder if I am writing this comment for hackernews karma or relevant discussion talking points..]
This feels to me the most irreversible consequence because it might be hard for the generation (myself included) to see value in non-measurable things as everything has to be measured and transactional-ized.
(...) I would like for humanity to be more nuanced and less measured but more varied (grey rather than black or white) but I feel like that there is enough noise on the internet that maybe even this ends up becoming noise and I am not sure if anyone who might benefit from reading this actually does end up reading it.
From one of my comments that I had written sometime ago: https://news.ycombinator.com/item?id=47559013
>These problems are all because of a culture that favours the measurement over what is being measured.
Hear, hear!
Over a decade ago, my orientation at UChicago included the traditional "Aims of Education" address. They packed the whole first-year class into the chapel to explain, at length, that this education will not be "useful."
You're not supposed to make more money, or be happier, or really become anything other than a better version of yourself.
I wonder if they still do this.
Oddly enough my reaction to this is that it's a broader societal problem as opposed to an A.I. problem.
Why shouldn't universities switch to examinations where no technology (apart from say calculators) are allowed; and this is strictly enforced? This was certainly the norm when I went to university.
I agree that A.I. trivializes (or changes how you approach) a lot of take home work; but people who wanted to cheat could more or less always do so for that to some degree. I guess it makes it easier to do so; however my expectation would be a greater reliance or weighting on in person examinations as a response; as opposed to a normalization of cheating.
One way in which A.I. could be seen as contributing to this is that it is devaluing the importance of what were seen as 'intellectual' pursuits; as we now have automation for them that is at the very least often surface level effective for undergraduate work.
i always found it funny when people complaied about white board coding tests. Back in school, I had to write C code in those little blue books. :)
EDIT: I meant writing in blue books before this era of copying words out of the claude app on your phone
I would suggest the one exception to this would be courses explicitly designed to teach how to use AI, and how not to. But in that case, it's less "use AI to cheat on this course" and "AI is the tool this course is about."
Otherwise your suggestion makes sense.
Then it becomes, teach what? "To use AI", yes, and, then, to do what? Use it how? To make some software? Why? You are already taking software engineering classes to learn to make software. To write something? Why? You are already taking classes that ask you write things yourself. An AI class, to me at least, is akin to taking a class about how to pay someone to write your essay for you.
And if we are talking about the various AI strategies people have where they have LLMs talking to LLMs to come up with whatever gooblyguck, are the poor souls who've been asked to come up with the AI class for the department going to know any of these strategies themselves? Are these strategies even going to be sustainable going forward after VC is no longer subsidizing tokens?
> Tying education to a capital-intensive and (likely soon to be) tightly regulated technology is one more step toward a different, frightening future. A world in which independent educational institutions are neutered and transformed by their reliance on a central authority into factories designed to train students according to the “needs of society” is not a new prospect — it has been the persistent dream of Fabians, technocrats, and engineers…
I hadn’t thought of this. Every school district and university tied into centralized AI inherently undermined its ability to decide how its kids are to be educated.
> I don’t think she was laughing two years later when I was TAing the class and we observed a fairly distinct gap of about 40 percentage points between the take-home test and the one administered in-person.
40pp is massive. Take homes are pretty much dead at that point. And not just in schools, but also for interviews. I don’t see how you can get a meaningful signal, it’s guaranteed they will be made using AI.
Overall it’s a very good read, really enjoyed the author voice and style
the screenshot with formulas and then, in the middle of it all, "wait let me be more careful" had me laughing to myself.
The college system is creating the zombie underclass with AI or without it. The amount of money colleges charge combined with the text book thinking shapes people into thinking there are steps to success, there are right answers, and "getting a job" is the right way to go. Colleges don't teach independent thinking and that is the exact thinking we need in the era of Youtube and AI. You don't need college to teach you how to learn text book items anymore and I think that is scary to some.
> the text book thinking shapes people into thinking there are steps to success, there are right answers, and "getting a job" is the right way to go
Apart from entry-level texts, what discipline are you thinking of? Pretty much all my after-freshman-year undergrad texts contained debates.
> You don't need college to teach you how to learn text book items anymore and I think that is scary to some.
a) This is about universities, not "college" b) The University teaches you critical thinking, not how to learn "textbook items". It's not vocational training for upper middle class. It's for building and developing citizens who can think critically.
Ok but I need a job to live.
And if I want to do something interesting I need the skills and knowledge which are learned at a college level.
> Ok but I need a job to live.
Not really, you need cooperation with other people in this complex world to live. No necessarily a job. You could be self-employed or a member of a cooperative or an elected official.
But yeah, the capitalist default is to have a job, sure.
> And if I want to do something interesting I need the skills and knowledge which are learned at a college level.
Not really, no. You need the skills and knowledge and for some professions you do need the official certificate of education and for a subset of those that's actually warranted, because you cannot get your hands on the training other ways. Doctors kinda need the official system, self-taught appendectomy would not be ideal. English literature? Not so much.
You need a job to live - under the current economic system. a lot of the critiques of capitalism lead you straight into ai, and most economic critiques of AI are critiques of capitalism as it exists now.
Trade schools and apprenticeships exist.
Trades can pay very well and frequently require nothing more than on the job training.
You think you need college for the same reason you equate "job" with survival. These are not universal truths, not even in capitalist hellscape America. It might be harder but it is in no way a requirement. Anyone who tells you different is lying to you.
AI really does destroy everything it touches.
Profs can push back on this if they want. Not all of them want to (or want to justify pushback given their pay).
For me when I teach, no laptops or phones in class along with in-class handwritten paper quizzes on course readings and concepts has helped a lot.
I wrote something along very similar lines recently! Even the zombie metaphor is quite close.
https://pistolas.co.uk/work-that-need-not-be/
(Comment would be stronger if you contrasted the pieces yourself.)
I'm a current student, who also happens to be a full-time professional who is "all in on AI", and I think most are missing the true opportunities AI opens up for education.
Because my student path is non-linear (vs just following a life script), I may be a bit weird / not the average student, but it's especially true for me that I'm very intentional about actually learning the things I sign up for classes to learn.
My point is that I'm not taking classes just for the motions or to create slop. With that context, here is how AI helped me very specifically in a recent linear algebra course:
1. I was able to prompt very specific questions, usually audits of my work, in ways that provided responses that were more like a socratic tutor and not a cheating parter. In this way I did not need to bother my professor as much or seek out a tutor, when I was stuck. But I also didnt shortcut my way to answers. I was intentionally limiting the AI assistance to finding small errors or jogging my memory about steps missed or next steps.
2. I vibe coded a note taking web application (started as a chrome plugin for notion) so that I could shortcode and pick math symbols while my other arm was full holding my newborn (yes I'm a dad too). This has since evolved into a full-on science writing platform that I love whether or not anyone else ever uses it (though I am trying to turn it into a business). Maybe I actually ended up adding more work to my math class but it added a layer to the learning (what math symbols are needed, what are typical patterns for this subject, etc) that I think helped with my overall absorbtion of the subject.
I dont know if #2 is transposable to other students or to other subjects but I imagine there is some version of a double major yet to be created that is Core + how to properly use AI to learn (including vibe coding tools to help yourself and other students).
There are many other smaller ways AI can be used to help learning (flash cards, generated quizzes, etc) that are oft mentioned but that articles like this gloss over.
Having said that, I loved reading this (so well written it could not be AI despite the emdashes), and especially appreciate any mention of "The Whispering Earring", which is one of my spinning tops to remind me to remain vigilant of my cognitive health despite my almost complete embrace of AI.
Universities will still act as gatekeepers of prestige and status. There is no AI alternative to the top-20 schools...I remember all the hype from 10-15 years ago about how online learning and "MIT courseware" would upend the universities or threaten credentialism, and nothing even close to that happened. As it turned out, the online version of MIT is not a substitute for the actual thing.
Schools will adapt, as they have already, by weighing grading more towards in-class quizzes and tests . I think the humanities will continue to struggle, but I see the AI boom making STEM more relevant, even if AI can automate a lot of code or math.
On the other hand, it would be really good if universities stopped being gatekeepers of prestige and status. It seems like some of the biggest idiots in high visibility posts right now come from the ivy league..
>As it turned out, the online version of MIT is not a substitute for the actual thing.
More precisely, the people motivated enough to actually do the online MIT version were often already on a high-performance trajectory, and for the people who were not, few people took the online credential seriously, despite whatever skills they acquired.
All the courseware, classes, and schooling in the world cannot teach one to think.
Of course they can. What the heck?!?
Logic 101 changed the clarity of my thinking markedly.
> by weighing grading more towards in-class quizzes and tests
The piece discusses blue book tests where students were still cheating with their phones providing AI responses
that's a proctoring problem though, no phones during a test is typical to say the least.
And yet a Top 10 school like University of Chicago has apparently not been able to fix that problem.
That's telling in and of itself.
AI camera watching the students?
Faraday cage. EMP blast if that doesn't work.
I can't help but wonder if the fundamental problem is just that we spent decades pretending that a university degree was some sort of useful job training in the first place. As a professional software developer, I think my computer science degree is not actually all that important. Sure there are some relevant concepts, but they're ones you'd pick up on the job anyway.
I don't regret getting my degree (back in 2009), but I think requiring a person to have one is a dumb job requirement.
Frankly, we shouldn't have so many people going to university in the first place. There's a lot of people it's just utterly wasted on, and it drags down the entire apparatus as a result. In a sane society we'd have much more apprenticeships, vocational training, etc.
For Computer Science in particular: it's not supposed to be job training. CS is an education closer to math or science (in fact, at my university it belongs to the department of hard sciences and math). If you like that (and I sure did!) it'll be worth your time. If you're just looking for job training, you're looking in the wrong place.
My university CS program didn't even teach programming in any of the major classes, it was assumed you'd learn on your own or by doing one of the optional workshops.
There's a lot of stuff taught at CS careers that you simply won't learn on the job, or if you do, it won't be as rigorous and you'll be missing the fundamentals.
See also: https://news.ycombinator.com/item?id=48139148
> There is an extreme idealist view of education that might see the threat of AI as good precisely because it could transform those kids — the former connoisseurs of SparkNotes and Mathway, the ones snickering in lectures and inking formulas onto their palms before exams before the rise of generative AI — into zombies lurching and stumbling their way into the “permanent underclass” (as the tech bros say), leaving the elect few free to enjoy the benefits of a humanist education without all the noise and din.
In this world, what are the benefits of a humanist education? The only reason we care so much about education is that it's how to determine merit in meritocratic societies, and therefore a key part in how people gain social status. In a world where AI does all the knowledge work and robots do all the physical work, with an 'elect few' owning everything and everyone else in a 'permanent underclass', why do the elect few even need to keep the permanent underclass alive?
Zombification of USA people was already happening before AI. Not surprisingly, it's been one of the favorite cultural themes in the country's cultural produce for long years. Zombies + superheroes was not poised to produce great non-drooling non-moron americans.
> The prevalence of AI use on college campuses, particularly at “elite” universities, is a cancer on our culture that threatens to turn a generation of promising young Americans into a class of drooling morons...
Modern education is like that, even before AI. Check this https://www.jstor.org/stable/25006902
> Check this
Evidence of people complaining about a thing isn’t evidence of the thing per se.
Right. I got my bachelors degree more than a decade ago, and they did a good enough job of it without AI. College was the first time in my life that I'd ever heard, "You cannot aspire to [ambition]," and I heard it there quite a few times.
Higher education needs reform more intensive than a simple defense against LLMs (as does the legal system and profession, as does the software engineering field, as does the field of psychology/psychiatry, as does-).
I think that universities just have to adapt to deal with slop, or think of new ways to challenge people to learn the essence of their studies. I wouldn’t want to be a uni teacher in these times though.
This is already being done. I teach computer science at bachelor-level and all exams are in-person. We talk through the code.
It's not hard, just unpopular. End credentialism, stop giving out grades or administering exams.
The (Western) educational system still does this for PhDs. Grades barely matter, and in most places you have two oral exams in your entire 5 years: your qualifying exam, and your final defense.
The reason it doesn't happen for the rest of the system is scaling. The US awards about 60k PhDs per year, compared to about 2M bachelors. There simply are not enough faculty and it is not realistic to hire enough (if there are even enough qualified people in existence)
And that's ignoring all of the problems with "not giving out grades" or "ending credentialism" - I guess people are supposed to just get hired on vibes?
The solution is obvious. Teaching must be no-tech—just go back to 1950s.
The other problem of course is attention span due to social-media erosion.
The big tech has really done a number on society already and they’re just getting started.
I agree with you that no-tech parts of universities would work - obviously you can't avoid tech when teaching some things like coding, but mostly I think it would be a good idea.
There are problems: Having students attend lectures is great but they have to work with the material and prove they understand it - how to do that without homework? I'm sure there are ways. Have them work in a building full of computers cut-off from the internet maybe, but how to keep them from using their phones?
Another option is just severe comprehensive testing in heavily inviglated rooms long after they finished the class involving the material to prove they know it. Perhaps you could do this for the first few years of knowledge in a discipline and then assume the student actually is serious and take the leash off after they passed the tests. I know some disciplines already do this kind of thing, even before AI. Basically everyone has to pass a bar-exam type thing, even if they're studying art - but things like art can't really be condensed into an exam and it would certainly restrict and narrow what can be taught and learned, that's a big problem in my mind. Also what if there are new ideas in the study of physics and they can't really be taught because the exam is too difficult to change quickly? What if there's a big split in the philosophy of buisness, but the exam only asks about one side of the split? What if you have an ingenious professor who wishes to talk about a new branch of philosophy he's created - not on the exam though.
Edit: I guess if professors designed their own exams, instead of some distant exam-comittee it would alleviate most of my concerns about them.
For coding you can actually teach students on commodore 64s. It’s actually better because they have a BASIC shell and assembly language. Most importantly, no internet. :)
Actually, give them internet why not. But they have to use a 56k modem. Mwhaaha
> ...how to do that without homework?
Tests. Many of my university courses only graded on tests. They strongly encouraged you to do the homework to better understand the material, but didn't consider homework completion when calculating your grade.
Consider that universities are educating adults who are -often- paying to be there. If we assume competent course design and instruction, if an adult chooses to not work on the material until they understand it, then the only person they're harming is themselves... which -as an adult- is a thing that they're usually fully entitled to do.
but how would you do that? what about homework and coursework? students will just transcribe claude slop on paper and submit that.
You give exams in person, in class, on blue books, no phones. This part isn't hard. Instructors have been doing it for generations. It's only in the post COVID era that some have moved to having exams take home and on Canvas or similar platforms. This is great for instructors -- less work! but I am not convinced it actually helps students.
The part that is more difficult is take-home work, and I think the solution is that instead of being something that you turn in for credit, it needs to move to being more of a chance to practice for in-person exams.
What about essays? I've taught classes where students had to write essays in class, in person. On paper, with a pen (this may no longer be allowed on many campuses because of access and perceived fairness reasons, which IMO is a shame, but it is what it is). I think the traditional assignment of "write a 15 page paper on XYZ" is probably done. Instead students will have to prepare to write an essay in class by reading the source material (books, papers, etc) and converse with AIs that are hopefully not hallucinating, to get an understanding of the material and then come to class and be prepared to write about it.
It's a new world, but one we can adapt to.
Assignments, sure. But if tests/exams are proctored in-person with pen and paper, the students may quickly pivot to traditional learning methods if they want to pass their courses.
Requiring them to write it in longhand at least removes the instant gratification. I think that will work for some students.
It would be fun trying to draw all those emojis in longhand. You could probably do the bullet points as tiny manicules.
It would actually be interesting to see what people do attempting to transcribe AI generated material to paper. At the very least it's another layer of learning in writing it out.
I dunno think outside the box.
One option… They can do homework just test them every week in class. Homework doesn’t count for grade anymore. But test questions based upon homework.
Another… kids do reading at home in textbook, then work together in class to finish. Adjust hours accordingly.
There’s a very interesting problem space here though, to “disrupt” education by going back in time and applying a modern spin on education.
Engineering at my EU uni, homework and coursework were at most a tiny part of the total grade, and never enough on their own to pass. If they were relatively bigger projects, you'll pass an interview or similar review after delivering it. This all were just nudges study and to check ourselves and they were seen as a "gift" of the Bologne Process (restructuring/standardizing of unis in the EU).
The only thing that mattered were the exams, be it pen and paper or coding/electronics labs, in person and proctored. No matter how much slop I could have access to back in the day I would have failed the same subjects I did.
In-person tests and workshops, including oral exams.
Like we'd been doing for literally hundreds of years.
More in class, in discussion, and less "assignments"
Unfortunately that's way more expensive to do.
for STEM topics, I feel like some amount of "personal study time" is kind of needed to really grok stuff, at least for a percentage of students.
I studied maths, and spending time alone trying to solve problems and redoing the proofs from memory was important for my learning.
I don't think I'd have learned as much had those moments been replaced with more in class discussion.
Yes but AI doesn't prevent you from doing that learning. What is makes harder is the old broken ways of credentialing your learning. The way you do the learning has no need to change.
I personally feel like the software engineering profession may have to start moving more towards an apprenticeship model than a theoretical CS-gradate-then-work model.
Internship / coop programs at places like Waterloo already look a bit like this.
Slop made by students is one thing, but slop generated by facilities and fed at extreme premium to students just asks a question "why someone would pay for this instead of buying some LLM tokens, taking curriculum and teaching themselves".
If we want to teach students to use AI, it should just be a separate course, not shoving it in every possible nook and cranny to the point it is teacher AI talking with student AI with light supervision from both AI handlers
This whole piece is AI generated.
The prose is a bit too purple and tortured for that, IMO. Stock Opus 4.7 or 5.5 Pro is a more disciplined writer.
And, anyway, the point the article is trying to make is obvious. What's absolutely not obvious, and what it sheds very little light on, is what the University is going to look like in 10 years. Not what it should look like, but what it is most likely to look like.
> what the University is going to look like in 10 years
Mostly like they look like now, probably. With slightly more strictly enforced rules around exam.
I fail to see why it won't be like that.
I read a lot of AI prose three days and this bears none of the hallmarks. If this is AI, if really live to see the prompt.
I'm confident this is human.
It is easy to change the system prompt to make the AI talk with a different voice. It is remarkably hard (at least for Claude, I haven't experimented as much with GPT) to get it to not use so many em-dashes like this essay does.
This is such an ignorant trope. The last few places I worked ALL used em-dashes as part of house style and I will continue to use them. It's extremely common (and arguably the LLMs do it because it is extremely common).
There's no way. Just the first paragraph alone is enough to convince me; it's too well-written and melodious to be AI, with too much original thought:
Today, the demonic vice of the old is not that they are hard and demanding on the youth — instead they do not demand enough from us, and they cannot quite believe that we have not lived up to the little they have demanded. They think too well of our generation.
Without defending the quality of the rest of the essay, it's a great start. LLMs today could never match it.
Style is the wrong diagnostic. Purple prose and em-dashes can be prompted in or out. The harder question is whether the reasoning was committed or generated. A distinctive voice tells you nothing about whether the person actually worked through the argument or had it produced for them. Which is sort of the point the essay is making about students.
I can tell you with 100% certainty this is just how UChicago students write
It sounds to me like how I'd imagine a Philosophy student at the University of Chicago would write.
this comment is ai generated
AI generated or not, I concur. I rally want to know what Universities will look like in 10 years time. What will be taught there that cannot be taught by an AI (whatever form or interface it has).
Will Universities still be centers of knowledge and exploration? or will that be more disseminated through society, and so Universities not so important?
What courses will exist? Are those vastly different from today's courses?
> AI generated or not, I concur. I rally want to know what Universities will look like in 10 years time. What will be taught there that cannot be taught by an AI (whatever form or interface it has).
Computer-assisted instruction been amazing unsuccessful. Why is that?
Kinda glad to see it as universities have made a mockery of education and learning for decades; hoping AI just replaces them altogether