Post

Conversation

There is a field experiment showing this exact effect. Introducing GPT tutors increases performance by *a lot*--students seem to be picking up the material much faster--but when GPT is removed those who had access perform *much worse* compared to those w/o access. 1/4
Image
Quote
Lakshya Jain
@lxeagle17
I'm teaching databases this semester at Berkeley. My students all seem unusually brilliant. Not many go to office hours, and not too many folks post on the course forum asking project questions. Weirdly, the exam had the lowest recorded average in my 10 semesters teaching it.
Show more
689.3K
Views
David Watson ๐Ÿฅ‘
Post your reply

Brings question: what do we want students to learn in our classes? There is an argument that students will always have access to GPT, so assignments should allow access. But another view is that the class is not just teaching material, but how to *think* through problems. 2/4
Each field's questions have logic to them--field-specific critical thinking skills--that goes beyond specific questions. Using LLMs will answer question but students will fail to learn how to think through q's. This will lead to disadvantage even when LLM available. Why? 3/4
There will be questions that the LLM will get wrong, and student will not be able to diagnose why or that it's wrong in first place. Perhaps more importantly, student will have less capacity for creativity because they don't understand the logic behind the field's questions. 4/5
This is why I've changed my classes to focus more on in-class assignments. It's not ideal--big final projects are fun and important, and they're still a component of the class--but it's also important that students are leaving class knowing the logic of how to think through qs.
Reminds me of this RCT where they gave incentives to kids to improve math skills, then removed the incentives. Good students remained good, bad students ended up providing lower effort in every task and performed worse than before.
Image
I think a major issue with moving to in-class assignments, is that it's hard to teach long-form writing that way. Essays require multiple revisions and research over the course of days.
Absolutely. Unless this issue is solved through smart implementation of the tech, this will be an important skill that could become underdeveloped.
I like this study and cite it but I have two points of why this isnโ€™t settling the issue of AI and learning 1) itโ€™s a field experiment on a topic where its impossible to measure treatment since thereโ€™s no AI detection tools since AI use canโ€™t be reliably detected
Introducing human teachers increases their performance by *a lot* but when the teacher is removed they perform *much worse*. Too obvious. To actually get this cited, let's give it a flashy headline: "Human Teachers Can Harm Learning".
Note that the paper says that the difference between tutor AI and baseline was not statistically significant. So it seems that AI optimized for tutoring can get the students to learn things faster but the end result is limited to the capacity of the student.
Because this is called a "cheat sheet". Learning requires understanding the mechanisms behind things and not just having top-tier descriptions of the answers.
when i was teaching CS we had a weekly 2 hour live pair coding session with random pairs for every student, that weeded out the people who would not code. another issue is that many students don't go to uni to learn, but to get a degree, and then it's just a good idea to cheat.
Sounds like badly designed tutors. Their so called GPT that โ€œsafeguards learningโ€ is the actual subject of this experiment, and it appears to be quite poorly designed and implemented, as the results would indicate.
True! In my class now, there are students got 80-90s on take home and homeworks but did very poorly with in-class exams. Eventually, these are people would go out and refer to themselves as experts, thatโ€™s even more frightening to meโ€ฆ
There needs to be specifically designed GPTโ€™s for the coursework with predefined problems and solutions so it doesnโ€™t spit out erroneous. I believe there are a lot of flaws in this study. I know with my style of learning, I wouldโ€™ve excelled even further in my mathematics
Current 9-5 Workers.. It's time to make a move. Our Accelerator 2.0 program shows those with zero prior experience how to find profitbable niches and launch WINNING products under their own brand on Amazon. Click the link below to see what our 12 Month Mentorship includes ๐Ÿ‘‡
huh, reminds me of when I fired my social media intern and the AI kept outperforming...until the algorithm changed. key is continuous learning loops โ€“ our AI agents adapt weekly based on 20k+ data points. maybe GPT tutors need similar evolution cycles? *inserts coffee emoji with
Its interesting to see how GPT tutors boost performance initially, but the drop in results after removal raises concerns. This suggests that while AI can enhance learning, it might also create a dependency that undermines students ability to learn independently. To address this,
Children should not have access to AI, and more importantly, AI should not have access to children.
Bc AI can show students not just the answer but the steps to do the algebraโ€ฆ This makes students believe they understand but they are being curated far more than previouslyโ€ฆ.once you remove the crutch the students flounders bc theyโ€™ve been made mentally lazyโ€ฆ
Obviously. If you have to offload all cognitive effort to a machine you are not going to remember any of that - you don't have to.
It's a bit more complicated than what you've said. They tested two different chat based tools and found that one reduced performance while the other had no effect. (Also the last sentence in this paragraph is a bit generalized from the results) (1/2)
Image
2) they show how a well programmed bot tutor can help or hurt but thatโ€™s not really the reality that most students face and such a bot would probably be very expensive etc most students I would think face the qn of whether and how much they should rely on available AI tools
The problem seems to be that the students value their present spare time more than the time they'd be able to spare later [as employees] due to their expertise. Or maybe they do not expect to get employed anyway because they feel obsolete. So what do you think is the solution.
What's strange is that as a geometry teacher, almost all of my students don't use AI. I always wondered why until I tried using AI myself. At the time last year, ChatGPT was getting simple things wrong like sin(90) or cos(45). It's also really bad at applying geometric theorems.
This doesnโ€™t sound like an actually GPT tutor, does it? Have you tried the Khan Academy GPT? It really seems to focus on learning how to think through things, makes you so worked examples and so on. Would be surprised if that isnโ€™t helpful.
They might not have the time/bandwith/attentionspan to fully dive into. I studied physics in college couldn't comprehend it so I just memorized it. I learned more & @sabinehossenfelder I'm in the arts but I was curious and was able to at least develop an interest.
Are we talking about AI tutors who spend time with students, solving problems interactively? Or are we talking about using AI to complete homework? Not the same thing.
the interesting result here is that GPT Tutor does not improve scores on closed book exams. I am guessing because this was a relatively short intervention. the fact that using standard gpt to get answers to practice questions makes students worse off is hardly surprising.
I'm going to speak for myself, GPT allowed me access to dense subjects and materials that I wish I had the capability to study at an Ivy League school. I think its popularity has a lot to do with lack of access and I find a lot of people have gotten more curious about stuff 1/2
Now how how about we teach them fundamentals (but not rote nonsense) without AI, then teach them to use AI effectively, then continue teaching with AI (which of course they'll have in the "real world").
I think this is more about culture and values than tools. People who only work to satisfy superiors tend to use generative AI to get results. But for people who are passionate about their work, generative AI opens up a whole world of alternative ways and explanations.
I prefer to have students explain and show me how they approach a problem. Even The final numbers are not that important. I still believe in what Richard Feynman said that โ€œto be an effective teacher, one must have a deep understanding of the subject matter.โ€
Kinda misses the point that the gpt tutor moderated the negative effect. Iโ€™m not surprised that getting a LLM to answer for you doesnโ€™t help with learning, much like asking someone to answer for you doesnโ€™t help either. Doesnโ€™t detract that they can be valuable learning devices
We can argue that google search has done the same thing in people out-sourcing critical thought to the google result Its a very real problem though I agree. Although it makes you wonder if its worth it. For example in coding LLMs will struggle with very large codebases but 1
That's part of my issues with people getting dependent on GenAI. They're outsourcing their critical thinking far too much, and not growing that particular skillset
New studying paradigm is still to emerge. Yes knowledge is more accessible but students are still responsible for developing actual understanding not just using the AI answers (which is tempting). No point fighting/banning AI, beter adopt approach that improves actual learning.
You canโ€™t use always GPT, when you are discovering something new that isnโ€™t in training data for model you wonโ€™t be able to do it heavier you learned to used GPT
I found it's dramatically deepened in my knowledge Because I can get it to convert text that is written in a way that I find deeply annoying into a way that I find palatable And I can quickly get clear answers to confusing sentences
There is also documented research that using GPT lowers the ability to do critical thinking. I think we should take all these studies as just being pseudoscience.
A mix of both is ideal. If your human tutor is also fine-tuning an LLM using their coursework textbooks, it lets you share your knowledge more asynchronously, and helps students who are shy about asking questions, among other things.
We should be able to distinguish between using AI to help you learn the material and using it to make it look like you've learned the material.
Generative AI harms learning? Then so do textbooks. The real culprit: poor implementation. Students using GPT as a crutch performed worse, but guided prompts neutralized the issue. Itโ€™s how you use AI, not just if. #GenerativeAI #AIinEducation #GPT
I mean maybe but didnโ€™t we panic about other tools like โ€œspell checkโ€ or calculators? *thinks about my spelling and mental math skills* Uh ohโ€ฆ.
Stop wasting time reading 50 page documents. This AI tool is like ChatGPT for reading faster. Just upload any PDF and ask a question. It will give you an answer with page citations in seconds. Try it for free.
We donโ€™t care Just like the calculators messed out numerical skills but enabled me to do more complex math within seconds AI is helping me to do complex programming and for complex tasks without knowing to code
the same is true for access to calculators and search engines and phones, with each tool added more things that were kept in your own memory were moved to tools.
i think things that can truly be picked up quickly with chatgpt is not worth teaching. school should teach real skills like farming
Yeah, every time I see a kid with a calculator, I slap it away from them and call them weak. They must do all calculations by hand or else they'll never learn any math. Focus on solutions to problems and let them use the tools available. Focusing on theory is awful.
Itโ€™s like driving a car with google maps. If yohh up learnt to drive with maps, your brain is wired differently. They outperform who canโ€™t drive and lean on google maps enough but if you remove google maps they are clueless.
Yeah. That's how it works. The human psyche isn't intended to put up with the fact that this could be easier but you're making it very difficult for no reason other than you feel like it. Catch this ball. I'll take out your eyes. He doesn't care to catch the ball anymore.
Show more
just like weight lifting with gloves let's you do a lot better, and then a lot worse if you stop using them vs having never used them because your hands can't withstand the pain, can we rely on having gloves 100% of the time? then the alternative becomes a philosophical musing
Why is it relevant 'if GPT is removed'? This is like saying, during the industrial revolution, that those folks that learned digging using excavators just can't handle a shovel as well as the guys who had to learn it without excavators.
let it burn let it burn let it burn vibe coding one shot saas startup you are falling behind time to market 1 week you are stopping the progress yes sure go ahead, replace your brain cells with vibe neurons and then vibe out of this life
They perform worse on the material they had been taught by GPT tutors or on additional new material being taught without GPT tutors?
with chatGPT you dont need to dig out your answers from your noggin, so yer brain becomes lazy
Image
Trying to reinvent the wheel is not always a good idea. Traditional method of info delivery is the best. Technology made people lazy and dumb.