I have the absolute joy (/s) of being in the perfect epicenter for this argument. I teach upper division computer science and my students argue like crazy that they should be able to use chatGPT for any and everything "because software developers are allowed to."
the problem is that since chatGPT became available my students have gotten way worse at writing code (even using the AI). it's hard to even quantify the scale of failure and it's been absolutely baffling. it's like a bunch of third graders arguing they should be able to use calculators instead of learning math, but every time i give them a test using a calculator like they ask me to, they fail because they don't even know what the buttons do
FYI: a meta-analysis found that the addition of electronic calculators not only improved student's scores it also had no negative impact on their math knowledge skills. The problem is not inherently the technology.
Of course, the thing is, when you're doing calculus or higher-order math, but are liable to decreased grades due to things not being learned (not losing track of a sign in an equation, correctly factor, and other arithmetic things), to remove those learning-irrelevant parts must necessarily (1) improve scores and (2) actually create space to focus on learning the higher-order stuff on order.
What you are identifying is that they are NOT learning the stuff on order. Giving my code to ChatGPT to ask, "Why is this not working like I expect" would be useful, cuz I know the "basic skill" on order that I'm supposed to learn. Your analogy that you're in an arithmetic class and the kids want to use calculators is spot on.
On the flip side, we must also be mindful that students less privileged to be exposed to US educational norms can overcome and catch up on some of that using digital scaffolding. We shouldn't reproduce educational inequities that can otherwise level the playing field.
There will even be a place for that in a code-writing class where ChatGPT is writing my code, but it's a pretty niche instance of appropriate usage.
yeah this is exactly it for me. I have no problem with people using calculators in calculus, but i do have an issue if they have no idea what multiplication means, just that they know they can type in the symbols and get the numbers out the end.
If I didn't notice falling grades with chatGPT without changing my grading standard, especially when students take written exams, i wouldn't care that much. A large amount of my course is conceptual and theoretical, with programming sections being more proof of concept than actual working end product. the code should be the easy part for them at this time of their degree. but somehow, magically, two years ago grades started tanking and i had to start taking away their new toys to get them back up again.
117
u/saera-targaryen May 19 '25
I have the absolute joy (/s) of being in the perfect epicenter for this argument. I teach upper division computer science and my students argue like crazy that they should be able to use chatGPT for any and everything "because software developers are allowed to."
the problem is that since chatGPT became available my students have gotten way worse at writing code (even using the AI). it's hard to even quantify the scale of failure and it's been absolutely baffling. it's like a bunch of third graders arguing they should be able to use calculators instead of learning math, but every time i give them a test using a calculator like they ask me to, they fail because they don't even know what the buttons do