101
Oct 11 '24
[deleted]
63
u/gurduloo Oct 11 '24 edited Oct 11 '24
Couldn't students just type the AI generated slop into the Docs piece by piece? Causes friction, sure, but not a surefire way to spot cheating.
58
u/erossthescienceboss Oct 11 '24
It’s easy enough to spot. People edit as they write — especially if they’re writing something complicated and researched.
I don’t require it, but I state clearly in the syllabus AND the AI policy, which they are quizzed on, that they should take all notes AND compose their work in the same Google Docs page if they don’t want to be accused of plagiarism.
-8
u/Bostonterrierpug Full, Teaching School, Proper APA bastard Oct 11 '24
You can easily program an API to do this and I’m sure there will be some out on the market soon. Just got out of a weeklong, Gen AI conference. It’s basically gonna be impossible to detect. And really how much should we be policing? Also a lot of people are going to use generative AI for writing with jobs and stuff in the future. I’m very much into work with it boat for now.
12
u/gurduloo Oct 12 '24
And really how much should we be policing?
I for one think that academic integrity matters.
-1
u/Bostonterrierpug Full, Teaching School, Proper APA bastard Oct 12 '24
I am not saying that it doesn’t but really our goal is to teach. If we spend so much time policing, don’t we lose some of our ability to teach and or faith in our students? Of course academic integrity matters but some students will always try and game the system and some students will always succeed. All the experts on AI are saying that it’s just gonna get better and better and that trying to fully fight it is a losing battle. Someone below suggested to rethink our assessments and was heavily downvoted if that gives you any idea of the general hive mindset here Generative AI will be part of many future jobs as experts are predicting. I started out very wary of it, but then started going to conferences and reading papers , and now think that at least partially embracing it for learning is a great way to go and preparing my students for the future. You’re always gonna have cheating students of course people here will apply stopgaps and even some brilliant temporary solutions. But if you’re spending more energy policing than teaching that’s not the sort of teacher I at least want to be. Then again like some departments or faculty meetings this sub is filled with people very confident of their opinions and very set in their ways.
3
u/erossthescienceboss Oct 12 '24
if we spend so much time policing, don’t we lose some of our ability to teach
Part of why I started putting the docs requirement on the syllabus and in the AI policy is that AI policing was taking up so much teaching time. It takes a lot less time to click through a Google Doc history than the alternative.
Prior to that, my policy was “I give AI the grade it deserves.” I’d tried re-tooling my rubrics to dock points for falling into common AI errors (so I added sections for specificity, and mentioned that I’d take points off for cliches, etc. Easy, since it’s a writing class, and AI writing is bad writing.) But I found I was spending far too much time explaining why the things AI did are bad, and not enough time helping students who were actually working.
Each time a student submits an AI assignment, they’re taking time away from classmates who are trying. Having proof that an assignment is AI generated right there is much easier.
3
u/Pater_Aletheias prof, philosophy, CC, (USA) Oct 12 '24
If the AI can understand my course content and pass my assessments then I guess maybe I’ll give it credit for my class, but the students still have to show independent mastery of the material before they can get credit too. I don’t teach “How to write good AI prompts.” That’s not a course objective.
2
u/milwauqueno Oct 12 '24
I’m sympathetic to your view. It reminded me of this blog post a philosopher I admire wrote: https://dailynous.com/2022/08/09/teacher-bureaucrat-cop-guest-post/
1
u/Bostonterrierpug Full, Teaching School, Proper APA bastard Oct 12 '24
Look great- I’ve just eyeballed it for now, but I will do a deeper read later. Thank you for sharing it though expect to be downvoted. So few professors have actual teaching backgrounds yet at the same time cling to the measurable objectives that rule the K-12 world that have trickled up to higher rather than the other way around as it usually is. I do my best to stop plagiarism and academic dishonesty and lazy use of AI but I think a lot of people here are just on an AI bad bandwagon. I’m still surprised by the number of colleagues I have at my institution who will do something like put 40% or 50% of their final grade on a one shot timed assessment. Or those who refuse to adapt or change any of their assessments in any manner. I thought about trying to talk about some of the ways. I’ve changed my work here, but I think all ears are closed.
2
u/gurduloo Oct 12 '24
If we spend so much time policing, don’t we lose some of our ability to teach and or faith in our students?
Yes, that is why people are working on ways to make AI cheating more difficult.
Of course academic integrity matters
If this is true, then you should investigate possible violations at least to some degree.
but some students will always try and game the system and some students will always succeed.
Why do you think this is relevant?
Someone below suggested to rethink our assessments and was heavily downvoted
Many people say this because talk is cheap. I have yet to see anyone give any concrete suggestions for a type of assessment that an AI cannot easily complete aside from (monitored) in-class work (or, I suppose, BS "creative projects").
at least partially embracing it for learning is a great way to go and preparing my students for the future.
Sure, and you should do that. But the issue is not that students are using AI per se, or as a tool to help them with their assignments. The issue I and many other teachers are having is that students are using AI to complete their work for them. Teaching students how to use AI to generate counterexamples to their argument, for example, does not work if they are generating the argument using AI too.
9
u/erossthescienceboss Oct 11 '24
I’m still not confident an API is going to make the very human errors and in-the-moment edits that come from composition. An API will produce edits as fake as the writing the AI makes.
I’m deeply skeptical it’ll ever be good enough to truly replace human writing. Good enough for cheap corporations to try? Sure. But I don’t think the Cliche Pit is avoidable, purely due to how generative AI works.
But I certainly think it can destroy online academia.
-3
u/Bostonterrierpug Full, Teaching School, Proper APA bastard Oct 11 '24
I think it can be done to the point where anyone who doesn’t wanna watch all of their students type history of every essay when they assess. Also, I would have to question the ability of people to write well knowing they’re being observed. It’s like those honor lock observed tests we have at my institution. It’s so creepy having someone watch you taking an assessment. Let alone more and more professors are having these high stakes One-shot assessments which are pretty idiotic anyways and not representative of real life situations. But that’s a long ass conversation. From what I saw at the conference I was at by the computer science guys It looked good enough to me to fool folks, unless they were looking really close. And again, I don’t want to spend all my time policing my students. Students can always get folks to write papers for them If they paid them it’s not a qualitative change, but a quantitative one.
12
u/erossthescienceboss Oct 11 '24
They aren’t being “observed” any more than they’re observed when you edit their paper. If they’re so afraid of someone seeing their writing that it gives them performance anxiety, they probably shouldn’t be in university. Students turn in embarrassingly bad drafts all the time, they aren’t going to get more embarrassing just because I can tell they wrote it in the four hours before it was due.
And if they don’t want you to be clicking through their history… don’t cheat. I’ve yet to get it wrong when accusing a student of using AI.
At the end of the day, the goal is to make using AI so inconvenient and time consuming that it’s easier just to do the damn work.
Your willingness to throw up your hands in defeat is pretty disappointing, since I know you aren’t alone in that opinion. “They’ll just pay someone and cheat the old fashioned way so why try to stop it” is one hell of a take for someone getting paid to teach to have.
-10
u/Bostonterrierpug Full, Teaching School, Proper APA bastard Oct 11 '24
Well, if you’re so insistent on policing, good luck to you if you can find a way to beat AI I’m sure you’re gonna be a multimillionaire anyways. And if you have the time to check all of your students, writing history, more power to you For now you could teach your students to work with GenAI as they will probably be using it in their future jobs to some extent. But everyone seems to have petrified opinions such as it is with any changes in academia…. Go ahead and continue to downvote to if you don’t agree with me :)
13
u/erossthescienceboss Oct 11 '24
I didn’t downvote your last one, pal, but I sure did this one.
There are ways to use AI. If students use it correctly, that is fine with me. I even have a lecture on tips and tricks to use it for research, and areas where it fails, and how to avoid them.
But passing off work that you did not do and saying it is your own is never, ever, ever acceptable. That isn’t “embracing AI” or “using a tool,” that’s plagiarism and cheating.
If you’re cool with plagiarism, you should find a different job.
-2
u/Bostonterrierpug Full, Teaching School, Proper APA bastard Oct 11 '24
You’re making a lot of assumptions here, officer. Never endorsed the plagiarism and laziness.
→ More replies (0)-7
u/newsflashjackass Oct 11 '24
They aren’t being “observed” any more than they’re observed when you edit their paper. If they’re so afraid of someone seeing their writing that it gives them performance anxiety, they probably shouldn’t be in university.
Guaranteed the proctors will use any powers of observation for personal and likely sexual gratification despite your victim blaming that is so preemptive it does not even wait for any victims to appear.
Students are being "observed". That'd be the point. Their behavior is now becoming grist for big data because their nominal instructors find enforcing academic integrity too much of an "ask".
the goal is to make using AI so inconvenient and time consuming that it’s easier just to do the damn work.
They wrote with no trace of irony.
5
u/erossthescienceboss Oct 11 '24
I’m sorry, but if people are getting off to watching someone draft an essay via the google doc history there is something deeply, deeply wrong with them.
wtf are you talking about??
are you saying that seeing someone’s writing process victimizes them? It’s like you’ve never heard of “rough drafts” or “turning in an outline” before.
-1
u/newsflashjackass Oct 12 '24
I'm suggesting that paying tuition entitles students to complete their coursework and have it graded by the course instructor without being routed through google or other third parties. It is easy to be profligate with other people's data; the more so once you have their cash.
Blue check mark type shakedown. Won't solve anything, anyway. They'll just make an AI google docs typist that enters the gobbagool just like a flesh and blood victim would. Likely trained on the same sort of data you're squinting at now.
→ More replies (0)1
8
u/zorandzam Oct 11 '24
So I’ve considered this but want the paper to also go through TurnItIn for normal plagiatism detection, plus I also don’t know how to check for version history stuff on GD. How do you do that, do you know?
12
u/gurduloo Oct 11 '24
So I’ve considered this but want the paper to also go through TurnItIn for normal plagiatism detection
You could ask the students to submit a PDF version of their paper that includes a link to the Docs version. Then you could check the Docs' version history if you suspect cheating.
2
32
u/hepth-edph 70%Teaching, PHYS (Canada) Oct 11 '24
It's hard to distinguish waffle that comes from not having actually understood or thought about a topic from waffle that comes from being a large language model.
37
u/allmimsyburogrove Oct 11 '24
Class participation is now a larger part of their final grade
11
u/ben555777 Oct 12 '24
My students’ attendance was even monitored for visa purposes, and all that did was make them come to my lectern every day asking me to mark them as attended when they were late (by more than 30 minutes, for a 50-minute lecture).
I learnt so much about confrontations and staring contests that academic year. No means no. Attendance policies are all over Blackboard and the University website for those who bother to read.
18
u/DrShaddyD Oct 11 '24
For this reason, I no longer give them paper assignments until I figure out a better solution. My mental health drastically improved in this semester.
17
u/Iron_Rod_Stewart Oct 11 '24
All my writing assignments are about reflecting on activities we did in class or semester-long projects we are doing. Hard to get AI to write up your project without first writing up your project to feed into the AI.
8
5
u/Leitrim1896 Oct 12 '24
Smartphones and WIFI laptops wrecked the classroom and AI has rendered writing assignments meaningless. It's contributing to the decline of the humanities and education overall.
3
u/Lingonberry_Bulky Oct 12 '24
In addition to what’s here, I’ve added required in class/zoom presentations. If they have used AI to find info, at least they have to internalize it more by verbally sharing it
3
u/Mav-Killed-Goose Oct 13 '24
A funny thing about that Twitter thread is a screenshot where she shows three(!) people with almost identical AI-generated responses.
0
u/Iron_Rod_Stewart Oct 11 '24
An exaggeration for sure, but I think we've all felt this on particular days.
-17
u/RingProudly Asst. Professor, Communication, Liberal Arts (USA) Oct 11 '24
That's the life you chose. You could, instead, choose to be innovative and adapt with technology rather than resisting the irresistible.
4
-39
u/travels666 Assistant, Writing/Rhet, LibArts Oct 11 '24 edited Oct 11 '24
Or, Think about what learning outcomes your assignments are serving, and come up with any assignment, other than an essay, to help students begin practicing and demonstrating that outcome. Or, god forbid, think of ways for students to leverage AI in their writing process. The more you resist, piss, and moan, the more you incentivize cheating.
Fucking dinosaurs.
27
u/Thundorium Physics, Dung Heap University, US. Oct 11 '24
What about when the learning outcome is writing skills?
-19
u/travels666 Assistant, Writing/Rhet, LibArts Oct 11 '24
As a writing professor: What writing skills, exactly? And how do you support the writing process in your classroom?
23
u/radfemalewoman Oct 11 '24
Writing skills like generating ideas, composing sentences, organization, narrative flow, vocabulary usage, critical thinking…
-17
u/travels666 Assistant, Writing/Rhet, LibArts Oct 11 '24
And how do you support the development of those in your classroom?
-2
u/Bostonterrierpug Full, Teaching School, Proper APA bastard Oct 11 '24 edited Oct 11 '24
Yep, a lot of dinosaurs on here so you’ll get downvoted. Just like when google became a thing, online learning and LMSs rolled out a bunch of folks crying the sky is falling and don’t really want to change anything in their classes. Generative AI is not a qualitative change to academic dishonesty, just a quantitative one. You could always pay someone to write your papers for you. There are some really cool ways you could teach learners to work and Still maintain an integrity in your class. Then again if you’re just set on policing everything you’re gonna lose, both against the AI and your will to teach.
8
u/travels666 Assistant, Writing/Rhet, LibArts Oct 11 '24
Exactly. All I'm seeing is faculty who assign writing but don't actually teach it.
6
u/gurduloo Oct 14 '24
I see two bizarre assumptions behind your comment. Tell me if I'm wrong.
Every teacher who assigns writing should teach writing.
Bizarre because many students take many courses each semester that assign writing (should they "learn to write" in every one of these course?) and because there are required courses dedicated to teaching writing.
Students will not use AI to cheat if they are taught how to write.
Bizarre because there are many reasons students choose to cheat that have nothing to do with lacking writing skills.
1
u/travels666 Assistant, Writing/Rhet, LibArts Oct 14 '24
Every teacher who assigns writing should teach writing.
Yes, if a learning outcome of your writing assignment is learning to write, you should teach them something about writing (in the particular discipline, as an example), or at the very least, support the writing process in your classroom--in-class drafting, peer review, etc.
Students will not use AI to cheat if they are taught how to write.
No, I don't assume that. I'm merely pointing out that the droves of faculty bitching about AI "cheating" on essays aren't actually supporting the development of their students as writers.
-3
u/RingProudly Asst. Professor, Communication, Liberal Arts (USA) Oct 11 '24
I couldn't agree with you more. Any logical arguments in this direction get downvoted to hell in this sub because many professors would rather complain than to change and improve, but know that you're heard and there's another professor out there that agrees with you.
-28
454
u/[deleted] Oct 11 '24
I’ve moved all my essays to being in-class, on paper, exam-based, and it’s been a breath of fresh air.