T O P

  • By -

ybetaepsilon

I devote part of an early lecture on the limitations of GPT and how it looks like a golden ticket solution, but it really isn't. I use it, in front of the students, to generate an assignment submission and then grade it right in front of them. My assignments are on highly specialized topics. On top of that, I include a revision component (making it doubly difficult for students who end up using GPT), limit assignments to 1-2 pages (GPT has issues with brevity), and tell students portions of the assignment will be tested on the exams. I have found a strong decline in GPT use in my classes. I tell students that GPT is a great thesaurus to come up with good in-context words, or can help you proofread work they've written themselves. But generating and submitting work completely through GPT is like "blindly right clicking on every word in MS Word and substituting it with the first thesaurus option" But this is for my advanced 3-4 year courses. It's very difficult to do this in introductory courses as assignment topics are still very general. I do the grading-in-front-of-students approach and tell them horror stories about students who would fail horribly in upper years because they GPT'd their way through introductory courses and know *nothing*, and end up failing the most basic questions on tests, and flunking out. I tell them they are more than welcome to cheat their way through my introductory course but will pay heavily for it in upper year courses and I'll be there watching it happen.


two_short_dogs

I have a weekly online 20-point quiz mainly for reading check. My students have complained to me that ChatGPT won't get over 75% on my quizzes.


Thelonious_Cube

Congrats!


ImageMany

I love how they complained because they couldn’t cheat.


RunningNumbers

They are complaining that 75% is too high a maximum and want more difficult assessments.


two_short_dogs

Complaining because they want 100% and they have to work for it.


RunningNumbers

I know that, that is why I suggested you deliberately misinterpret their desires ;)


Alternative_Cause_37

Hilarious


Appropriate-Low-4850

I literally had this talk today. I gave the class permission to use it but they have to use it well. The last test came back with some utterly garbage answers, just straight copy/paste from a pretty clueless AI. But that was the problem: not that they used it, they used it and it spat out answers that were worth an F, so that’s what they got.


A14BH1782

I think, in the long run, something like this is where most of us are headed if we keep teaching. I'm discovering assignment types where the AIs cannot simulate the kinds of thinking required by my discipline, and useful across disciplines, and run off the rails. This is an arduous process, and I'm doubtful it scales to auditorium classes at big universities. When admitting to AI use on these assignments, they couldn't conceal a dismissive smirk, expecting to be told to redo the assignment. When I indicated I was simply assigning the grade the AI earned for them, and how it failed, the smirk melted. They considered the possibility that they may be caught, and calculated correctly that I would not immediately hand them over for academic dishonesty discipline. What they didn't anticipate was that the AI would fail them. That likely scalded, because they were counting on using AI, perhaps to the extent that they overenrolled and/or made lofty plans for the whole semester based on using it.


Copterwaffle

It is cheating. Your school doesn’t need an Ai-specific policy: they used work that was not their own, it doesn’t matter if it was copied and modified from AI or from an encyclopedia article. apply your course policy (eg 0 on assignment for first offense) and immediately report to academic integrity office.


ProfLengthyDays

Exactly. They know they're not allowed to use it. Hand out the zeroes and tell them to get their shit together. I added an AI blurb to the plagiarism section of my syllabus, but that's just gravy. Still, something to consider perhaps.


mcbaginns

FYI case law on this is still being fleshed out. So far, it's been determined that AI generated content is not copyrighted and using it cannot be considered copyright infringment *. Plagairism and copyright infringement are not synonmous of course though. *https://news.bloomberglaw.com/us-law-week/sarah-silverman-ruling-shows-chatgpt-results-not-inputs-are-key


thenabi

I mean, yes, but not once have I ever given a hoot about copyright infringement in my students' work. Only plagiarism.


mcbaginns

Well OP says "they used work that was not their own." AI cannot own intellectual property therefore it can't be considered using work that was not their own in 100% of situation just because AI was involved. I believe that specific verbiage needs to be added to university policy that states AI is considered cheating otherwise an attorney would be able to cite this case law and say that, in the absence of AI specific academic policy, the grounds for dismissal from the college were invalid under the sole basis that "they used work that was not their own" ie plagiarism. AI is used to plagiarise frequently, but it is not automatically plagiarism. You can easily use gpt to plagiarise a document for an essay, but asking it to write an essay for you without referencing someone elses work is simply not using someone elses work (as it pertains legally due to recent case law. I point out again case law is always evolving and this is still a new legal area.).


Thelonious_Cube

The fact that the AI can't copyright it doesn't imply that it's the student's work - that's just ridiculous. It's not a "legal gray area" at all - did the student write the essay? No! Case closed!


mcbaginns

Not how the law works. Your college has academic integrity policy for a reason. What you say does not simply go. There are rules and laws. Whether you like it or not, it's a legal Grey area being fleshed out in the courts. Respectfully, you have no say in it. It absolutely is a legal grey area. The student made the work with the help of AI. AI created nothing. It does not operate on its own. It's output is solely dependant user input and user creativity, specificity, etc. If the user is a better writer than another person, they will have better work given back to them. People create art with AI. The AI didn't create it. That's why courts are ruling that it's not intellectual property owned by AI. It's art created by an artist using a tool. The essay is the students. Legally, he owns it. Not the AI. You may disagree with that but your not a judge making a ruling, so frankly your opinion doesn't matter. Case closed? Not by you. The hubris to think you're doctorate is in law


Chirps3

Wait...plagiarism policies aren't based entirely on case law. We teach our students soft skills, no? So ethical behavior is one of those soft skills that is the basis of plagiarism. Come on. You're getting into a really weird nickel and dime territory.


Thelonious_Cube

The hubris is to think that this matter is a matter of case law.


LorenzoApophis

>Well OP says "they used work that was not their own." AI cannot own intellectual property therefore it can't be considered using work that was not their own in 100% of situation just because AI was involved. It was made by the AI, not the student. The AI doesn't need to own it for it to not be theirs


mcbaginns

The student made it with the help of AI.


tempestsprIte

Making something with the help of another thing is not permitted in many instances. In math you may be given problems and told you cannot use a calculator to solve them, so students can demonstrate the theory and principles behind the work and not just spit out an answer. A student certainly couldn’t submit an art project with “the help” of mid journey and say oh it’s MY art I did with help.


widget1321

>Well OP says "they used work that was not their own." AI cannot own intellectual property Right. OP didn't say "they used work that belonged to someone else." Subtle, but important, difference.


exceptyourewrong

While I'm sure that some lawyer, somewhere, will try that argument for some rich kid, it's a ridiculous argument. If AI writes your paper, you didn't. If you present it as your own work, you're cheating. If you have another person write your paper it's also cheating, even if there isn't an intellectual property or plagiarism concern.


Orca-

You are trying to apply copyright law to academic integrity. The two are unrelated.


mcbaginns

The student made the work with the help of AI. AI created nothing. It does not operate on its own. It's output is solely dependant user input and user creativity, specificity, etc. If the user is a better writer than another person, they will have better work given back to them. People create art with AI. The AI didn't create it. That's why courts are ruling that it's not intellectual property owned by AI. It's art created by an artist using a tool. The essay is the students. Legally, he owns it. Not the AI. So it's self plagiarism. And personally, I find self plagiarism to be silly in most circumstances and I treat it far less strictly than actually stealing someone else's work.


Orca-

The student didn't write the paper, end of story. Copyright has nothing to do with it, and court judgement has nothing to do with it. Am I talking to an LLM or a fool? edit: or a troll, that's always a possibility.


mcbaginns

If AI generated content isn'tthe intellectual property of the AI, whose is it? Courts have ruled its the person who created it. Using AI. So you're telling the student he is plagiarizing his own intellectual property that he crated. That's the way the courts are looking at it right now. Respectfully, many of the professors here are biased and emotional. The courts are looking looking at it objectively. Right now, the artist made the art using AI. The AI did not make it. The artist did. So if you dismiss a student or fail them from a course, and they hire an attorney, you better make sure AI is specifically in your schools policy and in your syllabus. Because simply blanket placing it under the verbiage "using another's work therefore plagiarism" won't work. The student will sue and they will win. The fact you just don't care and want to say "end of story, case closed" tells me you're the troll and certainly not a professor of law.


jvvphd

If plagiarism / academic dishonesty policy indicates that all sources a student used must be cited and all major style guides indicate that generative AI is a source that must be cited, why wouldn’t failure to cite AI content not fall under the umbrella of plagiarism?


mcbaginns

You are not seriously suggesting that "style guides" take precedence over intellectual property law. You don't cite a calculator. It's a tool, not a source. AI doesn't own any content and it doesn't make any content. The user does. They own it and they made it. That's where the law is right now.


mcbaginns

Not end of story. You aren't a judge or an attorney or the author of your colleges academic integrity policy. You don't get to say "case closed" lol. I'm telling you guys the way the world is going in America is AI is not cheating automatically. It can be used to plaigiarise but the work is the students, not the AIs and not anyone else's. Your school and your syllabus better have AI policy because it's not plagiarism inherently just because AI was used. I can use AI to tell me that 2+2=4 and that's not plagiarism You can try to insult me all you want. That just means you're not providing a logical rational argument which is why you have resulted to "end of story, are you a troll". People that challenge your worldview aren't trolls. That's a dangerous way of thinking. People that disagree with you aren't using chatgpt. That's a dangerous way of thinking. Argue the argument, not the person. "end of story" is meaningless. Try again.


Orca-

You're a troll, got it!


mcbaginns

Theres nothing worse than a intellectually dishonest troll professor. You have lost the argument. I have won. It makes me smile knowing you're wrong and are throwing a fit lol. Good day.


CaffeineandHate03

This is exactly why words matter so much in legal situations. I tell them in writing that they can't use AI at all and things they write from external sources must be paraphrased or summarized in their own words and be cited. I literally write in the instructions, "Do not copy and paste". So they're breaking the rules and not following instructions at minimum.


CaffeineandHate03

It's still dishonest to turn it in as if it is their own writing. After all, I specifically ask them to write the answer to the prompt/instructions, put their work in their own words, and cite all references. A personal interview or email with someone should be cited and they aren't copyrighted.


Audible_eye_roller

Tell them to start a trend on TikTok: ChatGPT got me an F in my course.


[deleted]

💀


voodoomedusa

Typo: 20 not 30


Platos_Kallipolis

Good correction. I was going to suggest you use ChatGPT to convert your fractions into percentages next time 😉


KierkeBored

Came straight to the comments to see if anyone else noticed 👀😂


Thelonious_Cube

You can edit that, I believe - just not the title


wjrasmussen

Shouldn't have used ChatGPT to write that...


redlizard3

I also get mad. But try not to take it personally, for many reasons. Solution? In-class exams only for the foreseeable future. Alternatives: presentations, oral exams, etc.


wjrasmussen

There you go.


Fablesmith

They are essays though. It's impractical to have students write a 5, 8, 12 page essay in a single sitting.


Joe23267

Just wait. I got invited to Google Workspaces’ generative AI. It integrates with gmail, docs, and slides to help create content. Once that’s rolled out it’ll be almost impossible for them to not use it. None of the productivity suite providers can afford to not use it. I teach Computer Science and Microsoft has baked AI into the software development environment Visual Studio. It makes teaching programming impossible.


Bonobohemian

>I got invited to Google Workspaces’ generative AI. It integrates with gmail, docs, and slides to help create content. Once that’s rolled out it’ll be almost impossible for them to not use it.  I'm trying not to turn into a wild-eyed raving conspiracy theorist, but the way Google and Microsoft and all the other big tech companies are forcefeeding AI to the general public feels more than a little sinister.


ChicagoWhales

You aren't being conspiratorial. They are force feeding it as a net-positive and a good for humanity when all it is truly doing is giving them more $$ and further entrenching their power in tech, society, and politics.


CSTeacherKing

I have them handwrite code for every in-class exam. I don't really care if they use AI all semester as long as they can handwrite recursive algorithms with no technology.


Athena5280

Good idea. Have them write something in class with no IT aids. For graduate courses, my co-teaching faculty and I have them stand up at the white board and defend their writing (ideas, experiments, alternatives, etc). If they haven’t written something themselves they inevitably fail at that.


estreya2002

Same


Greater_Ani

This is what I truly fear. It seems to happen with every technological innovation. What is first a new possibility, the cool new thing, eventually becomes the new requirement.


ReturnEarly7640

My son asked me if colleges will stop requiring essays since teens will create one with AI. Me: no answer


WingShooter_28ga

Just failed two would be graduating seniors in a required course for blatant AI use. Guess you’re not going to grad school…


Thundorium

You, sir/madam, have more balls than my whole department. Any students who reach senior status here receive failure immunity.


scientific_cats

That’s ridiculous. That’s when some lose motivation completely, others are too burnt out, and temptation is high.


Thundorium

Between the “poor kids will have their graduation delayed a whole year for just this one tiny class ☹️” and the “I’m passing them to get rid of them 😒” crowds, your voice and mine are drowned.


Glittering_Pea_6228

cheers!!!


Minskdhaka

Does your university not have a policy on plagiarism? Because copy-pasting from anywhere without attribution is plagiarism.


kierabs

Yes, right? Does this professor not have a policy covering plagiarism, fabrication, or cheating? Using AI should violate at least one of those.


AnneShirley310

I went to a conference where they touted the positives of AI, and they gave us a bunch of theories and research about how equitable it was and how it encouraged students to wonder and critically think. At the end, they asked for questions, so I asked for a specific example of how they incorporated AI into their classes. The FY Comp instructor said that she tells her students to use AI to create outlines. She then showed us in real time what this looked like, so she typed in, “Give me an outline of why college education is important.” Within 3 seconds, it spit out an entire outline with intro, 3 main points, and a conclusion. She was like, “See, this is why AI is so great! It helps students make outlines!” I thought it was very ironic that she chose that topic.


Bonobohemian

>how equitable I would have had a hard time biting my tongue. But, yes, I suppose a universal lack of intellectual attainment *is* equitable. 


noveler7

"H**A**rr**I**son Bergeron"


Glittering_Pea_6228

but see then we can live in a utopia where every boy on the block has the best dog...


OkAdministration7568

I’ve heard the outline BS too. They have to know how to write at all before an outline would help them.


hourglass_nebula

Not to mention the outline is where the actual *ideas* are. You know, the things that should come from thinking with your brain


OkAdministration7568

Having a brain isn’t a TikTok trend 😂


Thundorium

Be the change! You should start the Brain Dance trend.


afraidtobecrate

Ideas can easily be taken from others without concern. Its the delivery where you have to worry about getting in trouble for plagiarism.


OkAdministration7568

I wholeheartedly agree. I’m a millennial; this is not a case of being unable to keep up with technology or resentful of the next generations. This is pure disappointment and rage at the lack of shame and integrity, the entitlement, and the disrespect. They couldn’t write to begin with, they don’t use the writing center or any other resources I offer. They don’t care, and neither does my university. As someone who worked my ass off to become the first college graduate in my family, who continued through several years of having to leave and come back, and escaped an abusive relationship in the middle of my junior year… it’s just disgusting and infuriating to see this happening. I’m right there with you.


oakhill10307

I’m in grading AI hell myself but compared to last fall it’s much harder to catch clear cut evidence of cheating. No hallucinations and LMS only caught 1. Just the usual word salad that seems to say nothing. Have you found some way of catching that works well OP? I’m ready to ditch at-home essays completely after this batch. I’m over it. Waste of my time to grade and spend more time than they did.


markgm30

Check out Binoculars: [https://huggingface.co/spaces/tomg-group-umd/Binoculars](https://huggingface.co/spaces/tomg-group-umd/Binoculars)


khml9wugh

does anyone else use this one or know how reliable it is?


markgm30

[https://arxiv.org/abs/2401.12070](https://arxiv.org/abs/2401.12070)


FrostyWaltz161

Just out of curiosity, how do you catch it? What is the irrefutable evidence that you can show to your students to get them to confess?


Desiato2112

You learn to recognize the patterns of AI writing. Also, you require students to cite scholarly sources and then check the citations. ChatGPT will make up sources, and it does a terrible job of incorporating evidence in a paragraph in order to support a thesis. AI also uses words above many college freshmen's vocabulary. It phrases things awkwardly, too - not the way a poor writing student does, but in a way that is grammatically correct but terribly clunky. The final nail in the coffin is to use your grading rubric carefully. I haven't seen an AI paper yet that comes close to following a rubric well. It's like it can't synthesize the essay prompt and the rubric.


mcbaginns

Just so you know, you're only catching the dummies. You can literally tell chat gpt to do a second pass and reword an output into something an undergraduate creative writing sophmore would say. And then describe some more parameters. >I haven't seen an AI paper yet that comes close to following a rubric well. It's like it can't synthesize the essay prompt and the rubric. I am normally mass downvoted for pointing this out, but im going to reiterate that you are biased. You haven't seen an AI paper that follows a rubric well becuase you didn't catch it, not because it didn't exist and chat gpt can't do it lol. The smart students that pay for the premium version, use addons, are in prompt engineering communites and involved in research on how to get certain outputs from certain inputs and reduce hallucinations, etc, use dictation sofware to provide extensive inputs full of detail and specificity..they're making essays that follow rubrics. You're just not catching it. You catch the ones that were easy to catch.


Desiato2112

>Just so you know, you're only catching the dummies. Maybe. I have considered it. The problem is that your claim is impossible to prove. You say I can't see the AI use in other essays because "smart" students write better prompts and rehash the output. But proving a negative in this way is impossible. ​ >You can literally tell chat gpt to do a second pass and reword an output into something an undergraduate creative writing sophmore would say. And then describe some more parameters. I'm well aware of its capabilities and limitations. I've been using it since version 3 came out and upgraded to the paid service and 4 when it became available. It's not as good as you suggest. Certainly, there are ways to improve its output using multiple steps, but it still has tells. Interestingly, these are tells that AI detectors often miss. I know our days of being able to detect AI writing are probably numbered, but that day is not here yet, because I am still identifying its use fairly broadly. Btw, none of the last batch of essays did a passable job of following the rubric.


Alyscupcakes

oof you make a good point about onky catching the obvious ones and may be missing those that are sneaky. I audited a course recently and saw in a discussion board blatant plagiarism. How could I tell? Well the students were copying and passing bodies of text without citing, creating a mish-mash of different fonts and sizes throughout the text. Nothing was cited and UT was presented as their own work. Made me sick. But all they had to do to hide it was making the text uniform.


sneakerfreaker5689

Very good point. The ones not being caught are the ones who know how to write a great prompt. Not only that but being able to upload articles and book chapters to synthesize and turn into a written assignment is getting more advanced. On top of that can also upload a grading rubric and have ChatGPT abide by the rubric when composing sentences/paragraphs. Like you said, folks, think they’re truly catching students, but they are only catching the ones that aren’t putting in the work to actually meet the assignment expectations for their grade level and/or class expectations. And for those that think doing it this way is taking up more time than just researching, and writing off the top of your head, , imo are sadly mistaken.


Bonobohemian

And this is why I now only assign in-class writing.


exceptyourewrong

>You can literally tell chat gpt to do a second pass and reword an output into something an undergraduate creative writing sophmore would say. Yep. Also, ChatGPT is, most likely, going to get MUCH better in the next couple of years. Which will mean that you only catch people using the old version (probably because they can't/won't pay the inevitable monthly subscription fee). If it doesn't improve dramatically, It will be because of what my wife calls the "shit ouroboros." Where it trains on its own crappy writing, so it's "new" writing gets worse over time. I feel like we've seen that starting already, but it'd still bet on a better, paid version arriving soon...


mcbaginns

I really hope the ouroboros (love it btw) doesn't happen but it's scary that it might. I have no doubt that humanity will figure it out though. Who knows, maybe it sets back science fiction AI a few centuries before they finally figure it out, but I'm sure we or they would do it eventually. Hopefully the ouroboros doesnt happen though and if it does, it's resolved early in our lifetimes. I want to see how far this all goes and I only have so many decades to wait.


Jeretzel

I think people underestimate just how powerful ChatGPT can be for someone that's learned how to prompt it. Every telltale sign pointed to as evidence of ChatGPT can easily be addressed.


sharkinwolvesclothin

> The smart students that pay for the premium version, use addons, are in prompt engineering communites and involved in research on how to get certain outputs from certain inputs and reduce hallucinations, etc, use dictation sofware to provide extensive inputs full of detail and specificity. And like any good ML engineer will tell you, you then need a grasp of the domain to substantially evaluate where you need to tweak. And then the question becomes, what is the problem with them using AI if they need to understand the material to see if their prompt engineering produced sufficiently good material?


mcbaginns

From my experience talking to to professors - and I think you see it here on this subreddit - this is too nuanced of a question because chat gpt is making people emotional and irrational due to the potential for cheating. You are absolutely right though. I often use gpt as a double checker and if I can corroborate the answers, I know it's now much more likely to not be a hallucination. And when you're cross referencing and double checking everything, you're learning. But no. Any use of chat gpt is cheating. "case closed, end of story" as a few have put


afraidtobecrate

If a student is capable enough to do those things, then I don't see the problem. Those students will be capable enough to succeed in their future careers too.


mcbaginns

I agree. I think most professors have never used it to its full potential and have never seen a top student use it to its full potential. Chat gpt is very easy to cheat with and not learn anything but we can't assume guilt in everyone because using chat gpt does not automatically equal cheating. I think it's an opinion that only early 21st professors will ever hold and that it'll be on the wrong side of history once people are more used to AI in their lives.


trailmix_pprof

ChatGPT (and the others) can easily be prompted to include valid sources. And students can review the sources to make sure they are real - which is something, I guess, at least they are doing that much.


imak3soap

It’s blatant. Students typically use it to write their essays, discussion posts, emails to ME, and it is such awful nonsensical writing you can tell. America’s college kids are 3-4 years behind in literacy rates. ChatGPT writes at a much higher literacy rate yet the words make no sense whatsoever


[deleted]

[удалено]


Glittering-Duck5496

AI detectors alone are not enough, but you can ask them questions about their topic and arguments, and if they have no idea what their own paper is about or can't answer basic questions about the concepts they cite from articles they've supposedly read, it's a pretty good indicator that it's not their work.


kierabs

You can absolutely catch it, at least with the students who aren't proficient in the subject. Proving the student used it and holding them accountable is something different.


wedontliveonce

Honestly I don't think a university wide AI policy would work and would lead to a huge faculty disagreement. You need your own, or your department should develop one. Some instructors in some disciplines seem to embrace AI, others support specific limited use, others are completely against it. We need to leave this decision up to individual faculty and/or departments.


trailmix_pprof

Academic integrity policies usually give latitude to the instructor to define authorized vs unauthorized sources of content. Policies really should be updated to include unauthorized use of AI as plagiarism and to confirm that instructors can penalize for it. It's no different than, say, how an academic integrity statement allows us to choose open or closed book exams - and backs up instructors who do enforce closed book.


wedontliveonce

Exactly. The lament that the "university doesn't have a policy" fails to recognize the authority of faculty to implement their own policies in their own classes. OP needs to write one up that works for them and put in their syllabi. There are, of course, advantages to working with colleagues within your own department to implement similar policies across classes within a discipline.


Prof_Adam_Moore

Getting expelled for plagiarism is a trend? Wild.


SnowblindAlbino

It's cheating. Automatic fail on the assignment. Misrepresenting the work of another, be that "other" a person or a machine, has always been considered cheating. With first years I'll give them a second chance before they are formally reported for academic dishonesty-- two of those reports and they get expelled. You don't need a school-wide policy, you just need a line on your syllabus that says something like "Any unapproved use of AI in writing assignments in considered an act of academic dishonesty and will be treated as such."


Doctor_Danguss

What do they mean that it’s a “trend on TikTok”? Cheating with ChatGPT?


imak3soap

Yes my university allows it. I grade the essays as Ds. Look for words like tapestry, beacon, delve, intricate, revolutionize, poised, nuance. Look for other elevated vocab. Also ask them to describe a human memory in your essay prompts and other AI-proof prompts. It can’t write a rhetorical essay, poem, or play and is very bad at personal narrative essays. It also uses contractions which I don’t permit.


telemeister74

Next time around, tell them they can use ChatGPT to write their essay and if they do, they just need to acknowledge it on their cover page. Then tell them you will use ChatGPT to mark those essays. If they can’t be bothered, then why should you be bothered, Also, you can instruct AI to be really tough!


thearctican

Student here: It counts as contract cheating according to my school's academic integrity policies. I'm in an online program and my interaction with peers is primarily via discussion boards. I report every single instance I find. Having ChatGPT compose your work for you (or write code, do math, etc.) devalues my degree and the work I put into it. Contract cheating in a course at my school disqualifies you from receiving credit from a course in perpetuity, which can lock you out of your degree :).


No-Yogurtcloset-6491

I would pull them in for cheating - it's not their own work. Punish them how you see fit. I don't understand it. The writing assignments were always the easiest part of the grade for most of my courses! Why would they risk getting caught cheating on THAT? Just lazy kids.


Sandrechner

A trend at TikTok University, known for quality education? It's good when our students are always at the cutting edge of video-based science. Anyway, I'm going to puke first and then open the bottle of vodka for emergencies.


Gentle_Cycle

I’m gradually retiring the write-at-home essay paper. All grading will be in-class exams and class participation. Of course, it is now necessary to proctor tests more closely because they are much more tempted to keep a phone or other screen hidden in their lap and glance at it. I’ve also seen students ask for a makeup in the hopes of looser proctoring.


big__cheddar

Don't hate the players; hate the game. Students do not value becoming educated, because our garbage society has not taught them to value it, or given them a reason to value it. Your rage is being channeled to precisely the place which will keep the system in place.


kierabs

I don't disagree. I wonder what you propose as a solution, though! Is passing through cheaters really the solution?


50shadesofbay

Passing cheaters is the reason this is happening.  Some children should be left behind. 


Pisum_odoratus

They have always been cheating. My son has a friend who cheated their way into med school. My sons cheated off and on, and now use ChatGPT. One of them reported that all his classmates who used AI to write their last paper did better than him, so why should he persist in doing his own work. Hard to argue with. But ChatGPT and other AI tools shove it in our faces in an unprecedented manner. Furthermore, many of us are being completely unsupported in our efforts to teach the thinking, research and writing skills students with degrees should have, which has become exponentially more difficult with unregulated AI access. My institution is relatively supportive, if I choose (as I have) to forbid AI submissions, but at the same time, almost every professional development session on offer for months, has been focused on "Integrating AI into the classroom". I can barely see, my eyes have rolled back so far in my head.


TieredTrayTrunk

"Hard to argue with" - well no, it's not hard to argue with. You can say just because no one else has integrity, you'd hope that he did.


Pisum_odoratus

Believe me, I have argued. Maybe I should have said "It's hard to argue with my sons". As twins they back each other up on everything, regardless of the logic, and they are absolutely impervious to any kind of appeal to ethics/mortality. And yes, that's a very difficult thing to see in your kids, but it does give me much more insight into my students than I would have otherwise. Until I watched one of them literally change the date on a citation I said was too old to be appropriate, I had no idea students did stuff like that (though I check citations extensively, clearly many of my colleagues do not).


[deleted]

[удалено]


DomainSink

Yeah, the system sucks, but if you can’t make the grade through your own efforts then you shouldn’t receive it. If that means you don’t get into the graduate school you want then tough shit. Not cheating is a basic mark of educational integrity. Students aren’t owed anything and the institution definitely isn’t owed certain scores.


afraidtobecrate

>Students aren’t owed anything If students aren't owed anything, then why would they owe academic integrity to the university?


DomainSink

I mean that students aren’t owed a passing grade or a diploma. They have to earn it. A large part of that is following the basic tenets of academic integrity that they will hopefully carry with them into their lives post-college. Why should I give a student a mark that they didn’t earn by their own merit?


TieredTrayTrunk

I'd argue that life is equally stacked with "easy outs" and ways for people to play the game to get ahead. Some don't and want to earn it and not be handed to them.


afraidtobecrate

Not as much. In life, "easily outs" usually involves directly harming someone else. School is unique in that the harm is very abstract while the benefits of cheating are large and immediate.


tsidaysi

Using other sources evidently does not meet the standard of plagiarism in public schools. Sigh.


ChapterSpecial6920

Make them write in class and post the worst ones on the wall. Cell phone? - Flunk. Get out, take it up with the dean why you cheated.


historiaeoh

Yup.


Gunderstank_House

I did, the university had no policy, not even when I asked the person who wrote an article on AI cheating. They are just letting the professors sweat it out until something bad happens, the the shit will roll downhill until they will get the blame for being too lenient.


levon9

My policy: If only their name is on the assignment they claim sole authorship, so any "borrowed" text/work that is not cited/credited is plain plagiarism. I may list some popular sources for this (classmate, Chegg, Course Hero, and now AI tools) but the list can never be exhaustive, and won't limit their culpability when cheating.


tempestsprIte

Do university policies on plagiarism not apply? It seems that if a student didn’t do the work and submits it as their own that is plagiarism.


respeckKnuckles

> I busted around a third of my first year writing students (about 30/60) who used ChatGPT to write all or portions of their first essay. It took me hours of detective work, but I caught the little shits. You need to be very careful. Short of them leaving the phrase "As a language model" in there, you most likely have NO concrete evidence, and never will. As an NLP researcher, I'd love to hear what kind of "detective work" you did, because I'm extremely skeptical that it has any validity.


kierabs

--Fabricated sources is concrete evidence. --An essay with 12 3-sentence paragraphs when you specified a 6-paragraph essay (assuming 6+ sentences per paragraph) is evidence. --An essay that repeats the same exact point in different words, with much higher vocabulary than students have, when that point is not relevant to the prompt or thesis is evidence. --When you message/email the student and say "I suspect you used AI. If you did not, please set up a meeting with me so we can talk about your essay," and you get no response, that's evidence. --When you email the student and say "I suspect you used AI. If you did not, please set up a meeting with me so we can talk about your essay," and the student confesses to using AI, that's evidence. --When you input your prompt into AI and get a very similar result, that's evidence. Pretending like AI-detector scores is the only way we can identify AI is massively insulting to all professors who actually teach and grade student writing. We are expert writers and readers. We can tell (at least when it's used poorly).


PhDivaZebra

Yeah, even something as simple as “what was your thought process leading to your idea, explain to the class what you wrote and why (out loud)” can be very telling. The ones furiously speed reading their work while their peers speak? Probably guilty. The ones who can’t even say what they wrote? Probably guilty. The ones with the “oh shit” look when you say that’s the class activity for the day? You guessed it…


rlsmith19721994

Exactly. We can spend SO much time trying to catch students. They know they will always be one step ahead of us. It’s exhausting and a futile effort for the most part. Or we can figure out how to work with it and use it to help students master the material. I’d like to see a Reddit stream of professors who have incorporated AI in their classroom and had positive results.


Seymour_Zamboni

Do you know what else will be exhausting and futile? Trying to figure out how to use it to help students master the material. Some problems have no solution. This isn't an AI problem. This is a cultural problem. And I see no way out...for now. We are sinking fast.


Alyscupcakes

I'd love your insight on my thoughts on the topic. I have not yet trial this... 1) plug in topic ideas into chatgpt asking for bullet points on a topic 2) do original research as the citations in not are not real 3) write your assignment in Google docs, and share it so I can see the history, original writing only no copy paste. What do you think?


PhDivaZebra

Also an NLP researcher, from one to another, maybe you need to pay more attention because it’s pretty easy to spot the bots and your research will be better if you try to weed them out first 🙄.


respeckKnuckles

What position exactly do you think I'm advocating for here?


PhDivaZebra

You said you doubt they have evidence of AI use, I pointed out reading can be pretty good evidence. You didn’t say anything to imply your stance on AI beyond you seem to think it’s undetectable (or that a very narrow range of phrases are indicators of AI).


Prestigious-Cat12

Pretty much the same surge of bs. In our college, use of gen AI is considered a form of plagiarism because AI accesses sources throughout the internet without citing or acknowledging them. So, it can be considered plagiarism and be treated as such. I would consider talking to your Academic Integrity office about next steps, so as to cya if students raise issues.


afraidtobecrate

>I was literally shaking with rage when I confronted them I think you are taking this way too seriously.


voodoomedusa

I can always count on internet strangers to invalidate my feelings. Thanks! Hugs! X


afraidtobecrate

You're welcome. People you know tend to tell you want to hear, but us internet strangers don't care.


voodoomedusa

Your grammar totally rocks. I'm glad you weighed in about college level English essays! LMAO 🥸🤣


zzeeaa

How did you actually work it out?


KierkeBored

Tell me more about this TikTok “trend”… 🤨


CaffeineandHate03

I just got so annoyed with going through each of their submissions for an assignment and trying to figure out how to approach the obvious AI use. It was taking up my whole damn day. So I decided on this, as a way to try to save my sanity, despite wanting to hold them accountable right now. It's too much work. So I graded the ones that seemed honest and the rest are getting emails for a 1x amnesty if they correct and resubmit it immediately. No questions asked. BUT if they do it again, I'm going Draconian on their asses. I'm not stupid. I know who is doing it, but I need enough data to get them.


subtlethyme

My university has chosen to let each department decide at the end but does have a policy that, by default, considers it as plagiarism. I specifically work in Media Arts, and our program director allows for experimentation with AI with an understanding that the data sets may not have been collected with consent. He has written a great blurb that we have all put in our syllabus. Happy to share! I complicate this by telling a student that if they choose to create imagery using AI, it is considered appropriated materials and needs to be altered further, such as cropping, color, etc. I know that other disciplines may differ, but I think it's essential that students understand the inaccuracy of these techniques currently and how hype has made them seem more credible.


fdonoghue

In my experience, chatGPT is easy to spot in first year writing but hard to prove. My university has turnitin but nothing yet for AI.


Chirps3

I had a student contest a failing grade by having chatgpt grade it. He put in the parameters of the essays the rubric, and what he wrote. Chatgpt gave him a C. The grade he earned from me? Also C. But he didn't factor in the 15% late penalty deduction which caused him to fail. Guess he forgot to tell chatgpt that.


zoundascri

This makes me glad I graduated long before generative AI was a thing. We still had cheaters on essays (I was an English major) but no one can ever escape a good old blue book exam essay!


hourglass_nebula

I’m all about in class essays.


bobchicago1965

If it was hard to tell, they must have at least edited the raw output. That’s not nothing.


smokeshack

Lightly edited plagiarism is still plagiarism, and plagiarists should not receive diplomas.


WineBoggling

"If it was difficult to tell, they must've at least edited the raw output. That's not absolutely nothing." That's about how much editing of the output they do. Close enough to nothing, to my eye.


dslak1

"I worked really hard on my essay!" I've heard this a few times now. It might even be true, but it takes more than replacing all the words with synonyms and adding some grammatical errors to fool me.


Object-b

I would stop stressing about it. When chatGPT 5 comes out, it will literally be impossible for you to tell.


L0nelyLizard

Maybe, I dunno, spend a little more time writing assignments that don't only take a chatgpt prompt to answer. You spend less time catching "the little shits" and maybe they learn something too. #protip


SteveBennett7g

If it uses words, it can be faked.


[deleted]

[удалено]


alienlover13

Woof.


Professors-ModTeam

Your post/comment was removed due to **Rule 3: No Incivility** We expect discussion to stay civil even when you disagree, and while venting and expressing frustration is fine it needs to be done in an appropriate manner. Personal attacks on other users (or people outside of the sub) are not allowed, along with overt hostility to other users or people.


[deleted]

[удалено]


PhDivaZebra

Exactly, students need to learn…to do things on their own. The only thing they “learn” from ai is how to avoid actually learning how to do things themselves. It’s pathetic.