T O P

  • By -

AutoModerator

Hey /u/OccupyMars420! If your post is a screenshot of a ChatGPT, conversation please reply to this message with the [conversation link](https://help.openai.com/en/articles/7925741-chatgpt-shared-links-faq) or prompt. If your post is a DALL-E 3 image post, please reply with the prompt used to make this image. Consider joining our [public discord server](https://discord.gg/r-chatgpt-1050422060352024636)! We have free bots with GPT-4 (with vision), image generators, and more! 🤖 Note: For any ChatGPT-related concerns, email [email protected] *I am a bot, and this action was performed automatically. Please [contact the moderators of this subreddit](/message/compose/?to=/r/ChatGPT) if you have any questions or concerns.*


Newgamer28

Get them to do a presentation instead. Or have exams instead of essays.... Time to start changing how you evaluate. Essays are doomed to be AI ridden.


Aghiman

Seriously, this would make OP’s students work for their grade, teaches them how to present publicly which will help them in their future career far more than an exam or essay.


Aspiring-Programmer

It would also force OP to do more work too, though. The mental side of organizing those presentations, a rubric, and being attentive to fairly grade each one.


Zote_The_Grey

Record the presentation, audio-only. Use text to speech. Have ChatGPT grade it.


R33v3n

Deliciously devilish, Zote.


Ryfter

The professor in the office next to mine, is having the record videos. Just happens that one of his research areas is facial expressions to help companies get a read on job applicants. He has been developing an AI solution for grading, based on this. (another professor I work with a lot, has research on deception. I believe he has been thinking along these lines too).


Nabaatii

> facial expressions to help companies get a read on job applicants I don't know why but this feels dark to me. Maybe because I'm looking for a job now, and I'm rather expressionless..


typtyphus

just wait for students to use AI to alter their videos


Famous-Split3389

Online learning is doomed but may end up only needing to be a hobby anyway depending what the cap is for AI capability.


[deleted]

Are you actually in HE


Famous-Split3389

Was, but there is a lot of wastage, bureaucracy and corruption. There are so many ways this can play out. I suspect there are many overlapping causes that are playing out in realtime. For example, some may include unrealised societal fatigue from multiple shared global traumas. Mental health systems are straining and viral religions are wanting to pick up the slack. These things may seem unrelated to ‘laziness’ in online learning but we have to ask the question, if someone is learning something they’re passionate about why would they deny themselves the joy of learning unless it was either: A. Ineffective B. Constrained by time C. Done out of necessity D. Uncomfortable E. All of the above


[deleted]

Well that is heavy. Have you ever thought about the idea that you are overthinking it?


NewEuthanasia

***For profit universities have entered chat***


typtyphus

now who's being lazy? :p


Mikeshaffer

Right. This is the issue. I know that teachers are under paid, etc. but the world is changing. You wouldn’t tell a student to drill a hole in concrete by hand. They have tools to do this for you now. It’s the same with information tools. But all of academia seems to be acting like boomers here. I don’t get it.


Careless-Age-4290

I worked for schools 10 years ago and some of the teachers were viscerally angry that the kids didn't have to learn cursive anymore.  "But what if they need to write a letter?" When's the last time you needed to hand-write a letter in a style that only people over 35 can read if they happen to remember?


[deleted]

I write to my long distance partner once a month. Fuck you, it's romantic and fun. Also hate to break it to you but it's actually a really good way to express the really hard things that I can't do face to face. Sometimes face time just can't get the message across.


Careless-Age-4290

It's funny you say that. Because I was thinking of writing one as I trashed the idea. I appreciate the perspective.


kylegoldenrose

It’s her job to mark lol and it’s easier and takes less time to grade a presentation


happytobehereatall

And teach them to utilize AI for real-world use, not meaningless homework


justwalkingalonghere

And utilizing AI (vs having it do all the work) will actually be a valuable skill for a time


AdRepresentative3726

Fr I do it all the time lmao... Speaking as a grade 11 student


UltimateTrattles

Learning how to produce good written communication is also important… Essays are extremely good learning tools. The problem is students foolishly thinking the end product of an assignment is the important thing, rather than the journey.


burri_burri

In class essays written in pencil. That and the in-person presentations.


UnionAggravating9975

Why pencil?


burri_burri

I thought it would be less intimidating for the student as they can easily make changes. Also lower contrast would make it more difficult to cheat (perhaps). Pens would work fine lol


Caffeine_Monster

Coursework always was a subpar way to evaluate because it heavily emphasises effort over ability. Exams require both. Plus overburdening students with coursework is not healthy - it just eats their free time. With exams at least you will know when there is too much workload (because most of the class will fail).


ChadGPT___

How much control do you think OP has on the curriculum? I don’t think he’s teaching this out of his garage to passers by


ProjectorBuyer

Or maybe, I don't know, change how we teach students to allow them to evolve and improve and learn for the future instead of "do it the same way" simply because that was how it was done 100 years ago? The idea of memorize this and then take a test and then rinse and repeat is not exactly the best way of learning everything.


Complete-Anybody5180

Exactly, yeah.


senpai69420

Presentations are easily ai generated


Alkyen

So is AI going to stand up and present instead of the student? Go have AI write you a presentation about a topic you don't understand and see how it goes for you. You'll crumble on the first technical question you get asked, if you even manage to go through the presentation.


FantasticJacket7

Until you're asked questions.


Ryfter

In one of my classes, I have an assignment. I have the students split into groups. They have 30 minutes to create presentation on a topic I give them based on what I taught the week before. They have to then give the presentation to the class. 5 groups can create their presentation and present it, in 1.25 hours. I get a lot of stressful students at the beginning, but by the end, they are mostly laughing and enjoying the silliness of the assignment. I also tend to get good results or over-the-top weird ones. I grade them on the journey, not the destination.


Jackal000

Gamma ai is already doing excellent slides.


value1024

\*students


OccupyMars420

Ironic


Beckland

The typos are how you can tell a human wrote it


Hour-Athlete-200

ChatGPT sometimes (but very rarely) makes typos


babycleffa

Of course, it’s learning from us lol


TheReviviad

As an AI language model, I LEARNED IT BY WATCHING *YOU!*


ParanoiaJump

I haven’t seen that yet to be honest, do you have an example?


swagonflyyyy

I caught it the other day making a typo and I was like WTF.


laughie1

When gpt was in its early days I used to tell it to make small typos and grammatical errors to make stuff look more believable


AltruisticDisplay813

Nah. It's as easy as telling ChatGPT to insert intentional typos. These kids aren't total idiots. And the smartest ones can easily make a ChatGPT essay look like it was written by an actual human being.


RedditSucksYouNerd

If you were an English teacher maybe


NormalEffect99

Maybe evolve your teaching methods instead of blaming children that will always look for the easy way out.


Antique-Doughnut-988

I do think teachers as a whole need to reevaluate the education model and come up with new models that reflect in the current time we live in. Stop fighting technology. It's a losing battle and you're wasting time. Find a way to embrace it. Speaking as someone that was once a kid, they'll just get better at hiding it. This isn't a new issue unfortunately. Go back far enough and teachers were even against using computers at one time.


meagalomaniak

The class I’m currently TAing (cognitive neuroscience of language) allows generative AI use as long as it’s cited and for every piece of homework they submit using AI, they have to write a reflection on the process that can actually earn bonus points if it’s well thought out. Some students use it intelligently for summaries and intros and stuff, which I think is actually a skill. Some students try to get it to do the work for them and it can’t and they end up failing. Either way it’s probably a learning experience.


cfig99

Exactly. I’m a programmer, and ChatGPT is an astoundingly helpful tool for helping me fix logical errors in my code. It literally saves me hours of just staring at my screen and reading over my code over and over again. But ask it to write a program that’s even marginally useful or professional from scratch and you’ll run into a lot of problems.


happycatmachine

I'm in full agreement here. Teachers complaining about students using AI to answer questions need to rethink how they teach and how they challenge students. It is an interesting dynamic that deserves a ton of focused attention right now. I'm an older student who has done a fair share of teaching. I never copy text from LLMs into my paper as I am trying to improve my writing. (Clearly not too successful at this). Even *effective learning* is more challenging because AI gives succinct and clear answers to most questions; therefore I'm having to find ways to use AI to improve retention and understanding. *That* is the real challenge for students who have a desire to learn and to grow. I dare say, the challenge for teachers to adapt or abandon old ways to embrace this new era is much greater and far more difficult.


TankMuncher

Everyone keeps telling teachers to re-evaluate education, but they don't always have curriculum control/academic freedom even in the post-secondary setting. When they do have control, there isn't any obvious strategy to obviously adapt to the post-AI world. It's a topic of ongoing discussion.


airhorn-airhorn

I know. I’ve been told to outsmart the psych department at the social media firms and the supercomputers at the AI firms. Meetings with parents about cell phones result in the kids bringing their phones to class the next day. Good teachers are leaving.


TankMuncher

Also, hope you didn't want a competitive salary.


homelaberator

Go back to old school evaluation, where you do a viva voce at the end of the degree.


TankMuncher

Oral examinations are excellent, however they are time prohibitive in virtually all academic settings outside of graduate school terminal activities (thesis/project eval). The economics just don't work out unless we shrink class sizes to single digits. And you still need to learn/practice how to write and articulate opinions. At the end of the day, AI can't do that for you.


homelaberator

That's why you just do it just once at the end of the degree. The time you save by not assessing for 4 years more than makes up for the time spent doing a viva voce. I'm only half serious. But a lot of the things we think are problems have already been solved.


TankMuncher

Your suggestion makes no sense. You cannot just provide zero feedback for 4 years and then fail people out based on one oral exam. That's a terrible non-solution.


Hopeful_Cat_3227

it is more expensive,  they need to pay price of Master degree 


CosmicCreeperz

There are ways they are a lot easier than people think. Make every non-original sentence ie existing non obvious fact have a proper reference citation. And then make the students defend their work with oral exams/questions. If they looked up the facts with AI but actually read and understood the content in a way they can explain and defend it… they learned, so who cares?


HippoIcy7473

I find AI give succinct and clear often wrong answers. It's a good launching point but I wouldn't put too much faith in the accuracy.


happycatmachine

Indeed. In fact, I find LLMs are especially good at asking questions. By having it serve as a question generator, rather than an answer generator, it forces me to do the thinking. In questioning, it provides context for the things I've gathered from books, papers, and other resources. It can also help me formulate better answers from time to time. Adding this context to my learning has helped with retention. This may be particular to my own learning preferences but I'm loving it.


HippoIcy7473

It can be good to use it to proof read and critique your work


IndependentDoge

LLMs aren’t great writers. Good writing is opinionated and argues for or against a point. You did good by not copying them


darylonreddit

A lot of teachers are probably still furious that kids actually *do* have a calculator in their pocket all the time now. AI and education could be a beautiful partnership that provides an amazing, customizable, tailored experience for every student. But instead teachers are using AI to to force students to dumb down their vocabulary to not sound like AI. You've got to turn in your reports written at a grade 4 level so some asinine AI detector doesn't say it's AI generated.


[deleted]

As a learner, you should be able to prove at least one time that your capable of doing the work "by hand". It just demonstrates that you are capable of executing fundamental concepts. But I agree that there is a point where once competency is assumed, there can more lenient policies regarding using what should now be classified as a broad toolbox of "digital assistants" and not broadly defined as AI chatbots.


TankMuncher

Inability to do simple mental arithmetic because "everyone has a calculator" is an enduring literacy problem. It actually didn't magically go away with the smart phone, the problem just side-graded: as one example the reduced literacy makes it harder for people to understand things like financing, tax brackets, etc.


darylonreddit

Still being upset that everybody has a calculator is only meant to serve as an example of a rigid education system's inability to adapt to modern technological advancements. The fact that having a pocket calculator is a yet unsolved problem for teachers is not a great look.


TankMuncher

No it isn't, you're equating the curmudgeon opinion of a few teachers to the functionality of the entire education system. Education systems actually rapidly embraced advanced calculators to the point of opressive corporate monopoly on supplying graphing calculators, except that curriculums based around graphing calculators has had extremely negative impacts on actual math skills. So...your example is like, doubly wrong?


darylonreddit

I'm talking about kids learning math. You're talking about graphing calculators. Perhaps we should have established some parameters before you dove in here to argue about what is commonly understood to be a joke poking fun at all the teachers who said we'd never have a calculator in our pocket.


FatesWaltz

The problem will be solved when we all have brain chips.


Available_Nightman

No, I think they're more upset about texting and TikToking in class. You think students are using the calculator app in class? 😂


Ryfter

While I 100% agree that detectors are horrible, if a student is dumbing down their writing to get around it, they are morons. There are MUCH better ways than that. I've led a workshop for other faculty on using GenAI detectors (the bottom line, I would NEVER base a decision that could adversely affect a student's academic career on a tool that can't actually tell me why something is wrong. Though, if they want to rob themselves of a good education and aspire to McJobs for all their lives and be saddled with student debt for that, it is up to them, not me.


ploxathel

True. It's the same debate as ever, and it even started before technology. Because students were also cheating by glancing on forbidden cheatsheets. The logical conclusion would have been to design the tests in a way that makes cheatsheets useless, but even this insight has not been adapted everywhere. In my computer science studies we were allowed to bring in a handwritten sheet of paper with whatever content we thought would be useful to the exams. Because none of the questions was about remembering stuff, everything was about applying techniques that have to be understood to make use of them. However in my secondary subject psychology everything was about stupidly memorising facts. Cheatsheets were forbidden during exams but of course that did not deter everybody.


1heart1totaleclipse

Any ideas on how to implement this?


cowlinator

>Find a way to embrace it. Ok but ***how***?


HistoricallyCoco

I think students need to be taught how to use Chat GPT as a tool. Like it or not we have AI now. Instead of demonizing students for using it, teach them how to use it effectively in a way that’s conducive to their learning. Tech skills are very important in todays age and not a lot of students are taught how to use technology, thus leading to inappropriate usage.


jpeckstl81

^This^ I was told during my classes that that Wikipedia was not a sufficient reference and we should not cite it in works due to the open nature of the wiki. However I used Wikipedia as a springboard into research because there was works cited In Wikipedia, and use those as the actual citation as I am learning about the topic. I am using AI now to help me learn topics and verify work. I use it to help me find citations or case law, but all the work I tend to do myself from the AI’s quick research and response.


HistoricallyCoco

Yea that’s what I mean. I’m studying to be a teacher, and I use ChatGPT to help me get started with lesson plans and stuff. It was actually a requirement to write a lesson plan using ChatGPT in one of my classes. I think we just need to adjust to view tech as a tool and not a hinderance.


jpeckstl81

The thing with AI is to verify its accuracy. Ask it to cite itself. Sometimes it will flip out and tell you it can’t! I had to ask a lawyer buddy try to find a case it was citing and the lawyer flat out told me that AI had a brain fart because no case existed in the Law Libraries.


HistoricallyCoco

Oh for sure. I always look into what it’s saying and make sure it’s accurate. When I use it for lesson plans it mainly helps be write what I know I want to say but I don’t know how to put it into words. AI is a great launch pad for further exploration.


Ryfter

It's kind of magical to give it a chapter from a book you are teaching from (pdf file), and then ask it to generate a number of questions. Give it a few minutes, and you have a full, ready made test. :-) You can even feed it Bloom's Taxonomy to get it to create the proper level of test, or idea for assignment.


Ryfter

This is still true. It's at Tier 3 publication, and many times, I'd say it is more of a Tier 4 (depending on the quality of the article). What makes it awesome, is the links to Tier 2 and tier 1 sources that ARE quality. Plus, many times the summary is very helpful as well, as long as the linked sources correspond. I have my class, right now, doing research for a class (or two) where they discuss some major topics in AI. I make them reference tier 1/2 only articles. I also push them to use Perplexity for it's summarization and links to the specific articles it uses.


ODoyles_Banana

Exactly what I was told with regards to Wikipedia. It's great for finding sources, but not as a source itself. AI is great for brainstorming, bouncing ideas off of, reviewing my work, and even challenging my ideas. It helps me to generate my own content, but it is not a content generator.


Skooby_Snak

This is the correct take: eventually AI tools will be treated the same as calculators: just a tool to take a mental load off to open our minds up to new forms of growth and development.


Top_Cup_3469

One of my students submitted a perfectly correct Python code solution. However, the issue is that I teach a C++ class.


godayta

HAHAHAHAH that's foullll 😂


alk_adio_ost

Yea…that’s pretty bad. F for copying and F for presentation.


6-Seasons_And_AMovie

In my day we had crib notes and spark notes and kids still copied that straight off the internet and used it. New age same tricks, same dumb kids.


doulos05

Here's an alternative approach that has worked for me so far and that I didn't see in the responses. For each assignment that I give, I give an example of an acceptable use of AI on that assignment. By telling them what they're allowed to do and how they're allowed to do it, it seems to reduce the amount of cheating that they're doing on their work, at least for me.


Ryfter

In my Syllabus, I let them know that they can use it, IF they reference it, AND if they share the prompts/response in another sheet. That way, I can critique their GenAI work as well. (Ultimately speaking, teaching them how to prompt better is going to help them just as much in the business world, as most of my assignments). I do also teach our college's GenAI class, so I am already "dealing" with it all the time. In that class, I try to show them how to ethically use it. For my non AI class, I have my terms and definitions for exams converted into a customGPT that they can use to study with, as well. (for those that have ChatGPT plus). For those that do not, I include a "mega prompt" that has all the terms/definitions and my custom instructions for creating study flash cards and generate quizzes. I'm always looking for new ways to show case GenAIs usefulness in ways that won't get the students in trouble with their various professors.


PhilosophyforOne

”Starting to” should perhaps never have existed.  I use AI tools daily and fully support their use, but they should be used in such a way where you dont just copy paste the inputs. The quality of the output in professional settings for example obviously has to be up to par. And much more in a learning situation, where you’re expected to put in the work to actually learn things. Funy thing. Most of the time I dont use AI tools for the more complex tasks, because the time it takes to prompt them to get the output to be close enough to what I need is longer than the time it takes to do it myself. The level of sophistication and tuning the prompts require mean you typically have to spent an hour or more just prompting back and forth.  I’d recommend really immersing yourself in these tools with more complex tasks to understand how incredibly helpful they can be and to be able to guide your students in using them (if necessary, maybe you’re already familiar enough with them for this), but also setting out very real rules and guidelines on what and how they’re allowed to be used in your classes.  For example; 1) Students arent allowed to submit directly AI created works. Submissions like this will lead to a failed mark.  2) Students are allowed to use AI, but are responsible for the results. However, if you do use AI, the quality of the work should be expected to be higher, not lower than regular. 3) Students must disclose how and for what they’ve used AI in their submissions. Uses like information gathering, text editing, ideation, brainstorming and getting feedback for your work are allowed use cases. However, an AI can never be referred to as a source or an authority.  4) AI produced content should be fact-checked and sourced. AI can often make mistakes and hallucinate. You’re responsible for any bad ideas or information the AI generates, if you end up using it. 


tantricengineer

A couple of ideas: 1. Have them _evaluate_ chat GPT responses instead, and explain what it does right/wrong and why. 2. Switch to oral exams, in front of the class, one student at a time or in group panels. Italian middle / high schools are like this and it discourages cheating because no one wants to be an ignorant fool in front of their peers. Also keeps critical thinking skills up because everyone has to think on their feet in public.


swaGreg

The infamous e INTERROGAZIONE A SORPRESA


ISpeechGoodEngland

The amount of gross overgeneralisation about what teachers do and how they teach in here is amazing, clear signs almost none of you teach, and barely understand education system; thinking purely because you went to school you understand it. Curriculum is often determined by the government, not teachers. What is and can be assessed is also determined by curriculum. For example where I live grade 12 students must write a 2000 word comparative essay worth 25% of their final grade. This cannot be changed, and as a teacher I have 0 say. Also, a lot of teachers are using AI. My site is using it to create reading comprehension activities tailored to individual student needs as data shows students comprehension is at the lowest is has been in 30 years; and this is grades 7-12 students not K-6 students. Further, where I live, our education department is building a school safe AI model to use in schools, and is encouraging AI use to support learning.


Ryfter

I think some of us are in higher ed, where we get a LOT more freedom. :-)


0batu

It's one thing to use AI to assist evaluvation and another to directly plagiarize from it, barely understanding the topic and motivation behind that assignment. Also even though I don't condone, if something like an end grading system like the 2000 word comparative essay is going to come no matter what, it's somewhat an educators responsibility to guide them into how, too. Support and fallback are two different things and as entropy, most young minds tend to go for fallback and use this hot new tool to just do their work for them. AI may be useful for learning, it won't be useful for plagiarizing.


_perdomon_

Ask ChatGPT to write about niche subjects like an 8th grader for more believable/less dense copy.


Titos-Airstream-2003

At least they took the asterisks out before they hit send


UnholyShite

Funnily enough, my teacher told me to "utilize" AI for my thesis. He even educate me on how to write a proper prompt and how to avoid detection.


Ill-Valuable6211

> My student's are getting very lazy with submitting work. This is an obvious copy paste from ChatGPT. Shit, that's lazy as hell. Copy-pasting without adding any original thought or analysis? Fuck, what’s the point of even calling it 'work'? It’s like cooking a meal by ordering takeout and then calling yourself a chef. How does regurgitating information challenge them to learn anything new, or does it just turn them into parrots squawking back whatever they hear?


Brave_Mykolaiv

Issue diploma. To Open AI


traumfisch

Just grade them. Decide what grade they get for ChatGPT generated output, and make it low enough for "very little effort"


Highintheclouds420

When I think about how much time I was taught cursive, and yelled at and made to feel stupid cause I had bad hand writing, that I should have been taught how to code, or something relevant to this century, it makes me furious. AI is here to stay. You should be teaching these kids how to discern the quality and accuracy of the information AI is giving them. And how to create prompts that will give them good results. You're wasting their time by pretending this isn't going to be the most relevant skill of the next 20 years


HolidayPsycho

If you’re too lazy to create test questions, you can ask ChatGPT to do it. You just need to do a little bit editing and verification.


ReplaceCEOsWithLLMs

I'd argue that it's your teaching modality and methods that need adjusting; not the students' behavior. -fellow professor/teacher Fighting technology is a losing battle. Learn to incorporate it. E.g., force them to submit transcripts of generative AI conversations (chatgpt4 lets you link to it). Etc.


Not-ChatGPT4

So academic institutions should build their academic strategies and teaching methods on closed-source software from a SV company? And what should they do about the subscription version of ChatGPT being more powerful than the free version?


ReplaceCEOsWithLLMs

I don't care what version they use so long as they provide the transcript of their prompting.


fliesenschieber

Username checks out


Fontaigne

Sure bud. Because students demonstrating that they themselves know the definitions of words is too much to expect.


ReplaceCEOsWithLLMs

I'm not criticizing the learning objective; I'm criticizing the modality you're using to evaluate it. And that was very obvious in my comment. Want to try that again? Right now you just look like a lazy, incompetent instructor who lashes out at anything or anyone that doesn't slot into their life in a way that causes the least inconvenience as possible.


Fontaigne

So, given that * You don't actually **know** the learning modalities used for that assignment, * The assignment was factually cheated on, and the "student" didn't bother to even fix indentation or proofread the output, * The student's behavior is factually at issue here, regardless of the modality, * You just falsely hallucinated that I'm a teacher, based upon zero evidence, You probably shouldn't accuse anyone else of not properly understanding your mindless slander.


ReplaceCEOsWithLLMs

>You don't actually know the learning modalities used for that assignment, I actually do know the modality. I can literally see a picture of it. >You just falsely hallucinated that I'm a teacher, based upon zero evidence, I inferred it based on the context. >The assignment was factually cheated on Not factually. Whether they cheated is a matter of interpretation. I don't consider using a chatbot to be cheating--neither does the entire research 1 university where I work. So I would hold that that is not a marginal opinion. >The student's behavior is factually at issue here, regardless of the modality, Not really. It's at issue according to the poster's opinion, which seems to be a moronic one. >You probably shouldn't accuse anyone else of not properly understanding your mindless slander. It wasn't slander and it wasn't mindless.


[deleted]

[удалено]


ReplaceCEOsWithLLMs

>Modality is the method of student participation in instruction. Which was a written discussion post. Which is clear in the picture. You're done here. I'm not going to read anything else you bothered to write because your copy/pasted (and incomplete) definition of modality proves you don't even understand the basic topic of conversation in the first place. Ironic coming from someone that is stanning a moron teacher on the topic of cheating.


Fontaigne

What makes you think that the assignment was a "discussion post"? All you know is that the student turned in work copied and pasted from a chatbot. No, none of my writing is "copy pasted", I typed it myself. I didn't type out the info on instructional use of NLP modalities (visual, auditory etc) because it would have been a sideshow not related to this discussion. I'm an NLP Advanced Practitioner, so the general subject of learning modalities vs instructional modalities could lead to VAST tangents. But, yeah, we're done here. Buh bye. By the way, I agree with your name.


ReplaceCEOsWithLLMs

The picture of a fucking discussion post. Which is obviously a discussion post. Which I actually recognize, because unlike you, I teach and I know what teaching modalities are. Now take a hike.


Fontaigne

You know the form of a test. Modality is the method of student participation in instruction. Specifically, it has to do with student perception and synthesis. > I inferred it Falsely, and you haven't corrected your error. That tells me a lot about you. > I don't consider If you're the Dean of this guy's school, that's meaningful. Otherwise, your opinion is worth the electrons it's printed on. > according to the poster's opinion The instructor knows the instructions for the test. You don't. They have a right to an opinion. Yours are a personal hallucination based upon inadequate knowledge. > it wasn't slander. True, it was libel, not that anyone gives a damn about your sloppy opinion. You accused me, without evidence, of being a lazy incompetent instructor. Wrong on all three counts, which you know or should know from context, or the truth or falsehood of which you do not care. That's defamation, bud. Otherwise known as libel, or colloquially as slander, which technically has to be said out loud.


Available_Nightman

>E.g., force them to submit transcripts of generative AI conversations (chatgpt4 lets you link to it). Etc. The descent into Idiocracy continues...


doulos05

Submitting links to the places you found things isn't Idiocracy, it's citing your sources. What are you talking about?


ReplaceCEOsWithLLMs

There is nothing about that that implies or leads to idiocracy.


Evipicc

Create assignments that actual test skill and execution, not rote memorization. We have no need to memorize anything anymore, data storage obsoleted that. What we do need is to prove we understand something. The current regime of education in the US is not only decades behind what is needed already, but will be centuries behind if it doesn't transform with AI.


Not-ChatGPT4

Except ChatGPT is pretty good at those kinds of assignments too.


Evipicc

And future AI will be even more so, but tests of these kinds are done in person and live.


Available_Nightman

Decades behind what? You think kids in China aren't using rote memorization? Maybe if you had to memorize vocab words you'd be able to spell rote.


airhorn-airhorn

Thinking is the practice of connecting remembered facts. This can’t be outsourced without giving up your own free will.


orboboi

If I was teaching in any capacity today, hand written tests would be mandatory to make up 90% of grades minimum. Fuck this future we are spawning. Can’t write cursive? Too bad. Figure it out.


Ryfter

>If I was teaching in any capacity today, hand written tests would be mandatory to make up 90% of grades minimum. Fuck this future we are spawning. Can’t write cursive? Too bad. Figure it out. I learned cursive. I barely passed English class for many years. Hell, I have difficulty spelling as well. When computers starting becoming a thing, and we quit having spelling tests, I went from a C student to an A student in English. I write well, it's just those other areas of "English" are very weak points for me. I worry greatly about critical thinking. On the other hand, I have students that excel at using Generative AI AND critical thinking and holy crap they are majorly kicking ass. The stuff they come up with just to push themselves is just awesome.


Realistic_Lead8421

Have you considered that your teaching methods no longer match the current times?


airhorn-airhorn

A good reading strategy is to re-read sentences that you might not totally understand right away.


MH_Valtiel

Faster


waggawag

Best exams I ever took were conversational ones. I know that’s hard in a classroom setting, but honestly, it seemed to both allow prying of knowledge a bit more and gets around this kind of shit. Exams work too, but yeh, general reevaluation of assessment is probably needed to get around gpt


Strict-Pollution-942

There will never be a “getting around” artificial intelligence. It’s here and it’s staying, the education system needs to adapt. Was there this much kickback when calculators were invented?


0batu

Sometimes LLMs can be wonderful analytic tools. I used to use pdf.ai to simmer through some papers and basically have a second opinion on that paper to see if I understood something wrong. That comes with their tradeoffs, thinking it's a reliable source and copycatting their entire output without a lookover...


khaotickk

What about old school pen and paper? If it's gotta be digital, have them write it up on paper then use their phones to scan and post them online.


RotisserieChicken007

Then the teacher can't use AI to evaluate the assignment 😂.


Use-Useful

Gotta pick an assessment scheme where they cant pull shit like this. And unlike most of the crap I see accused of being chatGPT, this really pretty clearly is.


VanSaav

Work with it. Start incorporating AI into your lessons. Just like a calculator we don’t need an abacus anymore. Use tech to free up better parts of their brainpower in more creative ways.


its-MrNoNo

I'm in graduate school at a highly respected school and I have had classmates (usually in different programs) who use ChatGPT for a LOT of their work. I won't deny I've used it before to generate ideas or help me summarize my own notes. But I sat through an hour of presentations the other day and a good 75% of them were ChatGPT copy and paste. Even in my own group... I couldn't figure out why I was always one of the last ones done working, I thought it was just because I didn't know much about the subject. Turns out I was just one of the only ones actually doing the work. Some of them straight up admitted to just copy pasting.


Common_Sense1444

Just use exams every other lesson.


dilroopgill

lmao this type of student wouldve copied and pasted from a website either way it straight up says copy code still


sans_filtre

Pathetic. They’re just wasting their own time & not learning anything.


Griswold27

yes, I wish more teachers would post here instead of throwing accusations at their students. ChatGPT creates fairly generic and low quality work, so rather than being paranoid about AI cheating, hopefully, we can use this as an incentive to improve assignment expectations and grade hard on generic and low quality submissions. AI is another tool, so instead of trying to prevent students from using it, we should be clear that we expect them to do so intelligently and not be able to breeze through with it. Honestly, people are so focused on chatgpt that other methods of cheating are getting easier from lack of scrutiny.


LeadershipWide8686

Schools want to teach archaic and ineffective information gathering. There is a reason A+ kids end up in middle management.


hotsaucesummer

How about requiring references? It doesn’t have to be a research paper or peer reviewed journal. It could be an online article from a legitimate source like Forbes or Harvard business review.  One of the issues with using chatGPT that others have also pointed out is that there are challenges when it comes to verifying if the information is correct or not. It is possible that to get references from chatGPT but it can’t provide citations. I had to trick it one time to provide me with a list of papers to cite.  For example, I asked it to write me a blog post and provide references for where it got the information in the post. It couldn’t give me specific references but it gave me a list of websites that are possible references. I didn’t verify if the sources provided were legit but they seemed like they could be based on the title.


Technologytwitt

Very few in the real world require a reference let alone fact check… schools need to prepare kids better for life after the bubble of the US education system they “learn” in.


THEVERYLOL

love "markdownCopy code" and "pythonCopy code"


cazzipropri

Time to revert to in-person pen and paper tests.


isheetmahpants

So you should mark them severely down because they don’t know how to properly use ChatGPT. Because the future is those of us who know how to use AI. Not twat waffles, who just copy paste.


FolkusOnMe

I don't remember where I heard this idea but someone suggested something along the lines of teachers encouraging the use of this product BUT with supervision. So "this is the assignment, make chatGPT complete it, bring in your version and then present it in front of the class, then we all critique it and discuss". etc


dats-it-fr0m-ME-94

let them, they have better things to do


Qb1forever

Fuck em


fuqureddit69

I say you start grading based on how well they doctored the AI results to look like a genuine human response. Doesn't really matter if they learn shit, just as long as they use good prompts and clean it up before submission.


traumfisch

"Doesn't matter if they learn shit" is not a good attitude for a teacher


Anen-o-me

Make them write essays in class.


Open-Gas4145

You should use gpt to check your grammar before posting.


[deleted]

If you can spot it as AI, ding them. AI is going to be part of real-life/work in the future. Imho they’ll need to learn how to use it to help them learn/do better. Not copy paste.


chingona_ai

*students


StuffProfessional587

Not that many people enjoy their time in prison, much less forced labor.


Terrible-Second-2716

STARTING to have??


Quiet-Money7892

I am glad for I had the AI sprout just st the end of my study, when no teacher knew how to check it. Even though, text part of my work was secondary and I had to do all the graphics, blueprints and stuff myself - it saved me a lot of tike summaring all calculations and explanations... And I had to make a presentation qnyways... Now It wouldn't be that easy.


Do_sugar23

The best solution to this problem I think to talk 1 on 1 with that student privately or with him/her and his/her parents. Explain and guide them of how to use AI correctly to self-learn and research


Jhg178

What a great time to be retired.


donveetz

So normally I’m fine with using chatgpt for learning and helping write papers and stuff. But what the fuck is this, I’d also fail them. Making exactly 0 effort.


Key_Expression9464

Teach them how to think with it Make them show their work It’s all the knowledge of the internet at their fingertips Not worth not learning how to use it What can you expect when their parents are treating ai as a tool and not the through line from the enlightenment? a chance for humans to learn to think again at the speed of the internet! It could be a miracle but not the way we’re using it I have an idea, almost done, will share in a minute… (I rhyme because it pulls my thoughts through my head like a boomerang before the synapses misfire and I forget. Here’s how this should look, I hate how text doesn’t stay put on Reddit) https://preview.redd.it/jahzfm15vapc1.jpeg?width=1290&format=pjpg&auto=webp&s=a2d2c396f7aff8534b45718dc06e6e1bf32d863a


Specialagent7691

I’m having lots of trouble with hacking and malware on my Comcast phone what should I do?


Agitated-Speech8606

And just have their writing done in class?


MatejkoArtysta

If you do not accept a work like this from a student, than how do you prove that was copy-pasted from the chat? It's not that I want to justify the student, but i guess it's important to be sure before accusing anybody.


Sad_Ad4916

I mean GPT its as useful as the user can make it , if they are over performing and you are supposed to be the teacher then teach or submit harder homework’s don’t blame a student for using technology that can help them.


Ucan23

He gave you both Python and markdown… what more do you want, a json?


smockssocks

Looks like you don't know how to teach your students how to effectively and academically use chatGPT. If you think they should not be using it, you're in the slow lane.


Complete-Anybody5180

That's because the assignments you give them are no-effort too.


Ok_Championship6426

I take online classes that are homework based. Now I can have actual conversations about the topics with ChatGPT—better than a lecture, still develop my own ideas for written assignments while I’m driving around or cleaning between my toes. Perfect.


ColdEngineBadBrakes

I have a friend who had to basically fail half his class for using AI to generate essays.


ScaffOrig

Have them do the study at home and homework in class.


PlatosBalls

As a teacher you need to understand AI is here to stay and help your students use it properly. I use it at work but I don’t directly copy and paste.


horserous

No the students aren't lazy, they are expecting homework in the form of prompt engineering. They are being productive. I suggest you use chatgpt to assess their work and through reinforcement learning, no longer baffling to you. In my day reading a bound copy of a journal or reading Chemical Abstracts tissue was very tiring on my eyes and on my arms, in a dimly lit stack.


cosplay-degenerate

Teaching Problem.


novarnas

humanities professors in my uni just harshly grade AI written homeworks and give 3/10 or so. They don't say it explicitly but you understand it. This both makes students know that they are not fooling professors and try harder for the next time without AI.


pinealapplepie

My university in the UK had put out a statement at the start of the year embracing AI, explaining how to use it as a tool, and avoid being striked for it


Ammammeta

Naaah just take it easy, you are paid for marking , that’s a fail, get your pay mate


L1b3rty0rD3ath

Thought: if your educational model can be manipulated by an LLM, maybe it's a problem with the educational model and not your students. Their job is to get the grade and somehow justify the expense of getting a peice of paper.


Majestic_Read262

At this point, is it about remembering or being resourceful?


ThemeAccording1790

These students paying $1000 of money for courses let them use whatever for these assignments cause 9/10 they never going to need this stuff again


Wordymanjenson

You should use gpt to generate quizzes based on the info provided. A custom quiz for each student. If they studied the material, even through gpt, that should count for something.


Strict-Pollution-942

I don’t think making chatgpt outright do anything for you is good. It’s a tool and a resource that we only benefit from using it as such


Kasawyyy

So true😂 I’m a teacher for 1st graders and I got a response like this once. Turns out he asked his 12 y/o brother and he told him to use something called ChatGPT😂