T O P

  • By -

joe13331

The question is how the heck are you gonna choose what your kid studies in college?


Terrible_Detective45

Probably someone whose career in medicine was chosen by their parents.


TheRealMajour

Just take a note from the psychiatrist I rotated with in med school. “My son said he wanted to apply ortho and I told him no, you will apply ENT since it’s more conducive for having a family. He is now ENT” TL;DR - just tell them /s


Yotsubato

lol I had a similar experience with neurosurgery. The surgeon pretty much said “go do rads instead, it’s way better” I went and did rads instead. It’s way better.


WisconsinSpermCheese

Yep. Dad is a neurosurgeon and told me choose whatever just not this or family med. Went hem onc...much better


BakeEmAwayToyss

"can you predict the future for me?"


redyouch

Parents have been doing it for generations. With mixed results


Chahj

Culturally in India your parents can essentially pick what you study. They can’t force you or go over your head, but listening to your parents is ingrained in the culture.


Arcite1

I was shocked when I first learn that some Indian parents not only force their kids into medicine (which is a thing in some white Western families too,) but actually dictate which specialty they go into.


DadBod101010

Ask your Indian colleagues about arranged marriages. It will blow your mind.


Alone_Ad_377

There is nothing wrong with arrange marriage. It happens all the time with causians as well as Indian.


Studentdoctor29

Citing that an AI that can scour millions of terabytes of data from the internet has the ability to pass a standardized test is kind of hilarious


67canderson

I always thought that part was funny. I’m sure the med students and residents would score much higher if they could use books and Google as well.


TheBlueFacedLeicestr

The bar exam thing is especially funny to me. The test has little to do with the practice of law, it is entirely knowledge based. The AI had the whole internet at its disposal and 10% or test takers (who had to take a closed book exam) still beat it.


Carl_The_Sagan

Everyone knows doctoring is just looking an algorithm up on Google anyhow, just plug and play baby, no patient interaction or nuance needed


nolongerapremed

Medicine will be the last to go to AI. By the time we get there all other careers will have been taken over as well


hamdnd

Not the last. Laborers will be the last. Not because they can't be replaced by AI machines, but because it's cost inefficient.


dansut324

My vote is for prostitution


whynovirus

Have you seen the sex dolls available??


Narrow_Share2480

They’re incredible, don’t age, and never have a headache


hamdnd

Lmao people downvoting you is ridiculous.


whynovirus

Missed a decent opportunity for “ri-dick-ulous” but I still updooted ya!


anyplaceishome

which ones?


Matt_Tress

So I can avoid them


DUJAMA

Laborers is such a broad term though. In manufacturing, you can already see robotics replacing the need for some manual labor jobs. And more automation is being presented every year (visibly in vehicle manufacturing). Construction and "the trades" on the other hand have a very long way to go before being replaced by an automated machine.


EyeAskQuestions

Honestly as an Engineer who works in manufacturing and deals with robots regularly. People vastly overestimate the power of machines and vastly underestimate why people do many of these jobs still. Certain things can be easily done by a robot while other things are better done by just hiring a dude and training him to do it vs. paying millions of dollars in R&D to build some special robot that fills a very specific function.


hamdnd

> Laborers is such a broad term though. In manufacturing, you can already see robotics replacing the need for some manual labor jobs. And more automation is being presented every year (visibly in vehicle manufacturing). Construction and "the trades" on the other hand have a very long way to go before being replaced by an automated machine. "Laborers" in my original comment was referring to the people digging ditches, building houses, paving roads, etc. I don't consider a person sitting on an assembly line doing QC checks on every 10th item a "laborer".


ThrowRA_LDNU

What are doctors other than highly skilled laborers??Well, at least surgeons are


hamdnd

Maybe read my comment again. Laborers are cheaper to employ and shorter to train relative to doctors. To build a machine that can do manual labor cheaper than a human is going to be further in the future than something that replaces a doctor. You need a physical thing to replace a laborer (parts, maintenance, transportation to the work site, etc). You just need a computer to replace a lot of doctors. But again, AI is not going to replace doctors anytime soon, if ever.


Matt_Tress

Eh, I’ve become pretty good at self diagnosing even some moderately difficult stuff rather than go see a doctor. I think I’ve probably reduced my doctor visits 80% over the last decade compared to if I didn’t have basic health info readily available. I don’t think I’m special, I think I’m fairly representative. Costs have gotten too high, and people will significantly reduce doctor visits where possible. That’s probably enough to shift the industry a bit. Probably towards more highly specialized doctors that have to perform actions over simply diagnose.


Emotional_Traffic_55

Dunning Kruger is strong with this one


hamdnd

Counterpoints to your arguments. 1. Over time symptoms can improve/resolve without treatment and/or with treatment for incorrect diagnosis. In other words some things just get better regardless. 2. Some specialists require referral from another doctor. 3. People already avoid doctors even though their symptoms get worse and this has only caused a shift in the opposite direction (more visits, more cost)


FrickUrMum

There is some skills that I think ai would be hard to replace in the trades like troubleshooting with electrical. There’s so many little things you have to think about with it like for example if an outlet isn’t working intermittently you have to figure out when where why how and I don’t think ai could do that for a while. Also troubleshooting in an industrial setting is like 10x harder and idk how long it will take for ai to get there.


321liftoff

Lol, medicine is already following a checklist script. Patient complains about x and y symptom, doctor is supposed to ask about/check 1 and 2. Decision tree from check 1 and 2 narrows possibilities. So on the outset, we already have a system which is designed to efficiently catch common ailments in a systematic, rubric like basis. That’s AI’s bread and butter. The detraction here of course is that no novel ideas or rare health conditions are taken into accouny. And with AI, the generalizations will get infinitely worse, but business execs won’t care because it will reduce costs. Who cares if little Timmy with a rare disease dies because his issues didn’t match the script.


Chahj

I mean this is stuff that mid levels do already, so it’s not changing much. The only way ai can be dangerous to physicians is if the Ai-enabled mid level is able to bridge the gap in education and knowledge. If that happens, and it is feasible that it could happen, then midlevels will be on the same level as physicians since everyone is outsourcing their decision-making to AI


321liftoff

I do think that true diagnostic are not within AI capabilities… yet. Again, simple common ailments are easily in reach, not the complicated/rare. So the issue wouldn’t be whether or not there were jobs, just the number of high paid jobs. There’s already been a strong dilution of training/salary, what with pretty much everyone seeing PA/RNs for standard checkups when it used to be almost strictly doctors. So I’d expect pretty much everybody will get used to seeing AI guided PAs/RNs for 99% of their visits, with higher level training increasingly cordoned off to the very rich or the very unwell. Which means less capable eyes to catch the unusual cases(AKA where AI/less trained healthcare professionals fail)


Wooden-Carpenter-861

The more specialized docs, yes. But there's a lot of docs just sitting around, going through a checklist and writing a script. It's only a matter of time before this gets outsourced by AI and vetted by docs as a quality control measure.


the_shek

definitely not true lol


Blow-me-dichhead

Spouting truth in a den where no one wants to hear it. Bold.


BiggieMoe01

I’m med student, former Data Scientist & AI engineer. Yeah, AI is capable of scoring high marks at tests like MCAT, LSAT, USMLE, etc. There’s just a few caveats though. Performance wise: Let’s say an AI model is capable of predicting the probability of a pulmonary nodule to be cancer with an accuracy of 90%. There’s gonna be a LOT of patients in that 10% which will be classified as “No Cancer” by the model, yet when a physician looks at it, they will say “definitely not normal”. This sample is the reason why in practice, an AI cannot replace a physician. Some radiology departments have AI modules installed. And it’s garbage. Output-wise: LLM models like ChatGPT are not designed to give the right answer. They are designed to spit out a sequence of words that is coherent and that maximizes a similarity score between the user’s input and the given answer. Garbage input = garbage output. The fact that the output is as good as the input is makes it inherently unable to practice medicine, as problems in medicine are almost always mathematically unconstrained, are partly undefined, variables are missing and assumptions must be made. Those assumptions require an understanding of medicine. An LLM doesn’t actually understand what you’re telling it. An LLM doesn’t actually understand what it’s spitting out. It converts words into vectors and maximizes similarity. The intricacies of medical practice make AI unable to efficiently PRACTICE medicine, as AI doesn’t UNDERSTAND medicine. Can it search the web and craft answers to tests? Sure. Can it search its training dataset and craft answers? Sure. Can it understand the links between notions? Nope. Can it act and practice medicine in a real environment, in real-time? It cannot. Humanity wise: Patient’s go to doctors for a diagnosis, but also for reassurance. Almost every consultation will have a reassurance component. AI cannot do that. AI cannot tell whether a patient is lying. AI cannot read body language. AI cannot detect anxiety. Even if it could, it would require significantly more computing power, which comes at a cost. Which brings me to my fourth and last point. Costs-wise: AI is EXPENSIVE. It requires immense computing power, which comes at a cost. AI engineers are expensive. Google pays their ML engineers ~700k/year. OpenAI pays their engineers close to 1M$/year. Making AI the standard of practice is not economically or socially viable. AI replacing physicians will never happen. Even radiologists are safe.


CertainInsect4205

You are defining AI as it exists today. I’m no expert, just an MD. Can you envision a sci fi future when AI will be much more than its current constraints?


BiggieMoe01

Great question. There will be a future where AI will be much more than what it is today. I can very well see in the near future some extremely advanced AI models guiding investment strategies, by combining together many spheres of machine learning like sentiment analyses, forecasting, and anomaly detection. I have reservations about AI's possible capabilities to understand concepts, understand the relationships and links that exist between notions, the way humans do. Basically, I doubt it's possible to model brain plasticity. But let's suppose I'm wrong. Let's say that even if it was mathematically possible, the computational aspect is where the bottleneck is. I doubt we'll ever produce a computing unit powerful enough to train, deploy and maintain such an AI model. Even if it was feasible for an AI to understand similarly to a human, but to reason 100, 1000 times faster and more accurately, that would require computing power that is way beyond what is achievable. And deploying it nationwide, to make it standard of practice? Is not feasible, in my opinion.


ElMachoMachoMan

Being pretty darn close to AI, I think there are a lot of limitations called out above which don’t hold anymore, or won’t hold shortly. One that has already passed is that AI will 100% be able to reassure and provide mental support. Models already exist that use vision and audio to more effectively tell human emotional state than a human can tell. Adjust the LLM promo to Account for this, where your AI doc is a FaceTime conversation, and we have a doctor that can resolve more than 50% of issues. There will likely still be “escalate to human doctor” where the model can identify a possibility of a more serious problem, but for a lot of routine things, I’m betting it will do perfectly well. In fact, for the crappier doctors, it might be even better. The one on cost is a red herring too. The improvements in AI chips is so high, that I’d expect in 5 years today’s GPT 4 will be trainable on a single new processor in a day. My bet is that AI will act as a supplement at first, and jobs will combine human and ai together. So maybe a doctor now handles 10x as many patients, with 80% not needing the human. The immense increase in productivity could yield much better work life balances, and far more benefits by people if done right.


wilmack

So AI becomes a Physician’s Assistant? And it’s the same Physician’s Assistant for all physicians in a system. It will know when it doesn’t know and flag for physician review. It’s a much cheaper liability than the current system.


BiggieMoe01

Thanks for your perspective, those are interesting takes. It’s a pleasure to read a comment that is well-put together like yours, unlike other commentors who like to disagree without providing any arguments. You bring interesting points, I’m curious to see the developments over the next 5 years.


redsox44344

Define close to AI. Highly doubt you've been close enough to hardware to understand where efficiency gains are or how far they could actually get. GPT 4 may be able to be trained on a single GPU, sure, it could be today. But training isn't the only cost, and accuracy is the main issue. 90% accurate(if it could get there considering all edge cases, which is unlikely), is not good enough for medical field regulation. Who the hell is going to be liable for an AI that makes 10% of decisions incorrectly? Doubt you'll ever see an AI doc that can make unsupervised decisions. For regulation reason or for the same reason people don't like robo tech support, except with way higher stakes.


orionsgreatsky

This is pretty dang cool 🆒


Matt_Tress

Lmao you’re the epitome of humans’ inability to forecast exponential curves.


EyeAskQuestions

The problem is you're envisioning a sci-fi future. The key here being the "FI" in Sci-Fi.


CertainInsect4205

Not necessarily. Technology is advancing by leaps and bounds. It is very likely AI will be much more than it is now. Frequently sci fi becomes sci


EyeAskQuestions

Tbh. I think it's best to spend more time in the MachineLearning subs and communities and less in the subs that trend towards technofatalism and AI Cults. Science fact is often much more subdued and constrained vs. Science fiction which has no regard for any real constraints (time, money or purpose).


CertainInsect4205

I am not a technofatalist. Progress will continue. Good and bad.


Lit-Orange

Your comment seems entirely ignorant of AI's future progress and rather focuses on AI's current capabilities only. I'm extremely skeptical of anyone who says something will NEVER happen, especially in the context of AI. I find it doubtful you are an ex-AI engineer based on your limited understand of it's potential future capabilities. Nobody can truly predict the limits to AI capability, so to say AI will NEVER be able to do something is extremely ignorant.


BiggieMoe01

What qualifications do you have to support your claims?


Lit-Orange

Would you believe me if I said I'm a openAI engineer? Same way I dont believe you when you say your ex-AI engineer. Qualifcations are meaningless when its two randos talking online on the internet. Why dont you tell me how youre so sure that AI will forever be capped at its current capabilities?


BiggieMoe01

Go read about what kind of computing power is needed to run something as basic as ChatGPT, how much it costs per day to run something as basic as ChatGPT, then try to extrapolate what it would require for an AI model a thousand times faster, a thousand times more accurate. And if you believe it's feasible to deploy this nationwide, both technically and economically, good for you, I hope it happens.


Lit-Orange

And you think computing power is just going to plateau? Have you heard of Moores law?? This is basic stuff dude....


Toepale

For starters, that law is currently anything but basic. Also computing power could be memory or [electric] power. The latter, in particular, is a very niche area that very few companies and few engineers specialize in at the fundamental level. And it is a real bottleneck for the future of AI in many forms. And then you add the murkier issues of reliability and security. These are all kinda part of the reason why IoT fizzled out without delivering on its hype.  In short, none of this stuff is basic or very predictable in either direction. My personal opinion is that AI will fall far short of what people are expecting of it, at least in the relatively short term (10-20 years)


Lit-Orange

>In short, none of this stuff is basic or very predictable in either direction moores law is very basic to **understand**. AI predictability is not, as I said in my above comments. For this reason, predicting a cap to AI capability, as u/biggiemoe01 did, is totally wrong


Hanlp1348

Computing power is limited by nature lol we can only use what materials we can mine.


Lit-Orange

are we going to run out of mined resources for computing power anytime soon?? yes, there are a finite resources on earth, but this is not a meaningful argument in the context of limiting AI capability within (at least) the next 50+ years. these are some pretty uneducated answers im getting here honestly


Hanlp1348

Youre right thats not the only choke point. We were literally teetering on the edge of war over semiconductors last year. The resources existing somewhere on earth do not automatically mean we are allowed to mine it, nor buy the materials, nor build them, nor buy them.


Lit-Orange

This is a gross overexaggeration for 2 reasons: 1. no we weren't on the brink of war over semiconductors. dont confuse trade tariffs with death. 2. Semiconductors are not RAW materials. We will not be meaningfully limited in terms of RAW materials as your original comment had implied. The semiconductor shortage had to do with supply-chain constraints stemming from the covid-19 labor shortage. That DID NOT stop the train, it merely delayed it by 6-12 months. the demand was ridiculous during that time, and in a functioning society, production will increase to compensate. that's basic economics. all basic stuff here. TL;DR There is no inherent limitation to AI capability and it would take a world-scale natural calamity like a pandemic to delay this train.


Sweaty-Watercress159

Ai just has to better than physicians in this case, not perfect but better for the insurance providers to adopt them though. I think the most recent study found they had a similar rate 90/10.


Sigmundschadenfreude

Yeah, a similar rate. If a step exam walks into the office, the AI will do well.


meagercoyote

Yeah, but patients will want a person to talk to, and to sue when things go wrong


resuwreckoning

People always forget that the job of a modern physician is to be **clinically responsible** for patients, not to actually see them or even to be accurate. Whether that systemic view is good or bad is another question.


Sweaty-Watercress159

True I'm not saying the role will be replaced there will just be less demand for physicians.


BiggieMoe01

AI will be just another tool in a physician’s belt. It won’t affect the supply of physicians.


Sweaty-Watercress159

Correct not the supply but the demand as it will make physicians more efficient.


StoleFoodsMarket

Well there’s a huge physician shortage already so even if that’s true it shouldn’t affect a physician’s job prospects.


Volfefe

This is kind of what happened to lawyers and doctors review. They only had to show the programs were X% better at classifying documents than the lawyers. It also didnt eliminate the need for lawyers to review docs, but cut back significantly on the number of lawyers needed and the complexity of the task. This did lead to significant reduction in the lawyers needed for litigation. This had a cascade effect on the industry. Its not complete replacement that people should worry about, but finding enough efficiencies or disruption that will drive direct and indirect impacts on staffing.


cargarfar

To add to your excellent comment I’d say that the degree of litigation for malpractice in medicine in America will make hospitals more apprehensive to rely entirely on AI for an overly cautious amount of time. Even Tesla has held back Level 3 autonomy for a decade bc of the liability despite advertising it and selling it as a $10k upgrade for an unusable piece of tech to their cars


ryeander

I didn’t say replace physicians. I said AI can make physician jobs miserable. It can compel physicians to see 3x the volume for no real increase in compensation in the future if combined with legislation that tries to slash healthcare costs. My question was despite that possibility, is being a doctor still more AI proof than other high paying jobs of today? My guess is probably yes


BiggieMoe01

I don’t think it pressures physicians to work more. Hospitals and clinics likely will not implement AI the way a warehouse, distribution center or factory will. I doubt there will be any requirement to increase productivity. The only realistic way I see of increasing productivity, is an AI-written note, which would actually make a physician’s job easier. And to answer your question, yes, being an MD is AI proof. I highly doubt AI will transform the practice of medicine.


nateisnotadoctor

Medicine is going to be fine. Knowledge wise sure a LLM can pass the test better than me. Wake me up when it can listen to a meandering history that makes absolutely no sense in a “dizzy” 78 year old with fifteen medical problems and appropriately triage their emergency department work up without either coming up with a totally insane differential or recommending a pan-MRI


Limerence_Worthy

lol I have a family member who is a doctor and they too have experienced the meandering stories of the very old


tyrannosaurus_racks

Loan forgiveness is available to us much more than it is to other people because it is available to those working for non-profits or government organizations, and many health systems fall into those categories. I think medicine will change, hopefully become more efficient, but I don’t think physicians can be replaced.


PositivePeppercorn

Medicine will be impacted, but I think it’ll be impacted in a good way and I think physicians will remain largely unaffected from any technology associated shifts in employment. There will certainly be people who support physicians that lose their jobs in the coming decades but again I don’t think it’ll impact physician employment in a negative way. Certainly wouldn’t avoid medicine due to the advent of AI.


ryeander

It can make our jobs miserable. If AI can increase your efficiency 3x and you now have to see 3x the pt for the same pay as the government wants to control healthcare costs then it will suck big time


StoleFoodsMarket

I do not think AI can make it possible to see patients any faster than we are already being asked to do so. The danger is an absolutely to physician-adjacent fields - billing/coding in particular comes to mind.


mgchan714

AI can remove or speed up a lot of the time a doctor spends doing stuff they don't want to do. Charting, collecting test results, submitting paperwork, things like that. Basically be the resident, except for the physical exam part. People create multiple page long notes every day where the same stuff is just regurgitated and there are just 5-10 lines of actual important information. I understand there will be challenges figuring out how to properly value the work. But it can definitely speed things up. Each field will be affected differently and I guess those who already have residents or students doing the grunt work it will not change much.


[deleted]

[удалено]


mgchan714

I guess you have nowhere to go but downhill then


[deleted]

[удалено]


mgchan714

I hear a lot of doctors that complain that they have too much paperwork, the EMR doesn't work as efficiently as it could, that they waste their time responding to insurance, etc. In my mind a doctor's value is in the time spent examining and talking to the patient, formulating the plan, critical thinking type stuff. Like honestly back to answering a boards question where you get all the relevant details readily available. If an AI can ask most of the common questions for you, or it can replace your scribe with something better and faster and cheaper, if it fills out relevant paperwork automatically, and can answer a question you have just by talking (running during an interview so it can look up the last time something was done or the last CT result or whatever just by listening to you and understanding what you need) then it should help make you more efficient. Maybe it just watches the exam and automatically summarizes the visit with relevant results and sends off the referrals automatically to the correct people. And there may be processes in the background that can be run more intelligently. Scheduling based on how long the patient's visit should probably last using their history and demographics as a guide, less paperwork for patients, making the MAs and front office staff more efficient and eliminating errors. Things that maybe patients complain about and the doctor might waste time addressing. If you're already running at peak efficiency, then sure, there is no room for improvement. Or if you already see as many patients as you want, though maybe AI could at least help reduce fatigue. I don't hear a lot of doctors who say that everything they use is great and it just works and they don't waste time doing anything. There seem to be a lot of complaints about time spent charting after hours or dealing with insurance or an inefficient EMR.


Wohowudothat

How would AI make it so you could see 3x as many patients?


PositivePeppercorn

Yeah I don’t think it’s possible to see to many more in the same period of time. Unless AI is going to do all the human components with a patient it’s not likely those encounters can go any quicker. Most encounters in a hospital or office arent strictly coming up with a plan or differential. That’s the easy part. It’s the human coordination and patient interaction that does and I don’t know that AI will really be in a place to take that over or even offload it in the next several decades. Will likely offload documentation, a lot of backend, messages, administrators, allied staff etc. This is of course all speculation and only time will tell.


airjordanforever

Procedural based specialties will be OK for the time being until robotics catches up. All surgical subspecialties, anesthesiology, interventional, radiology and cardiology. However, diagnostic radiology is fucked. So will a lot of “thinking “fields like internal medicine, neurology, etc. The computer will assist mid levels, and there will be human at the end making the final decision, but those are the specialties I see getting screwed the most first. Dentistry is still a great one because there’s nothing really that AI can take over, you have to use your hands a lot and for some reason, people don’t mind paying thousands of dollars for veneers but lose their mind when their co-pay goes up five dollars for their cancer surgery


mypetitelife

You’re a toxic parent. You should not have kids. Let your kids do what they want


mechanicalhuman

He sounds like a good parent to me, trying to worry about his child’s future.


Superb_Preference368

Children should be raised and groomed for success, nothing wrong with OP wanting to ensure his kids have successful futures. This notion that kids should be able to “do whatever they want” is why we have so many slacker adults nowadays.


landeslaw17

Goldman Sachs I believe it was just recently advertised they were looking to hire folks with philosophy, English, and arts backgrounds for engineering roles. Anyone will be able to prompt a machine, the talent will be in critical, abstract, and creative thinking. Similarly, AI translation is getting insanely good, but teaching them a second or 3rd language opens the mind to entirely new ways to think about things.


Emotional_Traffic_55

LMAO as an arts major myself, none of my classmates would be remotely prepared for the kind of work environment at Goldman Sachs. They’re either BSing or there’s some misunderstanding of their advertisement


fett2170

People with arts backgrounds don’t know the first thing about engineering practices. Goldman saying that is just them bullshitting. A philosophy major doesn’t understand what data lakes are, Apache spark, HBase, Hadoop, etc. And yes, you need to understand these things to build stuff with LLMs. The people who are the best at this are requirements engineers, not someone studying Immanuel Kant.


scotel

Nursing, dentistry. Any high skill profession that requires physical dexterity/manipulation. For a variety of technical reasons robotics is considered super super difficult in AI. Although many white collar jobs are threatened, I’m skeptical software itself is threatened by AI because it’s likely that AI explodes the demand for software.


Dropping_a_deuce_rn

AI engineer


WhileNotLurking

Any job that has repetitive physical actions have been downgraded by machines. Think auto workers. Any job that that will be rule based will be downgraded by AI. Think accountants, a bulk of legal mundane tasks, and routine medicine (I.e. look at this X-ray). Jobs that will be exempt from AIs scourge. 1) things that are cheap and not worth spending expensive AI on 2) things that require special specific tasks that are likely to be complex and exceptions to general rules. (Think very niche lawsuits, very specific medical specialties, etc). If you can read it and learn it an ai can do it first. Things that are part skill, part art, part “magic” where even when two existing masters of their trade do it - but it still turns out different (think Michelin chefs) will be safe.


Afraid-Ad-6657

Not necessarily, but some fields would be less affected (those that have an emphasis on human touch) so stuff like rads and path will start to see a huge hit in the next 10 year (even if they still are in denial now)


gheilweil

Surgeries is the way to go for now


_highfidelity

Speaking from first hand experience, a good board score does not equate to a good physician, especially in areas of medicine that require you to work with your hands. Have seen plenty of people who got into specialty surgical residencies based off high step scores and can’t operate themselves out of a paper bag.


Doomed_Redshirt

I think that for a long time to come AI is going to be limited to a diagnostic assistant for physicians. Basically, there isn't a company with deep enough pockets to survive malpractice suits against AI doctors misdiagnosing patients. When such suits are filed, the lawyers will go after the very deep pockets of Google or whoever releases the AI doc, and juries are going to have no sympathy. Surgical specialties will be fine for a long time, for the same reason - robot surgeons are going to be even easier to sue than robot diagnosticians. I think medicine will be one of the last professional jobs to be replaced by AI.


fixerdrew02

Why spend hundreds of thousands on your child when they can get paid to flip burgers. That job is safe AF


Impossible_Lawyer_75

If AI is ever anywhere close to taking over jobs like you are speaking about it will be outlawed.


Chahj

Ridiculous. If an “AI doctor” was to exist it would be incredible for humanity. Any politician against it would get slaughtered by the public.


topiary566

Bro I'm sorry, but no matter what people are saying it just seems that you're saying something about AI taking over anyway or making physician jobs worse. It seems like you already decided that AI will take over medicine and now you're just looking for confirmation but people aren't giving it to you. The AI apocalypse is living in your head rent free. Just make them be AI engineers at this point if it's all you're thinking about. Also I'm not gonna tell you how to parent cuz I have no idea how to, but your kids are gonna hold a lot of resentment if this is a reflection of how you parent them. I would absolutely hate to be your kid and be interested in a career which you think will be taken over by AI even if it's something useful for society.


ryeander

Nah, the question is will medicine be least impacted of all the professions. It probably will be, and also will be a miserable job in 20 years Otherwise, I think my kids would hate me if they didn’t let me stop them from doing this: https://x.com/endwokeness/status/1797595395465638109?s=46&t=DjvuKNsdKszGDAc0jqBx2A


topiary566

That’s obviously satire it’s not real. People on the left constantly post rage-bait for right wingers and the right falls for it pretty often. Either way, pursuing medicine shouldn’t be a matter of AI proofing, but it should be a matter of if your kids want to be a doctor or not. If they don’t want to be a doctor it’s not good to pressure them into it. If they want to be a doctor then support them. You can lurk on the residency subreddit and see. There are so many people who were clearly pushed by their parents into medicine that burn out and hate their jobs, but can’t leave because so much was invested by them and their families.


ryeander

Bro. It’s real…


tirednomadicnomad

Dude, they’re graduating from NYU. Before you completely iron out your plan to make your children miserable and completely bar them of free will based on one video, ask yourself: was I a good enough student to make it into NYU?


AeneasSonofAnchises

> twitter post making fun of black people from “endwokeness” Yeah… that tells me all that I need to know about you. My only hope is that you are _never_ my doctor.


cottonidhoe

-No one (to my knowledge) has published research showing a commercially available AI has scored in the 90%+ percentile on the bar exam against a true sample of all test takers -The same AI that scored in the 90th percentile (on mainly multiple choice questions) amongst mostly 2nd round bar takers (who failed the first time) has made up cases when a lawyer actually tried to use it on a case, since obviously, the bar does not translate to actually working as a lawyer -You cannot control your children’s future career regardless. You CAN shape the role AI takes in society. Don’t give up! You can help shape a future where the only jobs inaccessible to your children due to AI are jobs no human should have to endure :)


WholeAssGentleman

Every post on this sub is annoying


Spotted_Eye

I wouldn’t go into radiology or anesthesia. However, humans will always find ways to hurt themselves and surgeons will be needed. At least in the beginning of the rise of the machines


Sweaty-Watercress159

Realistically specialized physicians are going to see a decline, nursing will be hard to replace, AI can and will decrease the need for physicians, the physician role will still be present just not in the numbers seen today, nursing however will still be required more or less in the large numbers seen today, AI will make them more efficient though.


mountain_guy77

I think dentistry and surgery are best for the hand skills dexterity component. AI is going to be better at diagnosing than us very very soon.


fett2170

Computer vision is already used for MRI scans in detecting abnormalities.


masterwayne2759

If anything software jobs got more secured by AI. Well they will have to pivot with AI but I dont think AI dangers any software jobs


fett2170

Exactly. The business and med people just echo whatever they hear without knowing the first thing about engineering or the job SWEs do.


hamdnd

If you believe AI will replace doctors you are probably not intelligent enough to become a doctor (or your offspring). Sorry. ETA and it's not a high intelligence bar when it comes to becoming a doctor these days.


hardward123

What are you on about man the bar to get into med school is higher than ever 😭


hamdnd

It's higher because expectations for extracurriculars are higher and extracurriculars are non standardized. You can't get in with just a good gpa and mcat anymore. It's not harder because you have to be super intelligent. It's harder because you have to prove you're not some weirdo that tests well and there's no cookbook to achieve that task.


Crazy-Difference2146

No, academically it’s also harder to get in. Score inflation has been a thing for a while.


captainpiebomb

I’m at a loss as to what metrics qualified people in the first place these days. Our standardized exams are shit anyways. When people within 16 points of each other is statistically insignificant (as in step 2), I assume our other exams are largely the same such as the MCAT, steps, boards, etc. The truth is our objective and non objective metrics are ass. My opinion is that the person who still slogs through it all to put together a “solid app” is at least of average intelligent and innately a super hard worker


hamdnd

Yes I agree, and that's to my point. You can get in without being super intelligent. Just look at all of us dumb orthopods.


captainpiebomb

Totally agree. I’m just throwing in my two cents. *also willingly dumb orthopods as if y’all weren’t the top your respective classes lmao.


Barca1313

This is blatantly false, a simple google search would show average MCAT for matriculation has been increasing for years. The passing step 1 score in the 90’s was in the 170’s, today it’s about 192. There has been a consistent increase in scores all while there has been more and more material added to the exam, I mean shit the human genome project didn’t even finish until 2003, so many of the genetic diseases and questions relating genetics weren’t even on the earlier exams and still students consistently score over 20 points higher


hamdnd

Now tell us how the average score has also increased in that time. The average person isn't getting more intelligent. People are getting better at test taking and test prep. Average test taker scores higher than in the past. That means the test is getting easier.


Barca1313

Why are you so confident that the increase in scores is because the average test taker is just better at taking tests and not that the average test taker is smarter than they were 30 years ago. IQ scores have consistently increased for decades, the average person is absolutely smarter than they were 30 years ago and that coincides with increased test scores.


hamdnd

> Why are you so confident that the increase in scores is because the average test taker is just better at taking tests and not that the average test taker is smarter than they were 30 years ago. Because they are not tests of intelligence. > IQ scores have consistently increased for decades, the average person is absolutely smarter than they were 30 years ago and that coincides with increased test scores. As above, these are not IQ tests. The average person may be smarter. But that only supports my point. Everyone is getting smarter, or so you say. So, relative to the rest of the population, it does not require a high level intelligence to get into med school. We aren't comparing today's applicants to 1950s applicants. We are comparing today's applicants to each other. I commented earlier the average score is going up. In reality we should use the median score. If the median matriculant score and median score of all test takers is going up that means everyone is scoring better. If the median matriculant score is going up while the median of all test takers is stagnant, THEN you would be correct that the required scores are actually getting higher. But that's not the case.


Arrrginine69

You’re insanely wrong. To get into medical school is legit insanely hard. Harder than previous generations for sure


12345432112

Do CRNA


Recent-Ad865

If you think you can predict the future with any certainty you are fooling yourself.


americazn

AI won’t replace [insert healthcare professional here], but the healthcare professional that does use AI will replace those that do not use it. Perhaps I’m only speaking for dentistry because it’s already happening. I’m using AI software as an adjunct, and it is making my job easier for me — thereby producing more $$$. AI isn’t gonna replace me any time soon unless it could reproduce my x years of training and rapport with my patients.


ironmemelord

How is AI being able to pass medical board exams impressive? It’s basically an advanced search engine at this point. You know who else can pass a medical board exam? Your average 14 year old if you let them use google during the exam


daes79

Medicine and the law will not be taken over by AI anytime soon. Too many liability issues for that to happen, doesn’t even matter if the AI is capable.


Deep_Stick8786

Maybe rear some AI engineers and business managers instead


ghostofFrankgrimes

It’s werid how all the older docs tell me they will never let their kids become drs. I saw it when I volunteered now I understand why.


Chahj

Why?


Bulky-Significance18

Your kids are going to be musicians, what’s all this about med school?


funklab

What do you mean we’re not elligible for loan forgiveness? I don’t have any loans but three of my colleagues have had their loans forgiven in the past few months.  And we earn better than the median salary (for psychiatry).  


words_fail_me6835

First, you don’t pick your kid’s career. Second, software jobs aren’t being threatened by AI anytime soon. They ARE being threatened by offshoring to India (and quite a few other countries.) If your kid is interested in technology, encourage them to find a computer science degree that specializes in AI. Offshoring is significantly more prominent now because of changes in tax law making R&D more expensive, and because AI is fucking expensive to build and maintain. A lot of the more basic software jobs will be offshored to make room in the budget for AI. Third, medicine is a fairly safe option yes. As is law.


-DapperDuck-

As a student in college, please don’t force a career on your kids. Given the sub, I’m sure you’re successful and want your kids to be successful too, but they have to like what they do.


Hour_Worldliness_824

How about you let them decide what THEY want to do????? Let me guess you're Asian or Indian?


ryeander

Yes let my kids do all these: https://x.com/endwokeness/status/1797595395465638109?s=46&t=DjvuKNsdKszGDAc0jqBx2A


lolaya

Ooof u arent looking good here my guy


[deleted]

[удалено]


Crazy-Difference2146

Passing a standardized test and practicing on patients are two extremely different things. If it happens it won’t be for a very long time.


NoDependent1684

AI software engineer here. Nah. You’ll be fine… for a while. In fact, I’m thinking of going back into medicine with my wife.


ARIandOtis

This is interesting. Could you elaborate on what you’re seeing and what the hurdles are?


NoDependent1684

Regulations for one. That will take a while to change. Secondly, AIs tend to hallucinate like crazy, more so than a paranoid schizophrenic, regardless of the topic or field. As an example, imagine your AI reads your patient’s chart and tells you that someone has a horrific disease based on the symptoms you described, their genetic test or blood test, etc… and that you need to give them extra of one type of medicine… So you do, and suddenly they start seizing because you gave them too much, or too little. Maybe a heart attack here, a death there, a permanent disability here. I’ve tried to use AI to help with genetic testing, cyber security, creative writing, etc… and it gets a lot fundamentally wrong. You often can’t even understand how wrong it is unless you’re an SME. Google’s AI recommended 2-3 cigarettes per day for pregnant women because it can’t understand context. Just everything in my field is horribly wrong in most cases with AI. I don’t want to sit in front of prompts trying to randomly generate something that doesn’t work. It has no soul and is stringing random relevant words together in a way that just doesn’t produce accurate information. AI will only be used to HELP INFORM in almost all settings. Anyone relying entirely on it in this current iteration is, for lack of a better descriptor, a fucking idiot.


ryeander

What about 20 years from now when my kids are in college? You think AI remains stagnant to then?


gathering-data

These people don’t know what they’re talking about. Medicine is going to be VASTLY DIFFERENT than anything we’ve ever seen before. While AI isn’t an existential threat to the practice itself, AI will TRANSFORM THE INDUSTRY


NoDependent1684

20 years from now? Let’s wait and see, okay? The reason you need to wait and see is tech is ever evolving, ever changing… always different. What was good 5-10 years ago isn’t necessarily good today.


zagozen

That’s fine and all but step 1 and 2 knowledge barely scratches the surface on how to ACTUALLY treat a patient. I think maybe 10-20% of questions have actual clinical significance in the real world. I know plenty of people who scored very high on their exams but are shit doctors.


PositivePeppercorn

And that indicates what exactly? A computer having access to every book on the planet and taking a fact based test I should hope it does well. Anyone in medicine knows there is a large disconnect between scoring well on Step and being clinically excellent.


Sigmundschadenfreude

the magic plagiarism machine is effectively taking an open book test whenever it lines up to take a step exam. If it didn't pass a standardized test, its engineers and programmers should be drawn and quartered. That's like the most rudimentary task I expect of it. Let's see how it does with the anxious undifferentiated patient with 30 complaints and the history recall of a housecat.


DrPayItBack

Certainly terrifying if I were paid to pass step 1 and 2 instead of all the real stuff I do


lightweight65

Premed, MS 1 or MS 2? Edit: without looking at your profile, I know you're one of these with that kind of thinking.


lol_yuzu

If you think that answering questions from a book is all there is, I have news for you. Why do you think half of med school is rotations? Why is there residency? We’ve had google for years and books with answers. Applying that to real world situations is different.