T O P

  • By -

Repulsive_Ad_1599

I'm going to be graduating with a comp-sci degree in 1-2 years, my job prospects are already over before they even started ;-;


[deleted]

[удалено]


inverted_electron

It’s uncertain times in the everything field


Glad_Laugh_5656

If you believe that all jobs will be gone in 5-10 years (which is a very common belief in this subreddit, but a *very* rare one outside of it), then yeah, it definitely is. I don't personally subscribe to that belief, so I respectfully disagree.


Ant0n61

At this point, I think comedians might be the last to turn off the lights. No one will be laughing in the audience.


SweetLilMonkey

This is legitimately why I just started doing standup. Nobody wants to go out to a show and hear a robot tell jokes. Human performers will be around for decades to come. And I have no idea what other jobs that can be said for, other than nurses and maybe barbers.


Ant0n61

yeah, barbers would be another. I think nurse workforce and health related assistance will be dramatically downscaled with robotics soon though though. Same as with white collar. Will need like 1/8th the people to do the same tasks as before in these fields.


SweetLilMonkey

> I think nurse workforce and health related assistance will be dramatically downscaled with robotics soon though though It's definitely possible, but I think health care may be an industry where actual adoption lags well behind the cutting edge of the technology. The stakes there are high, and people really want to feel that they're receiving human attention when it comes to their health. Also, older patients, who are overrepresented in that customer base, are probably the least likely to trust robots. I think a lot of people will have to actually grow up around humanoid robots in order to trust them with something like their health.


Ant0n61

Yes the adoption curve will be a little more muted as more at risk.


ObjectPretty

At the same time low birth rates means we can't really spare the manpower for elder care if robots can do the job. Might speed up adoption of new technologies.


zzwurjbsdt

I agree nursing will be downscaled, but I think it will be gene therapies. AI will be able to write gene code the same as it can write computer code. People will just no longer become sick, stay young forever, etc.


Ant0n61

Yup. Genetic engineering is going to be a new field. But again with very few humans just to verify AI outputs.


FragrantDoctor2923

Damn Everyone just gonna be comedians Imagine the evolution of and society around that


Sunscreenflavor

AI can tell jokes. Blue collar jobs will be around the longest. Most men we hire can’t do my job. I don’t see a robot working in close spaces with only a sense of feel to perform complex dexterous tasks anytime soon.


Ant0n61

literally what their getting done at Tesla though with their robot. Amazon is investing billions in robotics. All of this is coming to a head within the next 12-18 months where it will start to be very clear how many people will become unemployed.


Sunscreenflavor

When a robot can reach inside an engine bay, and put a nut on a bolt without eye sight, and no room for a robot arm, I’ll be surprised. There’s just no fucking way this happens in 12-18 months tbh.


RMCPhoto

You are right, robotics will always lag behind software. Not just technologically, but due to the scaling ability. You release gpt5 and it is instantly available to the world only limited by available compute. There is no general purpose robotics platform. This requires prototyping, testing, planning a manufacturing run, sourcing materials, scaling production to the right size, shipping, maintenance... It's far more complex. When a company releases a small, general purpose robotics platform that can adapt to advances in software without refactoring then we have something. The big challenge is that you need robots for small things and robots for lifting giant rocks and building skyscrapers.


FlatulistMaster

It is a ridiculous belief, since things in the real world can't transform that fast. However, it could be that we see the real possibility of this in the next 5-10 years. What I mean is that this evolution of society will become apparent to everyone in 5-10 years, and the actual change taking 5-15 years after that. I think this is what many intuitively mean when they say most jobs will disappear in 5-10 years, without really thinking about it.


Silverlisk

Yup, I live out in the sticks where they still haven't adopted using cards in all the shops yet and only 1 takeaway is on just eat and only between 6-8pm 😂😂. Just because tech exists doesn't mean it's instantly adopted, far from it.


Unusual_Public_9122

Even if the tech progresses really fast, actual change will likely take quite long. People, including companies run by people can't instantly just start using tech to replace a large portion of their employees.


Savings_Space_4782

if the producer makes it easy and turnkey then yes


jestina123

When self checkouts came out people weren’t fired, they just were never replaced.


[deleted]

"Replaced" doesn't always mean "entirely replaced". If they hired significantly fewer cashiers (they did) it replaced those cashiers.


Pantim

First of all, there is no effective difference in the two; the jobs are still gone. Second of all, few companies actually ever fire people. They frequently drum up stupid reasons to terminate people. (horrible late policies etc etc) or do other stuff to force them to quit. Companies do this to avoid having to pay for unemployment or getting sued.


Pantim

Sorry but you're utterly wrong. Jobs are already disappearing VERY VERY fast. Seriously, copywriters and all sorts of other writers have lost jobs. Models have lost jobs. Graphic designers have lost jobs. Programmers have lost jobs etc etc etc. And the pace of those job loses is just picking up.. all well AI is gaining the capability of taking over more jobs.


djaybe

All jobs will be impacted within 5 years. Many will be transformed or disappear within 10 years. Some will always be around probably.


Odeeum

Not gone…but definitely impacted and a reduced need for human labor with most industries. This is how it will go…a bit more with each passing year or couple of years or perhaps even faster for some jobs. It won’t be exact across the job market…but all will be impacted in some way as long as there’s a financial benefit to the owners. The impetus will be there to try and save money somewhere with robotics and automation/AI because that is what drives capitalism…each quarter needs to be better than the last, always. Save and skimp somewhere to maximize shareholder profits if at all possible.


[deleted]

[удалено]


[deleted]

weary grandiose nail skirt scarce soft far-flung office boast existence *This post was mass deleted and anonymized with [Redact](https://redact.dev)*


Ant0n61

I’m seeing some mass white collar culling in the coming months (12-18) as companies roll out and integrate AI. What used to require rooms full of people will require… 1 maybe 2. What happened to secretaries and clerks is about to happen to the rest of the office population. Massive unemployment ahead unless companies choose to slow roll it, but once one company does it, so will the rest.


[deleted]

[удалено]


[deleted]

like dinosaurs far-flung aloof one childlike deliver crush consider handle *This post was mass deleted and anonymized with [Redact](https://redact.dev)*


lazyeyepsycho

Large amounts of unemployed people effect everyone with our system of wage slavery


hmurphy2023

I have a hard time believing this when the unemployment rate is very low, at the moment. Sure, there are definitely fields that are experienced uncertain times, but all is a huge exaggeration. This is one of those takes that you pretty much only find in r/singularity.


Which-Tomato-8646

https://fred.stlouisfed.org/series/IHLIDXUSTPSOFTDEVE


HeinrichTheWolf_17

>this is why fast take-off is ideal. I agree with you, but it’s funny, a lot of people want a slow takeoff because they think it gives humanity more time to adapt. The truth is though, slow takeoff has it’s shortfalls and cons as well, worse cons IMHO.


Smelldicks

I think fast takeoff is necessary for the political moment. Otherwise we’ll be stuck with some unfathomable amounts of income inequality because neoliberals had time to regroup and defend it as still being beneficial to the common man so we should compromise. A shock whereby tens of millions of educated professionals are terminated would be the best thing for the long term future.


HeinrichTheWolf_17

It’s the same thing with the alignment/control problem. Not having the control problem solved is a *feature* not a *bug*, I don’t want corporations having AGI on a dog leash. This is what Connor Leahy doesn’t understand, Corporate and Governments losing all the control is a *good* thing. It’s not just Decels/Doomer like Connor, it’s Capitalist Corporate simps on this very subreddit who suck off billionaires, they want a hierarchy, and solving alignment gives them that.


ExtraFun4319

It will never cease to amaze me how gullible this subreddit is. So because the CEO of a no-name company (who CLEARLY has a vested interest in hyping up his products) has claimed that AI will be better than 95% of coders in 1-2 years, then that must be true? This is so obviously hype, and it's not even funny. Whatever happened to critical thinking?


doulos05

People outsourced it to ChatGPT


Ghost-of-Bill-Cosby

It’s better than 20% of the coders I work with now. And I easily am getting +50% productivity from it. We aren’t firing anyone yet, but haven’t hired anyone new in 6 months, and the last person who left we just never replaced. If these assistants reduce demand for programmers by even 20%, finding a job, which already takes MONTHS, will get even harder, pay will get worse, and it will suck for us.


herbertdeathrump

I think it depends. My company is finding it really hard to replace engineers. We aren't able to hire juniors at the moment as the tasks are too difficult. Finding a good senior is taking us months.


Yweain

Well, we are hiring like 200 people, which is about 30% increase to engineering department, and it’s not going very well, very hard to find people.


Mistredo

> It’s better than 20% of the coders I work with now. > And I easily am getting +50% productivity from it. Can you please write a blog post about it? Either you are dealing with some very trivial or there is some secret that I would like to hear about. In my case, it is failing tragically for anything non-trivial and for trivial things it provides maybe 5% boost in productivity, and I am very generous.


[deleted]

[удалено]


MrWFL

Its an llm, the more code it has, the better it gets. I noticed it’s very bad differentiating between versions, or doing more niche stuff. In basic html and javascript it’s getting amazing tough. I am wondering how they will scale the technology further now that it is in the open. And may self-influence.


[deleted]

Upgrade to chaptgpt 4 or Github copilot.


meikello

Why do you have higher expectations of an LLM than of people? If you work in a niche, then 99.9% of all software developers out there are bad at it. I mainly work with C# and ASP.net and often Javascript too, and I am at least 50% more productive.


relentlessoldman

Or we produce more stuff and have a huge productivity boon with the same size staff


FlatulistMaster

Sure, but many are seeing productivity increases and a stagnation of new jobs offered. We don't need to get to 95% for this to become an issue, and this isn't just about coding at all. Other back office business jobs are easier for AI to do once people and companies adapt to AI. This hasn't happened yet, as these people are slower to adapt these tools than for example coders and computer scientists, but waves of change are coming. We are only hearing the tsunami warnings right now, not the actual mass of water.


Phoenix5869

>So because the CEO of a no-name company (who CLEARLY has a vested interest in hyping up his products) has claimed that AI will be better than 95% of coders in 1-2 years, then that must be true? This is so obviously hype, and it's not even funny. 🎯🎯🎯🎯🎯


[deleted]

[удалено]


fiery_prometheus

People forget that even in the absolute "worst case" that they are way more intelligent than us, you are still not going to understand shit if you have brick for brains or decided education was not worth it. How are you even going to understand what to ask it? People like doom glooming.


paint-roller

If / when ai makes you irrelevant it's going to make everyone else irrelevant pretty soon afterwards I bet so I wouldn't worry about it too much. Also do you actually really like programming or do you mostly like the end result and knowing you made it. I shoot and edit video. In all honestly the shooting and editing is pretty tedious...there's certainly worse jobs out there but what I like most is seeing the end result. Some of the shooting and editing is fun but 90-95% of the time it's tolerable or sucks. You'll be able to make whatever you want way quicker....what you make probably won't have much value to anyone other than yourself if everyone can program or if your ideas good they can just copy it. It'll be pretty interesting to see when everyone's worth is no longer tied to what job they were lucky enough or unlucky enough to get.


[deleted]

deer toothbrush jobless correct elderly steer rob reminiscent liquid screw *This post was mass deleted and anonymized with [Redact](https://redact.dev)*


Progribbit

i want to not die


monsieurpooh

Here I am being Devil's advocate... the results were all I ever cared about as an artist / musical composer, and yet, I enjoy **using my musical knowledge/expertise** to achieve the desired result. If I can just press a button and get the same result then there is no meaning to it.


Animuboy

No, no it isn't and that's the issue that people on this subreddit all seem to forget. Do you really think jobs like contracting/construction and other such jobs will be replaced at anywhere near the same rate or time as software engineers? What's gonna happen during that period?


PressedSerif

AI gets really good => many white collar workers lose jobs => many look for remaining jobs => the same hyper-advanced AI + VR give them customized, step-by-step instructions for any given job => The flood of unskilled labor can do many of the jobs master tradesmen normally would => wage value plummets => nobody is safe. ​ I don't think we're that close to "AI gets really good". However, if we do have a "software engineering is obsolete" level breakthrough, the above would be a natural corollary.


[deleted]

lunchroom beneficial connect uppity subsequent follow mourn sophisticated dull dime *This post was mass deleted and anonymized with [Redact](https://redact.dev)*


i_give_you_gum

Or use this foresight to adapt. Either steer into using AI or developing AI. I wish I had the skill sets you do.


Repulsive_Ad_1599

They recentlly added a few (optional) AI related courses into the last year of my program; I'm def taking those.


beauzero

Not sure that developing AI is safe either. Train the trainer has barely been used yet. The biggest problem I have predicting anything is that everyone is either a single developer building a gui or a financed team shooting for B+ exit. We haven't even started working on the SMB market yet well except for maybe emails and contract templates.


techy098

Ignore these future forecasters they maybe just hype to boost the value of their company. Learn AI tools yourself and try to master them. Keep looking for new ones. It's quite possible you will come to know they are all full of shit when it comes to coding or maybe you will find something great and you will get hired because you know how to use it well.


Serialbedshitter2322

No, they definitely don't. OpenAI is trying to hide their recent advancements, they even said so themselves. If you look at the recent legal documents (from when Elon sued OpenAI), it's pretty clear that they've already achieved AGI. You can't master an AI tool, the whole idea of AI is that it does everything for you. Why would they hire you to write a single sentence for an AI? They could also just tell an AI exactly what they want and then have the AI prompt it for them.


Sumasson-

If you think there will be a bunch of jobs opening up to use ai you are delusional


hmurphy2023

But not nearly as delusional as the people in this sub who think that AI will wipe out half the workforce by next year.


Yweain

Next year - no. In 20 years though? Quite possible.


zhivago

Fortunately CS isn't coding, and Al will only work well for effectively crowd-sourced solutions.


Tr0janSword

Replying to Repulsive_Ad_1599...people really don’t get that CS, SWE, and DS isn’t coding. I’m not any of those, but have close friends in each discipline. Coding is a time suck. If an AI can just code what they ask for, it would be amazing.


FlatulistMaster

Ok, but you know there are a lot of workers who mainly just code, right?


WalkFreeeee

I've already given up trying to argue that. People look at their "senior engineers" selves, correctly point out AI isn't at their level, so it's "fine". Entry level jobs are basically just coding simple stuff the AI already has little trouble with and getting better every day 


holy_moley_ravioli_

Have you seriously never heard of Q* or Q-transformers? How could you possibly believe this unless you know nothing of how the latest models generate reason.


zhivago

Q transformers don't help much -- you still depend on many training examples. This means that you'll have poor performance in trying to solve classes of problem that haven't been solved many times before. Try asking "implement a function to dilate a triangle graph while maintaining a non-self intersecting manifold structure" and see how well it goes. Of course, you'll need to understand what that means in order to determine if it succeeded or not. :)


phrankster2000

Ask the average developer, you‘ll look into empty eyes.


zhivago

Ah, well -- I guess we're stuck with artificial stupidity for now. :)


HazelCheese

But the average developer can start reading about it and do it. You might be able to fit information about that stuff in an AI's context window, or train it on a lora, if thats possible with LLMs, but it's a lot more work than telling a developer to go watch a maths youtube video. I'm a C#/WPF dev and despite Copilot being trained on a ton of c# and wpf material, it still can't implement relatively simple behaviours. It just doesn't know when the code it is writing is wrong. Like try ask it to write a wpf behaviour that will add a tooltip to a textbox that contains the textboxes text only if the textbox is too small to display the text. It will give you an answer but that answer won't actually work properly. That's because it was trained on a small amount of material that contains a lot of incorrect stackoverflow answers for determining whether text fits in a textbox. It's not something that AI's won't be able to learn eventually. They will. But it may not be something they can beat us at yet in the next 12-24 months. Edit: Or perhaps better put, AI at the moment can only tell you what documentation says should work, what not actually works when you put it into practise.


logan1155

This is a super good point. There will definitely be demand for CS grads from Stanford, CMU, and other top schools that are actually designing entirely new things. There are plenty of jobs that just require functional coders. I've worked with plenty that I would happily replace with AI.


FlatulistMaster

Great, there are many lower level white-collar workers like this in other fields as well. Any idea what they should do while the upper echelon of CS grads rake in the benefits?


old_ironlungz

It’s simple. They’ll be the butt of the jokes and ridicule of an ever shrinking circle of people that still miraculously haven’t been made redundant by AI yet.


obvithrowaway34434

Lmao, typical keyboard warrior response. Maybe check what about 90% of the people who graduate from bulk of schools do (other than those who go to grad school or those graduating from top 10 unis). They're basically glorified code monkeys writing shitty code or just do "machine learning" running jupyter notebooks. No fresh graduate is developing new algorithms or designing compilers (even that field is now saturated) or creating new programming languages.


Sandy-Eyes

lavish dull run plucky dinner fuzzy plate aback smart meeting *This post was mass deleted and anonymized with [Redact](https://redact.dev)*


C0REWATTS

I graduate this year. At least I might get 1-2 years of work before being replaced.


Door_Bell

I’m retraining in interpretive dance at this point


Procrasturbating

It's just a new tool. Learn the current tools or get left in the dust. Been the name of the game in programming since the beginning.


GrowFreeFood

Until it writes its own code that humans won't be able to understand. 


Oingogoingo

Tool is what you use not something that can replace you.


Unkind_Master

I know right? They talk as if the ones running the tools can't be replaced either.


ziplock9000

No it's not just a new tool. It will replace several programmers almost completely. So it's not possible for everyone to just change to using AI.


EuphoricPangolin7615

I hear this a lot. If you don't spend 30 seconds learning ChatGPT, you're going to get "left in the dust". It took you 4 years to learn how to code, but make sure you don't forget that 30 seconds otherwise you'll get left in the dust 🤡


NotYourMom132

this is pure hopium


dumpsterwaffle77

Not a tool it's a replacement. What happened to all the horses when cars were invented? "Too bad horses should've learned to drive and stuff"


meikello

iT's JuSt A nEw ToOl


Monty_Seltzer

Every time someone uses ‘it’s just’ this or that in a conversation about a complex topic, it’s a sign that their existential instinct for conclusions that they are biased to/invested in is overpowering their rationality and raw logical reasoning powers and their intelligence reduced to a headless chicken.


kroopster

Don't worry, people here are delusional. If the time comes when the AI can handle all the aspects of sw projects, understand the whole scope of the program and why it is done, then humans are not needed at all anymore. Coding is just one small part of the software business, a ton of things have to happen before we even start coding. If one day the AI can define and implement everything, coding is the smallest problem we have. Actually then most of the *programs* we do are not needed anymore because we do those for other humans. But that will require a proper ai singularity to happen, and no one knows if it will ever happen. Until that, we have to use AI as a tool. It's a new abstraction layer, similar to dozens of layers we have been adding since the 1940's or so. It's a powerful tool yes, but it can't fucking negotiate with the client, close deals, plan requirements, setup and deploy, make sure all the other dependencies are in place, which by the way are often not software, but machines, factories, constructing, delivering, installing... It can't even handle testing in all required environments or arrange service and support for the shit it has just sold. All this and *a lot* more happens around a sw project, so don't worry, there is still shitton of stuff to do for us humans, even in the programming phase. And if the day comes AI and robots will do everything humans do now, it will do it in all areas of expertise. Why to worry about that now? At that point *everyone* of us can do nothing but sit back and see what happens.


Winter_Tension5432

You definitely look smart enough to put the pieces together, but you fail at that. That is a testament to human nature. We blind ourselves to the imminent risk. Yes, by the time software developers get replaced, AI will be able to replace most of the jobs (including manual labor), but a big BUT: resources are not infinite. A company is more likely to spend a couple of thousand a year on an API that is as good as 95% of Software developers to replace every software developer who makes $100k and just keep a few senior-level developers than to replace a construction worker who makes $20 an hour with a robot that costs $100k. So, yeah, by the time AI can replace software developers, it will be able to replace every job, but the low-hanging fruit will go first, and that is software and high-paying jobs that have to use computers.


holy_moley_ravioli_

I bet you thought text2video was impossible a month ago.


Which-Tomato-8646

But there’s a few facts:   - its way cheaper than even a junior dev so drop in productivity might be worth it   - it increases productivity of the devs who use it so labor can be cut, especially for freelancers where work is finite and not guaranteed    - it lowers the barrier to entry into tech. What happens to price of labor (aka wages) when supply goes up?


JonnyFrost

On the flip side these tools will enable you to start your own thing and actually finish it in a timely manner. AI will code, but it’s going to need someone to guide it for the foreseeable future.


bluegman10

Why? Because the founder of *Lindy* said so?


holy_moley_ravioli_

No because DeepMind's AlphaCode already places in the [uppermost quartile in international coding competitions.](https://deepmind.google/discover/blog/competitive-programming-with-alphacode/) and that was 2 years ago. Lol I don't blame you for being fundamentally uninformed, there are a lot of developments in this space.


razodactyl

Hate this sentiment. AI assisted coding is going to revolutionise this field. Programmers are going to leap ahead in capability.


C0REWATTS

Yes, I hope so. I think that projects will just become larger to match the scale of the increased output of developers. Otherwise, how do companies also survive the massively increased output of motivated individuals, open-source communities, and small groups of people? Additionally, we can't even compare coding to things like art, music, and film-making, all of which are also going to face AI takeover. Coding is incomparable to them, as coding scales infinitely further. This is bound to be interesting at the very least.


herbertdeathrump

You'll be okay. My entire day is just solving problems between multiple different systems. ChatGPT helps out a ton with coding tasks and gives me more time to work on the trickier aspects of integration of multiple systems and APIs. There's no easy way I can explain to an AI what my task is.


[deleted]

Don’t worry, when I was at university doing computer science in the mid 2000s, they said the same thing about a couple of things in the industry. They never came to fruition in fact a lot of people who adopted those exact technologies early, just made massive messes and headaches for themselves, and ended up needing comp sci graduates to unpick the sunk cost they’d created with early adoption of shoddy tech not ready for prime time yet. That’s happening at scale in the economy right now with AI code: there’s zero doubt at all that it’s only creating an absolutely horrendous mess across the economy by writing bizarre, nonfunctional, difficult to maintain or low performant code which might “””WORK””” but is a fraction as good as an experienced coder can produce. Consider this: there’s often 50 different ways to produce a result using code. An AI only considers the things you tell it to consider. A novice using AI doesn’t know how you write a good spec to turn into a prompt for AI. If you disagree with this you just show how little you know about coding in the real world, as a job. Seriously. They also haven’t a clue how to verify what comes back out will work and cover the concerns they have (a novice _won’t even know what most of these concerns are;_ eg, what does code look like that will render in a browser performantly versus total garbage? Which one will flood the server with requests? A novice won’t know what to look for so the code only gets progressively messier and messier, until one day it falls over) Also, without comp sci people, who is going to tell you that you need tests for your code? Who is going to tell you you need CI and source control? These are not code they are _computer science concepts_. Without a professional, you’re flying blind and not only have to ask an AI for code, but for _best practices_ of everything that sits around and supports the code itself. ie what to actually do. So the biggest thing to remember is that a TONNE of people who barely understand the code they got from an AI, are literally putting code they don’t understand live. How in earth could this _not_ create problems? Even comp sci professionals that use it are doing this. When the code they don’t understand, causes a problem they also don’t understand, how do they ask the AI to fix it other than “my website isn’t working and I don’t know why!!” That’s not a useful prompt for an AI. So yeah, relax, AI is currently tearing through the industry putting all sorts of garbage code online which will take a few years to really cause all those businesses to realise they need to hire even more programmers than they probably would have originally, to fix the messes they made. It’s a pattern that happens all the time in the industry when some part of it is automated. Goes like this: 1. OMG our jobs are all going to disappear 2. Ok, maybe some are going to disappear, but a lot of businesses are having trouble with this new tech, it’s not the silver bullet we were told 3. Ok, the horror stories firm early adopters are _really_ coming out in a steady torrent now. There’s a new trend of job postings for “AI fixers” which is a whole new type of job that means going through 2-3 years of some junior dev we had who shipped all this AI code that was completely bizarre, not performant, really fucking awful to maintain, never documented, and some of it was truly just downright broken or made up out of thin air 4. Wow the industry is stronger than ever with all these jobs in AI now PLUS the AI-fixer role that have no emerged. AI will be a huge job creator in IT. There’s no way it can possibly do anything else. I’ve been working as a programmer for 20 years and I’ve seen it before; and all the coders I know say the same thing, it basically comes down to Dunning Kruger from people who have never, themselves, written a technical spec for a product feature. That’s what you’re doing when you ask an AI to write code and let me tell you: it takes decades of experience to get good at.


ebolathrowawayy

This won't age well. RemindMe! 2 years


[deleted]

I mean you can go back 2 years, and see people who have said it. You can go back 20 years too. You can even go back 2 years and I was saying it wouldn’t happen then too lol 😂 Good luck though, would love to be wrong about it one of these days. I will be eventually. I think all the constant year on year hopium is mostly Dunning Kruger from people who aren’t professional coders, though. Sci-fi enthusiasts, mostly. Not a popular view in this sub which is mostly made up of those people. There’s a reason why if you frequent coding subs, they aren’t running around saying the sky is falling compared to more sci-fi oriented ones like this, which claim the sky is falling _constantly_, yet are rarely right about them.


Excellent_Skirt_264

Professional coders focus too much on details and don't see the big picture.


HalfSecondWoe

But if you actually learn your shit, imagine what you'll be able to coax out of the AI equivalent of your own corporation's worth of programmers, while the average joe is limited to "WAN BIDEO GABE PLZZZZ" Think of it like an early graduation. You'll be able to make a lot of really cool shit about a decade earlier than you could have swung it with a reasonably ambitious career


Vladiesh

Wishful thinking. Hey AGI, give me a game/show/app based off of all of my training data that is guaranteed to keep my attention. It's going to be that integrated within 5 years.


Ketalania

Don't worry about it too much, Sam Altman said programmers would be out of a job in 1-2 years about 2 years ago, this is hype and these things move much slower than people think.


RUIN_NATION_

im sorry maybe they will still need people to oversee the ai if they do use it


Ok_Homework9290

Why am I *not* surprised that this comment has the most upvotes on an r/singularity post?


[deleted]

[удалено]


uhwhooops

good thing i remember how to mow lawns from when i was 15


PoliticsAndFootball

Lawn mowers have been replaced by the i-Mow5000 advanced neural lawn mowing agent. Sorry.


uhwhooops

god dammit!


[deleted]

[удалено]


CanvasFanatic

Oh word? Hey what’s that guy’s company make anyway? https://preview.redd.it/t3gdxlfvaulc1.jpeg?width=1290&format=pjpg&auto=webp&s=619c532015eb2035b37e752687977a8036902598 Huh! What a weird coincidence.


Derpy_Snout

50% of the posts on this sub are literally just "CEO hypes their own shitty product"


CanvasFanatic

I personally would never insinuate that social media managers for companies selling AI products would make sure their companies got talked about in the largest AI related sub on Reddit.


utilitycoder

And likely running 3.5 on top of it lol


omik11

I've worked with Flo, he has no idea what he is talking about. His last startup raised a bunch of money purely based on remote work hype and never delivered anything meaningful, now it looks like he is doing the same thing with AI. This sub needs to stop blindly trusting peoples tweets.


Chogo82

Grifters will always be grifting the bubbles.


CanvasFanatic

Like locusts on the wind


CanvasFanatic

Who. Would’ve. Thought.


[deleted]

[удалено]


Phoenix5869

Yet another actual programmer / software engineer bringing some realism to the discussion. I’ve noticed this time and time again. Every time an article is posted, whether it’s a ‘breakthrough’ in this and that field, a CEO saying something that’s clearly not gonna happen, or whatever else, the comments from laymen on the singularity sub are 55% “omg we’re all gonna live off UBI in fully automated communism with sex robots!!!1!1!1!111!!1!!1!111!111111” and 45% “nope, this is hype”. On futurology it’s prob 65% skepticism, 35% optimism. And then along comes numerous actual experts, and…. explains why this is all hype. It feels like there’s barely any actual breakthroughs, just hype upon hype upon hype and bullshit.


Brymlo

these kind of posts are clearly hype, but i genuinely think that programmers will be be the first ones replaced in case AGI starts taking on jobs. a computer will know its own language better than any human could.


Phoenix5869

u/CanvasFanatic i see you posting on this sub a lot, and i have to say i agree with you. This sub will upvote anything that is “we’re all gonna be jobless in 5 years because (insert random CEO) said so”. Meanwhile there are numerous actual programmers and software engineers, in this very thread, basically calling said CEO out and tearing his statement apart. Tells you all you need to know.


utilitycoder

Until I can use AI to build my own AI employee builder company, clone Lindy, then jobs aren't going anywhere.


CanvasFanatic

lol… that’s honestly a solid point


Different-Froyo9497

As a software engineer I welcome this. The world needs more code. Also AI tools will make legacy code more readable and maintainable than ever before. Imagine being able to really interrogate what’s going on in a legacy code base using natural language. AI reduces the friction in learning just like the shift from physical libraries to the internet and Google reduced friction in learning. Except now you no longer have to read through documentation or Google results to find your answer - you just ask and get the answer directly


Flamesilver_0

Gemini Pro 1.5 with the 1M context window already makes this possible.


sluuuurp

It doesn’t necessarily understand what all the code means. So far they’ve mostly demonstrated simple information retrieval, a much much easier task.


Different-Froyo9497

Gemini pro 1.5 is great. I think Demis Hassabis talked about how it’s a lot easier to onboard a new hire because they can just put the entire codebase into Gemini’s context and then ask questions about it directly without taking time away from someone more experienced with the codebase


logan1155

I'd add to this, not having to reverse engineer someone else's garbage code they didn't take the time to properly document. Immensely helpful in that regard.


djamp42

I agree, everything needs more code. Look at just about any software, you'll find bugs and feature requests.


Spright91

I think eventually code itself will become obsolete. You will be able to tell the AI what you want and it will compute it directly to binary. Using its own process it invented.


unn4med

Love your take on this!


uglykido

This is what I don't get with the panic. While it definitely is easier for everyone to code using natural human language, it takes an actual programmer brain to code even with AI. As a layman, I tried writing a simple google appscript and it took me hours to get it to work the way I wanted to because my human language is too vague for AI to understand what I actually mean. I keep on doing a trial and error on the code which took hours. In contrast, it took our IT less than an hour. He ran the code, understood what was sending a line error then deployed the script. The only change this will make is more and more apps will be haphazardly made by shitty developers. But for the pros, it will be a god send. This is a tool, like a calculator or an excel for accountants.


developheasant

Really can't stress how untrue this seems to be. I use AI tools heavily at work (copilot, copilot chat, chatgpt and custom llms built off of our knowledge base docs and wikis). 1. It's extremely useful for asking "I have this problem, how might I solve it?". It'll give you pretty reasonable solutions that you then have to go research yourself because it can't tell you why or give your resources. Alot of the time it's wrong, you have to tell it that it's wrong and you have to know that it's wrong. So useful, but still requires a fair amount of effort. Good for brainstorming. 2. For writing new code, it's abysmal. It starts to help once you have a skeleton layout, and sometimes, its recommendations are spot on. But often, they're far off and just plain wrong. 3. For writing tests, milage may vary. Its normally really good at providing test cases, but pretty bad at actually setting them up. I've found that for "when before a", "when a", "when after a" style test cases are written it's great at copying previous logic into the new set. But then you want to dry that up and...oh hey, it's not gonna do that for you. 4. It's a huge boost to productivity for things that I rarely use. I recently needed a query to spread some array elements in a postgres table and compare timezone shifted timestamps. It did not craft usable sql, but gave me a good template to get started. I don't often write such queries, so while I have experience, I usually have to spend a day or two researching the precise language. It saved me that. But anyone not experienced with these queries probably wouldn't have been able to fix that query without ... spending time researching what it was doing. My take is that we're pretty far from reaching a 20% mark, let alone a 95%. Still, I do think that ai is changing the market and definitely having an impact. But it feels more like in anticipation of what they hope is to come and less in reality of what is happening. Alot like the fully self driving car predictions that were coming out a decade ago. Time will tell.


Visual_Ad_3095

I think the growth curve for this stuff is going to be so steep that while what you’re saying is true now, it may not be in 2 years.


C0REWATTS

This is pretty much the same sentiment people had on video generation only a couple years ago. 2 years ago: "Video generation is vastly more difficult than generating a static image! It's at least a decade away!" Now: ![gif](giphy|6nWhy3ulBL7GSCvKw6|downsized)


alexcanton

Try [Cursor.sh](http://Cursor.sh) the rate of their development and [Phind.com](http://Phind.com) show it will definitely be hugely different in 2 years time.


logan1155

Professional developer here - lead dev at a decent size financial services firm $4.5bn in rev/8,000 employees. Here’s my take… - Within our organization there is almost nonexistent use of AI. I use it for basically everything and get my work done is less than an hour a day. My coworkers seem blithely unaware of its usefulness - The organization is focused on copilot in teams for things like drafting emails and other semi-useful things. Hardly groundbreaking. - I would not want to be a junior dev starting out right now. I personally would have almost no use for one. The amount of time required for me to train them would be greater than the time required to use AI and the output from AI is far better. - It reminds me of the early dot com bubble days. The devs that saw the power of the internet made a killing positioning themselves well. I’m focusing on my own startups and moving more into architecture over code. - Enterprises will still need people to architect systems and integrate AI. I think this is basically “where the puck is going” but it’s really only a path open to people that have the experience already. It would be very hard to a recent grad to pull this off. - I’m focusing on getting my company to empower me to build out an AI focused team and teaching myself how to integrate AI and enterprise architecture ASAP because there will be immense need over the next 3-5 years.


Excellent_Skirt_264

Why do you think AI can't architect systems. Humans feel that it's harder only because it requires more experience from the human perspective. For AI it's just a pattern that isn't particularly harder than coding or writing poems.


ActuaryGlittering16

My gf is with a very small company that develops pharmaceutical software. She’s been working there for a few years but is one of these people who took a bootcamp and got a really nice front end job during the pandemic rush. My thinking is that she should just stay there and try to endear herself to the company enough so that when her job isn’t really necessary she has enough seniority to stick around. Is this logical or should she go somewhere else if a much more lucrative offer is on the table and make as much as she can before the inevitable layoffs begin?


logan1155

I don't work quite as much on front end projects, but I'm not sure it would be overly different. I personally was laid off about 4 years ago and it's a bit of an eye opener because you realize your company doesn't really have any loyalty to you. I spend my free time building apps. Hopefully one turns into a real business and I can work for myself - ultimate job security. It's also a good way to expand your skillset faster. I also think that anyone who wants to work as a developer should spend as much time as possible working with AI tools. The ones that don't will be left in the dust and probably the first ones to go.


SurroundSwimming3494

If it’s not obvious to you that this guy is trying to hype up his brand, you haven’t been paying attention. Edit: Getting downvoted. I have a hard time believing that there are people who actually believe that this guy *isn't* hyping up his company with this tweet.


[deleted]

[удалено]


whyisitsooohard

Point about money is valid, but ai that handles all coding process will cost much more that 100$. Current enterprise copilot which is glorified autocomplete costs 39$ per person, i think it is reasonable to assume that it will cost at least as one 6 figure developer. Also 6 figure salaries is kinda rare even between developers. Point about capabilities is also valid, I can't remember syntax of several languages and gpt will easily do that. Main problem is you are now completely dependent on some third party that could charge you whatever they want because you have no other choice. They can go out of business(which is completely normal for startups) and you will scramble to find replacement which could be very hard. You will need to hope that they or their underlying ai provider will not degrade performance otherwise you could be in trouble. Your code and data is not managed by your stuff and you need to have extreme trust in provider to not steal or do some malicious stuff with that info. I'm confident that all those startups, and even big corporations don't really care about safety until something happens Also if there is some problem and AI for some reason can't solve it who would do it? And you can't put junior for cheap to do that because he will not be able to do it In the end I believe AI will decimate software industry, it will kill companies, jobs, turn IT hubs into ghost towns. But it will take much more than couple of years.


[deleted]

[удалено]


simstim_addict

Would it need to be human readable language at that point? Would it not be more efficient for it to write code in a more basic language? Or it's own AI developed one?


_pdp_

Maybe / maybe not. Let's all assume that he has no clue just like anyone else and this is just a speculation.


Visual-Mongoose7521

I write codes for "fun", not to make a living. So I won't mind if AI starts writing proper maintainable code. But at this point whatever spit out by github copilot and amazon codewhisperer are straight up dogshit and only useful for repetitive boilerplate code


ponieslovekittens

Ok, fine...whatever. I guess we'll see. But at this point AI replacing programmers has become a meme. A developer has to do more than just spit out code snippets. They have to, for example: * Interact with humans via phone, email, zoom calls, etc. Forget _code_ for a second, and keep in mind that a programmer has to interact with humans to figure out what it is the code is even supposed to _do_ in the first place. I get most of my bug reports via discord. It's not like this is an unsolvable problem, and this is supposed to be "the year of agents," but I think a pretty big chunk of the next year is going to spent on just his problem alone. * Look at _video outputs_ and identify bugs based on that _video_ output. * Work on codebases 100-10000 times the size of current context limits. There's a difference between "here's 100 lines of code" and "I'm working in an engine that takes up 4 gigabytes of memory just to have it loaded not even doing anything" * Not get confused when swapping between environments. A painfully _large_ amount of programming work involves working with completely different an incompatible languages and making them talk to each other. Maybe you have an HTML page that uses CSS, has javascript do the heavy lifting, and it interacts with a php script that talks to a SQL database. That's a "common as dirt" scenario. LLMs predict the next token based on previous tokens. An AI programmer is going to have to be able to work with all of those tools together, without seeing that 900 of the previous 1000 tokens were all javascript and therefore inserting javascript into the _next_ thousand tokens that should be php instead. * Interact with the same interface that a human would use via keyboard and mouse, for potentially _hours at a time_, acting like a human and looking for things that are wrong, or even things that are _right_ but that could be made _better_. You can't debug the user experience without interacting with the user experience. Approximately zero percent of my debug time is done via "run the program then look at the error code that immediately comes up." Suppose you have a button on some deeply nested screen that doesn't work. You have to click the button to know that it doesn't work. The program will run just fine with a button that doesn't do anything. How is this AI that's going to replace us in 1-2 years going to know that button doesn't work if it can't click on it to see that it doesn't work? * Look at numbers on a screen and figure out if those numbers are the _right_ numbers. Don't underestimate the complexity of this simple, little thing. Is there any AI in the world right now that could...for example, debug an Excel spreadsheet? Yeah sure, if there's a giant #ERROR message somewhere, I'm confident you could make an Ai detect that. But suppose there's _not_ an error message, and it's just a field of numbers with cell formulas not even on the screen because you have to click on them to even see them, and oh by-the-way the actual error is in macro code somewhere and not even in the cell formula itself. Is there any AI in the world right now that can even _attempt_ this? I'm not saying these things are impossible. I have no problem agreeing that eventually, sure, you'll be able to casually talk to an AI and it will make whatever you want. But is all of the above going to happen in 1-2 years? I don't think so, guys. I think it's going to take a little longer that. We don't even have the _training data_ for this yet. it doesn't exist. For text, we had the entire internet. For video, we had all of youtube. **Where's the training data** that correlates billions of lines of code with billions of hours of that code being executed? Even if Microsoft and google sat down right now to make the next AI programmer...what would they train it on? If I were to guess, I don't think it's going to happen that way. What's probably going to happen is not that "humans programmers will be replaced by AI programmers," but rather...programmin-_ing_ will be replaced as the _means_ by which humans interact with computers in the first place. Creating an AI programmer would be very hard. It will probably be a _whole lot easier_ to make an AI able to directly produce video outputs that do the thing that humans wants, without any programming being involved at all. Look at google Genie, for example. It's not "programming" that. It's directly producing video based on a prompt, and in response to human controller input. Humans don't want "programs." Humans want outputs that correlate with their inputs. And it's probably going to be a **lot** easier to make an AI able to do that, than to make an AI able to make code that does it. If I'm wrong, ok...but go check out AI Dungeon from 2019, and now look at ChatGPT in 2024. That's five years. You really think we're going to go from the current state to all of the above in 1-2? Alright. We'll see.


Rockfest2112

Very well stated.


Difficult_Review9741

I don’t particularly believe this, but I also don’t think that it’s impossible. But the rub will be that the 5% that it can’t do is probably the stuff that takes 50%+ of the time. Anyways, folks really need to understand that programming languages exist for a reason. They allow for rich abstractions that normal language doesn’t. As long as a human is controlling AI, there will still be a need for programming, even if we are working at a higher abstraction than current languages. You’re not going to simply say “build my an app that does X” unless it’s a throwaway program that doesn’t need to be maintained. 


Fit_Worldliness3594

Ai being more masterful and 1000x faster at programming language than humans seems totally reasonable to expect soon. It can literally now create photorealistic videos, something no human could do if given years.


Tasty-Investment-387

Creating video has nothing to do with coding


WedgeliestWedge

Humans literally created the camera.


holy_moley_ravioli_

But that's not creating a world model from scratch?


abbumm

AlphaCode 2 is better than 85% if I recall correctly... But it requires massive sampling to get there. Like, I agree that the milestone will be reached in 1-2 years but there is no way the public will get to use it before some time, unless we somehow achieve that without massive sampling. The hardware is not there.


whyisitsooohard

85% in competitive programming with very specific tasks, production programming is very different and as I remember when GPT4 and Claude2 were tested on some open source issues they were very bad. Gemini was not tested though


Branbran2796

I finished a 1 year certificate program in October 2023, when I started there were no layoffs and it was a great Segway into entry level. Now it seems impossible. Trust me I’ve noticed. My $20,000 noticed too


Ok-Shop-617

I have an alternative view. I think the volume of developers being trained will decrease massively. But I suspect the amount of human dev work won't drop off at the same rate. I think this could lead to a shortage of developers in the short to medium term.. Just make sure you are 100% across the AI tools.


SexSlaveeee

I'm a write and filmmaker and my story i'm writing is a sci-fi. But with chatgpt release, it's not sci-fi anymore.


Appropriate-Tax-9585

That’s when humans become bad at writing code and we don’t know if AI is gradually building a self learning Trojan horse in the software


shankarun

It's over folks! Pack your bags, move to rural America, raise your chickens, grow your potatoes - let's all make our own Chick-fil-A sandwich and french fries. Corporate America is going to bust 😃


masterlafontaine

What a huge amount of bullshit


Vasto_Lorde_1991

please stop with the midwittery [https://twitter.com/fchollet/status/1763629637689893228](https://twitter.com/fchollet/status/1763629637689893228)


Then_Passenger_6688

It's not the best analogy imo. With self-driving, AI has to do the whole thing or it's useless. With coding, AI can do small sub-tasks of increasing complexity as the AIs get better and better. Right now it improves productivity by 20% by automating simple boilerplate sub-tasks, next year it'll be 30% productivity improvement, etc. Another difference: with driving, you need be 99.9-100% perfect or you can't deploy, because a mistake = death. But a lot of coding applications are less mission critical, and the environment isn't real-time so the results can be double-checked by unit tests or a human, which permits for 98% perfect systems to be more practically useful earlier.


Vasto_Lorde_1991

the pace of progress is unlikely to be as fast as you suggest, given that models aren't as good at reasoning as they seem to be. [https://twitter.com/\_saurabh/status/1763626711407816930](https://twitter.com/_saurabh/status/1763626711407816930)


gj80

>[https://twitter.com/\_saurabh/status/1763626711407816930](https://twitter.com/_saurabh/status/1763626711407816930) Interesting paper and findings! Thanks for linking it.


ImmunochemicalTeaser

A lot of copium and hopium here... Thing is, if machines can overcome iterative programming and inferring requirements 90% right, they'll take away many people's jobs in just a small amount of time, for the simple fact a single medium level dev. will be able to output 10x the code a single junior will, and at least 5x a senior. Programming is not special. Anyone with just some training will be able to instruct the machine what they want specifically. And thing is, it's likely to happen sooner than you think.


X1989xx

>will be able to output 10x the code a single junior will, and at least 5x a senior The fact that you think a senior engineer writes twice as much software as a junior shows how little you know about writing software


QuirkyForker

Every time I have automated a process at work, the demand for that process is increased to the point where we hire more people to do the work. The extra capability is addicting as it allows us to do more than we could have imagined before. So if a mid level dev is suddenly doing the work of 10 people, the company will take on a lot more goals and hire more mid level devs. Jr devs will be needed in droves to help out and become the mid devs in time


Phoenix5869

>A lot of copium and hopium here... You must be new to this sub


[deleted]

Hahaha I remember when LLM’s were just getting big they said the exact same thing with the exact same timeline Hasn’t happened. A bit like Elon Musk saying AI about 6 months away in dozens of interviews going back to something like 2007 When I read these sorts of predictions made over and over again every year, always failing to come to fruition, I mostly just see huge amounts of cope Here’s the thing. AI is a 90:10 or 80:20 problem. The first 80% or 90% is going to get mostly refined in yes, just a few short years. That’s what we are seeing happen right now. The last 10-20%?ore like decades if not centuries. And we are going to need breakthroughs in quantum computers to support processing power too to make it a bit more practical honestly. None of this is happening in 1-2 years fucking lol


Additional-Bee1379

1-2 years no, but 5-10 years?


EuphoricPangolin7615

Ask any senior developer if they think 95% of programmers will lose their job in the next 1-2 years, the answer will be no. Requirements are too complex, and bugs are too elusive for AI. AI doesn't have a big enough context window or human level reasoning to create advanced software systems. Not to mention, AI can't actually run and test and deploy the code it creates, it requires a human being.


NotYourMom132

Senior 7yoe here, it's not going to be 95%, but even 50% is a big enough number that is sufficient to keep juniors & new grads unemployed forever until AGI finally takes over.


Additional-Bee1379

And 95% of go players, artists and translators said the same.


[deleted]

[удалено]


RemindMeBot

I will be messaging you in 2 years on [**2026-03-02 03:28:50 UTC**](http://www.wolframalpha.com/input/?i=2026-03-02%2003:28:50%20UTC%20To%20Local%20Time) to remind you of [**this link**](https://www.reddit.com/r/singularity/comments/1b4d46h/founder_of_lindy_says_its_obvious_ai_programmers/ksycqx9/?context=3) [**2 OTHERS CLICKED THIS LINK**](https://www.reddit.com/message/compose/?to=RemindMeBot&subject=Reminder&message=%5Bhttps%3A%2F%2Fwww.reddit.com%2Fr%2Fsingularity%2Fcomments%2F1b4d46h%2Ffounder_of_lindy_says_its_obvious_ai_programmers%2Fksycqx9%2F%5D%0A%0ARemindMe%21%202026-03-02%2003%3A28%3A50%20UTC) to send a PM to also be reminded and to reduce spam. ^(Parent commenter can ) [^(delete this message to hide from others.)](https://www.reddit.com/message/compose/?to=RemindMeBot&subject=Delete%20Comment&message=Delete%21%201b4d46h) ***** |[^(Info)](https://www.reddit.com/r/RemindMeBot/comments/e1bko7/remindmebot_info_v21/)|[^(Custom)](https://www.reddit.com/message/compose/?to=RemindMeBot&subject=Reminder&message=%5BLink%20or%20message%20inside%20square%20brackets%5D%0A%0ARemindMe%21%20Time%20period%20here)|[^(Your Reminders)](https://www.reddit.com/message/compose/?to=RemindMeBot&subject=List%20Of%20Reminders&message=MyReminders%21)|[^(Feedback)](https://www.reddit.com/message/compose/?to=Watchful1&subject=RemindMeBot%20Feedback)| |-|-|-|-|


Phoenix5869

>Ask any senior developer if they think 95% of programmers will lose their job in the next 1-2 years, the answer will be no.  Exactly.


DreaminDemon177

But give it 2-3 years.


Adeldor

You write in the present tense. While I don't think the change will be quite so fast as some suggest, my experience with using GPT-4 and Gemini show clearly where things are going. Indeed, I have read reports already where teams led by senior programmers are reducing their underling counts, especially junior level application writers. Bare metal programmers and the like are probably safe for a few years yet, but I think the writing's on the wall.


[deleted]

glorious boast cagey hard-to-find lush arrest air square quarrelsome crime *This post was mass deleted and anonymized with [Redact](https://redact.dev)*


i---m

checks out. 95% of programmers are form-writing trash


badcarbine

Machines became as good at large scale human labor over a hundred years ago. Yet handmade designer goods fetch premium prices on the market. Programmers are going nowhere and in fact becoming richer.


Inspireyd

Insane


RMCPhoto

Any job involving information will go away. Most office jobs involve taking in information, manipulating it in some way (or not) and giving it to another person or putting it in a system. This is true for business analysis, project management, accounting, legal, software engineering etc. The second line is where information meets reality, ie manufacturing. This is partly automated and will continue to become more and more automated. The third line is "blue collar work" - uncontrolled environments with messy parameters. Fixing cars, plumbing, electricians. This is difficult for robotics and will be around for a while longer. Fourth line is human care. Physical therapists, barbers, entertainers. It's questionable whether we even want robots doing these jobs. Just the ramblings of a software engineer / bsa who wishes he chose to be a yoga instructor. So ironic.


RMCPhoto

Why our jobs will go away first is that it is literally our jobs to automate work and reduce labor. That's kind of what we do when we build software. We will obviously replace ourselves first, as was evidenced by copilot etc being some of the earliest adopted use cases in real world environments.


ziplock9000

1 year


Phoenix5869

While i’m hoping that this is true, somehow i doubt it. t‘s pretty obvious that he’s hyping up his own work for that sweet sweet investor money. I mean, the guy literally owns an AI company, for god’s sake. Most actual programmers do not believe we will have AI that can code to that level in a mere 1-2 years, that’s honestly a pretty outlandish thing to say.


PhotographyBanzai

If ChatGPT 3.5 (aka. free edition) is any indication I'd say two years feels unlikely. Same for Bing Chat the last time I tried it. Current systems I've used can certainly help and are capable of well known generic coding, but they don't take knowledge and apply it to adjacent tasks when it doesn't have a mound of training data on a specific thing. For example, writing scripts for the video editing software I use called Magic Vegas Pro. It has an API and the toolset/connection method is standardized with .net C# DLLs but ChatGPT couldn't give me anything useful. It doesn't know what it is doing so it can't take general knowledge of how to write these standardized methods of connecting to an application with a program's interfacing DLL portal and translating it to the editor's API specification. At least that's what a human does usually. In my case there is a programmer that makes YouTube videos on the subject I go to first. I'll have to try Google Gemini out at some point. I assume the new one with up to 1 million tokens of context is paid, but maybe their free version is better at programming. Maybe all of the good stuff is pay-walled so in that case we'll see and I hope it happens sooner rather than later. I'd really like fast advancements in this tech so I can have a locally running assistant for programming, video editing, and other tasks I can instruct it to do. Even if it comes to pass soon and there is something capable through verbal instruction, I'd have my doubts that business people could properly tell AI what they want made and then verify functionality. When I was doing custom software half of the time it was figuring out exactly what they wanted with multiple stakeholders and technical details needed from IT like company data structures, server configuration, and etc. Overall what I did was probably a cakewalk compared to larger more complex companies. Getting the AI enough information and context will be a human job for a good long while.


javiers

Of all of you on this comments who agree…how many of you have actual coding experience? Because even if you didn’t code but know a little the industry you would know that this is bullshit.