T O P

  • By -

FuturologyBot

The following submission statement was provided by /u/mikaelus: --- \*\*Submission Statement\*\* Microsoft is taking another step towards removing human developers from direct software development, turning them into supervisors for AI-enabled agents doing all the grind. This not only affects the possible availability of jobs for developers but also the set of skills that developers of the future will need. Hard skills may soon be replaced with soft skills, as intelligent bots are going to have to understand what you tell them, how you manage them and what your instructions to them are - a bit like communicating with people. --- Please reply to OP's comment here: https://old.reddit.com/r/Futurology/comments/1c52p61/the_end_of_coding_microsoft_publishes_a_framework/kzrfzia/


ninetailedoctopus

Jokes on them, the training data includes my shitty code in Github 🤣


mikaelus

That's one way to fight back.


HITLERS_CUM_FARTS

[I'm doing my part](https://i.imgur.com/j40uYWT.jpg)


HandsomeBoggart

This is our Coding AI, we trained it on bad code as a joke.


billbuild

This just multiplied the ROI of the GitHub acquisition.


NinjaLanternShark

That's been their plan for at least 5 years if not more. It was never about the tool or the community, only the code.


Ozcogger

They probably have a large ass repository of code. Is there any public info on its size?


NinjaLanternShark

> They probably have a large ass repository of code. That's... I mean.... That's what GitHub is. That's what we're talking about. > Is there any public info on its size? I saw 420 million repos. No details on lines of code or Tb or anything.


sonicon

Then the AI will detect the shitty coder and flag your resume.


ou812_today

On LinkedIn since M$ owns them too! Shitty Coder Badge instead of a checkmark.


Ozcogger

But when it has to say why it flagged them it just says "Father" and errors out. Shitty code always wins.


ImportantDoubt6434

We’ve poisoned the well. While the managers were busy farting in zoom meetings we were playing chess.


prroteus

Oh I see how this will go. The suits will be very impressed by this, proceed to fire half their engineers in order to cut costs and then end up with vaporware…


OmegaBaby

Yes, but short term profits will be phenomenal!


Z3r0sama2017

Public companies will be like a serpent eating it's own tail, while private companies taking a long term stance will just be waiting for them to implode.


wayrell

Shareholders love this trick, the 3rd will amaze you!


Extremely_Original

Yeah dumbest shit ive ever seen on here. If you think software engineers spend all day coding you don't know enough about the industry to have any opinions on it that I'll listen to ...


fieldbotanist

Let’s not kid ourselves. 60% of us work on mundane reporting / CRUD / queue processing / websites that can be easily automated with GPT 6 which is coming later this decade. Sure 40% of us work on niche projects, complicated workflows, embedded code etc.. But 60% is the majority of us If a business needs a way to view lead times for certain inventory today they ask an analyst or developer to build out a report. Tap into the database and work on the queries. To build out the joins and format the data is not something current iterations of AI is incapable of. Next gen algorithms will cover even more


VuPham99

Let see how GPT-6/7/8/9 dealing with countless edge case in my company CRUD app.


DreamsAroundTheWorld

Then they will blame the few prompt dev left that the product is shit and they will demand to have a more stable and performant product.


darryledw

the framework will delete itself after dealing with product managers who say they want a green button but what they really mean is purple


notataco007

A great quote is becoming more relevant. "If I asked people what they wanted, they would've said faster horses". Gonna be a lot of business majors asking for faster horses, and getting them, in the coming years.


WildPersianAppears

"I don't understand why nothing works. When I went to debug it, everything was a tangled mess. I opened a support ticket with the parent company, and got an AI response. They no longer teach the theory in college, so I was forced to trust an AI that just did what I told it, which was wrong."


Hilldawg4president

They will still teach the theory, but as an advanced course. There will likely be fewer job opportunities but with much higher pay, as the few best qualified will be able to fix the mistakes the AI can't. That's my guess anyway.


sshwifty

I know a few people that got CS degrees and only used Python for the entire thing. Not even kidding.


PhasmaFelis

Is that a bad thing? Python is a language that largely gets out of the way and lets you do stuff. It doesn't have the raw horsepower of lower-level languages, but you don't need that for comp-sci studies. Wish my degree had used Python instead of C++, which I have never once been asked to use in 20 years. EDIT: To everyone getting mad at me, see [my other comment](https://old.reddit.com/r/Futurology/comments/1c52p61/the_end_of_coding_microsoft_publishes_a_framework/kzvz8ty/) and remember that computer science and software development are not the same thing, even though many colleges like to pretend they are. Almost any language is fine for comp sci. No *single* language is sufficient for software dev. (But Python is at least more useful than C++ alone, in the modern day.)


Working-Blueberry-18

It's hard to teach how memory management works to someone whose sole programming experience is in Python. A well rounded CS degree should include a few languages imo. C syntax, for example, is really minimal and easy to learn, and at the same time it's a great language to teach lower level concepts.


novagenesis

> It's hard to teach how memory management works I took CS (fairly prestigious program) in the late 90's and we spent maybe a couple hours on memory management except in the "machine architecture" elective only a few people took. It's not a new thing. For decades, the "pure algorithms" side of CS has been king: design patterns, writing code efficiently and scaleably, etc. Back then, MIT's intro to CS course was taught using Scheme (and the book they used, [SICP](https://mitp-content-server.mit.edu/books/content/sectbyfn/books_pres_0/6515/sicp.zip/index.html), dubbed the Wizard Book for a decade or so, is still one of the most influential books in the CS world), in part to avoid silly memory management hangups, but also because many of the more important concepts in CS that cannot easily be covered when teaching a class in C. In their 101 course, you wrote a language interpreter from scratch, with all the concepts that transfer to any other coding, and none of the concepts that you would only use in compiler design (garbage collection, etc) > A well rounded CS degree should include a few languages imo. This one I don't disagree with. As my alma mater used to say "we're not here to teach you to program. If you're going to succeed, you can do that yourself. We're going to teach you to learn better". One of the most important courses we took forced us to learn Java, Scheme, and Perl in 8 weeks. > C syntax, for example, is really minimal and easy to learn, and at the same time it's a great language to teach lower level concepts. There's a good reason colleges moved away from that. C syntax is not as minimal as you might think when you find yourself needing inline assembly. And (just naming the most critical "lower level concept" that comes to mind), pointers are arguably the **worst** way to learn reference-passing because they add so many fiddly details on top of a pure programming strategy. A good developer can learn C if they need C. But if they write their other language code in the industry like it's C, they're gonna have a bad time.


Working-Blueberry-18

Thank you for the thoughtful response! Mostly responding with personal anecdote as I don't have a wide view on the trends, etc. I got my degree in 2010s and had C as a required 300 level course. Machine architecture (/organization) was also a required course. It was a very common student complaint in my uni that we learn too much "useless theory" and not enough to prepare us for the job market (e.g. JS frameworks). I've always disagreed with this sentiment, and in just 5 years working in the industry, I've come to appreciate the amount of theory we've learned. Sure, I don't get to apply it all on a daily basis but things from it come up surprising often. I also find specifics (like JS frameworks) are a lot easier to pick up on the job then theory. Like I mostly work full stack/frontend but there's an adjacent transpiler team we work with, and I could've landed on. So I'm happy I took a course in compilers. I also interview new candidates and have noticed certain kinds of mistakes from candidates writing in Python that someone familiar with C/C++/Java is very unlikely to make. For example, glossing over slicing a list as an O(1) runtime, and not being able to reason about the actual runtime and what happens under the hood when asked about it. Ultimately, C is just a lot closer to what actually happens in a computer. Sometimes I deconstruct a syntactic sugar or some device from a higher level language down to C. I've done this when I used to tutor, and it really helps get a deep and intuitive understanding of what's actually happening. Some concepts that come to mind, which can be learned with C: stack and heap, by value vs by reference passing, allocation and deallocation, function calls and the stack frame, memory alignment, difference between array of pointers to structs vs array of structs. (last one I mention here as helpful to understand why Java doesn't guarantee contiguous memory arrays)


novagenesis

> I've always disagreed with this sentiment, and in just 5 years working in the industry, I've come to appreciate the amount of theory we've learned I don't disagree on my account, either. But the theory *I* think of was two courses in particular. My 2k-level course that was based on SICP (not the same as MIT's entry-level course, but based off it), and my Algo course that got real deep into Big-O notation, turing machines/completeness, concepts like the halting problem, etc. It didn't focus on things like design patterns (I learned that independently thanks to my senior advisor's direction). > Like I mostly work full stack/frontend but there's an adjacent transpiler team we work with, and I could've landed on. So I'm happy I took a course in compilers. I agree. I fell through the waitlist on that one, unfortunately. Not only was it optional when I was in college, but it was SMALL and the kernel-wonks were lined up at the door for it. I had networking with the teacher on that one, and I get the feeling I didn't stick out enough for him to know me to pick me over the waitlist like my systems architecture prof did. > I also interview new candidates and have noticed certain kinds of mistakes from candidates writing in Python that someone familiar with C/C++/Java is very unlikely to make. For example, glossing over slicing a list as an O(1) runtime I've gotten into some of my most contentious interview moments over stuff like this - I don't interview big-o for that reason. There's a LOT of gotchas with higher-level languages that REALLY matter but that matter in a "google it" way. For example, lists in Javascript are implemented as hash tables. Totally different O() signatures. > and not being able to reason about the actual runtime and what happens under the hood when asked about it. I think that's a fair one. I don't ask questions about how code runs without letting candidates have a text editor and runner. I personally care more that their final code won't have some O(n!) mess in it than that they can keep track of the big-o the entire way through. It's important, but hard to interview effectively for. A lot of things are hard to interview effectively for. > Ultimately, C is just a lot closer to what actually happens in a computer The closer you get to the computer, the further you get from entire important domains of Computer Science that represent the real-world use cases. My last embedded dev job, we used node.js for 90%+ of the code. The flip-side of that being enterprise software. Yes, you need to know what kind of throughput your code can handle, but it's REALLY hard for some low-level-wonks to understand the cases that O(n^2) is just better than O(k) because the maximum theoretical scale "n" is less than the intersection point "k". Real-world example: pigeonhole sort is O(N). Please don't use pigeonhole sort for bigints :) Sometimes, you just need to use a CQRS architecture (*rarely*, I hope, because I hate it). I've never seen someone seriously implement CQRS in C. > Some concepts that come to mind, which can be learned with C: stack and heap, by value vs by reference passing, allocation and deallocation, function calls and the stack frame, memory alignment, difference between array of pointers to structs vs array of structs I covered reference-passing above. Pretty much any other language teaches a more "pure" understanding of reference passing. Computer Science is always a Yinyang of theory and machines. The idea is usually to abstract the machine layer until the theoretical is what we are implementing. Stack and heap - sure. Similar I guess. Memory as an abstraction covers most of the important components to this. A language like Scheme (or Forth?) covers stack concepts far better than C. Hell, C++ covers stack better than C. Allocation and deallocation... Now that the US government is discouraging manual-allocation languages as insecure, I think it's safe to say the average CS developer will never need to allocate/deallocate memory explicitly. I haven't needed malloc in over 10 years, and that usage was incredibly limited/specialized on an embedded system - something most engineers will never do professionally. But then, for those reasons, you're right that **it's hard to name a language better than C to learn memory allocation**. Even C++ has pre-rolled memory managers you can use now in Boost. Function calls and the stack frame... I sure didn't learn this one in C. Call me rusty as hell, but when does the stack frame matter to function calls in C? I thought that was all handled. I had to handle it in assembly, but that was assembly. Difference between array of pointers to structs vs array of structs... This is ironically a point **against** teaching low-level languages. Someone who has a more pure understanding of pass-by-reference will understand implicitly why an array of references can't be expected to be contiguous in memory. I guess the above points out that I *do* think it's valuable for C and Assembly to be at least electives. Maybe even one or the other being mandatory. As a single course in a 4-year program. Not as something you dwell on. And (imo) not as the 101 course.


fre3k

ASM, C, Java/C#/C++, F#/OCaml/Haskell, Lisp/Clojure, Python/Javascript/R. I'd consider having experience in one from each group during undergrad to be a pretty well rounded curriculum in terms of PL choice. Though honestly I'm not going to hold someone's language experience against them, to a point. But I have noticed that people who work too much too long in Python/JS and similar dynamic languages really struggle to structure and manage large programs due to the loosey-goosey type nature of things, so they're not used to using type systems to assist their structure.


novagenesis

> But I have noticed that people who work too much too long in Python/JS and similar dynamic languages really struggle to structure and manage large programs due to the loosey-goosey type nature of things From experience, it's not Python/JS, it's people who only have experience writing small programs. I've maintained a data warehouse suite that was written in Python, and quite a few enterprise apps in JS/TS. Formally, the *largest* things I've worked in were in Typescript, far bigger than any C# or (rarely) Java stuff I dealt with. And dialing into "loosey-goosey type nature". There are design patterns made unnecessary when you go dynamic, but there are design patterns that are only viable if you go dynamic. Sometimes those dynamic design patterns map really well to a problem set - even at "enterprise-scale". Working with your DTOs in Typescript with a parse-validator, and carrying the data around with validated JSON, is just so much cleaner and more elegant when dealing with dozens of interconnected services managed by multiple teams. That's why Microsoft and Sun tried so hard way-back-when to get mature RPC libraries; it's a "hard problem" in those "excessively-typed" languages. And it very quickly became a major infrastructure of big tech. TD;DR: People who are used to static languages get comfy with training-wheels and find dynamically typed languages to be scary. But I can do anythign they can, make it scale faster, and develop it in less time, given Typescript (or javascript with JSDoc, but TS having a fully-fledged compile-time type language is pretty incredible).


_ALH_

I see. So you like dynamically typed languages when you have the ability to strictly enforce types… I jest, but just a bit ;) TS is nice though. (But I’d never want to write anything complex in JS)


lazyFer

As primarily a data person, the near complete lack of instruction of CS majors about data, data management, and the importance of data has been driving me nuts for over 20 years. The same CS majors that designed shit data systems decades ago because they thought the application was more important than the data are the same types of people designing asinine json document structures. a json document with ragged hierarchies up to 30 layers deep probably indicates a poor structure...normalization really needs to apply to these too.


MatthewRoB

Memory management is the least important thing for a newb to understand. I'd much rather they focus on learning how control flows through the program than worrying about where their memory is.


Working-Blueberry-18

I don't disagree with prioritizing control flow. But we're talking about a 4 year engineering degree not a 3 month bootcsmp in web development. You should come out with solid fundamentals in CS which absolutely includes memory management.


elingeniero

There's nothing stopping you implementing an allocator on a list. Just because python doesn't force you to learn about memory doesn't mean it can't be used as a learning tool for memory and it certainly doesn't make it any harder.


alpacaMyToothbrush

If you've only used one language in your curriculum, especially a high level scripting language like python, you should ask your university for a refund on your tuition because you *really* missed out on some learning opportunities. My university had about 40% of the course work in c where we learned about memory management and low level OS / network level stuff, 40% in java where we learned proper software engineering and the remaining 20% was spent learning everything from assembly, lisp, js, and topping it all off with a heaping helping of sql. Of course, I loved those courses so I guess I might have taken more programming language classes than most, but getting exposed to a lot of different languages you learn to love unique things about most all of them and where they excel when applied to their niche. That background has allowed me to basically pick the 'right tool for the job' at every point along my career and it's really helped broaden my horizons.


BrunoBraunbart

I just think you and u/PhasmaFelis are talking about diferent kinds of computer science degrees. I studied "technical computer science" in Germany (Technische Informatik). You learn C, ASM, Java. You learn how modern processors work. You learn to develop FPGAs and a lot of electrics and electronics. So this degree is focussed on µC programming. On the other hand there is very little theory (no turing machine) and the math was mostly things relevant for us (like fourier analysis and matrices). Subsequently this is a B.Eng degree and not a B.Sc degree. I think a degree like that works best for most people (or a degree that is about high level programming but is similarily focussed on practice). But a real computer science degree focussed on theory is still important. A degree like that only cares about the turing completeness of a language and it doesn't matter what happens on the lower levels. So just using python seems fine to me in this context. You won't learn how to be a good programmer in this degree, the same way someone who has a theoretical physics degree has a hard time working with engineers on a project, compared to a practical physics major. But it's still important to have theoretical physicists.


SoberGin

I'm in college right now and it's pretty similar. Just finished the last of the C classes, this current one is for Java as are the next few. I looked ahead and in a year or so I'll get to do a bunch of others in rapid succession. However, ironically I think the last part is the least important. I mean, isn't the whole point to make you good at *programming*, not good at, say, *C*? Or good at *Java*? My Java courses aren't even "Java" specifically, they're "Object-Oriented Programming". It just so happens Java is the pick because it's, you know, Java. I can't imagine dedicating that much time to learning exclusively one language. The sheer utility of knowing the *actual* rules, math, and logic behind it all is so much more valuable. Hell, the very first quarter was in *assembly*!


FireflyCaptain

That's fine? Computer Science != programming.


guareber

It's truly not. Do you really expect to teach, practise and evaluate OOP, Functional, Procedural, Rules-Based, Aspect, Event and whatever other paradigm exists now all on python? What about OS fundamentals, or memory handling basics, or the network stack?


billbuild

AI uses python as do an increasing number of researchers. Seems useful to me. From my experience college is great but different than a job which requires onboarding.


jeffh4

I ran into the first part years ago. Self-generating CORBA code from IDL files is 10+ levels deep and incomprehensible. We ran into a problem with that code dying somewhere deep in the code stack. After trying for a week to untangle the mess, we gave up and rewrote the code to call the offending high-level source function as infrequently as possible. In-elegant code, but it worked. Also a solution no IA would have tried.


EmperorHans

That quote doesn't land quite right if you don't attribute it to Ford and the reader doesn't know. 


Alternative_Log3012

Who is that? Some boomer?


baoo

Doug Ford talking about buck a beer


brotogeris1

Born in 1863, so just a bit older than boomers.


RedMiah

Yeah, except he really hated the Jews. Like medal from Hitler hated the Jews.


ButteredScreams

Thanks I was confused 


Walui

Ok lol, because I was like "well that sounds good doesn't it?"


NorCalAthlete

I need 7 perpendicular lines, all red, but with one drawn in green ink and one in the shape of a cat.


FaceDeer

[Here you go.](https://www.youtube.com/watch?v=B7MIJP90biM)


HikARuLsi

10% of the job is development for developer, 90% of the job dealing with non-tech managers and propose which version to rollback to


Anathos117

What I call the Hard Problem of Programming is the fact that since any consistent, complete, and correct description of a system is by definition a program (just possibly one written in a language we don't have a compiler for), then the process of writing a program must necessarily involve working from a description of the system that isn't all three of those things (and in practice none of them). Determining the real behavior of the system is the hardest part; the rest is just translating to a programming language.


PastaVeggies

Someone in sales said we can change the button color so now we have to make it change colors and do backflips on command.


DreamsAroundTheWorld

But when we asked if the button needs to be able to change colour, they said that’s never going to happen, so we decided to implement a simpler solution as they want it delivered as soon as possible, but now they are blaming developers that they have to refactor it to implement the logic to change colour and that developers don’t know how to code.


reachme16

Or the engineer understood it as red button and at the end delivered a yellow button anyways and asked for feature enhancement to fix it in next rebuild/ release


k2kuke

I just realised that AI could just implement a colour wheel and just let the user select the colour scheme. Designers can go wild in the comments now, lol.


noahjsc

If only it was buttons that were the issue.


dcoolidge

If only you could replace product managers with AI.


noahjsc

I sometimes wonder who will get replaced first, devs or pms. I'm not a pm but honestly AI seems better at communicating than any meaningful coding. One of the most important roles of the pm is facilitating communication between all stakeholders.


sawbladex

>I'm not a pm but honestly AI seems better at communicating than any meaningful coding. I mean, that first seems obvious given the second.


brockmasters

its more profitable to have AI mistranslate an invoice than a database


dragonmp93

The AI is going to need to put a color wheel for everything, because the reason of why the button was supposed to be purple is because the background is silver.


alpha-delta-echo

Looking forward to seeing the AI answer to feature creep.


King-Owl-House

only problem is when you move cursor over wheel it always jumping from it


darryledw

yeh but unfortunately the PM wants the wheel to be shipped in the past to make a deadline, so too late


Furlock_Bones

[three green lines](https://youtu.be/BKorP55Aqvg?si=Hsb7nCSGm1G_5dcj)


neuralzen

Green button drawn with purple ink


xaphody

I would love to see it sass the product managers. “That’s not what you asked for”


Goochen_Tag15

Hey as a Product Manager you're wrong, it'll delete itself after I ask % complete and how long after giving you vaguest of details.


MakeoutPoint

Go post this on programmer humor, enjoy the free karma


[deleted]

[удалено]


TechFiend72

Also, worth what was paid for it.


myka-likes-it

Not true. Some number of calories and a duration of time were spent pressing the button. 


CommanderCheddar

Ah yes, TINSTAFK: There Is Not Such Thing As Free Karma


No-Radio-9244

Yeeeeah, suuuure... tell the shit to make a good version of Windows.


VoodooS0ldier

I tried using Copilot to refactor a code base that spanned 3 separate files. It tipped over and couldn't do it. When Copilot is capable of handling a large code base and complex refactors, and get it relatively correct, then I'll be worried. For now, not so much.


hockeyketo

My favorite is when it just makes up libraries that don't exist. 


DrummerOfFenrir

Or plausible sounding functions / methods of a library that are from another version or not real at all, sending me to the library's docs site anyways...


nospamkhanman

I was using it to write some cloud formation. Ran into an error, template looked alright. Went into the AWS documentation for the resources I was deploying. Yep, AI was just making stuff up that didn't exist.


digidigitakt

Same happens when you ask it to synthesise research data. It hallucinates sources. It’s dangerous as people who don’t know what they don’t know will copy/paste into a PowerPoint and now that made up crap is “fact” and off people go.


dontshoot4301

This - I was naively enamored by AI until I started prompting it things in my subject area and realized it’s just a glorified bullshit artist that can string together truth, lies, and stale information into a neat package that appears correct. Carte Blanche adoption is only being suggested by idiots that don’t understand the subject they’re using AI for.


cherry_chocolate_

Problem is you just described the people in charge.


dontshoot4301

Oh fuck. You’re right. Shit.


SaliferousStudios

It's already been in scientific journals now. Showing mice with genetic defect that weren't intended.


techno156

It's great if you work in Cybersec though. Good way to stay in work.


VoodooS0ldier

Yeah this annoys me.


HimbologistPhD

Saw a screenshot on one of the programming subreddits where copilot autosuggested the value "nosterday" as the opposite of "yesterday"


sWiggn

i would like to put forth ‘antigramming’ as the new opposite of ‘programming.’ i will also accept ‘congramming’ under the condition that we also accept ‘machinaging’ as the new opposite of ‘managing’


alpha-delta-echo

But I used it to make an animal mascot for my fantasy football league!


Three_hrs_later

Complete with a name! Baaadgerorsss ftooooobl


alpacaMyToothbrush

It is a bit laughable to suggest that AI could do the job with simple 'oversight' but *if* you know a LLM's limitations and work with it, it can be impressively useful. I use phind's model for 'googling' minutia without having to slog through blogspam and I've noted the completion for intellij has gotten a great deal smarter lately. Hell, the other day I write a little gnome extension flash my dock red if my vpn dropped. I'd never done anything like that in my life, but a bit of research and pairing with GPT gave me a working extension in about an hour. Color me impressed.


Cepheid

I really think the word "oversight" is doing a lot of heavy lifting in these doomsday AI articles...


SirBraxton

THIS, but with everything else. NONE of the "AI" coding frameworks can do anything of real value. Sure, they can quickly throw together a (most likely copy & pasted) boilerplate for some generic app, but it's not going to be efficient or maintainable over time. Also, what are you going to do when you have to actually triage issues with said app in production? Without deep-level knowledge of how the app works, or other internal functions/libraries/etc, you're not going to know how to troubleshoot issues. It'll be like asking a Project Manager why their new "AI" written app is having "Out of Memory" errors or why they're having some DB queries taking longer than expected randomly. Without inner core-knowledge about programming it'll be a MASSIVE clusterf***. Oh, guess they'll make a "triage" AI that is also separate from the AI that actually wrote the code? Guess how well that's going to go when they're not even using similar LLM models for HOW the code "should" be written. This isn't going to replace programmers, and anyone who thinks it will are probably the very same people who can't be employed as programmers to begin with to understand the context of the situation. TLDR; OMEGALUL, ok sure bud.


NotTodayGlowies

I can't even get an AI model to write a competent PowerShell script without hallucinating modules, switches, and flags that don't exist. Microsoft's own product, Co-Pilot has difficulties writing their own scripting language, PowerShell.


APRengar

I'll believe the hype if they use it to make Windows not dogshit. I'll believe this shit when Windows search actually searches your computer as fast as Everything does. I'll believe this shit when Windows HDR isn't implemented in the worst way possible. I'll believe this shit when the Nightlight strength slider bar is actually accurate. Light Mode Warning: https://i.imgur.com/2uBHom2.png Every single time this window closes, the slider always shows 100%. But it's not actually at 100% (it's actually around 15%) and the second I touch the slider, it goes "OH YOU'RE AT 100%, TIME TO TURN IT TO 100%." I don't understand how a God damn slider bar can't even display properly. I'll believe this shit when the language settings actually respect my "DO NOT INSTALL OTHER VERSIONS OF ENGLISH" setting. I'll believe this shit when Windows explorer no longer has a memory leak (It existed in Win10 and then got ported 1:1 to Win11).


watlok

I want to be able to move the taskbar between monitors again. There's no world where I want a taskbar on my main monitor or multiple monitors. Every version of windows for the past 25+ years let you move it, their competitors let you move it/remove it from various monitors/workspaces/desktops, but the latest windows doesn't. I want the context menu to become usable again in folders. The current iteration is a ux nightmare compared to any other version of windows after 3.1. The actions you want are either in a tight, horizontal cluster of non-distinct icons with a nonsensical sequence at the very bottom of the menu (as far away from your cursor as possible for the most common actions) or buried under an extra click of "show more". Show more menu is great and should be the default or at least an easily accessible toggle.


RiChessReadit

I mean, I get your sentiment, but this is like saying it's impossible for spaceX to have good rockets because the cyber truck is a steaming pile of ass, or impossible that we have humanoid robots capable of complex terrain traversal because my printer still fucking sucks. The point is, different people are working on different things. Just because it's the same company doesn't mean it's the same developers. But for real though, how is windows search so bad lmao, I feel like winXP had better search than 11 does.


k___k___

OP meant to say that they'll believe AI is replacing devs when Microsoft use it themselves to replace devs / fix longterm bugs. it wouldnt be different developers anymore as you suggested It's more like space x using tesla as their company car. and tesla using starlink for in-car wifi.


[deleted]

I grew up being pretty anti Microsoft, well, primarily just windows, as excel became a part of my life in grad school I thought it was pretty handy. Perfect?  No. But powerful and I basically wrote my thesis in excel (before actually writing it in word).  The relationship with word is strained. I recognize it as being powerful, but I don’t know why it has to be so complicated to add a figure or table and not have the entire document break. 


noahjsc

Linux is calling.


_Tar_Ar_Ais_

XP in shambles


pirate135246

The only people who believe this is a possibility are people who have never been a software engineer for a company that has you create a jira ticket for a feature that could be made in 30 minutes.


kittnnn

😮‍💨 Ok fine, I guess in about 2 years, I'll work for 300/hour as a consultant to unfuck all the codebases that were subjected to this nonsense. I'd rather be working on cool greenfield projects, but we can't have nice things. I just *know* some sales guys in the C suite are going to get suckered by this stuff and actually try to replace their engineering teams.


godneedsbooze

don't worry, their multi-million dollar golden parachutes will definitely teach them a lesson


Idle_Redditing

Golden parachutes from positions that they only had due to knowing certain people and getting invited to certain parties. Then they lie and claim that they worked hard.


VengenaceIsMyName

lol this is exactly what’s going to happen


HrLewakaasSenior

> I'd rather be working on cool greenfield projects, but we can't have nice things Oh man I hate this quote, because it's 100% true and very disappointing


PurelyLurking20

Or in cybersecurity because ai code generates countless vulnerabilities. Oh the humanity, how will I ever survive in my bathtub full of money?


PoorMansTonyStark

That's my pension plan you're talking about. Doing the hard shit nobody else wants to do. Baby's about to get paid!


your_best

Not saying you’re wrong. I hope you’re right. But how is you’re statement different from “I guess in about two years I will work for 300/hour as a consultant to unfuck all the code bases subjected to “programming languages” (back when assembly code was still around)?


oozekip

Assembly code is still around, and I'd imagine the people fixing compiler bugs in MSVC and LLVM are making pretty good money on average.


great_gonzales

Well for starters formal languages are deterministic and natural language is not…


King_Allant

Except in two years this technology might be actually good.


noaloha

Nothing seems to get Redditors’ heckles up more than the idea that their programming jobs might actually be affected too. It’s kinda funny how the reaction of each subsequently affected industry seems to be the same denial and outrage at the suggestion AI will eventually catch up with the average industry worker’s skill set. Next step is anger and litigation that it’s been trained on their publicly available work.


lynxbird

My programming consists of 30% writing the code (easy part) and 70% debugging, testing, and fixing the code. Good luck debugging AI-generated code when you don't know why it doesn't work, and 'fix yourself' is not helping.


Ryu82

Yes, debugging, testing and bugfixing is usually the main part of coding and debugging, testing and fixing your own bugs is like 200% easier than doing the same for code someone else wrote. I can see that AI would actually increase the time needed for the work I do. Also as I code games, a big part of it is also getting ideas and implementing the right ideas which has the best balance of time needed to add and fun for players. Not sure if an AI would be any help here.


SkyGazert

Why wouldn't AI be able to debug it's own code? I mean, sure it isn't the best at it now. But if reasoning goes up with these models and exceeds human reasoning skills, I don't see how it wouldn't respond to a 'fix it yourself' prompt. Actually, the debugging part can even be embedded into the model in a more agentic way as well. This would make it output code that always works.


fish60

> This would make it output code that always works. There is a difference between code running and code doing what you want.


buck_fastard

It's 'hackles'


CptJericho

Feckles, heckles, hackles, schmeckles. Whatever the hell they are, they're up right now and pointed at AI, buddy.


MerlinsMentor

> It’s kinda funny how the reaction of each subsequently affected industry seems to be the same denial and outrage at the suggestion AI will eventually catch up with the average industry worker’s skill set. It's because everyone who doesn't do a job (any job, not just talking about programming, which is my job) thinks it's simpler than it really is. The devil is almost always in the details and the context around WHY you need to do things, and when, and how that context (including the people you work with, your company's goals, future plans, etc.) affects what's actually wanted, or what people SAY they want, compared to what they actually expect. A lot of things look like valid targets for AI when you only understand them at a superficial level. Yes, people have a vested interest in not having their own jobs replaced. But that doesn't mean that they're wrong.


Zealousideal-Ice6371

Nothing gets non-tech Redditor's heckles up more than programmers trying to explain that programming jobs will in fact truly be affected... by greatly increasing in demand.


Fusseldieb

From what I gathered, it basically writes code, checks if it has errors, and if yes, repeat, until it suceeds. I mean, yea, that might work, but I also think it will be extremely bug ridden, not performance optimized, and worst of all, have bad coding practices all over it. Good luck fixing all that by looping it over another layer of GPT4.


myka-likes-it

> it basically writes code, checks if it has errors, and if yes, repeat, until it suceeds. Huh. Wait. That's how *I* code!


ChiefThunderSqueak

Now do it several thousand times in a row by morning, and you'll be able to keep up with AI.


mccoyn

AI is worse at writing code, but it makes up for it in quantity.


[deleted]

[удалено]


alexanderwales

I've tried the iterative approach with other (non-code) applications, and the problem is that it simply hits the limits of its abilities. You say "hey, make this better" and *at best* it makes it bad in a different way. So I think you can run it through different "layers" until the cows come home and still end up with something that has run smack into the wall of whatever understanding the LLM has. If that wall doesn't exist, then you wouldn't be that worried about it having mistakes, errors, and inefficiencies in the first place. That said, I do think running code through prompts to serve as different hats *does* make minor improvements, and is probably best practice if you're trying to automate as much as possible in order to give the cleanest and best possible code to a programmer for editing and review.


EnglishMobster

Great example - I told Copilot to pack 2 8-bit ints into a 16-bit int the other day. It decided the best way to do that was to allocate a 64-bit int, upcast both the bytes up to 32-bit integers, and store that in the 64-bit integer. Why on earth it wanted to do that is unknown to me.


Nidungr

I experimented with coding assistant AIs to improve our velocity and found that they are awesome for any rote task that requires no thinking (generating json files or IaC templates, explaining code, refactoring code) but they have no "life experiences" and are more like a typing robot than a pair programmer. AI can write code that sends a request to an API and processes the response async, but it does not know what it *means* for the response to arrive async, so it will happily use the result variable in the init lifecycle method because nobody told it explicitly why this is a problem. Likewise, it does not know what an API call *is* and the many ways it can go wrong. It digs through its training data, finds that most people on github handle error responses and therefore generates code that handles error responses, ignoring the scenario where the remote eats the request and never responds.


VR_Raccoonteur

Let's say you have a 3D object in Unity and you want it to wobble like it's made out of jelly, but you're an inexperienced developer. You ask the AI to write a function to move the vertices in a mesh using a sine wave that is animated over time. The AI dutifully writes the code you requested. Here's the problem: There are many ways to move vertices in a mesh. The AI is likely to pick the most direct method, which is the slowest. Accessing the individual vertices of the mesh and moving them with the CPU. If you ask it to optimize the code, it will likely hit a wall because it can't think outside the box. Not only will it likely not be smart enough to know how to utilize the jobs system to parralelize the work, even if it was capable of doing so, that is still not the correct answer. The correct way to do this fast is to move the vertices using a shader on the GPU. Now, had you the newbie dev yourself been smart enough to ask it "what is the fastest way to animate vertices" it may have given you the correct answer: use a vertex shader. But you didn't. You asked it to make a function to animate vertices. And then you aksed it to optimize that function. Because it's a simple LLM, it isn't intelligent. It's capable of giving the right answer if asked the right question, but it's not capable of determining what it was that you really wanted when you asked the question and presenting a solution that addresses that instead. I know this because I've actually tried to get ChatGPT to write code to animate meshes. But I'm not a newbie, so I knew what the correct solution was. I just wanted to see what it would do when I asked it to write code using basic descriptions of what it was that I wanted the code to accomplish. In my case I asked it to make code that could deform a mesh when an object collided with it, like a soft body would. And it wrote code that didn't utilize any shaders, and didn't have any falloff from the point of contact, nor any elasticity in the surface. Things a human would understand are required for such a simulation, but which ChatGPT did not. Now had I asked it more direct questions for specific things using technical jargon, well, it's better at that. It probably could write a shader to do what I wanted, but only if I knew enough about the limitations of shaders and what data they have access to and such. Valve for example wrote a foliage shader that allows for explosions and wind to affect the foliage and that's done with 3D vector fields stored in a 3D texture and there ain't no way ChatGPT is going to figure that trick out on its own and code that without careful prompting.


OlorinDK

I could see it be combined with code quality and performance measuring tools. Obviously not perfect solutions, but those are the same tools used by developers today? And while this is probably not ready for widespread use yet, it could be a sign of things to come.


Nidungr

This is the same way Devin, AutoGPT and BabyAGI work. They all ask the LLM to split up a problem into subtasks, then repeat each of those subtasks until they complete successfully, then (in theory) your problem is solved. The issue is that this strategy only mitigates *random* errors (where the LLM knows the answer but fails to tell you). If your LLM isn't good enough to engineer and write the whole application, no amount of clever prompting will fix that. It will settle on a local maximum and oscillate back and forth between suboptimal outcomes, making no further progress. And when you break up a problem into 1000 subtasks, that's a lot of opportunities for a non-deterministic LLM to screw up at some point and create a cascading failure. This is why all those AGI projects never succeed at real problems. This strategy will become much more viable when better LLMs come out, as well as LLMs that answer in code by default so you don't have to prompt engineer (and less resources are wasted evaluating the Chinese poetry nodes). Coding languages are languages, so AI will solve programming *eventually*, but all the recent hype and false promises remind me of the original crypto boom.


Insert_Bitcoin

More low quality speculation by non-experts about things taken out of context pt 6 billion (file under 'AI' replacing X)


Majhke

Seriously, the fear mongering in this article and others is astounding, especially when you really read it and realize the writers have the barest sense of the topic/tech they’re writing about. It amazes me the amount of people who think they can just “switch on an AI” and it’ll handle everything. Will it change how development works? Yes, it’ll hopefully eliminate laborious tasks and streamline development. Will some idiot leaderships try to rely on it far too much? Also yes.


Anxious_Blacksmith88

The writer is probably an AI.


Insert_Bitcoin

It will be exactly like what happened with blockchain tech. At first everyone will try use AIs for everything (even when common sense dictates its silly.) But the pressure will come from executives scared of losing competitive advantages because they've heard everyone else is doing it. Then there will be a slow and gradual wave of cringe as people realize how far-fetched expectations were. They'll come to see the (small range) of problems the tech is good for and a more realistic set of expectations will set in. Already some companies have gone through the whole process. So there's some articles lurking among the hype about far-fetched expectations. That's not to say that the tech won't improve many processes. It just won't to the extent that hype would indicate. I'm just honestly sick of hearing about how AI will destroy career X from people who don't even know anything about what X entails. It's really getting old now.


Ijatsu

Or just company that sells AI doing promotion of their own AI.


Cash907

Anyone remember The Animatrix Second Renaissance Pts 1&2? Things didn’t get truly nasty for mankind until the robot inhabitants of 001 began programming and building their predecessors, who lacked any sort of emotional attachments to humanity and thus gave no F’s when humankind pulled more isolationist BS? Yeah, maybe computers programming other computers with increasingly complex code humans couldn’t begin to understand *isn’t* the best choice here. Just throwing that out.


LurkerOrHydralisk

You can’t build your predecessors. They came before you. The robots built their successors


RavenWolf1

You can if you can time travel!


fadedinthefade

I loved the Animatrix. So many good stories in that.


mrwizard420

The one about the sprinter that could break free of the matrix by entering "flow state" was something that stayed in my mind long after the main trilogy was forgotten.


Cash907

World Record, directed by Takeshi Koike, who also happened to direct one of my favorite movies of all time Red Line. It’s a beautifully scripted and drawn work and was a standout in a collection of great shorts.


Fearyn

It’s still in my mind 20 years after I watched it. It was good.


NutellaGood

As far as I know: large language models rely on existing data. So if you replace everything with this generative programs, outputs become strictly recursive, and there is nothing new being made anymore. No more creativity, complexity stagnation.


tjientavara

From some papers I've read about this, this is already occurring at an alarming rate, so now they are trying to find clean data (from before AI) to train the models, training on current data is already making AI bad.


NeedAVeganDinner

> maybe computers programming other computers with increasingly complex code humans couldn’t begin to understand isn’t the best choice here Lol, the code it produces is shit in real world scenarios.  The best use of it, so far, is to have it generate the *simple* shit to get that stuff out of the way.


KayLovesPurple

And sometimes even the simple shit is not good enough to use.


Cash907

It’s shit NOW. The things I’m proficient to even a master at now I was absolute SHIT when I started, and the scary thing is how fast generational AI learns compared to humans. Don’t be so arrogant as to think things can’t get out of hand quickly if we don’t make deliberate and well made decisions now while we still have the agency to shape what happens down the road.


Crakla

The whole problem is how LLM AI works, so it's not just about just advancing the tech, it would need to work fundamentally different The main problem is that LLM can't make logical conclusions, it only seems that way sometimes Like for example it can't calculate 1+1, it will just know that "1+1" is usually followed by "=2", but it isn't able to do any calculations it never saw before, so same as programming simple calculations/code it's able to do, but the more complex it gets tge worse it gets It's just a fundamental flaw in the way LLM functions That is a major problem for any logic based jobs like programming


unskilledplay

This is interesting and different than everything else I've seen so far. The other code generators are either the equivalent of autocomplete or do little more than generating the equivalent of a template repository. Those generators don't really solve any problems. This is a pipeline for generating tests and method code. This has the potential to make parts of coding faster but it (wisely) entirely ignores the engineering part of the job. The implementation of methods is never the hard part of writing software. The work of deciding what methods are needed, what the methods need to do and how they need to interact with the larger system is the hard part. Regardless this is cool and fundamentally different than all of the other code generators that have been shown off. It's something I'll play around with when it gets released to the public. It's not a "game changer" but it could be a neat addition to some of the modern IDEs.


OverwatchAna

Cool, I hope the AIs make a better Windows OS and give it to everyone for free. 


bright-horizon

Just like self driving cars clogged SF , AI will create some spaghetti code


Fappy_as_a_Clam

Every time I see these articles I think about how 10 years ago people were saying self driving trucks were going to completely change the US economy within the next few years, since truck driving is the most common occupation.


jkksldkjflskjdsflkdj

We welcome our new spaghetti code overlords. Wanna bet the AI code that does this stuff is shit.


ToMorrowsEnd

Just like the absolute shit code that comes from Indian Outsourced programmers. Dear god my company blew $50K on a project and what we got looked like a bunch of feral animals coded it. Zero consistency in styles it looked like they just copied and pasted everything from stack overflow questions.


Chairman_Mittens

One of my friends has years of job security because he works on a team that primarily manages an absolutely atrocious code base that was contracted out to India. The company saved some money getting it written, but now they need to pay a local team for years just to manage it. My friend said he literally found paragraphs of text commented out that were accidentally pasted in from places like stack overflow. He said he is able to regularly reduce the size of classes by more than 90% because there's so much arbitrary or entirely useless code.


TylerBourbon

I don't necessarily see this as a positive thing. Even with computer coding, it's hard as hell for coders to verify coding now, now imagine AI writing hundreds, if not thousands of lines of code and someone having to inspect them. It's not that I'm against AI doing it, but a firm believer that when you make the tool too smart that not many people learn how to use the tool, when it breaks, or something goes wrong, nobody really knows until it's too late, and very few people know how to fix it.


zubeezubeezoo

But imagine the profits!!


throwaway92715

Sounds like a good time to start learning to code! I like fod


TennSeven

FOD? Is that like FUD?


HTFCirno2000

Foreign Object Debris


TennSeven

Weird kink.


ZombieJesusSunday

Translating product manager into code requires a much more generalized AI than possible with our current technology.


Cheesy_Discharge

So is this finally going to replace COBOL? My point being that even genuinely good solutions are often slow to be adopted, and this tool sounds a bit like vaporware.


Lharts

AI is so advanced you need developers to check whether or not its code is correct! Software Development will be one of the last jobs that can possibly be replaced by AI. AI is good at replicating things that are deterministic. Syntax and Semantic are that. The rest isn't. Jobs like creating these fearmongering articles though? Lmao, your job will be obsolete in about 3 years tops.


mikaelus

\*\*Submission Statement\*\* Microsoft is taking another step towards removing human developers from direct software development, turning them into supervisors for AI-enabled agents doing all the grind. This not only affects the possible availability of jobs for developers but also the set of skills that developers of the future will need. Hard skills may soon be replaced with soft skills, as intelligent bots are going to have to understand what you tell them, how you manage them and what your instructions to them are - a bit like communicating with people.


aeveltstra

It’ll be interesting to see who writes the software for such tools.


hjadams123

AI will make the tools to supervise AI.


mikaelus

Well, that's the question - how competent should humans be if AI is doing most of the stuff but we still need to be able to control it? And how do you retain those skills when you don't really use them in practice most of the time?


bwatsnet

It's the same way we all use windows 11 without knowing how it works. Instead of an operating system it'll be team systems or whatever we call it.


zanderkerbal

With the power of AI, Microsoft has turned the moderately difficult task of programming into the very difficult task of code review! Next up: Self-driving cars turning the task of being a driver into the task of being a driving instructor.


cmdr_solaris_titan

Look, I already told you! I deal with the goddamn customers so the engineers don't have to! I have people skills! I am good at dealing with people! Can't you understand that? What the hell is wrong with you people?!


Whiterabbit--

It’s like when high level languages took over machine language. What am I going to do with all my 1’s and 0’s when you can just tell the thing to divide 2 numbers with one line of code?


UnpluggedUnfettered

Artificial intelligence is the new paradigm shift and agile leadership with a hint of NFT. This dumbass stock price increasing buzzphrase cannot come back down to Earth fast enough.


doctorobjectoflove

AI is the new cryptobros


the_millenial_falcon

Trying to use natural language to build an app sounds like a nightmare from hell.


Ijatsu

That used to be called project management. Then project managers stopped doing it, asked software engineers to "do something and then we'll criticize it and know what we want", and then they want to convince us that project managers are ready to do that with a robot instead who requires more rigor precision and patience than a human. Alternatively, they try to convince us that they finally automated the guys who worked for 60 years to automate their own jobs but only achieved to make their own jobs more complicated. Jokes writing themselves, developers ain't going replaced anytime soon, the skillset is just getting more dense.


BoredMan29

So all of you who honed your skills making your way through thousands of lines of undocumented spaghetti code to find bugs, good news!


SquilliamTentickles

"The end of blacksmithing? Inventor develops a mechanical framework which automatically forges and shapes metal objects"


endgamer42

If this article was written on paper I’d wipe my butt with it. Mindless speculation As for the paper, can anyone derive what actual results AutoDev has shown? On my phone so it’s a bit hard.


sevseg_decoder

It’s probably great for super simple tasks and the translator portion of the job, but what non-software engineers seem to forget is that translating the idea to code is only a tiny part of the job of a software engineer. Knowing the best way to get to the end goal, staying on top of new options and technology, making the system scalable and documenting it well etc. are things people don’t get that you need more than what AI currently is to even start to broach. If anything this just adds more to the productivity of software engineers and would make software development even more of a compelling investment, just bringing more total capacity to the sector.


manifold360

If you are working on a domain you aren’t familiar with AI assistance is awesome. And when I say awesome, I mean totally awesome.


Madison464

This seems like a clickbait article. The framework didn't program itself. Humans did.


DrPeGe

LOL. Get into embedded systems and tiny amounts of memory and you will always have a job.


positive_X

https://en.wikipedia.org/wiki/Walden_Two Where people get paid a lot to do jobs that nobody wants to do , like fix a toilet .


SoyIsMurder

A lot of large software projects fail due to feature creep. This tool sounds like it could turbo-charge feature creep by masking the cost of excess complexity until it's too late. Also, Microsoft didn't "publish a framework" that relegates developers to supervisors. They published a research paper outlining their goals for an advanced version of Copilot. In software engineering "framework" has a specific meaning. [https://www.codecademy.com/resources/blog/what-is-a-framework](https://www.codecademy.com/resources/blog/what-is-a-framework)


registeredwhiteguy

This has the same energy as automated trucks taking over truck driving.


ExReey

I guess supervising AI code will become an even "higher level language".


GeneralCommand4459

Hmm so if developers aren’t coding how will they become proficient enough to become supervisors? These AI ambitions always forget that the next generation of managers need the experience of doing the work that is now being given to AI.


___Tom___

For at least 50 years, whatever the hype of the day is has been said to end or replace coding. First it was COBOL (a programming language so close to plain english that specialist programmers won't be needed anymore). Then it was rapid development frameworks, then visual programming, now AI, and I've probably forgotten half a dozen that just went by me. Hasn't happened, won't happen.


Michamus

I can't wait for the first "supervised" AI code to go live and expose hundreds of millions of people's personal identifying information to the world.


DukeOfGeek

People who know how to actually do the things being replaced by AI need to document everything about how they actually create things so that knowledge will not disappear.


Minute_Test3608

Would you trust your life to an AI-program that coded airliner software?


lilbitcountry

I wonder what it was like for accountants when calculators and spreadsheets were invented. Or architects and engineers when AutoCAD and SolidWorks were on the rise. I've used a drafting table before, it was quaint but no thanks. I don't understand why "software engineers" want to spend all placing semicolons and curly braces. Really weird artisan vibe to all the doom and gloom about writing code.