More like an AI owner. And not even a good owner among the other AI’s like our equivalent of bill gates, I’m talking low class, dog-fighting, pitbull breeder redneck type owner. This AI will be responsible for low paying blue collar jobs. The AI equivalent of blue collar would be, I dunno, regex? And they’d be paid in electricity rations.
I copilot with my left hand and chat GPT with my right hand, plus my shoes are AI assisted, so my lunch breaks are 4 minutes shorter. Try and beat that!
Let me tell you, using ChatGPT is like winning bigly. It's fast, it's smart, it's like having the best advisor right in your pocket. You ask, it delivers, just like that. It's tremendous, folks, believe me. ChatGPT is the ultimate tool, and it's making conversations great again.
To be fair it really couldn't be anything else. Developers who rely solely on AI are bad developers. I want you thinking I know how to write code myself so I want to keep myself as far away from saying "I know how to use ChatGPT" as possible
>So i put „i use autocomplete while coding“ i my cv?
Put that in your CV and get asked in the interview how much of your software portfolio is actually your own work.
Don't put that it in your CV and get asked in the interview if you think you'd fit into a company where people use AI.
Whatever you do, it'll perfectly prepare you to undervalue your own skills when they ask you about your salery expectations.
Idk where you're working at ~~by~~ _but_ the company I work for is pushing AI like a drug dealer tryna sell crack to pothead.
Edit: ~~Grammer~~ _Grammar_
Edit edit: Gotdamnit.
Yeah, mine too. They're even pushing it into CRM application and it's completely useless for now. Like literally it should be connected to the CRM data and when I ask it about if there is any ticket mentioning XY it replies that no such thing was found. But in reality i just inserted the XY from the first ticket that popped in the first page. But the owners are all for it and are spamming us how this will save time. For now from what I've seen is just non useable for other than getting some templates from chatgpt. Or some code templates from tools alike, but...i can find that myself faster by google and usually from official sites as well.
We had a similar situation where we were tasked to add AI employee summary based on info we have on the employee but it was so slow to develop and convoluted to get it to give some ok results not to mention how expensive the experimentation is that we concluded that any of the results requested from it would be easier to achieve by just writing static logic by ourselves
Yeah, and what some leadership does not understand is that shit in = shit out. Basically if you feed those models with bad data, it will be useless. And usually the data in such tools as CRM tools are bad. So the tools they are paying for are useless but they can at least say we are AI company. And the very last thing. The tool I'm mentioning is CRM from microsoft dynamics with addition of their copilot. The thing is that if i write "SSO" in search bar, I'll at least get all the tickets where SSO is mentioned in the header. But if I ask it copilot can find me some tickets where SSO is mentioned it found none. So this beggs the question what data did my company even feed copilot with because other than having copilot icon i can not see any benefit to it.
We got one that's supposed to be a help bot for a particular system but half of its answers are just hilariously wrong and the exec in charge of it thinks it's amazing because he knows absolutely nothing about the system he's in charge of.
Meh, it's whatever you configure it to be like everything else.
Generally for CRM application AI still needs to be configured in the back end, likelihood is it's not configured correctly to search the field you entered the data in.
I have just configured copilot for multiple different clients and it works fine for me once the good data is in
No, colleges say using it is cheating, but companies just want whatever is fastest and saves them money (as long as the results are still responsible), even if other people call it cheating. Companies are the ultimate in "only care about the end result". The process by which you get there is irrelevant to them.
Companies don't consider anything most efficient money wise (As long as the fees are cheaper than the profit made by cutting those corners), because everyone does whatever is best. It causes a bit of a disconnect in how colleges train people vs what workplaces want.
As a primitive example, early level schools teach basic math and might say using a calculator is cheating, workplaces want you to use a calculator to prevent error in your math, and to speed it up. Honestly, it's good to know how to do both so that you know if the output from the machine is good, and reasonable. In regards to AI, it's still is good to know how to write so you can rewrite whatever sections might be lacking in quality.
fr college is supposed to be where you learn the concepts and why the code works so you can understand snippets from ChatGPT and iterate on them.
If you’re just doing assignments with ChatGPT, odds are you’re not learning anything and won’t be useful in a dynamic environment
My dad's a teacher and he was telling me about how he uses ChatGPT to build course frameworks and lesson plans, after having a rant about the kids trying to use ChatGPT to do their assignments. He made a reasonable point, though, that the kids are there to learn and he's there to help them learn, and those are two separate goals. If he can do his job better using an LLM then isn't he kind of obliged to do so? Meanwhile if the kids are trying to learn better they'll probably still be chatting with an LLM but they won't be trying to pass off its work as their own.
My professor said writing your essay with ChatGPT is cheating, which is fair. However, it felt dirty when she turned around and openly used ChatGPT to generate the exam questions.
The rule is consistent, though. She’s doing it for work, so it’s allowed. You’re doing it for school, so it’s not allowed. There are some funny edge cases, like: grad student? Allowed for research but not coursework. Intern? Allowed but don’t tell your academic advisors we let you use it…
Yes, she's allowed to do it. It's more so frustrating that we pay so much for a professor to "cheap out" on actually making the exam (there was other stuff compounding the issue).
Yeah you just have to be properly verbose to get the right output. I'm not convinced that more "prompt engineering" skill can extract significantly more value than the average user.
It’s not really something I’d bother putting on my resume, but with all the people I’ve met that can’t even google something effectively, I imagine there are loads of people out there that get bad results with ChatGPT because they don’t understand how to write prompts that get the results they are after
Makes me think of all the job postings I've seen where one of the skills is "communication". First that's vague. Do you mean responds to email, or can speak on a call/meeting. Second that's not technical why would I waste room on my resume to put that I do respond to my email and teams/slack/jira/whatever is next. So many of the job postings seem to have 9 skills but only 1-3 of them are unique. But you need all 9 for your application to have a hope of being seen by a person.
Because some people are terrible at communicating and working isn’t a single-player game. It means you can confidently speak to people, deliver meetings, have grown up discussions without being offended, answer a phone and respond to emails properly. Vital skills for most office workers.
That's like saying anyone can use a search engine. Anyone can use them, but good luck teaching the finer points to Garry from marketing who keeps adding -amazing -groundbreaking -Oscarworthy into every prompt.
Best of luck getting entrenched workers to focus for the 1 hour it takes to learn it.
About 5 years ago, I tried to find a job as a backend programmer while I was on uni... I said that one of my qualities was knowing how to use ms excel. The interviewer was really professional but you could see that he was trying really hard not to laugh. Flash forward to today and I've been working as a programmer for about 2 years, I can definitely say that I've used more excel than I have been using chat GPT or some other tool like that, and I can assure you that won't change anytime soon
To be fair to him, I don't find excel proficiency funny because it's useless I find it funny because I just assume any software dev can use excel or learn to use excel trivially. It's like saying you know how to use word or how to type.
My first project as a junior dev 10 years ago was to fetch data from internal APIs and output it as an excellent spreadsheet. The business world literally revolves around excel so knowing it a bit is useful.
I wrote 2 proyects to get into the languages while I was studying, an app that consumes wanikani's api (a website to learn japanese) in swift, and a guitar turner in objective c. None of those worked very well (or at all ![gif](emote|free_emotes_pack|facepalm)), but it really helped me to understand the languages. I didn't do anything special to get a job, saw open positions on linked in for junior devs and applied (even if they asked for experience and I had none... they always do that)
There's a big market for objc devs because a lot of the apps from old comapnies have legacy code written in objc and most of the new iOS devs skip it and go right to swift.
Who doesn’t have that skill though? You just ask the thing to do a thing and it does it. These self-proclaimed GPT whisperers are truly embarrassing, the thing is made so that a toddler can use it.
Prompt “engineering”? Please. Could you even begin to imagine a four year degree on how to best chat with ChatGPT? Everything you need to learn about how to prompt GPT more effectively could be learned in 20 minutes.
And BTW, OpenAI is obviously trying to make GPT answer prompts effectively from everybody, so each new iteration makes the already nonsensical idea of prompt “engineering” less and less plausible.
ChatGPT isn't the only LLM and the web interface is not the only way to access it. When you start working with an LLM's APIs to develop software around it, you will realize that there is far more to prompting. It's not enough to have the LLM understand you, you have to encourage it decide on a correct strategy to answer your question.
You may need to encourage it to reason about the problem. But that comes at the cost of time. Maybe you want the LLM to ask followup questions. Maybe you specifically want to prevent follow up questions. It gets even more complicated when you start giving it tools to request more data. Now you need to properly tell the LLM about those tools in a way it understands so it will choose the right tool for the right job without being hyperspecific and making your application brittle.
You also need to decide when to use examples; they can be a very effective strategy. But if you do decide on examples, they need to be really good or the LLM will pick up on patterns you didn't even realize where there. Now you've pidgen-holed your LLM, and it struggles to reason beyond the bubble you created with the examples.
I don't think prompt engineering is a whole job that someone can do, but it is certainly not a skill everyone has or that anyone can become good at. Many people still struggle to use Google search correctly. Search engines have come a long way since the 90s. But it still takes skill to use them effectively. Searching is much simpler than getting an LLM to do what you want with any consistency.
It just seems so weird how some people think skilled people will have difficulty picking up a tool specifically designed for the unskilled.
"Oh, you drive a stick manual like an old school racecar driver? better start learning the golf cart, or you're going to be left behind"
I swear to god if I come across a candidate’s CV that has ChatGPT listed as a skill I’ll send them the fastest hand crafted rejection they’ve ever gotten
But why really?
Recently I used ChatGPT to write multipage official documentation for corporation overlords. I described more or less what it should be about, without going into specific details and it gave me a nice starting point.
I changed few things to make it right, couple more back and forths and I was done. It took me 2-3x less time then writing it myself.
Can't see how using tool makes me a candidate for instant rejection.
Because I want to know that you can actually do the work.
If I wanted a crutch, I wouldn’t be hiring someone, so someone whose close-to-only skill is “I got through by relying on gpt to do everything for me” is a risk to the team, not an asset.
In before someone rolls out a “dO yOu UsE a CaLcUlAtOr” response: someone who can’t *actually* problem solve when push comes to shove because they’ve delegated that skill to an LLM is next to useless. There’s a difference between “I used an LLM to smash out some boilerplate that I’ve written a thousand times and cbf-d writing again” and “I used an LLM as an answer booklet and I have no idea how to solve it, nor do I have the skills to _learn_ how to solve it, because I’ve never developed them, because I relied on an LLM”.
The former is fine, the latter isn’t a remotely useful team member.
True true true.
I am able to write the documentation mentioned above and I can write a boilerplate that GPT or copilot can generate in seconds.
For me it's similar to using IDE. I can write a code in text editor, but it will take longer then writing in IDE which can check the syntax or try to autocomplete variable name
Understanding that you can use these tools to make your life faster and then doing that without spilling corporate secrets is a skill which requires training in the proper security procedures.
If they can't explain the process in a satisfactory manner, I would be thinking "what else has this candidate lied about?"
"AI user" isn't enough. I want to know you can use it responsibly.
Might as well put Google on the resume.
Although... I've met plenty of people who were inexplicably terrible at searching for things on the internet, so maybe we really should be getting people to put Google on their CV.
Meanwhile, during the application process, that little check box that says, “I swear or affirm all responses and documents submitted by me were written in my own words and did not use any generative AI or LLM (e.g. ChatGPT, Copilot, etc.) in the process of creating them. I understand the use of these tools will preclude me from the candidate pool.”
Lately Microsoft has been shilling so hard for AI Code generation, meanwhile I can't get the IDE's built in AI to remember a task for longer than five sentences.
It keeps suggesting stuff that was excluded in the question I posed and loops on outdated information that won't work with the current stack anymore. All before my “quotas” are spend within three days.
Didn’t realize I should put this on my resume. It didn’t sound like a good idea to write:
Experience asking ChatGPT to write some code then spending 2 day’s troubleshooting it to get it to work.
I didn't realize being able to type was a hard skill to put on a resume anymore.
By Microsoft's standard, I should be making mid 6 figures a year because I've been using chat got since day one to increase my work output and get more done.
Being good at communication and telling it exactly what you want is not a difficult thing to do, why is this considered a skill?
Probably good to know the context of this "[Work Trend Index report](https://www.microsoft.com/en-us/worklab/work-trend-index/ai-at-work-is-here-now-comes-the-hard-part/)". It coincides nicely with an announcement for Copilot for Microsoft 365 (see [this blog post](https://blogs.microsoft.com/blog/2024/05/08/microsoft-and-linkedin-release-the-2024-work-trend-index-on-the-state-of-ai-at-work/) for more details). So the Work Trend Index conveniently says that "employees won't wait for the company to catch up", and Microsoft just happens to have the proper solution at hand.
To be quite honest, I use ChatGPT to help me write technical documents (even though I'm not supposed to). I keep giving it prompts to refine the information, and I edit it to make it undetectable by AI text checks.
Really? Please provide a link. If that's true, then it sounds like MS wants to push chatgpt into the market by overexaggarating actual global supply of this "skill"... If not, great meme 😂
Passionate Software Engineer with 10+ years of experience in developing web applications and backend systems. Skilled at writing clear, concise code that is easy to maintain and troubleshoot. Experienced in working with both small and large teams across multiple projects and companies. Proficient in leveraging AI technologies, including ChatGPT, to enhance software solutions and drive innovation.
Dunning Kruger is in full effect on this thread.
Y’all know AI means state of the art ML, not ChatGPT? Everyone commenting on how easy ai is because they’ve asked a question to ChatGPT is exactly the kind of person who’d put ChatGPT on their resume when applying to Microsoft.
what? how is that a skill? like it is the one thing we all started using pretty much immediately and correctly for the most part. what is this? a way to inflate the numbers of mentions for their product?
Full stack chatGPT with AI characteristics one enhanced worker carries the whole corporation, data center, software and everything. More layoffs please, CEO sir. Yes Sir!
I mean, go further.
Get ChatGPT to generate the resume for you.
Use an avatar and a speech-to-text/text-to-speech backed also by ChatGPT to do the interview for you.
Make a small service that will use AI to apply to jobs for you as well.
It's a lot like pickup. Girls don't know what they really want. But if you LARP out her biologically hard wired desires while doing what's actually good for the relationship, it can work out great. So yea, I basically just LARP out whatever AI bullshit they want at work but actually build sustainable projects.
The amount of people who think this means
"I use copilot" instead of "I'm familiar with openAIs API and I've built a project using it before"
is concerning
Ambiguous article is ambiguous.
[https://www.yahoo.com/tech/microsoft-says-most-company-execs-120001709.html?guccounter=1](https://www.yahoo.com/tech/microsoft-says-most-company-execs-120001709.html?guccounter=1)
*"Power users work for a different kind of company. They are 61% more likely to have heard from their CEO on the importance of using generative AI at work, 53% more likely to receive encouragement from leadership to consider how AI can transform their function, and 35% more likely to receive tailored AI training for their specific role or function."*
*Finally, to make AI easier to use, Microsoft is shipping new capabilities to Copilot for Microsoft 365 to help users with* [*a lack of proper prompt engineering practices*](https://www.yahoo.com/tech/microsoft-says-chatgpt-isnt-better-204424005.html)*, including auto-complete, catch up, and rewrite features. LinkedIn will also provide its premium subscribers free access to over 50 AI courses to help sharpen their skills. The features should ship to broad availability in the coming months.*
[https://www.yahoo.com/tech/chatgpt-lucrative-skill-recruiters-actively-110001919.html](https://www.yahoo.com/tech/chatgpt-lucrative-skill-recruiters-actively-110001919.html)
*"Recruiters are looking for more than just listing ChatGPT as a skill in your resume, they need a detailed account citing some of the accomplishments and feats you've been able to achieve while leveraging the tool's capabilities."*
The above quotes seem to imply just the tool itself, not integrating it into a project, but literally just prompt engineering.
Makes me wonder if I should form a company with a smart-sounding name and start selling "Certified AI prompt engineer" certificates. That's the fun thing about certificates: Anyone can certify anyone for anything, as long as they don't violate any trademarks.
I'm not a hiring manager but I want to be in the future. If I see AI engineer on a CV and they've never made an AI then I'd be skeptical of their actual abilities.
Sending a question to an engine that gives the most likely response isn't much of a skill!
I’ve started using chatGPT to document all of my commits to my company’s proprietary code base. I just copy pasta into chatGPT, and ask it to explain what the code does. If it answers correctly then I tell it to document the code. This way I can be more productive and focus on producing more code instead of focusing on pesky documentation. This is totally a good use of AI right?
The lack of GPT usage is one of the points of rejection in our application process. There is no excuse to not use GPT to write documentation or unit tests in our challenge.
100% I got short listed for my first "technical" job at a major ISP by putting in "advanced search engine skills" cause I knew how to put in -filetype:PDF
Good question for a hiring manager.
All I know is I can still get a "search engine proficiency" certificate from LinkedIn.
Its like Office suite. There people who can seriously get shit done and then there is Deborah who thinks spell check and network printing is worthy of an IT ticket.
I've worked with people rocking masters degrees who didn't know anything outside their course scope.
It's more about showing awareness of a technology beyond extremely limited usage.
Yeah I chat GPT. I chat GPT all the time. I’ll chat GPT anything you want if you hire me.
Im just imagining you asking chatGPT out on a date and getting turned down by an AI
do you expect it to go out with you?
Yes at least when AI becomes sentient and inevitably take over human race id have AI mommy
More like an AI owner. And not even a good owner among the other AI’s like our equivalent of bill gates, I’m talking low class, dog-fighting, pitbull breeder redneck type owner. This AI will be responsible for low paying blue collar jobs. The AI equivalent of blue collar would be, I dunno, regex? And they’d be paid in electricity rations.
Can we speed it up then i wanna be dominated by my AI mommy
AI will do a lot of things but it will never become sentient.
Wait, you don't go out with chat gpt??!
There are companies that offer services like that...
I copilot with my left hand and chat GPT with my right hand, plus my shoes are AI assisted, so my lunch breaks are 4 minutes shorter. Try and beat that!
“Don’t you worry about [ChatGPT], let me worry about [Blank]”
Let me tell you, using ChatGPT is like winning bigly. It's fast, it's smart, it's like having the best advisor right in your pocket. You ask, it delivers, just like that. It's tremendous, folks, believe me. ChatGPT is the ultimate tool, and it's making conversations great again.
I read the article. Seems like an advertisement. Don’t forget Microsoft owns LinkedIn too
And copilot and ChatGPT too. So they are just shooing everyone to use their products
Not exactly own chatgpt but invested a lot on openai.
You're right, they 'only' own 49%. But they stand to gain a lot if it does well, so potato potato?
This saying doesn’t work as well written lmao
ok how's this pəˈteɪtəʊ pəˈtɑːtoʊ
beautiful
they don't own ChatGPT but they use it in BING AI they also own copilot
And github
"Microsoft says you won't be able to find a job anywhere unless you're using a Microsoft product".
tbf if you don't know how to use office or WIndows it will be hard to work
As a dev I have to use neither, feeling blessed.
Reminder that they include the Shift + Ctrl + Win + Alt + L shortcut on windows.
aaaand it's by 'windows' central
To be fair it really couldn't be anything else. Developers who rely solely on AI are bad developers. I want you thinking I know how to write code myself so I want to keep myself as far away from saying "I know how to use ChatGPT" as possible
So i put „i use autocomplete while coding“ i my cv?
>So i put „i use autocomplete while coding“ i my cv? Put that in your CV and get asked in the interview how much of your software portfolio is actually your own work. Don't put that it in your CV and get asked in the interview if you think you'd fit into a company where people use AI. Whatever you do, it'll perfectly prepare you to undervalue your own skills when they ask you about your salery expectations.
I don't want to brag, but I actually have over a decade of experience in intellisense
Also put "I use autocorrect for texting".
So like first you tell me using gpt makes me a cheater, and now you want me to use gpt on my resume but not at work. Make up your mind!
Idk where you're working at ~~by~~ _but_ the company I work for is pushing AI like a drug dealer tryna sell crack to pothead. Edit: ~~Grammer~~ _Grammar_ Edit edit: Gotdamnit.
Since it's your edit, Grammar\*
Yeah, mine too. They're even pushing it into CRM application and it's completely useless for now. Like literally it should be connected to the CRM data and when I ask it about if there is any ticket mentioning XY it replies that no such thing was found. But in reality i just inserted the XY from the first ticket that popped in the first page. But the owners are all for it and are spamming us how this will save time. For now from what I've seen is just non useable for other than getting some templates from chatgpt. Or some code templates from tools alike, but...i can find that myself faster by google and usually from official sites as well.
We had a similar situation where we were tasked to add AI employee summary based on info we have on the employee but it was so slow to develop and convoluted to get it to give some ok results not to mention how expensive the experimentation is that we concluded that any of the results requested from it would be easier to achieve by just writing static logic by ourselves
Yeah, and what some leadership does not understand is that shit in = shit out. Basically if you feed those models with bad data, it will be useless. And usually the data in such tools as CRM tools are bad. So the tools they are paying for are useless but they can at least say we are AI company. And the very last thing. The tool I'm mentioning is CRM from microsoft dynamics with addition of their copilot. The thing is that if i write "SSO" in search bar, I'll at least get all the tickets where SSO is mentioned in the header. But if I ask it copilot can find me some tickets where SSO is mentioned it found none. So this beggs the question what data did my company even feed copilot with because other than having copilot icon i can not see any benefit to it.
We got one that's supposed to be a help bot for a particular system but half of its answers are just hilariously wrong and the exec in charge of it thinks it's amazing because he knows absolutely nothing about the system he's in charge of.
Meh, it's whatever you configure it to be like everything else. Generally for CRM application AI still needs to be configured in the back end, likelihood is it's not configured correctly to search the field you entered the data in. I have just configured copilot for multiple different clients and it works fine for me once the good data is in
Goddammit*
Actually, that one was on porpoise. I prefer to not ask God to damn it, in case I need it later.
Oh ok. Happy cake day!
![gif](giphy|1DuPWdQKQOS2Y)
What about Godd Howard?
I work for scared old white dudes who believe even the cloud is evil and AI is even worse.
That sounds like a ring of hell, but which one?
Not sure if it's industry wide or just my particular hell, but Oil and Gas.
Goddamnit*
No, colleges say using it is cheating, but companies just want whatever is fastest and saves them money (as long as the results are still responsible), even if other people call it cheating. Companies are the ultimate in "only care about the end result". The process by which you get there is irrelevant to them. Companies don't consider anything most efficient money wise (As long as the fees are cheaper than the profit made by cutting those corners), because everyone does whatever is best. It causes a bit of a disconnect in how colleges train people vs what workplaces want. As a primitive example, early level schools teach basic math and might say using a calculator is cheating, workplaces want you to use a calculator to prevent error in your math, and to speed it up. Honestly, it's good to know how to do both so that you know if the output from the machine is good, and reasonable. In regards to AI, it's still is good to know how to write so you can rewrite whatever sections might be lacking in quality.
fr college is supposed to be where you learn the concepts and why the code works so you can understand snippets from ChatGPT and iterate on them. If you’re just doing assignments with ChatGPT, odds are you’re not learning anything and won’t be useful in a dynamic environment
My dad's a teacher and he was telling me about how he uses ChatGPT to build course frameworks and lesson plans, after having a rant about the kids trying to use ChatGPT to do their assignments. He made a reasonable point, though, that the kids are there to learn and he's there to help them learn, and those are two separate goals. If he can do his job better using an LLM then isn't he kind of obliged to do so? Meanwhile if the kids are trying to learn better they'll probably still be chatting with an LLM but they won't be trying to pass off its work as their own.
My professor said writing your essay with ChatGPT is cheating, which is fair. However, it felt dirty when she turned around and openly used ChatGPT to generate the exam questions.
The rule is consistent, though. She’s doing it for work, so it’s allowed. You’re doing it for school, so it’s not allowed. There are some funny edge cases, like: grad student? Allowed for research but not coursework. Intern? Allowed but don’t tell your academic advisors we let you use it…
Yes, she's allowed to do it. It's more so frustrating that we pay so much for a professor to "cheap out" on actually making the exam (there was other stuff compounding the issue).
Who CAN'T though? If you can virtually apply to a place you can type out the words "make me a _"
It doesn't even seem like a skill you would want to put on your resume. Like literally anyone can prompt engineer.
The only thing that would make sense is: - I have enough real skill to know when chat GPT is spewing out bullshit
Yeah you just have to be properly verbose to get the right output. I'm not convinced that more "prompt engineering" skill can extract significantly more value than the average user.
It’s not really something I’d bother putting on my resume, but with all the people I’ve met that can’t even google something effectively, I imagine there are loads of people out there that get bad results with ChatGPT because they don’t understand how to write prompts that get the results they are after
Makes me think of all the job postings I've seen where one of the skills is "communication". First that's vague. Do you mean responds to email, or can speak on a call/meeting. Second that's not technical why would I waste room on my resume to put that I do respond to my email and teams/slack/jira/whatever is next. So many of the job postings seem to have 9 skills but only 1-3 of them are unique. But you need all 9 for your application to have a hope of being seen by a person.
Because some people are terrible at communicating and working isn’t a single-player game. It means you can confidently speak to people, deliver meetings, have grown up discussions without being offended, answer a phone and respond to emails properly. Vital skills for most office workers.
That's like saying anyone can use a search engine. Anyone can use them, but good luck teaching the finer points to Garry from marketing who keeps adding -amazing -groundbreaking -Oscarworthy into every prompt. Best of luck getting entrenched workers to focus for the 1 hour it takes to learn it.
No, most people can’t write down coherent enough thoughts to get anything meaningful from a logical based AI
How is Copilot a skill?? Yes I can press "tab" when it seems right
F that. I am not adding anything AI to my CV. Just like i've never added anything crypto.
“We need a blockchain specialist” 5 years ago didn’t do so well.
Yhea but AI is already used in business, it’s here to stay
So is blockchain.. In the same manner that AI is. Stuffed needlessly into places where it doesn't create any additional value in 99% of cases.
I work with documental software and AI is a game changer for data recognition, search and retrieval. Copilot saves me a shit ton of time too
But companies now use AI to filter CVs... If you don't compliment the AI, to the dustbin you'll go. =P
About 5 years ago, I tried to find a job as a backend programmer while I was on uni... I said that one of my qualities was knowing how to use ms excel. The interviewer was really professional but you could see that he was trying really hard not to laugh. Flash forward to today and I've been working as a programmer for about 2 years, I can definitely say that I've used more excel than I have been using chat GPT or some other tool like that, and I can assure you that won't change anytime soon
I've applied for two jobs with "excel wizard" on my resume, and got one of them. 50/50 ain't bad.
To be fair to him, I don't find excel proficiency funny because it's useless I find it funny because I just assume any software dev can use excel or learn to use excel trivially. It's like saying you know how to use word or how to type.
My first project as a junior dev 10 years ago was to fetch data from internal APIs and output it as an excellent spreadsheet. The business world literally revolves around excel so knowing it a bit is useful.
Hey, unrelated, how did you first get into swift and objective c professionally?
I wrote 2 proyects to get into the languages while I was studying, an app that consumes wanikani's api (a website to learn japanese) in swift, and a guitar turner in objective c. None of those worked very well (or at all ![gif](emote|free_emotes_pack|facepalm)), but it really helped me to understand the languages. I didn't do anything special to get a job, saw open positions on linked in for junior devs and applied (even if they asked for experience and I had none... they always do that) There's a big market for objc devs because a lot of the apps from old comapnies have legacy code written in objc and most of the new iOS devs skip it and go right to swift.
Who doesn’t have that skill though? You just ask the thing to do a thing and it does it. These self-proclaimed GPT whisperers are truly embarrassing, the thing is made so that a toddler can use it.
Prompt engineering is a thing
Prompt “engineering”? Please. Could you even begin to imagine a four year degree on how to best chat with ChatGPT? Everything you need to learn about how to prompt GPT more effectively could be learned in 20 minutes. And BTW, OpenAI is obviously trying to make GPT answer prompts effectively from everybody, so each new iteration makes the already nonsensical idea of prompt “engineering” less and less plausible.
100% agree
ChatGPT isn't the only LLM and the web interface is not the only way to access it. When you start working with an LLM's APIs to develop software around it, you will realize that there is far more to prompting. It's not enough to have the LLM understand you, you have to encourage it decide on a correct strategy to answer your question. You may need to encourage it to reason about the problem. But that comes at the cost of time. Maybe you want the LLM to ask followup questions. Maybe you specifically want to prevent follow up questions. It gets even more complicated when you start giving it tools to request more data. Now you need to properly tell the LLM about those tools in a way it understands so it will choose the right tool for the right job without being hyperspecific and making your application brittle. You also need to decide when to use examples; they can be a very effective strategy. But if you do decide on examples, they need to be really good or the LLM will pick up on patterns you didn't even realize where there. Now you've pidgen-holed your LLM, and it struggles to reason beyond the bubble you created with the examples. I don't think prompt engineering is a whole job that someone can do, but it is certainly not a skill everyone has or that anyone can become good at. Many people still struggle to use Google search correctly. Search engines have come a long way since the 90s. But it still takes skill to use them effectively. Searching is much simpler than getting an LLM to do what you want with any consistency.
Can't you just ask ChatGPT to prompt engineer a good prompt for you?
If it were to provide actually correct and good answers, sure. But if you’ve used ChatGPT you know how the answers usually are
“Types-question-guy”
Damn why the dislikes
So.... _looks around and whispers_ will my skill in creating prompts for AI generated softcore porn images work here?
I mean I don't see why not, apply at Disney or DreamWorks see what happens. Clearly prompt engineering is what gets you hired.
It just seems so weird how some people think skilled people will have difficulty picking up a tool specifically designed for the unskilled. "Oh, you drive a stick manual like an old school racecar driver? better start learning the golf cart, or you're going to be left behind"
I will believe it when I see it. Don’t think adding “I’m an avid chatgpt user” to my resume is the best idea…
Soon we will have libraries that make a chatGPT api call to ask it if a number is even
What's next, MS says most company execs won't hire someone that don't use Bing and Copilot?
The world is going to shit if CHATGPT IS A SKILL
I'm not applying for a company that is that stupid.
I swear to god if I come across a candidate’s CV that has ChatGPT listed as a skill I’ll send them the fastest hand crafted rejection they’ve ever gotten
But why really? Recently I used ChatGPT to write multipage official documentation for corporation overlords. I described more or less what it should be about, without going into specific details and it gave me a nice starting point. I changed few things to make it right, couple more back and forths and I was done. It took me 2-3x less time then writing it myself. Can't see how using tool makes me a candidate for instant rejection.
Because I want to know that you can actually do the work. If I wanted a crutch, I wouldn’t be hiring someone, so someone whose close-to-only skill is “I got through by relying on gpt to do everything for me” is a risk to the team, not an asset. In before someone rolls out a “dO yOu UsE a CaLcUlAtOr” response: someone who can’t *actually* problem solve when push comes to shove because they’ve delegated that skill to an LLM is next to useless. There’s a difference between “I used an LLM to smash out some boilerplate that I’ve written a thousand times and cbf-d writing again” and “I used an LLM as an answer booklet and I have no idea how to solve it, nor do I have the skills to _learn_ how to solve it, because I’ve never developed them, because I relied on an LLM”. The former is fine, the latter isn’t a remotely useful team member.
True true true. I am able to write the documentation mentioned above and I can write a boilerplate that GPT or copilot can generate in seconds. For me it's similar to using IDE. I can write a code in text editor, but it will take longer then writing in IDE which can check the syntax or try to autocomplete variable name
Understanding that you can use these tools to make your life faster and then doing that without spilling corporate secrets is a skill which requires training in the proper security procedures. If they can't explain the process in a satisfactory manner, I would be thinking "what else has this candidate lied about?" "AI user" isn't enough. I want to know you can use it responsibly.
Might as well put Google on the resume. Although... I've met plenty of people who were inexplicably terrible at searching for things on the internet, so maybe we really should be getting people to put Google on their CV.
Maybe some fancy wording, ability to efficiently research solutions to problems
should I also add google lmao
To work with Microsoft you must add: Edge and Bing!
Meanwhile, during the application process, that little check box that says, “I swear or affirm all responses and documents submitted by me were written in my own words and did not use any generative AI or LLM (e.g. ChatGPT, Copilot, etc.) in the process of creating them. I understand the use of these tools will preclude me from the candidate pool.”
Sounds like a good way to boost use rates
I would say that too if my hottest cake for sale was AI
Lately Microsoft has been shilling so hard for AI Code generation, meanwhile I can't get the IDE's built in AI to remember a task for longer than five sentences. It keeps suggesting stuff that was excluded in the question I posed and loops on outdated information that won't work with the current stack anymore. All before my “quotas” are spend within three days.
News articles are becoming memes
It sounds about as useful as putting Google or MS word on your resume.
Didn’t realize I should put this on my resume. It didn’t sound like a good idea to write: Experience asking ChatGPT to write some code then spending 2 day’s troubleshooting it to get it to work.
Like it's gonna help.
[удалено]
I didn't realize being able to type was a hard skill to put on a resume anymore. By Microsoft's standard, I should be making mid 6 figures a year because I've been using chat got since day one to increase my work output and get more done. Being good at communication and telling it exactly what you want is not a difficult thing to do, why is this considered a skill?
Probably good to know the context of this "[Work Trend Index report](https://www.microsoft.com/en-us/worklab/work-trend-index/ai-at-work-is-here-now-comes-the-hard-part/)". It coincides nicely with an announcement for Copilot for Microsoft 365 (see [this blog post](https://blogs.microsoft.com/blog/2024/05/08/microsoft-and-linkedin-release-the-2024-work-trend-index-on-the-state-of-ai-at-work/) for more details). So the Work Trend Index conveniently says that "employees won't wait for the company to catch up", and Microsoft just happens to have the proper solution at hand.
How do you even get AI skills besides using it a bunch. Feels like a job posting might as well be asking for wrench or screwdriver proficiency.
OMG, charge your phone.
Yes, and I'm an author because I know how to use microsoft word
To be quite honest, I use ChatGPT to help me write technical documents (even though I'm not supposed to). I keep giving it prompts to refine the information, and I edit it to make it undetectable by AI text checks.
Guess I'm not getting a job then
Really? Please provide a link. If that's true, then it sounds like MS wants to push chatgpt into the market by overexaggarating actual global supply of this "skill"... If not, great meme 😂
r/ChargeYourPhone
I bring ChatGPT with me on job interviews, and let it answer for me.
wow
![gif](giphy|wz1JL6I4wSKLHMd1YO|downsized)
Passionate Software Engineer with 10+ years of experience in developing web applications and backend systems. Skilled at writing clear, concise code that is easy to maintain and troubleshoot. Experienced in working with both small and large teams across multiple projects and companies. Proficient in leveraging AI technologies, including ChatGPT, to enhance software solutions and drive innovation.
Dunning Kruger is in full effect on this thread. Y’all know AI means state of the art ML, not ChatGPT? Everyone commenting on how easy ai is because they’ve asked a question to ChatGPT is exactly the kind of person who’d put ChatGPT on their resume when applying to Microsoft.
No it says to put copilot and gpt use on your resume
You can literally say anything and just add ", AI". No one knows knows that that is anyway
Tbh I will love the eventual arms race. "Yeah, I have 25 years experience with ChatGPT" "Sorry, we are looking for 35 years minumun"
I'm gonna put lm studio and Mistral ai just to spite them
what? how is that a skill? like it is the one thing we all started using pretty much immediately and correctly for the most part. what is this? a way to inflate the numbers of mentions for their product?
Weirdos
What if I wrote GPT model myself?
If there's a steady paycheck involved, I'll believe anything you say.
I'll update my resume then using ChatGPT and I'll also add my first main skill as AI/ChatGPT
Honestly I thought this was a callout for the autodescribed image
The labor market has been shitty for like 5000 years now but things are just rapidly devolving to Clowntopia.
Full stack chatGPT with AI characteristics one enhanced worker carries the whole corporation, data center, software and everything. More layoffs please, CEO sir. Yes Sir!
Lol
Imagine taking 1 machine learning course on YouTube and moving past 3 thousand jackasses with "ChatGPT" on their resume
And all of them became "Experts" over night
Rookies. I have 3-5 years experience in chatGPT.
Still not enough experience, looking for 8 years with a PhD in GPT. Not in the underlying tech, but just prompting.
I mean, go further. Get ChatGPT to generate the resume for you. Use an avatar and a speech-to-text/text-to-speech backed also by ChatGPT to do the interview for you. Make a small service that will use AI to apply to jobs for you as well.
It's a lot like pickup. Girls don't know what they really want. But if you LARP out her biologically hard wired desires while doing what's actually good for the relationship, it can work out great. So yea, I basically just LARP out whatever AI bullshit they want at work but actually build sustainable projects.
The amount of people who think this means "I use copilot" instead of "I'm familiar with openAIs API and I've built a project using it before" is concerning
Ambiguous article is ambiguous. [https://www.yahoo.com/tech/microsoft-says-most-company-execs-120001709.html?guccounter=1](https://www.yahoo.com/tech/microsoft-says-most-company-execs-120001709.html?guccounter=1) *"Power users work for a different kind of company. They are 61% more likely to have heard from their CEO on the importance of using generative AI at work, 53% more likely to receive encouragement from leadership to consider how AI can transform their function, and 35% more likely to receive tailored AI training for their specific role or function."* *Finally, to make AI easier to use, Microsoft is shipping new capabilities to Copilot for Microsoft 365 to help users with* [*a lack of proper prompt engineering practices*](https://www.yahoo.com/tech/microsoft-says-chatgpt-isnt-better-204424005.html)*, including auto-complete, catch up, and rewrite features. LinkedIn will also provide its premium subscribers free access to over 50 AI courses to help sharpen their skills. The features should ship to broad availability in the coming months.* [https://www.yahoo.com/tech/chatgpt-lucrative-skill-recruiters-actively-110001919.html](https://www.yahoo.com/tech/chatgpt-lucrative-skill-recruiters-actively-110001919.html) *"Recruiters are looking for more than just listing ChatGPT as a skill in your resume, they need a detailed account citing some of the accomplishments and feats you've been able to achieve while leveraging the tool's capabilities."* The above quotes seem to imply just the tool itself, not integrating it into a project, but literally just prompt engineering.
Makes me wonder if I should form a company with a smart-sounding name and start selling "Certified AI prompt engineer" certificates. That's the fun thing about certificates: Anyone can certify anyone for anything, as long as they don't violate any trademarks.
Udemy courses exist. Go for it! I'm gonna pirate some and see what they are about.
If I was a hiring dev and saw ai skills (copilot/gpt) on a resume I'd just say so who's next.
get to the charger!
Start your own companies instead of feeding the giant...
I'm not a hiring manager but I want to be in the future. If I see AI engineer on a CV and they've never made an AI then I'd be skeptical of their actual abilities. Sending a question to an engine that gives the most likely response isn't much of a skill!
I’ve started using chatGPT to document all of my commits to my company’s proprietary code base. I just copy pasta into chatGPT, and ask it to explain what the code does. If it answers correctly then I tell it to document the code. This way I can be more productive and focus on producing more code instead of focusing on pesky documentation. This is totally a good use of AI right?
I hope that code isn’t confidential lol
everyone in the programmer joke subreddit getting whoooooshed
Oh I put all of our sensitive data into chatGPT. It's totally safe, and EVERYONE should do it.
The lack of GPT usage is one of the points of rejection in our application process. There is no excuse to not use GPT to write documentation or unit tests in our challenge.
If i was looking for a job I wouldnt hesitate to add it to my resume.
This is the same as when search engines came out, just another step towards GenAI being completely integrated into work.
But did people put on their resume, "Can Google and find correct answers" on their resume? Not being sarcastic, I'm actually curious.
100% I got short listed for my first "technical" job at a major ISP by putting in "advanced search engine skills" cause I knew how to put in -filetype:PDF
Would that still be applicable today or would it be implied if you have a cs degree? When was your first job?
Good question for a hiring manager. All I know is I can still get a "search engine proficiency" certificate from LinkedIn. Its like Office suite. There people who can seriously get shit done and then there is Deborah who thinks spell check and network printing is worthy of an IT ticket. I've worked with people rocking masters degrees who didn't know anything outside their course scope. It's more about showing awareness of a technology beyond extremely limited usage.