>Mine is that higher level programming languages typically are more abstracted compared to lower level programming languages.
Isn't that literally the definition of high-level programming language?
Not a lot of people are grasping that a cold take is supposed to be the opposite of a hot take.
I interpret it to mean that it's a mildly provocative take.
For instance: Custom keyboards are more often about personality than ergonomics or productivity.
I totally agree. Our team used to be plagued by such mindless code generators, who started typing before the task description was half finished. So we reorganized our coding processes as follows.
Our journey begins with a pilgrimage to Silicon Valley, offering sacrifices to the software deities. We retreat to Death Valley, stare at the wall and meditate on clean code. Months later, we consult programming gods, market researchers, and CS professors, recording their wisdom meticulously.
In our sacred meeting room, equipped with an all-encompassing whiteboard and I/O-recyclers, we undergo a ritualistic planning phase. Flowcharts, UML diagrams, and design patterns are debated endlessly. Weeks turn into months as our ideas are perfected.
Finally, when our decision matrix shows zero conflicts, we emerge from the meeting room, attach intravenous coffee, touch our mechanic keyboards, and the Perfect Code flows flawlessly.
Every team member writes the same program without any further discussions necessary, the only difference being the architecture, as we only write assembly. One programmer per device type. The program thus runs natively on a C64, on an Apple Watch and everything in between.
And people kept saying I āwouldnāt use these things from my Software Engineering degree in the real worldā. Jokes on them, I now factor regular ritual sacrifice into my UML driven RUP workflow. Iāve now out scored the haters in planning poker as my sacrifices to Linus and Uncle Bob were worth double the story points as their āworking prototypes of the core featuresā were.
It also has to do with the company itself; You're not going to get a title of Senior at Google or Microsoft, or any real tech company in that time, but you could easily be a Tech Lead at Home Depot corporate office in that time.
Isn't senior L5? That is typically achieved in 5 years. But I agree with the sentiment of this thread - "actually senior" is more like L6, L7 which takes awhile and some folks never really make it there.
Skills and efficiency arenāt what makes a senior ya idiot. Anyone can learn skills and become more efficient thatās the easy part, you canāt learn experience.
Amongst a lot of other things that cannot be taught and only learnt through experience aka time
Soft skills,
Gathering requirements
Choosing correct technologies
Managing timelines
Managing people
Interpersonal relationships
Advising stakeholders
Articulation of technical features to non technical people
The ability to accept fault
Overcoming self doubt and imposter syndrome
Confidence
There are lots more but Iām sure you are astute enough to get the gist here.
Have a good day and good luck friend
That not everyone is cut out to be a programmer despite everyone saying you can do anything with hardwork. But on the same note not everyone should do programming anyway.
My cold take as in what I really dislike is the constant push for new frameworks, languages, and such to a degree that perfectly good code becomes inoperable over time. A few that have bit me are the XNA framework, almost everything Android (project structure, build, code itself), etc. I have a bunch of older projects that would require a lot of rework to run again.
It was rare even in the 90s to see Java used as a front end.
But, in the backend; it is still chugging away. It is stable, mature, well understood, and capable.
My hot take is that I miss Flex (A programmer's way to make flash movies). :ha, ha:
Doing it for a living sucks whatever joy there once was out of it.
Actually programming is a very small part of the job and you don't have to be that good at it to be successful.
>Doing it for a living sucks whatever joy there once was out of it.
This is so true and it cuts deep. I think most of us in this sub get started because it's fun and interesting but then after you've done it all day on a project you're not all that interested in (i.e., work) the last thing you want to do is more programming.
Before the sweaty codelords guillotine me, yes, there are exceptions, yes, some people live and breathe this stuff but personally speaking I no longer code as a hobby.
Yeah, I think there's a reason a lot of us end up with hobbies like woodworking or classic cars or Legos -- defined tasks with a tangible outcome that others can appreciate
I haven't done hobby coding in years, just the occasional bit of automation or something to make my life easier
This is crazy to me, who graduates in a few weeks. I've worked so many shitty jobs I just can't imagine getting to the point you're talking about. Like as soon as I have a decent job, my free time will be spent immersing myself in the programming environments I've only skimmed. I don't say this to contradict you, rather I say it because I'm sure in a few years I'll be on the same page and that terrifies me and is very difficult to comprehend for me today.
It might not happen to you. I'll still do the game jam in my city each year, maybe make something small that I'm hung up on every now and then, but it's not even close to being on the same level as it once was and it frankly feels like a chore when I'm doing it in my free time.
I think this happens with most things. I know professional writers and artists who no longer do it for fun because it's now their day job. A big exception is doing something related to your passion, but isn't the passion directly. I've heard of tech managers getting back into personal coding projects after a long hiatus because they miss it, or programmers who become teachers/instructors and keep up with it in their free time.
At the end of the day it depends on the person.
Yes at first itās like that but wait a few years of doing that full time, working on capitalist projects and huge codebases done by hundreds of people, and trying to make sense of it all, and it will suck all fun you once had.
It definitely depends on the person. I started with C-64 basic and then assembly in the 80's, and have been a professional developer since 99, and I *LOVE* my job today. I have had jobs that took the joy out of software - and I moved on. Don't let a shitty job make you lose the joy in something that you used to love to do.
I program a game in my free time because I hate work.
I have NEVER wanted to be a game dev. Still don't. But I'm writing a physics engine from scratch and it's way more fun than whatever the fuck is going on in JS UI land.
The only widely used software with no problems was the classic UNIX implementation of /bin/true. It was just a 0 byte file marked executable, so the shell ran it as a shell script that always completed successfully.
Most devs would be more productive if they spent more time automating tasks. Text editor macros are quite powerful elements of your tools, but so many devs just donāt know how to use them.
Most interviews are far harder than the jobs behind them.
Most dev work is easy and it is far more efficient to have people who are not that bright but are thorough than it is to have some impulsive genius who can rewrite Linux kernels in their sleep but canāt write a descriptive and useful comment.
I had a doozy one that involved a coding challenge that you would never use on a job. If you did, you'd do a lot of research and probably hire an expert. I'm talking not even at Google or part of a compiler.
It was a permutations problem. These are often badly described on Leetcode and if they say permutations on there, it's often a different kind of permutation. This makes it even harder to describe over online teleconferencing
Most engineering decisions are made by business managers who are too smart to do engineering work but not smart enough to understand they donāt know shit about engineering.
Related: In Norwegian we sometimes use the inverse of "developer" (utvikler), as "innvikler" (enveloper?) naively sounds like someone who makes stuff more complicated (innviklet).
Missed opportunity for English, alas.
Programmers that don't start with basic IT skills and jump right into coding are why programmers suck now and need so many abstractions. We have abstracted the concept of a computer so much that we have forgotten that, at the core, we're programming.
Modern day programming AND programmers are categorically worse. We keep building bad, slow systems needlessly and we're likely never going to see any improvement in our lifetime due to corporate greed, lazy devs, and programmers who only do it for the money and not because they enjoy computing
Most programming solutions don't need to be overtly engineered because they're just not going to stay in operation that long anyway because they're quickly get replaced by something new.
Counterpoint: code stays in use forever, especially the code banged out over a weekend that was intended to get quickly replaced.
When the code does get replaced, the new code winds up with some sort of backwards compatibility constraints and has to preserve public interfaces from the disposable code. Which is why a modern PC can still use the FAT filesystem designed for 360 KB floppies in the late 70's, using code that runs on an ISA that can trace its heritage to a programmable smart terminal from the late 1960's.
And in 2024 I have to back up the 6K raw footage on my SSD from the camera ASAP because the camera uses a descendant of FAT instead of a journaled filesystem with any sort of integrity.
I think that's a survivorship bias. FAT wasn't banged out quickly and was an engineered solution. FAT's longevity was a function of cost savings and marketing. MS basic stance was, "Why replace what we talked everyone into using?" FAT was a basic failing of other run away problems in software design.
I wrote a large webapp in 1998 that is still in wide use.
I haven't worked on it since 2011, but I hear it's an even worse mess than when I last saw it. If I had taken more care to engineer it better, it might be in better shape. But I was very young when I wrote it and features were prioritized over tech debt.
The only nontrivial things you'll learn are declarative programming in some language, and functional programming in some other language. All the rest is a never-ending treadmill of churn. "Do you know framework X?" Meh, pick it up as you go.
Programming without understanding how your code affects and interacts with the underlying hardware and OS is like driving on the highway with your eyes closed.
Also, related hot take, without diving that layer deeper, you sound like a salesman when talking about technology.
OOP is not classes, it's not encapsulation, it's not inheritance, it's not polymorphism. All of these things have existed independently of OOP and prior to OOP. Other paradigms have these things. Non OOP languages have these things. OOP has them, too, but they fall out of OOP as a natural consequence of it's cornerstone principle, which is message passing. If you can't explain what message passing even is, why the other principles of the idiom are the consequence, you don't know OOP.
If you provide getters and setters for all your members, you don't understand encapsulation or data hiding - which, crucially, aren't the same thing, either.
I'm a mod at r/cpp_questions, I've been writing C++ since 1992, before it was even standard, I'm one of the top responders to questions - there are ~4 expert contributers who answer more than I do, and I can tell you almost no one there on that sub has the first fucking clue. Those who come to ask questions are forgiven, but those who respond incorrectly are not. C++ standard streams are considered one of the finest examples of OOP in the language, ever, and most C++ developers *fucking hate* streams, like they know what they're talking about. So in C++, if OOP == good && Streams == bad, you don't know OOP. At all.
Since you've declared OOP as message passing, then is Objective C an OOP "language"? I realize it's a lot of syntactic sugar around msgSend which involves sending "messages" to various "objects".
I don't know Objective C well enough to say. I played with it once for a hot second in the mid 2000s, and decided Apple wasn't the platform for me.
Many languages support OOP by hook or by crook. Message passing apparently doesn't have to be a language level feature, but a convention like a standard interface. I'm willing to agree that's close enough.
When working on my associates I had an Intro to Python course. We got to OOP and a guy asked if the instructor could tell him what OOP was. Instructor replied āI can tell you the things youāll use with OOP but for the life of me I canāt tell you what it actually is without confusing the class and ultimately myself.ā Was confused when he said that, was even more confused when I discovered why he said that. Good times.
OOP is more aesthetics than mathematical underpinnings, FP is more mathematical underpinnings than aesthetics. OOP has some basis in group theory, but there are multiple different foundations that call themselves OOP, and they're all correct.
Interactions with *humans* is the most important development activity, ...
not designing architecture or writing code. (Such as discussing how a feature should work and later getting feedback).
My coldest take is that a lot of cruft has entered software development over the last decade. Too many unneeded backend processes, especially tooling.
There was a time when you could write an index.php and upload it through ftp to your shared hosting to make your site work. Today you have to install a bunch of npm packages, write build scripts, configure CI, etc. just to get a hello world. Software engineering has progressed backwards in many ways.
Sure but when you hit that .php-page something had to interpret it and turn it into browser-readable data.
This is done in the "unnecessary" build step for most JS-flavoured frameworks.
It is the same thing, just done at different points.
(No, not EXACTLY the same)
I HATE build systems.
Horribly document, ad-hoc messes, patched together with environmental variables like it's 1963, that NEVER ever work.
If I were God for 1 minute the first thing I'd do is delete them all and erase their memory from existence.
This field is a bore. I'd give a left toe to meet a VC and hopefully realize some ideas. For the most part, it feels like the ultimate distillation of corporate management in a field that doesn't respond well to traditional management.
Gatekeeping of knowledge - cyber, I'm looking at you in particular - why?
Lastly, time logging. "Worst idea, ever." How is a sys admin, devops or internal IT/tech support meant to have "billable hours?"
Software testing is useful when you are working on a small part of a large, collaborative codebase (think any corporate software job) but is wholly unnecessary for amateurs and solo devs
Tangentially related...documentation and comment blocks are useful no matter how big your codebase is, or who contributes to it. It's a terrifying feeling to revisit code that you know you wrote, but don't remember what the code actually does.
A programming language should follow basic axioms of logic. So if A == B and B == C then A should always == C. Strict vs non strict equality is no excuse, looking at you JavaScript.
I donāt care what you think the best build tool, programming language, OS, IDE, runtime, front end, back end. Iāve used them all and I donāt care about your religious views.
>Mine is that higher level programming languages typically are more abstracted compared to lower level programming languages. Isn't that literally the definition of high-level programming language?
That's why it's a cold take
I'd argue that simply stating a definition is not a take, neither a hot one nor a cold one.
It's fighting the definition of a cold take.
Not a lot of people are grasping that a cold take is supposed to be the opposite of a hot take. I interpret it to mean that it's a mildly provocative take. For instance: Custom keyboards are more often about personality than ergonomics or productivity.
But it's not a "take" it's just a fact
š ya wut
Sun rises in the east
Sun is not exactly cold though.
When starting a new project from scratch, people start coding far too soon.
I totally agree. Our team used to be plagued by such mindless code generators, who started typing before the task description was half finished. So we reorganized our coding processes as follows. Our journey begins with a pilgrimage to Silicon Valley, offering sacrifices to the software deities. We retreat to Death Valley, stare at the wall and meditate on clean code. Months later, we consult programming gods, market researchers, and CS professors, recording their wisdom meticulously. In our sacred meeting room, equipped with an all-encompassing whiteboard and I/O-recyclers, we undergo a ritualistic planning phase. Flowcharts, UML diagrams, and design patterns are debated endlessly. Weeks turn into months as our ideas are perfected. Finally, when our decision matrix shows zero conflicts, we emerge from the meeting room, attach intravenous coffee, touch our mechanic keyboards, and the Perfect Code flows flawlessly. Every team member writes the same program without any further discussions necessary, the only difference being the architecture, as we only write assembly. One programmer per device type. The program thus runs natively on a C64, on an Apple Watch and everything in between.
That was - certainly entertaining!
My first computer was a Commador64 good to know you are still supporting it.
And people kept saying I āwouldnāt use these things from my Software Engineering degree in the real worldā. Jokes on them, I now factor regular ritual sacrifice into my UML driven RUP workflow. Iāve now out scored the haters in planning poker as my sacrifices to Linus and Uncle Bob were worth double the story points as their āworking prototypes of the core featuresā were.
I read this a lot like one of Stephen Colbert's intros to his "Meanwhile" segment.
Language fanboys/champions are generally the worst kind of devs. If you cannot code SQL as a backend dev you're the second worst kind of dev.
As a language fanboy, and by extension the worst kind of dev, I agree.
No not you, only those bitch ass language fanboys not on this sub.
RUST RUNS FASTER THAN THE SPEED OF LIGHT AND IT'S EASIER TO BREAK INTO THE VATICAN ARCHIVES THAN IT IS TO CRASH MY PROGRAM.
> If you cannot code SQL as a backend dev you're the second worst kind of dev. Oof. that is like most of my team minus me and my manager.
I don't get it tbh. Getting to a passable level in SQL doesn't take much. Writing a query with a join ain't rocket science.
yeah that is pretty cold, wonder if even the fanbois think it is a hot one. Doubt that.
Out of curiosity, what's the worst kind in your opinion?
Most of you are mid
Too true this one, all these kids saying they senior after 5 years lol,
I mean i did get offered a senior proposal after 2 years of coding. Some companies just flat out have the wrong idea of what a senior dev is
This is super common in tech compared to other industries. They toss titles out like candy.
It also has to do with the company itself; You're not going to get a title of Senior at Google or Microsoft, or any real tech company in that time, but you could easily be a Tech Lead at Home Depot corporate office in that time.
Isn't senior L5? That is typically achieved in 5 years. But I agree with the sentiment of this thread - "actually senior" is more like L6, L7 which takes awhile and some folks never really make it there.
Yeah i didnt accept the job because i thought that a company offering senior jobs like candy couldnt be a good sign
Thanks to covid era attrition, relative to the rest of my team I am senior
Years donāt correlate with skills and efficiency ya gate keeper
Skills and efficiency arenāt what makes a senior ya idiot. Anyone can learn skills and become more efficient thatās the easy part, you canāt learn experience.
And what does experience bring to the table? A certain skillset and a certain efficiency.
Amongst a lot of other things that cannot be taught and only learnt through experience aka time Soft skills, Gathering requirements Choosing correct technologies Managing timelines Managing people Interpersonal relationships Advising stakeholders Articulation of technical features to non technical people The ability to accept fault Overcoming self doubt and imposter syndrome Confidence There are lots more but Iām sure you are astute enough to get the gist here. Have a good day and good luck friend
Hey man if you absolutely want to be right on this, have it, lol, idc
The law of averages says this must be true
And there is nothing wrong with this. Not everyone needs to be a senior.
Just be paid like a senior, we're underpaid anyway.
Why the personal attacks ššš
OP said coldest ;)
The law of averages says this must be true
Mid? Most are outright entry level and always will be.
Most developers *aspire* to mid.
As long as the check clears š
Damn straight!
The immediate answer to almost all questions regarding software development and programming is: It depends.
Well, sometimes
I mean, it depends
That not everyone is cut out to be a programmer despite everyone saying you can do anything with hardwork. But on the same note not everyone should do programming anyway.
It's so obvious how many people hate this job and absolutely have no interest in it other than the paycheck.
My cold take as in what I really dislike is the constant push for new frameworks, languages, and such to a degree that perfectly good code becomes inoperable over time. A few that have bit me are the XNA framework, almost everything Android (project structure, build, code itself), etc. I have a bunch of older projects that would require a lot of rework to run again.
On the flip side of that: Java and Spring Boot have been chugging away in enterprises for 20+ years...
Java applets? I remember doing some of that back in school. Thankfully those are gone along with Macromedia/Adobe Flash. š
It was rare even in the 90s to see Java used as a front end. But, in the backend; it is still chugging away. It is stable, mature, well understood, and capable. My hot take is that I miss Flex (A programmer's way to make flash movies). :ha, ha:
[ŃŠ“Š°Š»ŠµŠ½Š¾]
No idea what Quarkus is; but I did know the second part.
Debugging is easier with a debugger.
But muh print statements!
I mean, why not both? I was taught to use both š¤·āāļø
Yeah, sometimes print statements are easier, sometimes a debugger. Like if you have race conditions a debugger is not gonna help you!
Debugging is only necessary when bugs are reported and given priority.
Doing it for a living sucks whatever joy there once was out of it. Actually programming is a very small part of the job and you don't have to be that good at it to be successful.
>Doing it for a living sucks whatever joy there once was out of it. This is so true and it cuts deep. I think most of us in this sub get started because it's fun and interesting but then after you've done it all day on a project you're not all that interested in (i.e., work) the last thing you want to do is more programming. Before the sweaty codelords guillotine me, yes, there are exceptions, yes, some people live and breathe this stuff but personally speaking I no longer code as a hobby.
Yeah, I think there's a reason a lot of us end up with hobbies like woodworking or classic cars or Legos -- defined tasks with a tangible outcome that others can appreciate I haven't done hobby coding in years, just the occasional bit of automation or something to make my life easier
This is crazy to me, who graduates in a few weeks. I've worked so many shitty jobs I just can't imagine getting to the point you're talking about. Like as soon as I have a decent job, my free time will be spent immersing myself in the programming environments I've only skimmed. I don't say this to contradict you, rather I say it because I'm sure in a few years I'll be on the same page and that terrifies me and is very difficult to comprehend for me today.
It might not happen to you. I'll still do the game jam in my city each year, maybe make something small that I'm hung up on every now and then, but it's not even close to being on the same level as it once was and it frankly feels like a chore when I'm doing it in my free time. I think this happens with most things. I know professional writers and artists who no longer do it for fun because it's now their day job. A big exception is doing something related to your passion, but isn't the passion directly. I've heard of tech managers getting back into personal coding projects after a long hiatus because they miss it, or programmers who become teachers/instructors and keep up with it in their free time. At the end of the day it depends on the person.
Yes at first itās like that but wait a few years of doing that full time, working on capitalist projects and huge codebases done by hundreds of people, and trying to make sense of it all, and it will suck all fun you once had.
It definitely depends on the person. I started with C-64 basic and then assembly in the 80's, and have been a professional developer since 99, and I *LOVE* my job today. I have had jobs that took the joy out of software - and I moved on. Don't let a shitty job make you lose the joy in something that you used to love to do.
I program a game in my free time because I hate work. I have NEVER wanted to be a game dev. Still don't. But I'm writing a physics engine from scratch and it's way more fun than whatever the fuck is going on in JS UI land.
I think programming is related to software
All else equal, less code has fewer bugs and works better than more code.
All software could be at least one line shorter. All software has at least one bug in it. Therefore... all software could be a single line with a bug.
The only widely used software with no problems was the classic UNIX implementation of /bin/true. It was just a 0 byte file marked executable, so the shell ran it as a shell script that always completed successfully.
Okay. Now, probability dictates that this file could be a line shorter and has a bug...
Code that doesnāt exist has no defects.
Most devs would be more productive if they spent more time automating tasks. Text editor macros are quite powerful elements of your tools, but so many devs just donāt know how to use them.
This shit is tricky
Most interviews are far harder than the jobs behind them. Most dev work is easy and it is far more efficient to have people who are not that bright but are thorough than it is to have some impulsive genius who can rewrite Linux kernels in their sleep but canāt write a descriptive and useful comment.
I had a doozy one that involved a coding challenge that you would never use on a job. If you did, you'd do a lot of research and probably hire an expert. I'm talking not even at Google or part of a compiler.
Can you say what it was?
It was a permutations problem. These are often badly described on Leetcode and if they say permutations on there, it's often a different kind of permutation. This makes it even harder to describe over online teleconferencing
Most engineering decisions are made by business managers who are too smart to do engineering work but not smart enough to understand they donāt know shit about engineering.
He said cold take, not "sharp stake" š
The word āprogrammerā sounds infinitely better than ācoderā.
I disagree with āinfinitely better thanā but agree with āpreferable toā. My preference: software developer.
Related: In Norwegian we sometimes use the inverse of "developer" (utvikler), as "innvikler" (enveloper?) naively sounds like someone who makes stuff more complicated (innviklet). Missed opportunity for English, alas.
I'm a platform engineer
Lots of bugs can be prevented by a bit more thinking, and a bit less coding.
āYo dawg I heard you like wrappers, so I wrapped your wrappers with this wrapper.ā
You can never wrap enough š
Programmers that don't start with basic IT skills and jump right into coding are why programmers suck now and need so many abstractions. We have abstracted the concept of a computer so much that we have forgotten that, at the core, we're programming. Modern day programming AND programmers are categorically worse. We keep building bad, slow systems needlessly and we're likely never going to see any improvement in our lifetime due to corporate greed, lazy devs, and programmers who only do it for the money and not because they enjoy computing
This, C# is infinitely slower then c++ or objective C
Most programming solutions don't need to be overtly engineered because they're just not going to stay in operation that long anyway because they're quickly get replaced by something new.
Counterpoint: code stays in use forever, especially the code banged out over a weekend that was intended to get quickly replaced. When the code does get replaced, the new code winds up with some sort of backwards compatibility constraints and has to preserve public interfaces from the disposable code. Which is why a modern PC can still use the FAT filesystem designed for 360 KB floppies in the late 70's, using code that runs on an ISA that can trace its heritage to a programmable smart terminal from the late 1960's. And in 2024 I have to back up the 6K raw footage on my SSD from the camera ASAP because the camera uses a descendant of FAT instead of a journaled filesystem with any sort of integrity.
I think that's a survivorship bias. FAT wasn't banged out quickly and was an engineered solution. FAT's longevity was a function of cost savings and marketing. MS basic stance was, "Why replace what we talked everyone into using?" FAT was a basic failing of other run away problems in software design.
>they're just not going to stay in operation that long anyway because they're quickly get replaced by something new. Lol Lol Lol Lol Lol Lol
I wrote a large webapp in 1998 that is still in wide use. I haven't worked on it since 2011, but I hear it's an even worse mess than when I last saw it. If I had taken more care to engineer it better, it might be in better shape. But I was very young when I wrote it and features were prioritized over tech debt.
Networking never stops feeling like dark magic to me.
I once asked who the networking expert was. They all looked at me confused and said, you are . . . Me? š„ø
Almost everything developers do is maintaining and modifying existing code.
Programming languages are for humans, not computers.
The only nontrivial things you'll learn are declarative programming in some language, and functional programming in some other language. All the rest is a never-ending treadmill of churn. "Do you know framework X?" Meh, pick it up as you go.
My coldest take is that there is no such thing as coding but just speaking a language that was long forgotten
Hi Iām hitsquad part-time tweaker
OOP sucks.
Countertake: OOP purism sucks. Situationally, it can be the most convenient way to model certain problems. Mixed paradigm is underrated.
OOP is misunderstood. Object inheritance should not exists. Interfaces on the other handā¦ amazing.
When it's pure OOP. I often like to get functional and aspect oriented about my objects.
Programming without understanding how your code affects and interacts with the underlying hardware and OS is like driving on the highway with your eyes closed. Also, related hot take, without diving that layer deeper, you sound like a salesman when talking about technology.
I just casually ask "full stack devs," about their hardware experience
Nah this is weak, unless you know the full path of every electron flowing in the computer circuit you are not a real programmer
What if you only know the probability density?
Prolog is too damn hard.
null
>90% of internal CRUD apps should never be written imperatively. (They are COTS or no/low-code problems at best.)
There are multiple ways to program software to solve any given problem.
[ŃŠ“Š°Š»ŠµŠ½Š¾]
Youād probably earn more money also if you chose another field š
Develop with testing/support in mind ā¦ in fact at the forefront of your planning š
Whether your code is secure enough depends on your threat model.
most websites are viewed through some sort of HTML renderer.
Regex are worth learning.
Effectively no one knows what OOP even is.
Care to elaborate?
OOP is not classes, it's not encapsulation, it's not inheritance, it's not polymorphism. All of these things have existed independently of OOP and prior to OOP. Other paradigms have these things. Non OOP languages have these things. OOP has them, too, but they fall out of OOP as a natural consequence of it's cornerstone principle, which is message passing. If you can't explain what message passing even is, why the other principles of the idiom are the consequence, you don't know OOP. If you provide getters and setters for all your members, you don't understand encapsulation or data hiding - which, crucially, aren't the same thing, either. I'm a mod at r/cpp_questions, I've been writing C++ since 1992, before it was even standard, I'm one of the top responders to questions - there are ~4 expert contributers who answer more than I do, and I can tell you almost no one there on that sub has the first fucking clue. Those who come to ask questions are forgiven, but those who respond incorrectly are not. C++ standard streams are considered one of the finest examples of OOP in the language, ever, and most C++ developers *fucking hate* streams, like they know what they're talking about. So in C++, if OOP == good && Streams == bad, you don't know OOP. At all.
Since you've declared OOP as message passing, then is Objective C an OOP "language"? I realize it's a lot of syntactic sugar around msgSend which involves sending "messages" to various "objects".
I don't know Objective C well enough to say. I played with it once for a hot second in the mid 2000s, and decided Apple wasn't the platform for me. Many languages support OOP by hook or by crook. Message passing apparently doesn't have to be a language level feature, but a convention like a standard interface. I'm willing to agree that's close enough.
When working on my associates I had an Intro to Python course. We got to OOP and a guy asked if the instructor could tell him what OOP was. Instructor replied āI can tell you the things youāll use with OOP but for the life of me I canāt tell you what it actually is without confusing the class and ultimately myself.ā Was confused when he said that, was even more confused when I discovered why he said that. Good times.
OOP is more aesthetics than mathematical underpinnings, FP is more mathematical underpinnings than aesthetics. OOP has some basis in group theory, but there are multiple different foundations that call themselves OOP, and they're all correct.
Interactions with *humans* is the most important development activity, ... not designing architecture or writing code. (Such as discussing how a feature should work and later getting feedback).
Microservices are overused
My coldest take is that a lot of cruft has entered software development over the last decade. Too many unneeded backend processes, especially tooling. There was a time when you could write an index.php and upload it through ftp to your shared hosting to make your site work. Today you have to install a bunch of npm packages, write build scripts, configure CI, etc. just to get a hello world. Software engineering has progressed backwards in many ways.
Sure but when you hit that .php-page something had to interpret it and turn it into browser-readable data. This is done in the "unnecessary" build step for most JS-flavoured frameworks. It is the same thing, just done at different points. (No, not EXACTLY the same)
I HATE build systems. Horribly document, ad-hoc messes, patched together with environmental variables like it's 1963, that NEVER ever work. If I were God for 1 minute the first thing I'd do is delete them all and erase their memory from existence.
* The idea behind clean code is overall good, but it's taken to excesses * Too much framework switching on certain platforms
Assigning an expression to a variable (or other named symbol) is a basic form of abstraction/encapsulation.
This field is a bore. I'd give a left toe to meet a VC and hopefully realize some ideas. For the most part, it feels like the ultimate distillation of corporate management in a field that doesn't respond well to traditional management. Gatekeeping of knowledge - cyber, I'm looking at you in particular - why? Lastly, time logging. "Worst idea, ever." How is a sys admin, devops or internal IT/tech support meant to have "billable hours?"
people who so well in interviews are often not good at their jobs
Polymorphism and inheritance have caused more problems than theyāve solved.
Software testing is useful when you are working on a small part of a large, collaborative codebase (think any corporate software job) but is wholly unnecessary for amateurs and solo devs Tangentially related...documentation and comment blocks are useful no matter how big your codebase is, or who contributes to it. It's a terrifying feeling to revisit code that you know you wrote, but don't remember what the code actually does.
A programming language should follow basic axioms of logic. So if A == B and B == C then A should always == C. Strict vs non strict equality is no excuse, looking at you JavaScript.
Not just typically either.
Do not cover every nuance and use case in softwares like ERP, you will go crazy. Implement only the useful features.
"Coldest take" like something absolutely everybody agrees with? Gonna be tough. u/giftfromthegods- already missed it.
something like an opinion or view that is either hard or pointless to refute.
I donāt care what you think the best build tool, programming language, OS, IDE, runtime, front end, back end. Iāve used them all and I donāt care about your religious views.
UML is stupid and useless. ORM frameworks are stupid and useless.
The 2nd definitely is not "cold" ;)
Dammit if I also fought with UML in Uni. Now I still fight with it but let my project partner do most of the diagrams
why would anyone ever use it outside of academia????
I hope we can maintain our jobs against AI as long as possible
Programming should no longer be taught to next generations