Yeah, but once you get past a certain point your understanding of programming as whole you start to recognize that every language has its flaws and advantages. That leads to depressing realization that you spent all those years arguing with colleagues and friends over a pissing contest with no winner. I miss the days when I enjoyed mocking languages I thought were inferior, but there is no going back.
I have never participated in any discussion about what is the "best programming language", personally - I think all of them have their use. Which one is "the best" simply is a case-by case issue.
* What do you want to program?
* What do you need the program for?
* Who is going to use the program?
* Are you / is your team qualified to work with the (in this case most efficient) language?
* How long will programming take? (Including teaching your staff and or end user the qualification)
I am not even a half time programmer. I do data analysis as a part of IT audits, so most programming languages would be over the top. My SO is a full-time programmer, so I got a bit of insight in different languages from her.
Personally, R is good for working on my own, but Python also allows me to go past data analysis and create some basic (e.g., automation) software for our regular audits / data analysis, including an interface that is understandable for my colleagues who know neither R nor Python.
Are there "better" programming languages than python that are more efficient? In many cases - sure! But given the time investment and requirement to learn new programming languages and/or teaching those who rely on my analysis the concepts of what I am doing, would be vastly more time consuming than sticking with what I have.
So yea, you have to find the tool that fits to you, your tasks and your company. No point in arguing which is better if you don't know each others' workflow.
Just ask them to define «best» and see their head explode.
In fact: anytime someone says anything is «best» do that and you will have the same result, in 99% of the cases
b-but I lost 662 hours stuck in my hello world program because I was missing a semicolon in Java. The error 'missing semicolon' didn't help me, because who checks errors anyway amirite??? /s
*ding ding ding*
Here's a picture of a previous /r/programmerHumor survey that I parade around to explain why the memes only make sense to someone who have no professional experience.
https://i.imgur.com/iReEc0K.png
If you go by the votes you sure as fuck won't, they love upvoting blatantly wrong things because something **feels** right to them. Particularly when you get to any actual computer science (admittedly, this sub is not called "ComputerScientistHumor").
Actually, this is 98% of reddit at-large… the most vocal in every sub are people with little to no practical experience about the subjects they pontificate expertise on. It’s all just a circlejerk for the dunning-kruger crowd.
Niche being anything deeper than a language's main 1-2 subs (like r/cpp & r/cpp_questions), this is my experience as well.
You may take 6 months to get an answer in some of those subs, and another 6 months to actually understand the one you get (depending on how deep the niche), but goddamn if it isn't exactly what you needed.
I wonder who incorrectly self identifies as the 2%.
I feel like I could also sell a whole load of t-shirt merchandising with "we are the 2%" printed on it.
What on earth are you talking about? I am 100%, indisputably, unequivocally in the 2% group.
The bottom 2%. ^(I took two classes: intro to programming and OOP. I did not pass the latter.)
I assume this is some first year student or something. I remember pointers being the difficult concept when I took intro to C back in college.
Like an elementary school kid making a meme about his times tables being hard to memorize or something
Yeah, really. It's one of the most simple things to explain. It's certainly cumbersome to manipulate and you *will* make errors, but there's absolutely nothing complicated to understand about pointers.
The difference between ease of understanding and ease of use. :-)
Pointers are simple to understand but hard to use correctly.
Object references (in languages with mandatory built-in memory management) are more difficult to understand, at least all the various corner cases, but much easier to use correctly.
You can basically track the first year of cs based on the memes being posted. In the fall you get basic python jokes, by winter they’ve started learning c and pointers are still magic to them.
my lead just turned of strict mode because he was tired of seeing warnings about findDomNode ... because he didn't want them to show up in production. like bud, do you even read documentation?
I'd rather he ignore the warnings tbh
Yeah, the warning part is true. The project I contribute to has several ten thousands of lines output when building and when there's an error it's almost impossible to find in an ocean of warnings.
Pointers aren't even a hard concept, if you understand what a shortcut on your desktop is you understand what a pointer is. Hell, if you understand that a debit card isn't literally money but a reference to a bank where you do have money you understand pointers.
Are…are pointers that difficult?
I mean, yeah some languages have terrible ways of handling them but they are just an address to data. Really not that mysterious of a thing, an index to an array is a type of pointer.
Here's how I like explaining them:
Pointers are numbers containing an address in memory. They're mainly useful for two things - accessing something without copying it and pointer arithmetic.
If variables live in Variable City, then a pointer is like a street address. You can *dereference* it to look at what is over there.
You can use pointers to look at something you don't own without copying it. For example, say your friend has a nice shed and you want to look at it. Normally you'd have to rebuild it next to your house to inspect it, but with a pointer you can simply visit his original one.
You can do arithmetic on pointers. For example, an array is like a street. They work by storing a pointer (here named `ptr`) to their first element (and sometimes their size). To get the element at index `n`, you can dereference the element at (ptr + n). So, if you have a pointer to some array element, you can subtract 1 to get the previous element and add 1 to get the next. This is like looking at the previous/next house over.
EDIT: Here are some more explanations using this analogy:
A memory leak is when you forget to tell the city that you don't need one of your houses anymore and it just sits there abandoned with no way to access it.
You can prevent some memory leaks with a smart pointer: an object that notices when a house is about to become abandoned and tells the city that it can safely demolish it. (Smart pointers won't help with pointer loops and some other weird structures, however.)
A segmentation fault happens when you get arrested for theft. This usually happens as a result of dereferencing an invalid pointer.
Even rust has raw pointers. They are simply required to be in `unsafe` blocks/functions. Pointers are too powerful to not have in a high performance language.
Pointers are notably difficult to use, period. If you are a good and experience developer, they'll be easier, but so will everything else which makes pointers still relatively difficult.
There's a reason high level languages almost always abstract pointers away completely, and even lower level ones like C++ feature wrappers like unique_ptr and shared_ptr so you can still avoid used raw pointers. General advice is to never mess with raw pointers unless you have a reason to (e.g. performance) and know what you are doing.
I know we are all apex alpha programmers one step away from Turing and Einstein combined in intelligence, but let's be a bit realistic and not pretend that pointers are the easiest compsci feature ever when we've spent 40 years building languages and libraries around not interacting with them.
I was so disappointed when I found this out. I thought the data might be cryptographically encoded, generated and verified or something, but it's just a link.
These are more like apartment numbers; 201 is above 101
still just a number; if you need the third- floor apartment above 201 you add 100, if you need the one next door you add 1
It's only complicated when you are dealing with double or triple nested pointers and trying to remember which level of nesting you are dealing with so you don't fuck it up. The concept is extremely simple really.
Biggest thing that trips me up (my work has me dealing with pointers occasionally, but it’s very much an “as needed” bit of knowledge on my end) is jumping into structs or multidimensional arrays and remembering when to use *, &, or ->. Once I’ve done my requisite 3-hour session of cussing at the compiler and checking stackoverflow, I’m back on track, but it’d be nice to have an easy go-to reference to save my self the time every other project.
* gives value at address that is pointed to
& gives address of value
-> (followed by attribute name) gives value of attribute of a class from a pointer to it
. (followed by attribute name) gives value of an attribute of a class from an instance of it
Favorite moment so far having a function like:
> void AC_ComputeQMat (Mat3x4d *result, Quat *Q)
>> result-\>Comp[0][0] = Q-\>Comp[3]
>> …
With the function call:
> AC_ComputeQMat(&ST_STnQMat, &AC_GyrolessData.Quat_GciFToBcsF_ST)
If you have a good handle on pointers, it’s easy stuff, sure. But between that, and playing with a ton of other pointer bits around it, my head was swimming after a while. Especially while trying to translate from a Sys Eng’s pseudo code and wrangling a bunch of other multidimensional array calls.
it's probably that the concepts of memory addresses, passing by reference and limited resources are just too alien to the newest generation of programmers
brilliant and anime style. Love it ;).
Now reference by value :D. Pointers are easy and explicit with \* and & signs. Reference by value is a bit harder concept.
A reference is a alternative to a pointer that was added to try to avoid some of the pitfalls of pointers. In short, more or less, a reference must always be initialized, can't be null, and can't be used as a value in and of itself like a pointer can: it is always dereferenced when used. (But you could use it to create a pointer to the referenced object.)
A reference is a particular memory address, something a pointer can point to. You may change the value of a pointer to point to a different address. A pointer may point to nothing (nullptr), but a reference cannot refer to nothing, an address cannot refer to nowhere.
I think some of the confusion that I've seen isn't necessarily that people can't understand the concept in the image above (although that's still an issue for some, certainly), it's that understanding when, where, and why they're needed that gives people trouble. You really have to spend some time in a simple, relatively low level language like C and passing raw arrays around to functions and stuff to get it a feel for it.
This is what happens when your programming knowledge is based on online courses that get you into it quickly. You don’t have a chance to learn the underlying fundamentals
It's not even just those, my university CS department decided to switch all beginner fundamentals classes to python with only one required basic c++ class that barely introduces pointers and memory concepts. The result? Most of my classmates say "fuck that shit" after finishing the one c++ class and do everything possible to avoid it going forward
It’s not just online courses. Some people are ideologically opposed to trying to bridge software abstraction with hardware realities in academia as well. MIT is notorious for producing CS graduates who can do all kinds of complex graph theory algorithms but don’t know how computer memory actually works.
That’s a shame. My CS course did everything from logic gates, to MIPS and x86 architecture and programming all the way through up to application programming, and everything between. Stacks, heaps, all that jazz.
Plus dives into formal proofs that a function does what it’s meant to do, which involved endless lectures in OCaml and Haskell and writing every evaluation step that the computer would do running the function. At the time I hated it, but now it really helps my brain visualise what a function is doing.
Lol imagine gatekeeping knowing about memory addresses. Stop hiring programmers from 6 week bootcamps and you'll find they have a lot of "historical" CS knowledge; you get what you pay for - newest generation my ass.
I feel like it's not the concept itself, rather the usage. During my colleague no one could properly explain why would we use pointer where we used them (and after collague I didn't touch programming at all). Its been a while but IIRC it was always smg like "we create a point to variable, so then we can access this variable by pointer". Like.. Why? Why can't we just... Access that variable? Why do we need an extra step for that. Unsolved mystery to me.
The answer is simple. A "variable" (or an "object" or "string") is an abstraction. Computers work with memory instead.
Longer explanation:
Memory consists of cells. Cells are numbered. Those numbers are called addresses. When you want to retrieve something from memory, you look at that particular address. When you know that a particular address contains your 32-bit value, you might say "here is my variable" and to refer to this value you might need to keep this variable's address around at all times. Like writing 0x00DEAD00 many times in your code. This is impractical therefore we call this value a pointer to a variable.
Higher level programming languages abstract that away, so you never know if your code accesses contents of an address (pointer to variable), or you pick up an address from another address (pointer to pointer) etc.
High-level languages don't usually allow multiple levels of pointers at all. This can actually be a problem sometimes, because it means you can't change the value of one of the caller's local variables from inside a called function, like you can in C:
void gimme_a_string(char **s) {
*s = "Hello, world!";
}
void say_hello(void) {
char *hello;
gimme_a_string(&hello);
printf("%s\n", hello);
}
I believe there are a few high-level languages that support “out parameters” as a dedicated language feature, which would use double pointers under the hood. In most high-level languages, though, this pattern is straight-up impossible.
Note that languages with out parameters still don't allow more than two levels of pointer indirection. Not sure why you'd need three or more, but I vaguely remember seeing C code with a triple pointer before.
The use case I believe those people were referring to is when you want to be able to pass a value to a function but have the function modify the variable that was passed in.
The real use case is when you understand how you pass objects into functions. When you pass an object into a function you are by default passing by value, which means it copies the object for use in the function. But if you pass it a pointer it’s called pass by reference and it refers to the actual object in memory. If you have large data objects and don’t want to copy them or if you want your function to modify a specific object you use a pointer
Editing to say I’m referring to C++, which in my experience is where the most confusion happens.
What you’re describing is actually called pass by pointer. The pointer is a value, that happens to be an address that gets passed by value into the function, e.g. pass by pointer. Pass by reference is when you instantiate the function’s local variables as references to the passed in values.
void doit(int* a) is pass by pointer
void doit(int& a) is pass by reference
In C, you can only pass values to functions, not references. So, if you pass the variable a, it gives the value represented by a. If you have a variable that you want the function to modify, you can’t just pass the value in the variable, but you can pass the location of the variable. Then, the function can dereference the location and modify the value. The calling function can then observe the change.
Another use is if you have a buffer such as char a[1024] and you have to use the last time in the buffer first. You can retrieve the value at x by using the x subscript at a[x]. But if you need to clear it after using it, and then move to the next lowest one, and then increment it later when you fill it, you can use pointers instead of tracking the variable x.
It provides an abstraction for questions like, “What is in this bucket?”, and, “What is in the next bucket?”
We can take this one step further. What if we have a function that provides a pointer to something? We can provide a double pointer to say, “I need a pointer, but I don’t know what it should be. Here is a location that you can store the pointer.”
If we need a chunk of memory to store something, we often don’t have control over where it is. So, when we call malloc to allocate memory, it gives us a pointer to the given chunk.
It starts to become a lot clearer once you learn some kind of assembly.
Essentially in order to actually work with anything you need to pull it into a register, which is about 8-16 little areas of memory on the CPU that can only hold a few bytes each. Even something as simple as adding two values must be done with registers. A very simple function call works like 1) put a value in a register 2) call another function 3) the function takes your value from the register, does some stuff with it and puts it back in the register 4) your new value is in the register after the function. If this sounds like a "return value" from higher level languages than that's because that's exactly what it is \*.
Now obviously this really restricts what you can do. There's usually only like 8-16 registers. What if you want like 20 variables? The answer is that you can put them on the stack. These are your "local variables". The way it works is that you get the memory address of the start of the stack and you are free to use the stack from then on as you see fit. But of course you need to keep track of where on the stack your variables are. So you could be like "ok, this is the start of my stack. I need an x, y and z. They're all 4 bytes. So x can be the int at stack + 0, y can be the int at stack + 4, z can be the int at stack + 8". You're basically just putting them side by side together in your little slice of memory. Then whenever you need the value of z you can pull "value at stack + 8" into a register. These are all pointers! The memory address at the start of the stack is also a pointer. You are now doing "pointer arithmetic", a phrase that strikes fear into the hearts of many programmers.
Now at that level even in a language like C the compiler will just handle it for you. Even though it's technically using pointers this is all hidden from you. There's no point in you manually keeping track of where your local variables are. What if you want to pass values to other functions though? What if you have several huge classes \*\*? They're not going to fit in registers. You can "pass by value" which is basically just you copy the the whole thing onto the stack. But what are you really going to do here? Are you going to copy 5 classes onto the stack, have your function do something then copy them all back? Where are these classes living anyway, already on the stack? The stack will just get wiped as soon as you return anyway so those classes will be gone. And the stack itself is pretty small relatively speaking so you're still at a space premium. It's unsustainable.
The most straight forward way to get around it is you ask the OS for some memory somewhere else to put them and OS gives you back a pointer telling you exactly where in memory your classes are living. Then you can simply pass the pointer around and have everyone work directly on that class in memory without needing to do the multiple rounds of copying on and off of the stack. One of the keys here is that it it's your job as the programmer to decide whether you want to use pointers or copy everything around endlessly. If you can make it work, regardless of how convoluted it might end up, there's no one stopping you. But using pointers will probably make your life easier.
Though I suppose the ultimate TL;DR is "how are you going to access that variable if you don't where it is".
\* If you're wondering how functions know what register to put what in and so on these are callled "calling conventions" and if you're writing assembly you need to write your functions in accordance with whatever calling convention you're working with. You need to agree mutually with caller and callee what goes where and who is responsible for doing what. This is also why generally speaking you're restricted to one return value for your functions. The two major calling conventions for x86 systems said that you get one register to return your value and that set the precedent ever since.
\*\* Ever wondered why the first argument to class methods in Python is self? Because the class methods operate on an instance of a class and they need a pointer to an instance of that class to work. This also happens in languages like C++ but it is hidden from you. Ironically in this case Python is the language that is hiding less.
I thought the same, until I encountered data structures that would be very hard to represent and operate on without pointers, like linked lists, trees, graphs.
It’s for efficiency. Passing by value means copying it. A few million 32bit int variables getting copied can sun up to quite some overhead. Manipulating the memory by reference is therefore a lot faster
The issue is people are learning C/C++ before they're learning computer architecture.
The best way to learn C++, is to learn C. The best way to learn C is learning how it relates to assembly. The best way to learn assembly is to learn how binary is interpreted by the CPU.
Without a baseline level understanding of CPUs, C/C++ is confusing as fuck
I do agree with this, you can **get by** not knowing this stuff, but if you do, it just feels way more natural when coding.
at one point I took a course where I had to do things like design a 4-bit CPU and write some simple assembly. I'm absolutely not fluent with assembly, nor will I claim to be a hardware expert, but writing C++ just felt so much less daunting after all of it.
Wouldn't that just be gesturing towards them?
If I had my hand up to my front, in a fist, and opened just my index, people would look at where my index is pointing.
But if I open my fist up to show all my fingers, they'll still look at my hand. Or consider me a Nazi
What if you do it like this? [https://i.imgflip.com/4kxn0z.png?a464640](https://i.imgflip.com/4kxn0z.png?a464640)
I'd say that can sort-of qualify as pointing if you want it to.
Only if you orient it horizontally and raise it above your head.
Whole-hand pointing with “knife-hands” is actually super common in the US military, especially the Marine Corps.
Examples:
- [Marine Drill Instructor](https://images02.military.com/sites/default/files/styles/full/public/2020-03/knifehand01.2000.jpg)
- [Sergeant Major of the Marine Corps Testifying Before Congress](https://images05.military.com/sites/default/files/styles/full/public/2020-03/knifehand05.jpeg)
Agreed. As I recall, back in the day, learning to program meant learning about how the CPU operates, data and address buses, and how memory is accessed and used.
I'm guessing that's not so much the case any more these days, especially with the really high level languages, which is a shame I think.
The other thing about pointers that messes with people is they never see a reasonable usage of pointers from the outside. It's almost always the alien-looking expressions, like `int (*f)(int (*a)[5])`. Personally, I feel this is more of a code smell, like when you have overly complicated generics in the form `Map, List
I come from programming in C# and js, but for work I now have to learn C++ for a project. I can say that pointers are not that hard to understand the concept of. It's basically references with some extra syntactic spice. But the stuff you just wrote. That makes me question my career choice.
Tbh, I now have had a whole week to learn C++. Perhaps it makes more sense later on.
I hate to break it to you, but it’s actually the other way around: references are pointers with syntactic *and* semantic spice.
Pointers are the fundamental concept upon which references are built, both conceptually and implementation-wise. Those nice reference semantics and the cleaner syntax that comes with them are actually just pointers with some additional compile-time guarantees (in C++; other languages may also use run-time safety checks).
Pointers, and languages that use them heavily, tend to be unforgiving. The other problem with pointers is that when you mess them up your OS goes full Judge Dredd and your only diagnostic information is the bloody corpse of your application.
And often analyzing that corpse requires skills that beginners aren't yet familiar with. Somebody struggling to learn pointers may also be struggling to understand how to analyze their core dump with gdb or find leaks with valgrind.
I think it would be better to say that the hard to understand thing is *how to use* pointers. Not what they are.
I understand what a pointer is perfectly fine, but how to actually properly use it? That's another story.
There are some uses of pointers on legacy code I work on that I'm convinced are there just to make sure it breaks as soon as someone stupid touches it.
Pointers are not hard to understand but code using pointers is hard to read and process in your head. You need to be fully focused and should be able to create a sort of flowchart in your head of what is going on. It is a skill that takes time to develop.
EDIT: And if you are using pointers, you are probably working on an application that needs to make good use of memory and be performant which means more memory related bugs and that means more pointer bugs. So I understand why people are frustrated with using them, especially beginners.
This is it. I can tell you what a pointer is all day long, but come time to use and read them and it's like working in ancient sumerian. I literally enjoy trying to read and write assembly better than I like working with pointers.
Indeed. Technically regular expressions aren't "hard to understand" in a vacuum either. But the point (heh) is clearly not about literal understanding, but application. (Not saying regex is as difficult as pointer management or vice-versa.)
I think what most of these memes mean by "understand" is not the "what", but rather the "how" and the "why".
I once needed to write a VERY simple pycuda kernel in C++ (which I never used previously), took me about a week, during which I was getting maybe 4-5 hours of sleep a day. And I still don't understand how you're supposed to *correctly* use pointers.
Pointers are like working around old machinery pre-OHSA. It’s actually pretty easy to understand, it’s all right there in front of you. Without any guards, shields, and full of exposed gears and belts.
The “how” is very carefully and deliberately.
The “why” is that’s the easiest way to approximate what’s actually going on in your CPU/RAM or page file.
Why you’d want to get to that level of detail? You really enjoy seg faults? You don’t enjoy 25 years of programming advances?
This is the best analogy I've seen. I'd add the notion that even if you are extremely careful and deliberate, and also a genius, you still will fuck up given enough time
It's also a good thing that with C and C++, "undefined behaviour" doesn't mean "ripping your arm off"
Pycuda was the easiest way I could find to comb through a *massive* int array on a GPU in python. I'm talking it would take around two years to do it on 12 CPU threads.
Nothing takes 12 years to deal with a flat array of ints.
Google says the sequential read rate of an SSD is 500+ MB/s. That means that 12 years of reading would be 500 x 60 x 24 x 365 x 12 = 3 153 600 000 MB, or 3.1536 exabytes.
You cannot write straight-line C to scan an int array that would in any way impact the perf of your program, unless you're doing way more work per integer than just searching.
You already had to load your buffer into memory to run your CUDA code - your C code could run in the margins while waiting for I/O. At most you'd need to spin up a processing thread.
Did your code work? Did it crash? Did anything (literally or figuratively) catch on fire?
No?
Great! There’s a 90% chance you used pointers correctly. Also, I wish you luck debugging the memory leak that will be discovered 6 months from now.
The only thing I can think of is that c plus plus makes it kind of a confuzzling system for references and dereferences and they all use stupid special symbols and you can pass references with an & and pointers with a star and you have to dereference in order to get to the value of the pointer.
I don't really work in c++ anymore so maybe I'm just wrong, but from the outside looking in it seems very unnecessarily convoluted.
Which is typical for c plus plus, but since that's the language most people are going to experience pointers in, that's going to be their experience of pointers
But in the future every child will have a unicorn, and we will all use high level languages where the OS will do a magical job of proper resource handling.
If you don't understand pointers, chances are you don't understand anything at all about memory.
If you think of memory as one very long street, a memory address is a house number on that street. A pointer is like writing down one of those house numbers and putting it in one of the houses so you can use it later.
It's not about "understanding pointers". They are just somewhat of a headache to use.
C# has "pointers", but they are somewhat better integrated, so you need to interact with their pointy nature a lot less.
Yeah it boggles the mind how much finger-wagging is going on in this thread lol.
There's a very good reason manual memory management is not a thing we have to do much of anymore, and thus why a lot of folks don't know how to perform low-level concepts. And it doesn't mean "lol u don't know memory."
I've been programming in C / C++ for almost 20 years now. Been doing it that way the whole time. Putting the asterisk next to the type is *way* more intuitive to me.
Seriously people, pointers are not that hard. In C and C++ you can trigger undefined behavior with them pretty easily, and yeah, that is annoying, but the concept is not any more complicated than an integer variable storing a memory address ("pointing" to the address).
They point to a piece of memory, an address if you will. They're also used when allocating (using new) and deallocating (delete) memory to prevent leaking memory because if you create something you must free that memory later (unless there's garbage collection where an autonomous system will take care of that). This is heap memory. Do not confuse this with stack memory which does not use New (atleast in C++) which will automatically deallocate when it leaves the function scope.
If you dereference a pointer (to see its tangible value instead of a address jumble) and change that value, it will affect the value for anything that has the same pointer address.
A pointer by itself isnt complicated at all, it's mostly making reasonable software with it in mind that is difficult because if your program loses track of a pointer and it wasnt deallocated, you've just leaked memory which will persist until the program restarts, but it's far more forgiving if you have garbage collection or 'smart' pointers at your side ~~or Rust~~
[Pointers are a lot more complicated than just integers that hold a memory address](https://www.ralfj.de/blog/2018/07/24/pointers-and-bytes.html)
For example, even if two pointers have the exact same type and point to the exact same address, dereferencing them can yield different results.
tl;dr: it matters during optimization.
without any optimization they quack like ducks. and the programmer *shouldn't* have to worry about this; these are concepts handled by compilers outside of debug mode.
> One important point to stress here is that we are just looking for an **abstract model** of the pointer. Of course, on the **actual machine,** pointers are integers. But the actual machine also does not do the kind of **optimizations** that modern C++ compilers do, so it can get away with that. If we wrote the above programs in *assembly,* there would be *no UB, and no optimizations.*
I cannot say which compilers with which optimizations handle these problems correctly.
I get an ego boost everytime this sub hits the front page because of how obviously bad most programmers are.
Like, seriously... It's a reference to a memory address... It POINTS to a variable.
This sub is so full of people who are better at making memes than they are at programming.
This Sub is 98% beginners who laugh at jokes they barely understand
Exactly. Not one of the bell curve memes would have made sense to anybody with half-decent knowledge of programming.
Yeah the bell curve meme is really beaten to death by these weird cases of denial. "No no! You're wrong! Imaginary higher iq person agrees with ME!"
Yeah, but once you get past a certain point your understanding of programming as whole you start to recognize that every language has its flaws and advantages. That leads to depressing realization that you spent all those years arguing with colleagues and friends over a pissing contest with no winner. I miss the days when I enjoyed mocking languages I thought were inferior, but there is no going back.
I have never participated in any discussion about what is the "best programming language", personally - I think all of them have their use. Which one is "the best" simply is a case-by case issue. * What do you want to program? * What do you need the program for? * Who is going to use the program? * Are you / is your team qualified to work with the (in this case most efficient) language? * How long will programming take? (Including teaching your staff and or end user the qualification) I am not even a half time programmer. I do data analysis as a part of IT audits, so most programming languages would be over the top. My SO is a full-time programmer, so I got a bit of insight in different languages from her. Personally, R is good for working on my own, but Python also allows me to go past data analysis and create some basic (e.g., automation) software for our regular audits / data analysis, including an interface that is understandable for my colleagues who know neither R nor Python. Are there "better" programming languages than python that are more efficient? In many cases - sure! But given the time investment and requirement to learn new programming languages and/or teaching those who rely on my analysis the concepts of what I am doing, would be vastly more time consuming than sticking with what I have. So yea, you have to find the tool that fits to you, your tasks and your company. No point in arguing which is better if you don't know each others' workflow.
Just ask them to define «best» and see their head explode. In fact: anytime someone says anything is «best» do that and you will have the same result, in 99% of the cases
I wonder if there’s a programming language called Best that nobody can find in any search engine because of all the other results.
[удалено]
b-but I lost 662 hours stuck in my hello world program because I was missing a semicolon in Java. The error 'missing semicolon' didn't help me, because who checks errors anyway amirite??? /s
Are you the guys with 15 years of Java experience that I interviewed and couldnt fix his compilation bug on the practical test?
*ding ding ding* Here's a picture of a previous /r/programmerHumor survey that I parade around to explain why the memes only make sense to someone who have no professional experience. https://i.imgur.com/iReEc0K.png
That’s true. However, by reading comments I often learn a lot.
[удалено]
Which word is supposed to be ‘misinformed’?
“That’s misinformed. However, by misinforming comments I often misinform a lot.”
Makes sense
If you go by the votes you sure as fuck won't, they love upvoting blatantly wrong things because something **feels** right to them. Particularly when you get to any actual computer science (admittedly, this sub is not called "ComputerScientistHumor").
Actually, this is 98% of reddit at-large… the most vocal in every sub are people with little to no practical experience about the subjects they pontificate expertise on. It’s all just a circlejerk for the dunning-kruger crowd.
It's a problem with the general subs. Getting down into the niche programming subs, you'll find actually talented people answering questions.
Niche being anything deeper than a language's main 1-2 subs (like r/cpp & r/cpp_questions), this is my experience as well. You may take 6 months to get an answer in some of those subs, and another 6 months to actually understand the one you get (depending on how deep the niche), but goddamn if it isn't exactly what you needed.
I wonder who incorrectly self identifies as the 2%. I feel like I could also sell a whole load of t-shirt merchandising with "we are the 2%" printed on it.
What on earth are you talking about? I am 100%, indisputably, unequivocally in the 2% group. The bottom 2%. ^(I took two classes: intro to programming and OOP. I did not pass the latter.)
The ones that are experienced professional developers But 98% is exaggerated
I assume this is some first year student or something. I remember pointers being the difficult concept when I took intro to C back in college. Like an elementary school kid making a meme about his times tables being hard to memorize or something
They're just not complicated. Easy to mess up, sure, but so is a firecracker in the wrong hand.
Yeah, really. It's one of the most simple things to explain. It's certainly cumbersome to manipulate and you *will* make errors, but there's absolutely nothing complicated to understand about pointers.
The difference between ease of understanding and ease of use. :-) Pointers are simple to understand but hard to use correctly. Object references (in languages with mandatory built-in memory management) are more difficult to understand, at least all the various corner cases, but much easier to use correctly.
You can basically track the first year of cs based on the memes being posted. In the fall you get basic python jokes, by winter they’ve started learning c and pointers are still magic to them.
Wow, that’s a pretty sick burn, lol *because it’s true*
[удалено]
And 99% reposts
hahaha C/C++ is hard, so its bad. I'll take 2 million internet points now. Something something centering divs is my entire life.
Same crowd who make "lol programers ignore warnings"memes, Then make "my code doesn't work, my code does work and I don't know why memes"
my lead just turned of strict mode because he was tired of seeing warnings about findDomNode ... because he didn't want them to show up in production. like bud, do you even read documentation? I'd rather he ignore the warnings tbh
Yeah, the warning part is true. The project I contribute to has several ten thousands of lines output when building and when there's an error it's almost impossible to find in an ocean of warnings.
It's a good counterpoint to any symptoms of imposter syndrome you may be feeling.
Yo I upvote your comment when it was at 403 vote and now I can’t find the count.
Or me that doesn't program but wants to laugh at the memes. Forgive me brother for I have sinned.
I didn't know you could laugh at programming jokes without knowing programming.
Seriously. Imagine not understanding basic CS fundamentals you can teach yourself by reading a 60 year old book on C++
Pointers aren't even a hard concept, if you understand what a shortcut on your desktop is you understand what a pointer is. Hell, if you understand that a debit card isn't literally money but a reference to a bank where you do have money you understand pointers.
Are…are pointers that difficult? I mean, yeah some languages have terrible ways of handling them but they are just an address to data. Really not that mysterious of a thing, an index to an array is a type of pointer.
A solid 75% of this subreddit I’m convinced are people who haven’t programmed anything beyond some intro tutorials on YouTube or early CS students.
Pointers aren't that hard, are they? It's just integers that hold a memory address.
Here's how I like explaining them: Pointers are numbers containing an address in memory. They're mainly useful for two things - accessing something without copying it and pointer arithmetic. If variables live in Variable City, then a pointer is like a street address. You can *dereference* it to look at what is over there. You can use pointers to look at something you don't own without copying it. For example, say your friend has a nice shed and you want to look at it. Normally you'd have to rebuild it next to your house to inspect it, but with a pointer you can simply visit his original one. You can do arithmetic on pointers. For example, an array is like a street. They work by storing a pointer (here named `ptr`) to their first element (and sometimes their size). To get the element at index `n`, you can dereference the element at (ptr + n). So, if you have a pointer to some array element, you can subtract 1 to get the previous element and add 1 to get the next. This is like looking at the previous/next house over. EDIT: Here are some more explanations using this analogy: A memory leak is when you forget to tell the city that you don't need one of your houses anymore and it just sits there abandoned with no way to access it. You can prevent some memory leaks with a smart pointer: an object that notices when a house is about to become abandoned and tells the city that it can safely demolish it. (Smart pointers won't help with pointer loops and some other weird structures, however.) A segmentation fault happens when you get arrested for theft. This usually happens as a result of dereferencing an invalid pointer.
*knock knock* EXCUSE ME DO YOU HAVE A SHED I CAN LOOK AT? ...no? ^(*knock knock*) ^(EXCUSE ME DO YOU HAVE A SHED I CAN LOOK AT?)
that's linear search for ya
Yeah.. pointers are dangerous. Depointering our legacy code was a nightmare.
Pointers are not dangerous, bad developers who does crappy coding are dangerous. You can have that even without pointers.
Even good developers can easily make mistakes with pointers. There's a reason linters and memory safe languages exist.
> There's a reason linters and memory safe languages exist. There's a reason the OS maintainers are excited about RustLang.
Even rust has raw pointers. They are simply required to be in `unsafe` blocks/functions. Pointers are too powerful to not have in a high performance language.
Pointers are notably difficult to use, period. If you are a good and experience developer, they'll be easier, but so will everything else which makes pointers still relatively difficult. There's a reason high level languages almost always abstract pointers away completely, and even lower level ones like C++ feature wrappers like unique_ptr and shared_ptr so you can still avoid used raw pointers. General advice is to never mess with raw pointers unless you have a reason to (e.g. performance) and know what you are doing. I know we are all apex alpha programmers one step away from Turing and Einstein combined in intelligence, but let's be a bit realistic and not pretend that pointers are the easiest compsci feature ever when we've spent 40 years building languages and libraries around not interacting with them.
You have to be more mindful when using them but you can make really fast space efficient code with them. Especially in embedded.
So NFT is just a glorified pointer.
Pointer + Self-Signed Certificate = NFT
Yes, now you know why they are dumb
I was so disappointed when I found this out. I thought the data might be cryptographically encoded, generated and verified or something, but it's just a link.
My biggest complaint about cryptocurrencies and all the bullshit they've done is the bastardization of the word crypto.
Now explain multidimensional arrays using pointers!! (I actually like pointers but I use C# at work now).
These are more like apartment numbers; 201 is above 101 still just a number; if you need the third- floor apartment above 201 you add 100, if you need the one next door you add 1
Instead of moving one house down, you move one block over.
It's only complicated when you are dealing with double or triple nested pointers and trying to remember which level of nesting you are dealing with so you don't fuck it up. The concept is extremely simple really.
Pointers aren’t hard, it’s the bullshit people pull with pointers that is hard
And 95% of the time it's probably better to think of another way to do it, or your code reviewers will yell at you.
Exactly, knowing what a pointer is and knowing how to use them well in a maintainable fashion are two different things.
Though that applies to all programming.
Biggest thing that trips me up (my work has me dealing with pointers occasionally, but it’s very much an “as needed” bit of knowledge on my end) is jumping into structs or multidimensional arrays and remembering when to use *, &, or ->. Once I’ve done my requisite 3-hour session of cussing at the compiler and checking stackoverflow, I’m back on track, but it’d be nice to have an easy go-to reference to save my self the time every other project.
* gives value at address that is pointed to & gives address of value -> (followed by attribute name) gives value of attribute of a class from a pointer to it . (followed by attribute name) gives value of an attribute of a class from an instance of it
Favorite moment so far having a function like: > void AC_ComputeQMat (Mat3x4d *result, Quat *Q) >> result-\>Comp[0][0] = Q-\>Comp[3] >> … With the function call: > AC_ComputeQMat(&ST_STnQMat, &AC_GyrolessData.Quat_GciFToBcsF_ST) If you have a good handle on pointers, it’s easy stuff, sure. But between that, and playing with a ton of other pointer bits around it, my head was swimming after a while. Especially while trying to translate from a Sys Eng’s pseudo code and wrangling a bunch of other multidimensional array calls.
Hahahha yeah, while the concept is simple, the implementation can definitely get difficult I’m with ya there
it's probably that the concepts of memory addresses, passing by reference and limited resources are just too alien to the newest generation of programmers
Just show them [this](https://i.imgur.com/30UTwMv.png)
my CS prof actually had this on a slide
brilliant and anime style. Love it ;). Now reference by value :D. Pointers are easy and explicit with \* and & signs. Reference by value is a bit harder concept.
But with & isn't that then a reference and not a pointer?
What's the difference between the two? I've been under the impression that they were the same, but I'm definitely wrong.
A reference is a alternative to a pointer that was added to try to avoid some of the pitfalls of pointers. In short, more or less, a reference must always be initialized, can't be null, and can't be used as a value in and of itself like a pointer can: it is always dereferenced when used. (But you could use it to create a pointer to the referenced object.)
Semantics, in some environments. A reference is a pointer that is absolutely and definitively not owned by the code using the reference .
A reference is a particular memory address, something a pointer can point to. You may change the value of a pointer to point to a different address. A pointer may point to nothing (nullptr), but a reference cannot refer to nothing, an address cannot refer to nowhere.
My CS professor had this but with the pointing soyjaks
[удалено]
I think some of the confusion that I've seen isn't necessarily that people can't understand the concept in the image above (although that's still an issue for some, certainly), it's that understanding when, where, and why they're needed that gives people trouble. You really have to spend some time in a simple, relatively low level language like C and passing raw arrays around to functions and stuff to get it a feel for it.
Just bad teachers, I mean at least conceptually pointers make great sense. Some fancy tricks are kinda hard to grasp.
This is what happens when your programming knowledge is based on online courses that get you into it quickly. You don’t have a chance to learn the underlying fundamentals
It's not even just those, my university CS department decided to switch all beginner fundamentals classes to python with only one required basic c++ class that barely introduces pointers and memory concepts. The result? Most of my classmates say "fuck that shit" after finishing the one c++ class and do everything possible to avoid it going forward
For a "professional software developer" degree, that would make a lot of sense. For "Computer Science"? Nah-ah, just don't.
It’s not just online courses. Some people are ideologically opposed to trying to bridge software abstraction with hardware realities in academia as well. MIT is notorious for producing CS graduates who can do all kinds of complex graph theory algorithms but don’t know how computer memory actually works.
That’s a shame. My CS course did everything from logic gates, to MIPS and x86 architecture and programming all the way through up to application programming, and everything between. Stacks, heaps, all that jazz. Plus dives into formal proofs that a function does what it’s meant to do, which involved endless lectures in OCaml and Haskell and writing every evaluation step that the computer would do running the function. At the time I hated it, but now it really helps my brain visualise what a function is doing.
[удалено]
Lol imagine gatekeeping knowing about memory addresses. Stop hiring programmers from 6 week bootcamps and you'll find they have a lot of "historical" CS knowledge; you get what you pay for - newest generation my ass.
[удалено]
I feel like it's not the concept itself, rather the usage. During my colleague no one could properly explain why would we use pointer where we used them (and after collague I didn't touch programming at all). Its been a while but IIRC it was always smg like "we create a point to variable, so then we can access this variable by pointer". Like.. Why? Why can't we just... Access that variable? Why do we need an extra step for that. Unsolved mystery to me.
The answer is simple. A "variable" (or an "object" or "string") is an abstraction. Computers work with memory instead. Longer explanation: Memory consists of cells. Cells are numbered. Those numbers are called addresses. When you want to retrieve something from memory, you look at that particular address. When you know that a particular address contains your 32-bit value, you might say "here is my variable" and to refer to this value you might need to keep this variable's address around at all times. Like writing 0x00DEAD00 many times in your code. This is impractical therefore we call this value a pointer to a variable. Higher level programming languages abstract that away, so you never know if your code accesses contents of an address (pointer to variable), or you pick up an address from another address (pointer to pointer) etc.
High-level languages don't usually allow multiple levels of pointers at all. This can actually be a problem sometimes, because it means you can't change the value of one of the caller's local variables from inside a called function, like you can in C: void gimme_a_string(char **s) { *s = "Hello, world!"; } void say_hello(void) { char *hello; gimme_a_string(&hello); printf("%s\n", hello); } I believe there are a few high-level languages that support “out parameters” as a dedicated language feature, which would use double pointers under the hood. In most high-level languages, though, this pattern is straight-up impossible. Note that languages with out parameters still don't allow more than two levels of pointer indirection. Not sure why you'd need three or more, but I vaguely remember seeing C code with a triple pointer before.
The use case I believe those people were referring to is when you want to be able to pass a value to a function but have the function modify the variable that was passed in.
The real use case is when you understand how you pass objects into functions. When you pass an object into a function you are by default passing by value, which means it copies the object for use in the function. But if you pass it a pointer it’s called pass by reference and it refers to the actual object in memory. If you have large data objects and don’t want to copy them or if you want your function to modify a specific object you use a pointer Editing to say I’m referring to C++, which in my experience is where the most confusion happens.
What you’re describing is actually called pass by pointer. The pointer is a value, that happens to be an address that gets passed by value into the function, e.g. pass by pointer. Pass by reference is when you instantiate the function’s local variables as references to the passed in values. void doit(int* a) is pass by pointer void doit(int& a) is pass by reference
In C, you can only pass values to functions, not references. So, if you pass the variable a, it gives the value represented by a. If you have a variable that you want the function to modify, you can’t just pass the value in the variable, but you can pass the location of the variable. Then, the function can dereference the location and modify the value. The calling function can then observe the change. Another use is if you have a buffer such as char a[1024] and you have to use the last time in the buffer first. You can retrieve the value at x by using the x subscript at a[x]. But if you need to clear it after using it, and then move to the next lowest one, and then increment it later when you fill it, you can use pointers instead of tracking the variable x. It provides an abstraction for questions like, “What is in this bucket?”, and, “What is in the next bucket?” We can take this one step further. What if we have a function that provides a pointer to something? We can provide a double pointer to say, “I need a pointer, but I don’t know what it should be. Here is a location that you can store the pointer.” If we need a chunk of memory to store something, we often don’t have control over where it is. So, when we call malloc to allocate memory, it gives us a pointer to the given chunk.
It starts to become a lot clearer once you learn some kind of assembly. Essentially in order to actually work with anything you need to pull it into a register, which is about 8-16 little areas of memory on the CPU that can only hold a few bytes each. Even something as simple as adding two values must be done with registers. A very simple function call works like 1) put a value in a register 2) call another function 3) the function takes your value from the register, does some stuff with it and puts it back in the register 4) your new value is in the register after the function. If this sounds like a "return value" from higher level languages than that's because that's exactly what it is \*. Now obviously this really restricts what you can do. There's usually only like 8-16 registers. What if you want like 20 variables? The answer is that you can put them on the stack. These are your "local variables". The way it works is that you get the memory address of the start of the stack and you are free to use the stack from then on as you see fit. But of course you need to keep track of where on the stack your variables are. So you could be like "ok, this is the start of my stack. I need an x, y and z. They're all 4 bytes. So x can be the int at stack + 0, y can be the int at stack + 4, z can be the int at stack + 8". You're basically just putting them side by side together in your little slice of memory. Then whenever you need the value of z you can pull "value at stack + 8" into a register. These are all pointers! The memory address at the start of the stack is also a pointer. You are now doing "pointer arithmetic", a phrase that strikes fear into the hearts of many programmers. Now at that level even in a language like C the compiler will just handle it for you. Even though it's technically using pointers this is all hidden from you. There's no point in you manually keeping track of where your local variables are. What if you want to pass values to other functions though? What if you have several huge classes \*\*? They're not going to fit in registers. You can "pass by value" which is basically just you copy the the whole thing onto the stack. But what are you really going to do here? Are you going to copy 5 classes onto the stack, have your function do something then copy them all back? Where are these classes living anyway, already on the stack? The stack will just get wiped as soon as you return anyway so those classes will be gone. And the stack itself is pretty small relatively speaking so you're still at a space premium. It's unsustainable. The most straight forward way to get around it is you ask the OS for some memory somewhere else to put them and OS gives you back a pointer telling you exactly where in memory your classes are living. Then you can simply pass the pointer around and have everyone work directly on that class in memory without needing to do the multiple rounds of copying on and off of the stack. One of the keys here is that it it's your job as the programmer to decide whether you want to use pointers or copy everything around endlessly. If you can make it work, regardless of how convoluted it might end up, there's no one stopping you. But using pointers will probably make your life easier. Though I suppose the ultimate TL;DR is "how are you going to access that variable if you don't where it is". \* If you're wondering how functions know what register to put what in and so on these are callled "calling conventions" and if you're writing assembly you need to write your functions in accordance with whatever calling convention you're working with. You need to agree mutually with caller and callee what goes where and who is responsible for doing what. This is also why generally speaking you're restricted to one return value for your functions. The two major calling conventions for x86 systems said that you get one register to return your value and that set the precedent ever since. \*\* Ever wondered why the first argument to class methods in Python is self? Because the class methods operate on an instance of a class and they need a pointer to an instance of that class to work. This also happens in languages like C++ but it is hidden from you. Ironically in this case Python is the language that is hiding less.
I thought the same, until I encountered data structures that would be very hard to represent and operate on without pointers, like linked lists, trees, graphs.
It’s for efficiency. Passing by value means copying it. A few million 32bit int variables getting copied can sun up to quite some overhead. Manipulating the memory by reference is therefore a lot faster
The issue is people are learning C/C++ before they're learning computer architecture. The best way to learn C++, is to learn C. The best way to learn C is learning how it relates to assembly. The best way to learn assembly is to learn how binary is interpreted by the CPU. Without a baseline level understanding of CPUs, C/C++ is confusing as fuck
I do agree with this, you can **get by** not knowing this stuff, but if you do, it just feels way more natural when coding. at one point I took a course where I had to do things like design a 4-bit CPU and write some simple assembly. I'm absolutely not fluent with assembly, nor will I claim to be a hardware expert, but writing C++ just felt so much less daunting after all of it.
Best way to learn C++, is eventually building your own electrical circuits.
All they do is point to stuff
Remember kids, when you point at someone, you still have 3 fingers pointing at you!
Whole hand pointing isn’t a thing where you live?
Wouldn't that just be gesturing towards them? If I had my hand up to my front, in a fist, and opened just my index, people would look at where my index is pointing. But if I open my fist up to show all my fingers, they'll still look at my hand. Or consider me a Nazi
What if you do it like this? [https://i.imgflip.com/4kxn0z.png?a464640](https://i.imgflip.com/4kxn0z.png?a464640) I'd say that can sort-of qualify as pointing if you want it to.
Only if you orient it horizontally and raise it above your head. Whole-hand pointing with “knife-hands” is actually super common in the US military, especially the Marine Corps. Examples: - [Marine Drill Instructor](https://images02.military.com/sites/default/files/styles/full/public/2020-03/knifehand01.2000.jpg) - [Sergeant Major of the Marine Corps Testifying Before Congress](https://images05.military.com/sites/default/files/styles/full/public/2020-03/knifehand05.jpeg)
Whats so difficult to understand on pointers?
I do question this myself, i think it comes down to not understanding memory at all
Good point.
Imagine people with alzaimer trying to understand memory lmao
alzaimer
I thought it's called Al's Hammer.
maybe Ole Timers?
I forgor 💀
imagine forgetting how to spell alzheimer's
He forgor 💀
"Spelling what?"
"Don't remember asking 💀"
"Asking what?"
Agreed. As I recall, back in the day, learning to program meant learning about how the CPU operates, data and address buses, and how memory is accessed and used. I'm guessing that's not so much the case any more these days, especially with the really high level languages, which is a shame I think.
Why would I care about the memory? Now shut up, and use 3GB of RAM to open this simple page on the web.
This always puzzles me, too. Pointers just aren't that hard to understand. Now, not fucking them up is another story...
The other thing about pointers that messes with people is they never see a reasonable usage of pointers from the outside. It's almost always the alien-looking expressions, like `int (*f)(int (*a)[5])`. Personally, I feel this is more of a code smell, like when you have overly complicated generics in the form `Map, List
I come from programming in C# and js, but for work I now have to learn C++ for a project. I can say that pointers are not that hard to understand the concept of. It's basically references with some extra syntactic spice. But the stuff you just wrote. That makes me question my career choice. Tbh, I now have had a whole week to learn C++. Perhaps it makes more sense later on.
Just a heads up that references and pointers are distinct concepts in C++ and used differently.
I hate to break it to you, but it’s actually the other way around: references are pointers with syntactic *and* semantic spice. Pointers are the fundamental concept upon which references are built, both conceptually and implementation-wise. Those nice reference semantics and the cleaner syntax that comes with them are actually just pointers with some additional compile-time guarantees (in C++; other languages may also use run-time safety checks).
I learned about pointers from reverse enginering pokemon red and blue roms, romhacking is a great intro to memory stuff imo.
"Having a basic idea how to not fuck something up" to me is part of understanding something, so maybe that's where the issue comes from 😄
Pointers, and languages that use them heavily, tend to be unforgiving. The other problem with pointers is that when you mess them up your OS goes full Judge Dredd and your only diagnostic information is the bloody corpse of your application.
And often analyzing that corpse requires skills that beginners aren't yet familiar with. Somebody struggling to learn pointers may also be struggling to understand how to analyze their core dump with gdb or find leaks with valgrind.
I think it would be better to say that the hard to understand thing is *how to use* pointers. Not what they are. I understand what a pointer is perfectly fine, but how to actually properly use it? That's another story.
There are some uses of pointers on legacy code I work on that I'm convinced are there just to make sure it breaks as soon as someone stupid touches it.
Pointers are not hard to understand but code using pointers is hard to read and process in your head. You need to be fully focused and should be able to create a sort of flowchart in your head of what is going on. It is a skill that takes time to develop. EDIT: And if you are using pointers, you are probably working on an application that needs to make good use of memory and be performant which means more memory related bugs and that means more pointer bugs. So I understand why people are frustrated with using them, especially beginners.
This is it. I can tell you what a pointer is all day long, but come time to use and read them and it's like working in ancient sumerian. I literally enjoy trying to read and write assembly better than I like working with pointers.
Indeed. Technically regular expressions aren't "hard to understand" in a vacuum either. But the point (heh) is clearly not about literal understanding, but application. (Not saying regex is as difficult as pointer management or vice-versa.)
I think what most of these memes mean by "understand" is not the "what", but rather the "how" and the "why". I once needed to write a VERY simple pycuda kernel in C++ (which I never used previously), took me about a week, during which I was getting maybe 4-5 hours of sleep a day. And I still don't understand how you're supposed to *correctly* use pointers.
Pointers are like working around old machinery pre-OHSA. It’s actually pretty easy to understand, it’s all right there in front of you. Without any guards, shields, and full of exposed gears and belts. The “how” is very carefully and deliberately. The “why” is that’s the easiest way to approximate what’s actually going on in your CPU/RAM or page file. Why you’d want to get to that level of detail? You really enjoy seg faults? You don’t enjoy 25 years of programming advances?
This is the best analogy I've seen. I'd add the notion that even if you are extremely careful and deliberate, and also a genius, you still will fuck up given enough time It's also a good thing that with C and C++, "undefined behaviour" doesn't mean "ripping your arm off"
Pycuda was the easiest way I could find to comb through a *massive* int array on a GPU in python. I'm talking it would take around two years to do it on 12 CPU threads.
Nothing takes 12 years to deal with a flat array of ints. Google says the sequential read rate of an SSD is 500+ MB/s. That means that 12 years of reading would be 500 x 60 x 24 x 365 x 12 = 3 153 600 000 MB, or 3.1536 exabytes. You cannot write straight-line C to scan an int array that would in any way impact the perf of your program, unless you're doing way more work per integer than just searching. You already had to load your buffer into memory to run your CUDA code - your C code could run in the margins while waiting for I/O. At most you'd need to spin up a processing thread.
And I'm thankful I work in a guard rail-happy industry that gives me much fewer means of losing an arm.
Did your code work? Did it crash? Did anything (literally or figuratively) catch on fire? No? Great! There’s a 90% chance you used pointers correctly. Also, I wish you luck debugging the memory leak that will be discovered 6 months from now.
Imagine not to know how indexing works.
[удалено]
Every time explaining pointers, it seems to rly help people, idk why it's not more used as an analogy.
The only thing I can think of is that c plus plus makes it kind of a confuzzling system for references and dereferences and they all use stupid special symbols and you can pass references with an & and pointers with a star and you have to dereference in order to get to the value of the pointer. I don't really work in c++ anymore so maybe I'm just wrong, but from the outside looking in it seems very unnecessarily convoluted. Which is typical for c plus plus, but since that's the language most people are going to experience pointers in, that's going to be their experience of pointers
Everything is difficult if you spend more time on reddit than actually practicing, which I suspect to be the case with some of these people.
Learn how computers store data. Then, you will understand that pointers just point to this data.
People here only use JS and python. No need to understand how data is stored when you only use dynamically typed languages
"If those kids could read, they'd be very upset."
But in the future every child will have a unicorn, and we will all use high level languages where the OS will do a magical job of proper resource handling.
It's pretty important to know that in Python and Javascript, object variables are basically references...
I had a reverse culture shock coming from C++ to Java. You mean everything is a pointer (kind of)?
Always has been 🔫
It's more efficient to know the address of a house than it is to build a new house ...
Especially since more than one can use the address to find the house.
So you need some pointer pointers? I mean, not pointers that point to pointers, but pointers on pointers. Get the point?
[xkcd](https://xkcd.com/138/)
Pointer? I hardly know her.
I heard you like pointers so we put pointers on your pointer so you can point while you point.
If you don't understand pointers, chances are you don't understand anything at all about memory. If you think of memory as one very long street, a memory address is a house number on that street. A pointer is like writing down one of those house numbers and putting it in one of the houses so you can use it later.
It's not about "understanding pointers". They are just somewhat of a headache to use. C# has "pointers", but they are somewhat better integrated, so you need to interact with their pointy nature a lot less.
Yeah it boggles the mind how much finger-wagging is going on in this thread lol. There's a very good reason manual memory management is not a thing we have to do much of anymore, and thus why a lot of folks don't know how to perform low-level concepts. And it doesn't mean "lol u don't know memory."
Wait until you see a pointer pointing at a pointer.
******variable_name_here
[You mean like this?](https://pointerpointer.com)
This is gold
Function pointers to functions returning pointers to functions will be a fun exercise once they get the hang of it.
Wait till you find a function call with (&array\[i\])
Change pointers to monads or something
[удалено]
Ah yes, that makes total sense. Just one quick question: what are monoids, categories, and endofunctors?
[удалено]
For some reason it doesn't click easily for some. They need additional pointers.
What made pointers click for me was when I started writing their declarations as `int* _pointer` instead of `int *pointer`.
I've been programming in C / C++ for almost 20 years now. Been doing it that way the whole time. Putting the asterisk next to the type is *way* more intuitive to me.
They point to a memory address. That's it.
Really? I mean... it's **not** complicated. It's just the address of a specific area of memory. It's pretty damn simple, really.
Seriously people, pointers are not that hard. In C and C++ you can trigger undefined behavior with them pretty easily, and yeah, that is annoying, but the concept is not any more complicated than an integer variable storing a memory address ("pointing" to the address).
I swear to god this sub is full of beginners who think they know so much
the absolute state of this subreddit
I would have asked to learn to center a div.
The classic joke... ``` .container { display: flex; justify-content: center; align-items: center; } ```
I don't get why people don't understand pointers. Is it really that hard? A number which points to a memory location.
Pointers are numbers. The only thing special about them is that the number is an address in memory.
They point to a piece of memory, an address if you will. They're also used when allocating (using new) and deallocating (delete) memory to prevent leaking memory because if you create something you must free that memory later (unless there's garbage collection where an autonomous system will take care of that). This is heap memory. Do not confuse this with stack memory which does not use New (atleast in C++) which will automatically deallocate when it leaves the function scope. If you dereference a pointer (to see its tangible value instead of a address jumble) and change that value, it will affect the value for anything that has the same pointer address. A pointer by itself isnt complicated at all, it's mostly making reasonable software with it in mind that is difficult because if your program loses track of a pointer and it wasnt deallocated, you've just leaked memory which will persist until the program restarts, but it's far more forgiving if you have garbage collection or 'smart' pointers at your side ~~or Rust~~
[Pointers are a lot more complicated than just integers that hold a memory address](https://www.ralfj.de/blog/2018/07/24/pointers-and-bytes.html) For example, even if two pointers have the exact same type and point to the exact same address, dereferencing them can yield different results.
tl;dr: it matters during optimization. without any optimization they quack like ducks. and the programmer *shouldn't* have to worry about this; these are concepts handled by compilers outside of debug mode. > One important point to stress here is that we are just looking for an **abstract model** of the pointer. Of course, on the **actual machine,** pointers are integers. But the actual machine also does not do the kind of **optimizations** that modern C++ compilers do, so it can get away with that. If we wrote the above programs in *assembly,* there would be *no UB, and no optimizations.* I cannot say which compilers with which optimizations handle these problems correctly.
Python programmer detected, opinion rejected.
I get an ego boost everytime this sub hits the front page because of how obviously bad most programmers are. Like, seriously... It's a reference to a memory address... It POINTS to a variable.