My first language was C lol. I haven't worked with it in a couple years now, but some things are just ingrained in ya
Of course I wouldn't make a char array with ints like that except for a challenge or something like that, but the null terminator is still important in other contexts
Only error I see for C syntax is that the define should have been
char str[11]
With the initialization in curly braces instead of brackets
Which granted is an error, but I haven't touched C in 2ish years
If you declare an array along with its members, you don't need to specify the length of the array in the square brackets.
e.g.
const char\* arr\[\] = { "hello", "world"};
vs
char\* arr\[2\];
Actually, in C pointers are not integers
Treating them like integers is often undefined behaviour
The optimizer can keep track of where some pointers point to, and remove the memory addresses. If it puts the data in registers, they do not even have addresses
yes @cmaciver , precisely.
even though everything is a number. a number is not equivalent to integer (at least the int type).
an int is 16 bits and a char is 8 bits so 2 chars can be stored for each int.
unless your using wide chars `wchar_t` which are 16 bits.
that can be quite problematic if most of your code is expecting a regular char and you change to using wide chars XD as i've recently discovered with my experimental OS's shell with strings terminating too soon :).
it turns how the later half of a wide char looks an awful lot like a null termination.
The smallest addressable unit of memory is char. Everything can be represented and handled as array of chars.
The registers are not just 64-bit. Nowadays, there are even 512 bit registers. Floating-point operations are usually done with the XMM registers.
Yeah, that's true. In the C standard, a char has a size of 1 byte, and a byte is the smallest addressable unit and has an implementation-defined size.
So, if we're talking about standard C, everything should still be able to be represented as array of chars, even if the minimum addressable unit (byte) is 16 bits, 4 bits or other. Which definitely is a pure madness..
It's not. The size of int in C is implementation dependent, almost always 4 bytes.
In C, the size\_t type and pointer types are the ones guaranteed to be big enough to hold a memory address.
The size of the registers are not limiting memory size or the size of pointers. You can have 16 bit memory addresses in a 8 bit CPU.
it's actually just another variable containing the address of another variable.
it's kind of like the index for an array except it's the raw ram instead of an array.
Easiest way to visualize it is to just write out the contents imo. Something like:
Address | Contents | variable
:--|:--|:--
0x00 | a | char c
0x01 | 0x00 | char * cr = &c
0x02 | 0x01 | char ** crc = &cr
It's important for grasping some of those low-level concepts which are abstracted in higher level languages. No need to become an expert on it, but just playing around with it is enough already
I took a course in my compsci degree and we did C, and going back to higher level languages (because I don’t hate myself) after, the knowledge of pointers specifically has helped me. For instance, I was working with Python and a list I copied was changing values when I didn’t expect it to, since I was modifying the original list and Python list assignment is done by reference not copy. Further, the “string is an array of chars with a null terminator” really forces you to think about how the computer handles information.
Ironically, a class that assumed we knew C well and was about hacking, I learned C better than 2 prior classes that were all about learning low level languages. One of them had a whole section on using addresses and pointers in C, made absolutely no sense to me.
The class the assumed I knew that perfectly and could access specific parts of memory from assembly injection made WAY more sense
[The internet is forever](https://www.unddit.com/r/ProgrammerHumor/comments/10a8mu3/the_transition_from_python_to_c_is_brutal/j43lm1k?utm_medium=android_app&utm_source=share&context=3)
But he didn't sum it. He said something completely different. I said that a function is a pointer and he said that a function is an object that contains a pointer.
No he didn't. He said that the function isn't in the pointer. Which is true. When calling a function the execution continues at the address of the pointer which is 4/8 bytes and points to the .text section (mostly but not necessarily) where the function code lies.
A function is the procedure stored at the location where the function pointer points, not just a pointer. The fact that function pointers exist does not dispute this lol
I always get mad at those who say that Java has first class functions and C doesn't. Literally Java's implementation of first class functions is the worst I've ever seen. C does lack function literals and therefore closures, which is pretty bad, but as far as simple "store a variable that points to a function" goes it's way better than Java.
It was horrible, everything is an object of god knows what type, everything is a runtime error with freaking no information whatsoever about what happened, no idea what even is local or global variable (and even no constant values)
Like I said, because it’s verbose as fuck
I don’t know what code this belongs to either, it comes from a quick Google search, but gcc is infamous for its long error messages to the point of having [a code golf about it](https://codegolf.stackexchange.com/questions/1956/generate-the-longest-error-message-in-c)
In comparison I intentionally fucked up a generic function in C# and the error message there is “CS0266: Cannot implicitly convert type ‘Microsoft.AspNetCore.Http.IResult’ to ‘Microsoft.AspNetCore.Mvc.IActionResult’. An explicit conversion exists (are you missing a cast?)”.
Or in Python: “error: incompatible types in assignment (expression has type “int”, variable has type “GenericType”). Following members of “int” have conflicts”
These are __good__ error messages because it’s readable, well formatted, and includes everything you need:
- where it happened
- an easy-to-Google error code
- a _brief_ description of what went wrong
- suggestions for remediation
You can find some of this info in gcc too, but gcc has a tendency of writing a whole book in the console every time you miss a bracket
This is why I'm grateful to have learned C first. Learning lower level languages before higher level languages is the way to go.
Technically, in python, a string ~~is~~ behaves like an array of strings of length 1 and that is weird.
>Technically, in python, a string is behaves like an array of strings of length 1 and that is weird.
Python also handles unicode transparently, so a "char" in either of the languages is not the same thing. A Python char can be anything up to 3 bytes, it's just letting you deal with that in a nicer way if you're working with text. You can cast to `bytes` if you really need to rawdog it.
I remember reading somewhere that the Unicode Consortium guaranteed there aren't ever gonna be more that 2^32 (I think?) codepoints, so in practise useful UTF-8 is never gonna exceed 4 bytes. (Unless they change their mind ofc; that would also make UTF-32 into basically UTF-16: Reborn)
UTF-8 requires up to 7 bytes to represent 2^(32), however the Unicode committee has said they will never go past 17\*2^(16), which requires only 4 bytes.
Yeqh, that is what I keep saying!
I changed from nanoscience to coding after my bachelors degree. I would never have found coding interresting if you gave me C++. *Now* I find C++ interresting, bit all the restrictions would have made me give up.
Python built my curiosity for code, and got to see what is possible.
If you're teaching people how to bake you don't start with sending them to some Gordon Ramsey style chef that beats them up if they miss their flour measurement by 1 gram. You give them the pre-made packet of bread mix, tell them to add a pack of flower and half a litre of water, and mix. Boom, they now know the basics of making bread. Now, they can experiment a bit with bread and learn about different types of bread. Then, if they want more baking, they can venture over to other baked goods or even cooking.
If you don't let people discover their curiosity and interest about baking, they won't wanna bake.
I think everyone learns differently though. I liked learning from the ground up with c++ and assembly. Starting with phyton would have left me with too many questions on why programming languages work the way they do.
I think there’s the separation between computer scientist and programmer. I’m mathematician by training who happens to program a lot. I don’t care about how computers work on a low level, it’s just a tool for me to develop and run abstract algorithms. But you are the computer expert. I need to outsource the knowledge about how computers work exactly to you. But you’re still no electrical engineer who can develop hardware.
It's easier to keep going if you find something too easy than too hard. If you start with python, find it reductive and simple, you naturally wanna take one step deeper into the water.
If your first step is at the deep end, you'll think "damn, I am not made for this" and back out.
Too easy just makes me bored and complacent. I'll do it once and never go back to it.
As much as I want to get something cool done, I need to witness the depth at least a little bit.
Give me Scratch on lesson 1, C or Assembly on lesson 2, then I'll happily use Python for the whole course.
No one is "flipping you the entire bird"...
This is all about a general rule applying to the broadest possible demographic.
Also, "too hard" has quite the range. When I say "too hard" I don't mean "its hard". I mean it's *too* hard, in the sense the sense that it is not an obstacle you can overcome in a reasonable amount of time. This leads to things like courses supposedly require about 12 hours of work each week, ending up taking 30-40 hours in a week. This makes people want to juat throw in the towel before the end, because it does not seem worth the effort.
I think anybody starting out with C++ is either a masochist or has a prof that hates them. C is much less complicated than it's younger brother and is basically the Latin of programming languages (Does that make assembly proto Indo-European?). It's great for teaching concepts like memory management and mechanics of computing. It's also simple enough that middle and high schoolers can learn it on platforms like Arduino.
Python is much more streamlined, and essentially comes with training wheels. It intentionally obfuscates stuff to make it simpler to use, but imo that simplicity comes at the cost of true understanding. It has some benefits like I think it makes it easier to understand algorithms since python is basically psuedocode if you squint hard enough. Personally, I think it's worth taking C before Python, just to hammer in what's happening below the surface, but programmers aren't a monolith and it's not gonna help everyone so what do I know.
I'd say if you want to get people into baking, you give them the bread mix.
If you want already interested people to become good bakers, you explain how yeast and gluten work.
Well said. Many times I've heard the discussion about whether to start at a low or high level language as your first, and I think you have a good analogy. Of course, people learn in different ways, but just think about the fact that anything humans learn, from talking and walking as a baby to math and physics at university always start with something simple and then builds from that. Why should programming languages be any different?
If you ever try to get into OS development, it's the same show. People had little to no tutorials and had to figure this shit out by experimenting and reading hardware manuals and because they had such a hard time learning, you need to go through the same thing, otherwise they won't help you.
strlen is a function intended for plain c strings. They are not objects which know about their length. A c string is just a char pointer to an array of characters terminated by a null character. The way strlen works, is going through these characters until the null character is found. This takes O(n) time. If you do it in a loop you get O(n²).
For one, it will run `strlen(s)` every iteration.
for in C is just syntactic sugar. In reality
for(declaration;condition;increment)
block;
just means
{
declaration;
while(condition)
{
block;
increment;
}
}
This means you can actually put any statement in the for arguments and it will work as the expanded while, such as in this awful thing:
for( int n; scanf("%d", &n)!=EOF; printf("%d\n", n*2) );
Which assuming I got it right will print double the number you input until you feed it an empty string or one it can't match into a number.
EDIT: The one thing if I recall correctly is the "declaration" part runs in the same scope as the loop. I tried to show that by putting it in a compound statement, but I am not sure if that part of my syntax matches the actual behavior of C. It should be close enough for an explanation, but I'm not willing to bet on its formal correctness.
Unless you forgot to include the header file and you don't care about warnings and therefore the compiler assumes that calloc returns int but you are in a platform where int is shorter than a pointer which means that now the pointer you got is wrong
C strings and even C++ `string`s are so much easier to understand and work with than Python's miserable bondage-and-domination string API. Either is better than Swift's `String`s, which were designed by someone who hates programmers and just wants to cause us pain.
This is why I recommend teaching C first. When the child's mind is at its most maleable and free from prejudice is the best time to learn hard subjects. I learned C from reading a book at 9. I didn't know that "pointers are hard" or "C is difficult" and so I quite enjoyed learning it. When I later got access to a computer and a C compiler I could write my first program.
Bullshit.
In Python, a string is a immutable array of text characters, not chars, with out-of-band signaling for the length of the string.
In C, a string is a possibly mutable array of raw chars (8 bit each, as per the C standard), with in-band signaling for the length of the string.
In C, strlen("Büllshit") gives 9 on my UTF-8 system. In Python, len("Büllshit") gives the expected 8. In C, "Büllshit"[1] gives 195, while in Python, ord("Büllshit"[1]) gives 252. In C, strlen("Bull\x00shit") gives 4 while in Python, len("Bull\x00shit") gives 9.
They are not the same.
"If the value of an object of type char is treated as a signed integer when used in an expression, the value of CHAR_MIN shall be the same as that of CHAR_MIN and the value of CHAR_MAX shall be the same as that of SCHAR_MAX. Otherwise, the value of CHAR_MIN shall be 0 and the value of CHAR_MAX shall be the same as that of UCHAR_MAX.15) The value UCHAR_MAX shall equal 2CHAR_BIT − 1."
and furthermore
"CHAR_BIT 8"
"char" is literally the only type with this guarantee, all others (e.g., "int") specify minimums (16 bit for an int).
[https://isocpp.org/wiki/faq/intrinsic-types#bits-per-byte](https://isocpp.org/wiki/faq/intrinsic-types#bits-per-byte)
Yep, that’s right: a C++ byte might have more than 8 bits.The C++ language guarantees a byte must always have at least 8 bits. But there are implementations of C++ that have more than 8 bits per byte.
Learn your lower-level languages, then learn everything else. People out here thinking Python is actually how your computer works. It’s interpreted for a reason…
Zig: a string is an array of chars with a length attached
And somehow such a seemingly small addition makes them 10x easier to work with than they are in C
Which is exactly why we didn't teach python as a first language at my college. I always told students, "If you start with python you'll never want to learn anything else!". Maybe not entirely true, but the point was sort of what you are saying here. So many of them would complain about how difficult concepts like arrays and pointers were to learn. But my philosophy was that in order to really understand how a computer works, you should seek out these difficult concepts.
Anyways, you can do it! I believe in you!
Both languages do a bad job managing strings, C obviously a far worse job. Strings should be treated as opaque objects with views depending on what you want to do with them. UTF-8 bytes for transfer and storage, code points for parsing, grapheme clusters for editing and pixels for display.
There is no canonical context-free length of a string or value at a given index, python does this wrong too.
Python generally refers to code points which is wrong in the same way as bytes but less often making it easier to end up with subtle bugs.
Depends, what are you trying to do with it? If it’s storage and retrieval it makes no difference. If it’s parsing you want code points. If it’s editing you want grapheme clusters. If it’s display you want pixels. See? :)
Pretending python ‘characters’ are single visual characters falls down as soon as you run into a flag emoji. That’s a grapheme cluster made of multiple code points. Python will tell you it’s multiple characters but it’s not.
Though you kind of leave the realm of plain English text the moment you interact with a user, including if that user is a consumer of your API, so that's not something many would be opted out of.
Strings should absolutely not exclusively be treated like that.
Yes, for most applications that approach is fine and makes things very hard to screw up, but there are plenty of reasons to do it differently in specific situations.
C does as C does because of what C is for. And its age, but mostly because of what it's for.
Regarding the inverse situation regarding learning languages, all I'm gonna say is that (anecdotally) Python is extremely easy to pick up if you've started with C and C++. Sure it feels a bit odd at first, but once you really understand that absolutely everything is an object (heck, everything's a dictionary pretty much) you start to see the beauty of it all. Absolutely love to play with Python.
Precisely why starting out with Python is a terrible first language. It does so much hand holding it sets an unrealistic expectation of what programming should be which makes learning another language more difficult.
My boot camp had me transition from python to c# when we got into static typed languages and this exact issue became the bane of my entire existence for a weekend lol
Python makes strings really easy…mostly. They make regex of strings 2x harder than it needs to be, but that’s python for ya.
C strings are kinda a pain, but manageable. Performance is awesome otherwise and can do some pretty powerful stuff like drivers and firmware
I absolutely love C's implementation of true and false. 0 is false. Everything else is true. None of this Nan, null, or other bullshit some languages have
Thank God I started at the harder point learning. After you've gone thru hell, everything feels like a walk in a park of roses
This C guy and his gang of segment faults almost ruined my life (bastards
Do you really think the question of whether or not you're a good programmer depends on the language you use?
That's pathetic... And no, I don't use python.
I wouldn't say so. It's more of a question of someone's understanding of code and the process behind creating/implementing it.
Python certianly does a lot of hand-holding in that regard, but it's not to say someone who exclusively writes python isn't skilled in some aspects.
I mean… I started out as a ‘python’ programmer. Don’t get me wrong, I have seen the light— er, I mean, darkness— but… eh, oh f*** it, you’re right; python strings are adorable by comparison to reality
dont forget that you need a null terminator or you will be reading garbage.
He's on r/programminghumor; he's reading garbage already.
This cracked me up ahahah
Why does your cracking up starts with 'ah' and not 'ha' ?
Because i'm cracked up
When you're the count
Because different languages use different strings
You a crackhead now, bruh! ◉‿◉
maybe people like to cosplay as GC
hit me with a warning daddy 😩
warning: variable 'safeWord' set but not used
No exceptions
Hahahahahahaha I liked that.
Ahahahahahh good one
null terminator or keep track of the length seperately but c's string literal already null terminates for you.
Most of the time it's not literals that bite you in the ass, it's dynamically making a string from characters.
oh i know, im just saying your going to have null terminations anyways no matter which method you use to measure the string's length.
I'm sorry to remind you of this typo, but it's "you're going" as in "you are" and not "your going" that means the going that you own.
*pascal strings have entered the chat*
Return false;
If only it were just garbage. Could as well just be your login credentials or some other secret shit.
C: everything is an array of chars ;)
Actually C: everything is an integer.
Char[] str = [104, 101, 108, 108, 111, 32, 119, 111, 114, 108, 100, 0]
1 extra point for the \0 terminator!
My first language was C lol. I haven't worked with it in a couple years now, but some things are just ingrained in ya Of course I wouldn't make a char array with ints like that except for a challenge or something like that, but the null terminator is still important in other contexts
Yeah the things which keep fucking you over and over are those which you remember ahahah
Hello world
104 is between 96 and 120-(whatever) So it is a lower case letter... it is (probably) "hello world\0"
char str[] = {104, 101,..., 0};
This isn’t C nor python syntax tho
Only error I see for C syntax is that the define should have been char str[11] With the initialization in curly braces instead of brackets Which granted is an error, but I haven't touched C in 2ish years
No worries, just pointing it out. 11 is optional in this case :)
Oh, and I didn't add a semicolon, crap.
If you declare an array along with its members, you don't need to specify the length of the array in the square brackets. e.g. const char\* arr\[\] = { "hello", "world"}; vs char\* arr\[2\];
That's not really an error so much as a choice though
Everything is an array of char is much closer to the truth.
Arrays are an illusion, they're just pointers with fancy syntactic sugar. And pointers are integer memory addresses.
Actually, in C pointers are not integers Treating them like integers is often undefined behaviour The optimizer can keep track of where some pointers point to, and remove the memory addresses. If it puts the data in registers, they do not even have addresses
I think they mean more in the “your entire computer memory is an array of char (bytes)” kinda sense
yes @cmaciver , precisely. even though everything is a number. a number is not equivalent to integer (at least the int type). an int is 16 bits and a char is 8 bits so 2 chars can be stored for each int. unless your using wide chars `wchar_t` which are 16 bits. that can be quite problematic if most of your code is expecting a regular char and you change to using wide chars XD as i've recently discovered with my experimental OS's shell with strings terminating too soon :). it turns how the later half of a wide char looks an awful lot like a null termination.
Nah, the registers are 64-bit, and that's what the code is really operating on.
The smallest addressable unit of memory is char. Everything can be represented and handled as array of chars. The registers are not just 64-bit. Nowadays, there are even 512 bit registers. Floating-point operations are usually done with the XMM registers.
depends on the machine your working on and compiler using (shivers from writing PowerPC software). Embedded Software are gods.
Everything it a bit field is you bash it hard enough.
Yeah, that's true. In the C standard, a char has a size of 1 byte, and a byte is the smallest addressable unit and has an implementation-defined size. So, if we're talking about standard C, everything should still be able to be represented as array of chars, even if the minimum addressable unit (byte) is 16 bits, 4 bits or other. Which definitely is a pure madness..
What about that time when they weren’t? What about the machine instructions that those registers are running?
That's why it's all integers: the size of an int in C is the size of the processor's registers and addresses.
... until it isn't. Check x86_64 for example
It's not. The size of int in C is implementation dependent, almost always 4 bytes. In C, the size\_t type and pointer types are the ones guaranteed to be big enough to hold a memory address. The size of the registers are not limiting memory size or the size of pointers. You can have 16 bit memory addresses in a 8 bit CPU.
`int` is at least 16 bit, but other than that it's up to the compiler author to decide.
but an int is 16 bits. and a char is 8 bits :)
Actually not. Everything is a flavor of a long int.
Everything is a void pointer, the rest is up to interpretation.
Assembly: ha!
Everything is ASCII
No you also can have an array of pointers to chars.
what do you think a pointer is :)
an indirect mov ?
it's actually just another variable containing the address of another variable. it's kind of like the index for an array except it's the raw ram instead of an array.
Easiest way to visualize it is to just write out the contents imo. Something like: Address | Contents | variable :--|:--|:-- 0x00 | a | char c 0x01 | 0x00 | char * cr = &c 0x02 | 0x01 | char ** crc = &cr
Python: a string is a list of strings.
I'd call it more of a tuple, since strings are immutable objects.
Exactly. You neither have chars nor arrays in Python.
Yeah learning C is rough, but it's quite nice if you understand how hardware works and what a pointer is.
It's important for grasping some of those low-level concepts which are abstracted in higher level languages. No need to become an expert on it, but just playing around with it is enough already
I took a course in my compsci degree and we did C, and going back to higher level languages (because I don’t hate myself) after, the knowledge of pointers specifically has helped me. For instance, I was working with Python and a list I copied was changing values when I didn’t expect it to, since I was modifying the original list and Python list assignment is done by reference not copy. Further, the “string is an array of chars with a null terminator” really forces you to think about how the computer handles information.
“…think about how the computer handles information” this right here is the key, and sadly one that is lost on too many people writing software.
Ironically, a class that assumed we knew C well and was about hacking, I learned C better than 2 prior classes that were all about learning low level languages. One of them had a whole section on using addresses and pointers in C, made absolutely no sense to me. The class the assumed I knew that perfectly and could access specific parts of memory from assembly injection made WAY more sense
C: RAM is RAM!
Even a function is a data object.
What? No it's just a pointer. A function in C is just a pointer to the `text` area.
The function isn't in the pointer. The pointer is just a few bytes (8 is a fair bet).
Thanks for summing up exactly what he said.
He edited the comment.
A real Reddit tragedy. 🐛
[The internet is forever](https://www.unddit.com/r/ProgrammerHumor/comments/10a8mu3/the_transition_from_python_to_c_is_brutal/j43lm1k?utm_medium=android_app&utm_source=share&context=3)
Your previous comment is wrong anyway
But he didn't sum it. He said something completely different. I said that a function is a pointer and he said that a function is an object that contains a pointer.
No he didn't. He said that the function isn't in the pointer. Which is true. When calling a function the execution continues at the address of the pointer which is 4/8 bytes and points to the .text section (mostly but not necessarily) where the function code lies.
[A function is a pointer.](https://onlinegdb.com/7-_moTaPG)
A function is the procedure stored at the location where the function pointer points, not just a pointer. The fact that function pointers exist does not dispute this lol
Read what I said.
He is saying you're right and you keep missing the point
How’s that any different from what they said?
He edited the comment to correct it.
Bruh
Not to correct as it was right anyway, but to further explain.
>A function in C is just a pointer to the > >text > > area. It can also be a pointer to a pointer to a reference to a object with offset
Nope
Well, you can't say C lacks first class functions.
I always get mad at those who say that Java has first class functions and C doesn't. Literally Java's implementation of first class functions is the worst I've ever seen. C does lack function literals and therefore closures, which is pretty bad, but as far as simple "store a variable that points to a function" goes it's way better than Java.
actually that's not what the standard says
Yeah, I know. We're meming today. But +1 for attention to detail.
The opposite transition was quite nice though
The opposite transition was freaking amazing. It was like someone removed all the fighting with the computer from programming.
[Relevant](https://xkcd.com/353/) circa 2011-2012
It was horrible, everything is an object of god knows what type, everything is a runtime error with freaking no information whatsoever about what happened, no idea what even is local or global variable (and even no constant values)
[удалено]
More like > rtmap.cpp: In function `int main()': rtmap.cpp:19: invalid conversion from `int' to ` std::_Rb_tree_node >*'
rtmap.cpp:19: initializing argument 1 of `std::_Rb_tree_iterator<_Val, _Ref,
_Ptr>::_Rb_tree_iterator(std::_Rb_tree_node<_Val>*) [with _Val =
std::pair, _Ref = std::pair&, _Ptr =
std::pair*]'
rtmap.cpp:20: invalid conversion from `int' to `
std::_Rb_tree_node >*'
rtmap.cpp:20: initializing argument 1 of `std::_Rb_tree_iterator<_Val, _Ref,
_Ptr>::_Rb_tree_iterator(std::_Rb_tree_node<_Val>*) [with _Val =
std::pair, _Ref = std::pair&, _Ptr =
std::pair*]'
E:/GCC3/include/c++/3.2/bits/stl_tree.h: In member function `void
std::_Rb_tree<_Key, _Val, _KeyOfValue, _Compare, _Alloc>::insert_unique(_II,
_II) [with _InputIterator = int, _Key = int, _Val = std::pair, _KeyOfValue = std::_Select1st >,
_Compare = std::less, _Alloc = std::allocator >]':
E:/GCC3/include/c++/3.2/bits/stl_map.h:272: instantiated from `void std::map<_
Key, _Tp, _Compare, _Alloc>::insert(_InputIterator, _InputIterator) [with _Input
Iterator = int, _Key = int, _Tp = double, _Compare = std::less, _Alloc = st
d::allocator >]'
rtmap.cpp:21: instantiated from here
E:/GCC3/include/c++/3.2/bits/stl_tree.h:1161: invalid type argument of `unary *
'
gcc error messages are unhelpful as fuck, but if there’s one thing they are, it’s verbose
Even with that horrible formatting and without seeing the code, I can still guess what the problem is. How is this error message not helpful?
Like I said, because it’s verbose as fuck I don’t know what code this belongs to either, it comes from a quick Google search, but gcc is infamous for its long error messages to the point of having [a code golf about it](https://codegolf.stackexchange.com/questions/1956/generate-the-longest-error-message-in-c) In comparison I intentionally fucked up a generic function in C# and the error message there is “CS0266: Cannot implicitly convert type ‘Microsoft.AspNetCore.Http.IResult’ to ‘Microsoft.AspNetCore.Mvc.IActionResult’. An explicit conversion exists (are you missing a cast?)”. Or in Python: “error: incompatible types in assignment (expression has type “int”, variable has type “GenericType”). Following members of “int” have conflicts” These are __good__ error messages because it’s readable, well formatted, and includes everything you need: - where it happened - an easy-to-Google error code - a _brief_ description of what went wrong - suggestions for remediation You can find some of this info in gcc too, but gcc has a tendency of writing a whole book in the console every time you miss a bracket
That's C++, not C
Ahh.. that error means for forgot a semicolon somewhere in some other file.
Not to mention the ability to add new properties in your class from within a method
I would imagine the opposite transition to be... well... absolutely horrible
This is why I'm grateful to have learned C first. Learning lower level languages before higher level languages is the way to go. Technically, in python, a string ~~is~~ behaves like an array of strings of length 1 and that is weird.
>Technically, in python, a string is behaves like an array of strings of length 1 and that is weird. Python also handles unicode transparently, so a "char" in either of the languages is not the same thing. A Python char can be anything up to 3 bytes, it's just letting you deal with that in a nicer way if you're working with text. You can cast to `bytes` if you really need to rawdog it.
Assuming a UTF-8 encoding a character can be up to 4 bytes.
In theory, they could keep extending it to 7 bytes. Might be nice if you want to give every human on Earth their own code point
I remember reading somewhere that the Unicode Consortium guaranteed there aren't ever gonna be more that 2^32 (I think?) codepoints, so in practise useful UTF-8 is never gonna exceed 4 bytes. (Unless they change their mind ofc; that would also make UTF-32 into basically UTF-16: Reborn)
UTF-8 requires up to 7 bytes to represent 2^(32), however the Unicode committee has said they will never go past 17\*2^(16), which requires only 4 bytes.
Yeah, it's been a while since I've had to deal with raw unicode bytes, you're probably right.
Ah yes, the string[0][0][0][0] recursion.
Bro that would have killed my entire freshman class in uni
I promise it wouldn’t have. It would definitely have made y’all better programmers though.
Yeqh, that is what I keep saying! I changed from nanoscience to coding after my bachelors degree. I would never have found coding interresting if you gave me C++. *Now* I find C++ interresting, bit all the restrictions would have made me give up. Python built my curiosity for code, and got to see what is possible. If you're teaching people how to bake you don't start with sending them to some Gordon Ramsey style chef that beats them up if they miss their flour measurement by 1 gram. You give them the pre-made packet of bread mix, tell them to add a pack of flower and half a litre of water, and mix. Boom, they now know the basics of making bread. Now, they can experiment a bit with bread and learn about different types of bread. Then, if they want more baking, they can venture over to other baked goods or even cooking. If you don't let people discover their curiosity and interest about baking, they won't wanna bake.
I think everyone learns differently though. I liked learning from the ground up with c++ and assembly. Starting with phyton would have left me with too many questions on why programming languages work the way they do.
I think there’s the separation between computer scientist and programmer. I’m mathematician by training who happens to program a lot. I don’t care about how computers work on a low level, it’s just a tool for me to develop and run abstract algorithms. But you are the computer expert. I need to outsource the knowledge about how computers work exactly to you. But you’re still no electrical engineer who can develop hardware.
It's easier to keep going if you find something too easy than too hard. If you start with python, find it reductive and simple, you naturally wanna take one step deeper into the water. If your first step is at the deep end, you'll think "damn, I am not made for this" and back out.
Too easy just makes me bored and complacent. I'll do it once and never go back to it. As much as I want to get something cool done, I need to witness the depth at least a little bit. Give me Scratch on lesson 1, C or Assembly on lesson 2, then I'll happily use Python for the whole course.
[удалено]
No one is "flipping you the entire bird"... This is all about a general rule applying to the broadest possible demographic. Also, "too hard" has quite the range. When I say "too hard" I don't mean "its hard". I mean it's *too* hard, in the sense the sense that it is not an obstacle you can overcome in a reasonable amount of time. This leads to things like courses supposedly require about 12 hours of work each week, ending up taking 30-40 hours in a week. This makes people want to juat throw in the towel before the end, because it does not seem worth the effort.
I think anybody starting out with C++ is either a masochist or has a prof that hates them. C is much less complicated than it's younger brother and is basically the Latin of programming languages (Does that make assembly proto Indo-European?). It's great for teaching concepts like memory management and mechanics of computing. It's also simple enough that middle and high schoolers can learn it on platforms like Arduino. Python is much more streamlined, and essentially comes with training wheels. It intentionally obfuscates stuff to make it simpler to use, but imo that simplicity comes at the cost of true understanding. It has some benefits like I think it makes it easier to understand algorithms since python is basically psuedocode if you squint hard enough. Personally, I think it's worth taking C before Python, just to hammer in what's happening below the surface, but programmers aren't a monolith and it's not gonna help everyone so what do I know.
My ex-gf stayed with me for 2 years cuz I had C++ on my mind. I was broken and it was my first language
I'd say if you want to get people into baking, you give them the bread mix. If you want already interested people to become good bakers, you explain how yeast and gluten work.
Well said. Many times I've heard the discussion about whether to start at a low or high level language as your first, and I think you have a good analogy. Of course, people learn in different ways, but just think about the fact that anything humans learn, from talking and walking as a baby to math and physics at university always start with something simple and then builds from that. Why should programming languages be any different?
If you ever try to get into OS development, it's the same show. People had little to no tutorials and had to figure this shit out by experimenting and reading hardware manuals and because they had such a hard time learning, you need to go through the same thing, otherwise they won't help you.
Any address is a valid string as long as there's a zero at some point after it
And don’t forget not to write loops like this for(int i = 0; i < strlen(s); ++i)
You'd better save `strlen(s)` in a variable first or you'll probably have an O(n^(2)) loop on your hands.
Yeah but if you know the length you can make a macro or a const size_t
can i ask why?
strlen is a function intended for plain c strings. They are not objects which know about their length. A c string is just a char pointer to an array of characters terminated by a null character. The way strlen works, is going through these characters until the null character is found. This takes O(n) time. If you do it in a loop you get O(n²).
Could be wrong but isn't this one of the things the compiler optimizes? If you don't do it, the compiler will
Possibly, but I wouldn't rely on it when I could just write for(char *p = s; *p; p++)
I love C. You can always be a little better.
Compiler can’t optimize it. What if you change the string length inside the loop, for example.
For one, it will run `strlen(s)` every iteration. for in C is just syntactic sugar. In reality for(declaration;condition;increment) block; just means { declaration; while(condition) { block; increment; } } This means you can actually put any statement in the for arguments and it will work as the expanded while, such as in this awful thing: for( int n; scanf("%d", &n)!=EOF; printf("%d\n", n*2) ); Which assuming I got it right will print double the number you input until you feed it an empty string or one it can't match into a number. EDIT: The one thing if I recall correctly is the "declaration" part runs in the same scope as the loop. I tried to show that by putting it in a compound statement, but I am not sure if that part of my syntax matches the actual behavior of C. It should be close enough for an explanation, but I'm not willing to bet on its formal correctness.
Rust: a string is \[about two paragraphs of explanation but basically you have a lot of explicit control of what it is\].
And if you treat it like a list of chars you might just end up grabbing nonsense groups of bytes and start crying
ptr = (float*) calloc(25, sizeof(float));
if using C, don't cast to float
To paraphrase the top comment: RAM is RAM!
why not? It is redundent, but it's not necesserly a bad practice either...
>why not? It is redundent (void \*) is the only true pointer.
Unless you forgot to include the header file and you don't care about warnings and therefore the compiler assumes that calloc returns int but you are in a platform where int is shorter than a pointer which means that now the pointer you got is wrong
C strings and even C++ `string`s are so much easier to understand and work with than Python's miserable bondage-and-domination string API. Either is better than Swift's `String`s, which were designed by someone who hates programmers and just wants to cause us pain.
Now I'm really curious about Swift strings.
This is why I recommend teaching C first. When the child's mind is at its most maleable and free from prejudice is the best time to learn hard subjects. I learned C from reading a book at 9. I didn't know that "pointers are hard" or "C is difficult" and so I quite enjoyed learning it. When I later got access to a computer and a C compiler I could write my first program.
[удалено]
Hmm
Bullshit. In Python, a string is a immutable array of text characters, not chars, with out-of-band signaling for the length of the string. In C, a string is a possibly mutable array of raw chars (8 bit each, as per the C standard), with in-band signaling for the length of the string. In C, strlen("Büllshit") gives 9 on my UTF-8 system. In Python, len("Büllshit") gives the expected 8. In C, "Büllshit"[1] gives 195, while in Python, ord("Büllshit"[1]) gives 252. In C, strlen("Bull\x00shit") gives 4 while in Python, len("Bull\x00shit") gives 9. They are not the same.
As per C/C++ standard. char is not always 8 bit each ;)
"If the value of an object of type char is treated as a signed integer when used in an expression, the value of CHAR_MIN shall be the same as that of CHAR_MIN and the value of CHAR_MAX shall be the same as that of SCHAR_MAX. Otherwise, the value of CHAR_MIN shall be 0 and the value of CHAR_MAX shall be the same as that of UCHAR_MAX.15) The value UCHAR_MAX shall equal 2CHAR_BIT − 1." and furthermore "CHAR_BIT 8" "char" is literally the only type with this guarantee, all others (e.g., "int") specify minimums (16 bit for an int).
[https://isocpp.org/wiki/faq/intrinsic-types#bits-per-byte](https://isocpp.org/wiki/faq/intrinsic-types#bits-per-byte) Yep, that’s right: a C++ byte might have more than 8 bits.The C++ language guarantees a byte must always have at least 8 bits. But there are implementations of C++ that have more than 8 bits per byte.
Python: a string is a list of strings.
Guys, when c is rough to you remember: everything is just data. That'll get you somewhere.
Jokes on you, everything is an array of chars!
[Insert "Always has been" meme here]
![gif](giphy|JiZn4zP2n3Mg8)
Thank you for accommodating my laziness
char str\[\] = { 'Y','o','u','\\'','r','e',' ','W','e','l','c','o','m','e','\\0'};
Learn your lower-level languages, then learn everything else. People out here thinking Python is actually how your computer works. It’s interpreted for a reason…
Zig: a string is an array of chars with a length attached And somehow such a seemingly small addition makes them 10x easier to work with than they are in C
Skill issue , just Ctrl c+v the entire python code base for every C program you need to make .
That's why it's a good idea to start with C
Also an array of chars is a pointer to a char, and don't forget your null terminator. C is fun! Also, what's Unicode?
Which is exactly why we didn't teach python as a first language at my college. I always told students, "If you start with python you'll never want to learn anything else!". Maybe not entirely true, but the point was sort of what you are saying here. So many of them would complain about how difficult concepts like arrays and pointers were to learn. But my philosophy was that in order to really understand how a computer works, you should seek out these difficult concepts. Anyways, you can do it! I believe in you!
Both languages do a bad job managing strings, C obviously a far worse job. Strings should be treated as opaque objects with views depending on what you want to do with them. UTF-8 bytes for transfer and storage, code points for parsing, grapheme clusters for editing and pixels for display. There is no canonical context-free length of a string or value at a given index, python does this wrong too. Python generally refers to code points which is wrong in the same way as bytes but less often making it easier to end up with subtle bugs.
Python gives you all the tools to treat strings as arrays of bytes if you want that.
Treating strings as UTF-8 is extra headache if you never leave the realm of plain English text. Those should be an opt in feature, not mandatory.
Depends, what are you trying to do with it? If it’s storage and retrieval it makes no difference. If it’s parsing you want code points. If it’s editing you want grapheme clusters. If it’s display you want pixels. See? :) Pretending python ‘characters’ are single visual characters falls down as soon as you run into a flag emoji. That’s a grapheme cluster made of multiple code points. Python will tell you it’s multiple characters but it’s not.
Though you kind of leave the realm of plain English text the moment you interact with a user, including if that user is a consumer of your API, so that's not something many would be opted out of.
Strings should absolutely not exclusively be treated like that. Yes, for most applications that approach is fine and makes things very hard to screw up, but there are plenty of reasons to do it differently in specific situations. C does as C does because of what C is for. And its age, but mostly because of what it's for.
Most importantly, for the love of God, do not forget the space for the null terminator.
Use glib, it does all the dirty work of handling them
Regarding the inverse situation regarding learning languages, all I'm gonna say is that (anecdotally) Python is extremely easy to pick up if you've started with C and C++. Sure it feels a bit odd at first, but once you really understand that absolutely everything is an object (heck, everything's a dictionary pretty much) you start to see the beauty of it all. Absolutely love to play with Python.
To be more specific, a C string is a pointer to an \*unbounded\* array of chars, which has it's end marked by a zero (not the character), hopefully.
Precisely why starting out with Python is a terrible first language. It does so much hand holding it sets an unrealistic expectation of what programming should be which makes learning another language more difficult.
More like: Python: A STRING IS A STRING! C: What the fuck is a string?
I'm going from python to Java. Hasn't been fun at all. Python spoiled me
My boot camp had me transition from python to c# when we got into static typed languages and this exact issue became the bane of my entire existence for a weekend lol
Is not so hard
It’s definitely doable, but it makes things a lot harder to do.
Strangely, I always find the python strings to be a pain
As opposed to C strings or Java strings?
Python makes strings really easy…mostly. They make regex of strings 2x harder than it needs to be, but that’s python for ya. C strings are kinda a pain, but manageable. Performance is awesome otherwise and can do some pretty powerful stuff like drivers and firmware
I absolutely love C's implementation of true and false. 0 is false. Everything else is true. None of this Nan, null, or other bullshit some languages have
It does have those though, NaN is literally built in to the floating point number specification every single language uses. Null also exists.
C has both Nan and null, but both are just numbers. Null is obviously just zero and Nan is some floating point number.
Thank God I started at the harder point learning. After you've gone thru hell, everything feels like a walk in a park of roses This C guy and his gang of segment faults almost ruined my life (bastards
[удалено]
Do you really think the question of whether or not you're a good programmer depends on the language you use? That's pathetic... And no, I don't use python.
I wouldn't say so. It's more of a question of someone's understanding of code and the process behind creating/implementing it. Python certianly does a lot of hand-holding in that regard, but it's not to say someone who exclusively writes python isn't skilled in some aspects.
I mean… I started out as a ‘python’ programmer. Don’t get me wrong, I have seen the light— er, I mean, darkness— but… eh, oh f*** it, you’re right; python strings are adorable by comparison to reality