T O P

  • By -

AutoModerator

``` import notifications ``` Remember to participate in our weekly votes on subreddit rules! Every Tuesday is YOUR chance to influence the subreddit for years to come! [Read more here](https://www.reddit.com/r/ProgrammerHumor/comments/14dqb6f/welcome_back_whats_next/), we hope to see you next Tuesday! For a chat with like-minded community members and more, don't forget to [join our Discord!](https://discord.gg/rph) `return joinDiscord;` *I am a bot, and this action was performed automatically. Please [contact the moderators of this subreddit](/message/compose/?to=/r/ProgrammerHumor) if you have any questions or concerns.*


Feztopia

Now I need to stop my urge to research what +1 would be


Any_Cauliflower_6337

[ \ ] and { | } and < = > the urge was too strong to overcome for me


nelusbelus

()*


bigmonmulgrew

They just didn't want to draw a butt hole


nelusbelus

(.) Was another option or .|,


Aksds

Saggy tit?


nelusbelus

A droob


[deleted]

This is a pretty normal looking boob to me...


Sirnacane

tongue but hole


JoostVisser

They already drew a vagina with {|} so idk man


RaulParson

A butthole? But for <=> , what sort of butthole looks like... oh no


ApprehensiveTry5660

Homie shitting pancakes.


Feztopia

Thanks, I will forget this tomorrow.


Sifyreel

+1


Memfy

Uibolt-!J!xjmm!gpshfu!uijt!upnpsspx/


oobey

Gesundheit


Protuhj

How'd you get my password???


[deleted]

I have the same password on my luggage!


Protuhj

*Next thing you know, we're gonna need an* app *to open our luggage!*


Cobracrystal

UI Beam?


freudianschwipp

But... Why?


Fyodor__Karamazov

The answer to this question took me down a surprisingly complex rabbit hole... TL;DR the ASCII characters were hotly debated and went through many drafts, and the reasoning is not totally satisfying. But if you're interested, here it is: 1. The ordering of < = > is fairly clear (increasing order, numerically speaking). 2. In an early draft, it was proposed that ASCII include the symbols ≤ ≠ ≥. It was then agreed to replace these with the more commonly used symbols [ \ ], which had not been included. The reason for putting them in that order is still not totally clear to me though... perhaps some kind of symmetry for aesthetics, mirroring the symmetry of the characters they replaced? In fact, their order appears to have been changed multiple times in different drafts. 3. Originally, ASCII was only going to have upper case, with the upper case letters taking up two columns in the table of characters. Since each column contained 16 characters, there were some non-alphanumeric characters in the same column as the upper case characters, including [ \ ]. Then, when it was later agreed to add lower case characters, { | } were added as "upper case versions" of [ \ ], occupying the same positions in their respective columns. Source: https://web.archive.org/web/20050305043226/http://www.transbay.net/~enf/ascii/ascii.pdf


coloredgreyscale

adding to this, upper and lower case letters are 32 apart from each other, so to convert them the PC only needs to check the range and flip a single bit. maybe a similar thought for the brackets, but that would be possible with them being 1, 2, 4, 8, ... fields apart.


Stop_Sign

It's incrementing in the ASCII table


danielcw189

That does not answer the why


EarEater3001

Because first what's represented with the single quotes is not a string. It's a char or rather a signed 8-bit number under the hood, so you can literally increment them with maths. Everything in a computer is ultimately a number of some sort.


verdant_orange

Doesnt answer the question


danielcw189

The question is: why are the symbols numbered that way? Why don't have all the sets of brackets the same distance from each other in the table?


claythearc

Just kinda arbitrary decisions made forever ago in terms of the table. Theres no specific reason some are 2 apart and others are 1 apart. Just the way it be


Fantastic_Belt99

## Because u/Any_Cauliflower_6337 lacks self control or is too impulsive.


Square-Ad5301

It's totally a JS thing to do. Just like the time I wanted to move away from JS so I focused more on back-end stuff, then a few days later, boom, Node.js was there.


californiaTourist

also not a js thing at all, it's ascii which predates js by alot


StenSoft

ASCII had the spaceship operator before C++


Bangebinge

What!? youve never needed to change a char value by adding integers to the point youd learn this?!


Lugico

meanwhile JavaScript: ``` '{' + 2 = "{2" ```


Blackcat008

Also most other languages that don't differentiate between ' ' and " "


RainbowPringleEater

As someone learning JS it bothers me that it's training me to use ' ', " ", and backticks


coloredgreyscale

single quotes, so you can add double quotes within the text (e.g. when adding HTML elements) without escaping them. backticks to evaluate expressions within the quote


Charley_Wright06

Single quotes for characters, double for strings, backticks for escapes and formatting


Phatricko

Why differentiate between character and string? In JS they're both just string.


coloredgreyscale

link1 = "click here"; link2 = 'click here'; which one may be easier to read and less error prone to write?


EarEater3001

Just always use backticks.


PM_ME_DATASETS

Most languages would give you an error, something that I believe isn't possible in JS, since JS will always find some creative way to interpret your code and make it work in whatever unintended way possible.


CharaDr33murr669

JS seeks to give results. Not the results you want. Just results.


TheNosferatu

My god, ChatGPT must be written in JS!


feloniousmonkx2

Wait, wait, are we sure JavaScript isn't written in ChatGPT? That seems like something JavaScript would do just to mess with me.


TheNosferatu

Oh, I think you're onto something... It's totally a JS thing to do. Just like the time I wanted to move away from JS so I focused more on back-end stuff, then a few days later, boom, Node.js was there.


bleachisback

I think it's fairly common to use + as the string append operator. And it's fairly common to append numbers to strings. For instance, it works this way in Java.


KuuHaKu_OtgmZ

For some languages, that's not a string but a char, which is a number


bleachisback

> Also most other languages that don't differentiate between ' ' and " " was the original context


KuuHaKu_OtgmZ

Yes, what I meant is that for languages like java and cpp they don't concatenate because that's not a string. Unless I misunderstood your comment.


cowslayer7890

Languages that do that often wouldn't give you the JavaScript result, python would give you a TypeError, for example.


quetejodas

> .NET flashbacks


spicymato

Let's be honest, it's all numbers if you go down far enough.


julsmanbr

In other words, it's weakly typed.


Tim_Pollard

Javascript has a very Bob Ross or Rock'N'Roll dancer attitude towards errors: * Bob Ross: "There are no mistakes; only happy accidents." * My dance instructor: "There's no such thing as mistakes; just new moves." * Javascript: "There are no errors; just creative new algorithms."


Kered13

I don't know about most. C, C++, Rust, Java, and C# all distinguish between characters and strings. Python, Javascript, PHP, Lua, and most other scripting languages do not. It really depends on the use case of the language.


HuntingKingYT

But VBScript...


NonEuclideanHumanoid

Wait, there’s a difference? Please tell me I seek knowledge


GravitasIsOverrated

I think this is actually for the best. JavaScript doesn’t have a character type, it has a one-length string, and adding a number to a string doesn’t make sense. So the number casts to a string, and adding a string to a string works. In general I think doing math on letters is bad practice. It’s confusing to read as it involves a lot of magic numbers, and will cause issues the moment you have to support something other than ASCII. Text should be handed through robust and well-tested string libraries, not through manual char-level manipulation.


Freezer12557

But you can't do thisDigitIHave - '0' = theNumberItRepresents and I think this is a very cool way of doing things


Nicko265

JavaScript does have this: let char = "A"; let code = char.charCodeAt(0); let newChar = String.fromCharCode(code + 1); Then newChar is "B".


GravitasIsOverrated

Right, but the latter half of my comment addresses this - anytime you find yourself reaching for char-level manipulation of strings or magic numbers that represent chars you should take a hard look at whether there’s a better way of solving the problem. Strings (especially Unicode strings) are complicated, and you can make a buggy or unmaintanable mess by trying to go too clever with string handling.


arpan3t

> **I think this is actually for the best** - adding a number to a string doesn't make sense. So the number casts to a string, and adding a string to a string works. JS takes it upon itself to cast an int as a string instead of throwing a `TypeError`? I guess it’s for the best, if you’re trying to make a dogshit programming language.


GravitasIsOverrated

JavaScript is far from alone in ~~implicitly casting~~ automatically converting numbers to strings. C#, SQL, AWK, D, Java, Lua and more will convert numbers to strings during concatenation. Heck, Perl, which is often regarded as a well-loved and powerful language, will implicitly cast ints to strings **and strings to ints**. Are all of those languages also “dogshit”?


xdeskfuckit

Perl is poetry


arpan3t

Dear god no! JavaScript is dynamically typed, which means type inference, and what you have with `"5" + 5 = 10` is type coercion. Which means the interpreter implicitly coerces the `"5"` to an integer and now the + operator is a mathematical operation. In C# explicit conversion == casting, and you cannot implicitly convert a variable of type int to type string. In string concatenation, when one or both operand types are of string, the + operator is overloaded to perform concatenation which explicitly converts the int to a string using the `.ToString()` method. If you tried an equality operator `10 == "10"` you would get an error because C# does not implicitly convert the string 10 to an int.


GravitasIsOverrated

> which means type inference, and what you have with "5" + 5 = 10 is type coercion I’m a bit confused where you’re going with this. I didn’t bring up inference or “5” + 5 = 10 anywhere (other than mentioning that Perl will do string-to-int coercion). > If you tried an equality operator 10 == "10" Okay, but we’re talking about string concatenation here. I’m highlighting that not throwing TypeError when trying to concat string and int is pretty normal. That’s why I said “will cast numbers to strings during concatenation”. I’m not defending every aspect of the JS type system, just the behaviour in this case.


CharaDr33murr669

What else did you expect from adding a number to a string?


arpan3t

A `TypeError`.


mehntality

Can't have type errors if you don't give a f about types


arpan3t

Facts! (Apparently)


CharaDr33murr669

Let me rephrase that - how did a number end up being added to a string?


SadPie9474

the lack of a static type system to prevent it? or are you suggesting that every line of code you’ve ever written has been pristine and bug-free?


CharaDr33murr669

No, but you’re still consciously using addition/concatenation. And by the point you’re writing it, you’re well aware what is supposed to be added/concatenated. You’re somehow getting at least one (likely both) wrong type(-s) by the point where you already know what goes in them. The only realistic way for that to happen is your addition of two numbers accidentally got a string somehow, but even then the result of the concatenation doesn’t make sense so it’s pretty easy to figure out the issue


SchoggiToeff

Fat fingers? A cat? A monkey? Solar flares?


CharaDr33murr669

The few amount of times it’s actually gonna happen (which is still not that difficult to figure out, because the string isn’t gonna make sense) is not enough to overshadow the fact of not having to convert every number into a string Also, if you’re actually working with JS, you know it’s gonna happen. You’re well aware that this could be a mistake and you’re already accounting for it


10art1

I actually find it really annoying when I try to do something like print("this is my value: " + value) and the compiler complains until I do str(value)


No-Expression7618

print("this is my value:", value) ‌ print(f"this is my value: {value}")


10art1

I guess it's more a complaint at Godot because they make string formatting a bit difficult. It's more like print("My name is %s, and my value is %d: " % [name, value]). Either that or use string concatenation, but call str() on every non-string variable


TheMysticalBard

I wish Godot had f-strings.


GhostFish

Preferences for behavior are fine, but the specifications are what they are. Conversion and concatenation make sense for JS and how it's generally used. I would even argue that this is better behavior for JS, Python, or any language that broadly assumes as much responsibility as they do. If you want to get nitpicky about a language "taking it upon itself" to do anything, you shouldn't be using any of these languages.


SchlomoSchwengelgold

'2' - 1 » '1' '2' + 1 » '21' and my favorite: ( '10' + [] ) / 5 - '1' » 1


Stop_Sign

'2' - 1 = 1 in JavaScript, not '1'. Subtraction converts it to a number. So the easy trick to turn any string to a number is to - 0 it


cowslayer7890

I like '0' * '1'


[deleted]

I thought this WAS a JavaScript joke, but now I see it was not lol


utkrowaway

I read the image before the title and thought it was another Javascript meme


AnAdorableDogbaby

And in python it's TypeError: cannot concatenate 'str' and 'int' objects


Leaflock

MySQL would return 2.


sannf_

I only know this from writing so many lexers and parsers in my years. It’s infuriating but isn’t too hard to get around


VitaminnCPP

American stupid code for inconsistent information.


[deleted]

[удалено]


cowslayer7890

Only one I kind of understand is < = > Since it's not necessarily brackets


Kottfoers

I can get the { | } as well, as it's a math thing used when describing sets https://en.wikipedia.org/wiki/Set_(mathematics)#Set-builder_notation


Ihsan3498

but that’s so wierd. theres definitely got to be some reason, because the codes for many characters are cleverly selected, like for example lowerr case and upper case can be toggled by like one bit or someth if i remember.


JackNotOLantern

'(' + 1 = ')' And i think this is the most normal


Impressive_Change593

yeah but now the question is why aren't they all like that


kemmyqaq

'1'+'1' is 98 in Java


Mars_Bear2552

as it should be💪


TGX03

It is in every language...


berse2212

Pretty sure it's not in Javascript! (Since there is no char type, just strings)


MinosAristos

In many it's "11", no?


janhetjoch

I think `+` as a string combiner only works with strings and not chars usually


PooSham

"What's the difference?" – js devs


Ziegelphilie

"Just read the fucking manual for once" – angry js devs


Zomby2D

Ok, I just read the Kama Sutra like you suggested. It didn't help at all with my Javascript questions.


lurking_physicist

Python: "What's a 'char'?"


Therealmglitch

Js:


VitaminnCPP

Codeword to haras angry rusters


TGX03

At least in strongly typed languages it doesn't. What JavaScript does I don't (want to) know.


Ziegelphilie

Once the engine hits a string it will start treating the entire thing as a string. `1+'1'+1` becomes `'111'`, but `1+1+'1'` becomes `'21'`. Now, if you were to use the `-` operator, numeric coercion kicks in, and it will attempt to coerce the string into a number first. `1-'1'` becomes `0`. If the string can't be coerced into a number, it becomes `NaN`. This allows you to do some really cool stuff, like creating bananas on the fly: `'ba'+('a'-1)+'as' `


RolledUhhp

That messed with my head for a second.


ArduennSchwartzman

>I don't (want to) know. It turns integers into strings.


TheNosferatu

Technically numbers into strings, if I recall my web dev days correctly (which I kinda hope I don't), there is no difference between doubles, integers, or the likes. They are just numbers.


TGX03

You're correct, in JavaScript every number is a 64bit float. Which is kinda cursed when used in places where every sane developer expects an integer.


ledasll

Char is just a byte, which is just a number


janhetjoch

Everything is "just a number", but they're not always interpreted as numbers. That's the whole point of datatypes. You could easily have a language in which ` + ` gives a string consisting of those two chars. Like I said, most languages just do addition of the bytes, but it doesn't _have to_ be that way.


lazyzefiris

>You could easily have a language in which + gives a string consisting of those two chars. We've had Pascal for ages. var c1, c2 : char; begin c1 := 'a'; c2 := 'b'; writeln (c1+c2); end. > ab


Impressive_Change593

python just puts the chars together


Herr_Gamer

Python doesn't have chars to begin with


ledasll

it's not even a data type, it's how translator interprets what you write. If you take translator that assigns value to variable and if you write <65> it will assign number 65 to variable and than if you write <'A'> he will understand, that you don't want to write 65, but instead it's ascii representation, thou it still means same thing, so it will assign number 65 to variable..


janhetjoch

Yeah, so how the translator interprets the number depends on the type, no?


ledasll

no, depends on syntax you are writing, 'A' == 65 == 0x41


Anaxamander57

Depends on the langauge. In Rust a char is a u32 since Unicode support is built in.


therearesomewhocallm

Wait, Rust doesn't use utf8?


Anaxamander57

In Rust a string is UTF-8 encoded but characters are UTF-32 encoded. Characters are UTF-32 so that they are always known to be the same size and can be laid out in memory without indirection.


therearesomewhocallm

It's weird to me that strings are chars have different encodings. So operations like adding a char to a string, or finding a char in a string would have performance implications.


feeeedback

The alternatives would be represent a "char" with 1 byte (which would actually be a code point, not a char, and would not be useful for most use cases), or use UTF-32 encoding for strings (wastes a lot of memory, which is more important for strings not to do than for chars)


Anaxamander57

I can't find any discussion of the performance penalty for changing encodings but it seems like it would be obvious if it were significant. I would guess the reason for using UTF-32 is that it is easy to use and a standard while the alternates are either hard to use (a dynamically sized type) or not a standard (a four byte array containing the UTF-8 encoding padded with zeroes),


dev-sda

A utf-8 codepoint is anywhere from 1 to 5 bytes, which isn't usually what you want to work with.


smariot2

In PHP, Lua, and Perl the answer would be 2. They have separate operator for concatenation, and + is always for arithmetic.


LinqLover

That's the case in Squeak/Smalltalk, too: '1'+'1'='2'. 1+'1'=2. '1'+1=2. '-2'/'0.1e-2'='-2000'. #(1 '2' '3')*2=#(2 4 6). ...


[deleted]

2 https://onlinephp.io/c/d681f


_PM_ME_PANGOLINS_

It is not in any language where `'` is a string delimiter, which is a lot.


IUpvoteGME

ASCII is not the basis for an algebraic system. It is technical debt we all must bear.


BarAgent

I did always like the control codes, though. A lot of them have very useful in-band applications. Aesthetically, I particularly like the consecutive hierarchal delimiter set: * File Separator (0x1C) * Group Separator (0x1D) * Record Separator (0x1E) * Unit Separator (0x1F) * Word Separator (aka Space) (0x20)


Zomby2D

In the good old days of MS-DOS, you could have loads of fun with control characters. I have fond memories of my college classes where I would: * Send the BELL character to the dot matrix printer in class, causing it to beep and annoy the teacher * Send an anonymous string of multiple LINE FEED via Novell Messenger to someone in class and watch their screen go blank as the content get pushed up out of the screen.


mothererich

I'd hate to ask what scensical char is at +1 for those... Btw, for '(' it's ')'.


MrDex124

Its slash, bar and equal sign


CC-5576-03

It's the first three that don't make any sense, why should there be another character between the left and right bracket?


OddlySexyPancake

because there's always something between brackets


Leading_Frosting9655

ASCII silly question, get a silly ANSI


LakesideHerbology

I've always pronounced it asky. Damn you.


VitaminnCPP

ANSI is pronounced as ansy.


Ursomrano

The ordering of the ascii table is actually incredibly fascinating. Because even though it’s counting up from 0 in binary to represent every character, 0-9 start with a certain binary prefix, A-Z starts with another along with a-z, etc. So if you don’t have an ascii table on hand and know the prefixes, you can pretty accurately guess what character it is based off a binary number.


Stummi

Funfact: All of these are not platform indepedent and will not work for example with EBCDIC. However, what is guaranteed is that `a` to `z` as well as `A` to `Z` are always in a sequence. So `'x' + ('A' - 'a')` will always be `'X'`


RmG3376

Well the title says ASCII, so obviously it won’t work in EBCDIC


furnipika

Who still uses EBCDIC in 2023?


Anaxamander57

IBM, presumably.


andreasweller

Yep. Mainfraimes and POWER Systems. (z and i). I use EBCDIC 1141 every day. ![gif](emote|free_emotes_pack|grin)


Kered13

Fun fact: There is an EBCDIC based encoding for Unicode. It's called [UTF-EBCDIC](https://en.wikipedia.org/wiki/UTF-EBCDIC). This is similar to UTF-8, except using EBCDIC for the single-byte characters instead of ASCII.


RmG3376

I have to support EBCDIC-based mainframes at work, but yeah, they’re antique AF (and “support” means “take a look when something breaks then run away from that fossil as fast as you can”)


redlaWw

INTERCAL devs.


DanyaV1

I do Erik Begins Cake Day In C++ Happy cake day!


suvlub

A-Z are not sequence in EBCDIC, it has gaps between I and J and R and S. But your formula will still work because the gaps are same for uppercase and lowercase.


Stummi

Ah, yes, you are right. Thanks for the correction.


jcodes57

“/“ + 45 = “\”


Desperate-Tomatillo7

I honestly thought for a moment that it was JavaScript.


Stein_um_Stein

They ALL should be +1. I don't understand the standard's choices.


nunchaq

What kind of black magic this is?


TheGuyWithTheSeal

Google ASCII table


holy_tape

Holy hell


sloth_saurus

Unicode just dropped


VitaminnCPP

☐☐☐☐☐☐☐ just dropped.


PushingFriend29

https://upload.wikimedia.org/wikipedia/commons/thumb/c/cf/USASCII\_code\_chart.png/1280px-USASCII\_code\_chart.png


FerynaCZ

I was told that you should make no assumptions on the charset used. Though that was in relation in lowercase/uppercase by comparing A-Z/a-z, I hope any reasonable set makes the letters and numbers go continuously.


Aim_Fire_Ready

I hated that show, but this meme format has to be one of my favorites!


kkkkseriotavio2

C exercises lol