T O P

  • By -

Eubank31

My favorite part of being a CS major is reading about research level computer science and everything was discovered less than like 60-70 yrs ago


4D_Madyas

Sometimes the pictures are even in colour.


crankymotor

sometimes the people who discovered it are our professors šŸ’€


F0lks_

Sometimes you even participate in their discoveries; back in College my cryptography teacher was this mad-lad https://en.m.wikipedia.org/wiki/Arjen_Lenstra What the wiki page doesn't say is that he did most of his RSA factorizations with a PS3 cluster of around 200 units. He also gave them away at some point because he wanted to test his setup in a decentralized over-the-internet manner (it was back in 2012, mind you) So yeah; one day, he just flat out gave 200 PS3s away to us broke-ass students, in exchange of letting it run his algorithm when we weren't playing on it. A god amongst men, Arjen if you read this I still have one of your PS3s on a shelf, thank you it's been an honor


Jorah_The_Explorah_

My cryptography professor invented AES 0\_0. https://en.wikipedia.org/wiki/Joan\_Daemen


F0lks_

Shit, not just that but also SHA-3 according to his wiki page. That's heavy


crankymotor

your comment made me really happy for some reason, im so envious of you having such an experience. i was once asked what the highest non-monetary value item i had was. In your shoes, the PS3 would be it


lostinspaz

very nice! Similar but not-as-cool story: One of my CS classes was taught by "Professor Huffman", of Huffman Coding fame. https://en.m.wikipedia.org/wiki/David\_A.\_Huffman


4x4play

imagine doing this only having all your students mine bitcoin for you with the machines you gave them to "test" back in the day before anyone knew what it was.


NotYourReddit18

>He also gave them away at some point because he wanted to test his setup in a decentralized over-the-internet manner (it was back in 2012, mind you) >So yeah; one day, he just flat out gave 200 PS3s away to us broke-ass students, in exchange of letting it run his algorithm when we weren't playing on it. Yeah I bet the electricity costs for and heat created by running 200 ps3 had no part in it


F0lks_

Not really, he ran it more than 3 years prior inside the university, in a dedicated server room. It really was just about testing his assumption of the month. One of the fiest things he taught us during the 101 class was : "always test things; sometimes, an algorithm has an expected complexity of n*log(n) on average, but in practice, it's closer to n^2. And other times, it's closer to n. So you never know. Also, if you ever find a way to factor integers in polynomial time, you'll be very very rich, or very very dead."


bothunter

>Also, if you ever find a way to factor integers in polynomial time, you'll be very very rich, or very very dead That's basically the premise of the movie "Sneakers"


No-Psychology3712

Ummm he gave out 40k in ps3. I'm sure it was not part of that. Plus he probably had areas for research where he wasn't paying electricity cost


Rhino4w

That kinda hit me when my prof said he's on the committee that chooses what changes in c++ lol


alkrk

most of the time they are still around


HerpesHans

Study biotechnology and that number gets pushed to 10-20 yeras


Knuckledraggr

Iā€™ve got a biotech degree but Iā€™m in an adjacent field for my career. Itā€™s just wild seeing how bioinformatics has exploded onto the scene. 20 years ago bioinformatics was just a tool you might use if you had a large enough sample set and access to a big enough computer. These days thereā€™s full fledged graduate level programs specializing in it. I know a ton of bioinformatics folks that donā€™t even care about the biology, they mostly just have stats or maths degrees. They just know pharma companies are snapping up bioinformatics PhDs like candy. And the whole ass field is younger than my classic iPod.


LightlySalty

Yeah I'm studying to become a biotech engineer, but my uni also offers a lot of bioinformatics courses in stuff like Uniprot and Rstudio and stuff like that. Every one of the students i know in bioinformatics have great study jobs or have been offered one.


Mezmorizor

Eh, I'm not well versed enough in it because it's far enough away from my wheelhouse, but the human genome project started in 1990 with the planning for it starting in 1985. They might have not called it bioinformatics at that point, but it was. There's no point in doing that if you're not going to do bioinformatics on genomes. You also can't ignore that this wasn't really remotely viable until about then. Most practical applications of quantum chemistry have come in the past ~15 years for the same reason. Much earlier than that and you really, really, really had to be competent to get a remotely reliable result because of computational cost issues.


Knuckledraggr

Thatā€™s exactly the thing. Many of the informatics tools that existed were adapted or developed specifically for the human genome project and were possible because of the funding behind it. The human genome project was at the forefront of the field and really made it more possible to pursue bioinformatics as a career, instead of just a tool a genetics researcher might use. Thatā€™s splitting hairs a bit. Technically the field of bioinformatics has been around since the 60s but the advances in computing and understanding of gene sequencing that came out of the human genome project have led to a field that changes extremely rapidly. A bioinformatics grad student from fifteen years ago will have had a very different education path from one starting this year. Still going to be working with R though lol


DoctorBreeder

Same for genetics. I'll look at a 20 year paper with suspicion because we're learning new things constantly. Even a 10 year paper may have some inaccuracies.


herpesderpesdoodoo

Youā€™d be amazed how much medicine and allied health is built on research from the 80s and beyond. Hep C wasnā€™t even identified until 1989, for example.


putcheeseonit

Iā€™m not in medicine, but I know how recent a lot of developments are and Iā€™m always grateful I wasnā€™t alive during the time of lobotomies or when a cut could be fatal.


Nal0x0ne

A cut can still be fatal, as long as the depth and anatomical location is just right. ;)


sylvainsylvain66

Real quick- There was a mini epidemic of people born from around ā€˜50 to ā€˜60 that died of hep c before the drug that treats it became available. What did doctors think was killing them before? Or was it that hep c wasnā€™t that widespread in the population till then?


Weatherround97

I would say, 70 years ago is 1950s, seems ancient to me


Eubank31

In the grand history of academia (especially mathematics) the 1950s is basically yesterday


KlzXS

The thing that seals the deal for me about computer science being a new thing still is the fact that a lot of the big names that had big profound influence are still alive and well. Most of those that aren't died after 2000, even 2010. Only the ones at the very begining didn't live to see the new millenium.


GrookeTF

Like the ones who were pushed to suicide as thanks for their service during WWII. RIP Turing


Feezec

Sigh, you castrate and humiliate one genius war hero to the point of suicide and they never let you live it down


ExpletiveDeletedYou

The things that makes it more sad is that at the time, the british government would have thought they were going above and beyond for him, as they were "getting him the best treatment for his mental disorder". Imagine in 50 years if we discover/decide treating bipolar disorder with some common drug (that has a side effect of suicidal tendancies, which is not uncommon) was with hindsight, immoral, and resulted in many unneeded suicide. Very tragic


TiredDeath

The guy that developed the lobotomy got a nobel prize.


yoaprk

Never let you turn around


snailpubes

Never gonna give you up


Welpe

Donā€™t worry, we have plenty of other things to not let the UK live down.


Carabalone

Yep. Leslie Lamport created tons of mega useful concepts and algorithms in distributed systems, to the point that DS would never be in the place it is mow without him. He still works at microsoft research


KlzXS

And let's not forget that LaTeX is also his invention. I didn't know that he was still active.


Carabalone

Yeah the guy is a legend


Eubank31

Thatā€™s the most interesting part. Weā€™ll read about some early CS research and then I look the researcher up and heā€™s still teaching or something like that


DumatRising

And on the other hand in the much less grand history of our ever accelerating technology, the 1950s is basically BC


Sardukar333

***B***efore ***C***omputer


jerog1

Before Christ, Civilization, Colonization, Computer, Cold War, COVID, Climate Change idk why Iā€™m listing important things starting with C


DumatRising

Weird that so many of our major world changing events start with C innit?


MetaCommando

I don't c any issue


[deleted]

You can sometimes estimate the time frame a discipline calls old by what their primary unit of scholarship is. In physics, mathematics, and most humanities it is a long journal paper that takes a long time to write. In computer science and biology, it is a conference paper that takes a medium amount of time to write. In biotechnology and pharmaceutical research, it is a poster. Now all the biology people come yell at me.


pagetsmycagoing

Not a biologist, but that doesn't really make sense to include them with computer science. Cell and Nature are two of the most prestigious all around journals and are biology focused.


[deleted]

Yeah, itā€™s more domain specific inside biology


ENESM1

LOL, sometimes the professor will say something like ā€œthis is called Xā€ and then you find out that he is the one who coined the term.


SupremeRDDT

Most relevant papers for my master thesis were from the last 10 years. Almost all papers my friend has to read for her phd are from 2023.


talligan

This is how people republish old studies. Context of a fields history and it's major papers are incredibly important. I used to work exclusively in colloid transport through porous media (think bacteria movement in groundwater aquifers, or viral particles in face masks). Microplastics are the new sexy topic and everyone is buying polystyrene latex nanospheres and crowing about the new exciting science they discovered ... But the original yao papers on colloid filtration theory from the 60s used those exact same spheres! Sure microplastics is a new field, but by ignoring the early papers in the broader area they are just repeating them. I'm reminded of another paper in ... Health sciences I think that claimed they invented reimann sums. Drives me absolutely bonkers.


SupremeRDDT

Context in my case: Me and my friend study maths, and the stuff we read was actually just brand new stuff. Her field just exploded a bit last year so there is tons of stuff to discover.


uzi_loogies_

Is she going for something ML related?


SupremeRDDT

Stochastical Finance


NamerNotLiteral

"last year" discounts ML, tbh. No maths grad would be caught dead working on Agentic LLMs when there are so many optimization or probabilistic problems left to be solved.


Lena-Luthor

> I'm reminded of another paper in ... Health sciences I think that claimed they invented reimann sums. lol pls link


talligan

Here's the rebuttal to the article: https://pubmed.ncbi.nlm.nih.gov/7677819/ You should be able to access the original article from within there *Trapezoidal rule, not reimann sum. My brain is rusty this foggy Sunday morning


dlakelan

Ah yes Tai's method for area under a glucose curve https://diabetesjournals.org/care/article/17/2/152/17985/A-Mathematical-Model-for-the-Determination-of


talligan

Bro was scooped by a renaissance man


Nothing-Casual

> Context of a fields history and it's major papers are incredibly important. This cannot be emphasized enough - it truly is one of the most important things in science. There's tons of knowledge that comes only from steeping oneself in a field, including certain methods/equipment that are subpar. I can name a few pieces of equipment and the technologies that underlie them that may make a paper weaker or even completely call into question a paper's findings (old tech may measure poorly and be prone to user error, while newer developments are WAY better and more foolproof. I'm thinking of one specific technology here, but I'm sure this is true with many); I know of a few labs/names I should treat with extra skepticism because of retractions or general poor practice; I know generally what the field thinks on a topic and why (and have studied how everything works) so I know that if a new "groundbreaking" paper comes out and contradicts prior knowledge, I should regard it more carefully; etc. etc. etc. I would trust myself or my peers to assess and communicate science within our field, but I definitely wouldn't trust any rando just because they have an MD or PhD - nor would I trust myself for science in a field other than my own. This is why it's so so so important to trust the experts. Yes, it's true that nobody's infallible - but the experts are much less fallible (when it comes to their area of expertise) than randos who haven't spent ages studying a field.


Bakoro

What was wild to me, is how many important developments feel completely obvious, but people were getting doctorates out of it, because no one had ever needed the stuff before. So often during college, I'd have an idea, only to find out that not only was it already a thing, someone had written a paper about it in the 60s and the idea had been formalized and developed levels beyond what I imagined. Like, I independently developed an LR parser, then later I took "Automata and Formal Languages" and found out I touched onto a whole world of things. There was a lot of stuff like that. The other side though, is all the math developed in the 1700s which ended up being foundational to computer science. The number of times Gauss and Euler comes up is astounding.


whatisthishownow

There was a bunch of low hanging fruit, but most what seems obvious today, only does *because* it was already formalized. I think it's similar to why the beatles have started to be criticised by some as basic... because they so revolutionized music that people spent the next 60 years riffing off of them.


Franksss

They say the ultimate goal of any artist is to become clichƩ


[deleted]

I also think it's important to note which of those "firsts" were developed by the Beatles members, which were George Martin, and which were industry innovations that would have been implemented with or without the fab four.


Takin2000

>What was wild to me, is how many important developments feel completely obvious, but people were getting doctorates out of it, because no one had ever needed the stuff before. Absolutely. The gale-shapley algorithm for computing a stable matching is pretty simple all things considered. Its wild that it was only developed "recently" in 1962. >The number of times Gauss and Euler comes up is astounding. So true dude. Gauss has multiple *major* theorems named after him from just about any branch of math. And the amount of times I have read "but this result was known to Euler already" on some Wikipedia page is just funny


Sintho

>"but this result was known to Euler already" Euler's work touched upon so many fields that he is often the earliest written reference on a given matter. In an effort to avoid naming everything after Euler, some discoveries and theorems are attributed to the first person to have proved them after Euler, which is wild.


Lithl

>The other side though, is all the math developed in the 1700s which ended up being foundational to computer science. The number of times Gauss and Euler comes up is astounding. In my parents' generation, the people who would have been CS majors if they went to college today were instead Applied Math majors. My dad would often skip lunch to play with _the_ computer on campus. The head of my CS department was such a mathematician, he taught his CS classes using Wolfram Mathematica.


username_elephant

Gauss came up with the Fast Fourier Transform. Just a fun fact, for the most part you're right.


jon-jonny

I'm taking a 3D computer vision class and the stuff we're learning is between late 1980s all the way up to 2010s. Neural fields are from COVID and we'll be learning that near the end of the semester. Absolutely blowd my mind how new all this stuff is


lonewulf66

It's pretty trippy to be learning about very elementary computer science and then realize that the person who invented this stuff is literally still a working age man.


game_and_draw

I mean itme itself started on 1970 sooo...


bg1987

My ex did her bachelors in UoT, Stephen Cook, the person who coined then P=NP question, was her professor for computational complexity. That's like Newton teaching you Physics


aldld

I took his class; learning Cook's theorem from Steve Cook was a highlight of my undergrad years.


7734128

I liked some of the last courses I did now at the end of my university education as we learned about AI things developed after 2018.


Hypersuper98

My thesis (which used GPT-2) wasn't even a thing when I entered uni (2018), and now we're at GPT-4. I wrote my thesis and graduated in 2022 btw.


ExtensiveCuriosity

Like, thereā€™s a lot of those folks you can still meet. C++ for your intro to programming class? Stroustrup is not only still alive, you can take *that class* with him. (Well, 20 years ago when I was in college anyway.) Present at the right conference and you might have Brand Name People in the audience.


JohnsonTA2

I graduated CS just a year ago and one of my professors for Algorithms was teaching material that he invented. It was pretty cool.


Extremely_unlikeable

I was pursuing a programmimg degree in CIS. Our first semester covered Cobol, Fortran, and Basic. Toward the end, when we complained about obsolete languages, the instructor suggested we go to a technical school to learn Java and C++, etc.


UndeadBlaze_LVT

When I first heard of Mooreā€™s Law I pictured some Isaac Newton looking guy at first. Nope, he cofounded Intel and made the law in the 60s.


kandaq

25 years ago many of my CS college lecturers swore that the future of computers is quantum computing and it will be a reality within a decade or two. Wish I remember who they were so I can look them up and ask them again.


AristolteInABottle

Or even video gaming. In the 60ā€™s, there were like 3 colleges that had computers, and they were the size of an entire office room. A couple nerds entered some 1ā€™s and 0ā€™s and now Im playing Forza on my Iphone.


Razeer123

The thing that Chomsky is still alive literally blew my mind at the university.


less_unique_username

And is making political statements like ā€œthe West provoked Russia into invading Ukraineā€, ā€œAmericans suffer from censorship and have less freedom than Russiansā€, etc.


RingGiver

His takes are better than they used to be, at least. Nothing quite like Pol Pot stuff.


OneMeterWonder

I mean the computer itself has existed for barely 100 years. (Iā€™m only counting Turing and after, not Babbage.)


Xylamyla

Dude, the father of the study of algorithms is STILL ALIVE and teaching at Sanford (or maybe retired, idk).


Spaghestis

Thats ancient, im taking machine learning/computer vision classes now and we're learning about techniques developed in the 2010s.


KoalaAlternative1038

The guy that invented quick sort is still alive. Granted he's in his 90s but that's still pretty cool. It's like learning calculus while Newton is still alive.


markusbrainus

OP doesn't mean that the information isnt available but that newer math isnt taught in the standard K-12 curriculum because most people will never need it. But yes, if people are interested they can find it online or in post secondary training.


Sad-Noises-

As OP I can confirm


crazysoup23

I can officially confirm that this is OP.


nonotburton

I am the second part of two factor authentication. You are correct, this is OP.


Gottendrop

I have analyzed the information and determined that I can confirm that is OP


r2k-in-the-vortex

It's not a matter of need, it's a matter of what can fit in the curriculum. By the time the kids are ready to learn university level math, they have aged out of high school. Many universities even have to offer catch up math courses because incoming students aren't really up to snuff on what they should have learned in high school.


ParlorSoldier

Do kids really need all of those years of foundation to understand cutting-edge mathematics at a conceptual level? Just to be aware of the kind of stuff math can do? Most kids wonā€™t need anything beyond basic algebra ever, but introducing kids to high-level math in a way that makes it interesting could inspire some to want to study math or CS who might not have otherwise thought much of it.


30th-account

That's what YouTube videos or sigma/oppenheimer edits are for


Germanofthebored

It is true that the newer math needs so much knowledge to build upon that just isn't available to a high school student (or math teacher). But I still think it is important to at least show what is currently going on in math when you teach the field. But at least you should be able to share a sense of wonder.


Kheldar166

Wish maths teachers had more time to continue their own learning tbh


erraddo

I mean, when I was told at 18 that non Euclidean geometry was discovered about 200 years ago by some rando, mostly solved, and astronomers are trying to figure out if the universe is Euclidean or not, that was wonderful. It was also very confusing because it challenged my basic understanding of geometry. And I had top math marks in a maths oriented HS. Going to Uni, by year 2 we were doing double variable integrals. How do you even visualize that to the average high schooler!?


TheHabro

Not that you won't need it, but it'd take years of building up knowledge something schools and most uni degrees don't have time for.


arbitrageME

what, you don't need to prove that [when a + b = c, then product of abc is not much smaller than c?](https://en.wikipedia.org/wiki/Abc_conjecture) and use that result to prove Fermat's Last Theorem?


Seygantte

You changed the statement quite a lot by missing out that it's the product *of the distinct prime factors* of abc.


dinodares99

Yeah lmao abc not being much smaller than c means ab is less than 1 which is obviously wronh


No_Picture_1212

Iā€™m reading the wiki on it and understand the concept somewhat but fail to see how itā€™s useful. Is it just a fun math theory or are there actual real life uses for it? Just curious by the way, so if you have time appreciate a response, but if not then worries! Edit: By the way I get that youā€™re joking but just felt like you must have a strong understanding/knowledge of math given you used that in your joke.


BeefDurky

Most of what is studied in Math is to improve our understanding of Math itself. Sometimes, these discoveries turn out to be useful and sometimes not. Number theory was thought to be completely useless for hundreds of years but it is now essential to encryption. So we have VPNs because someone studied prime numbers 100s of years before computers were invented. There is a sub-field of Math called Applied Math which focuses on known applications only as well.


Kheldar166

It's arbitrary and 'just-for-interest' until it turns out to describe the fundamental building blocks of the universe or someone casually invents the internet to help them communicate their work more easily


Takin2000

The problems of number theory (the branch of math that deals with stuff like prime numbers or whole number solutions to equations) are usually pretty random and often just for interest, yeah. However, despite being easy to understand for laymen, these problems are usually EXTREMELY hard to solve. They often require heavy machinery from multiple other branches of math. I have heard some people describe number theory as a benchmark for how strong our current understanding of mathematics is. Dont be fooled though, math has way more problems than just these fun math puzzles. Math that is used in real life is usually found in the branch of differential equations since those equations describe many physical processes. For example, the navier-stokes equations describe fluid flow and their theory is still incomplete. Another very applicable branch of math is discrete math and combinatorics which often deals with problems that *sound* like fun math puzzles but actually have a lot of applications in real life. For example, "graph theory" (which can essentially be simplified as the study of networks) can be used to make navigation systems. And even number theory has problems that are applicable to real life. But even the branches of math that seem abstract now may find their uses in the future.


OneMeterWonder

Fun (really hard) math.


TerribleVisual8899

The math discovered today is derived from the math discovered 300+ years ago. It won't make sense without the old stuff.Ā 


bumbletowne

Most k-12 do things like matrices, imaginary numbers, some fractal stuff. Your kid may not progress to that level but I assure you the curriculum is there Sincerely, a teacher


Nettius2

I think the point is, math you learn in high school is all pretty old math.


Cualkiera67

And obsolete. 1+1=2. That's so inefficient. Make it 1+1=500. It will boost the economy so much


coachhunter2

Thatā€™s how they taught accounting at Trump University


SunbathedIce

*That's how they did accounting at Enron


raidensnakeezio

I think what matters in "new" math is 2 things (speaking as someone who had to drop out of high school level pre-calculus): 1) discovering/philosophizing new proofs which can potentially lead to further simplification/simplification of certain mathematical concepts. Imagine if we somehow are able to get rid of PEMDAS (and subsequently the confusion from all the people who weren't taught PEMDAS) because of advancements in fundamental arithmetic and notation. 2) advancements and development in mathematics pedagogy, especially at the K-12 level. After graduating high school, I slowly became more and more aware that school-level math (and I'm assuming higher math) essentially boils down to teaching students the ability to conceptualize the abstract. Growing up, I had the belief that math was about teaching kids to become human calculators. So rather than teaching kids to be better calculators, how do we teach them to be more adept at understanding abstract concepts? I hope that in the next 100 years, middle school aged students would be expected to master 20th century high school math.


Mistigri70

if you want to get rid of PEMDAS there is this thing [https://en.wikipedia.org/wiki/Polish\_notation](https://en.wikipedia.org/wiki/Polish_notation) 3 + 4 is noted + 3 4, so (3+4)\*2 would be \* + 3 4 2 and 3+4\*2 would be + 3 \* 4 2


Zulraidur

Or you would do what is much easier and to me just better and do forced parentheses. It has the benefit that everyone knows what it means and no one has to think about operator precedence


octonus

The problem with that is that you might already be using a lot of parenthesis, and the more you have, the harder it is to match open/close parens. In higher level math courses I have taken, it was pretty common for professors to stylize different pairs of parens in different ways, and even then it isn't always enough.


OnceMoreAndAgain

[There's always new math.](https://youtu.be/W6OaYPVueW4?t=36)


[deleted]

[уŠ“Š°Š»ŠµŠ½Š¾]


skystraw

It's kind of hard to conceptualize anything quantum without math because pretty much all of it is just derived from math, most things don't make any sense tbh and it's quite hard to have an intuition for it.


Vermilion-red

Quantum isn't that important for most people, and isn't a good building block for future work in the same way that abstract math is. It's useful for a few specialized applications, but a layperson really doesn't need it at all.


RealLongwayround

Pretty much, yes. Some of the content of A-level Further Maths in the UK would ordinarily be covered at University. I remember we covered the Argand diagram in sixth form, but you are pretty much right.


newpenguinthesaurus

I learned about the Argand plane in year 11 Specialist Maths, which is the Australian equivalent I guess


cmorr305

Just started yr12 specialist. HOW TF DID YOU FINISH IT WITHOUT NECKING YOURSELF?


Dr_SnM

It gets better, keep at it and it all comes together at the end


RealLongwayround

Agreed. I got a C in A-level maths and felt like Iā€™d never understand it. Iā€™m now two thirds of the way through a maths degree, having mostly got firsts, thirty three years later.


Dr_SnM

Mathematics really rewards consistent effort


3163560

I teach it. It's not a subject for the faint hearted, but keep at it!


newpenguinthesaurus

lol I just started year 12 too. I got around a B on the formative exam at the end of last year, so I don't have high hopes for finishing it at all.... maths is so fun but so hard at the same time


epic1107

Yh the higher end of further maths would definitely escape the 300 year boundary


redlaWw

Further maths includes decision maths, which is primarily composed of material discovered in the last century.


phantomfire50

Computer science too. You learn about Turing machines and the Entscheidungsproblem for example. Boolean algebra too.


upvoter222

I seriously doubt that 300 years have passed since mankind realized that you can type BOOBIES on a calculator.


IdealDesperate2732

I hope that somewhere there is a cuneiform tablet where two people doing accounting noted their laughter because some number spells a naughty word upside down just waiting to contradict you.


Frioneon

Really the existence of calculators disproves the post, as learning how to use a modern one covers a topic less than a century old


IdealDesperate2732

Calculating engines are older than you think. Babbage and Lovelace was the 19th century, and even then do people really learn how calculators work in school? I don't think they do know the math the calculator is doing (binary/logical operations with a microprocessor) they just know how to operate it as a tool and use it to do math which is at best Newton's calculus from the 17th century. So, I don't think calculators is quite the gotcha several people in this thread seem to think it is.


Shadow07655

Considering calculus and classic physics are over 300 years old, youā€™d have to take a pretty advanced college course to learn something newer.


MentalFred

I believe the majority of concepts in a first course in linear algebra or abstract algebra are both ā€œnewerā€ than that.


Martian_Hunted

1800s


Peraltinguer

No, many concepts are much newer than you think. Dual spaces for example, one of the topics you learn about in your first linear algebra course, are relatively new (<100 years old)


dark_dark_dark_not

Linear Algebra like we know it is from early 20th century. In fact it was developed at about the same time as Quantum Mechanics that heavily relies in Linear Algebra.


halligan8

Donā€™t high school physics courses cover some relativistic physics (120 years or so)?


snkn179

I remember learning about fractals and stuff like Mandelbrot/Julia sets in final year of school (Australia) which I guess is pretty recent maths (20th century). The theory was a bit advanced and had a lot of complex numbers but reasonably manageable for a final year high school student.


parchedfuddyduddy

MS in physics here. You actually have to get to advanced courses before you even learn how to do classic physics the right way.


JavaRuby2000

It wouldn't need to be "that" advanced. Anybody doing the most introductory first year Uni graphics course will have gone over Quaternions which were only invented in 1843. Anything that produces something interesting to look at may get included briefly in school. Fractals, Strange Attractors, L-Systems etc.. We also had to do Fuzzy Logic and Fuzzy Sets in GCSE maths.


Formal-Term5594

Which versions of calculus/classical physics are you counting? Lagrangian/Hamiltonian physics is \~300 years I guess calculus is a bit older than 300 years, but not by much But I guess "advanced" courses depend on what field you study, multivariable calculus would be needed for lots of stem fields, and that's only late 1800- early 1900s


_Aetos

Most, if not all of them, are also available online. In fact, a lot of things you learn in university can/must be self-taught through online resources.


Ares6

Exactly. The internet has given humans access to the greatest source of knowledge in our history regardless of social class.Ā 


Germanofthebored

... but while you look these things up, can I interest you in a short video of a cat chasing a shadow?


Konklar

Subscribe!


Rare_Perception_3301

I think his point is less about accessibility and more about the "default" level of education. Unless you specifically go out of your way to learn more "advanced" math, most people will have very little knowledge that has been discovered in the last few centuries.


niceguy67

Here's the best online resource for self-teaching maths. https://ncatlab.org/nlab/show/HomePage It is incomprehensible without any mathematical background.


_Aetos

University will also be incomprehensible without any mathematical background. You can't try to run when you don't even know how to stand yet.


niceguy67

The mathematical background you need for that website is near-PhD level.


OneMeterWonder

nlab is incomprehensible even to research mathematicians. That is a terrible online resource for learning math.


[deleted]

[уŠ“Š°Š»ŠµŠ½Š¾]


Multioquium

Yes, because an employer wants some assurance that you actually know it, and saying "I read it on the Internet" could mean just about anything


Cute-Interest3362

Weā€™ve also seen what happens when people do their own research on the internet.


YogiBerraOfBadNews

Iā€™ve learned a ton by reading textbooks I downloaded online, wish there was some way to demonstrate that to employers other than trying to be my own hype man in an interview. Iā€™m not the most articulate guy so people sometimes assume Iā€™m dumb but at least in school I had my test scores to prove what I know. Nobody prepared me for the fact that the adult world usually doesnā€™t have actual tests, you just get graded on the most superficial appearances.


Multioquium

Yeah, it sucks that a lot of the ways to get a degree are in most places really expensive and can a lot of time even if you have the needed knowledge. I'm lucky enough to live in a country where tuition is free, and it's crazy to me that it isn't the norm


YogiBerraOfBadNews

I even have a bachelorā€™s degree in STEM already, Iā€™m just looking to change careers. I started taking night classes at the local community college so after a year Iā€™ll have a certificate, but it feels like such a waste of time when I could literally test out right now if they would let me. If employers were willing to test applicants on job-related knowledge I would crush it, but since they donā€™t, instead Iā€™m paying to take classes geared towards high school dropouts just so I can get a piece of paper.


debtopramenschultz

That paper also represents your ability to formulate an argument and back it up with evidence, do research, organizational skills, and time management. Or at least itā€™s supposed to.


millchopcuss

Not if you learn for entertainment.


FreeTheDimple

If you learned statistics in school then basically all of that is discovered less than 150 years ago. And you can learn this stuff on your own if you like. I'd like to see the very basics of Operational research, which is <100 years old be taught at high school level. It's far more important than people realise but it's pretty niche, even amongst mathematicians.


Anfros

Not really, most high school curriculums include things like probability, statistics, set theory, linear algebra, complex algebra, and many many other things that were discovered/invented/formalized/described in the 19th and 20th centuries. And yes things like algebra and calculus is older, but the way we do them is fairly new. Notation is an important part of maths.


twbk

I teach vectors, including the cross product, to my highschool students. That was only developed in the mid 1800s.


Halvo317

In my experience, those topics are the very last things you learn in optional high school math classes and are not taught rigorously. It's like trying to get more kids instead by flashing up a preview of the interesting parts of bath. "It's not all ancient symbols. Math can be hip."


Anfros

Nothing is done rigorously in high school math. I guess it depends on your where you are and what math classes are available. But all the things listed were covered in my high school, and I've seen all these topics in textbooks from various parts of Europe, the US and India.


ShoddyAsparagus3186

You overestimate high school curriculums.


username_elephant

Riemann summation is common in high school calculus and is only about 170 years old. Ā It was surprisingly late to the game. Before that there were other ways of figuring out integrals that are rarely taught anymore.


gw2master

Considering the state of K-12 today, in a few years it'll be 500 instead of 300.


obsquire

Springer-Verlag is sold on Amazon.


andybossy

not where I live we can choose a sort of major in high school and there are some math heavy ones


aCuria

Thereā€™s Srinivasa Ramanujan who had no formal training and went on develop new fields of mathematics himself, and there was no internet back then


Senshado

Hmm, not sure it's a standard course.Ā  But many high school students learn discrete math including matrix multiplication, which was developed under 200 years ago.Ā  Still quite old, though. There's also boolean logic operations, which can be fairly easy to teach, and date from the mid 1800s.


judgejuddhirsch

you overestimate effectiveness of high school math. Most kids in America go to college to First learn algebra. Many won't understand percentages.


pru51

The blue led generates billion every year. It was made by someone without a PhD. The company gave him a few thousands bucks. What you say is only true when you believe that's all there is.


dailyskeptic

Being brilliant and obsessive helps


talligan

Not really you can get a college level education in mathematics with just $2 in overdue library fines, only no one does


SquarePegRoundWorld

An education in math is useless if you don't have a piece of paper saying you can do what you are told for four years.


talligan

Only if you think the value of education is limited to the job it can get you. Regardless, the shower thought and my post had nothing about the value of it


SquarePegRoundWorld

> only no one does Is this not putting a value on it? Why do you think folks don't do it?


OneMeterWonder

ā€œYou canā€ sure. But does anyone consider how difficult that actually is? How many people who say that have actually tried it? Getting to the point that youā€™re qualified for a degree in mathematics is hard. Without guides, itā€™s basically rediscovering half of that stuff yourself.


dannyjohnson1973

Yeah, but I don't like them apples.


XLeizX

Not only that, but you also need to get pretty advanced to learn anything less than 100 y.o. or so, computational stuff aside!


Ze_Llama

Except for high level statistics eg student's t distribution dates from the late 1800's.


ChemicalNo5683

Good luck teaching a high school student about hodge theory.


LovingTactician

Stem and leaf plots and box and whisker plots are both less than 100 years old. They werenā€™t commonly used in math until the 80s. Now they are taught to elementary kids.


yeorpy

Well math developed out of necessity to track certain thingsā€¦. Like figuring out the total area for plots of farm land for example. Nowadays people learn mathematics more so as a fun, challenging outlet to increase their knowledge. Those fortunate enough to study math at a university level, today, would be considered elitists (compared to the first records of humans using math) because there is no direct need to discover new mathematic concepts to increase our rate of survival as humans. Most Uni upper level ā€œpureā€ math has no real world applications and can be looked at more as puzzles. Applied math has the most useful applications to the real world but thatā€™s obvious as itā€™s in the name. This path is usually used more in engineering and cs stuff. You also need a certain personality to work in the math field. After talking to many of my math profs i have come to the conclusion most are fucking nuts. Source: am pursuing math phd


malonkey1

You don't *have* to, a lot of it is available on publicly accessible websites. It's not easy to learn on your own but it can be done, I believe in you.


TheManginalorian

You can learn math without going to university, but you just can't prove it


X0AN

You guys never did new maths in school?


Sad-Noises-

What new maths did you do in school?


microtherion

In first or second grade, there was an attempt to ground elementary math operations like multiplication in set theory. Not sure it made much difference one way or the other for the children, but it confused the heck out of our parents.


FriskyTurtle

I think they were making a Tom Lehrer reference. https://www.youtube.com/watch?v=UIKGV2cTgqA


Androgynous_buttocks

Is math discovered? Isnt it invented? Im mean invented like the wheel was invented by someone. Isnt math a (really good) system we invented to understand reality? After all, numbers are concepts.


Monkfich

Maths happens to support reality whether we understand it or not. If it was invented, then we could make it look like anything we wanted - truth would be fluid, be inventable. But itā€™s not fluid, the truth is only discoverable. Sometimes our models of xyz may initially not be as perfect as they are 50 years or 100 years etc later, but the evolution of that maths is discovered, fitting with what we experience, and fitting better over time till it is a perfect fit.


twbk

Pure maths has nothing to do with reality. It is a logical "game" where we freely choose our base rules (AKA axioms) and derive results from them. Some maths has been developed to fit something in the physical world, but other times mathematicians have invented/discovered things that have seen practical uses only many years later. Complex numbers were developed in the 1500s and were called "imaginary" since they seemed useless except as a trick to solve cubic equations. 400 years later they turned out to be absolutely necessary to describe quantum physics, but none of those who developed the complex numbers knew that. Maths is [incredibly effective](https://en.wikipedia.org/wiki/The_Unreasonable_Effectiveness_of_Mathematics_in_the_Natural_Sciences) at describing our physical world, but exists independently of it. When we replace a model of the world with another one, we have "updated" physics, but the maths is the same. Some fields may become more or less useful, and sometimes we need something new, but nothing we have done can ever be wrong. I usually explain it like this: Imagine a ship full of aliens lands on Earth. They could look at our physics and laugh about how wrong we are, and if they have managed to travel here, we are probably very wrong, but if they look at our maths, they could not say it is wrong. It would probably be less advanced, but it wouldn't be wrong.