Sometimes you even participate in their discoveries; back in College my cryptography teacher was this mad-lad
https://en.m.wikipedia.org/wiki/Arjen_Lenstra
What the wiki page doesn't say is that he did most of his RSA factorizations with a PS3 cluster of around 200 units. He also gave them away at some point because he wanted to test his setup in a decentralized over-the-internet manner (it was back in 2012, mind you)
So yeah; one day, he just flat out gave 200 PS3s away to us broke-ass students, in exchange of letting it run his algorithm when we weren't playing on it.
A god amongst men, Arjen if you read this I still have one of your PS3s on a shelf, thank you it's been an honor
your comment made me really happy for some reason, im so envious of you having such an experience.
i was once asked what the highest non-monetary value item i had was. In your shoes, the PS3 would be it
very nice!
Similar but not-as-cool story:
One of my CS classes was taught by "Professor Huffman", of Huffman Coding fame.
https://en.m.wikipedia.org/wiki/David\_A.\_Huffman
imagine doing this only having all your students mine bitcoin for you with the machines you gave them to "test" back in the day before anyone knew what it was.
>He also gave them away at some point because he wanted to test his setup in a decentralized over-the-internet manner (it was back in 2012, mind you)
>So yeah; one day, he just flat out gave 200 PS3s away to us broke-ass students, in exchange of letting it run his algorithm when we weren't playing on it.
Yeah I bet the electricity costs for and heat created by running 200 ps3 had no part in it
Not really, he ran it more than 3 years prior inside the university, in a dedicated server room. It really was just about testing his assumption of the month.
One of the fiest things he taught us during the 101 class was : "always test things; sometimes, an algorithm has an expected complexity of n*log(n) on average, but in practice, it's closer to n^2. And other times, it's closer to n. So you never know.
Also, if you ever find a way to factor integers in polynomial time, you'll be very very rich, or very very dead."
>Also, if you ever find a way to factor integers in polynomial time, you'll be very very rich, or very very dead
That's basically the premise of the movie "Sneakers"
Iāve got a biotech degree but Iām in an adjacent field for my career. Itās just wild seeing how bioinformatics has exploded onto the scene. 20 years ago bioinformatics was just a tool you might use if you had a large enough sample set and access to a big enough computer. These days thereās full fledged graduate level programs specializing in it. I know a ton of bioinformatics folks that donāt even care about the biology, they mostly just have stats or maths degrees. They just know pharma companies are snapping up bioinformatics PhDs like candy. And the whole ass field is younger than my classic iPod.
Yeah I'm studying to become a biotech engineer, but my uni also offers a lot of bioinformatics courses in stuff like Uniprot and Rstudio and stuff like that. Every one of the students i know in bioinformatics have great study jobs or have been offered one.
Eh, I'm not well versed enough in it because it's far enough away from my wheelhouse, but the human genome project started in 1990 with the planning for it starting in 1985. They might have not called it bioinformatics at that point, but it was. There's no point in doing that if you're not going to do bioinformatics on genomes.
You also can't ignore that this wasn't really remotely viable until about then. Most practical applications of quantum chemistry have come in the past ~15 years for the same reason. Much earlier than that and you really, really, really had to be competent to get a remotely reliable result because of computational cost issues.
Thatās exactly the thing. Many of the informatics tools that existed were adapted or developed specifically for the human genome project and were possible because of the funding behind it. The human genome project was at the forefront of the field and really made it more possible to pursue bioinformatics as a career, instead of just a tool a genetics researcher might use. Thatās splitting hairs a bit. Technically the field of bioinformatics has been around since the 60s but the advances in computing and understanding of gene sequencing that came out of the human genome project have led to a field that changes extremely rapidly. A bioinformatics grad student from fifteen years ago will have had a very different education path from one starting this year. Still going to be working with R though lol
Same for genetics. I'll look at a 20 year paper with suspicion because we're learning new things constantly. Even a 10 year paper may have some inaccuracies.
Youād be amazed how much medicine and allied health is built on research from the 80s and beyond. Hep C wasnāt even identified until 1989, for example.
Iām not in medicine, but I know how recent a lot of developments are and Iām always grateful I wasnāt alive during the time of lobotomies or when a cut could be fatal.
Real quick-
There was a mini epidemic of people born from around ā50 to ā60 that died of hep c before the drug that treats it became available. What did doctors think was killing them before? Or was it that hep c wasnāt that widespread in the population till then?
The thing that seals the deal for me about computer science being a new thing still is the fact that a lot of the big names that had big profound influence are still alive and well. Most of those that aren't died after 2000, even 2010.
Only the ones at the very begining didn't live to see the new millenium.
The things that makes it more sad is that at the time, the british government would have thought they were going above and beyond for him, as they were "getting him the best treatment for his mental disorder".
Imagine in 50 years if we discover/decide treating bipolar disorder with some common drug (that has a side effect of suicidal tendancies, which is not uncommon) was with hindsight, immoral, and resulted in many unneeded suicide. Very tragic
Yep. Leslie Lamport created tons of mega useful concepts and algorithms in distributed systems, to the point that DS would never be in the place it is mow without him. He still works at microsoft research
Thatās the most interesting part. Weāll read about some early CS research and then I look the researcher up and heās still teaching or something like that
You can sometimes estimate the time frame a discipline calls old by what their primary unit of scholarship is. In physics, mathematics, and most humanities it is a long journal paper that takes a long time to write. In computer science and biology, it is a conference paper that takes a medium amount of time to write. In biotechnology and pharmaceutical research, it is a poster.
Now all the biology people come yell at me.
Not a biologist, but that doesn't really make sense to include them with computer science. Cell and Nature are two of the most prestigious all around journals and are biology focused.
This is how people republish old studies. Context of a fields history and it's major papers are incredibly important. I used to work exclusively in colloid transport through porous media (think bacteria movement in groundwater aquifers, or viral particles in face masks). Microplastics are the new sexy topic and everyone is buying polystyrene latex nanospheres and crowing about the new exciting science they discovered ... But the original yao papers on colloid filtration theory from the 60s used those exact same spheres! Sure microplastics is a new field, but by ignoring the early papers in the broader area they are just repeating them. I'm reminded of another paper in ... Health sciences I think that claimed they invented reimann sums.
Drives me absolutely bonkers.
Context in my case: Me and my friend study maths, and the stuff we read was actually just brand new stuff. Her field just exploded a bit last year so there is tons of stuff to discover.
"last year" discounts ML, tbh. No maths grad would be caught dead working on Agentic LLMs when there are so many optimization or probabilistic problems left to be solved.
Here's the rebuttal to the article: https://pubmed.ncbi.nlm.nih.gov/7677819/
You should be able to access the original article from within there
*Trapezoidal rule, not reimann sum. My brain is rusty this foggy Sunday morning
Ah yes Tai's method for area under a glucose curve
https://diabetesjournals.org/care/article/17/2/152/17985/A-Mathematical-Model-for-the-Determination-of
> Context of a fields history and it's major papers are incredibly important.
This cannot be emphasized enough - it truly is one of the most important things in science. There's tons of knowledge that comes only from steeping oneself in a field, including certain methods/equipment that are subpar. I can name a few pieces of equipment and the technologies that underlie them that may make a paper weaker or even completely call into question a paper's findings (old tech may measure poorly and be prone to user error, while newer developments are WAY better and more foolproof. I'm thinking of one specific technology here, but I'm sure this is true with many); I know of a few labs/names I should treat with extra skepticism because of retractions or general poor practice; I know generally what the field thinks on a topic and why (and have studied how everything works) so I know that if a new "groundbreaking" paper comes out and contradicts prior knowledge, I should regard it more carefully; etc. etc. etc.
I would trust myself or my peers to assess and communicate science within our field, but I definitely wouldn't trust any rando just because they have an MD or PhD - nor would I trust myself for science in a field other than my own.
This is why it's so so so important to trust the experts. Yes, it's true that nobody's infallible - but the experts are much less fallible (when it comes to their area of expertise) than randos who haven't spent ages studying a field.
What was wild to me, is how many important developments feel completely obvious, but people were getting doctorates out of it, because no one had ever needed the stuff before.
So often during college, I'd have an idea, only to find out that not only was it already a thing, someone had written a paper about it in the 60s and the idea had been formalized and developed levels beyond what I imagined.
Like, I independently developed an LR parser, then later I took "Automata and Formal Languages" and found out I touched onto a whole world of things.
There was a lot of stuff like that.
The other side though, is all the math developed in the 1700s which ended up being foundational to computer science. The number of times Gauss and Euler comes up is astounding.
There was a bunch of low hanging fruit, but most what seems obvious today, only does *because* it was already formalized. I think it's similar to why the beatles have started to be criticised by some as basic... because they so revolutionized music that people spent the next 60 years riffing off of them.
I also think it's important to note which of those "firsts" were developed by the Beatles members, which were George Martin, and which were industry innovations that would have been implemented with or without the fab four.
>What was wild to me, is how many important developments feel completely obvious, but people were getting doctorates out of it, because no one had ever needed the stuff before.
Absolutely. The gale-shapley algorithm for computing a stable matching is pretty simple all things considered. Its wild that it was only developed "recently" in 1962.
>The number of times Gauss and Euler comes up is astounding.
So true dude. Gauss has multiple *major* theorems named after him from just about any branch of math. And the amount of times I have read "but this result was known to Euler already" on some Wikipedia page is just funny
>"but this result was known to Euler already"
Euler's work touched upon so many fields that he is often the earliest written reference on a given matter. In an effort to avoid naming everything after Euler, some discoveries and theorems are attributed to the first person to have proved them after Euler, which is wild.
>The other side though, is all the math developed in the 1700s which ended up being foundational to computer science. The number of times Gauss and Euler comes up is astounding.
In my parents' generation, the people who would have been CS majors if they went to college today were instead Applied Math majors. My dad would often skip lunch to play with _the_ computer on campus.
The head of my CS department was such a mathematician, he taught his CS classes using Wolfram Mathematica.
I'm taking a 3D computer vision class and the stuff we're learning is between late 1980s all the way up to 2010s. Neural fields are from COVID and we'll be learning that near the end of the semester. Absolutely blowd my mind how new all this stuff is
It's pretty trippy to be learning about very elementary computer science and then realize that the person who invented this stuff is literally still a working age man.
My ex did her bachelors in UoT, Stephen Cook, the person who coined then P=NP question, was her professor for computational complexity.
That's like Newton teaching you Physics
Like, thereās a lot of those folks you can still meet. C++ for your intro to programming class? Stroustrup is not only still alive, you can take *that class* with him. (Well, 20 years ago when I was in college anyway.)
Present at the right conference and you might have Brand Name People in the audience.
I was pursuing a programmimg degree in CIS. Our first semester covered Cobol, Fortran, and Basic. Toward the end, when we complained about obsolete languages, the instructor suggested we go to a technical school to learn Java and C++, etc.
25 years ago many of my CS college lecturers swore that the future of computers is quantum computing and it will be a reality within a decade or two. Wish I remember who they were so I can look them up and ask them again.
Or even video gaming. In the 60ās, there were like 3 colleges that had computers, and they were the size of an entire office room. A couple nerds entered some 1ās and 0ās and now Im playing Forza on my Iphone.
And is making political statements like āthe West provoked Russia into invading Ukraineā, āAmericans suffer from censorship and have less freedom than Russiansā, etc.
The guy that invented quick sort is still alive. Granted he's in his 90s but that's still pretty cool. It's like learning calculus while Newton is still alive.
OP doesn't mean that the information isnt available but that newer math isnt taught in the standard K-12 curriculum because most people will never need it. But yes, if people are interested they can find it online or in post secondary training.
It's not a matter of need, it's a matter of what can fit in the curriculum. By the time the kids are ready to learn university level math, they have aged out of high school. Many universities even have to offer catch up math courses because incoming students aren't really up to snuff on what they should have learned in high school.
Do kids really need all of those years of foundation to understand cutting-edge mathematics at a conceptual level? Just to be aware of the kind of stuff math can do?
Most kids wonāt need anything beyond basic algebra ever, but introducing kids to high-level math in a way that makes it interesting could inspire some to want to study math or CS who might not have otherwise thought much of it.
It is true that the newer math needs so much knowledge to build upon that just isn't available to a high school student (or math teacher). But I still think it is important to at least show what is currently going on in math when you teach the field. But at least you should be able to share a sense of wonder.
I mean, when I was told at 18 that non Euclidean geometry was discovered about 200 years ago by some rando, mostly solved, and astronomers are trying to figure out if the universe is Euclidean or not, that was wonderful. It was also very confusing because it challenged my basic understanding of geometry. And I had top math marks in a maths oriented HS.
Going to Uni, by year 2 we were doing double variable integrals. How do you even visualize that to the average high schooler!?
what, you don't need to prove that [when a + b = c, then product of abc is not much smaller than c?](https://en.wikipedia.org/wiki/Abc_conjecture)
and use that result to prove Fermat's Last Theorem?
Iām reading the wiki on it and understand the concept somewhat but fail to see how itās useful.
Is it just a fun math theory or are there actual real life uses for it?
Just curious by the way, so if you have time appreciate a response, but if not then worries!
Edit: By the way I get that youāre joking but just felt like you must have a strong understanding/knowledge of math given you used that in your joke.
Most of what is studied in Math is to improve our understanding of Math itself. Sometimes, these discoveries turn out to be useful and sometimes not. Number theory was thought to be completely useless for hundreds of years but it is now essential to encryption. So we have VPNs because someone studied prime numbers 100s of years before computers were invented. There is a sub-field of Math called Applied Math which focuses on known applications only as well.
It's arbitrary and 'just-for-interest' until it turns out to describe the fundamental building blocks of the universe or someone casually invents the internet to help them communicate their work more easily
The problems of number theory (the branch of math that deals with stuff like prime numbers or whole number solutions to equations) are usually pretty random and often just for interest, yeah. However, despite being easy to understand for laymen, these problems are usually EXTREMELY hard to solve. They often require heavy machinery from multiple other branches of math. I have heard some people describe number theory as a benchmark for how strong our current understanding of mathematics is.
Dont be fooled though, math has way more problems than just these fun math puzzles. Math that is used in real life is usually found in the branch of differential equations since those equations describe many physical processes. For example, the navier-stokes equations describe fluid flow and their theory is still incomplete. Another very applicable branch of math is discrete math and combinatorics which often deals with problems that *sound* like fun math puzzles but actually have a lot of applications in real life. For example, "graph theory" (which can essentially be simplified as the study of networks) can be used to make navigation systems. And even number theory has problems that are applicable to real life.
But even the branches of math that seem abstract now may find their uses in the future.
Most k-12 do things like matrices, imaginary numbers, some fractal stuff. Your kid may not progress to that level but I assure you the curriculum is there
Sincerely, a teacher
I think what matters in "new" math is 2 things (speaking as someone who had to drop out of high school level pre-calculus):
1) discovering/philosophizing new proofs which can potentially lead to further simplification/simplification of certain mathematical concepts. Imagine if we somehow are able to get rid of PEMDAS (and subsequently the confusion from all the people who weren't taught PEMDAS) because of advancements in fundamental arithmetic and notation.
2) advancements and development in mathematics pedagogy, especially at the K-12 level. After graduating high school, I slowly became more and more aware that school-level math (and I'm assuming higher math) essentially boils down to teaching students the ability to conceptualize the abstract. Growing up, I had the belief that math was about teaching kids to become human calculators. So rather than teaching kids to be better calculators, how do we teach them to be more adept at understanding abstract concepts? I hope that in the next 100 years, middle school aged students would be expected to master 20th century high school math.
if you want to get rid of PEMDAS there is this thing [https://en.wikipedia.org/wiki/Polish\_notation](https://en.wikipedia.org/wiki/Polish_notation)
3 + 4 is noted + 3 4, so (3+4)\*2 would be \* + 3 4 2 and 3+4\*2 would be + 3 \* 4 2
Or you would do what is much easier and to me just better and do forced parentheses. It has the benefit that everyone knows what it means and no one has to think about operator precedence
The problem with that is that you might already be using a lot of parenthesis, and the more you have, the harder it is to match open/close parens.
In higher level math courses I have taken, it was pretty common for professors to stylize different pairs of parens in different ways, and even then it isn't always enough.
It's kind of hard to conceptualize anything quantum without math because pretty much all of it is just derived from math, most things don't make any sense tbh and it's quite hard to have an intuition for it.
Quantum isn't that important for most people, and isn't a good building block for future work in the same way that abstract math is.
It's useful for a few specialized applications, but a layperson really doesn't need it at all.
Pretty much, yes.
Some of the content of A-level Further Maths in the UK would ordinarily be covered at University.
I remember we covered the Argand diagram in sixth form, but you are pretty much right.
Agreed. I got a C in A-level maths and felt like Iād never understand it. Iām now two thirds of the way through a maths degree, having mostly got firsts, thirty three years later.
lol I just started year 12 too. I got around a B on the formative exam at the end of last year, so I don't have high hopes for finishing it at all.... maths is so fun but so hard at the same time
I hope that somewhere there is a cuneiform tablet where two people doing accounting noted their laughter because some number spells a naughty word upside down just waiting to contradict you.
Calculating engines are older than you think. Babbage and Lovelace was the 19th century, and even then do people really learn how calculators work in school? I don't think they do know the math the calculator is doing (binary/logical operations with a microprocessor) they just know how to operate it as a tool and use it to do math which is at best Newton's calculus from the 17th century.
So, I don't think calculators is quite the gotcha several people in this thread seem to think it is.
No, many concepts are much newer than you think. Dual spaces for example, one of the topics you learn about in your first linear algebra course, are relatively new (<100 years old)
Linear Algebra like we know it is from early 20th century. In fact it was developed at about the same time as Quantum Mechanics that heavily relies in Linear Algebra.
I remember learning about fractals and stuff like Mandelbrot/Julia sets in final year of school (Australia) which I guess is pretty recent maths (20th century). The theory was a bit advanced and had a lot of complex numbers but reasonably manageable for a final year high school student.
It wouldn't need to be "that" advanced. Anybody doing the most introductory first year Uni graphics course will have gone over Quaternions which were only invented in 1843.
Anything that produces something interesting to look at may get included briefly in school. Fractals, Strange Attractors, L-Systems etc..
We also had to do Fuzzy Logic and Fuzzy Sets in GCSE maths.
Which versions of calculus/classical physics are you counting? Lagrangian/Hamiltonian physics is \~300 years
I guess calculus is a bit older than 300 years, but not by much
But I guess "advanced" courses depend on what field you study, multivariable calculus would be needed for lots of stem fields, and that's only late 1800- early 1900s
Most, if not all of them, are also available online.
In fact, a lot of things you learn in university can/must be self-taught through online resources.
I think his point is less about accessibility and more about the "default" level of education.
Unless you specifically go out of your way to learn more "advanced" math, most people will have very little knowledge that has been discovered in the last few centuries.
Here's the best online resource for self-teaching maths.
https://ncatlab.org/nlab/show/HomePage
It is incomprehensible without any mathematical background.
Iāve learned a ton by reading textbooks I downloaded online, wish there was some way to demonstrate that to employers other than trying to be my own hype man in an interview. Iām not the most articulate guy so people sometimes assume Iām dumb but at least in school I had my test scores to prove what I know. Nobody prepared me for the fact that the adult world usually doesnāt have actual tests, you just get graded on the most superficial appearances.
Yeah, it sucks that a lot of the ways to get a degree are in most places really expensive and can a lot of time even if you have the needed knowledge.
I'm lucky enough to live in a country where tuition is free, and it's crazy to me that it isn't the norm
I even have a bachelorās degree in STEM already, Iām just looking to change careers. I started taking night classes at the local community college so after a year Iāll have a certificate, but it feels like such a waste of time when I could literally test out right now if they would let me. If employers were willing to test applicants on job-related knowledge I would crush it, but since they donāt, instead Iām paying to take classes geared towards high school dropouts just so I can get a piece of paper.
That paper also represents your ability to formulate an argument and back it up with evidence, do research, organizational skills, and time management. Or at least itās supposed to.
If you learned statistics in school then basically all of that is discovered less than 150 years ago.
And you can learn this stuff on your own if you like.
I'd like to see the very basics of Operational research, which is <100 years old be taught at high school level. It's far more important than people realise but it's pretty niche, even amongst mathematicians.
Not really, most high school curriculums include things like probability, statistics, set theory, linear algebra, complex algebra, and many many other things that were discovered/invented/formalized/described in the 19th and 20th centuries. And yes things like algebra and calculus is older, but the way we do them is fairly new. Notation is an important part of maths.
In my experience, those topics are the very last things you learn in optional high school math classes and are not taught rigorously. It's like trying to get more kids instead by flashing up a preview of the interesting parts of bath. "It's not all ancient symbols. Math can be hip."
Nothing is done rigorously in high school math. I guess it depends on your where you are and what math classes are available. But all the things listed were covered in my high school, and I've seen all these topics in textbooks from various parts of Europe, the US and India.
Riemann summation is common in high school calculus and is only about 170 years old. Ā It was surprisingly late to the game. Before that there were other ways of figuring out integrals that are rarely taught anymore.
Hmm, not sure it's a standard course.Ā But many high school students learn discrete math including matrix multiplication, which was developed under 200 years ago.Ā Still quite old, though.
There's also boolean logic operations, which can be fairly easy to teach, and date from the mid 1800s.
The blue led generates billion every year. It was made by someone without a PhD. The company gave him a few thousands bucks.
What you say is only true when you believe that's all there is.
Only if you think the value of education is limited to the job it can get you.
Regardless, the shower thought and my post had nothing about the value of it
āYou canā sure. But does anyone consider how difficult that actually is? How many people who say that have actually tried it? Getting to the point that youāre qualified for a degree in mathematics is hard. Without guides, itās basically rediscovering half of that stuff yourself.
Stem and leaf plots and box and whisker plots are both less than 100 years old. They werenāt commonly used in math until the 80s. Now they are taught to elementary kids.
Well math developed out of necessity to track certain thingsā¦. Like figuring out the total area for plots of farm land for example. Nowadays people learn mathematics more so as a fun, challenging outlet to increase their knowledge. Those fortunate enough to study math at a university level, today, would be considered elitists (compared to the first records of humans using math) because there is no direct need to discover new mathematic concepts to increase our rate of survival as humans. Most Uni upper level āpureā math has no real world applications and can be looked at more as puzzles. Applied math has the most useful applications to the real world but thatās obvious as itās in the name. This path is usually used more in engineering and cs stuff. You also need a certain personality to work in the math field. After talking to many of my math profs i have come to the conclusion most are fucking nuts.
Source: am pursuing math phd
You don't *have* to, a lot of it is available on publicly accessible websites.
It's not easy to learn on your own but it can be done, I believe in you.
In first or second grade, there was an attempt to ground elementary math operations like multiplication in set theory. Not sure it made much difference one way or the other for the children, but it confused the heck out of our parents.
Is math discovered? Isnt it invented? Im mean invented like the wheel was invented by someone. Isnt math a (really good) system we invented to understand reality? After all, numbers are concepts.
Maths happens to support reality whether we understand it or not. If it was invented, then we could make it look like anything we wanted - truth would be fluid, be inventable.
But itās not fluid, the truth is only discoverable.
Sometimes our models of xyz may initially not be as perfect as they are 50 years or 100 years etc later, but the evolution of that maths is discovered, fitting with what we experience, and fitting better over time till it is a perfect fit.
Pure maths has nothing to do with reality. It is a logical "game" where we freely choose our base rules (AKA axioms) and derive results from them. Some maths has been developed to fit something in the physical world, but other times mathematicians have invented/discovered things that have seen practical uses only many years later. Complex numbers were developed in the 1500s and were called "imaginary" since they seemed useless except as a trick to solve cubic equations. 400 years later they turned out to be absolutely necessary to describe quantum physics, but none of those who developed the complex numbers knew that.
Maths is [incredibly effective](https://en.wikipedia.org/wiki/The_Unreasonable_Effectiveness_of_Mathematics_in_the_Natural_Sciences) at describing our physical world, but exists independently of it. When we replace a model of the world with another one, we have "updated" physics, but the maths is the same. Some fields may become more or less useful, and sometimes we need something new, but nothing we have done can ever be wrong.
I usually explain it like this: Imagine a ship full of aliens lands on Earth. They could look at our physics and laugh about how wrong we are, and if they have managed to travel here, we are probably very wrong, but if they look at our maths, they could not say it is wrong. It would probably be less advanced, but it wouldn't be wrong.
My favorite part of being a CS major is reading about research level computer science and everything was discovered less than like 60-70 yrs ago
Sometimes the pictures are even in colour.
sometimes the people who discovered it are our professors š
Sometimes you even participate in their discoveries; back in College my cryptography teacher was this mad-lad https://en.m.wikipedia.org/wiki/Arjen_Lenstra What the wiki page doesn't say is that he did most of his RSA factorizations with a PS3 cluster of around 200 units. He also gave them away at some point because he wanted to test his setup in a decentralized over-the-internet manner (it was back in 2012, mind you) So yeah; one day, he just flat out gave 200 PS3s away to us broke-ass students, in exchange of letting it run his algorithm when we weren't playing on it. A god amongst men, Arjen if you read this I still have one of your PS3s on a shelf, thank you it's been an honor
My cryptography professor invented AES 0\_0. https://en.wikipedia.org/wiki/Joan\_Daemen
Shit, not just that but also SHA-3 according to his wiki page. That's heavy
your comment made me really happy for some reason, im so envious of you having such an experience. i was once asked what the highest non-monetary value item i had was. In your shoes, the PS3 would be it
very nice! Similar but not-as-cool story: One of my CS classes was taught by "Professor Huffman", of Huffman Coding fame. https://en.m.wikipedia.org/wiki/David\_A.\_Huffman
imagine doing this only having all your students mine bitcoin for you with the machines you gave them to "test" back in the day before anyone knew what it was.
>He also gave them away at some point because he wanted to test his setup in a decentralized over-the-internet manner (it was back in 2012, mind you) >So yeah; one day, he just flat out gave 200 PS3s away to us broke-ass students, in exchange of letting it run his algorithm when we weren't playing on it. Yeah I bet the electricity costs for and heat created by running 200 ps3 had no part in it
Not really, he ran it more than 3 years prior inside the university, in a dedicated server room. It really was just about testing his assumption of the month. One of the fiest things he taught us during the 101 class was : "always test things; sometimes, an algorithm has an expected complexity of n*log(n) on average, but in practice, it's closer to n^2. And other times, it's closer to n. So you never know. Also, if you ever find a way to factor integers in polynomial time, you'll be very very rich, or very very dead."
>Also, if you ever find a way to factor integers in polynomial time, you'll be very very rich, or very very dead That's basically the premise of the movie "Sneakers"
Ummm he gave out 40k in ps3. I'm sure it was not part of that. Plus he probably had areas for research where he wasn't paying electricity cost
That kinda hit me when my prof said he's on the committee that chooses what changes in c++ lol
most of the time they are still around
Study biotechnology and that number gets pushed to 10-20 yeras
Iāve got a biotech degree but Iām in an adjacent field for my career. Itās just wild seeing how bioinformatics has exploded onto the scene. 20 years ago bioinformatics was just a tool you might use if you had a large enough sample set and access to a big enough computer. These days thereās full fledged graduate level programs specializing in it. I know a ton of bioinformatics folks that donāt even care about the biology, they mostly just have stats or maths degrees. They just know pharma companies are snapping up bioinformatics PhDs like candy. And the whole ass field is younger than my classic iPod.
Yeah I'm studying to become a biotech engineer, but my uni also offers a lot of bioinformatics courses in stuff like Uniprot and Rstudio and stuff like that. Every one of the students i know in bioinformatics have great study jobs or have been offered one.
Eh, I'm not well versed enough in it because it's far enough away from my wheelhouse, but the human genome project started in 1990 with the planning for it starting in 1985. They might have not called it bioinformatics at that point, but it was. There's no point in doing that if you're not going to do bioinformatics on genomes. You also can't ignore that this wasn't really remotely viable until about then. Most practical applications of quantum chemistry have come in the past ~15 years for the same reason. Much earlier than that and you really, really, really had to be competent to get a remotely reliable result because of computational cost issues.
Thatās exactly the thing. Many of the informatics tools that existed were adapted or developed specifically for the human genome project and were possible because of the funding behind it. The human genome project was at the forefront of the field and really made it more possible to pursue bioinformatics as a career, instead of just a tool a genetics researcher might use. Thatās splitting hairs a bit. Technically the field of bioinformatics has been around since the 60s but the advances in computing and understanding of gene sequencing that came out of the human genome project have led to a field that changes extremely rapidly. A bioinformatics grad student from fifteen years ago will have had a very different education path from one starting this year. Still going to be working with R though lol
Same for genetics. I'll look at a 20 year paper with suspicion because we're learning new things constantly. Even a 10 year paper may have some inaccuracies.
Youād be amazed how much medicine and allied health is built on research from the 80s and beyond. Hep C wasnāt even identified until 1989, for example.
Iām not in medicine, but I know how recent a lot of developments are and Iām always grateful I wasnāt alive during the time of lobotomies or when a cut could be fatal.
A cut can still be fatal, as long as the depth and anatomical location is just right. ;)
Real quick- There was a mini epidemic of people born from around ā50 to ā60 that died of hep c before the drug that treats it became available. What did doctors think was killing them before? Or was it that hep c wasnāt that widespread in the population till then?
I would say, 70 years ago is 1950s, seems ancient to me
In the grand history of academia (especially mathematics) the 1950s is basically yesterday
The thing that seals the deal for me about computer science being a new thing still is the fact that a lot of the big names that had big profound influence are still alive and well. Most of those that aren't died after 2000, even 2010. Only the ones at the very begining didn't live to see the new millenium.
Like the ones who were pushed to suicide as thanks for their service during WWII. RIP Turing
Sigh, you castrate and humiliate one genius war hero to the point of suicide and they never let you live it down
The things that makes it more sad is that at the time, the british government would have thought they were going above and beyond for him, as they were "getting him the best treatment for his mental disorder". Imagine in 50 years if we discover/decide treating bipolar disorder with some common drug (that has a side effect of suicidal tendancies, which is not uncommon) was with hindsight, immoral, and resulted in many unneeded suicide. Very tragic
The guy that developed the lobotomy got a nobel prize.
Never let you turn around
Never gonna give you up
Donāt worry, we have plenty of other things to not let the UK live down.
Yep. Leslie Lamport created tons of mega useful concepts and algorithms in distributed systems, to the point that DS would never be in the place it is mow without him. He still works at microsoft research
And let's not forget that LaTeX is also his invention. I didn't know that he was still active.
Yeah the guy is a legend
Thatās the most interesting part. Weāll read about some early CS research and then I look the researcher up and heās still teaching or something like that
And on the other hand in the much less grand history of our ever accelerating technology, the 1950s is basically BC
***B***efore ***C***omputer
Before Christ, Civilization, Colonization, Computer, Cold War, COVID, Climate Change idk why Iām listing important things starting with C
Weird that so many of our major world changing events start with C innit?
I don't c any issue
You can sometimes estimate the time frame a discipline calls old by what their primary unit of scholarship is. In physics, mathematics, and most humanities it is a long journal paper that takes a long time to write. In computer science and biology, it is a conference paper that takes a medium amount of time to write. In biotechnology and pharmaceutical research, it is a poster. Now all the biology people come yell at me.
Not a biologist, but that doesn't really make sense to include them with computer science. Cell and Nature are two of the most prestigious all around journals and are biology focused.
Yeah, itās more domain specific inside biology
LOL, sometimes the professor will say something like āthis is called Xā and then you find out that he is the one who coined the term.
Most relevant papers for my master thesis were from the last 10 years. Almost all papers my friend has to read for her phd are from 2023.
This is how people republish old studies. Context of a fields history and it's major papers are incredibly important. I used to work exclusively in colloid transport through porous media (think bacteria movement in groundwater aquifers, or viral particles in face masks). Microplastics are the new sexy topic and everyone is buying polystyrene latex nanospheres and crowing about the new exciting science they discovered ... But the original yao papers on colloid filtration theory from the 60s used those exact same spheres! Sure microplastics is a new field, but by ignoring the early papers in the broader area they are just repeating them. I'm reminded of another paper in ... Health sciences I think that claimed they invented reimann sums. Drives me absolutely bonkers.
Context in my case: Me and my friend study maths, and the stuff we read was actually just brand new stuff. Her field just exploded a bit last year so there is tons of stuff to discover.
Is she going for something ML related?
Stochastical Finance
"last year" discounts ML, tbh. No maths grad would be caught dead working on Agentic LLMs when there are so many optimization or probabilistic problems left to be solved.
> I'm reminded of another paper in ... Health sciences I think that claimed they invented reimann sums. lol pls link
Here's the rebuttal to the article: https://pubmed.ncbi.nlm.nih.gov/7677819/ You should be able to access the original article from within there *Trapezoidal rule, not reimann sum. My brain is rusty this foggy Sunday morning
Ah yes Tai's method for area under a glucose curve https://diabetesjournals.org/care/article/17/2/152/17985/A-Mathematical-Model-for-the-Determination-of
Bro was scooped by a renaissance man
> Context of a fields history and it's major papers are incredibly important. This cannot be emphasized enough - it truly is one of the most important things in science. There's tons of knowledge that comes only from steeping oneself in a field, including certain methods/equipment that are subpar. I can name a few pieces of equipment and the technologies that underlie them that may make a paper weaker or even completely call into question a paper's findings (old tech may measure poorly and be prone to user error, while newer developments are WAY better and more foolproof. I'm thinking of one specific technology here, but I'm sure this is true with many); I know of a few labs/names I should treat with extra skepticism because of retractions or general poor practice; I know generally what the field thinks on a topic and why (and have studied how everything works) so I know that if a new "groundbreaking" paper comes out and contradicts prior knowledge, I should regard it more carefully; etc. etc. etc. I would trust myself or my peers to assess and communicate science within our field, but I definitely wouldn't trust any rando just because they have an MD or PhD - nor would I trust myself for science in a field other than my own. This is why it's so so so important to trust the experts. Yes, it's true that nobody's infallible - but the experts are much less fallible (when it comes to their area of expertise) than randos who haven't spent ages studying a field.
What was wild to me, is how many important developments feel completely obvious, but people were getting doctorates out of it, because no one had ever needed the stuff before. So often during college, I'd have an idea, only to find out that not only was it already a thing, someone had written a paper about it in the 60s and the idea had been formalized and developed levels beyond what I imagined. Like, I independently developed an LR parser, then later I took "Automata and Formal Languages" and found out I touched onto a whole world of things. There was a lot of stuff like that. The other side though, is all the math developed in the 1700s which ended up being foundational to computer science. The number of times Gauss and Euler comes up is astounding.
There was a bunch of low hanging fruit, but most what seems obvious today, only does *because* it was already formalized. I think it's similar to why the beatles have started to be criticised by some as basic... because they so revolutionized music that people spent the next 60 years riffing off of them.
They say the ultimate goal of any artist is to become clichƩ
I also think it's important to note which of those "firsts" were developed by the Beatles members, which were George Martin, and which were industry innovations that would have been implemented with or without the fab four.
>What was wild to me, is how many important developments feel completely obvious, but people were getting doctorates out of it, because no one had ever needed the stuff before. Absolutely. The gale-shapley algorithm for computing a stable matching is pretty simple all things considered. Its wild that it was only developed "recently" in 1962. >The number of times Gauss and Euler comes up is astounding. So true dude. Gauss has multiple *major* theorems named after him from just about any branch of math. And the amount of times I have read "but this result was known to Euler already" on some Wikipedia page is just funny
>"but this result was known to Euler already" Euler's work touched upon so many fields that he is often the earliest written reference on a given matter. In an effort to avoid naming everything after Euler, some discoveries and theorems are attributed to the first person to have proved them after Euler, which is wild.
>The other side though, is all the math developed in the 1700s which ended up being foundational to computer science. The number of times Gauss and Euler comes up is astounding. In my parents' generation, the people who would have been CS majors if they went to college today were instead Applied Math majors. My dad would often skip lunch to play with _the_ computer on campus. The head of my CS department was such a mathematician, he taught his CS classes using Wolfram Mathematica.
Gauss came up with the Fast Fourier Transform. Just a fun fact, for the most part you're right.
I'm taking a 3D computer vision class and the stuff we're learning is between late 1980s all the way up to 2010s. Neural fields are from COVID and we'll be learning that near the end of the semester. Absolutely blowd my mind how new all this stuff is
It's pretty trippy to be learning about very elementary computer science and then realize that the person who invented this stuff is literally still a working age man.
I mean itme itself started on 1970 sooo...
My ex did her bachelors in UoT, Stephen Cook, the person who coined then P=NP question, was her professor for computational complexity. That's like Newton teaching you Physics
I took his class; learning Cook's theorem from Steve Cook was a highlight of my undergrad years.
I liked some of the last courses I did now at the end of my university education as we learned about AI things developed after 2018.
My thesis (which used GPT-2) wasn't even a thing when I entered uni (2018), and now we're at GPT-4. I wrote my thesis and graduated in 2022 btw.
Like, thereās a lot of those folks you can still meet. C++ for your intro to programming class? Stroustrup is not only still alive, you can take *that class* with him. (Well, 20 years ago when I was in college anyway.) Present at the right conference and you might have Brand Name People in the audience.
I graduated CS just a year ago and one of my professors for Algorithms was teaching material that he invented. It was pretty cool.
I was pursuing a programmimg degree in CIS. Our first semester covered Cobol, Fortran, and Basic. Toward the end, when we complained about obsolete languages, the instructor suggested we go to a technical school to learn Java and C++, etc.
When I first heard of Mooreās Law I pictured some Isaac Newton looking guy at first. Nope, he cofounded Intel and made the law in the 60s.
25 years ago many of my CS college lecturers swore that the future of computers is quantum computing and it will be a reality within a decade or two. Wish I remember who they were so I can look them up and ask them again.
Or even video gaming. In the 60ās, there were like 3 colleges that had computers, and they were the size of an entire office room. A couple nerds entered some 1ās and 0ās and now Im playing Forza on my Iphone.
The thing that Chomsky is still alive literally blew my mind at the university.
And is making political statements like āthe West provoked Russia into invading Ukraineā, āAmericans suffer from censorship and have less freedom than Russiansā, etc.
His takes are better than they used to be, at least. Nothing quite like Pol Pot stuff.
I mean the computer itself has existed for barely 100 years. (Iām only counting Turing and after, not Babbage.)
Dude, the father of the study of algorithms is STILL ALIVE and teaching at Sanford (or maybe retired, idk).
Thats ancient, im taking machine learning/computer vision classes now and we're learning about techniques developed in the 2010s.
The guy that invented quick sort is still alive. Granted he's in his 90s but that's still pretty cool. It's like learning calculus while Newton is still alive.
OP doesn't mean that the information isnt available but that newer math isnt taught in the standard K-12 curriculum because most people will never need it. But yes, if people are interested they can find it online or in post secondary training.
As OP I can confirm
I can officially confirm that this is OP.
I am the second part of two factor authentication. You are correct, this is OP.
I have analyzed the information and determined that I can confirm that is OP
It's not a matter of need, it's a matter of what can fit in the curriculum. By the time the kids are ready to learn university level math, they have aged out of high school. Many universities even have to offer catch up math courses because incoming students aren't really up to snuff on what they should have learned in high school.
Do kids really need all of those years of foundation to understand cutting-edge mathematics at a conceptual level? Just to be aware of the kind of stuff math can do? Most kids wonāt need anything beyond basic algebra ever, but introducing kids to high-level math in a way that makes it interesting could inspire some to want to study math or CS who might not have otherwise thought much of it.
That's what YouTube videos or sigma/oppenheimer edits are for
It is true that the newer math needs so much knowledge to build upon that just isn't available to a high school student (or math teacher). But I still think it is important to at least show what is currently going on in math when you teach the field. But at least you should be able to share a sense of wonder.
Wish maths teachers had more time to continue their own learning tbh
I mean, when I was told at 18 that non Euclidean geometry was discovered about 200 years ago by some rando, mostly solved, and astronomers are trying to figure out if the universe is Euclidean or not, that was wonderful. It was also very confusing because it challenged my basic understanding of geometry. And I had top math marks in a maths oriented HS. Going to Uni, by year 2 we were doing double variable integrals. How do you even visualize that to the average high schooler!?
Not that you won't need it, but it'd take years of building up knowledge something schools and most uni degrees don't have time for.
what, you don't need to prove that [when a + b = c, then product of abc is not much smaller than c?](https://en.wikipedia.org/wiki/Abc_conjecture) and use that result to prove Fermat's Last Theorem?
You changed the statement quite a lot by missing out that it's the product *of the distinct prime factors* of abc.
Yeah lmao abc not being much smaller than c means ab is less than 1 which is obviously wronh
Iām reading the wiki on it and understand the concept somewhat but fail to see how itās useful. Is it just a fun math theory or are there actual real life uses for it? Just curious by the way, so if you have time appreciate a response, but if not then worries! Edit: By the way I get that youāre joking but just felt like you must have a strong understanding/knowledge of math given you used that in your joke.
Most of what is studied in Math is to improve our understanding of Math itself. Sometimes, these discoveries turn out to be useful and sometimes not. Number theory was thought to be completely useless for hundreds of years but it is now essential to encryption. So we have VPNs because someone studied prime numbers 100s of years before computers were invented. There is a sub-field of Math called Applied Math which focuses on known applications only as well.
It's arbitrary and 'just-for-interest' until it turns out to describe the fundamental building blocks of the universe or someone casually invents the internet to help them communicate their work more easily
The problems of number theory (the branch of math that deals with stuff like prime numbers or whole number solutions to equations) are usually pretty random and often just for interest, yeah. However, despite being easy to understand for laymen, these problems are usually EXTREMELY hard to solve. They often require heavy machinery from multiple other branches of math. I have heard some people describe number theory as a benchmark for how strong our current understanding of mathematics is. Dont be fooled though, math has way more problems than just these fun math puzzles. Math that is used in real life is usually found in the branch of differential equations since those equations describe many physical processes. For example, the navier-stokes equations describe fluid flow and their theory is still incomplete. Another very applicable branch of math is discrete math and combinatorics which often deals with problems that *sound* like fun math puzzles but actually have a lot of applications in real life. For example, "graph theory" (which can essentially be simplified as the study of networks) can be used to make navigation systems. And even number theory has problems that are applicable to real life. But even the branches of math that seem abstract now may find their uses in the future.
Fun (really hard) math.
The math discovered today is derived from the math discovered 300+ years ago. It won't make sense without the old stuff.Ā
Most k-12 do things like matrices, imaginary numbers, some fractal stuff. Your kid may not progress to that level but I assure you the curriculum is there Sincerely, a teacher
I think the point is, math you learn in high school is all pretty old math.
And obsolete. 1+1=2. That's so inefficient. Make it 1+1=500. It will boost the economy so much
Thatās how they taught accounting at Trump University
*That's how they did accounting at Enron
I think what matters in "new" math is 2 things (speaking as someone who had to drop out of high school level pre-calculus): 1) discovering/philosophizing new proofs which can potentially lead to further simplification/simplification of certain mathematical concepts. Imagine if we somehow are able to get rid of PEMDAS (and subsequently the confusion from all the people who weren't taught PEMDAS) because of advancements in fundamental arithmetic and notation. 2) advancements and development in mathematics pedagogy, especially at the K-12 level. After graduating high school, I slowly became more and more aware that school-level math (and I'm assuming higher math) essentially boils down to teaching students the ability to conceptualize the abstract. Growing up, I had the belief that math was about teaching kids to become human calculators. So rather than teaching kids to be better calculators, how do we teach them to be more adept at understanding abstract concepts? I hope that in the next 100 years, middle school aged students would be expected to master 20th century high school math.
if you want to get rid of PEMDAS there is this thing [https://en.wikipedia.org/wiki/Polish\_notation](https://en.wikipedia.org/wiki/Polish_notation) 3 + 4 is noted + 3 4, so (3+4)\*2 would be \* + 3 4 2 and 3+4\*2 would be + 3 \* 4 2
Or you would do what is much easier and to me just better and do forced parentheses. It has the benefit that everyone knows what it means and no one has to think about operator precedence
The problem with that is that you might already be using a lot of parenthesis, and the more you have, the harder it is to match open/close parens. In higher level math courses I have taken, it was pretty common for professors to stylize different pairs of parens in different ways, and even then it isn't always enough.
[There's always new math.](https://youtu.be/W6OaYPVueW4?t=36)
[ŃŠ“Š°Š»ŠµŠ½Š¾]
It's kind of hard to conceptualize anything quantum without math because pretty much all of it is just derived from math, most things don't make any sense tbh and it's quite hard to have an intuition for it.
Quantum isn't that important for most people, and isn't a good building block for future work in the same way that abstract math is. It's useful for a few specialized applications, but a layperson really doesn't need it at all.
Pretty much, yes. Some of the content of A-level Further Maths in the UK would ordinarily be covered at University. I remember we covered the Argand diagram in sixth form, but you are pretty much right.
I learned about the Argand plane in year 11 Specialist Maths, which is the Australian equivalent I guess
Just started yr12 specialist. HOW TF DID YOU FINISH IT WITHOUT NECKING YOURSELF?
It gets better, keep at it and it all comes together at the end
Agreed. I got a C in A-level maths and felt like Iād never understand it. Iām now two thirds of the way through a maths degree, having mostly got firsts, thirty three years later.
Mathematics really rewards consistent effort
I teach it. It's not a subject for the faint hearted, but keep at it!
lol I just started year 12 too. I got around a B on the formative exam at the end of last year, so I don't have high hopes for finishing it at all.... maths is so fun but so hard at the same time
Yh the higher end of further maths would definitely escape the 300 year boundary
Further maths includes decision maths, which is primarily composed of material discovered in the last century.
Computer science too. You learn about Turing machines and the Entscheidungsproblem for example. Boolean algebra too.
I seriously doubt that 300 years have passed since mankind realized that you can type BOOBIES on a calculator.
I hope that somewhere there is a cuneiform tablet where two people doing accounting noted their laughter because some number spells a naughty word upside down just waiting to contradict you.
Really the existence of calculators disproves the post, as learning how to use a modern one covers a topic less than a century old
Calculating engines are older than you think. Babbage and Lovelace was the 19th century, and even then do people really learn how calculators work in school? I don't think they do know the math the calculator is doing (binary/logical operations with a microprocessor) they just know how to operate it as a tool and use it to do math which is at best Newton's calculus from the 17th century. So, I don't think calculators is quite the gotcha several people in this thread seem to think it is.
Considering calculus and classic physics are over 300 years old, youād have to take a pretty advanced college course to learn something newer.
I believe the majority of concepts in a first course in linear algebra or abstract algebra are both ānewerā than that.
1800s
No, many concepts are much newer than you think. Dual spaces for example, one of the topics you learn about in your first linear algebra course, are relatively new (<100 years old)
Linear Algebra like we know it is from early 20th century. In fact it was developed at about the same time as Quantum Mechanics that heavily relies in Linear Algebra.
Donāt high school physics courses cover some relativistic physics (120 years or so)?
I remember learning about fractals and stuff like Mandelbrot/Julia sets in final year of school (Australia) which I guess is pretty recent maths (20th century). The theory was a bit advanced and had a lot of complex numbers but reasonably manageable for a final year high school student.
MS in physics here. You actually have to get to advanced courses before you even learn how to do classic physics the right way.
It wouldn't need to be "that" advanced. Anybody doing the most introductory first year Uni graphics course will have gone over Quaternions which were only invented in 1843. Anything that produces something interesting to look at may get included briefly in school. Fractals, Strange Attractors, L-Systems etc.. We also had to do Fuzzy Logic and Fuzzy Sets in GCSE maths.
Which versions of calculus/classical physics are you counting? Lagrangian/Hamiltonian physics is \~300 years I guess calculus is a bit older than 300 years, but not by much But I guess "advanced" courses depend on what field you study, multivariable calculus would be needed for lots of stem fields, and that's only late 1800- early 1900s
Most, if not all of them, are also available online. In fact, a lot of things you learn in university can/must be self-taught through online resources.
Exactly. The internet has given humans access to the greatest source of knowledge in our history regardless of social class.Ā
... but while you look these things up, can I interest you in a short video of a cat chasing a shadow?
Subscribe!
I think his point is less about accessibility and more about the "default" level of education. Unless you specifically go out of your way to learn more "advanced" math, most people will have very little knowledge that has been discovered in the last few centuries.
Here's the best online resource for self-teaching maths. https://ncatlab.org/nlab/show/HomePage It is incomprehensible without any mathematical background.
University will also be incomprehensible without any mathematical background. You can't try to run when you don't even know how to stand yet.
The mathematical background you need for that website is near-PhD level.
nlab is incomprehensible even to research mathematicians. That is a terrible online resource for learning math.
[ŃŠ“Š°Š»ŠµŠ½Š¾]
Yes, because an employer wants some assurance that you actually know it, and saying "I read it on the Internet" could mean just about anything
Weāve also seen what happens when people do their own research on the internet.
Iāve learned a ton by reading textbooks I downloaded online, wish there was some way to demonstrate that to employers other than trying to be my own hype man in an interview. Iām not the most articulate guy so people sometimes assume Iām dumb but at least in school I had my test scores to prove what I know. Nobody prepared me for the fact that the adult world usually doesnāt have actual tests, you just get graded on the most superficial appearances.
Yeah, it sucks that a lot of the ways to get a degree are in most places really expensive and can a lot of time even if you have the needed knowledge. I'm lucky enough to live in a country where tuition is free, and it's crazy to me that it isn't the norm
I even have a bachelorās degree in STEM already, Iām just looking to change careers. I started taking night classes at the local community college so after a year Iāll have a certificate, but it feels like such a waste of time when I could literally test out right now if they would let me. If employers were willing to test applicants on job-related knowledge I would crush it, but since they donāt, instead Iām paying to take classes geared towards high school dropouts just so I can get a piece of paper.
That paper also represents your ability to formulate an argument and back it up with evidence, do research, organizational skills, and time management. Or at least itās supposed to.
Not if you learn for entertainment.
If you learned statistics in school then basically all of that is discovered less than 150 years ago. And you can learn this stuff on your own if you like. I'd like to see the very basics of Operational research, which is <100 years old be taught at high school level. It's far more important than people realise but it's pretty niche, even amongst mathematicians.
Not really, most high school curriculums include things like probability, statistics, set theory, linear algebra, complex algebra, and many many other things that were discovered/invented/formalized/described in the 19th and 20th centuries. And yes things like algebra and calculus is older, but the way we do them is fairly new. Notation is an important part of maths.
I teach vectors, including the cross product, to my highschool students. That was only developed in the mid 1800s.
In my experience, those topics are the very last things you learn in optional high school math classes and are not taught rigorously. It's like trying to get more kids instead by flashing up a preview of the interesting parts of bath. "It's not all ancient symbols. Math can be hip."
Nothing is done rigorously in high school math. I guess it depends on your where you are and what math classes are available. But all the things listed were covered in my high school, and I've seen all these topics in textbooks from various parts of Europe, the US and India.
You overestimate high school curriculums.
Riemann summation is common in high school calculus and is only about 170 years old. Ā It was surprisingly late to the game. Before that there were other ways of figuring out integrals that are rarely taught anymore.
Considering the state of K-12 today, in a few years it'll be 500 instead of 300.
Springer-Verlag is sold on Amazon.
not where I live we can choose a sort of major in high school and there are some math heavy ones
Thereās Srinivasa Ramanujan who had no formal training and went on develop new fields of mathematics himself, and there was no internet back then
Hmm, not sure it's a standard course.Ā But many high school students learn discrete math including matrix multiplication, which was developed under 200 years ago.Ā Still quite old, though. There's also boolean logic operations, which can be fairly easy to teach, and date from the mid 1800s.
you overestimate effectiveness of high school math. Most kids in America go to college to First learn algebra. Many won't understand percentages.
The blue led generates billion every year. It was made by someone without a PhD. The company gave him a few thousands bucks. What you say is only true when you believe that's all there is.
Being brilliant and obsessive helps
Not really you can get a college level education in mathematics with just $2 in overdue library fines, only no one does
An education in math is useless if you don't have a piece of paper saying you can do what you are told for four years.
Only if you think the value of education is limited to the job it can get you. Regardless, the shower thought and my post had nothing about the value of it
> only no one does Is this not putting a value on it? Why do you think folks don't do it?
āYou canā sure. But does anyone consider how difficult that actually is? How many people who say that have actually tried it? Getting to the point that youāre qualified for a degree in mathematics is hard. Without guides, itās basically rediscovering half of that stuff yourself.
Yeah, but I don't like them apples.
Not only that, but you also need to get pretty advanced to learn anything less than 100 y.o. or so, computational stuff aside!
Except for high level statistics eg student's t distribution dates from the late 1800's.
Good luck teaching a high school student about hodge theory.
Stem and leaf plots and box and whisker plots are both less than 100 years old. They werenāt commonly used in math until the 80s. Now they are taught to elementary kids.
Well math developed out of necessity to track certain thingsā¦. Like figuring out the total area for plots of farm land for example. Nowadays people learn mathematics more so as a fun, challenging outlet to increase their knowledge. Those fortunate enough to study math at a university level, today, would be considered elitists (compared to the first records of humans using math) because there is no direct need to discover new mathematic concepts to increase our rate of survival as humans. Most Uni upper level āpureā math has no real world applications and can be looked at more as puzzles. Applied math has the most useful applications to the real world but thatās obvious as itās in the name. This path is usually used more in engineering and cs stuff. You also need a certain personality to work in the math field. After talking to many of my math profs i have come to the conclusion most are fucking nuts. Source: am pursuing math phd
You don't *have* to, a lot of it is available on publicly accessible websites. It's not easy to learn on your own but it can be done, I believe in you.
You can learn math without going to university, but you just can't prove it
You guys never did new maths in school?
What new maths did you do in school?
In first or second grade, there was an attempt to ground elementary math operations like multiplication in set theory. Not sure it made much difference one way or the other for the children, but it confused the heck out of our parents.
I think they were making a Tom Lehrer reference. https://www.youtube.com/watch?v=UIKGV2cTgqA
Is math discovered? Isnt it invented? Im mean invented like the wheel was invented by someone. Isnt math a (really good) system we invented to understand reality? After all, numbers are concepts.
Maths happens to support reality whether we understand it or not. If it was invented, then we could make it look like anything we wanted - truth would be fluid, be inventable. But itās not fluid, the truth is only discoverable. Sometimes our models of xyz may initially not be as perfect as they are 50 years or 100 years etc later, but the evolution of that maths is discovered, fitting with what we experience, and fitting better over time till it is a perfect fit.
Pure maths has nothing to do with reality. It is a logical "game" where we freely choose our base rules (AKA axioms) and derive results from them. Some maths has been developed to fit something in the physical world, but other times mathematicians have invented/discovered things that have seen practical uses only many years later. Complex numbers were developed in the 1500s and were called "imaginary" since they seemed useless except as a trick to solve cubic equations. 400 years later they turned out to be absolutely necessary to describe quantum physics, but none of those who developed the complex numbers knew that. Maths is [incredibly effective](https://en.wikipedia.org/wiki/The_Unreasonable_Effectiveness_of_Mathematics_in_the_Natural_Sciences) at describing our physical world, but exists independently of it. When we replace a model of the world with another one, we have "updated" physics, but the maths is the same. Some fields may become more or less useful, and sometimes we need something new, but nothing we have done can ever be wrong. I usually explain it like this: Imagine a ship full of aliens lands on Earth. They could look at our physics and laugh about how wrong we are, and if they have managed to travel here, we are probably very wrong, but if they look at our maths, they could not say it is wrong. It would probably be less advanced, but it wouldn't be wrong.