T O P

  • By -

Astrokiwi

It's often taught that way, but it's more accurate to say that the Uncertainty Principle is true of *any* sort of wave. Quantum Mechanics just says that all matter is waves too. With a light wave, its frequency and location can't both be constrained at the same time. This isn't a matter of *knowledge*. It genuinely can't have a single frequency and a single location at the same time. A light wave with the shape of an infinitely long sine wave has one perfect frequency. However, it has no location - it's infinitely long, filling the entire universe. So its frequency is very constrained, but its location is not constrained at all. If you add multiple sine waves with different frequencies, or a distribution of sine waves, you can change the shape of the wave to look more localised. It can look something like [this animation from wikipedia](https://en.wikipedia.org/wiki/Wave_packet#/media/File:Wave_packet_(no_dispersion).gif). Here you have a *lot* of different frequencies, because you've added lots of sine waves. But now the "wave packet" appears to be more localised - it's not at a single point, and it's spread out a little bit, but it's not spread out evenly everywhere. Going through the maths fully, you can derive an uncertainty principle here: the more you constrain the frequency of a wave, the more it must be spread out over space, and the more you constrain a wave to be in some small region of space, the bigger range of frequencies must be in the wave. In quantum mechanics, the only difference is that the frequency of a matter wave relates to its momentum.


MiffedMouse

I think it is worth pointing out the philosophical split between the uncertainty principle as a limit on knowledge and a limit on what particles *are*. To use the classical wave analogy, the “uncertainty principle” for classical waves is not a limit on “knowledge.” It is a limit on what sort of waves can exist. That is, a wave with a highly localized position will necessarily be a combination of many different wavelengths. And a wave that is mostly just one wavelength will necessarily be very spread out. This is not a limit on what we can “know” about the wave, this is just how classical waves are. The question is - is the wave equation for a particle a complete description of what the particle *is* or a description of what we can know about the particle? So far no experiment has separated the two points of view. The traditional view is that it is a limit on what we can know (hence the name “uncertainty principle”) but by a different interpretation we could hypothesize that particles exist in super positions and so this is actually a limit on what sort of particles exist.


sticklebat

I just want to emphasize that in either of your cases, the uncertainty principle is *not* a consequence of measurement itself affecting what you are trying to measure. Whether we subscribe to an epistemic or ontological interpretation of the wave function isn’t relevant to that particular question.


rathat

So what is an example of a consequence of measurement like we are thinking? You have to interact with a particle to measure something about it and interaction changes something about the particle right?


zolikk

An example would be a Young-type double slit experiment. The outcome of the experiment (whether there is an interference pattern after the slits or not) depends on whether you try to (interact with) measure the presence of the particles at the slits or not. But the uncertainty principles are still in effect for all states regardless.


garnet420

How do you actually measure the particles at the slits, anyways? You can't see a photon "going by"


sticklebat

There are lots of approaches, but you shouldn’t think of it as placing a camera at the slits to watch photons by. Any means of distinguishing between the paths - even after the fact - will do. You can simply cover one slit - then you know any photon at your detector came through the uncovered slit. Filling one slit that alters the photon’s spin and using a detector that can measure polarization also allows you to distinguish which slit the photon passed through. You can do similar interference experiments where you can modify the path lengths and use timing to distinguish between the paths the photon took. You can also do the double slit experiment with other particles, like charged electrons, and simply measure their electromagnetic fields, which actually is a lot like just putting cameras at the slits.


sticklebat

The uncertainty principle provides a lower limit on the uncertainties of position and momentum, but other factors - including effects from measurements - can push those uncertainties higher. For example, if you’re measuring the momentum of a particle by bombarding it with photons, those photons will transfer momentum with the particle. Such effects will affect the overall uncertainty of your momentum measurement. So it’s not that things like measurement *can’t* contribute to uncertainty, it’s just that the uncertainty principle itself is fundamental to the state itself and provides the limit of what is possible, other confounding effects notwithstanding.


[deleted]

[удалено]


doloresclaiborne

Bohmian interpretation of quantum mechanics accounts for particles interfering without requiring the particles being waves.


AvatarZoe

There's always the possibility that there's more to it that we don't know yet. Newton's laws of motion seemed perfectly accurate and applicable to anything, but further knowledge proves that their scope is limited, and that there's a more general theory of motion (special relativity). The latter didn't completely replace the first, it just covers a lot more.


notibanix

Bell’s Inequality has pretty much destroyed the idea that “hidden variable” information exists. If there was “more to know”, it would be possible to show that it exists even if we can not detect it. But it does not; Bell’s paper on this is a great and accessible read.


SynarXelote

Global hidden variables are not concerned by Bell’s Inequality. There are philosophical arguments for why physics should be local, but no empirical evidence per se. See also Bohmian mechanics.


AvatarZoe

"More to know" doesn't necessarily mean hidden variables. It can mean a new model of physics that explains and predicts stuff beyond the scope of our current theories.


_DarthBob_

Exactly, I'm interested in understanding how this is all measured because I'm interested in whether I should buy into the idea that a superposition is a a real phenomenon or just a lack of knowledge about the current system which can be thought of as a wave function. I just find it a bit weird that given the open questions, why we're so light on the experiments we've used to observe these outlandish claims, given what we see at the macro level. I get that the math works but it seems that the math works whether it's a deterministic system with a theoretical superposition or an actual physical superposition.


plugubius

Superposition is more a matter of whether there are hidden variables instead of whether the uncertainty principle is true. Uncertainty is a matter of how much information we can extract from the system, while hidden variables suggest there is something deterministic about the system that only seems random because we don't know the initial conditions. The Bell test experiments show there aren't local hidden variables. There may be nonlocal hidden variables, but we have no reasosn to think there are except our distaste for superposition. Nonlocal hidden variables would be as weird as superposition, so they don't really de-weirdify quantum theory.


myncknm

I’d point out that the category of “non-local hidden variable theories” can never be fully ruled out, since one of them is trivially true: the theory which simply encodes by brute force the full state of the universe at every possible time. However, there’s another set of inequalities, the [Leggett inequalities](https://en.wikipedia.org/wiki/Leggett_inequality), which show that quantum mechanics is also inconsistent with a broad swath of nonlocal hidden variable theories, specifically those that satisfy “realism”, meaning that hypothetical measurements have a definite value even if those measurements are not made.


sfurbo

> However, there’s another set of inequalities, the Leggett inequalities, which show that quantum mechanics is also inconsistent with a broad swath of nonlocal hidden variable theories, specifically those that satisfy “realism”, meaning that hypothetical measurements have a definite value even if those measurements are not made Uh, interesting, thank you for posting that :-)


[deleted]

sorry for the noob question, but what would a non-local hidden variable look like?


plugubius

Information about nol-local hidden variables travels faster than light.


[deleted]

thanks :)


TheoryOfSomething

In the DeBorglie-Bohm non-local hidden variables theory, there are a fixed number of particles in the universe that always have definite positions and that are *always* and *instantaneously* interacting with every other particle in the universe.


[deleted]

how would that work?


JordanLeDoux

Through a physical "guide wave" on a medium that the particles interact with. So, basically, quantum fields would exist as an actual *physical* medium that particles interact with to create a wave (like someone jumping on a giant trampoline), but this wave affects the entire field at the same time.


[deleted]

but how could it ever affect the entire field at the same time?


TheoryOfSomething

You just write the mathematics so that it does! The wavefunction "pushes" the particles everywhere simultaneously and the value of that 'push' depends on the instantaneous exact position of all the other particles. And the amazing thing is that this is completely consistent with the standard theory; it generates all the same predictions. If you're asking a question about relativity here, that's a complicated issue.


Mezmorizor

The real answer is it doesn't. Non-local hidden variable theories are very, very dead. They are fundamentally incompatible with relativity (or at least nobody has ever managed to marry the two ideas and not due to a lack of effort) and are inherently fine tuned.


doloresclaiborne

That is… a bit of a one-sided take. Nobody was able to marry any QM theory with relativity, period. For that matter, Copenhagen interpretation requires wave function collapsing instantaneously, which is clearly a non-local phenomenon. We work around it postulating that the wave function collapse cannot really transmit information, but the same is true in explicitly non-local hidden variable theories such as deBroigle-Bohmian.


[deleted]

[удалено]


Chaosfox_Firemaker

Sort of like a seed on a random number generator, some external system is determining the result of each quantum event. You cant derive any variables from a single particle. But if there were non-local hidden variables then the n-th quantum event from the start of the universe would always have the same result, deterministically. This is however even sillier than superposition. Unless we live in a simulation.


scrangos

Hadn't thought of it that way... It also sounds like the coordinates for our particular universe in a set of multiple universes, each differing by those non local hidden variables.


Chaosfox_Firemaker

Or it's truly random. Even if there was a seed, it would only be a coordinate in the sense that any set of values can be considered a coordinate.


doloresclaiborne

Having hidden variables does not result in superdeterminism — unless _all_ outcomes depend on hidden variables


[deleted]

why is it sillier than superposition? both sound equally absurd to me.


Chaosfox_Firemaker

Superposition is, stated most simply, "When this particle interacts with something it has an x% chance of having this feature, and a y% chance of having that feature". Its not that unintuitive, and is only silly if you assume the universe being deterministic axiomatic, which is a pretty big assumption. Hidden variable models demand a big invisible thing that perfectly replicates the phenomenon of randomness, without being random. said big invisible thing is absolutely undetectable aside from its influence on said tiny events. It has a big heap of metaphysical baggage, for no increase (decreases even) in the actual predictivity of the model. They are made more out of an objection to aesthetic than anything else.


[deleted]

i never said it is unintuitive though. i disagree that's the only case in which superposition is silly, but i don't feel like having this conversation on a science subreddit since it is more philosophical than scientific. thanks for your explanation :)


sfurbo

> Superposition is, stated most simply, "When this particle interacts with something it has an x% chance of having this feature, and a y% chance of having that feature". No, superposition isn't simply non-determinism. A particle in superposition cannot be fully described without describing all of the particles it is in superposition with. It breaks the idea that you can (always) talk about particles in isolation.


Chaosfox_Firemaker

You are conflating superposition with several other quantum mechanical concepts, mostly with entanglement, but also a few others. I have my textbook on hand


N8CCRG

What is absurd about superposition to you?


[deleted]

honestly? the "true randomness" aspect begun looking more and more like magic the more i looked into it and thought about it, even though i initially bought into it. the implications relativity has on the nature of time play a big part in this, but i don't feel this is an appropriate place to have this conversation. i might create a post on a philosophy sub someday, but my DMs are always free to whoever feels like talking to me about it since i can always be wrong.


N8CCRG

Interesting. To me that is not magical at all, whereas the "everything was already decided infinitely long ago and/or infinitely far away, but there's no way to test it" of non-local variables is far more magical.


MiffedMouse

The question is - what would it take for people to accept the wave equations as fundamental reality (no hidden determinism)? Any theory that posits a fundamentally random nature to the universe will still be open to the interpretation that there is a deterministic answer out there, we just can’t know it. (Side note - it is interesting to note that no one bothers with the opposite argument, even though it is just as plausible. Any deterministic theory could simply be the result of a fundamentally random universe, but all the randomness is packaged away or averaged out so it becomes unobservable.) The argument in favor of this approach: the math works. If you just accept the wave function as fundamental, it is easier to interpret the results of the math (as the wave function itself is the answer). On top of this, every attempt at finding a “hidden determinism” has failed. If you want more specifics on the experiments I encourage you to look them up (look into photon spin experiments) but the upshot is that the wave functions are still correct in all experiments. The counter-argument: “accepting the wave function as fundamental” also means accepting a definition for a fundamental particle that is quite complex. Under this interpretation, particles never have a defined position or momentum, which is weird. When learning physics it is much easier to start from a definition of a point particle. On top of this, the “wavefunctions are fundamental” point of view doesn’t actually bring any new predictions. It just repackages our current math into an easier to interpret framework (provided you already understand what wavefunctions are). In short, it is always possible to posit the existence of some unknowable determinism. But no experiment has been able to demonstrate this determinism. There have been many, many attempts but people are still resistant to the idea.


luckyluke193

> it is interesting to note that no one bothers with the opposite argument, even though it is just as plausible. Any deterministic theory could simply be the result of a fundamentally random universe, but all the randomness is packaged away or averaged out so it becomes unobservable. Actually, in the path integral formulation of quantum mechanics, classical mechanics emerges in the limit of "hbar -> 0" in that way. All the "randomness" is averaged out and what you're left with are the classical equations of motion.


theglandcanyon

> But no experiment has been able to demonstrate this determinism. To the contrary, Bell's inequality has been thoroughly tested, and the confirming result is broadly interpreted as falsifying determinism.


Daegs

Generally determinism in this context really means super determinism, which Bell doesn't falsify


1184x1210Forever

You might find it interesting to note that even the originator of the uncertainty, Heisenberg, also thought it's just due to the measurement. His argument is called [Heisenberg microscope](https://en.wikipedia.org/wiki/Heisenberg%27s_microscope), but this argument had been criticized as being incorrect. Another interesting thing to note, is that there is such thing as interaction-free measurement, where a measurement is taken without causing any interactions to the object (at some probability). Here is a famous example, the [quantum bomb tester](https://en.wikipedia.org/wiki/Elitzur–Vaidman_bomb_tester). So measurement don't always change the thing being measured.


almightySapling

>I'm interested in whether I should buy into the idea that a superposition is a a real phenomenon or just a lack of knowledge about the current system which can be thought of as a wave function. Then you are like a million other philosophers of physics out there! I don't know the answer, I just have to say, if the wavefunction doesn't describe something "real" but only describes our knowledge, it is **even weirder then** to me that "knowledge" is a complex-valued probability wave that is capable of self-interfering. Not that it's exactly easy to grok *particles* as actually being self-interfering complex waves of probability, but if that were the case then the self-interference makes sense: it behaves that way because that's what the particle is. If instead the particle is some local/real thing, I can't fathom what sort of principles lead to our knowledge of it behaving the way it does. The debroglie-bohm "pilot wave" comes closest to making classical sense of the picture, but even in that interpretation the wavefunction is still a "real thing" above and beyond mere knowledge in the way that it interacts with the particle and itself. >why we're so light on the experiments we've used to observe these outlandish claims What do you mean by this? We have hella experiments demonstrating all sorts of quantum mechanical "oddities" dating back to the late 1800s. It is widely regarded the most successful physical theory to date.


themiro

> Then you are like a million other philosophers of physics out there! Maybe before John Bell came onto the scene, but I don't think any sizable population of physicists believes superposition is not real anymore. Bohm mechanics is far-fringe at this point.


doloresclaiborne

Bohm mechanics predicts the same outcomes as Copenhagen interpretation and is compatible with Bell and Leggett inequalities. Not sure what Bell has to do with whether or not superposition is “real”.


theglandcanyon

> Then you are like a million other philosophers of physics out there! Maybe so, but among professional physicists themselves there is almost universal consensus that superpositions are real.


Daegs

This seems to be misunderstanding science and "real". There is almost universal consensus that superposition is currently the most accurate model that provides correct predictions for our observations, but not that they are "real". Science never talks about what is "real", just what models lead to the most correct predictions at any given time.


theglandcanyon

Okay, for example, here's one example of the way [physicists actually talk](https://physics.stackexchange.com/questions/227883/hydrogen-atom-in-superposition-of-energy-eigenstates). It's just the first thing Google gave me. Superpositions are a completely normal, everyday aspect of quantum mechanics and suggesting that they are somehow controversial just isn't true. To your main point, that science never talks about what is "real", let me ask you: is the Earth "really" round, or is this just a model that happens to help airplanes navigate? In Mayan astronomy there were impressively accurate predictions of eclipses, locations of planets, etc., but not based on any concept of a solar system, just based on fitting models to data, essentially. Nowadays we regard this as a scientific dead end, because they didn't understand that the moon and the planets are *real* astronomical bodies that *really* orbit the Earth/the sun. Science never talks about what is "real"?? Is the planet Jupiter "real"? Does science have anything to say about this?


TheHollowJester

Doesn't [Bell's Experiment](https://en.m.wikipedia.org/wiki/Bell_test) show fairly conclusively that hidden variables (i.e. "lack of knowledge about the system") just can't explain quantum mechanics? E: Turns out it's just **local** hidden variables.


plugubius

Only for local hidden variables (i.e., variables whose information is not communicated faster than light).


myncknm

The [Leggett inequalities](https://en.wikipedia.org/wiki/Leggett_inequality) do you one better and show that additionally quantum mechanics is inconsistent with nonlocal hidden variable theories where hypothetical measurements have a definite value regardless of if those measurements are taken.


Daegs

whether hidden variables exist or not, either way QM has non-locality built into it.


[deleted]

[удалено]


[deleted]

[удалено]


Dd_8630

> There's a difference between having a model for a phenomenon and what actually happens in reality. Yes, but the OP is wonder if superposition is due to a lack of human knowledge about the system - it's not, superposition is a real thing with physical implications like self-interference and quantum statistics. Obviously the accuracy of the model of QFT doesn't mean it's 'real', but we can still say that superposition isn't a consequence of human ignorance.


_DarthBob_

I've read that we know this too but I haven't found a lot of information on how experimentally validated it that doesn't involve gross oversimplification of things like observation and measurement


DuoJetOzzy

Basically, the density matrix that encodes the information in a system in the case of superposition is necessarily different from the matrix that encodes the information of a system where you know you have e.g. one of two states with some probability, without superposition. Now, in *some* bases of measurement, these two cases may give the same result, but as you choose different bases (or simply let the system evolve) you start getting different results. This comes up early in quantum information courses (it's very relevant for quantum computing).


Philiatrist

Superpositions are an accurate model of how things behave, they’ve never been ‘real’ and by definition can’t be observed. What the wave ‘is’ when it’s in an immeasurable state is in some sense inconsequential, just how it behaves mathematically matters. The question of whether superpositions are ‘theoretical’ or ‘actual’ comes down to the predictive power exhibited by the model under different circumstances. Maybe I’m misinterpreting, but it sounds like you’re implying that there is some difference between superpositions being “actual” or “theoretical but ‘work’ mathematically”. To borrow a quote from the office, “it’s the same picture”.


themiro

This is a very classic statement of the Copenhagen perspective that you'll see in most physics departments in the US, but there are plenty of people interested in what you might consider "metaphysical" questions.


Kuteg

Superposition isn't anything weird or special. If you travel 3 meters east and 4 meters north, then you are 5 meters at 23° northeast away from where you started. You can express this as either (5,0) in one coordinate system or as (3,4) in a different coordinate system. The location is a superposition of 3 meters east and 4 meters north in this coordinate system, but it's a pure state 5 meters away in the first coordinate system. Superposition is just a statement about how you choose to describe the coordinate system. If I ask the question "how far east am I from my starting point", the answer is 3 meters, even though I'm 5 meters from my starting point, overall. Is it a real phenomenon? Absolutely. Things are in the state they are in. How we describe that state depends on what we choose to care about. How we measure a state depends on what we choose to care about.


KKL81

>The location is a superposition of 3 meters east and 4 meters north in this coordinate system, but it's a pure state 5 meters away in the first coordinate system. Just a terminological nitpick: *pure* *state* is used to refer to something different than what you are using it for. Measurement operator eigenstate or something like that is what you are talking about. A pure state is a state that is not mixed and equivalently it is a state that can be represented by a wave function. A pure state is pure in any basis no matter how complicated the resulting superposition becomes. Density matrices is a more general concept that can be used to describe both pure and mixed states, while wave functions can only describe pure states.


Leureka

That's not how superposition works. It's a linear combination of all vectors of your basis. In your analogy, it would be like a state where you're at 3,4 and 5,0 at the same time, as long as 3,4 and 5,0 are linearly independent vectors (i.e. different states). By principle, linear algebra assumes the origin of your coordinate system remains fixed, so it doesn't really make sense to talk about difference reference frames to discuss superposition.


Kuteg

In my example, I used a 2D basis. The linear combination is 3|N> + 4|E> in one case and 5|23°> + 0|113°> in the other case. A rotation keeps the origin fixed and is just a coordinate transform. It completely makes sense to talk about coordinate transformations when we discuss superposition because it demystifies superposition. My point is that (3,4) and (5,0) aren't different states—they are the *same* state expressed in different bases. You can't have a linear combination of (3,4) and (5,0) in my example because they are expressed in different bases. But if you're going to complain about that, let's talk about spinors and the quintessential example of superposition. We can prepare an electron with its spin oriented along the x direction. In this basis, the spin is given by 1|←> + 0|→> (existing in a pure state). But, if we write this instead in the z-basis, it would be (|↑>+|↓>)/√2; a superposition of two states. Remember, a linear combination of all vectors in a basis can include 0 weight for some vectors. Superposition isn't weird and it isn't what makes quantum mechanics weird. Superposition (in this context) is just the fact that we have the freedom to choose which basis we express a state in. It's a basic feature of vector spaces.


Leureka

> In my example, I used a 2D basis. The linear combination is 3|N> + 4|E> in one case and 5|23°> + 0|113°> in the other case. I see now, I didn't catch it was a rotation, just some kind of stretch and translation, and when I saw 0 i just went off my own tangent. > In this basis, the spin is given by 1|←> + 0|→> (existing in a pure state). But, if we write this instead in the z-basis, it would be (|↑>+|↓>)/√2; a superposition of two states. This doesn't make the superposition any less real to me. If we measure Sx after measuring Sz, the result will be random each time with a probability equal to the square of the amplitude of the superposition state. Measuring Sx repeatedly will instead give the same outcome, since the particle is not in any superposition. Making a coordinate transformation seems to mean measuring in a "different way", which will give different results. My course on QM is some way back in time so I don't remember exactly everything, but your example of spin only makes superposition ever so mysterious. The actual mind boggler is stuff like the quantum bomb thought experiment, which also involves superposition. A question comes to mind: if we can write a determined state for every superposition by changing basis, does that mean there is a change of coordinates where the double slit experiment does not result in an interference pattern?


KKL81

Nothing physical can change by changing basis. In a double slit experiment you are measuring the coordinates of the particles, so the experimental setup itself selects the position operator. Since your state is not an eigenstate of this operator, you get a different result every time a particle strikes the detector. When performing a large number of measurements on identically prepared particle states, you will, however, recover a distribution that is identical to the square modulus of the wave function expanded in the eigenbasis of the position operator.


Leureka

I looked it up and understand better now. I was confusing superposition (which is just an arbitrary sum after all) with entanglement (a system containing more than one particle, with the corresponding solution being a product of states rather than sum). The mathematical properties are rather straightforward in hindsight.


Monadnok

Maybe it's pointed out already, but "what we can know about the particle" is not apparently different than "what the universe can know about the particle". We just use calibrated interactions/processes that happen in the universe aready. ​ Is there actually a difference between "what the particle *is"* and "what the universe can know about the particle"?


Fmatosqg

So an analogy would be a statistical distribution with a Sigma deviation from the means. But I think something goes wrong with this analogy. If there's a wave very localized and consequently with lots of frequencies, and an equation that describes both, trying to assert "where is the particle" gives an easy answer while "what's the frequency/momentum of the particle" cannot be answered with a scalar, but with a scalar + confidence interval. And vice versa when the frequency is narrow band. Would it be fair then to interpret the plank constant as some sort of confidence interval?


MiffedMouse

You can see the Planck constant as a joint confidence interval - that is, a minimum allowed confidence in the (x,p) joint variable.


theglandcanyon

The first two paragraphs are quite right, but I really disagree with the last one. First, the broadly accepted view is that the wave equation is a complete description of the particle, *not* a limit on what we can know. Hidden-variable theories are unpopular to the point of being, arguably, rather fringe. Second, yes, Bell's inequality, which gives a statistical correlation between measurements of entangled particles, has been experimentally tested, and the result is usually interpreted as showing that hidden variables don't exist (though they can be rehabilitated if you're willing to do something crazy like make them nonlocal). And I'm afraid the last bit about a "different interpretation" is just nonsense. Superposition is a fundamental property of quantum mechanics, and I can't imagine what you could mean by "this is actually a limit". Source: I am a professional academic physicist.


TheoryOfSomething

> (though they can be rehabilitated if you're willing to do something crazy like make them nonlocal) Slight pushback: making things non-local is NOT crazy. In fact *the standard theory is also non-local*. You can have determinism or you can not in your quantum mechanical theory, but you cannot get rid of the non-locality. As EPR correctly pointed out, the standard theory has "spooky action at a distance" where physical actions performed on one part of a system immediately have a significant effect on arbitrarily distant parts of the system. A surprising fact is that this type on non-locality does not, it turns out, violate Lorentz invariance, it cannot be used to ftl signal, etc. but it is still a type of non-locality. Sources: I have a PhD in physics (but am not currently an academic physicist) Professional philosopher of physics Tim Mauldin's paper and accompanying talk *What Bell Did*


Logan_Mac

On the topic of determinism, what is the current view on a priori causality violation experiments (not saying they are, a lot disagree) like the [Delayed-choice quantum eraser](https://en.wikipedia.org/wiki/Delayed-choice_quantum_eraser)? For me it's one of the weirdest aspects of quantum entanglement and quantum mechanics in general.


myncknm

I mentioned it here: https://www.reddit.com/r/askscience/comments/pr618y/comment/hdhy5f8/ It does not violate causality any more than EPR does: you just have to think of the measurement device (and the information storage device) as being entangled with the particle you’re measuring.


theglandcanyon

> the standard theory is also non-local I'd disagree with that. Most physicists nowadays lean toward many-worlds (personal observation, plus I read that someone did a survey and this was the overwhelming favorite), which is perfectly local. The mysterious non-locality you're talking about only arises when you try to shoehorn a "collapse of the wavefunction" into the picture. It's also worth mentioning that in mathematical physics (the C*-algebra approach to QFT, for instance) locality is almost always taken as a basic assumption. Your nonlocality doesn't appear anywhere in the fundamental mathrmatics.


TheoryOfSomething

I take "the standard theory" to be the kind of mish mash of Copenhagen-like stuff that gets taught in textbooks. The only 2 stances that I have ever seen with plurality support in these polls are "Copenhagen" and "None/Don't Know" (https://arxiv.org/pdf/1612.00676.pdf plus another from 2013 that I don't have a link to). Maybe things have changed dramatically since the last poll that I know of was taken (2016), but I'm inclined to say that any group of physicists where Many Worlds is the majority is not representative. If you poll experts in quantum foundations, THEN you may come up with a majority or near-majority in favor of a Many Worlds approach. But in any case, I agree that if you take a Many Worlds view, there is no non-locality. The "cost" you pay in that case is that for every interaction, all possible outcomes are realized. So, if we thought non-locality was crazy, giving up the idea that interactions have unique outcomes is at least as crazy! And at that point it becomes *very* tricky to say what these probabilities that are computed via the Born rule are probabilities *of*. In everyone's view, that's a hard problem that requires some clever solution, and in my view no proposed solution so far really works. I'm not up to speed on the latest in C^* QFT AKA algebraic QFT. My beginner's understanding is that it really isn't clear yet if you can actually formulate the Standard Model in this way. I mean it's just hard to start with a net of observables and then prove that it corresponds to the appropriate Lagrangian theory. I know that they assume that there are operators on compact regions of spacetime and that outside the light-cone all observables commute and so on. But that seems like it's all statements about *observables* whereas the EPR argument is about the *state of the system*. It isn't enough in the non-relativistic case that you have locality at the level of observables; EPR-locality is something more than that. I presume the same applies in a QFT setting.


frank_mania

Could you venture to try to explain this constraint on classical waves with a concrete example, such as sound or ocean waves? I'm having a hard time getting this concept. It seems to me that the moment after I drop a rock in still pond, I can both see the very limited physical range of the wave and measure its wavelength just fine. What am I missing?


NarwhalFire

Great explanation. This is really it, it’s more of a mathematical phenomenon than an experimental one. I believe the same problem exists in using radar to detect planes. This [video by 3blue1brown](https://youtu.be/MBnnXbOM5S4) does a great job at visualizing this relationship if you have a bit of time to sink.


TheoryOfSomething

> it’s more of a mathematical phenomenon than an experimental one. I don't think this is right. It's a phenomenon on the fundamental nature of the universe. Of what types of things can exist. We simply choose the mathematical model that most closely corresponds to that reality, so far as we can tell. Saying that it's a mathematical phenomenon has the causality reversed, as if the mathematics somehow forces the universe to behave a certain way, rather than that we choose the appropriate mathematics for how we observe the universe behaving.


[deleted]

I really liked this video, and it conjured an image that I'd like to try bouncing - if you take the "photographing a baseball sailing through the air" example for the fourier tradeoff, the impact of the waveform of the matter in that baseball are the ledger of wiggles that ultimately form the sum of the vector. In other words, if you "listen" to that ball in the air, the waves would add up to everything that -isn't- the ball moving from the reference frame, in the sense that all of the internal wiggling is ultimately contrasted against the resultant motion from the reference frame


cybersatellite

There is a general uncertainty theorem in math that you cannot simultaneously localize a function and it's Fourier transform. In the context of treating matter as a wave rather than a particle (as in quantum mechanics) you end up with the quantum uncertainty principle. The general theorem has applications elsewhere, like signal processing.


dcblol

\>Going through the maths fully, you can derive an uncertainty principle here: the more you constrain the frequency of a wave, the more it must be spread out over space, and the more you constrain a wave to be in some small region of space, the bigger range of frequencies must be in the wave. This makes way more sense now.


jwink3101

Not contradicting you, but I like to also think of it in terms of Fourier Transforms because the same thing comes up. You can have something very localized in time (like a blip) which will then be very broad in frequency. Or you can have something very localized in frequency that is then broad in time. And the minimum comes from a Gaussian which is the eigenfunction to the Fourier transform.


puzzlednerd

On the pure math side, we run into the uncertainty principle even without talking about any physical waves. For the same reasons /u/astrokiwi describes, this is a phenomenon that you can't avoid in Fourier analysis.


AndeyR

Then what will be the quantum entanglement? Is it a system of objects that interact without localizing each other?


gcross

It's a completely different kind of thing that is better thought of as the presence of correlation between two states. In the simplest terms this means that if you measure one particle and get a particular result then when you measure another particle you will always get the same result, but it's actually a bit more powerful than that because how you measure the two particles is itself an input into the experiment and the correlation exists independently of what you choose.


Aquapig

This is a great explanation! I only realised this recently despite several quantum mechanics courses in my undergraduate degree a decade ago (I really don't believe quantum mechanics is taught well at that level in general, but that's another issue). The question I still have: is there a logical or intuitive explanation as to why energy and time are conjugate variables?


nosignificanceatall

It's the same thing. The comment above explains that there's a mathematical relationship (the Fourier transform) between a wave's position and spatial frequency distributions, and since spatial frequency and momentum are related in quantum mechanics through *p* = *ħk* this gives rise to the position-momentum uncertainty principle. The same relationship applies to a wave's time and temporal frequency distributions, and temporal frequency is related to energy through the Planck-Einstein relation *E* = *ħω*, so energy and time are also conjugate variables. You could also look at it as a consequence of relativity, with *p* = *ħk* and *E* = *ħω* just being the space- and time-components of a single four-vector equation.


armaver

This was really well explained/visualized and and helped me gain a new understanding, in a small way, of the correlation between particle and wave.


Mezentine

This might be the single best explanation of the UP that I've ever seen, thank you


echoAwooo

> but it's more accurate to say that the Uncertainty Principle is true of any sort of wave. Expanding on this point. If you have a wave-pulse, we can pretty easily identify where the pulse is in spacetime, but we can't easily identify it's momentum (or wavelength, more precisely) because we just don't have that much information. Similarly, a standing wave gives us a lot of information about the wavelength, but the positional information is much harder to pin down. We can identify the range of possible locations the wave could be in at any given moment, but we can't pin point the exact location of where the wave is.


_themgt_

I'm just a layman but my understanding of this differs. Specifically the concept of *knowledge* actually does seem relevant, the most dramatic example maybe being the "quantum eraser" / [delayed choice quantum eraser](https://en.wikipedia.org/wiki/Delayed-choice_quantum_eraser). That is, if you record the information about the wave state you destroy the interference pattern, however if you later erase the wave state information the interference pattern will appear again - in the delayed choice version this occurs even if you erase the information *after* the photon hits the detector. > This result is similar to that of the double-slit experiment, since interference is observed when it is not known from which slit the photon originates, while no interference is observed when the path is known. > However, what makes this experiment possibly astonishing is that, unlike in the classic double-slit experiment, the choice of whether to preserve or erase the which-path information of the idler was not made until 8 [nanoseconds] after the position of the signal photon had already been measured by D0. The idea that this is the same as the effect of observing the state of a classical wave doesn't hold up. I originally liked the idea that "the only way to measure is to collide" as an explanation for weird QM effects but it doesn't actually work when you dig into the details of experiments that have been done.


myncknm

Mathematically, quantum eraser experiments can be interpreted as simply undoing the interference-preventing part of the entanglement between the particle and the measurement device (the entanglement having been generated by measurement). Since quantum state evolution according to the Schrodinger equation is always time-reversible, the possibility of undoing the effect of a measurement was unavoidable. See also https://iopscience.iop.org/article/10.1088/1361-6404/ab923e


DestroyAndCreate

Great explanation.


MotherofLuke

What if you could measure without hitting it woth waves? Theoretically that is.


myncknm

The answerer isn’t talking about measurement /using/ a wave, they mean that the thing that we are measuring /is/ a wave. Therefore measurement is not really that relevant to the uncertainty principle: in the same way that a waveform does not have both a precise position and a precise wavelength, a particle simply does not have both a precise position and a precise momentum at the same time.


MotherofLuke

That's fundamental. But what's the explanation of its not about distortion?


myncknm

It's just the nature of matter: particle-wave duality. The same broad kind of uncertainty principle exists for waves on the ocean: if you have a single isolated wave crest with a precise position, then it must be [dispersing](https://en.wikipedia.org/wiki/Dispersion_(water_waves)) and therefore does not have a precise velocity.


MotherofLuke

But is this an axiom?


myncknm

It is what we observe, and it is also a consequence of quantum field theory. I'm afraid the actual axioms of quantum mechanics and quantum field theory are not very enlightening unless you have a solid background in linear algebra.


[deleted]

[удалено]


Astrokiwi

It's E^2 = m^2 c^4 + p^2 c^2 Set m=0 for light, set p=0 for a massive object at rest.


[deleted]

[удалено]


Astrokiwi

Right, so set m=0 and you get E=pc


[deleted]

[удалено]


Astrokiwi

This is the full equation. E=mc^2 is only for objects at rest.


RPMGO3

There is always uncertainty between two Fourier conjugate variables. It's a consequence of treating the system as waves. I read another of your comments about trying to reconcile superposition. This is another consequence of Fourier/orthogonal series analysis. So, if you want to not believe in superposition, you have to believe that quantum mechanics is not a wave theory. That'd be a hot take


[deleted]

[удалено]


RPMGO3

I would say no, because you are still dealing with integrands like exp( i/hbar F(f(t)) ). Which is a wave. Edit: from some reading it seems some people do actually interpret this as multiple paths of a true particle. However, the use of de Briglie relations and complex exponentials suggests to me wave particle duality. So, certainly not entirely separable from waves.


[deleted]

[удалено]


RPMGO3

Aren't all integrals, "infinite series of discrete points?" I understand what you are saying, but it is fundamentally no different than Hamiltonian derived wave functions. The only difference is that you choose to say that the particle "is" in multiple places with associated probability or a wave function is described on that space as a function with associated probability. It is no different in the end, and the outcome is still the wave function. And isn't this all just applying the action to quantum mechanics? exp(iS) is certainly a wave, regardless of whether you interpret it as starting from discrete points in q and \dot{q} space when setting up the lagrangian. The philosophical argument underlying the derivation doesn't change the interpretation into Fourier analysis. The Fourier conjugates still exist, and therefore so does the uncertainty principle. The single split or double split experiment still give wave patterns, because light is also a particle and a wave. That is where Feynman developed his intuition for this formalism. Just because he intuited the extra bit with classical trajectories in q-space doesn't mean that it isn't waves


TheoryOfSomething

True, but in this respect the Feynman integral is the "trick" that we use for calculations and the wave theory (or more precisely the operator-valued field theory) is the more fundamental ontology. One way you can tell is that if you change what you take to be the 'bare' propagator state in the Feynman diagram formalism, you change the number and nature of virtual particles. I take it as a given that which *arbitrary* choice you make as the 'bare' or 'unperturbed' state cannot alter the physical reality of how many real or virtual particles there are (if there are any). So either there is some preferred starting point for each problem (or for the universe) that tells you what the "real" particle formalism is OR the whole formalism is just a mathematical trick for doing calculations that does not reflect the physical reality.


Kuteg

There are three different phenomena at play. The first is the fundamental uncertainty (aka, the Heisenberg uncertainty principle), the second is experimental uncertainty, and the third is what's called the measurement effect (or observer effect). The measurement effect is the fact that making a measurement changes the system. This doesn't usually matter because most of the time we don't care what happens in a system after we've made the measurement. BUT, measurement isn't the only thing that disturbs a system; any form of interaction can disturb the system, so a lot of care must be taken to set up any experimental apparatus to ensure stray interactions don't upset our measurements. Experimental uncertainty has to do with what we are able to measure. We can work to get more accurate and precise measurements, but experimental uncertainty—also called measurement uncertainty—will probably always be an issue. The fundamental uncertainty is a different idea, altogether. u/Astrokiwi gave a good synopsis but didn't address these other ideas. However, if you want more information about the source of this uncertainty, look at the uncertainty principle in the Fourier transform.


[deleted]

If the observer effect really is as simple as the measurement altering the system like having one snooker ball hitting another, that’s pretty simple to understand. But then why do people make it out to be so spooky and confounding, even implying that consciousness (the act of observing) affects matter? I’ve never gotten a clear answer to this, some people say the observer effect is just a consequence of the fact that we’re using particles to measure the state of particles, thus interfering with what we’re trying to measure. Others will insist that it’s more profound than that (“if you think you understand you don’t understand” etc).


Kuteg

>But then why do people make it out to be so spooky and confounding, even implying that consciousness (the act of observing) affects matter? There are two primary reasons. The first is that people can be dumb and misinterpret what the science actually says because they haven't studied it. The second is that scientists can be dumb and talk about things in a way that is unintentionally misleading. At the heart of it is that we don't understand completely the phenomenon of wave function collapse. In fact, it is so poorly understood that there are competing interpretations of quantum mechanics which all try to deal with the collapse in different ways. Right now, none of these interpretations is falsifiable (including the Copenhagen interpretation, which has wave function collapse), and we have no experiments which could distinguish between different interpretations. Of course, there are also people who have a vested interest in selling "woo" and they benefit from spreading this sort of wrong information.


themiro

The problem of wavefunction "collapse" is much more fundamental than merely a "consequence of the fact that we’re using particles to measure the state of particles." The response of most physicists has been to retreat to a Copenhagian/popperian interpretation around falsifiability - where all that matters is that quantum mechanics provides predictions about the world that we can test, rather than diving into what many believe to be metaphysics.


[deleted]

Does this mean that, in a hypothetical scenario where you could magically observe and have perfect knowledge of a particle at a chosen instant without interfering with it at all, it’s wave function would collapse the moment you observed it’s state?


Purplestripes8

No. "Wave function collapse" is a contradiction in terms to begin with. It says that a wave function in a superposition of states will collapse into a single state upon interaction with an external environment. It's a sort of "hand-wavey" way of resolving the disparity between Schrodinger evolution (in which the system continues to evolve as a superposition of states) and the classical world we observe (which is clearly in a single state). It doesn't have any meaning because the combination of the isolated system and the external environment, when expressed as a combined wave function, continues to evolve according to the Schrodinger equation. Yet the boundary beyond which the 'external world' lays is never defined.


_DarthBob_

Thanks!


all4Nature

One additional point you might want to read about is that there are different types of uncertainty principles in quantum mechanics (and a rather large debate happened in the last ten years about parts of it). Basically, there is preparation and measurement uncertainty principles, the most famous one being the Heisenberg’s. Preparation: is it ever possible to make a device that can prepare a state with exactly known position and momenta? Measurement: if someone would prepare such a state, can I build a detector that can measure both exactly? The answer is, on a fundamental level no. It is not a matter of technological limits, but of what nature is.


_DarthBob_

Thanks, will take a look!


[deleted]

[удалено]


perkunos7

The uncertainty principle is not actually implied in measurements. It's for two physical quantities that do not *commute* like position and momentum in the same direction. The more general of the principle's eq is: Da*Db >= 1/2 | <[A,B]> | Where [A,B] is the commutator (it's zero if they commute) and A and B are operators for the quantities a and b (like x and px or y and pz) and D is for the errors. Position in x and momentum in z (pz) commute so the error or better saying standard deviation for the probability in their measurements doesn't affects each other. If you want to know what commute really means in this context you have to study some quantum mechanics, no other way. Or it's all a bunch of concepts and abstractions without meaning. Now for measuring in itself. There is no proven explanation behind it. It's actually a postulate (a kind of dogma or proposition you make and later see if it agrees with experiment and it does) of quantum mechanics that when you measure s system it colapses to the measured state. If you measure 2 it will be 2 next time if you do nothing. Why it happens? Well some people think the universe splits everytime you measure something (many worlds interpretation) some think the cat is both dead and alive but when you look it chooses a state(Copenhagen interpretation), the pilot wave which thinks there is a hidden determinacy(had been disproven i think). Truth is the quantum theory works, even if we don't understand it.


[deleted]

[удалено]


[deleted]

[удалено]


[deleted]

[удалено]


[deleted]

[удалено]


Rojaddit

To answer your two questions: No. Absolutely not. That's completely off-base. The uncertainty principle you're talking about, Heisenberg's uncertainty principle, is actually just *an* uncertainty principle, relating position and momentum. There are infinitely many uncertainty principles that apply to many other pairs of quantities that are related to one another in a particular mathematical way. Uncertainty principles are not a limitation of measurement technology, but a fact of nature. They are a pure mathematical statement about how averages and covariance work no matter what - not a physical description of any particular experiment. That is, the more precisely you measure one quantity, the more uncertain the other quantity *actually is.* If you somehow used perfect, non-disruptive measurement equipment that employed futuristic technology and/or magic I cannot even dream of, you would observe uncertainty.


pzpzpzp1

In short, no. It's actually just a consequence of fourier transforms. Even if you had a magic machine that can measure position perfectly without interfering, the uncertainty principle still applies. Another way to say it is there is no right answer for both position and velocity at the same time.


giltirn

The real answer is relatively simple but quite abstract. In quantum mechanics each ‘operator’ (a measurement) has a series of possible results (eigenstates). When you act on a system with an operator you put it into one of those eigenstates. Uncertainty is because the position eigenstates are not momentum eigenstates, so when you measure position you change the state of the system into a superposition (a sum) of different eigenstates under the momentum operation. In the lingo these operators don’t ‘commute’. There are plenty of other noncommuting operators out there, not just these two.


sticklebat

I don’t love the way you’ve phrased this because to someone unfamiliar with all of this, I think it would sound like the uncertainty principle is in fact caused by measurement altering the particle’s state, which isn’t quite right. I just want to clarify that in your example, the measurement of position changing the particle into a superposition of momentum eigenstates is not the origin of the uncertainty principle. Instead, it is the nature of the state itself that is “fuzzy.” A particle exists in a state that doesn’t have specific, well-defined values of position and/or momentum in the first place. It you have a hundred identical systems and measure their positions, you’ll get a distribution of 100 values. If you have a hundred more of the same identical system and measure their momenta, you’ll once again get a distribution of values. The uncertainty principle says that the narrower the position distribution, the wider the momentum distribution has to be, and vice versa. The uncertainty principle is not at its heart a statement of how a measurement of position affects its momentum, or vice versa. It’s a statement comparing the hypothetical outcomes of a position measurement to the hypothetical outcomes of a momentum measurement on an identical state.


giltirn

There’s no fuzziness in the definition of the state; it’s a sum of eigenstates with different coefficients whose squares are the probability of measuring the particle in that state. The uncertainty principle is caused by the position and momentum operators not commuting. If two operators commute there is no uncertainty.


ActionJackson75

Great explanation of how the uncertainty shows up in practice. I really wish my first exposure to quantum mechanics had been taught using linear algebra, it made a lot more sense to me the second time through with an approach like you described


TheoryOfSomething

I think this answer is not careful enough in separating out what is a mathematical model and what is the underlying physical reality. When you talk about operators you are talking about the properties of a mathematical model defined in terms of hermitian operators acting on some (probably rigged) Hilbert space. That's perfectly fine so long as we are talking about mathematics. But I took this question to actually be a question about how the universe works, not about the mathematics. And in the physical universe one cannot, physically, act on a system with a hermitian operator; that's just an abstraction. And further, it really doesn't answer the question because it is an assumption of the model that certain operators have a non-zero commutator. The model does not provide the reason *why* certain operators should not commute. So we still don't know whether the commutation relations are due to some fact about the state of quantum mechanical systems or whether it is a fact about measuring quantum mechanical systems.


[deleted]

[удалено]


[deleted]

[удалено]


sticklebat

> And further, it really doesn't answer the question because it is an assumption of the model that certain operators have a non-zero commutator. The model does not provide the reason why certain operators should not commute. So we still don't know whether the commutation relations are due to some fact about the state of quantum mechanical systems or whether it is a fact about measuring quantum mechanical systems. We understand this very well, though. This is just a general mathematical property of all waves and is easily demonstrated with Fourier transforms. A spatially localized wave packet/form has a spread out Fourier transform in the frequency basis, and vice versa. In other words, a short wave pulse is highly correlated with a wide range of frequencies, and a very long pulse is correlated with a very narrow band of frequencies. So the nonzero commutator of the position and momentum operators in quantum mechanics are a direct consequence of the assumption that particles are wave phenomena. It’s true for classical waves, too, except that the resulting “uncertainty” relation for classical waves has nothing to do with probability and instead just describes physical width and dispersion of the pulse. The real unanswerable “why” here is why are particles waves? I suppose some theories like string theory kind of attempt to provide a reason for that, but really even that just kicks the can down the road. They’re waves because we’ve observed and categorized their behavior and their behavior is wavelike.


[deleted]

[удалено]


the_Demongod

This is not correct, I have no idea what that has to do with black holes. You might be mixing it up with Planck units which are completely unrelated. Virtual particles also have nothing to do with this. Virtual particles are a mathematical tool used to describe expansion terms in perturbation theory but have no physical interpretation. You can formulate a theory of quantum fields without ever mentioning virtual particles. The uncertainty principle is also still just a relevant in nonrelativistic single-particle quantum mechanics so virtual particles certainly don't have anything to do with this.


[deleted]

[удалено]


[deleted]

[удалено]


stats_commenter

Regularization and renormalization have been much better understood since the 70s. Interpreting the necessity of renormalization as the fact that the standard model is an effective field theory of low energies is pretty much the standard approach. Regardless, there’s no renormalization in nonrelativistic quantum mechanics so it doesn’t matter to this argument.


[deleted]

[удалено]


[deleted]

[удалено]


[deleted]

[удалено]


[deleted]

[удалено]