from entangled states: two recent posts…
How to measure the quantumness of a system:
I’ve talked before about the fundamental issue of the behavior of a system near the quantum/classical regime boundary. Simple systems on a small scale are dominated by quantum mechanical characteristics. Large numbers of particles considered on a larger length scale act classically. It’s not at all clear how to think about the boundary or transition from quantum behavior to classical behavior.
Back when I was in grad school studying out of Merzbacker’s text, the suggestion was that as N (the number of particles) became large the quantum equations would naturally tend to classical behavior. This was motivated by the experience of the latter part of the 19th century when it was shown that the classical behavior of Heat described by Thermodynamics could be even better thought of in terms of statistical averaging of random behavior of large numbers of basically inert particles. As the number of particles in the ensemble grew, the odd random mesa-states where washed out by the strong classically understood states.
But it hasn’t worked quite as neatly for Quantum/Classical physics. That may be because the behavior imagined in the quantum realm is completely contradicted by the behavior in the macroscopic. At some point particles apparently cease to be non-local and transition to a localized, deterministic behavior. How? Why?
The first step will be working to better describe or even better to measure quantifiably the “quantumness” of a system. Two physicists in South Korea have published a paper that sketches out a theoretical method to do just this, and which suggests some interesting paths for investigation.
“For the past 10 years or so, physicists have been proposing various ways to define or measure macroscopic quantum superpositions. Many of these proposals start by considering the number of particles or the distance between component states involved in the superposition. Although this approach sounds reasonable, the proposals have run into problems – particularly, they have not been general enough to be applied to different types of states.
The biggest advantage of Lee and Jeong’s method of measuring macroscopic quantum superpositions is its generality, which enables it to be applied to many different types of states and allows for direct comparison between them. The method is based on the quantum interference of a given state in phase space, which is the space in which all possible states of a system are represented.
As the scientists explained, a macroscopic quantum superposition has two (or more) well-separated peaks and some oscillating patterns between them in phase space. The scientists showed that the frequency of these interference fringes reflects the size of the superposition, while the magnitude of the interference fringes relates to the degree of genuine superposition. So using this method, the scientists could simultaneously quantify both the size of the system and its degree of quantum coherence. The method also works for superpositions that are fully or partially decoherent, which occurs when macroscopic superpositions lose quantum coherence due to interactions with their environments.”
Read the full details here.
One of the biggest basic questions in Physics these days seems to be centered around this idea of how the quantum regime extends or interacts with the classical. It’s relatively simple (grin) to see how the Relativistic regime collapses to the classical but it’s not at all that way in the quantum. Getting a tool to at least start to quantify the problem is going to be a major step.
The whole situation reminds of where we were with the study of Critical Phenomenon back in the early ’80′s. Once we worked out how to measure fluctuations near the critical point in different sorts of physical systems, and starting classifying them by dimensionality, it became possible for De Gennes to have his theoretical breakthrough that led to his Nobel Prize.
Even single photons must obey Einstein
One of the basic tenents of Special Relativity is that the speed of light (a self-propagating electro-magnetic wave) is a universal constant for all observers. That’s what makes all the neat reference frame trickery work. It’s why it’s a fundamental property of relativity that there is no absolute reference frame and simultaneity is relative.
The thing about this idea though is that it didn’t come from a theoretical background. It came by observation. No matter how carefully scientists in the 19th century tried to measure a change in the speed of light in space, they couldn’t find one. Most people thought the experiment was flawed. Einstein, along with Lorentz and a few others, imagined instead that they needed to create a model that baked that result in from the beginning.
But then along came the idea of wave/particle duality and conceptual nightmare that has ensued for our macroscopically prejudiced minds. The idea that it wasn’t actually a wave that was moving at the speed of light but an actual particle just made the whole situation bizarre. Given the constants of electricity and magnetism it’s straightforward to see the speed of light – but putting a particle into the mix, and getting rid of the wave nature to imagine just a point particle, and the idea that this particle’s motion is a constant for all just hurts our heads.
Some physicists recently decided to double check this idea. (This is how science works and awards are won.) We know that light moving in a medium moves more slowly than light in a vacuum. The electrons in the matter interact with the light to change the propagation characteristics of the wave and effectively retard its motion. But what about light in a vacuum – and what about a single photon? Could a single photon, a particle, be induced to go faster than the speed of light? Unlike the situation where we see the wave-nature, there’s no obvious reason that a massless particle shouldn’t be able to go just as fast as it wants.
“To address this question, [Professor Shengwang Du from The Hong Kong University of Science and Technology] and his coauthors’ demonstration required not only producing single photons, but separating the optical precursor, which is the wave-like propagation at the front of an optical pulse, from the rest of the photon. Previous experiments based on macroscopic electromagnetic wave propagation (involving lots of photons) have shown that the optical precursor is the fastest part in the propagation of an optical pulse. But this study is the first to experimentally show that optical precursors exist at the single-photon level, and that they are the fastest part of the single-photon wave packet.
[…]“In the slow light (with a group velocity slower than c) case, the central part of the main wave packet follows the group velocity,” Du explained. “When the medium density increases (with more atoms), the slow group velocity decreases. In the fast light or superluminal (with a group velocity faster than c or negative group velocity) case, the main wave packet seems to get ‘confused’ and does not follow the group velocity. …We are sure that the main wave packet cannot travel faster than the precursor, which travels at c.”
The results agree with previous studies that have analyzed single photons whose precursor and main wave form have not been separated, which have reported an oscillatory structure. The interference of the precursor and the slightly delayed main waveform can explain this structure.”
Full details here.
It’s a more subtle problem than I lay out above but it’s manageable and the experiment verifies that even for a particle, the speed of light is an absolute. Why? It’s not really clear. The requirement is easy to bake into an equation, but the physics still escapes us. Of course the duality of particle and wave nature escapes us too.
But no matter. What the experiment shows is that Einstein is right again. His theory posits from the beginning that in 3-space it is impossible to state pretty much anything about the world we live in with certainty. Things are relative. One person’s yes can be another person’s no and they’re both correct.
Makes trying to makes sense of scholastic and reformation strands of theology a real challenge.
1 comment
Comments feed for this article
June 23, 2017 at 1:26 am
Forked Again Decisions | dead beat poet
[…] quantumness measured (aseekingspirit.wordpress.com) […]