This paper is basically statistical mechanics with a quantum veneer. Two major issues:
1. Scale: They're simulating just 13 qubits with QuTiP and making grand claims about quantum thermodynamics. The computational complexity they're glossing over here is astronomical. Anyone who's actually worked with quantum systems knows you can't just handwave away the scaling problems.
2. Measurement Problem: Their whole argument about instantaneous vs time-averaged measurements is just repackaging the quantum measurement problem without actually solving anything. They're doing the same philosophical shell game that every "breakthrough" quantum paper does by moving around where they put the observer and pretending they've discovered something profound.
zifk 369 days ago [-]
I disagree with you on both fronts.
1. The main underpinning of this article is the analytical theory they come up with independent of their simulation. The fact that it explains a few qubits well is exactly why this is interesting. If you were to scale up their model - a spin-1/2 ising model, you would effectively get a classical magnet, which is obviously well described by classical thermodynamics. It's in limit of small systems that quantum mechanics makes thermodynamics tricky.
2. Their time averaging is just to remove fluctuations in the state, not avoid the measurement problem. They're looking at time averages of the density matrix, which still yields a quantum object that will collapse upon measurement. And as their mathematical model points out, this can be true for arbitrary time averaging windows, the limits just change respectively as smaller time averages allow for larger fluctuations. There's nothing being swept under the rug here.
whatshisface 369 days ago [-]
Quantum mechanics is statistical mechanics in the complex numbers.
carlob 369 days ago [-]
Quantum mechanics is Markov chains in imaginary time.
teamonkey 369 days ago [-]
Can you explain that?
diegoperini 369 days ago [-]
State transitions are probabilistic and operators have complex coefficients.
tsimionescu 369 days ago [-]
State transitions are deterministic, it's only measurement that is probabilistic.
Filligree 369 days ago [-]
Even that is arguable. Subjective experience is probabilistic… kinda.
guntars 369 days ago [-]
Do atoms decay deterministically?
tsimionescu 369 days ago [-]
As long as they are isolated, their state is a superposition of all possible states, and evolves determinsitically, with the amplitude of each of these "sub-states" evolving perfectly determinsitically. If you want to perform a measurement, you choose a possible decomposition of the superposition state and measure along that axis, and you'll get one of the values along that axis, with a probability that is the modulus of the square of the (complex) amplitude of that value.
Filligree 369 days ago [-]
Yes, aka. continuously. Interactions with larger systems makes it appear discontinuous.
wfewras 369 days ago [-]
I saw the best minds of my generation pithposting on hn.
> Even for isolated systems where, in principle, we could have zero surprisal and access to all possible information...
This makes no sense. How could you have access to "all possible information" in an isolated system? You obviously can't make any measurements, and if the system is prepared, then it's entangled with the system used to prepare it and again cannot be isolated. The whole notion of "an isolated system" is a theoretical fiction that doesn't actually exist in physical reality, but even in theory one cannot access all of the information in an isolated system because of the no-cloning theorem. So this really feels to me like the old joke about spherical chickens.
Furthermore, this seems like an already-solved problem. Constructing classical reality requires copying classical information, and the only way to make that happen is to discard quantum information [1]. That is the source of the Second Law and the arrow of time [2].
As you know, all possible information for an isolated system is obtained via solutions to the Schrodinger equation. This is standard many-body physics.
lisper 369 days ago [-]
Well, yeah, but that seems like a vacuous observation to me. In order to find solutions to the SE you have to know the initial conditions. How are you going to obtain those for an isolated system? You haven't solved the problem, you have just pushed it backwards in time.
11101010001100 369 days ago [-]
Thermalization (excluding systems which exhibit many body localization and the like) will occur regardless of initial conditions.
lisper 369 days ago [-]
That is manifestly untrue because the SE is time-reversible. That is the whole problem.
11101010001100 369 days ago [-]
Yes, and at the same time, the statistics of the system can still satisfy thermalization.
timewizard 369 days ago [-]
> This implies that for macroscopic systems, the expected time one would be required to wait to observe such a decrease in entropy occurring is unobservably large.
Yea but we have virtual particles and the Casimir effect. Am I wrong or isn't this these perturbations evidencing themselves on a macroscopic scale?
whatshisface 369 days ago [-]
Perturbations can mean either analogical reasoning (something is similar to something that it could come from with a small change) or actual perturbation (the effect of Venus on the orbit of the moon). Virtual particles are perturbations in the former sense, while quantum fluctuations are a small perturbation in the latter.
11101010001100 369 days ago [-]
The signal to noise at the union of QM and thermodynamics on HN is evidence of regression to the mean.
abyssin 369 days ago [-]
Why?
11101010001100 369 days ago [-]
The comments on HN are always full of ostensibly deep quips or questions but the work to connect them to scientific or philosophical questions is absent. That's the part that doesn't scale.
nhatbui 369 days ago [-]
So true, tale as old as time. Someone “raises doubts” based on partial knowledge of the subject, they go back and forth with someone, and then finally someone comes in with conversation-killing “what is consciousness anyways” type comment
readthenotes1 369 days ago [-]
"The second law of thermodynamics states that the entropy of an isolated system can only increase over time. "
Isn't there a difference between "can only increase" and "cannot decrease"?
goatlover 369 days ago [-]
Over long enough time, fluctuations to lower entropy states will happen, so the jaw is statistical.
2rsf 369 days ago [-]
Well, it's the an equal sign missing from one. For the later it can stay the same while it cannot stay the same for the former.
n00b101 369 days ago [-]
The trusty laws of thermodynamics strike again
Rendered at 10:25:17 GMT+0000 (Coordinated Universal Time) with Vercel.
1. Scale: They're simulating just 13 qubits with QuTiP and making grand claims about quantum thermodynamics. The computational complexity they're glossing over here is astronomical. Anyone who's actually worked with quantum systems knows you can't just handwave away the scaling problems.
2. Measurement Problem: Their whole argument about instantaneous vs time-averaged measurements is just repackaging the quantum measurement problem without actually solving anything. They're doing the same philosophical shell game that every "breakthrough" quantum paper does by moving around where they put the observer and pretending they've discovered something profound.
1. The main underpinning of this article is the analytical theory they come up with independent of their simulation. The fact that it explains a few qubits well is exactly why this is interesting. If you were to scale up their model - a spin-1/2 ising model, you would effectively get a classical magnet, which is obviously well described by classical thermodynamics. It's in limit of small systems that quantum mechanics makes thermodynamics tricky.
2. Their time averaging is just to remove fluctuations in the state, not avoid the measurement problem. They're looking at time averages of the density matrix, which still yields a quantum object that will collapse upon measurement. And as their mathematical model points out, this can be true for arbitrary time averaging windows, the limits just change respectively as smaller time averages allow for larger fluctuations. There's nothing being swept under the rug here.
NewsArticle: "Even Quantum Physics Obeys the Law of Entropy" https://www.tuwien.at/en/tu-wien/news/news-articles/news/auc...
NewsArticle: "Sacred laws of entropy also work in the quantum world, suggests study" ... "90-year-old assumption about quantum entropy challenged in new study" https://interestingengineering.com/science/entropy-also-work...
This makes no sense. How could you have access to "all possible information" in an isolated system? You obviously can't make any measurements, and if the system is prepared, then it's entangled with the system used to prepare it and again cannot be isolated. The whole notion of "an isolated system" is a theoretical fiction that doesn't actually exist in physical reality, but even in theory one cannot access all of the information in an isolated system because of the no-cloning theorem. So this really feels to me like the old joke about spherical chickens.
Furthermore, this seems like an already-solved problem. Constructing classical reality requires copying classical information, and the only way to make that happen is to discard quantum information [1]. That is the source of the Second Law and the arrow of time [2].
[1] https://arxiv.org/abs/quant-ph/9512022
[2] https://blog.rongarret.info/2014/10/parallel-universes-and-a...
Yea but we have virtual particles and the Casimir effect. Am I wrong or isn't this these perturbations evidencing themselves on a macroscopic scale?
Isn't there a difference between "can only increase" and "cannot decrease"?