Hacker Newsnew | past | comments | ask | show | jobs | submit | Mugwort's commentslogin

Physics doesn't need another Einstein. Einstein explained Brownian motion, the Photoelectric effect, created special relativity and general relativity, the cosmological constant, helped found quantum mechanics, served as an invaluable critic of quantum mechanics. Then he foundered. Einstein never accepted particle physics, refused to follow new developments and became a dinosaur. This sadly is what most "physicists" are doing today, the followed in Einstein's footsteps, mostly the dinosaur part.

What does physics need? The world doesn't know. Nobody knows but someone will do it, someday in a manner no one else thought possible or could really anticipate. In fact, that's not entirely true but new developments will happen and only a handful of people will be in the loop. Lorenz, Poincare etc. e.g. laid some vital groundwork for relativity.

My own two cents on the matter is that we really don't understand our theories well enough and are badly in need of a firmer foundation. The situation is analogous to calculus before Weierstrass, Cauchy, Dedekind and Cantor.

Of course, mathematics wasn't completely stuck just because calculus wasn't fully developed. Probability and non-Euclidean geometry were stunning developments which predated a truer understanding of real numbers.

So it is with physics right now. Unification, strings, etc. isn't working out so well right now. Quantum computing is now a thing and Quantum mechanics is enjoying a second revolution not unlike the General Relativity Renaissance led by Penrose, et. al.

We can't predict the future. We don't know the sequence things we need to take the next step in AI or even if there is one. Will some form of deep learning be all we need? Probably not but possibly yes.

Physics is right where it should be. Frustration is part of the process. We're feeling some pain because our approach isn't working. Instead of having answers to everything maybe we should focus on better questions.


I'm not a physicist or even a researcher of anything but I sometimes watch youtube videos of people experimenting with material sciences and chemistry and a trend I have seen is that even things that we consider to be very basic have a huge lack of good quality information available.

One youtuber I saw spends time looking at new research papers for ideas and has found that they are just impossible to reproduce. Following the steps exactly as well as every possible variation doesn't produce anything close to the results described in the paper. In another video [0] he attempts to make glass and finds that almost all of the information available on the internet about making glass is not enough to actually make glass since they contain only enough information to register a patent but not enough to make from scratch.

I wonder if we will see huge gains by just making complete and unobfiscated information available to the public since it looks like even the foundations of society are secret company internal information.

[0] https://www.youtube.com/watch?v=mUcUy7SqdS0


The world should have a "crafting" wiki, which explains how to build anything from scratch.

From how to make glass from sand to how to build a quantum computer.


Would the crafting wiki contain instructions for crafting the wiki itself?

(An attempt at humour referencing the "set of all sets" paradox. That is all, as you were.)


The Whole Earth Catalog ceased publication in 1971. The Whole Mars Catalog however............


> Physics doesn't need another Einstein. Einstein explained Brownian motion, the Photoelectric effect, created special relativity and general relativity, the cosmological constant, helped found quantum mechanics, served as an invaluable critic of quantum mechanics. Then he foundered. Einstein never accepted particle physics, refused to follow new developments and became a dinosaur.

If you measure a scientist not by what they accomplished, but by what they got wrong, you'll quickly find that there are no good scientists.


> What does physics need? The world doesn't know. Nobody knows but someone will do it, someday in a manner no one else thought possible or could really anticipate.

I think that is precisely what the professor meant when he talked about Einstein, not the person himself with all his brilliance and mistakes, but someone capable of revolutionary way of thinking to jolt physics out of it's current stagnant state. Which is specially important since, physicists have turned into almost religious believer of their own, untestable theories, and incapable/unwilling to think outside of that narrow box.


Something important to know about Einstein: he was a patent clerk. He didn’t have much in the way of original revolutionary ideas. He read what other physicists had submitted patents for and combined the work. Without a doubt he did good work and all of mankind greatly benefited from it, but it goes a bit too far to claim all of the revolutionary ideas that came out of him were original.

The most important work, relativity, should be credited to Lorentz and Poincaré.


Einstein was only human... doing "better" and never get stuck on some pet theories might not be humanely possible.


To dimm willis936's post (cancel culture) is an example for the cause physics is stuck. Here is another try to give a reason to dimm:

"Density of mass is _not_ the source of gravity. A difference in density is. There is no difference in density of the Universe (as a whole) compared to 'outside the Universe' since there is no 'outside the Universe'. And therefore gravity does _not_ dominate the Universe. More precisely: there is no gravity of the Universe (as a whole) at all. - To apply the field equations to the Universe as a whole (Friedmann) is nonsense."

It is simple and well known in Germany for years (of course it is still under the regime of U.S.-cancel culture). It doesn't need another Einstein to know his cosmology was wrong. http://www.hashsign.eu


Hilbert and Courant is BY FAR the best mathematical physics book in existence. No contest. Boas and all the others are good, even very good. H & C beats them by a kiloParsec.


New edition of Arfken, Weber and Harris is great!


Hildebrand "Methods of Applied Mathematics" is the most approachable math text I've read. After hearing me express confusion about various topics that my dad thought were straightforward, he gave me his book from when he was in college, and yes, things were suddenly straightforward.


Do you think it might be related to this event? (linked below ) I'm going to make a guess, that there might have been a nuclear detonation. It probably wasn't a test. This seems to be an accident. Not all weapons are accounted for, e.g. there a American hydrogen bombs lost in Spain (I know... it sounds crazy but look it up.). Could there have been a bomb laying around someone in a place everyone forgot existed maybe some forgotten relic from the Soviet day?

A GLOBAL MAGNETIC ANOMALY: On June 23rd, Earth’s quiet magnetic field was unexpectedly disturbed by a wave of magnetism that rippled around much of the globe. There was no solar storm or geomagnetic storm to cause the disturbance. So what was it?

https://wattsupwiththat.com/2020/06/25/weird-out-of-nowhere-...


Almost certainly not. A nuclear detonation would also produce noticeable seismic effects as well as release iodine-131, which was not detected.


https://thebarentsobserver.com/en/ecology/2020/06/various-re... reports that Iodine 131 was measured in Norway during the week of June 2-8.


Ok, but that's two weeks off from this detection.


From what I can see in the article, its source is explained:

“Pc waves are classified into 5 types depending on their period. The 10-minute wave on June 23rd falls into category Pc5. Slow Pc5 waves have been linked to a loss of particles from the van Allen radiation belts. Energetic electrons surf these waves down into Earth’s atmosphere, where they dissipate harmlessly.”


It basically impossible the lost bombs from old broken arows woild detonate i nuclear way (the conventional explosive might still explode and contaminate a small area).

Even if you discount machinery still working after being embedded in the ground or unde water any neutron initiators or bateries would have lomg sonce became unusable.


There’s a huge risk of abandoned nuclear weapons self detonating. Russian ICBMs (used to?) have safeties that failed “deadly”. They had to run around and turn them off during the 90’s so they didn’t self-deploy. That was a huge problem because they had lost track of some of them.

Also, the soviet’s “Dead Hand” is still active.

It is a dead man’a switch that’s designed to automatically destroy the world if it detects the initial phase of a nuclear war. It could easily fail in a way that caused a false positive.

https://en.m.wikipedia.org/wiki/Dead_Hand

I’m sure other countries have equally problematic aging arsenals.


What the hell? Thanks for this, haven't seen anything about this anywhere.


Maybe because only specialists even took note of this. The article itself states that it's more like "hearing a pin drop".

If you look at the scales on the included diagrams, you might note that these "ripples" are on the order of a few nT.

The natural variation of the magnetic field can be 2 orders of magnitude higher, so the real story here is that they were even able to pick this up in the first place...


Cool, thanks.


I disagree that quantum mechanics needs fixing. What it requires is understanding. Many people even ones with PhDs don't really have a deep understanding of QM.

Ask a grad student or even a professor of physics what is essentially different about QM verses classical mechanics.

I almost always get one of two answers.

QM has operators acting on a Hilbert space (rigged HS). It's so mysterious because we have to replace everything with operators. well...

Have they heard of the Koopman–von Neumann formulations of classical mechanics? It's CM formulated using HS and ops.

https://en.wikipedia.org/wiki/Koopman%E2%80%93von_Neumann_cl...

Another answer is uncertainty. QM has uncertainty. Well... so does classical mechanics!

So what is different? That's a very long answer but one thing is the Exclusion principle.

Besides the KvN formulation is it possible to do QM using the equations of CM in the same form?

yes, it is. The Wigner Moyal Weyl formulations of QM. Look it up or Moyal star products.

Lastly, the collapse of the wave function.

Already done long, long ago by John von Neumann in his book on Quantum Mechanics. The correct formulation involves the subtle concept of decoherence. Von Neumann's trace operators are used in extreme cases where standard calculations (which involve a tiny bit of cheating) won't work, e.g. black hole entropy.

What's wrong with QM? So far nothing! It's possible someone might find a flaw but it hasn't happened.

The problem is mostly US. We don't understand all the implications of what we already possess. The enormous amount of myth making and lack of understanding surrounding QM obfuscates what the true problems are which in 99% of cases is the most physicists simply haven't studied von Neumann carefully enough.


I think part of the problem (sociologically) is also sharp reluctance on part of funding agencies, and hiring committees. Graduate students are made to feel, in no uncertain terms, that working on quantum foundations equates to career suicide.

It’s like physicists (collectively) have PTSD from a few decades of trying to understand QM, which then crystallized into the maxim “shut up and calculate”. What we’re seeing is the resulting learned helplessness.


Thankfully quantum information has been a successful way to smuggle foundational research past funding committees.


If someone has a better way to reduce the mental load of reconciling all experimental data physicists will no doubt adopt it. Until then they are wise to mostly just muse about foundation as personal passtime rather than engage in ideological wars.


> The correct formulation involves the subtle concept of decoherence.

The trouble is that decoherence only explains part of the story. With decoherence you end up with the probability density converging into multiple outcomes. But decoherence does not in any way explain that the choice is made to pick one of these outcomes.

As for what is wrong with QM, the main issue for me is the measurement process, which is just posited axiomatically. How measurement works should be explained by the theory but it is just posited. They assume a classical device that does the measurement. Decoherence explains bits of it but much is left hanging loose.


Decoherence describes the spectrum of behavior between quantum and classical.

As you mention, the probability density converges to multiple outcomes, but then I believe that you're in more of an Everett Many Worlds between the outcomes, rather than a "choice."

In practice, it seems like all of the Many Worlds scenarios statistically converge into one world sooner or later anyways (would love this to be formally shown, but have never seen it, so it's just conjecture from me). For example, if you flip a quantum coin, splitting the universe in two in a Many Worlds Interpretation, who cares? Does the Sun notice? Does someone in a town a mile away?

You've created a small bubble of "two universes," in a Many Worlds sense, but that bubble will pop. Quantum mechanics is ambivalent about the direction of time, so it sure seems like worlds are joining as fast as they're splitting, keeping us statistically around one effective universe.


In Scott Aaronson's paraphrased words...

Quantum mechanics is just statistics, but operations preserve the 2-norm instead of the 1-norm. Instead of case weights (probabilities) adding up to 1, the squares of case weights (amplitudes) add up to 1. Everything else (the uncertainty principle, measurement mattering, no cloning, Bell inequalities, etc, etc, etc) follows.


One of my professors had a pithy way of saying this: "Quantum mechanics is the square root of classical mechanics."


Damn, now I need to understand the square root.


> Everything else (the uncertainty principle, measurement mattering, no cloning, Bell inequalities, etc, etc, etc) follows.

Did Aaronson provide a source for that?


He usually explains the details right after stating it. For example, you can follow along in his lecture notes: https://www.scottaaronson.com/democritus/lec9.html .

"Statistics but with the 2 norm" is really just a succinct way of stating the postulates of quantum mechanics, and obviously all effects of quantum mechanics are determined by the postulates. So you shouldn't really see this as a statement that's controversial at all.


In many cases, Wikipedia actually lists the analogy for classical probability theory. For several theorems, the proof for Hilbert spaces can be directly brought to the classical world. The main difference to keep in mind is that we have amplitudes, not probabilities, and they can be negative or complex. We recover typical probability theory by taking norms/squares of amplitudes.

The uncertainty principle follows via doing signal analysis on Fourier-transformed wavefunctions. A fun professional treatment from Baez et al. is [0], and 3blue1brown has an excellent two-video visual explanation [1][2]. The uncertainty principle turns out to be a special case of the sampling theorem (yes, that one! [3]), which itself turns out to be a special case of a result in sheaf theory [4].

Measurements matter because "observables don't commute"; taking linear operators on the complex numbers or other Hilbert spaces can have lasting effects which can't easily be undone. Combine this with "conservation of probability", which is formally mostly abstract nonsense [5], and we get mostly to Aaronson's point of view. (I would go further using the Free Will Theorem. [6])

When Aaronson says that the no-cloning principle is provable using probability, but for complex numbers, he's referring to the standard proof [7]. There are two connections to draw to typical probability theory. The first, and bigger, connection is that random variables can't be cloned in typical probability theory either! The second, deeper, connection is that Hilbert spaces give linear logics, which imply conservation laws for the information representing the particles to be cloned.

Finally, for the Bell inequalities, again there is a standard proof on Wikipedia [9] using typical probability theory. I'd like to mention the overlooked Kochen-Specker theorem [8], which forms the backbone of the Free Will Theorem [6]. Measuring a particle is like taking a sample of a random variable: We decide how we want to ask the particle, and the particle chooses a response that is both allowed by its probability distribution and also correctly represents its context.

[0] http://math.ucr.edu/home/baez/photon/schmoton.htm

[1] https://www.youtube.com/watch?v=spUNpyF58BY

[2] https://www.youtube.com/watch?v=MBnnXbOM5S4

[3] https://en.wikipedia.org/wiki/Nyquist%E2%80%93Shannon_sampli...

[4] https://arxiv.org/abs/1405.0324

[5] https://en.wikipedia.org/wiki/Probability_current

[6] https://en.wikipedia.org/wiki/Free_will_theorem

[7] https://en.wikipedia.org/wiki/No-cloning_theorem#Proof

[8] https://en.wikipedia.org/wiki/Kochen%E2%80%93Specker_theorem

[9] https://en.wikipedia.org/wiki/Bell%27s_theorem


Entanglement is a separate thing though I think.


But how can one understand something in terms of some other, more basic, things when it itself is the most basic thing there is, save for the only language in which one can describe it in a meaningful way, math. We understand how atoms work, how semiconductors work, etc. - all in terms of quantum mechanics, but there is nothing else (but math) that can possibly “explain” how quantum mechanics itself “works” the way it does.


Granted, my PhD is in experimental physics, not theoretical. The story as I learned it was that QM is different because CM produced absurd answers for phenomena such as blackbody radiation, line spectra, and the photoelectric effect.


I wouldn't take seriously some random hack claiming physics ph.d.s don't know QM. Physics is defined by how physicists practice it.


Indeed, I think something that happens is that the weirdness and paradoxes of QM are what get reported and debated, with no mention that people are using routinely it to solve mundane problems with a high rate of success and accuracy.


A good trick question is to ask them about the trace of noncommuting operators, if you have finite dimensions -

https://physics.stackexchange.com/questions/10230/trace-of-a...


>QM has uncertainty. Well... so does classical mechanics!

Wait what? I mean sure, at scale CM is not feasible to compute (eg. n body problem). But our inability to precisely compute the otherwise completely predictable events is not the same as QM which can give true randomness.


QM is essentially deterministic. Neither CM or QM allows for true randomness. Maybe at this stage it is only a philosophical issue. In the end Statistical Mechanics and Thermodynamics are what truly rule the world. There is no hard and fast rule on when QM should kick in. We adjust the rule based on the need, which is why we largely don't need to deal with foundational issues while interpreting the world in QM framework.


>QM is essentially deterministic

No it isn't. Even with perfect information and unlimited compute resources you cannot determine the outcome of a simple Bell Test https://en.wikipedia.org/wiki/Bell_test_experiments

The same is not true for CM. QM absolutely allows for true randomness. This has been proven time and time again. I encourage you and the original author to read the above. Look at the Feynman lectures. There are no hidden local variables. This is true randomness.


The only point at which randomness enters is with "measurement" -- interaction with a classical system. But, if you believe in quantum mechanics, there is actually no such thing as a classical system, only large quantum systems.


And with that observation, you're into the Everett interpretation. Also called Many Worlds.

Namely that when an observer observes a quantum mechanical system, you get a quantum mechanical system that can be described as a superposition of parallel observers who each think that they saw something different. Those parallel observers cannot meaningfully interact thanks to thermodynamic considerations.

Of course accepting this description involves believing in quantum mechanics a little more than most feel comfortable believing in it...


Yes. It's been my defaultish interpretation since I've learned quantum mechanics.

There is a wonderful lecture by Harvard Professor Sidney Coleman, called "Quantum Mechanics in Your Face"[1]. In it, he essentially leads into precisely this. It's what happens when you take quantum mechanics seriously.

My issue is that I can't take quantum mechanics seriously, or expect that it's interpretational issues can be sorted out within itself. The problem is that quantum mechanics is "merely" an extremely excellent approximation to quantum field theory. It can be thought of as an "effective theory" in a very similar way to QFTs as low-energy versions of other QFTs. Which means naturally that we should expect the framework of QFT to answer the interpretational issues, especially to provide guidance as to how the changes due to the approximations change the interpretation.

This is all a fine idea, but quantum field theories have even worse interpretational problems.

[1] https://www.youtube.com/watch?v=EtyNMlXN-sw


> There are no hidden local variables

But there could be non-local ones. The Bohm theory, which is perfectly consistent with standard QM, is fully deterministic.


The wavefunction is a nonlocal variable!


> The wavefunction is a nonlocal variable!

Important point. It is quite unfair to say "nya nya local hidden variable theories cannot replicate QM", when QM is itself radically non-local.


Yes, but it's not a hidden variable (at least not in a prepared system).


Unless locality is not true, which we don't know for an absolute fact.


There is no "experiment" in a purely quantum world.


> QM is essentially deterministic.

No. The standard formulation of QM posits a measurement using a pseudo-classical device which generates a probabilistic outcome. This is at the very core. Source: any text on QM.

The wave function evolution is deterministic but that is only a part of the theory.


What in your view is the mechanism by which perfectly random outcomes are determined?


Well, either quantum needs fixing or GR does (or both). QM is extremely difficult to understand, GR is easier to grasp. I think it's a clue.


I agree that it needs better understanding, but to produce this understand would likely still require "fixing" it. As it stands, QM is a bunch of algebra that "works" for unspecified reasons when using some handwavey heuristics for converting CM to QM. It produces a miscellaneous set of implications, not all of which are clearly connected to each other.

Fundamentally, QM is just some special subset of algebra with certain properties. Just like other algebras aren't "physics", QM isn't really physics. It's just a bag of tools with the right algebraic features to solve a wide category of problems physicists often have.

This is fine, but first think of ordinary algebra. It has some interesting implications that took a while for people to wrap their head around. The zero wasn't a widely accepted concept for thousands of years. Negative numbers are already a squirrelly concept. How can you have negative three apples? That was an easy abstraction hurdle to get across, eventually, and now even children understand that the implication of this is that you owe three apples. If you're give five apples, you have to return three to whomever you borrowed apples from, and now you're left with two.

QM is firmly in the group of abstractions where we haven't worked out the mental models quite yet. Talking about complex-valued probabilities makes most people raise their eyebrows quizzically. It doesn't matter how many different ways you demonstrate that the algebra works out, it's still a very difficult concept to internalise.

One interesting mental model I've come across is that classical probabilities are the products of two things, we just haven't noticed. Quantum Mechanics undoes this multiplication, making it in some sense the "square root of classical probabilities", which is not a probability, but something... "else". It's mysterious because in ordinary life we only ever observe the (else×else) products, never the "roots", even though the latter is more fundamental.

I've never heard anyone take this kind of thought all the way to its conclusion. My current extremely tentative notion is that the "roots" represent probabilities in a kind of continuous alternate universe space, along the lines of MWI. Neither the "observer" nor the "observed" are in any one such universe, but smeared across them in some distribution. Their interactions require their many-worlds-distributions to be multiplied to produce a "real" result, which is still a distribution, but now the one we're used to in CM. Self-interactions such as a resonator in a potential well require self×self products, which are simplify to self^2. Quantum mechanics just undoes this squaring in order to model to underlying behaviour across parallel worlds.

But note how to get even this far, this tentative explanation already required a nearly complete rethinking of what QM really is. It's never going to be sufficient to shuffle the algebra around on a page, because algebra isn't physics. Physics is. Quantum Mechanics is still in the "a bunch of algebraic tricks" stage and needs to be dragged kicking and screaming into a form that people can intuitively understand in terms of physical concepts, not just mathematical ones.


If QM were a tool, it would be a hammer with a corkscrew for a handle. Sure, it works if you can manage to swing it, but it's really hard to grasp -- has awful mental ergonomics. What distinguishes a true theory (as a mental tool) from a bag of tricks: having that underlying model that we can wrap our minds around.

Some physicists claim it's inevitable because we don't have direct sensory access to quintessential quantum behavior -- we can only have intuitive models for classical physics. I don't buy it -- the human imagination is quite powerful.

I suspect that a superior theory (1) will be mathematically equivalent to QM (2) may suggest obvious extensions that are not equivalent, leading to testable predictions (3) won't result from merely reshuffling equations -- it'll take some serious inspiration.

Another hunch: the decoherence approach is barking up the wrong tree. The lesson of Schroedinger's cat is that realism doesn't emerge from a non-realistic theory. It seems like locality has a better chance of being an emergent property.


Isn't this the tipping point when a company crosses the line, becomes a monopoly, restricts consumer choices, and suffocates competition by leveraging the entire playing field? We've been through that before. Sad to see Apple of all companies turn evil.


Get a copy of Racket, take notes, type in the programs for yourself, experiment. This video is the best most direct introduction I have yet found.

https://www.youtube.com/watch?v=a5p8DPbaokE

Learning Lisp, Scheme or Clojure is more about learning ideas than anything else.

Later you might want to learn Alonzo Church's Lambda calculus after learning some set theory and mathematical logic.


Racket is not Common Lisp. It's not a bad idea to learn it, but the questioner specifically asked about Common Lisp.


I'm definitely going to listen to the Lex Fridman lectures simply because I enjoyed his interview with Judea Pearl so much.


We also passed 400 ppm CO2 in the atmosphere. Best decade ever for sealing peoplekind's fate.


That can't possibly be true, not while Trump is president.


Is Hacker News fake news? Many times people post things that would make a real expert roll their eyes. There are some smart people here but also lots of inaccurate information.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: