There is alot going on in physics besides things directly related to the standard model. Running out of important things to study is of no concern. As a random example check out the microscopy work by recent nobel prize winner Eric Betzig.
I think people fail to appreciate the many frontiers that exist in just understanding how matter interracts with itself. The past was exciting because we got to discover new building blocks, but that only provides information on the periphery of whats going on. We know all sorts of stuff about the proton, but put it in a nucleus with a few dozen other nucleons and its a whole different story. All sorts of virtual particles are exchanged, its really complicated, and there literally arent any models that can predict the properties of the nucleus purely from first principles. There are important conceptual and computational innovations happening continuously to make progress on hundreds of problems just like this.
I was briefly involved in Single Molecule Biophysics, where microscopes push the limits of what you can do with visible light in order to understand protein behaviour, where I heard the often repeated warning: "the microscope is an experiment itself".
This was born out of the incredibly fancy and poorly understood microscope, which was experimental, and it was no guarantee that you were measuring the protein and not some weird light physics effect!
> and it was no guarantee that you were measuring the protein and not some weird light physics effect!
My thesis was in fluorescent bio physics at the nano scale (STED). We literally had a PI effect. Normally, the 'PI effect' is that when the professor comes in, the experiment stops working, mostly due to Murphy's law. With our case, it was because the extra body heat caused the mirrors to expand and drift out of place, ruining the alignment. This also meant it took a while for the mirrors to come back into alignment after lunch and first thing in the morning. What a mess!
So yes, in microscopy, the apparatus very much is part of the experiment and it takes ages to get things controlled and imaging correctly.
I worked for a while on a team that designed an electron microscope from scratch. There was a phenomenal amount of work to take it from a bench top kluge that sometimes takes an image to a reliable product.
When designing sensitive instruments we had to use almost a paranoid attention to detail.
Thanks for your reply! I actually found out one of my current coworkers helped build a single molecule scope, he talked about the use of an air cushion beneath the giant breadboard of lasers to ensure they stayed nm accurate.
Most optics tables are floated on air pistons these days, they aren't super expensive. It helps with vibe issues and with shock absorption (there is a lot of slamming your head on the table when doing optics alignment :P ).
Stupid question: normally the labs I work in have excellent temperature control for this reason. In fact, some of the labs are actually rooms-within-rooms (with the outer room having AC, and the inner room having another AC). Was this for budget reasons, or were you physically placing your bodies extremely close to the optical equipment while running experiments?
The lab was in a bio lab, so the AC was set to STP. Well, mostly. Just having a warm person breathing in the tiny room was enough to throw the mirrors off. Since we were doing active tuning and development on the optics, we had to be in there, and it took a bit to get the room to a steady state with us in it. Fortunately, during the summer or winter, the only thing what would change was humidity, not the temperature, and that seemed to not affect things too much.
Also, not a stupid question. We thought of it too and it took us a week to collect enough data to rule out the AC system as a cause.
Another frontier is light/matter interactions. This is an area where a ton of progress has occurred in a wide range of areas. And that work often has direct implications. I worked on a microscope recently that coupled 6 different lasers into a single fiber for fluorescent illumination. The tools we got to work with... some are from 100 years ago and are still great, others, like spatial light modulators and other light-conditioning tools are really amazing. These days you can reproduce the Michelson Morely experiment (technically VERY challenging at the time) in an afternoon. Or do a basic quantum eraser experiment with $2K in optical components(!)
One of the other students in my lab put it this way (very paraphrased):
If you break up the Nobel Prizes and look at the optics based ones, ones that made up some new microscope, or used light in some new way, then Optics becomes a top 5 domain. In that a lot of Nobels come out of manipulating light. And a lot of the time, the sum total 'new thing' worthy of a Nobel is just a single optical object, sometimes it's five, not much more. But, the thing is, if you look at how long those experiments took to preform, from concept to working prototype taking some kind of data, then Optics varies wildly. Some, like DIC imaging, is just a single 1/4 waveplate in the middle, and it took Nomarski nearly a decade and a half to get it just right. Some, like STED, were built in Hell's living room over about a month. And that's the thing about Optics research, it's just blind luck most of the time. Sure, the equations help, but they tend to come after the apparatus is up and running. You can spend a few weekends in your living room in your underwear tinkering about while watching the game, or you can spend your entire career painstakingly laboring in the nauseating dark of the lab all alone. Incremental progress just doesn't seem to be a thing in Optics; you either get it right or you don't. Anyone that has been in totality during an eclipse knows that there is a huge difference between 99% and 100% coverage. Optics is exactly like that.
Lots going on in the foundations as well, esp. cosmology. Particle physics is hardly the end all be all. I mean the reality is that for going significantly beyond the Standard Model you just need higher and higher energies -- energies that might even be impossible for humanity to explore ever. One could build a better accelerator which Sabine opposes because it's unclear whether it will yield anything (but honestly that's science. If there were guarantees, it wouldn't be called "research"). Or one can theorize and do thought experiments and hope that one will discover mathematical constraints that rule out a big class of models.
It's not like she has a better plan. She says "new and better methods". I mean that's exactly what physicists try and pursue. I highly doubt that an ambitious young scientist wldn't try and get a cool publication if they could think of a novel theory. But the reality is that novel ideas that fit all the existing experimental constraints are very very very hard to figure out. People work on them but lack of a million new models a year is not because of some groupthink but because physics is hard.
This was great until the bit about medicine. Medicine has made huge, tremendous progress in the past 30 years. And there are still many serendipitous discoveries happening that lead to new drugs. I mean jesus for the first time ever, a company is going to put up a drug for approval to treat alzheimers.
Let's not lump medicine in with physics, please. Medicine has always been skeptical of theories without data and, while maybe facing issues, they aren't the same as physics.
Its ironic that a physicist, criticizing others physicists, make a common mistake that all physicists make: over simplification and acting like everything is just like physics with slight modifications.
Yeah, I am not sure about that. To keep up with pace and statistical rigorousness medicine would have to be disturbingly and outrageously unethical. It also deals with more complex issues.
(tl;dr--the anti-amyloid hypothesis you should be extremely skeptical about, and the inconsistency in the trial data here is signalling that statistical fluke should be a high concern).
There's a reason I went back to being a software engineer at an internet company working on ML models that actually work- clinical trials have a long way to go before they produce results that can be really trusted.
I’m going to have to disagree. Anyone in pharma or med devices can tell you that major innovation is basically outsourced to venture capitalists, where if successful through clinical trials (always a big if), it can become acquired. This is not a method for radical innovation, e.g. a device to stop a hazardous insulin metabolic response to digestion versus the current insulin-pump market.
I disagree at the 'rank and file' level, which is different than Pharma. Pharma is mainly scientists and engineers paid for years to noodle a problem. Rank and file physicians are often Sensors (immediate world around them) and Feelers (tuned to people's feelings) - which are different than the personality traits required to push past status quo, take risks, and innovate.
Bullshit, crises in astronomy for example. The Big Bang Bullshit has be falsified for years now.
http://cosmology.info has links to dozens of papers.
Show me one paper that measured Lorenz contraction, you can't because it does not exist.
The whole standard model is based on wrong assumptions.
In fact If you would actually talk with scientist in the respected fields that don't take every explanation but ask critically, you will see that most of them are stuck.
> They do not think about which hypotheses are promising because their education has not taught them to do so. Such self-reflection would require knowledge of the philosophy and sociology of science, and those are subjects physicists merely make dismissive jokes about.
Accusing working scientists of being insufficiently reflective on any topic, let alone themselves or the field they work in, seems without merit on its face, and frankly so bizarre as to be silly or even dim-witted. Further, the assertion that only the humanities will save science, or indeed the accusation that scientists are ignorant of the history of science, is a weighty charge that I see no reason to even consider believing.
> They believe they are too intelligent to have to think about what they are doing.
I have a PhD in physics. Even though I have left the field, I’ve hung around with enough of them, and occasionally tried to raise some more philosophical questions with them.
This is 10 years ago, so unfortunately I don’t remember details about discussions I never got going. Some were able to argue for their favourite interpretation of QM, or rather, how to resolve the measurement problem, but I mostly got blank stares when I tried to probe a bit more about the ontology.
One of the reasons I left academia was that I never found the intellectual and curios environment I longed for. The successful people were successful mostly because they knew how to write grant proposals and which committees they should be on, not because they did really interesting science.
Yeah, that's not surprising. Cutting-edge R&D is very boring hyper-parameter tuning basically, and requires a kind of person that is able to do things for a long long time (to gather data) and then to think a bit later.
The same goes on with neuro folks. Consciousness, mind, hello? No, just let's poke rats, thanks.
But this works.
And that's why I don't understand Sabine's viewpoint. She can't just sit in an ivory tower and stare down the cosmos in a contest until it reveals its secrets. We need data, better models, better measurements, etc. Yes, sure, it needs better theorists too, but then why not let the experimentalists do what they do best? It's not like there isn't a bunch of theorists in every serious university on the planet, maybe organize them better?
Sabine's point is that the current approach doesn't work any longer.
> She can't just sit in an ivory tower and stare down the cosmos in a contest until it reveals its secrets.
That is a gross misrepresenation of what she is saying. Did you actually read anything she wrote? She is criticising the notion that you can make progress, sitting in the ivory tower, as you exprss it, and coming up ideas about strings and super symmetries without any experimental data or other validation from reality. But she is also critical to the approach: just throw more money and manpower into even bigger experiments before we even know what to look for and hope something interesting reveals itself.
Of course she can have it both ways. Don’t be stupid. There’s plenty to do in physics without building an even bigger collider which likely won’t teach us more than the current one.
In my experience the successful ones, yes they are good at grant writing, but also they are singularly laser focused on the tiny little bit that is their specialty.
Yes, that's how science progresses, but doesn't really make for interesting coffee table discussions about the nature of reality.
Yes, that is how some parts of science progress. And I certainly don't want to diminish those contributions. But that is also the exact reason why we haven't had any mayor breakthroughs in fundamental physics in soon half a century. I don't see how an Einstein, Bohr or even Feyman could get tenure today.
QM is fundamentally not correct. They assume energy and matter is interchangeable, which is not.
Build me a proton from energy and I will shut up, you can't.
The appearance of positions can be explained much much much much easier, once you got your electron model correct.
When I studied physics I never saw the periodic table as the result of QM model, never made sense for me. Now I understand, thanks to a more advanced, but classical model.
QM and GR can never be unified because GR is a effect of surface pressure and QM has the wrong surface.
Those mangeled protons are not anti-protons. They are not stable but only a Proton with damaged outer shell (the one that breaks at 5 MeV). Get it stable like the Proton and we talk again.
Sigh. For your benefit more than mine, I’d like to make you aware that you sound like a Markov chain trained on science-sounding words, because those words don’t fit together in the order you have combined them.
If you are so good, please explain me the black spot.
I just follow quite different, more classical model that is not invented by me.
What most people think is is a very bad truth function.
I care about the number of assumptions required for a model and the number of papers falsifying it. I don't care if a paper praises how good prediction X fits, you have a infinite number of models that predicted exactly the same value. But through falsification you can sieve out the wrong ones and guess what, most of the models commonly proposed as "truth" are long falsified. The standard model is not even logically sound and combined with GR/SR highly paradoxical.
Full Ack.
Unfortunately training against confirmation bias (or other ones) is not part of any scientific study.
We in general give very little attention to papers that falsify an accepted theory.
For example, look at the papers http://cosmology.info has collected. If you have gone through some of them, you can not, at least as a scientist, still think that Big Bang is a true theory. But still, this bulls*t theory pops up all the time....
This is just one of them, in fact, nearly everything is falsified...
It does not matter if theory X predict A correct, if prediction B is false your theory X is also not true but at most a model to calculate A.
The model I have adopted also perdicted the "Higg-Boson", but it has nothing to do with gravity.
> the philosophy and sociology of science are a VERY different thing from "the humanities".
I'm not sure that they are.
But my point is simply that accusing scientists of not being self-reflective is silly, accusing them of ignorance of history is silly, and for this piece to be elevated beyond the level of polemic would require something, anything, in the way of persuasion that its assertions were true.
As usual with Sabine, this article blatantly misrepresents how science is done and how actual scientists think, and serves mainly to portray herself as the only person in the field capable of rational thought. People in the comments here are picking up on this when it takes an unfair swipe at medicine, but it really applies to the whole article.
I am not going to bother doing a line-by-line rebuttal, because I've already done this for nearly a hundred of her previous articles, in forums across the internet. It just doesn't matter, because tomorrow she will put out another massively popular article with the same oversimplifications and misrepresentations, and it will get 1000 times the views any response I write could. That is what it means to be a public intellectual. I am a working scientist, which means I have neither the time to keep up nor the audience.
> As usual with Sabine, this article blatantly misrepresents how science is done and how actual scientists think
No offence but she is a trained physicist so she too is an 'actual scientist' by the very definition of the term. And she's completely right about the neglect of economic reasoning. The arguments against blowing 40 billion bucks on an accelerator that will likely not surface anything new is valid. Physicists gotta accept that opportunity costs are real.
And the 'canary in the coalmine' point in particular in regards to pharmaceuticals is valid too. The research costs for very specialised, small scale drugs has exploded by a magnitude or two depending on who you ask. Other fields are starting to suffer as well.
I would go even further than she does and say it's an almost universal condition of modern research. We see diminishing returns everywhere, and if this is not stopped and we don't manage to drastically increase the efficacy of scientific research we'll effectively flatline at some point.
Almost any material hard metrics you look at like life expectancy, economic growth or whatever you could consider a sign of technological advancement, even tens of billions in investment seem to produce only marginal improvements. We are not living in the golden age of science.
> Almost any material hard metrics you look at like life expectancy, economic growth or whatever you could consider a sign of technological advancement, even tens of billions in investment seem to produce only marginal improvements. We are not living in the golden age of science.
As a knit pick, these metrics are all derivatives of science output, and not science itself. For example, detecting gravity waves does move the needle on any of these metrics, but it is still a great achievement for experimental physics.
Fantastic mondegreen ("knit pick", like picking at stitching with knitting needles, as opposed to "nit pick", which is picking off lice eggs to prevent future annoyance)
"Mondegreen" is a word I've needed but didn't know existed. Thanks for sharing.
All intensive purposes.
Get down to brass tax. There's a lot of these that I've uttered or heard and they last a long time because they sound kind of sensible.
> For example, detecting gravity waves does move the needle on any of these metrics
It will surely do. Those achievements are not possible without staggering advances in technology that can be later applied in different fields.
The same technology that we first developed for those expensive colliders is now used in MRI and others diagnosis tools that we have for granted today.
> The same technology that we first developed for those expensive colliders is now used in MRI and others diagnosis tools that we have for granted today.
The question is, how much better would "MRI and other diagnosis tools" if that money had been spent actively trying to improve them instead of mostly going to detecting gravity waves? That's the question you need to answer to justify spending billions chasing physicists latest "fetish".
"Big physics" proponents always justify wasting (in my opinion) billions in colliders and observers by the unintended benefits that accidentally derive from them but they never mention how much more progress those other ares could make by directly using that money.Instead of spending $40B in the next big collider, what if we were spent developing CRISPR and related technologies? Or on a "moon-shot" for AGI? On developing new tools and techniques to study and understand large scale social systems?
If you put a team to make a better MRI, you'll get the same MRI maybe a bit faster or smaller.
If you want the MRI 2.0, you start with something monstrously different, otherwordly, like gravitational wave detection. There you reach metrological precision, DSP, optics, waveguides, EM shielding, thermal noise compensation, and other incremental advancements that taken together gives you MRI 1.5, and then you can bring in those who make it work into a nice and shiny improved MRI.
The problem with moon-shots (eg. AGI) is that you don't know where to go. With physics at least you sort of do. Higher energy. More sensitivity. Better experiments.
Arguments for "spinoff benefits" from research always seem really weak to me. If you want lighter, imperishable food, invest in food preservation, not a space program that might happen to produce freeze-dried food.
I'm not 100% convinced. I think it's hard to get really smart people to work on problems that aren't interesting. Asking a team to make food preservation 10% better won't invite the creativity and determination that asking them to preserve food to Mars would.
I think, when you take humans into account, you're better off setting interesting and challenging goals, with many useful byproducts.
That is the best argument for "spinoffs" that I've heard. Well done.
I still think it's pretty weak though; the inspiration benefit seems unlikely to overcome the lost efficiency, in general. And different people find different problems inspiring; some people can be inspired by practical problems (e.g. "how can we lighten the backpacks of trampers").
Are you suggesting these spin-offs are (always?) after-the-fact justifications using issues that were not pressing concerns?
The problem with attempting to justify space exploration with spin-off examples like food preservation is that it risks provoking a 'so what?' response. Piling on unimpressive claims does not strengthen one's argument, it dilutes it, and gives the opposition an easy target to dispute.
When Clarke came up with the idea of communications satellites, he, and others, thought he had come up with a before-the-fact justification for space travel, as he expected these relays would need regular maintenance, but another claimed-spinoff-but-not-really, semiconductor electronics, obviated the need.
Why is the general public wasting millions of dollars on an opioid epidemic? Why does shrekli's 710$ medicine still cost 700$ after he left and was charged?
There is plenty of wasted money in the pharma sector that can be spent and profited from in crispr technologies. I'm not sure why you're taking physics money for that.
The physics funding programs have already moved on from big budget tags in some spaces. ITER will probably be the last big white whale project in fusion. There are a bunch of smaller fusion projects being funded at smaller scales to prove their efforts before going big. The iterative tech model is bleeding over into science funding.
I think the right way to think about science investment is the multi-armed bandit. There's a lot of uncertainty, but you also know that if you fund a portfolio with a good distribution, you'll have a chance of maximizing the total good science results that come out of it.
I've always been very skeptical of claims that heavy investment in the space sciences advacnced other technologies, more than if we had just invested that money directly in the other technology. I also see a lot of money wasted in space sciences running experiments on the ISS that aren't really that useful (like crystallography- sure, you can grow nice crystals in space and then bring them back down to earth, but for the money you spent on that, you could have funded 10 PIs, and also crystal structures aren't that useful for advancing science).
Interestingly, if you look at modern science funding, it does basically treat it as a multi-armed bandit and there is a portfolio of project funding that includes both large particle physics experiments and individual investigators.
Isn't the 'problem' (for the wider population) that physicists are motivated towards answering the big questions (that's a valid outlay for $billions IMO, just like, say how we spend billions on football/soccer). So effectively what we're doing is allowing that because we know there are positive repercussions that come along with such exploration.
I think his point is the science would be better served if it focused on solving problems for people, rather than exploring knowledge for the sake of it. But really I think that is the realm of engineering, and science is really just pursuing knowledge for the sake of it. It's impossible to know what will be useful - Richard Feynman thought it would be fun to analyze the flight characteristics of a flying plate. He later used the math he developed doing that to help him earn a Nobel prize.
Reading your 1st sentence made me think "But that's engineering, it's application. Basic science is getting knowledge just for the sake of it."
And you basically followed with it. Basic science is the foundation engineering builds on. I believe 1st semiconductors were a science experiment. Em waves were. Gigantomagnetoresistance was. The internet was an academic curiosity.
It think it's important to blow some of human work on the free thinkers. It moves the popluation forward. Any other comparable scientific and engineering progress I can think was fueled by war.
Imho allocating resources only for acute goals lead to local technological optima. Seemingly puporseless research helps to break out.
> I believe 1st semiconductors were a science experiment.
I'm pretty sure the transistor [1] was invented specifically to solve a real engineering problem.
[1] I'm taking your use of "semiconductor" to mean transistor because my understanding is semiconductors are a kind of material that was discovered, not invented.
My favourite example is number theory, which was either hailed or ridiculed for millennia as being pure mathematics that can and will never be applied to real life problems.
Today number theory is the basis for several important fields of application, including computation theory, modern cryptography, and numerical calculation.
Yes, but the question is, did developing number theory hundreds of years in advance of its applications accelerate the development of, say, cryptography? Or would cryptographers have just worked out the number theory when they needed it? If it did accelerate the development of cryptography, how much acceleration did we get for that investment?
Also in modern times, especially, there is a big problem of discoverability. If the mathematical result you need for some application is buried in a thirty-year-old journal using unfamiliar terminology, what are the chances you'll find it and understand it when you need it? I venture to suggest that much of the time it's easier to re-derive the result than to find it.
That's a good point, but I think a strong counter-example would be anything we get from satellites. We went to space for political reasons, but let's just say it was for the sake of science because it sure didn't solve any immediate problems for mankind. But it did create the technology for satellites, which allow us to better predict the weather and gives us GPS tracking, and other things as well. But imagine you wanted to develop a global tracking system, would your first thought be to invent satellites if they did not already exist? So it definitely does matter sometimes, although which times is obviously open to interpretation.
I think that is a weak counter-example. The applications of rocketry and satellites were obvious by the end of WW2. In 1945 Arthur C. Clarke had already published an article explaining how useful it would be to have communication satellites in geostationary orbit, and it didn't take much imagination to suppose that a big enough rocket could put them there.
It's easy to think of it as a possibility, but the amount of effort to put a satellite into space is still extraordinary (space travel is still far from a commodity all these decades later). My point was they would not go through the trouble of first creating satellites as it would be considered too exotic of an idea. It would be like thinking of Uber 15 years ago before smartphones were ubiquitous, and deciding you first need to invent the smartphone rather than shelve the idea.
Speaking of nitpicking, it's nitpicking - literally speaking, removing the eggs of the lice you've probably already removed from someone's hair. It's a fussy activity (of necessity), takes forever, and would probably seem quite an annoying thing to someone (likely a child) who thinks you've already solved their pressing problem and should maybe leave them alone now.
Speaking of nitpicking or the more practical nit combing, none is needed if a 0.9% suspension of Spinosad (1) is applied to a child’s dry hair and washed off after 10 minutes. You may happily and confidently send your kids back to school after this one treatment despite the horrified looks of parents and teachers ;-)
Or you can just let your kid have that fuschia mohawk she wants. The eggs will fall off the shaved part and the peroxide and glue will kill off the rest in the unshaved part.
> We see diminishing returns everywhere, and if this is not stopped and we don't manage to drastically increase the efficacy of scientific research we'll effectively flatline at some point.
Most likely there is no way to increase efficacy, because the problems we try to solve become harder and harder as we go. This is why the predictions of singularity are unrealistic, the self improving AI, and constantly increasing number of human researchers will go into merely not flatlining instead of solving everything at once.
Indeed! It's especially relevant because being smarter doesn't obviate the cost of doing experiments. The superintelligence still has to build hardware and wait around for results.
Short summary... she is a research fellow, theoretical physicist, and of course has a PhD in physics.
I fully expect scientists to disagree with each other. I even enjoy reading about their disagreements. But, to take shots at qualifications (even if in a passive aggressive manner) is rude. The tone of the parent attempts to dismiss her by making us feel she isn't a read scientist while in fact she's talking about her profession.
I'm not taking shots at qualifications. I only say that people should be defined by what they spend the majority of their time doing, not their job title.
Sabine has more qualifications than me, but like Michio Kaku and NDT, she now spends most of her time promoting a certain message to the public in a very one-sided way. That's why I called her a popularizer. Like Michio Kaku, she can afford to spend the time on that, and I can't.
Not being familiar with this author, the biggest reason I dismissed it is because I didn't see any kind of honest discussion that, well, maybe fundamental particle physics is just harder now, in that that next step of what can be probed by experiments isn't actually possible by humans, now or really ever (e.g. requiring a particle accelerator with the radius of Pluto's orbit, etc.)
So instead you get all this hand-wringing about the process, about how theorists come up with elegant math that isn't testable, yada yada, without at least an honest statement that "It is at least possible that the universe's final secrets will never be amenable to human inquiry."
This is something I've thought about. I'm a physicist, but am now happily working in industry. My job is to solve problems, and for most part I do, but it always seems possible that one fine day, someone will walk into my cubicle with a problem that I just can't solve. I'm always faced with the fact that I can't honestly predict how long the next problem will take. Maybe a day, maybe a week, maybe... forever?
Despite a string of successes, we never know for sure when physics will be hit with a problem that takes a decade, or a century, or even a millennium, to solve. As I understand the history, the ancient Greeks could formulate quadratic equations that they couldn't solve, and it wasn't until something like a millennium later that Al-Khwarizmi cracked it, at least for real roots. Now we teach it to children in middle school.
Finding a testable theory of quantum gravity might be the problem that takes a millennium.
I think that I mostly agree with the theme of your post. An important thing to keep in mind is that physics is inherently an experimental science, so the critique of formulating untestable theories is valid. I have heard that cosmology is particularly bad about this, however I am not in that field...so who knows.
> this article blatantly misrepresents how science is done and how actual scientists think
Her factual statements, that every high-energy physics experiment since the 1970s has only confirmed an existing theory (the Standard Model) and that every prediction made by physicists about possible extensions to that existing theory has turned out to be wrong, are correct.
If you think that the underlying cause for these facts is not what she says, what do you think it is?
> Her factual statements, that every high-energy physics experiment since the 1970s has only confirmed an existing theory (the Standard Model) and that every prediction made by physicists about possible extensions to that existing theory has turned out to be wrong, are correct.
Well, the snarky answer is that the equivalent of the LHC (America's SSC) was defunded halfway through construction in the 1990s. This set back the field by nearly 20 years. It's rather unfair to say that no progress was made in the 1990s and 2000s if the very instrument that could have done so was defunded.
Anyway, the real answer is that the vast majority of scientific theories ever formulated were completely wrong. Some people say the early 20th century was the golden age of physics. But there were several failed theories of relativitistic gravity proposed in the 1910s, many failed attempts to formulate the quantum theory of fields in the 1920s, and within particle physics, a slew of failed theories of the atomic nucleus and the fine structure constant. None of this appears in textbooks.
And this is just the tip of the iceberg. Recall that back then the journals were still getting filled with theoretical papers weekly -- the fact that we teach the material that was right to undergraduates in slim textbooks shows that the vast majority was wrong or irrelevant.
Experimental confirmation has always taken a long time, e.g. see the long disputes over the ether that stretched well into the 1940s, 40 years after the proposal of special relativity, complicated by false positives and systematic uncertainties.
Fundamental science is simply hard, and it only looks like it used to be easy because we forgot 99% of the failures. There is a perfectly valid point to be made that it may be getting harder, but it is insufficient to just point to failures to prove it.
Well, she’s out of a job by the end of the year it looks like, so I guess she’s preparing to live on the ads and the book sales?
Still, I don’t think Hossenfelder is a reason to be this salty! Just post links to some other physics blogger that contradicts her and move on. That’s how Lubos is handled and it works ok as far as I can tell (I can’t actually tell very well on account of all the ignoring of Lubos I’m doing).
At least Lubos uses/shows some math to diss on other's work. Of course since research math is so ungodly far from rigor we can't check without the full burden of becoming a similar theoretician.
But maybe, one can dream at least, in the future science will be done with smart models, that can at least verify themselves given some assumptions.
I'm new to the field. I'd appreciate some links to past rebuttals that you've done because I also want to learn about the discourse and politics surrounding the field. I think that understanding all of those things is quite important, and could benefit me greatly. Thanks.
One thing that bothers me is that indeed it has been resolving inconsistencies that often lead to progress, as the author points out. What would you say to that? What are the still unresolved inconsistencies in physics? To me it feels more like that we just mostly ran out of the obvious inconsistencies to resolve.
That's a bit like saying phlogiston is an unexplained part of physical theory. Once you understand the underlying mechanics of thought, it stops being mysterious and really stops being a problem at all.
Weird, because if I were to create a list of everything ranked by how confident I am that the listed item exists, consciousness would be at the very top by a wide margin. Everything else could just be a nice illusion (e.g., brain in a vat).
It depends on what you mean. We understand that it is a mechanical, computational process. As opposed to, say, the reigning theory of mind in philosophical circles which is dualist. (Spiritualism in academia in the 21st century... sigh.)
We have not yet mapped out the wiring diagrams of our brains which result in the human experience of consciousness. But that's a technological limitation in brain scanning and simulation. We also haven't yet created AI machines that exhibit what we would call consciousness, but we have good ideas of how to do so and are making progress. In both cases we know there is no need to invoke dualist answers, whether it be souls, ghosts in the machine, or 'qualia.'
>> We also haven't yet created AI machines that exhibit what we would call consciousness, but we have good ideas of how to do so and are making progress.
That's news to me. What do you mean? What are those good ideas that we have and that we are making progress towards, that will lead us to conscious machines?
Actually: conscious software. Presumably, if we can get to strong AI from where we are right now, then we already have the hardware and we just need to figure out how to write the software?
I never said there was broad consensus among all philosophers and physicists, etc. It takes generations for these ideas to truly die, just as it did with other widely believed falsehoods. No one really attacked consciousness from a mechanistic perspective until Turing, and that work wasn't followed up on outside of the AI community until the 70's and 80's, and it wouldn't be until very recent advances in AI that others took these philosophical ideas originating from computer science seriously.
So in the philosophy of physics and the mind, there are whole departments filled with tenured professors who came of age in their thinking at a time when consciousness was a Hard Problem, and have focused their mental tools on a class of solutions (qualia, observer-triggered wave collapse, etc.) which are irreconcilable with mechanistic physical reality, making the problem even more intractable and mysterious. It'll take another generation or two before the ideas of Dennett, Dawkins, Tegmark, etc. get more widely recognized and consciousness finally goes the way of phlogiston.
For references, I recommend Dennett's "Consciousness Explained", and really anything by Daniel Dennett and Richard Dawkins--materialistic explanations of consciousness pervade their work. For a compatible (hah!) physical perspective, I suggest Max Tegmark:
I feel like this is a bit of an oversimplification of a complicated topic. For example, qualia is not a solution, it's a problem! And while Dennett does present an interesting picture (especially in the way he resolves qualia), there is a reason his book is nicknamed "Consciousness Explained Away" -- he simply defines away many of the interesting parts of the problem.
Also, I don't really like the way you're describing these professors of philosophy. These people have spent their lives studying this problem and adjacent topics. Sure you may be absolutely convinced that "things that conflict with our understanding of mechanistic physical reality must be untrue," others believe "what almost all people perceive to be true about their internal experience cannot be dismissed when discussing the nature of that experience." I think this topic is much more debatable than you are making it out to be.
The phlogiston is an elegant argument in this context that regresses the discussion. The use of the phlogiston as a criticism is elegant because it allows you to think of consciousness as an element that inhabits the brain and then proclaim that to be wrong, leaving at the end of the story only the physical brain and no room for imagining anything else in that space, without having proven the case.
A phlogiston is not the correct analogy for grounding consciousness in science. An easier fit would be the software/hardware split you find in computing, where the abstractions and meaning we perceive to be real is a set of built layers of computational or logical abstraction.
I am on the side of not thinking you can scientifically ground consciousness with the understanding we have. I don't think either analogy fits and the argument will forever be pinned in debate between characters that prefer their own worldview.
I actually do think that to some extent, studying physics is studying the mind — specifically studying exactly how ‘reality’ differs from our intuition about it. A new discovery in physics is often also a discovery about psychology — for example — relativity exposed our intuition about time and space as incorrect.
At some point you can't fix a model by adding more stuff to it, if it's already inconsistent.
Removing the wrong assumptions first, helps more then adding new mechanisms.
Having worked with scientists and mathematicians with PhDs, I can relate to the notion that they could have difficulties when it comes to identifying what is important and what to focus on. Their lack of eagerness to find real-world applications for their substantial skills is perhaps what had allowed them to stay at university and keep studying for so many years in the first place.
It's no coincidence that, on the other end of the spectrum, a lot of successful entrepreneurs could barely stay in college long enough to get their bachelor's degree.
I'm not sure if "eagerness to efficiently apply one's skills" is something that can be taught but it could certainly be incentivized.
Yesterday you proclaimed that the status quo of physics is free from "bad science", and today you're engaging in a character assassination against a well-renowned female physicist who is questioning the status quo.
No, that's very different. Yesterday I proclaimed that our experimental colleagues do an excellent job of avoiding false positives, because they use great scientific practices. Sabine would agree with me on that.
This article is not about doing data analysis in experiment, it's more about how we should choose the theories that guide our choices in experiments in the first place. I disagree with Sabine because I see her as painting an unfair picture of theorists, as if we were all completely unreflective and she was the only one in the world to think seriously about theory selection. Most of her vague suggestions are things that people (sometimes, hundreds of people) have been already doing for decades.
That breaks the cardinal rule of science popularization: don't trash people or take credit for insights without giving a careful, fair picture of everything that is going on, because the scientists affected don't have the right of reply.
I'd like to point out that discoveries in fundamental physics do not always come from work in fundamental physics. Physical systems exhibit some "holographic" or "self-similar" tendency where high-level statistical effects will often mimic low-level fundamental dynamics. For example, wave propagation was originally described for pressure waves propagating in a physical medium, then applied to light waves after the work of Maxwell. More recently, the Lorentz transformations, originally used to calculate the effects of finite-speed electric fields on susceptible particles, turned out to be a fundamental property of the universe. And the Hamiltonian path integral formulation, originally a mathematical curiosity, turned out to be crucial for describing relativistic quantum interactions.
Some of the work that `yummypaint describes which is being done in quantum chromodynamics -- a field in which we have a working "lattice" approximation but almost no real analytical solutions -- may eventually prove useful for a theory of quantum gravity. However, QCD is still very immature. Those with a little experience will recognize the phrase "non-perturbative" which describes some QCD problems -- this also appears in quantum gravity, where perturbation theory fails. The lattice gauge theory model used to solve QCD problems today cannot be extended to gravity, but other methods might not have this limitation.
So the lack of clear and demonstrated progress in fundamental physics does not mean that we are not learning real facts which can eventually be useful for describing fundamental physics. The situation is not so dire.
As for critiques of Hossenfelder, it's certainly true that her writing tends to the dramatic. Just last year she was optimistic about asymptotically safe quantum gravity:
Although I am only in physics by degree and not by practice, I think this is "physics as expected."
Old measurements do not change. Over bog standard Newtonian physics, special relativity provides additional decimals in the form of significant figures and mostly in particular circumstances. For plotting the trajectory of a large rock at comparably low speeds, Newton's equations work rather well on Earth, once you take air resistance into account. Even if another theory supplants special relativity for the extreme situations, old ballistic tables will still work.
Each advance will have smaller and smaller corrections to the experimental values, in narrower edge cases, even if the mathematics of the newer theory looks completely different. This in turn means that discarding an old theory in favor of a new one by experimental confirmation of what was unexpected in the old system must only come either in the more extreme values (more Telsa to a magnetic field, for example) or with greater precision and accuracy via more elaborate and finely-calibrated equipment. Both are expensive, requiring more time and more work.
I believe she is right in that we are certainly on course for some very time-consuming and expensive science. However, I do not see an easy way out of it, nor should we, I believe, expect one. Flip open a CRC Handbook of Chemistry and Physics; many of the basic values have been long-known. We have added a few more numbers to the right of the decimal in many cases, but that is it. Two hundred years ago, a single person with a modest budget could discover new physics and chemistry. A hundred years ago, a handful of collaborators with some backing could advance these most basic of sciences. By only fifty years back, we are looking at budgets in the millions and teams of scientists working in harness.
It may simply be out of reach to disassemble a galaxy to build a collider large enough to test some theories and at that juncture, one might well consider physics "close enough to done." A mote of dust is always going to fall on some surface no matter how often you polish and buff. At some point you have to stop fussing and enjoy your furniture.
Yes, it is handy to look for the cases where we can find a way to test our hypotheses for under a million dollars, in no more than a few years. We will run out of those, too.
This is "unsexy" to many. The ratcheting noose of entropy in a closed system, which we can only try to run away from at pedestrian sub-light speeds is, bluntly put, a drag.
I approve of the message that perhaps a 40 billion dollar particle accelerator is an unwise investment. Otherwise, I'm not sure there's useful information here.
"I try to explain to a cosmologist or particle physicists that we need smarter ways to share information and make decisions in large, like-minded communities."
Making decisions in large, like-minded communities? When has that ever been successful? Isn't that pretty much the opposite of what has led to physics breakthroughs, up and to this point?
"And please spare me the complaints that I supposedly do not have anything better to suggest, because that is a false accusation. I have said many times that looking at the history of physics teaches us that resolving inconsistencies has been a reliable path to breakthroughs, so that’s what we should focus on."
If I was a physicists and some random other physicist of little renown was to suggest that I (and everyone else) need to "focus more on the inconsistencies", I wouldn't exactly be swayed. I would be rolling my eyes. It's just not a very useful suggestion to make.
I wonder if what the author is observing is an effect of the industrialization of science. As she pointed out,
> Because the existing scientific system does not encourage learning. Physicists today can happily make career by writing papers about things no one has ever observed, and never will observe.
Maybe what's really going on is that almost every scientist out there is forced to chase grants and maintain a steady stream of publications. We have to give our scientists the ability to put long-term effort into projects that have a low likelyhood of payoff; otherwise research stagnates.
Also, it seems like the 'groupthink' she pointed out is a very real problem. Remember that weird EM drive associated with NASA? We all know that it's nearly impossible that it could be a valid phenomenon; but as a global scientific community, almost everyone dismissed it outright. We should all have jumped on it, to see if it could possibly be true, regardless of how outlandish. If we are unwilling to explore the boundaries of what we know, then we deserve to stagnate, and we will.
The inability to make long-term investments is a major reason I don't intend to stay in academia after my PhD. Government and corporate labs don't seem better in my view either.
At present my plan is the following: Keep the amount of time spent making money to the minimum needed to support myself. Then I can invest my own time and money into research. I could make money from research via grants/contracts like normal academics, but that's not necessary in this scheme. I'm thinking about starting several SaaS businesses.
Your suggestion about jumping on the EM drive is in direct opposition to what Sabine suggests:
"The only way to avoid being sucked into this vicious cycle is to choose carefully which hypothesis to put to the test. But physicists still operate by the “just look” idea like this was the 19th century. They do not think about which hypotheses are promising because their education has not taught them to do so. Such self-reflection would require knowledge of the philosophy and sociology of science, and those are subjects physicists merely make dismissive jokes about. They believe they are too intelligent to have to think about what they are doing."
Jumping on a crank's idea is not carefully choosing which hypothesis to test.
You call the originator of the EM drive a crank; I am proposing that the widespread adoption of your dismissive attitude is in large part the cause of the current scientific stagnation observed by the author.
I also suggested that the other large cause of the current stagnation is that we do not have enough scientists pursuing outlandish theories - we should hear about things like the EM drive far more frequently, and we should be excited about new ideas, not dismissive of anything foreign to the current 'groupthink'.
Also, returning to your first point, the author pointed out just the opposite to the point you're attempting to make:
> With fewer experiments, serendipitous discoveries become increasingly unlikely. And lacking those discoveries, the technological progress that would be needed to keep experiments economically viable never materializes. It’s a vicious cycle: Costly experiments result in lack of progress. Lack of progress increases the costs of further experiment. This cycle must eventually lead into a dead end when experiments become simply too expensive to remain affordable. A $40 billion particle collider is such a dead end.
... seeking extremes won't work, otherwise we'd rule out everything that's not supported by current theories. Also we can't let ourselves be deceived by the everyday cold-fusion perpetuum-mobile crackpot.
But we should make it easy for crackpots to test their shtick. So we should invest in better metrology.
And when something actually breaks through, then we should jump on it.
And basically that's what's going on with the "EM drive". Currently even though it goes in the face of basic theories the test bench they used was not sensitive enough to rule it out.
There's a broad spectrum of topics there, and quite a few episodes talk about the foundation of physics, including the quantum measurement problem (several episodes about that, in fact), string theory and several cosmology topics.
It feels from the outside, academic research is performed by an individual or small group that culminates into a paper at the end. When attacking large problems in physics or biology this seems short sighted. A potentially better way would be to attack the problem in an organized manner with teams researching various aspects of the problem and building off of each other. Remove the whole 'paper writing/journal article' from the equation and just keep moving forward. Does this exist? Are teams of researchers coordinated from a high level and moving through problem spaces with clear direction? Or is everyone working on their small piece, trying to get their bit of credit?
I worked many years in particle physics. It is coordinated within each experiment. You can’t just publish as a lot of people stand behind the work there must be consensus. There are abundances of physics sub groups, article review boards, research strategies, etc.q
"In the foundations of physics, we have not seen progress since the mid 1970s when the standard model of particle physics was completed."
This seems to discount the explosion of work in the foundations of theoretical physics. Most notably the many variations of String Theory. Just because the work has not yet produced experiments that can validate them do not mean they are not progress. It's some of the most exciting foundational work that's come for decades.
> This seems to discount the explosion of work in the foundations of theoretical physics. Most notably the many variations of String Theory.
Hossenfelder is well aware of string theory, as she says right in the article:
"All these wrong predictions should have taught physicists that just because they can write down equations for something does not mean this math is a scientifically promising hypothesis. String theory, supersymmetry, multiverses. There’s math for it, alright. Pretty math, even. But that doesn’t mean this math describes reality."
> Just because the work has not yet produced experiments that can validate them do not mean they are not progress.
But it's not just that string theory hasn't yet produced experiments that can validate it. It's that (a) direct experiments to validate it are something like twenty orders of magnitude away in terms of energy scale, so not doable now or for the foreseeable future, and (b) every prediction it's made so far that can be tested at energy scales we can probe (such as the prediction that supersymmetry would show up at the LHC) has been wrong.
Most of those theories have some implications about lower energy ranges. And pushing up energies is a great way to rule out a lot of them.
(And some of those theories have some large scale cosmological implications too, so better telescopes can rule out more.)
But ... there seems to be a lack of (serious) theories that have easily testable predictions. And this is not surprising, because we also lack easily reproducable but yet unexplained data. (For example the high-energy gamma ray busts are very interesting but we have no idea where do they come from really.)
And that's not necessarily a problem. After all, we can measure things with insane accuracy (LIGO, atto/femto-second laser pulses, neutrino detectors, etc), our theories about the accessible energi regimes are quite good (14+ digits for QED - quantum electro dynamics).
At some point it's no wonder we need bigger microscopes to see deeper into reality.
Sure, were someone just come up with an even better theory that can be tested easily, that would be great too, but since we have no recipe for that, but we have quite a few for building bigger experiments, that's what most of the particle physicists advocate.
Just because the work has not yet produced experiments that can validate them do not mean they are not progress.
The argument I've seen expressed about this, is that these advances should more properly be considered advances in mathematics not physics. Or to put it another way "the math works, but we have no idea if it applies to the real world or not."
It's a question for the episemologists, I suppose, whether or not we should consider an advance "real new knowledge" if there's no experimental validation.
For many people that have followed fundamental physics it has become more and apparent that string theory, with its variations, is a dead end.
The problem isn’t that no ones has managed to experimentally validate the theory. The problem is that string theory has failed to even make predictions that even in theory could be experimentally tested.
Edit: since I’m getting downvoted, is there anything factually wrong with what I have written?
Fellow (former) physicist here. I think you got downvotes because you merely asserted the death of string theory, and didn’t provide any facts. Ironic because string theory doesn’t present any new facts either. Down voters don’t know that what you state about string theory being dead is not just some guys opinion, but a general observation that any unbiased person in the field would infer. String theory was a fad that came and is now out of fashion.
This is exactly what she’s doing. I don’t understand why, but she seems to love controversy and has a very high opinion of herself. She’s clearly not stupid, but it is so tiresome.
This is an extremelly well written and well thought out post. Thank you.
Here is my suggestion: how about for the next LHC scale physics project we get all the governments in the world to put in manhatten project levels of funding to research machines which can control the climate and the weather to mitigate the worst impacts of climate change.
This would be a lot more practical and better for science and the future of the human species than confirming what everyone has known with 99%+ confidence for over 30 years. Just my 2 cents.
Me: Hmm... interesting. Personally I think they should spend time consolidating the different mathematical techniques.
> Maybe all these string theorists have been wasting tax-money for decades, alright, but in the large scheme of things it’s not all that much money. I grant you that much. Theorists are not expensive.
> String theory, supersymmetry, multiverses. There’s math for it, alright. Pretty math, even. But that doesn’t mean this math describes reality.
I don't really understand this part. I am not convinced that her post says much [], because consider this: she is encouraging observation at the expense of speculation and then arguing that observational techniques are more expensive and hence prohibitive in the long run?
Is the real "unhappiness" in academia not just as simple as information overflow and the fact that only a few can be famous, whether there are a few or whether there are many? I don't know if that is what people get at with these kinds of blog posts. In any case, what I have found in mathematics is that the names of people whose work are worth reading are not necessarily the ones that you'll encounter first. OP does actually hint at this as well, but again, is it not just information overflow (and shouldn't be made more complicated than that)?
[] The one good takeaway that I see is "focus on inconsistencies".
For most problems in the history of physics, people have been able to use pre-existing math that had been developed sometimes hundreds of years before. String theory in an exception to this, and the math needed to do what people want doesn't exist yet. This is part of why im willing to give people who do string theory research a pass on not having testable predictions yet. They are still developing the tools, which is why there tends to be alot of collaboration with mathematicians. Even if this line of research doesnt ultimately yield physical insight, its still moving math forward, and that will ultimately benefit everyone.
Me: Hmm... interesting. Personally I think they should spend time consolidating the different mathematical techniques.
> Maybe all these string theorists have been wasting tax-money for decades, alright, but in the large scheme of things it’s not all that much money. I grant you that much. Theorists are not expensive.
> String theory, supersymmetry, multiverses. There’s math for it, alright. Pretty math, even. But that doesn’t mean this math describes reality.
I don't really understand this part. I am not convinced that her post says much [1], because consider this: she is encouraging observation at the expense of speculation and then arguing that observational techniques are more expensive and hence prohibitive in the long run?
Is the real "unhappiness" in academia not just as simple as information overflow and the fact that only a few can be famous, whether there are a few or whether there are many? I don't know if that is what people get at with these kinds of blog posts. In any case, what I have found in mathematics is that the names of people whose work are worth reading are not necessarily the ones that you'll encounter first. OP does actually hint at this as well, but again, is it not just information overflow (and shouldn't be made more complicated than that)?
[1] The one good takeaway that I see is "focus on inconsistencies".
I think people fail to appreciate the many frontiers that exist in just understanding how matter interracts with itself. The past was exciting because we got to discover new building blocks, but that only provides information on the periphery of whats going on. We know all sorts of stuff about the proton, but put it in a nucleus with a few dozen other nucleons and its a whole different story. All sorts of virtual particles are exchanged, its really complicated, and there literally arent any models that can predict the properties of the nucleus purely from first principles. There are important conceptual and computational innovations happening continuously to make progress on hundreds of problems just like this.