Ron Maimon answers about physics and math on Quora (part 2)
The general advice is simply to read the original literature, the research papers of the 20th century, not just rely on the secondary literature. You should read the secondary literature too, but as an exegesis, to make the primary literature more accessible when you don't have a lot of experience. There is no substitute for reading the masters. Everything else I say follows, although I am ashamed to admit that I don't listen to my own advice enough.
One insight you will get from reading primary literature is that String theory is not bullshit, and it isn't something that people believe out of deference to authority, but for very very good reasons, originally the S-matrix philosophy and later the holographic principle. That doesn't mean it is certainly right, you need experimental evidence to come to that conclusion, and we don't have this evidence today, but it does mean that string theory is a huge step forward in physics, and it is possibly the largest single step to a theory of everything that has ever been made.
Every other path to quantum gravity is somewhat bullshitty. That doesn't mean you can ignore all the non-string quantum gravity literature, the loop stuff has some interesting and perhaps relevant mathematics, but it means that only the string theory path gives you a good theory at the end of the day. The loop stuff has problems with reproducing the entropy law
But it's very hard to learn string theory, because it's ultimately a strange Italian theory, with enormous contributions from Berkeley, Syracuse NY, and various smaller centers, but it owes little to the main centers of physics: Moscow and Princeton, at least not until the 1980s. In the 1980s, Polyakov and Witten take it up, moving it to the big time, so that by the 1990s, every university has research on the subject.
String theory is hard to learn because it is so much a grass-roots theory. The original literature is Berkeley's bootstrap program, and reading Gribov's "The Theory of Comples Angular Momentum", together with Tullion Regge's articles on Regge theory, and various not so well-known 1960s phenomenological literature on Regge theory is the only way to gain intuition for the theory. Feynman's book "Photon Hadron Interactions" also has some insights here, but this is more for the QCD light-cone stuff that is done in the 1970s by Gribov and in the 1980s more by Kenneth Wilson and later taken up by Rajeev. This stuff is fascinating, but outside the main line of development.
The main line is through the work of Veneziano, Mandelstam, Schwarz, Scherk and all their coworkers in the 1970s who develop the formalism and show it describes gravity consistently. The literature is daunting because without reading the original literature, you can't understand any of it. So I would advise the young person to read the original literature on strings in the 1970s, until he or she is thoroughly familiar with it, and then to move on to the 1980s and 1990s. To do this, it helps to read a modern string theory textbook first, and Polchinski's is without peer, it is the best. Even so, Polchinski's book is not enough, you need to read the original literature to really get it.
The other piece of advice is simply to keep an open scientific mind, and to remember that experiments are important. So reading condensed matter literature will keep you grounded, and reading about cold fusion will let you see how terribly political physics can become, and how easy it is to dismiss solid experimental evidence in a bad political climate. This happened to string theory too--- it was politically out from 1974-1984. It was only kept alive through the effort of Gell Mann and a handful of other supporters, and this type of political effort is important, and needs to be lauded too.
I agree with the other answers here, especially on the importance of learning to program and do computations. I wanted to give a perspective on some other things that I think are important, but more controversial.
String theory is the culmination of a particularly radical program of physics which has it's roots in the period 1938-1941, when Wheeler formulated the concept of the S-matrix, or scattering matrix, and Heisenberg was very taken with this concept, and proposed that it is the fundamental quantity underlying all relativistic physics.
Wheeler's S-matrix is a quantity that tells you how incoming particles are turned into outgoing particles. The incoming free particles are energy-eigenstates, meaning they are enormous long plane waves with definite energy, and after scattering, they turn into a superposition of other plane waves. There are annoying intricacies in doing the limit for the S-matrix, because two infinite plane-waves never scatter (the particles are spread out over all space, and so never find each other). The S-matrix is defined as the limit or scattering amplitude density per unit momentum on the mass shell per appropriately scaled unit area of the incoming plane waves.
The mathematical intricacies are not so important, the S-matrix is a definition of how particles come in turn into particles coming out. The basic idea Heisenberg had was that Wheeler's S-matrix doesn't require following the details of what's going on in-between the input and output, it can describe the whole process without knowing what is going on in the middle. By employing logical positivism, Heisenberg became convinced that the S-matrix was sufficient to reconstruct the whole theory, so that only scattering was necessary to know what was going on in any situation. He then proposed that one should formulate rules for the S-matrix directly, without using quantum field theory to find a series for this. All this was in 1941, in Nazi Germany, and this means nobody paid attention, because everyone else had fled.
Heisenberg proposed that one should use the principle of unitarity to reconstruct the S-matrix from some postulates. The idea here is that unitarity is the statement that SS*=1, and this condition relates higher orders of scattering to lower orders. Unitarity is a restrictive non-linear condition, and Heisenberg hoped that there would be a unique finite unitary theory, but he had no idea how to formulate it. The reason Heisenberg was interested in this is because, unlike the electron, the proton was discovered to be a big blob in space, it wasn't described well by Dirac theory. It's magnetic moment was more than 4 times bigger than what it should have been for a Dirac particle, and it's charge radius was about a femtometer, it wasn't pointlike like the electron.
Non-pointlike particles are a problem in relativity, because you need to have consistent communication between the parts of the particle. The idea of space-time points in positivism requires local probes, elementary fields which represent localizable particles. If your particles are blobs, space and time might not be reliable notions. But if you use a unitary S-matrix, you are only referring to asymptotic things--- free cold particles in plane waves coming in and going out, so you aren't making any assumptions about space-time, whatever space-time is doing at short distances, the S-matrix is stable to these phenomena, since it is describing the relation between asymptotic things.
Wheeler also emphasized the S-matrix (naturally, he discovered the thing, one of the first major natively American discoveries), and he was interested in reconstructing theories of particle interactions from S-matrix alone, without a detailed space-time picture of fields. When Feynman became his student, they made an acausal formulation of classical electrodynamics, and he had Feynman work on the S-matrix for quantum electrodynamics from this classical foundation, and Feynman never learned or used local fields. He constructed the perturbation theory for quantum electrodynamics from pure S-matrix particle considerations, and, in heroic inspirational work, he derived consistent and correct Feynman rules from free-particle propagators, primitive interaction vertices (determined from the classical limit and minimal coupling), and the restriction of unitarity on higher orders, which determines the way loops have to work. His intuition was from the particle path-integral, which he formulated in order to tackle this problem. The results gave consistent scattering formulas, but they didn't mention any local fields, so Feynman thought he had an amazing new kind of physical theory.
Not quite. Feynman got a rude shock--- other people like Schwinger had derived the exact same rules from local field theory! They didn't use S-matrix, and they got the exact same propagators and vertices, with no herculean efforts. Feynman had to work 10 times harder, and yet the result was equivalent. Dyson showed how to derive Feynman's diagram series from field theory, as Feynman did in the early 1950s, and Schwinger too, each in their own way. Candlin completed the thing by showing how to do path-integrals for local fields.
This experience soured Feynman at Wheeler's S-matrix, and gave up the idea that this was something radical and new, and became a field theorist. Feynman was one of the critics of string theory when it was prominent, probably because he was already burned once by S-matrix. He heckled proto-string-theory in the 1960s, and his opposition was possibly a reason for the marginalization of the ideas in the 1970s (also, some mistakes made by S-matrixers in the 1960s--- I'll get to those).
Aside from Wheeler, who came up with the S-matrix, postwar, the S-matrix idea was ignored until the around 1956, when Murray Gell-Mann, Stanley Mandelstam, Tullio Regge, Vladimir Gribov, and Lev Landau started to get interested, really under the influence of Feynman's magic looking derivation of the Feynman rules. In this case, it is Wheeler's ghost, once Feynman gets away from Wheeler, the S-matrix is out the window.
Anyway, the main results from this era were Tullio Regge's discovery of the fact that particles come in families which have to be scattered together in families with the scattering of all of these together reconstructing the true scattering, which is softer (meaning less divergent at high energies) than the scattering of the particles individually.
Mandelstam and Gell-Mann were studying dispersion relations, integral laws which determine the scattering from the singularities of the amplitude. Landau discovered the correct physical interpretation of these singularities (from thinking about Feynman diagrams), they are places where you have just the right kind of energy in a subset of the incoming particles to produce a physical particle of another type. The dispersion relations allowed you to compute the amplitude from experimental data on physical scattering, and you would never have to work with a field theory! You could reconstruct the S-matrix from some simple considerations, and experiment.
Mandelstam realizes that Regge's idea for families of particles with different angular momentum has a more physical interpretation in relativity, where you find that the asymptotic scattering at high energy is related to Regge's prediction for the unphysical scattering at values of "cosine theta" much bigger than 1. These predictions were mathematical curiosities until Mandelstam's interpretation came along, now they turned into experimental predictions: knowing the Regge trajectory function (the rate of increase of mass-squared with angular momentum) you could predict the rate at which the scattering amplitude fell off at high energies at any fixed "t" (meaning angle normalized by a power of energy). These relations were all S-matrix, meaning you didn't need a Lagrangian.
At the same time, Froissart proved the Froissart bound in S-matrix, showing that there is a strict bound on the amount of scattering you can have in a theory with a mass gap. The scattering can't grow faster than logarithmically. There were many other more minor results in this era, relating S-matrix quantities to physical observables.
This is where Geoffrey Chew comes in. He was a phenomenological guy, not like the big-shot theorists, and he at some point realizes that the strongly interacting particles, the proton, the pions, the Kaons, are all lying on these Regge trajectories. He says that this means that they are not fundamental, and further, he says that the correct way to describe them is using the dispersion relations of Gell-Mann and Mandelstam, without postulating that there is a quantum field theory underneath. He calls this "nuclear democracy", meaning none of the strongly interacting particles are fundamental, they are all composite, and further, they don't have constituents, they are made up of each other in a self-consistent way.
Chew and Frautschi showed that the basic law of the strong interactions is that the particles lie on straight-line Regge trajectories (meaning the mass-squared is linear plus offset function of the spin) and the slope is the same for all the mesons. Simultaneously, Gribov formulated the Pomeron trajectory, to explain why cross sections in the strong interaction were maximal--- they saturate the Froissart bound (actually, in experimental data, the cross-sections grow as a small power until now, meaning that they more than saturate the bound, they violate it! This behavior can't go on forever, the scattering has to fall back to logarithmic, and this is called "Pomeron unitarization" in the literature. The mechanism of Pomeron unitarization is not understood, nor is it heavily studied for reasons that will become clear soon)
Chew went on to develop methods of extracting S-matrix predictions from a few particle interactions and experiment, while Mandelstam continued to press on with the idea of a fundamental theory using only dispersion relations and S-matrix. Feynman thought that the theory should be a field theory, Gell-Mann wasn't sure, and hedged his bets. In the 1960s, people were heavily split, with half the community working on S-matrix and hard mathematical stuff related to dispersion relations, and the other half secretly working on field theory, and nobody knew whether the strong interactions were a field theory or an S-matrix thing.
In 1968 was the major triumph for the S-matrix folks. Dolen Horn and Schmidt had shown in 1967 that scattering in the strong interaction had a strange property--- normally when you exchange particles, you have a broad background and peaks on top of this background at places where you have particle exchange. But DHS showed that where you have a peak, the background is depressed, as if the background were a sum of broad peaks! This means that the particles you are exchanging that give you peaks (S-channel exchange in Mandelstam jargon) are really responsible for the background (t-channel exchange). In quantum field theory, the two things are completely separate things.
So people pondered what this meant--- they drew "fishnet" Feynman diagrams. In 1968, without knowing what this meant, Veneziano proposed a scattering amplitude that had the Dolen-Horn-Schmidt property. This property is so ridiculously restrictive that there were essentially only two solutions (modulo some assumptions, like straight line trajectories with parallel slope), Veneziano's and a later amplitude by Shapiro.
These results were wind in the sails of S-matrix theory. People were confident that there would be a theory, that it would be unique, and it would solve the problem of the strong interactions. This meant that most physicists were working on S-matrix from 1968-1974, and field theory was marginalized. The S-matrix people were saying stupid things, like the fact that field theory has perturbative infinities meant that it was inconsistent, and that there would be one unique S-matrix consistent with relativity, things like that.
During this time, people like Feynman and Bjorken were still trying to describe the strong interactions with field theory, that is, with point particle constituents. Experimental data from electron-proton scattering showed that there were charged points inside the proton, and this meant quantum field theory, not S-matrix theory (which predicts soft scattering from a diffuse blob). But nobody could figure out how the points were stuck inside the proton, so that we don't see free quarks or gluons. Also, Gell-Mann was dithering, because maybe the quarks are points, and the glue is S-matrix blob.
Feynman in 1972 book "Photon Hadron interactions" demonstrates that if quantum electrodynamics is a field theory at the proton scale (something well supported by experiment by then), then the things in the proton that are charged should also be described by a locally commuting field theory. This was a strong argument for field theory, rather than S-matrix theory.
Schwinger had given a toy field theory model with this property in the mid 1960s--- the Schwinger model of 1+1 dimensional electrodynamics. He showed that in this model, the electrons and positrons formed mesons and are permanently confined, because the electric field doesn't die away with distance. Nambu had postulated that the vacuum of the strong interaction was like a superconducting pair-condensate of fermions, and this model was successful in predicting the interactions of pions, as shown by Weinberg. Weinberg also was becoming skeptical of S-matrix theory, because he was able to show that the predictions of Chew for pion scattering could be derived more simply from effective field theory. The finite-number-of-particles form of S-matrix theory was turning into field theory in another form, people were getting burned the same way Feynman got burned.
But unlike Feynman's quantum electrodynamics S-matrix, or the S-matrix of pion-pion models which turned into the effective field theories of Weinberg, Veneziano's theory was clearly not turning into a field theory--- the scattering was always soft, things were completely composed of Regge trajectories, there was no notion of quantum field, in fact, there was no notion of space and time. The theory was clearly new and different from field theory, and it required infinitely many particles to be consistent. It was also very hard to make work, it demanded all sorts of things that nobody ordered.
In the early 1970s, there was tremendous progress on what this theory was, and as the theory became fleshed out, it looked less and less correct for the strong interactions. Nambu proposed that the thing described by Veneziano's theory is a string. Susskind also proposed this, and understood how the string modes were Veneziano's things, as did Nielson from fishnet diagrams (good picture), and analogy with vortex lines (not 100% accurate, but whatever).
By 1974, Lovelace had shown the Veneziano theory needs to live in 26 dimensions, Ramond incorporated fermions, and showed it needs supersymmetry on the world sheet (and the critical dimension shrunk to 10), Scherk showed the theory includes electrodynamics and Yang-Mills theory in low-energy limits, and Yoneya had shown that string theory includes gravity (work which was reproduced and extended in groundbreaking reinterpretation of Schwarz and Scherk). String theory was also predicting soft scattering at large angles, which was conflicting with the experimental data from Bjorken scattering, showing partons, little points. The more it was fiddled with, the less it looked like experimental data, and because it was a self-consistent S-matrix, you couldn't add stuff to fix the contradiction with data, it was determining itself by self-consistency.
Then in 1974, when the Charm quark was discovered, the whole field realized that the correct theory of the strong interactions was SU(3) gauge theory, with Nambu's color idea, and Gell-Mann and Zweig's quarks being the point particles. Field theory won, and S-matrix theory, including string theory, was thrown out as wrong garbage, and a lot of people lost reputation and jobs.
The result was a complete counter-revolution in physics. S-matrix theory was mathematically and physically demanding, the stuff was incredibly difficult to understand, in comparison, field theory is kind of trivial (no offense to field theorists). It was easy for field theorists to think that the S-matrix people were engaged in horseshit, publishing garbage that didn't make any sense, and making up stuff by groupthink and consensus thinking, without any mathematically consistent thing underneath. This was especially true when field theory was shown to be correct for the strong interactions, all the motivation dropped out of the S-matrix program. I personally read a lot of the 1960s literature in the late 1980s and early 1990s, and I couldn't understand how all these people could be chasing after such obvious bunk.
It is very hard to build intuition for string theory, because it is a scattering theory, so it doesn't tell a story in space-time (although this is improved with Mandelstam's 1974 light-cone formulation and Kaku and Kikkawa's string field theory, it is only true that you get a picture in a light-cone coordinates, and the picture is not really local in space-time when you consider the coordinate perpendicular to the light-front).
The counterrevolution was a terrible thing, although a lot of good physics was done. It was essentially a conservative thing, like the politically conservative reagan movement, or the dismissal of progressive rock in favor of simple commercial rock, or the rejection of Marxism in favor of older ideas. These things were necessary, there was a lot of bunk in communism, progressive rock, and S-matrix theory, and this bunk needed to be purged, but the manner in which these things were purged threw out legitimate stuff along with the overreaching nonsense, and caused a lot of good people a lot of pain.
Anyway, not everyone gave up on string theory. Scherk and Schwarz understood that this was really a fully consistent S-matrix including gravity, and it is probably uniquely determined, so it would be a theory of everything. The 1976 work of Gliozzi, Scherk, and Olive showed that string theory was supersymmetric in space-time, and the construction of supergravity explained what string theory was predicting to alter General Relativity. These supersymmetry things were very fruitful to study, even within field theory, but string theory remained out.
In the 1980s, there was a new young superstar, Edward Witten, who was a mathematics powerhouse with stunning physical intuition. He was following string theory, as were all the young people, and he was never sure if it was bunk or not. But he was very good with General Relativity, and he discovered a bunch of annoying things for traditional approaches to quantum gravity:
* Kaluza Klein theory is unstable: this was a disaster, the space-time falls apart semi classically, due to a weird instanton you would never guess in a million years, and you would never see this instability in perturbation theory. You need to stabilize the vacuum. * Gravitational anomalies: you can't introduce chiral matter in gravity theories arbitrarily, there are insanely stringent consistency conditions on chiral stuff, and nearly all field theories of gravity are inconsistent.
Further, it was clear that the path-integral for gravity was no good, the sum was over topologies, and included parts that diverge in ways that can't be fixed by going to imaginary time.
Also, Hawking had made progress in quantum gravity, the first real progress, by showing that black holes were thermal. This meant that you needed to formulate the theory somewhat differently. There couldn't be any global conservation laws (you can't have Baryon number conservations, because you can make a black hole out of neutrons, and have it decay to gravitons and photons). The theory had to have an infrared ultraviolet link, because high energies produce big black holes, not small localized collisions.
Now string theory was shown to solve all these problems. It was soft at high energies, and it was shown to have ultraviolet-infrared duality, and also T-duality by Schwarz and collaborators like Green. String theory makes every global symmetry a gauge symmetry, something which was known since the early days, from Scherk's work. So it was consistent with post-Hawking expectations, in a way no field theory could be.
Further the supersymmetry in string theory showed that there is no process which would destroy a supersymmetric Kaluza Klein vacuum, so Witten's instability was also fixed.
Then in 1984 Michael Green and John Schwarz showed that the gravity theories which come out of string theories, in those cases where they have chiral fermions, are magically just the ones that cancel all the anomalies. This was the last straw for Witten--- there is absolutely no reason that an inconsistent theory would produce anomaly-free low energy limits, especially that the cancellation was magic, relying on a conspiracy of certain bosonic fields and chiral fermions together. This kind of thing absolutely demanded that string theory makes sense mathematically.
Further, the anomaly cancellation mechanism suggested there should be an E8xE8 string theory, which was duly found in 1985 by Rohm, Gross, Martinek, Harvey. The heterotic string was sort of "het" (different) and "erotic" (sexy) because it could immediately produce realistic physics with gravity.
The main problem in string theory is because it was constructed as a self-consistent theory, you couldn't be sure if it was the right theory, because there was no data to support it specifically, and there was no physical principle to derive the theory.
In the 1990s, Susskind, following 'tHooft's prescient analysis of Hawking's information loss argument, formulated the string-theoretic holographic principle. The principle Susskind gave explained why string theory had to look the way it looks, and explained what the strings are: they are little extremally charged black holes. The black hole oscillations have to describe all the matter that can fall in, and further, any one black hole can oscillate to reproduce any other, because anything can fall into a black hole.
So in the 1990s, string theory was explained in a deep sense, through the holographic principle: it's the theory of black holes with just enough charge to be extremal. Then their shaking tells you how to reproduce the behavior of stuff near the black hole, and any one black hole can be made a constituent for any other, in the sense that the other black hole (if it is localized, like by closing the sheet into a compact shape) can fall into a big black hole of any other type.
This led to the golden age in the mid 1990s, when string theory was extended to the AdS/CFT correspondence. The results of this era showed that string theory was definitely unique, definitely consistent, and almost certainly the only possibility consistent with the holographic idea, because it is a-priori impossible to construct a holographic theory, except that string theory does it.
This evidence is persuasive. Further, string theory now has regimes where it can be calculated to arbitrary accuracy on a computer, in principle, so we know it is well defined, at least on certain backgrounds. This means that we have actually solved the problem of quantum gravity in principle, although we have not solved the problem of the quantum gravity in our universe.
The main barriers to string theory are that you can't predict anything at low energies yet, because we don't know our vacuum. This problem will be solved at some point when an exhaustive search of vacua is complete (this is not an insurmountable problem--- it's about the same as the classification of finite simple groups in complexity). The more fundamental problem is that the theory doesn't describe finite-area cosmological horizons, like the one surrounding us, and so there is still a domain which needs to be understood theoretically.
I am optimistic that the theory will make predictions about black hole emissions in our universe, relatively independently of the high-energy details. The reason is that there are still mysteries in big black hole emissions, in the charged and rotating case, which we definitely know how to calculate in principle in string theory, but we haven't figured out what the general prediction is. String theory is the only way to be sure we understand black hole physics.
This is not a review, and I have told a mostly personal story. Apologies to anyone I neglected, these were just what I thought of at this moment. Wikipedia has a reasonable history in the page on "String Theory" (which I wrote after thinking a little, and a few things were fixed up later).
The S-matrix is an asymptotic operator which describes how particles going into a scattering event transform into particles going out. The S-matrix can be calculated from a Hamiltonian description, but the nice thing about it is that it does not require a Hamiltonian or Lagrangian description of the intermediate details of the scattering at all, it can be built up without regard to the local space-time structure. Because of this, you can use it to construct theories which are insensitive to a breakdown of naive space-time structure. You don't need any knowledge of the local structure of space and time to talk about incoming and outgoing particles, since these are defined at far away locations and far away times, so you know they can be described in the ordinary way, using plane waves.
From the S-matrix idea, you can reconstruct physics, but you need some assumptions. Feynman started with the idea of an electron and a photon, and classical electrodynamics in the classical limit, and found the Feynman rules for QED. Schwinger and Dyson found the same rules from the Hamiltonian description of QED, and it required a renormalization procedure to make sense of the diagrams in both pictures. So Feynman decided S-matrix was equivalent to field theory, and stuck with field theory for the rest of his career.
But others pursued a pure S-matrix theory. Chew and Mandelstam, working with consistency conditions, decided that there was enough information in the S-matrix to reconstruct all of physics. People worked hard throughout the 60s to show how this program would work, and a lot of people accepted this, but a lot of people also stuck to field theory too. At the time, the focus was the strong interaction.
The S-matrix description of Pions and Nucleons developed by Chew in the early 1960s transmuted into the effective field theory of Nambu and Weinberg in the late 1960s. Weinberg became convinced that the only solution to the S-matrix consistency conditions was a form of field theory, and he was sort of right, under the assumption of finitely many fundamental particles.
But Tullio Regge showed that particles can come in families, and Chew and Mandelstam persisted in looking for a theory of exchange of Regge trajectories. Vladimir Gribov described these Regge proceses with a calculus of diagrams, but this calculus ultimately had an interpretation in terms of a two-dimensional light-cone picture developed by Feynman, Gribov, and later followed up by Kenneth Wilson and nowadays is developed further by Sarada Rajeev. It still wasn't a new theory.
But in 1968, Veneziano found a formula for an S-matrix approximation (a first-order scattering amplitude) that was clearly completely different from field theory. This was the foundation of string theory, it developed into string theory over the following decades.
In the mid 1990s, the S-matrix picture was understood more completely as the form of holographic principle appropriate to asymptotically flat space time. The statement that "everything is in the S-matrix" is then more properly reinterpreted as the statement that the local physics is reconstructed from dynamics on holographic boundaries at the edge of the universe. This became accepted when it was demonstrated to work in AdS/CFT models, and now all this old stuff is water under the bridge. But betwee 1960 and 1974, there were two camps in physics who hated each other and did not hire each other, or read each other, the S-matrix people and the field theory people.
Both sides made spectacular progress, but the S-matrix folks made more progress and got beat up more for it, so I prefer to laud them more.
If Everett said what you say, it would be in conflict with quantum mechanics, and also silly. Everett doesn't say that the universe splits when the electron goes through two slits, rather that the linear evolution of the wavefunction is something that holds at all scales, so that people end up superposed, just like electrons, just like quantum mechanics predicts.
But people don't ever "feel" superposed. In the Everett interpretation, this is just a property of how people feel. The 'split' in universes is not a split in the physical universe, it is a split in the perceptual memories of a recording device that measures the universe. When a classical computing device measures an outcome, and ends up in a superposition of different computational memory states, you view the two outcomes as "existing" (philosophically) and the appearence of probability is only subjective, from the point of view of the computing device itself, from the inside.
In this view, the collapse in quantum mechanics is a property of perception, not so much of the physics. The physics is simple unitary evolution, all the complicated probabilistic reduction is from selecting a particular path to make a consistent memory for a computing device.
The idea doesn't work if the computing device can recohere the different outcomes back together, to get interference between previously split alternatives, like an electron's wavefunction merging after the slits. But this can only happen if it erases every bit of information it acquired, since interference only happens when two different histories of the whole quantum system, a system that includes the device in this case, reach exactly identical states. This is impossible if there was entropy production, and even in cases of reversible computation, it requires restoring the exact initial state of the reversible computer, so there is no paradox between interference and the subjective Everett history-splitting.
The reason this is not more emphasized is because it mentions the memory of computing devices, as a model for human memories, and people get annoyed when it is the consciousness of a person that is doing something in making quantum mechanics work. Sorry. That's how collapse works in Everett, and in other no-collapse interpretations, there's no way around it. This is the reason it is not described by physics, but by a sort of meta-physics of mind, that tells you how memories embed and gain continuity in physical systems. The memories in the computational measuring device are what is doing it. These memory robots, as models of the brain of observers, are emphasized in Everett, and in the proper accounts of this interpretation they are included. When you don't include them, it sounds like the nonsense above.
Everett is just taking quantum mechanics seriously as a model for the entire universe. This is useful when considering cosmology, and the fact that he can do it (with only philosophical headaches, no physical paradoxes) means that it is philosophically possible that quantum mechanics is exact. But that doesn't mean that quantum mechanics is exact, just that in this case, the Everett interpretation shows how to reconcile measurement with unitary evolution in a realistic philosophy.
Since the result only involves philosophical readjustment, in the end, it isn't too much different from Copenhagen. The Copenhagen folks thought of collapse in much the same way, except they didn't make it explicit, because their positivism meant it was enough to describe how to predict results of any experiment. You didn't need to give an account of what is "really" going on. Everett just provided this account for this interpretation.
I am sure there are many, it makes no difference what they believe. You need to look at the evidence for dark-matter yourself, independent of any authority.
There are three classical lines of evidence for dark-matter:
Rotation curves: galaxy rotation curves (the speed of stars at different distances from the center) are mapped by blueshift analysis of spectral-lines at various distances from the center, a reliable method, and this shows that the rotational speed doesn't obey Kepler's law with the visible matter as the major source of gravity. The conclusion, assuming standard gravity, is that the galaxy is surrounded by a uniform cloud of dark matter
Zwicky's estimate of galaxy-cluster binding: when you consider the velocities of galaxies in bound clusters, you can figure out how deep the potential well is. The galaxies should be bound, or else it would be a conspiracy that we observe them together right now (they would be flying past by coincidence), and from the velocities, you get a sense of the total mass in the cluster. The result is that there is about 30% of the closure density in dark matter plus ordinary matter, but the ordinary matter is only about 5%.
Cosmological bounds: The mapping of the blackbody radiation fluctuations allows you to quantify the cosmological model, and this reveals that there is a 70/30 split of the universe into cosmological constant and matter. The total amount is the closure density, and this is consistent with Zwicky's dark-matter estimate. Simulations of the structure formation in the universe gets roughly right global density distributions of galaxies with the dark matter content as it is, and no modifications in gravity.
These three lines all converged to the same answer, since the rotation curves also showed a certain amount of dark matter at least 3 times the ordinary matter in a cloud around a galaxy. The coincidence of these estimates is extremely strong evidence, and you can't reject it because the idea sounds fishy. However fishy it sounds, it is supported by the observations, and whether it sounds right to your ears is a problem of your ears not of the hypothesis.
In response to the galaxy rotation data, there was the proposal that one should modify gravity. The MOND idea is that gravity doesn't work the same at slow velocities, and it can reproduce the rotation curves by adjusting parameters. It is not consistent with General Relativity, but you can add fields to General Relativity until you get something complicated enough that nobody can follow you anymore. This explanation generally fails on the Zwicky estimates, and it completely fails to account for the cosmological data.
But let's pretend they could fix that up by more parameter adjustment. There is a more direct observation of dark matter nowadays using gravitational lensing. The effect of weak lensing can be used to map out the rough distribution of gravitating matter in various astronomical situations. This process gives a distribution of dark matter which is also consistent with the main discovery methods.
But not in every case. There is the bullet-cluster colliding galaxy, shown on this video, where the dark-matter distribution was completely different:
As you can see, simulation of the dark-matter plus ordinary matter reproduces the results.
[go to actual Quora question for embedded video]
This is what people call overwhelming evidence, and it is no longer possible to deny dark matter by anyone, reputable or not. So it doesn't matter what they say, there is no more conversation necessary, the evidence is good enough to be certain, and any further denials are political, and need to be opposed politically.