Saturday, August 14, 2010

On Dark Matter. I: What & Why?

This post is a distillation of some e-mail discussions I have had on this topic.

Some (but not all) young-earth creationists (YEC)  deny the existence of Dark Matter because in galaxies and clusters of galaxies, it is needed to keep these systems gravitationally bound over cosmological times of billions of years.   Since YECs need a young universe, less than 10,000 years old, such long time scales are not needed so, according to them, Dark Matter is not needed.  Their explanation is that the structures we see were created in their present form by a deity and have not had time to undergo any detectable change.

Electric Universe (EU) supporters deny the existence of Dark Matter under the justification that galaxies are powered by giant Birkeland currents this mechanism explains the rotation curves of galaxies.  These currents are as yet undetected in spite of the fact that WMAP had more than enough sensitivity to detect the synchrotron radiation Dr. Peratt claimed they should emit.

Some popular-level treatments of Dark Matter:
365 Days of Astronomy Podcast, Dark Matter: Not Like the Luminiferous Ether, by Rob Kno
Dark Matter: A Primer

What is “Dark Matter”?
“Dark Matter“ is a generic term used for something which we currently don't know precisely what it is.  Once we know what it is and detect it directly, it will certainly be renamed.  Its most general description is matter which can be detected by its gravitational influence but not (as yet) by more direct means such as emitted light.

Over the years, its observational definition has changed as refined instruments made it possible to identify some non-lumininous or low-luminosity components of dark matter with known objects and processes.

- MACHOs: non-luminous stellar-scale objects detected as part of the MACHO project

- Ionized hydrogen: free protons (positive hydrogen ions, sometimes called HII by astronomers) have no spectrum.  However, because ionizing hydrogen contributes an equal number of free electrons in the intergalactic medium, it can alter the balance of ions of other elements which have spectra we can detect.  This relationship allows us to infer the amount of ionized hydrogen in the IGM

- Neutrinos: For a number of years, neutrinos with mass were regarded as the prime candidate for dark matter.  As solar neutrino and ground-based experiments of neutrino oscillations placed smaller and tighter limits on the mass and other characteristics of the neutrino, it was eventually realized that neutrinos could be only part of the non-baryonic Dark Matter problem.
Dark Matter hasn't been demonstrated in the laboratory, so why believe it exists?

Many things were 'known' before they could be clearly demonstrated in the laboratory.  In many cases it was possible to devise indirect tests which were used to narrow in on the details.  This information was then used to refine future experiments in techniques for direct detection.  Not all of these problems existed in distant space.
  • From about 1920 to 1932, atomic physicists could not explain why most atoms were about twice as massive as the protons they contained.  They knew there was something that made up for the mass difference and primary speculation was some type of tightly bound proton-electron configuration, but those types of models did not produce good results.  The answer would await the discovery of the neutron in 1932, which did not interact by the electromagnetic force.  I have yet to find any papers predicting the existence of a neutral particle with a mass approximately that of the proton.
  • From 1933 to 1954, nuclear physicists had great success calculating nuclear reaction rates using a hypothetical particle they called the neutrino.  The neutrino salvaged conservation of energy and explained why electrons emitted in beta decay did not have a fixed energy (characteristic of a 2-body decay process) but exhibited a range of energies up to the maximum allowed by energy conservation, characteristic of a many-body decay process.  The neutrino would not be detected directly until 1954.  The neutrino did not interact electromagnetically or via the strong nuclear force.
  • The 1/r^2 force law of Newtonian gravity was not demonstrated at laboratory scales until the 1990s.  The real precision in defining the Newtonian gravitation force was established primarily through observations and precise measurements of planetary motion done years before we could actually travel in space.  If U.S. science required a definition of strict laboratory demonstration of Newtonian gravity before launching our first ballistic missiles or orbiting satellites, the Soviet Union would have kept their lead in spaceflight.
 In addition, astronomy has a rather successful history of detecting things first by their gravitational influence and confirming the objects later as detection technology improved.  Consider these examples from the history of astronomy:
  • The planet Neptune could be considered as the first example of 'dark matter', detected gravitationally before seen optically.  We didn't know that the planet had to exist, we only observed discrepancies in the orbit of Uranus and inferred the existence of a planet based on the understanding of gravity that existed at the time.  Alternatives, such as an extra term in Newton's gravitational force law, were examined as well.
  • Perturbations in the motions of the stars Sirius and Procyon, detected in 1844, were due white dwarf stars too faint to be seen by telescopes of the day.  It took 50 years for telescopes to improve their sensitivity to a level that these small, faint stars could be detected close to a bright primary star.  For 50 years, these stars were 'dark matter'.  We would later determine that these white dwarf stars hinted at another state of matter that existed at densities too high to be produced in current laboratories.
  • Perturbations in the spectral lines of distant stars has been most recently used to detect extrasolar planets since 1995.  These perturbations are due to the gravitational influence of the orbiting planet on its parent star.  Only recently have some of these planets been imaged directly.
Just as in these historical examples, we know there are limits in our ability to detect some processes and particles.  If a problem is solved by known processes that operates below our current detection threshold, then these are reasonable lines of research to pursue (dark matter, proton-proton reaction).  However, if the suggested solution indicates that the process is well within the detection threshold of current technology, that is most likely a dead end for research (see “Testing Science at the Leading Edge".

To be continued...
Minor typo fixed.  Thanks to the commenter who caught it.

1 comment:

W.T."Tom" Bridgman said...

Catching up on the material captured on the DVR today, I finally saw the Dark Matter segment of "Through the Wormhole With Morgan Freeman".

A nice touch in the episode is it shows a number of data-based visualizations of supercomputer simulations from the scientists. One of my favorites is the simulation of galaxy formation using just visible matter, and then observing what happens as you add larger amounts of an additional particle that just interacts gravitationally (dark matter). This is a good example of how scientists test new ideas through computer simulations.

So...What Happened?

Wow.  It's been over eight years since I last posted here... When I stepped back in August 2015,...