I Don’t Believe in Electrons

By Russell Anderson, Director at Transaction Analytics Ltd.

So, I was sitting in a staff meeting in the Department of Evolutionary Biology at the University of California.

The big concern for the week was that the campus was being overrun by Christians.

Steven J.

Gould had been summoned to give a lecture to denounce Creationism (a bit overkill).

This was a very popular activity at the time — Daniel Dennett and Richard Dawkins were outdoing each other to prove whose atheism was most pure¹.

One of the professors asked the room, “Did you read The Newspaper² today? 35% of Americans don’t believe in evolution!” I couldn’t help myself — I blurted out, “That’s great!” They all stared at me as if I had farted.

I was not making friends, so I explained, “50% of Americans believe in Astrology.

We’re winning!”They did not find this the least bit amusing.

So, I suffered in silence for the remainder of my studies there.

Maybe I have it all wrong, but I did not think any question in Science was a matter of “belief.

” A scientist is supposed to be a professional sceptic.

 Indeed, probably the most annoying question to ask a scientist is whether they believe in a theory.

Even more disturbing is to ask the general public whether they believe what scientists cannot attest to: “The Truth.

”  From childhood, we are all indoctrinated with a fairy tale view of Science:Science is unique in human endeavours in that it is cumulative and progressive.

What distinguishes Science from other forms of knowledge is something called “The Scientific Method,” wherein observational facts are expanded by logical deduction or hypotheses, generated by induction, are tested and verified.

New scientific theories subsume and replace old theories like matryoshka dolls, increasing scope, applications, and predictive accuracy, ever marching towards the Truth.

This is a myth, and any thinking scientist knows it.

Yet all practicing scientists actively perpetuate the fiction, publishing their work in a sterile, hypothetical-deductive narrative (observations, hypothesis, methods, results, conclusions), regardless of how the ideas were generated or how the work was actually done.

Karl Popper poked huge holes in the standard view (Logical Positivism or Logical Empiricism), persuasively arguing that scientific theories can never be proven, and the best we can hope for is to falsify a scientific theory³.

Platt argued that scientists should apply the “method of multiple hypotheses,” wherein they come up with as many alternative explanations as possible, and set out to test them against each other⁴.

Thomas Kuhn challenged the idea of scientific progress as a steady, accumulation of knowledge in his historical analysis of scientific revolutions⁵.

Kuhn distinguishes between phases of ‘normal’ or ‘revolutionary’ science.

Normal science operates within a general theoretical consensus (‘dominant paradigm’), much like the classical view.

Kuhn compared scientific progress during normal science as “puzzle solving,” where the community worked out the implications of a theory, analogous to deductive reasoning.

When anomalies or inconsistencies arise that cannot be explained by the current theory, scientists don’t necessarily abandon the theory, but often develop ad hoc hypotheses to amend the current theory (e.


, adding epicycles to Ptolemaic orbits or the Bohr atomic model [Fig 2]).

Figure 2: Ad hoc or Brilliant? LEFT: The Bohr Atomic Model: Electrons are allowed to occupy discrete orbits, corresponding to standing waves.

 RIGHT: The Tychonic model of the solar system: Earth is the center, the Moon and the Sun orbit Earth, and the other planets orbit the sun.

This is mathematically identical to the Copernican system, without some of the weird phenomenology of us not feeling the Earth spinning.

Both an astronomer and alchemist, Brahe was one of the most eccentric people in the history of science, keeping a pet moose, retaining a dwarf psychic, and having an affair with the Queen.

Sadly, his moose died when it got drunk and fell down the stairs⁷.

Image: wiki commonsHistory has not always been kind to discarded theories or ad hoc hypotheses, regarding them as desperate fudges having little or no rational foundation.

Feyerabend celebrated their inventiveness.

Freed of dogma, scientific revolutions are full of creativity.

 Ad hoc theories revive and rejuvenate stagnant fields.

They are inspired and pave the way for radical new ideas and theories.

New or competing theories, on the other hand, may explain the anomaly, but jettison much of the scope of the previous theory.

A good example is the transition from Alchemy to Chemistry (Fig.


Both Kuhn and Feyerabend argued with no agreed upon paradigm (common premises, observations, definitions, scope, values), debate between theories is incommensurate and irrational.

New values, arguments, methods, aesthetics, appeals to popular media, all become fair game (Feyerabend termed this “anything goes”).

Feyerabend advocated methodological anarchism, arguing there is no such thing as the scientific method.

⁶Figure 3: Phlogiston Theory Alchemists J.


Belcher (1667), Pierre Josèphe Macquer, Georg Ernst Stahl (1731) contributed to early chemistry by beginning a systematic classification of the elements, including phlogiston⁸.

Phlogiston theory offered a general theory combining combustion (e.


, fire) and corrosion (e.


, rust).

The discovery of Oxygen (Joseph Priestley and Antoine Lavoisier 1774) was the beginning of the end of Alchemy.

Arguably, Alchemy was making progress, if you are willing to accept that phlogistons have negative mass.

Image: Hay Exhibits, Brown University [8]Figure 4: Rene Descartes’ Pneumatic theory of muscle contraction.

 Theories of muscle contraction have undergone several paradigm shifts over 2000 years.

Galen (129–200 AD) noted muscles act in agonist/antagonist pairs, powered by the ‘animal spirits’ (animation being the distinguishing ability of animals over plants).

Rene Descartes proposed a hydraulic mechanism (balloon) theory of muscle in 1664⁹.

It was disproved in 1667, by Jan Swammerdam, who demonstrated muscles did not change volume when contracting.

Galvani (1791) demonstrated electricity can trigger muscle contraction, leading to electromotive theories (inspiring re-animation in Mary Shelly’s Frankenstein).

 Protein folding or protein spring theories were dominant for 200 years.

The current sliding filament theory only became established in 1954.

Each of these theories was profoundly inventive and well-supported.

Each contributed to our knowledge, without being correct.

Image: wiki commonsAn unappreciated aspect of scientific debates is the rhetoric employed, which is often far from rational.

Tycho Brahe lost his nose in a duel as a youth and wore a brass prosthesis.

This gave him a tremendous advantage in debates, as if a speaker presented an opposing view, he would casually remove and polish his nose, distracting the audience.

Galileo’s essays were loaded with sarcasm and ridicule and are quite entertaining, even to the modern ear.

He published his papers in vernacular Italian, rather than scholarly Latin, to address a broader audience.

¹⁰ Appeals to popular media and celebrities continue into the modern day, for better or worse.

After Popper, Kuhn, and Feyerabend, neopositivism was “dead, or as dead as a philosophical movement ever becomes.

”¹¹ Beyond the precept of falsification, there is no consensus amongst scientists over what is Science, what is a theory-independent fact, what methods are valid to develop theories, what values should be applied when deciding between theories, and rarely anyone brash enough to claim their theory is “true,” in any objective sense.

Open questions are if scientific theories are at least approximating a true description of reality and whether it is rational to believe in unobservable entities — like our friend, the electron.

  The cautious position of many scientists is to take an agnostic stance towards the truth of a theory, while valuing its fruits.

With all the weirdness of quantum mechanics, most physicists have given up on metaphysics.

Quantum mechanics is accepted because it agrees with experiment; that’s it.

What it means or if it is ‘true’ is avoided in polite conversation.

(“Shut up and calculate.

”)Pragmatist claims of Science are less grandiloquent: a theory’s value is utilitarian — it has predictive value and enables applications.

Theories are ‘just theories,’ instead of facts or laws.

Often, the word ‘model’ is used instead of ‘theory,” partly to avoid epistemological baggage.

(A model is defined weakly as “a mathematical or statistical representation or approximation of something.

”) Indeed, models are viewed as useful analogies or images of reality.

A colourful metaphor often used is borrowed from Pablo Picasso — “Art is the lie that helps us see the truth.

”  This worldview is more optimistic than it sounds.

Consider that the alternative, Realist view ultimately leads to Science being completed.

In 1996, Scientific American reporter John Hogan suggested as much in The End of Science, where the time is coming where most of the great questions in Science will have been answered.

¹² Once Physics is unified, Chemistry can be derived, and Biology explained.

All that will be left for science is logical deduction of the details, i.


, puzzle solving.

What fun is that? Under the agnostic view, the Truth can never be known, scientific exploration is a much more human activity, and scientific debate is endlessly interesting.

  Using a telescope, Galileo saw mountains on the Moon, spots on the Sun, and moons orbiting Jupiter.

This might be viewed as verifiable science, as one could, in principle, travel to the Moon to look at the mountains.

But what of the microscopic world? At first, there might not seem to be any difference between the telescope and the microscope.

Maybe we can trust the light microscope… to a point.

But if you accept current theory, the microscopic realm is radically different than the macroscopic, and the laws of physics have completely different manifestations.

There is no way to directly experience this world with our senses in the same way as mountains on the Moon.

 How much of what we see depends on the theory we have of what we are looking at? What are we not seeing?Rational Empiricism is also frustrated by complexity.

You would be hard-pressed to find a neuroscientist who knows how the brain works or an immunologist who knows how the immune system works.

We don’t have a clear understanding of how vaccination works or why the immune system doesn’t digest itself in a cascade of immune reactions.

There are hundreds of types of cells, molecular receptors, cytokines, and neurotransmitters — all interacting in a combinatorial web of connections, often having effects directly opposite of what they do in isolation in vitro.

Mysteries and adventures abound in the neurosciences.

A demoralizing saying about all cognitive systems is “perhaps the simplest representation of a thing is the thing itself.

”So, what do we make of all this? Clearly, Science is on to something.

Newtonian mechanics is still used to solve the vast majority of Engineering problems, despite being ‘wrong.

’ Genes were hypothesised, unobservable entities with extraordinary predictive power 100 years before the discovery of DNA.

They are now almost observable, so should we ‘believe’ in them? I guess so — you got me.

On the other hand, our understanding of what is meant by an electron has shifted multiple times in 100 years.

Electrons are neither particles or waves, or maybe they are disturbances in a field.

They have charge and spin, but they don’t have mass (they only ‘borrow’ it from the Higgs boson).

They are neither here nor there.

You kick one, and another one falls over 60 light-years away.

Believing in electrons is an exercise in double-think: simultaneously holding mutually contradictory thoughts.

We can agree only that electrons are a useful concept.

When you die, you can ask God what they are.

  Feyerabend was wildly popular amongst my Philosophy-major friends at Berkeley.

It is fun to be an iconoclast when you are young.

I felt I was missing out, so I audited some of his lectures in grad school.

He was enormously entertaining.

I caught him after class, earnestly telling him I had been reading one of his books.

He asked me, “Which one?” When I told him, “Against Method,” he seemed panicked.

“Oh, God, I am so embarrassed.

That is a horrible book.

”Feyerabend never took himself that seriously, telling John Horgan in The End of Science, “I think Western Science can defend itself.

” Kuhn found himself disavowing himself from Kuhnians, insisting he was not supporting a type of cultural relativism.

Constructive Empiricists argue that it would be a “miracle” for Science to have accomplished so much and make such accurate predictions without having some truth content.

¹³But Feyerabend’s warnings of the tyranny scientific dogma is well-heeded.

Dogma restricts free thought and debate.

Dogma is boring.

Parroting is not learning.

And Dogma can be wrong.

The only epistemological tenet left standing is falsifiability.

The sanctimony of academia and the media has us all rooting for them to be proven wrong about everything, from climate change to carbohydrates.

In his autobiography,¹⁴ he lamented, “I often wished I had never written that fucking book.

” I, for one, am glad he did.

  Which brings me back to the faculty meeting.

It does not matter in practice what a scientist believes is true.

It matters even less what the general public thinks scientists believe.

The hypersensitive grad students in the staff meeting were in full reverence of famous scientists’ efforts to falsify God.

A question that might matter is, “Is there such a thing as free will?” Under the Clockwork Universe of Cartesian physics, the answer seemingly was “No.

” Quantum mechanics left us with some awful choices about what to believe: from the non-falsifiable and irrelevant ‘many worlds’ interpretation to the bleak, super-determinist view.

Neither of these extremes allows for free will.

I certainly hope that free will survives, as I find predestination depressing.

Until then, I remain firmly on the side of the pragmatists.

As William James declared, “My first act of free will shall be to believe in free will.

”Change my mind.


Reposted with permission.

 Bio: Russell Anderson has a Ph.


in Bioengineering and B.


in Electrical Engineering from the University of California.

His academic research involves learning in biological systems (neural, immune, and evolution).

He conducted postgraduate research at Los Alamos and Livermore National Labs, University of California (Berkeley, San Francisco, Davis, and Irvine), the Smith-Kettlewell Eye Research Institute, and the California Department of Health.

He has served as chief scientist at IBM, Opera Solutions, HNC Software, KPMG, NICE/Actimize, HCL, Mastercard, JP Morgan Chase, and Halifax Bank of Scotland.

He has published over 30 scientific papers and holds 5 patents for commercial predictive systems.

Related: var disqus_shortname = kdnuggets; (function() { var dsq = document.

createElement(script); dsq.

type = text/javascript; dsq.

async = true; dsq.

src = https://kdnuggets.



js; (document.

getElementsByTagName(head)[0] || document.


appendChild(dsq); })();.

Leave a Reply