Biology 446    Unsolved Problems Fall 2011

Some Philosophy of Science

Karl Popper: "The Logic of Scientific Discovery" (Harper)
    "Conjectures and Refutations" (Harper)
    also: "The Open Society and Its Enemies" (Harper)
    " The Poverty of Historicism" (Harper)

Popper sees science as progressing by making guesses and then testing these guesses.
Considers that the (potential) disprovability of a hypothesis is a virtue.
(a good theory is one which 'sticks its neck out')
A theory which is insusceptible to disproof (falsifiability) is not scientific.

Popper's views have been strongly advocated by Peter Medawar. (Nobel prizewinning researcher)
(Popper has also written very effectively against communism and other forms of historical determinism.)
Selected quotes from "Conjectures and Refutations"

    "When should a theory be ranked as scientific? Is there a criterion for the scientific character of status of a theory?"

    ".....to distinguish between science and pseudo-science, knowing very well that science often errs and that pseudo-science may happen to stumble on the truth."

    "There is of course the most widely accepted answer to the problem: the empirical method, which is essentially inductive, proceeding from observation to experiment."

    "On the contrary...the problem is one of distinguishing between a genuinely empirical method and a non-empirical or even a pseudo-empirical method...which...although it appeals to observation and experiment, nevertheless does not come up to the scientific standards. (astrology versus astronomy)

    "The theories of Marx, Freud and Adler...able to explain practically everything, whatever happens always confirmed it ....what precisely does it confirm? no more than that a case could be interpreted in the light of the theory.

    "...precisely this fact -that they always fitted, that they were always confirmed -which in the eyes of their admirers constituted the strongest arguments in favor of these theories. It began to dawn on me that this apparent strength was in fact their weakness.

    "With Einstein's theory the situation was strikingly different...the impressive thing was the risk involved in the prediction [of the amount by which light should be bent by gravity]. The theory was incompatible with certain possible results of observations -in fact with results which everybody before Einstein would have expected."

    "Confirmations should count only if they are the result of risky predictions."

    "Every good scientific theory is a prohibition: it forbids certain things to happen. the more a scientific theory forbids, the better it is."

    "A theory which is not refutable by any conceivable event is non-scientific."

    "Every genuine test of a theory is an attempt to falsify it, or to refute it. Testability is falsifyability."

    ".... the criterion of the scientific status of a theory is its falsifyability, or refutability, or testability."

    "Bold ideas, unjustified expectations and speculations constitute our only means for comprehending nature. Those of us who refuse to expose our own ideas to the risk of refutation are not the real participants in the game of science."

"...Russell is right when he attributes to epistomology practical consequences for science, ethics and even politics. The idea that there is no such thing as objective truth...and the idea that the truth is the same as usefulness are closely linked with authoritarian and totalitarian ideas...the belief in the possibility of a rule of law, of fundamental rights and a free society can easily survive the recognition that judges are not omniscient...(but) cannot well survive the acceptance of an epistomology which teaches that there are no objective facts." [Do you think this may have some relevance to the current craze for "deconstruction"]


Thomas Kuhn: "The Structure of Scientific Revolutions"
Univ. of Chicago Press
(another interesting book by this same author is "The Essential Tension")

Kuhn's writings concern what one might call the sociology of science, how new concepts supplant older ones, specifically how this occurs in practice (perhaps inevitably), rather than how this ought to occur.

Two terms with special meanings in Kuhn's writings are "paradigm" and "revolution". By paradigm he means a conceptual framework, such a theory or set of theories, in terms of which the observed facts are interpreted and explained. What he means by a "revolution" is some major change in people's thinking, in which an old paradigm comes to be considered disproven or is otherwise discarded, being replaced by a new paradigm. Among Kuhn's major insights are:

(1) That people tend to cling to their old paradigm, sometimes a little irrationally;

(2) That most science is an attempt (almost always a successful attempt) to fit new facts into the currently accepted paradigm; (and this can almost always be done, even when the paradigm is very wrong!)

(3) That observations tend to be uninterpretable or otherwise meaningless unless they do fit into the accepted paradigm (however forced and "procrustian" the fit!);

(4) That an accepted paradigm is never discarded until a new one has been proposed. (a scientific revolution = paradigm-shift);

(5) That historical sections of textbooks tend to ignore or even hide the existence of those paradigms which preceded the currently accepted one, and to pretend that past discoveries had fit right into the current paradigm, or had even been motivated by it.

Some quotations from "The Structure of Scientific Revolutions" by Thomas Kuhn

    "The transition to a new paradigm is a scientific revolution."

    "Mopping-up operations are what engage most scientists throughout their careers. They constitute what I am here calling normal science." "Normal science does not aim at novelties, and when successful finds none."

    ".... (a) new theory implies a change in the rules governing the prior practice of normal science. ...a new theory....is seldom or never just an increment TO what is already known. Its assimilation requires the reconstruction of prior theory and the re-evaluation of prior fact."

    "What Lavosier announced in his papers from 1777 on was not so much the discovery of oxygen as (it was) the oxygen theory of combustion. ...a reformulation of chemistry so vast that it is usually called the chemical revolution. ...the impossible suggestion that Priestly first discovered oxygen and Lavosier then invented it has its attractions."

    ".... once it has achieved the status of a paradigm, a scientific theory is declared invalid only if an alternative candidate is available to take its place. (In contrast to) ...the methodological stereotype of falsification by direct comparison with nature."

    "Once a first paradigm through which to view nature has been found, there is no such thing as research in the absence of any paradigm. To reject one...without... substituting another is to reject science itself."

    "The transition from a paradigm in crisis to a new one ...is far from a cumulative process....Rather, it is a reconstruction of the field from new fundamentals.

    "Almost always the men who achieve these fundamental inventions of a new paradigm have been either very young or very new to the field..."

    "Political revolutions aim to change political institutions in ways that these institutions themselves prohibit. Their success therefore necessitates the partial relinquishment of one set of institutions in favor of another, and in the interim, society is not fully governed by institutions at all."

    "Science textbooks...refer only to that part of the work of past scientists that can easily be viewed as contributions to the statement and solutions of the text's paradigm problem. Partly by selection and partly by distortion, the scientists of earlier ages are implicitly represented as having worked on the same set of fixed problems and in accordance with the same set of fixed canons...(of)...the most recent revolution in science theory and method... No wonder... they have to be re-written after each scientific revolution. (so that science once again seems largely cumulative."

Looney sub-fields related to philosophy of science

Anyone who likes Kuhn's ideas should probably be warned that some sociologists and English professors have over-extended comparable ideas in crazy directions. "Post-Modernism" and "The Strong Program", etc. are wastes of time and perversions of intelligence; but we are better off knowing they exist. Some courses at UNC and Duke seriously advocate such bunk, which has some entertainment value. But beware of them!

Are scientists like a primitive tribe, whose rituals (journal publication, symposia, experiments) should be analyzed by anthropologists? Are research papers (journal articles) just one more literary genre, more boring than novels and as stereotyped as Epic Poems? Lots of people claim to think so, and are running around loose, writing books and getting faculty jobs in non-science departments. Although their symptoms should be added to the infamous "Diagnostic and Statistical Manual of Mental Disorders", many of these people are quite intelligent. Some of their minds were twisted by personal disappointments in failed attempts to become research scientists. Others are cynical nihilists making careers from debunking truth (S. Fish, also known as Morris Zapp). There are whole industries of such stuff, and whole sections of bookstores.

A post-modernist journal published at Duke (named "Social Text") fell for a practical joke by a physicist named Alan Sokal, back in the mid-90s. He wrote a parody article, making fun of the kinds of b.s. that they would like to believe, submitted it to their editor as if it were a real article, and they fell for the trick and published it in their journal. Sokal then announced that it was a joke, and proved the fraudulence of their whole field. This was part of what is now called "The Science Wars". Very bright people, spinning mental wheels, when they should be humbly trying to cure cancer and understand self-allergy.

A person's whole sense of who they are and what they are worth can be based on an over-optimistic opinion of how intelligent they are, and what great science they are going to do. If their experiments then consistently fail, such people can either moderate their hopes, have a nervous breakdown, or switch fields and adopt some extreme version of philosophy of science. In college and graduate school, I knew two such people. Better they should have stayed in science, or maybe written novels. (Note, I am not referring to my friend who does linguistics, in case he reads this. He turned out to be even more intelligent than everybody expected! Although I wish he were doing cancer research.) Conversely, I have known dozens of people, counting myself, who have blundered into discoveries of much larger importance that could reasonably have been hoped for based on their previous aptitudes. "If at first you don't succeed", try using a different buffer; don't turn to postmodernism.

I encourage students to notice and learn from whatever philosophical adjustments they themselves make (or their friends make), or are tempted to make, based on success or disappointments in laboratory research. It is natural, maybe unavoidable, or sometimes good, to develop rationalizations for consistent patterns of non-success. I promise you, however, that there really is such a thing as The Truth. It's "out there", sure enough. People of every level of aptitude can contribute to discovering what is true. How wonderful and odd, that phenomena, mechanisms, and laws of nature can exist for thousands and millions of years, with not one person ever understanding how they work - and then you or I can go do something in a lab that adds knowledge about them to the permanent treasury of science, not to mention the textbooks of the future.

Almost as strange is the fact that 90% of research scientists never read any philosophy of science. They just don't know it exists. Imagine a football game, with many thousands of people watching from the stands, and with a few hundred of these observers dressed in vertically striped shirts, as if they were referees, each with a whistle in his mouth. Visualize these self-appointed pretend-referees calling fouls, etc. just as if they really were referees, except that none of the players down on the field pays any attention to them, or ever knows they exist. Between games, the pretend-referees keep busy writing vitriolic articles criticizing each other's accuracy in calling fouls, noticed by hardly anybody except each other. Can anything actually be learned from them, or by them? Maybe.

If you choose to read philosophy of science (NOT needed for this course, however. Nor even helpful.) I recommend the following: Stephen Toulmin, Alfred Whitehead, Thomas Kuhn, Bertrand Russell, and Jeffrey Kasser's excellent recent series of recorded lectures sold by "The Teaching Company". Ernst Mach is also worth reading; even though he turned out to be wrong about atoms not existing (!), he is an example of a first-rate scientist writing philosophy, and Einstein was very stimulated by him. Books on the history of mathematics are also fascinating: who knew?! The best are by Howard Eves, Morris Kline, E. T. Bell (most entertaining, least accurate), Carl Boyer, Ivor Gratten-Guinness, Dirk Jan Struik, and the 4 volume boxed anthology by J. R. Newman. If you find reasonably cheap copies of any of these in a used book store, then do your brain a favor. They are every bit as much fun as calculus wasn't.

Some useful terminology and recurring metaphors for Unsolved Problems course

  • * "Occam's Razor": The simplest hypothesis is to be preferred; if there are two competing hypotheses, both of which explain the data available, then we should assume that the simpler one is more likely to be true; named for the medieval philosopher William of Occam, who wrote that "one should not multiply entities beyond necessity" or something like that. Einstein once wrote "Things should be made as simple as possible; but not more so!" Occam's razor tends to be less applicable to Biology than to Physics. Sometimes spelled Ockham.

  • * "Koch's Postulates": Formal criteria for identifying something, or settling some type of question; named for "the German Pasteur", the great bacteriologist of the late 1800s Robert Koch, who put forward these four criteria for proving that a given type of germ caused a given disease:
      1) The species of bacteria could (always) be isolated from victims of the particular disease
      2) These bacteria could be cultured outside the body
      3) Deliberately infecting a previously healthy animal with the isolated bacteria would produce the disease
      4) Bacteria of the same type could be re-isolated from these deliberately infected animals.
    People often speak loosely of 'deciding on some Koch's postulates to decide whether X causes Y', very much in the same sense as one might speak about a 'litmus test' for Supreme Court nominees.

  • * "Paradox": Sometimes used to mean simply a puzzling problem; it is defined in one dictionary as things which seem either "contrary to received opinion" or self-contradictory, while also seeming true. Often, apparent self-contradictions turn out to be contradictory only in terms of, or due to the erroneous assumptions of, some part of the current conventional wisdom that itself happens to be wrong. For this reason, the first clue that one of our basic assumptions is mistaken is the emergence of a paradox. The following is quoted from the introduction of Hoffman, Levy and Nepom, "Paradoxes in Immunology" CRC press, 1986
    "Paradoxes play a key role in the advancement of science. They are associated with excitement, and with the knowledge that we must be looking at something the wrong way. The clear formulation of a paradox can herald important advances, since the resolution of the paradox is generally a step forward. It is therefore of paramount importance to identify paradoxes and focus attention on them."
    "The moral... is that we cannot tell from the existence of a paradox which part of our thinking is wrong. ... The usual response to a paradox is to conclude that the theory is wrong; frequently, however,... the data are being wrongly interpreted. In any case, the formal delineation of the paradox serves to highlight the existence of a problem that deserves study, and which might reward such study with fundamental new insights.

  • * "Anthropomorphism": The common practice, often more of less unconscious, of thinking about cells, molecules, etc. as if they were human, in the sense that they could think, understand things, have intentions, goals etc. For example, one might speak of a cell "trying" to accomplish some task; and one frequently speaks of molecules as being "recognized" by antibodies or by receptor molecules. Such metaphorical usages are often very helpful, and need not be misleading as long as we do not take them too literally at the subconscious level. The word "recognize" is especially worthy of careful analysis: it refers to specific binding (usually by precise conformational fitting) between two molecules, and carries implications that one of the molecules involved will, as a result of the binding, change its properties in some way, often in such a way as to transmit some kind of signal to other molecules or parts of a cell. "Recognize" is then mental shorthand for "bind specifically and undergo some change in conformation or other property...etc." Its OK, as long as you don't forget.

  • * "The Exception that Proves the Rule": Although this commonly-heard cliche originated from an archaic and now-forgotten meaning of the word prove (which used to mean the same as "disprove"), it has come to fill an useful intellectual nitche in its misunderstood form. Sometimes evidence seems initially to disprove an idea or hypothesis, but then it turns out that the situation is really a special case, with this specialness being such as to constitute some kind of actual additional confirmation of the original idea or hypothesis. Thus, the apparent exception to the rule, by itself turning out to be an exception, thereby becomes converted into new and unexpectedly strong additional evidence in support of what it had seemed to disprove. There are so many examples of this that I am unable to think of any.

  • * "Constructing a 'Don't Worry' Hypothesis": Skipping past an apparent impossibility, impracticality or self-contradiction of your currently favored theory by assuming that the problem can somehow be fixed or circumvented, and perhaps making a mental note to go back and fix it later (but not trying to specify what that additional saving hypothesis is, or otherwise worrying about it).

  • * "Intellectual Duct Tape": Patching a hypothesis or paradigm to protect it from data that is not really consistent with its original predictions. (Like what is traditionally called an "ad hoc" hypothesis)

  • * "Unpredicted Riddles": Queer little regularities in the data (such as Chargaff's rules of base frequencies in DNA, i.e. A=T and G=C+mC); another possible example is the frequently upside-down-and-backwards orientation of neural projections. These are regularities which no theory had predicted and which don't seem to make any sense, yet are nevertheless observed consistently. People even tend to get so used to these phenomena that they no longer regard them as requiring any explanation, other than perhaps being given a name, rather than considering them as being tantalizing hints as to the deep structure of the underlying mechanism. Perhaps the greatest "Aha!" experience in all of science would be to perceive that some hypothesis, that you had invented to explain some other aspects of the data, happens also to predict the occurrence of the riddle (as a certain model of neural connectivity proposed by Steve Roth happened to predict the upside-down-ness of neural projections).

  • * "The Fossilized Paradigm": When a now-disproven way of looking at a phenomenon lives on in the vocabulary or jargon of a field: Although you know that X isn't really true, you still go on talking about the phenomena as if X were true, although generally not realizing that this vocabulary originated in the earlier paradigm, or what it would probably imply to the uninitiated reader. Examples are that evolution is sometimes discussed in Lamackean terms, while immunity is discussed as if it were instructional.

  • * "Text Book Speaks With Forked Tongue": The tendency for the historical portions of texts to change the historical development of a subject so as to make it appear more of a straight line Baconian introduction of one new fact or explanation after another, without any detours in which now-discarded theories were considered proven facts.

  • * "Aesopean Explanations": Explanations that are intuitively attractive, but fundamentally erroneous. Often based on supposed analogies to familiar day-to-day phenomena; analogous to "Aesop's Fables". Some examples: The elephant's trunk got long because the crocodile pulled on it; Thunder is caused by clouds bumping into each other; Anticoagulants and vasodialators "thin the blood"; Arteries become blocked by too much cholesterol sticking to the insides of the vessel walls; Aging is caused by the body "wearing out"; Cells and aggregates of cells round up because of "surface tension". In Marxist jargon "Aesopean language" means oversimplified explanations designed to convince the naive; for example, "budget deficits cause inflation" has long been a staple of American politics.

  • * "Hilbert's Problems": A set of the most important and/or difficult unsolved problems in any field. At the International Mathematical Conference of 1900, in Paris, one of world's greatest mathematicians (David Hilbert) proposed a series of 23 unsolved mathematical problems. Some were quite specific, others more vague; about a third have since been solved. To solve one of Hilbert's problems is comparable to winning the Nobel Prize. Whoever solves one of them instantly becomes world famous (among mathematicians, anyway). For example, Godel's famous proof ammounted to a demonstration that Hilbert's first problem couldn't be solved!

  • * "Procrustean": In Greek mythology, there was an innkeeper named Procrustes who had only one sized bed; if you were too tall, he cut off your feet until you fit; if you were too short, he stretched you out on a rack until you were long enough. Thus, when someone has to stretch or abridge the facts to fit some favorite theory, then the effort is said to be "procrustean".

  • * "Baconian": Francis Bacon was a contemporary of Shakespeare and was Lord Chancellor of England before being sent to the Tower for embezzling. He wrote several philosophical books which proposed much of what has come to be thought of as "the scientific method". His highly empirical, practical-minded approach broke with the very speculative Aristotelian approach that had dominated science for the previous 1500 years or more. If someone uses the term "Baconian" in a pejorative sense, what is probably being criticized the extreme form of simply accumulating facts and observations, and hoping that the hypotheses will arise spontaneously.

    Bode's Law: There is a noticeable regularity in the spacing of the planets of the solar system, for which no cause has ever been determined. With the exception of Pluto and the gap between Mars and Jupiter, the planet's relative distances from the sun obeys the rule that each planet is about 1.73 or rather that the radius of each planet's orbit is about 1.73 times larger than the one before: like Venus orbit is about 1.73 wider than Mercury's and the Earth's orbit has a radius about 1.73 more than Venus's, etc.

    Harris's Rule for Scientific Fame: The most highly praised research papers are those that provide easy-to-understand evidence in support of some conclusion that people already believe, or wish to believe, but for which the previous evidence was either inconclusive or too complicated to understand easily. The worst thing you can do is to produce complex evidence that contradicts people's treasured prejudices.

    The more surprising and unexpected a scientific result is, the greater its information content. But the more it tells us, the less likely we are to be able to make sense of the new information.

    Even when the exact same experimental result is predicted both by an old accepted theory and by its new competing rival hypothesis, people will interpret the truth of this predicted result as a confirmation only of the old theory, and thus as a contradiction to all alternatives, including ones that made the same prediction.

    A method for constructing a new hypothesis, as an alternative to an existing one: Just re-write the old, existing hypothesis with the following substitutions of words and concepts: Substitute the word "stimulate" for the word "inhibit"; and conversely substitute the word "inhibit" wherever the word "stimulate" had been. Similarly, substitute increase for decrease, and vice versa; substitute "more" for "less", "top" for "bottom", "push" for "pull", "weak" for "strong", "assembly" and "depolymerization", "plus end" and "minus end" (and, of course, vice versa in each case). If you keep on making such reverse substitutions, eventually they will cancel one another out in the sense that the newly rephrased hypothesis will now predict most of the same phenomena and experimental results that the old original hypothesis had predicted ("explained"). When there are some differences between predictions, then check and see if those made by the new "mirror image" hypothesis may not be closer to the truth than the old one.

    From Merzbach and Boyer, A History of Mathematics, Third Edition:

  •  

    back to index page