Friday, March 27, 2009

5.2.2 Isaac Newton’s Rules of Reasoning in Natural Philosophy

Religious people are fond of pointing out how deeply pious Newton was. In fact, in his own non-conformist manner, he was. He was also an alchemist and serious student of Biblical prophesy, forecasting the end of the Earth in the year 2060. The economist, John Maynard Keynes came into possession of Newton’s research papers on alchemy. After studying them he concluded in an oft repeated quote, "Newton was not the first of the age of reason: he was the last of the magicians." He may well have spent more effort in his religious studies and publication than in scientific works (though it is only his contributions to science that are considered memorable today). That surprising fact needs to be considered in the context of the time in which he lived (late 1600’s). This was before most discoveries in biology, medicine, astronomy, geology, physics, and chemistry had been made. Many of today’s modern sciences had not even been invented. The gaps in knowledge were deep and wide, and there had been a long uninterrupted history of allowing God to fill those gaps. There was no viable competing theory to the traditional theistic one. Essentially everyone was a theist and a creationist. Only a few decades earlier, England practiced persecution and torture of those who disputed church doctrine (and on rare occasion, execution).

Newton, although very devout, had no tolerance for what he called “occult causes” both because he saw them as unnecessary and unhelpful. They had no explanatory power, but were simply excuses for explaining away what we didn’t yet understand. In his day, the nature of magnetism, electricity, gravity, cohesion, friction, thermodynamics, fermentation, cell biology, and other natural phenomena were not well understood. He envisioned that from the confusion that then reigned, laws of nature would emerge to resolve those mysteries. He criticized the Aristotelians for ascribing occult causes to incomprehensible natural phenomena by correctly observing that “such occult qualities put a stop to the improvement of natural philosophy, and therefore of late years have been rejected. To tell us that every species of things is endowed with an occult specific quality by which it acts and produces manifest effects is to tell us nothing”. Even so, he himself subscribed to two seemingly occult entities – the invisible force called “gravity”, and the luminiferous ether through which he believed light traveled.

However, it is important to keep in mind that he lived on the historical edge of the scientific revolution. During his lifetime, there was not a clear distinction between chemistry and alchemy, between the natural and the supernatural, between science and magic. He helped to refine those distinctions in many ways, not the least of which was an often reprinted work of just a few pages called “Rules of Reasoning in Philosophy”. It was a simple guide to help thoughtful observers make sense of their experiences in the natural world.

He enumerated four rules for understanding real world “natural philosophy” (i.e., science) problems as follows:
“1. We are to admit no more causes of natural things such as are both true and sufficient to explain their appearances. To this purpose the philosophers say, that Nature does nothing in vain, and more is in vain, when less will serve; for Nature is pleased with simplicity, and affects not the pomp of superfluous causes.”

This is his somewhat anthropomorphized, teleological version of Occam’s Razor - “do not multiply entities beyond necessity”. Just as the Golden Rule shows up again and again in different religions, Occam’s Razor has a way of reappearing in different forms in Science. It also goes by the “Law of Parsimony”. Briefly, it means - the simplest solution that explains the phenomenon is to be preferred. Experience shows that this principle has surprising explanatory power and leads to the correct answer much of the time.
“2. To the same natural effects we must, as far as possible, assign the same causes. As to respiration in a man, and in a beast; the descent of stones in Europe and in America; the light of our culinary fire and of the sun; the reflection of light in the earth, and in the planets."

In other words, causality is universal – the light from our candle is the same as the light from the sun, which is the same as the light from a distant galaxy. A falling rock follows the same laws as a planet falling in orbit around the sun. Without evidence to the contrary, we can justifiably infer that the same types of causes produce the same types of physical outcomes.
“3. Qualities of bodies are to be esteemed the universal qualities of all bodies whatsoever.”

The qualities and characteristics of objects we can experiment on should be considered the same as similar objects that we have never touched. Principles derived from clear evidence are valid, beyond doubt. We should have the discipline to follow the evidence and not turn our backs on facts in favor of a belief that may be more comfortable or familiar. Look to nature for guidance and let it direct our research rather than let our theories blind us to the evidence. The third of Newton’s rules presages later, similar restatements of the same concept such as the “Cosmological Principle” and the “Principle of Uniformity”.
“4. Propositions deduced from observation of phenomena should be viewed as accurate until other phenomena contradict them.”

Unless proven otherwise, the best theory that successfully explains the facts should be accepted, keeping in mind that all theories are provisional, subject to revision given new evidence. Acknowledging that new discoveries were possible, he cautioned that future discoveries might lead to improvements of existing theories. Going back to the quote, “the light of our culinary fire and of the sun”, at that time there was no theory of nuclear fusion. Now we know that a candle flame is not the same as a nuclear reaction. This last rule embraces improvements and changes to theories to accommodate new discoveries of this type. The fact that he envisioned that future discoveries of this type could and would be made testifies to the tremendous insight he showed in developing these amazing and impressive guidelines. For this reason, I find myself returning to these these four simple and inspiring rules of reasoning to review and contemplate.

These guidelines have been modified, recycled, enhanced, and restated in many different forms since Newton first proposed them. They were among the first of many attempts to provide a philosophical framework and justification for the process of drawing conclusions from what we see happening around us, and extrapolating that knowledge to the greater universe beyond.

Wednesday, March 25, 2009

5.2.1 The Infinite Regress Problem

In The Logic of Scientific Discovery, Karl Popper explored what is called the “Problem of Induction”. This phrase describes the question of whether inductive inferences can be justified, and under what circumstances they are. The problem has two facets: (1) you can’t logically deduce that the inductive process is valid, and (2) using inductive logic to prove itself is “begging the question” (circular reasoning). Even though inference and induction seems to work brilliantly, this paradox continues to haunt us.

One of the larger questions we are trying to answer is, on what assumptions are science and rational empiricism based? If those assumptions could be enumerated, it would then be reasonable to ask why we believe those assumptions - how they can be justified. Given some more thought and analysis, it might be possible to provide good justifications for those assumptions by presenting more fundamental justifications. But then they come under the same kind scrutiny – why would we believe them, ad infinitum? This “infinite regress” is intellectually unsatisfying, because the series of nested justifications never ends. Or, it could mean that the at a deep level level, some unprovable postulates just have to be accepted - that rationality is based on just as flimsy a framework of unproven assumptions as any other faith-based belief system. Or it could end up in a circular argument (e.g., "I like bananas because they taste good"). If any of these are the case, then skeptics of rationality would argue that since its most basic beliefs cannot be justified, we don’t really know anything, absolutely. We are left with only one certainty – that there are no certainties.

Although this does represent a perplexing problem, it has not stymied attempts to overcome it – in fact, there have been many attempts to defeat the infinite regress problem. Some have responded that it is perfectly acceptable to let the justifications roll on to infinity. I have a hard time seeing how that would actually work, though. It is easy enough to just say that regress is not really a problem, but I am hard pressed to imagine a scenario involving more than just a few levels of justification before getting to the very atoms of logic and experience, impossible to go any deeper. The “why / because” back and forth dance can only go so far before it becomes tedious and meaningless.

Another alternative is Foundationalism, which overcomes the infinite regress argument by proposing that some core beliefs are self-evident and obvious, neither requiring justification nor even being capable of justification. A later section of this document goes deeper into this concept. A very similar response is Coherentism, which allows for explanatory circularity involving cross-referencing of justifications. If a diverse, interrelated system of different lines of evidence and belief is consistent, non-contradictory, and mutually supportive of the entire structure, the overall system and the theories that comprise that system gain credibility. In this model, beliefs are like pieces in a puzzle or clues in a murder mystery. They become more believable as they fit with more and more of the interrelated beliefs, facts, evidence, and theories that already are on firm ground. For example, the theory of Evolution gained more credibility and relevance as subsequent discoveries in organic chemistry, molecular biology, geology, paleontology, zoology, botany, anthropology, and archeology showed strong confirmation for it, and even helped advance and deepen the original theory. As a system of beliefs, the theory of Evolution is coherent with the rest of science. The same is true of all the other major branches of science.

Science relies on the principle of induction. This principle allows the inductive process to be put into a logical form capable of providing a basis for the legitimacy of scientific endeavors. Hans Reichenbach wrote, its absence would “mean nothing less than to deprive science of the power to decide the truth or falsity of its theories. Without it, clearly, science would no longer have the right to distinguish its theories from the fanciful and arbitrary creations of the poet’s mind.”

5.2 How can we have confidence in our inferences

I won’t profess to have demonstrated in any conclusive way that an external world exists. Like all interesting philosophical problems, it defies proof – if it were provable, then it would cease to be an issue at all and there would be no reason to continue writing about it. Many of the world’s greatest thinkers have lived and died without resolving the question. I have given many examples of where they stood on this issue. But it does seem reasonable to assert that only if there is something in the world besides our individual minds does it become worthwhile to try to discover what it is and how it works. The drive to learn about the world is more important to those who advance the body of science than any philosophical issues related to the meaning of that activity.

I will develop the rest of this presentation assuming that there is an external world that exists when we are not around to observe it. Given that, this section discusses if and how can we draw reasonable inferences about that world from our experiences of it. This section defines some of the postulates and assumptions that must be accepted to make it possible for people to operate in the world, and for scientists to learn about the world. They are not based on faith, but on a mixture of evidence and reason: carefully crafting theories, gathering evidence, collecting consistent support from countless observations, and a generous application common sense.

As Bertrand Russell wrote, "the general principles of science are believed because mankind have found innumerable instances of their truth and no instances of their falsehood. But this affords no evidence for their truth in the future, unless the inductive principle is assumed." This great man was unable to arrive at a conclusive,deductive proof of induction. But his contributions to the analysis of the problem, and the contributions of many other famous individuals make interesting reading.

John Worrall, a professor of philosophy of science at the London School of Economics wrote:
Nothing in science is going to compel the adoption of a realist attitude towards theories. But this leaves open the possibility that some form of scientific realism, while strictly speaking unnecessary, is nonetheless the most reasonable position to adopt.

5.1.3 What are we to conclude?

Practicing scientists generally assume what non-scientists do – that whatever the answer to this question really is, we all must proceed as if the world is as it appears. So, we assume that the physical world has a metaphysical/ontological reality independent of human experience – that a falling tree does make a sound if there is no one there to hear it – that stars are born, burn, and then die without any humans bearing witness.

The corollary to this assumption is that human beings can experience that external world to the degree that our senses and our instruments allow it. By no means do all philosophers accept this view, and perhaps some scientists would not completely accept it. But for the practice of science to succeed, those who “do” science have to behave as if it were so. I will not complicate matters by delving into the challenges to this conceptualization of reality that Quantum Mechanics brings to the table, but it does introduce some difficulties (such as the inconvenient violation of the law of non-contradiction by the wave/particle duality of light). But at that small scale, as well as at the cosmological scale, concepts and words that have clear definitions at the macro level cease to have the same meanings.

If we can stipulate, then, that we can accept reality at face value, what do we do with that information? As we experience the world, can those experiences translate into and inform our explanations about how it functions, how it has functioned, and how it will in the future? In other words, can we have confidence in what we believe we are learning from the world? Is an epistemology based on interactions with nature reliable?

Tuesday, March 24, 2009

5.1.2.9 Modern Philosophy of Science

Finally we come to the 20th and 21st centuries, where we find no single Modern Philosophy of Science. Neither today, nor in the past has there has ever been a unanimous philosophical consensus. Adding to the diversity of outlook, scientific specialization and the rapid pace of discovery in the last century have driven each of the sciences to catalog its own set of non-scientific issues and individual philosophies with which to grapple. We live in a results-oriented age, and pragmatism plays a large part of any modern scientific endeavor. The concern is not so much “what meaning does science have?” as much as “what conditions and ways of thinking and acting make it possible to do science well, to do it fast, to make rapid innovations and breakthrough discoveries?” So, the bottom line is that performance matters, and philosophy takes a back seat. Delivering the goods is at the front, not introspection.

However, 20th and 21st century science produced many new and crucial non-scientific questions. There are so many that that they displace the comparatively uninteresting metaphysical questions related to the nature of existence (unless existence itself is fundamental to the research area as in Cosmology or Particle Physics). Any time for armchair philosophizing is given to considering immensely important issues like the morality of research in nuclear energy, the ethics of genetic modification and stem cell use, impact of science and technology on the environment, limited natural resources, unequal distribution of technology, the whole gamut of green issues, spread of disease in 3rd world countries, famine, birth control, the nature of consciousness and “the mind”, free will vs. determinism, what differentiates life from non-life, and countless other concerns. Some of these issues could be considered philosophical, but for the affected people they can mean life and death.

Of course, we can’t forget the never-ending ideological battles between the metaphysical naturalists and those promoting mystical or religion based “origin” explanations. These range from debates over evolution to cosmology, teleology, the “fine tuned universe”, and consciousness. Little time is spent speculating on non-controversies such as the possibility that the reality we see is not actually there, or is there in some other form.

If one term could encapsulate the current scientific view – “Scientific Realism” is the dominant theme in 21st century science. This doctrine descends, with modification, from the Logical Positivist movement initiated by Wittgenstein, Carnap, and others earlier in the 20th century. Beginning in the 1930's, the Logical Positivists set Philosophy of Science apart as a true separate sub-field with general Philosophy. Their agenda dominated Philosophy of Science for decades. It takes the real world as an adequate working hypothesis – that what we see is what we get. Scientific Realists believe that when we perceive the world, we perceive what is actually there. Additionally, it promotes the idea that the objects of science that cannot be directly observed (atoms, black holes, gravity, electricity, sub atomic particles, magnetism, genes, etc) have real existence that is just like that of objects that can be seen and directly experienced. Even though the objects of the micro-world are invisible to human senses, they are predicted by theory, detectable by our instruments, and transformable into data that can be observed. They act consistently with what one would expect from actually existent objects. The hypothesis that the unobservable objects of science are actually there rather than reality is "acting as if" they are there (but they really aren't) is the best explanation for our experience in the world. The fact that scientific explanations have worked so well for so long, and that they can be utilized in technology and engineering so successfully is a powerful and convincing argument for realism. It would be a huge and improbable coincidence if non-real unobservable entities were able to generate the measurements we take of them, and then permit us use them to build new and surprising technological devices from them, allowing us to discover newer, even more bizarre and different unobservable entities. If they actually were not present, it would require an intricate set of miracles for this to occur. Further, objects that were previously unobservable (DNA, molecules, atoms) now can be observed (using instruments based on our knowledge of how other unobservable objects, like X rays, work). They are have gone from unobservable to observable. This transition doesn't make a non-real object suddenly become real - they were real all the time. One way of thinking about it is that humans have built artificial sense organs that can see into the distant past, the future, the very fast, the hidden, the invisible, the very slow, the very far, the very small, and the very large.

Realists assert that no other way of considering reality would allow as reliable a path to achieving the goals of facilitating the discovery process, publication, and theoretical progress. Realists maintain that a very good reason for subscribing to their view is that it has an unsurpassed record of success and achievement, and no record of being wrong. That is, no experiments have demonstrated that the external world does not exist (which would be an Idealist result). The theories produced by this worldview and practice both explain the existing state of affairs and predict future outcomes with unequaled power. The cumulative set of theories and facts from all the sciences demonstrate extremely high coherence and mutual support that could only be explained by their being correct. Scientific Realism has a remarkable track record that attests to the extremely high probability that it is the right way of viewing the world. It is widely held that the most powerful argument in favor of Realism is the "no-miracles argument", according to which the success of science and realism would be miraculous if scientific theories were not at least approximately true descriptions of the world.

So far, I have been contrasting Realism with Idealism. However, an issue that is much more topical within current Philosophy of Science is Realism and Anti-Realism. Realism is the belief that behind our theories and models, there is actually some sort of substance that is actually being modeled. It reflects a belief in the actual existence of both observable and unobservable aspects of the world described by the sciences. Anti-realism refers to claims about the non-reality of "unobservable" or abstract entities such as sub-atomic particles, electrons, genes, or other objects too small to measure directly or detect with human senses. For example, a realist would assert that there really is an entity called the "electron" that exists independent of its several attributes that we are able to measure. The anti-realist either doesn't care to speculate about it, or chooses not to make assertions about that which cannot be directly experienced, or flatly denies the existence of anything beyond the charge, spin, mass, and other properties of electrons that we are collecting measurements on. "Anti-Realist" is not the same as Idealist, because while the Idealist might think the entire external world is illusory, the anti-realist just has issues with what constitutes the fundamental content of that external world.

Realists advocate the idea that the primary reason for accepting the objective existence of the table and the atoms of which it is made is that it is the only stance which fully explains the table's persistence and similarity to all observers at any time, in any place. Everyone who experiences the phenomenon which we call "table" has essentially the same experience. By extension, the same arguments that are made for the existence of a table applies to the existence of the small unobservable entities of which it is made - the atoms and their constituents. Arguments that anti-realist make for withholding belief in the unobservable atoms in the table should carry over to the table, itself. The belief in everyday objects, and the subordinate objects of which they are composed, allows us to explain many observable phenomena that would otherwise be inexplicable. Why should such explanation be ruled out for the contents of the objects in the unobservable world?

The Ptolemaic theory of the solar system is an anti-realist model. It is a good theory in that it describes current movement of celestial bodies, and can predict their movement in the future. We know now that it doesn't logically model the actual structure of the solar system - it was just a very effective calculating tool that astronomers used to help them locate the objects of our solar system. Many, if not most Ptolemaic cosmologists didn't believe that the solar system actually looked like their model, and it wasn't an important factor for them anyway (although it conveniently fit with the prevalent worldview of the time, which was that the Earth was the center of the universe). The model was just to quirky to be believed as an true representation (requiring its many cycles and epicycles). The same could be said of the early models of the atom. Dalton had a very simple indivisible atomic model - the atom was just a single small object with no internal structure. This was the first physical model for the phenomenon we call "atom", and was really the first advance in over 2000 years since Democritus theorized its existence. The Thomson "plum pudding" model did a better job of predicting outcomes of experiments in the early 1900's after the existence of the electron had been confirmed. But it was not a true representation of any sort of actual reality. The Rutherford nuclear or "planetary" model was a big improvement because it moved the electrons into "shells" or orbits, each capable of containing a certain fixed number of electrons. But it still was not a description of the underlying fabric of reality. Chadwick, Heisenberg, and Bohr further improved it with the proton/neutron model of the nucleus, and so on. Our current model involves a quantum mechanical "cloud" of probabilities representing possible positions of the electrons.

But is there really a set of objects down at that level which the model is correctly describing? If this question would have been asked of any of the prior theories, the correct answer would have been "no" - the underlying reality (if it even existed) would not have looked anything like the model. It is probably not the case that our current probabilistic models will go unmodified into the future. The likelihood that our current descriptions of atoms (and things smaller than the atom) really are painting a true and accurate picture of a deeper internal structure is not very high. And the question, "is there even anything down there?" has not really been answered, and it may not be capable of having an answer. Certainly some type of "atom-like" phenomenon is occurring in that location where our models say the atom is. But the "thing" which the models attempt to describe is probably not quite what those models bring to mind. Our models can only go so far as to describe the structure of the relevant relationships at that level, but the reality to which they point keeps shifting. This is the essence of "Structural Realism", which is a very important concept in modern philosophy of science.

One prominent anti-realist position is instrumentalism. Instrumentalism is the pragmatic view that a scientific theory is a useful tool for understanding the world. Instrumentalists evaluate theories or concepts by how effectively they explain and predict phenomena, as opposed to how accurately they describe objective reality. By this standard, it has been suggested, the Ptolemaic solar system and the Copernican solar systems were equally good (up to a point). In my opinion, Instrumentalists have voluntarily donned blinders so they can focus on their work, rather than adopted a complete philosophical framework for reality. Non-realism takes a purely agnostic view towards the existence of unobservable entities: unobservable entity X serves simply as an instrument to aid in the success of theory Y. We need not determine the existence or non-existence of X. Some scientific anti-realists argue further, however, and deny that unobservables exist.

Individual theories may be disproved, but the overall body of science is fundamentally “right”. Its theories are able to explain what we currently see, to anticipate events that will occur in the future, and to predict discoveries about what occurred in the past (as in geology, astronomy, and paleontology). Its epistemological basis is nature itself, rather than mythology, tradition, or revelation. The increase in knowledge that results from its application passes through the rigorous filter of the scientific method. It is coherent, consistent, reliable, and it makes continual progress and theoretical refinement. Further, there is no compelling reason to disbelieve it. No competing acceptable explanation has been proposed. This doesn’t constitute irrefutable proof, instead utilizing “inference to the best explanation”, meaning that among the only set of available explanations Realism is by far the strongest.

As has been mentioned, the modern view also incorporates Scientific or Methodological Naturalism as its core epistemology – there is simply no other way of gaining knowledge about the world that can compete. However, these terms, themselves, can generate debate even among those who practice them. This debate, though, may be more semantic than substantive.

The first view is that Naturalism is a necessary component of science itself, that Science can only investigate the Natural and can say nothing about the Supernatural. Only by linking empiricism to a naturalistic research framework can we gain knowledge about nature. Because of this, the Supernatural is inaccessible to science. It may or may not exist, but it is outside the purview of science.

Alternatively, some believe that science, as it is practiced, does not need to make a distinction between the Natural and the Supernatural. In this view, insisting on Philosophical Naturalism is putting the cart before the horse. Science doesn’t limit itself to studying natural phenomena – it studies what it can study, and by definition that domain of phenomena and entities is called the “natural world”. The scientific process doesn’t assume in advance anything about what is or is not “natural”, but only what it can successfully address. For example, in centuries past insanity, plagues, comets, eclipses, and other phenomena were considered to be of supernatural origin (i.e., messages from god or possession by demons, etc). Science demystified them and they were transported from the supernatural realm to the natural realm. They were not automatically off-limits to science because they were thought to be of supernatural origin.

If we apply it to the verification of entities, processes, or phenomena, those that we can detect, measure and describe with science end up being what we call “nature”. Science precedes Naturalism, rendering the Natural/Supernatural distinction moot. So, it would be incorrect to say that science can only deal with natural phenomena. More correct is that what we call “natural” is simply the collection of everything that science studies. So, some phenomena that were previously considered supernatural could be effectively moved to the natural if they could be investigated by science. As described in http://www.naturalism.org/science.htm,


“Science needn’t define itself as the search for “natural” or material causes for phenomena. In actual empirical fact, in building explanations and theories, science proceeds quite nicely without any reference to the natural/supernatural distinction. Science is defined not by an antecedent commitment to naturalism (whether methodological or ontological), but by criteria of explanatory adequacy which underpin a roughly defined, revisable, but extremely powerful method for generating reliable knowledge.”

The term “explanatory adequacy”, as used here, involves the standard set of criteria that surrounds the scientific method and scientific proof:
  • There should be good evidence for the phenomenon.
  • The phenomenon being studied should have some minimal level of “prior probability” (i.e., be considered potentially probable even before the collection of new confirming evidence).
  • A hypothesis for the phenomenon must be proposed that is testable, and it must be capable of falsification.
  • If a theory, it should have descriptive, explanatory, and predictive power.
  • The theory should either propose or lead towards a mechanism for the effect being studied.
  • The explanation should be consistent with previous knowledge, or if not, provide a convincing explanation why it is not.
In the generally accepted 21st century model for science, there are several factors that must be exist for one to confidently assert the establishment of a new scientific phenomenon. This might also be framed, as Steven Novella put it, as a standard for having confidence in drawing a scientific conclusion from evidence:
  • The investigation must have been conducted using good methodology, where any artifacts are weeded out, confounding factors are eliminated, and extraneous variables are controlled for.
  • The results must be statistically significant.
  • The results must be capable of replication at other independent labs and by other researchers.
  • The size of the effect must be well above the “noise” level.
Further, we need to see all of these factors occur at the same time. It is not enough to see just one or two, but all together. Replication with poor methodology proves nothing, as do studies with good methodologies but small effect sizes, or studies with strong statistical results but with murky methodologies.

Methodological naturalism is sometimes (incorrectly) used synonymously with philosophical naturalism. In fact, they are quite different. As previously described, methodological naturalism is an epistemology and is the heart of the protocols used in scientific research - it is a tool, a methodology, for discovering new knowledge. It is the idea that all scientific phenomena are to be explained and tested by reference to natural causes and events, and that magic and supernatural causes can not be offered for the phenomena. Philosophical naturalism describes a metaphysical point of view. Methodological naturalism is agnostic towards the ultimate metaphysical realities of the universe which are intrinsic to philosophical naturalism.

Modern science does not reject out of hand supernatural causality or explanations. It does not assume philosophical naturalism, which excludes in advance all supernatural explanations. It does not restrict its inquiry only to naturalistic explanations for phenomena. Methodological naturalism is not a choice or preference, as is philosophical naturalism. It is a necessity for the practice of science. We do not limit the types of answers that we are willing to consider to those that conform to an a priori naturalistic paradigm. We only limit the questions that science asks to those that can be addressed by the scientific method. If the question is posed in such a way that it cannot be falsified, then it simply can't be addressed by science - it is not a scientific question. If a testable hypothesis involving supernatural agents can be constructed, then science can address that hypothesis. This has already been attempted in tests to see if prayer will cause amputated legs to re-grow, and in double-blinded studies to test if appeals to God on behalf of sick persons will hasten their recoveries, homeopathic medicine studies, and ESP experiments. None of the results of these types of investigations were statistically compelling.

Karl Popper led the movement to embrace falsifiability, just mentioned, rather than verifiability, which was a fundamental tenet of the Logical Positivists. Popper was one of the most influential philosophers of science of the last century. Falsifiability certainly ranks as one of the most important elements in the modern conduct of science. Its dominance over verifiability results from fact that no number of positive experimental outcomes can ever absolutely confirm a scientific theory. But a single counter-example is decisive. It shows that the theory being tested is false, or at least incomplete. Instead of saddling scientists with the impossible task of providing absolute proof, a theory was considered to be tentatively “true” if ample opportunity and means were proposed to disprove it, but no one was able to do so. Falsifiability became Popper’s criterion of “demarcation” between what is and is not genuinely scientific: a theory could be considered scientific only if it were also falsifiable. This emphasis differed from that of Logical Positivists, who focused instead on verifiability – testing the truth of statements by showing that they could be verified.

Popper demonstrated his position with an example of the rising sun. Although there is no way to prove that the sun will rise every morning, we can hypothesize that it will do so. If only on a single morning it failed to rise, the theory would be disproved. Barring that, it is considered to be provisionally true. The longer a theory retains this provisional status, and the more attempts are made to test it, and the more times those tests fail to disprove it, the greater its claim to truth. The “sun-will-rise” theory has been well tested many billions of times, and we have no reason to anticipate that circumstances will arise that will cause it to stop happening. So we have a very good reason to believe that this theory "probably" represents reality. This argument has some weaknesses (primarily that it is not deductively ironclad, just as no inductive judgment can be). But because the theory has never failed, no stronger proof suggests itself, it is pragmatically useful, and is statistically unlikely to be disproved, it is a very good operating theory.

In the years since Popper first introduced the idea of falsification, it has been criticized, altered, and enhanced. See the entry in this blog for "Falsifiability vs Verifiability" to learn more about the limits of "naive falsification" and ways around those limits. Also see an amusing Youtube video, for a description of the Duhem-Quine Thesis, which addresses other modern concerns with hypothesis testing. One of my blog entries "Duhem-Quine" briefly describes Duhem-Quine, and another, "Criteria of Adequacy" , describes several techniques that can be used to determine which of several competing hypotheses that appear to be equally believable is more likely to be correct.

Saturday, March 7, 2009

5.1.2.8 Wittgenstein

Ludwig Wittgenstein is the “triple threat” of modern philosophers. For any of three separate accomplishments he would be famous. Although he was born a generation after Moore and Russell, they all contributed to the rebirth of modern Analytic philosophy (which uses precise, sometimes mathematical language to analyze issues). He laid down the precursors of what would become Logical Positivism, though he never considered himself a member of that school of thought. And, he led the way in defining modern Linguistic and “Common Language” philosophy, which is a dominant movement even today. Even the other Greats of his time considered him to be uniquely brilliant.

One of the richest men in Europe, he volunteered in the Austrian army during WWI, and spent many years as an elementary school teacher and gardener. From time to time he lit intellectual fires that radically reshaped 20th century philosophy.

His contributions fall into two periods marked by the publication of Tractatus, followed after his death by Philosophical Investigations. He is extremely difficult to interpret, and as many times as I have waded through his work and others’ analyses of it, I have emerged feeling like I just experienced something very important, but baffled by just exactly what it was. I will not try to do a comprehensive summary of his positions, but only to relate them to the question at hand – what is the external world and how does our perception of it function?

He was not a metaphysician, and in fact rejected metaphysics as a legitimate focus of philosophical speculation. He actually had very little to say about the nature of reality. It is his silence on the subject that makes him important to this discussion. For one of the giants of modern philosophy to regard it as an uninteresting question itself is interesting.

Tractatus Logico-Philosophicus
Wittgenstein's goal with this work was to define a logically perfect language, building on Russell's earlier efforts, with which to discuss philosophical issues. All complex domains (physics, math, dance, art, architecture, sports) have specific, highly tailored language that is used only in the context of describing the elements of that domain. But philosophy has traditionally used language and concepts from everyday life in incorrect and misleading ways to complicate issues that could and should be dealt with completely differently. Difficult philosophical problems result from abuse of language when it is misappropriated from normal, everyday use into an unfamiliar and shaky metaphysical environment.

He intended to develop a language appropriate to the task of discussing philosophical issues, and to delineate exactly what could and could not be discussed meaningfully using that language. Kant attempted to distinguish what could be known from what was forever unknowable and inaccessible. Wittgenstein's parallel task was to distinguish what could be said from what was unsayable and inaccessible. The point of this work was to draw a strong connection between language and the world (i.e., reality). He believed that when people attempt to gain certainty or to convey it to others by making controversial, confusing, or debatable propositions they are engaging in confused thinking and semantic nonsense that hinders understanding instead of helping it.

He laid out an approach that had these elements:
  • The world is comprised of atomic, independent "facts" which constitute the elements out of which larger hierarchies of facts are built.
  • These larger groupings of facts have logical forms defining their relationships.
  • The human mind contains thoughts (logical pictures) which represent the external “facts” comprising the world.
  • For those mental pictures to faithfully represent the external world, they must have the same logical form as those facts.
  • That logical structure is built from language propositions which he expressed in a highly structured and tailored manner.
These propositions share a "pictorial form" with the reality they represent. They are correct or incorrect to the extent that they faithfully model the logical form of that reality. If the world were differently structured from the forms of our logic, we would not be able to express it in language at all. They express in a scientific way the experiences of sense data (impressions of the world). Other types of statements involving math and logic convey no new information, but are only tautologies. That is the limit of what can be expressed with any certainty or meaning – everything else is beyond our ability to speak sensibly. Only "true propositions" about the world are sensible and meaningful.

In his words, “most of the propositions and questions to be found in philosophical works are not false, but nonsensical. Consequently we cannot give any answer to questions of this kind, but can only point out that they are nonsensical … it is not surprising that the deepest problems are in fact not problems at all”. In other words, much of what is debated in philosophical terms is nothing but impressive, but empty, verbal gymnastics. He showed the limits of what could be discussed, leaving the rest to just be regarded in awe and mystery – the realm of poetry, religion, theater, relationships, and emotions. He did not denigrate those disciplines and experiences, but strongly believed that logical, philosophical thinking could not deal with them at all. The region of the inaccessible is important. It contains much of what we value in life. In fact, to him, these were the most important aspects of human life. However, philosophy has its limits, and his job was to define and clarify the bounds of what could discussed within its sphere. "What cannot be said maybe can be shown, and what can be shown cannot be said".

Much to the dismay, and sometimes delight, of other philosophers, he constrained and limited the range of issues to which their discipline could add constructive value. Instead, philosophy should be used to organize collections of propositions which represent the existence and non-existence of “states of affairs” in the world. This set of statements constitutes the entirety of natural science.

The founders of Logical Positivism (chiefly Carnap) took this to mean that only empirically verifiable sentences were meaningful, and on these grounds eliminated metaphysics, aesthetics, and ethics from their curriculum. This philosophy rejects metaphysics, instead emphasizing that the goal of knowledge is only to describe the phenomena that we experience, which we can observe and measure, and to attempt to do nothing beyond that.

Not only were earlier philosophers in error by confusing logical and grammatical forms, they were also trying to say the unsayable. The activity of philosophy should be to show the limits of what could be said by saying some things very clearly, then stopping, and pointing towards the mystical that goes beyond the sayable. He criticized Hume’s extreme skepticism. “Doubt implies a question, questions imply answers, and answers imply that something can even be said about the issue. “ Hume tried to talk about the unsayable (God, value, skepticism). The solution to the nagging existential problems of life was the vanishing of both the question and the answer. There can be no answer to life, when there is really no question that can be meaningfully asked. As complex as his writings were, the overall gist of them were that he attempted to rigorously prove that certain things could not be conclusively described or decided using language, among them ethics, aesthetics, all things mystical, and metaphysics. So, the question of “is there an external world or not” is simply one that he would dismiss from the realm of philosophy. Is is not a question, but is built into his very first assertions in the Tractatus:
  1. The world is everything that is the case.
  2. What is the case (a fact) is the existence of states of affairs.
He concludes with a thought that encapsulates the entire book and is often repeated in other contexts, “what can be said at all can be said clearly, and what we cannot talk about we must pass over in silence”. So, after much difficult propositional calculus and highly structured statements, he advised us to stop trying to talk about things that we will never be able to decide. Some things must simply be observed in awe and admiration. The metaphysics of reality falls into that category – in his view, philosophy had nothing to say about it.

Philosophical Investigations
By the time this later work was published Wittgenstein has changed his mind about much of what he earlier said in Tractacus, in particular the part concerning the need for a precise philosophical language. This was the beginning of his foray into “Ordinary Language" philosophy. Tractatus showed an isomorphism could exist between the "real" world and some ideal language. Philosophical Investigations showed that the quest for an ideal language and isomorphism is doomed. He had come around to the conclusion that there need be no isomorphism between words and reality - all that was required was that the words help the parties to a communication achieve whatever social goal they intended. He elaborated many other novel concepts regarding a new linguistic approach to thinking, but most of them don't bear on our central question here.

What is relevant is that he expounded the view that conceptual confusions involving our use of language are the cause of most problems in philosophy. “Philosophy is a battle against the bewitchment of our intelligence by means of language”. He no longer held the view that a highly tailored language would be needed, but that common language would do. By eliminating the confused tangle of mangled language, he was able to make philosophical problems just vanish. Words can trick us into mis-categorizing things. The grammatical form of the sentences in which philosophical questions are formed hoodwinks us into believing problems exist where there are none.

He cleared the table of philosophical double-talk by dismissing the majority of philosophical questions as simple misuse of language. He saw a human tendency to become trapped in the language we use to describe our ideas to such a degree that the ideas become more important than the reality that they may or may not actually refer to. In many cases, associations of ideas in the mind that seem to have meaning, significance, and import don’t have external referents in the real world. But the strength and vividness of the false ideas are just as strong as that which would accompany coherent and meaningful ideas. Confused use of language disguises the underlying logical form, and renders most philosophical questions into perplexing nonsense and obscure linguistic puzzles. He felt that he had shown that most philosophical problems were caused by linguistic errors and general faulty use of language, that when resolved cause the original question to vanish (for example, how much energy was wasted in medieval scholastic debates exploring the various properties and abilities of angels?) When extended to the question of the existence of reality, the question itself doesn’t make sense. Simply because a question has legitimate syntactical form does not require that it actually have meaning and be capable of receiving a response. In other words, the logical form of the thoughts inspiring the question may not be isomorphic with any actual "state of affairs" in the world.

"Meaning just is use" — that is, words are not defined by reference to the objects or things which they designate in the external world nor by the thoughts, ideas, or mental representations that one might associate with them, but instead by how they are used in effective, ordinary communication. Language arises in social contexts and is oriented to achieving different social goals, depending on the situation. Language is meaningful if it accomplishes the goals of those involved in using it. To require precision and exact definitions in language is to become involved in a whole nest of philosophical problems whose origin is in the neurotic quest for certainty. So, when one asks “does reality exist”, both “reality” and “exist” have the common meanings that the asker and the asked ascribe to them. We don’t require concrete definitions of them as long as all parties to the communication understand each other. For example, when I describe the color "blue" to you, we both understand it to be the color of the sky, though we each have our own personal interpretations of that sense experience. It doesn't matter if they are or are not identical to each other, as long as we both agree on what is meant. In fact a private "definition" of blue would be utterly useless - we could not share it or talk about it. And a non-private, shared, definition of blue would have the same problems as the color, blue, itself - we could agree on the definition (e.g., a wavelength of radiation), but would have no way of confirming if it is seen the same way by everyone.

Philosophers had obscured this simplicity by misusing language and by asking meaningless questions. For example, one might ask “what is the meaning of life?” as if that were a question with an answer. One might instead ask the much simpler question, “what is the meaning of this stone?” What type of answer would satisfy such a seemingly straightforward question? What are the boundaries that would circumscribe a meaningful answer? There are none - a stone, and very likely "life", simply "is". Both can be put to various uses, or not at all. The strings of words comprising the two questions have the structure and lexical form of real questions, but both are nonsensical. Merely asking a question does not imply that a useful answer is forthcoming. If we can't describe the "meaning" of a stone, how much further are we from describing the "meaning" of life? If one were to ask this question, "what is 17 times 237", the boundaries around a possible answer would be, "it is a 5 digit number, odd, ending in 9, and certainly non-prime", even before doing the arithmetic. However, if one were to ask the question, "what is 17 times blue?", there are no properties of a possible answer - the question makes no sense. I propose that "what is the meaning of life" is this type of question.

Regarding reality, Wittgenstein might respond, “not how the world is, is the mystical, but that it is at all.” As with all mystery, there is nothing meaningful that philosophy or even language can say about it other than to simply point to it in awe. He fully accepted reality, but concluded that all we can experience of it is the "framework" erected by the language and the mental models we use to understand it. We "see the picture through the frame".