Tuesday, September 7, 2010

9.0 Final Thoughts

Throughout the course of modern history, science and the scientific method have contributed substantially to the ever shrinking scope of religious claims about the world as well as the scope of religious authority. Science has done far more to both explain the world around us and help us improve our condition than millennia we have endured religion. It's not surprising that some people more oriented by the mystical, the supernatural, or religious resent this view, and among their response is to deny that science is any different from religion or belief in the supernatural - that it is just one of many competing worldviews.

An important aspect of this flawed tactic is to insist that science doesn't really provide objective knowledge about the world and that it doesn't utilize a consistently reliable or proven method for acquiring knowledge. Instead, science is supposed to be based on guesswork, "theories," and false beliefs which are all inferior to "true" religions, like Christianity, and their revelations from God, as described in the Bible.

There is a curious contradiction here because people who argue for this myth end up involved in two efforts which should be recognized as contradictory: first, they have to denigrate science and argue that it really isn't as good as its defenders claim; second, they have to argue that science is actually a type of religion which relies on faith, not unlike their own religion uses faith (a false accusation, and a "tu quoque" fallacy, all in one). This argument attempts to trivialize the methodologies and requirements of science, and by implication their own religions by essentially arguing that they both are "merely" faith-based, rendering them both rather inferior methods of obtaining knowledge. It would be far more persuasive and clever to argue that one's own religion is as good as science, and then that science is also a religion.

However, we have seen in previous sections that science lacks the major characteristics of religion, so even going down this path is a waste of time. We see that religions and other mystical belief systems always fail when trying to obtain the objectivity and reliability of science. The reason is that the advances made by science, the benefits of science, and reliability of science cannot be matched at any level by any religion. Religions have claimed for all of human history that they have received special information from gods, but at no point did any of those gods explain how to utilize electricity, how to improve sanitation, the origins of disease, the building blocks of matter, the comings and goings of the seasons, and so forth. Much of this discovery was already well underway even during the earliest stages of modern science over 2000 years ago — the fathers of science in ancient Greece didn't even require a fully developed scientific method or scientific community for much progress to be made.

Some argue that the average, non-technical person accepts what science says based on "faith". It is true that few people are in a position to confirm the results of modern scientific experiments, so they have to accept what others say based on their experience and authority. But in the same way, we rely on experts to fix our cars, wire our houses for electricity, style our hair, and fix our leaky faucets. We do this not out of "faith" that they know what they are doing. We have done whatever due diligence we can, and decide which of the experts in these fields we should trust to do the work. By the same token, we decide to trust experts on climate science, cosmology, evolutionary biology, physics, and all the other scientific disciplines. In principle, we could learn plumbing or auto mechanics, just as we could learn astro-physics, but we just don't have the time, the inclination, and probably the talent to do so. Unlike with religion, anyone can, in principle, confirm scientific experiments on their own. The ability of others to repeat experiments (or show that those experiments cannot be replicated) to make sure they are right is one of the most distinctive attributes which defines the scientific method.

Moreover, most people can observe the practical impacts of what science says and thus don't need to conduct experiments to confirm that scientists are right. Not everyone is able to understand the theories behind how electricity operates, but everyone is able to witness the obvious and dramatic effects of electricity at work in their electric appliances.

Some religious believers might claim the same on behalf of their god(s), but there are many believers from many religions claiming the same about many different gods. Not all of those gods can exist, so not all of the claimed "effects" can be attributed to real gods. For every god who blesses Israel, there is another god who is failing Canaan. Everyone, however, uses the same electricity and sees the same effects of electricity. There aren't alternative denominations of "energy" with competing claims about what the "real" source of energy is. Thus the claims about gods and their effects do have to be taken on faith, but the claims of science — for example the science of electricity — don't need to be taken on "faith" at all - you would be insane to debate the reality of the science behind electricity.

I should add that religions don't re-evaluate their basic tenets. They don't put themselves and their doctrines on trial as part of their fundamental operation, as does science. They are not evidence based. They make claims that cannot be verified or falsified - they can't be evaluated at all, but must simply be accepted or rejected. Their knowledge is revealed, not discovered empirically, and is offended by criticism. To paraphrase an exchange heard during a evolution/creation debate, the creationist quipped that “his textbook was cheaper”. The evolutionist struck back with, “perhaps, but that’s because we update ours occasionally”.

It ultimately boils down to this: when it comes to obtaining knowledge about the universe, are you going to trust empirical methods and logic, or revealed knowledge? Which will you rely on? Which do you rely on when crossing a busy street or when looking for your lost keys? Keep in mind that even when people from different cultures and eras use the former (empirical) method, their findings all agree. But there are thousands of different and conflicting versions of revealed knowledge. This stark difference: consistency and agreement vs. inconsistency and incompatibility should be very compelling.

As Steven Novella wrote in his Neurologica Blog.

What do you think science is? There's nothing magical about science. It is simply a systematic way for carefully and thoroughly observing nature and using consistent logic to evaluate results. Which part of that exactly do you disagree with? Do you disagree with being thorough? Using careful observation? Being systematic? Or using consistent logic?
... and ...
We should reject the "science is religion" gambit that Creationists and other anti-science proponents will not let go of. This is philosophically naive. Science does not require any worldview. It just follows a certain set of methods. You don't have to believe anything to do science. You just have to follow methodological naturalism. Philosophers specifically distinguish between methodological naturalism, which is just following the methods of science, and philosophical naturalism. With methodological naturalism, you assume "cause and effect" exists in the universe, that you can't invoke magic in scientific arguments. It doesn't require that you actually believe that there is nothing magical in the universe. Having that actual belief is philosophical naturalism, which many scientists do, in fact, subscribe to - but it is not required to successfully "do" science. Science is not about belief - it is about a set of methods. Religion is about belief. Science is categorically not a world view, it is not a religion.

Saturday, June 26, 2010

7.0 Social postulates / values of science

In a 1942 article on the ethos of science, Robert K. Merton described the scientific ethos as “that emotionally toned complex of values and norms which are held to be binding on the man of science.” He was interested in studying the interactions between social and cultural structures and science. He arrived at a list of ideals and values that are dictated by the goals and methods of science, and which are binding on scientists operating in the wider scientific and social milieu.These ideals and virtues, widely accepted as desirable by most members of the scientific community, are so widespread as to be treated alongside the other fundamental assumptions of science discussed in this chapter. Scientists’ behaviors are strongly influenced through these norms by sanctions and rewards and “are in varying degrees internalized by the scientist”. As with the philosophical postulates regarding reality and inference, these values cannot be proved or disproved, but are built into functioning of the community of scientists:

Communalism
Scientific knowledge is public knowledge; freely available to all. The results of research do not belong to individual scientists, but to the world at large.

Universalism
There are no privileged sources of scientific knowledge; the laws of science are the same everywhere and are independent of the scientists involved.

Disinterestedness
Scientists are unbiased; science is conducted in order to further human knowledge. They have no personal stake in the acceptance or rejection of data or claims.

Originality
Science is the discovery of the unknown; all scientific work must be novel, continually adding to the body of scientific knowledge.

Skepticism
Scientists take nothing on trust; knowledge, whether new or old, must always be scrutinized for possible errors of fact or inconsistencies of argument.

8.0 Why Science is not a religion

The corollary to the accusation that science relies on faith to the same degree as religion is that science IS the new high-tech religion of the 21st century. This facile criticism is easily countered. The following retorts are taken from “Science is Not a Religion: Why Science and Scientific Research are not Religions” by Austin Cline, on About.com

  • Belief in Supernatural Beings: The most common and fundamental characteristic of religion is belief in supernatural beings - usually, but not always, including gods. Few religions lack this characteristic and most religions are founded upon it. Does science involve belief in supernatural beings like gods? No - many scientists are themselves theists and/or religious in various ways while many others are not. Science itself as a discipline and profession is godless and secular, promoting no religious or theistic beliefs.

  • Sacred vs Profane Objects, Places, Times: Differentiating between sacred and profane objects, places, and times helps religious believers focus on transcendental values and/or the existence of a supernatural realm. Many scientists, godless or not, probably have things, places, or times which they consider "sacred" in the sense that they are venerated in some way. Does science itself involve such a distinction? No - it neither encourages nor discourages it. Some scientists may believe that some things are sacred, and others won't.

  • Ritual Acts Focused on Sacred Objects, Places, Times: If people believe in something sacred, they probably have rituals associated with it which are also sacred. A scientist who holds something as "sacred" may engage in some sort of ritual or ceremony. As with the very existence of a category of "sacred" things, however, there is nothing about science which either mandates such a belief or excludes it. Some scientist participate in rituals and some don't; there are no scientific rituals, godless or otherwise.

  • Moral Code With Supernatural Origins: Most religions preach a moral code which is typically based upon whatever transcendental and supernatural beliefs are fundamental to that religion. Thus, for example, theistic religions typically claim that morality is derived from the commands of their gods. Scientists have personal moral codes which they may believe have supernatural origins, but those are not an inherent part of science. Scientists also have professional codes which have purely human origins.

  • Characteristically Religious Feelings: Perhaps the vaguest characteristic of religion is the experience of "religious feelings" like awe, a sense of mystery, adoration, and even guilt. Religions encourage such feelings, especially in the presence of sacred objects and places, and the feelings are typically connected to the presence of the supernatural. Most scientists experience such feelings; often, it's a reason why they got involved in science. Unlike religions, however, these feelings have nothing to do with the supernatural.

  • Prayer and Other Forms of Communication: Belief in supernatural beings like gods doesn't get you very far if you can't communicate with them, so religions which include such beliefs naturally also teach how to talk to them - usually with some form of prayer or other ritual. Most scientists believe in a god and therefore probably pray; other scientists don't. Because there is nothing about science which encourages or discourages belief in the supernatural, there is also nothing about it which deals with prayer.

  • A Worldview & Organization of One's Life Based on the Worldview: Religions constitute entire worldviews and teach people how to structure their lives in relation to their worldview: how to relate to others, what to expect from social relationships, how to behave, etc. Scientists have worldviews, and there are common beliefs among scientists in America, but science itself doesn't quite amount to a worldview. It provides a basis to a scientific worldview, but different scientists will arrive at different conclusions and incorporate different elements.

  • A Social Group Bound Together by the Above: A few religious people follow their religions in isolated ways; more often than not religions involve complex social organizations of believers who join each other for worship, rituals, prayer, etc. Scientists belong to a variety of groups, many of which will be scientific in nature, but not all the same groups. What's important, though, is the fact that even these scientific groups are not "bound together" by all of the above. There is nothing in science which is even remotely like a church.

And from the Counter Creationism Handbook, by Mark Isaak come these counter arguments. Although originally phrased in the context of evolutionary biology, the same rebuttals are equally meaningful for any science:
  • No faith is required. Scientific knowledge is based on evidence that has been observed.

  • "Faith" in religion is different that "faith" in science. The two uses of the same word are not equivalent. Religious faith is better called "gullibility" (believing with no evidence and no logical backing). Religious faith is on the same footing as faith in Bigfoot, Leprechauns, UFOs, and ESP.

  • Religion explains ultimate reality and moral purpose. Science describes how things and processes work and creates theories that predict outcomes of future events and new discoveries.

  • Religion describes the place and role of humans within ultimate reality. Science does not attempt to make these types of value judgments.

  • Religions include reverence for and or belief in a supernatural power or powers. Science does not. In the words of the great scientist, Laplace, to Napolean when asked about his famous discourse on the variations of the orbits of Saturn and Jupiter, "I had no need of that hypothesis".

  • Religions have a social structure (priests, deacons, congregation) built around their beliefs. No such social structure is built into science. Although there is a social element to science, it is no more nor less than in any other profession.

  • Religions impose moral prescriptions on their members. Science does not. Science has been used (and misused) as a basis for morals and values by some people (Hitler’s eugenics, justification for slavery, etc). These views, though invoking science, are themselves not science. Science cannot be held responsible for its misapplication.

  • Religions include rituals and sacraments. With the possible exception of college graduation ceremonies, there is nothing comparable in scientific studies.

  • Religious ideas are highly static; they change primarily by splitting off new religions. Ideas in evolutionary biology change rapidly as new evidence is found.

  • How can a religion not have any adherents? When asked their religion, many, perhaps most, people who believe in evolution and the scientific method will call themselves members of mainstream religions, such as Christianity, Buddhism, and Hinduism. None identify their religion as science. If science is a religion, it is the only religion that is rejected by all its members.

  • Science may be considered a religion under the metaphorical definition of something pursued with zeal or conscientious devotion. This, however, could also apply to stamp collecting, watering plants, or practically any other activity.


Calling science a religion renders the term, "religion", effectively meaningless. If "religion" is to retain anything resembling its common definition, it should be clear that science falls far short of having the requisite properties of a religion. It does share several traits with religion, but so do political parties, sporting fandom, and many other social organizations. Among those common properties are:
  • Evangelism (but this is a feature of politics, economics, sports, and many other social systems)

  • Meetings, hierarchies, orthodoxy

  • Occasional fanaticism

  • Dogmatism (sometimes)

  • Membership in organizations

But again, just because some elements are shared with religion, science does not share the key attributes (belief in a supernatural deity, faith without proof, rules for morality, holy books and legends, and all the rest). To extend the definition of religion to include science, political parties, sports fans, etc would so dilute that term as to render it practically meaningless. By that definition, avid stamp collecting would be a religion.

Tuesday, April 27, 2010

6.7 Does Science Reveal or only Model Reality?

Some critics of science argue that it can't teach us about "ultimate reality" or show us "thing in themselves", the absolute nature of all things. Instead it can offer only a thin, limited, and truncated view of a wider reality. We can only be aware of our sensations (phenomena), but the objects behind these sensations (the noumena, if they actually exist) will forever be beyond our reach. Some claim that access to the "true nature" of things is only through other means - revelation, mysticism, emotion, faith, social movements, drugs, etc. They claim that their epistemologies teach deeper and more fundamental truths than those imparted by a rational and empirical study of the world. Whether the source is the Holy spirit, religious ecstasy, Nirvana, Enlightenment, LSD experimentation, or Cosmic Consciousness, members of these groups each say they have personal, deep, and moving direct experiences of what they believe is some kind of Truth or "Ultimate Reality". They believe that this reality underlies that everyday common reality which is available to the senses. We can stipulate that reality, itself, is not mind-dependent. But, by definition, our perceptions of it are. For this reason, we are forever separated from the world as it really is, because we are limited to know only our perceptions of it, the phenomena we experience. Though we can never really know "ultimate reality" (whatever that actually means) directly, and are limited to what our experience and perception give us, we can at least know that, somehow, “things in themselves” really do exist “out there”. “Things in themselves” exist wholly outside our experience, and all we can say is that they exist. William Blake wrote,
”If the doors of perception were cleansed everything would appear to man as it is, infinite".
This is an expression of this transcendental sentiment. No doubt people are having intense and moving spiritual experiences. They are certain they really are in communication with some fundamental essence that escapes us in our normal daily pursuits. For many, these experiences define their being and their humanness. They can feel as real or even more real than any other experience. However, there are several problems with the claims these people make.

First, each of these types of claims can't be shown to be anything other than personal mind-altered experiences. In fact, there is strong evidence that that is exactly what they are. Recent revelations in neuroscience demonstrate that the mind can deceive itself in strange and wonderful ways. This deception can be triggered by drugs, electrical stimulation, sensory deprivation, the power of suggestion, schizophrenia, dissociative disorder, hormonal imbalances, disease, tumors, emotional stress, seizures, brain injury, social influence, and bio-chemical malfunctions. Brain researchers can induce comas, trances, out-of-body experiences, false memories, anger, pain, joy, ecstasy, and a host of other cognitive/emotional experiences that appear utterly real and meaningful to the patient. Neuroscientists have identified brain regions that are responsible for making us feel as if we are in our bodies and that we are separate from the universe around us. If these regions are disrupted, it will result in a sensation of floating outside one’s body, or perhaps a sensation of being one with nature, or the universe, or some higher power. A strong feeling of God's presence, seeing into past lives, interacting with alien races, or experiencing the seeming unity of all things is a psychological/neurological experience - not an objective one. But, these experiences are still valuable to the person having them, and are probably quite important in the evolution and maturing of individual human consciousnesses. Simply naming a human experience doesn't explain it, nor does explaining it make it vanish. But understanding it for the neurological and psychological event that it is helps put it in context and aids in making sense of it in relation to other, similar phenomena that we do understand. Realizing that a person in the throes of religious fervor at a tent revival is participating in mass hysteria instead of actually talking to God is instructive, but it does nothing to reduce or dampen the experience of the participant. It does, however, bring the event back down to Earth.

Second, to say that science doesn't achieve a goal it sets out for itself - showing us the "true reality" - is mostly a straw man argument. The assertion that science even attempts to teach this is certainly not shared by all scientists, and in fact, is not even a dominant view. It is true that Scientific "Realists" do think that there is a reality (i.e., actual "things") underlying what science describes and accesses. But a large majority of scientists (if they were even to consider the question) would probably fall into the camp of scientific "Instrumentalists". The Instrumentalists don't concern themselves with philosophical questions of ontology, "being", and ultimate reality. For them, science is just the tool they employ for obtaining knowledge and explaining how the world works. As Stephen Hawking said,
I don't demand that a theory correspond to reality because I don't know what it is. Reality is not a quality you can test with litmus paper. All I'm concerned with is that the theory should predict the results of measurements.
So, Hawking was of the Richard Feynman school of "shut up and calculate". Instrumentalists in science do not aim at or seek "truth" in a religious or philosophical sense, but look to it as a tool for providing increasingly accurate descriptions of the world. To them, science studies "phenomena", and does not attempt to deal with "Noumena". The term, "phenomenon", came into its modern philosophical usage through Immanuel Kant, who contrasted it with noumenon (for which he used the term "Ding an sich", or "thing-in-itself"). Noumena, in contrast to phenomena, are not directly accessible to observation. For the purpose of this discussion, the Noumena would correspond to that inaccessible ultimate reality.

Not only for instrumentalists, but for most scientists, all that science can really provide us is a descriptive and explanatory model that successfully makes predictions. We don't ever know if we are describing the way the world actually "is", or if we have just come up with a way that we can understand and think about the world in a way that make sense to us. Werner Heisenberg said:

What we observe is not nature itself, but nature exposed to our method of questioning.
The only thing we can really say about any scientific theory or model is how well it make describes, explains, and predicts. We can't take the next step and say that it *is* the way nature really "is". This is true of all of our theories, and probably even more relevant the more abstract the theory is (for example, theories about quantum physics).

There is a lot of discussion about the seemingly uncanny ability of mathematics to describe and predict events in the world. An article written in 1960 by Eugene Wigner called "The Unreasonable Effectiveness of Mathematics in the Natural Sciences" eloquently presented the question about the relationship between mathematics and the physical world. In it he asked how is it that math is so capable of describing things that don't appear to be mathematical entities? It is possible that because Mathematics is an enterprise the purpose of which is to describe and manipulate many types of logically consistent systems, and because our world is one such system, it should be no surprise that there are certain branches of mathematics that can describe (and predict and explain) phenomena in our world. Is our world logically consistent? Of course - if it were not, it would implode in a giant flash of improbability!

Seriously, when our theories contradict each other (such as is the case with the wave/particle theory of light) it is an invitation to further research, not a threat to reality. These apparent internal contradictions are more likely to indicate our inadequate or incomplete physical models rather than an actual incompatibility of reality with itself. Case in point - when James Faraday saw electric current moving a compass needle at right angles to the current, he didn't question the sanity of the universe, but concluded (correctly) that there were new laws yet to be discovered.

But we also have within mathematics many concepts that do not correspond to any real things in our world. In this view, the domain of concepts which mathematics can address includes and goes beyond the world we find ourselves in. A different response, advocated by physicist Max Tegmark, is that physics is so successfully described by mathematics because the physical world is completely mathematical, isomorphic to a mathematical structure, and that we are simply uncovering this bit by bit. In this interpretation, the various approximations that constitute our current physics theories are successful because simple mathematical structures can provide good approximations of certain aspects of more complex mathematical structures. In other words, our successful theories are not mathematics approximating physics, but mathematics approximating mathematics.

Third, the term "ultimate reality" is not even agreed upon by those who use it. As mentioned in the previous paragraph, those who subscribe to the philosophical principle of "Noumena" think that there are "things-in-themselves" that exist beneath the level of common perception or the "phenomena" of the every day world. They believe that what is observed of the world is only a surface description of some deeper reality that humans, perhaps, can't ever access. This view is, and has been, debated since the time of Kant. To assert that there is some fundamental essence that transcends what we observe is a philosophical exercise that is a matter of taste. By definition it can never be measured, because by being so it transforms into the concrete world of phenomena. But simply to say that it exists is not to prove anything. In other words, to ask what exists beneath the observable might not even be a meaningful question. "Ultimate reality" may be a fantastic concept that doesn't refer to any actual part of our universe. This term is carefully crafted to be unapproachable by scientfic inquiry, thus making argument about it somewhat moot, except for the philosophically inclined. However, there is no prospect for near term resolution of any outstanding differences of opinion on it.

Fourth, it is true that science only reports on what it can study, which is the natural world. It discovers facts, looks for patterns among those facts, establishes connections between phenomena, helps us derive theories, and allows us to make useful, correct predictions. It proves itself again and again as the best way to obtain knowledge about the universe. But, simply because it recognizes and acknowledges limits regarding what it can study does not mean that faith, superstition, magic, or revelation can do better. In fact, they all perform quite terribly in this capacity. At most, their primary power is in deceiving their practitioners into believing things that are not true, as history has shown countless times. Undoubtedly, those who subscribe to these intuitive ways of gaining knowledge "feel" that they have gotten in touch with a deeper reality, but there is a tremendous distance between strongly feeling something to be true and that thing actually being true. Simply experiencing the sensation of certainty does not actually make you certain.

Getting back to the supposed failure of science to get at "ultimate reality", which is not one of its goals in the first place, modern philosophers of science who embrace Scientific Realism argue that the theories of science are not perfectly true but only "approximately true". They don't perfectly reflect reality, but do a fair job of it. Approximate truth, which is sometimes called verisimilitude, is indispensable to contemporary scientific realists. If we say the Earth is spherical, or the sun is 93 million miles away, these are approximations. Strictly speaking, they are false statements because they are not exactly right. Yet they are unarguably approximately correct. The models we use are not intended to be exact replicas of reality, but stylized representations that abstract away irrelevant details, but retain the logical form of the reality being modeled - for example, the Ideal Gas Law which views a gas as a continuous fluid, or the frictionless surface in many first year physics textbooks that removes friction from the discussion so that other forces can be examined. So, we could say that Copernicus' heliocentric model is false in the sense that the planets don't circle the sun in perfect circles, but scientific realists argue that it is approximately true, in a way the Ptolemy's geocentric model was not. It is also approximately true that the earth is flat (at least as far away as the horizon), and that Newtonian mechanics is an approximately true theory (at low velocities).

There have been cases where the accepted theory was a good describer and predictor of phenomena, but was not even approximately close to "truth". Examples are the theories involving phlogiston, caloric, ether, and geocentricism. The old sunrise/sunset model that had the sun moving around the earth (i.e., the Ptolemaic model) was not "approximately true", though sunrises could be predicted to the minute using that model. Scientific theories, like all logical models, sometime fail in the attempt to reflect the logical structure of the reality they refer to, while still succeeding in their ability to describe the phenomena and predict future occurrences of the phenomena. But when a theory describes, predicts, and explains, and the model appears to actually bear a structural similarity to the outside world, we can reliably say that the theory is "probably" right in being an approximation of reality.

Science recognizes that some (many?) of its theories will be completely overturned and replaced with something that is not a refinement of, but a revolutionary replacement of, old theories. This is why we frequently stress that theories are both provisional (subject to change with more evidence), and (at best) only an approximation of the reality they model.

William of Ockham, who gave us the valuable Ockham's Razor, believed that the world was composed of simple unrelated objects on which we impose order through mental abstractions. Wittgenstein echoed this when he wrote that what we call “Laws of Nature” are nothing more than an artificial order imposed by man on nature - they are not given by nature to man. He said, "at the basis of the whole modern view of the world lies the illusion that the so-called laws of nature are the explanations of natural phenomena." In his view, science works out conceptual schemata or "laws" to describe some aspect of nature. In nature we find loose and unconnected facts. Metaphorically, the schemata resemble a grid we draw on a rough surface which is the "reality" we are attempting to explain. The rough surface contains the “facts” of that reality. Our imposed grid can only approximate the true complexity of the surface. Given a knowledge of how the grid is set up, we can deduce other parts of the grid, but we cannot deduce any surface features. We use experimentation and empirical methods to derive our grids of understanding, and one of several grids may actually fit a surface (these would correspond to competing theories). Occasionally one grid is thrown out and replaced with another, which occurs during scientific revolutions or paradigm shifts. The laws of nature which were supposed to describe the way nature had to work did not really describe nature at all, but just the framework or grid we impose on nature. In fact, laws of nature are just necessary conditions which must exist if our grids are to provide fruitful scientific theories. The grids are not "true", a priori, existing in nature and outside of our arbitrary grid. We never see "the picture" (i.e., reality), but just “the frame” (our language which describes the picture). To Wittgenstein, reality is composed of simple objects thrown together to form a ”state of affairs”; of everything which ”is the case”. All "states of affairs" in the world are totally independent. You cannot infer one state of affairs from another. There is no logical law of cause and effect. To his mind, no causal nexus exists in nature. Cause and effect may be useful to us, but are not provable or even necessary. Induction (accepting the simplest law that can reconcile our experiences) has no logical justification, but a psychological one. In other words, we would go insane without laws of nature.

So, Wittgenstein is saying in a slightly different way that we humans cannot experience "ultimate reality", but only can create a language that "points" to reality. He is highly critical of scientific explanations, saying that not only do they fail to approximate "reality", but that they only have value in supporting each other. Our laws of nature cast the form that any description of the world must take. They tell us nothing about the world. We can infer some things about the world from the fact that it is more easily described by one explanatory system than another. But we are forever separate from direct experience of it.

Not all philosophers agree with Wittgenstein. For example, Francis Bacon, who helped introduce what we now call the scientific method, wrote, "Nature, to be commanded, must be obeyed." He knew that in order to command nature, one must act according to its rules. His statement, "Reality is Absolute", recognizes what is called the "primacy of existence". This means that reality is not subject to wishes, whims, prayers, or miracles. If you want to change the world, you must act according to reality. Nothing else will affect reality. To him, reality was what we dealt with every day, not some abstract ineffable conception.

It should be pointed out that that the best that philosophy, religion, and mysticism were able to do in regard to the task of revealing Ultimate Reality was to divide the world into the Classical Elements of Earth, Air, Fire, Water, Quintessence ("the fifth element"), with a little "Spirit" to complete the mix. This was as far as they could take the project of discovering how the universe was constructed and what its constituent elements were. It was a primitive effort, and it missed the mark by a wide margin. Science has discovered deeper and deeper insights into levels of actual reality. Regardless of whether it exposes some loosely defined "ultimate reality", it does disclose much more about common reality than any competing religion, philosophy, or mysticism ever has. Extending the limitation of human senses with scientific instruments has given the human race what amounts to collective ESP. We can see across the universe and back in time, visualize individual atoms or entire galaxies, listen to the Earth vibrate, examine the surfaces of stars, talk to someone across the world, fly to other planets, and look at creatures swimming at the bottom of the deepest ocean trench. Technology and science have led to the discovery of living cells, molecules, DNA, the true elements of the periodic table, atoms, subatomic particles and every other aspect of reality that no philosophy or religion ever dreamed of. This should be an embarrassment to those who argue that their non-scientific approaches will show us a reality that science missed. They had their opportunity, and all they offered was Classic Elements and a lot of talk about some other ineffable substance called "ultimate reality" that was poorly defined then, and is still as murky as ever, even after several millennia.

If we had left it to these competing epistemological systems, our deepest understanding of reality would be the same as they were in Babylonian times. Science, indeed, may be near the discovery of whatever might be meant by ultimate reality. It is already at the point of dissolving subatomic particles into pure energy - it may well be near the limit of reductionism when the very substance it studies vanishes in a flash of light. It is not armchair philosophy or meditation that will eventually reveal the true nature of reality, but the Large Hadron Collider or one of its descendants.

Science and technology will not help us learn life lessons that make us better people - how to love our wives and families, to approach life with equanimity, to be generous, joyful, peaceful, and kind. These are lessons learned from other sources. But if the questions involve questions of fact, of what "is", the revelatory forms of knowledge will always fall short.

Saturday, March 20, 2010

6.5 John Oakes’ Assumptions of Science

These ideas were aggregated from a series of presentations that can be found at http://www.grossmont.edu/johnoakes/, the website for John Oakes at Grossmont College in El Cajon, California. He, I am sure, is not the originator of this list. But he did a very good job of compiling them into a single presentation which I summarize here. In his view, they express the core set of basic assumptions that both science and rational empiricism embrace:
  • The rules of logic are valid tools for learning and understanding.
  • The world is real. The physical universe exists.
  • Human senses are reliable.
  • The real world is knowable and comprehensible.
  • There are laws that govern the real world. The universe is orderly, having regularity, pattern, and structure. Laws of nature describe that order.
  • Those laws are knowable and comprehensible. The principles that define the functioning of the universe can be discovered. Nature is understandable.
  • Those laws don't radically change according to place or time, since the early stages of the big bang. They are universal.
  • All phenomena have natural causes. Scientific explanation of human behavior opposes religious, spiritualistic, and magical explanations.
  • Language is adequate to describe the natural realm
  • Mathematical rules are descriptive for the physical world
  • Unexplained things can be used to explain other phenomenon (e.g. gravity is thus far unexplained but it is used to explain the movement of planets and the bending of light)
  • Observable phenomena can provide information and knowledge about unobservable phenomena (induction)
  • All ideas are tentative, potentially changed by new information. This harkens back to Newton's fourth of his Rules of Reasoning in Natural Philosophy :
  • "Propositions deduced from observation of phenomena should be viewed as accurate until other phenomena contradict them.” Unless proven otherwise, the best theory that successfully explains the facts should be accepted, keeping in mind that all theories are provisional, subject to revision given new evidence.
  • Nothing is self evident. Truth claims must be demonstrated objectively.
  • Knowledge is derived from acquisition of experience, empirically, through senses directly or indirectly.
6.6 Norm Levan Panel on Intelligent Design

When you compare the list in the previous section to this next one, you will notice quite a lot of overlap. Even after examining the assumptions presented in just these two sections, one can see a common thread emerge: With the use of empiricism, informed by logic and rational thought there appear to be few if any blocks to acquiring an understanding of the universe. The Norm Levan Panel is part of a secular humanist research facility based in Bakersfield CA. It investigates issues related to Intelligent Design, Evolution, and the conflict between religion and science. These assumptions were presented during a forum held at Bakersfield College on April 21, 2006:
  • There is a reality independent of us or our viewpoint
  • Nature follows fundamental rules and laws
  • Humans have the ability to figure out rules of nature
  • Peer review is critical to filtering out human biases
  • Objective observational experiences are necessary for advancing knowledge of reality
  • Scientific method combines rationalism's deductive logic with empiricist's inductive logic based on observational experience
  • Invoking the supernatural is dead-end to further inquiry. Science cannot test supernatural explanations, since they are unfalsifiable, unverifiable, and can be altered to fit any situation post-hoc.
I cannot completely agree with the last of the bulleted items. There is nothing inherently untestable about supernatural claims. What is untestable are supernatural causes if they are presented as being immune from being disproved. Claims of ESP or faith healing, which rely on supernatural powers, can certainly be tested. But supernatural causes that can transform to fit any outcome, which elude falsification, or defy testing are, by definition, unscientific and fall outside the realm of the scientific method. They may be true or not true, but science is not equipped to find that out.

So, the elements in these catalogs of assumptions underlying the scientific method and empirical inquiry revolve around assertions that reality is objective and consistent, that humans have the capacity to perceive reality accurately, and that rational explanations exist for elements of the real world. These assumptions are based in naturalism, logic, and empiricism, which provide a framework within which science can be performed.

Biologist Stephen J. Gould included two additions that augment these lists: 1) Uniformity of law and 2) uniformity of processes across time and space--must first be assumed before you can proceed as a scientist doing science.

Saturday, February 20, 2010

6.4 Falsifiability vs Verifiability

Karl Popper, the well-known modern critic of Logical Positivism, wrote The Logic of Scientific Discovery in the 1930's. In it he promoted the revolutionary idea that the Logical Positivists' requirement of verifiability was too strong a criterion for science, and should be replaced by a criterion of falsifiability. The Positivists held that statements about the world are meaningless and unscientific if they cannot be verified.

To the Logical Positivist, the technical term, "Verification", has a very precise meaning (though different Positivists have had slightly different definitions of it). Generally it indicates that a statement is meaningful only if it is either empirically verifiable or else tautological (i.e., such that its truth arises entirely from the meanings of its terms). "Verifiability" requires that a statement be logically entailed by some finite set of observation reports. Later Positivists, having abandoned this view, required of a verifiable statement only that it be made evident or supported or rendered probable by the relevant set of observations.

Popper, on the other hand, argued that verifiability was more a requirement for "meaning" rather than science. He explained that there exist meaningful theories that are not scientific, and that a criterion of meaningfulness is not the same as a criterion for demarcation between science and non-science. Popper proposed that that falsifiability was the correct rule for this use because it did not invite the philosophical problems inherent in verifying via induction. It allowed for statements from the physical sciences which seemed scientific but which did not meet the more stringent verification criterion. In other words, it is more difficult to construct strictly verifiable hypotheses than it is to devise falsifiable ones, and for this reason, the criterion of verifiability excludes much of what we would consider to be real science. If verifiability were the criterion, then the targets which science could address would be far more constrained.

One of the main criticisms of logical positivism was that its own principle of verification, which held that a statement is only meaningful if it can be empirically verified, was self-defeating. The Positivist "Verification Principle" was central to the Positivist project of demarcating scientific knowledge from non-scientific or metaphysical claims. The problem with the Verification Principle is that it cannot be empirically verified itself. This means that the principle fails its own test for meaning and is thus rendered meaningless according to the Positivist's own criteria. This problem is known as the "verification principle's self-refutation" and was pointed out by several philosophers, including Popper and Quine, who argued that the principle was either trivially true or not at all true.

This critique undermined the Positivist's claim to have found a solid foundation for scientific knowledge and contributed to the decline of the movement. However, it is important to note that this was not the only reason for the demise of logical positivism. Other factors, such as the emergence of new scientific theories, critiques of the positivist's understanding of language and meaning, the emergences of Quantum Theory (and the "Uncertainty Principle"), and the social and political changes of the time, also played a role.

To be clear, just because something is "falsifiable" does not mean it is false. Rather, it means that if it is false, then this can be shown by observation or experiment. Popper used falsification as a criterion of "demarcation" to draw a sharp line between those theories that are scientific and those that are unscientific or pseudo-scientific. He was motivated by a frustration with what he considered to be some unscientific theories that were popular at the time - Marxism and Freudianism, and his great admiration of Relativity Theory. He saw the one set of theories as qualitatively different than the second. Marx and Freud were making claims that were fundamentally incapable of being disproved, while Einstein's claims were very clearly capable of disproof. This difference is the essence of Falsifiability. It is useful to know if a statement or theory is falsifiable, if for no other reason than that it provides us with an understanding of the ways in which one might assess and test the theory. One might at the least be saved from attempting to falsify a non-falsifiable theory, or come to see an unfalsifiable theory as unsupportable.

Popper claimed that, if a theory is falsifiable, then it is scientific; if it is not falsifiable, then it is not open to falsification and therefore not a meaningful scientific issue. This puts most (but interestingly, not all) questions regarding God and religion outside the domain of science. Falsifiability also circumvents the debate over whether the domain of science only encompasses the "natural world" as opposed to the "supernatural". Instead it frames science within the bounds of a methodology - science deals with hypotheses that can be falsified.

Falsifiability certainly ranks as one of the most important elements in the modern conduct of science. Its superiority to verifiability results from fact that no number of positive experimental outcomes can ever absolutely confirm a scientific theory. But a single counter-example is decisive. It shows that the theory being tested is false, or at least incomplete. Instead of saddling scientists with the impossible task of providing absolute proof, a theory was considered to be tentatively “true” if ample opportunity and means were proposed to disprove it, but no one was able to do so.

Popper demonstrated his position with an example of the rising sun. Although there is no way to prove that the sun will rise every morning, we can hypothesize that it will do so. If only on a single morning it failed to rise, the theory would be disproved. Barring that, it is considered to be provisionally true. The longer a theory retains this provisional status, and the more attempts are made to test it, the greater its claim to firm truth. The “sun-will-rise” theory has been well tested many billions of times, and we have no reason to anticipate that circumstances will cause it to stop happening. So we have a very good reason to believe that this theory represents reality. This argument has some weaknesses (primarily that it is not deductively ironclad). But because no stronger proof suggests itself, it is consistent with other information we have, and it is pragmatically useful, it remains a very good operating theory.

However, critics have legitimately pointed out practical problems with the straightforward use of falsifiability to test theories. The basic idea that Popper proposed is that a scientist proposes a theory, and researchers test the theory in an attempt to find confirming and/or contradictory evidence. If it can be falsified by reliable and reproducible experimental results, that theory must be abandoned.

Thomas Kuhn argued that the actual practice of science doesn't follow this type of pattern at all. He pointed out that in the history of science, there have been several famously incorrect conclusions reached by applying this standard. For example, when it was discovered that the orbit of the planet, Uranus, did not follow the path predicted by Newtonian mechanics, it appeared to falsify Newton. However, the desire to retain Newton's laws was so strong (after all, it was coherent with so many other theories), that post hoc explanations were introduced to save it - in this case another planet was posited even further out than Uranus. This actually turned out to be the case (Neptune was discovered some years later). But it is considered a weakness in a theory to have to postulate ad hoc changes simply to save it from the facts.

A similar problem was encountered by Lord Kelvin in his attempt to falsify claims of great antiquity for the Earth's age. He was a devout Christian who objected to Darwin's new Theory of Evolution, and was very motivated to demonstrate that not enough time had passed for evolution to have occurred. He calculated the rate at which the Earth has probably cooled since its early (assumed) molten state. This number (about 20 to 100 million years or so) happened to agree very well with his other calculation for the age of the sun. Our planet could certainly not be older than the sun. And he thought that if the sun were made of even the highest grade coal, it could have been burning at its current rate for only a few thousand years. He added in gravitational contraction as an alternate source of heat, and arrived at the same age - 20 to 100 million years. Thus, he believed he had falsified the Theory of Evolution, which requires much more time than that. Of course, if we altered the assumption concerning the source of heat for the sun (e.g. from coal to fusion), it would allow a far greater age for both it and the Earth. To Kelvin's credit, he did recognize this when he wrote,

"inhabitants of the earth cannot continue to enjoy the light and heat essential to their life for many million years longer unless sources now unknown to us are prepared in the great storehouse of creation."

Another great example is the Theory of Evolution. The current "Modern Synthesis" Theory is quite different from the version first proposed by Darwin, though it shares many of the basic fundamentals. It has incorporated Mendelian genetics over Darwin's version of inheritance, and was improved by our increased understanding of DNA and molecular biology. The theory has changed over time in the face of new information that showed limitations, weaknesses, and actual errors in the original theory. It was not discarded, but improved.

The question is, though, how many times are we allowed to move the goalposts and alter theories, and the background assumptions of those theories, after they have failed? The addition of ad hoc, "auxiliary propositions" can weaken the original theory by erecting a shaky scaffold of special cases around it. But even Popper admitted that the "naïve falsification" he proposed has to be flexible enough to bend with necessity:

"Some genuinely testable theories, when found to be false, are still upheld by their admirers—for example by introducing ad hoc some auxiliary assumption, or by reinterpreting the theory ad hoc in such a way that it escapes refutation. Such a procedure is always possible, but it rescues the theory from refutation only at the price of destroying, or at least lowering, its scientific status."

I question his assertion that introduction of ad hoc assumptions inevitably weakens or lower the status of the theory. In the examples above involving Uranus, the sun's age, and Evolution, the addition of new considerations into the theories enriched them and eventually led to a clearer understanding of the solar system and life. The modified theories were actually stronger than the original ones.

As attractive and seemingly fail-safe as Popper's Falsifiability Theory may have seemed, especially when contrasted with the most competitive alternative of the time - the Logical Positivist's Verifiability technique - it is clearly not a panacea. We see that it has some serious limitations and weaknesses. Popper's theory was basically that no amount of data could ever completely prove a theory, but that even a single piece of counter-evidence is sufficient to disprove it. It is a wonderful guideline, like Occam's Razor or "measure twice cut once", but it is not useful in all scientific endeavors. Although the naïve initial temptation upon learning about it is to apply it indiscriminately, it turns out that it is not universally applicable.

The first problem concerns the assertion that no amount of data can confirm a theory. This is simply not how science is actually practiced. Overwhelming and consistently supportive data boosts confidence in a theory to the point where it is accepted as a practical fact; that to dispute it would be contrary and perverse. No one seriously disbelieves in the law of gravity (e.g., that apples may start falling upward tomorrow). Scientists usually don't need to confirm a theory one hundred percent in order to trust and use it as if it were true.

As Kuhn described in his work, scientists do not discard a theory as soon as an experimental observation contradicts the theory. That contrary evidence would need to be reproduced several times, and other similar experiments would need to be done to probe the potential weakness and boundaries of the problem area. There could have been flaws in the experimental methodology or the analysis of results, or maybe the theory just needs a minor adjustment to accommodate the new data.

Another reason why the importance of falsification has declined is because much of modern science is model-based rather than hypothesis and theory-based. Doing science using models rather than theories doesn't really lend itself to falsification, since there are no experiments being conducted to isolate behaviors that will yield evidence supporting or contradicting a hypothesis. Of course models are incomplete and simplistic (compared to the complex physical process they are modeling). There are bound to be errors in them. Data that conflicts with the model doesn't necessarily imply that the model should be discarded, but more likely that the model needs additional refinement to address aspects of reality that were left out of it or incorrectly dealt with in the model. In fact, discovering and repairing flaws in the models adds deeper understanding of the real-world phenomena the model attempts to replicate. Climate and weather models, molecular models, economic models, cosmological models, and the rest are not always thrown out when reality conflicts with them - instead they are usually enhanced or modified to incorporate the new information, making the models stronger and more accurate and representative. Sometimes older models become so ragged and jury-rigged that it makes more sense to discard them and begin again with a new approach.

One last problem with falsification is that much of science does not involve establishing the correctness of theories - they are not tests of theories. Materials science, chemistry, biology, computer science, and others involve activities that don't involve falsification or verification - they are making things like new materials, molecules, pharmaceuticals, software solutions, and devices. There is nothing to falsify - so Popper's method is simply irrelevant in these legitimate sciences.



Duhem-Quine Thesis

The Duhem-Quine thesis adds another objection to Popper's criterion of falsifiability (sad to say). Falsifiability works so well in so many cases, but it does, unfortunately have at least one fatal flaw, which Duhem and Quine identified. They assert that no hypothesis entails predictions about an empirical observation alone, because that hypothesis is always associated with a large collection of supporting assumptions, theories, and auxiliary hypotheses. This thesis states that it is impossible to test a hypothesis in complete isolation, because an empirical test of the hypothesis requires one or more of these background assumptions, the ultimate background assumption being that we can even rely on the rules of logic. The hypothesis being tested cannot be completely segregated from the assumptions that support it. Instead, the consequences of the hypothesis rest on background information, and that information must itself be tested and proven (or at least shown not to be false) - they must be accounted for. And those background assumptions, themselves may depend on other background assumptions. If your experiment on the primary hypothesis generates a negative result (i.e., you have falsified the hypothesis), but you have not accounted for all background assumptions (ad infinitum), you really can't draw any conclusion. Your hypothesis may indeed be wrong, or the background assumptions may have problems which invalidate the falsification. The case involving the age of the Sun (above) is an actual example of this - the background assumption about the source of heat in the sun was wrong, as was his additional assumption about the rate of the Earth's cooling. So, although Kelvin thought he had falsified evolution, he had done nothing of the sort. Further, a discrepancy between what a hypthesis predicts and the actual observed data does not necessarily falsify the hypothesis because there may have been flaws in how the data, itself, was collected or measured.

The thesis can be expressed using symbolic logic:
H → E
This says that "If H, then E", where H is the hypothesis and E is evidence, or an observation expected if the hypothesis is true. That is, if the hypothesis is true then we should see the evidence. By the logical rule of modus tollens,
¬E → ¬H
This says that if we do not observe E, then H is false. In other words, if we don't observe the evidence when running an experiment, then the hypothesis has been falsified. For example, say that H is the hypothesis that water boils at 100°. You have a pot of water you intend to boil. If you heat the water past this temperature and it does not boil, then you have falsified the hypothesis. But this assumes quite a lot, for example it assumes that you are at one atmosphere of air pressure, and that the water is pure and unadulterated. But what if you are at two atmospheres of pressure, or what if the water has a contaminant, such as antifreeze, in it that raises the boiling point? The background assumptions have been violated. So a better expression of the experiment is:
(H & A) → E
This means that H, along with some background assumptions imply E. So, if you don't observe E, then (H&A) are false
¬E → ¬(H & A)
This says that the combination of H and the its background assumptions, A, is false. A is not just a single assumption, but many (such as we are at one atmosphere, that the water is pure, that the thermometer is well calibrated, etc). So, A is really (A1 & A2 & A3 & ... & An), where each of the A's is a different background assumption. So now we have:
(H & (A1 & A2 & A3 & ... & An)) → E
and also:
¬E → ¬(H & (A1 & A2 & A3 & ... & An))
The above expression, ¬(H & (A1 & A2 & A3 & ... &An)), is the same as any of these:
¬H | ¬(A1 & A2 & A3 & ... &An)
¬H | (¬A1 | ¬A2 | ¬A3 | ... | ¬An)
¬H | ¬A1 | ¬A2 | ¬A3 | ... | ¬An
This means that if you don't observe E, then either the hypothesis, H, is wrong, or one of the background assumptions, (A1, A2, A3, ..., An), are wrong, or some combination of H and one more of An is wrong.

So, as frequently as the idol of "falsifiability" is honored in the context of science, a "naïve falsification" approach truly is insufficient. Serious researchers must also take background assumptions into account to ensure that they, too, have strong support. But all is not lost. Obviously, science still occurs, experiments are run, hypotheses are falsified, and progress is still made. Overall, this critique of Popper's method been healthy for science. Researchers are forced to take less for granted in their assumptions, do a thorough job of supporting their underlying assumptions, and check their experimental methods to buttress against possible errors in auxiliary assumptions. The reliance on background assumptions cannot be eliminated, but their destabilizing influence can be minimized to, hopefully, manageable levels. The process of justifying beliefs and assumptions can only begin once a number of precursors assumption are independently justified. Some of these fundamental assumptions must be accepted as self-evident if they cannot be justified because they comprise the frame in which justification takes place.

Monday, February 8, 2010

6.3 Ockham’s razor and the Law of Parsimony

I have made several appeals to Ockham's razor so far in this paper. It is one of the most widely referenced basic principles of science, empiricism, and reason, being one of the few that people who don't actually study this field are familiar with. It is a heuristic principle that has been shown to be immensely valuable in the long history of science, as well as in everyday living. Also called the "Law of Parsimony", it is succinctly expressed as "entities should not be multiplied beyond necessity". In modern English, "the simplest explanation tends to be the correct one". This is not a mystical revelation, but is a guideline that has been borne out in case after case. Simply stated, nature tends to solve problems using the least energy and complexity that will suffice, taking the shortest and most direct path available. Just as water flows downhill, and Uranium splits into new atoms that have the lowest stable energy level, all physical systems trend to the state of lowest energy following the path of least resistance. All of these phenomena, summed up, seem to promote the overall tendency of nature to "prefer" (pardon my anthropomorphizing) the simplest course to an outcome.

Ockham's razor is not a scientific theory, nor is it a law of nature. It is a guideline - a rule of thumb. It is a pattern that frequently fits the turn of events. However, things don't always work out according to it. It does not compel us to always choose a particular explanation over another. And there have been many cases where the more complicated explanation was correct. The mind boggling number of subatomic particles is by no means a simple explanation for the existence of matter. It is a far more complex theory that simple atomic theory that required only protons, neutrons, and electrons. Mendeleev's periodic table with its dozens of elements is much more complicated than Aristotle's five elements (earth, air, fire, water, either). The theory of evolution is far more complex than "god just created everything as you see it today". In each of these cases, a more complex theory turned out to be correct.

But in most cases, the simpler explanation does tend to be the right one. Example: If a dog owner comes home to the trash can tipped over and trash scattered on the floor, two possible explanations are that the dog tipped over the trash or someone broke into the house and sorted through it, or that a poltergeist was responsible. Most of the time, blaming the dog would be the correct choice. This guideline has been stated in many ways by many different people in different times and places. Aristotle wrote in one of his essays:

"We may assume the superiority "ceteris paribus" (i.e., all things being equal) of the demonstration which derives from fewer postulates or hypotheses."
John Duns Scotus preceded Ockham in proposing this rule in the late 1200's:
"Plurality is not to be posited without necessity. What can be done with fewer would in vain be done with more."
Thomas Aquinas, also in the late 1200's, wrote:
"If a thing can be done adequately by means of one, it is superfluous to do it by means of several; for we observe that nature does not employ two instruments where one suffices."
Galileo, in the course of making a detailed comparison of the Ptolemaic and Copernican models of the solar system, maintained that
“Nature does not multiply things unnecessarily; that she makes use of the easiest and simplest means for producing her effects; that she does nothing in vain, and the like”
Isaac Newton proposed his four “Rules of Reasoning in Philosophy”, the first of which dealt directly with Ockham's Razor, though not by that name. This is his somewhat anthropomorphized, teleological version of Ockham’s Razor:
"We are to admit no more causes of natural things such as are both true and sufficient to explain their appearances. To this purpose the philosophers say, that Nature does nothing in vain, and more is in vain, when less will serve; for Nature is pleased with simplicity, and affects not the pomp of superfluous causes.”
The name of this guideline is (most agree) incorrectly attributed to William of Ockham, a 14th century Franciscan friar. Whether or not he actually said anything resembling the rule that bears his name, the words he supposedly said were:
"Entia non sunt multiplicanda praeter necessitatem" (entities must not be multiplied beyond necessity),
Bertrand Russell, much later in the 20th century offered a version:
"Whenever possible, substitute constructions out of known entities for inferences to unknown entities."
Kant, in the Critique of Pure Reason, proposed his version:
“Rudiments or principles must not be unnecessarily multiplied (entia praeter necessitatem non esse multiplicanda).”
He argued that this is a regulative idea of pure reason which underlies scientists' theorizing about nature. This common-sense mindset appears and reappears multiple times thoughout western thought. It probably also occurred to Oriental and Arabic scholars, though I have found no references to such independent origins. In any case, its wide distribution and persistent popularity testify to its enduring value. Although our explanations sometimes run counter to Ockham's Razor, it is a valuable rule to keep in mind. It can help bring us back to reality when tempted to engage in complex flights of fancy when faced with confusing situations. It probably leads us in the right direction more often than not. But, as noted, we should not be enslaved by it, but should use it as one of our tools for problem solving in the real world.

Sunday, February 7, 2010

6.2 Aristotle’s Laws of Thought

Aristotle's "Laws of Thought" date back to the earliest days of Western Philosophy. They shape the basic structure of western thought, science, and its overall worldview - the worldview that can so puzzle many non-Westerners. Many philosophers (and mathematicians) who followed Aristotle such as Locke, Leibnitz, Schopenhauer, and Boole, have modified and enhanced his principles. However, the initial intent has remained the same even if the laws, themselves, get reformulated. These laws are fundamental logical rules, with a long tradition in the history of western philosophy, which together define how a rational mind must think. To break any of the laws of thought (for example, to contradict oneself) is to be irrational by definition. These three classic laws of thought were fundamental to the development of classical logic. They are:
  • Law of Identity - an object is the same as itself:
    A ⇔ A
  • Law of Noncontradiction - contradictory statements cannot both at the same time be true, e.g. the two propositions "A is B" and "A is not B" are mutually exclusive:
    ¬(A ∧ ¬A)
  • Law of the Excluded Middle - Everything (that is, every proposition) must either be true or not true. There is no in-between:
    A ∨ ¬A
Actually, with just a little logical manipulation, I think I can show that the Law of Noncontradiction is the same as the Law of the Excluded Middle. There is a rule in logic called De Morgan's Law. It has several representations, but one of them is:
    ¬(P ∧ Q) ⇔ ¬P ∨ ¬Q
If we let P = A, and Q = ¬A, then
    ¬(A ∧ ¬A) ⇔ ¬A ∨ ¬¬A,
which is the same as:
    ¬(A ∧ ¬A) ⇔ ¬A ∨ A
The left hand side is the Law of Noncontradiction, and the right hand side is the Law of the Excluded Middle.

These are self-evident logical principles - fundamental axioms that cannot be proved (or disproved), but must be accepted (or rejected) a priori. In other words, there is nothing "under" them - they cannot be decomposed into more basic principles. They are similar, conceptually, to the axioms in Euclidean Geometry (e.g. the famous "Parallel Postulate"). Other types of geometry are possible, but if you begin with certain postulates you get Euclidean geometry. Other postulates generate other geometries. In logic, other postulates could be substituted for the Laws of Thought, and in fact have been in other traditions such as Buddhism, which celebrates contradiction. Paraconsistent logic (a type of logic that deals with contradictions differently than classical logic) does not depend on the Law of Noncontradiction. Even Greek philosophy before Aristotle (and Parmenides, who proposed similar laws) did not always embrace these concepts. But practically everything we know of traditional Western Philosophy and Logic embodies these principles. Preceding Aristotle by over a century, Heraclitus believed that contradictions were necessary - that their existence was essential to a thing's identity:

"Not only could it be stated that identity is the strife of oppositions but that there could be no identity without such strife within the entity."
He argued that because all things change, they must have already had in them "that which they were not". Only the existence of such contradictions could account for the change we see in the world. For example,
"Cold things grow warm; warm grows cold; wet grows dry; parched grows moist."
The defenders of Aristotle’s three laws of thought quickly learned that they had to establish the context for the application of these laws, because they were frequently assailed with counter-examples that seemed to violate them. It became clear that the laws could not be employed loosely or in poorly defined conditions. So, they began to require a “definite logic” model. In this model, the terms and the expressions formed from these terms must be clearly definable and knowable. But this ideal is rarely achieved in the real world, and we are forced to make assertions about things in less than precise, fuzzy terms. Not until the creation of Mathematical Logic by Boole in the 19th century, and later Russell and others, was logic able to refine its expression with mathematical, perfectly clear terms and operations.

This development in logic admirably suited the predispositions of the Western mind, and certainly helped shape it. Western philosophy to a very large extent has been founded upon the Laws of Thought and similar ground rules. We believe that our thinking should strive to eliminate ideas that are vague, contradictory, or ambiguous, and the best way to accomplish this, and thereby ground our thinking in clear and distinct ideas, is to strictly follow laws of thought.

But are these laws simply axioms, or can the be proved? It doesn't appear that there is a direct proof, but to some degree they must be accepted a priori. However, Aristotle pointed out attempts to logically justify these axioms were unnecessary. He held that the axioms of classical logic are self evident because 1) all syllogisms rely on them, and 2) because they can be defended through retortion.

A defense through retortion occurs whenever an argument must rely upon the very principle it seeks to challenge or overturn. Any attempt to form a syllogism to refute the Laws of Thought will have to rely on the very axioms it seeks to overturn, leading to an implicit reliance on the axioms, which is a self refutation (i.e., the "Stolen Concept fallacy"). In other words, it is impossible for the laws of logic to not be correct. If I were to say, "the Law of Non-Contradiction is false", this presupposes the Law of Non-Contradiction itself, because I am simultaneously intending to convey, "It is not true that the Law of Identity is true".

In spite of how dominant these laws of thought have been, they have not been without their critics, and philosophers from Heraclitus to Hegel have leveled powerful arguments against them. But the issue does not seem to be whether the laws are applicable or not, but where and when are they applicable. Certainly, the laws of thought have a place, but what is that place? As Walt Whitman wrote in “Song of Myself”:
"Do I contradict myself?
Very well, then, I contradict myself.
(I am large, I contain multitudes.)"
Also as Nagarjuna, one of the fathers of Buddhism, wrote in "Verses on the Middle Way":
"Everything is real and not real.
Both real and not real.
Neither real nor not real.
That is Lord Buddha's teaching."
The time to abandon strict laws of thought arises when we are beyond the realm to which ordinary logic applies, or as when “the sphere of thought has ceased, the nameable ceases” (Nāgārjuna). A similar sentiment is expressed by Wittgenstein's assertion in the Tractatus,
"what can be said at all can be said clearly, and what we cannot talk about we must pass over in silence"
Many people who value rational thought, objectivity, and clear, precise thinking have no doubt been frustrated while engaging in fruitless debates with those who abandon the Rules of Thought. Anyone who has had to counter statements like, "your truth is not the same as my truth", "everything is relative", "what is proof for you is not proof for me", "your facts are just your opinion" has dealt with this first-hand. I have been frustrated in my conversations with relativists, sophists, self-styled mystics, and post-modernists who toyed with words and meanings simply for the pleasure of being evasive and derailing rational discourse. They equivocate on the important concepts like truth, meaning, free will, reality, faith, belief, trust, experience, existence, good, bad, etc. When they sense they are being pinned down in a logical contradiction, they do an end-run around logic and question the very premises of rationality (for example, the Laws of Thought), subverting the entire effort. They redefine important terms, frequently in mid-discussion, using them in varying ways that suit their desired outcome (note - it is always important to define terms up front to make sure you are not talking at cross purposes with someone!) There doesn't appear to be a sincere desire to arrive at a clear conclusion, but more a desire to put the person promoting the rational world-view off balance, questioning the very premises needed for an exchange of ideas, throwing logic out the window, and wallowing in mystical babble simply for the fun of it.

The Persian philosopher, Avicenna (also known as Ibn Sina) has a famous quote about how to deal with those who disregard the Law of Noncontradiction:
"Anyone who denies the Law of Noncontradiction should be beaten and burned until he admits that to be beaten is not the same as not to be beaten, and to be burned is not the same as not to be burned."
Of course I don't recommend that, but it definitely shows that even great minds can become a little peeved with intransigent illogical thinking.

Philosophical naturalists and realists attempt to understand the world using a reason and evidence-based approach. They employ logic and empiricism, filtered through external review and correction, iterative refinement, and ultimately balanced by informed judgment which also has to take unknowns and risk into account. Experience has shown this to bear the greatest fruit if the goal is truly to understand the world.

Those who approach these questions from a religious or mystical point of view, will achieve an outcome which embodies whatever results they feel are enlightening, thrilling, comforting, uplifting, or that allow them to persist in their irrational (by definition) and incoherent (i.e., disorganized and internally inconsistent) mystically-based world view. To allow the introduction of multiple, inconsistent concepts during an exchange causes confusion because of the impreciseness (and even trickery) of language. The epistemologies feeding our different world views (science/evidence/reason/naturalism vs mystical/religious/irrational/revelatory) differ. The irrational approach is based on revelation/inspiration/emotion/myth/sacred texts, and the scientific world view is based on observation/experiment/measurement/evidence/theory/methodology/coherence/critique. It is difficult, probably impossible, to bridge the gap between these diametrically opposite positions.

However, the irrational does have its place in our world. Humans are not robots, but are primarily emotional beings with a veneer of rationality laid on top. Not everything is best dealt with through a reductionist, rational approach. We would lead a very narrow existence, indeed, as well as barren and joyless, to try to apply these or similar laws to every human experience. Of what use is it to be entirely reason-based when enjoying the beauty of nature, the joy of your pet, or the laughter of friends and relatives. However, in the focused realm of science, whose goal is merely to explain how things work and of what they are made, this type of restricted and disciplined thought is a perfect fit.

Sunday, January 10, 2010

6 Assumptions of Science

None of the philosophical questions we have explored are resolved, or else (obviously) they would not still be considered philosophical questions. The controversies and different points of view surrounding the nature of reality, the problem of induction, and the necessary assumption of a uniform universe still stir debate as to what scientific naturalism represents, what are its limits, and how well man can actually know the world. Having said all that I can on the subject, I must now leave it and bubble up one level to describe assumptions that science makes based on these convincing, but admittedly unresolved, principles.

6.1 Rejection of Magic

No one has addressed primitive beliefs in magic and superstition as well as Sir James Frazer, author of The Golden Bough. This was the first definitive description of the myriad explanatory techniques and coping mechanisms that pre-scientific people used to make sense of their world. Rather than using empirical methods of observation, hypothesis, test, and measurement, the long standing unsophisticated, intuitive methods they used to explain how the universe worked invoked what would today be called magic. These people found patterns in the world based on associations of ideas in the mind, either through similarity or proximity. According to Frazer, “the order on which magic reckons is merely an extension, by false analogy, of the order in which ideas present themselves to our minds.” Primitive societies, succumbing to this way of reasoning, relied on what he called “sympathetic magic” to explain events in the world. Two sub-categories of these phenomena subsumed the bulk of primitive magical thought:
  • Law of Similarity (“like” produces “like”). This is the basis of voodoo, images, effigies, idols, and holy statues. Charms based on the Law of Similarity may be called Homeopathic of Imitative Magic. The Mandrake root, which resembles a man's form, is supposed to have magical properties. Rhinoceros horn, which bears a striking resemblance to a body part of virile male, is used as an aphrodisiac. Primitive cave paintings depicting of successful hunting scenes were thought to insure a successful outcome to the real hunt. Mistletoe was used in pre-modern times as a cure for epilepsy. It does not fall to the ground because it is rooted on the branch of a tree. It would seem to follow as a consequence that an epileptic cannot fall down as long as he carries a piece of mistletoe. Such a train of reasoning would probably be regard even now as reasonable by a large portion of humanity.

  • Law of Contact (or Contagion) is based on the idea in which things that have once been in contact with each other continue to act on each other at a distance after the physical contact has been severed. Charms based on the Laws of Contact are called Contagious magic. Our abhorrence at the idea of wearing a piece of clothing previously worn by a mass murderer, or receiving a blood transfusion from a violent criminal are demonstrations of this law at work. Relics of saints, or fragments from the “true cross” can supposedly transfer spiritual energy. A lucky shirt or lucky ritual such as crossing your fingers invoke the Law of Contact. Charms made from fingernail clippings, hair and other discards from a target of magic are frequently used.

Magic is a spurious system of natural law as well as a fallacious guide of conduct. It is more akin to a false science than a false religion. Magical systems attempt to express, explain, and exploit causality through an association of ideas – the first through similarity in form, the second in similarity of position. Magical thinking commits the mistake of assuming that things which resemble or were near each other are somehow the same or have some unseen but real connection and causal relationship. The magician believes he can produce an effect merely by imitating it (law of similarity), or that whatever he does to a material object will affect equally the person with whom the object was once in contact (law of contact).

There are countless examples of sympathetic magic in primitive and not-so-primitive societies – far too many to list here. But here is a sampling from The Golden Bough:

Among the Esquimaux boys are forbidden to play cat’s cradle, because if they did so their fingers might in later life become entangled in the harpoon-line... Here the taboo is obviously an application of the law of similarity... as the child’s fingers are entangled by the string in playing cat’s cradle, so they will be entangled by the harpoon line when he is a man and hunts whales. Again, among the Huzuls of the Carpathian Mountains the wife of a hunter may not spin while her husband is eating, or the game will turn and wind like the spindle, and the hunter will be unable to hit it. Here again the taboo is clearly derived from the law of similarity… In some of the East Indian islands any one who comes to the house of a hunter must walk straight in; he may not loiter at the door, for were he to do so, the game would in like manner stop in front of the hunter’s snares and then turn back, instead of being caught in the trap. For a similar reason it is a rule with the Toradjas of Central Celebes that no one may stand or loiter on the ladder of a house where there is a pregnant woman, for such delay would retard the birth of the child ... Malays engaged in the search for camphor eat their food dry and take care not to pound their salt fine. The reason is that the camphor occurs in the form of small grains deposited in the cracks of the trunk of the camphor tree. Accordingly it seems plain to the Malay that if, while seeking for camphor, he were to eat his salt finely ground, the camphor would be found also in fine grains; whereas by eating his salt coarse he ensures that the grains of the camphor will also be large … The chief product of some parts of Laos, a province of Siam, is lac. This is a resinous gum exuded by a red insect on the young branches of trees, to which the little creatures have to be attached by hand. All who engage in the business of gathering the gum abstain from washing themselves and especially from cleansing their heads, lest by removing the parasites from their hair they should detach the other insects from the boughs. Again, a Blackfoot Indian who has set a trap for eagles, and is watching it, would not eat rosebuds on any account; for he argues that if he did so, and an eagle alighted near the trap, the rosebuds in his own stomach would make the bird itch, with the result that instead of swallowing the bait the eagle would merely sit and scratch himself

The list of examples goes on and on. In Cormac McCarthy's Blood Meridian, there is a scene where one of the soldiers objects to having his drawn likeness captured in a sketchbook. He rejects being compared to a superstitious native, but cannot otherwise account for his extreme reluctance. The implication is that, like the "savages" they are pursuing, he feels danger from sympathetic magic associated with a book containing his picture over which he has no control.

In our own lives, we subscribe to many superstitions and magical belief systems. Recently, the system called “The Law of Attraction”, popularized in the motion picture, “The Secret” encouraged visualization of desired outcomes to cause the outcomes to occur. This is more than just positive thinking – it is literally magic. The Law of Similarity, again, is at work here: a mental image of a thing is somehow similar to the thing itself. Also, we use homeopathic medicine when we believe water retains a “memory” of a curative agent that once was in it - this is the Law of Contact at work. And how many of us, normally rational in most of our decisions, continue to take large varieties of supplements and herbal remedies based on a recommendation or foggy personal recollection, and refuse to stop taking it in the face of proof that they don't work?

Science avoids magical explanations in favor of empirical observations, hypotheses, and experimentation. But we all seem to have weak areas where we let primitive magic drive our decisions.