qrcode

Tuesday, February 21, 2006

A Reasonable Degree of Scientific Certainty

Over the years I have heard people say, "I believe/know to a moral certainty that such-and-such (is true)." I have said it myself but have no idea what it means. I think it means the same thing as "I really, really, really believe such-and-such (is true)." Interestingly, when people use it they may be talking about a moral issue as in "I know to a moral certainty that screwing goats is a sin." But it doesn't have to be a moral issue. Right now, I know to a moral certainty that George Bush has lost his fu__ing mind. Selling the operating rights to something like 6 or so of our most important ports to some dudes in Dubai is what one of my acquaintances calls "dirt fu__ing stupid." Maybe some Dubai citizens involved in the purchase of these operating rights are reliable but if you are familiar with the 9/11 Commission Report, you will know that
Bin Laden relied on the established hawala networks in Pakistan, Dubai, and throughout the Middle East to transfer funds efficiently.
Hawala networks are informal banks that allow persons to transfer money around and hide the interactions from government investigators charged with making us as safe as possible. Unfortunately our President doesn't seem to give a damn about the safety of the American people when there is money to be made.

However, it is not my primary intention to bash Bush though it is always good fun but to worry about phrases like "to a moral certainty" and the widely used legal phrase "to a reasonable degree of scientific certainty." I don't know of any other similar phrases. I have never heard, for instance, things like "to a religious certainty," for instance or "to an academic certaint" or any other such phrase.

Attorneys using expert witnesses who are scientists ask them things like "Do you know to a reasonable degree of scientific certainty that OJ's blood was present at the crime scene?" The scientist replies, "Yes" and he or she might give the reasons. I have argued in an earlier blog that scientific certainty is not attainable. We can say with great certainty if not total scientific certainty things like "I know to a scientific certainty that John Jones' heart was beating at noon because he showed a blood pressure reading of 120/70 at that time." However, our speaker could not say, "I know to a reasonable degree of scientific certainty that John Jones' blood pressure was 120/70 at the time in question." The problem is that the hearing of the person taking John Jones' blood pressure could be bad or the mercury manometer employed might be inaccurate. Only the most trivial of facts, if any, are knowable to a scientific or any other certainty.

If you buy into this argument, then we are forced to talk about degrees of scientific certainty as the phrase we are looking at suggests. In some cases, as in experiments that provide levels of confidence experts may say what the level of confidence of some finding was. However, the experiment may have been predicated on problematic assumptions or the sample of subjects (mice, people, etc.) might have been defective in one way or another.

In some cases, as when dueling psychiatrists give expert testimony they cannot honestly say that they know to a reasonable degree of scientific certainty that the defendant did (or did not) know the difference between right and wrong or that the defendant was (or was not) legally insane. The problem here is that no measuring is being done and without that, there can be no degree of certainty, much less scientific certainty. When I gave expert testimony during depositions about alleged deceptive advertising by some eight or nine major oil companies in regard to the inclusion of alcohol in gasoline ("gasohol," as it is sometimes called) I did say that it was my expert opinion that the language of any sign put up at a gasoline filling station saying, "No Alcohol Added" or "No Alcohol in Our Gasoline" implicates that adding alcohol to gasoline is somehow bad. I could say that I knew with a reasonable degree of scientific certainty that some people would do so because I did and the claim is so phenomenally weak that my drawing the inference is enough to verify it. (Another linguist and a psycholinguist said the same thing.) I could have said that I was reasonably sure that the oil company ad agencies and the oil executives involved in deciding to use signs like that also must have believed that people would draw that inference or they wouldn't have bothered creating and putting up the signs. In fact, I believe that most people would draw that inference but I couldn't assert that with any specific degree of scientific certainty.

The link associated with the title of the present blog also notes that we cannot be certain about the claims of scientists, and goes on to say that there are those who exploit this lack of certainty for their own ends. The anti-Darwinian Creationist believes this leaves room for Intelligent Design in the classroom, as if the latter (non)theory actually could be as certain or more certain than the theory of evolution (or theories of evolution, if you prefer). Or a conservative politician like Bush might exploit the existence of scientific uncertainty to delay action on some environmental regulation when acting on it will be expensive.

To exploit the fact that scientists cannot be certain about their claims (not counting trivial claims based on observations such as "John is alive") simply because this uncertainty exists is intellectually dishonest. Its like blaming a rose for having thorns. Unlike roses, however, where we have alternatives, there is no alternative to science as a means of understanding the physical universe (including our minds but alas, not our souls, whatever those things are).

I am more than a little troubled by use of the word "reasonable" in "to a reasonable degree of scientific certainty." There are some published articles discussing this notion but I would have to go to the library to find the publications since they are all at pay to view sites. I did run into Blog 702, which concerns legal issues, that noted that the Third Circuit held that a "Handwriting Expert's Testimony Need Not Be Given to "Reasonable Degree of Scientific Certainty."

Once one puts the word "reasonable" in "to a reasonable degree of scientific certainty" an unacceptable level of subjectivity is involved in assessing the claim because what might be reasonable to one scientist or judge or jury might not be reasonable to another. In short, use of "reasonable" and "scientific certainty" in such a phrase makes it into a kind of oxymoron. We have at best an illusion of scientific credibility. It wouldn't be the first time a court has deemed credible quite sloppy science. In my expert opinion, experts should restrict themselves to saying "It is my expert opinion that ...." and add whatever measures might have been used or defend the opinion and then shut up.

I fear I have meandered a bit in this blog. I trust most will forgive me.

Tweet This!

Saturday, February 18, 2006

More on PC Racial and Ethnic Prejudice

When I wrote my blog on The Last Bastion of PC Prejudice, I was unaware of the work of John Baugh who has done a 2 year study of the phenomenon of linguistic prejudice. In a press release put out on February 2, 2006 by the University of Washington at St. Louis, it notes that his research demonstrates both that people can make correct racial and ethnic identifications by hearing a voice on the telephone and that there is systematic prejudice against Hispanics and African Americans. In a press release called Linguistic profiling: The sound of your voice may determine if you get that apartment or not he is said to have claimed that
some companies screen calls on answering machines and don't return calls of those whose voices seem to identify them as black or Latino
and goes on to say
Some companies instruct their phone clerks to brush aside any chance of a face-to-face appointment to view a sales property or interview for a job based on the sound of a caller's voice. Other employees routinely write their guess about a caller's race on company phone message slips.
He proved his point by having people call concerning advertised rental properties and discovered (Duh!) that very commonly people with identifiable Spanish accented English or Black accented English were told that the advertised properties or jobs were no longer available while the same properties or jobs were said to still be available to those speaking standard American English.

This sort of prejudice has long been known to exist by linguists. Bill Labov demonstrated some 35 or more years ago that three department stores in New York City exhibited dialect stratification as a function of how many dropped the r's in "fourth floor." The more "r's" the more likely you could get at job at Sax Fifth Avenue and at Macy's the specific job you were assigned reflected the percentage of r's one drops. In this case, the study was not about race per se but the fact is that Blacks tend to drop r's more than Whites in New York City and some other places, all other things being equal.

Dr. Bough's study helps to confirm what Hispanic Americans and Black Americans already know, namely that America is still a country in which racial and ethnic prejudice is alive and well and being practiced.

Tweet This!

Thursday, February 16, 2006

On Protecting the Institution of Marriage

Back during the battle between our War President and John Kerry we also had a battle here in Ohio and elsewhere concerning referenda on whether gay marriages should be allowed. As I drove to the polling place, I saw signs urging voters to protect their marriages by voting against establishing a right for two gay men or two gay women to marry. Right now, Assistant Attorney General Patrick DeAlmeida of the state of New Jersey is trying to get 7 justices of the New Jersey Supreme Court to "protect the institution of marriage" by not "redefining marriage" to allow gays to be married.

There are some linguistic mind games going on here. The fact is that if the voters had voted to allow gay marriage in Ohio, my marriage would not have been affected. I wouldn't have loved my wife less nor would she have stopped loving me. Our marriage would have been just as strong or good as it was the day before gay marriage became legal. Indeed, the only thing I know of that has had a negative effect on heterosexual marriage has been the establishment of no fault divorce. That has constituted a real threat to the institution of marriage for it allows for one spouse to obtain a divorce even though the other party doesn't want one. Gay marriage poses no threat of any sort. Indeed, gay marriage might strengthen the institution of marriage if the gays getting married could set a better standard for marriage than the miserable one that we heterosexuals have established.

What did the ignorant, religiously inspired voters in Ohio think would happen if gay marriage had been approved? Did they think that someone like me would have jumped at the chance to become gay, would have dumped my wife rapidly (thanks to the no fault divorce law), and gone out in the world to find some man to marry? Who knows what they were thinking. My view is that these people quit thinking years ago and replaced thinking with Christian and other religious dogma. Deep down in their murky hearts, I suspect, their motivation was their hostility toward homosexuality. The problem is that we straight folks have been conditioned to think that gays are some sort of horrible kind of human being.

When I was a kid, I recall my mother pointing to some man and saying he was something or other -- a queer or a homosexual -- and I decided instantly that being a queer was very, very bad. I had no idea why this was bad since she didn't elaborate what it meant to be queer. While in grade school I remember someone telling me that a gay was someone who burst fart bubbles in their bath water with their teeth. I must have thought that was a possibility or it wouldn't have had so deep an effect that I would remember it so many years later. After I found out what a gay was (but before I discovered what gays did in bed), I began to fear I might be gay, that is, I might be one of these very awful people. However, it turned out that I liked girls in a different way than I liked boys so I was saved. I think a lot of little boys worry about that just as they worry about how you go about getting a girl to let you go to bed with her. My friends and I didn't have a clue how we could get that to happen. Of course I am talking about what little boys thought about in the 50's.

Later, when I was in my thirties, at the height of my political liberalism, I still harbored negative feelings toward gays. These feelings were exposed to the light of day when a close friend who was married made a pass at me late one night. I was shocked but it turned out that I was more concerned at his betrayal of his wife than that he was gay per se. We stayed friends. I learned a lot about what gay men go through from talking to him and began to feel some sympathy for them but I was still bummed that he had betrayed his wife. Latter on, another very close friend who was married (our families were very close as well) outed himself. I responded badly but, again, it turns out that I wasn't bothered so much by his being gay as his betrayal of his wife. I was not nearly as bummed as she was. Indeed, learning of his betrayal destroyed her confidence in her judgment and ended up largely wrecking her life.

The moral of this story is that I have very strong feelings about the institution of marriage. To me the central responsibility of married partners is that they be faithful to each other -- that they be able trust each other. I have no problem with divorce but I do have a problem with infidelity. Now, does gay marriage damage the institution of marriage as I construe the institution,, where trust and fidelity are its central values? The answer is obviously, "No." The only way gays can damage the institution of heterosexual marriage is by marrying a straight person. The irony is that the nitwits who oppose gay marriage seem to be perfectly happy with a gay man marrying a straight woman (or a gay woman marrying a straight man), the one thing that is a threat to the institution of marriage. And the fact is that the prejudice against gay people that pervades our society drives some of them to engage in fraudulent marriages as a cover.

Tweet This!

Monday, February 13, 2006

Critical Analysis of Evolution

In my last blog, How to Think, I urged that people who wish to become sophisticated in their thinking try to develop skills in critically thinking. It seems that the Ohio Board of Education wants that too but only in biology -- in fact, only in the study of evolution. An attorney in the intelligent design trial in Pennsylvania, Mr. Eric Rothschild, is quoted in my morning's Columbus Dispatch as saying
When you see "critical analysis of evolution" you really need to look at what's behind that....Why is there no call for critical analysis of plate techonics?
If I were to have a "Critical Thinkers Hall of Fame" I would include Mr. Rothschild. He has made the two points that most need making, and, of course, beat the crap out of the creationists in court. The first is that "critical analysis of evolution" is code for "introduce creationism/intelligent-design in biology lectures," though he didn't put it that way exactly. The second is that a specific reference to a critical analysis of evolution in the absence of references to a critical analysis of plate techtonics, a critical analysis of the particle and wave theories of light, and so on, is the best possible evidence that someone is engaged in the fallacy of "special pleading," though again he didn't put it quite that way. As The Nizkor Project notes
From a philosophic standpoint, the fallacy of Special Pleading is violating a well accepted principle, namely the Principle of Relevant Difference. According to this principle, two people can be treated differently if and only if there is a relevant difference between them. This principle is a reasonable one. After all, it would not be particularly rational to treat two people differently when there is no relevant difference between them.
The Nizkor Project is concerned that we do not treat people differently who do not exhibit any relevant difference but the same applies to scientific claims. If we are to have a critical analysis of evolution, we must have a critical analysis of the formation of black holes, plate techtonics, the number agreement rule in English ("John and Mary are here" is a well-formed English sentence but "John and Mary is here" is not), and any other scientific claim because there is no relevant difference between these things. In fact this whole notion of urging biology teachers to engage in a critical analysis of evolution should receive a giant "DUH!!!!" Our teachers should engage their students in a critical analysis of every subject they study.

But, of course, that is not what fundamentalist Christians want. The most inane aspect of this is the idea that scientists working on issues in evolutionary theory are not already engaged in a critical analysis of their discipline. Of course they are. The idea that God plopped Adam down on Earth and then in creating Eve took one of Adam's ribs and also gave them a full blown language competence so He can talk to them and they can talk to each other would, of course, be dismissed immediately because there is no empirical evidence for it (books, including the Bible, don't count). That's all it takes to dispatch the creationist theory of the origin of humans to the intellectual dumping grounds for bad ideas.

The idea that the design of humans exhibits an intelligence at work is hard to swallow. The notion "intelligent design" is impossibly vague. To me, if humans were intelligently designed, we would have an immunity to all diseases. If He had done that, we wouldn't have to be worrying about the bird flu. To my claim, it could be countered that the intelligent design of humans would entail giving us a limited life span because of (perhaps) the psychological damage life itself inflicts on us and the lack of a total immunity to disease is one of the mechanisms God uses to limit our lifespans. I would reply, why not simply let the aging of organs ado the trick? No diseases; just the aging of organs, perhaps adding on accidents that take our lives and such things as murders. There is no way to make the notion "intelligent design" substantive because my "intelligent design" might be your "idiotic fantasy," and vice versa.

Tweet This!

Sunday, February 05, 2006

How to Think -- A blog in which I Toot My Horn too much

At a linguistics conference at Georgetown University some years ago, a guy who had given a very nice talk came up to me after I had given my talk and fielded questions, and said something like, "I wish I could think like you MIT people." I was somewhat taken aback but the fact is that the MIT experience, at least for the first few groups that went through the program -- entering in '62-'66 or so -- and possibly later groups as well did give us an unusual opportunity to develop some pretty high powered intellectual skills, specifically the skill to create valid arguments (which may not lead to true conclusions, of course) and the skill to defend and critique arguments.

When I entered MIT there was almost no existing literature in linguistics deemed relevant to the development of Chomsky's radically different, mathematically based, scientific approach to the theory of language. It was argued, for instance, that the grammar of a language should consist of explicit, formalizable rules. Since traditional and structuralist grammarians did not have these goals, they did always or even often ask the kinds of questions that we wanted answered we didn't spend time reading but spent it instead on trying to create new knowledge. Given that there wasn't much such knowledge at the time, it wasn't too hard to come up with new proposals if you were resaonably imaginative.

So, we were asking new kinds of questions and had to develop new kinds of descriptions of languages. Fortunately, we graduate students were clustered together in several very large rooms with lots of desks and blackboards. A healthy, friendly competition developed to try to come up with new ideas (rules of English usually and properties of said rules as well as linguistic universals, i. e., rules and principles that apply across languages) . I don't know what others did but I frequently busted my butt at night to try come up with something new to contribute. The drill was to tell people you had this great new idea, go to a blackboard and write down the proposed generalization about language, or, more commonly, just some English rule or condition on the applicability of the rule, and defend it. The others would start firing objections and not infrequently one would be lead to change one's proposal. If it concerned just English, others might point out that the generalization did or did not apply to languages they knew or knew about. One had to become very critical of one's own ideas to survive the cross-examination without getting too bloodied.

So critically examining one's own ideas was essential to survival. Moreover, applying one's critical skills to the proposals of others, was necessary if one was to be a good citizen. I would advise anyone concerned about developing better critical skills to adopt an adversarial view towards one's own ideas as well as those of others. Don't believe anything anyone tells you unless you can confirm it yourself. This skeptical point of view doesn't amount to cynicism about knowledge since one ends up believing quite a lot of things to be true, at least tentatively. Skepticism requires sharply honed intellectual skills. Cynicism requires no intellectual skills at all.

Learning to defend positions one believes to be true is a very hard thing to do. There are two things that one ought to do. One is to carefully study good examples of arguments and try to emulate them. Another is to subject your arguments to criticism from others. This latter activity is particularly important, as important as submitting any creative writing you might do to experts for criticism.

Learning to be a critical thinker and learning to construct valid arguments in support of one's ideas are necessary conditions on intellectual success. They are not, however, sufficient conditions. One must also develop a capacity for imaginative thinking if one is to have ideas that are worth defending. So, how does one come up with new ideas? Here is my "recipe."

1. Every field will have one or more intellectual cliques, sets of people who share fundamental assumptions. Learn from these cliques but don't ever become a "true believer." Early on I was a Lone Ranger who worked inside one of the two main cliques and was able to make an impact by doing work that was inconsistent with one of the major assumptions of this clique in my work on English adverbial clauses. Interestingly, that work has survived some 36 years later, as a googling of my 1970 doctoral thesis shows. However, later on I went to the linguistic dark side for a theoretical linguist and wrote a book on TV advertising. This had a huge impact outside of theoretical linguistics (where is was totally ignored). (False) modesty prevents me from elaborating just how big an impact it had. In any event, if you go the safe route, question the basic assumptions of your clique to try to find their flaws. Its the best way to go about having an impact on the field. It is more exciting, however, to strike out on your own. I had to do it to keep my sanity. I hated the idea of doing just one kind of thing for 30 years.

2. Do not underestimate the value of ignorance. My doctoral thesis grew out of work I began my first semester at MIT when I wrote a paper in a course Noam Chomsky taught that violated one of Chomsky's basic assumptions, which, fortunately, he did not discuss that semester until after I had come up with an analysis that violated it. My paper changed people's minds about the assumption and I ended up with my first ever footnote a few months later in a 1964 book by Paul Postal and Jerrold Katz (in which my last name was misspelled!).. Pretty heady stuff for a beginner. But it does show that ignorance can lead to intellectual bliss.

I think that the widely accepted view that mathematicians do their best work when they are quite young may be because they are ignorant to some degree of conventional mathematical wisdom. I had a cousin in a graduate mathematical course in which the prof gave out ten prolems to solve over the weekend. My cousin solved none of them but on Sunday, he ran across a fellow student who said he had managed to solve three. It turns out that the prof had given them 10 "official" unsolved problems. My cousin's classmate didn't know they were genuinely difficult problems and managed to take a novel view of three of them that happened to work out. So, I advise you to work your butt off when you are young and ignorant.

I will confess that in most cases in my life, when I addressed a new problem, I did not read the literature on it until after I had given the problem a try. Afterwards, I did look at the literature to see if someone else had come up with my ideas on the matter, which rarely happened. I suggest you try this approach, at least provisionally. But don't tell anyone you do this since you are supposed to read the literature first. Unfortunately, reading the literature first can put you in that box we are told to think outside of if you aren't careful.

3. Read the literature in related disciplines. It may lead you to ideas that are of cross-disciplinary interest. Just for fun, I decided to Google my book ["Speech Acts and Conversational Interaction" philosophy] and found some citations in course syllabi in philosophy and several references in papers in Computer Science (Cognitive Science). I also found a reference to it in a paper on "axiology," a perspective the existence of which was totally unknown to me until now -- it is probably pretty nutty. There was also a reference in a course in anthropologyat Florida State. This saved me from doing a lot of separate searches. In fact, I majored in philosophy and worked closely with a philosopher at Ohio State who moved on to the University of North Carolina, worked with a very talented OSU colleague in communications at Ohio State and read some things in that field; read a lot in the area of Conversational Analysis in Sociology, worked with a computer Scientist at Ohio State and read some of the literature in artificial intelligence, and read in the area of psychology. I probably skipped a field or two. Unfortunately, while interdisciplinary research is often said to be a desideradum in academia, it is not often rewarded as my philosophic OSU and UNC colleague, Bill Lycan, and I discovered. MIT press was willing to publish a book we did on conditional sentences (propositions) but it was clear from the comments we were getting that we would be flamed by a lot of reviewers from both fields so we withdrew it. As I was retiring, Bill decided to revise it into a book of primarily philosophical interest and he published it as Real Conditionals. Philosophy tolerates mavericks much more than linguistics does. The book includes a joint paper we published in a journal, and gives me plenty of credit so he's happy and I'm happy. In any event, this paragraph demonstrates that if you step way out of the box and actually read relevant material in other fields some wild ass stuff may result, some being successful (my Cambridge Press book on speech acts and conversation and Bill's Oxford Press book on conditionals once it had been "purified" a bit) and some not successful (the aborted original version of the joint book Bill and I originally wrote). True interdisciplinary research is a crap shoot.

Okay, enough tooting of my horn. I tooted it to show how that a Lone Ranger can force Cliquists to acknowledge the quality and importance of their work even if they don't like it themselves. The work does have to be well-received by some highly regarded people even if they are in other fields. The fact is if you don't strike out on your own you will likely create easily forgettable work unless you focus on questioning the foundations of your field or becoming a true maverick. My conclusion is, then, that you try learn to think critically and learn to defend and critique ideas whether they are your ideas or the ideas of others, and try to stimulate your imagination by questioning the basic assumptions of your field (which can be very hard work even for advanced scholars), working on problems before you look at how others have treated them, and reading work in related fields and trying to see if it might impact on problems that interest you. I hope this was useful for some of you younger people. With some embarassment at my horn tooting I hereby launch this into the internet stream.

Tweet This!

Friday, February 03, 2006

Conspiracy Theories, Science, and Intelligent Design

I touched on conspiracy theories in a previous post, noting that they, like theories predicated on counterfactual conditionals (If Hitler hadn't been born, ...) tend to be highly problematic. I made mention of "respectable conspiracy theories" and a commenter wondered how there could be respectable ones. In fact, there have been some pretty nasty conspiracies in my lifetime. There is no question now that Nixon, Haldeman, and Erlichman were up to their ears in the Watergate cover up though none may have been involved in planning of the break in at the Watergate hotel. Nixon's former Attorney General, John Mitchell, on the other hand, was involved. The Kennedy assassination was said to be the work of a single semi-deranged man by the Warren Commission. Almost no one seems to believe that. As a result, a plethora of conspiracy theories have arisen to replace the Warren Commission story. Oliver Stone got his reputation as a major conspiratorialist because of his movie, JFK which is, perhaps, the best known movie on the subject.

What is the difference between the conspiracy to cover up the Watergate burglary and the conspiracies to kill JFK? In both cases, we have a lot of facts to work with. In the case of the Watergate conspiracy (there were at least two -- the Mitchell-Liddy et al conspiracy to break in to the Watergate and the subsequent cover up which involved a very large number of people including Nixon himself) we have sworn testimony before the Senate Watergate Committee which anyone could tune in to watch and we have the famous White House tapes. We have, in short, direct evidence of the conspiracies. In the case of the JFK killing, we lack the so-called "smoking gun" (maybe I shouldn't have used that expression). We have no tapes linking LBJ to some person X and tapes linking X to the shooter, assuming that LBJ was behind it all. We have no similar hard evidence linking CIA operatives, said to hate JFK, to the shooter. We have no hard evidence linking Castro to the shooter. And we have no hard evidence linking the Mob to the shooter. I have mentioned four different theories I have heard over the years. There are many more and some involve major multiple conspirators (actual or former CIA operatives, members of the Mob, and Oswald or others (see the reference to James Files below).

No one, to my knowledge, proffers new theories of the Watergate break in or of the cover up except perhaps Republicans trying to clean up their image (as manufacturers of multiple conspiracies -- Watergate, Iran-Contra, and the current one involving the outing of Valerie Plame, which isn't to say that Democrats don't have their problems in this area -- see the Gulf of Tonkin incident) . But in contrast with the two Watergate conspiracy theories there are numerous theories of the JFK killing and hosts of web sites devoted to it around the world. One of the more interesting is one that cites where "hard evidence" can be found. Why are there so many theories? I think that the reason is that the Warren Commission theory is just too simple. The idea that Lee Harvey Oswald acted alone is not credible to most people. I suppose I don't believe it myself. Clouding the picture is the fact that there have been confessions by persons other than Oswald (check out this Google search page for information on the confession of James Files).

The contrast between the Watergate conspiracies where we do not have gobs of competing theories and the plethora of conspiracies proffered for the JFK killing is instructive. The less that is known about some state of affairs, the more theories of that state of affairs are possible. And, in the case of the JFK killing, differing conspiracy theories may dispute the "facts" cited by the Warren Commission and the "facts" proffered by other conspiracy theories (the Files confession, for instance). The only thing that can reduce the number of such theories is a "smoking gun."

The principle that the less that is known about a state of affairs, the more theories there are likely to be applies across the board of human affairs. There are commonly competing scientific theories that exist because critical facts aren't known. This is inevitable in science and it is not a sign that scientific research is, in general, untrustworthy. Typically, competing scientific theories abound at the cutting edge of the science -- the "wild side" of sciences. Most of what scientists say they know is predicated on hard facts. However, any theory of a subject matter takes a slice of reality and studies that. Chemists once did not concern themselves with matters of interest to physicists. Now we have physical chemistry. Once biology and chemistry were separate fields. Now there is biochemistry. Because scientists work with at least partially arbitrary slices of reality, there will always be things that aren't known because they aren't within the scope of interest of the science.

The existence of the Nontheory of Intelligent Design can exist because there are flaws in the Theory of Evolution. There will always be flaws in that theory so we will probably always be bothered by religious sorts who exploit these flaws to offer Intelligent Design which can live forever since it makes no actual testable empirical claims. To compare a theory that makes testable empirical claims (the Theory of Evolution) with "theories" that do not (Intelligent Design) is purely and simply intellectually dishonest.

I am proud to say the the current governor of Ohio has instructed those responsible for teaching children in Ohio to rid their texts and courses of reference to Intelligent Design. He did it to ward of law suits, so his motives were not pure. However, it is nice to see this otherwise seriously flawed governor doing the right thing even if for the wrong reasons.

Tweet This!