Skeptic Friends Network

Username:
Password:
Save Password
Forgot your Password?
Home | Forums | Active Topics | Active Polls | Register | FAQ | Contact Us  
  Connect: Chat | SFN Messenger | Buddy List | Members
Personalize: Profile | My Page | Forum Bookmarks  
Home Rationally Speaking N. 52, August 2004: Changing our mind: a Bayesian Approach
Menu
Skeptic Forums
Skeptic Summary
The Kil Report
Skeptillaneous
Creation/Evolution
About Skepticism
Fan Mail
Skepticality
Rationally Speaking
Claims List
Skeptic Links
Book Reviews
Gift Shop
Staff


Server Time: 21:18:09
Your Local Time:



Rationally Speaking
science,philosophy,scientific method, natural selection
Printer Friendly Printer Friendly Version of this Article... Bookmark Bookmark This Article...


N. 52, August 2004: Changing our mind: a Bayesian Approach


This column can be posted for free on any appropriate web site and reprinted in hard copy by permission. If you are interested in receiving the html code or the text, please send an email

Ever wonder how you go about changing your mind? Massimo discusses one possible process.


Massimo’s other ramblings
can be found at his
Skeptic Web.

Massimo’s books:



Denying Evolution:
Creationism, Scientism,
and the Nature of Science



Tales of the Rational:
Skeptical Essays About
Nature and Science
I am often accused by people who don’t know me very well of never changing my mind and always wanting to be right. These charges are usually hurled at me in the midst of some heated debate, often when the other side is close to be running out of defensible arguments. While I find the accusation of wanting to be right rather comical, considering that it is being advanced by somebody who is in fact trying to convince me that he is instead right on whatever we are discussing, the charge of not changing my mind is more serious. After all, I think of myself as a reasonable person who is interested as much in learning as in teaching, and surely such attitude — if entered into honestly — must at least occasionally lead me to admit that I was wrong on something, and therefore to change my mind.

Sure enough, once I started paying attention to the issue, I discovered that I have changed my position on several issues over the years. This doesn’t happen very often, and the change is rarely dramatic. But both of these characteristics are to be expected if one puts a lot of thought into shaping his own opinions: changing them too easily is the sign of a mind so open — as Carl Sagan once said — that the brain is about to fall out! What is more interesting, however, is that I began to give some serious thought to how exactly we change our mind about things. This is a crucial subject for anybody seriously interested in social discourse (or in advertisement and propaganda, the dark sides of the same coin), and sure enough has received a fare share of attention by both philosophers and neurobiologists (see, for example: Epstein, R.L., 1999, Critical Thinking, Belmont, CA, Wadsworth, or Gazzaniga, M.S., 2000, “Cerebral specialization and interhemispheric communication. Does the corpus callosum enable the human condition?” Brain 123: 1293-1326).

An interesting way of looking at how we change our mind, supported by recent neurobiological evidence, and in good agreement with my reflections on my own experiences, is provided by what is called the Bayesian framework (see RS N. 20, January 2002). Bayesians think of our understanding of truths about things in terms of probabilities based on evidence. So, for example, suppose you are a scientist testing different anti-AIDS drugs. You may start out with no a priori knowledge of which drug works better, and therefore you don’t have any reason to prefer one to another. The Bayesians would say that the prior likelihoods of the drugs being effective are, at this point, equal. But then you begin your research, collect data, and gradually see that a couple of the drugs seem to be effective, one or two more hardly make any difference, and one even has detrimental effects. Accordingly, you adjust your estimate of the likelihood of success of each drug based on the data, what Bayesians called the posterior probabilities associated with each drug’s effectiveness.

The key here is that you may never know for sure that one drug is working, or that another isn’t. What you do is to constantly re-adjust your posterior probabilities as a function of more and more evidence. In other words, you keep your mind open to change as more data come in. Notice, however, that Bayesian theory also predicts that, if in fact one or more of the drugs are truly more effective, with time your posteriors will stabilize to attach high likelihoods to the good drugs and low likelihoods to the poorly performing ones. After that, only dramatically different new information is likely to change your mind (alter your posteriors) on that subject.

A similar process, I think, is used by our brains on any subject to which we apply our mental powers. Often we do not start with flat priors (i.e., with equal probabilities assigned to each alternative being considered), because our opinions are influenced in a more or less subtle way by our social milieu. If you are raised in a conservative religious family, you are much more likely to simply adopt your parents’ beliefs than to question them (though occasionally too strong of a parental hand catalyzes an outright rejection), so your priors are very much in favor of, say, a Biblical god, as opposed to a mainstream god, deism, or atheism. If you are impervious to new knowledge coming in (say, because your upbringing was characterized by strong conditioning), you may indeed never change your mind.

But now suppose you get on the Internet, begin frequenting the local public library, go to secular or at least not strictly religious high school and college. Floods of constrasting information and opinions begin to enter your brain, and it begins to process all that information automatically, whether you like it or not. Your innate thought processes work like a Bayesian calculator, constantly re-adjusting the posterior probabilities and, in the process, more or less gradually change your mind. Your conscious self will monitor this subconscious process, and the change of opinion may feel almost instantaneous, like a “conversion,” or “a light bulb going on.”

I’m not saying, of course, that our brains are perfectly rational computers that — like a natural Bayesian algorithm — always converge to the best estimate of posterior probabilities possible given the available evidence. We have plenty of reasons to believe that this is, alas, not the case. Nonetheless, viewing the process in Bayesian terms helps, I think, not only account for its general nature, but also for some interesting features that can be used to improve our critical thinking. For example, the best way to effectively adjust our posterior probabilities is to take in as much reliable information as possible, and from as many different sources as possible. Hence the value of reading, discussing, and generally engaging one’s thoughts all the time, on whatever subjects one thinks are important. Your innate Bayesian calculator will not only allow you to change your mind as often (or as rarely) as necessary, but will make sure you have the best possible view of what is (likely to be) true or not.



Read or Add Comments about Rationally Speaking


Back to Rationally Speaking



The mission of the Skeptic Friends Network is to promote skepticism, critical thinking, science and logic as the best methods for evaluating all claims of fact, and we invite active participation by our members to create a skeptical community with a wide variety of viewpoints and expertise.


Home | Skeptic Forums | Skeptic Summary | The Kil Report | Creation/Evolution | Rationally Speaking | Skeptillaneous | About Skepticism | Fan Mail | Claims List | Calendar & Events | Skeptic Links | Book Reviews | Gift Shop | SFN on Facebook | Staff | Contact Us

Skeptic Friends Network
© 2008 Skeptic Friends Network Go To Top Of Page
This page was generated in 0.02 seconds.
Powered by @tomic Studio
Snitz Forums 2000