Saturday, May 30, 2009

Everything you believe is a convenient and advantageous lie!

Okay, as a prelude to a long post, I give unto you, my adoring audience, a youtubular present.


I truly do love myself some rap music.
Anyway, for this post, lovingly crafted at the very end of a month in which I have posted next to nothing, I have decided to delve into the same Christianity Today article that was brought up over at The Pharyngula.  It is an article authored by philosopher Alvin Plantinga, who I personally thought was one of those apologists from the mid 1500's, due to his name and the high regard with which he is usually mentioned.  The article serves primarily as a brief summary of his "Evolutionary Argument against Naturalism", in which he attempts to show why holding evolution and naturalism to be true simultaneously is self-refuting.
What she means is that natural selection doesn't care about the truth or falsehood of your beliefs; it cares only about adaptive behavior. Your beliefs may all be false, ridiculously false; if your behavior is adaptive, you will survive and reproduce. Consider a frog sitting on a lily pad. A fly passes by; the frog flicks out its tongue to capture it. Perhaps the neurophysiology that causes it to do so, also causes beliefs. As far as survival and reproduction is concerned, it won't matter at all what these beliefs are: if that adaptive neurophysiology causes true belief (e.g., those little black things are good to eat), fine. But if it causes false belief (e.g.,if I catch the right one, I'll turn into a prince), that's fine too. Indeed, the neurophysiology in question might cause beliefs that have nothing to do with the creature's current circumstances (as in the case of our dreams); that's also fine, as long as the neurophysiology causes adaptive behavior. All that really matters, as far as survival and reproduction is concerned, is that the neurophysiology cause the right kind of behavior; whether it also causes true belief (rather than false belief) is irrelevant.

He is correct in saying "natural selection does not care about the truth or falsehood of your beliefs" if it either has no effect on your behavior or produces adaptive behavior.  But, let's take a look at the examples he gives.  A frog who possesses the belief that flies are good food and a frog who possesses the belief that flies will turn him into a prince, a true belief and a false belief.  The problem here is that the frog with the first belief will likely be willing to eat flies in a wide variety of situations, whereas the one who thinks he will turn into a prince will be unwilling to do so if he fears a sudden transformation will result in falling off his lillypad and drowning, or meaning that he will have to leave his frog-wife, or if he simply does not think becoming a prince is a good thing.  It is far less likely to be a positive motivator for eating and more likely to be a deterrent in certain situations than the more general, correct belief presented.  As for the relevance of "true belief"...it's kind of a recurring theme, so I'll address it later.
What we learn from Crick and Churchland (and what is in any event obvious) is this: the fact that our hypothetical creatures have survived doesn't tell us anything at all about the truth of their beliefs or the reliability of their cognitive faculties. What it tells us is that the neurophysiology that produces those beliefs is adaptive, as is the behavior caused by that neurophysiology. But it simply doesn't matter whether the beliefs also caused by that neurophysiology are true. If they are true, excellent; but if they are false, that's fine too, provided the neurophysiology produces adaptive behavior.
If that entire paragraph sounds at all familar, its because it is a painstaking repetition of the same exact idea alluded to in the previous paragraph and explicitly stated in its last sentence.  But, I am not one to judge.  If I were accused of being redundant everytime I was redundant, then I would be accused of being redundant.
So consider any particular belief on the part of one of those creatures: what is the probability that it is true? Well, what we know is that the belief in question was produced by adaptive neurophysiology, neurophysiology that produces adaptive behavior. But as we've seen, that gives us no reason to think the belief true (and none to think it false). We must suppose, therefore, that the belief in question is about as likely to be false as to be true; the probability of any particular belief's being true is in the neighborhood of 1/2. But then it is massively unlikely that the cognitive faculties of these creatures produce the preponderance of true beliefs over false required by reliability. If I have 1,000 independent beliefs, for example, and the probability of any particular belief's being true is 1/2, then the probability that 3/4 or more of these beliefs are true (certainly a modest enough requirement for reliability) will be less than 10(to the power -58). And even if I am running a modest epistemic establishment of only 100 beliefs, the probability that 3/4 of them are true, given that the probability of any one's being true is 1/2, is very low, something like .000001.[7] So the chances that these creatures' true beliefs substantially outnumber their false beliefs (even in a particular area) are small.
Two interesting assumptions are made.  One is that beliefs become believed in at random.  The other is that, by setting the probability of all beliefs being true at 50% and then stating that it is profoundly unlikely that 75% of a random assortment of a large sample of beliefs are true, he is actually stating something of significance.  The argument here is that, given a .5 chance that a belief is true, it is unlikely that .75 of beliefs are true, in case that rephrasing makes it clearer.  I am sure that this is a remarkable observation to someone out there.  

So, I guess now is the time to lay my cards on the table:  the chances of any given belief being true is higher than he suggests it should be.  Why, you may ask?  Well, intrigued reader who obviously cares about this a little too much, it is because, even though "producing true belief is irrelevant" as long as it produces adaptive behavior, adaptive behavior is far more likely to come from true beliefs than from false ones.  True beliefs, or those that are at least good at approximating reality, are inherently more likely to give advantages to the creature that possesses them than false ones.   This is because false beliefs rely on luck and specific contexts in order to not be maladaptive, and most often are outright maladaptive in almost all situations if they deviate from reality enough.  By contrast, there is almost no imaginable circumstance in which true beliefs are maladaptive, even if there are some cases where it would be nonadaptive and deemed irrelevant to survival.  To better get an idea of the problem here, I summon forth another example he uses in establishing this argument, from Wikipedia:
Perhaps Paul very much likes the idea of being eaten, but when he sees a tiger, always runs off looking for a better prospect, because he thinks it unlikely the tiger he sees will eat him. This will get his body parts in the right place so far as survival is concerned, without involving much by way of true belief. ... Or perhaps he thinks the tiger is a large, friendly, cuddly pussycat and wants to pet it; but he also believes that the best way to pet it is to run away from it. ... Clearly there are any number of belief-cum-desire systems that equally fit a given bit of behaviour
In the first situation, he has offered up a double-set of false beliefs that interact in order to offer up adaptive behavior.  The first is a maladaptive desire to be eaten, and the second is a belief that the animal about to eat him is unlikely to do so and thus he is compelled to leave.  Of course, the problem is that this set of beliefs is nowhere near as beneficial as a simple fear of getting maimed to death.  Primarily because it depends on Paul being able to outrun a tiger, on the second belief always trumping the former whenever it triggers, and on Paul not setting himself up in a situation that he cannot escape from before he has a chance to be promped to believe that the animal approaching is not fit to devour him.  The second belief acts as a safety net that only works if he is in position to meander away safely.  In any other circumstance, he will be screwed over and will have put himself in that position due to the maladaptive first belief. 

The second scenario is slightly better, because the two beliefs directly redefine one another into "the tiger is a large, friendly, cuddly pussycat, run away from it".  The only problems he would incur would be that he would run away from anything that he wanted to pet (not too significant), that he would still be willing to approach the tiger under the assumption that he is "cuddly pussycat" prior to his attempt to "pet it", and this once again relies on the ability of someone being able to outrun a tiger while simultaneously having a mitigated, but still inordinately large, desire to be near one.  And this is to say nothing of how he would deal of news of tigers approaching (as opposed to approaching them himself), or how much problems it might cause for other people if he insists that tigers are harmless.  Who knows what would happen if he didn't inform his children about the proper way to "pet" the tigers...
But of course this same argument will also hold for us. If evolutionary naturalism is true, then the probability that our cognitive faculties are reliable is also very low. And that means that one who accepts evolutionary naturalism has a defeater for the belief that her cognitive faculties are reliable: a reason for giving up that belief, for rejecting it, for no longer holding it. If there isn't a defeater for that defeater—a defeater-defeater, we could say—she can't rationally believe that her cognitive faculties are reliable. No doubt she can'thelp believing that they are; no doubt she will in fact continue to believe it; but that belief will be irrational. And if she has a defeater for the reliability of her cognitive faculties, she also has a defeater for any belief she takes to be produced by those faculties—which, of course, is all of her beliefs. If she can't trust her cognitive faculties, she has a reason, with respect to each of her beliefs, to give it up. She is therefore enmeshed in a deep and bottomless skepticism. One of her beliefs, however, is her belief in evolutionary naturalism itself; so then she also has a defeater for that belief. Evolutionary naturalism, therefore—the belief in the combination of naturalism and evolution—is self-refuting, self-destructive, shoots itself in the foot. Therefore you can't rationally accept it. For all this argument shows, it may be true; but it is irrational to hold it.
And that's how he refutes naturalism.  By indicating that our brain is unreliable and pulling a Descartes on us, specifically focusing on the "belief in evolutionary naturalism" that is assumed in order to establish that the brain is unreliable.  But, the problem is that the brain is not unreliable according to his projection based on evolutionary theory, at least not to the degree that Plantiga would have us believe.  As I said above, the probability of a belief being true is higher than the probability he gives.  Partly because I cheat.  I feel that "true" is overrated, and "true enough" needs to be given some credit as well.  Making it a dichotomy between beliefs that are either wholly true or wholly false is missing the point of having a "reliable" brain.  Example: believing that an object is blue is technically a false belief.  In reality, an object perceived as blue is just that:  perceived as blue.  It is reflecting light that is of a wavelength interpretted by our eyes as blue.  Which is why we need to have a more lax distinction between "true" and "false"; because sometimes the answers that we arrive at are necessarily simplistic in comparison to the reality.   

Another part of it is because not all beliefs are equiprobable to possess, in that any given belief is more likely to be believed in the closer it is to reality.  Picture an acute angled piece of a pie chart.  Imagine that the point that was once at the center of chart is representive of the one true belief about a subject matter, and every other belief is "false".  Now, the further out you go, the more false the belief is, and, in addition, the more possible beliefs there are.  There are an increasingly large number of possible beliefs the more wrong you become.  The number of beliefs that are close to the truth are much smaller than the number of possible wrong beliefs.  This would suggest that it is more likely to get a false belief, given the sheer difference in number.  But, I would like to declare right now that belief is not determined by chance alone.  We do not simply receive beliefs from all possible options that are randomly plugged into our skull.  We receive our beliefs based on, well....reality.  Our beliefs are inherently biased to be near the shallow end of the piece of pie, deviating from the one true belief as little as possible.  This is just because it requires an incredible amount of effort to make conclusions about reality that are backasswards to that degree, and the fact that there are probably so many possible ways of determining its falsity when you reach the "counterfactual" zone of false beliefs that it is unlikely that any creature that isn't already non-functional would adopt it.  The "true enough" beliefs are significantly more likely because they actually have evidence to support them, and thus we make reality serve as the common reference point for determining our beliefs.  

Anyway, further granting that beliefs are also more likely to be maladaptive as you deviate from reality further, and require other improbable but complementary false beliefs to make it potentially adaptive or neutral (as in the tiger examples), it becomes clear that false beliefs should be far less common, or should at least have a small enough of a magnitude, to make it so that our brain is "reliable enough".

I'll admit, false beliefs can creep in, but they would be scarce in comparison to true beliefs, which are infinitely more convenient, in that they don't need to rely on other beliefs to allow you to be functional.  I assume that it is rather analogous to lying:  tell the truth and everything is straight forward, but tell a lie and you better be prepared to continue lying in order to provide a network of support for the original.  Instead of simply rendering the world accurately, you need to create a flimsy house of cards and try your damndest to not accidentally knock the whole thing over.  Humans do in fact have known false beliefs and perceptions.  We have oversimplifying heuristics, we experience some known illusions and have some common cognitive biases and delusions that we encounter every now and then as well.  So, in short, the human mind is unreliable to a degree, just as one would expect for an evolved brain in a species that, though not infallible, has its brain as its almost exclusive remarkable feature.  Interestingly, the explanations for these shortcomings are less than satisfactory for those purporting theistic evolution/creation.  Well...I thought it was interesting at least...

Just a side note though, is that even if "naturalism" were refuted, we would still necessarily need to adopt methodological naturalism in order to function in the world, and we would still have no reasonable case for why we should believe in anything but methodological naturalism.  Indeed, there is a possibility that there is more to reality than the observed, natural world, but, sadly, we have yet to see evidence for whether there is more, let alone what kind of things we should expect that more to be.
So reflect once more on what we know about these creatures. They live in a world in which evolutionary naturalism is true. Therefore, since they have survived and reproduced, their behavior has been adaptive. This means that the neurophysiology that caused or produced that behavior has also been adaptive: it has enabled them to survive and reproduce. But what about their beliefs? These beliefs have been produced or caused by that adaptive neurophysiology; fair enough. But that gives us no reason for supposing those beliefs true. So far as adaptiveness of their behavior goes, it doesn't matter whether those beliefs are true or false.
Here's the point to drive home, I suppose:  adaptiveness of belief does not guarantee truth of belief, but adaptiveness of belief does correlate with accuracy of belief (which correlates with the likeliness of obtaining that particular belief, due to beliefs being easier to adopt when you actually have facts to support them).  I have yet to be convinced that adaptiveness of a belief and veracity of a belief are mutually exclusive, despite the number of times it has been repeated.
Suppose the adaptive neurophysiology produces true beliefs: fine; it also produces adaptive behavior, and that's what counts for survival and reproduction. Suppose on the other hand that neurophysiology produces false beliefs: again fine: it produces false beliefs but adaptive behavior. It really doesn't matter what kind of beliefs the neurophysiology produces; what matters is that it cause adaptive behavior; and this it clearly does, no matter what sort of beliefs it also produces. Therefore there is no reason to think that if their behavior is adaptive, then it is likely that their cognitive faculties are reliable.
Again, yes it is does not matter whether a belief is true or not if it is adaptive.  And yet, true beliefs are more often adaptive than false ones, and false ones are more often maladaptive than true ones.  Fancy that.  [ I can repeat things too ;) ]

And now, for dessert, a brief discussion of something completely similar, stolen from the Pharyngula comment thread.  Take it away, , Jim:  
Because I think naturalism is a false view of reality, I have no trouble trusting that my thoughts - such as the thought that my bed exists - can be rational (i.e. possessing reason and understanding) and valid. You, on the other hand, have no doubt that your bed exists, but you subscribe to a worldview (naturalism) that provides no basis for trusting that what you think about your bed is true. You argue for the truthfulness of naturalism, but you conduct your mental activity as if it isn't true. If it were true, there would be no "you" engaged in any mental activity; that mental activity would instead be nothing more than electro-chemical neural activity induced by material causes, all of which are irrational (i.e., lacking reason and understanding).
Here is the rub:  emergent properties.  The straw naturalist/materialist always seems to involve dismissing humans as simply a cluster of cells/molecules/chemicals, and dismissing thoughts as mere "electro-chemical" reactions.  The argument is overly reductionist and is inordinately obsessed with the components rather than the whole.  Please though, dismiss the spoken word as mere syllables.  Dismiss the written word is simple globs of graphite, streaks of ink, or random arrangements of pixels (depending on your medium).  Dismiss a building as just bricks.   Completely ignore the larger functions such things have.  An interesting note:  you can observe this kind of thing in molecules themselves.   Compounds comprised of molecules have much different properties than the elements of its component atoms possess.  A slight change in atom type or amount can also significantly change a molecule.  Hell, even a change of the position of the atoms relative to one another without changing the component atoms at all can result in a change in its observed properties at a macro level (e.g. isomerism, stereoisomerism).  In short, it is not that consciousness is just electro-chemical signals, it is that consciousness is a very difficult to explain phenomenon that fairly clearly results from said signals.

As for why he thinks that his ability to rationally determine the truth of something is superior to those who admit the neural basis to their thoughts, I have no idea.  Considering that all that we know about the fallibility of the human mind (independent of Plantinga, of course), it probably isn't a good idea to trust your thoughts alone, especially if you think that naturalism is wrong.  You might find yourself encased in a solipsist nightmare, forced to doubt external reality and the existence of other people, doomed to an existence entirely defined by only your own thoughts, since they are the only thing beyond the Deceptive Demon's reach. 

So, on that note: sleep tight.
[If you still having difficulty seeing how the video at the start of the post relates to anything at all, you are sane.  Congratulations.]

15 comments:

pboyfloyd said...

Yea, what can anyone really say about Alvin(and the Chipmonks) Plantinga and his amazing properly basic beliefs.

What a marroon? This seems to be a philosopher's back-door to the Realm-of-B'lief

Michael said...

It is nice to see that the asylum seeker has returned to his madman's paradise...

but perhaps, because my mind is full of false belief, maybe the words I have read on your blog just now are not really there. But because i believe that I read them, it is okay... after all, there is a 50/50 chance. haha

Another great post! hope to see more soon!!

Stacy said...

I wonder how hard it is for "Jon" to go all "Jr. High" like that in the video - too funny.

"...the chances of any given belief being true is higher than he suggests it should be...""Oi" - I don't know where to begin seeker ... I believe for every drop of rain that falls, a flower grows.

Pliny-the-in-Between said...

Egads! Have you been possessed by the ghost of Tolstoy!? ;)

It'll take me a couple of days for my limited brain to assimilate all this.

mac said...

I love that Jon Guy. :-)

As to the post.....I'll get back to ya ;-)

Asylum Seeker said...

Does the realm of B'lief have any relation to the "city of R'yleh", pboy?

Thank you for the kind words, Mandar. Can't guarantee that I will have "returned" with any consistency: I tend to spend summer in a state of hibernation. But, hopefully I will be able to crawl to the blogosphere a little more often than last year.

"I believe for every drop of rain that falls, a flower grows."

I'd say it's got a 50/50 chance of being true ;)

"Have you been possessed by the ghost of Tolstoy!? ;)"

Pfft, I wish. Also, come on! This post isn't even in my top 10 in regards to length! Though I guess the language is a little dense, because I actually rephrased this post from about 4 pages of notes I scrawled down during my free time at work, the day after reading Plantinga's article and other summaries of it. Guess it's probably for the best that I left out a spiel about "factual, counterfactual, and afactual beliefs" (which was both incoherent and unconvincing, anyway).

"As to the post.....I'll get back to ya ;-)"

You damn well better.

mac said...

Adaptive behavior to a false belief can, indeed, be maladaptive.

Take the frog who thinks he may become a prince:
He goes about eating flies in hopes of becoming, while his wife frogess inhabits another less foolish frogs lilly pad...or said sily frog neglects other important aspects of his life, eating flies constantly, HOPING to become a prince - which we all know happens by a kiss from a beautiful maiden.

Perhaps the Author was writing of HIS beliefs having a 50/50 chance of being true. I'd surmise it to be lower, if this article is any indication of his beliefs foundations.

Pliny-the-in-Between said...

So consider any particular belief on the part of one of those creatures: what is the probability that it is true? Well, what we know is that the belief in question was produced by adaptive neurophysiology, neurophysiology that produces adaptive behavior.
--------------------------------
We know absolutely no such thing. All we know is that the belief system can be registered or stored within the limitations and construct of the underlying anatomy and physiology. That such beliefs can be stored in such a fashion tells us next to nothing about how the supporting structures came about. As a metaphor, I use paint cans very effectively to store screws and nails - which has absolutely no bearing on the construction of the can in the first place. Repurposing - of paint cans or neuroanatomy tells us nothing or either's origins other than it works.

pfft, PFFT! After all this time that is the best raspberry I rate! I am appalled - Appalled I say!

Asylum Seeker said...

Clever angle mac! I neglected the possibility that a frog trying to reach an impossible goal by eating flies instead of sating a biological desire would have all sorts of problems arise due to that fact. Another reason why, even if you get the right effect from a false belief in one respect, it just isn't as good as having the more accurate belief.

Of course, all of this might be not but mental masturbation (as they called it on the Pharyngula thread) if as Pliny (and others also on aforementioned thread) suggests, his underlying premises revolving around the interaction of beliefs and biology are flawed. There are far too many ways to tackle this thing, it appears...

GearHedEd said...

I only read the first .5 of the original post, but I'm .75 sure that you nailed it, Seeker!

Asylum Seeker said...

I wouldn't expect an evolved brain to think anything less ;)

(Man, that made me sound arrogant...)

Asylum Seeker said...

Also, if anyone ever manages to stray back here and read this again, here's a fortuitous little article that slightly supports the arguments I make in this post. Yay me!

Anonymous said...

[url=http://www.kfarbair.com][img]http://www.kfarbair.com/_images/_photos/photo_big7.jpg[/img][/url]

מלון [url=http://www.kfarbair.com]כפר בעיר[/url] - אווירה כפרית, [url=http://www.kfarbair.com/about.html]חדרים[/url] מרווחים, שירות חדרים, אינטימיות, שלווה, [url=http://kfarbair.com/services.html]שקט[/url] . אנחנו מספקים שירותי אירוח מיוחדים כמו כן ישנו במקום שירות חדרים הכולל [url=http://www.kfarbair.com/eng/index.html]אחרוחות רומנטיות[/url] במחירים מיוחדים אשר מוגשות ישירות לחדרכם.

לפרטים אנא גשו לאתר האינטרנט שלנו - [url=http://kfarbair.com]כפר בעיר[/url] [url=http://www.kfarbair.com/contact.html][img]http://www.kfarbair.com/_images/apixel.gif[/img][/url]

Anonymous said...

1000 dollar loan payday bad credit personal loans without homepayday advance. Cash advance programs are specifically for those who need an advance before.

Plokiju said...

"Jim"'s comment (quoted above, originally from the comments at Pharyngula) is characteristic of a mindset that has become much more prevalent since you first published this post. The existence of belief in something is more and more often considered to be a valid criterion for determining whether or not that thing is capital-T True.

As eloquently put forth by His Artistness Prince and recently seen on bumper-stickers, "everything you think is true". *Whew*, what a relief! Everything I think is true! But wait - if everyone's thoughts are true, doesn't that mean that the thoughts of someone I disagree with are true too? Even if they are the exact opposite of my thoughts, even if what I believe is "the other person's beliefs are not true"? How does that work?

One of the greatest gifts we have as conscious beings is the ability to throw out or redefine ideas when new data shows our old ideas to be inadequate. So, for instance, if we find that a certain assumption (like "everything you think is true") leads to a paradox, we know it must be discarded in favor of other thoughts that better explain things without resulting in logical inconsistencies.

The defining characteristic of intelligent beings is our ability to differentiate between our thoughts and reality. Crazy people are the ones who believe that everything they think is true.

But Jim doesn't just say he disagrees with you. Jim says that your beliefs prove his point. Using what sounds a little like logical arguments he explains that not only do his thoughts, by their very existence, 'prove' that what he is thinking is True, any completely opposite thoughts you might have also prove that what he is thinking is True.

People who think their thoughts are necessarily true are considered to be 'crazy'. People who think that everything proves them right, that statements and their opposites and completely unrelated ideas all support their viewpoint, are considered to be 'paranoid'.

At least that's how it was in the past. Nowadays more and more people consider the adoption of these modes of thought to be a mere lifestyle choice.