This blog is dead — go to

Personal Bias (Assorted Meandering Reflections)
December 31, 2008, 3:10 pm
Filed under: Epistemology, Science | Tags: , ,

It is widely held that personal bias is a bad thing, at least epistemically (i.e. when it comes to knowing).  

People value loyalty when it comes to friends and family.  This seems a bit like personal bias (or maybe it involves personal bias).  But we recognize that this kind of loyalty can go wrong (even if it does not always go wrong).  

What is it to be biased?  When is it wrong?  Is it wrong when it comes to knowing?  Always?


Maybe bias in knowing works like this: You are biased towards A if you assign more weight to the evidence for A than it seems to you that it has.


But there is deeper kind of bias on top of this.  This deeper bias works like this: The evidence for A seems to be weightier than it actually is.   


If you are biased in the first way, you decided to be biased.  If you are biased in the second way, you won’t even be aware of it.


I’d say that it is exceedingly difficult to sort out the one from the other in daily life.  For what begins as the first can morph into the second.  You decide to think of your favourite sport team as better than they seem to you.  You decide to oversell their good qualities and undersell the their bad qualities.  But, after a while, this makes things seem different.  You no longer decide to oversell their good qualities.   Instead, the team starts to seem that good to you.  


Also, who would admit (even to themselves) that they were biased in the first sense?  No, you’ll convince yourself that you were biased in the second sense.  You’ll feel that you were biased in the second sense.  It will seem that way.  Except maybe in moments of special moral clarity.


Anyway, all this assumes that epistemic bias amounts to misjudging the weights of evidence for propositions.  Is that accurate?


What if you’re considering whether you can jump across a canyon, and it seems to you that you can’t make it?  If your life depends on it, isn’t it good and proper to try to convince yourself that you can make it?  Won’t bias here save your life?  Won’t it help you successfully navigate life?  Won’t it furnish you with the knowledge that you really can make it across, even though it seems that you can’t?  I’d say that this is an example of bias furnishing us with knowledge, not merely true belief.  (Others would say that that you merely reached out in faith and just happened to be right.  I’d say, yes, you did do this, but by doing this you achieved knowledge and made contact with the world.  Or you probably did.  More information is needed to make a full judgment.)


Also, science is inherently conservative and biased.  (I don’t mean for this to be a criticism.)  Maybe ‘loyal’ is a better word than ‘biased’.  Research programs only get off the ground because a groups of researchers are committed to certain paradigm theories in the face of anomalies.  Science wouldn’t work if people always tried to build theories from the ground up.  They work within unquestioned assumptions so they they can dial in on very specific things to question and test.  Doubts can only have meaningful content in the context of belief.  You might say that this is non-biased loyalty, and so it doesn’t count as bias.  But you can be sure that sometimes it amounts to the second kind of bias.  Also, despite what scientists have to say about it, the decision of what research projects are undertaken cannot be made without reference to our needs and wants.


At the bottom of it all, you should only ever believe something because it seems to you that it is so.  You can never escape this baseline “it seems so”.   You might think that “knowing it is so” is a very different thing from “it seeming so”, and much to be preferred.  It may be more than “it seems so”, but it is not less.  You can be more or less confident in your “it seems so”.  But there is always a risk that you are biased in the second sense.


You can be non-culpably biased in the second sense.  Every impulse of epistemic action is issued under the risk of bias in the second sense.


Leave a Comment so far
Leave a comment

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: