Pages

Friday, October 9, 2015

How can I know anything?

When I say that I want to believe only those things I have evidence for, people invariably ask, "But then how do you know anything?  How do you know you're not a brain in a jar?"

The answer, I suppose, is "I don't, not with certainty."    What frustrated me about pre-modern philosophy, when I studied it in college, was the desire to know things with certainty, as if you could solve for existence like a math problem.  To achieve that, most of the reasoning was a priori -- starting from basic axioms and trying to reason from there.  What I couldn't see was how you could know your axioms were correct.  For instance, "Everything has a cause."  Certainly I've never seen anything that doesn't.  But how can you prove that there is nothing that doesn't?  You don't, you assume it at the outset, and if you happen to be wrong, everything you conclude in your argument is now doubtful.

But, of course, if you assume nothing, if you say that everything is doubtful, well, maybe you're a brain in a jar.  Descartes, so far as I know, did not set out to prove "a priori reasoning is a joke," but that was kind of what I got out of him.

The way out, to me, is the scientific method.  I'm sure this methodology would have been more exciting to me, way back in fifth grade when I learned about it, if I had studied Descartes first.  At the time it was like, "Of course you do experiments to find out stuff, how else would you do it?  Doy!"  (Hey, it was the 90's.)

But I missed the real point of it, which was that instead of starting with an axiom -- something that is assumed and never questioned later on -- you start with a hypothesis.  A guess.  You assume it conditionally, you make predictions of what will happen if it's true, and then you see if your predictions are true.  If your predictions always come true, you start to believe more and more that the hypothesis is accurate.  But if even one is false, you have to abandon or at least revise your hypothesis.

So when I make the hypothesis, "my senses are generally reliable," I can then make some predictions.  For instance, I predict that my senses won't contradict one another; that I won't randomly jump around like in a dream, but will have to go step by step from one place to another; that other people will affirm that their sense-experience is similar to mine.  So far, so good -- my experience is entirely compatible with my hypothesis.  I haven't disproven the idea that I'm a brain in a jar, but given that it seems a lot more complicated and less likely, I don't spend much worry on it. 

This is how science develops.  The theory of evolution started as a guess, but as new finds confirm its predictions -- that newer fossils are more complex than old ones, that fossils will be found in the geological layers corresponding to the period when we think they lived, and so forth.  When some minor deviation is found -- say, a T. rex in the Jurassic layer -- the theory has to be adjusted to contain it, and we get new dinosaur books that list T. rex in the Jurassic instead of the Cretaceous.  When predictions fail in a big way, with no way to explain how the theory could be reconciled with them, the theory is abandoned.  If they find a rabbit fossil in a Precambrian layer, that would pretty much disprove evolution, but it's never happened.  That is a good reason to believe in evolution -- but no one claims to be certain, in the philosophical sense, about it.  Certainty is really not something that happens outside of math.

Here are some theories I believe, with a high but not total level of certainty:

1.  My senses are generally accurate.
2.  Other people are mostly to be trusted
a. unless they have a motive to lie, or a reason why they would be mistaken
b.  or they have a history of lying
c.  or if they are asking me to believe something that seems contradictory or highly unlikely
[This theory allows me to believe in stuff like "Abraham Lincoln was shot by John Wilkes Booth" without believing every salesman who knocks on the door.]
3.  My moral intuitions should be followed
a.  though I should use reason to doublecheck them
b.  and not act if I am doubtful

This last is perhaps more of a choice than a proper theory because it contains should, but I do think that the fact that humans have roughly similar moral intuitions, and that they match a rational idea of "what actions are good for humanity," it's reasonable if not fully scientific.

Why trust the scientific method?  Well, in the few centuries since it was adopted, it's given us the steam engine, the gasoline engine, the airplane, electricity, penicillin, the eradication of smallpox, the internet, sanitation, a man on the Moon, global positioning systems, and the air conditioner.  Not that these things are indispensable (well, not all of them) but that they are good examples of how the scientific method is a good way of navigating the world we live in.  Rather than either assuming things and not going back to check, or failing to make any assumptions and living in total uncertainty, it's about moving forward with your best guess and seeing if it holds up.  It's about not privileging any belief you hold so dearly that you won't allow future evidence to amend it.  It's humble in what it promises, but impressive in what it delivers.

It can't prove I'm not a brain in a jar, though.  Oh well. 

1 comment:

SeekingOmniscience said...

Yeah. As far as proof goes--well, the probabilities of the scientists seem to wear a fair bit better than the Absolute Metaphysical Certainties of the philosophers, for what it's worth.

Related Posts Plugin for WordPress, Blogger...