Some years ago now, I heard my then-dean, a physicist by training, express his unease at the suggestion that the mission statement for the College of Arts and Sciences include the phrase, "concern for truth." The word makes people nervous, he warned, and they're bound to ask, "whose 'truth'?" A sociologist colleague, seconding the dean's reservations, remarked that, while of course his research advances knowledge, he isn't concerned with "truth." A couple of us pointed out that, unless your conclusions are true, they aren't really knowledge, only purported knowledge; and I did my best to explain that it doesn't follow from the fact that people disagree about what is true that truth is relative to perspective. But in due course a Strategic Plan for the college specified curricular innovations required by the "fundamental questions" that have been raised "about the presumed universality and objectivity of 'truth.'"
Probably most of you have heard, not exactly this story, but other stories essentially similar, only set in other places and with different characters; for the ideas my dean had picked up are by now almost an orthodoxy in academia, taken in some quarters-admittedly, more often in the humanities and the social sciences than in the physics department!-as indications of intellectual sophistication and moral rectitude.
Naturally, proponents of this new almost-orthodoxy-the "Higher Dismissiveness," as Anthony Gottlieb calls it, or in my terminology, the "New Cynicism"-differ among themselves on the finer points. But they agree that the supposed ideal of honest inquiry, respect for evidence, concern for truth, is a kind of illusion, a smoke-screen disguising the operations of power, politics, and rhetoric; and that those of us who think it matters whether you care about the truth, who feel no need for precautionary scare quotes when we write of fact, knowledge, evidence, etc., are hopelessly naive. As if this weren't bad enough, the feminists and multiculturalists among them suggest that in our naivet�� we are complicit in sexism and racism, and the sociologists and rhetoricians of science among them suspect us of reactionary conformism with the military-industrial complex.
Faced with such an intimidating double accusation of naivet�� and moral backwardness, many take the ostrich attitude, apparently hoping that if they ignore the New Cynicism hard enough, it will go away. But an old-fashioned prig like myself begins to feel-well, rather like the proverbial cannibal among the missionaries.
A thoughtful cannibal will notice, at the heart of the New Cynicism, a profound intolerance of uncertainty and a deep unwillingness to accept that the less than perfect is a lot better than nothing at all. And so again and again true, fallibilist premises are transmuted into false, cynical conclusions: what is accepted as known fact is often enough no such thing, therefore, the concept of known fact is ideological humbug; one's judgment of the worth of evidence depends on one's background beliefs, therefore, there are no objective standards of evidential quality; science isn't sacred, therefore, it must be a kind of confidence trick; and so on.
But there's really no need to give up on the objectivity of truth, evidence, etc., provided you're fallibilist enough. What is most urgently needed is a realistic understanding of our epistemic situation, of how complicated evidence can get and how difficult serious inquiry can be. Of course, this will be only the beginning of the work, for then there will be the questions of the place of the sciences within inquiry generally, of the differences between science and literature, of the roots of relativism, and of the claim of the New Cynicism to represent the interests of the oppressed and marginalized.
* * *
Evidence is complex and ramifying, often confusing, ambiguous, or misleading. Think of the controversy over that four-billion-year-old meteorite discovered in 1984 Antarctica, thought to have come from Mars about eleven thousand years ago, and containing what might possibly be fossilized bacteria droppings. Some space scientists thought this was evidence of bacterial life on Mars; others thought the bacterial traces might have been picked while the meteorite was in Antarctica; and others again believed that what look like fossilized bacteria droppings might be merely artifacts of the instrumentation. How did they know that giving off these gases when heated indicates that the meteorite comes from Mars? that the meteorite is about four billion years old? that this is what fossilized bacteria droppings look like?-like crossword entries, reasons ramify in all directions.
How reasonable a crossword entry is depends on how well it is supported by its clue and any already completed intersecting entries; how reasonable those other entries are, independent of the entry in question; and how much of the crossword has been completed. How justified a belief is, similarly, depends on how well it is supported by experiential evidence and by reasons, i.e., background beliefs; how justified those background beliefs are, independent of the belief in question; and how much of the relevant evidence the evidence includes.
The quality of the evidence for a claim is objective, depending on how supportive it is of the claim in question, how comprehensive, and how independently secure. A person's judgments of the quality of evidence, however, are perspectival, depending on his background beliefs. Suppose you and I are working on the same crossword puzzle, but have filled in some long, much-intersected entry differently; you think a correct intersecting entry must have an "F" in the middle, while I think it must have a "D" there. Suppose you and I are on the same appointments committee, but you believe in graphology, while I think it's bunk; you think how the candidate writes his gs is relevant to whether he can be trusted, while I scoff at your "evidence." Or, to take a real example: in 1944, when Oswald Avery published his results, even he hedged over the conclusion to which they pointed, that DNA is the genetic material; for the then-accepted wisdom was that DNA is composed of the four nucleotides in regular order, and so is too stupid, too monotonous, a molecule to carry the necessary information. But by 1952, when Hershey and Chase published their results, the tetranucleotide hypothesis had been discredited; and then it could be seen that Avery already had good evidence in 1944 that DNA, not protein, is the genetic material.
Inquiry can be difficult and demanding, and we very often go wrong. Sometimes the obstacle is a failure of will; we don't really want to know the answer badly enough to go to all the trouble of finding out, or we really don't want to know, and go to a lot of trouble not to find out. I think of the detective who doesn't really want to know who committed the crime, just to collect enough evidence to get a conviction, of the academic who cares less about discovering the causes of racial disharmony than about getting a large grant to investigate the matter-and of my own disinclination to rush to the library to check out the article that might oblige me to redo months of work.
Other things being equal, inquiry goes better when the will and the intellect, instead of pulling in different directions, work together; that's why intellectual integrity is valuable. But even with the best will in the world, even when we really want to find out, we often fail. Our senses, imaginations, and intellects are limited; we can't always see, or guess, or reason, well enough. With ingenuity, we can devise ways of overcoming our natural limitations, from cupping our ears to hear better, through tying knots in rope or cutting notches in sticks to keep count, to highly sophisticated electron microscopes and techniques of computer modelling. Of course, our ingenuity is limited too.
Everyone who looks into how some part or aspect of the world is-the physicist and the detective, the historian and the entomologist, the quantum chemist and the investigative journalist, the literary scholar and the X-ray crystallographer-works on part of a part of the same vast crossword. Since they all investigate the same world, sometimes their entries intersect: a medical researcher relies on an amateur historian's family tree in his search for the defective gene responsible for a rare hereditary form of pancreatitis; ancient historians use a technique devised for the detection of breast cancer to decipher traces on the lead "postcards" on which Roman soldiers wrote home.
So successful have the natural sciences been that the words "science" and "scientific" are often used honorifically, as all-purpose terms of epistemic praise. (To make sure we get the point, the television actors who promise that new, scientific, Wizzo will get our clothes cleaner wear white coats.) Unfortunately, this honorific usage disguises the otherwise obvious fact that not all, or only, scientists are good, honest, thorough, imaginative inquirers. Some scientists are lazy, some incompetent, some unlucky, a few crooked; and plenty of historians, journalists, detectives, etc., are good inquirers.
Science is neither sacred nor a confidence trick. Standards of stronger and weaker evidence, better and worse conducted inquiry, are not internal to the sciences; and there is no mode of inference, no "scientific method," exclusive to the sciences and guaranteed to produce true, or probably true, or more nearly true, or more empirically adequate results. Nevertheless, as human cognitive enterprises go, the natural sciences have been remarkably successful; not because they use a uniquely rational method of inquiry, unavailable to other inquirers, but in part because of the many and various "helps" they have devised to overcome natural human limitations. Instruments of observation extend sensory reach; models and metaphors stretch imaginative powers; techniques of mathematical and statistical modelling enable complex reasoning; and the cooperative and competitive engagement of many people in a great mesh of subcommunities within and across generations not only permits division of labor and pooling of evidence but also-though very fallibly and imperfectly, to be sure-has helped keep most scientists, most of the time, reasonably honest.
Science, like literature, requires imagination. Scientists, like writers of literature, stretch and amplify the language they inherit: a non-proteinous substance in the nucleus of cells is dubbed "nuclein," and later comes to be known as "nucleic acid"; then we have "deoxyribose nucleic acid"; then "ribonucleic acid," subsequently acknowledged to be "ribonucleic acids," in the plural; and then-almost a century after "nuclein" was coined-"transfer RNA," "messenger RNA," and so on. Scientists, like writers of literature, rely on metaphors: the chaperone molecule, the Spaghetti Hypothesis, the uncles-and-aunts experiments, parental investment, and so forth. But it doesn't follow, and it isn't true, that science is indistinguishable from fiction. The distinction between the imaginative and the imaginary is key.
Scientists engage in writing, and writers of literature engage in inquiry; but the word "literature" picks out a bunch of kinds of writing, while the word "science" picks out a bunch of kinds of inquiry. A scientist dreams of structures, classifications, and laws that, if he is successful, are real, and explanations that, if he is successful, are true. Imagination, and imaginative exploration of imagined explanations, comes first; but to go beyond mere speculation, appraisal of the likely truth of the imagined conjecture, itself often requiring imagination in the design of experiments, instruments, etc., must come after. And this requires that serious scientific metaphors, those that are not just picturesque speech but working intellectual tools, eventually be spelled out in literal detail: what, literally, is invested in reproduction? what constitutes maximizing return?
Progress in the sciences is ragged and uneven, and each step, like each crossword entry, is fallible and revisable. But each genuine advance potentially enables others, as a robust crossword entry does; "nothing succeeds like success" is the phrase that comes to mind. Think of Watson and Crick checking their model of DNA using only a ruler and a plumb line, and then of Max Perutz, years later, checking his structure for the far more complicated hemoglobin molecule using a complex computer program; or of how, starting with the (relatively) simple X-ray, eventually we had the PET-scan, the CAT-scan, MRI.
Just about every inquirer, in the most mundane of everyday inquiries, depends on others; otherwise, each would have to start on his part of the crossword alone and from scratch. Natural-scientific inquiry is no exception; in fact, it is more so-the work, cooperative and competitive, of a vast intergenerational community of inquirers, a deeply and unavoidably social enterprise. But it doesn't follow, and it isn't true, that scientific inquiry is nothing more than a process of social negotiation in which scientists trade their theoretical loyalties for prestige, or that the entities postulated in scientific theories are nothing more than social constructions.
It is true, however, that both the internal organization of science and its external environment can affect how well or how poorly scientific work gets done. As ever more elaborate equipment is needed to make ever more recherch�� observations, scientific work tends to get more expensive. When only governments and large industrial concerns can afford to support science, when some scientists are tempted to go prematurely to the press, some find it possible to make fortunes from their work, the expert witness business booms, there is no guarantee that mechanisms that have thus far proven more or less adequate to sustain intellectual integrity will continue to do so. There are no grounds for complacency.
Some of the knowledge the natural sciences have achieved has the potential to cause grave harm-knowledge brings power, and power can be abused. Of course it doesn't follow, as some proponents of the New Cynicism are tempted to conclude, that the natural sciences haven't achieved genuine knowledge after all. But difficult moral and political questions about the distribution of resources, the applications of scientific knowledge, etc., cannot responsibly be left to scientists alone to settle. Again, there are no grounds for complacency.
In scientific inquiry, and in inquiry of every kind, what we take to be legitimate questions sometimes turn out to be flawed. Questions about the properties of phlogiston, for example, turn out to rest on a false presupposition, and so have no true answer; some texts turn out to be ambiguous in ways of which the author is unaware, and so have no uniquely correct interpretation; and so on. None of this has any tendency to undermine the objectivity of truth. Sometimes, speaking carelessly, we say that something is true for you, but not for me. But this has no tendency to undermine the objectivity of truth either; what we mean is only that the something-liking chocolate chip cookie ice cream, say, or being over six feet tall-is true of you but not of me; or else that you believe whatever-it-is, but I don't.
A statement or belief is true just in case things are as it represents them to be; so everyone who believes anything, or who asks any question, implicitly acknowledges-even if he explicitly denies it-that there is such a thing as truth. Truth is not relative to perspective; and there can't be incompatible truths (this is a tautology, since "incompatible" means "can't be jointly true"). But there are many different truths-different but compatible truths-that must somehow fit together. It doesn't follow that all the truths about the world must be unified in the logical positivists' strong sense of that term, that they must be reducible to a privileged class expressed in a privileged vocabulary; nor, in particular, that all the truths about the world must be expressible in the language of physics. Rather, physics supplies a contour map on which the social sciences, history, etc., superimpose a road map-the superimposed maps each representing, in its own "vocabulary," the same one, real world.
Excerpted from PUTTING PHILOSOPHY TO WORKby SUSAN HAACK Copyright © 2008 by Susan Haack. Excerpted by permission.
All rights reserved. No part of this excerpt may be reproduced or reprinted without permission in writing from the publisher.
Excerpts are provided by Dial-A-Book Inc. solely for the personal use of visitors to this web site.