Buy F&SF • Read F&SF • Contact F&SF • Advertise In F&SF • Blog • Forum

September 2000
 
Book Reviews
Charles de Lint
Elizabeth Hand
Michelle West
James Sallis
 
Columns
Curiosities
Plumage from Pegasus
Off On a Tangent: F&SF; Style
 
Film
Kathi Maio
Lucius Shepard
 
Science
Gregory Benford
Pat Murphy & Paul Doherty
 
Coming Attractions
F&SF; Bibliography: 1949-1999
Index of Title, Month and Page sorted by Author

F&SF; Electronic
You can read a digital version of The Magazine of Fantasy & Science Fiction. Copies are available at:

ereader.com
Available Format: Palm Reader

Fictionwise
Available Formats: Adobe Acrobat (PDF), Palm Reader (PDB), Rocket/REB1100 (RB), Microsoft Reader (LIT), Franklin eBookMan (FUB), Hiebook (KML), iSilo (PDB), Mobipocket (PRC)

audible.com
PC Digital Audio: PocketPC models, Apple iPod, Audible Otis, Rio players, Iomega HipZip, Visor with AudibleAdvisor, Digisette Duo-Aria MP3 player, Franklin eBookman PDA, Palm OS5 handhelds, Mac (Mac OS X and iTunes3) Digital Audio: Apple iPod, Audible Otis

Current Issue • Departments • Bibliography

A Scientist's Notebook
by Gregory Benford

Risks and Realities

In March 2000 NASA decided to ditch one of its primary research satellites. It will be gone by the time you read this, its orbit deliberately lowered until it dives into the Pacific Ocean.

How come? Because the second of its three gyros is failing, the first already gone, and with only one left the satellite will be unmanageable. In time, left alone, it would spiral into the atmosphere anyway, slamming several big car-sized chunks into the surface. But the observatory satellite, which has pioneered our view of the universe seen in the gamma-ray high-energy portion of the spectrum, could carry on for at least several more months before the second gyroscope dies.

NASA made their decision based on the risk to those below. The odds of injuring anybody?—less than one in a million, depending on exactly what assumptions you make going into the calculation.

Is the yield to science worth such a chance? A NASA spokesman said, "Any risk is unacceptable," which I take as the usual media hyperbole.

Still, such questions arise constantly in our technological world. In my last column I dealt with the Mars probe failures, but the element of risk appends to every human activity.

We often forget this, demanding that something be "safe" when nothing ever truly is. As you sit reading this, probably indoors, radon gas accumulates in the room with you. In many homes it probably yields a higher level of radioactivity than if you were sitting right on top of a nuclear waste storage facility.

And don't forget that at any moment, a meteorite could hammer down through the atmosphere and squash you. (Only one person has verifiably been hit by one, a woman lying in bed in Alabama.)

So what does a scientist make of such quandaries? I could react with a dry numerical analysis of the trade-offs involved with the satellite, knowledge vs. danger, but that would miss the most important aspect: human emotion.

*     *     *

As we evolved, with only crude technology, disaster was always natural---floods, storms, plagues.

Consider earthquakes, nowadays the universal fear of Californians. To nomads who lived on the land, without the comforts of houses, they were once no more troubling than a passing squall. With technology and the advance of comforting domesticity came disaster of a different kind---self-engendered. It is our Faustian bottom line.

Alas, techno-risk brings with it the vexing problem of risk assessment. In 1900, the average lifetime in the U. S. A. for both men and women was 48 years; now it's about 75. (The longevity value of simply being female is about 7 more years now than the average male.)

Science has been so successful in giving us years, we now seem to brood darkly on the possibility that it will, through accident and environmental effects, subtract a bit. We pass on from more protracted causes now, too, as a recent New Yorker cartoon showed: two old people talking and one says, "In my day, people used to just die."

Lawyers argue their cases as though the world should rightly guarantee us all a life free of any chance of accident—and if something bad happens, it must be somebody else's fault.

That attitude arises because juries welcome it. Their perceptions of risk color courtroom judgments and public policy alike, but seldom very rationally (i.e., seldom with any quantitative sense).

Activity or Technology

League of Women Voters

College Students

Experts

Nuclear Power

1

1

20

Motor Vehicles

2

5

1

Handguns

3

2

4

Smoking

4

3

2

Motorcycles

5

6

6

Alcoholic Beverages

6

7

3

General
(private aviator)

7

15

12

Police Work

8

8

17

Pesticides

9

4

8

Surgery

10

11

5

Table 1
Ordering of perceived risk for 10 activities and technologies. The ordering is based on the average (geometric mean) risk ratings within each group. Rank 1 represents the most risky activity or technology.

Our perceptions differ greatly. In Table 1 the opinions of different groups appear, including experts who know statistics. It shows daunting differences.

Notice how nuclear power takes a beating in the eyes of the reasonably aware public. Apparently, TV and movies have told most of us for so long that nukes are very bad, and we have absorbed the message. Never mind that nobody in Europe or North America has ever died of the effects of nuclear power generation.

Cop shows make police work look more risky than it is. If you've ever spent much time with real cops, you find that they are very careful people. I knew one who wore his bulletproof vest to ordinary civilian events.

Interestingly, all the hospital shows seem to have reassured the public about the dangers of surgery. Having just barely survived a burst appendix 15 years ago, the bottom line of Table 1 looks to me very much like the metaphorical bottom line, too.

The public has a fairly reasonable idea of the risks from smoking and handguns, agreeing with the experts who know the numbers. This may help explain why attack lawyers have gotten away with assaulting the manufacturers of these products, extorting hundreds of billions of dollars (and several billions for themselves, of course) for the sin of providing products that adults bought of their own free will. In the case of cigarettes, there was even a big warning label on the box. No matter; tobacco has become an evil, so those who provide it are, too.

Risk has some funny side effects. We may see those who provide other health-risk products like red meat or alcohol or dairy fats, soon enough dragged into court, as if we had never heard that these things brought on heart attacks or cancer.

Notice that the experts know alcohol poses a big risk, but the public discounts it, apparently because it makes us feel good. I drink about a bottle of wine a day, plainly far over into the risky level, but do I care? Nope. But I agree with the experts. I simply take the risk, knowingly.

So, unsurprisingly, the public doesn't think like the experts. It's not as though they've ever been schooled in the elementary calculations one needs to make comparisons. The media pictures they get of disaster stress spectacle. Good footage overrules careful weighing of alternatives.

Also, paranoia is the simple plot device. Want an instant bad guy? Oil and nuclear power companies serve nicely. They are symptomatic of faceless, impersonal institutions. The public responds to these shorthand methods, and draws the wrong conclusions.

*     *     *

But what of the experts?

I've spent a fair amount of time in the company of risk assessors, and it's striking that so much of their work concerns air safety. Yet only one death occurs per billion passenger miles, yielding a few hundred per year in the U. S. A. (This, versus 150 deaths/day for auto accidents, and 100 deaths/day from smoking for the whole U. S. A. population.) The odds of dying are 1/10,000 per year for frequent flyers---comparable to the murder and suicide risk in the general population.

The average car driver must travel about 100 million miles to incur a 50-50 risk of dying in a crash. By airplane, the odds are ten times better: one in a billion miles. Yet we hear about air crashes a lot more than auto wrecks.

I think that concern with air safety is great because the rational, intellectual classes fly often. It's so quantifiable, so high-tech, so clean an issue. And "we" (the experts) do harbor some fear of flying. Never mind that we know that Bernoulli's equation explains why planes stay up—I teach that stuff, and still I occasionally feel a primate fear, gazing down from a dizzying height. (Though taxi cabs terrify me more, especially in New York.)

I suspect that the crucial issue here is control. We're helpless in an airplane, suspended at 35,000 feet. Cars we drive ourselves, trusting in our skills. People want to rely on themselves. Trusting a professional pilot makes little difference; we want to be masters of our fate.

This attitude extends to technology generally. Old is good; we think we know it, and thus it is safer. Generally, we fear new technology and shrug our shoulders at old risks. Time-honored, they seem homey. Railroad travel is more risky than airplanes, and cars are much worse. Yet few fear climbing into their own car, or boarding the Amtrak.

No, it's the newness that puts us off. As science fictional people, this should worry us. It amounts to a bias against the future.

*     *     *

What, then, of nuclear power, that long-ago symbol of the future?

Chernobyl has yielded 31 dead already from direct effects. Among the 24,000 living between 3 and 15 kilometers of the plant, a simple projection from the dose rate they got gives 131 added cancers in that population. That is a 2.6% increase in the expected number. If they all smoked---and a majority did, actually---that would give a 30% increase.

Ah, but what of the future? Considering the 75 million exposed in the Ukraine and Byelorussia, we get about 3,500 extra cancers, summed up over their entire lives.

Sounds like a lot. But this is only a 0.0047% increase in the expected 15 million cancers they should have in future.

Newspaper headline, front page:

3500 DEAD FROM CHERNOBYL.

Or, taking the other tack, there's a small item at the bottom of page 35 of that same newspaper:

CHERNOBYL CANCER RATE "INFINITESIMAL PERCENTAGE" SAYS PHYSICIST. ENVIRONMENTAL GROUPS ATTACK HIM.

Okay, I favor the guy in the second headline. Still . . .

Which one of these methods is "right?"

Neither---they just weigh different aspects of the problem. But it's clear how the media play the game.

Nuclear power provides a need that will be met somehow, after all. In North America it has lost the battle for public opinion. In Europe there is a regional schizophrenia. The French generate most of their electricity in nuclear plants, and have never had any big, risky events. Yet most of the rest of the western Europeans are trying to shut down the reactors they have. In Eastern Europe, reactors get a better perception. Even the Russians continue on with their extensive program, probably because they have so much invested.

Burning oil and coal, on the other hand, kills about 10,000 people per year in the USA from increased lung cancer and emphysema. This number has been known from careful NIH studies for decades. Nobody gets excited about those deaths, ever . . . except the relatives, of course.

Thus "no nukes" may well recall the old saying: For every complex problem there is a solution that is simple, appealing---and wrong. So why do people feel so strongly?

*     *     *

Part of the problem is that we think only of showy disasters—thanks, Hollywood---while ignoring everyday dangers. In Table 2 I've listed some common carcinogen agents and their estimated dangers.

Table 2: INVISIBLE EVERYDAY RISKS  
Carcinogen Exposure/Day  
Tap Water 1  
Well Water, Contaminated, Silicon Valley 4  
Swimming Pool, 1 Hour (Child) 8  
Formaldehyde in Workplace Daily 5,800  
Home Air (14 hr/day) 6,000  
Mobile Home Air (14 hr/day) 21,000  
PCBs in Diet 2  
Bacon, 100 gr. cooked 3  
Comfrey Herb Tea 30  
Peanut Butter Sandwich 30  
Brown Mustard, 5 gr. 70  
Basil, 2 gr. 100  
Diet Cola 60  
Beer, 12 oz. 2,800  
Wine, 0.25 liter 4,700  

Here, "Risk" assumes that humans are like rats, as far as response to the environment goes. Human response to carcinogens is taken as linear in the dose received—directly proportional, with no weird factors. A rating of 1 means a substance will induce tumors in one rat lifetime. This dose is then scaled to human daily use. Wine is 4700 times more likely to give you cancer than tap water, even if you live in LA.

The huge risk from mobile home air comes from the outgassing of the plastics used in cushions and carpeting. Home air is chancy for these reasons, plus the radon gas that leaks up from naturally decaying radium in the soil. Many artificial materials decay into formaldehyde, which is tough on lungs. The best solution for all these places is to open a window.

Swimming pools have chlorine, a carcinogen. Plenty of common foods contain the cancer-causing agents plants evolved long ago to defend themselves against insects and animals. The most successful can even induce a zesty taste when nibbled by big animals like us—they're our spices. A hugely successful example is the tobacco plant—instant poison to many insects, a stimulant to big guys like us.

There are some subtle problems with this approach to cancer threat analysis, but it probably yields useful approximations.

All this data assumes that you can generalize from rats exposed to, say, a heavy dose of diet cola. People drink little of it, weighed by the human/mouse body weight, but you have to start somewhere. (I hate most sweeteners, myself, and it's pleasant to know it causes cancer. But then, everything seems to.) These are big assumptions, but common ones in the risk-measuring business. Without them, there would be little to say. Keep that in mind.

Peanut butter, that homey symbol of health, has fungal poisons called aflatoxins in it which cause cancer in rats—and presumably, us, though no study is ever going to be able to pry that one factor loose from all the myriad dietary patterns we have.

Assessing such risks is hard because the "insult" takes decades to display a final cancer. Epidemiology doesn't give easy estimates of "how safe" anything is. Comparing one risk to another is simple---but it tells you nothing about how close to "zero risk"—which really doesn't exist, of course—that you should go, or how to assess the costs of countermeasures.

*     *     *

Plus, there's that sneaky assumption of linear response . . .

An example: Years ago, British Rail decided to improve safety standards on their commuter lines. To cover costs they had to raise ticket rates. Safety comes at a price, and somebody pays.

This measure drove some commuters to use their cars, lowering net revenue for Brit Rail---and, since car travel is about 1,000 times more dangerous than rail transport, the "safety upgrade" increased injuries and deaths among the commuters. This is a classic example of how nonlinear effects must be included in cost/benefit analysis.

Further, Table 2 shows that the popular notion of a benign nature, where evolution has equipped us to cope perfectly with natural toxic chemicals, is wrong.

After all, natural selection doesn't care about toxic threats to us after we've reproduced. Also, many of our defenses are general. We shed the surface cells of our digestive and lung systems every day, presumably to protect against ordinary "insults."

We produce antitoxin enzymes and myriad defenses, but they can be damaged by other environmental effects, too. Finally, we eat many things our ancestors of only a few centuries back did not---potatoes, coffee, tomatoes, kiwi fruit. Evolution can't have defended us against them yet.

Further, our own systems betray us by making hydrogen peroxide and other reactive compounds of oxygen, which probably contribute to aging and cancer. The only way to avoid that is to stop breathing.

We ingest at least 10,000 times more natural pesticides (toxins) by weight than we do man-made ones. Natural ain't necessarily safer. The nation with the longest life expectancy is Japan, an urban, extremely crowded industrial land. (But the safest state in the union is Hawaii. Maybe the slower pace helps? Yet another reason to move there.)

*     *     *

Take a common way of presenting risk information---the comparison. Pithy, concrete, convincing.

"Smoking two packs of cigarettes gives the same risk as a year spent breathing Los Angeles air." What are we to make of such facts, thrown at us by the risk-managers? Should we be rational as the risk-assessors define it?

First, there's no need to be. It's painfully obvious that the orderly, engineering mentality does not always lead to lowest possible risk.

Look at nuclear reactor control rooms---banks of switches in bleached lighting. The most trivial switch looks much the same as the vital one. There are no personal touches to the room, no odd markers allowed. This guarantees that bureaucrats like the looks (so clean, neat, reassuring)---and the people who work there hate it.

An impersonal, "professional" look causes just the boredom that is the enemy of look-sharp safety. A few years ago the crew manning one control room put pull-levers from beer dispensers on the vital controls, so they could see them right away in a crisis. ("Running hot---go for the Bud!")

Good idea. Their manager angrily removed them.

What to do?

The future must allow more human environments in high-tech enterprises. The pyramid structure familiar in industrial firms has to be discarded, so that highly integrated teams, with real team spirit, run the show. And they have to be tested regularly, against each other, to sharpen their performance.

All well and good---the perfect nerd environment is perhaps not the safest. But what of the grand conflict between the "irrational" public and the risk-statistics folk?

First, we have to recognize that the perceived risk is not merely proportional to the number of people hurt or killed.

Three Mile Island surely proved this. No other accident in our history has had such costly social impact. It imposed huge costs on the utility and nuclear power industries, increased the use of dangerous oil and even coal, and prompted a more hostile view of other complex technologies (chemical manufacturing, genetic engineering).

How to explain this? Sure, the media tart up the news---but why does such sensationalism work?

I believe the most important index in these spectacular disasters is what they portend. Train wrecks kill many, but they are ordinary and excite few. As the New Yorker said after the Bhopal catastrophe,

What truly grips us in these accounts is not so much the numbers as the spectacle of suddenly vanishing competence, of men utterly routed by technology, of fail-safe systems failing with a logic as inexorable as it was once---indeed, right up until that very moment---unforeseeable. And the spectacle haunts us because it seems to carry allegorical import, like the whispery omen of a hovering figure.

Mt. St. Helens got less press than Chernobyl because it didn't mean very much. This is what the analysts imply when they speak of "psychometric factor spaces" in assessing the impact of events. DNA technology awakens many of the deep fears that nuclear power does, invading "factor spaces" that train wrecks never touch.

To many people, bland expert testimony that the annual risk from living near a nuclear power plant is equivalent to the risk of riding an extra three miles in a car is simply dumb---it omits the dimensions of human lives affected by the failure of so gargantuan a technology.

Yet we all know life is nothing without risk. It would be dull, gray, leached of zest. As Hal Lewis, the dean of nuclear safety experts and author of Technological Risk, has remarked, reflect on how western civilization would be if we had elected make the minimization of all risks our principal motivation. We'd be bored---and then extinct.

Indeed, we get bored with risk itself. This may be why older technologies seem safer than they really are.

So we use stairs despite the risk of falls. (I do as a habit, to stay in physical conditioning.)

We eat canned food, despite occasional botulism. (I prefer fresh food, but not because of risk.)

We climb mountains. (The riskiest sport of all is climbing in the Himalayas, where 1 in 10 die.)

We make love despite heart attack risk. We don't make love, despite the fact that married men live longer than singles.

Given our everyday acceptance of risk---indeed, open foolhardiness in smoking or in driving long commutes---why do we balk at nuclear plants, for example?

I have a guess, and it will be as true in the future as it is now.

*     *     *

Every storyteller knows that there are two crucial points in a narrative. One is the opener, the hook, where you draw the audience in. Even more important is the finish, which has to satisfy the tensions the story has set up. But one tension audiences expect will be released (though they probably couldn't say so consciously) is finally expressed in the question, What's it mean?

The best narratives tell us what human experience signifies, what our lives are worth, what role we play (if any) against a larger canvas (if any).

We instinctively dislike stories that lower our estimate of what human lives mean. Audiences prefer dramas about rich, beautiful, powerful people, rather than barflies and beggars—these people matter. Similarly, we deplore disasters that seem to rob us of our self worth.

In ancient times, weather and the gods made disasters. Now we make them, for we are lords of the biosphere.

I propose that the myriad small deaths from disease, tornadoes, falls, or even from train wrecks, all seem to us as "natural." Dying of something nature makes, whether it's a microbe or a meteor, has about it a strange sense of harmony. This at least carries a freight of consoling meaning. And eventually we assign old, familiar technology to the category of "natural."

Death from new technology that we do not understand carries a taint of being self-inflicted, almost of unintentional suicide. This is especially true if we cannot control the new technology personally, relying on unseen experts—that pilot up ahead in the cockpit, say.

Techno-accident demeans all life by making it appear trivially spent.

Another aspect: It may well be that the most important feature of modern times is not technology, but the fact that we dwell in the first era in which atheist ideas are commonly (though not universally) accepted.

Disaster means something if it comes from God or, failing that, at least from nature. Techno-disasters can't be rationalized this way, because we have only ourselves to blame.

So, deploring the public's irrational views of risk, as some number-crunching experts do, can miss a vital point. People seek to invest event with meaning---they want more from risk assessment than body counts.

And if they die in their cars, while in full control—well, that's life, isn't it?

Knowing this, do we who have a hand in evaluating disasters have an obligation to cater to these psychodynamic needs?

To some extent, yes---but we cannot simply rubber-stamp measures which divert society's attention from the serious threats, such as tobacco or saturated fats.

A larger aspect: It has been plausibly argued that we are spending a million times more per life to save people from side effects of nuclear power than we are to save sick children in the undeveloped world. This sort of comparison can't be allowed to escape the disaster-dazed media audience.

More, we cannot concentrate on arguing about rare but spectacular disasters, like nuclear power, to the neglect of everyday deaths. That would merely play into the media-driven perception of safety as solely a matter of gaudy spectaculars.

No, I'm afraid that our moral obligation is to treat every separate life as important---to acknowledge the public's easy distraction by huge disasters, but remind them of the small ones---and to thus in our own way give each life meaning.

===THE END===


Comments on this column welcome at gbenford@uci.edu, or Physics Dept., Univ. Calif., Irvine, CA 92717

To contact us, send an email to Fantasy & Science Fiction.
If you find any errors, typos or anything else worth mentioning, please send it to sitemaster@fsfmag.com.

Copyright © 1998–2008 Fantasy & Science Fiction All Rights Reserved Worldwide

Hosted by:
SF Site spot art