Is it Worth the Risk?

16 minutes to read

Modern life is filled with risks, from common activities as well as potentially catastrophic ones. Engineers and scientists can help us to understand these risks and to make decisions about them.

It’s Saturday morning, and you have a busy day ahead of you. You get up early and ride your bike to the soccer field, where you play goalie for your school team. After the game (your team wins), you ride home and jump into the shower. Still wet, you use the hairdryer, which is plugged into a bathroom outlet. You race downstairs for lunch: a peanut butter sandwich. You relax at home for a while, then head to the doctor’s office for a tetanus shot. Back home again, you sneak a taste of the cookie dough on the kitchen counter and make arrangements for a ride to a party that evening.

Later that night, safely home and preparing for bed, you might reflect on the actions you’ve performed during the day. Each one has been very ordinary. And yet each has carried some risk—some chance of resulting in injury, illness, or death. You risked a serious accident bicycling, driving, running downstairs, even playing soccer. Using your hairdryer in the wet bathroom put you at risk of electrocution. Your lunch exposed you to aflatoxin, a natural carcinogen (cancer-causing substance) found in peanut butter. Your house may contain another carcinogen: radon, a natural radioactive gas that can enter homes through cracks in basement walls or floors. Your tetanus shot, like any vaccine, carried a small risk of harmful side effects. And the raw eggs in the cookie dough could have given you a dangerous case of food poisoning.

Acceptable and unacceptable risks

If you’re like most people, you willingly accept these and other risks in your daily life. You may take steps to reduce certain risks—for example, by wearing an automobile seat belt or bicycle helmet, or by sealing your basement against radon. But you recognize that you can’t completely avoid risk, and that to try to do so would drastically change the way you live.

There are certain risks, however, that people are less willing to accept. These hazards seem more frightening than others—more dangerous, or harder to avoid. Many people, for example, are concerned about the possible effects of living near a hazardous waste disposal site. They fear that chemicals burned or stored at such facilities could pollute the air or poison the water supply. Others wonder how safe their food is: Does it contain dangerous levels of pesticides or food additives? And the risk of accidents at nuclear power plants has created strong opposition to nuclear energy. Such risks are often the focus of political debate and community activism.

The scientific study of risk

Societies, like individuals, have to make choices about the risks they face. They must decide, for example, whether to give up the benefits of a new technology in order to eliminate its risks. Sometimes, they must choose between competing technologies carrying different kinds of risks. Or they must decide whether the number of people who might be affected by a hazardous activity justifies the cost of taking action to reduce the risk.

To help make these decisions, engineers and scientists have developed tools that allow them to measure and compare risks. Scientists approach questions of risk in measurable terms. How many people, they ask, will die prematurely as a result of a risky activity, technology, or substance? How many will become seriously ill? How many people will benefit? How much money will society save?

The scientific study of risk is known as risk assessment. The study is based on the branch of mathematics concerned with probability—the likelihood that an event that occurs randomly will happen at a given time. Probability theory was developed in Europe in the 1600’s, as a way to help gamblers make bets in dice-rolling games.

The first major problem in risk analysis arose in Europe in the late 1700’s. At that time, smallpox—an infectious disease—was one of the leading causes of death, killing hundreds of thousands of people a year. Physicians knew they could make people immune to smallpox by inoculating (deliberately infecting) them with a mild form of the disease. But inoculation carried a risk of giving the person a more serious form of the illness, which could result in death. Leading mathematicians and scientists used probability theory to help them weigh the risks of inoculation until physicians developed a less dangerous vaccine in the 1800’s.

Modern risk assessment

The modern field of risk assessment began to emerge in the early 1900’s, chiefly in the United States. In the 1920’s, concern about the growing use of pesticides in agriculture led scientists to begin using laboratory animals to test the safety of chemical compounds. But the new science of risk assessment received its most important boost beginning in the 1960’s, amid growing fears about the hazards of nuclear radiation and the emergence of the environmental, consumer protection, and worker safety movements.

Today, governments play an important role in protecting individuals from excessive risk. For example, the U.S. Food and Drug Administration (FDA) administers standards for the labeling of food products and the content of processed foods. The Environmental Protection Agency (EPA) regulates the public’s exposure to health risks from pollution and other environmental hazards. And the Occupational Safety and Health Administration (OSHA) oversees regulations covering workers’ exposure to potentially hazardous substances and conditions on the job.

Measuring risk directly

Agency officials and other experts who deal with risk say there are two main types of risks: those that are directly measurable and those that must be calculated indirectly. Experts use different tools to assess the different kinds of risks.

Risks are directly measurable if the consequences of a risky activity occur in large enough numbers for statisticians to accumulate data on their frequency. For example, hospitals keep records of the number of patients who die of lung cancer and other illnesses that can be caused by smoking. Scientists can use these data, along with other statistics about smokers and nonsmokers, to calculate the health risks of smoking. These calculations tell researchers that, for example, a pack-a-day smoker is up to 30 times more likely to develop lung cancer than someone who has never smoked. The data also show that smoking is a probable cause of 17 percent of all deaths in the United States.

The chances of dying in an automobile accident

To learn how scientists calculate directly measurable risks, let’s look at the risk of death caused by automobile travel. Statistics show that in the United States, some 21,300 automobile drivers and passengers die annually as a result of traffic accidents. Other data-collected by, for example, electronic highway counters—indicate that, taken together, all the automobiles in the United States travel a total of about 1,600 billion miles (2,570 billion kilometers) each year. These units are called vehicle miles.

To estimate the probability of dying in an auto accident during a given year, we also need to know how much automobile traveling Americans do. Statisticians compile this information as passenger miles. One passenger mile is the transportation of one person for one mile. Transportation surveys show that, on average, each car carries about 1.7 passengers (including the driver). So 1.7 passengers traveling 1,600 billion vehicle miles per year gives us about 2,700 billion passenger miles.

The next step is to calculate the number of deaths per passenger miles. To do this, we divide 21,300, the annual number of deaths, by 2,700 billion. This calculation yields 0.0000000079, or 79 deaths for every 10 billion passenger miles.

Now, suppose you expect to travel 35,000 miles (56,000 kilometers) in 1995. Your chance of suffering a fatal accident during that year is 35,000 X 0.0000000079, or about 0.00028. Expressed as a fraction, this can be reduced to about 1/3,570, or 1 chance in 3,570. Like any probability, this can also be expressed as a percentage, 0.028 percent.

Of course, the likelihood of any specific person becoming an accident victim may be greater or smaller than the average we just calculated. If you are a driver, your precise risk depends on how safely you drive, whether you wear a seat belt, whether you drink before driving, the weight of your car, and other variables. Experts can gather data on such variables to determine, for instance, the effectiveness of seat belts in preventing fatalities and thus reducing the risk from car travel.

Calculating the risks of rare events

The risk of dying in an auto accident can be calculated directly because auto fatalities occur in large numbers, so small variations from year to year do not affect future expectations. But many other hazardous activities result in fatalities or other health consequences infrequently, making direct statistical analysis impossible.

In the United States, for example, some 150 to 200 of the nation’s approximately 500,000 bridges annually suffer the collapse or partial collapse of one or more spans. Fewer than 12 people die each year as a result of such failures. However, the consequences of a single bridge collapse could be disastrous, especially if the bridge is large and heavily traveled. For this reason, it is important that engineers have some way to determine the likelihood that any particular bridge might collapse.

Engineers know that in most cases, a bridge collapse results from a series of smaller events that are themselves not uncommon. For example, flooding can allow water to accumulate on a bridge deck. Or debris may build up in the drains, blocking them. The water can then corrode the steel and other structural elements of the bridge. In cold weather, water freezing inside the bridge deck can cause it to crack, while salt or other deicing compounds can increase corrosion. Meanwhile, heavy loads on the bridge can cause additional cracks, especially where the structure has been corroded. And boats colliding with the bridge can weaken its support structures.

One way to assess the probability of an event such as a particular bridge collapsing is to calculate the likelihood of each step in any possible combination of events that might lead to the disaster. Engineers can use historical data to estimate the probability of such events as flooding followed by freezing temperatures. Then, using mathematical formulas, they can multiply these numbers to get the probability that all of these steps will occur. Likewise, engineers can calculate the probability for all other series of events that might result in bridge failure. They can then add the probabilities of each of the different chains of events to come up with a total probability for the accident during a given period of time.

Using analogy to measure risks

Some risks must be measured indirectly not only because they occur infrequently, but also because data on these risks are hard to collect. The risk of cancer due to exposure to certain substances often falls into this category. It is impossible to determine the specific origin of most cases of cancer because 20 or more years may elapse between the time a person is exposed to a carcinogenic substance and the time the cancer is diagnosed. In addition, the large number of cancer deaths from all sources makes it difficult to isolate common factors—such as exposure to a particular substance—that might have caused cancer in many different patients.

If scientists want to determine whether a chemical compound may pose a health risk such as cancer, they can test it in the laboratory. First, they might test the compound to see whether it can cause mutations (genetic changes) in bacteria. Cancer is thought to begin with changes involving genes.

The scientists then test the effects of the compound on laboratory animals. In these studies, they give high doses of the chemical to a small number of animals, usually rats or mice. (The scientists use high doses because lower doses would cause illness in few animals, requiring hundreds of animals for each test.) Rats and mice make good subjects for such tests because they have short life spans, so changes in their health show up quickly. The scientists divide the animals into groups, give each group a certain amount of the compound, and so determine the smallest daily dose that will cause some of the animals to develop tumors or other disorders. The researchers calculate the daily dosage that would pose an equivalent risk to human beings exposed to the compound over a lifetime. They can then calculate the risk people would face from smaller exposures.

Also Read:  Preeclampsia - Causes, Risk factors, Symptoms, Complications, Diagnosis, Treatment, Prevention

Scientists use sophisticated computer models to estimate how much exposure to the compound different groups of individuals might actually have. For example, when solid wastes are burned in an incinerator, workers at the facility are likely to be exposed to higher doses of any hazardous compounds produced than are people who live nearby but work elsewhere. The scientists use the models to help estimate the maximum amount of a compound different groups of people might come into contact with over a lifetime. Then, they compare data from the computer models with data from the animal studies to determine whether the compound should be regarded as hazardous.

The uses and limitations of risk assessment

Risk assessment has proved a useful tool in helping societies make decisions about risk. For example, the U.S. Federal Aviation Administration (FAA), the agency that oversees aircraft safety, used risk assessment when it decided in 1992 not to require the use of safety seats for infants in commercial airplanes. Data on crash fatalities had shown that the regulation would save, on average, the life of 1 child every 10 years. But the rule would have forced parents to purchase a ticket for an extra seat, encouraging many parents to travel by car instead. The FAA found that the resulting increase in auto travel would cause 5 additional infant deaths each year, or 50 over 10 years.

But decisions are not always so simple. One problem is that risk assessments can be uncertain, leaving citizens and policymakers unsure how dangerous an activity really is. When experts calculate risks based on statistics, the data leave little room for opinion. But when scientists evaluate risks indirectly, their findings may not be so clear-cut. Because experts must rely on assumptions, estimates, and judgments to make these assessments, many such evaluations are highly controversial.

Consider, for example, the risk of a major accident at a nuclear power plant. Estimates of the likelihood of such an accident vary widely, from 1 chance in 10,000 to 1 chance in 1 million annually per plant. This is partly because probability estimates on such complex activities as nuclear power generation depend on large numbers of smaller estimates, which may themselves be uncertain. For example, much of the technology involved in nuclear power has never been applied to any other industry, so experts have little direct experience to help them estimate the likelihood of a problem. Another problem is that engineers cannot foresee every possible combination of events that might occur in a complex system. Many critics also charge that risk assessments often ignore the possible contributions of human error to an accident.

Determining risk from animal studies

There is also uncertainty about risk assessments based on animal studies. For example, scientists assume—unless they know otherwise—that a compound with harmful effects on animals has similar effects on human beings. But research has shown that many compounds act differently on different species. One such compound is the artificial sweetener saccharin, which studies in the 1970’s linked to bladder cancer in rats. Today, some biologists believe saccharin promoted the disease because it stimulates rats’ bodies—but not those of human beings—to produce another substance, which in turn causes cancer. If this is so, saccharin may not pose a cancer risk to humans.

Animal studies may in other ways make certain compounds seem riskier than they really are. Many critics point out that laboratory tests rely on exposures far higher than would ever be found in real life. They argue that a compound may be dangerous at high levels but safe in low doses. Some researchers even believe that the act of ingesting massive doses of certain compounds, rather than anything in the compounds themselves, may trigger body cells to change in ways that can result in cancer.

Other experts say animal studies may lead researchers to underestimate risks. They argue that some chemicals can accumulate in the body, causing long-term harm undetectable by studies using animals with extremely short life spans. Critics also say the studies may fail to take into account the fact that one substance may be more hazardous in the presence of another. Cigarette smoking, for example, has been shown to increase the toxic effects of many compounds. Finally, animal studies are most often performed solely for cancer risks, so any other health effects of a substance may go unnoticed.

Making decisions about risks: The case of pesticides

Most experts believe that animal studies, though imprecise, are still the best tool for estimating chemical risks. They say that scientists can avoid underestimating most risks by leaving a margin of safety in their estimates—for example, by basing them on maximum possible doses.

Nevertheless, scientists recognize that they still have a very imperfect understanding of some risks. This failing sometimes puts consumers, producers, environmentalists, scientists, and regulators at odds. This is true, for example, in the case of chemical pesticides. These compounds allow farmers to produce abundant, inexpensive agricultural goods. But studies have found that some of these chemicals pose both health and environmental risks. On the basis of such studies, the EPA has banned the use of many such compounds in the United States.

One health risk posed by some pesticides is cancer. Some experts who argue against strict government control of pesticides point out that the cancer risk from most artificially produced pesticides may be minuscule compared with the risk from natural pesticides made by plants as protection against insects, plant-eating animals, and bacteria. Biochemist Bruce N. Ames of the University of California at Berkeley, a leading expert in carcinogenic risks, has estimated that these natural compounds make up more than 99 percent of all pesticides in the American diet.

Many environmentalists and others who call for stricter pesticide regulation argue that cancer is not the only known risk of using artificial pesticides. Some types of pesticides have been found to cause other toxic effects, such as nervous system disorders and, possibly, birth defects. In addition, they say, some pesticides accumulate in the environment and may devastate fish, birds, and livestock. Furthermore, some scientists argue, chemical pesticides may pose unique risks to children, whose growing bodies and smaller body weights may make them more vulnerable to certain substances than adults are.

Researchers have developed safer, less-polluting ways to control some pests—for example, through techniques that disrupt the reproductive cycles of insects. But such methods require a different approach for each insect, so they are far more expensive than chemical pesticides. As a result, a society that decides to eliminate chemical pesticides must pay the price at the grocery checkout counter. And this could cause many people to reduce their consumption of fruits and vegetables, resulting in poorer nutrition and, possibly, an overall decline in health.

When perceptions of risk collide

The pesticide issue, like many others, is complicated by the fact that people’s perceptions of risk often differ sharply from those of the experts. Psychologists, sociologists, and others who study how people think about risks have found, for example, that individuals more willingly accept risks they take on voluntarily than risks over which they feel they have no control. They tend to fear new hazards more than familiar ones and poorly understood risks more than sure ones. People are also more concerned about risks that might affect them or their families personally than about risks affecting strangers. Finally, nonexperts tend to view risks that might cause many casualties at once with more dread than risks whose effects may be spread over time.

These psychological factors sometimes lead people to make what many experts regard as irrational judgments about risks. Nuclear power is a case in point. Until such energy sources as wind or solar power become more efficient, energy experts say, the United States must continue to rely on two main sources of power: nuclear energy and the combustion (burning) of fossil fuels, chiefly coal and petroleum. Both technologies carry risks—but comparing them may surprise you.

The nuclear power question

The world’s largest nuclear accident occurred in 1986, when an explosion and fire destroyed the reactor at a nuclear power plant in Chernobyl, near Kiev, Ukraine. Some 30 people died of radiation poisoning after the accident, and experts believe the radiation released may eventually contribute to as many as 20,000 cancer deaths.

A high estimate of the likelihood of a major accident in the United States is 1 chance in 10,000 annually per reactor. This means that with 100 U.S. reactors operating (the approximate number in use in 1994), such an accident might be expected about once every 100 years. Even a major U.S. accident would probably not lead to a significant release of radiation, however, because of the way nuclear power plants in the United States are constructed. Experts believe that the chance of an accident so severe that significant radiation is released is only 1/10 to 1/100 of the overall accident risk. These odds mean that a deadly accident might occur once in 1,000 to 10,000 years.

Citizens may also be concerned about nuclear energy because of the risks from radioactive waste, or because of the possible effects of low-level radiation exposure on people living or working near a nuclear reactor. But engineers estimate that if nuclear waste were properly and permanently stored, it would be likely to cause no more than 1,000 cancer deaths during the entire 100,000-year period that such waste is radioactive. Experts also point out that radiation levels around a nuclear plant are hundreds of times lower than a person’s average radiation exposure from such natural sources as radon and from medical and dental X rays.

Are fossil fuels safer?

People generally regard fossil fuels as much safer than nuclear power. But unlike nuclear power, fossil fuel combustion is responsible for highly dangerous forms of environmental pollution. Burning the fuels produces chemical compounds—sulfur dioxide and nitrogen oxides—that form the chief pollutants in acid rain, a major source of damage to forests, rivers and lakes, and the wildlife that depend on them. Air pollution from nitrogen oxides and ozone, another compound that is released by fossil fuel combustion, can cause or worsen lung disease, contributing to between 5,000 and 50,000 deaths a year, according to some estimates. Fossil fuels also release carbon dioxide, which accumulates in Earth’s atmosphere. Many scientists believe this accumulation may cause global warming, in which excess heat is trapped in the atmosphere, increasing Earth’s average surface temperature.

Of course, if a nuclear disaster might occur once in 10,000 years, it is as likely to happen tomorrow as on any other specific day during the next 10,000 years. But the question scientists say policymakers and consumers need to ask is whether we, as a society, should make such a small risk the basis for our decisions about the future—especially in light of the amount of risk we live with every day. After all, they say, a risk is like a reverse kind of lottery. Each week, lotteries sell millions of tickets, one of which may bring its owner a windfall. But for the other 9,999,999 or so ticket buyers, nothing changes.

If you’re the person who wins the lottery, of course, it doesn’t matter what the odds were—just as the odds don’t matter if you’re the one person in a million to suffer the effects of a risky activity. That’s why experts say it’s important to be aware of hazards, so that we can act, when possible, to reduce them. But it’s important to remember, too, that society cannot eliminate all risks. It can only compare the risks and benefits of all alternatives—and then try to make the best choice possible.

Leave a Comment