index home News audio search help techcity computerworld














EXTREME INFORMATION

During the 1960s and '70s, U.S. telecommunications companies and public utilities were encouraged to build redundant facilities to guard against military attacks. But with the Cold War over and deregulation all the rage, those organizations no longer see survivability as the high priority it once was, says John Hwang, general manager of information technology for Los Angeles.

"Sheer competition is driving these companies to skinny down," Hwang says. "I am very worried that we do not have an information infrastructure able to take care of the extreme cases."

If the public phone network were wiped out in an emergency, police and firefighting units could in theory use "tactical" communications systems based on radio, satellites and the like, says Hwang, the former director of IT at FEMA. "But do we really know how to operate in that kind of environment? We are talking about a lot of coordination, a lot of collaboration."

That kind of coordination will be put to the test this October when a coalition of government and private organizations try out the "extreme information infrastructure," or XII. The XII will be a secure, survivable intranet that takes input from fixed users, mobile wireless users and sensors and creates a database of crisis information accessible in Web format.

The XII will be tested in a scenario in which terrorists start a fire near Bedford, Mass. The fire diverts emergency personnel while elsewhere a toxic plume of nerve gas drifts toward Boston. The gas incapacitates the crews on ships in Boston Harbor, causing them to crash into piers, according to Lois Clark McCoy, president of the National Institute for Urban Search and Rescue and an organizer of the exercise.

Information about the spread of the gas will be transmitted over the XII from sensors at an actual detonation at Camp Lejeune, N.C. "We are not exploding anything in Boston," McCoy says.
— Gary H. Anthes






Crisis
management
reference
links




PROBLEMS
Impediments to emergency response

  • Inadequate voice service

  • Congested wireline and wireless services

  • Unknown radio frequencies in use by relief organizations

  • Limited access to remote information and limited information sharing among organizations

  • Lack of E-mail capability between local users and regional offices

  • Slow setup of telecommunications facilities at crisis scenes




    SOLUTIONS


    Capabilities needing research and development

  • Self-configuring wireless data networks

  • Adaptive networks that discover and react to damage and changes in use

  • "Judgment support" tools drawing on data from diverse and unanticipated sources

  • Remotely accessible "metacomputer" systems for modeling and simulation

  • Multimedia fusion of data from varied, unexpected sources

  • Distributed, virtual crisis-management "anchor desks"

  • Adaptive interfaces that analyze use and signal when crisis managers make errors

  • More flexible and powerful geographic information systems

    Source: National Research Council, Washington
  • Red Alert
    Crisis management systems are improving, but not fast enough. Lives and property are at stake.

    Sept. 17, 1995. A powerful earthquake centered near New Madrid, Mo., rocks the southeastern part of that state, killing or injuring hundreds of people, cutting off communications to the area and leaving roads and bridges impassable.

    A fractured road causes a military truck carrying chemical munitions to the Pine Bluff, Ark., arsenal to overturn, dispersing its payload over a wide area. Thousands in the region are left without shelter, food, water or medical supplies.

    But technology-enabled local, state and federal officials launch a response within minutes.

    • Crisis managers activate the Government Emergency Telecommunications Service, which gives emergency workers priority on jammed telephone circuits.

    • They activate a satellite communications service run by NASA and tied in to a commercial mobile data network.

    • A state trooper identifies the spilled chemical using a database of chemical hazards on his portable PC. He reports his findings via wireless electronic mail to an emergency operations center in Conway, Ark.

    • At the center, a computer model predicts the spread of the chemical hazard and helps plan evacuation and cleanup.

    • Crisis managers across the country share maps, weather data and other information over an "emergency information network" — a secure subnetwork on the Internet that uses World Wide Web technology.
    The good news is that this earthquake never happened. It was part of an exercise conducted by federal, state and local agencies.

    The bad news is that, had it been a real emergency, response wouldn't have been nearly as fast and effective.

    Indeed, the history of hurricanes, earthquakes, terrorist bombings and other disasters in the U.S. shows a pattern of slow response and poorly coordinated activities.

    An integrated crisis management system is available for use by military forces, but nothing comparable exists to support civilian emergency response. The result, experts say, is unwarranted loss of life, limb and property, and recovery costs far higher than necessary.

    And experts worry that the U.S. is especially ill-equipped to handle very large disasters. In October, a group will test an Internet-based "extreme information infrastructure" intended to provide communications during catastrophes such as a nuclear explosion (see "Extreme information," at left).

    "Crisis management is extremely manual right now," says Vinton Cerf, a co-author of Computing and Communications in the Extreme, a report on crisis management from the National Research Council (NRC) in Washington. And the computer and communications systems that do exist lack interoperability among the myriad federal, state and local emergency response agencies.

    "The police guy can't talk to the fireman, and the fireman can't talk to the emergency helicopter and so on. There are a bunch of places where you find disparities," says Cerf, who is one of the originators of the Internet and a senior vice president at MCI Communications Corp. in Washington.

    An official at the Federal Bureau of Investigation who asked not to be named gives this description of the immediate aftermath of the 1993 World Trade Center bombing in New York: "All the power was gone. There were at least eight [emergency response] agencies in that building, and we couldn't communicate with any of them. Had there been any shooting or a subsequent bombing, we would have been up the creek."


    SUPPORT ON THE WAY

    Several initiatives are under way to achieve the kind of integrated and automated support for emergency response tested in the earthquake exercise described earlier. In its report last year, the NRC outlined several promising — albeit piecemeal — initiatives.

    For example, the Government Emergency Telecommunications Service was set up in 1995 to give emergency workers access to a special telephone circuit when normal channels are choked. It facilitated voice communications following the 1995 bombing of the Alfred P. Murrah Federal Building in Oklahoma City and during the recent floods in Louisiana.

    But the service can't yet secure priority access to cellular phone circuits, which are often jammed during a crisis.

    Oklahoma City
    CAD software helped in the search-and-rescue operations at the bombed Murrah Federal Building in Oklahoma City; photo by FPC International.
    Computer-aided design software helped in the search-and-rescue operations in Oklahoma City. It mapped locations to be searched and estimated where victims might be based on the location of their offices.

    But NRC officials say that if more computational power had been available, the CAD data could have been used in a model to estimate loads on various parts of the building to show where shoring was needed.

    Hurricane Marilyn struck the Virgin Islands in September 1995 and wiped out all communications with the mainland. Just before the storm hit, the U.S. Army set up a PC that contained crisis-management software and a communications package that linked with the Immarsat satellite communications service. In the first 24 hours after the storm, all official calls passed through that link. But there weren't enough channels available to meet demand.

    Hurricane Andrew destroyed much of the phone service south of Miami in August 1992. The American Red Cross was able to communicate via an experimental wireless cellular network that used IBM laptops fitted with radio modems.

    But response to Hurricane Andrew was slowed by the inability to access proprietary databases. The Federal Emergency Management Agency (FEMA) couldn't get needed information from Dade County, Fla., until it paid the county for the data and convinced it that privacy could be protected.

    FEMA directs the emergency response activities of 26 federal agencies and coordinates with numerous state and local authorities and volunteer groups. Often, information sharing is relegated to mail, fax and phone.

    upside down car
    When Hurricane Andrew hit Florida in 1992, FEMA couldn't access needed information from Dade County until it paid for the data and convinced local officials that privacy could be protected.
    "The compatibility of their systems varies greatly," says Bruce Boughman, FEMA's director of operations and planning. "For example, the state of California uses Mac-based systems, which makes it very difficult to share products with them."

    Interoperability is a problem even within response organizations. For example, the Red Cross maintains a database of 19,000 experts and volunteers ready to respond to emergencies, but it isn't linked to systems at state and local Red Cross offices, which must communicate via fax and phone.

    The Red Cross and FEMA are replacing their old emergency response applications with new systems that are scalable, flexible and network-enabled. FEMA officials say the agency is spending $67 million over five years to replace its stand-alone, 1980s-era systems.

    But those application-level efforts don't address underlying network deficiencies. "The communications infrastructure is more fragile than people like to accept," Cerf says. "And it gets even more complicated when we try to put together a new system built on the pieces that are still surviving."

    What's needed, Cerf says, are self-configuring networks that adapt to damage and automatically discover new links after a crisis begins. "For example, you'd like to be able to turn your computer on and have it sample the radio spectrum and discover what transmission resources are out there," he says.

    It isn't easy today to bring radio gear, phone switches and computers into a crisis area and get them online quickly. "What's missing is a set of procedures and protocols that would allow systems to discover each other and assemble themselves into a communications network," Cerf says.

    Some of those capabilities may be provided by the new Mobile Internet Protocol, he added.

    But crisis managers understandably view new technology with some apprehension. Bob Canfield, head of Los Angeles' Emergency Preparedness Division, says, "We want to take advantage of technology. But as an operational person, I know that it will fail at the most inopportune time."


    By Gary H. Anthes
    Anthes is Computerworld's senior editor, special reports. His Internet address is gary_anthes@cw.com


    index home News audio search help techcity computerworld


    © Copyright 1997 by Computerworld, Inc. All rights reserved. @Computerworld is a service mark of International Data Group, Inc.