Feb 1, 2011 10:14 GMT  ·  By
Power outages are systemic failures that can be foreseen and prevented, a Purdue expert says
   Power outages are systemic failures that can be foreseen and prevented, a Purdue expert says

A group of scientists from the Purdue University is currently arguing for the establishment of a cross-disciplinary approach for analyzing the possibility of systemic failure in complex system, and for finding out ways of preventing that from happening.

Failures such as the ones the team aims at preventing from happening again include such calamities as the BP/Deepwater Horizon oil spill, huge power blackouts, and even the subprime mortgage crisis.

Other accidents that could have been avoided include the 1984 Union Carbide accident in Bhopal, India and the 1988 failure of an oil platform in the North Sea. The former disaster injure more than 100,000 people and killed 2,000, whereas the latter killed 167.

“Such systemic failures are not limited to the chemical and petrochemical industries. The Northeast electrical power blackout in 2003 and a recent massive recall of drug products are both systemic failures,” says expert Venkat Venkatasubramanian.

“Financial disasters such as Enron, WorldCom, the subprime mortgage derivatives crisis and the Madoff Ponzi scam also belong to the same class. The striking similarities in such catastrophes necessitates a broader perspective to better understand such failures,” the expert says.

Venkatasubramanian holds an appointment as a professor of chemical engineering at Purdue. “In the history of systemic failures, a few disasters have served as wake-up calls,” he goes on to say.

“The Flixborough chemical plant accident in the United Kingdom in 1974, where a Nypro UK plant explosion killed 26 people, was one such call,” the professor adds. His proposal for a new approach in handling risks appears in the January 2011 issue of the AIChE Journal.

The expert believe that all disasters must be studied from a common systems engineering perspective, if analysts are to understand the things that went wrong during all these events. This will allow experts to plan better for the future, and also to handle similar events more efficiently.

“There is an important role for universities here, as well, in creating and disseminating knowledge about abnormal-events management in complex engineered systems and their public and corporate policy implications,” the Purdue expert adds.

Venkatasubramanian also believes that the inherent fragility of complex systems is what underlies their failure in the end. The more complex engineered systems, processes and products become, the higher the chances go of something going wrong with them.

An increase in systemic complexity has also been linked with an increase in the harm they produce when the systems collapse or fail. “In other words, the behavior of the whole is different than the sum of its parts and can be difficult to anticipate and control,” the expert adds.

“This is further compounded by human errors, equipment failures and dysfunctional interactions among components and subsystems that make systemic risks even more likely if one is not vigilant all the time,” he concludes.