Engineering failure is useful

February 19, 2016 // By Peter Clarke
Dennis Feucht discusses the works of the engineering author Henry Petroski and what he had to say about failure.

One of the few popular engineering writers of late last century was Henry Petroski, professor of civil engineering at Duke University in North Carolina. His book , To Engineer is Human: The Role of Failure in Successful Design , crosses engineering disciplines in its applicability. His major assertions are worth pondering by electronics engineers.

 

Consequences of failure

Petroski recounts some spectacular civil engineering failures: the Tacoma Narrows Bridge, the Kansas City Hyatt Regency Hotel elevated walkway, and the de Havilland Comet, the first commercial jet aircraft to fly across the Atlantic Ocean. By pushing the state of the art too far, or by irresponsible design or faulty construction techniques, failures happen.

Failure is commonplace and well-known not only to engineers but to the general public, as recounted in cruder terminology on some bumper stickers. For engineers, however, failure takes on special meaning. It is part of the exercise of our craft. As Woody Allen said, if you're not failing now and then, you're playing it safe. 

But being "too safe" is the goal when designing life-critical devices, such as the typical works of civil, automotive, medical, and aeronautical engineers. And when less threatening devices such as garage-door openers or pencil sharpeners fail too frequently, they pose a financial threat to their suppliers, either before or after shipping the product. An over-designed product, however, is usually not cost-competitive in the marketplace, and may also suffer performance disadvantages from non-optimal design trade-offs. Knowing how close to the edge of failure to come - and knowing where that edge is - is a mark of an experienced designer.

In evaluation engineering at Tektronix in the early 1970s, some guy had schematic diagrams of Tek oscilloscopes taped onto the walls around his desk. His job was to monitor field failure reports. When a failure occurred, he would put a red dot by the failed component. 

In time, these "measles charts" graphically depicted the