by Fitz Walker
Today, flying by computer. The University of Houston presents this series about the machines that make our civilization run, and the people whose ingenuity created them.
"Computer controlled devices are everywhere. Even simple things likely have a computer inside of them. Your kitchen may have the smartest toaster known to man. So it's no surprise that computers are used to help fly a complex airliner safely. But what happens when these computers we so rely on fail? All too often, the results are deadly.
The Boeing 737 is a very well-known jet airliner. It first flew in 1968. And it is the most produced American jet airliner in history. On an upgraded version of the Boeing 737 - called the MAX - computer systems are used to maintain stable flight. Its new engines are so big that that it flies differently than previous models. New autopilot controls were needed to make the plane fly as expected. So when these computer controls had a failure, there was tragedy.
Boeing 737-MAX engine upgrades required computer augmentation for stable flight.
Photo Credit: Wikipedia Commons.
Some form of automation is needed for very advanced systems. You are likely listening to this broadcast in a highly automated automobile. Not just cruise control, but engine controls and transmission operations. Thousands of computer decisions a second are being made so that you get the most gas mileage.
If automation fails in your car, you can pull over and stop. There's no pulling over and stopping in an airplane. For this reason, there are government rules for designing safety critical computerized systems in aircraft. But these guidelines can't always protect from bad requirements or account for all failures. We can only try to reduce the chance of faults. To design a system to work right, you must also think about how it can work wrong. This may sound pessimistic, but fully evaluating what can go wrong will help to build a safer system.
MIT professor and expert on systems safety Nancy Leveson wrote about how modern computer controlled systems are also open to more complicated system failures. Failures can also happen within the connections between computer components. Bad things can happen when an airspeed sensor stops talking to a flight computer. Or the autopilot is confused by pilot commands. She also talks about how problems can arise even when components work as expected without error but unknown results comes from their interaction. This is something that can be very difficult to predict. The root of these issues is the increasing complexity of modern control systems.
MIT professor of system and software safety Nancy Leveson wrote a book on computer safety.
Photo Credit: Massachusetts Institute of Technology.
One can argue to make these systems less complicated, but that genie is already out of the bottle. Our world is only going to get more automated. This is no more evident than the push to develop self-driving cars. We must take lessons learned from the aviation industry and apply them to ground based vehicles. Or even the opposite. Otherwise we are doomed to repeat the past and with dire consequences. Much of the solution is ultimately in good design based on good requirements.
Computer controlled systems are here to stay. And while they can provide almost unlimited flexibility, these systems must be designed in such a way that failures can be easily detected and contained safely. Detecting a problem is just as important as preventing one. Good design must expect the worst in order to build the best.
I'm Fitz Walker at the University of Houston, where we're interested in the way inventive minds work.
Nancy Leveson, "Engineering a Safer World" ISBN: 9780262533690
Boeing 737 MAX: https://en.wikipedia.org/wiki/Boeing_737_MAX.
Self-Driving Cars: https://en.wikipedia.org/wiki/Self-driving_car.
World's smartest toaster: https://en.wikipedia.org/wiki/Self-driving_car.
This episode was first aired on February 25, 2020