There are problems with software and software testing. We realize that every time we experience accidents and losses in software-enabled systems: hacking, financial losses, autonomous vehicle crashes, airplane accidents, and other losses where software plays a role.
At MIT’s Department of Aeronautics and Astronautics, we work with a systems approach to safety. We assume that accidents and losses can be caused not only by component failures but by unsafe interactions of system components that have not failed. These ‘components’ include humans.
How do we identify potential flaws, identify the most critical test cases, do targeted testing for software that is very complex, and identify important test cases that include human interactions?
How do we engineer the role of testers and management? Testers and test leadership obviously impact safety, but how are those captured when we analyze safety? How do we take into account the human actions and beliefs of testers, testing managers, and of operators in the systems we are building? How do we account for the safety-critical decisions they make and ensure they are receiving adequate feedback to make the correct decisions? This keynote will introduce the Systems-Theoretic Process Analysis (STPA) methodology and how these factors can be addressed in modern testing.
Session takeaways:- Safety requires engineering more than just the technical system
- Understanding and modelling the whole system, including testing and management
- Methods to analyze safety
- Introduction to systems theory and STAMP
- STPA, a systems theoretical approach to safety analysis