Dr. John P. Thomas
Safety and Cybersecurity Group
Contact
Room 33-408
Department of Aeronautics and Astronautics
Massachusetts Institute of Technology
77 Massachusetts Avenue, Cambridge, MA 02139
Administrative Assistant
Pam Fradkin
pfradkin@mit.edu
(617) 258-9729
Biography
Prior to joining MIT, Dr. Thomas held engineering positions in the aerospace, automotive, defense, telecom, and other sectors. After recognizing patterns in the challenges that arise in large complex systems, he returned to academia to pursue a Ph.D. in systems engineering to address these challenges. Today, Dr. Thomas is a member of the aeronautics and astronautics department at MIT where he works to develop more effective solutions and demonstrate them on real applications with industry partners.
Dr. Thomas's work produces structured processes for analyzing cyber-physical systems, especially systems that may behave in unanticipated, unsafe, or otherwise undesirable ways through complex interactions with each other and their environment. By using theory and systems theory, more efficient and effective design and analysis processes can be created to prevent flaws and identify hidden assumptions that will lead to unexpected and undesirable behaviors when they are operated in the real world. More recently he has been applying these techniques to systems that rely on complex human interactions to achieve safety and security goals. These systems may not only be subject to human error--they may inadvertently induce human error through mode confusion, clumsy automation, and other mechanisms that can be difficult to anticipate.
Much of Dr. Thomas's work is focused on developing systems approaches to engineering and analysis including Systems Theoretic Process Analysis (STPA). He developed the formal structure to define STPA steps like Unsafe Control Actions (UCAs), Loss Scenarios (with or without a UCA), and other process steps to help ensure potentially hazardous or undesirable interactions are systematically identified and controlled. He has created algorithms to use STPA to automatically generate formal executable and model-based requirements for software components as well as methods to detect flaws in an existing software specification. The same process can be applied to address security and functional goals of the system, thereby permitting the automated detection of conflicts between these and other goals during early development processes.
He has collaborated extensively with industry groups to teach STPA, integrate it into engineering processes, and to define the certification framework that enables professionals to become certified in using STPA. He is a member of international committees responsible for authoring safety engineering standards including ARP4761A, ISO26262, and J3187A.
Positions Held at MIT
Director, ESL
Executive Director, PSAS & C-SASS
Research Scientist/Engineer
Instructor
Postdoctoral Associate
Research Assistant
Ph.D. Candidate
Previous industry experience
Google
NASA
FAA
Ford Motor Company
General Motors
Nuclear Regulatory Commission
Electric Power Research Instutute
Sandia National Laboratory
Lincoln Laboratory
Raytheon
US Air Force
US Army
Specialization and Research Areas
Engineering Mistakes, Human Error, Gaps in Safety Assessment & Assurance, Systems Engineering, Requirements Development, Cybersecurity, Aircraft Certification, Safety-Critical Nuclear Power Control Systems, Automotive Autonomy and Advanced Driver Assistance Systems, Healthcare Safety, Medical Device Safety
Teaching Experience
System Safety (incl. STAMP, STPA, CAST)
Systems Engineering
Requirements Engineering
Software Engineering
User Interface Design
Human-Centered Design
Automation and Control Systems
System and Software Security
Safety-Critical Systems
Discrete Math
Probability Theory
Control Theory
Signal Processing
MIT Lab/Research Group Affiliations
Partnership for Systems Approaches to Safety and Security
System Engineering Research Lab
Safety and Security Research Lab
Software Engineering Research Lab
Complex Systems Research Lab