Richard I. Cook | |
Birth Date: | May 3, 1953 |
Death Date: | August 31, 2022 |
Nationality: | American |
Fields: | cognitive systems engineering resilience engineering |
Workplaces: | Control Data Corporation The Ohio State University University of Chicago KTH Royal Institute of Technology Adaptive Capacity Labs |
Alma Mater: | Lawrence University University of Cincinnati |
Known For: | new look of safety, going solid, going sour, patient safety, how complex systems fail, line of representation, bone as archetype of resilience, being bumpable, sharp-end, first vs. second stories |
Dr. Richard I. Cook (May 3, 1953 – August 31, 2022)[1] was a system safety researcher, physician, anesthesiologist, university professor, and software engineer.[2] Cook did research in safety, incident analysis, cognitive systems engineering, and resilience engineering across a number of fields, including critical care medicine, aviation, air traffic control, space operations, semiconductor manufacturing, and software services.
Cook graduated Cum Laude from Lawrence University in 1975 from a customized program that included physics and urban planning. After completing his bachelor's degree, Cook took a position as a lead systems analysis at Control Data Corporation, working with finite element analysis programs such as ANSYS and NASTRAN on the CDC STAR-100, and managing teams of programmers and support analysts.[3]
In 1986, Cook received his MD degree from the University of Cincinnati where he was a General Surgery intern.[4] In 1994, he completed his Anesthesiology residence at the Ohio State University.
Cook served in the Department of Anesthesia and Critical Care at the University of Chicago as an associate professor[5] and Director of the Cognitive Technologies Laboratory[6] from 1994 to 2012, where he provided clinical care, did teaching and training, research, and community service.
In 2012, Cook was named Sweden's First Professor of Patient Safety, at KTH Royal Institute of Technology,[7] where he served until 2015, when he retired from the position as a Professor Emeritus.
From 2015 to 2020, Cook worked as a research scientist at the Ohio State University in the Department of Integrated Systems Engineering. During this time, he also had a part-time appointment as a clinical professor of anesthesiology at the Ohio State University Wexner Medical Center, where he provided patient care and trained new medical practitioners.
In 2017, Cook, along with John Allspaw and David Woods, founded a consulting company, Adaptive Capacity Labs.
Cook was active in the patient safety movement from the mid-1990s to the mid-2000s.[8] [9] He was one of the founding board members of the National Patient Safety Foundation and served on its executive committee until 2007. From 1998 to 2000, he advised the U.S. Veterans Health Administration (VA) on patient safety initiatives, and in 2000 was appointed co-director of a V.A. "Gaps Center" that was funded due to Cook's research.
In 1997, Cook helped organize the workshop "Assembling the Scientific Basis for Progress on Patient Safety" in Chicago, and co-authored the resulting report published the following year: A Tale of Two Stories: Contrasting Views of Patient Safety.
In 2011, Cook served on an advisory panel for the Institute of Medicine on the topic of Health IT and Patient Safety. In the final published report, he wrote a dissent where he argued that health IT software should be regulated as a class III medical device.[10]
In 1998, Cook wrote a treatise titled How Complex Systems Fail, republished in the book Web Operations: Keeping the Data on Time,[11] and Hindsight magazine,[12] where he identified eighteen characteristics of complex system failure modes.
In 2012, Cook gave a talk on the topic at the O’Reilly Velocity conference.
Cook was a proponent of what came to be known as the "new look" of safety[13] [14] (referred to by Sidney Dekker as the "new view"[15]).
According to the New Look, operators within safety-critical are faced with competing demands, dilemmas, conflicts (both technical and organizational), and uncertainty. In particular, operators are always faced with the competing demands for achieving production goals and for failure-free operations.
When accidents occur, they tend to be attributed to human error because of hindsight bias. Interventions in the wake of accidents lead to a cycle of error, where they increase the complexity of the system and create the potential for new failure modes.
Instead of focusing on the people in the system as the source of accidents, the "new look" perspective argues that it is the people in the system that create the safety in the system; that it is the work of the human operators that compensate for gaps in the designed system, and that successful work is much more common than failure.
Cook (along with David Woods and John McDonald) introduced the term going sour to refer to incidents where there is a slow degradation of system performance over time.[16] Cook noted going sour incidents are more complex and more difficult to describe than acute incidents. In addition, in these types of incidents, the actions of human operators play more of a role in how the incident unfolds.
Cook (along with Jens Rasmussen) introduced the term going solid to describe a significant shift in systems operations when a form of capacity becomes exhausted.[17] The term originates from the nuclear power industry, where it is used as slang to refer to a technical situation that has become difficult to manage. More literally, the term describes a change in system behavior related to the state of a steam boiler. Typically, a steam boiler contains a mixture of steam and liquid water. When the boiler becomes completely filled with liquid water, it is said to "go solid".
Cook applied the concept of "going solid" to an intensive care unit in a hospital that undergoes a "bed crunch", when there are no longer enough beds to assign to patients. Cook notes that "going solid" situations tend to foster opportunities for accidents to occur.
Cook noted that operators of software systems are not able to interact directly with the systems that they supervise, but instead they interact through representations.[18] Operators see visual representations of the internal state of software systems, and they manipulate representations in order to act on the system.
Cook used the term line of representation as a metaphor for distinguishing between two sets of entities. Above the line of representation lie people, organizations, and human processes. Below the line of representation are the software artifacts and infrastructure.
Cook notes that the people within an organization, who exist above the line, will describe in concrete terms the entities below the line, despite not being able to directly observe or act upon these entities.
The Future of above-the-line Tooling, SRECon Americas 2022
A Few Observations on the Marvelous Resilience of Bone & Resilience Engineering, REdeploy Conference, San Francisco, CA, 2019
Resilience In Complex Adaptive Systems, O'Reilly Velocity Web Performance and Operations Conference, New York, NA, 2013
How Complex Systems Fail, O'Reilly Velocity Web Performance And Operations Conference, Santa Clara, CA, 2012
Lectures on the study of cognitive work, The Royal Institute of Technology, Huddinge, Sweden, 2012