The Cynefin framework is a conceptual framework used to aid decision-making. Created in 1999 by Dave Snowden when he worked for IBM Global Services, it has been described as a "sense-making device". Welsh: Cynefin is a Welsh word for 'habitat'.[1]
Cynefin offers five decision-making contexts or "domains"—clear (known as simple until 2014, then obvious until being recently renamed), complicated, complex, chaotic, and confusion—that help managers to identify how they perceive situations and make sense of their own and other people's behaviour. The framework draws on research into systems theory, complexity theory, network theory and learning theories.[2]
The idea of the Cynefin framework is that it offers decision-makers a "sense of place" from which to view their perceptions. is a Welsh word meaning 'habitat', 'haunt', 'acquainted', 'familiar'. Snowden uses the term to refer to the idea that we all have connections, such as tribal, religious and geographical, of which we may not be aware.[3] [1] It has been compared to the Māori word Maori: tūrangawaewae, meaning a place to stand, or the "ground and place which is your heritage and that you come from".
Snowden, then of IBM Global Services, began work on a Cynefin model in 1999 to help manage intellectual capital within the company.[4] He continued developing it as European director of IBM's Institute of Knowledge Management,[5] and later as founder and director of the IBM Cynefin Centre for Organizational Complexity, established in 2002.[6] Cynthia Kurtz, an IBM researcher, and Snowden described the framework in detail the following year in a paper, "The new dynamics of strategy: Sense-making in a complex and complicated world", published in IBM Systems Journal.[7]
The Cynefin Centre—a network of members and partners from industry, government and academia—began operating independently of IBM in 2004.[8] In 2007 Snowden and Mary E. Boone described the Cynefin framework in the Harvard Business Review.[9] Their paper, "A Leader's Framework for Decision Making", won them an "Outstanding Practitioner-Oriented Publication in OB" award from the Academy of Management's Organizational Behavior division.[10]
Cynefin offers five decision-making contexts or "domains": clear, complicated, complex, chaotic, and a centre of confusion. The domain names have changed over the years. Kurtz and Snowden (2003) called them known, knowable, complex, and chaotic. Snowden and Boone (2007) changed known and knowable to simple and complicated. From 2014 Snowden used obvious in place of simple, and is now using the term clear
The domains offer a "sense of place" from which to analyse behaviour and make decisions.[11] The domains on the right, clear and complicated, are "ordered": cause and effect are known or can be discovered. The domains on the left, complex and chaotic, are "unordered": cause and effect can be deduced only with hindsight or not at all.[12]
The clear domain represents the "known knowns". This means that there are rules in place (or best practice), the situation is stable, and the relationship between cause and effect is clear: if you do X, expect Y. The advice in such a situation is to "sense–categorize–respond": establish the facts ("sense"), categorize, then respond by following the rule or applying best practice. Snowden and Boone (2007) offer the example of loan-payment processing. An employee identifies the problem (for example, a borrower has paid less than required), categorizes it (reviews the loan documents), and responds (follows the terms of the loan).[9] According to Thomas A. Stewart,
This is the domain of legal structures, standard operating procedures, practices that are proven to work. Never draw to an inside straight. Never lend to a client whose monthly payments exceed 35 percent of gross income. Never end the meeting without asking for the sale. Here, decision-making lies squarely in the realm of reason: Find the proper rule and apply it.[13]
Snowden and Boone write that managers should beware of forcing situations into this domain by oversimplifying, by "entrained thinking" (being blind to new ways of thinking), or by becoming complacent. When success breeds complacency ("best practice is, by definition, past practice"), there can be a catastrophic clockwise shift into the chaotic domain. They recommend that leaders provide a communication channel, if necessary an anonymous one, so that dissenters (for example, within a workforce) can warn about complacency.[9]
The complicated domain consists of the "known unknowns". The relationship between cause and effect requires analysis or expertise; there are a range of right answers. The framework recommends "sense–analyze–respond": assess the facts, analyze, and apply the appropriate good operating practice.[9] According to Stewart: "Here it is possible to work rationally toward a decision, but doing so requires refined judgment and expertise. ... This is the province of engineers, surgeons, intelligence analysts, lawyers, and other experts. Artificial intelligence copes well here: Deep Blue plays chess as if it were a complicated problem, looking at every possible sequence of moves."[13]
The complex domain represents the "unknown unknowns". Cause and effect can only be deduced in retrospect, and there are no right answers. "Instructive patterns ... can emerge," write Snowden and Boone, "if the leader conducts experiments that are safe to fail." Cynefin calls this process "probe–sense–respond".[9] Hard insurance cases are one example. "Hard cases ... need human underwriters," Stewart writes, "and the best all do the same thing: Dump the file and spread out the contents." Stewart identifies battlefields, markets, ecosystems and corporate cultures as complex systems that are "impervious to a reductionist, take-it-apart-and-see-how-it-works approach, because your very actions change the situation in unpredictable ways."[13]
In the chaotic domain, cause and effect are unclear. Events in this domain are "too confusing to wait for a knowledge-based response", writes Patrick Lambe. "Action—any action—is the first and only way to respond appropriately."[14] In this context, managers "act–sense–respond": act to establish order; sense where stability lies; respond to turn the chaotic into the complex.[9] Snowden and Boone write:
In the chaotic domain, a leader’s immediate job is not to discover patterns but to staunch the bleeding. A leader must first act to establish order, then sense where stability is present and from where it is absent, and then respond by working to transform the situation from chaos to complexity, where the identification of emerging patterns can both help prevent future crises and discern new opportunities. Communication of the most direct top-down or broadcast kind is imperative; there’s simply no time to ask for input.[9]
The September 11 attacks were an example of the chaotic category.[9] Stewart offers others: "the firefighter whose gut makes him turn left or the trader who instinctively sells when the news about the stock seems too good to be true." One crisis executive said of the collapse of Enron: "People were afraid. ... Decision-making was paralyzed. ... You've got to be quick and decisive—make little steps you know will succeed, so you can begin to tell a story that makes sense."[13]
Snowden and Boone give the example of the 1993 Brown's Chicken massacre in Palatine, Illinois—when robbers murdered seven employees in Brown's Chicken and Pasta restaurant—as a situation in which local police faced all the domains. Deputy Police Chief Walt Gasior had to act immediately to stem the early panic (chaotic), while keeping the department running (simple), calling in experts (complicated), and maintaining community confidence in the following weeks (complex).[9]
The dark confusion domain in the centre represents situations where there is no clarity about which of the other domains apply (this domain has also been known as disordered in earlier versions of the framework). By definition it is hard to see when this domain applies. "Here, multiple perspectives jostle for prominence, factional leaders argue with one another, and cacophony rules", write Snowden and Boone. "The way out of this realm is to break down the situation into constituent parts and assign each to one of the other four realms. Leaders can then make decisions and intervene in contextually appropriate ways."[9]
As knowledge increases, there is a "clockwise drift" from chaotic through complex and complicated to simple. Similarly, a "buildup of biases", complacency or lack of maintenance can cause a "catastrophic failure": a clockwise movement from simple to chaotic, represented by the "fold" between those domains. There can be counter-clockwise movement as people die and knowledge is forgotten, or as new generations question the rules; and a counter-clockwise push from chaotic to simple can occur when a lack of order causes rules to be imposed suddenly.[9]
Cynefin was used by its IBM developers in policy-making, product development, market creation, supply chain management, branding and customer relations.[15] Later uses include analysing the impact of religion on policymaking within the George W. Bush administration,[16] emergency management,[17] network science and the military,[18] the management of food-chain risks,[19] homeland security in the United States,[20] agile software development,[21] and policing the Occupy Movement in the United States.[22]
It has also been used in health-care research, including to examine the complexity of care in the British National Health Service,[23] the nature of knowledge in health care,[24] and the fight against HIV/AIDS in South Africa.[25] In 2017 the RAND Corporation used the Cynefin framework in a discussion of theories and models of decision making.[26] The European Commission has published a field guide to use Cynefin as a "guide to navigate crisis".[27]
Criticism of Cynefin includes that the framework is difficult and confusing, needs a more rigorous foundation, and covers too limited a selection of possible contexts.[28] Another criticism is that terms such as known, knowable, sense, and categorize are ambiguous.
Prof Simon French recognizes "the value of the Cynefin framework in categorising decision contexts and identifying how to address many uncertainties in an analysis" and as such believes it builds on seminal works such as Russell L. Ackoff's Scientific Method: optimizing applied research decisions (1962), C. West Churchman's Inquiring Systems (1967), Rittel and Webber's Dilemmas in a General Theory of Planning (1973), Douglas John White's Decision Methodology (1975), John Tukey's Exploratory data analysis (1977), Mike Pidd's Tools for Thinking: Modelling in Management Science (1996), and Ritchey's General Morphological Analysis (1998).[29]
Firestone and McElroy argue that Cynefin is a model of sensemaking rather than a full model of knowledge management and processing.[30]
Steve Holt compares Cynefin to the theory of constraints. The theory of constraints argues that most systems outcomes are limited by certain bottlenecks (constraints) and improvements away from these constraints tend to be counterproductive because they just place more strain on a constraint. Holt places the theory of constraints within the Cynefin framing by arguing the theory of constraints moves from complex situations to complicated ones by using abductive reasoning and intuition then logic to creating an understanding, before creating a probe to test understanding.[31]
Cynefin defines several types of constraints. Fixed constraints stipulate that actions must be done in a certain way in a certain order and apply in the clear domain, governing constraints are looser and act more like rules or policies applying in the complicated domain, enabling constraints that operate in the complex domain are constraints that allow a system to function but do not control the entire process.Holt argues that constraints in the theory of constraints correspond the Cynefin's fixed and governing constraints. Holt argues that injections in the theory of constraints correspond to enabling constraints.