Normalization process theory (NPT) is a sociological theory, generally used in the fields of science and technology studies (STS), implementation research, and healthcare system research. The theory deals with the adoption of technological and organizational innovations into systems, recent studies have utilized this theory in evaluating new practices in social care and education settings.[1] [2] It was developed out of the normalization process model.
Normalization process theory, dealing with the adoption, implementation, embedding, integration, and sustainment of new technologies and organizational innovations, was developed by Carl R. May, Tracy Finch, and colleagues between 2003 and 2009.[3] [4] [5] It was developed through ESRC funded research on Telehealth and through an ESRC fellowship to May. Its application to randomised controlled trials was led by Professor Elizabeth Murray of University College London, and chararacterised normalization process theory as a trial killer.
Through three iterations, the theory has built upon the normalization process model previously developed by May et al. to explain the social processes that lead to the routine embedding of innovative health technologies.[6] [7]
Normalization process theory focuses attention on agentic contributions – the things that individuals and groups do to operationalize new or modified modes of practice as they interact with dynamic elements of their environments. It defines the implementation, embedding, and integration as a process that occurs when participants deliberately initiate and seek to sustain a sequence of events that bring it into operation. The dynamics of implementation processes are complex, but normalization process theory facilitates understanding by focusing attention on the mechanisms through which participants invest and contribute to them. It reveals "the work that actors do as they engage with some ensemble of activities (that may include new or changed ways of thinking, acting, and organizing) and by which means it becomes routinely embedded in the matrices of already existing, socially patterned, knowledge and practices".[8] These have explored objects, agents, and contexts. In a paper published under a creative commons license, May and colleagues describe how, since 2006, NPT has undergone three iterations.[9]
The first iteration of the theory focused attention on the relationship between the properties of a complex healthcare intervention and the collective action of its users. Here, agents' contributions are made in reciprocal relationship with the emergent capability that they find in the objects – the ensembles of behavioural and cognitive practices – that they enact. These socio-material capabilities are governed by the possibilities and constraints presented by objects, and the extent to which they can be made workable and integrated in practice as they are mobilized.[10] [11]
The second iteration of the theory built on the analysis of collective action, and showed how this was linked to the mechanisms through which people make their activities meaningful and build commitments to them.[12] Here, investments of social structural and social cognitive resources are expressed as emergent contributions to social action through a set of generative mechanisms: coherence (what people do to make sense of objects, agency, and contexts); cognitive participation (what people do to initiate and be enrolled into delivering an ensemble of practices); collective action (what people do to enact those practices); and reflexive monitoring (what people do to appraise the consequences of their contributions). These constructs are the core of the theory, and provide the foundation of its analytic purchase on practice.
The third iteration of the theory developed the analysis of agentic contributions by offering an account of centrally important structural and cognitive resources on which agents draw as they take action.[13] Here, dynamic elements of social contexts are experienced by agents as capacity (the social structural resources, that they possess, including informational and material resources, and social norms and roles) and potential (the social cognitive resources that they possess, including knowledge and beliefs, and individual intentions and shared commitments). These resources are mobilized by agents when they invest in the ensembles of practices that are the objects of implementation.
Normalization process theory is regarded as a middle range theory that is located within the 'turn to materiality' in STS. It therefore fits well with the case-study oriented approach to empirical investigation used in STS. It also appears to be a straightforward alternative to actor–network theory in that it does not insist on the agency of non-human actors, and seeks to be explanatory rather than descriptive. However, because normalization process theory specifies a set of generative mechanisms that empirical investigation has shown to be relevant to implementation and integration of new technologies, it can also be used in larger scale structured and comparative studies. Although it fits well with the interpretive approach of ethnography and other qualitative research methods,[14] it also lends itself to systematic review[15] [16] and survey research methods. As a middle range theory, it can be federated with other theories to explain empirical phenomena. It is compatible with theories of the transmission and organization of innovations, especially diffusion of innovations theory, labor process theory, and psychological theories including the theory of planned behavior and social learning theory.