How Psychological, Sociological and Cultural Theories of Risk Differ


The purpose of this paper is to examine how psychological, sociological and cultural theories of risk differ.  A small pre‑facto step will be to affirm why such approaches are worthy of study.  A summary of the three approaches with an overview of how they differ will be undertaken.  Brief suggestions for further inquiry will be included.

There seem to be three major differences in the theories.  These can be categorized as:  respect, openness, and complexity.  The reader may find that respect for the individual increases as each theory is described.  Similarly, the theories exhibit more openness as we progress.  So, the individual’s interaction with people and things becomes more important to our understanding of risk.  Also, there is an increased complexity, or layered approach, as we move from psychological, through sociological, and into cultural theories.


It should first be stated that these three approaches are vital to the understanding of risk.  Most argue that ‘risk must be understood within a wider social or cultural context’ (SCSPO: 15).  I side with the assertion that ‘Risk perception is inherently multi‑dimensional and personalistic, with a particular risk or hazard meaning different things to different people and different things in different contexts’ (Royal Society, 1992: 7 quoted in SCSPO: 18).

The individual is relevant.  I can’t conceive of a situation whereby risk does not vary as a result of its interactions with people and groups.  Otherwise, the people ‘at risk’ or interacting with risk would not affect the risk.  The age, gender, social group, cognitive skills, or life experiences would be irrelevant.  This is impossible.

Similarly, I am compelled to find people and the human condition at issue when defining a crisis. Crisis has been called ‘a serious threat to the basic structures or the fundamental values and norms of a social system…’ (Rosenthal, 1986 quoted in SCSPO: 41).  Does a risk, crisis or disaster exist without interaction with people?  This is unlikely, as is the case of the tree falling in the woods with no one there to hear the sound.  It has been argued that ‘Disasters do not cause effects.  The effects are what we call a disaster’ (Dombrowsky, 1995: 242 quoted in SCSPO: 43). Thus, if a tidal wave occurs in the middle of the ocean, where there are no ships, buildings or people, it may be difficult to describe it as a disaster, because there are not measurable effects on people.


Having injected the human element into the study of risk, I now pose the hard questions concerning what human elements are worthy of study.  Which aspects of the human prism will we look through?  Among the most interesting prisms through which to view risk are the psychological, sociological, and cultural theories.

All three theories inject the human condition into the equation, but in different ways and with varying emphasis.  Psychologists would have us focus on the cognitive skills and perceptions of the individual who experiences risk.  The areas of concern to sociologists include ‘risk communication, systems approaches, and socio‑technical or isomorphic learning approaches’ (SCSPO: 16).  Cultural theorists would have us see risk ‘as a cultural phenomenon dependent on everyday social involvement with family, friends and peers’ and tied up with identity (Ibid).


Psychologists are concerned primarily with measurement.  They wish laboratory techniques to provide quantifiable, repeatable and thus deductive results.  Psychologists study ‘decision-making, cognitive, psychometric approaches and mental modelling approaches’ under controlled conditions (SCSPO: 13).  This makes a contribution to our understanding, but can also lead to ‘context specific results’ which don’t entirely tell us how people will behave under non‑controlled circumstances (Ibid: 17).

The risk perception approach codifies the differences between people’s perceptions of risk and the actual incidence of risk.  People prefer known, controllable and voluntary risks to the more frightening unknown and imposed risks (Otway and von Winterfeldt, 1982, quoted in Royal  Society, 1992: 101).  Further, lay people misunderstand probability.  Psychologists conclude there is a cognition or trust problem to be addressed (SCSPO: 17).

Risk perception researchers study cognition, which is ‘the mental process or faculty by which knowledge is acquired through perception, reasoning or intuition’ (Ibid: 70).  Psychologists use social survey and laboratory techniques to compare and contrast perceived risk and actual risk in populations (Ibid: 70‑1).  Since psychologists view risk as a concrete, objectively measurable entity, they suggest that people have irrational attitudes and behaviours concerning risk (Gardner, 1987: 360, Kahneman and Tversky, 1979 quoted in  SCSPO: 71).

Psychometric approaches use statistical analysis of data to gauge risk perception.  The flaw in this type of research is often that respondents can’t be expected to know about certain hazards– smallpox vaccinations for example–unless some coincidence caused this knowledge to be ‘at hand’?  Yet, few respondents may be willing to self-identify as not knowing anything about what sounds like an important topic.  Further, some respondents may become weary of trying to guess at the magnitude of hazards which the respondent had never before considered.  If these perceptions are relevant, they surely have been mediated by a variety of social and cultural processes.

There is an ‘intellectual ethnocentricism’ to both psychometric and risk perception techniques.  Some researchers seem to take delight at pointing to the ‘wrongness’ of the subjects’ perceptions. When psychologists find that people are even verging on being mentally sick (irrational), this verges on misanthrophy.

An upgrade from these prisms is the mental model approach.  This is qualitative research and presents similar concerns as do focus groups.  The representativeness of the sample, the quality of the leader and the context of the subject matter being tested are all variables.  The assumption is that expert opinion is accurate, when in fact it may not be (Royal Society, 1992: 107).  The technique of showing respondents how misinformed they are is arrogant.  Finally, it is assumed that lay people wish to be informed, when in fact many activists wish to disagree with authorities, including experts.  Many citizens suffer from information overload and personal commitments and don’t relish receiving more information.

The overriding concern with many psychological approaches is the context in which the study is undertaken.  The context is generally the laboratory or a social setting which is constructed (focus group or lay/expert dialogue).  It could be argued that the context has been removed from the psychological methodology.  The social amplification theory (Royal Society, 1992: 114-116) holds much promise to bring risk perception research out of the laboratory.  But what’s also needed is the inclusion of social and power structures, mediated communication (through media, peers and others) and other factors (SCSPO: 81).  I am compelled to consider the social matrix in which decisions are made (Pidgeon, 1992: 170‑171 quoted in SCSPO: 92‑93).


It is compelling to look beyond psychological approaches to understand risk. Despite the contribution these theories make, ‘it is restrictive to talk of failures in large‑scale hazardous systems purely in technical terms.  Individuals, their organizations and groups, and ultimately their cultures, are all implicated’ (Pidgeon, 1992: 181 quoted in SCSPO: 103).  The Kegworth crash of January 1989 killed 47 people when a British Midland flight tried to make an emergency landing on a motorway.  A fan blade broke in the left engine, disrupting air conditioning and filling the flight deck with smoke.  Pilots reacted by shutting down the one remaining good engine.  They may have been confused, misread instruments, or known of earlier problems in the left engines of similar planes.  Technically, this can be viewed as a cognitive problem, poor sight lines, or problems inherent in the human/technology interface.  But there are other elements (Pidgeon, 1992: 183 quoted in SCSPO: 105).  The chain of command comes into play, including the deference with which flight crews treat captains and male/female interaction.  Thus we must look to sociological causes as well, and these include ‘risk communication, systems approaches and socio‑technical or isomorphic learning approaches’ (SCSPO: 16).

Socio-technical manifestations involve social disruption, multiple stakeholders and social processes (Shrivastava, 1988 quoted in SCSPO: 199).  The common theme within the phases described (ie: crisis of management, operational crisis situation and the crisis of legitimation) is the degree to which the human element comes into play (SCSPO: 205).  Coupling causes (Ibid: 207) also can involve the relatively unpredictable interaction between people or between people and the technical structure.  Manifestations spread into the broader community in the form of inquests, compensation, and new engineering (Ibid: 209).  Thus, a major contribution of the socio-technical approach is the increased recognition of the role of people (Ibid: 213).  The types of victims and their interaction with the organization and responders create important distinctions and pave the way for different strategies for each type of player (Ibid: 214).


Risk communication offers a set of rules by which information about risk will be more acceptable to a lay audience.  Making risks known and voluntary are chief among these.  Using hourly workers and high school science teachers as spokespeople, showing caring while disseminating facts, advice on body language, as well as arriving early and staying late at public meetings are among other suggestions given (Covello, 1993: video).

Risk communication theories are flawed by the so-called ‘deficit model’ (SCSPO: 115).  This concept holds science to be the highest, if not the only proper type of unassailable knowledge.  It negates lay experience and alleges an objectivity to science.  The poor citizen needs to be enriched through quasi‑scientific techniques that sound like they are based on impressive research and certainty.  The citizen is a sponge waiting to soak up high-quality information from his or her betters (Ibid: 115‑16).

Risk communication has an abundance of faith in science, an essentially negative view of people, and persists in trying to transfer interesting, isolated data from the laboratory to the real world.  To upgrade risk communication and make it more applicable to real-life situations, the field must recognize the ‘very complex ‘open system’’ in which people operate as opposed to the ‘‘closed system’ of the laboratory’ (Ibid: 126).  We must ‘accept that risks and hazards exist in a social, economic and political context (including a personal political context)…’ (Ibid: 133).


This prism views organizations as ‘a whole consisting of components which are interconnected in an organized way’ and studies the ‘characteristics of an organization’ (Ibid: 172).  However, as in other approaches where the human element is key, ‘‘systems’ are a perceptual construct (Waring, 1989, 1996a 1996c quoted in SCSPO: 172) and not an incontestable truth’ (SCSPO: 172).  Systems might ‘include families, institutions, towns and societies’ in which ‘world‑view’ can be an important variable (Ibid: 174).

Systems approaches tell us important things about the human element.  ‘Prediction and control’ are difficult when dealing with ‘variable, fallible, wilful and unpredictable humans’ (Ibid: 180).  Further, ‘reductionism, in contrast to holism, will deliberately attempt to mask significant features or system properties’ (Ibid: 181).  In addition to the importance of the human element, we learn that thorough understanding of systems components may not lead to an acceptable understanding of the whole system.  So, ‘examination of an individual scaffold component does not enable accurate prediction of what a particular erected scaffold will look like or how safe it will be’ (Ibid: 180).

World-view adds the element of ‘perceptual biases’ which individuals may have (i.e.: ‘structural/functionalist, interpretative, radical structuralist and radical humanist’ (Ibid: 183).  Management systems codify the prescriptive approach.  Perhaps depending on one’s view of people, one can view organizations from the ‘clockwork’, ‘socio-technical’ (i.e.: adding the human element) or the human activity vantage points (Ibid: 185).  The efficacy of the systems approach may depend on the degree to which the human element is taken into account.  Reflexive or holistic approaches may yield better results than the prescriptive.  Thus, ‘[t]he quality of a management system and hence its capacity to deal effectively with risks will depend on how holistic an approach to that system is taken’ (Ibid: 187).  Put another way, both the system (design, framework, management) and the human factors (context) need to be taken into account for optimum success (Ibid: 190).


In the context of this examination, cultural theory is a broader concept than the psychological or sociological approaches discussed.  We can view culture as pertaining to ‘a particular autonomous group or population’ or as a group or groups’ ‘system of ideas, values and behaviours’ (Ibid: 319).  Culture is ‘that complex whole which includes knowledge, belief, art, morals, law, custom and any other capabilities and habits acquired by man as a member of society’ (Tylor, 1871 quoted in SCSPO: 319).  In this context, risk is culturally constructed as a result of ‘everyday social involvement with family, friends and peers’ (SCSPO: 16).  Strong determinants of the concept of risk include the concept of individual identity and ‘the strength and context of an individual’s relationship to social groups, and the social structure or nature of such groups’ (Ibid).

Psychology and sociology make some attempt to explain the origin of perceptions in experiences or group membership.  Cultural theory seems firm in its contention that ‘one’s attributes as a cultural actor are derived from social membership’ (Tylor, quoted in SCSPO: 320).  Cultural theory seems to ask us to consider the multiple impacts of our memberships.

Psychology often studies the individual in relative isolation.  Sociology seems to want to study the individual as he or she relates to another person or group, operates within a group or operates in relation to human or technical systems.  Cultural theory adds complexity.  It wishes to examine the texture and nuances of our multiple memberships or ‘patterns of interpersonal relations’ (Thompson et al., 1990: 1 quoted in SCSPO: 321).

To add complexity to social memberships, there are grid/group dimensions.  Perhaps we should envision a matrix over a list of social memberships, with the grid being ‘the total body of rules and constraints which a culture imposes on its people in a particular context’ (Mars and Nicod, 1983: 124 quoted in SCSPO: 322) and the way a person’s ‘life is circumscribed by externally imposed prescriptions’ (Thompson et al., 1990: 5 quoted in SCSPO: 322).  The group aspect deals with ‘collectiveness among people who meet face to face’ (Mars, 1994: 24 quoted in SCSPO: 322) and the moral coercion which others impose on a person by virtue of his membership in a ‘bounded face-to-face unit’ (Mars and Nicod, 183: 125 quoted in SCSPO: 322).

As in the American voting method of ‘pulling the lever’ for all candidates of one party because of membership in that party, grid and group help us explain behaviour.  These dimensions help social actors ‘justify the validity of their social situations’ (SCSPO: 323).  But, American voters may split their vote and support one party for Congress and another for President when the situation warrants.  Similarly, some grid rules may override group membership (or each other), and vice versa.  The social scientist must examine a complex set of overlapping and sometimes competing grid/group dimensions to explain human behaviour.  Cultural theory assists in this quest.

Cultural theorists seem to have addressed the problem of archetypes.  While extremes in psychological categorization (inferior, superior, extrovert, introvert) are helpful to point to human traits and bahaviour patterns, no one of these actually exists in isolation in the real world. While one may predominate, people exhibit aspects of several.  This dilemma has been nicely addressed through the requisite variety condition.  Four ways of life (individualism, hierarchial, egalitarian, fatalism as well as the fifth and rare hermit) are said to exist simultaneously and indeed ‘each depends upon the survival of the others for its own continued existence’ (Ibid: 325). Cultural theory recognizes that ‘It is only the presence in the world of people who are different from them that enables adherents of each way of life to be the way they are’ (Thompson et al., 1990: 96 quoted in SCSPO: 325).  So, it may be that one of cultural theory’s perceived weaknesses is indeed a strength.  While it does deal with archetypes and is reductionist, ‘there is no need to choose between…collectivism and individualism, values and social relations, or change and stability’ (Thompson et al., 1990: 21 quoted in SCSPO: 326).  All personality is essentially multiple.

Cultural theory gives us a matrix with which to decide how individuals perceive risk and behave. It helps us decide how people ‘choose what to fear (and how to fear it) in order to support their way of life’ (Wildavsky and Dake, 1990: 43 quoted in SCSPO: 328).  So, for ‘the individualist entrepreneur, nature is benign’, disaster cannot happen to him or her and risk is disregarded.  The hierarch thinks ‘nature is perverse/tolerant’ and blames disasters on ‘deviance and rule breaking’. The egalitarian is ‘precariously balanced….Risk and potential disaster are an ever-present threat’. The fatalist finds that disasters are ‘acts of God’ (SCSPO: 329).

Cultural theory allows and even requires an open and eclectic approach to risk.   If it is true that individuals ‘select certain risks for attention to defend their preferred lifestyles and as a forensic resource to place blame on other groups’ (Royal Society, 1992:112 quoted in SCSPO: 330), then it may also be true that organizations do the same.  Thus, if a corporation or government wishes to examine and plan for risk fully, then it should involve critics.  ‘…[T]hey won’t hesitate to articulate the risks’ that the corporation or government overlooked or didn’t want to acknowledge (SCSPO: 330).

The way in which organizations either assimilate or block outside information may be through their ‘cultural web’.  The various ways in which an organization can understand its attitudes toward risk involve the cultural web of ‘rituals and myths, symbols, power structures, organizational structures, control systems and routines’ (Ibid: 333).  Examination of these complex elements allows managers to understand how culture ‘impacts on the strategy they are following, and the difficulty of changing it’ (Johnson, 1992: 208 quoted in SCSPO: 333).  Thus, a manager can examine why ‘apparently good (risk, crisis and disaster management) ideas sometimes fail to transfer effectively across cultural borders’ (SCSPO: 338).


The two major erroneous assumptions about risk are that it is a ‘concrete entity’ which ‘can be precisely defined and unambiguously measured in objective terms’ (Ibid: 247) and that risk modeling is ‘a neutral, objective activity and therefore the final quantitative assessment will be unbiased and also independent of the analyst’ (Ibid: 248).  One might add that risk modeling is also not independent of those causing the risk and those who are at risk.  A danger of ignoring the human element with a strictly engineering approach is that ‘the complexity of human behaviour in general and human errors in particular can[not] be prespecified and reduced to a simple unitary numerical representation’ (Ibid: 250).

So, if we are committed to understanding risk, we must be committed to understanding the human condition.  After all, about 80% of ‘the recommendations made by public inquiries …were concerned with management, administration and information, and not technical matters (Toft and Reynolds, 1994 quoted in SCSPO: 242).

All social science approaches may fail to provide all the answers, but each technique in turn asks certain questions that others don’t.  Thus, by using all the investigative techniques available, more complete answers are possible.  Axiomatically, failing to employ the range of approaches available seems to guarantee a less than complete set of answers and solutions.

In our pursuit of the management, measurement, understanding, and risk communication,  it is important to remember how the three techniques studied differ.  They mainly differ in their validation of individual experience and empowerment.  Each technique in turn appears to be more open, textured and willing to accept multiple sources of data and even uncertainty in order to examine the truth.  Each in turn may even be less self-absorbed, self- assured and myopic.  In order to begin understanding an entity as complex and changeable as risk, we must surely be willing to use disciplines that are at least as complex and difficult as the subject being studied.



Bonner, W. Allan, “Off The Record: How to Succeed with Risk Communications”, Law Times,

November 21 – November 27, 1994, Paula Kulig, Managing Editor.

Covello, V.T., Butte, G., Thorne, S. and Walsh, J. (1993) A Process for Community Dialogue and Outreach for the Canadian Chemical Industry: A workshop for Members of The Canadian Chemical Producers’ Association, The Canadian Chemical Producers’ Association. (Video and Manual).

Dombrowsky, W.R. (1995) ‘Again and Again: Is a Disaster What We Call “Disaster”?’, Some Conceptual Notes on Conceptualizing the Object of Disaster Sociology’, in: Module 1, Unit 2 (‘A Theory of Crisis’) of  in Risk, Crisis and Disaster Management, Leicester: SCSPO: 43.

Gardner, H. (1987) ‘The Mind’s New Science: A History of the Cognitive Revolution’, in: Module 1, Unit 3 (‘Psychological Approaches to Risk Management’) of M.Sc in Risk, Crisis and Disaster Management, Leicester: SCSPO: 71.

Johnson, G. (1992) ‘Managing Strategic Change: Strategy Culture and Action’, in: Module 1, Unit 8 (‘Introduction to Cultural History’) of  in Risk, Crisis and Disaster Management, Leicester: SCSPO: 333.

Kahneman, D. and Tversky, A. (1979) ‘Prospect Theory: An Analysis of Decision Making Under Risk’, in: Module 1, Unit 3 (‘Psychological Approaches to Risk Management’) of  in Risk, Crisis and Disaster Management, Leicester: SCSPO: 71.

LaPiere, T.R. (1934) ‘Attitudes vs. Actions’, in: Module 1, Unit 6 (‘The Management of Organizational Risks’) of  in Risk, Crisis and Disaster Management, Leicester: SCSPO: 252.

Mars, G. (1994) ‘Cheats at Work: an Anthropology of Workplace Crime’, in: Module 1, Unit 8 (‘Introduction to Cultural History’) of M.Sc. in Risk, Crisis and Disaster Management, Leicester: SCSPO: 322.

Mars, G. and Nicod, M. (1983) ‘The World of Waiters’, in: Module 1, Unit 8

(‘Introduction to Cultural History’) of M.Sc. in Risk, Crisis and Disaster Management, Leicester: SCSPO: 322.

Otway, H.J. and von Winterfeldt, D. (1982) ‘Beyond Acceptable Risk: On the Social Acceptability of Technologies’, in: Royal Society, Chapter 5 (‘Risk Perception’) of Risk: Analysis, Perception and Management, London: Royal Society: 101.

Pidgeon, N.F. (1992) ‘The Psychology of Risk’, in: Module 1, Unit 3 (‘Psychological Approaches to Risk Management’) of M.Sc. in Risk, Crisis and Disaster Management, Leicester: SCSPO: 89-107.

Royal Society (1992) Risk: Analysis, Perception and Management, London: The Royal Society.

Royal Society (1992) ‘Risk: Analysis, Perception and Management’, in: Module 1, Unit 1 (‘An Introduction to Risk, Crisis and Disaster Management’) of M.Sc. in Risk, Crisis and Disaster Management, Leicester: SCSPO: 18, 330.

Rosenthal, U. (1986) ‘Crisis Decision Making in the Netherlands’, in: Module 1, Unit 2 (‘A Theory of Crisis’) of M.Sc. in Risk, Crisis and Disaster Management, Leicester: SCSPO: 41.

Shrivastava. P., Mitroff, I.I., Miller, D. and Miglani, A. (1988) ‘Understanding Industrial Crisis’, in: Module 1, Unit 5 (Systems Ideas and Risk) of M.Sc. in Risk, Crisis and Disaster Management, Leicester: SCSPO: 199.

The Scarman Centre of Study and Public Order (1997) ‘Module 1: Theories of Risk and Crisis’ of M.Sc. in Risk, Crisis and Disaster Management, Leicester: SCSPO.

Thompson, M., Ellis, R. and Wildavsky, A. (1990) ‘Cultural Theory’, in: Module 1, Unit 8 (‘Introduction to Cultural History’) of M.Sc. in Risk, Crisis and Disaster Management, Leicester: SCSPO: 321, 322, 325, 326.

Toft, B. and Reynolds, S. (1994) ‘Learning from Disasters – A Management Approach’, in: Module 1, Unit 6 (‘The Management of Organizational Risks’) of M.Sc. in Risk, Crisis and Disaster Management, Leicester: SCSPO: 242.

Tylor, E. (1871) ‘Primitive Culture’, in: Module 1, Unit 8 (‘Introduction to Cultural History’) of   M.Sc. in Risk, Crisis and Disaster Management, Leicester: SCSPO: 319, 320.

Waring, A.E. (1989) ‘Systems Methods for Managers – A Practical Guide’, in: Module 1, Unit 5 (‘Systems Ideas and Risk’) of M.Sc. in Risk, Crisis and Disaster Management, Leicester: SCSPO: 172.

Waring, A.E. (1996a) ‘Safety Management Systems’, in: Module 1, Unit 5 (‘Systems Ideas and   Risk’) of M.Sc. in Risk, Crisis and Disaster Management, Leicester: SCSPO: 172.

Waring, A.E. (1996c) ‘Practical Systems Thinking’, in: Module 1, Unit 5 (‘Systems Ideas and       Risk’) of M.Sc. in Risk, Crisis and Disaster Management, Leicester: SCSPO: 172.

Wildavsky, A. and Dake, K. (1990) “Theories of Risk Perception: Who Fears What and Why’,

in:  Module 1, Unit 8 (‘Introduction to Cultural History’) of M.Sc. in Risk, Crisis and Disaster Management, Leicester: SCSPO: 328.

Previous ArticleNext Article
Allan Bonner
Dr. Allan Bonner, BA, BEAD, MA, MSc, DBA, LLM, MScPI, MPPA(Cand.), is the founder of Allan Bonner Communications Management Inc., which is a communications company that has helped clients on five continents face some of the most contentious issues of our times. For 30 years he has taught regularly at the post-graduate level, as a sessional, contract, and intensive one-day instructor across Canada, and as a guess lecturer around the world. He has worked with the military, including Peacekeepers, the Provost Marshall, the Judge Advocate General (JAG), and the Military Complaints Commissioner. He has helped international diplomats; domestic, central, and offshore banks; 11 oil, gas, and chemical companies; and other blue-chip clients on five continents. His political work includes helping heads of government, cabinet ministers, G7, G8 and UN delegations, the WTO, NATO, IAEA, and many others. He has worked in Hong Kong, Seoul, Tokyo, Bangkok, Beijing, Singapore, Canberra, Budapest, Geneva, Bled, Vienna, Kuwait City, Nairobi, most American States, Mexico, and all Canadian provinces. Allan was the first North American, and still only Canadian, to be awarded a post-graduate degree in risk, crisis, and disaster management. He taught on most Canadian military bases, the Canadian Police College, and worked with security and intelligence officials. At the international level, Allan has worked with several UN and international entities with security responsibilities.