Cultural Influences on Errors: Prevention, Detection, and Management

Publikation: Beiträge in SammelwerkenKapitelbegutachtet

Standard

Cultural Influences on Errors : Prevention, Detection, and Management. / Gelfand, Michele J.; Frese, Michael; Salmon, Elizabeth.

Errors in Organizations. Hrsg. / David A. Hofmann; Michael Frese. New York : Routledge Taylor & Francis Group, 2011. S. 273-315 (SIOP organizational frontiers series).

Publikation: Beiträge in SammelwerkenKapitelbegutachtet

Harvard

Gelfand, MJ, Frese, M & Salmon, E 2011, Cultural Influences on Errors: Prevention, Detection, and Management. in DA Hofmann & M Frese (Hrsg.), Errors in Organizations. SIOP organizational frontiers series, Routledge Taylor & Francis Group, New York, S. 273-315. <https://www.taylorfrancis.com/books/e/9780203817827/chapters/10.4324/9780203817827-18>

APA

Gelfand, M. J., Frese, M., & Salmon, E. (2011). Cultural Influences on Errors: Prevention, Detection, and Management. in D. A. Hofmann, & M. Frese (Hrsg.), Errors in Organizations (S. 273-315). (SIOP organizational frontiers series). Routledge Taylor & Francis Group. https://www.taylorfrancis.com/books/e/9780203817827/chapters/10.4324/9780203817827-18

Vancouver

Gelfand MJ, Frese M, Salmon E. Cultural Influences on Errors: Prevention, Detection, and Management. in Hofmann DA, Frese M, Hrsg., Errors in Organizations. New York: Routledge Taylor & Francis Group. 2011. S. 273-315. (SIOP organizational frontiers series).

Bibtex

@inbook{2c88836a5bbb4114ac338aae168e2739,
title = "Cultural Influences on Errors: Prevention, Detection, and Management",
abstract = "On April 26, 1986, a chain of problems caused by design flaws and exacerbated by human error culminated in the worst nuclear power plant disaster in history. When Reactor 4 at the Chernobyl plant exploded, it released 400 times the radioactive fallout as the atomic bombing of Hiroshima (Stone, 2006). While an accurate death count is impossible because of Soviet efforts to cover up the effects of the fallout, 31 people died instantly after the reactor explosion (“The Chernobyl Accident,” 2000), at least 28 workers who were diagnosed with acute radiation sickness died in the 4 months following the accident (United Nations Scientific Committee on the Effects of Atomic Radiation, 2000), and the Chernobyl Forum (2006) estimated that the accident could cause another 4,000 cancer deaths among those who experienced the highest levels of exposure. In 2002, nearly 300 people died and hundreds were injured after a Tanzanian passenger train lost power on a hill and rolled back into a freight train. A government report found that the driver of the train failed to apply the manual brakes, a mistake attributed to human error and inexperience (“Human Error Blamed,” 2002). In 2003, after the crash of the Andrew J. Barberi ferry, in which 10 people died, a review of the U.S. Coast Guard safety records showed that more than 30 of the 50 accidents that have occurred on Staten Island ferries “have been blamed on what investigators deemed to be mistakes or acts of negligence by captains, mates, deckhands or other ferry employees” (McIntire, 2003). These stories illustrate an intuitive conclusion: Errors are univer- sal. Errors, whether they cause thousands of deaths or minor inconveniences, are a global phenomenon. Yet, despite the fact that errors are a human universal, a careful look at this volume illustrates that scholarship on the topic is generally a Western enterprise, with theories and research generated largely in the United States and Western Europe (for notable exceptions, see Helmreich, 2000; Helmreich & Merritt, 1998; Helmreich, Wilhelm, Klinect, & Merritt, 2001; Jing, Lu, & Peng, 2001; Li, Harris, Li, & Wang, 2009). Examining cultural influences on errors is critical for theory and practice. Theoretically, cross-cultural research on errors will help to elucidate further what is universal (i.e., etic) and culture specific (i.e., emic) about error processes while expanding error theories, constructs, and measures to be globally relevant. Practically, a cultural perspective on errors is critical to help identify how to prepare best for and manage errors in ways that are targeted to specific cultural contexts. For example, there are large differences in aircraft accidents across different nations even though most airplanes are similar in make and age and are often serviced by the same specialized service firms (Civil Aviation Authority, 1998). Although such differences are likely multiply determined, cultural characteristics of the pilots and the crews in error prevention, detection, or management may at least partially explain such cultural variation. Cross-cultural research is also needed to help identify ways in which multicultural teams can better manage cultural differences in responses to errors. Finally, the study of errors also enhances our understanding of culture itself. As Freud (1901/1954) has noted, errors often point out critical system characteristics that may go unnoticed. In other words, errors may tell us something about fundamental characteristics of cultural systems themselves. For example, as we discuss in this chapter, errors in high-power-distance cultures can occur when low-power members fail to communicate openly with their superiors-a phenomenon that is a key defining feature of such cultures. More generally, a cultural perspective on errors has much to offer the science and practice of errors in organizations. In this chapter, we integrate research on culture with research on errors to identify key propositions for future research. We first discuss critical distinctions regarding the construct of errors, and we advance a process model of errors that includes error prevention, error detection, and error management. We then discuss how key cultural dimensions, including uncertainty avoidance, humane orientation, tightness-looseness, fatalism, power distance, and individualism-collectivism differentially affect each stage of the error process. We identify numerous “cultural paradoxes” regarding the error process for each of these cultural dimensions that we expect could have important short-and long-term consequences for organizations. We also discuss error management in culturally diverse groups, identifying group compositions that are the most ideal in managing the three stages of the error management process. Finally, we conclude with implications of this perspective for theory, research, and training to manage errors within a global context.",
keywords = "Business psychology, Entrepreneurship",
author = "Gelfand, {Michele J.} and Michael Frese and Elizabeth Salmon",
year = "2011",
month = jul,
day = "21",
language = "English",
isbn = "978-0-8058-6291-1",
series = "SIOP organizational frontiers series",
publisher = "Routledge Taylor & Francis Group",
pages = "273--315",
editor = "Hofmann, {David A.} and Michael Frese",
booktitle = "Errors in Organizations",
address = "United Kingdom",

}

RIS

TY - CHAP

T1 - Cultural Influences on Errors

T2 - Prevention, Detection, and Management

AU - Gelfand, Michele J.

AU - Frese, Michael

AU - Salmon, Elizabeth

PY - 2011/7/21

Y1 - 2011/7/21

N2 - On April 26, 1986, a chain of problems caused by design flaws and exacerbated by human error culminated in the worst nuclear power plant disaster in history. When Reactor 4 at the Chernobyl plant exploded, it released 400 times the radioactive fallout as the atomic bombing of Hiroshima (Stone, 2006). While an accurate death count is impossible because of Soviet efforts to cover up the effects of the fallout, 31 people died instantly after the reactor explosion (“The Chernobyl Accident,” 2000), at least 28 workers who were diagnosed with acute radiation sickness died in the 4 months following the accident (United Nations Scientific Committee on the Effects of Atomic Radiation, 2000), and the Chernobyl Forum (2006) estimated that the accident could cause another 4,000 cancer deaths among those who experienced the highest levels of exposure. In 2002, nearly 300 people died and hundreds were injured after a Tanzanian passenger train lost power on a hill and rolled back into a freight train. A government report found that the driver of the train failed to apply the manual brakes, a mistake attributed to human error and inexperience (“Human Error Blamed,” 2002). In 2003, after the crash of the Andrew J. Barberi ferry, in which 10 people died, a review of the U.S. Coast Guard safety records showed that more than 30 of the 50 accidents that have occurred on Staten Island ferries “have been blamed on what investigators deemed to be mistakes or acts of negligence by captains, mates, deckhands or other ferry employees” (McIntire, 2003). These stories illustrate an intuitive conclusion: Errors are univer- sal. Errors, whether they cause thousands of deaths or minor inconveniences, are a global phenomenon. Yet, despite the fact that errors are a human universal, a careful look at this volume illustrates that scholarship on the topic is generally a Western enterprise, with theories and research generated largely in the United States and Western Europe (for notable exceptions, see Helmreich, 2000; Helmreich & Merritt, 1998; Helmreich, Wilhelm, Klinect, & Merritt, 2001; Jing, Lu, & Peng, 2001; Li, Harris, Li, & Wang, 2009). Examining cultural influences on errors is critical for theory and practice. Theoretically, cross-cultural research on errors will help to elucidate further what is universal (i.e., etic) and culture specific (i.e., emic) about error processes while expanding error theories, constructs, and measures to be globally relevant. Practically, a cultural perspective on errors is critical to help identify how to prepare best for and manage errors in ways that are targeted to specific cultural contexts. For example, there are large differences in aircraft accidents across different nations even though most airplanes are similar in make and age and are often serviced by the same specialized service firms (Civil Aviation Authority, 1998). Although such differences are likely multiply determined, cultural characteristics of the pilots and the crews in error prevention, detection, or management may at least partially explain such cultural variation. Cross-cultural research is also needed to help identify ways in which multicultural teams can better manage cultural differences in responses to errors. Finally, the study of errors also enhances our understanding of culture itself. As Freud (1901/1954) has noted, errors often point out critical system characteristics that may go unnoticed. In other words, errors may tell us something about fundamental characteristics of cultural systems themselves. For example, as we discuss in this chapter, errors in high-power-distance cultures can occur when low-power members fail to communicate openly with their superiors-a phenomenon that is a key defining feature of such cultures. More generally, a cultural perspective on errors has much to offer the science and practice of errors in organizations. In this chapter, we integrate research on culture with research on errors to identify key propositions for future research. We first discuss critical distinctions regarding the construct of errors, and we advance a process model of errors that includes error prevention, error detection, and error management. We then discuss how key cultural dimensions, including uncertainty avoidance, humane orientation, tightness-looseness, fatalism, power distance, and individualism-collectivism differentially affect each stage of the error process. We identify numerous “cultural paradoxes” regarding the error process for each of these cultural dimensions that we expect could have important short-and long-term consequences for organizations. We also discuss error management in culturally diverse groups, identifying group compositions that are the most ideal in managing the three stages of the error management process. Finally, we conclude with implications of this perspective for theory, research, and training to manage errors within a global context.

AB - On April 26, 1986, a chain of problems caused by design flaws and exacerbated by human error culminated in the worst nuclear power plant disaster in history. When Reactor 4 at the Chernobyl plant exploded, it released 400 times the radioactive fallout as the atomic bombing of Hiroshima (Stone, 2006). While an accurate death count is impossible because of Soviet efforts to cover up the effects of the fallout, 31 people died instantly after the reactor explosion (“The Chernobyl Accident,” 2000), at least 28 workers who were diagnosed with acute radiation sickness died in the 4 months following the accident (United Nations Scientific Committee on the Effects of Atomic Radiation, 2000), and the Chernobyl Forum (2006) estimated that the accident could cause another 4,000 cancer deaths among those who experienced the highest levels of exposure. In 2002, nearly 300 people died and hundreds were injured after a Tanzanian passenger train lost power on a hill and rolled back into a freight train. A government report found that the driver of the train failed to apply the manual brakes, a mistake attributed to human error and inexperience (“Human Error Blamed,” 2002). In 2003, after the crash of the Andrew J. Barberi ferry, in which 10 people died, a review of the U.S. Coast Guard safety records showed that more than 30 of the 50 accidents that have occurred on Staten Island ferries “have been blamed on what investigators deemed to be mistakes or acts of negligence by captains, mates, deckhands or other ferry employees” (McIntire, 2003). These stories illustrate an intuitive conclusion: Errors are univer- sal. Errors, whether they cause thousands of deaths or minor inconveniences, are a global phenomenon. Yet, despite the fact that errors are a human universal, a careful look at this volume illustrates that scholarship on the topic is generally a Western enterprise, with theories and research generated largely in the United States and Western Europe (for notable exceptions, see Helmreich, 2000; Helmreich & Merritt, 1998; Helmreich, Wilhelm, Klinect, & Merritt, 2001; Jing, Lu, & Peng, 2001; Li, Harris, Li, & Wang, 2009). Examining cultural influences on errors is critical for theory and practice. Theoretically, cross-cultural research on errors will help to elucidate further what is universal (i.e., etic) and culture specific (i.e., emic) about error processes while expanding error theories, constructs, and measures to be globally relevant. Practically, a cultural perspective on errors is critical to help identify how to prepare best for and manage errors in ways that are targeted to specific cultural contexts. For example, there are large differences in aircraft accidents across different nations even though most airplanes are similar in make and age and are often serviced by the same specialized service firms (Civil Aviation Authority, 1998). Although such differences are likely multiply determined, cultural characteristics of the pilots and the crews in error prevention, detection, or management may at least partially explain such cultural variation. Cross-cultural research is also needed to help identify ways in which multicultural teams can better manage cultural differences in responses to errors. Finally, the study of errors also enhances our understanding of culture itself. As Freud (1901/1954) has noted, errors often point out critical system characteristics that may go unnoticed. In other words, errors may tell us something about fundamental characteristics of cultural systems themselves. For example, as we discuss in this chapter, errors in high-power-distance cultures can occur when low-power members fail to communicate openly with their superiors-a phenomenon that is a key defining feature of such cultures. More generally, a cultural perspective on errors has much to offer the science and practice of errors in organizations. In this chapter, we integrate research on culture with research on errors to identify key propositions for future research. We first discuss critical distinctions regarding the construct of errors, and we advance a process model of errors that includes error prevention, error detection, and error management. We then discuss how key cultural dimensions, including uncertainty avoidance, humane orientation, tightness-looseness, fatalism, power distance, and individualism-collectivism differentially affect each stage of the error process. We identify numerous “cultural paradoxes” regarding the error process for each of these cultural dimensions that we expect could have important short-and long-term consequences for organizations. We also discuss error management in culturally diverse groups, identifying group compositions that are the most ideal in managing the three stages of the error management process. Finally, we conclude with implications of this perspective for theory, research, and training to manage errors within a global context.

KW - Business psychology

KW - Entrepreneurship

UR - http://www.scopus.com/inward/record.url?scp=84857236792&partnerID=8YFLogxK

M3 - Chapter

SN - 978-0-8058-6291-1

T3 - SIOP organizational frontiers series

SP - 273

EP - 315

BT - Errors in Organizations

A2 - Hofmann, David A.

A2 - Frese, Michael

PB - Routledge Taylor & Francis Group

CY - New York

ER -