Jump to content

User:Smishler17/sandbox

From Wikipedia, the free encyclopedia

Copied from [Bias] for editing purposes


Automation bias is the propensity for humans to favor suggestions from automated decision-making systems and to ignore contradictory information made without automation, even if it is correct.[1] Automation bias stems from the social psychology literature that found a bias in human-human interaction that people believe assign more positive evaluations to decisions made by humans than to a neutral object [2]. The same type of positivity bias has been found for human-automation interaction, where the automated decisions are rated more positively than neutral [3] This has become an growing problem as decision making in such critical contexts as intensive care units, nuclear power plants, and aircraft cockpits have increasingly involved computerized system monitors and decision aids. Errors of automation bias tend to occur when decision-making is dependent on computers or other automated aids and the human is in an observatory role. Examples of of automation bias range from urgent matters as flying on automatic pilot but also such mundane matters as the use of spell-checking programs.[4]

Disuse and Misuse

[edit]

The tendency toward overreliance on automated aids is known as "automation misuse".[5][6] Misuse of automation can be seen when a user fails to properly monitor an automated system, or when the automated system is used when it should not be. This is in contrast to disuse, where the user does not properly utilize the automation either by turning it off or ignoring it. Both misuse and disuse can be problematic, but automation bias is directly related to misuse of the automation through either too much trust in the abilities of the system, or defaulting to using heuristics. Misuse can lead to lack of monitoring of the automated system or blind agreement with an automation suggestion, categorized by two types of errors, errors of omission and errors of commission, respectively.[7][8][5]


Automation-induced complacency

[edit]

The concept of automation bias is viewed as overlapping with automation-induced complacency, also known more simply as automation complacency. Like automation bias, it is a consequence of the misuse of automation and involves problems of attention. While automation bias involves a tendency to trust decision-support systems, automation complacency involves insufficient attention to and monitoring of automation output, usually because that output is viewed as reliable.[9] "Although the concepts of complacency and automation bias have been discussed separately as if they were independent," writes one expert, "they share several commonalities, suggesting they reflect different aspects of the same kind of automation misuse." It has been proposed, indeed, that the concepts of complacency and automation bias be combined into a single "integrative concept" because these two concepts "might represent different manifestations of overlapping automation-induced phenomena" and because "automation-induced complacency and automation bias represent closely linked theoretical concepts that show considerable overlap with respect to the underlying processes."[10]

Automation complacency has been defined as "poorer detection of system malfunctions under automation compared with under manual control." NASA's Aviation Safety Reporting System (ASRS) defines complacency as "self-satisfaction that may result in non-vigilance based on an unjustified assumption of satisfactory system state." Several studies have indicated that it occurs most often when operators are engaged in both manual and automated tasks at the same time. This complacency can be sharply reduced when automation reliability varies over time instead of remaining constant, but is not reduced by experience and practice. Both expert and inexpert participants can exhibit automation bias as well as automation complacency. Neither of these problems can be easily overcome by training.[10]

The term "automation complacency" was first used in connection with aviation accidents or incidents in which pilots, air-traffic controllers, or other workers failed to check systems sufficiently, assuming that everything was fine when, in reality, an accident was about to occur. Operator complacency, whether or not automation-related, has long been recognized as a leading factor in air accidents.[10]

To some degree, user complacency offsets the benefits of automation, and when an automated system's reliability level falls below a certain level, then automation will no longer be a net asset. One 2007 study suggested that this automation occurs when the reliability level reaches approximately 70%. Other studies have found that automation with a reliability level below 70% can be of use to persons with access to the raw information sources, which can be combined with the automation output to improve performance.[10]

Future planned updates:

Needs proper formatting. A start/intro, body, and summary. It is disorganized and not fluid in the current state.

Need to elaborate on what exactly automation complacency is, the factors that influence it, and the research behind it. - Parasuraman

References:

Muir, B. M., & Moray, N. (1996). Trust in automation. Part II. Experimental studies of trust and human intervention in a process control simulation. Ergonomics, 39(3), 429-460.

Parasuraman, R., & Riley, V. (1997). Humans and automation: Use, misuse, disuse, abuse. Human factors, 39(2), 230-253.

  1. ^ Cummings, Mary (2004). "Automation Bias in Intelligent Time Critical Decision Support Systems" (PDF). AIAA 1st Intelligent Systems Technical Conference (PDF). doi:10.2514/6.2004-6313. ISBN 978-1-62410-080-2. Archived from the original (PDF) on 2014-11-01.
  2. ^ Bruner, J. S., & Tagiuri, R. (1954). The perception of people. HARVARD UNIV CAMBRIDGE MA LAB OF SOCIAL RELATIONS.
  3. ^ Dzindolet, M. T., Peterson, S. A., Pomranky, R. A., Pierce, L. G., & Beck, H. P. (2003). The role of trust in automation reliance. International journal of human-computer studies, 58(6), 697-718.
  4. ^ Skitka, Linda. "Automation". University of Illinois. University of Illinois at Chicago. Retrieved 16 January 2017.
  5. ^ a b Parasuraman, R., & Riley, V. (1997). Humans and automation: Use, misuse, disuse, abuse. Human factors, 39(2), 230-253.
  6. ^ Mosier, Kathleen; Skitka, Linda; Heers, Susan; Burdick, Mark (February 1997). "Automation Bias: Decision Making and Performance in High-Tech Cockpits". International Journal of Aviation Psychology. 8 (1): 47–63. doi:10.1207/s15327108ijap0801_3. PMID 11540946. Retrieved 17 January 2017.
  7. ^ Cite error: The named reference :0 was invoked but never defined (see the help page).
  8. ^ Tversky, A., & Kahneman, D. (1974). Judgment under uncertainty: Heuristics and biases. science, 185(4157), 1124-1131.
  9. ^ Cite error: The named reference J Am Med Inform Association was invoked but never defined (see the help page).
  10. ^ a b c d Cite error: The named reference The Journal of the Human Factors and Ergonomics Society was invoked but never defined (see the help page).