Abstract
Cognitive error, innate to all human reasoning, commonly interferes with clinical success. Psychiatric education and practice skew toward the lowest levels of clinical reasoning, leaving practitioners unprepared to solve many of the increasingly complex clinical dilemmas we face. Despite our self-confidence, around half of current treatment attempts result in suboptimal outcomes or overt treatment failures, increasingly mislabeled “treatment resistance.” While methods typically evolve during a career, abductive reasoning with Bayesian inference is the most effective for all levels of experience. Utilizing a collective perspective while obtaining, sharing, and analyzing information in a reflective, nondeterministic, and probabilistic manner leads to the best outcomes; individual clinicians must learn and practice these higher-level thinking skills. By facing our errors and failures, we mature and add to our clinical effectiveness. Results are better when we accept uncertainty and continuously revise our models.
Keywords
Medical Error, Clinical Reasoning, Treatment Failure, Treatment Resistance, Critical Thinking, Metacognition, Hypothetico-deductive, Inductive, Abductive, Bayesian, Eristic, Heuristic, Psychiatry
Abbreviations
HD: Hypothetico-Deductive; CD: Cognitive Diversity; CI: Collective Intelligence; CME: Continuing Medical Education
Introduction
The vast majority (74%) of medical education, at all levels, identifies only a few thinking skills and their applications as objectives: knowledge-remembering and comprehension-understanding: i.e., receiving, grasping, and recalling biomedical information [1-3]. Critical thinking and evaluative judgements encompassing the higher-level thinking skills of analysis, evaluation, and creative synthesis are largely neglected in education and underutilized in practice [4-6]. Though formal clinical reasoning is increasingly taught in medical schools, it is also more likely to be stressed during the basic science curriculum and insufficiently incorporated into clinical rotations [7].
Our profession’s outsized emphasis on the lowest level skills of medical problem solving have insufficiently prepared us to encounter and successfully treat the increasingly complex clinical situations our patients bring to us, as they live longer and face an increasing number of comorbidities and available treatments [8]. About half of treatment attempts in psychiatry currently end in failure [9-12].
Reason has developed over centuries to more accurately describe our world and to advise our best responses to it. The best clinical reasoning is essential to our efforts to treat illness, reduce medical error, and avoid treatment harms and failures. Due to training and innate biology, though, too many psychiatrists (and other physicians) employ suboptimal clinical reasoning. This limits their ability to recognize the complexity of clinical cases and acknowledge errors [13-15].
Even incomplete outcomes may have a worse impact than overt failures, as the latter prompt further action, while mild to moderate success may be accepted without further examination. Most of us are overconfident and unaware of our mediocre performance [16], helping to explain the proliferation of the term “treatment resistance” – a projection of our own failure at problem solving into a pseudo diagnosis [17-19].
Problems are too often labeled untreatable when symptoms fail to improve. Rather than serially applying similar treatments (e.g., antidepressants or anxiolytics), poor results should direct us to reexamine our original diagnosis and other mitigating factors, including comorbidity and lifestyle factors. Not only may bipolar disorder be diagnosed as major depression, but attention deficit hyperactivity disorder may appear as performance anxiety if not sufficiently evaluated and considered.
Cognitive Error
Psychiatrists and other medical professionals are among the most intelligent, highly trained, and experienced thinkers on the planet. We are still unable, though, to detach ourselves from the errors inherent in human cognitive processing that evolved as Homo sapiens brains struggled to survive and procreate.
Illustratively, one fifth of “cognitively normal” people still make errors on standard cognitive assessment tools such as the Clock Drawing Test and Trails Making Tests A and B [20-24]. Unchecked by the best conscious efforts at reasoning discussed below, these frequently derail our therapeutic efforts. Over three quarters of medical errors have been found to result from faulty cognition [25-27], rather than the systemic miscommunications that are more routinely blamed [28].
Most, 78.9%, of these mistakes occur during patient contact [29]. Over 24 cognitive tasks are required of physicians, though we actually accomplish far fewer. On average, we complete only 4.4-4.7 of these per treatment attempt and employ the same ones repeatedly, rather than utilizing a broader selection of skills. Self-reflection, a particularly useful technique to minimize cognitive error, is rarely employed [30].
Our problem-solving efforts are based on the concepts we formulate, and these are determined, not only by our cognitive skills and limitations, but by our approach: how we reason directly determines clinical outcomes. Even the design of our assessments (our methods and procedures) leads to different patterns of thinking [31].
When we are unprepared to adequately address the frequent uncertainty and ambiguity in our practice, we often respond by oversimplifying problems or taking premature action [32]. With the proper cognitive framework, however, feeling “uncertain” could just as well prompt us to look directly at clinical ambiguity and fill holes in data, leading to better answers. It is the questions we ask (or don’t ask) that determine the treatments we provide.
Even the best methods and efforts will at times still lead to treatment failure. We cannot know everything we need to at the beginning of every treatment, despite thorough and exhaustive evaluations, which are, nonetheless, essential. How we respond to these failures is crucial for our patients.
If we do not routinely exercise metacognition, consciously and purposefully examining our own thought processing, we will not uncover the cognitive errors that allowed and may perpetuate treatment failure. When we do “think about how we think,” however, we can improve our problem modeling and iteratively discover creative solutions to clinical impasses. This will only occur through our own humility and a mindset of reflection, self-examination, and new discovery as we seek and incorporate multi-source feedback [33–36].
Clinical Reasoning Models
Knowledge of the steps we must be taking to practice clinical reasoning at the highest level is every bit as essential as learning core and advanced biomedical knowledge. There are several methods of thinking we may utilize when practicing psychiatry and other medical specialties [37,38]. The most frequently referenced and studied is the hypothetico-deductive (HD) model [38,39]. When utilizing this process, a practitioner formulates a hypothesis (e.g., diagnosis, treatment plan, etc.) then performs experiments (i.e., gathers additional data, treats, etc.) to test the accuracy of predictions made from the hypothesis.
A patient presents with middle insomnia, anorexia, anergy, impaired concentration, poor short-term memory, anhedonia, and dysphoric mood. A psychiatrist might deduce from these symptoms that the correct diagnosis is major depression. The provider then predicts that these symptoms should improve when treated with an oral antidepressant medication.
It is the prediction, not the theory, which is challenged by testing. If the measurement of results does not match prediction, then the hypothesis is rejected. With this model, conclusions cannot be false if the premises on which they are based are true [40]. Predictions made from hypotheses, however, also require additional assumptions that may or may not be correct. If any are not, then the results of this hypothesis testing will be misinterpreted.
The HD method does not result in certainty. It can never prove a hypothesis is correct, as multiple explanations may lead to the same prediction [41,42]. This may result in a correct conclusion, incorrect explanations being supported (Type I errors), or accurate descriptions being erroneously rejected (Type II errors). While a large research community should be able to cancel out errors for each other, psychiatrists and other medical professionals working alone to diagnose and treat patients do not benefit from such a process.
Deductive reasoning is, therefore, deterministic, top-down, and reductionistic [1,40,43,44]. Our attempts to discover symptoms that support early diagnostic hypotheses are a major source of error in this method, one form of confirmation bias. Atypical and outlying information are frequently ignored or rationalized away. Missing information is often not acknowledged. In retrospect, we recall diagnoses rather than raw data and diagnostic detail, often distorting our memory of earlier evaluations [45,46].
Practitioners at all levels also tend to use a limited number of hypotheses (i.e., diagnoses). We insufficiently consider alternatives while searching for data to support our initial impressions. It is helpful to recall that data should lead to a hypothesis, and not the other way around [40].
Inductive reasoning, by contrast, is bottom-up, proceeding from evidence to theory. This form of rational thought creates causal inferences that link observations to the future. This permits the prediction of events based on our presumptions of these relationships [40,47]. In other words, induction establishes a rule [44].
While useful for generalization, the value of this form of reasoning is also limited. A practitioner can never be sure they are aware of all the factors involved; valid cause and effect expectations can never be assured [48,49]. However, we do rely on inductive reasoning when we attempt to translate clinical outcome studies into treatment plans: inferring or extrapolating from research populations to single patients [40,47].
Deduction proceeds from the general to the particular, while induction looks from the particular to the general, both being deterministic. Abductive reasoning, though, is the method most suited to our clinical practice. Using this approach, we make inferences from observations or known facts using probabilistic reasoning.
Degrees of certainty can be measured when described by known or estimated probabilities. The process is referred to as Bayesian induction or Bayesian inference [47]. This form of non-deterministic reasoning openly admits to its uncertainty and usefully attempts to measure it [40,50,51].
Whether we begin with a hypothesis, as in deduction, or with observation, as with induction, when thinking abductively we do not stop reasoning after making a single prediction. The process plays out over time, as new information from repetitive testing is added to initial and subsequent assessments. We test for accuracy, then follow-up by revising our assumptions and predictions based upon the new data we discover from our treatment results - then we begin the process anew.
This requires us to generate many hypotheses (e.g., diagnoses and treatments plans), as there is no guarantee any single one will be correct. Unwittingly, our ability to rationally develop accurate diagnoses is influenced by context, among and within specialties; i.e., we are each routinely better at some diagnoses than others [52].
Competitive hypothesis revision, a hallmark of the abductive theory of mind, is necessary to reduce treatment failure. This is opposed to relying on arbitrary estimates of significance, such as the p-test [40,43,53,54]. After revision, only the hypothesis that best fits the totality of our data should carry forward for additional evaluation [43,55,56].
Clinical decisions always carry some degree of uncertainty, which we can estimate with probability. Our choices must be based on all available empirical (and never theoretical) evidence. With abductive reasoning, uncertainty lessens as more data is accumulated and alternate hypotheses are sorted through. The results of inaccurate diagnoses and ineffective or suboptimal treatment plans are carefully examined; this data then contributes to better conclusions.
With this method, error is reduced by accepting uncertainty and revising our models. This is superior to allowing overconfidence and employing reductionistic, arbitrary, and individual estimates of significance and certainty that might be flawed from the beginning [57].
Course of Reasoning Over a Career
Novice clinicians often utilize the HD method [58]. As newly knowledgeable and less experienced practitioners, though, they may have difficulty forming a hypothesis, as well as trouble interpreting and accounting for negative data [58,59]. While more experienced physicians may alternate between HD and inductive reasoning, they often use the latter for simpler problems.
After six years of study and practice, the typical physician is no longer consciously referencing the biomedical information that continuing medical education (CME) so strongly stresses, but rather problem solving with pattern matching and experience, which emphasizes connections and blurs detail. At this point in their career, the average practitioner has evolved into “expert” mode [60], utilizing illness-scripts that link the features of an illness with its consequences [61]. Pattern recognition (Type I processing) is the form of logic experts use most often when forming diagnoses [62].
Given time constraints in medical practice, this inductive reasoning appears to reduce diagnostic errors in experienced practitioners, though HD is often more effective with complex medical dilemmas [58,61]. Experts are as prone to error as their less experienced colleagues, however, when they face these more difficult cases [63-65].
Clinical reasoning and decisions are always theoretical [47], and we must never make the mistake of reification, i.e., believing that our mental conceptualizations are tangible in themselves. “Essentially, all models are wrong, but some are useful,” as George E.P. Box and Norman R. Draper stated. “The practical question is how wrong do they have to be to not be useful” [66].
Abductive reasoning, encompassing Bayesian induction, though, is particularly effective in the practice of psychiatry, in which we observe phenomena and attempt to infer a particular cause [47]. Observations that do not align with our current model of a problem represent important information, the impact depending on the degree of deviation from the currently favored hypothesis [67]. We only remain aware and make use of this information, though, by applying successive reassessments of our model. With conscious correction, our models appear to become more accurate and often less complex [68].
We adopt one of three approaches based on the amount of data we have to process: eristic, heuristic, and abductive. With little data and strong time pressures, we will be limited to eristic reasoning: making choices motivated by hedonistic urges [69]. Eristic thinking includes wishful thinking, loss aversion, preference for the status quo, overconfidence, and the endowment effect (subjectively appraising our possessions as more valuable than their objective worth) [69-72]. This lowest level of information processing may be identified by its non-logical approaches, such as beliefs, strong emotions, economic gain, and prejudice [69,73].
With more moderate amounts of data in hand, we are likely to unconsciously rely on heuristic thinking, utilizing hardwired shortcuts to estimation. This type of thinking is discoverable through cues such as the use of analogy, consideration of past performance, evidence of truth seeking, and consideration of the consequences of outcomes [69,73]. It is not calculation, but estimation, based solely on recent and individual experiences. As a result, it carries many biases that can easily lead a practitioner astray, and does not take advantage of the statistical assessment of the broader experience of others [74-76].
Heuristic reasoning might be the best we can do with incomplete information, and its limitations are a strong argument for continuing to seek additional data before diagnosing and treating. Once we have sufficient data, though, we can proceed, as above, to consciously employ abductive reasoning, considering broader experience with stochastic, analytic methods (Type II reasoning), and employing creative, iterative hypothesis competition [69,73].
|
Reasoning Method |
Description |
Strengths |
Limitations |
Features |
|
Hypothetico- deductive (HD) |
Top-down: proceeds from general to particular |
Conclusions cannot be false if premises are true |
Additional assumptions are necessary that may not be correct |
Deterministic
Reductionistic |
|
Inductive |
Bottom-up: proceeds from particular to general
|
Creates and links causal inferences to future predictions
|
No certainty that all necessary factors have been considered |
Deterministic
Reductionistic |
|
Abductive |
Inferences are formed from observations or facts using probabilistic reasoning |
The uncertainty of serial conclusions is measured by probability |
A slow process
|
Non- deterministic
|
All levels of medical and psychiatric education, from undergraduate through continuing, must evolve to emphasize critical thinking and higher levels of clinical reasoning. Merely imparting and reviewing biomedical information is insufficient. The skills of evaluating data and formulating new diagnostic questions once information gaps are detected must be emphasized in curricula. Teaching metacognitive skills (including reflection, debiasing, and use of feedback and counterfactuals) in order to reduce provider error has become a practical and ethical imperative.
Pooled Efforts
As the volume of available and necessary medical knowledge expands to exceed the capacity of our brains to absorb and retain it, shifting to distributed cognition to access and collate information from reliable databases becomes a reliable strategy. When we attempt to memorize and internalize all necessary data as a sole practitioner, we also overemphasize and ossify “known” facts. This can stifle learning and thinking, and impede innovation, instead of challenging us to be creative in our assessments and judgements [77].
We must evolve to become conduits and collators of information, rather than libraries, dynamically exploring and testing explanations and solutions, rather than regurgitating rote answers. This skill of retaining and tolerating the contrasting perspectives of our colleagues and patients can help us generate more effective assessments and treatment plans [78]. This “collective perspective” also aids our ability to distinguish high from poor quality data [77,79] as we search for the best solution to solve each clinical problem.
Individual brains perform cognitive tasks in slightly different ways, based on our genes, physical environment, culture, and individual experience, so joining with the cognition of others is often valuable [80]. Utilizing diversity in cognitive style (i.e., how we formulate and attempt to solve problems), more than differences in cognitive ability, predicts identification of successful solutions [81]. For example, individual approaches may be more or less analytical or intuitive; visual or verbal; traditional or novelty-seeking.
Cognitive diversity (CD) often interacts with collective intelligence (CI), but is not synonymous. CD seeks different representations of a problem, rather than computational complexity, and is helpful as we strive to consider a variety of problem formulations and hypotheses, as we must when practicing psychiatry [81,82]. Successful problem-solving entails multiple steps, with divergent (many possible answers) and convergent (the one best answer) processing. Personal values determine what we choose to explore: some practitioners prefer simple, and others more detailed, explanations.
For CI to be more effective than any individual effort, we must intentionally consult with those who favor reexamining old data, as well as those who prefer uncovering new information [81]. Multidisciplinary teams that include contrasting assessment approaches, knowledge bases, and experiences, along with outside consultation, may help provide this desirable diversity. To reduce treatment failure, we must seek and fully consider a broad perspective, including the data, experience, formulations, decisions, and reasoning of others.
Adopting pluralism, considering multiple theories and perspectives within every dimension of the biopsychosocial model [83], is an effective tool for reducing cognitive error with each patient. Scholarship is critical, but leads to improved outcomes only when encompassing a broad view [84].
Threats and Solutions
Clinical reasoning is the essence of psychiatric practice, but is frequently degraded by overconfidence, conceptual uncertainty, and confusion [85-87]. Unrecognized suboptimal cognitive acts lead to errors in diagnostic reasoning and therapeutic mistakes, and result in increased patient harm, disappointment, and disengagement [88].
Cognitive errors result from gaps in knowledge and self-awareness, faulty data gathering, and incorrect information processing [38]. Any form of logic is difficult for novice psychiatrists to employ, as they struggle to recall biomedical information while learning to form and test hypotheses; they must also actively suppress incorrect information learned alongside the correct [89-91].
Experienced psychiatrists develop and retain a long-term mindset that narrows as their expertise and confidence grows, resulting in unrecognized mistakes and poor outcomes. Over time, we develop this bias for positive feedback [68]. Clinical experience can add to the reasoning powers of psychiatrists, but without conscious reflection on broad, multi-source feedback it more often reinforces conceptual error and lowers clinical performance [92].
Better opportunities for clinical reasoning exist, the best being theoretical, non-binary, non-deterministic, and probabilistic. This affords a reflective approach that estimates and reduces error through the conscious application of additional techniques: abductive logic, Bayesian inference, and a collective perspective encompassing cognitive diversity.
As no formal mechanism exists to teach and reinforce these skills to active practitioners, they must be sought, practiced, and mastered by individual psychiatrists seeking continuous improvement in their skills and clinical outcomes. Best practices include
- Identifying and filling information gaps
- Preserving and accounting for all of our data
- Avoiding rapid diagnosis
- Adopting a pluralistic approach to assessment and formulation
- Seeking, acknowledging, and utilizing feedback
- Developing and adding to our clinical and therapeutic skills
- Recognizing and responding positively to clinical impasses
- Nurturing our therapeutic alliances
- Applying humility
- Respecting and developing knowledge of other cultures, races, and ethnicities
- Enhancing our communication skills with patients and peers.
Most critically, by taking the time for reflective metacognition, we must be able to identify and monitor our current method of clinical reasoning. We can use cognitive wrapping to force self-examination before, during and following sessions [93-95], reflective journals to record our metacognitive performance, learning needs, and questions we seek to answer [96], and multi-source feedback to improve awareness of our clinical and communication deficits [97]. Remaining flexible and combining the several methods of data analysis that best match each clinical situation helps practitioners at all levels of experience and competence learn and develop new skills [98,99].
Once we have examined our individual cognitive style, and are current and complete in our biomedical and psychiatric knowledge and techniques, we can learn and apply these skills of critical thinking and reflection. We will retain these strategies lifelong. They will help us correctly transfer past knowledge into current situations with less distortion and misapplication. When we consciously organize the cognitive steps we take, we can better identify the information gaps that prevent successful outcomes.
Conclusion
As humans, we will all continue to make mistakes [100]. The “honorable character” of most healthcare providers often results in moral distress upon recognition of our errors, affecting not only us but also our work environment [101,102]. It can also facilitate “moral maturation,” prompt self-reflection, and add to our knowledge [101,103,104].
We usually know when our errors led to obvious adverse events. We need to become equally aware when these lead to treatment failure or suboptimal outcomes, and experience these with as much moral distress as the more visible negative consequences of our actions or inactions. Concern for our patients should make this not only the most practical, but also the most ethical approach.
References
2. Bloom’s Taxonomy [Internet]. Faculty Center, University of Central Florida. Available from: https://fctl.ucf.edu/teaching-resources/course-design/blooms-taxonomy/#:~:text=The%20goal%20of%20an%20educator's,from%20lower%2Dlevel%20cognitive%20skills
3. Légaré F, Freitas A, Thompson-Leduc P, Borduas F, Luconi F, Boucher A, et al. The majority of accredited continuing professional development activities do not target clinical behavior change. Acad Med. 2015 Feb;90(2):197-202.
4. Adams NE. Bloom's taxonomy of cognitive learning objectives. J Med Libr Assoc. 2015 Jul;103(3):152-3.
5. Greeno JG. A perspective on thinking. American Psychologist. 1989;44:134-41.
6. Tuma F, Nassar AK. Applying Bloom's taxonomy in clinical surgery: practical examples. Annals of medicine and surgery. 2021 Sep 1;69.
7. Blanco MA, Capello CF, Dorsch JL, Perry GJ, Zanetti ML. A survey study of evidence-based medicine training in US and Canadian medical schools. Journal of the Medical Library Association: JMLA. 2014 Jul;102(3):160-8.
8. Lenze E. Managing Older Adults with Mental Illness. New Orleans, LA; 2024.
9. Bear HA, Dalzell K, Edbrooke-Childs J, Garland L, Wolpert M. How to manage endings in unsuccessful therapy: A qualitative comparison of youth and clinician perspectives. Psychother Res. 2022 Feb;32(2):249-62.
10. Howes OD, Thase ME, Pillinger T. Treatment resistance in psychiatry: state of the art and new directions. Mol Psychiatry. 2022 Jan;27(1):58-72.
11. Salzer MS, Brusilovskiy E, Townley G. National Estimates of Recovery-Remission From Serious Mental Illness. Psychiatr Serv. 2018 May 1;69(5):523-8.
12. Wolpert M. Failure is an option. The Lancet Psychiatry. 2016 Jun 1;3(6):510-2.
13. Einhorn HJ, Hogarth RM, Klempner E. Quality of group judgment. Psychological Bulletin. 1977 Jan;84(1):158-72.
14. Hetma?ski M. Expertise and Expert Knowledge in Social and Procedural Entanglement. Eidos. A Journal for Philosophy of Culture. 2020;4(2):6-22.
15. Ryback D. Confidence and accuracy as a function of experience in judgment-making in the absence of systematic feedback. Perceptual and Motor Skills. 1967 Feb;24(1):331-4.
16. Rahmani M. Medical trainees and the Dunning–Kruger effect: when they don’t know what they don’t know. Journal of Graduate Medical Education. 2020;12:532-4.
17. Putman III HP. Encountering Treatment Resistance: Solutions Through Reconceptualization. First edition. Washington, DC: American Psychiatric Association Publishing; 2024.
18. Putman III HP. The Failed Concept of Treatment Resistance. Psychiatric Research and Clinical Practice. 2024 Sep;6(3):112-4.
19. Putman III HP. Treatment failure vs. treatment resistance. Current Research in Psychiatry. 2024 Oct 8;4(1):15-9.
20. Cohen ML, Weatherford S, Nandakumar R. How Normal Are "Normal" Errors of Language and Cognition? J Speech Lang Hear Res. 2019 May 21;62(5):1468-72.
21. Hazan E, Frankenburg F, Brenkel M, Shulman K. The test of time: a history of clock drawing. International journal of geriatric psychiatry. 2018 Jan;33(1):e22-30.
22. Ruffolo LF, Guilmette TJ, Willis GW. FORUM comparison of time and error rates on the trail making test among patients with head injuries, experimental malingerers, patients with suspect effort on testing, and normal controls. The Clinical Neuropsychologist. 2000 May 1;14(2):223-30.
23. Shulman KI. Clock‐drawing: is it the ideal cognitive screening test?. International journal of geriatric psychiatry. 2000 Jun;15(6):548-61.
24. Umegaki H, Suzuki Y, Komiya H, Watanabe K, Yamada Y, Nagae M, et al. Frequencies and Neuropsychological Characteristics of Errors in the Clock Drawing Test. Loewenstein D, editor. JAD. 2021; 82:1291-300.
25. Graber ML, Franklin N, Gordon R. Diagnostic error in internal medicine. Archives of internal medicine. 2005 Jul 11;165(13):1493-9.
26. Kassirer JP, Kopelman RI. Cognitive errors in diagnosis: instantiation, classification, and consequences. The American journal of medicine. 1989 Apr 1;86(4):433-41.
27. O'Sullivan ED, Schofield SJ. Cognitive bias in clinical medicine. Journal of the Royal College of Physicians of Edinburgh. 2018 Sep;48(3):225-32.
28. Kohn L, Corrigan J, Donaldson M, editors. To Err Is Human: Building a Safer Health System [Internet]. Washington, D.C.: National Academies Press; 2000 [cited 2024 Dec 10]. p. 9728.
29. Singh H, Giardina TD, Meyer AN, Forjuoh SN, Reis MD, Thomas EJ. Types and origins of diagnostic errors in primary care settings. JAMA Intern Med. 2013 Mar 25;173(6):418-25.
30. McBee E, Ratcliffe T, Goldszmidt M, Schuwirth L, Picho K, Artino AR Jr, et al. Clinical Reasoning Tasks and Resident Physicians: What Do They Reason About? Acad Med. 2016 Jul;91(7):1022-8.
31. Kim S, Choi I, Yoon BY, Kwon MJ, Choi SJ, et al. How do medical students actually think while solving problems in three different types of clinical assessments in Korea: Clinical performance examination (CPX), multimedia case-based assessment (CBA), and modified essay question (MEQ). J Educ Eval Health Prof. 2019;16:10.
32. Tyagi A, Garudkar S, Gagare AG, Thopte A. Medical Uncertainty: Are we better off in era of evidence based medicine?. International Journal of Medical Research & Health Sciences. 2015;4(1):208-13.
33. Al-Moteri M. Metacognition and learning transfer under uncertainty. International Journal of Nursing Education Scholarship. 2023 Jan 27;20(1):20230038.
34. Hajrezayi B, Shahalizade M, Zeynali M, Badali M. Effectiveness of blended learning on critical thinking skills of nursing students. Journal of Nursing Education. 2015 May 10;4(1):49-59.
35. Kuiper RA, Pesut DJ. Promoting cognitive and metacognitive reflective reasoning skills in nursing practice: self‐regulated learning theory. Journal of advanced nursing. 2004 Feb;45(4):381-91.
36. Murton C, Spowart L, Anderson M. How psychiatrists’ attitudes towards multi-source feedback including patient feedback influenced the educational value: a qualitative study. MedEdPublish. 2022 Feb 3;12(5):5.
37. Cook DA, Durning SJ, Sherbino J, Gruppen LD. Management reasoning: implications for health professions educators and a research agenda. Academic Medicine. 2019 Sep 1;94(9):1310-6.
38. Duong QH, Pham TN, Reynolds L, Yeap Y, Walker S, Lyons K. A scoping review of therapeutic reasoning process research. Advances in Health Sciences Education. 2023 Oct;28(4):1289-310.
39. Raharjanti NW, Wiguna T, Purwadianto A, Soemantri D, Bardosono S, Poerwandari EK, et al. Clinical reasoning in forensic psychiatry: concepts, processes, and pitfalls. Frontiers in Psychiatry. 2021 Aug 5;12:691377.
40. de-Sousa MR, Aguiar TR. Dedução, Indução ea Arte do Raciocínio Clínico na Educação Médica: Revisão Sistemática e Proposta Bayesiana. Arquivos Brasileiros de Cardiologia. 2022 Oct;119(5 suppl 1):27-34.
41. Kalinowski ST, Pelakh A. A hypothetico?deductive theory of science and learning. Journal of Research in Science Teaching. 2024 Aug;61(6):1362-88.
42. Popper K. The logic of scientific discovery. Special Indian Edition. London: Routledge; 2010.
43. Mason WA, Cogua-Lopez J, Fleming CB, Scheier LM. Challenges Facing Evidence-Based Prevention: Incorporating an Abductive Theory of Method. Eval Health Prof. 2018 Jun;41(2):155-82.
44. Rapezzi C, Ferrari R, Branzi A. White coats and fingerprints: diagnostic reasoning in medicine and investigative methods of fictional detectives. BMJ. 2005 Dec 24;331(7531):1491-4.
45. Arkes HR, Harkness AR. Effect of making a diagnosis on subsequent recognition of symptoms. J Exp Psychol Hum Learn. 1980 Sep;6(5):568-75.
46. Boshuizen HP, Gruber H, Strasser J. Knowledge restructuring through case processing: The key to generalise expertise development theory across domains?. Educational Research Review. 2020 Feb 1;29:100310.
47. Kyriacou DN. Evidence?based medical decision making: Deductive versus inductive logical thinking. Academic Emergency Medicine. 2004 Jun;11(6):670-1.
48. Rothman KJ. Modern epidemiology. 5. print. Boston: Little Brown; 1986.
49. Salmon WC. Causality and explanation. New York Oxford: Oxford University Press; 1998.
50. Osimani B. Modus Tollens probabilized: deductive and Inductive Methods in medical diagnosis. MEDIC. 2009;17:43-59.
51. Trimble M, Hamilton P. The thinking doctor: clinical decision making in contemporary medicine. Clin Med (Lond). 2016 Aug;16(4):343-6.
52. Malterud K, Reventlow S, Guassora AD. Diagnostic knowing in general practice: interpretative action and reflexivity. Scand J Prim Health Care. 2019 Dec;37(4):393-401.
53. Scheier LM. Why Research Design and Methods Is So Crucial to Understanding Drug Use/Abuse: Introduction to the Special Issue. Eval Health Prof. 2018 Jun;41(2):135-54.
54. Szucs D, Ioannidis JPA. When Null Hypothesis Significance Testing Is Unsuitable for Research: A Reassessment. Front Hum Neurosci. 2017 Aug 3;11:390.
55. Capaldi EJ, Proctor RW. Are theories to be evaluated in isolation or relative to alternatives? An abductive view. Am J Psychol. 2008 Winter;121(4):617-41.
56. Johnson DK. Inference to the best explanation and avoiding diagnostic error. In: Allhoff F, Borden S, editors. Ethics and Medical Error. New York: Routledge; 2020.
57. Cassam Q. Diagnostic error, overconfidence and self-knowledge. Palgrave Communications. 2017 Apr 11;3(1):17025.
58. Wijayaratne D, Weeratunga P, Jayasinghe S. Exploring synthesis as a vital cognitive skill in complex clinical diagnosis. Diagnosis. 2024 May 1;11(2):121-4.
59. Shin HS. Reasoning processes in clinical reasoning: from the perspective of cognitive psychology. Korean journal of medical education. 2019 Nov 29;31(4):299-308.
60. Guida A, Gobet F, Tardieu H, Nicolas S. How chunks, long-term working memory and templates offer a cognitive explanation for neuroimaging data on expertise acquisition: a two-stage framework. Brain Cogn. 2012 Aug;79(3):221-44.
61. Schmidt HG, Boshuizen HP. Encapsulation of Biomedical Knowledge. In: Evans DA, Patel VL, editors. Advanced Models of Cognition for Medical Training and Practice [Internet]. Berlin, Heidelberg: Springer Berlin Heidelberg; 1992. p. 265–82.
62. Cosmides L, Tooby J. Are humans good intuitive statisticians after all? Rethinking some conclusions from the literature on judgment under uncertainty. cognition. 1996 Jan 1;58(1):1-73.
63. Campitelli G, Speelman C. In: Gobet F, Schiller M, editors. Problem Gambling [Internet]. London: Palgrave Macmillan UK; 2014. p. 41–60.
64. Einhorn HJ. Expert judgment: Some necessary conditions and an example. Journal of applied psychology. 1974 Oct;59(5):562-71.
65. Frensch PA, Sternberg RJ. Expertise and intelligent thinking: when is it worse to know better? In: Sternberg R, editor. Advances in the Psychology of Human Intelligence. Mahwah, N.J: Lawrence Erlbaum Associates; 1989.
66. Box GE, Draper NR. Empirical model-building and response surfaces. John Wiley & Sons; 1987.
67. Sayood K. Information theory and cognition: a review. Entropy. 2018 Sep 14;20(9):706.
68. Trapp S, Guitart-Masip M, Schröger E. A link between age, affect, and predictions?. European journal of ageing. 2022 Dec;19(4):945-52.
69. Kurdoglu RS, Jekel M, Ate? NY. Eristic reasoning: Adaptation to extreme uncertainty. Frontiers in Psychology. 2023 Feb 9;14:1004031.
70. Berthet V. The Impact of Cognitive Biases on Professionals' Decision-Making: A Review of Four Occupational Areas. Front Psychol. 2022 Jan 4;12:802439.
71. Gunaydin G, Selcuk E, Yilmaz C, Hazan C. I Have, Therefore I Love: Status Quo Preference in Mate Choice. Pers Soc Psychol Bull. 2018 Apr;44(4):589-600.
72. Kahneman D, Knetsch JL, Thaler RH. Anomalies: The endowment effect, loss aversion, and status quo bias. Journal of Economic perspectives. 1991 Feb 1;5(1):193-206.
73. Kurdoglu RS, Ates NY, Lerner DA. Decision-making under extreme uncertainty: eristic rather than heuristic. International Journal of Entrepreneurial Behavior & Research. 2023 Jan 2;29(3):763-82.
74. Marewski JN, Gigerenzer G. Heuristic decision making in medicine. Dialogues Clin Neurosci. 2012 Mar;14(1):77-89.
75. Parpart P, Jones M, Love BC. Heuristics as Bayesian inference under extreme priors. Cogn Psychol. 2018 May;102:127-44.
76. Sundh J, Collsiöö A, Millroth P, Juslin P. Precise/not precise (PNP): A Brunswikian model that uses judgment error distributions to identify cognitive processes. Psychon Bull Rev. 2021 Apr;28(2):351-73.
77. Eichbaum QG. Thinking about thinking and emotion: the metacognitive approach to the medical humanities that integrates the humanities with the basic and clinical sciences. Perm J. 2014 Fall;18(4):64-75.
78. Baldwin T, Guo Y, Syeda-Mahmood T. Automatic Generation of Conditional Diagnostic Guidelines. AMIA Annu Symp Proc. 2017 Feb 10;2016:295-304.
79. Epstein RM, Siegel DJ, Silberman J. Self-monitoring in clinical practice: a challenge for medical educators. J Contin Educ Health Prof. 2008 Winter;28(1):5-13.
80. Brown JS, Collins A, Duguid P. Situated cognition and the culture of learning. 1989. 1989 Feb;18(1):32-42.
81. Sulik J, Bahrami B, Deroy O. The diversity gap: when diversity matters for knowledge. Perspectives on Psychological Science. 2022 May;17(3):752-67.
82. Almaatouq A, Noriega-Campero A, Alotaibi A, Krafft PM, Moussaid M, Pentland A. Adaptive social networks promote the wisdom of crowds. Proc Natl Acad Sci U S A. 2020 May 26;117(21):11379-86.
83. Stein DJ, Shoptaw SJ, Vigo DV, Lund C, Cuijpers P, Bantjes J, et al. Psychiatric diagnosis and treatment in the 21st century: paradigm shifts versus incremental integration. World Psychiatry. 2022 Oct;21(3):393-414.
84. Yager J. Psychiatric Eclecticism: a cognitive view. Am J Psychiatry. 1977 Jul;134(7):736-41.
85. Cioffi J. Situating uncertainty in clinical decision making. Academia Letters [Internet]. 2021 [cited 2024 Dec 11]; Available from: https://www.academia.edu/56671687/Situating_uncertainty_in_clinical_decision_making
86. Ghosh AK. On the challenges of using evidence-based information: the role of clinical uncertainty. Journal of Laboratory and Clinical Medicine. 2004 Aug 1;144(2):60-4.
87. Hogeveen J, Mullins TS, Romero JD, Eversole E, Rogge-Obando K, Mayer AR, et al. The neurocomputational bases of explore-exploit decision-making. Neuron. 2022 Jun 1;110(11):1869-79.
88. Zwaan L, Thijs A, Wagner C, van der Wal G, Timmermans DR. Relating faults in diagnostic reasoning with diagnostic errors and patient harm. Academic Medicine. 2012 Feb 1;87(2):149-56.
89. Badenhorst E, Mamede S, Hartman N, Schmidt HG. Exploring lecturers’ views of first-year health science students’ misconceptions in biomedical domains. Advances in Health Sciences Education. 2015 May;20:403-20.
90. Foisy LM, Potvin P, Riopel M, Masson S. Is inhibition involved in overcoming a common physics misconception in mechanics?. Trends in Neuroscience and Education. 2015 Mar 1;4(1-2):26-36.
91. Shtulman A, Valcarcel J. Scientific knowledge suppresses but does not supplant earlier intuitions. Cognition. 2012 Aug 1;124(2):209-15.
92. Tay SW, Ryan P, Ryan CA. Systems 1 and 2 thinking processes and cognitive reflection testing in medical students. Can Med Educ J. 2016 Oct 18;7(2):e97-e103.
93. Sethares KA, Asselin ME. Use of Exam Wrapper Metacognitive Strategy to Promote Student Self-assessment of Learning: An Integrative Review. Nurse Educ. 2022 Jan-Feb 01;47(1):37-41.
94. Pate A, Lafitte EM, Ramachandran S, Caldwell DJ. The use of exam wrappers to promote metacognition. Currents in Pharmacy Teaching and learning. 2019 May 1;11(5):492-8.
95. Kannan KK, Muthammal R. Exam wrapper and metacognition for undergraduate surgery students in exam preparation. International Surgery Journal. 2020;7(12):4083.
96. Mack HG, Spivey B, Filipe HP. How to add metacognition to your continuing professional development: scoping review and recommendations. The Asia-Pacific Journal of Ophthalmology. 2019 May 1;8(3):256-63.
97. Yang C, Potts R, Shanks DR. Metacognitive unawareness of the errorful generation benefit and its effects on self-regulated learning. J Exp Psychol Learn Mem Cogn. 2017 Jul;43(7):1073-92.
98. Al-Azri NH. How to think like an emergency care provider: a conceptual mental model for decision making in emergency care. International Journal of Emergency Medicine. 2020 Dec;13(1):17.
99. Malterud K, Reventlow S, Guassora AD. Diagnostic knowing in general practice: interpretative action and reflexivity. Scandinavian Journal of Primary Health Care. 2019 Oct 2;37(4):393-401.
100. Schwappach DL, Boluarte TA. The emotional impact of medical error involvement on physicians: a call for leadership and organisational accountability. Swiss medical weekly. 2008 Oct 14;138(1-2):9-15.
101. Tigard DW. The positive value of moral distress. Bioethics. 2019 Jun;33(5):601-8.
102. Frich L, Andersen AV, Josefsen R, Sagabråten SO. When a physician makes a mistake. Tidsskrift for den Norske Laegeforening: Tidsskrift for Praktisk Medicin, ny Raekke. 1997 Dec 1;117(30):4365-70.
103. Thibodeau PS, Nash A, Greenfield JC, Bellamy JL. The association of moral injury and healthcare clinicians’ wellbeing: a systematic review. International journal of environmental research and public health. 2023 Jul 5;20(13):6300.