Tuesday, January 29, 2013

January 2013: Innovations in Innovation Outcomes?


Educator perceptions of the relationship between education innovations and improved health.Friedman SR, Loh LC, Burdick WP. Medical Teacher. 2012, October; Early Online: e1-e8. Available online.

Innovation in health care education is in high demand. Institutions have innovations grants and innovations awards. Through formal and informal channels, we promote, cultivate, fund, and sustain innovative efforts. Indeed, medical education journals are riddled with descriptions of educational innovations1, including innovations designed to cultivate innovation.2,3 As editor of Academic Medicine, Steven Kanter even wrote an editorial helping us critically analyze our innovations for publication.4 His editorial, and the prevalence of innovative projects in the literature, shed light on the role and value of dissemination of educational innovations toward the improvement of practice.

This month, though, we use this article by Friedman and her colleagues up north to revisit the outcomes of our educational innovations and examine how they might extend beyond practice to public health.

After surveying faculty in Brazil and India, the authors generated a framework to understand how faculty believed their educational innovations contributed to health improvement. Mainly, the authors found that faculty believed the structure and process of their educational innovations improved the quality, quantity or relevance of health care education, which they then believed led to improved outcomes in public health.

Methodology in this article is not incredibly rigorous – a survey is distributed to an admittedly “relatively small and non-representative sample of faculty from two countries.”(p. e7) However, the authors do explore that critical link between health care education and health. Beyond that, they suggest that we, as innovators in education, explore that critical link as well. They argue that the links between educational innovations and public health improvement “impact what types of education innovations are implemented, or even conceived.”(p. e7) In other words, are we confident that we promote, cultivate, fund, and sustain educational innovations that are designed to improve public health? This article does not provide the answer, but it does encourage us to develop the question.

Bottom Line:

Innovations are critical to continued improvement in clinical education. Read this article, and those referenced below, to own your role as an innovator. The framework presented in this article suggests that your local innovation could have a broad impact. 


References
1. Anderson MB. A peer reviewed collection of reports on innovative approaches to medical education. Medical Education. 2012; 46(11): 1099-1100. Available here.
2. Armstrong EG, Barsion SJ. Creating “Innovator’s DNA” in Health Care Education. Academic Medicine. 2013; 88(3): 1-6. Available here.
3. Andolsek KM, Murphy G, Nagler A, Moore PR, Schlueter J, Weinerth JL, Cuffe MS, Dzau VJ. Fostering creativity: How the Duke graduate medical education quasi-endowment encourages innovation in GME. Academic Medicine. 2013; 88(2): 1-7. Available here.
4. Kanter SL. Toward Better Descriptions of Innovations, Academic Medicine. 2008; 83(8): 703-704. Available through the Health Sciences Library.


December 2012: Overcoming Adverse Events

Waking up the next morning: surgeons’ emotional reactions to adverse events. Luu S, Patel P, St-Martin L, Leung ASO, Regehr G, Murnaghan ML, Gallinger S, Moulton C. Med Educ. 2012; 46: 1179-1188. 

Adverse patient care events, defined in this article as injuries “caused by medical care rather than a disease process,” can profoundly affect practitioners, leading to many painful outcomes including “burnout, depression, guilt and shame.”p.1180  

In this article, Luu and her colleagues present the results of interviews with surgeons in two phases; the first during which they spoke with surgeons about past events and the second during which they spoke with surgeons about recent adverse events. The results are powerful. 

As educators, we are responsible for teaching a lot of skills to residents and students in a short period of time. We teach through lecture, feedback, demonstration, and simulation. But we are not just teaching technical skills, communication tools, and professionalism. We are also teaching our learners about the profession of a doctor, a nurse, a therapist.

Baystate is continuously regarded as an exceptional teaching institution, valuing with equally high regard our current patients and our future patients. So, by giving our learners insight to our vulnerabilities – by sharing with them the ways that we navigate experiences in which we are uncertain – by doing our part to ensure that students and residents are not only mastering learning objectives but are maturing as caretakers – it is then that we also affirm our appreciation of our learners’ personal development and their emotional wellbeing.  The present article highlights the difficulty of navigating emotional responses to adverse events and, in doing so, presents a learning opportunity for students and teachers.

Bottom Line:

Adverse events can be intensely personal and emotionally troubling for practitioners. Use this article as a starting point for conversations on the difficulties associated with being the ‘second victim’ in an adverse event. Educational opportunities, including debriefing after adverse events, can provide a way for both teacher and student to navigate these times.

November 2012: Croskerry on Diagnostic Errors

The Importance of Cognitive Errors in Diagnosis and Strategies to Minimize Them. Croskerry P. Acad Med. 2003; 78(8): 775-780. 

If you recognize Pat Croskerry’s name, then chances are you’ve thought a bit about cognitive error in your clinical decision making. If not, a quick glance at some of the published titles from this Scottish emergency medicine physician will help you calibrate his perspective on the subject: Overconfidence in Clinical Decision Making,1 Emergency Medicine: A Practice Prone to Error?,2 and The Affective Imperative: Coming to Terms with our Emotions.3 The cognitive influence on diagnostic error has come to the forefront of medical education thanks in part to Croskerry’s work.

This post highlights one of Croskerry’s articles in which he not only introduces and summarizes specific types of cognitive errors (see his 2-page List in the above link), but he also invites us down a pathway toward improving matters. Croskerry has cited that diagnostic errors account for 10-15% of medical error4 and, though most frequent in internal, family, and emergency medicine, he reminds us that cognitive errors can be made by physicians in all specialties and, I would argue, any healthcare professional responsible for a clinical decision about patient care.

To change practice, Croskerry’s charge to educators and practitioners is three-fold. First, we must recognize the impact and extent of cognitive errors in clinical decision-making. Second, we must “refute the inevitability” of these errors. Third, we must get rid of the “pessimism” preventing us from fixing the problem.p 776

This article, like many of Croskerry’s publications, reads as a call to action. He writes that we need to “de-bias” the cognitive patterns of diagnostic thinking prone to error. “It is not unrealistic” he writes of this charge, and the barriers standing in the way “are not insurmountable.”776 Croskerry’s published works, with their tables, lists, and appendices collectively signal an alarm for urgency, but it’s his style of writing that may ultimately motivate each of us to respond.

Bottom Line:

If you’re new to Pat Croskerry or the literature on cognitive diagnostic errors, this article is an enthusiastic first step. You’ll reference this article because of the list of “cognitive dispositions to respond,” formerly known as “physician biases” (see page 777); you’ll recommend it to a friend because of the tone in his writing, (see page 779). 




References:
1. Croskerry P, Norman G. Overconfidence in clinical decision making. Am J Med. 2008 May;121(5 Suppl):S24-9.
2. Croskerry P, Sinclair D. Emergency medicine: A practice prone to error? CJEM. 2001 Oct;3(4):271-6.
3. Croskerry P. Commentary: The affective imperative: coming to terms with our emotions. Acad Emerg Med. 2007 Feb;14(2):184-6.
4. Schiff G, Hasan O, Kim S et al. Diagnostic error in medicine: analysis of 583 physician-reported errors. Arch Intern Med 2009; 169: 1881-7. Cited in Croskerry P. Better clinical decision making and reducing diagnostic error. J R Coll Physicians Edinb, 2011; 41: 155-62.

October 2012: Interprofessional Collaboration(?)

“Do none of you talk to each other?”: the challenges facing the implementation of interprofessional education

Carlisle C, Cooper H, Watkins C. Medical Teacher. 2004; 26(6): 545-552.

One very popular topic in medical and health educational research right now is interprofessional education (IPE). In discussing the benefits of IPE, our CAO, Kevin Hinchey, has pointed out the paradox in our clinical educational system using a seasonally-appropriate football metaphor: “It’s as if we train our position players in separate facilities and then on game day, we expect them all to come to the field and work well together.” In other words, fragmented training may not be the best approach towards cohesive practice.

In the article above, Carlisle, Cooper and Watkins explore IPE through the various parties involved. Echoing Dr. Hinchey, the authors note that expecting our professionals to work well together in practice “is a bit like shutting the door after the horse has bolted.”(p. 545) To explore the feasibility of IPE during student training, the authors conduct focus groups, asking a semi-structured protocol of questions to students, patients, and practitioners and then analyzed the qualitative data.

 What were their findings? As with all qualitative studies, findings are in the form of themes, or patterns, identified in the data. The authors identified themes supporting the advantages of IPE and the challenges of initiating and implementing IPE – the full description of each, though, is beyond the scope of one short email.

Indeed, this article presents a worthwhile review of qualitative data collection and presentation of findings. Even more valuable, however, is the collective sentiment of the diverse group of participants, calling for integration of IPE earlier in our students’ careers and need for each group to break down “traditional” cultural attitudes which can often perpetuate disconnected education and practice. Exploring opportunities to overcome this disconnect can get us to work better on game day, which will ultimately help our patients win.

Bottom Line:

Interprofessional education and focus groups and two hot topics for our medical and health education researchers right now. This article, published out of the UK-based journal, Medical Teacher, satisfies our curiosity to learn more about both. 

September 2012: Dose of Education Research Basics

AM Last Page: Reliability and Validity in Educational Measurement, Artino AR. Durning SJ, Creel AH. Academic Medicine, 2010; 85(9): 1545. 

AM Last Page: Understanding Qualitative and Quantitative Research Paradigms in Academic Medicine, Castillo-Page L, Bodilly S, Bunton SA. Academic Medicine, 2012; 87(3): 386.

The development and dissemination of educational milestones from the ACGME has helped infuse many educational research concepts into our vernacular. We have “qualitative” and “quantitative” approaches towards evaluating residents with milestones, and we hope to have “valid” and “reliable” evaluation tools. As our conversations progress and our evaluations mature, it’s helpful to pause and ensure that we’re all on the same page-or, pages, in this case.

The “Last Page” of each Academic Medicine issue presents a peer-reviewed snapshot of key educational ideas, projects, or concepts relevant to medical education.  The benefit of this approach is that seemingly cumbersome topics are made accessible. The two Last Pages referenced above present an approachable contrast in validity and reliability and in research paradigms, both wrapped up in 1-page digestible bites.

While reliability is increasingly being consumed under the title of validity, it still represents a key consideration for measurement and assessing it is a task worthy of consideration, especially as we put increasing emphasis on resident and student performance evaluations and knowledge or skill tests. Assessing validity is not as straightforward and, I would argue, achieved with a collection of evidence rather than a single number.

As we develop new educational innovations and ask ourselves and each other why, how and to what extent these innovations work, some comfort with research paradigms would serve us well. Thanks to Castillo-Page and her co-authors, this one-pager presents the difference in methodological approaches to education research across critical aspects including research design, data analysis, and, of course, reliability and validity.

Bottom Line:

These two 1-page articles are a concentrated dose of educational research basics. Ten minutes of your day is well served by ensuring that your definitions of these critical terms are fully developed and ready for use. 

August 2012: "Complexity" through Qualitative Research

Representing complexity well: A story about teamwork, with implications for how we teach collaboration, Lingard L, McDougall A, Levstik M, Chandok N, Spafford MM, Schryer C. Medical Education, 2012; 46: 869-877.  

This Medical Education article is like the star pupil in a class of education research examples. To begin, it’s a rigorous qualitative study with a textbook application of a theoretical framework. Data includes standardized field notes of focused observations and transcripts from semi-structured interviews. Observations were conducted with “a ‘marginal participant role’…which enabled the study group to focus on observing interactions while allowing for informal discussions with team members.” (p. 871) Analysis was iterative and inductive, from a grounded theory approach utilizing open coding. Validity was achieved through member-checking and triangulation.

This article is also well written. Flip to the results section and you’ll read a story portraying the themes identified through analysis. This style of results presentation helps contextualize both the application of results and the theoretical framework for the reader.   

Finally, the article also highlights some meaningful content. The complexity that each of you may experience on patient care teams is not unique to your situations. In fact, complexity is an attribute ingrained in interprofessional collaboration (IPC), and the authors of this article argue that research examining IPC not only must consider such complexity but must also investigate such complexity directly. The authors argue that, although we all agree that IPC and interprofessional education are valuable, they are “in constant tension with other relevant motives, such as appropriate resource allocation and trainee education.” Competition among relevant objectives helps define the complexity of health care teamwork, and, unfortunately, the authors uncover that “IPC and IPE models do not sufficiently reflect this complexity.” (p. 876).

Bottom Line:

Interested in qualitative research?
Read this article for an example of rigorous methodology and creative display of results.
Interested in improving the collaboration among members of your patient care team?
Read this article for some insight into the complexity of interprofessional collaboration. Creative solutions to address such complexity will be successful only if we truly understand the underlying problems. 

July 2012: OMP and the RCT

Teaching the One Minute Preceptor: A Randomized Controlled Trial, Furney SL, Orsini AN, Orsetti KE, Stern DT, Gruppen LD, Irby DM. J Gen Intern Med. 2001; 16: 620-624. 

In this article, Furney et al. use a randomized, controlled trial (RCT) to examine the link between a monthly educational intervention – a 1-hour workshop on the One-Minute Preceptor (OMP) – and residents’ teaching skills. To measure the effectiveness of this intervention, they captured pre-post perspectives from learners on residents’ teaching and residents’ self-reported use of teaching skills.

The authors conclude that their intervention improves residents’ abilities to provide feedback and motivates learners to increase their outside reading. In the limitations section, authors note that the improvement in outcomes may not be due to the content of the one-hour educational intervention but instead related to the fact that any session on teaching skills was offered. In a sense, a structured session in which residents were reminded that teaching was an important part of their jobs was enough to mitigate “teacher fatigue”, or deterioration of teaching skills that can occur without presence of an intervention. 

While the RCT is largely a gold-standard for identifying causality in research, the limitations section of each educational research RCT paper is a reminder of the complexities of identifying and defining the specific agents in education. In evaluating the nature of evidence in both clinical and educational research, Patricio and vaz Carneiro1 point out that there exist complexities in educational interventions, studies, and circumstances that can be difficult to capture and even more difficult to interpret. They write that “the evidence on the success or failure of the intervention may be less clear in medical education studies because the establishment of causal relationship between the intervention and outcomes may be difficult (sometimes impossible)” (p. 480). In other words, defining and measuring various aspects of education can be a formidable and, sometimes, impossible exercise.

In the current article, Furney et al. provide evidence that a monthly intervention demonstrating the OMP improved residents’ teaching skills, from the perspectives of the learners and the teachers. Is the study conclusive? No. But it does provide a solid step in the body of evidence investigating the development of our resident teachers. It is this, and not the objective RCT, that should define a successful venture in quantitative educational research.

Bottom Line:

RCTs are possible and important in medical education research. However, even an RCT cannot guarantee a causal link given the ambiguity of education. It should be the goal of educational researchers to accurately define their study’s components and limitations so that others may continue the exploration. 


Reference:
1. Patricio M, Vaz Carneiro A. Systematic reviews of evidence in medical education and clinical medicine: Is the nature of evidence similar? Med Teach. 2012; 34: 474-482. 

June 2012: Expertise and Simulation

How Can Educators Use Simulation Applications to Teach and Assess Surgical Judgment? Andersen DK. Academic Medicine, 2012; 87; 7: 934-941.


Our patients rely on the expertise of our faculty. Expert judgment in clinical decision-making is also valuable for our learners to observe because, short of role modeling, this judgment cannot be taught…or can it? In this article, the author explores some characteristics of expertise and judgment, uncovering themes that can be applied to all specialties, in an effort to develop a teaching plan through simulation. He writes:
“An inherent part of expert judgment, therefore, appears to be the transition from routine, automated processes to focused, analytic behavior, characterized by slowing down at critical decision points [by being aware of the ‘subtle complexities of situations’]. Slowing down allows the expert surgeon to engage in analysis, teaching, and self-reflection, which enhances patient safety.” (p. 935). 
After unraveling the concepts of expertise and expert judgment, the author synthesizes literature to present simulation methods that can be used to teach and assess these concepts, including the cognitive skills associated with expert judgment and decision-making. In fact, some of this work with simulation and judgment is already being conducted at Baystate as it continues to evolve in the literature. Similar themes that appear in the literature around this topic include the concepts of cognitive rehearsal of the event prior to performance and strategies to avoid common cognitive errors in clinical decision-making.

Bottom Line:

“…Expert judgment is teachable and transferable to trainees…” (p. 939). Simulation methods provide the environment and tools to teach and assess cognitive and technical aspects of expertise and expert judgment in clinical decision-making.