Human Factors Quarterly Newsletter Winter 2016

 

The Human Factors Newsletter Winter 2016 Issue 13

Having trouble viewing this email? View it as a Web page.

Winter 2016 Header
Simulation and ModelingArticles in this issue highlight the application of modeling and simulation in Human Factors and Patient Safety. Article one describes how researchers at Vanderbilt use Medical Simulation to study human performance and technology use within a simulated environment to understand and improve quality and patient safety. Article two provides an overview of Conceptual Modeling, a human factors technique used to analyze and understand the context of usage for health information technology. Article three is our interview with Dr. Jason Saleem, where he discusses barriers with applying research results to system development and areas with great potential for future human factors contributions. Article four discusses work by the SimLEARN team to implement a Hazard Identification and Mitigation Plan in which they use simulation to understand and mitigate risks posed when launching new facilities and services. We conclude with our Spotlight series providing a brief overview of Modeling and Simulation in Human Factors Engineering. As always, we welcome your questions, feedback, and ideas for new articles via e-mail to VHA10P2HFQ@va.gov.

From The Editor-in-Chief, Scott D. Wood, PhD
 
CRISS Logo

Using Medical Simulation for Human Factors Research
Shilo Anders, PhD, Center for Research and Innovation in Systems Safety, Department of Anesthesiology, Vanderbilt University Medical Center

The Center for Research and Innovation in System Safety (CRISS) conducts basic and applied human factors research in patient safety and clinical quality and health care informatics at Vanderbilt University Medical Center. As research investigators for CRISS (lab seen in Figure 1), we study human factors and systems design and improvements in health care through research in patient care and in realistic simulations.....

Read More >

   Upcoming Events:
Human Factors and Ergonomics Society Healthcare Symposium, San Diego, CA, April 13-16, 2016

ACM CHI 2016, San Jose, CA, May 7-12, 2016

International Conference on Applied Human Factors and Ergonomics, Walt Disney World, FL, July 27-31, 2016

   Informational Links:

Human Factors Engineering (HFE) within the Veterans Health Administration (VHA) Office of Informatics and AnalyticsHealth Informatics Division seeks to increase awareness of human factors products and services among clinical end-users and other stakeholder groups. This quarterly newsletter serves to engage these communities in shared communication and collaboration around human factors-related issues.
Access and review previous issues of this newsletter
   Editor-in-Chief:
Scott D. Wood, PhD, Informatics Patient Safety, Office of Informatics and Analytics
   Managing Editor:
Christopher Petteys, MBA, Human Factors Engineering, Office of Informatics and Analytics
   Editorial Board Members:
Ross Speir, Human Factors Engineering, Office of Informatics and Analytics

Alissa Russ, PhD, VHA/Health Services Research and Development

Linda C Williams, RN, MSI, VA National Center for Patient Safety

Shilo Anders, PhD, Center for Research & Innovation in Systems Safety, Vanderbilt University

Rachel Wiebe, RD, CPHS, Health Solutions Management, Health Informatics, Office of Informatics and Analytics
 
Cognitive Modeling Conceptual Modeling in System Design
Scott D. Wood, PhD, Informatics Patient Safety, Office of Informatics and Analytics

When we first use a well-designed application or product, it seems natural and effortless, as if the designers know exactly how we like to work. We’ve all seen the opposite effect as well, when poorly designed products take a lot more effort to use despite their functionality. Highly usable products are not developed accidentally, but are...

Read More >
 
Jason Saleem profile The Veterans Health Administration (VHA): An Influential Health Care Leader in Human Factors Engineering – An Interview with Jason Saleem, PhD
Christopher Petteys, MBA, Human Factors Engineering, Office of Informatics and Analytics

What is your role within your organization? Please describe your affiliation with VHA and any work you may do with the organization.

I recently completed my first year as an Assistant Professor in the Department of Industrial Engineering at the University of Louisville. I also serve as the Director of the Center for Ergonomics at the University. The University has a rich tradition of research within the physical ergonomics side of human factors. I am in the process of changing that approach ....


Read More >
 
Simulation Mannequin

Using High-Fidelity Mannequins to Mitigate Risks and Hazards in New VHA Institutions
Haru Okuda, MD, FACEP, Lygia Arcaro, PhD, RN-BC, Terry Exum, MA, and Patricia Hubbard, PhD, SimLearn, Employee Education Service

VHA is the largest integrated health care system in the United States with 153 medical centers, numerous community based outpatient clinics and community living centers, and many more being planned, under construction, or near completion. With the opening, or activation, of a new clinical facility come many potential hazards....


Read More >

 
Spotlight on a black background Modeling and Simulation in Human Factors Engineering
Scott D. Wood, PhD, Informatics Patient Safety, Office of Informatics and Analytics

The field of Human Factors seeks to understand the work people perform to successfully achieve their goals in order to optimize efficiency, safety, learning, and other measures. In Health Informatics, we typically focus on...

Read More >
 


Back to top of article summary  |  Back to top of newsletter


CRISS Logo Using Medical Simulation for Human Factors Research
Shilo Anders, PhD, Center for Research and Innovation in Systems Safety, Department of Anesthesiology, Vanderbilt University Medical Center

The Center for Research and Innovation in System Safety (CRISS) conducts basic and applied human factors research in patient safety and clinical quality and health care informatics at Vanderbilt University Medical Center. As research investigators for CRISS (lab seen in Figure 1), we study human factors and systems design and improvements in health care through research in patient care and in realistic simulations. Simulation is defined as a “situation or environment created to allow persons to experience a representation of a real event for the purpose of practice, learning, evaluation or testing, or to gain an understanding of systems and human factors.” Simulation can range from computer modeling to full-scale recreation of human work environments.

CRISS Lab

Figure 1.  CRISS Lab

CRISS collaborates closely with Vanderbilt’s Center for Experiential Learning and Assessment (CELA), a multi­purpose, high-fidelity simulation facility with 12 fully-equipped clinical exam rooms, a 4-bed Intensive Care Unit or Emergency Department (ED), and an Operating Room (OR) suite. It is equipped with two complete control rooms, mannequin-based simulation, and the latest in virtual reality simulators and partial task trainers. CELA affords an optimal environment for research and teaching.

The CRISS and CELA collaboration uses human factors to improve patient safety by focusing on critical events and near misses to better understand how to translate safety knowledge and tools from other industries to health care. Simultaneously, investigators implement practical solutions to moderate the event. Our research has focused on assessing human performance and technology usability during critical events using full-scale simulation. Simulation has been used to assess clinician performance under different conditions, understand and improve human performance, and to evaluate Health Information Technology. The following sections provide illustrations of each.

1)  Study and improve human performance using simulation


Simulation exposes individuals to a situation or critical event where they must perform mitigation tasks and patient care. In this way we can understand human performance, or if an intervention is provided, examine its impact on performance. In one study, we examined resident anesthesiologists’ performance during pediatric operating room emergencies and for some events provided a pediatric emergency cognitive aid (CA). Additionally, we evaluated user preferences of paper or electronic CA during management of simulated critical events. Participants managed the simulated scenarios under one of three randomized conditions: 1) memory alone, 2) with a paper CA, 3) with an electronic CA. Out of 143 simulated events, the results showed that the resident’s performance significantly improved when they used the CA. Trainee performance was most positively impacted by using the CA when managing the simulated critical events. Surprisingly, although the time to task completion was unaffected by use of a CA, nearly a third of trainees chose not to use a CA despite it being explicitly provided to them.1 Even though the simulation environment helped validate the CA effectiveness on task performance, it also pointed out additional areas for improvement regarding CA adoption.

2)  Assess human performance under different conditions using simulation


Using simulation is an ideal way to study performance under conditions, such as sleep deprivation or environmental anomalies, like power failure. For example, we were part of a multisite study that used standardized high-fidelity simulation scenarios to assess the performance of practicing board-certified anesthesiologists (BCA) during medical emergencies (currently being considered an additional criterion for recertification). While not a specifically different condition, this study illustrates how simulation can be used for assessment. Video-recorded performances of 268 participants were analyzed by experienced BCAs for both technical and non-technical performance. 32% of anesthesia providers were rated as not performing at the level of a consultant. Higher rated performances were associated with academic (vs. community practice) and participant age of less than 50 years. If these findings reflect performance during actual care, it calls into question the efficacy of existing systems of continuing education and training. Greater use of simulation-based assessment and training as part of physicians’ life-long learning may be warranted.

3)  Simulation to assess technology


CRISS investigators design and evaluate medical devices and health information technology. In collaboration with VA, other Vanderbilt centers, and outside vendors, CRISS develops and improves the user experience. In one study we used the high-fidelity simulator (in both the floor/ED configuration and OR configuration) to conduct a usability test of two comparable technologies designed to reduce blood product administration errors. Twenty-two care providers evaluated one of the two products during simulated use in realistic scenarios after a 15 minute vendor-provided training. Significant effectiveness differences were observed between the two products, but surprisingly neither product was more efficient than manual checking. Usability issues included poor access to subtasks, lack of process feedback, inadequate error messaging, and confusing device interactions. This study suggests that simulation-based usability testing is a valuable and effective method of evaluating technology within the complex world of patient care.

Collaboration between CRISS and CELA provides a valuable way to advance human factors research in health care. In light of this, there still remain numerous avenues and areas of research to be explored. Advancement of simulation technology in areas that could benefit from human factors research, such as distributed simulations of teamwork and care, have yet to be explored. Likewise, resilience engineering is another area in which increased use of simulation has great potential. Resilience engineering focuses on how people succeed and adapt to perturbations in a given system, specifically the actions and interactions of individuals, devices, and processes within a given environment. This paradigm has only begun to be studied using simulation, which can be used to create brittle situations and evaluate both system recovery and failure. Finally, simulation can be used to further study how and if technology can be used effectively in real-world conditions.

 

1Watkins, SC, Anders S, Clebone, A, Hughes E, Zeigler L, Patel V, Shi Y, Shotwell M, McEvoy M, & Weinger MB. An emergency cognitive aid, regardless of mode of delivery, improved anesthesia trainees’ performance during simulated pediatric critical events. Submitted to Simulation in Healthcare.

 


Cognitive Modeling Conceptual Modeling in System Design
Scott D. Wood, PhD, Informatics Patient Safety, Office of Informatics and Analytics

When we first use a well-designed application or product, it seems natural and effortless, as if the designers know exactly how we like to work. We’ve all seen the opposite effect as well, when poorly designed products take a lot more effort to use despite their functionality. Highly usable products are not developed accidentally, but are the result of deliberate design that makes them work in a way that is intuitive to users, allows them to focus on their work, and makes it easier to do things right. An important technique for developing clear, intuitive products is conceptual modeling.

When users learn to use a software application, they form an internal understanding, a mental model, of how the application behaves and how they can use it to accomplish their goals. This understanding may be partially right, completely wrong or just incomplete. For example, user understanding of how Computerized Patient Record System (CPRS) notifications work is often flawed, as seen in the wide variety of workarounds that are performed to ensure important information is not accidentally deleted (such as creating a new unsigned note in which to copy notification results).

The conceptual model for an application represents the developer’s understanding of how the application should behave. The following figure (Wood et al, 2014, adapted from Johnson & Henderson, 2011) shows the relationship between mental models and conceptual models.The Health Care Task Domain represents the body of knowledge that clinicians draw upon to care for patients, such as medical training, care guidelines, and the workflows for their particular service and facility. Designers and developers also draw upon this knowledge (through subject-matter experts and human factors analyses) to create tools (in this example, representing a mobile product). The design concepts and product functionality built into the application and information from the patient’s record are exposed to the user via the user interface.

Mental and Conceptual Models 

Figure 2.  Contrast Between User's Mental Model and Designer's Conceptual Model (Adapted from Johnson and Henderson, 2011)

The Figure above shows the contrast between the user’s mental model and designer’s conceptual model. It helps illustrate several key points regarding this relationship and the design guidance we can derive from the modeling process:

  1. Users and designers have different perspectives for how an application should be used. A conceptual model captures designer intent explicitly and enables early analysis to detect mismatches with user mental models. Late detection of such design and requirement defects is very expensive, so early detection can save a lot of time and effort.
  2. Designer’s ability to derive a good conceptual model is determined by (but can be limited by) how well they understand the domain. Incomplete or inaccurate understanding of any aspect of the usage context can result in latent (undetected) design issues.
  3. Users develop a mental model based on their background, the exposed functionality, and the representation of both tasks and data within the user interface. So, beyond understanding the domain, developers also have to ensure that the exposed functionality and information fits both the task and cognitive design principles.
  4. Latent design issues can arise from mismatches between the mental and conceptual models, which can result in both poor usability and patient safety risks.

In a sense, the conceptual model can be seen as an ideal mental model. The ultimate design goal is for users to adopt the same mental model assumed by the designers and for that model to clearly support the user’s goals.

Conceptual modeling can be seen as both a human factors method and a framework for representing design knowledge in a structured way. A good conceptual model should inform multiple analytic techniques and allow unambiguous communication within and between analysis and development teams. Conceptual modeling seeks to create a holistic view of an application by explicitly modeling or representing key aspects of a product as they relate to intended usage such as, concepts, task-flow, and presentation. Some of the key aspects described in a conceptual model include:

  • the users (e.g., their roles, needs and background),
  • the task domain (e.g., health care, disease management, differential diagnosis, etc.), especially the medical or care basis for the application on which effectiveness will be measured,
  • high-level user activities or goals within or that connect to the task domain (e.g., make sense of new symptoms, determine new treatment, etc.),
  • concepts that the product introduces to the user (e.g., problem list) or that are used to help frame the problem,
  • functionality for interacting with those concepts (e.g., add/delete/edit problems),
  • tasks in which users complete goals utilizing the concepts through the available functionality (e.g., submit electronic prescription order to treat condition), and
  • decisions the user must make to accomplish their tasks and goals.

The main goals of conceptual modeling are to ensure that the correct product is being developed and that the concepts and functionality are sufficient for users to complete their goals using the specified tasks. Explicitly representing this information has many benefits, including the ability to analyze interactions between conceptual model elements that may reduce usability or increase patient safety risk. For example, listing conditions in a patient’s problem list without including the observation date may make it more difficult for a provider to understand whether a listed condition is active, being treated by another provider, or just historical. This could show up during analysis as a high-level goal with a difficult task procedure (large number of steps to complete the goal, or missing information on a key screen).

Conceptual modeling can inform multiple stages in the development lifecycle. An important function of the conceptual model is to serve as a consistent, standardized vocabulary for design, development, and evaluation. One example of this is in describing usage scenarios. Scenarios derived from, and which utilize the vocabulary of the conceptual model, can be used as a sanity check early in the requirements-gathering process. For example, if end users describe concepts, tasks, or decisions that are not in the scenarios, we may be missing something very important. Likewise, if the scenarios include tasks and decisions that end users do not care about, we may be building unnecessary functionality. Similarly, the conceptual model provides an excellent framework for uncovering latent design issues, problems with design consistency, and patient safety risks.

Conceptual modeling is just one way of documenting design intent and describing the context of use. Some have described it as the most important step in user interface design. Because the focus is on the user’s perspective, conceptual modeling can help reduce the gap between the user’s mental model and how the designers intend for a product to be used. However, the technique is flexible enough to apply to a wide range of design challenges. It also helps us answer two key questions regarding a new product: “Did we build it right?” and “Did we build the right thing?”

 


Jason Saleem profileThe Veterans Health Administration (VHA): An Influential Health Care Leader in Human Factors Engineering – An Interview with Jason Saleem, PhD
Christopher Petteys, MBA, Human Factors Engineering, Office of Informatics and Analytics

What is your role within your organization? Please describe your affiliation with VHA and any work you may do with the organization.

I recently completed my first year as an Assistant Professor in the Department of Industrial Engineering at the University of Louisville. I also serve as the Director of the Center for Ergonomics at the University. The University has a rich tradition of research within the physical ergonomics side of human factors. I am in the process of changing that approach by skewing the Center into including cognitive and organizational ergonomics. From my time in VA, these are the areas I have more experience in, and a greater interest in. Anything I’m doing now on the University side is heavily influenced by my experiences with VA Health Sciences Research and Development (HSR&D) and the Office of Informatics and Analytics (OIA), and their early recognition of the importance of human factors in healthcare. Fortunately, I’m maintaining my relationship with VA through an Intergovernmental Personnel Act (IPA) agreement with the Office of Informatics and Analytics and the Human Factors Engineering (HFE) team. I was tasked with assisting in the design of the next generation exam room, however, a change in need led me to support VistA Evolution efforts instead. I’m hoping to continue that work this fiscal year through an extension of my agreement with OIA and HFE.

How were you first introduced to human factors and what does that mean to you?

I was first introduced to human factors as an undergraduate while enrolled in the industrial engineering curriculum at the University of Pittsburgh. I wasn’t overly interested in engineering, but an introductory class in human factors engineering got me excited in the discipline of human factors because I learned that human factors is really a fusion between engineering and psychology. For me, engineering design by itself was important, but a little too “dry.” I loved the idea of a discipline that was inherently multidisciplinary. To fully understand the human cognitive capabilities and limitations, and how that applies to the design of such things as clinical information systems, one needs knowledge of psychology and other relevant disciplines. After experiencing this class, I decided to specialize in human factors at the graduate level and pursue a PhD in the discipline.

How do you employ human factors principles in your work?

Human Factors is my identity since that is what I specialized in at the graduate level. So it’s virtually part of everything I do, including research and teaching. I teach an introductory human factors engineering and ergonomics course and I’m getting ready to teach a graduate level, advanced human factors engineering course where we’ll be investigating and discussing the latest trends in human factors and how they relate to design. Specifically, we’ll be learning and applying advanced methods in human factors engineering, as well as newer models, theories, and frameworks related to the field, such as situation awareness, macro-cognition, distributed cognition theory, sociotechnical systems theory, virtual and augmented reality, team-based performance, workload, workarounds and their implications for redesign, resilience engineering, trust and technology adoption, and user centered design.

In terms of my research work with the Center for Ergonomics, I’ve focused on expanding beyond physical ergonomics to include cognitive and organizational ergonomics, which is related more to my background and interest. I took half of the Center and designed a human computer interaction setup where one can conduct usability and simulation studies in a controlled environment. I currently have a grant application under review where I propose to simulate different exam room configurations and do a controlled experiment to test if certain configurations are more likely to result in patient centered behaviors by the provider. For example, do certain configurations increase time spent on screen sharing activities or increase eye contact with the patient?

Have you seen changes in the interest in or awareness of human factors in VA? What are they?

There’s been a dramatic change in the interest in and awareness of human factors throughout VA. From being barely present when I first started with the VA in 2003, to now having a very strong presence, including a dedicated office. To go from just a handful of research investigators with human factors experience in 2003, to an entire operations office devoted to the discipline, is remarkable. In the private sector I’ve seen a similar trend of growth in interest and awareness of human factors. For example the National Science Foundation has put great emphasis on incorporating human centered design and cognitive engineering into their grant solicitations. A decade ago, when submitting a grant with human factors methods to HSR&D, no one with human factors experience was even available to review it. Now those types of grants are common and even expected.

Where do you see the greatest barrier to the effective integration of human factors?

I see the greatest barrier as a divide between research and operations. That is not a new concept, as it is common across many organizations and fields. I worked as a HSR&D investigator for about eight years, and then worked on the operations side within the HFE office. Having experienced both research and operations positions has made me really passionate about the topic and helped me understand the divide. When I was with HSR&D, I felt I did a lot of productive, meaningful research on integration of human factors with clinical information systems. However, I was frustrated with how little impact I was having on the operational side, and until I worked for them I couldn’t understand why. For example, performance metrics are very different depending on which side of the divide you are based. If you’re a research investigator, you are evaluated annually on grant funding and publications. In operations, my publication record didn’t really matter. I was evaluated on my customer service and my ability to deliver services and reports on time. Another example is timelines. Researchers are used to grants that cover a year to three.

When I joined operations, the biggest shock to my system was how rapid and aggressive the timelines were. If operational offices need help from HSR&D, they need help now, not a year from now. Finding common ground on some of these differences is an interest of mine. On the research side, they are incentivized to work with operations more closely because HSR&D investigators must now demonstrate strong operational partnerships to obtain funding. I don’t think the same incentives exist on the operations side. This is just one example of the type of work that can be done to lessen this divide and help improve the effective integration of human factors research with the development of clinical information systems.

Where do you see the greatest potential for the application of human factors principles?

I see two large application areas. One is more related to safety. There’s a tremendous amount of human factors principles and research that goes into the safety aspect of health care, for example, hand-offs or hand-overs and transitions of care research. The other area is related to Electronic Health Records (EHRs) and related health information technology, and that’s what I’m most interested in. VA is developing a new EHR, which poses a huge opportunity for the application of human factors principles. However, it’s also important to look at it from the research side, and not just the operational or usability-support side. For example, I’m trying to obtain funding to investigate how to integrate multiple computing devices into provider exam room workflow, and the room configurations that support it. In Wired magazine, there was a column that suggested having multiple screens can help focus one’s attention rather than serve as a distraction. That is, having multiple screens may simulate the cognitive ergonomics of paper. When I was a VA research investigator, I was really interested in the persistence of paper with the EHR and why so much paper was being used. There’s this notion that if we can incorporate multiple screens and devices, they can help us absorb some of the forgotten power and affordances of paper. For example, maybe the provider has a main monitor with progress notes along with an iPad to show certain images to the patient. Each screen serves a different purpose or function to help focus their attention rather than distract. Research like this is a great opportunity for the application of human factors in VA.


Simulation Mannequin Using High-Fidelity Mannequins to Mitigate Risks and Hazards in New VHA Institutions
Haru Okuda, MD, FACEP, Lygia Arcaro, PhD, RN-BC, Terry Exum, MA, and Patricia Hubbard, PhD, SimLearn, Employee Education Service

VHA is the largest integrated health care system in the United Stateswith 153 medical centers, numerous community based outpatient clinics and community living centers, and many more being planned, under construction, or near completion. With the opening, or activation, of a new clinical facility come many potential hazardsHerzer and colleagues defined hazards in the hospital activation setting as any event that could harm a patient or staff, which is associated with the introduction of new processes, equipment, provider teams, ergonomics and design, and the availability of resources. Often, these hazards go undiscovered until there is an adverse patient outcome or a near miss. 

Gunal acknowledged there are many different tools and models used in the evaluation and improvement of health care systems such as discrete event simulation and agent-based modeling and simulation. Most of these models focus on detailing the process of care to provide stakeholders information to effectively deploy human resources, capital and equipment. Little has been published on the use of modeling to identify potential hazards and latent safety threats in specific hospitals or clinical environments prior to their opening.

SimLEARN (Simulation Learning Education and Research Network) is the VHA’s national simulation center focused on developing the strategic vision and system-wide plan for simulation process modeling, training, education and research. SimLEARN developed a national VHA hospital activations simulation testing strategy. The strategy uses high fidelity mannequin-based simulation and standardized patients to identify patient care improvement opportunities and mitigate hazards prior to activation of new facilities. In 2011, SimLEARN deployed the national simulation-based hospital activations program to support the sequential opening of clinical services at the new VA Southern Nevada Healthcare System (VASNHS) Medical Center. Details, activities and initiatives within the simulation are detailed below.

Preston & Lopez and colleagues detailed the benefits of in-situ simulation and their use of clinical simulation training for their Perinatal Patient Safety Program. Review of this strategy resulted in direct dialogue supporting discussion of their detailed model for improving existing clinical care. Herzer and colleagues focused on new clinical therapies in an operating suite. The study detailed a strategy for patient care teams to prospectively identify and mitigate clinical hazards through a five-phase framework. This framework proved adaptable to VHA needs for hospital activation.

Hazard Identification and Hazard Mitigation Plan

To better meet the needs of VA, SimLEARN adapted the Herzer model, which incorporated pre-assessment, simulation testing, mitigation support and guidance of continuous improvement efforts led by VASNHS during medical center simulation testing phases. The five-phase Hazard Identification and Mitigation Plan (HIMP) developed by SimLEARN includes planning, pre-assessment, simulation testing/assessment, evaluation and mitigation. A detailed graphical depiction of the HIMP phases, and associated components within each phase, is shown in Figure 3.

HIMP Phases Figure 3. HIMP Phases and Associated Phase Components

Planning started thirteen months prior to delivery of patient care services, occurring initially via conference calls between VASNHS leadership and SimLEARN leadership. A site visit was accomplished and the contents of a joint hazard mitigation proposal were discussed.

Pre-assessment had two primary components, which included staff completion of the Hospital Activation Pre-Assessment Form and a subject matter expert pre-planning teleconference targeting specific clinical and nonclinical areas designated for initial activation. This phase helped the SimLEARN team identify and categorize concerns raised by the subject matter experts. This information provided a foundation for determining the type of scenarios to accomplish during the simulation testing phase.

Simulation testing and assessment were conducted one month prior to initial delivery of patient care services and included the use of standardized patient(s) and a high fidelity mannequin. SimLEARN team members facilitated simulation scenarios using the high fidelity mannequins in both clinical and nonclinical areas, and focused on three categories: patient flow, workflow and equipment use. After each phase, the SimLEARN team debriefed learners to gather feedback regarding potential hazards and patient care improvement (PCI) opportunities identified during simulation testing.

A PCI matrix adapted from the VA National Center for Patient Safety (NCPS), Safety Assessment Code (SAC) Matrix was used to support the identification of Patient Care Improvement probability and opportunity levels. The SAC matrix is regularly used by VA clinicians, so adapting the new PCI matrix to the similar format and measurement scale of the SAC matrix supported ease of use, requiring limited application training for clinicians participating in simulation testing.

The PCI matrix and the associated PCI legend as shown below in Figure 4 are based on an alpha-numeric designation. The alpha designation represents probability from frequent to remote. The numeric designation represents the patient care improvement opportunity levels of high, intermediate, or low. Action requirements range from “take immediate action” to “acceptable with no action required.” The PCI Matrix was used to support the alpha-numeric patient care improvement opportunity designations during pre-assessment and assessment.

PCI Matrix and Legend 

Figure 4.  Patient Care Improvement Matrix with PCI Legend

Mitigation

VASNHS-driven mitigation began immediately following the simulation phase and has now expanded to include simulation testing as a method for continuous improvement. VASNHS Leadership and the Leadership Team have taken a comprehensive approach to creating positive change in the three simulation testing focus areas identified (patient flow, workflow and equipment). SimLEARN provided VASNHS Leadership with a “Hospital Activation Simulation Testing Hazard Identification Report,” which categorized areas of note by location, service area, severity and probability level. VASNHS went into immediate action upon receipt of the report to insure the steps necessary to mitigate all potential hazards, prior to the delivery of patient services. Mitigation efforts included the following:

  • additional simulation testing to support continuous improvement;
  • equipment installation and/or modification;
  • staff training;
  • signage upgrades, and
  • process improvement.

Conclusions

The use of high fidelity human patient mannequin simulations proved a powerful methodology to assess potential hazards and latent safety threats, as well as assess patient workflow in newly-built hospitals prior to activation. The integration of this systematic approach to simulation testing enables facilities to identify patient care improvement opportunities and take mitigation steps prior to the delivery of patient care services. Finally, successful application of this systematic approach resulted in the development of a standardized national model of VHA simulation testing that will be replicated for delivery at new facilities opening throughout the VHA.

 


Spotlight Article Title Banner


Spotlight on a black background Modeling and Simulation in Human Factors Engineering
Scott D. Wood, PhD, Informatics Patient Safety, Office of Informatics and Analytics

The field of Human Factors seeks to understand the work people perform to successfully achieve their goals in order to optimize efficiency, safety, learning, and other measures. In Health Informatics, we typically focus on the relationships between several core aspects of system usage to understand these human factors: the user, the domain tasks they are performing, the technology they are using, and the environment in which they are working. Collectively, these components constitute a context of use, which itself is a form of high-level usage model. Modeling and simulation are important techniques for engineering each of these components and for optimizing the collective human-system interaction that emerges.

Models are used in human factors to focus on specific system components while abstracting away unnecessary details. For example, a workflow model of ambulatory care might allow us to understand how patients move through the clinical setting, how and when various clinicians interact with those patients, and information gleaned and produced as a result. This allows us to focus more on the patient perspective by looking at aspects such as the steps a patient follows, how long these steps take, challenges and risks along the way, while ignoring (for the moment) details such as specific screen designs used during the patient flow. There are many other human factors models that are useful for system development such as task models, cognitive models, situation awareness models, decision models, and human error models.

Perhaps the most important characteristic of a good model is that it allows us to make empirically falsifiable predictions. For example, changing model inputs should result in predictable and observable changes to the output. This not only enables some optimization of performance requirements in design, but also enables better analysis of test results. If users are not able to accurately complete a decision task, we can look at the visual model to better understand how information visualization may be affecting performance, or we can construct an error model to determine if the task is overloading the user’s memory or cognitive abilities.

While human factors models are important when used strictly for analysis, we can also bring them to life in the form of simulation. Simulation can be seen as a dynamic realization of an underlying static model that allows us to study the behavior of models as the inputs change, often as a function of time or events. For example, Petri nets can be used to understand how various constraints and bottlenecks (such as waiting for labs) can affect a static workflow model. When workflow assumptions change, we can see how bottlenecks shift and how overall performance changes. Simulation can also provide additional elements of realism when trying to recreate the full context of usage. For instance, computational cognitive simulations can help to understand how the combination of events and tasks can cause cognitive overload or usage errors (such as misdiagnosis).

Modeling and simulation are powerful tools in the human factors toolbox. They can be used to answer many questions regarding design, development, deployment, and training. When used judiciously, modeling and simulation can greatly reduce costs in health informatics by helping to both define the problem space and explore the solution space.