Human Factors Quarterly Newsletter Spring 2014

Having trouble viewing this email? View it as a Web page.

The Human Factors Newsletter Spring 2014 Issue 6

Newsletter top banner image with the Veteran's logo and the Veteran's Health Administration title.  This banner includes additional text depicting the Office of Informatics and Analytics as well as the Health Informatics sub titles.  This banner contains a label showing Issue 06, Spring edition 2014.  This newsletter's title is The Human Factors Quarterly.
Image of a female doctor or nurse conducting paper and electronic reconciliation of patient information. This issue's articles focus on patient safety, ranging from guides to help mitigate Electronic Health Record (EHR)-related patient safety risks to a decision support tool for secondary stroke prevention. It also takes a look at enhanced patient safety through graduate medical education. In addition, Dr. Frank Drews, Director of the Center for Human Factors and Patient Safety at the Salt Lake City VA Health Care Center, describes how his team reduced infection rates through the design of a human factors engineering-based kit. We also continue our feature on Mobile App Design and welcome your feedback via email to VHA10P2HFQ@va.gov.

- Jason J. Saleem, PhD, Editor-in-Chief.
   
Medical bag representing Information Technologies sitting inside of a life preservative Supporting the Assessment of Electronic Health Records to Improve Patient Safety
Hardeep Singh, MD, Internist/Informatician, Michael E. DeBakey VA Medical Center, Houston, TX, Center for Innovations in Quality, Effectiveness & Safety; Houston Patient Safety Center of Inquiry, Director, Michael W. Smith, PhD., Human Factors Engineer, Michael E. DeBakey VA Medical Center, Houston, TX, Center for Innovations in Quality, Effectiveness & Safety, Dean F. Sittig, PhD., Professor, University of Texas School of Biomedical Informatics

The Department of Veterans Affairs (VA) has long been a leader in the development and implementation of Health Information Technology (HIT), including EHRs. EHRs are being adopted rapidly in a variety of health care settings, across the Nation. However, new and often difficult to detect vulnerabilities...

Read More >
   Upcoming Events
Human Factors Engineering Course at University of Michigan, July 21-26; July 28-Aug 1, 2014


2nd International Symposium on Resilient Cognitive Systems Part of Resilience Week 2014 Denver, CO, August 19-21, 2014

   Informational Links
Human Factors Engineering (HFE) within the Veterans Health Administration (VHA) Office of Informatics and Analytics, Health Informatics Division seeks to increase awareness of human factors products and services among clinical end-users and other stakeholder groups. This quarterly newsletter serves to engage these communities in shared communication and collaboration around human factors-related issues.

Access and review previous issues of this newsletter


   Editor-in-Chief:
Jason J. Saleem, PhD, Human Factors Engineering, Health Informatics, Office of Informatics and Analytics


   Managing Editor:
Christopher Petteys, MBA, Human Factors Engineering, Health Informatics, Office of Informatics and Analytics


   Editorial Board Members:
Michael W. Smith, PhD, Houston VA HSR&D Center for Innovations


Rachel Wiebe, RD, CPHQ, Health Solutions Management, Health Informatics, Office of Informatics and Analytics


Linda C Williams, RN, MSI, VA National Center for Patient Safety


Scott D. Wood, PhD, Informatics Patient Safety, Office of Informatics and Analytics


 
Image of a typical triangular error message with and exclamation mark. What Does that Error Message Mean? 
Diane Murphy, RN, Management Analyst, Scott Wood, PhD, Health System Specialist, Jean Krieg BS, MT (ASCP), Management Analyst, Informatics Patient Safety, Office of Informatics and Analytics

A Bar Code Medication Administration (BCMA) coordinator recently asked us about the following error dialog in Figure 1, received by a clinical user. The Informatics Patient Safety office has analyzed many cases where error dialog design was, at least, a contributing...

Read More >
 
Doctor Frank Drews' profile picture A Socio-technical Perspective to the Application of Human Factors Engineering:
An interview with Frank Drews, PhD

Jason J. Saleem, PhD, Human Factors Engineering, Health Informatics, Office of Informatics and Analytics

Dr. Frank Drews currently serves as the Director of the Center for Human Factors and Patient Safety at the Salt Lake City VA Health Care Center. He and his team are employing cutting edge human factors engineering theories and principles...

Read More >
 
Studying medical team wheeling a patient with a sign saying safety first Human Factors Engineering and Graduate Medical Education in Patient Safety
Joe Murphy, APR, Public Affairs Officer and Linda Williams, RN, MSI, Program Specialist, VA National Center for Patient Safety

Virtually all health care organizations prior to the 1999 publication of the Institute of Medicine's landmark report, To Err is Human, engaged in investigations of events that resulted in harm to patients. Few of these investigations...

Read More >
 
Image of a brain over a heartbeat wave or computerized heart monitor signal Iterative Usability Testing of a Decision Support Tool for Stroke Secondary Prevention
Jane A. Anderson, PhD, RN, FNP-BC, Associate Director, Stroke Center, Michael E. DeBakey VA Medical Center

The Self-management To Prevent (STOP) Stroke Tool, is a point-of-care Clinical Decision Support (CDS) application that prompts providers on guideline concordant care for stroke secondary prevention, while simultaneously facilitating patient/provider communication around patient self-management...

Read More >
 
Leaning cell phone image with a paintbrush painting the image of the screen and icons Mobile App Design Brief:
Consistency and Terminology

Donna Harrigan, BA, Management and Program Analyst, Human Factors Engineering, Office of Informatics and Analytics

The Mobile Application (app) Design "brief" series highlights best practices in action in VA for mobile User Interface (UI) design certifications. The user experience concept of "consistency" is the ...

Read More >
 
Back to top of article summary  |  Back to top of newsletter


Medical bag representing Information Technologies sitting inside of a life preservative Supporting the Assessment of Electronic Health Records to Improve Patient Safety

Hardeep Singh, MD, Internist/Informatician, Michael E. DeBakey VA Medical Center, Houston, TX, Center for Innovations in Quality, Effectiveness & Safety; Houston Patient Safety Center of Inquiry, Director, Michael W. Smith, PhD., Human Factors Engineer, Michael E. DeBakey VA Medical Center, Houston, TX, Center for Innovations in Quality, Effectiveness & Safety, Dean F. Sittig, PhD., Professor, University of Texas School of Biomedical Informatics

The Department of Veterans Affairs (VA) has long been a leader in the development and implementation of Health Information Technology (HIT), including EHRs. EHRs are being adopted rapidly in a variety of health care settings, across the Nation. However, new and often difficult to detect vulnerabilities accompany advanced EHR capabilities, many of which are revealed only after they have contributed to errors and safety concerns. Although these are especially pressing concerns for organizations that are implementing new EHR systems, even well-established VA systems are not immune to such problems. As EHRs become more complex, and the patient safety vulnerabilities more intricate, health care facilities require tools to assess and improve the safety of their EHRs. The growth of EHR adoption, especially among smaller practices, makes it especially important that these tools are applicable to a wide range of users.

With support from the Office of the National Coordinator for HIT (ONC), our teams, based at the Michael E. DeBakey VA Medical Center, Houston, TX, Health Services Research & Development Center for Innovations in Quality, Effectiveness and Safety, the Oregon Health & Science University in Portland, Oregon, and the University of Texas Health Science Center in Houston, recently developed a set of self-assessment guides to help health care organizations mitigate EHR-related patient safety risks. Clinical leaders, administrators, and other stakeholders are able to use the guides to proactively identify conditions, practices, and policies that may increase patient safety risks within their local systems. The ultimate purpose of the self-assessment guides, known as the Safety Assurance Factors for EHR Resilience (SAFER), is to enhance the organization's resilience or ability to continuously anticipate, detect, correct, and prevent or mitigate patient safety hazards that arise from the design, implementation, operation, and maintenance of its EHR. The guides assist organizations in conducting a detailed assessment of their EHR- enabled health care systems, comparing existing practices with expert recommended practices. In addition to prompting changes to deficient practices or software, the self-assessment process may help groups develop a shared understanding of the full capabilities and limitations of their systems, an essential condition for resilience. The SAFER guides, released in January 2014, are available at no cost to VA staff, through the ONC.

One premise of the SAFER project is that EHR-related patient safety risks originate not just from features of technologies themselves, but also from the context within which technologies are implemented and used. Thus, understanding the risks of any EHR system requires an understanding of who uses it, how the organization influences its use, and the rules that govern the organization and its users. Even system-wide use of a single common EHR can carry different risks from facility to facility, depending on factors such as the local culture, training practices, workflows, and policies. The SAFER guides are designed to take into account not only technical factors, but usability and social/organizational factors as well.

Expert opinion and prior research by both our team and others determined the content of the SAFER guides. An expert panel representing the fields of informatics, patient safety, quality improvement, risk management, and human factors engineering and usability has provided input. This combined input, as well as a thorough review of the literature, resulted in organization of the SAFER guides around the following high-risk functions or aspects of system performance:
  • Computerized Provider Order Entry (CPOE) with Clinical Decision Support
  • Clinician Communication
  • System Configuration
  • Contingency Planning
  • Organizational Responsibilities
  • Patient Identification
  • System Interfaces
  • Test Result Reporting and Follow-Up

Our teams developed one guide for each of these topics, plus an additional guide titled High Priority Practices (addressing the most important practices from all areas above), for a total of nine self-assessment guides.

Human factors principles inform many of the recommended practices. For instance, safe use of an EHR depends on enabling users to perceive important information thus minimizing the burden on providers' memory and attention. The guides explain these general principles and offer specific examples of strategies to set them into practice. Effective communication and coordination are also critical for patient safety. Therefore, the guides address processes such as tracking the status of orders, referrals, and flagging relevant information for specific providers. Importantly, other recommended practices stress the involvement of end-users in decisions on EHR procurement, implementation, and configuration.

The core content of each guide is a checklist of 10 to 25 items that reflect recommended practices for improving the safety of EHRs. For each item, users are provided with further explanation and guidance to determine whether that practice is fully, partially, or not implemented at their facility.

In addition to incorporating human factors principles into the content of guides, we used human factors methodologies in the development of the guides. We sought the collaboration of key stakeholders such as the American Hospital Association and The Joint Commission, as well as health care organizations and potential users.

During the development process the team visited five facilities that varied in size and in degree of EHR implementation. A variety of users including Information Technology personnel, risk managers, clinical leaders, and others participated in cognitive interviews while reviewing the checklists. The findings from these interviews enabled us to check users' interpretations and modify checklist items to maximize comprehension and usefulness across a range of possible users and contexts.

In addition, the team observed processes and behaviors of potential users in order to gain more insight into how the guides could be applied to improve EHR-related patient safety practices. The information gathered from observations was then used to refine the content of the SAFER guides. For instance, recommended practices were modified to read more generically, rather than contain specific values, thresholds, or times. This allowed the guides to apply in a broader range of situations across organizations of varying sizes and stages of EHR implementation and use.

After the first revisions, a second crucial step was to test the guides at five, additional sites by soliciting further feedback from potential users. One goal of this process was to ensure that the checklist items were not only easy to interpret, conceptually, but also capable of being operationalized or measured in a consistent fashion. One outcome of this evaluation was the inclusion of planning worksheets in the final versions of each guide. These worksheets provide a rationale and practical guidance to organizations that are planning to implement recommended practices.

The strength of the SAFER guides is based on the extensive input from both subject matter experts and various potential end users. These collaborations helped ensure the final versions of the guides are relevant to many different real-world health care settings. Within VA, health care facilities and elsewhere, our teams hope that the SAFER guides will stimulate a rapid move toward best practices for personnel training, patient care processes and workflows, local policy refinement, and use of the EHR as a tool for improving patient safety.




Image of a typical triangular error message with and exclamation mark. What Does that Error Message Mean?

Diane Murphy, RN, Management Analyst, Scott Wood, PhD, Health System Specialist, Jean Krieg BS, MT (ASCP), Management Analyst, Informatics Patient Safety, Office of Informatics and Analytics

A Bar Code Medication Administration (BCMA) coordinator recently asked us about the following error dialog in Figure 1, received by a clinical user. The Informatics Patient Safety office has analyzed many cases where error dialog design was, at least, a contributing factor in adverse events. We also receive many requests from development teams to improve the effectiveness and Computer generated error message that says Incomplete data returned from system, OKusability of error dialogs. Poor dialog design can result in confusion, frustration, poor performance, and human error for clinical users, which could have negative implications for patient safety. This article introduces key design issues using error dialog examples and presents guidelines for good dialog design.

The end-user, in this case, was a floor nurse attempting to administer intravenous medications using BCMA. When she clicked on the Intravenous therapy (IV) tab in the Computerized Patient Record System (CPRS), the dialog box appeared. The floor nurse will likely have questions about the dialog message, such as:
  • What did I do wrong?
  • Why is this message being displayed?
  • Is there a fast way to fix it?
  • What happens if I click on ‘OK’?
  • How do I finish administering meds to my patient?
  • Who should I ask for help?

Error dialogs and user feedback that are unclear or confusing are a frequent source of usability, and patient safety related issues within VA's HIT. Effective error dialogs include three essential elements: What happened, why it happened, and how to fix it.
  1. What happened — Effective error dialogs clearly tell the user what happened in terms of their current task. Our example error dialog in Figure 1 indicates that something negative occurred, but it doesn't say what data is missing. Because the dialog doesn't tell what went wrong, the only recourse is to accept the message and figure out how to finish the task. A better approach is to convey error messages in terms of user actions and in language better understood by the user.
  2. Why it happened — Effective error dialogs explain why an error occurred in the context of the user's activity. The explanation should indicate which step(s) in the process went wrong and if it was a user action that caused the error. The BCMA example doesn't convey this information. Alerts and errors with explanations also help take the mystery out of system operation, providing users with a better sense of control over their situation.
  3. How to fix it — Ultimately, effective error dialogs must tell the user how to correct the problem and resume their primary clinical task. The BCMA error dialog also lacks next step information, whether the current task can be completed, or whether the user should contact someone for help.

Conveying these three aspects concisely and unambiguously enables the user to quickly understand the nature of the problem, correct it, and resume their work with minimal disruption.computer generated error message The error dialog in Figure 2, from ePrescribing, implements each of the three core principles:
  1. The first sentence explains that the order could not be completed.
  2. The second sentence tells the user why the order could not be completed.
  3. The last two lines explain the corrective-action options available to the user.

Designing effective error dialogs doesn't have to be complicated. As long as the text in the error dialog addresses the fundamental elements of what happened, why, and how to fix it, users will be able to quickly address the core problem and continue with their work. Effective error dialogs are an important part of developing safe, usable health care systems. Download the full report to learn more.

Of course, eliminating the need for an error dialog is the best possible solution. Error dialogs interrupt the user's primary tasks and train of thought, increase cognitive workload, require working memory, and increase the likelihood of other user errors. For some situations, such as network or other system errors, these types of dialogs are unavoidable. In other cases, the need for some error dialogs can be eliminated, or designed out of the user interaction. Reducing the likelihood of an occurrence of an error and thus the appearance of the error dialog can often be achieved through improved, error-tolerant design of the clinical information system.



Doctor Frank Drews' profile picture A Socio-technical Perspective to the Application of Human Factors Engineering: An interview with Frank Drews, PhD

Jason J. Saleem, PhD, Human Factors Engineering, Health Informatics, Office of Informatics and Analytics

Dr. Frank Drews currently serves as the Director of the Center for Human Factors and Patient Safety at the Salt Lake City VA Health Care Center. He and his team are employing cutting edge human factors engineering theories and principles that literally revolutionize the quality and safety of health care delivery throughout VA's health care system. Dr. Drews also serves as Co-Director, Salt Lake City Center of Innovation, Informatics, Decision Enhancement, and Surveillance Center (IDEAS 2.0). Additionally, Dr. Drews is an Associate Professor at the University of Utah in the Psychology Department and an adjunct professor in the departments of Bio-Medical Informatics, Internal Medicine, Anesthesiology and Education Psychology.

What is your role in the Veterans Health Administration (VHA)?
As Director of the Center for Human Factors in Patient Safety, my main focus is to spearhead new and innovative research to improve the safety of the delivery of health care to our Veterans. We try to identify challenges and limitations that are associated with current technologies. For example, many technology implementations in the past only focused on improving the technology, following the idea that better technology means better provider performance. However, there is more and more evidence that it is not as simple or easy as that. What we are doing is exploring the impact of these new technologies on the socio-technical system, for example, how people interact with each other after we implement a new EHR system. One concern is that such a new system may not facilitate clinician interaction with negative consequences for patient safety. Our focus is to look at the system as a whole rather than as isolated pieces. Previous work, including some work coming out of the IDEAS Center, indicates that if we ignore the socio-technical interactions that are part of the delivery of health care, we are likely to implement technology that will not meet our expectations, and certainly not improve patient safety.


How were you introduced to the concept of human factors?
I was introduced to the concept of human factors about 25 years ago as a student at the Technical University of Berlin, Germany. Back then, I had a strong interest in cognitive psychology and organizational psychology. When I came across the field of human factors, it was very appealing to me because my previous work in psycholinguistics and decision-making had no direct impact on people. I found the work in human factors extremely interesting because you could immediately apply theories, test them in everyday settings, and then refine them. In addition, if your theories were correct, you had an immediate impact on working conditions. Thus, you affected people's lives by making things better for them. So, I really loved and continue to love the close interaction between theory that aims at understanding more about human cognitive processes, and application of this theory with direct impact on how people perform in real life.


How did you become involved with VHA and how do you use/employ human factors in your work?
In the early 2000s, I was conducting research in the Psychology Department and the Department of Anesthesiology at the University of Utah. At that time, we were focusing on display design with the goal to improve the performance of anesthesiologists while monitoring patients during surgery. The work caught the attention of several people at the VA Medical Center here in Salt Lake City, especially Drs. Matthew Samore, Jonathan Nebeker, and Charlene Weir. In 2005, they invited me to give a presentation. During the discussion, it became clear that the group was very excited about the idea of applying cognitive psychology and human factors to advance research and practice in VA. To make a long story short, they offered an opportunity to work closely with them on several projects. Initially, we conducted research in infection control, communications, display design, and medication reconciliation. The overall impact of our collaboration demonstrated that human factors research could make a significant contribution to advance patient safety in VA. After a while it became clear that to allow me to conduct my own independent VA research, I needed to be employed by VA. To make that possible I became a United States citizen and joined the group at VA. In 2009, I applied for a grant from the National Center for Patient Safety (NCPS) to found the Center for Human Factors in Patient Safety. We were very fortunate that NCPS decided that our work had a lot of promise and decided to fund our Center.

An example of how we employ human factors is our work in the area of procedural adherence. This work tries to understand why clinicians do not follow standard best practices and what can be done to make it easier for them to follow those best practices. Our particular focus is on central line insertion and maintenance, both of them contributing factors to central line associated bloodstream infections. The result of our human factors work is that we developed a better way of guiding nurses through the procedure by providing them with human factors engineering based kits that make it easy for them to follow procedures during central line maintenance. As a result of the use of these kits, infection rates are significantly reduced, and consequently, lives of Veterans are being saved.


What human factors-based project do you think made the biggest impact in VHA? Why?
The Adherence Engineering Project has been very successful in reducing central line associated bloodstream infections and increasing procedural adherence. However, its impact has been more local in nature. Our desire is to grow the scope of the project to a regional, and even a national level. Another project that had an impact in the VHA was in the area of medication reconciliation with Dr. Nebeker. Here the idea was to develop a better display to provide patient information to physicians.


Where do you see the greatest potential for the application of human factors' principles in VHA?
The area of greatest potential where human factors can impact the quality of health care provided to Veterans is the area of EHR systems. There is a general sentiment in VA that the current system is inadequate and that we need a better way of providing patient information to providers. Patient records are currently fragmented, and it is time consuming to review those records in order to extract important information. I also believe that the focus on improving procedural adherence has great potential; getting clinicians to follow best practices rather than having them do what they think is best is critical for safe and effective delivery of care. Other areas that I would consider important include facilitating communication not only between providers, but also with patients. In addition, we have to get better at managing patient expectations more effectively; this includes describing issues and challenges associated with the delivery of care to give patients a better understanding of what to expect when receiving care. Overall, I think there is almost no area where human factors cannot have an impact on health care delivery within VA.


Have you seen changes with respect to the interest in or awareness of human factors in VA? What are they?
Yes, I am observing significant change with respect to human factors awareness in VA since I started to work in VA about ten years ago. At that time, there was very little interest in human factors. However, over the past five years, I have seen not only an increase in interest in human factors, but also an increase in human factors engineers being employed in VA. Today, more people than ever are talking about human factors in VA. In addition, there is a lot of sharing of ideas of how human factors can support operations and improve clinician performance.

One indicator of more interest is that people want to know about human factors. For example, a few years ago, we developed a short course in human factors that we successfully presented several times at different VA medical centers. Finally, more researchers from VA are attending Human Factors conferences. To me there is a noticeable shift that indicates that we are at a tipping point with regard to the status of human factors within VHA. I am convinced that in the near future more and more people see human factors as an important component. Human factors engineers will help identify and develop the solutions that will reduce the cost and improve the quality of health care delivery for Veterans.





Studying medical team wheeling a patient with a sign saying safety first Human Factors Engineering and Graduate Medical Education in Patient Safety

Joe Murphy, APR, Public Affairs Officer and Linda Williams, RN, MSI, Program Specialist, VA National Center for Patient Safety

Virtually all health care organizations prior to the 1999 publication of the Institute of Medicine's landmark report, To Err is Human, engaged in investigations of events that resulted in harm to patients. Few of these investigations, however, participated in a systems-based approach to problem solving. Traditionally the focus had been on individuals and their mistakes, rather than on system-level vulnerabilities and events that combine in an unfortunate sequence to cause an incident. Based on a name and blame culture, the emphasis of such investigations was not on prevention, but on individual correction or discipline. Since its establishment in 1999, VA's NCPS's goal of the reduction and prevention of inadvertent harm to patients, as a result of their care, has remained unchanged and is reflected in all its programs. By shifting the emphasis from eliminating errors and focusing on blaming an individual to reducing or eliminating harm to patients through investigating system-level vulnerabilities, much has been accomplished in VA. Reducing or eliminating harm to patients is the real key to patient safety.

Two educational initiatives illustrate efforts to spread the NCPS approach more broadly outside the boundaries of VA's health care organization: faculty development workshops for residency programs and a core curriculum for the Chief Residents in Quality and Safety (CRQS). NCPS has offered workshops since 2003 for faculty development in teaching patient safety in graduate medical education. More than half of the Nation's residents train in VA health care facilities; therefore, NCPS recognizes the crucial resource of residents, as the front-line where change is possible. Working with faculty enables patient safety curriculum to reach more physicians-in-training. Faculty development workshops include an introduction to human factors engineering and hands-on methods for teaching. Workshop participants gain the ability to teach using hands-on, interactive teaching methods, to perform basic human factors engineering evaluations of health care devices, medications, and architecture in an operational environment, and to recognize the importance of a professional human factors engineering consult when change within the complex health care system is indicated. Even if only the most basic of concepts are acquired, requesting evidence of usability testing prior to purchasing decisions may advance safety efforts.

A unique program for board eligible physicians offers the opportunity to spend a year as a CRQS. The program encourages learning and teaching combined with problem solving project work. The program benefits the CRQS with unique additions to his or her resume and portfolio. It also benefits the residency-training program with the ability to meet recently established milestones and in preparation for Accreditation Council for Graduate Medical Education's (ACGME) Clinical Learning Environment Review (CLER) visits. The ACGME established the CLER program to assess the graduate medical education learning environment of a sponsoring institution and its participating sites. CLER emphasizes quality and safety of the environment for learning and patient care, at the sponsoring institution.

Human factors engineering is at the core of teaching patient safety in graduate medical education. An individual CRQS may be nearly powerless in showing the way to effectively change the system, especially in places where name and blame culture remains existent. However, the use of published case studies and close calls allow rigorous human factors engineering evaluation without fear. The insights revealed by anPeople sitting at a conference, CRQS Orientation engineering evaluation will pave the way for future trust in a systems approach. This ability to learn from close calls is significant because they occur at a much higher frequency than actual adverse events. Addressing problems in this way not only results in safer systems, but it also focuses everyone's efforts on continually identifying potential systems-based problems and fixing them.

One of the most encouraging impacts about this program is a change in the way physicians see their work environment; knowing that design can either contribute to intuitive correct use, fail to inform the user or even mislead to the point of harming the patient becomes transformative. If NCPS existed only to perform patient safety problem solving, change would occur slowly and it would be primarily based on retrospective analysis. The investment in educational initiatives changes the perspective of the clinician and arms them with the evidence to demand better design and to address problem solving preemptively.




Image of a brain over a heartbeat wave or computerized heart monitor signal Iterative Usability Testing of a Decision Support Tool for Stroke Secondary Prevention

Jane A. Anderson, PhD, RN, FNP-BC, Associate Director, Stroke Center, Michael E. DeBakey VA Medical Center

The Self-management To Prevent (STOP) Stroke Tool, is a point-of-care Clinical Decision Support (CDS) application that prompts providers on guideline concordant care for stroke secondary prevention, while simultaneously facilitating patient/provider communication around patient self-management of stroke risk factors (Anderson et al, 2013). To date, development of the STOP Stroke Tool has been completed as part of two VA Health Services Research and Development funded research projects. Our study team used an integrated model for usability engineering to guide the development process based on iterative testing/design cycles that incorporate end-user feedback (Anderson, 2010). We engaged end-users early to evaluate both the functionality and usability of the tool.

During initial development, testing focused on the functionality and the overall usability of the tool. We assessed functionality by comparing documentation of guideline concordant care among a sample of multidisciplinary providers using test case scenarios and two documentation systems (a standard Computer Patient Record System (CPRS) generated note vs. the STOP Stroke Tool template). Overall, usability was evaluated using the Evidence Adaptive Clinical Decision Support Usability Questionnaire and open-ended questions that targeted end-users' perceptions on access, usability and usefulness of the tool. Results showed the template-based tool prompted a significant increase in providers' documentation for six of 11 clinical practice guidelines as compared to baseline documentation while using the standard CPRS note. Out of a possible 56 points, usability was acceptable.

Magnified, zoomed in image of computer generated remider message box displaying and open-ended question revealing multiple usability barriersInitial testing showed that guideline prompting and documentation could be successfully engineered using a CPRS reminder dialog template with an acceptable level of usability. However, findings from the open-ended questions revealed multiple usability barriers for acceptance and use of the tool in clinical practice (Figure 4). For example, current CPRS functionality does not allow simultaneous access to reminder dialog templates and other information in CPRS. Checkbox and drop down screens within the reminder dialog template were found to be burdensome, particularly for patients with multiple stroke risk factors. Lack of clarity on how to access embedded guidelines, patient education materials, and patient action plans within the template were also noted as a usability barrier.

The objective for the next phase of development was to identify redesign strategies to address usability barriers. A web-based design was explored as an alternative to the reminder dialog template. Design/testing cycles were structured to obtain end-user input to create the web-based prototype. This was accomplished using a visualization session and simulated clinic visits. A video was used to create a guided visualization session of screen interaction and navigation to illustrate how a web-based STOP Stroke Tool was envisioned for use in clinical practice. To facilitate participation in the visualization session by twelve multidisciplinary clinicians, the video was uploaded to the Stroke Quality Enhancement Research Initiative website. The feedback evaluation form included five open-ended questions to elicit clinician's perceptions on barriers and facilitators to use of the web-based tool in clinical practice, and to support patient/provider communication around patient self-management of stroke risk factors.

Facilitator themes of the web-based design for patient/provider communication were primarily related to hyperlinks to patient-education materials andWeb-based prototype screen shot action-planning/goal-setting options. Barriers were less related to the design and more related to the constraints of available technology in the current practice setting (i.e., desk top computers vs. tablet computers or pads). These findings were applied to develop a web-based prototype for further usability testing with clinicians in a simulated clinic visit.

Usability testing of the web-based STOP Stroke Tool prototype (Figure 5) was completed using a Think-Aloud protocol with providers as they interacted with a patient actor and the prototype during a simulated clinic visit. The audio recordings of the sessions were transcribed verbatim and analyzed using direct content analysis to identify facilitators and barriers to accessibility, usability and usefulness of the web-based tool and use of the tool for patient/provider communication. The overall facilitator themes of the web-based tool included comprehensive information, color coding, and systematic structure. The web-based tool was found to be useful for patient accountability, reminder of best practice, goal-focused care and enhanced communication. Barriers were lack of interoperability, time to use the tool, training to learn the tool and concerns about computer-centric care.

The involvement of end-users in the development and testing of CDS tools is critical for successful implementation in practice (Sintchenko et al, 2004). Results from usability testing not only provided valuable feedback for further development of the STOP Stroke Tool, but also provided insight regarding the interface between information technology, self-care management, and the human dimension of medical encounters. An encouraging finding from usability testing thus far is that the STOP Stroke Tool is perceived as helpful for patient/provider communication about stroke risk-factor management. More research is needed to determine whether using the STOP Stroke Tool during patient encounters improves patient/provider communication, patient self-management, shared decision-making and collaborative goal setting for managing stroke risk factors.




Spotlight On Mobile Apps Design Title Banner
Leaning cell phone image with a paintbrush painting the image of the screen and icons Mobile App Design Brief: Consistency and Terminology

Donna Harrigan, BA, Management and Program Analyst, Human Factors Engineering, Office of Informatics and Analytics

The Mobile Application (app) Design "brief" series highlights best practices in action in VA for mobile User Interface (UI) design certifications. The user experience Image of the Mindful Breathing application running on an iPhoneconcept of "consistency" is the focus of this issue.

Do you play chess? If you do, you'll know that the names of the pieces and directional moves are constant from game to game. For example, the piece that looks like a horse is called a "knight" and will always move in an ‘L’ direction; this consistency meets the player‘s expectation.

Consistency is also an important principle for UI design. A well-designed application has a steady use of key language, data representation and layout.

The Nielsen Norman Group is highly respected in the field of Human Factors and created 10 Usability Heuristics for User Interface Design (1995), which the industry Image of the Mindful Breathing application running on an iPhone displying an iPhone sample screen with navigation functionalitybroadly accepts. Regarding "consistency," Nielsen states, "users should not have to wonder whether different words, situations, or actions mean the same thing." For a positive user experience, terminology should be uniform within an app, as well as in new versions of the app. The terms used and their meaning should not change.

The images in Figures 6 and 7 are taken from VA‘s Mindfulness app and demonstrate the use of consistent terminology for a selection choice and the page title. When a user taps on the first selection choice "Mindful Breathing," a corresponding page is displayed with the exact same term for the title. If the resulting page title was labeled "Awareness Breathing," or simply "Breathing," consistency would not exist and may cause the user to pause and consider what is different, and whether they have made an incorrect selection. This consistent use of language allows the user to continue to think about the breathing exercise at hand versus being distracted by the functionality of the app.