Developing a successful simulation-based curriculum
Simucase Education Division
Posted August 7, 2014
Clint Johnson, MA, CCC-SLP, Leigha J. Jansen, EdD, CCC-A, Stacy L. Williams, PhD, Brenda Pantalone, MEd, Katie Ondo, MA, CCC-SLP
Introduction
The success of Simucase is determined by a well-designed teaching plan founded on strong instructional guidance. Course planning should begin with the examination of current learning goals and objectives, and a review of Simucase to decide how the simulation program will be incorporated effectively as a supplemental learning tool. The implementation of the simulation program will vary based on the course and the academic level of the students. For example, the simulation program will be used differently in Intro to Communication Disorders, Clinical Methods I and/or Adult Neurogenics. The utilization of Simucase must support the learning goals and objectives of the course, as well as the individual learning activities and assessment methods.
Once an instructional plan has been established, instructors need to communicate clear and attainable expectations to students regarding how the simulation program will be integrated into the curriculum. In simulation-based education, this guided interaction between teacher and student(s) is referred to as prebriefing; the activity includes a thorough review of Simucase and stresses the need for deliberate and repetitive practice (Issenberg, McGaghie, Pertrusa, Gordon, and Scalese, 2005; McGahie, Siddall, Mazmanian, and Myers, 2009). While students are actively using the program, instructors must continue to monitor their students, communicate openly, answer questions, and provide feedback at established times.
At the end of the simulation, students should always be provided with a structured opportunity to review their experiences, ask questions, identify strengths and weaknesses, and plan for both upcoming simulations and clinical encounters. This debriefing process is the most critical element of simulation-based education (Issenberg et al. 2005; McGaghie et al. 2009). The final component of the implementation process involves evaluating outcomes. Students should be asked to provide feedback (qualitative and/or quantitative) about their simulation experiences as evaluation ensures that Simucase, and the instructional practices, continue to grow with increased success in subsequent courses.
Course Planning
Simulation-based “experiences are critical in developing a clinician who can move from simply absorbing knowledge into one who can apply that knowledge effectively to assess and treat patients” (Tharpe & Rokuson, 2010). Simucase may be implemented in a majority of undergraduate and graduate courses in communication sciences and disorders; however, the cases will be used differently depending on the learning goals and objectives of the course and the academic level of the students. For example, there are multiple ways to weave the case of Kara Lynn (age 3; 6, phonology) into a course. In an undergraduate phonetics class, Kara Lynn’s speech samples may be used in a basic transcription activity. Later on, the case may be re-introduced in a graduate level diagnostics class so that students can perform a complete evaluation, or in an articulation and phonology disorders class to discuss intervention approaches. The key to maximizing the potential of Simucase is to design the simulation around existing learning goals and objectives. This is true of all simulation-based education and requires a working knowledge of the technology (Issenberg et al. 2005).
After reviewing the goals and objectives of a course, the next step is for instructors to become more familiar with Simucase, the scoring system, and the various ways that it can be incorporated into a course. A simple way to review the features of the program and understand the scoring system is described below:
Review the faculty guide.
Choose a virtual patient.
Print the answer key for that patient from the appendices of the faculty guide.
Follow the answer key and enter all the correct answers into the program.
Submit the report and review the results.
This exercise is beneficial because it helps to understand the decision-making process that went into the creation and review of each case. This can be very helpful when choosing goals and objectives, explaining the nuances of a case to students, discussing expectations during prebriefing, and answering questions during the debriefing process.
The next consideration in course planning involves choosing a simulation mode: Learning Mode or Assessment Mode (See Figure 1). In the Learning Mode, students can monitor their progress, and receive feedback for correct and incorrect responses (See Figure 2). The Learning Mode is recommended for novice clinicians or for more advanced clinicians that have had limited exposure to a specific population. In the Assessment Mode, students do not receive any feedback until they submit their case studies. This mode is appropriate for more advanced students with experience using the Simucase technology, or it could be used after students have gone through a specific case multiple times in the learning mode. In summary, the simulation mode should be congruent with the targeted goals and objectives and the academic level of the students.
Figure 1. Choosing a Simulation Mode: Learning or Assessment

Figure 2. Monitoring Progress in Learning Mode.

The final component of course planning is to decide if Simucase will be used with an entire class of students, with small groups of students, or individually. In each of these learning environments, learners benefit from faculty support and feedback. As a discussion tool, Simucase can be used to set expectations for learning about all the topics and issues pertaining to a specific case. While this is ideal for group discussion and collaboration, it limits the amount of individual contribution to the assessment process.
Simucase can also be used by small groups of students. In this way, students learn to collaborate and to share their individual experiences and ideas about a case. While students have additional support from their classroom peers, they are still challenged to work through a case and share their outcomes with other learners. Student groups can compete for “high scores” and share their successes and challenges. Faculty need to be aware that when forming student groups, students with practicum experience may influence overall group performance. Finally, Simucase can be used individually. Cases can be utilized to provide practice with new concepts, as well as provide remediation for learners struggling with specific ideas or skills. Students should be encouraged to work through a simulated case multiple times, restarting the assessment process as many times as needed. When they are satisfied with their scores, students may export their reports to their instructors.
Prebriefing
The prebriefing process sets the stage for the simulation experience. Many communication sciences and disorders students have limited experience with a simulation-based education (Hadley & Fulcomer, 2010); therefore, it is important to explain the purpose of Simucase and why simulations can help to improve their clinical competency and confidence. Instructors are responsible for reviewing the goals and learning objectives of the course, explaining how Simucase will be used to meet these objectives, stressing the need for deliberate practice, and providing overviews of assignments and assessment methods. Instructors should be mindful that simulations are an immersive learning experience and subsequently can produce emotional responses in students like frustration, hope, anxiety, and fear (Williams & Schreiber, 2010). These emotional responses have been shown to increase retention during the learning process (Fanning & Gaba, 2007). However, it is paramount to instill an atmosphere of openness and trust so that students feel comfortable expressing themselves and asking questions before, during and after the simulation is completed (Cantrell, 2008; Fanning & Gaba, 2007).
The final step of prebriefing, and perhaps the most critical is to review the technology with the students so that they have a firm grasp of the simulation modes and how to navigate the program. Next, explain to students that scoring is based on the strength of the decisions they make within each section of a case. In general, students earn points for strong, reflective decisions and lose points for poor, weaker decisions (rejected decisions). There are also decisions that are judged as acceptable, which result in no points awarded or deducted. The points earned in each section are cumulative, which determines the overall competency level of the student. The following scale is utilized to assign a competency measure:
90% or higher overall score = Mastering
70-89% overall score = Developing
Lower than 70% overall score = Emerging
Students should also understand that they will be required to ask relevant, follow up questions while collecting a case history and in the collaborators’ section. Answers to these questions are based off of keywords that trigger a response. Simucase houses a database filled with potential questions for all of the cases. Sometimes a question needs to be rephrased in order for it to trigger a response. This also happens in real-life during an assessment, so it is a good skill for students to learn. The keywords are constantly monitored so that the same question can be asked in a variety of ways.
Provide Feedback and Encourage Practice
As students work on a case, they will need supportive feedback and strong guidance from their instructor to succeed. Kirschner, Sweller and Clark (2006) concluded that educational approaches that provide extensive guidance to students are more effective and efficient than approaches that are minimally guided. If Simucase is being used in the learning mode, have students complete one section of the evaluation at a time or work through the case to the best of their ability, and then bring their questions in during the following class. Allow students to briefly discuss their scores with each other and then facilitate a roundtable discussion, answering questions as necessary and providing constructive feedback.
Continue to encourage students to practice a case as many times as they like. Williams and Schreiber (2010) have shown that university students who completed a case multiple times or restarted a case when mistakes were made scored significantly higher compared to students who only completed a case one or two times (r = .60, n = 16, p = .01). Furthermore, students who spent several days working on a case using concentrated “thought time” after completing an online simulation scored significantly higher when compared to students who completed a simulation in one sitting (r = .52, n = 16, p = .05). Issenberg et al. (2005) and McGaghie et al. (2009) support the notion of deliberate practice and include it in their list of factors that make simulations effective.
Debriefing
Debriefing is the most significant feature of simulation-based education (Fanning & Gaba, 2007; Issenberg et al. 2005). The debriefing process allows the participants to ask questions, vent frustrations, review strengths and weaknesses, and ultimately gain a broader understanding of the simulation experience (Cantrell, 2008). Lederman (1992) identified seven common structural elements in the debriefing process:
Debriefer
Participants to Debrief
An Experience (the simulation)
The Impact of the Experience (the simulation)
Recollection
Report
Time
Skilled debriefers or “facilitators tend to position themselves not as authorities or experts, but rather as co-learners” (Fanning & Gaba, 2007, p.118). Participants in the debriefing process consider the experience of the simulation and its emotional impact. In some cases, participants may be encouraged to debrief each other. Participants should be asked to recall the event verbally and perhaps submit a written summary of the experience. The final component involves time. Most debriefings will occur immediately following the simulation, but others may happen over a longer period of time, allowing students to fully digest the experience. Some instructors have encouraged students to keep a journal over the course of the semester and submit this as a final project (Fanning & Gaba, 2007).
Evaluate Outcomes
After the simulation is concluded, students should be encouraged to share their thoughts through journals, reports, and surveys. Students may also wish to review their scores on specific cases, analyze any improvements, or discuss why their scores may or may not have increased. From past reports, most students begin to recognize that: their skills are developing, repeated practice improves their decision-making and confidence, simulations provide a safe environment for learning from mistakes, and collaboration is critical to a successful evaluation (Jansen, 2014).
Conclusion
Simucase is a revolutionary technology that has the potential to improve the clinical skills of students in communication sciences and disorders; however, like any educational technology, it can not stand alone. Simucase is dependent upon solid curriculum planning and strong instructional guidance. The plan begins with a careful examination of the technology and considering the ways that it can enhance the learning goals and objectives within the curriculum. Simucase relies on instructors to implement the technology successfully by setting up clear expectations that are attainable, providing feedback throughout the simulation, and by encouraging students to reflect on the experience at its conclusion.
References
Cantrell, M.A. (2008). The importance of debriefing in clinical simulations. Clinical Simulation in Nursing, 4, 19-23.
Fanning, R.M., & Gaba D.M. (2007). The role of debriefing in simulation-based learning. Society for Simulation in Healthcare, 2(2), 115-125.
Hadley, A., & Fulcomer, M. (2010). Models of instruction used in speech language pathology graduate programs. Communication Disorders Quarterly, 32(1), 3-12. Issenberg, S.B., McGaghie, W.C., Petrusa, E.R., Gordon, D.L., & Scalese, R.J. (2005). Features and uses of high delity medical simulations that lead to effective learning: a BEME systematic review. Medical Teacher, Informa Healthcare, 27(1), 10-28.
Jansen, L. (2014). The evaluation of computer-based simulated case studies in speech-language pathology education [Doctoral dissertation, Nova Southeastern University].
Kirschner, P.A., Sweller, J., & Clark, R.E. (2006). Why minimal guidance during instruction does not work: An analysis of the failure of constructivist, discovery, problem-based, experiential, and inquiry-based teaching. Educational Psychologist, 41(2), 75-86.
Lederman, L.C. (1992). Debriefing: Toward a systematic assessment of theory and practice. Simul Gaming, 2, 145-159.
McGaghie, W.C., Siddall, V.J., Mazmanian, P.E., & Myers, J. (2009). Lessons for continuing medical education from simulation research in undergraduate and graduate medical education. CME: ACCP Evidenced Based Educational Guidelines, 135, 62-68.
Tharpe, A. M., & Rokuson, J. M. (2010, August). Simulated patients enhance clinical education: Vanderbilt offers unique program for audiology students. The ASHA Leader. Retrieved from www.asha.org.
Williams, S., & Schreiber, L. (2010, April). Simucase: Interactive case studies for student assessment. Presentation at the annual conference of the Council of Academic Programs in Communication Sciences and Disorders, Austin, TX.
Williams, S., & Schreiber, L.R. (2010). Beyond the big screen: Avatars prepare graduate students for realworld practice. SIG 16 Perspectives on School-Based Issues,11(2), 50-55. doi:10.1044/sbi11.2.50