Prebriefing and Debriefing Virtual Simulation: An Exploratory Study of Current Practices in Nursing Programs
Presented by: Donna Badowski DNP, RN, CNE, CHSE; Elizabeth Wells-Beede PhD, RN, C-EFM, CHSE, CNE
Uses of virtual simulation pre-pandemic by the audience: used to replace in-home family visits (that is me), and other assessment courses.
This study looked at prebriefing and debriefing – Was it done, looking at use of INACSL standards, and how debriefing was done. Their results are interesting. I think this presentation’s points are likely best reviewed by looking up their publication. May of their participants were trained in prebriefing and debriefing, but I wonder how representative that is. I know so many people that used virtual simulation that were not trained (and don’t know about INACSL or the certifications they measured). They found that INACSL standards were followed for the most part regarding orientation and prebriefing. They also found that only 10% or participants didn’t have knowledge or background in debriefing. They used multiple theoretical frameworks. Learning outcomes were addressed in debrief. Most people debriefed within 5 hours of the simulation. The sample included mostly full time faculty and was small. They did not look at the context of the virtual simulation. They wanted to know about student perceptions. I asked my question about trying to sample people without the certifications and they liked that idea. 🙂
Using an Innovative, Peer-Review Process to Validate Simulation Scenario Quality
Presented by: Jessica Manning MSN, RN, CNE; Leigh Swatscheno-Dunning MSN, RN, MEDSURG-BC; Abbey Elliott DNP, MHA, RN, CEN, CHSE
There is a need for peer review in order to produce quality simulation – not everyone that has content expertise also knows how to create a simulation. This group wanted to create a process improvement procedure to promote quality. They made 25 simulations in 10 weeks, which was a big undertaking. When doing revisions they did not include the author of the original simulation in the revisions to create a safe space to discuss revisions. As a team the simulations reviewed everything as a team. They reviewed changes with the simulation author along with rationales at the end of the process.
They reviewed their templates and standardized them to make it easier for facilitators to use. This process is not yet published. They have done pre-post surveys with faculty, a student survey, and looked at consistency and standardization. They learned that the process helps with consistency and provides a continuous quality improvement process. In the future they would like to collect quantitative data from students to see how it impacts their outcomes.
I liked the idea of having a template. We worked through their template and something I think I will use in my practice is stating the scenario objectives alongside the course objectives.
They use the HESI to do a needs assessment and developed simulations based on what students were performing lower on.
Advice on faculty push-back: they had good support from the director (buy-in). Since so many people contribute to a scenario they removed all authors from the scenarios. They do add a list of contributors. Someone mentioned using accreditation as a rationale for “this is what we need to do.”
The audience recommended limiting the objectives to two and do others in a future simulation. Sometimes simple is better. Someone recommended making a rubric for simulation development and limiting the number of objectives there.
It was a good discussion and I was happy I stayed.