|
Post by Somebody hasto save our skins! on Aug 23, 2017 9:53:54 GMT
What feedback/stats do you collect from participants and facilitators of your simulation events? We currently collect feedback forms before and after simulation events. They are specific to the event, asking questions about the techniques covered. I'd like to include a couple of questions about people's expectations and experiences of simulation, so I can identify areas of best practice and where our team can do better. It would be even more effective if I was receiving data that was directly comparable to other sim centres'. For that, we'd need to be asking the same questions. Do you already do this and compare feedback? Am I the only simtech that's anal enough to care about feedback stats?
Our possible new questions are focussed on qualitative info:
Pre-Course How would you rate your overall confidence/experience in [Name Of Course]? What are your expectations?
Post-Course Do you feel that this course could have been delivered as effectively without simulation? Please give reasons for your answer. Did the course meet your expectations? Please give reasons.
|
|
|
Post by Manikin Skywalker on Dec 8, 2017 23:02:50 GMT
we retained all our feedback, going back years!! specific feedback after every course was reviewed and courses adapted accordingly and we reviewed each other performance and made changes where necessary. we didn't compare with other centres unless we were doing a course that was being delivered in multiple centres. as for your questions: i'm not sure your questions will give you the viable data that will allow you to say what is best practice but they will allow you to highlight if you are fulfilling participants expectations or not.
if i'm understanding you correctly you want to improve the delivery of simulation courses, have a local best practice, which should lead to consistency of delivery. (remembering the opposite is not true, consistency does not lead to best practice)
so instead i'd start with your team defining broadly what is best practice, then start with a broad course structure, e.g. who is the course for, what are they going to learn, what are the faculty roles. these large broad elements you can break down into smaller and smaller elements. after each course, all faculty read the feedback and also give their own feedback to faculty team, debrief each other, what worked, what didn't, did the scenario go as planned, if not why not, did the scenario feel realistic, did the simulation feel real, agree changes for the next time, quickly you'll start to define every aspect of the course from the advertising, to the welcome, running the scenarios to the feedback process. then when your getting 95% satisfied with your courses you'll know you're doing a good job, and aim higher still!! questions i would ask in the feedback form you give to participants, is ask them how your simulation course compares to previous sim courses they have been on (here and/or elsewhere?), and i always like the one thing we did well and one thing we could do better options.
|
|