Elearning! October-November

October-November 2013

Elearning! Magazine: Building Smarter Companies via Learning & Workplace Technologies.

Issue link: https://elmezine.epubxp.com/i/201066

Contents of this Issue

Navigation

Page 19 of 52

Three separate learning cohorts Game based learning Evaluation design for all three cohorts Pre-event Evaluation: Standard & Custom Questions Traditional e-learning Traditional classroom learning provide reliable "leading" indicators of the success of the program. Follow-up evaluations will be sent to learners and their managers 60 days afer completing the program. Questions in these evaluations will focus on how much performance has improved and how much of the improvement was due to training. Tese measures will be correlated with the predicted performance improvement measures and will be compared to industry benchmarks maintained by KnowledgeAdvisors. Learners will also rate whether performance has improved for specifc sales behaviors (e.g., number of client contacts, reaching mutually benefcial agreements, etc.). Ninety days afer the program, KnowledgeAdvisors and Savvis will conduct interviews with selected learners. Following the process outlined in Brinkerhof 's Success Case Method, half of the interviews will be with learners who provided the highest ratings on the evaluations and half will be with learners who provided the lowest ratings. Te interviews will focus on the strengths of each program and which aspects provide the most useful knowledge and skills. Te interviews Post-event Evaluation: Standard & Custom Questions 60-day Followup Evaluation: Standard & Custom Questions 90-day Interviews with most and least successsful learners Post-event sales metrics for individuals (6 months) Figure 1. Evaluation design for comparing game-based learning with traditional e-learning and classroom training will also allow learners the opportunity to provide detailed examples of how they have used their knowledge and skills to improve their performance. Figure 1 shows the evaluation approach with key performance measures across each group. Savvis L&D by the Numbers: Overall Satisfaction with 90-Day Onboarding Experience 4.33/5.00 According to KnowledgeAdvisors' Mattox: "To demonstrate the impact of gamebased learning programs on the business, it is essential to gather business data such as the number of sales, sales margins and other indicators of success." Te fnal data sources will therefore come from Savvis' sales CRM system, which will provide results about sales cycles, win/loss ratios, revenue per transaction, and customer satisfaction. Information will be gathered for each learner for the 12 months prior to the program and six months afer to control for seasonal fuctuations. CONCLUSION Te road to success ofen requires innovation. Te Savvis L&D group has embraced innovation — gamifcation of learning — with the intent of transferring knowledge and skills more efciently and efectively to its sales force. Tis bold move was matched with an equally bold approach to evaluation. Only measurement can truly show how efective learning gamifcation is for Savvis compared to traditional forms of training delivery. Stay tuned. Savvis, Game On! Learning and KnowledgeAdvisors will report their evaluation results in 2014 with another case study. —Kathy Heldman co-authored this article for Game On! Learning. John R. Mattox II co-authored it for KnowledgeAdvisors. To receive a copy of the completed research fndings, send an email to research@ gameonlearning.com. Elearning! October / November 2013 19

Articles in this issue

Links on this page

view archives of Elearning! October-November - October-November 2013