Elearning! August - September

2013

Elearning! Magazine: Building Smarter Companies via Learning & Workplace Technologies.

Issue link: http://elmezine.epubxp.com/i/152139

Contents of this Issue

Navigation

Page 16 of 52

bigdata Remember to show what skills, activities and behaviors the employees were doing before and afer, so you can point to how they've improved because of the learning or enablement interventions you drove. Here is where Big Data is so exciting. By triangulating data sources, you can chart the impact of your programs. HOW IT'S DONE Cloud Talent Success was formed early in 2012, and we've seen fantastic results thus far. Having a structured approach to present to the department was critical, especially given the rocket-fueled growth and market expectations our corporation was experiencing. Overall approach Measurement is foundational to all aspects of what we do. It covers the full learning and enablement lifecycle. Tese aspects include: 1. Learning Inventory – To determine how to build formal, informal and social learning, we analyze performance using CRM and other data to identify skills gaps and success drivers. build custom metrics. Before-and-afer course surveys measure self-reported confdence (versus knowledge assessment) and actual execution against goals. 5. Business Impact – Very simply, we measure pipeline performance (opportunities created and won, average deal size and sales cycle, conversion ratio) before and afer learning. Tis includes quarterly tracking of overall sales performance versus goals and the market. This approach enables us to systematically plan, design, execute and evaluate the effectiveness of our sales rep training, measuring against real business numbers. Analytics Data analysis is truly the key, because Big Data without powerful analysis is just simple math with lots of numbers. Here's how we approached it. STEP 1: Driver Analysis – A critical frst step was analyzing success drivers. We drew data from four sources to study their infuence on sales attainment: • Customer Relationship Management Sales leaders especially love ... data which provides new and critically important insights not only on training impact and needs, but on specifc competencies and their impact on sales performance. 2. Learning Program Design – We frst defne business, process and learning metrics for each sales stage, then translate these into KPIs and core competencies around which we design courses. 3. Learning Schedule – Regularly tracking sales rep performance on key metrics, we can identify strengths/weaknesses and suggest individualized learning and mentoring intervention. 4. Learning Evaluation – Every program has one or more objectives — like prospecting or closing — around which we 16 August / September 2013 Elearning! (CRM) system: Identifying more than 110 variables, like average deal size, win ratio, sales cycle length • Learning Management System (LMS): Courses taken, self-evaluations, timing of training and more • Performance Management System: Manager ratings, goal setting, performance reviews, learning plans and more • Employee Records: Hire date, manager, sales experience, prior domain experience and more We then used advanced statistical techniques, including univariate analysis, regression modeling and structured equation modeling to determine key infuencers of performance and quantify their impact. Lastly, we converted these infuencers into KPIs and set targets. STEP 2: Business Impact Analysis – Here we merged three data sources, CRM, LMS and the Commissions fle (which tracks attainment versus quota for each sales person), then analyzed before-andafer performance impact. Quite honestly, it was a simple download to Excel from each source, followed by a data merge. For example, to measure training's impact on pipeline, we tracked performance for a certain period of time before and after a key sales course completion, isolating the impact of seasonality (for example, Q4 usually being the busiest quarter). Change Management At our highly data-driven company, it was not difcult to convince stakeholders conceptually that performance measurement and training accountability were good ideas. Politically, however, we took a prudent approach. First, we got buy-in. We created advocates and partners by building consensus with various stakeholders outside Cloud Talent Success when designing, conducting and presenting the results of the measurement model. For example, Sales Operations handled most data analysis, so their support was critical. We gained initial buy-in from the Sales Ops director, then met with him regularly so he remained familiar with, and felt ownership of, the driver analysis. We also met regularly with other sales team leads (like Strategic Sales and Enterprise Sales) for "workshops," at which we presented data and asked for input (versus saying, "Te data shows X, so let's do Y"). When we later presented recommendations refecting their ideas, it was easy for them to say "Yes." Next, we adopted the change. We operationalized the measurement strategy and recommendation process by making the data essential. For example, sales leaders love data especially that which provides new and critically important insights not

Articles in this issue

view archives of Elearning! August - September - 2013