Sunday, November 18, 2012

Are We Fulfilling Our Promise?

I would first like to thank everyone who joined us for the PCS at the BESIG 2012 Annual Conference.  I know it was a financial and time committment on your part and I hope that the sessions were worthwhile.

For those who could not attend, I believe you missed a very valuble session and I hope you will be able to make the next one.  But I understand that distance, financial, and training constraints prevented you from joining the session.  So I will do my best here to recap my workshop on assessing and reporting training quality.

Here is the available video of the presentation.  Note, it starts when I am speaking about the benefits of a quality assessment with clients.

Let's start with the presentation and follow with some of the explanation.

Are In-Company Trainers Afraid of Assessment?

As expected at a BESIG conference many of the trainers came from the educational setting in which assessment is a part of life.  However, I see that in the in-company setting assessment is avoided.  As long as the learners leave with smiles and the manager seems satified then we carry on as though everything is hunky-dory.  But there are considerable benefits to a comprehensive assessment program.

Business terminology:
cost-plus pricing
value-based pricing

Kirkpatrick's Four Levels

This is nothing new.  Donald Kirkpatrick described these levels long ago, but they continue to be the gold standard in training assessment for corporate training.  I think we need to be able to accomodate these client expectations of results with quantitative and qualitative data.

Impressions from Workshop

First, I would like to commend Target Training (one of the key sponsors of the conference) for supporting their staff to achieve certification on the Kirkpatrick model.  During the workshop one mentioned that I was not presenting the most recent developments on this.  He is correct, for more info check some of the more recent references.  However, in the sense of ELT and assessing Business English training, I feel that the traditional framework is already a significant step in the right direction. 

To invert the model (as is currently being taught) or to add a fifth level of monetary ROI (as has been advocated) are simply not steps either our profession or our clients are ready to accept.  And unless we are going out and setting up massive training programs, maybe is it unnecessary.  Therefore, it is more practical to focus on the traditional four levels approach.  However, I find it outstanding that this company is not only taking this approach to corporate training, but also developing their people.  It is far too rare in our industry.

Horton's External Factors

The problem with adopting the four levels without consideration is that is can lead to distortions.  It tends to ignore external factors.  I believe the Holton's simple and effective organization resonates which the BE trainer because we can fully identify with these challenges.  Now, Holton actually does not think Kirkpatrick model is effective at all (and they have a personal dislike for each other).  But strangely, his own 'model' looks extremely similar.  So for the sake of simplicity I just super-imposed Holton ideas on the pyramid.


A quick note about surveys because we talked a lot about this in the sessions.  These are not the end-all-be-all of assessment.  They are certainly valuable and quite easy to administer, but do not generally tell the whole story.  On one of the first slides, I showed the menu of assessment tools I see being used.  All have their place and all are valid, we simply need to understand which level they are assessing and how external factors can influence them.  I went to the talk by Judith Mader on performance-based testing which reveal some of the challenges with setting criteria.  This is what I use to judge learning, albeit on a smaller scale than her university.

But in response to questions about how to operationalize this I have uploaded an example survey that I use.  This is by no means perfect and I customize certain sections depending on who, what and when I am conducting the assessment.

English Training Feedback Form (Email)

Putting it Into Practice

It would be impossible for me to understand each training situation of the audience and we saw from the feedback that some have never thought about this, some have taken on part of this in their work, and some are already using these methods daily.  Additionally, some have no control over the assessment methods used in their organization.  However, it was very nice to hear some trainers talking about how they planned to change the way they speak with the learners to either get information on the transfer environment or gain insights on behavior/results.

Some other ideas were to review their feedback form, conduct some sort of before and after assessment, and to use a simple method like the workshop notes page in the handout.  I was really happy to hear that suggestion because, of course, this is the way the workshop was designed.


This was not really discussed that much in the groups but I think it may be the most important step, especially for training companies running many classes with many trainers.  Because the information for the report will come from many sources it needs to be organized to help drive improvement.  I also think it is the best tool for initiating trainer cross-talk.

For example, Trainer A consistently gets great feedback on reaction.  The learners love her, she plays games and there are lots of laughs.  On the other side, Trainer B scores great on learning and preparing people for meetings.  Sit the two down together and Trainer A gives a few lesson ideas for more fun and relaxation in the classroom, and Trainer B shares how she builds simulations to help for meetings.

I know that reporting sounds like tons of work and a boring admin task.  It is if there is no point, it is actually very motivating if everyone knows that this report will generate suggestions and action points to improve.

So... thanks to all who came!

Handout - Are We Fulfilling Our Promise


  1. Replies
    1. Thanks so much Michelle. It was great meeting to you and hopefully we'll see each other again soon.

  2. Thanks for sharing your slides and handouts. I really enjoyed your talks at the Besig conference and was particularly interested in your structured approach to assessment and reporting. A couple of questions: 1. Could you share any practical tips on how you collate the 10 training inputs, 4 levels and externalities - do you use an Excel doc.? 2. do you share any of the information in a course report with the students you are reporting on?

    Thanks very much

    1. Thanks so much for coming and I hope you had a successful conference.

      To answer your questions...
      First, I am a freelancer so I use reporting which fits my environment. Sometimes I am working with a corporate training university. They have their own report template and I follow that. But when I am making suggestions (one section) I structure my information as shown in the slides. For example, "The learners in this group performed well in presentations during the lesson but unfortunately the management is still asking them to keep slides text heavy and stick to two minutes per slide." (learning, application, transfer environment).
      For my direct clients, they are generally written according to the format above in OneNote and serve simply as a reflection tool because I am the only one who will read them. I write suggestions and go back to the client about possible changes, additional services, and professional development on my part.

      For example, over the last year I consistently score very high on reaction, behavior, and results, but they aren't learning as much as I would like. They consistently underperform in areas like vocabulary and grammar because I need to improve my review activities and lesson continuity (often due to fluctuating attendance). So my focus for the next 6-12 months will be on improving this while maintaining the same content, fun factor, and transfer design (enabling them to take it to work).

      As far as internal and external reporting... I like for my client check-ins to be valuable for the client. So, I will typically arm myself with some information for the short conversation. In most cases, the client doesn't want to worry about the English courses. So I will talk shortly and often feeding him/her with information. For example, "I just wanted to talk shortly about the English program. I got some feedback and nearly all the participants said they felt better about their relationships with international colleagues. I was just wondering if you had any concerns or any questions about the program." In most cases they are simply happy to get some feedback without having to think too much about it. Occasionally, I will also provide the client with a one slide rollup of my feedback and include things learned, any specific testimonials about application (anon.), and some results. I may also come and talk about some ways we might think about making it better in the future (class website, more access to authentic materials, more support from management, etc.). So the report is normally verbal and not written... they don't have time for that and it's not as personal.

      For the learners, I usually summarize my findings during the next lesson and handle any question/concerns because they will then feel they are not alone and they have had time to think about it. For example, I will come back and tell them that the feedback wasn't so good on presentations and ask about how important they are and if we would like to spend more time on it. Also, I might say that everyone is applying the lesson in their job and we will have a little knowledge sharing about how they do it (pre-task reflection, using models, keeping vocab card on their desk, etc.).

      I hope that answers your question and thanks for the comment.