After reading Beyond Kirkpatrick by Tom Werner, and Jane Bozarth's post on Alternatives to Kirkpatrick, I was satisfied to hear that there has been effort to better the evaluation process. This is the kind of information I was looking for when I posted Kirkpatrick's Four Level Evaluation Model, knowing that people would have tried, tested and evaluated the model by this time.
I did receive feedback supporting Kirkpatrick's model from John Pasinosky, Richeek and Geeta Bose. All of them supported the model and said it had worked well for them. John Pasinosky said:
--------------------------------------------------------------------------
I have tried to incorporate the 4 levels in my instructional design practice. It is very valuable if you don't have any other plans in place. ( - In the land of the blind the one-eyed man is king!). The good part about it is less that you can make a better product than that it gives you a framework to know what went wrong when it does go wrong - or someone is unhappy with the results.
---------------------------------------------------------------------------
Geeta Bose, agreed that the Kirkpatrick model had been useful. While saying that she added:
--------------------------------------------------------------------------
While this is a great foundation, like all other methodologies, this too should evolve to fit the changing needs of the training industry. This model (in the classical sense) does not help measure the ROI from training. This model also emphasizes on "post training evaluation" while evaluation should be an ongoing process.
At Kern, we have evolved an evaluation methodology that has worked well for us and our clients.
--------------------------------------------------------------------------
Richeek agreed too adding:
--------------------------------------------------------------------------
It is definitely practical. It all depends on how you market it. If managers & other decision-makers see the value, they'll definitely agree. I have gone into meetings where managers have come in late and start a meeting saying they have to leave it early and then have them say at the scheduled end, "Can we extend this meeting? Would you have the time?"
--------------------------------------------------------------------------
Another presentation I found by Bersin and Associates (designed way back in 2006) gives a good insight into the problems in Kirkpatrick's model of evaluation, that I found practical. I don't mean to down sell Kirkpatrick's model but all I'm saying is that we need to move on and come up with more current ideas. Quoting from Jane's post:
--------------------------------------------------------------------------
In the interest of fairness I would like to add that that Kirkpatrick himself has pointed out some of the problems with the taxonomy, and suggested that in seeking to apply it the training field has perhaps put the cart before the horse. He advises working backwards through his four levels more as a design, rather than an evaluation, strategy; that is: What business results are you after? What on-the-job behavior/performance change will this require? How can we be confident that learners, sent back to the work site, are equipped to perform as desired? And finally: how can we deliver the instruction in a way that is appealing and engaging?
-------------------------------------------------------------------------
I believe this would be true as change is the only constant in every field.
My 2 cents
It's good to know that certain strategies and models have proven in the past, but our requirements today are changing greatly and we need to move on, redefine strategies and redesign models to address the changing needs of learners. I believe it is imperative to re look at our existing training strategies, and evaluate courses to bring in this change.
My questions to all of you would be:
1. Do you think there is a need to change the way we design training?
2. Do you think we need to have more dynamic ways to address the problems faced by our learner?
3. How do you think evaluation should ideally be done to help improve and impact learning greatly?
I did receive feedback supporting Kirkpatrick's model from John Pasinosky, Richeek and Geeta Bose. All of them supported the model and said it had worked well for them. John Pasinosky said:
--------------------------------------------------------------------------
I have tried to incorporate the 4 levels in my instructional design practice. It is very valuable if you don't have any other plans in place. ( - In the land of the blind the one-eyed man is king!). The good part about it is less that you can make a better product than that it gives you a framework to know what went wrong when it does go wrong - or someone is unhappy with the results.
---------------------------------------------------------------------------
Geeta Bose, agreed that the Kirkpatrick model had been useful. While saying that she added:
--------------------------------------------------------------------------
While this is a great foundation, like all other methodologies, this too should evolve to fit the changing needs of the training industry. This model (in the classical sense) does not help measure the ROI from training. This model also emphasizes on "post training evaluation" while evaluation should be an ongoing process.
At Kern, we have evolved an evaluation methodology that has worked well for us and our clients.
--------------------------------------------------------------------------
Richeek agreed too adding:
--------------------------------------------------------------------------
It is definitely practical. It all depends on how you market it. If managers & other decision-makers see the value, they'll definitely agree. I have gone into meetings where managers have come in late and start a meeting saying they have to leave it early and then have them say at the scheduled end, "Can we extend this meeting? Would you have the time?"
--------------------------------------------------------------------------
Another presentation I found by Bersin and Associates (designed way back in 2006) gives a good insight into the problems in Kirkpatrick's model of evaluation, that I found practical. I don't mean to down sell Kirkpatrick's model but all I'm saying is that we need to move on and come up with more current ideas. Quoting from Jane's post:
--------------------------------------------------------------------------
In the interest of fairness I would like to add that that Kirkpatrick himself has pointed out some of the problems with the taxonomy, and suggested that in seeking to apply it the training field has perhaps put the cart before the horse. He advises working backwards through his four levels more as a design, rather than an evaluation, strategy; that is: What business results are you after? What on-the-job behavior/performance change will this require? How can we be confident that learners, sent back to the work site, are equipped to perform as desired? And finally: how can we deliver the instruction in a way that is appealing and engaging?
-------------------------------------------------------------------------
I believe this would be true as change is the only constant in every field.
My 2 cents
It's good to know that certain strategies and models have proven in the past, but our requirements today are changing greatly and we need to move on, redefine strategies and redesign models to address the changing needs of learners. I believe it is imperative to re look at our existing training strategies, and evaluate courses to bring in this change.
My questions to all of you would be:
1. Do you think there is a need to change the way we design training?
2. Do you think we need to have more dynamic ways to address the problems faced by our learner?
3. How do you think evaluation should ideally be done to help improve and impact learning greatly?
I would rather say, there is a requirement to widen the perspective. There are traditional learning audience and relatively modern audience. There are tons of training resources already available that would satisfy the traditional audience well. The training should address the requirements of modern audience as well. By modern audience I mean who prefer collaborative learning, interactive e-learning and learning through non-conventional sources.
ReplyDelete