Staff Perspective: Program Evaluation - How Your Feedback Helps Assess and Shape CDP’s Training Programs
Even though it is rare that I have the opportunity to directly interact with the majority of our CDP training participants, I would wager that I probably am not one of their favorite people. That’s because as the Program Evaluator at CDP, I’m the person behind all of the surveys and feedback forms that we ask each of our training participants to complete. When you factor in additional paperwork required for continuing education credits, a CDP participant may be asked to devote an extra 30 minutes or more of their time to providing feedback and answering questions about their training experience.
CDP’s program evaluation efforts are critical to managing and improving our trainings and providing quality content to all of our participants. We greatly appreciate the extra time and effort that so many of our attendees put into completing our surveys and feedback forms, but we realize that they are very busy people. So I wanted to take this opportunity to share some information about our evaluation strategies and processes to provide a window into how participants directly help CDP continue to provide high-quality training through their feedback.
Program Evaluation at CDP has been strongly influenced by the Kirkpatrick Model, a well-established method for evaluating the effectiveness of training and training programs. The Kirkpatrick Model recognizes four levels of evaluation. Below, I’ve provided a brief description how CDP assesses each level.
Level 1: Reaction
The Kirkpatrick Model defines Level 1: Reaction as the degree to which participants find the training favorable, engaging and relevant to their jobs. At CDP, we primarily assess this level through our post-training surveys which all training participants are invited to complete. More informal feedback from participants via email, our website forum, and even direct interactions with CDP staff at our training events is also considered when we assess Level 1: Reaction to our programs.
Information gathered from this level informs multiple decisions and program tasks at CDP. Participant feedback about presenter expertise, style, and interaction with the audience is provided to all of our trainers to help them prepare for future presentations. Information from participants about our online training platforms has directly affected decisions regarding platform-choice for each of our online training programs, allowing CDP to improve participant access and satisfaction with the online training experience.
Level 2: Learning
Level 2 of the Kirkpatrick Model assesses the degree to which participants acquire the intended knowledge, skills, attitude, confidence and commitment based on their participation in the training. While we hope that every participant in a CDP training is highly satisfied with their training experience (as assessed in Level 1: Reaction), it is essential that we provide our attendees with the information, skills, and confidence they need to be effective mental health providers for military populations. CDP’s assessment of Level 2: Learning is slightly more complicated than Level 1: Reaction and involves data collection both before and after our trainings.
The majority of our trainings assess knowledge gain using a multiple-choice knowledge assessment specific to each training topic. Participants are invited to complete a pre-test, online or in-person, before the training that establishes baseline knowledge about the training topic. Knowledge, using the same items, is then reassessed after the training via a digital or paper-copy post-test. We similarly examine changes from before to after training in self-reported readiness, comfort, and confidence to apply the training information and/or protocols. We study any changes across these items and use the results to inform decisions about methods to maximize gains in knowledge, attitudes, and behaviors in our participants.
Many of our trainings, particularly those that focus on a specific evidence-based psychotherapy (EBP) protocol, also include small-group, interactive exercises to allow participants to practice important skills. During these exercises, our presenters visit with each small group to observe, answer questions, and provide feedback on participant performance. This also is true of our online trainings where our trainers will “teleport” into breakout rooms in Second Life and join small groups of participants by audio/video on Adobe Connect and Zoom. Although this “checking in” is a less formal method than a written test or assessment, it allows for real-time evaluation of participants’ ability to apply information from the training to interactions with others. Feedback from participants about these exercises has been overwhelmingly positive, leading CDP to incorporate more of these active learning and feedback opportunities across our trainings.
Level 3: Behavior
The Kirkpatrick Model defines Level 3: Behavior as the degree to which participants apply what they learned during training when they are back on the job. This level is even more difficult for CDP to assess because it requires knowledge of events that happen after participants leave our trainings. The two primary methods that CDP uses to assess Level 3: Behavior are follow-up surveys and our consultation programs.
Multiple CDP programs send follow-up survey invitations to participants anywhere from 6 to 24 months after they complete a training with us. These surveys consist of items assessing how participants have been using information and skills from their CDP training in their job setting. CDP uses the responses from these follow-up surveys to ensure that our trainings are relevant to our participants’ workplace experiences. We have applied information from follow-up surveys to decisions about training content and consultation program design for our EBP programs; information from these surveys has been particularly helpful in considering how to address barriers to EBP and consultation implementation identified by participants after completing trainings.
Similarly, CDP’s consultation services also allow us to see how participants are applying what they have learned during training when they are back on the job. Although the primary goal of the CDP consultation program is to provide ongoing support resources for workshop participants, it also serves as an opportunity for us to collect information about the types of cases participants are treating and how they are using information and skills from their training. CDP consultants, who also are CDP trainers, use input from consultation calls and meetings to enhance the information that they present during trainings with relevant examples and applications.
Level 4: Results
The final level of the Kirkpatrick Model, Level 4: Results, is arguably the most important and often the most difficult to directly assess. This level examines the degree to which targeted outcomes occur as a result of the training. While the results of Level 1-3 assessments are important, it is the Level 4: Results that many stakeholders use to judge the true value of a training program. CDP’s Level 4 goal can be described as improving the quality and availability of behavioral health care for military and veteran populations through our training and consultation programs. This is a lofty goal, but with CDP’s focus on evidence-based therapies (quality) and a prolific training schedule that reaches thousands of providers annually (availability), it is definitely achievable.
To accurately assess this level requires carefully controlled studies that identify and isolate training effects in the target population. While CDP does not currently engage in this type of research, the Center is moving towards developing and collaborating on research protocols that seek to address Level 4: Results. As we advance these capacities, we will continue to track statistics and annual reports that provide insight into issues relevant to CDP training topics in military and Veteran populations. According to the Kirkpatrick Model’s most recent updates, such information/trends serve as leading indicators of a program’s effects and provide evidence that a program is on track to accomplish the highest level goals (Level 4: Results).
The Role of Participants in CDP Program Evaluation
As you can see from the descriptions above, CDP puts a significant amount of time and effort into designing and conducting comprehensive program evaluation to (1) verify the effectiveness of our trainings and (2) improve them by considering participant feedback in decisions about content and programs. We would not be able to do this without the voluntary participation of our training participants, though. We understand that our participants’ time is valuable and in demand and greatly appreciate the portion of that time spent on our surveys, questionnaires, and assessments.
If you participate in a CDP training, please consider taking the extra time to complete the associated survey(s) to help us in this effort. And if you’ve ever filled out a CDP survey or otherwise contacted us with feedback, THANK YOU! As CDP’s Program Evaluator, I can assure you that the responses and information that you provide are reviewed and carefully considered by CDP personnel across our programs. By providing feedback, training participants are not just “customers” of CDP, but important stakeholders with input into the selection and design of our training content. We rely on your involvement to further our goal of offering quality trainings that improve behavioral healthcare for all of our Service members, Veterans, and their families.
Even though it is rare that I have the opportunity to directly interact with the majority of our CDP training participants, I would wager that I probably am not one of their favorite people. That’s because as the Program Evaluator at CDP, I’m the person behind all of the surveys and feedback forms that we ask each of our training participants to complete. When you factor in additional paperwork required for continuing education credits, a CDP participant may be asked to devote an extra 30 minutes or more of their time to providing feedback and answering questions about their training experience.
CDP’s program evaluation efforts are critical to managing and improving our trainings and providing quality content to all of our participants. We greatly appreciate the extra time and effort that so many of our attendees put into completing our surveys and feedback forms, but we realize that they are very busy people. So I wanted to take this opportunity to share some information about our evaluation strategies and processes to provide a window into how participants directly help CDP continue to provide high-quality training through their feedback.
Program Evaluation at CDP has been strongly influenced by the Kirkpatrick Model, a well-established method for evaluating the effectiveness of training and training programs. The Kirkpatrick Model recognizes four levels of evaluation. Below, I’ve provided a brief description how CDP assesses each level.
Level 1: Reaction
The Kirkpatrick Model defines Level 1: Reaction as the degree to which participants find the training favorable, engaging and relevant to their jobs. At CDP, we primarily assess this level through our post-training surveys which all training participants are invited to complete. More informal feedback from participants via email, our website forum, and even direct interactions with CDP staff at our training events is also considered when we assess Level 1: Reaction to our programs.
Information gathered from this level informs multiple decisions and program tasks at CDP. Participant feedback about presenter expertise, style, and interaction with the audience is provided to all of our trainers to help them prepare for future presentations. Information from participants about our online training platforms has directly affected decisions regarding platform-choice for each of our online training programs, allowing CDP to improve participant access and satisfaction with the online training experience.
Level 2: Learning
Level 2 of the Kirkpatrick Model assesses the degree to which participants acquire the intended knowledge, skills, attitude, confidence and commitment based on their participation in the training. While we hope that every participant in a CDP training is highly satisfied with their training experience (as assessed in Level 1: Reaction), it is essential that we provide our attendees with the information, skills, and confidence they need to be effective mental health providers for military populations. CDP’s assessment of Level 2: Learning is slightly more complicated than Level 1: Reaction and involves data collection both before and after our trainings.
The majority of our trainings assess knowledge gain using a multiple-choice knowledge assessment specific to each training topic. Participants are invited to complete a pre-test, online or in-person, before the training that establishes baseline knowledge about the training topic. Knowledge, using the same items, is then reassessed after the training via a digital or paper-copy post-test. We similarly examine changes from before to after training in self-reported readiness, comfort, and confidence to apply the training information and/or protocols. We study any changes across these items and use the results to inform decisions about methods to maximize gains in knowledge, attitudes, and behaviors in our participants.
Many of our trainings, particularly those that focus on a specific evidence-based psychotherapy (EBP) protocol, also include small-group, interactive exercises to allow participants to practice important skills. During these exercises, our presenters visit with each small group to observe, answer questions, and provide feedback on participant performance. This also is true of our online trainings where our trainers will “teleport” into breakout rooms in Second Life and join small groups of participants by audio/video on Adobe Connect and Zoom. Although this “checking in” is a less formal method than a written test or assessment, it allows for real-time evaluation of participants’ ability to apply information from the training to interactions with others. Feedback from participants about these exercises has been overwhelmingly positive, leading CDP to incorporate more of these active learning and feedback opportunities across our trainings.
Level 3: Behavior
The Kirkpatrick Model defines Level 3: Behavior as the degree to which participants apply what they learned during training when they are back on the job. This level is even more difficult for CDP to assess because it requires knowledge of events that happen after participants leave our trainings. The two primary methods that CDP uses to assess Level 3: Behavior are follow-up surveys and our consultation programs.
Multiple CDP programs send follow-up survey invitations to participants anywhere from 6 to 24 months after they complete a training with us. These surveys consist of items assessing how participants have been using information and skills from their CDP training in their job setting. CDP uses the responses from these follow-up surveys to ensure that our trainings are relevant to our participants’ workplace experiences. We have applied information from follow-up surveys to decisions about training content and consultation program design for our EBP programs; information from these surveys has been particularly helpful in considering how to address barriers to EBP and consultation implementation identified by participants after completing trainings.
Similarly, CDP’s consultation services also allow us to see how participants are applying what they have learned during training when they are back on the job. Although the primary goal of the CDP consultation program is to provide ongoing support resources for workshop participants, it also serves as an opportunity for us to collect information about the types of cases participants are treating and how they are using information and skills from their training. CDP consultants, who also are CDP trainers, use input from consultation calls and meetings to enhance the information that they present during trainings with relevant examples and applications.
Level 4: Results
The final level of the Kirkpatrick Model, Level 4: Results, is arguably the most important and often the most difficult to directly assess. This level examines the degree to which targeted outcomes occur as a result of the training. While the results of Level 1-3 assessments are important, it is the Level 4: Results that many stakeholders use to judge the true value of a training program. CDP’s Level 4 goal can be described as improving the quality and availability of behavioral health care for military and veteran populations through our training and consultation programs. This is a lofty goal, but with CDP’s focus on evidence-based therapies (quality) and a prolific training schedule that reaches thousands of providers annually (availability), it is definitely achievable.
To accurately assess this level requires carefully controlled studies that identify and isolate training effects in the target population. While CDP does not currently engage in this type of research, the Center is moving towards developing and collaborating on research protocols that seek to address Level 4: Results. As we advance these capacities, we will continue to track statistics and annual reports that provide insight into issues relevant to CDP training topics in military and Veteran populations. According to the Kirkpatrick Model’s most recent updates, such information/trends serve as leading indicators of a program’s effects and provide evidence that a program is on track to accomplish the highest level goals (Level 4: Results).
The Role of Participants in CDP Program Evaluation
As you can see from the descriptions above, CDP puts a significant amount of time and effort into designing and conducting comprehensive program evaluation to (1) verify the effectiveness of our trainings and (2) improve them by considering participant feedback in decisions about content and programs. We would not be able to do this without the voluntary participation of our training participants, though. We understand that our participants’ time is valuable and in demand and greatly appreciate the portion of that time spent on our surveys, questionnaires, and assessments.
If you participate in a CDP training, please consider taking the extra time to complete the associated survey(s) to help us in this effort. And if you’ve ever filled out a CDP survey or otherwise contacted us with feedback, THANK YOU! As CDP’s Program Evaluator, I can assure you that the responses and information that you provide are reviewed and carefully considered by CDP personnel across our programs. By providing feedback, training participants are not just “customers” of CDP, but important stakeholders with input into the selection and design of our training content. We rely on your involvement to further our goal of offering quality trainings that improve behavioral healthcare for all of our Service members, Veterans, and their families.