Numerous models of evaluation have been proposed over the years (For example, Brookfield, 1986, identifies the Predetermined Objectives Approach, Goal-Free Evaluation, CIPP Model of Evaluation, Kirkpatrick Hierarchy of Evaluation among others.). These models have been applied in a variety of settings including adult education. More recent models of evaluation have included the Naturalistic Evaluation (Guba & Lincoln, 1989), also called the fourth generation evaluation, in which the concerns and issues of the stakeholders serve as the organizing structure for the evaluation. The evaluation models used in this study of workplace literacy programs included the Kirkpatrick Hierarchy (1994) used in conjunction with the Naturalistic Evaluation. In addition, strategies that companies can use to conduct their own evaluations were employed (Askov, Hoops, & Alamprese, 1997).
The researcher served as external evaluator of three projects of the National Workplace Literacy Program (NWLP) of the US Department of Education that were funded for a three-year period. Two of the projects were statewide; one was a community college that provided a variety of adult education programs. She visited each site twice per year to interview all stakeholders (management of the companies, unions, trainers, supervisors, learners, college personnel), observe classes, and troubleshoot for formative evaluation. Since she was not onsite as external evaluator, she had to rely on the colleges to collect the data related to Kirkpatrick’s hierarchy. The project director of each of the NWLP projects supervised the data collection process.
The qualitative data from the interviews were entered into a Filemaker Pro database. Using the database, the researcher’s assistant tallied the frequencies and categorized the responses to identify trends over time and across locations (company sites) within each project. They were also able to identify some generalizations that cut across projects. Additionally, data were gathered about each of the four levels of Kirkpatrick’s evaluation hierarchy, permitting generalizations about the effectiveness of each of the three national projects. (See the Appendix for the interview protocols.)
The primary data source included interviews with stakeholders as well as other data (related to Kirkpatrick’s four levels) supplied by the three NWLP projects. These interviews from the NWLP projects provide a rich data source that can provide researchers and educational practitioners, as well as business/industry and labor unions, with knowledge about not only how to evaluate workplace literacy programs but also how to design effective workforce education programs. This information is also useful to state and local policymakers as they design welfare-to-work programs.
The findings came from the analyses of the interview data that were put into the Filemaker Pro database. The database allowed the investigator and her assistant to search for key words and identify trends. The findings also resulted from the efforts of the colleges to collect data at each of the four levels of Kirkpatrick’s hierarchy (1994).
The two statewide projects institutionalized the workplace literacy programs in businesses and industries at the conclusion of the grant; the community college that served as the NWLP project director did not. A coordinating state structure is therefore recommended to provide support and training.
One of the statewide projects required a progressively greater financial match from companies each year; that seemed to lead to institutionalization since by the third year the businesses or industries were providing a 75% match. Most of the companies in that statewide program did institutionalize the program after the federal funding was no longer available.
Labor unions were very heavily involved in one of the statewide projects. The project initiated the concept of peer advisors who were workers who recruited co-workers to the workplace literacy program, developed promotional materials, and assisted the instructors on occasion with instruction. Peer advisors were successful even in non-unionized companies.
In all projects workers served with management as equals on workplace literacy advisory boards, leading to a cultural change within some work organizations. Management grew to respect the input of the workers, and workers trusted and appreciated management for its commitment to the program.
Strong curriculum development and staff training components proved to be useful and led to cohesiveness in program design. Programs using the functional context approach to instruction, where instruction in literacy skills was related to job tasks, appeared to be the strongest both in terms of company support and learner involvement.
As to the evaluation, Kirkpatrick’s hierarchy proved to be useful and understandable to practitioners who were collecting the data. A full day of training by the evaluator at the beginning of the projects proved to be essential. Level 1 (reactions of all stakeholders) data were easy to collect. Level 2 (mastery of the skills taught in class) was more difficult to assess since the instructors did not readily know how to create skill assessments. Level 3 (transfer to the workplace) data were often collected by supervisor interviews. However, learners themselves proved to be the best source of information about transfer in self-reports about using their skills in the workplace. Level 4 (impact) was best measured by determining the greatest need of the company (e.g., retention of workers) and showing the impact of the program on that need.
Askov, E.N., Hoops, J., & Alamprese, J. (1997). Assessing the value of workforce training. Washington, DC: National Alliance of Business.
Brookfield, S.D. (1986). Understanding and facilitating adult learning. San Francisco: Jossey-Bass.
Guba, E.G., &
Kirkpatrick, D.L. (1994). Evaluating training programs: the four levels. San Francisco: Berrett-Koehler.
Interview Guide for Partners
1. Place of employment:
2. How satisfied are you with the project? Why?
3. How effective was the partnership between industry and the College?
4. Did your expectations change during the course of the project? How?
5. What were your major disappointments?
6. How did the company benefit (productivity, quality, safety, absenteeism, retention, etc.)? Examples?
7. How did the workers benefit (morale, attendance, teamwork, etc.)? Examples?
8. How cost-effective was the project?
9. How do you feel about continuing the project?
10. Has the project helped the company with public relations (newspaper articles, TV, or radio coverage, etc.)? Examples?
11. Has the project improved the company’s training program? Examples?
12. What changes do you see in the near future that would change the needs of your workers for training?
13. Would you recommend this training program to your colleagues in other companies?
14. Other comments:
Supervisor/Training Director Interview Guide
1. Place of Employment:
2. Name of Class:
3. Number of your workers who participated:
4. How satisfied were you with the class(es)? Why?
5. How did the company benefit (productivity, quality, safety, absenteeism, retention, etc.)? Examples?
6. How did the workers benefit (morale, attendance, teamwork, etc.)? Examples?
7. Has participation in the class(es) affected their chances for advancement?
8. How much did the workers talk to you about the class(es)?
9. How did the workers who participated feel about the class(es)?
10. How did the other workers feel about the class(es)?
11. How did you feel about releasing workers from the job? How did you accommodate?
12. How does this training compare with training the company has done or could do itself?
13. Would you recommend the company continue this kind of training?
14. What are the advantages and disadvantages of working with the College in offering the class(es)?
15. Other comments:
Learner Interview Guide
1. Place of Employment:
2. Name of Class:
3. How satisfied were you with the class? Why?
4. What was the most important part? Least important?
5. What did you gain from the class?
6. How did the class help you with your job? Examples?
7. Did the class help you understand the company better? Examples?
8. Do you feel better about yourself as a worker as a result of the class?
9. Did the class prepare you for a company training program? Which one?
10. Did the class help you with getting a promotion or a better job? How?
11. How did your fellow workers feel about you taking the class?
12. Would you recommend others to take the class?
13. Did you get support from your supervisor to attend the class?
14. Do you look forward to any more classes? Where?
15. Do you do any more reading, writing, or math at work than you did before the class? Examples?
16. Do you do any more reading, writing, or math at home than you did before the class? Examples?
17. How did the class help you outside the job? Examples?
18. Other comments:
Staff Interview Guide
1. How satisfied are you with the project?
2. What are the greatest satisfactions?
3. To what extent are there agreements on the goals among all stakeholders?
4. What factors helped with the success of the project?
5. What factors acted as deterrents to the project?
6. What do you see as the major outcomes?
7. What are the major disappointments?
8. What was the most difficult part of the project?
9. How do you feel about your linkage with industry? Will it continue?
10. What would you change in a future project?
11. How has the college benefited from the project?
12. How much support have you had from the college?
13. How cost-effective was the project?
14. What are your plans for the future regarding this program?
15. Other comments: