Skip to main content

Final evaluation report Round 3 TAACCCT grant: Mission critical operations. (NC State Industry Expansion Solutions, 2017)

  • Findings

    See findings section of this profile.

    Evidence Rating

Absence of conflict of interest.

Citation

NC State Industry Expansion Solutions. (2017). Final evaluation report Round 3 TAACCCT grant: Mission critical operations. Raleigh, NC: NC State Industry Expansion Solutions.

Highlights

  • The study’s objective was to examine the implementation and operational processes of the Mission Critical Operations (MCO) project to identify process strengths, challenges, and opportunities for improvement.
  • The study authors conducted an implementation evaluation using qualitative data collected mainly through interviews and focus groups.
  • The study noted the key MCO program strengths included adaptability of the program for articulation, communication related to collaboration and project evaluation, broader instructional impact for support services, and knowledge sharing surrounding telepresence. The two main weaknesses of process efficiency for implementation were communication of course design and data collection for evaluation and performance tracking.
  • The implementation study of the five sites included in the MCO was well organized with detailed information and several quality assurance procedures at various stages of data collection and analysis. Study limitations affecting the evaluation included: low survey participation, institutional and management/staff changes during the grant period, and a lack of available employment and wage data.
  • The three embedded impact studies were reviewed by CLEAR in May of 2020:

Intervention Examined

The Mission Critical Operations (MCO) Program

Features of the Intervention

  • Type of organization: Technical and Community Colleges
  • Location/setting: Multi-site in North Carolina and Georgia
  • Population served and scale: Adults; Low-skilled; TAA-eligible; 1,964 participants
  • Industry focus: Multiple
  • Intervention activities: Accelerated learning; Career pathways; Student support services
  • Organizational partnerships: Employers
  • Cost: Not included
  • Fidelity: Not included

The MCO program was funded by a Round 3 Trade Adjustment Assistance Community College and Career Training (TAACCCT) grant from the U.S. Department of Labor (DOL). The TAACCCT grant program was designed to assist community colleges in developing two-year innovative workforce education and career training programs. The MCO program was designed to build institutional capacity and effect change for TAA-eligible and other adults in securing high-skill, high-wage employment through three main objectives. First, the program sought to increase attainment of certificates, diplomas, and other industry related credentials. Second, the program aimed to develop innovative and effective methods of curriculum to address specific industry needs and lead to improved learning outcomes and retention rates. Finally, the program wanted to demonstrate improved employment outcomes for TAA workers in particular. The MCO intervention used an evidence-based design and included the following components: articulation, collaboration, course design, credential development, instruction/support, scorecard development, telepresence, and tracking/evaluation. The intervention Consortium was made up of five colleges across two states. These included two community colleges, a technical college, and the University of North Carolina in Charlotte and a technical college in Georgia. The MCO program was offered among these college partners. The grant also outlined other partnerships and collaborations with industry employers, contractors, and other stakeholders.

An MCO program logic model was developed and consisted of inputs, activities, outputs, outcomes, and impacts. There were several inputs which included the MCO member institution, credit crosswalks, third parties, industry partners, technology, WDBs, and data. The activities consisted of the eight work plan activities listed above. The outputs included crosswalk model, continuous improvement, packaged curriculum, credential criteria, actual enrollment, draft of scorecard, RAMP+ model, and an integrated tracking system. The outcomes were increased student articulated credit, best-practice sharing, MCO courses, MCO credentials, enrollment targets, refinements based on results, increased training accessibility, and participant tracking established. The impacts component included improved retention, graduation, and wages; real-time workforce and credential available via an integrated data system, and courses.

Features of the Study

A case study design was used to gather qualitative information about the program. A program evaluation framework was also developed as a guide for the evaluation team to provide a vision, a mission, and a continuous improvement focus. The framework outlined the pieces to plan, do, study, and act as well as steps for quality assurance. The implementation study followed the research questions outlined in the study around curriculum selection, program design and structure, and participant assessments.

Each program evaluation question was linked to a work plan activity in order to evaluate the progress and completion of each program activity. The authors provide a figure that illustrates this linkage. The main source of data collection was through interviews and focus groups conducted at each participating MCO institution. A semi-standardized structure for gathering data was used so that probing questions could be incorporated to adapt or expand upon responses during the interviews or focus groups. MCO program activities (e.g., quarterly consortium meetings) were also observed. These observations were unstructured but used to document staff interactions and preparation for and participation in Consortium meetings. Another source of data came from document review, which included quarterly narrative progress reports (QNPRs), website content, and MCO literature.

Study Sites

  • Cleveland Community College in Shelby, North Carolina
  • Southern Regional Technical College in Thomasville, Georgia
  • Nash Community College in Rocky Mount, North Carolina
  • Wake Technical Community College in Raleigh, North Carolina
  • University of North Carolina at Charlotte in Charlotte, North Carolina

Findings

Intervention activities/services

  • The study noted the communication aspect of course design and data collection for evaluation and performance tracking as high frequency challenge areas for process efficiency of implementation. The recorded high frequency strength areas included adaptability of the program for articulation, communication related to collaboration and project evaluation, broader instructional impact for support services, and knowledge sharing surrounding telepresence.
  • The study found that the course design work plan activities implemented MCO concepts beyond expectations. The support services offered on the virtual digital tutor site also exceeded expectations.
  • The study found that the MCO project more than met its recruitment target by serving more participants in all outcome indicators than projected.
  • The study found that overall, the MCO program was associated with increased student retention and better program progress.
  • The grant established partnerships among all five institutions in the Consortium. All partners created or enhanced MCO concepts and integrated them into courses. MCO degree paths were also approved. The collaboration between the partner colleges while developing the curriculum led to career pathways that blended courses across the institutions. The grant also outlined employer collaborations between the college partner institutions and employers. Overall, the partner institutions reported good communication and engagement by employers, but experienced varying levels of contributions to curriculum development from employers.

Implementation challenges and solutions

  • The study found several implementation challenges, which included: (1) awareness of the grant program, communication, and content development for course design; (2) data collection and partner participation for collaboration and project evaluation; (3) standard process for articulation; (4) awareness of grant program, data collection and data quality for support services; (5) content development and program sponsor delays for telepresence; and (6) data collection and quality for evaluation and performance tracking.
  • The lessons learned by MCO program managers were categorized into the following themes: budget modification approval, hiring skilled faculty, project management skills, data collection, adaptability of the program, communication, student support services, knowledge sharing, and time availability of subject matter experts. Comments and suggestions relating to program sustainability were centered around the following themes: establishment of credentialed programs, building program awareness, and university partnerships and research.

Considerations for Interpreting the Findings

The study was well organized with detailed information and several quality assurance procedures at various stages of data collection and analysis. Study limitations noted by the authors included: the number of students served and inability to meet demand; a lack of faculty expertise and facility infrastructure; a lack of content, quality, and availability of applicable courses; attrition; and specialized equipment needs. Additional limitations affecting the evaluation included: low student participation in a survey, institutional and management/staff changes during the grant period, and a lack of available employment and wage data resulting in the inability to confirm MCO program impact on employment and wages. Additionally, detailed information about the sample or the selection criteria for focus groups and interviews were not included in the study.

Reviewed by CLEAR

August 2021

Topic Area