Skip to main content

Third-party evaluation of MoSTEMWINs: Implementation, outcomes, and impact. (Cosgrove & Cosgrove 2018)

  • Findings

    See findings section of this profile.

    Evidence Rating

Absence of conflict of interest.

Citation

Cosgrove, J., & Cosgrove, M. (2018). Third-party evaluation of MoSTEMWINs: Implementation, outcomes, and impact. St. Louis, MO: Cosgrove & Associates, LLC.

Highlights

  • The study’s objective was to examine the implementation of the Missouri STEM Workforce Innovation Networks (MoSTEMWINS) which helped develop, expand, and redesign new and existing STEM programs as well as enhanced student support and instructional strategies at 13 member colleges located in Missouri.
  • The study authors conducted an implementation study using a self-assessment implementation tool created by the evaluator and completed by member colleges, document reviews, and interviews and surveys completed by various stakeholders. It was designed to evaluate the level of stakeholder engagement in MoSTEMWINS, track implementation progress, and identify successes, challenges, and lessons learned as a result of implementation across all member colleges.
  • The study found that MoSTEMWINS engendered cross-college and statewide collaboration which provided an opportunity to share best practices and lessons learned. Member colleges also employed strategies to create or modify their STEM programs to better align them with local workforce needs by incorporating stackable and industry-recognized credentials, having employers provide input on programming, and leveraging technology-enabled learning.
  • The authors used a variety of data sources that collected quantitative and qualitative data from multiple stakeholders to measure fidelity and address the research questions. However, they did not thoroughly explain the methods used to draw samples from the different data sources, the internal quality assurance procedures carried out by the evaluator to ensure data validity and confidentiality, or the analytical methods employed to analyze qualitative data.
  • The embedded impact study was reviewed by CLEAR in May 2020.

Intervention Examined

MoSTEMWINs

Features of the Intervention

  • Type of organization: Community college
  • Location/setting: Multi-site in Missouri
  • Population served and scale: Adult learners; Long-term unemployed; Underemployed; TAA-eligible; Dislocated workers; Veterans; 2,935 participants
  • Industry focus: Information; Professional, scientific, and technical services
  • Intervention activities: Accelerated learning; Career pathways; Developmental education; Student support services; Technology; Work-based learning
  • Organizational partnerships: Employers; Community-based organizations; Career centers; Workforce Investment Boards
  • Cost: Not included
  • Fidelity: Included

The U.S. Department of Labor's (DOL) Trade Adjustment Assistance Community College and Career Training (TAACCCT) program provided $1.9 billion in grants to community colleges to improve skills and support employment in high-demand industries, notably manufacturing, healthcare, information technology, energy, and transportation. Through four rounds of funding, DOL awarded 256 TAACCCT grants to approximately 800 educational institutions across the United States and its territories. Missouri received a $14.9 million Round 4 TAACCCT grant in 2014 to fund programming to meet the state's demand for STEM workers. Missouri previously received Round 1 ($19.9 million) and 2 ($14.9 million) TAACCCT grants to support employment and skills development in healthcare and advanced manufacturing, respectively. The Missouri STEM Workforce Innovation Networks (MoSTEMWINS) builds upon these earlier efforts but uses Round 4 TAACCCT funds. MoSTEMWINS was a product of 13 community and technical colleges in Missouri coming together under the Missouri Community College Association (MCCA) to create a consortium. It was designed to help member colleges create, launch, and refine new and existing STEM programs that would help fill employment gaps in critical industries identified by STEM employers in Missouri. These industries include information technology, health sciences, life sciences, manufacturing, and transportation.

To remove barriers to program entry and completion, member colleges employed a model developed under MoSTEMWINS that accelerated entry into STEM programs by offering students opportunities to enhance underdeveloped academic skills through contextualized learning, stackable and industry-recognized credentials, and ensuring programs of study could be completed in a condensed manner through online service delivery. MoSTEMWINS allowed member colleges to improve capacity by leveraging updated instructional strategies that used technology-enabled learning, modularized coursework, and hands-on labs. It also allowed member colleges to enhance their student support services to assist with onboarding and recruitment, program completion and retention, and employment preparation. MoSTEMWINS provided education and training to 2,935 participants. The logic model for MoSTEMWINS includes inputs and resources (e.g., lessons learned from prior rounds of TAACCCT grants), outputs (e.g., redesigned courses, employer partnerships), and outcomes (e.g., number of grant participants)

Features of the Study

This implementation study evaluated the level of stakeholder involvement in MoSTEMWINS, tracked implementation since TAACCCT administration, and delineated the successes, challenges, and lessons learned as a result of implementation across all member colleges. The evaluation design was grounded in process evaluation and concepts from theory-driven evaluation to document how MoSTEMWINS was delivered and the outcomes achieved during the evaluation period. Storyboards created by each member college that depicted logic models specific to each college's efforts as part of MoSTEMWINS were analyzed and consortium quarterly reports, curriculum, and grant documents were carried out and reviewed as part of the evaluation. Additionally, student-level outcomes data was gathered from the Missouri Department of Economic Development and the consortium's Effort to Outcomes (ETO) data system. The authors also conducted multiple site visits to all member colleges where grant staff, college leadership, students, student support personnel, faculty, and other pertinent stakeholders were interviewed. Moreover, 90 faculty members (representative of all member colleges) and 43 employers (representative of 10 member colleges) were surveyed, and 33 employer partners were interviewed. The evaluator designed a self-assessment implementation tool that was completed by member colleges to determine their level of implementation across several activities by the end of the evaluation period. Each of these activities rolled up into three implementation strategies: accelerating entry, creating clear pathways to STEM careers, and improving employment attainment. For each implementation activity, member colleges gave a ranking representing the level of implementation achieved using a 5-point scale ranging from “not planned” to “sustaining implementation.”

Findings

Intervention activities/services

  • The study found MoSTEMWINS provided several opportunities for stakeholders of different member colleges to share best practices and lessons learns while creating an environment for statewide discussions that fostered knowledge-sharing. Such collaboration helped member colleges facilitate continuous improvement and enhance course delivery and program outcomes.
  • The study found member colleges developed and modified programs using career pathways, redesigned developmental education to incorporate contextualized learning, adopted student support strategies and instruction methods that utilized technology-enabled learning, used alternative instructional formats such as accelerated and non-term models, incorporated stackable and industry-recognized credentials, granted credit for prior learning, and expanded employer and community partnerships to ensure programs satisfied regional workforce needs.

Fidelity

  • The study found by the end of the grant period, the average self-rated implementation score for each activity for member colleges ranged from mature to sustaining implementation.

Implementation challenges and solutions

  • The study found recruiting students became more difficult as Missouri’s employment and economy improved. Staff expressed how it was hard to convince prospective students of the need to obtain additional education and training to secure gainful and sustainable employment.
  • The study found that new program delivery innovations or requirements as well as DOL-mandated reporting challenged long-standing processes and information systems at member colleges. As a solution, the consortium purchased and implemented a statewide ETO data system and provided ongoing training for users to ensure an accurate collection of participant-level data.
  • The study found that given the flexible program model (e.g., self-paced, open-entry, open-exit, and non-term based) as well as students’ other home, work, and study obligations, it was difficult to maintain participant motivation in completing the program.

Considerations for Interpreting the Findings

The authors used several data sources for collecting insights from different stakeholder groups at multiple intervals during the evaluation period. They also triangulated these data to substantiate their findings related to the posed research questions and to measure the degree to which fidelity was achieved. Some limitations of this implementation study noted by the authors include possible human error in data entry processes, faculty and employer survey data being subjected to positive-response bias, and self-assessments and ratings potentially representing inflated views of implementation progress. The authors do not provide sufficient information regarding how samples for interviews and surveys were drawn from data sources or the rationale for such methods. The implementation study also does not explain what internal quality assurance procedures were carried out to collect and verify data or what analytical methods were employed to analyze and interpret qualitative data. Additionally, the report does not provide evidence that consent by study participants or Institutional Review Board approval were obtained. Regarding the fidelity assessment, the degree to whether the intervention was implemented as designed was propagated by self-assessments completed by member colleges, introducing potential bias. Fidelity was assessed by the study authors and not by the CLEAR team.

Reviewed by CLEAR

May 2021

Topic Area