Absence of conflict of interest.
Citation
Highlights
- The study’s objective was to examine the implementation of the Accelerating Connections to Employment (ACE) program, which was administered across nine sites with partnerships between Workforce Investment Boards and community colleges to foster career paths for participants based on local labor market demands.
- The authors conducted an implementation evaluation that included a fidelity assessment. The study authors used data from site visit observations; interviews with staff, participants, and partners; focus groups with students; program intake and tracking data that was collected by the site staff; and follow-up surveys for participants.
- The study found that the study sites implemented the ACE program with fidelity but adapted the program model over time based on the needs identified of the student populations.
- The implementation study was comprehensive in its design, data collection, and analyses. The findings align with the research questions and are supported by the data.
- The embedded impact study was reviewed by CLEAR in August 2020.
Intervention Examined
The Accelerating Connections to Employment (ACE) Program
Features of the Intervention
- Type of organization: Local Workforce Investment Boards; Community colleges
- Location/setting: Multi-site in Connecticut, Georgia, Maryland, Texas
- Population served and scale: Job seekers; Low-skilled workers; 1,258 participants
- Industry focus: Health care and social assistance
- Intervention activities: Accelerated learning; Career pathways; Developmental education; Student support services; Work-based learning
- Organizational partnerships: Employers
- Cost: Grant expenditures per participant: $4,828 to $13,033
- Fidelity: Included
The Accelerating Connections to Employment (ACE) was an initiative that sought to address workforce demands and the challenges of inadequate training and career paths for low-skilled job seekers. The initiative was funded by a $12 million Workforce Innovation Fund grant awarded to the Baltimore County Department of Economic and Workforce Development by the U.S. Department of Labor, Employment and Training Administration (USDOL/ETA). The ACE program created a formal partnership between nine Workforce Investment Boards (WIBs) and 10 community colleges to develop and provide training for employment with high growth potential based on the local labor market. The program was implemented at nine sites across four states (Connecticut, Georgia, Maryland, and Texas) from 2012 to 2016. The sites represented a mix of urban, rural, and suburban communities. ACE combined evidence-based education and training services within the workforce system to help low-skilled individuals build their career paths in health care. The ACE model was comprised of 10 core components that included labor market demand, community engagement, credentials, learning assessment, outcomes, integrated teaching, student success, employment transition, campus involvement, and data tracking. Participants in the ACE program received integrated basic skills and vocational training, co-teaching, career navigation, job development, and support services. The target population for the ACE program included job seekers who were low-skilled, which included those with limited English proficiency and those with low skills related to reading, writing, and math.
The study included a logic model that included the inputs (WIBs, community colleges, employers, and other partners), activities (planning, intake, training, support services, transition, and tracking), outputs (increased referrals from partners; common assessment tools; individuals enrolled in training, internship model; proactive referral network; milestones for tracking and measuring progress); outcomes (systems: increased coordination among partners, advancement to support low-skilled adults break the cycle of generational poverty; individual: increased education credentials, securing and retaining employment, earnings gains, and further skills training).
Features of the Study
The authors conducted an implementation evaluation that included a fidelity assessment. The study authors used primary data from site visit observations, interviews with staff, participants, and partners, focus groups with students, as well as intake and tracking data that was collected by the site staff and items from follow-up surveys for participants. Qualitative data were analyzed with NVivo software, and the authors developed codes for the identification of common themes to include comparisons for agreement across coders. Fidelity to the ACE model was assessed using a fidelity rubric that followed the logic model, including operationalizing the 10 components of the program into measures and developing scoring. Over the four years of implementation, 1,258 participants were enrolled in the program with 77 percent completing.
Study Sites
- Anne Arundel County, Maryland
- Atlanta, Georgia
- Austin, Texas
- Baltimore City, Maryland
- Baltimore County, Maryland
- Montgomery County, Maryland
- New Haven, Connecticut
- Prince George’s County, Maryland
- Upper Shore, Maryland
Findings
Intervention activities/services
- The study found that the relationship between the WIBs and community colleges started slow – and were dependent on communication styles and personalities – but improved over time to foster collaboration.
- The study found that the training program and instruction benefited from employer involvement in the planning stages.
- The study found that sites adapted their organizational intake procedures to include orientation sessions, for example, to allow for ACE implementation to identify participants who would be program ready.
- The study found that sites added a job developer role to the ACE model to help reduce the workload of career navigators and support employer partnerships. Job placement services were noted as crucial by both the program staff and students.
- The study found that the co-teaching approach with a vocational and basic skills instructor changed over time as student needs were identified; for example, not all classes required a co-teaching approach.
Fidelity
- The study found that most ACE sites implemented the model with fidelity for the ACE components including planning, standardized intake processes, and academic components. The evaluators noted that job-related components and support services, though, did vary; for example, some sites did not meet the requirement to have participants placed into internships or other on-the-job opportunities.
Implementation challenges and solutions
- The study found that implementation challenges included recruitment, aligning student interest to the labor market needs, cost, and connecting students to internships.
- The study found that implementation solutions included tailoring the curricula to local labor market needs, comprehensive support services, hiring the job developer to ease the workload of the career navigator, co-teaching with vocational and basic skill instructors, active employer engagement, tutoring, and providing student stipends.
Cost/ROI
- As part of the implementation study, the study found that the ACE program included more cost than originally anticipated, particularly around extra staff as well as additional unanticipated costs (e.g., pre-eligibility costs like drug tests and background tests, providing funds for participants who failed exams for certifications initially, or providing remediation classes).
- The study included a separate cost study; grant expenditures per student ranged from $13,033 in Upper Shore, MD to $4,828 in Atlanta, GA.
Considerations for Interpreting the Findings
The implementation study was comprehensive in its design, data collection, and analyses. The findings aligned with the research questions and were supported by the data. The authors noted a limitation of the fidelity rubric in that it was applied retrospectively which could have affected the outcomes. Fidelity was assessed by the study authors and not by the CLEAR team.