Absence of conflict of interest.
Citation
Barnett, E. A., Bork, R. H., Mayer, A. K., Pretlow, J., Wathington, H. D., & Weiss, M. J. (2012). Bridging the gap: An impact study of eight developmental summer bridge programs in Texas. New York: National Center for Postsecondary Research.
Highlights
- The study's objective was to assess the impact of developmental summer bridge programs on college-level course completion, academic persistence, and the number of credits attempted and earned.
- The study was a randomized controlled trial conducted across eight colleges in Texas. Eligible students were randomly assigned to either the treatment or control groups. The authors conducted statistical models to examine differences in outcomes between the groups over two years.
- The study found that students in the summer bridge program earned significantly fewer credits for developmental courses than students in the control group.
- The quality of causal evidence presented in this report is high because it was based on a well-implemented randomized controlled trial. This means we are confident that the estimated effects are attributable to the developmental summer bridge programs, and not to other factors.
Intervention Examined
Developmental Summer Bridge Programs
Features of the Intervention
The eight developmental summer bridge programs in Texas were offered in the summer of 2009 and several of them received funding and technical assistance from the Texas Higher Education Coordinating Board (THECB) for two years prior to the start of the evaluation. Two of the programs were offered at open-admission, four-year university institutions and the other six were offered at community colleges in the state. Four of these programs were course-based and the other four were freestanding, in which students received developmental education but not formal course credit for participation. These programs offered developmental educational instruction in math, reading, and/or writing in an accelerated format and they largely ran between four to five weeks during the summer; however, the hours of daily instruction greatly varied by program. Additionally, these programs offered academic support in the form of tutoring, mentoring, or providing access to learning/computer labs and integrated a college knowledge component into instruction that focused on sharing information with regards to college expectations, planning, and payment. A $400 stipend was also integrated into the programs’ design which was awarded to students upon program completion. These programs were designed to serve recent high school graduates who were likely to enroll or already enrolled in college the following fall. Eligibility for program participation included students' scores on a college placement test indicating a need for remediation in a subject offered by one of the postsecondary institutions included in this study.
Features of the Study
The authors used a randomized controlled trial to compare the outcomes of students who participated in the developmental summer bridge programs to students who did not participate. Specifically, the study evaluated whether developmental summer bridge programs reduced students' need for developmental coursework upon starting college in the fall and improved academic outcomes. Student recruitment began in the winter of 2008-2009 across the eight sites in Texas. Across sites, 793 students were randomly assigned to one of the programs (i.e., treatment) and 525 students were assigned to a control group. Data was collected every semester over a two-year period, from the summer of 2009 to the spring of 2011. Those in the study’s treatment group were allowed to enroll in a summer bridge program and take advantage of all its associated services. In contrast, those in the control group were permitted to participate in other educational services offered at their institutions but could not enroll in the summer bridge program. Students in this latter research group were free to pursue other summer activities such as enrolling in other courses at the colleges that hosted the summer bridge programs or finding employment.
Data sources included baseline information forms, institutional administrative data gathered by the National Center for Postsecondary Research (NCPR), the Texas Higher Education Coordinating Board (THECB), the National Student Clearinghouse, and follow-up surveys. The authors used statistical models to compare the educational outcomes of treatment and control group members. The educational outcomes analyzed in this study include the number of credits attempted and earned (developmental courses, college-level courses, and total), the number of semesters registered at college (i.e., academic persistence), and whether students’ passed college-level courses in math, reading, and writing.
Study Sites
- El Paso Community College in El Paso, TX
- Lone Star College-CyFair in Cypress, TX
- Lone Star College-Kingwood in Houston, TX
- Texas A&M International University in Laredo, TX
- Palo Alto Community College in San Antonio, TX
- San Antonio College in San Antonio, TX
- St. Philip's College in San Antonio, TX
- South Texas College in McAllen, TX
Findings
Education and skills gain
- The study found program participation had no statistically significant effect on the average number of credits attempted or earned overall or in college-level courses. However, the average number of credits earned for developmental courses was significantly higher in the control group (4.0 developmental credits) compared to the treatment group (3.5 developmental credits).
- The study did not find a statistically significant relationship between program participation and the average number of semesters registered at any college (i.e., academic persistence).
- The study did not find a statistically significant relationship between program participation and the percentage of students who passed their first college-level math, reading, or writing course.
Considerations for Interpreting the Findings
The study was a well-conducted randomized controlled trial; however, there are a few items that should be considered when interpreting the study’s findings. First, to get the sample size required to detect minimum treatment effects, data had to be pooled across the eight program sites with analyses conducted at the treatment and control group-level by computing averages. Moreover, the evaluation analyzed the overall impact of the developmental summer bridge programs so specific program components were not evaluated. Lastly, the authors note that inconsistencies in program implementation across sites as well as the fact that control group members could freely enroll in summer college courses (though not the developmental summer bridge programs) could have influenced outcomes.
Causal Evidence Rating
The quality of causal evidence presented in this report is high because it was based on a well-implemented randomized controlled trial. This means we are confident that the estimated effects are attributable to the developmental summer bridge programs, and not to other factors.