Skip to main content

Interim report on the Demonstration to Maintain Independence and Employment (Gimm et al. 2009)

  • Findings

    See findings section of this profile.

    Evidence Rating

    Not Rated

Review Guidelines


Gimm, G., Denny-Brown, N., Gilman, B., Ireys, H.T., & Anderson, T. (2009). Interim report on the Demonstration to Maintain Independence and Employment. Washington, DC: Mathematica Policy Research.


  • This report summarized the results of the first round of the evaluation of the Demonstration to Maintain Independence and Employment (DMIE), under which the Centers for Medicare & Medicaid Services (CMS) offered funding to four states and the District of Columbia to design, implement, and test health care coverage and employment support programs for workers with potentially disabling conditions.
  • This CLEAR profile focuses on the process study component of the evaluation. Other components of the evaluation included an overview of beneficiaries’ characteristics and an impact analysis. This study sought to gain insight into successes, challenges, and lessons related to program outreach and recruitment—drawing relevant information on all five sites from interviews, program documents, state-level evaluation reports, and enrollment data.
  • Key findings indicated that most state DMIE programs faced challenges in recruitment—given delays in designing program processes and obtaining approvals, low interest from eligible beneficiaries, and poor access to data needed for outreach and screening—but eventually met enrollment targets. The authors also found that state DMIE programs that were designed around existing health programs were most successful in implementing streamlined recruitment, strategically supplementing available health benefits, and conducting swift program roll-out through existing provider networks.

Intervention Examined

Demonstration to Maintain Independence and Employment (DMIE)

Features of the Intervention

The DMIE, which was authorized under the 1999 Ticket to Work and Work Incentives Improvement Act and funded by CMS, aimed to delay or prevent reliance on Social Security disability benefits through medical assistance and other supports. Hawaii, Kansas, Minnesota, Texas, and the District of Columbia received federal funding to design and test innovative programs to obtain this objective. Each provided health care services beyond existing health insurance coverage, as well as discounted deductibles, premiums, and copayments. DMIE participants also received employment services such as vocational rehabilitation and service coordination and a personal case manager. DMIE enrollment occurred from 2002 through 2008 and varied by state. All DMIE services expired on September 30, 2009.

The age range for eligible participants varied slightly across states but was roughly 18 to 62. Eligible participants also had to work at least 40 hours per month, not have pending disability applications, and neither be receiving Social Security disability benefits nor have pending applications for benefits. Minnesota and Texas recruited low-income residents with severe mental illness from certain parts of the states. In Texas, participants also had to be uninsured and have a physical disability. Kansas recruited statewide participants with both physical and mental disabilities from its high-risk insurance pool and Hawaii recruited people with diabetes who lived in the city and county of Honolulu. The District of Columbia recruited workers with HIV/AIDS with incomes at or below 300 percent of the federal poverty level. Given the different recruitment strategies and target populations, participants varied across states in terms of age, marital status, ethnicity, college degree attainment rates, physical and mental capabilities, and employment characteristics.

Features of the Study

The first round of the process study covered by this report aimed to gain a comprehensive understanding of early implementation practices (specifically with regard to outreach and recruitment). To achieve this goal, the authors drew on one or more of the following qualitative data sources: (1) site visit interviews conducted in the first or second year of implementation with state-level program staff and evaluators and other stakeholders; (2) monthly calls among CMS, the authors, and state-level staff to review implementation progress; (3) state-level operational and evaluation protocols; (4) quarterly reports from states on design updates, recruitment progress, service provision and uptake, and evaluation efforts; and (5) publications by state-level evaluators. The process study also used quantitative data on program enrollment, participation, and withdrawal submitted by the states to the national evaluator.


The process study found that program sites were generally successful in adapting DMIE for—and attracting the interest of—their target populations. After addressing the initially poor response to outreach efforts, all sites except Hawaii exceeded or came very close to meeting their enrollment targets. The study provided useful information on challenges faced in recruitment, solutions developed by program staff, and best practices for the future. 

  • Recruitment challenges. At the design stage, devising strategies to identify eligible and interested beneficiaries took longer than expected, as did the process of acquiring institutional review board approvals for state-level evaluations. In the outreach phase, program staff grappled with gaps in contact information for potential beneficiaries and found that a distrust of service packages that seemed too good to be true and an unwillingness to share health and employment information detracted many from volunteering for the program. For screening, administrators had difficulty obtaining employment verifications and other necessary data. Overall, state DMIE programs took more than 12 months to conduct recruitment and increase enrollment into the program. 
  • Solutions to recruitment challenges. States were quick to respond to recruitment challenges and willing to adapt selection and outreach strategies on short notice. Solutions included (1) increasing the program’s geographic coverage, (2) widening the age brackets for program eligibility, (3) supplementing mail outreach with in-person recruitment at clinics, and (4) conducting supplementary rounds of outreach and intensive follow-up.
  • Enrollment results. Eventually, all sites but one surpassed or came very close to achieving their enrollment targets, except Hawaii. In Hawaii, the majority of those eligible for DMIE had state-mandated, employer-sponsored health coverage that met most of their diabetes-related health needs, and therefore they had limited incentive to participate in a program that offered a relatively narrow set of supplementary services and required a significant time commitment. In Minnesota, CMS decreased the enrollment target to allow for the above solutions to recruitment challenges to succeed.
  • Lessons related to program design. Overall, programs that targeted people enrolled in existing state health coverage programs were well-positioned for recruitment because they had a preselected group of candidates to market the program to and easy access to contact information required for outreach and other data needed for eligibility screenings. Building on ongoing programs also enabled state DMIE programs to use existing provider networks to facilitate rapid implementation and to ensure that program offerings supplemented benefits that were already available to the target populations.
  • Lessons related to program management. DMIE programs often involved a variety of stakeholders, including state Medicaid agencies, local health insurance and health care providers, organizations offering specialized services, and independent evaluators. The study found that centralizing leadership in one organization ensured swift identification and resolution of problems and effective coordination of implementation processes.

Considerations for Interpreting the Findings

This was a well-structured and clearly written report, summarizing information on program implementation in the same sequence for each state (starting with the intervention design and then describing key actors, recruitment practices, and early lessons learned). The authors used a variety of qualitative data sources to answer the research questions and drew in quantitative data when relevant. Although the report carefully described quality assurance protocols applied to the quantitative data sets, it did not provide adequate detail on the qualitative data, on which core program implementation findings were based. Without detailed information on who was interviewed, sample sizes and selection strategies, and steps taken to ensure data quality, it was difficult to fully assess the reliability of the data. However, the authors presented findings impartially, highlighting both the positives and negatives of program design, as well as successes and challenges in implementation.

Reviewed by CLEAR

June 2015