Duﬂo, E., & Saez, E. (2003). The role of information and social interactions in retirement plan decisions: Evidence from a randomized experiment. Quarterly Journal of Economics, 118(3), 815–842.
- The study’s objective was to determine the impact of receiving information about tax-deferred retirement accounts (TDAs) on enrollment in the accounts.
- The study used a randomized controlled trial (RCT) at a large university. Departments within the university were randomly assigned into two groups. In the treatment group, nonfaculty employees were randomly selected to receive or not receive a letter offering a $20 incentive to attend an employee-benefits fair, at which information on TDAs was provided. In the other group of departments, no employees received a letter. The study examines both the effect of receiving the letter or being in a department in which some people received the letter.
- Under most model specifications, the study found that people in both treatment groups had higher enrollment in TDAs, as measured 4.5 and 11.0 months after the fair.
- The quality of the causal evidence presented in this study is high. This means we are confident that the estimated effects of the intervention are attributable to the intervention itself, and not some other factor.
Incentives for Attending a Benefits Fair
- Compared with those in the control group, people in the treatment group had higher rates of TDA enrollment.
- Differences ranged from about 0.6 to 1.3 percentage points, depending on the follow-up time period and the specific regression model specification.
- Most, but not all, of the differences between the treatment and control groups were statistically significant when evaluated using the p-values reported in the study.
- Rates of TDA enrollment were not statistically significantly or substantively different between the department and letter treatment groups at either time horizon.
Considerations for Interpreting the Findings
This study was a well-conducted RCT with low attrition (evaluated under the liberal boundary). Additionally, there were no apparent confounds that could lead to biases in the results.
The study authors estimated multiple related impacts on TDA enrollment. Performing multiple statistical tests on related outcomes makes it more likely that some impacts will be found statistically significant purely by chance and not because they reflect program effectiveness. The authors did not perform statistical adjustments to account for the multiple tests, so the number of statistically significant findings is likely to be overstated.
Causal Evidence Rating
The quality of the causal evidence presented in this study is high. This means we are confident that the estimates reported accurately reflect the impacts of being in a department in which inducement letters and incentives were distributed or of receiving such a letter and incentive.