The Risks of Raising 401(k) Default-Savings Rates

Retirement savings coins in a jar

What You Need to Know

Retirement advisors commonly urge employers to hike their retirement plans’ default savings rates, and this has benefitted the average saver.
However, as explored in a new NBER analysis, higher defaults aren’t always better, and overly aggressive rates can cause various problems.
The results show employers and their advisors must think carefully about the influence of defaults and the limits of behavioral nudges.

Higher default savings rates and more aggressive default allocations made possible by the Pension Protection Act of 2006 have been a major trend in the world of 401(k) plans, with numerous analyses showing the positive affect both of these changes have had on the average American saver.

Given the broader adoption of higher defaults, a new study published by the National Bureau of Economic Research asks some natural questions: How high is too high for the default? And what happens if an employer only matches contributions made at very high rates in an attempt to encourage greater savings?

Specifically, the study reviews a real-world case study where a retirement savings plan adopted a default rate of 12% of income for new hires, which is much higher than previously studied defaults. Another distinguishing feature of the plan is that only contributions made above the 12% mark receive the employer match, with the theory being that these combined features should inspire very high levels of savings.

The paper, however, suggests this theory may be flawed, as by the end of the first year of the experiment, only 25% of employees had not opted out of this default. A subsequent literature review included in the analysis finds that the corresponding fraction of “opt-outs” in plans with lower defaults in the realm of 6% is approximately 50%.

See also  5 simple financial to-dos to accomplish on a summer Friday

The analysis was put together by a team of five NBER-affiliated researchers that included John Beshears and David Laibson of the Harvard Business School, Ruofei Guo of Northwestern University, Brigitte Madrian at Brigham Young University and James Choi of the Yale School of Management.

As the researchers summarize, in large part because only those contributions above 12% were matched by the employer, 12% was likely to be a suboptimal contribution rate for employees. Furthermore, employees who remained at the 12% default contribution rate unexpectedly had average income that was approximately one-third lower than would be predicted from the relationship between salaries and contribution rates among employees who were not at 12%.

The results, according to the researchers, suggest defaults appear to influence low-income employees more strongly, in part because these employees face higher psychological barriers to active decision-making and often fall prey to procrastination and inertia.

Whatever the case, the researchers conclude, simply pushing default contributions rates higher and higher does not appear to represent a realistic solution to the nation’s retirement savings shortfall, as even those with sufficient means to save at this level are often turned away.

While focused on the workplace, the findings are of growing relevance to the wealth management community as leading firms seek to expand their defined contribution capabilities to access a lucrative and growing market.

Key Details From the Analysis

As noted, the analysis looks at the real-world experience of an employer that modified its retirement plan to include a 12% default contribution rate for new hires. The firm did not make any matching contributions on the first 12% of pay contributed by the employee, but instead matched the next 6% of pay contributed at a 100% marginal match rate.

See also  Avoid These Mistakes When Going Independent

According to the researchers, this default was not only considerably higher than previously studied defaults, but it was also likely to be a suboptimal contribution rate for employees — and this fact shows in the results.

“The figures indicate that employees opted out of the default rapidly,” the researchers note. “By tenure month three, only 35% of the employees had never opted out of the default, and this fraction steadily declined to 25% by tenure month 12.”