Knowledge@Wharton: Can Firms Help Employees Make Better Retirement Choices?
Retirement savings plans have been in the news lately as Republicans eyed limits on the maximum pretax 401(k) contributions as a way to fund the cuts outlined in their tax proposal.
The 401(k) plan has over the last two decades become the dominant way that Americans save for retirement, replacing the traditional pension. But employers may not be doing as much as they can to encourage employees to use 401(k)s to adequately prepare for the end of their working years.
Many employers automatically enroll workers in 401(k) plans and set a default savings level, which puts the onus on employees to opt out. Research has shown that defaults result in greater participation in many situations, including saving for retirement. But in the case of retirement plans, a default becomes a two-part question: Firms need to consider whether or not to have one — and where to set the minimum savings rate.
New research co-authored by Wharton operations, information and decisions professor Katherine Milkmanattempts to find an optimal level for these defaults – with some surprising results. The paper, “How Do Consumers Respond When Default Options Push the Envelope?” was co-authored with Harvard’s John Beshears, Shlomo Benartzi of the University of California-Los Angeles and Richard T. Mason of City, University of London and Voya Financial.
“Companies tend to be very conservative about the savings rate they use for defaults – it’s often as low as 3%,” Milkman says. “But if people save at that rate for their entire career and retire at a normal age, that’s not nearly enough to live on comfortably.”
People tend to trust that employers are choosing the rate that is best for them, Milkman notes. Meanwhile, employers are skittish about setting a higher default rate because they fear it will shake employees out of passive acceptance and lead them to opt out from retirement savings entirely.
Finding the Right Rate
The researchers set out to find a default rate that struck the right balance – high enough that it would allow employees to adequately save, but not so high to induce the sticker shock that employers feared. In partnership with Voya Financial, a New York-based financial, retirement, investment and insurance firm, they conducted a field experiment involving 10,000 employees from one of Voya’s clients. When employees logged on to the enrollment website for their retirement plans they were randomly assigned to see a suggested savings rate ranging from 6% to 11% of their pay.
Many employers set default rates that take effect without the employee having to take any action. The researchers created a situation where the employee had to elect to continue at that rate in order to more clearly gauge acceptance of a particular default. Participants were also given access to a decision tool designed to help them determine an optimal retirement savings rate.
After looking at contribution rates after 60 days, the researchers found that not only were those with savings rates over 6% no more likely to drop out of the retirement plans than their peers, their contribution rates were 0.2 to 0.5 percentage points more than those of people defaulted into the lowest savings rates. It doesn’t seem like a lot, but the researchers note that it can add up to tens of thousands of dollars over the course of an employee’s career. While the likelihood of not participating began to inch up for those who were assigned the highest savings rate (11%), Milkman says the research shows that employers should feel comfortable moving beyond low defaults.
Read more at Knowledge@Wharton.