This is Part II of a 2-part post. Check out Part I, Does The Illinois Workplace Wellness Study Say What Everyone Says It Says?.
A lot of questions remain about if and how these programs work. We have observed results for only the first year of our intervention. We are continuing to collect data to evaluate effects in the second and third years.
— Illinois Workplace Wellness Study website
The University of Illinois study rolls on as the researchers demonstrate they are eager to uncover the truth and not just confirm over-simplified pre-existing notions about whether wellness works or doesn’t work. Notice that they called their paper, “What Do Workplace Wellness Programs Do?” rather than using a title that declares the issue put to rest, like, say, “Workplace Wellness Doesn’t Work.”
Ultimately, they may very well find that the Illinois program doesn’t yield the desired outcomes (potentially a real kick in the pants for Aetna, one of the researchers’ “collaborators”). Or that it does. If we knew for sure, there’d be no point in the study.
Personally, I’ve never had any reason to believe a wellness program would reduce an employer’s health care costs. But, so far, there isn’t anything in this study I’d cite to support that opinion.
Randomized Controlled Trials vs. Observational Studies
One of the most interesting things about the study is its design. It’s a randomized controlled trial (RCT) — a rare sighting in the world of wellness — and the researchers compared their findings to what they would’ve concluded if their data came from a an observational study (the kind that almost all our healthy lifestyle guidance is based on — from “physical activity is good for you” to “don’t inhale too much asbestos”), potentially explaining, as Aaron Carroll argues in his column, why some wellness studies show that wellness does work. Or, as one of the Illinois study’s principal investigators wrote to me in a private correspondence, “Methodology matters.”
Even when methods are about as good as can be, we probably should never trust a single study with high confidence…Take, for example, the randomized controlled trial (RCT). It’s reasonably considered the gold standard of social science methods. When you read the results of a well-conducted RCT, does that mean you can take them and run with it? Not so fast. They may not apply outside the population studied.
– Austin Frakt, co-editor with Aaron Carroll of the Incidental Economist, in Limitations: The Achilles Heel of Single-Study Relevance
No Reason to Expect Improvement
The Illinois Wellness study is and will continue to be important. It has the potential to deliver actionable insights into the value of incentives; the profile of employees who tend to engage in wellness programs; the types of programs that are and aren’t effective; and, ultimately, the behavioral, health, financial, and productivity outcomes we can expect from comparable programs.
For the university, the iThrive program is a good start — more thoroughly thought-out than most new programs. (Thanks to the study’s transparency, a large employer seeking to launch a wellness program could use the study’s published material to develop a program template — though I’d recommend skipping the incentives and the screenings, and adding longer-range plans for a more comprehensive strategy.) But that’s what it is — a start.
In a non-study situation, smart leaders of a “comprehensive” program, seeing that Year 1 activities had no effect on anything, would undertake a quality improvement process and make adjustments accordingly. After all, if there aren’t any behavior changes in Year 1 — and environment, culture, and work design aren’t even on the radar — there’s no reason to expect health, financial, or productivity improvements in the following years.
I admire the researchers’ refraining from sensational conclusions based on their Year 1 data. Now, it’s up to thought leaders with a media platform, and up to us — those responsible for applying research findings to our programs — to exercise the same restraint.