Could running actually be good for your knees?
That idea is at the heart of a fascinating new study of the differing effects of running and walking on the knee joint. Using motion capture and sophisticated computer modeling, the study confirms that running pummels knees more than walking does. But in the process, the authors conclude, running likely also fortifies and bulks up the cartilage, the rubbery tissue that cushions the ends of bones. The findings raise the beguiling possibility that, instead of harming knees, running might fortify them and help to stave off knee arthritis.
Of course, the notion that running wrecks knees is widespread and entrenched. Almost anyone who runs is familiar with warnings from well-meaning, nonrunning family members, friends and strangers that their knees are doomed.
This concern is not unwarranted. Running involves substantial joint bending and pounding, which can fray the cushioning cartilage inside the knee. Cartilage, which does not have its own blood supply, generally is thought to have little ability to repair itself when damaged or to change much at all after childhood. So, repeated running conceivably wears away fragile cartilage and almost inevitably should lead to crippling knee arthritis.
But in real life, it does not. Some runners develop knee arthritis, but not all. As a group, in fact, runners may be statistically less likely to become arthritic than nonrunners.
The question of why running spares so many runners’ knees has long intrigued Ross Miller, an associate professor of kinesiology at the University of Maryland in College Park. In earlier research, he and his colleagues had looked into whether running mechanics matter, by asking volunteers to walk and run along a track outfitted with plates to measure the forces generated with each step.
The resulting data showed that people hit the ground harder while running, clobbering their knees far more with each stride. But they also spent more time aloft between strides, meaning they took fewer strides while covering the same distance as when walking. So, the cumulative forces moving through their knees over time should be about the same, the researchers concluded, whether someone walked or ran.
But, recently, Dr. Miller had begun to doubt whether this finding really explained why running wasn’t wrecking more knees. He knew that some recent studies with animals intimated that cartilage might be more resilient than researchers previously had believed. In those studies, animals that ran tended to have thicker, healthier knee cartilage than comparable tissues from sedentary animals, suggesting that the active animals’ cartilage had changed in response to their running.
Perhaps, Dr. Miller speculated, cartilage in human runners’ knees likewise might alter and adapt.
To find out, he again asked a group of healthy young men and women to walk and run along a track containing force plates, while he and his colleagues filmed them. The researchers then computed the forces the volunteers had generated while strolling and running. Finally, they modeled what the future might hold for the volunteers’ knees.
More specifically, they used the force-plate numbers, plus extensive additional data from past studies of biopsied cartilage pulled and pummeled in the lab until it fell apart and other sources to create computer simulations. They wanted to see what, theoretically, would happen to healthy knee cartilage if an adult walked for six kilometers (about 3.7 miles) every day for years, compared to if they walked for three kilometers and ran for another three kilometers each of those days.
They also tested two additional theoretical situations. For one, the researchers programmed in the possibility that people’s knee cartilage would slightly repair itself after repeated small damage from walking or running — but not otherwise change. And for the last scenario, they presumed that the cartilage would actively remodel itself and adapt to the demands of moving, growing thicker and stronger, much as muscle does when we exercise.
The models’ final results were eye-opening. According to the simulations, daily walkers faced about a 36 percent chance of developing arthritis by the age of 55, if the model did not include the possibility of the knee cartilage adapting or repairing itself. That risk dropped to about 13 percent if cartilage were assumed to be able to repair or adapt, which is about what studies predict to be the real-world arthritis risk for otherwise healthy people.
The numbers for running were more worrisome. When the model assumed cartilage cannot change, the runners’ risk of eventual arthritis was a whopping 98 percent, declining only to 95 percent if the model factored in the possibility of cartilage repair. In effect, according to this scenario, the damage to cartilage from frequent running would overwhelm any ability of the tissue to fix itself.
But if the model included the likelihood of the cartilage actively adapting — growing thicker and cushier — when people ran, the odds of runners developing arthritis fell to about 13 percent, the same as for healthy walkers.
What these results suggest is that cartilage is malleable, Dr. Ross says. It must be able to sense the strains and slight damage from running and rebuild itself, becoming stronger. In this scenario, running bolsters cartilage health.
Modeled results like these are theoretical, though, and limited. They do not explain how cartilage remodels itself without a blood supply or if genetics, nutrition, body weight, knee injuries and other factors affect individual arthritis risks. Such models also do not tell us if different distances, speeds or running forms would alter the outcomes. To learn more, we will need direct measures of molecular and other changes in living human cartilage after running, Dr. Miller says, but such tests are difficult.
Still, this study may quiet some runners’ qualms — and those of their families and friends. “It looks like running is unlikely to cause knee arthritis by wearing out cartilage,” Dr. Ross says.