In a recent New York Times article, Gina Kolata discusses the origin & legitimacy of the much-touted and oft-parroted "ten percent rule" -- you know, the fact that, in distance training, increasing weekly mileage by more than ten percent over the previous week's mileage puts a runner at a significantly increased risk of injury. Where did this rule come from, she asks, and what kind of evidence is there for its veracity?
Kolata (who is in fact a runner) first began looking into the rule when a runner friend of hers was recovering from a stress fracture and attempting to work back up to the sixty-mile weeks he was running pre-injury. Try as she might, though, she was unable to find the original source. What's more, she couldn't find any scientific evidence that it was even true. Could it be that this sacred training principle that runners of every ilk have sworn by for ages untold is nothing more than an urban legend, like repeating "bloody Mary" three times in a darkened bathroom while staring at the mirror? ("You know what I heard? I heard that if you up your mileage by more than ten percent a week, then a month later your IT band will melt off your body. Totally happened to my friend's cousin's roommate's best friend's sister.")
I think that one of the Great Truths of Adulthood for me has been that we don't always necessarily know as much as we think we know. I can recall several times in the last ten years or so when some "fact" or other piece of conventional wisdom that I've accepted and repeated for years has been called into question; someone will ask, "Wait, how do you know that?" And I'll find myself going, "Wait a minute, how do I know that?" How do I know that I shouldn't increase my mileage by more than ten percent per week? Because someone, probably a high school coach, told me so some indeterminate number of years ago. And because I probably read it in some running book written by another coach, who didn't reference anything other than his or her many years of personal experience. And we've all been repeating it back to one another ever since.
The one scientifically valid, controlled, peer-reviewed study that Kolata did find was conducted by the University Center for Sport, Exercise and Health at the University of Groningen. (You can read the full study here.) The study followed two randomly assigned groups of runners who wanted to increase their weekly mileage in order to train for a four-mile race and recorded rates of injury in both groups. One group increased their mileage by ten percent for eleven weeks until they were running for ninety minutes per week; the other increased their mileage by significantly more than that until, after only eight weeks of training, they were running for ninety-five minutes per week. At the end of the study, the injury rate in both groups was about 20%.
Now, before I make any interpretations here, I should mention that in addition to being a runner, I am also a mathematician, and as such have studied statistics and research methodology. That doesn't make me the most expert-iest expert out there on this study or any other, but it does mean I've been trained not to make certain mistakes in interpreting research data, particularly involving statistics.
For example, it's important not to make the mistake of assuming a study like this actually proves anything. Empirical questions (like, "Does increasing weekly mileage by more than 10% increase the risk of injury in distance runners?") can't ever really be proven or disproven; when a study like this is done and it doesn't support a given hypothesis, all we can really say is that we've failed to provide evidence for the alternative, and that we maybe have a little more evidence that the hypothesis is false. I.e., this study does not DISPROVE the 10% rule; it only fails to support it, and provides some evidence that the rule may not be true. Even if we had a million studies like this, we'd only be gathering stronger and stronger evidence that the rule wasn't true. When we have enough scientific evidence of something, we then start to feel that it's highly LIKELY that that thing is universally true or false, but that's not the same as proof.
Secondly, rigorous, controlled scientific studies tend to be somewhat specific by their nature. Researchers try to limit variation among participants as much as possible so that, if they do see an effect in the treatment group, they'll know that it's coming from the one variable they changed as opposed to some other one that they didn't control for. In this study, the researchers needed the runners to be about the same age (around 40), at about the same level in their running (beginners), shooting for the the same distance (four miles), etc. This is great for making sure that nothing other than the mileage increase could be causing any difference in injury rates, but it doesn't necessarily tell us much about other types of runners. It seems as if the 10% rule may not help 40-year-old novices avoid injury, but we can't say much about older or younger runners, or runners with more experience.
Even given all that, though, I think the study is still worth paying attention to because it calls into question conventional wisdom using scientific methodology, and in particular, conventional wisdom that no one has ever tried to test scientifically before. Certainly more research is needed (on runners of different ages, experience levels, injury histories, etc.), but if for no other reason, this is an important result simply because it's the first real evidence we have either way.
As for practical applications, here's what I think this study does NOT warrant. It does NOT warrant foregoing completely any sort of advice to new runners about how quickly they should increase mileage. It does NOT warrant nodding and smiling when someone decides to try going from fifteen miles a week to fifty. But maybe it does warrant teaching runners to err on the conservative side when adding more miles and adjusting according to physical symptoms rather than handing them a (potentially) arbitrary number whose origins no one can seem to recall. After all, the ten percent rule can work the other way as well; there have definitely been times in my own training when even a 6-8% increase was decidedly too much, and I would've been a fool to blindly continue upping by 10% every week.
Finally, I am sure there are plenty of folks out there who will continue to swear by the 10% rule, even if we get fifty more studies that fail to show that it reduces injury rates, citing that it's worked for them and theirs for x number of years and why fix what ain't broke. To that, I would just point out the difference between individuals continuing to do what has worked well for them (smart) and generalizing anecdotal evidence to an overarching training principle (not smart). Unfortunately, humans are pretty much built to pay more attention to isolated, personal experiences (or perceived experiences) than to scientific evidence, precisely because it often does tend to work out better for individuals. For large groups, though, this is not the case, so that's something I think we need to be careful of. If the 10% rule works for you, by all means keep doing it; do bear in mind when you're discussing mileage with new runners, though, that we now have evidence that it may not generalize to everyone under all circumstances.
No comments:
Post a Comment