Think of this as a kind of medical detective story. Start with this: The percentage of Americans who are nearsighted has gone way up in 30 years.
That's according to a study published last month in the Archives of Ophthalmology. Susan Vitale, an epidemiologist at the National Eye Institute, which is part of the National Institutes of Health, and her co-authors looked at a national survey that gave vision tests to Americans in the early 1970s. It was then repeated with a similar group of people 30 years later. "The prevalence of myopia, or nearsightedness, in people age 12 to 54 went from 25 percent to 41.6 percent," explains Vitale. "So that's about a 66 percent increase."
The question is: Why? And in a mystery story, you've got suspects. Says Vitale, "Some of the risk factors that we know about for myopia are things like genetics, which is whether you have a family history of myopia. Things, possibly, like the amount of near-work that you do."
Near-Work Is A Prime Suspect
Genetics — or heredity — is by far the main thing that determines who becomes nearsighted. But then there's what scientists like Vitale call near-work. That's the things you do close up with your eyes, like reading or watching television or playing video games.
Near-work has been a suspect for hundreds of years. Even Johannes Kepler, the German astronomer and mathematician who came up with modern ideas for the lenses that correct nearsightedness, blamed his own fuzzy eyesight on all the reading he did.
"Kepler wrote about it, about 400 years ago, that he thought his nearsightedness was due to his intense study of astronomical tables and so forth," says Dr. Don Mutti, of the College of Optometry at the Ohio State University.
Mutti is a kind of detective of myopia. Like other scientists researching this, he too suspected that after genetics, things like reading were probably a big cause.
"It's the popular stereotype," he says, recalling the warnings of generations of parents. "Don't watch too much TV, or don't read under the covers with a flashlight."
A Long-Term Investigation
For the past 20 years, Mutti has followed a group — from childhood to adulthood — to see who develops myopia. He found something significant: Time spent outdoors during childhood was important.
"If you have two nearsighted parents and you engage in a low level of outdoor activity, your chances of becoming myopic by the eighth grade are about 60 percent," he says. "If children engaged in over 14 hours per week of outdoor activity, their chances of becoming nearsighted were now only about 20 percent. So it was quite a dramatic reduction in the risk of becoming myopic."
At first, that seems to support the theory that near-work causes nearsightedness: The more time kids spend indoors, the more likely they're watching TV or reading a book.
But then Mutti and his colleagues looked closely at the kids before they became nearsighted. And the reading and close-up things they did didn't predict who'd be nearsighted later. "What we found is that near-work had no influence at all," he says. "Children really aren't doing any more or less near-work — the children who are becoming nearsighted."
So that's another mystery. Why, then, does spending time outdoors make a difference? At first, scientists thought the outdoor exercise was the key. But it turned out kids who get indoor exercise don't get the benefits of reduced myopia.
Now, researchers are studying whether outdoor light somehow changes the way the eye grows.
"Light levels might have a beneficial effect on the eye," notes Mutti. "Light levels change certain aspects in retinal physiology."
The detectives of myopia are on the case.