Curbing nearsightedness in children: Can outdoor time help?

Two children dressed in coats playing outdoors on a balance feature in a city playground with their mother watching

Turns out that when your mother told you to stop sitting near the TV or you might need glasses, she was onto something.

Myopia, or nearsightedness, is a growing problem worldwide. While a nearsighted child can see close objects clearly, more distant objects look blurry. Part of this growing problem, according to experts, is that children are spending too much time indoors looking at things close to them rather than going outside and looking at things that are far away.

What is nearsightedness?

Nearsightedness is very common, affecting about 5% of preschoolers, 9% of school-age children, and 30% of teens. But what worries experts is that over the last few decades its global prevalence has doubled — and during the pandemic, eye doctors have noticed an increase in myopia.

Nearsightedness happens when the eyeball is too large from front to back. Genes play a big role, but growing research shows that there are developmental factors. The stereotype of the nerd wearing glasses actually bears out; research shows that the more years one spends in school, the higher the risk of myopia. Studies also show, even more reliably, that spending time outdoors can decrease a child’s risk of developing myopia.

Why would outdoor time make a difference in nearsightedness?

While surprising, this actually makes some sense. As children grow and change, their lifestyles affect their bodies. A child who is undernourished, for example, may not grow as tall as they might have if they had better nourishment. A child who develops obesity during childhood is far more likely to have lifelong obesity. And the eyes of a child who is always looking at things close to him or her might adjust to this — and lose some ability to see far away.

Nearsightedness has real consequences. Not only can it cause problems with everyday tasks that require you to see more than a few feet away, such as school or driving, but people with myopia are at higher risk of blindness and retinal detachment. The problems can’t always be fixed with a pair of glasses.

What can parents do?

  • Make sure your child spends time outdoors regularly — every day, if possible. That’s the best way to be sure that they look at things far away. It’s also a great way to get them to be more active, get enough Vitamin D, and learn some important life skills.
  • Try to limit the amount of time your child spends close to a screen. These days, a lot of schoolwork is on screens, but children are also spending far too much of their playtime on devices rather than playing with toys, drawing, or other activities. Have some ground rules. The American Academy of Pediatrics recommends no more than two hours of entertainment media a day, and has a great Family Media Plan to help families make this happen.
  • Have your child’s vision checked regularly. Most pediatricians do regular vision screening, but it is important to remember that basic screening can miss vision problems. It’s a good idea for your child to have a full vision examination from an ophthalmologist or an optometrist by kindergarten.
  • Call your pediatrician or child’s eye doctor if you notice signs of a possible vision problem, such as
    • sitting close to the television or holding devices close to the face
    • squinting or complaining of any difficulty seeing
    • not being able to identify objects far away (when you go for walks, play I Spy and point to some far-away things!)
    • avoiding or disliking activities that involve looking close, like doing puzzles or looking at books, which can be a sign of hyperopia (farsightedness)
    • tilting their head to look at things
    • covering or rubbing an eye
    • one eye that turns inward or outward.

If you have any questions or concerns about your child’s vision, talk to your pediatrician.

Follow me on Twitter @drClaire

About the Author

photo of Claire McCarthy, MD

Claire McCarthy, MD, Senior Faculty Editor, Harvard Health Publishing

Claire McCarthy, MD, is a primary care pediatrician at Boston Children’s Hospital, and an assistant professor of pediatrics at Harvard Medical School. In addition to being a senior faculty editor for Harvard Health Publishing, Dr. McCarthy … See Full Bio View all posts by Claire McCarthy, MD

Considering collagen drinks and supplements?

A red-colored drink being poured from a bottle into a glass with ice; concept is collagen drinks

A tremendous buzz surrounds collagen drinks and supplements, as celebrities and influencers tout miraculous benefits for skin, hair, and nails. Since the collagen in our bodies provides crucial support for these tissues, it seems plausible that consuming collagen might lead to lush locks and a youthful glow. But what does the science say?

What is collagen?

Collagen is a major structural protein in our tissues. It’s found in skin, hair, nails, tendons, cartilage, and bones. Collagen works with other substances, such as hyaluronic acid and elastin, to maintain skin elasticity, volume, and moisture. It also helps make up proteins such as keratin that form skin, hair, and nails.

Our bodies naturally produce collagen using the amino acids from protein-rich or collagen-rich foods like bone broth, meat, and fish. But aging, sun damage, smoking, and alcohol consumption all decrease collagen production.

Collagen drinks and supplements often contain collagen from many different sources, such as fish, cattle, pigs, or chicken. Typically, they contain peptides, short chains of amino acids that help make up essential proteins in the body, including collagen itself and keratin.

What does the science say about collagen drinks and supplements?

Research on skin includes:

  • A review and analysis of 19 studies, published in the International Journal of Dermatology, that had a total of 1,125 participants. Those who used collagen supplements saw an improvement in the firmness, suppleness, and moisture content of the skin, with wrinkles appearing less noticeable. That sounds promising, but it’s unclear if these skin improvements were actually due to collagen. Most of the trials used commercially available supplements that contained more than collagen: vitamins, minerals, antioxidants, coenzyme Q10, hyaluronic acid, and chondroitin sulfate were among the additional ingredients.
  • A few randomized, controlled trials (see here and here) show that drinking collagen supplements with high amounts of the peptides prolylhydroxyproline and hydroxyprolylglycine can improve skin moisture, elasticity, wrinkles, and roughness. But large, high-quality studies are needed to learn whether commercially available products are helpful and safe to use long-term.

Hardly any evidence supports the use of collagen to enhance hair and nails. One small 2017 study of 25 people with brittle nails found that taking 2.5 grams of collagen daily for 24 weeks improved brittleness and nail growth. However, this small study had no control group taking a placebo to compare with the group receiving collagen supplements.

There haven’t been any studies in humans examining the benefits of collagen supplementation for hair. Currently, no medical evidence supports marketing claims that collagen supplements or drinks can improve hair growth, shine, volume, and thickness.

Should you try collagen supplements or drinks?

At this time, there isn’t enough proof that taking collagen pills or consuming collagen drinks will make a difference in skin, hair, or nails. Our bodies cannot absorb collagen in its whole form. To enter the bloodstream, it must be broken down into peptides so it can be absorbed through the gut.

These peptides may be broken down further into the building blocks that make proteins like keratin that help form skin, hair, and nails. Or the peptides may form collagen that gets deposited in other parts of the body, such as cartilage, bone, muscles, or tendons. Thus far, no human studies have clearly proven that collagen you take orally will end up in your skin, hair, or nails.

If your goal is to improve skin texture and elasticity and minimize wrinkles, you’re better off focusing on sun protection and using topical retinoids. Extensive research has already demonstrated that these measures are effective.

If you choose to try collagen supplements or drinks, review the list of ingredients and the protein profile. Avoid supplements with too many additives or fillers. Products containing high quantities of prolylhydroxyproline and hydroxyprolylglycine are better at reducing wrinkles and improving the moisture content of skin.

Consult your doctor before starting any new supplements. People who are prone to gout or have other medical conditions that require them to limit protein should not use collagen supplements or drinks.

The bottom line

Large-scale trials evaluating the benefits of oral collagen supplements for skin and hair health are not available. If you’re concerned about thinning or lackluster hair, brittle nails, or keeping skin smooth and healthy, talk to your doctor or a dermatologist for advice on the range of options.

It will also help to:

  • Follow a healthy lifestyle and eat a balanced diet that includes protein-rich foods.
  • If you smoke, quit.
  • Limit alcohol to two drinks or less in a day for men or one drink or less in a day for women.
  • Apply sunscreen daily and remember to reapply every two hours.
  • Wear wide-brimmed or UV-protective hats and clothing when you’re spending a lot of time in the sun.

Follow Payal Patel on Twitter @PayalPatelMD

Follow Maryanne Makredes Senna on Twitter @HairWithDrMare

About the Authors

photo of Payal Patel, MD

Payal Patel, MD, Contributor

Dr. Payal Patel is a dermatology research fellow at Massachusetts General Hospital. Her clinical and research interests include autoimmune disease and procedural dermatology. She is part of the Cutaneous Biology Research Center, where she investigates medical … See Full Bio View all posts by Payal Patel, MD photo of Maryanne Makredes Senna, MD

Maryanne Makredes Senna, MD, Contributor

Dr. Maryanne Makredes Senna is a board-certified dermatologist at at Beth Israel Lahey Health, and assistant professor of dermatology at Harvard Medical School. Dr. Senna founded and directs the Lahey Hair Loss Center of Excellence and … See Full Bio View all posts by Maryanne Makredes Senna, MD

Does less TV time lower your risk for dementia?

Smiling couple sitting on couch watching TV; man with short brown hair points remote, woman has white hair; bowl of popcorn rests on blanket

Be honest: just how much television are you watching? One study has estimated that half of American adults spend two to three hours each day watching television, with some watching as much as eight hours per day.

Is time spent on TV a good thing or a bad thing? Let's look at some of the data in relation to your risks for cognitive decline and dementia.

Physical activity does more to sharpen the mind than sitting

First, the more time you sit and watch television, the less time you have available for physical activity. Getting sufficient physical activity decreases your risk of cognitive impairment and dementia. Not surprisingly, if you spend a lot of time sitting and doing other sedentary behaviors, your risk of cognitive impairment and dementia will be higher than someone who spends less time sitting.

Is television actually bad for your brain?

Okay, so it's better to exercise than to sit in front of the television. You knew that already, right?

But if you're getting regular exercise, is watching television still bad for you? The first study suggesting that, yes, television is still bad for your brain was published in 2005. After controlling for year of birth, gender, income, and education, the researchers found that each additional hour of television viewing in middle age increased risk for developing Alzheimer's disease 1.3 times. Moreover, participating in intellectually stimulating activities and social activities reduced the risk of developing Alzheimer's.

Although this study had fewer than 500 participants, its findings had never been refuted. But would these results hold up when a larger sample was examined?

Television viewing and cognitive decline

In 2018, the UK Biobank study began to follow approximately 500,000 individuals in the United Kingdom who were 37 to 73 years old when first recruited between 2006 and 2010. The demographic information reported was somewhat sparse: 88% of the sample was described as white and 11% as other; 54% were women.

The researchers examined baseline participant performance on several different cognitive tests, including those measuring

  • prospective memory (remembering to do an errand on your way home)
  • visual-spatial memory (remembering a route that you took)
  • fluid intelligence (important for problem solving)
  • short-term numeric memory (keeping track of numbers in your head).

Five years later, many participants repeated certain tests. Depending on the test, the number of participants evaluated ranged from 12,091 to 114,373. The results of this study were clear. First, at baseline, more television viewing time was linked with worse cognitive function across all cognitive tests.

More importantly, television viewing time was also linked with a decline in cognitive function five years later for all cognitive tests. Although this type of study cannot prove that television viewing caused the cognitive decline, it suggests that it does.

Further, the type of sedentary activity chosen mattered. Both driving and television were linked to worse cognitive function. But computer use was actually associated with better cognitive function at baseline, and a lower likelihood of cognitive decline over the five-year study.

Television viewing and dementia

In 2022, researchers analyzed this same UK Biobank sample with another question in mind: Would time spent watching television versus using a computer result in different risks of developing dementia over time?

Their analyses included 146,651 people from the UK Biobank, ages 60 and older. At the start of the study, none had been diagnosed with dementia.

Over 12 years, on average, 3,507 participants (2.4%) were diagnosed with dementia. Importantly, after controlling for participant physical activity:

  • time spent watching television increased the risk of dementia
  • time spent using the computer decreased the risk of dementia.

These changes in risk were not small. Those who watched the most television daily — more than four hours — were 24% more likely to develop dementia. Those who used computers interactively (not passively streaming) more than one hour daily as a leisure activity were 15% less likely to develop dementia.

Studies like these can only note links between behaviors and outcomes. It's always possible that the causation works the other way around. In other words, it's possible that people who were beginning to develop dementia started to watch television more and use the computer less. The only way to know for sure would be to randomly assign people to watch specific numbers of hours of television each day while keeping the amount of exercise everyone did the same. That study is unlikely to happen.

The bottom line

If you watch more than one hour of TV daily, my recommendation is to turn it off and do activities that we know are good for your brain. Try physical exercise, using the computer, doing crossword puzzles, dancing and listening to music, and participating in social and other cognitively stimulating activities.

About the Author

photo of Andrew E. Budson, MD

Andrew E. Budson, MD, Contributor; Editorial Advisory Board Member, Harvard Health Publishing

Dr. Andrew E. Budson is chief of cognitive & behavioral neurology at the Veterans Affairs Boston Healthcare System, lecturer in neurology at Harvard Medical School, and chair of the Science of Learning Innovation Group at the … See Full Bio View all posts by Andrew E. Budson, MD