Wednesday, November 20, 2019

Ketogenic diet helps tame flu virus

A high-fat, low-carbohydrate diet like the Keto regimen has its fans, but influenza apparently isn't one of them.
Mice fed a ketogenic diet were better able to combat the flu virus than mice fed food high in carbohydrates, according to a new Yale University study published Nov. 15 in the journal Science Immunology.
The ketogenic diet -- which for people includes meat, fish, poultry, and non-starchy vegetables -- activates a subset of T cells in the lungs not previously associated with the immune system's response to influenza, enhancing mucus production from airway cells that can effectively trap the virus, the researchers report.
"This was a totally unexpected finding," said co-senior author Akiko Iwasaki, the Waldemar Von Zedtwitz Professor of Immunobiology and Molecular, Cellular and Developmental Biology, and an investigator of the Howard Hughes Medical Institute.
The research project was the brainchild of two trainees -- one working in Iwasaki's lab and the other with co-senior author Visha Deep Dixit, the Waldemar Von Zedtwitz Professor of Comparative Medicine and of Immunobiology. Ryan Molony worked in Iwasaki's lab, which had found that immune system activators called inflammasomes can cause harmful immune system responses in their host. Emily Goldberg worked in Dixit's lab, which had shown that the ketogenic diet blocked formation of inflammasomes.
The two wondered if diet could affect immune system response to pathogens such as the flu virus.
They showed that mice fed a ketogenic diet and infected with the influenza virus had a higher survival rate than mice on a high-carb normal diet. Specifically, the researchers found that the ketogenic diet triggered the release of gamma delta T cells, immune system cells that produce mucus in the cell linings of the lung -- while the high-carbohydrate diet did not.
When mice were bred without the gene that codes for gamma delta T cells, the ketogenic diet provided no protection against the influenza virus.
"This study shows that the way the body burns fat to produce ketone bodies from the food we eat can fuel the immune system to fight flu infection," Dixit said.

Too much ultra-processed food linked to lower heart health

Ultra-processed foods, which account for more than half of an average American's daily calories, are linked to lower measures of cardiovascular health, according to preliminary research to be presented at the American Heart Association's Scientific Sessions 2019 -- November 16-18 in Philadelphia.


Researchers at the U.S. Centers for Disease Control and Prevention (CDC) found that for every 5% increase in calories from ultra-processed foods a person ate, there was a corresponding decrease in overall cardiovascular health. Adults who ate approximately 70% of their calories from ultra-processed foods were half as likely to have "ideal" cardiovascular health, as defined by the American Heart Associations' Life's Simple 7®, compared with people who ate 40% or less of their calories from ultra-processed foods.
Foods were categorized into groups by the extent and purpose of industrial processing they undergo. Ultra-processed foods are made entirely or mostly from substances extracted from foods, such as fats, starches, hydrogenated fats, added sugar, modified starch and other compounds and include cosmetic additives such as artificial flavors, colors or emulsifiers. Examples include soft drinks, packaged salty snacks, cookies, cakes, processed meats, chicken nuggets, powdered and packaged instant soups and many items often marketed as "convenience foods."
"Healthy diets play an important role in maintaining a healthy heart and blood vessels," said Zefeng Zhang, M.D., Ph.D., an epidemiologist at the CDC. "Eating ultra-processed foods often displaces healthier foods that are rich in nutrients, like fruit, vegetables, whole grains and lean protein, which are strongly linked to good heart health. In addition, ultra-processed foods are often high in salt, added sugars, saturated fat and other substances associated with increasing the risk of heart disease."
Using data from the National Health and Nutrition Examination Survey (NHANES) collected between 2011 and 2016, researchers at the CDC reviewed the results from 13,446 adults, 20 years of age and older, who completed a 24-hour dietary recall and answered questions about their cardiovascular health.
Cardiovascular health is defined by the American Heart Association's Life's Simple 7 as measures of healthy blood pressure, cholesterol and blood glucose levels, avoidance of tobacco products, good nutrition, healthy body weight and adequate physical activity.
"This study underscores the importance of building a healthier diet by eliminating foods such as sugar-sweetened beverages, cookies, cakes and other processed foods," said Donna Arnett, Ph.D., past-president of the American Heart Association and dean of the College of Public Health at the University of Kentucky in Lexington. "There are things you can do every day to improve your health just a little bit. For example, instead of grabbing that loaf of white bread, grab a loaf of bread that's whole grain or wheat bread. Try replacing a hamburger with fish once or twice a week. Making small changes can add up to better heart health."

Monday, October 14, 2019

Overweight before age 40 increases the cancer risk

In an international study, lead by the University of Bergen in Norway, the researchers wanted to find out how adult overweight (BMI over 25) and obesity (BMI over 30) increase the risk of different types of cancer.
The study showed that if you were overweight before age 40, the risk of developing cancer increases by:
  • 70 percent for endometrial cancer.
  • 58 percent for male renal-cell cancer.
  • 29 percent for male colon cancer.
  • 15 percent for all obesity-related cancers (both sexes).
"Obesity is an established risk factor for several cancers. In this study, we have focused on the degree, timing and duration of overweight and obesity in relation to cancer risk," says Professor Tone Bjørge, at Department of Global Public Health and Primary Care, University of Bergen.
Obesity increases risk over time
In the study, the researchers included adults with two or more measurements, obtained at least three years apart, and before a possible cancer diagnosis. On average, the individuals were followed for about 18 years.
Obese participants (BMI over 30) at the first and second health examination had the highest risk of developing obesity-related cancer, compared to participants with normal BMI.
"The risk increased by 64 percent for male participants and 48 percent for females," Bjørge says.
Avoid weight gain
Obesity is a global challenge and associated with increased risk of several types of cancer. The results from the study show that overweight and obese adults have an increased risk of postmenopausal breast, endometrial, renal-cell and colon cancer.
"Our key message is that preventing weight gain may be an important public health strategy to reduce the cancer risk," says Tone Bjørge.
Facts:
  • The researchers used data for 220,000 individuals from the Me-Can study, with participants from Norway, Sweden and Austria.
  • Data from health examinations, including information on height and weight, were linked to data from national cancer registries.
  • 27 881 individuals were diagnosed with cancer during follow-up, of which 9761 (35 percent) were obesity-related.

Story Source:
Materials provided by The University of BergenNote: Content may be edited for style and length.

Can excessive athletic training make your brain tired? New study says yes

You'd expect excessive athletic training to make the body tired, but can it make the brain tired too? A new study reported in the journal Current Biology on September 26 suggests that the answer is "yes."
When researchers imposed an excessive training load on triathletes, they showed a form of mental fatigue. This fatigue included reduced activity in a portion of the brain important for making decisions. The athletes also acted more impulsively, opting for immediate rewards instead of bigger ones that would take longer to achieve.
"The lateral prefrontal region that was affected by sport-training overload was exactly the same that had been shown vulnerable to excessive cognitive work in our previous studies," says corresponding author Mathias Pessiglione of Hôpital de la Pitié-Salpêtrière in Paris. "This brain region therefore appeared as the weak spot of the brain network responsible for cognitive control."
Together, the studies suggest a connection between mental and physical effort: both require cognitive control. The reason such control is essential in demanding athletic training, they suggest, is that to maintain physical effort and reach a distant goal requires cognitive control.
"You need to control the automatic process that makes you stop when muscles or joints hurt," Pessiglione says.
The researchers, including Pessiglione and first author Bastien Blain, explain that the initial idea for the study came from the National Institute of Sport, Expertise, and Performance (INSEP) in France, which trains athletes for the Olympic games. Some athletes had suffered from "overtraining syndrome," in which their performance plummeted as they experienced an overwhelming sense of fatigue. The question was: Did this overtraining syndrome arise in part from neural fatigue in the brain -- the same kind of fatigue that also can be caused by excessive intellectual work?
To find out, Pessiglione and colleagues recruited 37 competitive male endurance athletes with an average age of 35. Participants were assigned to either continue their normal training or to increase that training by 40% per session over a three-week period. The researchers monitored their physical performance during cycling exercises performed on rest days and assessed their subjective experience of fatigue using questionnaires every two days. They also conducted behavioral testing and functional magnetic resonance imaging (fMRI) scanning experiments.
The evidence showed that physical training overload led the athletes to feel more fatigued. They also acted more impulsively in standard tests used to evaluate how they'd make economic choices. This tendency was shown as a bias in favoring immediate over delayed rewards. The brains of athletes who'd been overloaded physically also showed diminished activation of the lateral prefrontal cortex, a key region of the executive control system, as they made those economic choices.
The findings show that, while endurance sport is generally good for your health, overdoing it can have adverse effects on your brain, the researchers say.
"Our findings draw attention to the fact that neural states matter: you don't make the same decisions when your brain is in a fatigue state," Pessiglione say.
These findings may be important not just for producing the best athletes but also for economic choice theory, which typically ignores such fluctuations in the neural machinery responsible for decision-making, the researchers say. It suggests it may also be important to monitor fatigue level in order to prevent bad decisions from being made in the political, judicial, or economic domains.
In future studies, the researchers plan to explore why exerting control during sports training or intellectual work makes the cognitive control system harder to activate in subsequent tasks. Down the road, the hope is to find treatments or strategies that help to prevent such neural fatigue and its consequences.

Story Source:
Materials provided by Cell PressNote: Content may be edited for style and length.

Journal Reference:
  1. Blain et al. Neuro-computational impact of physical training overload on economic decision-makingCurrent Biology, 2019 DOI: 10.1016/j.cub.2019.08.054

To Run Or Not To Run

Running and strength training shouldn't be mutually exclusive. Be a stronger, healthier runner with strength training.

The age-old debate about running usually creates a few extremists. There’s not much unbiased information to be found about the subject, and that makes most readers feel they need to “pick a side.” Many members of the strength and conditioning community bash running as nothing more than a way to further your injuries. Before you decide, it’s more important to sit back and consider the facts.

Recognize your Starting Point

Many smart strength coaches will say, “There’s no such thing as a bad exercise” to people who will proudly explain that deadlifts are bad for your back, or upright rows are bad for your shoulders, or that deep squats are bad for your knees. Where running is concerned, there really should be no difference. It’s not that the activity is bad for you; it’s actually great exercise. The problem is, due to its accessibility, many think they’ve got the clean slate of fitness and clean bill of health required to take up running with few to no consequences. That’s just not true. Most people who make the decision to get in better shape, have been sedentary for some time, have 10 or 15 pounds they’d like to lose, and are no longer spring chickens. They haven’t developed prowess through basic movement patterns, and have never worked on muscle imbalances. This creates the perfect storm for injury that lasts the long haul if not careful. There’s one main factor that should determine whether you start to run, and it can be summed up in a single, direct question:

Are you Strength Training?

If you’re not, don’t even think about running. Dealing with the impact of thousands of strides over the course of a half-hour run (for example) can cause damage to joints and connective tissue, while reinforcing very small ranges of motion that do mobility no services. For this not to cost you, it’s imperative that strength training be part of your routine to counter that damage. And 5- or 10-pound weights aren’t going to be what cuts it either. That’s not strength training.
Lifting weights comprising of compound movements like squats, deadlifts, presses, lunges, rows, and pull-ups are good for developing a foundation of strength. But there’s more. It’s also imperative that you prioritize the use of full range of motion. You won’t be putting any joints through that range anywhere else, and running long distances will exacerbate the problem. Supplementing this all with good mobility work will set the stage to allow the body to make the repeated impact of running a non-issue.

Do you Know How to Run?

Just like any exercise in the gym, there’s a technique involved to running. It would do you well to learn that form, ideally from in-person coaching with a professional running coach. Gait, foot strike, and other cues can play a serious role in just how safely and efficiently you move.

Keep it Short and Sweet

Based on the potential for injury that running can have on the body, it’s better to run faster, for shorter distances. Even though I’m partial to full-fledged sprinting, it doesn’t mean that’s the only option available. Fast runs that last 5 to 10 minutes are far superior to moderately paced runs that last four times as long. These will still have a great impact on your cardiorespiratory capacity, and also tap into greater metabolic demands which can be helpful if fat loss is your goal.

Don’t Run to Get in Shape, Get in Shape to Run

If you can’t perform compound movements using full ranges of motion, and don’t have a respectable measure of strength in your upper body, lower body, and core, then you’re doing yourself a disservice by prioritizing running in your routine, until you’ve sorted the above out. It’s a sad reality, but facing it now will be better than facing it later, or never at all. 

Wednesday, October 9, 2019

Finding upends theory about the cerebellum's role in reading and dyslexia

New brain imaging research debunks a controversial theory about dyslexia that can impact how it is sometimes treated, Georgetown University Medical Center neuroscientists say.
The cerebellum, a brain structure traditionally considered to be involved in motor function, has been implicated in the reading disability, developmental dyslexia, however, this "cerebellar deficit hypothesis" has always been controversial. The new research shows that the cerebellum is not engaged during reading in typical readers and does not differ in children who have dyslexia.
That is the finding of a new study involving children with and without dyslexia published October 9, 2019, in the journal Human Brain Mapping.
It is well established that dyslexia, a common learning disability, involves a weakness in understanding the mapping of sounds in spoken words to their written counterparts, a process that requires phonological awareness. It is also well known that this kind of processing relies on brain regions in the left cortex. However, it has been argued by some that the difficulties in phonological processing that lead to impaired reading originate in the cerebellum, a structure outside (and below the back) of the cortex.
"Prior imaging research on reading in dyslexia had not found much support for this theory called the cerebellar deficit hypothesis of dyslexia, but these studies tended to focus on the cortex," says the study's first author, Sikoya Ashburn, a Georgetown PhD candidate in neuroscience. "Therefore, we tackled the question by specifically examining the cerebellum in more detail. We found no signs of cerebellar involvement during reading in skilled readers nor differences in children with reading disability."
The researchers used functional magnetic resonance imaging to look for brain activation during reading. They also tested for functional connections between the cerebellum and the cortex during reading.
"Functional connectivity occurs when two brain regions behave similarly over time; they operate in sync," says Ashburn. "However, brain regions in the cortex known to partake in the reading process were not communicating with the cerebellum in children with or without dyslexia while the brain was processing words."
The results revealed that when reading was not considered in the analysis -- that is when just examining the communications between brain regions at rest -- the cerebellum was communicating with the cortex more strongly in the children with dyslexia.
"These differences are consistent with the widely distributed neurobiological alterations that are associated with dyslexia, but not all of them are likely to be causal to the reading difficulties," Ashburn explains.
"The evidence for the cerebellar deficit theory was never particularly strong, yet people have jumped on the idea and even developed treatment approaches targeting the cerebellum," says senior author and neuroscientist Guinevere Eden, D. Phil, professor in the Department of Pediatrics at Georgetown University Medical Center and director for its Center for the Study of Learning. "Standing on a wobble board -- one exercise promoted for improving dyslexia that isn't supported by the evidence -- is not going to improve a child's reading skills. Such treatments are a waste money and take away from other treatment approaches that entail structured intervention for reading difficulties, involving the learning of phonologic and orthographic processing."
In the long run, these researchers believe the findings can be used to refine models of dyslexia and to assist parents of struggling readers to make informed decisions about which treatment programs to pursue.
More information about dyslexia can be found at the International Dyslexia Association or at Understood.org.
This work was supported in part by grants from the Eunice Kennedy Shriver National Institute of Child Health and Human Development (P50 HD040095, R01 HD081078), and the National Center for Advancing Translational Sciences of the National Institutes of Health (TL1 TR001431).

Story Source:
Materials provided by Georgetown University Medical CenterNote: Content may be edited for style and length.

Antibiotic resistance in food animals nearly tripled since 2000

The growing appetite for animal protein in developing countries has resulted in a smorgasbord of antibiotic consumption for livestock that has nearly tripled the occurrence of antibiotic resistance in disease-causing bacteria easily transmitted from animals to humans, according to a recent report in the journal Science.
Researchers from ETH Zurich, the Princeton Environmental Institute (PEI), and the Free University of Brussels gathered nearly 1,000 publications and unpublished veterinary reports from around the world to create a map of antimicrobial resistance in low- to middle-income countries. They focused on the bacteria Escherichia coli, Campylobacter, Salmonella, and Staphylococcus aureus, all of which cause serious disease in animals and humans.
Between 2000 and 2018, the proportion of antibiotics showing rates of resistance above 50% in developing countries increased in chickens from 0.15 to 0.41 and in pigs from 0.13 to 0.34, the researchers reported. This means that antibiotics that could be used for treatment failed more than half the time in 40 percent of chickens and one-third of pigs raised for human consumption.
The researchers found that antibiotic resistance in livestock was most widespread in China and India, with Brazil and Kenya emerging as new hotspots. Since 2000, meat production has accelerated by more than 60% in Africa and Asia, and by 40% in South America, as countries on those continents shifted from low- to high-protein diets. More than half of the world's chickens and pigs are in Asia.
"This paper is the first to track antibiotic resistance in animals globally and it finds that resistance has gone up dramatically during the past 18 years," said co-author Ramanan Laxminarayan, a senior research scholar in PEI. The research was supported by the PEI Health Grand Challenge program and included co-author Julia Song, a graduate of Princeton's Class of 2018 and a past PEI research assistant.
"We certainly do want higher-protein diets for many people, but if this comes at the cost of failing antibiotics, then we need to evaluate our priorities," Laxminarayan said.
Meat production accounts for 73% of global antibiotic use. Antibiotics have made large-scale husbandry and widespread meat consumption possible by reducing infection and increasing the body mass of livestock.
The skyrocketing emergence of antibiotic resistance in livestock is especially troubling in developing countries, said first author Thomas van Boeckel, an assistant professor of health geography and policy at ETH Zurich. Those nations continue to experience explosive growth in meat production and consumption, while access to veterinary antimicrobials remains largely unregulated.
The researchers suggest that developing nations should take action to restrict the use of human antibiotics in farm animals and that affluent nations should support a transition to sustainable farming, possibly through a global fund to subsidize biosafety and biosecurity improvements. Otherwise, the unrestricted use of antibiotics in even greater numbers of animals raised for human consumption could lead to the global spread of infectious bacteria that are increasingly difficult to treat.
"Antimicrobial resistance is a global problem," said Van Boeckel, who was a Fulbright Scholar at Princeton from 2013-2015. "This alarming trend shows that the drugs used in animal farming are rapidly losing their efficacy."

Story Source:
Materials provided by Princeton UniversityNote: Content may be edited for style and length.

Dog ownership associated with longer life

Dog ownership may be associated with longer life and better cardiovascular outcomes, especially for heart attack and stroke survivors who live alone, according to a new study and a separate meta-analysis published in Circulation: Cardiovascular Quality and Outcomes, a journal of the American Heart Association.
"The findings in these two well-done studies and analyses build upon prior studies and the conclusions of the 2013 AHA Scientific Statement 'Pet Ownership and Cardiovascular Risk' that dog ownership is associated with reductions in factors that contribute to cardiac risk and to cardiovascular events," said Glenn N. Levine, M.D., chair of the writing group of the American Heart Association's scientific statement on pet ownership. "Further, these two studies provide good, quality data indicating dog ownership is associated with reduced cardiac and all-cause mortality. While these non-randomized studies cannot 'prove' that adopting or owning a dog directly leads to reduced mortality, these robust findings are certainly at least suggestive of this."
Given previous research demonstrating how social isolation and lack of physical activity can negatively impact patients, researchers in both the study and meta-analysis sought to determine how dog ownership affected health outcomes. Prior studies have shown that dog ownership alleviates social isolation, improves physical activity and even lowers blood pressure -- leading researchers to believe dog owners could potentially have better cardiovascular outcomes compared to non-owners.
Dog ownership and survival after a major cardiovascular event
Researchers in this study compared the health outcomes of dog owners and non-owners after a heart attack or stroke using health data provided by the Swedish National Patient Register. Patients studied were Swedish residents ages 40-85 who experienced heart attack or ischemic stroke from 2001-2012.
Compared to people who did not own a dog, researchers found that for dog owners:
  • The risk of death for heart attack patients living alone after hospitalization was 33% lower, and 15% lower for those living with a partner or child.
  • The risk of death for stroke patients living alone after hospitalization was 27% lower and 12% lower for those living with a partner or child.
In the study, nearly 182,000 people were recorded to have had a heart attack, with almost 6% being dog owners, and nearly 155,000 people were recorded to have had an ischemic stroke, with almost 5% being dog owners. Dog ownership was confirmed by data from the Swedish Board of Agriculture (registration of dog ownership has been mandatory since 2001) and the Swedish Kennel Club (all pedigree dogs have been registered since 1889).
The lower risk of death associated with dog ownership could be explained by an increase in physical activity and the decreased depression and loneliness, both of which have been connected to dog ownership in previous studies.
"We know that social isolation is a strong risk factor for worse health outcomes and premature death. Previous studies have indicated that dog owners experience less social isolation and have more interaction with other people," said Tove Fall, D. V. M., professor at Uppsala University in Sweden. "Furthermore, keeping a dog is a good motivation for physical activity, which is an important factor in rehabilitation and mental health."
While this study draws from a large sample, potential misclassifications of dog ownership in couples living together, death of a dog and change of ownership could have affected the outcomes of the study.
"The results of this study suggest positive effects of dog ownership for patients who have experienced a heart attack or stroke. However, more research is needed to confirm a causal relationship and giving recommendations about prescribing dogs for prevention. Moreover, from an animal welfare perspective, dogs should only be acquired by people who feel they have the capacity and knowledge to give the pet a good life."
Co-authors of the study are Mwenya Mubanga, M.D., M.P.H.; Liisa Byberg, Ph.D.; Agneta Egenvall, V.M.D., Ph.D.; Erik Ingelsson, MD, Ph.D. and Tove Fall, V.M.D., Ph.D. Agria Research Foundation and the Swedish Research Council for Environment, Agricultural Sciences and Spatial Planning (FORMAS), grant number 2013-1673 funded the study.
Dog Ownership and Survival: A Systematic Review and Meta-Analysis
Researchers reviewed patient data of over 3.8 million people taken from 10 separate studies for a composite meta-analysis study. Of the 10 studies reviewed, nine included comparison of all-cause mortality outcomes for dog owners and non-owners, and four compared cardiovascular outcomes for dog owners and non-owners.
Researchers found that compared to non-owners, dog owners experienced a:
  • 24% reduced risk of all-cause mortality;
  • 65% reduced risk of mortality after heart attack; and
  • 31% reduced risk of mortality due to cardiovascular-related issues.
"Having a dog was associated with increased physical exercise, lower blood pressure levels and better cholesterol profile in previous reports," said Caroline Kramer, M.D. Ph.D., Assistant Professor of Medicine at the University of Toronto and an Endocrinologist and Clinician scientist at Leadership Sinai Centre for Diabetes at Mount Sinai Hospital, part of Sinai Health System. "As such, the findings that people who owned dogs lived longer and their risk for cardiovascular death was also lower are somewhat expected."
Studies deemed eligible for analysis included those conducted among adults age 18 or older, original data from an original prospective study, evaluated dog ownership at the beginning of the study and reported all-cause or cardiovascular mortality of patients. Studies were excluded if they were retrospective, did not provide an absolute number of events that occurred and reported non-fatal cardiovascular events.
"Our findings suggest that having a dog is associated with longer life. Our analyses did not account for confounders such as better fitness or an overall healthier lifestyle that could be associated with dog ownership. The results, however, were very positive," said Dr. Kramer. "The next step on this topic would be an interventional study to evaluate cardiovascular outcomes after adopting a dog and the social and psychological benefits of dog ownership. As a dog owner myself, I can say that adopting Romeo (the author's miniature Schnauzer) has increased my steps and physical activity each day, and he has filled my daily routine with joy and unconditional love."

Story Source:
Materials provided by American Heart AssociationNote: Content may be edited for style and length.

Journal Reference:
  1. Mwenya Mubanga, Liisa Byberg, Agneta Egenvall, Erik Ingelsson, Tove Fall. Dog Ownership and Survival After a Major Cardiovascular EventCirculation: Cardiovascular Quality and Outcomes, 2019; 12 (10) DOI: 10.1161/CIRCOUTCOMES.118.005342

Thursday, August 15, 2019

How and why resistance training is imperative for older adults

For many older adults, resistance training may not be part of their daily routine, but a new position statement suggests it is vital to improving their health and longevity.
"When you poll people on if they want to live to 100 years old, few will respond with a 'yes'," says Maren Fragala, Ph.D., director of scientific affairs at Quest Diagnostics and lead author of the position statement.
"The reason mainly being that many people associate advanced age with physical and cognitive decline, loss of independence and poor quality of life," adds Mark Peterson, Ph.D., M.S., FACSM, an associate professor of physical medicine and rehabilitation at Michigan Medicine and one of the senior authors of the statement.
The position statement, published in the Journal of Strength and Conditioning Research, and supported by the National Strength and Conditioning Association, highlights the benefits of strength and resistance training in older adults for healthier aging.
Fragala explains that while aging does take a toll on the body, the statement provides evidence-based recommendations for successful resistance training, or exercise focused on building muscle endurance, programs for older adults.
"Aging, even in the absence of chronic disease, is associated with a variety of biological changes that can contribute to decreases in skeletal muscle mass, strength and function," Fragala says. "Such losses decrease physiologic resilience and increase vulnerability to catastrophic events."
She adds, "The exciting part about this position statement is that it provides evidence-based recommendations for resistance training in older adults to promote health and functional benefits, while preventing and minimizing fears."
Practical applications
The position statement provides 11 practical applications divided into four main components: program design variables, physiological adaptations, functional benefits, and considerations for frailty, sarcopenia and other chronic conditions.
The applications include suggestions on training types and amounts of repetitions and intensities, patient groups that will need adaptations in training models, and how training programs can be adapted for older adults with disabilities or those residing in assisted living and skilled nursing facilities.
"Current research has demonstrated that resistance training is a powerful care model to combat loss of muscle strength and mass in the aging population," says Peterson, a member of the University of Michigan Institute for Healthcare Policy & Innovation and Michigan Center on the Demography of Aging.
"We demonstrate in this position statement just how much resistance training can positively affect physical functioning, mobility, independence, chronic disease management, psychological wellbeing, quality of life and healthy life expectancy. We also provide recommendations for how to optimize resistance training programs to ensure safety and effectiveness."
Fragala adds that the benefits of participating in resistance training as an older adult outweigh the risks.
"The coauthors of this paper and the hundreds of other prolific researchers whose work we synthesized in this position statement have found that in most cases, the vast benefits of resistance training largely outweigh the risks when training is properly implemented," Fragala says.
Empowering healthy aging
The authors are proud to have the support of the National Strength and Condition Association for the statement.
"Too few of older Americans participate in resistance training, largely because of fear, confusion and a lack of consensus to guide implementation," Peterson says. "By having this consensus statement supported by the National Strength and Condition Association, we hope it will have a positive impact on empowering healthier aging."
The full list of authors for the position statement includes: Maren Fragala, Ph.D., Quest Diagnostics; Eduardo Cadore, Ph.D., Federal University of Rio Grande do Sul; Sandor Dorgo, Ph.D., University of Texas at El Paso; Mikel Izquierdo, Ph.D., Public University of Navarre; William Kraemer, Ph.D., The Ohio State University; Mark Peterson, Ph.D., M.S., CSCS*D, FACSM, University of Michigan, Michigan Medicine; Eric Ryan, Ph.D., University of North Carolina -- Chapel Hill.
Story Source:
Materials provided by Michigan Medicine - University of Michigan. Original written by Kylie Urban. Note: Content may be edited for style and length.

Journal Reference:
  1. Maren S. Fragala, Eduardo L. Cadore, Sandor Dorgo, Mikel Izquierdo, William J. Kraemer, Mark D. Peterson, Eric D. Ryan. Resistance Training for Older AdultsJournal of Strength and Conditioning Research, 2019; 33 (8): 2019 DOI: 10.1519/JSC.0000000000003230

Scientists reverse aging process in rat brain stem cells

New research, published today in Nature, reveals how increasing brain stiffness as we age causes brain stem cell dysfunction, and demonstrates new ways to reverse older stem cells to a younger, healthier state.
The results have far reaching implications for how we understand the ageing process, and how we might develop much-needed treatments for age-related brain diseases.
As our bodies age, muscles and joints can become stiff, making everyday movements more difficult. This study shows the same is true in our brains, and that age-related brain stiffening has a significant impact on the function of brain stem cells.
A multi-disciplinary research team, based at the Wellcome-MRC Cambridge Stem Cell Institute (University of Cambridge), studied young and old rat brains to understand the impact of age-related brain stiffening on the function of oligodendrocyte progenitor cells (OPCs).
These cells are a type of brain stem cell important for maintaining normal brain function, and for the regeneration of myelin -- the fatty sheath that surrounds our nerves, which is damaged in multiple sclerosis (MS). The effects of age on these cells contributes to MS, but their function also declines with age in healthy people.
To determine whether the loss of function in aged OPCs was reversible, the researchers transplanted older OPCs from aged rats into the soft, spongy brains of younger animals. Remarkably, the older brain cells were rejuvenated, and began to behave like the younger, more vigorous cells.
To study this further, the researchers developed new materials in the lab with varying degrees of stiffness, and used these to grow and study the rat brain stem cells in a controlled environment. The materials were engineered to have a similar softness to either young or old brains.
To fully understand how brain softness and stiffness influences cell behavior, the researchers investigated Piezo1 -- a protein found on the cell surface, which informs the cell whether the surrounding environment is soft or stiff.
Dr Kevin Chalut, who co-led the research, said: "We were fascinated to see that when we grew young, functioning rat brain stem cells on the stiff material, the cells became dysfunctional and lost their ability to regenerate, and in fact began to function like aged cells. What was especially interesting, however, was that when the old brain cells were grown on the soft material, they began to function like young cells -- in other words, they were rejuvenated."
"When we removed Piezo1 from the surface of aged brain stem cells, we were able to trick the cells into perceiving a soft surrounding environment, even when they were growing on the stiff material," explained Professor Robin Franklin, who co-led the research with Dr Chalut. "What's more, we were able to delete Piezo1 in the OPCs within the aged rat brains, which lead to the cells becoming rejuvenated and once again able to assume their normal regenerative function."
Dr Susan Kohlhaas, Director of Research at the MS Society, who part funded the research, said: "MS is relentless, painful, and disabling, and treatments that can slow and prevent the accumulation of disability over time are desperately needed. The Cambridge team's discoveries on how brain stem cells age and how this process might be reversed have important implications for future treatment, because it gives us a new target to address issues associated with aging and MS, including how to potentially regain lost function in the brain."
This research was supported by the European Research Council, MS Society, Biotechnology and Biological Sciences Research Council, The Adelson Medical Research Foundation, Medical Research Council and Wellcome.
Story Source:
Materials provided by University of Cambridge. The original story is licensed under a Creative Commons LicenseNote: Content may be edited for style and length.

Journal Reference:
  1. Michael Segel, Björn Neumann, Myfanwy F. E. Hill, Isabell P. Weber, Carlo Viscomi, Chao Zhao, Adam Young, Chibeza C. Agley, Amelia J. Thompson, Ginez A. Gonzalez, Amar Sharma, Steffan Holmqvist, David H. Rowitch, Kristian Franze, Robin J. M. Franklin, Kevin J. Chalut. Niche stiffness underlies the ageing of central nervous system progenitor cellsNature, 2019; DOI: 10.1038/s41586-019-1484-9

Good heart health at age 50 linked to lower dementia risk later in life

Good cardiovascular health at age 50 is associated with a lower risk of dementia later in life, finds a study of British adults published by The BMJ today.
The researchers say their findings support public health policies to improve cardiovascular health in middle age to promote later brain health.
Dementia is a progressive disease that can start to develop 15-20 years before any symptoms appear, so identifying factors that might prevent its onset is important.
The American Heart Association's "Life Simple 7" cardiovascular health score, initially designed for cardiovascular disease, has been put forward as a potential tool for preventing dementia.
Designed for "primordial" prevention, where the aim is to prevent the development of risk factors themselves in order to affect risk of disease, it is the sum of four behavioural (smoking, diet, physical activity, body mass index) and three biological (fasting glucose, blood cholesterol, blood pressure) metrics, categorised into poor (scores 0-6), intermediate (7-11), and optimal (12-14) cardiovascular health.
But the evidence remains inconsistent. So to address this uncertainty, an international research project led by Séverine Sabia from the French National Institute of Health and Medical Research and University College London, examined the association between the Life Simple 7 cardiovascular health score at age 50 and risk of dementia over the next 25 years.
Their findings are based on cardiovascular data collected from 7,899 British men and women at age 50 in the Whitehall II Study, which is looking at the impact of social, behavioural, and biological factors on long term health.
Participants were free of cardiovascular disease and dementia at age 50. Dementia cases were identified using hospital, mental health services, and death registers until 2017.
Of the 7,899 participants, 347 cases of dementia were recorded over an average follow-up period of 25 years. Average age at dementia diagnosis was 75 years.
After taking account of potentially influential factors, the researchers found that adherence to the Life Simple 7 cardiovascular health recommendations in midlife was associated with a lower risk of dementia later in life.
Compared with an incidence rate of dementia of 3.2 per 1000 person years among the group with a poor cardiovascular score, those with an intermediate score had an incidence of 1.8 per 1000 person years, while those with an optimal score had an incidence of 1.3 per 1000 person years.
This is an observational study, so can't establish cause, and the researchers point to some limitations, for example relying on self-reported measures and potentially missing cases of dementia in patient records.
However, higher cardiovascular health score at age 50 was also associated with higher whole brain and grey matter volumes in MRI scans 20 years later. And reductions in dementia risk were also evident across the continuum of the cardiovascular score, suggesting that even small improvements in cardiovascular risk factors at age 50 may reduce dementia risk in old age, say the researchers.
"Our findings suggest that the Life's Simple 7, which comprises the cardiovascular health score, at age 50 may shape the risk of dementia in a synergistic manner," they write. "Cardiovascular risk factors are modifiable, making them strategically important prevention targets. This study supports public health policies to improve cardiovascular health as early as age 50 to promote cognitive health," they conclude.
Researchers in a linked editorial agree that the study provides further support for the UK Government's recent policy focus on vascular health in midlife. "However, other evidence makes clear that vascular health at 50 is determined by factors earlier in the life course, including inequality and social and economic determinants," they say.
"Reducing the risk of dementia is a leading concern in aging societies. We know that risk can change across generations, and in the UK the prevalence of dementia has decreased by nearly 25% when standardised for age," they add.
They conclude: "Although the Whitehall study cannot reflect the UK's population, estimates obtained from this cohort reinforce the need for action to shift population risk profiles for cognitive decline and dementia across the life course."
Story Source:
Materials provided by BMJNote: Content may be edited for style and length.

Journal Reference:
  1. Séverine Sabia, Aurore Fayosse, Julien Dumurgier, Alexis Schnitzler, Jean-Philippe Empana, Klaus P Ebmeier, Aline Dugravot, Mika Kivimäki, Archana Singh-Manoux. Association of ideal cardiovascular health at age 50 with incidence of dementia: 25 year follow-up of Whitehall II cohort studyBMJ, 2019; l4414 DOI: 10.1136/bmj.l4414