BETA
This is a BETA experience. You may opt-out by clicking here

More From Forbes

Edit Story

The Top Ten Brain Science And Psychology Studies Of 2015

Following
This article is more than 8 years old.

The pace of research seems to accelerate more every year, and 2015 saw its share of major studies across several categories of brain science and psychology. This Top 10 list isn’t meant to be exhaustive (and it isn't ranked in any particular order), but is rather a survey of the top research covered here at Neuropsyched along with a few additional studies that had an impact. A number of these studies also serve as prelude to research we'll see in the coming year.

1.Brain Powers Change as You Age

Science bolstered the ever-changing-brain theory in 2015 by showing that mental abilities don’t all collectively peak and begin rolling downhill at any one age, or even during one or two decades. Instead, they fluctuate across a span of ages, with a couple peaking well into our elder years. The findings came from a study that included over 50,000 people with ages ranging from the teens to their 70s. Mental abilities like brain processing speed peaked early on, around age 18, while vocabulary skills continued developing into the 60s and 70s. Remembering things we see (visual working memory) peaks around age 25, while short-term memory doesn’t take full form until around 35. One of the most interesting mental abilities the study tracked has to do with our ability to read other people – how well we identify which emotions are percolating or absent in the person across the table. The researchers found that this ability doesn’t take shape until we’re in our 40s, and continues maturing for a couple of decades well into our 60s.

Quoting study co-author Laura Germine: “The brain seems to continue to change in dynamic ways through early adulthood and middle age,” and that the current study “paints a different picture of the way we change over the lifespan than psychology and neuroscience have traditionally painted.”

2. Alzheimer’s Clues Appear Much Earlier than We Thought

A handful of studies in 2015 hinted at early clues to the development of Alzheimer’s disease. One study published in the journal Neurology showed that late middle-age memory failures can predict Alzheimer’s as much as 18 years before diagnosis. Participants were given tests of mental ability and memory every three years for 18 years. Those who scored lowest on the memory and thinking tests during the first year of the study were 10 times more likely to develop the disease.

Another study published in the journal Science used fMRI to identify early signs of the disease appearing in the brain’s internal GPS system, within a region called the entorhinal cortex that plays a major role in memory and navigation. Look for more studies to build on these findings in 2016.

3. Missing Link Between the Brain and Immune System Identified

How the brain rids itself of toxins has been a point of debate for some time. The prevailing theory is that the brain doesn’t use the body’s lymphatic system, but rather has its own garbage removal system that appears to come online when we sleep. A study conducted by University of Virginia researchers in 2015 found that the brain does, in fact, use the body’s lymphatic system, but with a previously unidentified network of blood vessels in the meninges (the membranes surrounding the brain and spinal cord). The study was conducted in mice but the same vessels were also identified in human samples. It’s possible that abnormalities in these vessels may play a role in various neurological diseases like multiple sclerosis and schizophrenia. If that finding is confirmed, the network of vessels could become an early treatment target for these and other diseases.  Look for additional studies to build on this one in the next year.

4. Loneliness is Destructive to the Mind and Body

Two studies came out in 2015 showing a convincing link between loneliness and both mental and physical debilitation. One of the studies focused on the effects of loneliness on 8,300 adults age 65 and older who participated  in the U.S.Health and Retirement Study from 1998 to 2010. Participants in the study were assessed every two years across a range of factors, including levels of depression, loneliness, memory, cognitive function and social network status. About 1,400 of the participants (17%) reported loneliness at the start of the study, and roughly half of that group also reported clinically significant depression. Over the 12-year study, participants reporting loneliness experienced 20% faster cognitive decline than other participants. This result held true regardless of factors like demographics, socioeconomic status and the presence of other debilitating health conditions. Higher levels of depression also correlated significantly with more rapid cognitive decline.

In another study funded by the National Institutes of Health, loneliness (defined as a “perceived social isolation”) was linked to a 14% increase in premature death among older adults. More studies along these lines are set to publish in 2016.

5. Popular Over-the-Counter Drugs Linked to Increased Risk of Dementia

Researchers published a bombshell study in JAMA Internal Medicine in 2015 showing that four common medications are linked to a significantly increased risk of developing dementia in older adults. The study followed 3,434 people over the age of 65 for seven years. None of the participants showed signs of dementia at the start of the study period. During the seven years, almost 800 of the participants developed dementia (637 developed Alzheimer’s disease; the rest were afflicted with other forms of dementia). After controlling for a range of other factors, the researchers were able to link heightened risk of dementia to a daily dose of four medications: Diphenhydramine (the active ingredient in many over-the-counter antihistamines); Chlorpheniramine (another popular over-the-counter antihistamine); Oxybutynin (a prescription medication for bladder conditions); and Doxepin (an older prescription antidepressant from the class of meds called tricylics).

All of the drugs in question are anticholinergics – meaning they block a neurotransmitter called acetylcholine in the nervous system. Common side effects of taking anticholinergics include drowsiness, blurred vision and memory loss. People suffering from Alzheimer’s disease typically have low brain levels of acetylcholine, and previous research has shown a link between taking anticholinergic drugs and increased risk of dementia in older adults. While the study didn’t prove a cause-and-effect relationship, the correlation was strong and of particular concern for older adults.

6. Middle-Age Americans Are Dying and We Don’t Know Why

While not strictly speaking a brain science or psych study, research by two economists uncovered an alarming finding with a distinctly psychological dimension. Economists Anne Case and Angus Deaton reported “a marked increase in the all-cause mortality of middle-aged white non-Hispanic men and women in the United States between 1999 and 2013. This change reversed decades of progress in mortality and was unique to the United States; no other rich country saw a similar turnaround.” The researchers focused specifically on mortality rates for 45-to-54 year olds.

The impact of this study has nothing to do with firm conclusions, because the study itself doesn’t point to specific causes for the trend. And some statisticians are still wrestling with the data to determine exactly what it tells us beyond surface-level speculation. But study co-author Angus Deaton, who was also the 2015 Nobel laureate in economics, thinks he knows at least part of what the data is telling us: “Drugs and alcohol, and suicide . . . are clearly the proximate cause.” Why those factors are increasing among this specific group is the question. More to come on this in the coming year, no doubt.

7. The More Time You Spend On Facebook, The More Likely You’ll Be Depressed

One of the biggest ironies of our time is that social media—the technology that promised to connect us to the world—may be a significant factor in elevating rates of loneliness and depression. A 2015 study published in the Journal of Social and Clinical Psychology added to the chorus, but also helped clarify the issue by pinpointing the lynchpin between social media use and depression – social comparison. The researchers think that the social comparisons we make between ourselves and all of our online “friends” showing off the very best parts of their lives is the heart of the matter.

The study found that people who spend the most time on Facebook, men and women, consistently showed more depressive symptoms, and social comparison with peers surfaces as the main reason why. Another way to frame the findings – the personal public relations jobs people do on Facebook are having an impact, and not the sort I think we were hoping for in the early days of the technology. Maybe in 2016 we can start ignoring more online PR and take back some of the emotional ground we’ve yielded to social media.

8. We’re Getting Closer to Blood Testing for Mental Health Disorders

While still quite preliminary, a breakthrough study in 2015 showed that biomarkers for suicidal tendencies can be identified in blood tests. Researchers from Indiana University developed a questionnaire and a blood test that together predicted with 92 percent accuracy who among a group of 108 men would develop suicidal thoughts. Considering that only about 2 percent of people suffering from depression commit suicide (and depression is the leading cause of suicide), having a method that can detect who’s most likely to go there would be immensely useful to mental health professionals. On a broader scale, tests like these may also eventually show tendencies for developing depression and other psychiatric disorders, which would put a much finer point on identifying the best treatment options earlier on. Having said that, this area of research is controversial and in its infancy, so much more to come on all of the above – but the beginnings of something potentially quite big have emerged.

9. Diet Can Influence Your Chances of Developing Depression

Each year more research comes out linking inflammation at the cellular level to a host of badness, including heart disease, diabetes, some forms of cancer, and, more recently, depression. And we’ve also found that inflammation is strongly linked to lifestyle factors, with diet high among them. While the connective details are still not entirely clear, research from 2015 indicates that changing your diet to something closer to the Mediterranean Diet (which has known anti-inflammatory effects) can lower your risk of depression. The study suggests that even moderately following the diet can cut the risk by way of, it’s thought, lowering cellular inflammation. With inflammation research exploding, we’ll hear more about this and other linkages soon.

10. We Can Stop Wasting Time Talking About Birth Order

I always like to include one study in these lists that kicks a pop psych myth squarely in its tookis. This year that honor goes to research that challenged the long-held belief that birth order has a major effect on the relative personality and intelligence of siblings. Researchers studied 377,000 high school students to find out how much birth order affected their personality development and intelligence. They found that firstborns do have slightly higher IQs than their later-born siblings, but only one point higher – a statistically significant but practically meaningless difference. Firstborns also tend to score higher on certain personality traits like extroversion, agreeableness and conscientiousness, but the differences between their scores and those of later-borns are, according to the researchers, “infinitesimally small.” Overall, the association between birth order and personality was statistically .02, which is well below the level of perception.

According to study co-author Brent Roberts, professor of psychology at the University of Illinois, “In some cases, if a drug saves 10 out of 10,000 lives, for example, small effects can be profound. But in terms of personality traits and how you rate them, a .02 correlation doesn’t get you anything of note. You are not going to be able to see it with the naked eye. You’re not going to be able to sit two people down next to each other and see the differences between them. It’s not noticeable by anybody.”

You can find David DiSalvo on Twitter, FacebookGoogle Plus, and at his website daviddisalvo.org.

Also on Forbes:

The Top Ten Brain Science and Psychology Studies of 2013

The Top Ten Brain Science and Psychology Studies of 2012