We’ve probably all experienced the disconnect between our biological age and the age we appear to other people. Maybe, for example, you’ve noticed that someone may look younger or older than they actually are. That’s because everyone ages in different ways, and at different rates. What we see on the surface does not always perfectly correlate to that which we can’t see.
The same goes for the brain, which ages in different ways for different people. What makes the brain particularly interesting, however, is that it can be a much more accurate predictor of our “true age,” in a way that is even more accurate than our biological age. Brain age has also been identified as a potential biomarker that could be used to predict the onset of certain neurodegenerative conditions, such as Alzheimer’s.
The challenge? Identifying brain age and particular brain structural changes that can allow researcher’s to accurately predict both cognitive decline and, eventually, Alzheimer’s. A team of researchers at the University of Southern California have developed an artificial intelligence technique that could help overcome this problem. Their work is published in the Proceedings of the National Academy of Sciences.
To create the AI tool, termed a “neural network,” researchers collected brain MRI scans from over 4,000 people, all of whom started in a normal state of cognition, but some of whom went on to develop conditions like Alzheimer’s. The team used these scans to train the AI tool and enable it to create images that reveal different aspects of aging in the brain. When comparing participant biological age with their brain age, the difference between the two was able to give the team a glimpse into a person’s risk for cognitive decline. The greater the gap, the greater the risk of decline.
The team also noted that biological sex made a difference in how fast people’s brains age, with parts of the male brain aging faster than female brains.
Overall, the AI tool was able to predict a person’s actual age and be about a year more accurate that many existing approaches. This, ultimately, let’s researchers detect very subtle aspects of a person’s brain that could indicate aging and predict, with a higher degree of accuracy, a person’s risk for developing conditions like Alzheimer’s. This could allow for more personalized and timely interventions to minimize the risks associated with cognitive decline.
Sources: Eurekalert!; Molecular Psychiatry; National Academy of Sciences