Can an AI-Powered Eye Exam Spot Alzheimer’s Decades Early?
Retinal imaging might become a fast, noninvasive tool for earlier Alzheimer’s detection in everyday care settings, explains RetiSpec Chief Business Officer Catherine Bornbaum.
Could AI-driven retinal imaging flag Alzheimer’s decades before symptoms start? That question is central to RetiSpec’s mission. The health technology company is developing AI-powered retinal imaging to detect Alzheimer’s-related biomarkers, like amyloid — by looking at the back of the eye.
Catherine Bornbaum, PhD, RetiSpec’s chief business officer, says the aim is to make biomarker testing simpler, more scalable, and easier for people to access.
In this conversation with Being Patient’s founder Deborah Kan, Bornbaum explains how decades of research show that changes in the back of the eye can mirror the structural, functional and molecular changes of Alzheimer’s in the brain, making the retina a powerful, noninvasive window for early detection. Bornbaum also describes how AI-driven, real-time analysis of eye images using cameras already in optometrists’ offices could make eye exams a practical access point for noninvasive Alzheimer’s biomarker testing.
Being Patient: We’ve known for quite a while that you can detect biomarkers like amyloid through the eyes. Is that correct?
Catherine Bornbaum: Yes, that’s correct.
Being Patient: When was it first discovered? And tell me a little bit about how it was discovered.
Bornbaum: Yeah, so the ability to detect amyloid in the eye has an extensive body of work. It’s been investigated for more than 20 years. And we have investigators from all over the world coming to similar conclusions. So this tells us that there’s really something there. A number of folks have gone about it in different ways — either using different tracers, looking at tissue studies, looking at animal studies, and, obviously, clinical studies in humans.
We have a wide body of evidence telling us that there are pathologic changes that are happening in the eye and also structural and functional changes that happen in the eyes of individuals who have Alzheimer’s disease that are unique and different from the eyes of individuals who do not have Alzheimer’s disease.
Amyloid beta is one of the most commonly known and most advanced from a technologic perspective, but there are certainly other markers that are very promising, and we hope to see those in the future as well.
Being Patient: I had heard in the past before I knew much about Alzheimer’s detection, about cases where you can detect a brain tumor through a retinal scan. Is this technology much the same in terms of seeing brain tumors through the eyes?
Bornbaum: That is a really great question, and it also touches on a really interesting point that sometimes your eye doctor can detect things that are happening in your brain that you wouldn’t necessarily anticipate. Generally, when somebody is detecting signs of a brain tumor through the eye, they’re looking for signs of swelling in the back of the eye. It’s a condition called papilledema. And what this is, it’s sort of a physical manifestation of the pressure in the head manifesting in the eye.
In Alzheimer’s disease, we can see some structural changes that happen in Alzheimer’s disease as a proxy. So certain layers of the back of the eye change. You have a thinning of the retinal nerve fiber layer. You have changes to the ganglion cell layers. There are structural changes that are happening in the eye.
We also see changes in proteins and inflammation and the vascular structures that mirror what’s happening in the brain. This is really because the retina, which is the back of the eye, is embryologically and biologically brain tissue. So it shares those characteristics, and as you continue to grow and age and evolve, it still reflects and serves as a very good proxy for what’s happening in the brain.
Being Patient: When we’re talking about amyloid plaque in the brain, can you see that earlier, like in a pre-symptomatic stage? Because now we’re talking about blood tests, and we know early detection and in some cases before a patient ever experiences symptoms, you can detect the proteins of amyloid in your bloodstream. Is it the same in a scan? And if so, how early can you detect those abnormalities?
Bornbaum: The answer is yes, we can detect those signs early in the preclinical or pre-symptomatic stage. We know this from a number of studies conducted, most recently with RetiSpec’s use in the Bio-Hermes 1 study that was sponsored by the Global Alzheimer’s Platform.
In this study, we scanned individuals who were pre-symptomatic or healthy controls, and then we also followed and scanned folks who had mild cognitive impairment and later stage disease. And what we found is that individuals who were deemed preclinical or pre-symptomatic but who had elevated amyloid PET, we could detect that change. So RetiSpec found these folks to be positive for amyloid, even though they had no symptoms, and their amyloid PET scans confirmed that.
We’ve seen this in multiple studies, which tells us that the eye is a really, really important tool for early detection. It can give you a really important signal of what’s going on in your brain long before you start to have any symptoms and is accessible, non-invasive, and super simple to do. I’m really excited about that particular use case.
“This is really because the retina, which is the back of the eye, is embryologically and biologically brain tissue… as you continue to grow and age and evolve, it still reflects and serves as a very good proxy for what’s happening in the brain.”
Being Patient: How easy are these scans to administer? Is it immediate? Is it kind of like the brain tumor scenario when the optician is looking through the eyes and sees abnormalities and knows right away? Or does it take a certain extent of lab processing and analytics?
Bornbaum: It is very real time, instant results. So the experience itself is just like going to the eye doctor. We use the same cameras that are already in place in eye doctor’s offices. So it’s a very simple, familiar experience. You just have pictures of your eyes taken, and then the images are analyzed by the RetiSpec software. And then a report is generated in real time. You can click the button and get the report. And so this will tell you your likely amyloid PET status in real time.
One important thing to note is that you really do need the deep learning model or the AI in order to detect that signal because it’s not something that a human looking into your eye will be able to see. We’re essentially doing spectroscopy non-invasively in the eye, and you really do need the camera and the software in order to have this information. But once you have that, it is real time — no waiting. It’s incredible.
Being Patient: Can you see the tau? Can you see the inflammation? We know the pathology of Alzheimer’s, the presumed pathology is beta amyloid, tau, inflammation. And usually it’s not until a person is in the inflammatory state that they actually see symptoms of Alzheimer’s. So I’m curious about what exactly you can see through the eyes.
Bornbaum: Based on the body of evidence that exists from incredible researchers around the world, we believe that there are a number of biomarkers or things we’ll be able to detect in the future, including inflammatory markers, vascular signs, some of which we can already see, tau certainly, NfL (neurofilament light chain)… I think the evidence from tissue studies and early clinical work suggests that this is likely possible.
What I’m really excited about is the fact that there are a number of studies underway, and some of which we’ve completed, where we’re able to capture a wide variety of biomarkers all at once. So things like the Bio-Hermes 2 study, where you have over 1,000 participants from representative samples across various stages of disease who are being so. And then when we have this strong evidence base, AI models thrive on really good, clean, robust data.
This gives us a really good starting point for that. Our goal is to have this technology be sort of a simple eye scan where you can have tons of information about your brain health and neurodegenerative processes in a very simple, real-time experience.
Being Patient: What do we know about this technology in terms of how accurate and how early it can really detect the pathology of Alzheimer’s?
Bornbaum: In our studies, we’ve seen that the retinal signals that correspond with amyloid absolutely can be detected in folks who are cognitively normal. We have some longitudinal studies underway, but not yet at the sort of 10 plus, 20 plus years. So we have to rely on other research that’s been done in the field. That body of research tells us that you can detect these signals anywhere from 10, 17, maybe 20 years in advance of symptoms.
It provides a really important opportunity and window for early detection and potentially even being able to take part in clinical trials focused on prevention or getting access to therapy should they be available in the future.
Being Patient: Now you’re using RetiSpec to compare its accuracy to other testing methods — CSF (spinal taps), PET scans, and now blood tests. You’re running comparative analyses, so can you tell us what those comparisons show and what data you have so far?
Bornbaum: Our goal upfront is just to align as closely as possible with the accuracy of gold standard tools like PET or the lumbar puncture for cerebral spinal fluid, spinal tap, but also to make it much more scalable, accessible, and completely non-invasive.
You’re right, PET scans are excellent. They are the gold standard, but they’re expensive [and] hard to get access to. Blood tests are really promising, but they’re still in the early stages, and they still have some complexity with their logistics, cold storage, all of that. But the retina really offers an incredibly accessible way to detect signs of Alzheimer’s disease and particularly amyloid in everyday care.
“Our goal is to have this technology be sort of a simple eye scan where you can have tons of information about your brain health and neurodegenerative processes in a very simple, real-time experience.”
Being Patient: So you mentioned the Bio-Hermes study. What was the comparison there? Do you have the results? And what’s the percentage of accuracy compared to the other methods?
Bornbaum: Yeah, so in the Bio-Hermes 1 study, we ran a sub-study in that parent study of 271 participants. And in that study, we were comparing against amyloid PET as the gold standard. We also had blood-based biomarkers in that study.
In our target population, we had an area under the curve, or a technical name for agreement, of about 83 percent. So for us, this was a good milestone. It is really important to mention that this technology was validated in a manner that was blinded, meaning we didn’t have access to results, which is not always the case for digital biomarkers, and the results were held out by a third party. It was conducted in an incredibly rigorous manner.
We actually did a head-to-head comparison with the blood-based biomarkers, and these results were presented at the Clinical Trials in Alzheimer’s Disease Conference just last year. So we’re excited for this year as well. And what we presented on the podium there was that the RetiSpec results were completely comparable to the blood-based biomarkers when we compared against amyloid PET.
What was interesting, though, was that among the results, we had imaged 271 participants we were analyzing, and only a subset of these actually all had enough of the blood-based biomarker results. So in some cases, there was some issue with the tubes or there wasn’t enough sample in there. For us, this really was an indication that a simple test like ours can provide value in cases where some of these other tests may not be applicable or appropriate or work fully.
Being Patient: How expensive are they? How much would scanning cost?
Bornbaum: So there are really, really great partnerships. We focus on building great AIs, and we lean on our partners who already do all the heavy lifting on the equipment. The cameras that we use are already in tens of thousands of eye doctors’ offices around the globe. People don’t have to buy that camera. If they don’t have one, we can work with our partners at Topcon Healthcare to provide one of the cameras for them. But they’re ubiquitous cameras. They’re not that expensive. And we don’t charge for them.
The exciting thing about this technology is that when we are through regulatory review, very soon, we are able to use CPT codes that are already existing. [CPT codes are] a reimbursement code so that folks who are seeking care, who meet the criteria for the test, won’t have to pay out of pocket. And because it’s AI driven, all the costs of development are just upfront. It’s running the studies with PET that costs all the money. After that, it’s really just servers and electricity. So we’re able to provide this in a really cost-effective way, which is core to our mission. We want early, accessible detection at scale. It’s important to us that this be affordable and reach as many people who want and need it as possible.
Being Patient: What do you think this means for the diagnostic ecosystem in terms of the ability to maybe go to your optometrist and find out you have a biomarker?
Bornbaum: I think we are at a really exciting inflection point. We weren’t really sure what this would mean in the real world, just to be candid. Part of our development process and part of the evaluations and studies that we’ve done have included real world studies where we actually put this technology in optometry settings and in ophthalmology settings.
And we tested not only what the clinical workflows look like, what the accuracy was, but importantly, what patients felt about this, what clinicians felt about this, what care partners and loved ones felt about this. Because that’s really, really critical to understanding how this technology can be valuable, where it can be useful, where we need to be mindful of friction. And I will be honest, we were not sure what was going to happen.
We were really fortunate to receive a grant from the Davos Alzheimer’s Collaborative to test this in real world settings. And what we found was that optometry is a really effective channel. They already have very closely established referral pathways to both primary care and neuro specialists.
Certainly not everybody coming into an optometry office is going to opt into something like this. But what we found in our study is that a lot of folks who have struggled to have their concerns taken seriously when they report questions about their memory or noticing changes — it was brushed off as normal healthy aging or “let’s watch and wait and see,” when folks inherently knew something was wrong and were looking for answers. So this access point in optometry offered them something biologic that they could hold in their hand and take to a specialist or primary care provider for follow-up care.
Something that’s become apparent to us over the last few years is that there is a movement in eye care to advance this idea of healthcare from the eye. In the same way that we use the eye as a window into what’s happening in the brain, other companies and clinicians and scientists are using it as a window into systemic health, cardiovascular health, renal health. It is an incredible noninvasive access point.
And so there’s a big initiative happening right now between regulators, reimbursement, partners in equipment, and big tech companies, who are moving this forward. I think in a couple of years, everybody going into their eye doctor is going to be offered new different technologies to quickly, simply understand how their health is doing, their systemic health.
Being Patient: How far away are we from clinics using this technology? I mean, right now you’re looking at studies to determine accuracy, but the next step would be if it is deemed a diagnostic tool for Alzheimer’s. Is that something that has to go through FDA approval? How long will that take?
Bornbaum: We are actively preparing for our pivotal study that we will submit to the regulators, so FDA and Health Canada, and it is our hope that we will be able to offer this in market towards the end of next year — or worst case, a little bit after that. But we are on the cusp. It is many years of development in the making. We have an incredible team and partners, but that’s what’s next for us, and we’re very, very excited.










