SEATTLE, WA – University of Washington researchers are developing an app that could allow people to screen for pancreatic cancer and other diseases with the snap of a smartphone selfie.
With a five-year survival rate of 9 percent, pancreatic cancer ranks among the most deadly forms of the disease, but its symptoms can still be caught before a tumor engulfs the body.
Through a smartphone camera, BiliScreen uses a computer vision algorithms and machine learning tools to detect increased bilirubin levels in the sclera – the white of the eye.
One of the earliest symptoms of pancreatic cancer is jaundice, a yellow discoloration of the skin and eyes due to a buildup of bilirubin in the blood. Detecting signs of jaundice when bilirubin levels are minimally elevated before they’re visible to the naked eye could enable an entirely new screening program for individuals at risk.
Studying 70 people in an initial clinical study, the BiliScreen app correctly identified cases of concern 89.7 percent of the time with a 3-D printed box controlling light exposure, compared to the blood test currently used.
Lead author Alex Mariakakis, a doctoral student at the Paul G. Allen School of Computer Science & Engineering, believes the test is easy enough for people to self-administer once a month in the comfort of their own homes.
BiliScreen builds on BiliCam from the UW’s Ubiquitous Computing Lab, a smartphone app that screens for newborn jaundice with nothing but an image of a baby’s skin. A recent study in the journal Pediatrics showed BiliCam provided accurate estimates of bilirubin levels in 530 infants.
Bilirubin – which can be an early signal for pancreatic cancer at elevated levels in the blood – is screened by BiliScreen to help determine whether consultation with a doctor for further testing is necessary. Current methods for measuring bilirubin levels demand the care of a doctor –something that does not easily accommodate frequent screenings.
The whites of the eyes in adults are more sensitive than skin to changes in bilirubin levels, which often signals pancreatic cancer, hepatitis or the generally benign Gilbert’s syndrome. Unlike skin color, alterations in the sclera are more consistent across all ethnicities and races.
Should the sclera already exhibit a yellowish discoloration, bilirubin levels are already well past cause for concern. The UW team wondered if computer vision and machine learning tools could detect those color changes in the eye before humans can see them.
Using a smartphone’s built-in camera and flash, BiliScreen collects images of a person’s eye as they snap a selfie which automatically isolates the white parts of the eye. BiliScreen then processes the color information from the sclera by measuring the wavelengths of light it reflects and absorbs. It then correlates this information with estimated bilirubin levels by using certain algorithms.
BiliScreen comes with two accessories: a 3-D printed box to control lighting conditions and glasses that help the app calibrate colors. The UW team hopes to to remove the need for these additional accessories – possibly by mining data from facial pictures.
To account for different lighting conditions, the team tested BiliScreen with two different accessories – paper glasses printed with colored squares to help calibrate color and a 3-D printed box that blocks out ambient lighting.
From here on out, the team will be testing the app on more groups of people at risk for jaundice and underlying conditions and work on making the app as user friendly as possible — including removing the need for accessories like the box and glasses.