2.3 C
New York
Monday, January 27, 2025

Breast imaging AI algorithm influenced by affected person traits


Affected person traits influenced false-positive outcomes on digital breast tomosynthesis (DBT) when analyzed by an AI algorithm accepted by the U.S. Meals and Drug Administration (FDA), a research printed Might 21 in Radiology discovered. 

Researchers led by Derek Nguyen from Duke College in Durham, NC discovered that false-positive case scores as assigned by the algorithm had been considerably extra probably in Black and older sufferers and fewer probably in Asian sufferers and youthful sufferers in comparison with white sufferers and girls between the ages of 51 and 60. 

“Radiologists ought to train warning when utilizing these AI algorithms, as their effectiveness is probably not constant throughout all affected person teams,” Nguyen advised AuntMinnie.com. 

Radiology departments proceed to have curiosity in implementing AI into their workflows, with the know-how getting used to handle workloads amongst radiologists. Nevertheless, the researchers identified an absence of information on the affect of affected person traits on AI efficiency. 

Nguyen and colleagues explored the affect of affected person traits equivalent to race and ethnicity, age, and breast density on the efficiency of an AI algorithm decoding detrimental screening DBT exams carried out between 2016 and 2019. 

All exams had two years of follow-up with out a analysis of atypia or breast malignancy, indicating that they’re true-negative circumstances. The workforce additionally included a subset of distinctive sufferers that was randomly chosen to supply a broad distribution of race and ethnicity.  

The FDA-approved algorithm (ProFound AI 3.0, iCAD) generated case scores (malignancy certainty) and threat scores (one-year subsequent malignancy threat) for every mammogram. 

The research included 4,855 ladies with a median age of 54 years. Of the overall, 1,316 had been white, 1,261 had been Black, 1,351 had been Asian, and 927 had been Hispanic. 

The algorithm was extra prone to assign false-positive case and threat scores to Black ladies and fewer prone to assign case scores to Asian ladies in comparison with white ladies. The algorithm was additionally extra prone to assign false-positive case scores to older sufferers and fewer probably to take action for youthful sufferers. 

Probability of false-positive rating project by AI algorithm
Affected person demographic Suspicious case rating Suspicious threat rating
Odds ratio P-value Odds ratio P-value
White Reference Reference
Black 1.5 <0.001 1.5 0.02
Asian 0.7 0.001 0.7 0.06
Age 51-60 Reference Reference
Age 41-50 0.6 <0.001 0.2 <0.001
Age 61-70 1.1 0.51 3.5 <0.001
Age 71-80 1.9 <0.001 7.9 <0.001

Moreover, ladies with extraordinarily dense breasts had been extra prone to be assigned false-positive threat scores in comparison with ladies with fatty breasts (odds ratio [OR], 2.8; p = 0.008). The identical development went for girls with scattered areas of fibroglandular density (OR, 2; p = 0.01) and girls with heterogeneously dense breasts (OR, 2.0; p = 0.05). 

The workforce additionally reported no vital variations in each threat and case scores between Hispanic ladies and white ladies. 

The research authors known as for the FDA to supply clear steering on the demographic traits of affected person samples used to develop algorithms, and for distributors to be clear concerning the algorithm growth course of. Additionally they known as for numerous information units for use in coaching future AI algorithms. 

Nguyen stated that radiology departments ought to completely examine the affected person inhabitants datasets on which the AI algorithms had been skilled. 

“Making certain that the distribution of the coaching dataset intently matches the demographics of the affected person inhabitants they serve can assist optimize the accuracy and utility of the AI in medical workflows,” he advised AuntMinnie.com. 

The total research will be discovered right here. 

Related Articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Latest Articles