Computer Learning Model Accurately Predicts Osteoporosis From Hand Radiographs

doctor viewing hand Xray
Researchers determined whether a computer learning model could classify hand radiographs with osteoporosis based on the second metacarpal cortical percentage.

A computer deep learning model can accurately screen for osteoporosis based on x-rays of the hand and wrist, according to pilot study results published in The Journal of Hand Surgery.1

Computer algorithms designed to classify images have been applied to medical treatment and diagnosis in several ways, such as in spinal surgery and assessment of coronary angiograms.1 With regard to screening for bone loss, researchers have developed a mobile-based technology to predict the presence of osteopenia or osteoporosis based on hand x-rays, using second metacarpal cortical percentage as a proxy for global bone mineral density. Although this technology is simpler and notably more cost-effective than dual-energy x-ray absorptiometry,2 it is limited by inconsistencies in interpretations, measurements, and standardization, which may be improved with computer learning.

Based on this research, investigators explored whether a convolutional neural network, a deep learning technique for analyzing visual images, could be used to predict osteoporosis on plain hand and wrist x-rays.1

A total of 4000 unique right- and left-handed radiographs were laterally altered to reflect the image of a right hand and edited for background subtraction. The convolutional neural network was developed using a training set of x-rays to perform the laterality classification task, validated to ensure effectiveness, and tested using a set of novel images. For vertical alignment classification, 2500 hand x-rays were used to train a model to detect vertical alignment of the second metacarpal. Models for segmentation of the second metacarpal and prediction of osteoporosis based on the second metacarpal cortical percentage were also designed. All 4 processes were combined into a full automated pipeline model to predict osteoporosis, and the researchers tested each model for accuracy.

The laterality classification model had an accuracy of 99.62% in determining right vs left (100% specificity; 99.3% sensitivity). The model for vertical alignment was able to predict hand orientation within 10° of vertical in 93.2% of scans. For segmentation and prediction of osteoporosis, accuracy was 94.8% and 88.4%, respectively. Finally, the full pipeline model was tested on 265 unique x-rays, 34 of which were osteoporotic images. The full pipeline had an accuracy of 93.9% (82.4% sensitivity; 95.7% specificity), with only 3 second metacarpals segmented incorrectly. Sensitivity and specificity were similar in a balanced dataset in which the osteoporosis incidence was 50%.

Related Articles

“This was a pilot project to prove the computer can accurately read the images and make changes as needed (laterality, rotation, etc) to accomplish the goal,” the researchers stated. “Further refinements in the process and additional data would be required for this concept to replace dual-energy x-ray absorptiometry (DEXA) scans for diagnosis of osteoporosis or osteopenia.”

Nonetheless, given the frequency and availability of hand radiographs among older patients at greater risk for osteoporosis, this technology may “help with screening and identifying those who should be referred for further evaluation.”

References

1. Tecle N, Teitel J, Morris MR, Sani N, Mitten D, Hammert WC. Convolutional neural network for second metacarpal radiographic osteoporosis screening. J Hand Surg Am. 2020;45(3):175-181.

2. Schreiber JJ, Kamal RN, Yao J. Simple assessment of global bone density and osteoporosis screening using standard radiographs of the hand. J Hand Surg Am. 2017;42(4):244-249.