TIRUCHIRAPPALLI, India: The integration of artificial intelligence (AI) into medical imaging can support diagnosis and reduce clinical workload. A new study has tested the performance of two newer AI image recognition models—called transformers—in the automatic detection of common dental conditions on panoramic radiographs. The results highlight their potential to support dentists with faster and more reliable assessments.
Undertaken by researchers in India, the study sought to determine whether software can reliably sort a panoramic radiograph into a condition category—caries, gingivitis, calculus and hypodontia—based on the overall radiographic pattern. The authors tested two transformer models that process an image differently and compared their diagnostic performance and speed. Their goal was to address limitations of traditional diagnostic methods, including subjectivity, variability between clinicians and difficulty detecting early or subtle lesions.
They trained, validated and tested the models using a dataset of over 5,000 annotated panoramic images sourced from multiple clinical repositories. Their results showed that the best-performing model achieved slightly higher diagnostic performance, reaching around 96% accuracy. The second model delivered comparable accuracy but ran more efficiently—a real-world consideration. Here, accuracy refers to how often the model assigned the correct category to the whole radiograph. Both models were able to classify most radiographs correctly, but performance differed by condition.
How does this relate to AI products already used in clinics?
Tools such as Pearl Second Opinion, VideaHealth Detect AI and Align X-ray Insights support decision-making in that they typically highlight regions of interest for specific findings on radiographs. In contrast, the present study evaluated whether the AI models tested could be used for automated categorisation of radiographs as a whole, not as regions.
Overall, the study concluded that transformer-based systems offer a promising tool for automated diagnosis and have the potential to enhance early detection, reduce diagnostic errors and streamline workflows. Future work will focus on testing with larger and more diverse datasets and refining models to ensure reliability before routine clinical deployment.
The study, titled “A self attention based deep learning framework for accurate and efficient dental disease detection in OPG radiographs”, was published online on 21 January 2026 in Scientific Reports.
Topics:
Tags:
BOSTON, US: VideaHealth and Aspen Dental have completed one of the largest artificial intelligence (AI) deployments seen in dentistry, rolling out AI ...
The use of 3D imaging has become the standard of care for diagnosis and treatment planning for many medical and dental procedures. Such imaging was first ...
DUNDEE, Scotland: Management of irreversible pulpitis in permanent teeth is undergoing a gradual but meaningful shift. Advances in diagnostic understanding ...
Formal complaints are one of the most stressful aspects of clinical dental practice and are widely recognised as a significant source of professional ...
Live webinar
Tue. 3 March 2026
11:00 am EST (New York)
Dr. Omar Lugo Cirujano Maxilofacial
Live webinar
Tue. 3 March 2026
8:00 pm EST (New York)
Dr. Vasiliki Maseli DDS, MS, EdM
Live webinar
Wed. 4 March 2026
12:00 pm EST (New York)
Munther Sulieman LDS RCS (Eng) BDS (Lond) MSc PhD
Live webinar
Wed. 4 March 2026
1:00 pm EST (New York)
Live webinar
Fri. 6 March 2026
3:00 am EST (New York)
Live webinar
Tue. 10 March 2026
4:00 am EST (New York)
Assoc. Prof. Aaron Davis, Prof. Sarah Baker
Live webinar
Tue. 10 March 2026
8:00 pm EST (New York)
Dr. Vasiliki Maseli DDS, MS, EdM
To post a reply please login or register