Home Oral Health Reducing Noise in Dentistry: The Role of AI in Improving Radiographic Interpretation

Reducing Noise in Dentistry: The Role of AI in Improving Radiographic Interpretation

by adminjay


A patient emails 20 different dentists her recently taken full mouth series of x-rays, stating, “My previous dentist took these just before I moved to town. The move was sudden (work-related), so my dentist never had a chance to tell me what work I needed done. I’m here for two years on assignment, and I’m looking for a new dentist. Please review the attached x-rays and let me know if any treatment is required and when I can get an appointment.” Assume all twenty dentists agree to provide a provisional treatment plan based on the diagnostic images alone (on the condition, of course, they examine the patient before starting the work). How consistent do you think the 20 dentists’ plans will be?

If the scenario above seems familiar, you may remember the CBC’s 2012 undercover investigation. Their Marketplace segment, Money Where Your Mouth Is, found that, when presented with an identical full-mouth radiographic series, dentists varied widely on diagnosis. Indeed, 40% of the treatment plans provided by a group of 20 unsuspecting dentists were significantly out of line with the CBC’s hired experts, while various “innocuous discrepancies” were present in the remaining 60%. Despite the diagnostic inconsistency, there was no indication the full-mouth series was particularly complicated. “Unfortunately, in dentistry” the show’s expert opined, a patient’s diagnosis “depends very much on the dentist’s personal experience.” The show’s producers pointed to factors that might explain the conflicting opinions such as place of training, treatment philosophy (conservative vs aggressive), and the types of materials used by the providers. While the CBC noted some dentists were likely putting their “own financial interests ahead of a patient’s best interests,” there was one factor the show did not mention: human error.

Noise in medicine

Daniel Kahneman, the Nobel Prize in Economics winning author, best known for Thinking Fast and Slow, published a new book last spring: Noise: A Flaw in Human Judgment.[i] In Noise, Kahneman explains two broad phenomenon which describe systemic errors in fields such as law, medicine, and finance: bias and, as the title of the book suggests, noise. Bias comprises a consistent pattern of divergence, which may or may not have moral implications. A dentist with a bias towards early intervention when faced with incipient decay may be thought to have an “aggressive treatment philosophy.” A dentist who consistently provides more comprehensive treatment plans when patients have insurance coverage, demonstrates another kind of bias. On the other hand, if 10 dentists with no incentive other than to respond correctly, who are instructed to diagnose decay in the same set of radiographs, provide 10 different responses in no clear direction, their answers would be described as “noisy.”  The further the divergence of their diagnoses, the greater the level of noise.

Kahneman describes the real world as “scandalously” noisy, and Exhibit A of a noisy system is medicine:

Faced with the same patient, different doctors make different judgments about whether patients have skin cancer, breast cancer, heart disease, tuberculosis, pneumonia, depression, and a host of other conditions. Noise is especially high in psychiatry …[h]owever, considerable noise is also found in areas where it might not be expected, such as the reading of X-rays.

Indeed, radiologists call diagnostic variation their Achilles’ heel. Because noise in radiography is easy to demonstrate, research in this area has produced striking findings. For example, the error rate for radiologists who participated in a study on breast cancer diagnosis ranged from a perfect score (i.e., 0% error) to a staggering 50% rate of false negatives and 64% rate of false positives.[ii]

Physicians’ judgments over time have also been shown to be noisy. When unknowingly presented with the same challenging diagnostic images months apart, 22 physicians disagreed with their own earlier diagnoses between 63% and 92% of the time.[iii] Other research suggests that the accuracy of diagnoses may vary depending on the time of day or how busy the day is. One study that looked at a very large set of data revealed that 63.7% of tests reviewed at 8:00 am were referred for additional screening compared to only 47.8% of similar tests reviewed at 5:00 pm.[iv] It has been speculated that radiology departments tend to fall behind later in the day and, therefore, the rate of missed diagnosis climbs as clinicians feel rushed to catch up. While this is a rational explanation, it’s not exactly reassuring.

Noise in dentistry

It would be easy to imagine how the above summarized body of research relates to dentistry, but we don’t have to. Noise in the interpretation of dental radiography has been extensively studied and multiple factors have been found to contribute to increased noise. The factor most closely studied deals with the complexity of diagnosing oral radiographs and, in many cases, the lack of adequate training. Consider the following from a dental school in the United States:

The University of Missouri-Kansas City (UMKC) School of Dentistry includes assessment of radiographic interpretation in its comprehensive objective structured clinical examination (OSCE) … Senior dental students participate in the multistation OSCE in the summer semester between the third and fourth years. During the pilot, student performance on the radiology station of the OSCE resulted in a very small percent (2.9%) passing on first attempt, the lowest of all the OSCE stations.

That 97% of dental students failed the radiographic interpretation examination on the first attempt clearly demonstrated a lack of adequate training. A few years after the school revamped its approach to teaching X-ray interpretation, that number decreased to 73%.[v] Nevertheless, the results show that it is easy to err when diagnosing pathologies from two dimensional images. One might reasonably argue that seasoned clinicians would perform better; however, the real world poses a number of challenges that a student taking an examination would not face. A busy dentist under time pressures from staff and patients might spend significantly less time arriving at and documenting findings from a set of periapicals and bitewings. Dentists dealing with the stresses of daily practice might not feel they have time to study the entire field of view; the hazards of tunnel vision are well known. Others might arrive at the correct diagnosis but forget it before they have time to enter the chart notes.

Noise has been observed amongst dental experts as well as novices. In one study, clinical instructors at another US dental school were asked “to rate percent bone loss for indicated teeth while viewing digitized radiographic images by selecting one of the following categories: <15 percent, 15-30 percent, and >30 percent.”[vi] Answers were calibrated according to the American Dental Association (ADA) and American Academy of Periodontology (AAP)’s guidelines. When first exposed to the test images, “clinical instructors’ agreement with the correct choice was 64.5 percent.” This means that 35.5% of the time, the faculty’s instructors’ assessment was incorrect. The study noted “inaccuracies and inconsistencies of radiographic interpretation among clinical instructors” and concluded that further “training and calibrating [of] instructors are needed so that the accuracy and consistency of their ratings can be enhanced.”

Beyond inadequate training, other factors have been shown to contribute to error in the interpretation of dental radiography, including the ambient light (diagnoses were more accurate when the dentist studied the image in a dark room after giving their eyes time to adjust).[vii] Research has also established that the quality of radiographic images can vary from one imaging system to another, leading to an increased frequency in error, and that “projected anatomy contributes substantially to noise, especially when detecting low-contrast objects in the images.”[viii] In other disciplines, fatigue, hunger and even mood have all been observed to influence the quality of complex judgments.

Consequences of noise

Whatever the cause, we know that faulty evaluation of X-rays can lead to a variety of suboptimal outcomes such as misdiagnosis, missed and delayed diagnosis, over- or undertreatment, flawed evaluation of patients’ periodontal conditions over time and, ultimately, poor treatment outcomes, dissatisfied patients, and liability. The above, in turn, can negatively impact clinicians’ financial wellbeing, stress levels and career satisfaction. Although the exact rate of error (or noise) in real world dental practice is elusive, clinicians familiar with the research and who do not suffer from the Denning-Kruger effect ought to be motivated to reduce error in diagnosing oral pathologies from radiographic images.

Reducing noise and improving the quality of radiographic interpretation

Patient safety experts and practicing clinicians have proposed several ways to address noise in healthcare. Two of the studies mentioned above argue that rigorously applied and well-structured training programs are key to improving quality. Process improvements like the implementation of protocols and checklists have shown potential to decrease diagnostic errors. On an organizational level, patient centered cultures with a strong focus on patient safety are known, over time, to reduce adverse events and other patient safety incidents. Healthcare providers should never abandon incremental improvements; however, dental radiology is on the cusp of tremendous and rapid technological disruption.

The application of artificial intelligence (AI) programs in the medical profession have exploded in popularity over the past decade, and its applications in dentistry are developing at a breakneck speed. Until recently, the idea that a computer could search through a dental office’s entire database of X-rays, identify potential missed diagnoses and, with the click of a button, aid in the detection of a wide variety of oral pathologies seemed like fantasy. It’s already happening. In fact, multiple products have rushed to market and early adopters – general dentists and dental specialists in private practice – have incorporated AI solutions into daily routines. AI, deep learning algorithms, and computer-assisted diagnosis form part of the fabric of dentistry—and the sophistication of this technology is advancing hour by hour. The pace of change in this area can be summarized in five words: blink and you’ll miss it.

AI programs aid clinicians by tracing cephalometric landmarks; detecting caries, alveolar bone loss, and periapical pathosis; marking the inferior alveolar nerve; analyzing facial growth and performing other similar tasks. The use of AI to screen for oral cancer and lymph node metastasis is actively under development, as is the diagnosis and treatment planning of various other common orofacial diseases. In the not-too-distant future, deep and meta learning algorithms will be capable of detecting even rare diseases. On the bleeding edge of this technology, researchers are working on an algorithm that will be capable of “universal lesion detection,” not just in the mouth but throughout the entire body (and even the brain).

In the bread-and-butter areas of dentistry, AI is already out-performing dentists; in one study “a deep [learning] algorithm was able to detect carious lesions with an accuracy of 75.5–93.3% and a sensitivity of 74.5–97.1%.[ix] This is a considerable improvement over diagnosis by clinicians using radiographs alone, with sensitivity varying from 19% to 94%.” Using another algorithm, “the accuracy of [periodontally compromised tooth] diagnosis proved to be 76.7–81.0%, while the accuracy of predicting the need for extraction was 73.4–82.8%.”[x] There is clearly room for improvement; however, tests have shown error rates of 35.5 % and, with every additional radiograph fed into these algorithms, their accuracy and utility will increase.

While a breakthrough technology, AI is not yet ready to replace dental professionals. “Rather, the use of AI should be viewed as a complementary asset, to assist dentists and specialists” in an effort to minimize the noise that has so long plagued the field of radiographic interpretation. We know that technology can greatly disrupt the practice of dentistry – just look at the impacts Invisalign’s algorithm and companies like Smile Direct Club have had. It’s up to us to embrace it.

References

[i] Kahneman, Daniel, Olivier Sibony, and Cass R. Sunstein. Noise: a flaw in human judgment. Little, Brown, 2021.

[ii] Variability in the interpretation of screening mammograms by US radiologists. Findings from a national sample, C A Beam, P M Layde, D C Sullivan, 1996: https://pubmed.ncbi.nlm.nih.gov/8546556/.

[iii] Variation between experienced observers in the interpretation of accident and emergency radiographsP J Robinson 1, D Wilson, A Coral, A Murphy, P Verow, Br J Radiol., 1999: https://pubmed.ncbi.nlm.nih.gov/10474490/.

[iv] Association of Primary Care Clinic Appointment Time With Clinician Ordering and Patient Completion of Breast and Colorectal Cancer Screening, Esther Y. Hsiang, BA1,2; Shivan J. Mehta, MD, MBA3; Dylan S. Small, PhD2; et alCharles A. L. Rareshide, MS4; Christopher K. Snider, MPH4; Susan C. Day, MD, MPH3; Mitesh S. Patel, MD, MBA, MS2,3,4,5, Author Affiliations Article Information, JAMA Netw Open. 2019;2(5):e193403. doi:10.1001/jamanetworkopen.2019.3403

[v] Predoctoral Curricular Revision for Dental Radiographic Interpretation Competence Based on OSCE Results, Kumar, Vandana ; GadburyAmyot, Cynthia C, Journal of dental education, 2019-10, Vol.83 (10), p.1233-1239

[vi] Accuracy and Consistency of Radiographic Interpretation Among Clinical Instructors Using Two Viewing Systems, Lanning, Sharon K ; Best, Al M ; Temple, Henry J ; Richards, Philip S ; Carey, Allison ; McCauley, Laurie K, Journal of dental education, 2006-02, Vol.70 (2), p.149-159

[vii] Kawai, Taisuke, Kenji Sato, and Takashi Yosue. “Effects of viewing conditions on the detection of contrast details on intraoral radiographs.” Oral Radiology 21.1 (2005): 23-29.

[viii] The effect of anatomical noise on perception of low contrast in intra-oral radiographs: an in vitro studyOlsson, Lars ; Nilsson, Mats ; Svenson, Björn ; Hellén-Halme, KristinaDento-maxillo-facial radiology, 2016, Vol.45 (4), p.20150402-20150402

[ix] Bader JD, Shugars DA, Bonito AJ. Systematic reviews of selected dental caries diagnostic and management methods. J Dent Educ. 2001;65(10):960-8.

[x] Lee JH, Kim DH, Jegon SN, Choi SH. Diagnosis and prediction of periodontally compromised teeth using a deep learning-based convolutional neural network algorithm. J Periodontal Implant Sci. 2018;48(2):of114-23.


About the Authors

Julian Perez is Chief Legal Officer at dentalcorp, where he oversees legal, regulatory compliance, corporate governance, and enterprise risk functions to support practices in the delivery of optimal patient care. He earned his bachelor’s degree from Yale University and a JD from Columbia University’s School of Law.

 

 

Dr. Michelle Budd works with dentalcorp’s Compliance & Risk Management team as a Patient Safety Consultant. She graduated from Western University with a Doctor of Dental Surgery degree and subsequently earned a Master of Public Health degree. Michelle has travelled throughout Canada to help dental practices achieve and maintain professional compliance.


RELATED ARTICLE: Making an Impact by Giving Back, One Smile at a Time





Source link

Related Articles