In a recent study published in Technologies, researchers devised a novel system that uses machine learning to predict tongue disease.
Study: Tongue Disease Prediction Based on Machine Learning Algorithms. Image Credit: fizkes/Shutterstock.com
Background
Traditional tongue illness diagnosis relies on monitoring tongue features such as color, shape, texture, and wetness, which reveal the health state.
Traditional Chinese medicine (TCM) practitioners rely on subjective assessments of tongue characteristics, which leads to subjectivity in diagnosis and replication issues. The rise of artificial intelligence (AI) has created a strong demand for breakthroughs in tongue diagnostic technologies.
Automated tongue color analysis systems have demonstrated high accuracy in identifying healthy and ill individuals and diagnosing various disorders. Artificial intelligence has tremendously advanced in capturing, analyzing, and categorizing tongue images.
The convergence of artificial intelligence approaches in tongue diagnostic research intends to increase reliability and accuracy while addressing the long-term prospects for large-scale AI applications in healthcare.
About the study
The present study proposes a novel, machine learning-based imaging system to analyze and extract tongue color features at different color saturations and under various light conditions for real-time tongue color analysis and disease prediction.
The imaging system trained tongue images classified by color using six machine-learning algorithms to predict tongue color. The algorithms included support vector machines (SVM), naive Bayes (NB), decision trees (DTs), k-nearest neighbors (KNN), Extreme Gradient Boost (XGBoost), and random forest (RF) classifiers.
The color models were as follows: the Human Visual System (HSV), the red, green, and blue system (RGB), luminance separation from chrominance (YCbCr, YIQ), and lightness with green-red and blue-yellow axes (LAB).
Researchers divided the data into the training (80%) and testing (20%) datasets. The training dataset comprised 5,260 images classified as yellow (n=1,010), red (n=1,102), blue (n=1,024), green (n=945), pink (n=310), white (n=300), and gray (n=737) for different light conditions and saturations.
The second group included 60 pathological tongue images from the Mosul General Hospital of Mosul and Al-Hussein Hospital of Iraq, encompassing individuals with various conditions such as diabetes, asthma, mycotic infection, kidney failure, COVID-19, anemia, and fungiform papillae.
Patients sat in front of the camera at a 20cm distance while the machine learning algorithm recognized the color of their tongues and predicted their health status in real-time.
Researchers used laptops with the MATLAB App Designer program installed and webcams with 1,920 x 1,080 pixels resolution to extract tongue color and features. Image analysis included segmenting the central region of the tongue image and eliminating the mustache, beard, lips, and teeth for analysis.
After image analysis, the system converted the RGB space to HVS, YCbCr, YIQ, and LAB models. After color classification, the intensities from different color channels were fed to various machine learning algorithms to train the imaging model.
Performance evaluation metrics included precision, accuracy, recall, Jaccard index, F1-scores, G-scores, zero-one losses, Cohen’s kappa, Hamming loss, Fowlkes-Mallow index, and the Matthews correlation coefficient (MCC).
Results
The findings indicated that XGBoost was the most accurate (98.7%), while the Na<0xC3><0xAF>ve Bayes technique had the lowest accuracy (91%). For XGBoost, F1 scores of 98% denoted an outstanding balance between recall and precision.
The 0.99 Jaccard index with 0.01 zero-one losses, 0.92 G-score, 0.01 Hamming loss, 1.0 Cohen’s kappa, 0.4 MCC, and 0.98 Fowlkes-Mallow index suggested nearly perfect positive correlations, suggesting that XGBoost is highly reliable and effective for tongue analysis. XGBoost ranked first in precision, accuracy, F1 score, recall, and MCC.
Based on these findings, the researchers used XGBoost as the algorithm for the suggested tongue imaging tool, which is linked to a graphical user interface and predicts tongue color and associated disorders in real time.
The imaging system yielded positive results upon deployment. The machine learning-based system accurately detected 58 of 60 tongue images with 96.6% detection accuracy.
A pink-colored tongue indicates good health, but other hues signify illness. Patients with yellow tongues were categorized as diabetic, whereas those with green tongues were diagnosed with mycotic diseases.
A blue tongue suggested asthma; a red-colored tongue indicated coronavirus disease 2019 (COVID-19); a black tongue indicated fungiform papillae presence; and a white tongue indicated anemia.
Conclusions
Overall, the real-time imaging system using XGBoost yielded positive results upon deployment with 96.6% diagnostic accuracy. These findings support the practicality of artificial intelligence systems for tongue detection in medical applications, demonstrating that this method is secure, efficient, user-friendly, pleasant, and cost-effective.
Camera reflections might cause differences in observed colors, affecting diagnosis. Future studies should consider camera reflections and use powerful image processors, filters, and deep-learning approaches to increase accuracy. This method paves the way for extended tongue diagnostics in future point-of-care health systems.