PROJECT TITLE :
Human Visual System-Based Fundus Image Quality Assessment of Portable Fundus Camera Photographs
Telemedicine and also the medical “big information” era in ophthalmology highlight the use of non-mydriatic ocular fundus photography, that has given rise to indispensable applications of moveable fundus cameras. However, in the case of transportable fundus photography, non-mydriatic image quality is a lot of at risk of distortions, such as uneven illumination, color distortion, blur, and low distinction. Such distortions are known as generic quality distortions. This paper proposes an algorithm capable of selecting images of truthful generic quality that will be especially useful to assist inexperienced individuals in collecting meaningful and interpretable information with consistency. The algorithm is based on three characteristics of the human visual system-multi-channel sensation, simply noticeable blur, and therefore the contrast sensitivity function to detect illumination and color distortion, blur, and low distinction distortion, respectively. A total of 536 retinal pictures, 280 from proprietary databases and 256 from public databases, were graded independently by one senior and 2 junior ophthalmologists, such that three partial measures of quality and generic overall quality were classified into two classes. Binary classification was implemented by the support vector machine and the decision tree, and receiver operating characteristic (ROC) curves were obtained and plotted to analyze the performance of the proposed algorithm. The experimental results revealed that the generic overall quality classification achieved a sensitivity of 87.forty fivepercent at a specificity of 91.sixty sixpercent, with an space underneath the ROC curve of 0.9452, indicating the price of applying the algorithm, which is based on the human vision system, to assess the image quality of non-mydriatic photography, especially for low-price ophthalmological telemedicine applications.
Did you like this research project?
To get this research project Guidelines, Training and Code... Click Here