Back to Search
Start Over
Kullback Leibler Divergence for Image Quantitative Evaluation.
- Source :
-
AIP Conference Proceedings . 2016, Vol. 1750 Issue 1, p1-9. 9p. 1 Black and White Photograph, 3 Diagrams, 1 Chart, 2 Graphs. - Publication Year :
- 2016
-
Abstract
- Medical imaging has been expanding ever since to give diagnostic information through different types of modalities. Currently, there are many types of modalities such as Computed Tomography (CT) scans, Magnetic Resonance Imaging (MRI), X-rays (plain radiography), Positron Emission Tomography (PET) scan and Ultrasonographic diagnostics (USG), available in the field of medical and surgical. These modalities are widely used in clinical diagnosis and development of research in education. In terms of image quality, the qualitative analysis was always used to evaluate the quality of output image from classification results. By qualitative analysis, the researchers were able to judge the precision of detected lesion and hence calculated the accuracy of detection through the testing cases. However, the qualitative analysis was sometimes subjective and the verification from more than one radiologist was needed to confirm the results of classification. Therefore, the quantitative analysis was also needed to ensure the results from the classification algorithm can be assessed objectively. In this study, we propose pixel-based approach of Kullback Leibler (KL) divergence in assessing the medical images. Unlike the standard statistical analysis, the evaluation using KL divergence does not require testing of hypothesis or confidence interval construction based on the mean and standard deviation. The proposed framework of KL is useful to provide a descriptive measure for the purpose of summarizing data. Firstly, both of the original and computed images are normalized where the sum of all intensities is equal to one. Then, the probability distribution is calculated by column using function of hist (HO) and hist (HA) and each of the column are expressed as data vector hOt = {h0l,h02,h03,h0i}and hAi = {hA1, hA2, hA3, hAi} respectively. In the computation of probability distribution, the function of hist bins the elements in each data vector of and into 10 equally spaced containers and return the amount of elements in each container as row vector. The results have shown that the proposed framework of Kullback Leibler divergence is promising in presenting better final images quantitatively. [ABSTRACT FROM AUTHOR]
Details
- Language :
- English
- ISSN :
- 0094243X
- Volume :
- 1750
- Issue :
- 1
- Database :
- Academic Search Index
- Journal :
- AIP Conference Proceedings
- Publication Type :
- Conference
- Accession number :
- 116420506
- Full Text :
- https://doi.org/10.1063/1.4954516