PROJECT TITLE :
Blind image quality assessment based on Multichannel features fusion and label transfer - 2016
In this paper, we have a tendency to propose an efficient blind image quality assessment (BIQA) algorithm, that is characterized by a brand new feature fusion theme and a k-nearest-neighbor (KNN)-primarily based quality prediction model. Our goal is to predict the perceptual quality of a picture without any prior info of its reference image and distortion type. Since the reference image is inaccessible in several applications, the BIQA is kind of desirable in this context. In our methodology, a brand new feature fusion scheme is 1st introduced by combining a picture's statistical data from multiple domains (i.e., discrete cosine transform, wavelet, and spatial domains) and multiple color channels (i.e., Y, Cb, and Cr). Then, the anticipated image quality is generated from a nonparametric model, which is known as the label transfer (LT). Primarily based on the belief that similar pictures share similar perceptual qualities, we implement the LT with a picture retrieval procedure, where a question image's KNNs are looked for from some annotated pictures. The weighted average of the KNN labels (e.g., distinction mean opinion score or mean opinion score) is employed as the anticipated quality score. The proposed methodology is simple and computationally appealing. Experimental results on 3 publicly offered databases (i.e., LIVE II, TID2008, and CSIQ) show that the proposed method is very in keeping with human perception and outperforms several representative BIQA metrics.
Did you like this research project?
To get this research project Guidelines, Training and Code... Click Here