Semantic-aware blind image quality assessment
More Info
expand_more
Abstract
Many studies have indicated that predicting users’ perception of visual quality depends on various factors other than artifact visibility alone, such as viewing environment, social context, or user personality. Exploiting information on these factors, when applicable, can improve users’ quality of experience while saving resources. In this paper, we improve the performance of existing no-reference image quality metrics (NR-IQM) using image semantic information (scene and object categories), building on our previous findings that image scene and object categories influence user judgment of visual quality. We show that adding scene category features, object category features, or the combination of both to perceptual quality features results in significantly higher correlation with user judgment of visual quality. We also contribute a new publicly available image quality dataset which provides subjective scores on images that cover a wide range of scene and object category evenly. As most public image quality datasets so far span limited semantic categories, this new dataset opens new possibilities to further explore image semantics and quality of experience.
Files
Download not available