文摘
We consider the problem of ranking the web image search using human affects. For this, a Probabilistic Affective Model (PAM) is presented for predicting the affects from color compositions (CCs) of images, then the retrieval system is developed using them. The PAM first segments an image into seed regions, then extracts CCs among seed regions and their neighbors, finally infer the numerical ratings of certain affects by comparing the extracted CCs with pre-defined human-devised color triplets. The performance of the proposed system has been studied at an online demonstration site where 52 users search 16,276 landscape images using affects, then the results demonstrated its effectiveness in affect-based image annotation and retrieval.