Abstract: Web-based image search engines mostly rely on surrounding textual features. It becomes very difficult for them to interpret users’ search intention only by query keywords and this leads to ambiguous and noisy search results which do not satisfy users perspective. In order to solve this ambiguity in text based image retrieval we used visual information of query image. In this project, we are implementing a novel Internet image search approach where user is asked to click on one query image with less effort and visually relevant images from a huge database are retrieved. Our main perspective is to capture the users’ search intention from this one-click query image in following steps. In this system, the user first submits query keyword. A pool of images is retrieved by text-based search. Then the user is asked to select a query image from the image pool. Images in the pool are re-ranked based on their color and texture similarities to the query image. These similarities are computed using Euclidean distance method. A query-specific color similarity metric and a query specific textual similarity metric are learned from the selected examples and used to rank images. These similarity metrics reflect users’ intention at a finer level since every query image has different metrics.
Keywords: Content based image retrieval (CBIR), Color Coherence Vector (CCV), Texture Element Feature Characterization (TEFC), Pixels, Image, Cluster.