Our goal is an efficient algorithm for image retrieval based on relevance feedback. We assume the user is searching for a particular image in a database and responds to a sequence of machine queries by declaring which of two (or more) displayed images is ``closest'' to his target. Efficiency is measured by the average number of queries necessary to locate the image. We introduce a Bayesian feedback model which accounts for considerable variation in the responses of the user through a sequence of independent random metrics on feature space whose distribution may depend on both the displayed images and the target. Each new query is chosen to minimize the expected conditional entropy of the distribution over targets given the previous responses. The resulting algorithm is demonstrated for shape and image retrieval and its performance compared with theoretical bounds and previous models.