Search results for key=MMS1999 : 1 match found.

Search my entire BibTeX database
Output format: Text
BibTeX entry
     Combine using:
AND OR

Abstract icon Abstract BibTeX icon BibTeX entry Postscript icon Postscript PDF icon PDF PPT icon Powerpoint

Technical Reports

1999

  • @techreport{MMS1999,
    	vgclass =	{report},
    	vgproject =	{viper,cbir},
    	author =	{Henning M\"{u}ller and Wolfgang M\"{u}ller and David McG.\
    	Squire and Thierry Pun},
    	title =	{Performance Evaluation in Content-Based Image Retrieval:
    	Overview and Proposals},
    	number =	{99.05},
    	institution =	{Computer Vision Group, Computing Centre, University of
    	Geneva},
    	address =	{rue G\'{e}n\'{e}ral Dufour, 24, CH-1211 Gen\`{e}ve, Switzerland},
    	month =	{December},
    	year =	{1999},
    	url =	{{}/publications/postscript/1999/VGTR99.05_HMuellerWMuellerSquirePun.pdf},
    	url1 =	{{}/publications/postscript/1999/VGTR99.05_HMuellerWMuellerSquirePun.ps.gz},
    	abstract =	{Evaluation of retrieval performance is a crucial problem
    	in content-based image retrieval (CBIR).  Many different methods for
    	measuring the performance of a system have been created and used by
    	researchers.  This article discusses the advantages and shortcomings
    	of the performance measures currently used.  Problems such as a
    	common image database for performance comparisons and a means of
    	getting relevance judgments (or ground truth) for queries are
    	explained.
    
    	The relationship between CBIR and information retrieval (IR) is made
    	clear, since IR researchers have decades of experience with the
    	evaluation problem.  Many of their solutions can be used for CBIR,
    	despite the differences between the fields.  Several methods used in
    	text retrieval are explained. Proposals for performance measures and
    	means of developing a standard test suite for CBIR, similar to that
    	used in IR at the annual Text REtrieval Conference (TREC), are
    	presented.},
    }