SIGIR 2009 Workshop on The Future of IR Evaluation

23 July 2009
Workshop Homepage Program Call for Participation Organizers

Call for Papers

Evaluation is at the core of information retrieval: virtually all progress owes directly or indirectly to test collections built within the so-called Cranfield paradigm. However, in recent years, IR researchers are routinely pursuing tasks outside the traditional paradigm, by taking a broader view on tasks, users, and context. There is a fast moving evolution in content from traditional static text to diverse forms of dynamic, collaborative, and multilingual information sources. Also industry is embracing "operational" evaluation based on the analysis of endless streams of queries and clicks.

We invite the submission of papers that think outside the box:

The workshop brings together all stake-holders ranging from those with novel evaluation needs, such as a PhD candidate pursuing a new IR-related problem, to senior IR evaluation experts. Desired outcomes are insight into how to make IR evaluation more "realistic," and at least one concrete idea for a retrieval track or task (at CLEF, INEX, NTCIR, TREC) that would not have happened otherwise.

Help us shape the future of IR evaluation!


June 15, 2009Deadline for Paper Submissions
 Prepare your 2 page PDF using the ACM format
Submit online using EasyChair
July 2, 2009Notification of Acceptance
 Details of accepted papers published online
July 8, 2009Deadline for Camera Ready Copies
July 23, 2009SIGIR 2009 Workshop on the Future of IR Evaluation


This workshop will be held as part of the 32st Annual International ACM SIGIR Conference on Research & Development on Information Retrieval, Boston, 2009. Information on Boston can be found in the Wikipedia.