Search   |   Back Issues   |   Author Index   |   Title Index   |   Contents

Conference Report


D-Lib Magazine
October 2004

Volume 10 Number 10

ISSN 1082-9873

Cross-Language Evaluation Forum - CLEF 2004

15 - 17 September 2004, Bath, United Kingdom


Carol Peters

Red Line


The results of the fifth campaign of the Cross-Language Evaluation Forum were presented at a two-and-a-half day workshop held in Bath, UK, 15-17 September, immediately following the eighth European Conference on Digital Libraries. The workshop was attended by nearly 100 researchers and system developers.

CLEF logo

The main objectives of the Cross-Language Evaluation Forum (CLEF) are to stimulate the development of mono- and multilingual information retrieval systems for European languages and to contribute to the building of a research community in the multidisciplinary area of multilingual information access. These objectives are realised through the organisation of annual evaluation campaigns and workshops. Each campaign offers a series of evaluation tracks designed to test different aspects of mono- and cross-language system performance.

One of the principal objectives when CLEF began was to encourage developers to build multilingual retrieval systems capable of searching over collections in a number of languages. The multilingual track was thus the main track for several years and was made progressively more difficult. With CLEF 2003, where the track included a task that involved finding relevant documents in a collection containing documents in eight languages, we felt that we had achieved an important goal. We had shown that fully multilingual retrieval could be (almost) as effective as bilingual (L1 —> L2) retrieval and that systems are able to adapt and reengineer rapidly and effectively to process new languages as the need arises. For this reason, in CLEF 2004 we decided to reduce the multilingual document retrieval activity to leave more space for other types of cross-language information retrieval experiments.

Six tracks were offered to evaluate the performance of systems for:

  • mono-, bi- and multilingual document retrieval on news collections (Ad-hoc)
  • mono- and cross-language domain-specific retrieval (GIRT)
  • interactive cross-language retrieval (iCLEF)
  • multiple language question answering (QA@CLEF)
  • cross-language retrieval on image collections (ImageCLEF)
  • cross-language spoken document retrieval (CL-SDR)

CLEF 2004 thus marked a breaking point with respect to previous campaigns. The focus was no longer concentrated on multilingual document retrieval but was diversified to include different kinds of text retrieval across languages (exact answers in the question-answering track) and retrieval on different kinds of media (not just plain text but collections containing image and speech as well). In addition, increased attention was given to issues that regard system usability and user satisfaction with tasks to measure the effectiveness of interactive systems or system components being included in both the QA and the ImageCLEF tracks, with the collaboration of the coordinators of iCLEF (the interactive track).

In order to cover all these activities, the CLEF test collection has been expanded considerably: the main comparable multilingual corpus now contains almost 2 million news documents in ten languages. A secondary collection used to test domain-specific system performance consists of the GIRT-4 collection of English and German social science documents. ImageCLEF used two distinct collections: a collection of historic photographs provided by St Andrews University, Scotland, and a collection of medical images with French and English case notes made available by the University Hospitals, Geneva. Finally, the cross-language spoken document retrieval track (CL-SDR) used speech transcriptions in English from the TREC-8 and TREC-9 SDR tracks, supplied by the National Institute of Standards and Technology (NIST), USA.

The response from the information retrieval community was very encouraging. Participation in this year's campaign was considerably up with respect to the previous year with 55 groups submitting results for one or more of the different tracks: 36 from Europe, 13 from North America; 4 from Asia and one mixed European/Asian group. As in previous years, participants consisted of a nice mix of newcomers and veteran groups. The success of the question-answering and image retrieval tracks had a big impact on participation in CLEF 2004, not just with respect to the numbers but also regarding the skills and expertise involved. The popularity of question-answering has meant that a growing number of participants have a natural language processing background while the image and spoken document retrieval tasks have brought in groups with experience in diverse areas—including speech recognition, image processing and medical informatics—making CLEF an increasingly multidisciplinary forum.

The campaign culminated in the workshop held in Bath, UK, 15-17 September. In addition to presentations by participants in the campaign, talks included reports on the activities of the NTCIR evaluation initiative for Asian languages, and on industrial experience in building cross-language applications. The final session consisted of a panel in which panellists attempted to analyse the current organisation of the CLEF campaigns in depth, considering whether we are working on the right problems, choosing our investments wisely, and giving sufficient attention to the user perspective. Tracks taken into consideration for the CLEF 2005 campaign include multilingual Web retrieval and a cross-language Geographic Information Retrieval track.

The presentations given at the CLEF Workshops and detailed reports on the experiments of CLEF 2004 and previous years can be found on the CLEF website at <>.

CLEF is an activity of the DELOS Network of Excellence for Digital Libraries.

For more information, please contact:
Carol Peters, ISTI-CNR
CLEF Coordinator
Tel: +39 050 3152897
E-mail: <>.


Copyright © 2004 Carol Peters

Top | Contents
Search | Author Index | Title Index | Back Issues
Previous Report | Next Report
Home | E-mail the Editor


D-Lib Magazine Access Terms and Conditions