Difference between revisions of "CLEF 2021"

From OPENRESEARCH fixed Wiki
Jump to navigation Jump to search
(modified through wikirestore by orapi)
(modified through wikirestore by orapi)
 
(One intermediate revision by the same user not shown)
Line 9: Line 9:
 
|has program chair=Henning Müller, Lorraine Goeuriot, Birger Larsen
 
|has program chair=Henning Müller, Lorraine Goeuriot, Birger Larsen
 
|Acronym=CLEF 2021
 
|Acronym=CLEF 2021
|End date=2021/09/24
+
|End date=2021-09-24
 
|Series =CLEF
 
|Series =CLEF
 
|Type  =Conference
 
|Type  =Conference
Line 15: Line 15:
 
|State  =RO/B
 
|State  =RO/B
 
|City  =RO/B/Bucharest
 
|City  =RO/B/Bucharest
 +
|Year  =2021
 
|Homepage=http://clef2021.clef-initiative.eu/
 
|Homepage=http://clef2021.clef-initiative.eu/
 
|Ordinal=12
 
|Ordinal=12
|Start date=2021/09/21
+
|Start date=2021-09-21
|Title  =CLEF Conference and Labs of Evaluation Forum}}
+
|Title  =CLEF Conference and Labs of Evaluation Forum
 +
}}
 
=== Topics ===
 
=== Topics ===
  

Latest revision as of 04:18, 6 December 2021


Event Rating

median worst
Pain1.svg Pain5.svg

List of all ratings can be found at CLEF 2021/rating

CLEF 2021
CLEF Conference and Labs of Evaluation Forum
Ordinal 12
Event in series CLEF
Dates 2021-09-21 (iCal) - 2021-09-24
Homepage: http://clef2021.clef-initiative.eu/
Twitter account: @clef_initiative
Submitting link: https://www.easychair.org/conferences/?conf=clef2021
Location
Location: RO/B/Bucharest, RO/B, RO
Loading map...

Important dates
Papers: 2021/05/03
Submissions: 2021/05/03
Notification: 2021/06/04
Camera ready due: 2021/06/25
Committees
General chairs: Bogdan Ionescu, K. Selcuk Candan
PC chairs: Henning Müller, Lorraine Goeuriot, Birger Larsen
Table of Contents
Tweets by @clef_initiative| colspan="2" style="padding-top: 2px; " |

Topics

Relevant topics for the CLEF 2021 Conference include but are not limited to:

  • Information Access in any language or modality: information retrieval, image retrieval, question answering, search interfaces and design, infrastructures, etc.
  • Analytics for Information Retrieval: theoretical and practical results in the analytics field that are specifically targeted for information access data analysis, data enrichment, etc.
  • User studies either based on lab studies or crowdsourcing.
  • Past results/run deep analysis both statistically and fine grain based.
  • Evaluation initiatives: conclusions, lessons learned, impact and projection of any evaluation initiative after completing their cycle.
  • Evaluation: methodologies, metrics, statistical and analytical tools, component based, user groups and use cases, ground-truth creation, impact of multilingual/multicultural/multimodal differences, etc.
  • Technology transfer: economic impact/sustainability of information access approaches, deployment and exploitation of systems, use cases, etc.
  • Interactive Information Retrieval evaluation: the interactive evaluation of information retrieval systems using user-centered methods, evaluation of novel search interfaces, novel interactive evaluation methods, simulation of interaction, etc.
  • Specific application domains: Information access and its evaluation in application domains such as cultural heritage, digital libraries, social media, expert search, health information, legal documents, patents, news, books, plants, etc.
  • New data collection: presentation of new data collection with potential high impact on future research, specific collections from companies or labs, multilingual collections.
  • Work on data from rare languages, collaborative, social data.