DE eng

Search in the Catalogues and Directories

Hits 1 – 18 of 18

1
Comparative evaluation of cross-language information retrieval systems
In: http://www.mt-archive.info/LREC-2004-Peters.pdf (2005)
BASE
Show details
2
The Future of Evaluation for Cross-Language Information Retrieval Systems
In: http://nlp.uned.es/pergamus/pubs/clef-lrec.pdf (2004)
BASE
Show details
3
Eurospider at CLEF 2002
In: http://www.clef-campaign.org/workshop2002/WN/15.pdf (2002)
BASE
Show details
4
Experiments with the eurospider retrieval system for CLEF 2000
In: http://www.ercim.org/publication/ws-proceedings/CLEF2/eurospider.pdf (2001)
BASE
Show details
5
Experiments with the Eurospider Retrieval System for CLEF 2000
In: http://www.iei.pi.cnr.it/DELOS/CLEF/braschler.pdf (2001)
BASE
Show details
6
The Evaluation of Systems for Cross-Language Information Retrieval
In: http://www.educat.hu-berlin.de/~kluck/lrec2000-paper-70.pdf (2000)
BASE
Show details
7
The Eurospider Retrieval System and the TREC-8 Cross-Language Track
In: http://www.cs.columbia.edu/~min/papers/eit_t8f.pdf (1999)
BASE
Show details
8
SPIDER Retrieval System at TREC7
In: http://trec.nist.gov/pubs/trec7/papers/ETHTREC7.pdf.gz (1999)
BASE
Show details
9
The Eurospider Retrieval System and the TREC-8 Cross-Language Track
In: http://trec.nist.gov/pubs/trec8/papers/eit_t8f.pdf (1999)
BASE
Show details
10
SPIDER Retrieval System at TREC7
In: http://trec.nist.gov/pubs/trec7/papers/ETHTREC7.ps (1999)
Abstract: This year the Zurich team participated in two tracks: the automatic-adhoc track and the crosslingual track. For the adhoc task we focused on improving retrieval for short queries. We pursued two aims. First, we investigated weighting functions for short queries---explicitely without any kind of automatic query expansion. Second we developed rules that automatically decide for which queries automatic expansion works fine and for which it does not. For the cross-language track, we approached the problem of retrieving documents from a multilingual document pool containing documents in all TREC CLIR languages. Our method uses individual runs for different language combinations, followed by merging their results into one final ranked list. We obtained good results without sophisticated machine translation or costly linguistic resources. 1 Introduction For this year's adhoc runs we pursued ideas that were introduced by users with the hope to unite the weighting schemes based on probabilisti.
URL: http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.54.6540
http://trec.nist.gov/pubs/trec7/papers/ETHTREC7.ps
BASE
Hide details
11
From CLEF to TrebleCLEF: promoting Technology Transfer for Multilingual Information Retrieval
In: http://dis.shef.ac.uk/mark/publications/my_papers/DELOS_CLEFtoTrebleCLEF_20071116-d.pdf
BASE
Show details
12
The CLEF Campaign
In: http://research.nii.ac.jp/ntcir/workshop/OnlineProceedings2/Martin.pdf
BASE
Show details
13
From Research to Application in Multilingual Information Access: the Contribution of Evaluation
In: http://www.mt-archive.info/LREC-2008-Peters.pdf
BASE
Show details
14
The CLEF Campaign
In: http://www.mt-archive.info/NTCIR-2001-Braschler.pdf
BASE
Show details
15
Conference on Multilingual and Multimodal Information Access Evaluation
In: http://www.sigir.org/forum/2010D/conferences/2010d_sigirforum_agosti.pdf
BASE
Show details
16
A PROMISE for Experimental Evaluation
In: http://allan.hanbury.eu/lib/exe/fetch.php?media=ferro_et_al_promise.pdf
BASE
Show details
17
From Research to Application in Multilingual Information Access: the Contribution of Evaluation
In: http://www.lrec-conf.org/proceedings/lrec2008/pdf/913_paper.pdf
BASE
Show details
18
2004), ‘The future of evaluation for cross-language information retrieval systems
In: http://www.lrec-conf.org/proceedings/lrec2004/pdf/285.pdf
BASE
Show details

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
18
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern