DE eng

Search in the Catalogues and Directories

Hits 1 – 4 of 4

1
C3: Continued Pretraining with Contrastive Weak Supervision for Cross Language Ad-Hoc Retrieval ...
BASE
Show details
2
Transfer Learning Approaches for Building Cross-Language Dense Retrieval Models ...
BASE
Show details
3
Goldilocks: Just-Right Tuning of BERT for Technology-Assisted Review ...
Abstract: Technology-assisted review (TAR) refers to iterative active learning workflows for document review in high recall retrieval (HRR) tasks. TAR research and most commercial TAR software have applied linear models such as logistic regression to lexical features. Transformer-based models with supervised tuning are known to improve effectiveness on many text classification tasks, suggesting their use in TAR. We indeed find that the pre-trained BERT model reduces review cost by 10% to 15% in TAR workflows simulated on the RCV1-v2 newswire collection. In contrast, we likewise determined that linear models outperform BERT for simulated legal discovery topics on the Jeb Bush e-mail collection. This suggests the match between transformer pre-training corpora and the task domain is of greater significance than generally appreciated. Additionally, we show that just-right language model fine-tuning on the task collection before starting active learning is critical. Too little or too much fine-tuning hinders performance, ... : 6 pages, 1 figure, accepted at ECIR 2022 ...
Keyword: Computation and Language cs.CL; FOS Computer and information sciences; Information Retrieval cs.IR
URL: https://dx.doi.org/10.48550/arxiv.2105.01044
https://arxiv.org/abs/2105.01044
BASE
Hide details
4
USNA: A Dual-Classifier Approach to Contextual Sentiment Analysis
In: DTIC (2013)
BASE
Show details

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
4
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern