DE eng

Search in the Catalogues and Directories

Hits 1 – 16 of 16

1
Multilingual Unsupervised Sentence Simplification
In: https://hal.inria.fr/hal-03109299 ; 2021 (2021)
BASE
Show details
2
Text Generation with and without Retrieval ; Génération de textes basés sur la connaissance avec et sans recherche
Fan, Angela. - : HAL CCSD, 2021
In: https://hal.univ-lorraine.fr/tel-03542634 ; Computer Science [cs]. Université de Lorraine, 2021. English. ⟨NNT : 2021LORR0164⟩ (2021)
BASE
Show details
3
The FLORES-101 Evaluation Benchmark for Low-Resource and Multilingual Machine Translation ...
BASE
Show details
4
Tricks for Training Sparse Translation Models ...
BASE
Show details
5
Facebook AI WMT21 News Translation Task Submission ...
BASE
Show details
6
Findings of the AmericasNLP 2021 Shared Task on Open Machine Translation for Indigenous Languages of the Americas ...
Mager, Manuel; Oncevay, Arturo; Ebrahimi, Abteen. - : Association for Computational Linguistics, 2021
BASE
Show details
7
Alternative Input Signals Ease Transfer in Multilingual Machine Translation ...
Sun, Simeng; Fan, Angela; Cross, James. - : arXiv, 2021
BASE
Show details
8
AmericasNLI: Evaluating Zero-shot Natural Language Understanding of Pretrained Multilingual Models in Truly Low-resource Languages ...
Abstract: Pretrained multilingual models are able to perform cross-lingual transfer in a zero-shot setting, even for languages unseen during pretraining. However, prior work evaluating performance on unseen languages has largely been limited to low-level, syntactic tasks, and it remains unclear if zero-shot learning of high-level, semantic tasks is possible for unseen languages. To explore this question, we present AmericasNLI, an extension of XNLI (Conneau et al., 2018) to 10 indigenous languages of the Americas. We conduct experiments with XLM-R, testing multiple zero-shot and translation-based approaches. Additionally, we explore model adaptation via continued pretraining and provide an analysis of the dataset by considering hypothesis-only models. We find that XLM-R's zero-shot performance is poor for all 10 languages, with an average performance of 38.62%. Continued pretraining offers improvements, with an average accuracy of 44.05%. Surprisingly, training on poorly translated data by far outperforms all other ... : Accepted to ACL 2022 ...
Keyword: Computation and Language cs.CL; FOS Computer and information sciences
URL: https://arxiv.org/abs/2104.08726
https://dx.doi.org/10.48550/arxiv.2104.08726
BASE
Hide details
9
Multilingual AMR-to-Text Generation
In: 2020 Conference on Empirical Methods in Natural Language Processing ; https://hal.archives-ouvertes.fr/hal-02999676 ; 2020 Conference on Empirical Methods in Natural Language Processing, Nov 2020, Punta Cana, Dominican Republic (2020)
BASE
Show details
10
Augmenting Transformers with KNN-Based Composite Memory for Dialog
In: EISSN: 2307-387X ; Transactions of the Association for Computational Linguistics ; https://hal.archives-ouvertes.fr/hal-02999678 ; Transactions of the Association for Computational Linguistics, The MIT Press, In press, ⟨10.1162/tacl_a_00356⟩ ; https://transacl.org/index.php/tacl (2020)
BASE
Show details
11
Multilingual Translation with Extensible Multilingual Pretraining and Finetuning ...
Tang, Yuqing; Tran, Chau; Li, Xian. - : arXiv, 2020
BASE
Show details
12
Nearest Neighbor Machine Translation ...
BASE
Show details
13
Multilingual AMR-to-Text Generation ...
Fan, Angela; Gardent, Claire. - : arXiv, 2020
BASE
Show details
14
Facebook AI's WMT20 News Translation Task Submission ...
BASE
Show details
15
Beyond English-Centric Multilingual Machine Translation ...
BASE
Show details
16
MUSS: Multilingual Unsupervised Sentence Simplification by Mining Paraphrases ...
BASE
Show details

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
16
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern