DE eng

Search in the Catalogues and Directories

Hits 1 – 18 of 18

1
First Align, then Predict: Understanding the Cross-Lingual Ability of Multilingual BERT
In: https://hal.inria.fr/hal-03161685 ; 2021 (2021)
BASE
Show details
2
Can Multilingual Language Models Transfer to an Unseen Dialect? A Case Study on North African Arabizi
In: https://hal.inria.fr/hal-03161677 ; 2021 (2021)
BASE
Show details
3
First Align, then Predict: Understanding the Cross-Lingual Ability of Multilingual BERT
In: EACL 2021 - The 16th Conference of the European Chapter of the Association for Computational Linguistics ; https://hal.inria.fr/hal-03239087 ; EACL 2021 - The 16th Conference of the European Chapter of the Association for Computational Linguistics, Apr 2021, Kyiv / Virtual, Ukraine ; https://2021.eacl.org/ (2021)
BASE
Show details
4
When Being Unseen from mBERT is just the Beginning: Handling New Languages With Multilingual Language Models
In: NAACL-HLT 2021 - 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies ; https://hal.inria.fr/hal-03251105 ; NAACL-HLT 2021 - 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Jun 2021, Mexico City, Mexico (2021)
BASE
Show details
5
Cross-Lingual GenQA: A Language-Agnostic Generative Question Answering Approach for Open-Domain Question Answering ...
BASE
Show details
6
First Align, then Predict: Understanding the Cross-Lingual Ability of Multilingual BERT ...
Abstract: Multilingual pretrained language models have demonstrated remarkable zero-shot cross-lingual transfer capabilities. Such transfer emerges by fine-tuning on a task of interest in one language and evaluating on a distinct language, not seen during the fine-tuning. Despite promising results, we still lack a proper understanding of the source of this transfer. Using a novel layer ablation technique and analyses of the model's internal representations, we show that multilingual BERT, a popular multilingual language model, can be viewed as the stacking of two sub-networks: a multilingual encoder followed by a task-specific language-agnostic predictor. While the encoder is crucial for cross-lingual transfer and remains mostly unchanged during fine-tuning, the task predictor has little importance on the transfer and can be reinitialized during fine-tuning. We present extensive experiments with three distinct tasks, seventeen typologically diverse languages and multiple domains to support our hypothesis. ... : Accepted at EACL 2021 ...
Keyword: Computation and Language cs.CL; FOS Computer and information sciences
URL: https://arxiv.org/abs/2101.11109
https://dx.doi.org/10.48550/arxiv.2101.11109
BASE
Hide details
7
When Being Unseen from mBERT is just the Beginning: Handling New Languages With Multilingual Language Models ...
NAACL 2021 2021; Muller, Benjamin. - : Underline Science Inc., 2021
BASE
Show details
8
Establishing a New State-of-the-Art for French Named Entity Recognition
In: LREC 2020 - 12th Language Resources and Evaluation Conference ; https://hal.inria.fr/hal-02617950 ; LREC 2020 - 12th Language Resources and Evaluation Conference, May 2020, Marseille, France ; http://www.lrec-conf.org (2020)
BASE
Show details
9
Building a User-Generated Content North-African Arabizi Treebank: Tackling Hell
In: ACL 2020 - 58th Annual Meeting of the Association for Computational Linguistics ; https://hal.inria.fr/hal-02889804 ; ACL 2020 - 58th Annual Meeting of the Association for Computational Linguistics, Jul 2020, Seattle / Virtual, Canada. ⟨10.18653/v1/2020.acl-main.107⟩ (2020)
BASE
Show details
10
CamemBERT: a Tasty French Language Model
In: ACL 2020 - 58th Annual Meeting of the Association for Computational Linguistics ; https://hal.inria.fr/hal-02889805 ; ACL 2020 - 58th Annual Meeting of the Association for Computational Linguistics, Jul 2020, Seattle / Virtual, United States. ⟨10.18653/v1/2020.acl-main.645⟩ (2020)
BASE
Show details
11
When Being Unseen from mBERT is just the Beginning: Handling New Languages With Multilingual Language Models
In: https://hal.inria.fr/hal-03109106 ; 2020 (2020)
BASE
Show details
12
Can Multilingual Language Models Transfer to an Unseen Dialect? A Case Study on North African Arabizi ...
BASE
Show details
13
When Being Unseen from mBERT is just the Beginning: Handling New Languages With Multilingual Language Models ...
BASE
Show details
14
Unsupervised Learning for Handling Code-Mixed Data: A Case Study on POS Tagging of North-African Arabizi Dialect
In: EurNLP - First annual EurNLP ; https://hal.archives-ouvertes.fr/hal-02270527 ; EurNLP - First annual EurNLP, Oct 2019, Londres, United Kingdom (2019)
BASE
Show details
15
CamemBERT: a Tasty French Language Model
In: https://hal.inria.fr/hal-02445946 ; 2019 (2019)
BASE
Show details
16
Enhancing BERT for Lexical Normalization
In: The 5th Workshop on Noisy User-generated Text (W-NUT) ; https://hal.inria.fr/hal-02294316 ; The 5th Workshop on Noisy User-generated Text (W-NUT), Nov 2019, Hong Kong, China (2019)
BASE
Show details
17
ELMoLex: Connecting ELMo and Lexicon features for Dependency Parsing
In: CoNLL 2018 Shared Task: Multilingual Parsing from Raw Text to Universal Dependencies ; https://hal.inria.fr/hal-01959045 ; CoNLL 2018 Shared Task: Multilingual Parsing from Raw Text to Universal Dependencies, Oct 2018, Brussels, Belgium. ⟨10.18653/v1/K18-2023⟩ (2018)
BASE
Show details
18
CoNLL 2018 Shared Task System Outputs
Zeman, Daniel; Potthast, Martin; Duthoo, Elie. - : Charles University, Faculty of Mathematics and Physics, Institute of Formal and Applied Linguistics (UFAL), 2018
BASE
Show details

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
18
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern