DE eng

Search in the Catalogues and Directories

Page: 1 2 3 4 5 6 7 8 9...690
Hits 81 – 100 of 13.783

81
Adapting BigScience Multilingual Model to Unseen Languages ...
BASE
Show details
82
On Efficiently Acquiring Annotations for Multilingual Models ...
BASE
Show details
83
Team ÚFAL at CMCL 2022 Shared Task: Figuring out the correct recipe for predicting Eye-Tracking features using Pretrained Language Models ...
BASE
Show details
84
Does Corpus Quality Really Matter for Low-Resource Languages? ...
BASE
Show details
85
IIITDWD-ShankarB@ Dravidian-CodeMixi-HASOC2021: mBERT based model for identification of offensive content in south Indian languages ...
Biradar, Shankar; Saumya, Sunil. - : arXiv, 2022
BASE
Show details
86
mSLAM: Massively multilingual joint pre-training for speech and text ...
Bapna, Ankur; Cherry, Colin; Zhang, Yu. - : arXiv, 2022
BASE
Show details
87
On the Representation Collapse of Sparse Mixture of Experts ...
Chi, Zewen; Dong, Li; Huang, Shaohan. - : arXiv, 2022
BASE
Show details
88
Politics and Virality in the Time of Twitter: A Large-Scale Cross-Party Sentiment Analysis in Greece, Spain and United Kingdom ...
BASE
Show details
89
L3Cube-MahaHate: A Tweet-based Marathi Hate Speech Detection Dataset and BERT models ...
BASE
Show details
90
Few-Shot Cross-lingual Transfer for Coarse-grained De-identification of Code-Mixed Clinical Texts ...
BASE
Show details
91
A Unified Strategy for Multilingual Grammatical Error Correction with Pre-trained Cross-Lingual Language Model ...
Sun, Xin; Ge, Tao; Ma, Shuming. - : arXiv, 2022
BASE
Show details
92
A New Generation of Perspective API: Efficient Multilingual Character-level Transformers ...
Lees, Alyssa; Tran, Vinh Q.; Tay, Yi. - : arXiv, 2022
BASE
Show details
93
Factual Consistency of Multilingual Pretrained Language Models ...
Abstract: Pretrained language models can be queried for factual knowledge, with potential applications in knowledge base acquisition and tasks that require inference. However, for that, we need to know how reliable this knowledge is, and recent work has shown that monolingual English language models lack consistency when predicting factual knowledge, that is, they fill-in-the-blank differently for paraphrases describing the same fact. In this paper, we extend the analysis of consistency to a multilingual setting. We introduce a resource, mParaRel, and investigate (i) whether multilingual language models such as mBERT and XLM-R are more consistent than their monolingual counterparts; and (ii) if such models are equally consistent across languages. We find that mBERT is as inconsistent as English BERT in English paraphrases, but that both mBERT and XLM-R exhibit a high degree of inconsistency in English and even more so for all the other 45 languages. ...
Keyword: Computation and Language cs.CL; FOS Computer and information sciences; Machine Learning cs.LG
URL: https://arxiv.org/abs/2203.11552
https://dx.doi.org/10.48550/arxiv.2203.11552
BASE
Hide details
94
Examining Scaling and Transfer of Language Model Architectures for Machine Translation ...
BASE
Show details
95
MuMiN: A Large-Scale Multilingual Multimodal Fact-Checked Misinformation Social Network Dataset ...
BASE
Show details
96
Mono vs Multilingual BERT for Hate Speech Detection and Text Classification: A Case Study in Marathi ...
BASE
Show details
97
Agreement ...
Tal, Shira. - : Open Science Framework, 2022
BASE
Show details
98
Agreement ...
Tal, Shira. - : Open Science Framework, 2022
BASE
Show details
99
Natural Language Descriptions of Deep Visual Features ...
BASE
Show details
100
From Examples to Rules: Neural Guided Rule Synthesis for Information Extraction ...
BASE
Show details

Page: 1 2 3 4 5 6 7 8 9...690

Catalogues
517
4
412
0
2
0
22
Bibliographies
2.117
0
0
0
0
0
0
5
50
Linked Open Data catalogues
0
Online resources
73
17
0
0
Open access documents
11.476
5
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern