DE eng

Search in the Catalogues and Directories

Page: 1 2 3
Hits 1 – 20 of 51

1
SOCIOFILLMORE: A Tool for Discovering Perspectives ...
BASE
Show details
2
IT5: Large-scale Text-to-text Pretraining for Italian Language Understanding and Generation ...
Sarti, Gabriele; Nissim, Malvina. - : arXiv, 2022
BASE
Show details
3
Multilingual Pre-training with Language and Task Adaptation for Multilingual Text Style Transfer ...
BASE
Show details
4
DALC: the Dutch Abusive Language Corpus ...
BASE
Show details
5
Thank you BART! Rewarding Pre-Trained Models Improves Formality Style Transfer ...
BASE
Show details
6
Adapting Monolingual Models: Data can be Scarce when Language Similarity is High ...
BASE
Show details
7
Generic resources are what you need: Style transfer tasks without task-specific parallel training data ...
BASE
Show details
8
Adapting Monolingual Models: Data can be Scarce when Language Similarity is High ...
BASE
Show details
9
As Good as New. How to Successfully Recycle English GPT-2 to Make Models for Other Languages ...
BASE
Show details
10
Teaching NLP with Bracelets and Restaurant Menus: An Interactive Workshop for Italian Students ...
BASE
Show details
11
Teaching NLP with Bracelets and Restaurant Menus:An Interactive Workshop for Italian Students
Pannitto, Ludovica; Busso, Lucia; Combei, Claudia Roberta. - : Association for Computational Linguistics, 2021
BASE
Show details
12
What's so special about BERT's layers? A closer look at the NLP pipeline in monolingual and multilingual models ...
Abstract: Peeking into the inner workings of BERT has shown that its layers resemble the classical NLP pipeline, with progressively more complex tasks being concentrated in later layers. To investigate to what extent these results also hold for a language other than English, we probe a Dutch BERT-based model and the multilingual BERT model for Dutch NLP tasks. In addition, through a deeper analysis of part-of-speech tagging, we show that also within a given task, information is spread over different parts of the network and the pipeline might not be as neat as it seems. Each layer has different specialisations, so that it may be more useful to combine information from different layers, instead of selecting a single one based on the best overall performance. ... : Accepted at Findings of EMNLP 2020 (camera-ready) ...
Keyword: Computation and Language cs.CL; FOS Computer and information sciences
URL: https://dx.doi.org/10.48550/arxiv.2004.06499
https://arxiv.org/abs/2004.06499
BASE
Hide details
13
Personal-ITY: A Novel YouTube-based Corpus for Personality Prediction in Italian ...
BASE
Show details
14
Datasets and Models for Authorship Attribution on Italian Personal Writings ...
BASE
Show details
15
Unmasking Contextual Stereotypes: Measuring and Mitigating BERT's Gender Bias ...
BASE
Show details
16
Matching Theory and Data with Personal-ITY: What a Corpus of Italian YouTube Comments Reveals About Personality ...
BASE
Show details
17
Unmasking Contextual Stereotypes: Measuring and Mitigating BERT'S Gender Bias ...
BASE
Show details
18
As Good as New. How to Successfully Recycle English GPT-2 to Make Models for Other Languages ...
de Vries, Wietse; Nissim, Malvina. - : arXiv, 2020
BASE
Show details
19
Fair Is Better than Sensational: Man Is to Doctor as Woman Is to Doctor
In: Computational Linguistics, Vol 46, Iss 2, Pp 487-497 (2020) (2020)
BASE
Show details
20
BERTje: A Dutch BERT Model ...
BASE
Show details

Page: 1 2 3

Catalogues
1
0
4
0
0
0
0
Bibliographies
7
0
0
0
0
0
0
0
1
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
41
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern