DE eng

Search in the Catalogues and Directories

Hits 1 – 11 of 11

1
Knowledge Distillation for Quality Estimation ...
BASE
Show details
2
Knowledge Distillation for Quality Estimation ...
BASE
Show details
3
Controllable Text Simplification with Explicit Paraphrasing ...
NAACL 2021 2021; Alva-Manchego, Fernando; Maddela, Mounica. - : Underline Science Inc., 2021
BASE
Show details
4
The (Un)Suitability of Automatic Evaluation Metrics for Text Simplification ...
BASE
Show details
5
Knowledge distillation for quality estimation
Gajbhiye, Amit; Fomicheva, Marina; Alva-Manchego, Fernando. - : Association for Computational Linguistics, 2021
BASE
Show details
6
deepQuest-py: large and distilled models for quality estimation
Alva-Manchego, Fernando; Obamuyide, Abiola; Gajbhiye, Amit. - : Association for Computational Linguistics, 2021
BASE
Show details
7
IAPUCP at SemEval-2021 task 1: Stacking fine-tuned transformers is almost all you need for lexical complexity prediction
Rivas Rojas, Kervy; Alva-Manchego, Fernando. - : Association for Computational Linguistics, 2021
BASE
Show details
8
The (un)suitability of automatic evaluation metrics for text simplification
Alva Manchego, Fernando; Scarton, Carolina; Specia, Lucia. - : Association for Computational Linguistics, 2021
BASE
Show details
9
Controllable text simplification with explicit paraphrasing
Maddela, Mounica; Alva-Manchego, Fernando; Xu, Wei. - : Association for Computational Linguistics, 2021
BASE
Show details
10
deepQuest-py: large and distilled models for quality estimation
In: Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing: System Demonstrations ; 382 ; 389 (2021)
BASE
Show details
11
Knowledge distillation for quality estimation
In: 5091 ; 5099 (2021)
BASE
Show details

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
11
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern