DE eng

Search in the Catalogues and Directories

Page: 1 2
Hits 1 – 20 of 39

1
IGLUE: A Benchmark for Transfer Learning across Modalities, Tasks, and Languages ...
BASE
Show details
2
Universal Dependencies 2.9
Zeman, Daniel; Nivre, Joakim; Abrams, Mitchell. - : Universal Dependencies Consortium, 2021
BASE
Show details
3
Universal Dependencies 2.8.1
Zeman, Daniel; Nivre, Joakim; Abrams, Mitchell. - : Universal Dependencies Consortium, 2021
BASE
Show details
4
Universal Dependencies 2.8
Zeman, Daniel; Nivre, Joakim; Abrams, Mitchell. - : Universal Dependencies Consortium, 2021
BASE
Show details
5
Modelling Latent Translations for Cross-Lingual Transfer ...
BASE
Show details
6
Minimax and Neyman–Pearson Meta-Learning for Outlier Languages ...
BASE
Show details
7
Mind the Context: The Impact of Contextualization in Neural Module Networks for Grounding Visual Referring Expressions ...
BASE
Show details
8
Back-Training excels Self-Training at Unsupervised Domain Adaptation of Question Generation and Passage Retrieval ...
BASE
Show details
9
Minimax and Neyman–Pearson Meta-Learning for Outlier Languages ...
BASE
Show details
10
Visually Grounded Reasoning across Languages and Cultures ...
BASE
Show details
11
Visually Grounded Reasoning across Languages and Cultures ...
BASE
Show details
12
Visually Grounded Reasoning across Languages and Cultures ...
BASE
Show details
13
Universal Dependencies 2.7
Zeman, Daniel; Nivre, Joakim; Abrams, Mitchell. - : Universal Dependencies Consortium, 2020
BASE
Show details
14
Universal Dependencies 2.6
Zeman, Daniel; Nivre, Joakim; Abrams, Mitchell. - : Universal Dependencies Consortium, 2020
BASE
Show details
15
Words aren't enough, their order matters: On the Robustness of Grounding Visual Referring Expressions ...
BASE
Show details
16
MeDAL ...
Wen, Zhi; Lu, Xing Han; Reddy, Siva. - : Zenodo, 2020
Abstract: Me dical D ataset for A bbreviation Disambiguation for Natural L anguage Understanding (MeDAL) is a large medical text dataset curated for abbreviation disambiguation, designed for natural language understanding pre-training in the medical domain. It was published at the ClinicalNLP workshop at EMNLP. 📜 Paper 💻 Code 💾 Dataset (Kaggle) 💽 Dataset (Zenodo) Running the code Coming soon! Citation Download the bibtex here, or copy the text below: @inproceedings{wen-etal-2020-medal, title = "{M}e{DAL}: Medical Abbreviation Disambiguation Dataset for Natural Language Understanding Pretraining", author = "Wen, Zhi and Lu, Xing Han and Reddy, Siva", booktitle = "Proceedings of the 3rd Clinical Natural Language Processing Workshop", month = nov, year = "2020", address = "Online", publisher = "Association for Computational Linguistics", url = "https://www.aclweb.org/anthology/2020.clinicalnlp-1.15", pages = "130--135", } License, Terms and Conditions The ELECTRA model is licensed under Apache 2.0. The license for the ...
Keyword: deep learning; health science; natural language understanding
URL: https://dx.doi.org/10.5281/zenodo.4265633
https://zenodo.org/record/4265633
BASE
Hide details
17
Universal Dependencies 2.5
Zeman, Daniel; Nivre, Joakim; Abrams, Mitchell. - : Universal Dependencies Consortium, 2019
BASE
Show details
18
Universal Dependencies 2.4
Nivre, Joakim; Abrams, Mitchell; Agić, Željko. - : Universal Dependencies Consortium, 2019
BASE
Show details
19
CoQA: A Conversational Question Answering Challenge
In: Transactions of the Association for Computational Linguistics, Vol 7, Pp 249-266 (2019) (2019)
BASE
Show details
20
Universal Dependencies 2.2
In: https://hal.archives-ouvertes.fr/hal-01930733 ; 2018 (2018)
BASE
Show details

Page: 1 2

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
39
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern