DE eng

Search in the Catalogues and Directories

Hits 1 – 13 of 13

1
huggingface/datasets: 1.18.1 ...
BASE
Show details
2
huggingface/transformers: v4.4.0: S2T, M2M100, I-BERT, mBART-50, DeBERTa-v2, XLSR-Wav2Vec2 ...
BASE
Show details
3
huggingface/datasets: 1.16.0 ...
BASE
Show details
4
Motor constraints influence cultural evolution of rhythm
In: ISSN: 0962-8452 ; EISSN: 1471-2954 ; Proceedings of the Royal Society B: Biological Sciences ; https://jeannicod.ccsd.cnrs.fr/ijn_03085983 ; Proceedings of the Royal Society B: Biological Sciences, Royal Society, The, 2020, 287 (1937), ⟨10.1098/rspb.2020.2001⟩ (2020)
BASE
Show details
5
Transformers: State-of-the-Art Natural Language Processing ...
BASE
Show details
6
Transformers: State-of-the-Art Natural Language Processing ...
BASE
Show details
7
huggingface/transformers: ProphetNet, Blenderbot, SqueezeBERT, DeBERTa ...
Abstract: ProphetNet, Blenderbot, SqueezeBERT, DeBERTa ProphetNET Two new models are released as part of the ProphetNet implementation: ProphetNet and XLM-ProphetNet . ProphetNet is an encoder-decoder model and can predict n-future tokens for "ngram" language modeling instead of just the next token. XLM-ProphetNet is an encoder-decoder model with an identical architecture to ProhpetNet, but the model was trained on the multi-lingual "wiki100" Wikipedia dump. The ProphetNet model was proposed in ProphetNet: Predicting Future N-gram for Sequence-to-Sequence Pre-training, by Yu Yan, Weizhen Qi, Yeyun Gong, Dayiheng Liu, Nan Duan, Jiusheng Chen, Ruofei Zhang, Ming Zhou on 13 Jan, 2020. It was added to the library in PyTorch with the following checkpoints: microsoft/xprophetnet-large-wiki100-cased-xglue-ntg microsoft/prophetnet-large-uncased microsoft/prophetnet-large-uncased-cnndm microsoft/xprophetnet-large-wiki100-cased microsoft/xprophetnet-large-wiki100-cased-xglue-qg Contributions: ProphetNet #7157 (@qiweizhen, ...
URL: https://dx.doi.org/10.5281/zenodo.4110065
https://zenodo.org/record/4110065
BASE
Hide details
8
huggingface/transformers: Trainer, TFTrainer, Multilingual BART, Encoder-decoder improvements, Generation Pipeline ...
BASE
Show details
9
huggingface/pytorch-transformers: DistilBERT, GPT-2 Large, XLM multilingual models, bug fixes ...
BASE
Show details
10
Brüder, Geister und Fossilien : Eduard Mörikes Erfahrungen der Umwelt
Wolf, Thomas [Verfasser]. - Berlin/Boston : De Gruyter, 2001
DNB Subject Category Language
Show details
11
Forward pruning and other heuristic search techniques in tsume go
In: Information sciences. - New York, NY : Elsevier Science Inc. 122 (2000) 1, 59-76
OLC Linguistik
Show details
12
Pustkuchen und Goethe : Die Streitschrift als produktives Verwirrspiel
Wolf, Thomas [Verfasser]. - Berlin/Boston : De Gruyter, 1999
DNB Subject Category Language
Show details
13
Reading reconsidered
In: Thought & language/language & reading (Cambridge, MA, 1980), p. 109-127
MPI für Psycholinguistik
Show details

Catalogues
0
0
1
0
2
0
0
Bibliographies
0
0
0
0
0
0
0
0
1
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
9
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern