DE eng

Search in the Catalogues and Directories

Page: 1 2 3 4 5...13
Hits 1 – 20 of 253

1
A gentle introduction to Girard's Transcendental Syntax for the linear logician
In: https://hal.archives-ouvertes.fr/hal-02977750 ; 2022 (2022)
BASE
Show details
2
Learning and controlling the source-filter representation of speech with a variational autoencoder
In: https://hal.archives-ouvertes.fr/hal-03650569 ; 2022 (2022)
BASE
Show details
3
A Collection of Classroom Instruction ... : A Collection of Classroom Instruction ...
Thobias Sarbunan. - : Science Data Bank, 2022
BASE
Show details
4
Money Change_experiment -10000-FULL.rar ...
Tzouvelekas, Emmanuel. - : figshare, 2022
BASE
Show details
5
Money Change_experiment -10000-FULL.rar ...
Tzouvelekas, Emmanuel. - : figshare, 2022
BASE
Show details
6
On the Transferability of Pre-trained Language Models for Low-Resource Programming Languages ...
Chen, Fuxiang. - : Federated Research Data Repository / dépôt fédéré de données de recherche, 2022
Abstract: Pre-trained Language Models (PLM) such as CodeBERT and GraphCodeBERT, when trained on a large corpus of code, have recently displayed promising results in Software Engineering (SE) down-stream tasks. A PLM is most useful if it can be leveraged to improve the performance on code corpora written in low-resource programming languages, where training data is limited. In this work, our focus is on studying the impact of PLMs on a low-resource programming language corpus — specifically, we choose Ruby as the study subject. A recent study by Ahmed and Devanbu reported that using a corpus of code written in multilingual datasets to fine-tune multilingual PLMs achieves higher performance as opposed to using a corpus of code written in just one programming language. However, no analysis was made with respect to monolingual PLMs. Furthermore, some programming languages are inherently different and code written in one language usually cannot be interchanged with the others, i.e., Ruby and Java code possess very ...
Keyword: CodeBERT; Fine-tuned Models; Langages de programmation et génie logiciel, non classé ailleurs; PLM; Pre-trained Language Models; Programming language and software engineering, not elsewhere classified; Ruby
URL: https://dx.doi.org/10.20383/102.0563
https://www.frdr-dfdr.ca/repo/dataset/7c3eba54-7635-4459-9523-63508e613a06
BASE
Hide details
7
МОДЕЛИ ПРОСТОГО ПРЕДЛОЖЕНИЯ И ИХ СИНОНИМИЯ В ТУВИНСКОМ ЯЗЫКЕ ... : MODELS OF A SIMPLE SENTENCE AND THEIR SYNONYMY IN THE TUVAN LANGUAGE ...
Н.Ч. Серээдар. - : Мир науки, культуры, образования, 2022
BASE
Show details
8
LEXICAL AND SEMANTIC CHARACTERISTICS OF HYPONOMIC RELATIONS AND DEEPLY ANALYZING ITS FEATURES IN ENGLISH LINGUISTUCS ...
Jumaeva, Nasiba Komil Qizi. - : Academic research in educational sciences, 2022
BASE
Show details
9
Constrained control of gene-flow models
In: https://hal.archives-ouvertes.fr/hal-02373668 ; 2021 (2021)
BASE
Show details
10
Linguistic Resources Linguistic Resources for the automatic for the automatic annotation of speech annotation of speech (version 6)
In: https://hal.archives-ouvertes.fr/hal-03468442 ; 2021 (2021)
BASE
Show details
11
Multiplicative Linear Logic from Logic Programs and Tilings
In: https://hal.archives-ouvertes.fr/hal-02895111 ; 2021 (2021)
BASE
Show details
12
A gentle introduction to Girard's Transcendental Syntax for the linear logician
In: https://hal.archives-ouvertes.fr/hal-02977750 ; 2021 (2021)
BASE
Show details
13
Stellar Resolution: Multiplicatives - for the linear logician, through examples
In: https://hal.archives-ouvertes.fr/hal-02977750 ; 2021 (2021)
BASE
Show details
14
A gentle introduction to Girard's Transcendental Syntax for the linear logician
In: https://hal.archives-ouvertes.fr/hal-02977750 ; 2021 (2021)
BASE
Show details
15
Stellar Resolution: Multiplicatives - for the linear logician, through examples
In: https://hal.archives-ouvertes.fr/hal-02977750 ; 2021 (2021)
BASE
Show details
16
A mathematical model of the vowel space
In: https://hal.archives-ouvertes.fr/hal-03384303 ; 2021 (2021)
BASE
Show details
17
A mathematical model of the vowel space
In: https://hal.archives-ouvertes.fr/hal-03384303 ; 2021 (2021)
BASE
Show details
18
Training RNN Language Models on Uncertain ASR Hypotheses in Limited Data Scenarios
In: https://hal.inria.fr/hal-03327306 ; 2021 (2021)
BASE
Show details
19
Beyond Facts - a Survey and Conceptualisation of Claims in Online Discourse Analysis
In: https://hal.mines-ales.fr/hal-03185097 ; 2021 (2021)
BASE
Show details
20
НЕЙРОДИДАКТИЧЕСКИЙ ПОДХОД В МЕТОДОЛОГИИ ПЕДАГОГИКИ ... : NEURODIDACTIC APPROACH IN THE METHODOLOGY OF PEDAGOGY ...
Е.А. Местоева; М.Х. Мальсагова. - : Мир науки, культуры, образования, 2021
BASE
Show details

Page: 1 2 3 4 5...13

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
253
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern