1 |
MAGIC DUST FOR CROSS-LINGUAL ADAPTATION OF MONOLINGUAL WAV2VEC-2.0
|
|
|
|
In: ICASSP 2022 ; https://hal.archives-ouvertes.fr/hal-03544515 ; ICASSP 2022, May 2022, Singapour, Singapore (2022)
|
|
BASE
|
|
Show details
|
|
2 |
Cross-lingual few-shot hate speech and offensive language detection using meta learning
|
|
|
|
In: ISSN: 2169-3536 ; EISSN: 2169-3536 ; IEEE Access ; https://hal.archives-ouvertes.fr/hal-03559484 ; IEEE Access, IEEE, 2022, 10, pp.14880-14896. ⟨10.1109/ACCESS.2022.3147588⟩ (2022)
|
|
BASE
|
|
Show details
|
|
3 |
Cross-Lingual Transfer Learning for Arabic Task-Oriented Dialogue Systems Using Multilingual Transformer Model mT5
|
|
|
|
In: Mathematics; Volume 10; Issue 5; Pages: 746 (2022)
|
|
BASE
|
|
Show details
|
|
4 |
Comparative Study of Multiclass Text Classification in Research Proposals Using Pretrained Language Models
|
|
|
|
In: Applied Sciences; Volume 12; Issue 9; Pages: 4522 (2022)
|
|
BASE
|
|
Show details
|
|
5 |
Leveraging Frozen Pretrained Written Language Models for Neural Sign Language Translation
|
|
|
|
In: Information; Volume 13; Issue 5; Pages: 220 (2022)
|
|
BASE
|
|
Show details
|
|
6 |
The Effects of Event Depictions in Second Language Phrasal Vocabulary Learning
|
|
|
|
BASE
|
|
Show details
|
|
7 |
The Effects of Event Depictions in Second Language Phrasal Vocabulary Learning ...
|
|
|
|
BASE
|
|
Show details
|
|
8 |
StaResGRU-CNN with CMedLMs: a stacked residual GRU-CNN with pre-trained biomedical language models for predictive intelligence
|
|
|
|
Abstract:
As a task requiring strong professional experience as supports, predictive biomedical intelligence cannot be separated from the support of a large amount of external domain knowledge. By using transfer learning to obtain sufficient prior experience from massive biomedical text data, it is essential to promote the performance of specific downstream predictive and decision-making task models. This is an efficient and convenient method, but it has not been fully developed for Chinese Natural Language Processing (NLP) in the biomedical field. This study proposes a Stacked Residual Gated Recurrent Unit-Convolutional Neural Networks (StaResGRU-CNN) combined with the pre-trained language models (PLMs) for biomedical text-based predictive tasks. Exploring related paradigms in biomedical NLP based on transfer learning of external expert knowledge and comparing some Chinese and English language models. We have identified some key issues that have not been developed or those present difficulties of application in the field of Chinese biomedicine. Therefore, we also propose a series of Chinese bioMedical Language Models (CMedLMs) with detailed evaluations of downstream tasks. By using transfer learning, language models are introduced with prior knowledge to improve the performance of downstream tasks and solve specific predictive NLP tasks related to the Chinese biomedical field to serve the predictive medical system better. Additionally, a free-form text Electronic Medical Record (EMR)-based Disease Diagnosis Prediction task is proposed, which is used in the evaluation of the analyzed language models together with Clinical Named Entity Recognition, Biomedical Text Classification tasks. Our experiments prove that the introduction of biomedical knowledge in the analyzed models significantly improves their performance in the predictive biomedical NLP tasks with different granularity. And our proposed model also achieved competitive performance in these predictive intelligence tasks. ; over 3m from acceptance/publication
|
|
Keyword:
biomedical text mining; named entity recognition; natural language processing; pre-trained language model; predictive intelligence; text classification; transfer learning
|
|
URL: http://hdl.handle.net/10547/625294 https://doi.org/10.1016/j.asoc.2021.107975
|
|
BASE
|
|
Hide details
|
|
9 |
An Empirical Study of Factors Affecting Language-Independent Models
|
|
|
|
BASE
|
|
Show details
|
|
10 |
Neural-based Knowledge Transfer in Natural Language Processing
|
|
|
|
BASE
|
|
Show details
|
|
11 |
Chinese Idioms: Stepping Into L2 Student’s Shoes
|
|
|
|
In: Acta Linguistica Asiatica, Vol 12, Iss 1 (2022) (2022)
|
|
BASE
|
|
Show details
|
|
12 |
Cross-lingual Representation Learning for Natural Language Processing
|
|
|
|
BASE
|
|
Show details
|
|
13 |
Hate speech and offensive language detection using transfer learning approaches ; Détection du discours de haine et du langage offensant utilisant des approches de Transfer Learning
|
|
|
|
In: https://tel.archives-ouvertes.fr/tel-03276023 ; Document and Text Processing. Institut Polytechnique de Paris, 2021. English. ⟨NNT : 2021IPPAS007⟩ (2021)
|
|
BASE
|
|
Show details
|
|
14 |
Fostering teacher language awareness in a primary English-language immersion school in France: supporting teachers on the road to engaging students’ bilingual competencies
|
|
|
|
In: ISSN: 0965-8416 ; Language Awareness ; https://hal.univ-lorraine.fr/hal-03573322 ; In press (2021)
|
|
BASE
|
|
Show details
|
|
15 |
Investigating data sharing in speech recognition for an underresourced language: the case of algerian dialect
|
|
|
|
In: 7th International Conference on Natural Language Processing - NATP 2021 ; https://hal.archives-ouvertes.fr/hal-03137048 ; 7th International Conference on Natural Language Processing - NATP 2021, Mar 2021, Vienna, Austria (2021)
|
|
BASE
|
|
Show details
|
|
16 |
Automated audio captioning by fine-tuning bart with audioset tags
|
|
|
|
In: DCASE 2021 - 6th Workshop on Detection and Classification of Acoustic Scenes and Events ; https://hal.inria.fr/hal-03522488 ; DCASE 2021 - 6th Workshop on Detection and Classification of Acoustic Scenes and Events, Nov 2021, Virtual, Spain (2021)
|
|
BASE
|
|
Show details
|
|
17 |
Improving Multilingual Models for the Swedish Language : Exploring CrossLingual Transferability and Stereotypical Biases
|
|
|
|
BASE
|
|
Show details
|
|
18 |
Automated Paraphrase Quality Assessment Using Language Models and Transfer Learning
|
|
|
|
In: Computers; Volume 10; Issue 12; Pages: 166 (2021)
|
|
BASE
|
|
Show details
|
|
19 |
Reusing Monolingual Pre-Trained Models by Cross-Connecting Seq2seq Models for Machine Translation
|
|
|
|
In: Applied Sciences ; Volume 11 ; Issue 18 (2021)
|
|
BASE
|
|
Show details
|
|
20 |
Modeling phones, keywords, topics and intents in spoken languages
|
|
|
|
BASE
|
|
Show details
|
|
|
|