DE eng

Search in the Catalogues and Directories

Page: 1...59 60 61 62 63
Hits 1.241 – 1.255 of 1.255

1241
Dynamic and Multi-Channel Graph Convolutional Networks for Aspect-Based Sentiment Analysis ...
BASE
Show details
1242
Structured Sentiment Analysis as Dependency Graph Parsing ...
BASE
Show details
1243
Jointly Identifying Rhetoric and Implicit Emotions via Multi-Task Learning ...
BASE
Show details
1244
Recursive prosody is not finite-state ...
BASE
Show details
1245
Adapting Unsupervised Syntactic Parsing Methodology for Discourse Dependency Parsing ...
BASE
Show details
1246
Representing Syntax and Composition with Geometric Transformations ...
BASE
Show details
1247
Lower Perplexity is Not Always Human-Like ...
BASE
Show details
1248
Surprisal Estimators for Human Reading Times Need Character Models ...
BASE
Show details
1249
A Case Study of Analysis of Construals in Language on Social Media Surrounding a Crisis Event ...
BASE
Show details
1250
Psycholinguistic Tripartite Graph Network for Personality Detection ...
BASE
Show details
1251
Can Transformer Langauge Models Predict Psychometric Properties? ...
BASE
Show details
1252
11B: Linguistic Theories, Cognitive Modeling and Psycholinguistics #1 ...
BASE
Show details
1253
How is BERT surprised? Layerwise detection of linguistic anomalies ...
BASE
Show details
1254
Catchphrase: Automatic Detection of Cultural References ...
BASE
Show details
1255
Exploiting Language Relatedness for Low Web-Resource Language Model Adaptation: An Indic Languages Study ...
Abstract: Read paper: https://www.aclanthology.org/2021.acl-long.105 Abstract: Recent research in multilingual language models (LM) has demonstrated their ability to effectively handle multiple languages in a single model. This holds promise for low web-resource languages (LRL) as multilingual models can enable transfer of supervision from high resource languages to LRLs. However, incorporating a new language in an LM still remains a challenge, particularly for languages with limited corpora and in unseen scripts. In this paper we argue that relatedness among languages in a language family may be exploited to overcome some of the corpora limitations of LRLs, and propose RelateLM. We focus on Indian languages, and exploit relatedness along two dimensions: (1) script (since many Indic scripts originated from the Brahmic script), and (2) sentence structure. RelateLM uses transliteration to convert the unseen script of limited LRL text into the script of a Related Prominent Language (RPL) (Hindi in our case). While ...
Keyword: Computational Linguistics; Condensed Matter Physics; Deep Learning; Electromagnetism; FOS Physical sciences; Information and Knowledge Engineering; Neural Network; Semantics
URL: https://dx.doi.org/10.48448/x7q0-ax50
https://underline.io/lecture/25996-exploiting-language-relatedness-for-low-web-resource-language-model-adaptation-an-indic-languages-study
BASE
Hide details

Page: 1...59 60 61 62 63

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
1.255
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern