4 |
deepQuest-py: large and distilled models for quality estimation
|
|
|
|
Abstract:
We introduce deepQuest-py, a framework for training and evaluation of large and light-weight models for Quality Estimation (QE). deepQuest-py provides access to (1) state-of-the-art models based on pre-trained Transformers for sentence-level and word-level QE; (2) light-weight and efficient sentence-level models implemented via knowledge distillation; and (3) a web interface for testing models and visualising their predictions. deepQuest-py is available at https://github.com/sheffieldnlp/deepQuest-py under a CC BY-NC-SA licence.
|
|
URL: https://doi.org/10.18653/v1/2021.emnlp-demo.42 https://orca.cardiff.ac.uk/147257/1/2021.emnlp-demo.42.pdf https://orca.cardiff.ac.uk/147257/
|
|
BASE
|
|
Hide details
|
|
5 |
deepQuest-py: large and distilled models for quality estimation
|
|
|
|
In: Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing: System Demonstrations ; 382 ; 389 (2021)
|
|
BASE
|
|
Show details
|
|
6 |
Knowledge distillation for quality estimation
|
|
|
|
In: 5091 ; 5099 (2021)
|
|
BASE
|
|
Show details
|
|
7 |
Bilinear Fusion of Commonsense Knowledge with Attention-Based NLI Models ...
|
|
|
|
BASE
|
|
Show details
|
|
8 |
Enhancing the Reasoning Capabilities of Natural Language Inference Models with Attention Mechanisms and External Knowledge
|
|
|
|
BASE
|
|
Show details
|
|
|
|