1 |
Towards the new construct of academic English in the digital age
|
|
|
|
BASE
|
|
Show details
|
|
2 |
Eye-tracking L2 students taking online multiple-choice reading tests: benefits and challenges
|
|
|
|
BASE
|
|
Show details
|
|
4 |
Investigating the cognitive validity of EAP reading-into-writing test tasks: a pilot study
|
|
|
|
BASE
|
|
Show details
|
|
5 |
Introduction of statistical analyses for language testing/learning research (Part 1)
|
|
|
|
BASE
|
|
Show details
|
|
6 |
Comparing writing proficiency assessments used in professional medical registration: a methodology to inform policy and practice
|
|
|
|
BASE
|
|
Show details
|
|
8 |
Research and practice in assessing academic reading: the case of IELTS
|
|
|
|
BASE
|
|
Show details
|
|
10 |
Using eye-tracking research to inform language test validity and design
|
|
|
|
BASE
|
|
Show details
|
|
11 |
Investigating the cognitive constructs measured by the Aptis writing test in the Japanese context: a case study
|
|
|
|
BASE
|
|
Show details
|
|
12 |
Book review: Understanding second language processing: focus on processability theory
|
|
|
|
BASE
|
|
Show details
|
|
13 |
Researching the comparability of paper-based and computer-based delivery in a high-stakes writing test
|
|
|
|
BASE
|
|
Show details
|
|
14 |
Paper-based vs computer-based writing assessment: divergent, equivalent or complementary?
|
|
|
|
Abstract:
Writing on a computer is now commonplace in most post-secondary educational contexts and workplaces, making research into computer-based writing assessment essential. This special issue of Assessing Writing includes a range of articles focusing on computer-based writing assessments. Some of these have been designed to parallel an existing paper-based assessment, others have been constructed as computer-based from the beginning. The selection of papers addresses various dimensions of the validity of computer-based writing assessment use in different contexts and across levels of L2 learner proficiency. First, three articles deal with the impact of these two delivery modes, paper-baser-based or computer-based, on test takers’ processing and performance in large-scale high-stakes writing tests; next, two articles explore the use of online writing assessment in higher education; the final two articles evaluate the use of technologies to provide feedback to support learning.
|
|
Keyword:
computer-based testing; English language assessment; writing; X162 Teaching English as a Foreign Language (TEFL)
|
|
URL: http://hdl.handle.net/10547/622736 https://doi.org/10.1016/j.asw.2018.04.001
|
|
BASE
|
|
Hide details
|
|
15 |
Some evidence of the development of L2 reading-into-writing skills at three levels
|
|
|
|
BASE
|
|
Show details
|
|
16 |
Researching participants taking IELTS Academic Writing Task 2 (AWT2) in paper mode and in computer mode in terms of score equivalence, cognitive validity and other factors
|
|
|
|
BASE
|
|
Show details
|
|
17 |
Developing rubrics to assess the reading-into-writing skills: a case study
|
|
|
|
BASE
|
|
Show details
|
|
18 |
Demonstrating the cognitive validity and face validity of PTE Academic Writing items Summarize Written Text and Write Essay
|
|
|
|
BASE
|
|
Show details
|
|
19 |
Researching the cognitive validity of GEPT high-intermediate and advanced reading : an eye tracking an stimulated recall study
|
|
|
|
BASE
|
|
Show details
|
|
20 |
Reviewing the suitability of English language tests for providing the GMC with evidence of doctors' English proficiency
|
|
|
|
BASE
|
|
Show details
|
|
|
|