DE eng

Search in the Catalogues and Directories

Page: 1 2 3 4
Hits 1 – 20 of 76

1
Évaluation de la perception des sons de parole chez les populations pédiatriques : réflexion sur les épreuves existantes
In: ISSN: 0298-6477 ; EISSN: 2117-7155 ; Glossa ; https://hal.archives-ouvertes.fr/hal-03646757 ; Glossa, UNADREO - Union NAtionale pour le Développement de la Recherche en Orthophonie, 2022, 132, pp.1-27 ; https://www.glossa.fr/index.php/glossa/article/view/1043 (2022)
BASE
Show details
2
On topic validity in speaking tests
Khabbazbashi, Nahal. - : Cambridge University Press, 2022
BASE
Show details
3
The design and validation of an online speaking test for young learners in Uruguay: challenges and innovations
Khabbazbashi, Nahal; Nakatsuhara, Fumiyo; Inoue, Chihiro. - : Cranmore Publishing on behalf of the International TESOL Union, 2022
BASE
Show details
4
Digital representation for assessment of spoken EFL at university level: A case study in Vietnam
Vu, Thi Bich Hiep. - : Edith Cowan University, Research Online, Perth, Western Australia, 2021
In: Theses: Doctorates and Masters (2021)
Abstract: Assessing the speaking performance of students who are studying English as a Foreign Language (EFL) has mainly been conducted with face-to-face speaking tests. While such tests are undoubtedly interactive and authentic, they have been criticised for subjective scoring, as well as lacking an effective test delivery method and recordings for later review. Technology has increasingly been integrated into speaking tests over the last decade and become known as computer-assisted or computer-based assessment of speaking. Although this method is widely acknowledged to measure certain aspects of language speaking effectively, such as pronunciation and grammar, it has not yet proved to be a successful option for assessing interactive skills. An effective testing method is deemed to maintain the interactivity and authenticity of live speaking tests, able to deliver tests quickly and efficiently, and provide recordings of performances for multiple marking and review. This study investigated digital representation of EFL speaking performance as a viable form of student assessment. The feasibility of digital representation has previously been examined in relation to authenticity and reliability in assessment of different subjects in Western Australia, including Italian, Applied Information Technology, Engineering Studies, and Physical Education Studies. However, as far as the researcher is aware, no studies have yet assessed EFL speaking performance using digital representation. In an attempt to bridge this gap, this study explored the feasibility of digital representation for assessing EFL speaking performance in a university in Vietnam, the researcher’s home country. Data collection was undertaken in two phases using a mixed methods approach. In Phase 1, data related to English teachers’ and students’ perceptions of Computer- Assisted English Speaking Assessment (CAESA) were collected. Their perceptions were analysed in relation to the outcomes of a digital speaking assessment trial using the Oral Video Assessment Application (DMOVA). In Phase 2, student participants took an English speaking test while being videoed and audio recorded. English teachers invigilated and marked the trial test using the current method, followed by the digital method. Data were collected via Qualtrics surveys, interviews, observations and databases of student performance results. The feasibility of digital representation in assessing EFL speaking performance was analysed according to the Feasibility Analysis Framework developed by Kimbell, Wheeler, Miller, and Pollitt (2007). The findings from Phase 1 indicated that both teachers and students had positive attitudes towards computer-assisted assessment (CAA). They were confident with computer-assisted English assessment (CAEA) and preferred this testing method to the current paper-and-pencil process. Both cohorts believed that CAEA enhanced the precision and fairness of assessments and was efficient in terms of resources. However, some participants were sceptical about the authenticity of computer-assisted EFL speaking tests because it failed to foster conversations and interactions in the same way as face-to-face assessments. In spite of their scepticism, teachers and students indicated their willingness to trial DMOVA. Phase 2 identified the feasibility dimensions of DMOVA. This method of digital assessment was perceived to enhance fairness, reliability and validity, with some correlations between the live interview and digital tests. Teachers found it easy to manage the speaking tests with DMOVA and recognised the logistical advantages it offered. DMOVA was also credited with generating positive washback effects on learning, teaching and assessment of spoken English. In addition, the digital technology was compatible with the existing facilities at the university and required no support or advanced ICT knowledge. Overall, the benefits of the new testing method were perceived to outweigh the limitations. The study confirmed that digital representation of EFL speaking performances for assessment would be beneficial for Vietnam for the following reasons: (a) it has potential to enhance the reliability and accuracy of the current English speaking assessment method, (b) it retains evidence of students’ performance for later assessment and review, and (c) it facilitates marking and administration. These changes could boost EFL teaching, learning, and assessment, as witnessed in the trial, leading to increased motivation of teachers and students, and ultimately, enhancement of students’ English communication skills. The findings of the study also have implications for English speaking assessment policies and practices in Vietnam and other similar contexts where English is taught, spoken and assessed as a foreign language.
Keyword: and Multicultural Education; and Research; Bilingual; Computer-assisted assessment; Digital assessment; Digital representation; Education; Educational Assessment; EFL; English; Evaluation; Higher Education; Multilingual; speaking; Speaking assessment
URL: https://ro.ecu.edu.au/theses/2412
https://ro.ecu.edu.au/cgi/viewcontent.cgi?article=3414&context=theses
BASE
Hide details
5
Use of innovative technology in oral language assessment
Nakatsuhara, Fumiyo; Berry, Vivien. - : Taylor & Francis, 2021
BASE
Show details
6
Towards new avenues for the IELTS Speaking Test: insights from examiners’ voices
BASE
Show details
7
Video-conferencing speaking tests: do they measure the same construct as face-to-face tests?
BASE
Show details
8
The effects of extended planning time on candidates’ performance, processes and strategy use in the lecture listening-into-speaking tasks of the TOEFL iBT Test
Inoue, Chihiro; Lam, Daniel M. K.. - : Wiley, 2021
BASE
Show details
9
Efficacy in Occupational Safety and Health Training of Dairy Workers: Predictors of Test Performance on a Dairy Safety Knowledge Test From a Demographic Cohort
BASE
Show details
10
Suitability of Blackboard as Learning Management System to assess oral competence: Students’ perceptions and results
Muñoz Alcón, Ana Isabel; Trullén Galve, Francisco. - : Editorial Universitat Politècnica de València, 2021
BASE
Show details
11
Secondary EFL Teachers’ Views Towards the Implementation of Peer Assessment: Between Opportunities and Challenges
In: ELT Worldwide: Journal of English Language Teaching, Vol 8, Iss 2, Pp 352-364 (2021) (2021)
BASE
Show details
12
Cognitive validity in the testing of speaking
Field, John. - 2020
BASE
Show details
13
Task parallelness: investigating the difficulty of two spoken narrative tasks
Inoue, Chihiro. - 2020
BASE
Show details
14
Opening the black box: exploring automated speaking evaluation ; Issues in Language Testing Around the World: Insights for Language Test Users.
BASE
Show details
15
Re-engineering a speaking test used for university admissions purposes: considerations and constraints: the case of IELTS
Taylor, Lynda. - 2020
BASE
Show details
16
Analysing multi-person discourse in group speaking tests: how do test-taker characteristics, task types and group sizes affect co-constructed discourse in groups?
BASE
Show details
17
Investigating the use of language functions for validating speaking test specifications
Inoue, Chihiro. - 2020
BASE
Show details
18
The IELTS Speaking Test: what can we learn from examiner voices?
BASE
Show details
19
Academic speaking: does the construct exist, and if so, how do we test it?
BASE
Show details
20
Testing speaking skills: why and how?
BASE
Show details

Page: 1 2 3 4

Catalogues
0
0
1
0
0
0
0
Bibliographies
1
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
75
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern