Skip to content Skip to navigation
University of Warwick
  • Study
  • |
  • Research
  • |
  • Business
  • |
  • Alumni
  • |
  • News
  • |
  • About

University of Warwick
Publications service & WRAP

Highlight your research

  • WRAP
    • Home
    • Search WRAP
    • Browse by Warwick Author
    • Browse WRAP by Year
    • Browse WRAP by Subject
    • Browse WRAP by Department
    • Browse WRAP by Funder
    • Browse Theses by Department
  • Publications Service
    • Home
    • Search Publications Service
    • Browse by Warwick Author
    • Browse Publications service by Year
    • Browse Publications service by Subject
    • Browse Publications service by Department
    • Browse Publications service by Funder
  • Help & Advice
University of Warwick

The Library

  • Login
  • Admin

Designing and scaling level-specific writing tasks in alignment with the CEFR: a test-centered approach

Tools
- Tools
+ Tools

Harsch, Claudia and Rupp, Andrea Alexander (2011) Designing and scaling level-specific writing tasks in alignment with the CEFR: a test-centered approach. Language Assessment Quarterly, Vol.8 (No.1). pp. 1-33. doi:10.1080/15434303.2010.535575

Research output not available from this repository, contact author.
Official URL: http://dx.doi.org/10.1080/15434303.2010.535575

Request Changes to record.

Abstract

The Common European Framework of Reference (CEFR; Council of Europe, 2001) provides a competency model that is increasingly used as a point of reference to compare language examinations. Nevertheless, aligning examinations to the CEFR proficiency levels remains a challenge. In this article, we propose a new, level-centered approach to designing and aligning writing tasks in line with the CEFR levels. Much work has been done on assessing writing via tasks spanning over several levels of proficiency but little research on a level-specific approach, where one task targets one specific proficiency level. In our study, situated in a large-scale assessment project where such a level-specific approach was employed, we investigate the influence of the design factors tasks, assessment criteria, raters, and student proficiency on the variability of ratings, using descriptive statistics, generalizability theory, and multifaceted Rasch modeling. Results show that the level-specific approach yields plausible inferences about task difficulty, rater harshness, rating criteria difficulty, and student distribution. Moreover, Rasch analyses show a high level of consistency between a priori task classifications in terms of CEFR levels and empirical task difficulty estimates. This allows for a test-centered approach to standard setting by suggesting empirically grounded cut-scores in line with the CEFR proficiency levels targeted by the tasks.

Item Type: Journal Article
Subjects: L Education > LB Theory and practice of education
P Language and Literature > P Philology. Linguistics
Divisions: Faculty of Social Sciences > Centre for Applied Linguistics
Library of Congress Subject Headings (LCSH): CEFTrain Project, Language and languages -- Study and teaching, Language and languages -- Ability testing, Language and languages -- Study and teaching -- Germany , Language and languages -- Ability testing -- Germany
Journal or Publication Title: Language Assessment Quarterly
Publisher: Routledge
ISSN: 1543-4311
Official Date: 2011
Dates:
DateEvent
2011Published
Volume: Vol.8
Number: No.1
Number of Pages: 33
Page Range: pp. 1-33
DOI: 10.1080/15434303.2010.535575
Status: Peer Reviewed
Publication Status: Published
Access rights to Published version: Open Access

Data sourced from Thomson Reuters' Web of Knowledge

Request changes or add full text files to a record

Repository staff actions (login required)

View Item View Item
twitter

Email us: wrap@warwick.ac.uk
Contact Details
About Us