The browser you are using is not supported by this website. All versions of Internet Explorer are no longer supported, either by us or Microsoft (read more here: https://www.microsoft.com/en-us/microsoft-365/windows/end-of-ie-support).

Please use a modern browser to fully experience our website, such as the newest versions of Edge, Chrome, Firefox or Safari etc.

Assessing vocabulary size through multiple-choice formats : Issues with guessing and sampling rates

Author

Summary, in English

In most tests of vocabulary size, knowledge is assessed through multiple-choice formats. Despite advantages such as ease of scoring, multiple-choice tests (MCT) are accompanied with problems. One of the more central issues has to do with guessing and the presence of other construct-irrelevant strategies that can lead to overestimation of scores. A further challenge when designing vocabulary size tests is that of sampling rate. How many words constitute a representative sample of the underlying population of words that the test is intended to measure? This paper addresses these two issues through a case study based on data from a recent and increasingly used MCT of vocabulary size: the Vocabulary Size Test. Using a criterion-related validity approach, our results show that for multiple-choice items sampled from this test, there is a discrepancy between the test scores and the scores obtained from the criterion measure, and that a higher sampling rate would be needed in order to better represent knowledge of the underlying population of words. We offer two main interpretations of these results, and discuss their implications for the construction and use of vocabulary size tests.

Department/s

Publishing year

2015

Language

English

Pages

278-306

Publication/Series

ITL: Institut Voor Toegepaste Linguistik

Volume

166

Issue

2

Document type

Journal article

Publisher

John Benjamins Publishing Company

Topic

  • Languages and Literature

Keywords

  • sampling rate
  • testing
  • validation
  • vocabulary size
  • guessing
  • multiple-choice test
  • criterion-related validity
  • assessment

Status

Published

ISBN/ISSN/Other

  • ISSN: 0019-0829