Abstract
Assessment is a useful process as it provides various stakeholders (e.g., teachers, parents, government, employers) with information about students' competence in a particular subject area. However, for the information generated by assessment to be useful, it needs to support valid inferences. One factor that can undermine the validity of inferences from assessment outcomes is the language of the assessment material. For example, the use of excessively complex grammar and difficult vocabulary in the formulation of test questions may prevent students from displaying their true knowledge and skills (e.g., students who are not native speakers of the target language). In an attempt to support teachers and test developers in designing linguistically accessible assessment material, this study explored practical ways of investigating the linguistic complexity of test questions both at the level of vocabulary (lexical complexity) and grammar (syntactic complexity). The study compiled three corpora of examination questions and undertook automated lexical and syntactic analyses of these questions using software packages that are typically employed in the field of corpus linguistics.
| Original language | English |
|---|---|
| Pages (from-to) | 10-16 |
| Number of pages | 7 |
| Journal | Research Matters |
| Issue number | 29 |
| Early online date | 1 Mar 2020 |
| DOIs | |
| Publication status | Published - 18 Oct 2020 |