Examining the Language Factor in Mathematics Assessments
Abstract
In educational testing, assessment specialists typically create multiple forms of a test for different purposes, such as increasing test security or developing an item bank. Using different types of items across test forms is also a common practice to create alternative test forms.This study investigates whether word problems and mathematically expresseditems can be used interchangeably regardless of their linguistic complexities. A sample of sixth grade students was given two forms of a mathematics assessment. The first form included mathematicsitems based onmathematical terms, expressions, and equations. The second form included the same items as word problems. The underlying tasks and solutions of the items in the first test form were the same as the corresponding items in the second form. Explanatory item response modeling was used for examining the impact of item type and genderon difficulty levels of items and students’ test scores. The results showed that word problems were easier than mathematically expressed items. Gender and its interaction with the linguistic complexity of mathematics items did not seem to have any impact on student performance on the test.
Full Text: PDF DOI: 10.15640/jehd.v4n1a13
Abstract
In educational testing, assessment specialists typically create multiple forms of a test for different purposes, such as increasing test security or developing an item bank. Using different types of items across test forms is also a common practice to create alternative test forms.This study investigates whether word problems and mathematically expresseditems can be used interchangeably regardless of their linguistic complexities. A sample of sixth grade students was given two forms of a mathematics assessment. The first form included mathematicsitems based onmathematical terms, expressions, and equations. The second form included the same items as word problems. The underlying tasks and solutions of the items in the first test form were the same as the corresponding items in the second form. Explanatory item response modeling was used for examining the impact of item type and genderon difficulty levels of items and students’ test scores. The results showed that word problems were easier than mathematically expressed items. Gender and its interaction with the linguistic complexity of mathematics items did not seem to have any impact on student performance on the test.
Full Text: PDF DOI: 10.15640/jehd.v4n1a13
Browse Journals
Journal Policies
Information
Useful Links
- Call for Papers
- Submit Your Paper
- Publish in Your Native Language
- Subscribe the Journal
- Frequently Asked Questions
- Contact the Executive Editor
- Recommend this Journal to Librarian
- View the Current Issue
- View the Previous Issues
- Recommend this Journal to Friends
- Recommend a Special Issue
- Comment on the Journal
- Publish the Conference Proceedings
Latest Activities
Resources
Visiting Status
Today | 261 |
Yesterday | 1863 |
This Month | 46139 |
Last Month | 72673 |
All Days | 2758140 |
Online | 59 |