THE COMPARISON OF TWO TEST ITEM FORMATS IN ASSESSING STUDENTS’ READING COMPREHENSION ABILITY

Authors

  • Fidalia MR SMAN 1 Kampar Utara Kampar - Riau

DOI:

https://doi.org/10.22460/eltin.v2i2.p%25p

Abstract

This study discussed the comparison of the two kinds of test in assessing students’ reading comprehension skill. The sample of this research was 36 eighth grade students in one of the SMP-s in Riau. It employed quantitative research. The data collection technique used was reading comprehension test. Furthermore, the data obtained from both test item formats were analyzed, compared and calculated in form of paired samples t-test analysis by using IBM SPSS statistics version 22.  The research findings showed some points; there are 4.5833 point differences between two mean scores result (short answer and multiple choice test), the correlation between the result of the two test item formats is .626 which means that there is high correlation between them, and based on the comparison of t value and t table and the P value, it was found H0 is accepted which means that there is no effect of the use of those two test item formats in students’ score in assessing students’ reading comprehension ability. Keywords: comparison, testing, multiple choice, short answer, reading comprehension

References

Bergman, Manfred Max. 2008. Advances in Mixed Methods Research. UK: Sage Publication Ltd

Blachowicz, Camille & Ogle, Donna. 2008. Reading Comprehension Strategies for Independent Learners. NY: The Guilford Press

Bull, V (Ed). 2008. Oxford Learner's Pocket Dictionary Fourth Edition. UK: Oxford University Press.

Brown, H. Douglas &Abeywickrama, Priyanvada. 2010. Language Assessment Principles and Classroom Practices Second Edition.NY: Pearson Education, Inc

Cheung, Derek & Bucat Robert. 2002. How Can We Construct Good Multiple-Choice Items?. Hongkong: Science and Technology Education Conference

Gebhard, J. G. 2006. Teaching English as a Foreign or Second Language a Self-Development and Methodology Guide. USA: The University of Michigan.

Hatch, Evelyn & Farhady, Hossein. 1982. Research Design and Statistics For Applied Linguistics. USA: Newbury House Publishers, Inc

Heaton, J. B. 1988. Writing English Language Tests.UK: Longman Group

http://www.flaguide.org/cat/mutiplechoicetest/multiple_choice_test1.php accessed on 12th December 2013

Klingner, Janette K., Vaughn, Sharon & Boardman, Alison. 2007. Teaching Reading Comprehension to Students with Learning Difficulties. NY: The Guilford Press

Kusnadi. 2009. Improving Students' Reading Skill Through Interactive Approach. Bandung: Indonesia University of Education

Miller, M. David, Linn, Robert L., &Gronlund, Norman E. 2009.Measurement and Assessment in Teaching Tenth Edition.New Jersey: Pearson Education, Inc

Nation, I.S.P. 2009. Teaching ESL/EFL Reading and Writing. New York: Routledge, Taylor & Francis

Simkin, Mark G. &Kuechler, William L. 2005.Multiple-Choice Tests and Student Understanding: What is the Connection? USA: Decision Sciences Journal of Innovative Education

Sugiyono. 2012. Statistika untuk Penelitian. Bandung: PenerbitAlfabeta

Widhiarso, Wahyu. Bab II UjiHipotesisKomparatif. FakultasPsikologi UGM. Accessed in http://widhiarso.staff.ugm.ac.id/files/membaca_t-tes.pdfon 0th of December 2013

Published

2014-10-19