Using a Constructed-Response Instrument to Explore the Effects of Item Position and Item Features on the Assessment of Students-Written Scientific Explanations
详细信息    查看全文
  • 作者:Meghan Rector Federer ; Ross H. Nehm ; John E. Opfer…
  • 关键词:Constructed response instrument ; Item order effects ; Item surface features ; Scientific explanation ; Evolution
  • 刊名:Research in Science Education
  • 出版年:2015
  • 出版时间:August 2015
  • 年:2015
  • 卷:45
  • 期:4
  • 页码:527-553
  • 全文大小:823 KB
  • 参考文献:American Association for the Advancement of Science [AAAS]. (1994). Benchmarks for science literacy. New York: Oxford University.
    American Association for the Advancement of Science [AAAS]. (2011). Vision and change in undergraduate biology education. Washington, DC. http://?visionandchange.?org/-/span> .
    Anderson, D. L., Fisher, K. M., & Norman, G. J. (2002). Development and evaluation of the conceptual inventory of natural selection. Journal of Research in Science Teaching, 39, 952-78.View Article
    Bennett, R. E., & Ward, W. C. (1993). Construction versus choice in cognitive measurement: issues in constructed response, performance testing, and portfolio assessment. Hillsdale, NJ: Lawrence Erlbaum Associates.
    Berland, L. K., & McNeill, K. L. (2012). For whom is argument and explanation a necessary distinction? A response to Osborne and Patterson. Science Education, 96(5), 808-13.View Article
    Birney, D. P., Halford, G. S., & Andrews, G. (2006). Measuring the influence of complexity on relational reasoning: the development of the Latin Square Task. Educational & Psychological Measurement, 66(1), 146-71.View Article
    Bishop, B., & Anderson, C. (1990). Student conceptions of natural selection and its role in evolution. Journal of Research in Science Teaching, 27, 415-27.View Article
    Bridgeman, B. (1992). A comparison of quantitative questions in open-ended and multiple-choice formats. Journal of Educational Measurement, 29(3), 253-71.View Article
    Caleon, I. S., & Subramaniam, R. (2010). Do students know what they know and what they don’t know? Using a four-tier diagnostic test to assess the nature of students-alternative conceptions. Research in Science Education, 40, 313-37.View Article
    Catley, K. M., Phillips, B. C., & Novick, L. R. (2013). Snakes, eels, and dogs! Oh my! Evaluating high-school students-tree-thinking skills: an entry point to understanding evolution. Research in Science Education, 43(6), 2327-348.View Article
    Chi, M. T. H., Feltovich, P. J., & Glaser, R. (1981). Categorization and representation of physics problems by experts and novices. Cognitive Science, 5, 121-52.
    Clough, E. E., & Wood-Robinson, C. (1985). How secondary students interpret instances of biological adaptation. Journal of Biological Education, 19, 125-30.View Article
    Clough, E. E., & Driver, R. (1986). A study of consistency in the use of students-conceptual frameworks across different task contexts. Science Education, 70(4), 473-96.View Article
    Cronbach, L. J. (1988). Five perspectives on validity argument (In H. Wainer and H.I. Braun (Eds)). Hillsdale, NJ: Lawrence Erlbaum.
    Duschl, R. A., Schweingruber, H. A., & Shouse, A. W. (2007). Taking science to school: learning and teaching science in grades K-8. Washington DC: National Academies.
    Friedman, M. (1974). Explanation and scientific understanding. Journal of Philosophy, 71(1), 5-9.View Article
    Garvin-Doxas, K., & Klymkowsky, M. W. (2008). Understanding randomness and its impact on student learning: lessons learned from building the Biology Concept Inventory (BCI). CBE Life Sciences Education, 7(2), 227-33.View Article
    Gotwals, A. W., & Songer, N. B. (2010). Reasoning up and down a food chain: using an assessment framework to investigate students-middle knowledge. Science Education, 94, 259-81.
    Gray, K. E. (2004). The effect of question order on student responses to multiple choice physics questions. Master thesis, Kansas State University. Retrieved from http://?web.?phys.?ksu.?edu/?dissertations/-/span>
    Griffiths, T. L., Steyvers, M., & Firl, A. (2007). Google and the mind: predicting fluency with PageRank. Psychological Science, 18, 1069-067.View Article
    Gulacar, O., & Fynewevr, H. (2010). A research methodology for studying what makes some problems difficult to solve. International Journal of Science Education, 32(16), 2167-184.View Article
    Holland, P. W., & Dorans, N. J. (2006). Linking and equating. In R. L. Brennan (Ed.), Educational measurement (4th ed, pp. 187-20). Westport: American Council on Higher Education and Praeger.
    Hempel, C., & Oppenheim, P. (1948). Studies in the logic of explanation. Philosophy of Science, 15, 135-75.View Article
    Jensen, P., Watanabe, H. K., & Richters, J. E. (1999). Who’s up first? Testing for order effects in structured interviews using a counterbalanced experimental design. Journal of Abnormal Child Psychology, 27(6), 439-45.View Article
    Kampourakis, K., & Zygzos, V. (2008). Students-intuitive explanations of the causes of homologies and adaptations. Science & Education, 17, 27-7.View Article
    Kang, S. H. K., McDermott, K. B., & Roediger, H. L. (2007). Test format and corrective feedback modify the effect of cross-validation on long-term retention. European Journal of Cognitive Psychology, 19, 528-58.View Article
    Kelemen, D. (2012). Teleological minds: how natural intuitions about agency and purpose influence learni
  • 作者单位:Meghan Rector Federer (1)
    Ross H. Nehm (2)
    John E. Opfer (3)
    Dennis Pearl (4)

    1. Department of Teaching and Learning, Ohio State University, 333 Arps Hall, 1945 North High Street, Columbus, OH, 43210, USA
    2. Center for Science and Mathematics Education, Stony Brook University, Stony Brook, NY, 11794, USA
    3. Department of Psychology, Ohio State University, Columbus, OH, 43210, USA
    4. Department of Statistics, Ohio State University, Columbus, OH, 43210, USA
  • 刊物类别:Humanities, Social Sciences and Law
  • 刊物主题:Education
    Science Education
    Education
  • 出版者:Springer Netherlands
  • ISSN:1573-1898
文摘
A large body of work has been devoted to reducing assessment biases that distort inferences about students-science understanding, particularly in multiple-choice instruments (MCI). Constructed-response instruments (CRI), however, have invited much less scrutiny, perhaps because of their reputation for avoiding many of the documented biases of MCIs. In this study we explored whether known biases of MCIs—specifically item sequencing and surface feature effects—were also apparent in a CRI designed to assess students-understanding of evolutionary change using written explanation (Assessment of COntextual Reasoning about Natural Selection [ACORNS]). We used three versions of the ACORNS CRI to investigate different aspects of assessment structure and their corresponding effect on inferences about student understanding. Our results identified several sources of (and solutions to) assessment bias in this practice-focused CRI. First, along the instrument item sequence, items with similar surface features produced greater sequencing effects than sequences of items with dissimilar surface features. Second, a counterbalanced design (i.e., Latin Square) mitigated this bias at the population level of analysis. Third, ACORNS response scores were highly correlated with student verbosity, despite verbosity being an intrinsically trivial aspect of explanation quality. Our results suggest that as assessments in science education shift toward the measurement of scientific practices (e.g., explanation), it is critical that biases inherent in these types of assessments be investigated empirically.

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700