International Journal of Social Science & Economic Research
Submit Paper

Title:
THE EFFECTIVENESS OF COGNITIVE STRUCTURE ANALYSIS IN ASSESSING STUDENTS’ KNOWLEDGE OF THE SCIENTIFIC METHOD

Authors:
Mahnoor Ahmad and John Leddo

|| ||

Mahnoor Ahmad and John Leddo
MyEdMaster, LLC, Leesburg, Virginia, USA

MLA 8
Ahmad, Mahnoor, and John Leddo. "THE EFFECTIVENESS OF COGNITIVE STRUCTURE ANALYSIS IN ASSESSING STUDENTS’ KNOWLEDGE OF THE SCIENTIFIC METHOD." Int. j. of Social Science and Economic Research, vol. 8, no. 8, Aug. 2023, pp. 2397-2410, doi.org/10.46609/IJSSER.2023.v08i08.020. Accessed Aug. 2023.
APA 6
Ahmad, M., & Leddo, J. (2023, August). THE EFFECTIVENESS OF COGNITIVE STRUCTURE ANALYSIS IN ASSESSING STUDENTS’ KNOWLEDGE OF THE SCIENTIFIC METHOD. Int. j. of Social Science and Economic Research, 8(8), 2397-2410. Retrieved from https://doi.org/10.46609/IJSSER.2023.v08i08.020
Chicago
Ahmad, Mahnoor, and John Leddo. "THE EFFECTIVENESS OF COGNITIVE STRUCTURE ANALYSIS IN ASSESSING STUDENTS’ KNOWLEDGE OF THE SCIENTIFIC METHOD." Int. j. of Social Science and Economic Research 8, no. 8 (August 2023), 2397-2410. Accessed August, 2023. https://doi.org/10.46609/IJSSER.2023.v08i08.020.

References

[1]. Anderson, J.R. (1982). Acquisition of cognitive skill. Psychological Review, 89, 369-405.
[2]. Chaoui, N (2011) "Finding Relationships Between Multiple-Choice Math Tests and Their Stem-Equivalent Constructed Responses". CGU Theses & Dissertations. Paper 21.
[3]. de Ayala, R. J. (2009). The theory and practice of item response theory. New York: The Guilford Press
[4]. de Kleer, J. and Brown, J.S. (1981). Mental models of physical mechanisms and their acquisition. In J.R. Anderson (Ed.), Cognitive Skills and their acquisition. Hillsdale, NJ: Erlbaum
[5]. Elbrink, L., & Waits, B. (Spring, 1970). A Statistical Analysis of MultipleChoice Examinations in Mathematics. The Two-Year College Mathematics Journal, 1(1), 25-29.
[6]. Herman, J. L., Klein, D. C., Heath, T. M., & Wakai, S. T. (1994). A first look: Are claims for alternative assessment holding up? (CSE Tech. Rep. No. 391). Los Angeles: University of California, Center for Research on Evaluation, Standards, and Student Testing
[7]. Frary, R. (Spring, 1985). Multiple-Choice Versus Free-Response: A Simulation Study. Journal of Educational Measurement, 22, 21-31.
[8]. Leddo, J., Clark, A., and Clark, E. (2021). Self-assessment of understanding: We don’t always know what we know. International Journal of Social Science and Economic Research, 6(6), 1717-1725.
[9]. Leddo, J., Cohen, M.S., O'Connor, M.F., Bresnick, T.A., and Marvin, F.F. (1990). Integrated knowledge elicitation and representation framework (Technical Report 90 3). Reston, VA: Decision Science Consortium, Inc..
[10]. Leddo, J. and Sak, S. (1994). Knowledge Assessment: Diagnosing what students really know. Presented at Society for Technology and Teacher Education. .
[11]. Leddo, J., Zhang, Z. and Pokorny, R. (1998).Automated Performance Assessment Tools. Proceedings of the Interservice/Industry Training Systems and Education Conference. Arlington, VA: National Training Systems Association.
[12]. Liang, I. and Leddo, J. (2020). An intelligent tutoring system-style assessment software that diagnoses the underlying causes of students’ mathematical mistakes. International Journal of Advanced Educational Research, 5(5), 26-30.
[13]. National Council of Teachers of Mathematics (2000). Principles and Standards for School Mathematics. Reston, VA: NCTM.
[14]. Newell, A. and Simon, H.A. (1972). Human problem solving. Englewood Cliffs, NJ: Prentice
[15]. O'Neil Jr., H., & Brown, R. (1997). Differential Effects Of Question Formats In Math Assessment On Metacognition And Affect. Applied Measurement in Education, 331-351.
[16]. Quillian, M.R. (1966). Semantic memory. Cambridge, MA: Bolt, Beranek and Newman.
[17]. Schank, R.C. (1982). Dynamic Memory: A theory of learning in computers and people. New York: Cambridge University Press.
[18]. Schank, R.C. and Abelson, R.P. (1977). Scripts, Plans, Goals, and Understanding. Hillsdale, NJ: Erlbaum.

ABSTRACT:
Assessing students on their knowledge has been a key part of education, aiding in determining how much students have learned certain concepts. In the past, assessments have focused on whether students give the correct answer to problems, implying that the number of correctlyanswered test items is a valid measure of how much students know. However, this emphasis on correct answers has resulted in negligence of assessments that could potentially provide diagnostic feedback to teachers and educators as to what concepts students have mastered, where the gaps in their knowledge are and how to remediate them. Having this framework could greatly benefit classrooms and day-to-day teaching practices. The present paper describes an assessment technique called Cognitive Structure Analysis that is derived from John Leddo’s integrated knowledge structure framework (INKS-Leddo et al., 1990) that combines several prominent knowledge representation frameworks in cognitive psychology. While this framework has been used to determine its usefulness to mathematics, it has not been tested in other disciplines. The current paper is determined to test whether this framework can be utilized when it comes to testing students’ knowledge in science by assessing them on a specific scientific topic: the scientific method. Using a Google Form, students were assessed on four types of knowledge considered the basis of mastery of scientific method concepts: factual, procedural, strategic, and rationale. Students gave responses to queries, and their results were measured where each type of knowledge was scored and a combined knowledge score was created. Students were then given real Advanced Placement style problems to solve, which generated a problem-solving score. Correlations between each knowledge component score and problem-solving performance were moderate and the correlation between overall CSA-assessed knowledge and problem-solving performance was .63. Factual, strategic and rationale knowledge also showed statistically significant correlations with problem solving performance, but procedural knowledge did not. Future research can investigate CSA’s across other scientific topics and other subjects, and whether incorporating CSA as part of day-to-day classroom instruction can lead to higher student achievement and create more efficient learning practices.

IJSSER is Member of