International Journal of Social Science & Economic Research
Submit Paper

Title:
COGNITIVE STRUCTURE ANALYSIS: A TECHNIQUE FOR ASSESSING WHAT STUDENTS KNOW, NOT JUST HOW THEY PERFORM

Authors:
John Leddo, Shangzhi Li and Yujie Zhang

|| ||

John Leddo, Shangzhi Li and Yujie Zhang
MyEdMaster, LLC
John Leddo is the director of research at MyEdMaster.

MLA 8
Leddo, John, et al. "COGNITIVE STRUCTURE ANALYSIS: A TECHNIQUE FOR ASSESSING WHAT STUDENTS KNOW, NOT JUST HOW THEY PERFORM." Int. j. of Social Science and Economic Research, vol. 7, no. 11, Nov. 2022, pp. 3716-3726, doi.org/10.46609/IJSSER.2022.v07i11.011. Accessed Nov. 2022.
APA 6
Leddo, J., Li, S., & Zhang, Y. (2022, November). COGNITIVE STRUCTURE ANALYSIS: A TECHNIQUE FOR ASSESSING WHAT STUDENTS KNOW, NOT JUST HOW THEY PERFORM. Int. j. of Social Science and Economic Research, 7(11), 3716-3726. Retrieved from https://doi.org/10.46609/IJSSER.2022.v07i11.011
Chicago
Leddo, John, Shangzhi Li, and Yujie Zhang. "COGNITIVE STRUCTURE ANALYSIS: A TECHNIQUE FOR ASSESSING WHAT STUDENTS KNOW, NOT JUST HOW THEY PERFORM." Int. j. of Social Science and Economic Research 7, no. 11 (November 2022), 3716-3726. Accessed November, 2022. https://doi.org/10.46609/IJSSER.2022.v07i11.011.

References

[1]. Anderson, J.R. (1982). Acquisition of cognitive skill. Psychological Review, 89, 369-405.
[2]. Chaoui, N (2011) "Finding Relationships Between Multiple-Choice Math Tests and Their Stem-Equivalent Constructed Responses". CGU Theses & Dissertations. Paper 21.
[3]. de Ayala, R. J. (2009). The theory and practice of item response theory. New York: The Guilford Press.
[4]. de Kleer, J. and Brown, J.S. (1981). Mental models of physical mechanisms and their acquisition. In J.R. Anderson (Ed.), Cognitive skills and their acquisition. Hillsdale, NJ: Erlbaum.
[5]. Elbrink, L., & Waits, B. (Spring, 1970). A Statistical Analysis of MultipleChoice Examinations in Mathematics. The Two-Year College Mathematics Journal, 1(1), 25-29.
[6]. Frary, R. (Spring, 1985). Multiple-Choice Versus Free-Response: A Simulation Study. Journal of Educational Measurement, 22, 21-31.
[7]. Herman, J. L., Klein, D. C., Heath, T. M., & Wakai, S. T. (1994). A first look: Are claims for alternative assessment holding up? (CSE Tech. Rep. No. 391). Los Angeles: University of California, Center for Research on Evaluation, Standards, and Student Testing.
[8]. Leddo, J., Clark, A., and Clark, E. (2021). Self-assessment of understanding: We don’t always know what we know. International Journal of Social Science and Economic Research, 6(6), 1717-1725.
[9]. Leddo, J., Cohen, M.S., O'Connor, M.F., Bresnick, T.A., and Marvin, F.F. (1990). Integrated knowledge elicitation and representation framework (Technical Report 90?3). Reston, VA: Decision Science Consortium, Inc..
[10]. Leddo, J. and Sak, S. (1994). Knowledge Assessment: Diagnosing what students really know. Presented at Society for Technology and Teacher Education. .
[11]. Leddo, J., Zhang, Z. and Pokorny, R. (1998).Automated Performance Assessment Tools. Proceedings of the Interservice/Industry Training Systems and Education Conference. Arlington, VA: National Training Systems Association.
[12]. Liang, I. and Leddo, J. (2021). An intelligent tutoring system-style assessment software that diagnoses the underlying causes of students’ mathematical mistakes. International Journal of Advanced Educational Research, 5(5), 26-30.
[13]. National Council of Teachers of Mathematics (2000). Principles and Standards for School Mathematics. Reston, VA: NCTM.
[14]. Newell, A. and Simon, H.A. (1972). Human problem solving. Englewood Cliffs, NJ: Prentice Hall.
[15]. O'Neil Jr., H., & Brown, R. (1997). Differential Effects Of Question Formats In Math Assessment On Metacognition And Affect. Applied Measurement in Education, 331-351.
[16]. Quillian, M.R. (1966). Semantic memory. Camridge, MA: Bolt, Beranek and Newman.
[17]. Schank, R.C. and Abelson, R.P. (1977). Scripts, Plans, Goals, and Understanding. Hillsdale, NJ: Erlbaum.
[18]. Schank, R.C. (1982). Dynamic Memory: A theory of learning in computers and people. New York: Cambridge University Press.

ABSTRACT:
Assessment has been a key part of education, playing the role of determining how much students have learned. Traditionally, assessments have focused on whether students give the correct answer to problems, implying that that the number of correctly-answered test items is a valid measure of how much students know. Unfortunately, the focus on correct answers has also resulted in neglecting the potential ability of assessments to provide diagnostic feedback to educators as to what concepts students have mastered and where the gaps in their knowledge are, thus potentially informing the day-to-day teaching process. The present paper describes an assessment technique called Cognitive Structure Analysis that is derived from John Leddo’s integrated knowledge structure framework (Leddo et al., 1990) that combines several prominent knowledge representation frameworks in cognitive psychology. Using a Google Form, students were queried on four types of knowledge considered the basis of mastery of Algebra 1 concepts: factual, procedural, strategic, and rationale. From students’ responses to these queries, measures of each type of knowledge and a combined knowledge score were created. Students were also given problems to solve. Correlations between each knowledge component score and problemsolving performance were high and the correlation between overall CSA-assessed knowledge and problem-solving performance was a near-perfect .966. Results suggest that CSA can be both easily implemented and highly diagnostic of student learning needs. Future research can investigate CSA’s robustness across other subjects and whether incorporating CSA as part of day-to-day classroom instruction can lead to higher student achievement.

IJSSER is Member of