International Journal of Social Science & Economic Research
Submit Paper

Title:
COGNITIVE STRUCTURE ANALYSIS: ASSESSING STUDENTS’ KNOWLEDGE OF PRECALCULUS

Authors:
Lebei Nobel Zhou and John Leddo

|| ||

Lebei Nobel Zhou and John Leddo
MyEdMaster, LLC, 205 Colleen Ct., Leesurg, Virginia, United States of America

MLA 8
Leddo, John, and Lebei Nobel Zhou. "COGNITIVE STRUCTURE ANALYSIS: ASSESSING STUDENTS’ KNOWLEDGE OF PRECALCULUS." Int. j. of Social Science and Economic Research, vol. 8, no. 9, Sept. 2023, pp. 2826-2836, doi.org/10.46609/IJSSER.2023.v08i09.025. Accessed Sept. 2023.
APA 6
Leddo, J., & Zhou, L. (2023, September). COGNITIVE STRUCTURE ANALYSIS: ASSESSING STUDENTS’ KNOWLEDGE OF PRECALCULUS. Int. j. of Social Science and Economic Research, 8(9), 2826-2836. Retrieved from https://doi.org/10.46609/IJSSER.2023.v08i09.025
Chicago
Leddo, John, and Lebei Nobel Zhou. "COGNITIVE STRUCTURE ANALYSIS: ASSESSING STUDENTS’ KNOWLEDGE OF PRECALCULUS." Int. j. of Social Science and Economic Research 8, no. 9 (September 2023), 2826-2836. Accessed September, 2023. https://doi.org/10.46609/IJSSER.2023.v08i09.025.

References

[1]. Bushard, B. (2022, October 13). ACT college admission test scores drop to 30-year low as effects of covid-era online learning play out. Forbes. https://www.forbes.com/sites/brianbushard/2022/10/12/act-college-admission-test-scores-drop-to-30-year-low-as-effects-of-covid-era-online-learning-play-out/?sh=6f5a24663f7f
[2]. Claro, M., Salinas, A., Cabello-Hutt, T., San Martín, E., Preiss, D. D., Valenzuela, S., & Jara, I. (2018). Teaching in a digital environment (TIDE): Defining and measuring teachers’ capacity to develop students’ digital information and Communication Skills. Computers & Education, 121, 162–174. https://doi.org/10.1016/j.compedu.2018.03.001
[3]. de Kleer, J. and Brown, J.S. (1981). Mental models of physical mechanisms and their acquisition. In J.R. Anderson (Ed.), Cognitive skills and their acquisition. Hillsdale, NJ:Erlbaum.
[4]. Leddo, J., Clark, A., and Clark, E. (2021). Self-assessment of understanding: We don’t always know what we know. International Journal of Social Science and Economic Research, 6(6), 1717-1725.
[5]. Leddo, J., Cohen, M.S., O'Connor, M.F., Bresnick, T.A., and Marvin, F.F. (1990). Integrated knowledge elicitation and representation framework (Technical Report 90?3). Reston, VA: Decision Science Consortium, Inc..
[6]. Leddo, J., Li, S. & Zhang, Y. (2022). Cognitive Structure Analysis: A technique for assessing what students know, not just how they perform. International Journal of Social Science and Economic Research, 7(11), 3716-3726.
[7]. Leddo, J., Pillai, A., Patel, J., Hu, A., Kalavanan, P., Sreedhara, A. & Anumola, M. (2021). Degrading Math Skills: It is subject matter difficulty, not the passage of time, that matters. International Journal of Social Science and Economic Research 6(10), 4092-4101.
[8]. Leddo, J. and Sak, S. (1994). Knowledge Assessment: Diagnosing what students really know. Presented at Society for Technology and Teacher Education.
[9]. Leddo, J., Zhang, Z. and Pokorny, R. (1998).Automated Performance Assessment Tools. Proceedings of the Interservice/Industry Training Systems and Education Conference. Arlington, VA: National Training Systems Association.
[10]. Liang, I. and Leddo, J. (2021). An intelligent tutoring system-style assessment software that diagnoses the underlying causes of students’ mathematical mistakes. International Journal of Advanced Educational Research, 5(5), 26-30.
[11]. Makura, A. H., & Kalobo, L. (2019). Reliability and equity of Subject Delivery Competencies: Perspectives of pre-service teachers. Journal of Psychology in Africa, 29(5), 511–515. https://doi.org/10.1080/14330237.2019.1665907
[12]. NAEP long-term trend assessment results: Reading and Mathematics. The Nation’s Report Card. (n.d.). https://www.nationsreportcard.gov/highlights/ltt/2023/
[13]. National Mathematics Advisory Panel. Foundations for Success: The Final Report of the National Mathematics Advisory Panel, U.S. Department of Education: Washington, DC, 2008.
[14]. Nam, J. (2023, February 8). All about SAT scores: National average and full statistics: Bestcolleges. BestColleges.com. https://www.bestcolleges.com/research/average-sat-score-full-statistics/
[15]. Newell, A. and Simon, H.A. (1972). Human problem solving. Englewood Cliffs, NJ: Prentice Hall.
[16]. O’Neil Jr., H. F., & Brown, R. S. (1998). Differential effects of question formats in math assessment on metacognition and affect. Applied Measurement in Education, 11(4), 331–351. https://doi.org/10.1207/s15324818ame1104_3
[17]. Patel, N., Franco, S., Miura, Y., & Boyd, B. (2012). Including curriculum focus in mathematics professional development for middle-school Mathematics Teachers. School Science and Mathematics, 112(5), 300–309. https://doi.org/10.1111/j.1949-8594.2012.00146.x
[18]. Quillian, M.R. (1966). Semantic memory. Camridge, MA: Bolt, Beranek, and Newman.
[19]. Schank, R.C. and Abelson, R.P. (1977). Scripts, Plans, Goals, and Understanding. Hillsdale, NJ: Erlbaum.
[20]. Schank, R.C. (1982). Dynamic Memory: A theory of learning in computers and people. New York: Cambridge University Press.

ABSTRACT:
Assessment has been a key part of education, playing the role of determining how much students have learned. Traditionally, assessments have focused on whether students give the correct answer to problems, implying that the number of correctly answered test items is a valid measure of how much students know. Unfortunately, the focus on correct answers has also resulted in neglecting the potential ability of assessments to provide diagnostic feedback to educators as to what concepts students have mastered and where the gaps in their knowledge are, thus potentially informing the day-to-day teaching process. The present paper describes an assessment technique called Cognitive Structure Analysis that is derived from John Leddo’s integrated knowledge structure framework (Leddo et al., 1990) that combines several prominent knowledge representation frameworks in cognitive psychology. In a previous study (Leddo et al., 2022), CSA-based assessments of Algebra 1 knowledge correlated .966 with student problem-solving performance. The present study replicates the Leddo et al. (2022) findings on the subject of precalculus. Using a Google Form, students were queried on four types of knowledge considered the basis of mastery of precalculus concepts: factual, procedural, strategic, and rationale. From students’ responses to these queries, measures of each type of knowledge and a combined knowledge score were created. Students were also given problems to solve. Correlations between each knowledge component score and problem-solving performance were high and the correlation between overall CSA-assessed knowledge and problem-solving performance was .80. Results suggest that CSA can be both easily implemented and highly diagnostic of student learning needs. Future research can investigate CSA’s robustness across other subjects and whether incorporating CSA as part of day-to-day classroom instruction can lead to higher student achievement.

IJSSER is Member of