A GUIDELINE-BASED APPROACH TO SUPPORT THE ASSESSMENT OF STUDENTS’ ABILITY TO APPLY OBJECT-ORIENTED CONCEPTS IN SOURCE CODE

Authors

  • Norazlina Khamis Faculty of Computing and Informatics, Universiti Malaysia Sabah, Jalan UMS, 88400 Kota Kinabalu Sabah
  • Norhayati Daut Fakulty of Computing and Informatics, Universiti Malaysia Sabah, Jalan UMS, 88400 Kota Kinabalu Sabah

DOI:

https://doi.org/10.11113/jt.v78.4919

Keywords:

Object-oriented programming, object-oriented concept, programming assessment, Goal-Question-Metric approach

Abstract

There are many approaches in assessing students’ ability in object-oriented (OO) programming, but little is known on how to assess their ability in applying OO fundamental concepts in their written source codes. One major problem with programming assessment relates to variation in marks given by different assessors. Often, the grades given also does not gauge whether students know how to apply OO approaches. Thus, a new assessment approach is needed to fill these gap. The objective of this study is to construct and validate through expert consensus, a set of evaluation criteria for fundamental OO concepts together with the guidelines called GuideSCoRE, to help instructors assess students’ ability in applying OO concepts in their program source code. The evaluation criteria are derived from fundamental OO concepts found in Malaysian OO programming syllabuses and validated by a three-round Delphi approach. The proposed evaluation criteria were mapped with related OO design heuristics and OO design principles. A guideline (GuideSCoRE), constructed based on the Goal-Questions-Metrics approach together with the evaluation criteria is used by instructors when assessing students’ source codes. An inter-rater reliability analysis among six instructors found moderate agreement on assessment scores (κ values of mainly between 0.421 and 0.575) indicating that whilst the guidelines do not completely eliminate variations between raters, it help reduce their occurrences.  

References

Kolling. M. 1999. The Problem of Teaching Object-Oriented Programming, Part 1: Languages. Journal of Object-Oriented Programming.

Fleury. A. E. 2000. Programming in Java: Students-Constructed Rules. In 31st SIGCSE Technical Symposium on Computer Science Education.

Guzdial, M. 2001 Centralized Mindset: A Student Problem with Object-Oriented Programming. Journal of Computer Science Education. 14(3&4): 28-32.

Sheetz, S. D., Irwin, G., Tegarden, D. P., Nelson, H. J. and Monarchi, D. E. 1997. Exploring the Difficulties of Learning Object-oriented Techniques. Journal of Management Information System. 14(2): 103-131.

Berge, O. and Fjuk, A. 2003. Soco-cultural Perspectives on Object-oriented Learning. Workshop on Pedagogies and Tools for learning Object Oriented Concepts, European Conference on Object Oriented Programming, Darmstadt, Allermagne.

Lahtiven, E., Ala-Mutka, K. and Jarvinen, H. M. 2005. A Study of the Difficulties of Novice Programmers. ITiCSE ’05. June 27-29. Portugal.

Anna, E. and Thune. M. 2005. Novice Java Programmers' Conception of "Object" and "Class", and Variation Theory, in ITiCSE, Monte de Caparica, Portugal. 89-93.

Kate, S. et al. 2008. Student Understanding of Object-Oriented Programming as Expressed in Concept Maps. In SIGCSE Bull. 40: 332-336.

Kate, S. and Lynda, T. 2007 Checklists for Grading Object Oriented CS1 Programs: Concepts and Misconceptions. SIGCSE Bull. 39: 166-170.

Simon, H. et al. 1997. Avoiding Object Misconceptions. SIGCSE Bull. 29: 131-134.

Brown, G. et al. 1997. Assessing Student Learning in Higher Education. London: Routledge, Taylor and Francis.

Hagan, D. and Markham, S. 2000. Does it Help to Have Some Programming Experience Before Beginning a Computing Degree Program? In Proceedings of the 5th annual SIGCSE/SIGCUE ITiCSE conference on Innovation and Technology in Computer Science Education. New York: ACM. 25-28.

McCracken, M. et al. 2001. A Multi-National, Multi-Institutional Study of Assessment of Programming Skills of First-Year CS Students. In SIGCSE Bull. 33: 125-180.

Raymond, L and John, L. 2003. First Year Programming: Let All the Flowers Bloom. In Proceedings of the fifth Australasian Conference on Computing Education. Volume 20, Adelaide, Australia.

Allison, E. T and Mark, G. 2010. Developing a Validated Assessment of Fundamental CS1 Concepts. In Proceedings of the 41st ACM Technical Symposium on Computer Science Education, Milwaukee, Wisconsin, USA.

ACM. 2013. ACM Computing Curricula 2013. Available:https://www.acm.org/education/curricula- recommendations.

Norazlina K. 2016. Establishing Evaluation Criteria for Assessing Novices' Ability in Applying Object-oriented Concept Using Delphi Approach. International Journal of Information and Education Technology. ISSN 2010-3689.

Deborah J. Armstrong. 2006. The Quarks of Object-Oriented Development. Commun. ACM 49. 2(February 2006): 123-128.

Hsu, C. C and Sandford, B.A. 2007. The Delphi Technique: Making Sense of Consensus, Practical Assessment, Research & Evaluation. 12: 1-8.

Yousuf, M. I. 2007. Using Experts' Opinion Through Delphi Technique. Practical Assessment , Research & Evaluation. 12: 9-18.

Victor B, Gianluigi C. and Dieter R. 1994. The Goal Question Metric Approach.

Martin, R. C. 1996. Design Principles. Available:

http://www.objectmentor.com/resources/publishedArticles.html.

Meyer, B. 1997. Object-oriented Software Construction. Prentice Hall.

Liskov, B. 1987. Keynote Address-Data Abstraction and Hierarchy. In proceedings on Object-oriented programming systems, languages and applications (Addendum), Orlando, Florida, United States.

Kirsti, A. M. 2005. A Survey of Automated Assessment Approaches for Programming Assignments. Computer Science Education. 15: 83-102.

1993. Benchmarks for Science Literacy: Oxford University Press.

Altman, D. G and Bland, J. M. 1995. Statistics Notes: Absence of Evidence is Not Evidence of Absence. BMJ. 485.

Creswell, J. 1998. Qualitative Inquiry and Research Design: Choosing Among Five Traditions. Thousand Oaks, CA: SAGE.

Morse, J. M. 1995. The Significance of Saturation. Qualitative Health Research. 5: 147-149.

Trochim, W. M. K. 2006. Research Method Knowledge Base. Measurement: Reliability.

Nichols,T. R et al. 2010. Putting the Kappa Statistic to Use. The Quality Assurance Journal. 13: 57-61.

Moskal, B. M and Leydens, J. A. 2012. Scoring Rubric Development: Validity and Reliability. Practical Assessment, Research & Evaluation. 7(10).

Fleiss, L. 1971. Measuring Nominal Scale Agreement Among Many Raters. Psychological Bulletin. 76: 378-382.

Landis, J.R and Koch,G. G. 1997. The Measurement of Observer Agreement for Categorical Data. Biometrics. 33: 159-174.

Norazlina K. and Idris, S. 2007. Investigating Current Object-oriented Programming Assessment Mehod In Malaysia's Universities. In Proceedings of the International Conference on Electrical Engineering and Informatics, June 17-19, Bandung, Institut Teknologi Bandung. 666-668.

Downloads

Published

2016-02-09

Issue

Section

Science and Engineering

How to Cite

A GUIDELINE-BASED APPROACH TO SUPPORT THE ASSESSMENT OF STUDENTS’ ABILITY TO APPLY OBJECT-ORIENTED CONCEPTS IN SOURCE CODE. (2016). Jurnal Teknologi (Sciences & Engineering), 78(2). https://doi.org/10.11113/jt.v78.4919