Automated Test Cases and Test Data Generation for Dynamic Structural Testing in Automatic Programming Assessment Using MC/DC

Rohaida Romli, Shahadath Sarker, Mazni Omar, Musyrifah Mahmod

Abstract


Automatic Programming Assessment (or APA) is known as a method to assist educators in executing automated assessment and grading on students’ programming exercises and assignments. Having to execute dynamic testing in APA, providing an adequate set of test data via a systematic process of test data generation is necessarily essential. Though researches respecting to software testing have proposed various significant methods to realize automated test data generation, it occurs that recent studies of APA rarely utilized these methods. Merely some of the limited studies appeared to resolve this circumstance, yet the focus on realizing test set and test data covering more thorough dynamic-structural testing are still deficient. Thus, we propose a method that utilizes MC/DC coverage criteria to support more thorough automated test data generation for dynamic-structural testing in APA (or is called DyStruc-TDG). In this paper, we reveal the means of deriving and generating test cases and test data for the DyStruc-TDG method and its verification concerning the reliability criteria (or called positive testing) of test data adequacy in programming assessments. This method offers a significant impact on assisting educators dealing with introductory programming courses to derive and generate test cases and test data via APA regardless of having knowledge of designing test cases mainly to execute structural testing. As regards to this, it can effectively reduce the educators’ workload as the process of manual assessments is typically prone to errors and promoting inconsistency in marking and grading.

Keywords


automatic programming assessment; test data generation; dynamic testing; structural coverage; MC/DC.

Full Text:

PDF

References


R. Romli, S. Sulaiman, K. Z. Zamli, “Automatic programming assessment and test data generation a review on its approachesâ€. In Proceedings of Information Technology (ITSim) International Symposium 3, 2010, pp.1186-1192.

D. Jackson, “A Software System for Grading Student Computer Programsâ€, Computers and Education, 27 (3-4), pp. 171-180, 1996.

R. Saikkonen., L. Malmi, A. Korhonen, “Fully Automatic Assessment of Programming Exercisesâ€, ACM SIGCSE Bulletin, 33 (3), 2001, pp.133-136.R. E. Sorace, V. S. Reinhardt, and S. A. Vaughn, “High-speed digital-to-RF converter,†U.S. Patent 5 668 842, Sept. 16, 1997.

D. Jackson, M. Ushe, “Grading student programs using ASSYSTâ€, Proceedings of the 28th SIGCSE Technical Symposium on Computer Science Education, San Jose, CA., 1997, pp. 335–339.

M. Luck, M. S. Joy, “A secure on-line submission systemâ€, Journal of Software – Practise and Experience, 29 (8), pp. 721-740, 1999.

L. Malmi, V. Karavirta,, A. Korhonen, J. Nikander, O. Seppala, P. Silvasti, “Visual Algorithm Simulation Exercise System with Automatic Assessment: TRAKLA2â€, Informatics in Education, 3(2), pp. 267-288, 2004.

M. Choy, U. Nazir, C.K Poon, Y.Y Yu, “Experiences in Using an Automated System for Improving Students’ of Computer Programming†, Advances in Web-Based Learning – ICWL 2005, Lecture Notes in Computer Science, Vol. 3583/2005, 2005, pp. 267–272.

T. Tang, R. Smith, J. Warren, S. Rixner, “Data-Driven Test Case Generation for Automated Programming Assessmentâ€, Proceedings of the 2016 ACM Conference on Innovation and Technology in Computer Science Education ITiCSE 16, 2016, pp. 260-265.

H. Fangohr, N. O'Brien, A. Prabhakar, A. Kashyap, “Teaching Phyton Programming with Automatic Assessment and Feedback Provisionâ€, arXiv:1509.03556 [cs.CY], 2015, pp. 1-26.

S. Monpratarnchai, S. Fujiwara, A. Katayama, T. Uehara, “Automated Testing for Java Programs using JPF-based Test Case Generation, ACM SIGSOFT Software Engineering Notes, 39 (1), 2014, pp. 1-5.

L. A. Clarke, “A system to generate test data and symbolically execute programsâ€, IEEE Transaction on Software Engineering, SE-2(3), pp. 215-222, 1976.

N. Gupta, A.P Mathur, M. L Soffa, “Automated Test Data Generation Using an Iterative Relaxation Methodâ€, ACM SIGSOFT Software Engineering Notes, 23 (6), pp. 231-245, 1998.

J. Offutt, S. Liu, A. Abdurazik, P. Ammann, “Generating Test Data from State-Based Specificationsâ€, Software Testing, Verification And Reliability, Vol. 13, pp. 25–53, 2003.

K.Z. Zamli,. A. M. Isa, M. F. J. Klaib, S.N. Azizan, “Tool for Automated Test Data Generation (and Execution) Based on Combinatorial Approachâ€, International Journal of Software Engineering and Its Applications, 1(1), pp. 19-36, 2007.

W. Zidoune, T. Benouhiba, “Targeted adequacy criteria for search-based test data generationâ€, International Conference on Information Technology and E-Services, 2012, pp. 1-6.

R.P. Pargas, M. J. Harrold, R. Peck, “Test-Data Generation Using Genetic Algorithmsâ€, Journal of Software Testing, Verification and Reliability, 9(4), pp. 63-282, 1999.

P. Ihantola, “Test Data Generation for Programming Exercises with Symbolic Execution ind Java PathFinderâ€, Proceedings of the 6th Baltic Sea Conference on Computing Education Research: Koli Calling 2006, 2006, pp. 87 – 94.

N. Tillmann, J. D Halleux, T. Xie, S. Gulwani, J. Bishop, “Teaching and Learning Programming and Software Engineering via Interactive Gamingâ€, Proceedings of the 2013 International Conference on Software Engineering (ICSE’13), San Francisco,CA, USA, 2013, pp. 1117-1126.

R. Romli, “Test Data Generation Framework for Automatic Programming Assessmentâ€, PhD Thesis, Universiti Sains Malaysia, Malaysia, 2014.

K. J. Hayhurst, D. S. Veerhusen, J.J. Chilenski,, L.K Rierson, “A practical tutorial on modified condition/decision coverageâ€, NASA STI Report Series, 2001.

H. Zhu, P.A. V. Hall, J. H. R May, “Software Unit Test Coverage and Adequacyâ€, ACM Computing Surveys, 29 (4), pp. 365-427, 1997.

H. Zhu, “Axiomatic Assessment of Control Flow-based Software Test Adequacy Criteriaâ€, Software Engineering Journal, 10 (5), pp. 194 -204, 1995.

K. Ghani, J.A Clark, “Automatic Test Data Generation for Multiple Condition and MCDC Coverageâ€, Proceedings of the 2009 Fourth International Conference on Software Engineering Advances, 2009, pp. 152 -157.

C. Douce, D. Livingstone & J. Orwell, J. “Automatic test-based assessment of programming: A reviewâ€, Journal on Educational Resources in Computing (JERIC), 5(3), Aticle No. 4, 2005.

P. Ihantola, T. Ahoniemi, V. Karavirta & O. Seppälä, O. “Review of recent systems for automatic assessment of programming assignmentsâ€, In Proceedings of the 10th Koli Calling International Conference on Computing Education Research , 2010, pp. 86-93.

PY. Liang, Q. Liu, J. Xu & D. Wang, D. “The recent development of automated programming assessmentâ€. Proceedings of International Conference on Computational Intelligence and Software Engineering (CiSE 2009), 2009, pp. 1-5.

K. A. Rahman & M. J. Nordin, A review on the static analysis approach in the automated programming assessment systems. Proceedings of National Conference on Programming , 2007, Vol. 7.

IPL Information Processing Ltd. Designing Unit Test Cases, 1997 .Available: http://www.ipl.com/pdf/p0829.pdf. Retrieved on: 10 Feb 2009.

J. Watkins, S. Mills, Testing IT: An Off-the-Shelf Software Testing Process, 2nd Edition, 2011, Cambridge University Press, NY, USA.

S. Rayadurgam, M.P.E. Heimdahl, “Generating MC/DC Adequate Test Sequences Through Model Checkingâ€, Proceedings of the 28th Annual IEEE/NASA Software Engineering Workshop -- SEW-03. Greenbelt, Maryland, 2003, pp. 1–5.

M. Pezze, M. Young, Software Testing and Analysis: Process, Principles, and Techniques, 2008, John Wiley & Sons, Inc, USA.

J.R. Fraenkel, N.E Wallen, How to Design and Evaluate Research in Education, 4th Edition, 2000, McGraw-Hill Companies, Inc, U.S.A.




DOI: http://dx.doi.org/10.18517/ijaseit.10.1.10166

Refbacks

  • There are currently no refbacks.



Published by INSIGHT - Indonesian Society for Knowledge and Human Development