An Evaluation Framework and Analysis of Auto Assessing the Programming Courses during the COVID-19 Pandemic

: Covid-19 pandemic changed the traditional working dimension of many sectors. Many sectors are still struggling and facing many difficulties to change their current working environment to adapt the Covid-19 pandemic. Education sector is foster in adapting this pandemic and change their teaching learning process from face-to-face methodology to remote learning. UNICEF reported that approximately 800 million learners got affected due to the spread of COVID-19. This outbreak makes a paradigm shift for many educationists from schools to colleges. The educationists are still fighting to survive this pandemic and introduce different approaches to enhance the teaching learning process. This paper presents a framework for auto assessing the programming courses using MOODLE learning management system with the support of four activities (quiz, debugging, fill the missing code and programming). The proposed framework was analysed with the help 176 students enrolled for Python Programming Courses. The grades acquired by all the students areanalysed and the result shows that student programming skill is drastically increases by using this auto assessment framework. The student perception on auto assessment framework was analysed based on the survey conducted with students. The survey responses clearly state that this kind of auto evaluation framework is helping the students to increasing their learning interest in any programming courses. Suggestion on adapting this kind of auto assessment framework in other programming courses is also presented


Introduction
Learning computer programs is very essential to all the students from schools to university.Many students still find more difficulties in learning the fundamentals of programming courses.It is evident that there is high failure rate in programming courses examination conducted by various universities.The student learning outcome was measured in the programming courses are by Quiz, Assignments, Project and Viva Voce.To strengthen the student's programming skills several real-time problems should be solved in an adaptive manner.These exercises should be monitored and evaluated.Monitoring and evaluating students' solution will be a very difficult task.The traditional way of assessment is not practice centered.Students receive the question in random manner and start write the logic for the chosen question in a paper.The logic was evaluated by a teacher and if the logic is correct then students are permitted to type the code in appropriate code editor otherwise students are requested to rewrite the logic.The student typed code is manually evaluated by the teacher by giving sample input to the program.If the code correctly typed for the verified logic then 100% grade is given otherwise the grade was awarded based on the correctness of the logic.
The whole process was carried out in stipulated time.If any student is not able to submit the code at particulate span time then only very less grade was given to that particular student.For faculty, it is very difficult to assess all the students in the limited time period.The kind of grading impacts the learning approach of the student.The student assessment spirit and style characterize the curriculum.The baseline of the two authors is that there should be a proper evaluation/assessment method is required.In order to make the evaluation process as more accurate an automated mechanism is required.The auto evaluation process includes compilation error checking, testing the results and logic, efficiency of the code and total time taken to submit the code.Auto Evaluation ensures the same rubric is followed for all the submitted code and also it ensures speedy result declaration.This paper shows the study of how auto evaluation of programming have emphasize the student learning skills and faculty teaching skills compared with the traditional method.
The rest of the paper is organized as follows: Section 2 narrate the related work carried out in automated assessment system.Section 3 explains the framework used for auto assessing the programming courses.Section 4 describes the student performance analyses and also depicts the impact of using auto assessing in the programming courses.In section 5 Conclusion of the article is explained with the outlook of future work to be carried out.

2.Related Work
In the recent years, a quite number of reviews have been carried out on assessing the programming courses in automated manner [18,[22][23][24][25][26][27][28][29].Many students have drop out from the colleges because of difficulty in understanding the programming courses.According to (Tabano ES, Rodrigo MMT &Jadud MC 2011) 30-40% of student drop-out is mainly because of programming courses.The evolution of introductory programming courses for the past fifty years is completely analyzed by (Becker &Quille 2019).Their analyses states how the automated assessment system impacts on student retention and success rate.It also confirms that many research need to be carried out on auto assessing the programming courses based on various characteristics.Auto Grading of Student programs is started in the late 90's itself.(Jackson & Usher 1997) proposed a system based evaluation tool called ASSYST to assess the ADA and C language programming assignments.Only four characteristics are considered for the assessment namely, complexity, efficiency, style and correctness.(Daly C 1999)  JUnit and BLUEJ software and proposed a framework to assess the Java programming.The improvement to be carried out in auto evaluating the student submitted programming assignments was elaborately listed by (skupas B 2010).In this approach, the auto grading is done by two phases.In the first phase, the student submitted program is auto evaluated based on the output and in the second the programming logic is evaluated manually.A framework for evaluating programming language examinations is carried by (Himani Mittal &Syamala Devi Mandalika 2015).Every student submitted programs is evaluated form a small piece of program to single file programs.This framework is limited to programming languages like C, C++ and Java.
The impact of student performance on auto evaluation of programming course is studied by (Rubio-Escudero&Asencio-Cortes 2018).Their study states that, auto evaluation will enhance the student performance by attending and passing the examinations.It also helps the students to revise their knowledge based on the instant feedback given by the auto assessment tool.The student programming skill is influenced by various factors namely, school mathematics background, prior programming experience, comfort level, intrinsic motivation, formal training carried and understanding the shared material.(Bergin S & Reilly R 2005).The early prediction of lack of understanding in programming courses is done by (Skalka J, Drlík M &Obonya J 2017).A predictive modeling technique was suggested to monitor the student performance from the evaluation carried at mid semester.The auto evaluation alone does not improve the student programming skill.The ideal learning outcome is achieved by coupling effective way of teaching and providing high quality assignments.(Akahane Y, Kitaya H & Inoue U 2015).(Pieterse V 2013) conducted a study on assessing the student performance with and without using the auto assessment tool in programming courses.The results show that performance was not statically significant.The same kind of study was carried by (Wilcox X 2015) and it shows that the exam scores was significantly higher those who are undergone automated grading mechanism.The student dropout also significantly reduced when we use the auto assessment process in the programming courses.(Rubio-Sanchez M, Kinnunen P, Pareja-Flores & Velázquez-Iturbide 2014).

Framework for Auto Assessing the Programming Course
The ongoing COVID-19 pandemic came on unexpectedly, causing a change in teaching and learning.Many institutions in India have reacted quickly and created guidelines to emphasize the teaching-learning process without any hurtles.Many research is need to be carried out for efficiently adapt the courses to the new guidelines.
In this paper a framework was describes for the transformation of traditional assessment method of programming courses in engineering education to complete online format by means of student centred automated assessment tool.In order to evaluate the new assessment method, a detailed study was carried out by incorporating new framework for conducting programming lab courses and later student's perception on the auto evaluation tool is also analyzed.The traditional way of handling programming laboratory courses in many universities and colleges are initiated by providing the lab manual to the students at the commencement of every semester.The lab manual contains detailed description on what experiments need to be completed on every week and supporting material on those experiments.During the lab hour students gone through the lab manual to complete their lab exercises.Once student completed lab exercises faculty will start evaluating the lab exercises manually and mark will be allocated appropriately.The main drawback of the traditional approach is that students need to complete their lab exercise during the lab hour only.Students programming skill is manually assessed by faculty by having interaction with the students.It is very difficult to the faculty to assess all the students in a stipulated time.

Figure.1 Framework for Auto Assessing the Programming Course
To overcome the aforementioned issues a framework was suggested by incorporating MOODLE for handling programming laboratory courses.Modular Object Oriented Dynamic Learning Environment (MOODLE) is a Learning Management System which is effectively used for delivering content and conducting assessment based on MCQ.The proposed framework was depicted in Figure 1.Each laboratory experiments were auto assessed by completing four activities namely Quiz, Debugging, Filling the Missing Code and Programming.The Quiz comprises of 10 questions and need to be completed in 10 Minutes.Quiz Plugin was exclusively used for handling this activity.Coderunner is an open source plugin for MOODLE and it is used to auto assess the Debugging and Filling missing code activity.One question with 30 minutes duration and 5 questions with 10 minutes duration are used for Debugging and Fill the missing code activity respectively.In the last activity, Virtual Programming Lab (VPL) plugin is used to auto assess the student programming assignment based on the known and hidden test cases.One programming questions with 120 minutes duration is carried out for completing this activity.All the activity was graded against 100 marks.Python programming laboratory course is considered for the study.This course was taught in the fifth semester for Computer Science and Engineering students.176 students have enrolled for this course and the auto evaluation is carried out for 10 weekly exercises with four activities and one final exam with four activities.The result obtained is enabling us to analyze the creditability of newly introduced auto evaluation tool in the curriculum and how to improvise the same course in the future endowers.

Analyzing the Auto Assessment Framework
The programming lab session was conducted at any time and student has privilege to access the course material from anywhere.During this Covid 19 Pandemic all the laboratory classes was conducted through online and in every exercise students need to complete four activity as mentioned in the Figure 1.In Table 1 Auto Assessment analysis on Quiz, Debugging and Fill the missing code activity was narrated using two factors namely average completion time of all the students and Average grade secured by the entire student for ten exercises and one final exam respectively.The average completion time of quiz, debugging and fill the missing code by the entire students are 7.27 minutes, 24.63 minutes and 9.18 minutes respectively.From this it was observed that students have actively participated in the entire activity.

Analyzing the Student Responses on the use of proposed framework in the laboratory course
A survey was conducted with students after completing their final examination on python programming laboratory.The survey comprises of 9 question and the responses was collected on likert scale (Strongly Disagree, Disagree, neither agree nor disagree, Agree and Strongly Agree).The student responses over the survey were shown in Table 3. Students feel more comfortable while using the auto assessment tool and only very few percentage of students shows discomfort in using the tool.All the programming questions are auto evaluated against 10 different test cases and 22% of student had faced some difficulties in passing all the test cases.69% of students responded that auto assessment tool has increased their motivation to do the programming exercises and 74% of students answered that auto assessment tool were increased their programming skill.The grading system provided by the auto assessment tool is more creditable.More than 80% of students agreed that this kind of auto assessment tool can be used for other programming courses and it is the better alternative for traditional programming exams.

Figure.2 Average Grade of all the Students -Activity wise
Figure 2 shows the results of average grade of entire 176 students all the four activity.The total grade of entire students is gradually increased from the exercise 1 to final examination.Average grade of entire programming laboratory is 96% which is significantly high compared with the traditional way of assessing the programming experiments.

4.Conclusion and Future Work
The evaluation framework for analyzing the auto assessment process of programming courses during the covid-19 pandemic is elaborated in this paper.The proposed framework was incorporated using MOODLE learning management system by influencing four plugins.Every student should complete the four activities created for every lab exercise in a stipulated time.The grading possessed by all the students' shows that the proposed evaluation framework has increased the student programming skills.The framework can be easily configured by any person and it is completely available as open source.The questions prepared for all the four activities in every lab exercises is the key factor for the entire evaluation framework.Faculty need to be more Programming Total sensible on preparing the questions.The elaborative survey was conducted with the entire student who enrolled in the programming courses.The survey responses show that student feel more comfortable in using this framework and many students have recommended to use this kind of assessment process in other programming courses too.One of the major limitations in this framework is all the activities to be carried out in home and there is no any standard mechanism to ensure whether appropriate student only doing the activity.This will be taken to considered as an interesting future work.
proposed another promising tool for assessing C++ program through internet.This tool is limited to analyze the structure and syntax of any C++ program.(Mandal AK, Mandal C & Reade CMP 2006) suggested the architecture for auto evaluating the C programs based on inputs and outputs.(Farrow M, King PJB 2008) utilizes

Table . 1. Auto Assessment Details on Quiz, Debugging and Fill the Missing Code
The programming activity was separately analyzed based on number of test case passed, average execution time and number of times student compiled the code in every exercises.The details were illustrated in Table2.Every programming exercise question should be auto evaluated against 8 known test cases and 2 hidden test cases.The result shows that all the students have successfully passed 9 test cases.The average completion time of the entire program is 108.27 minutes and it reflects that all the students were actively participated in doing programming assignments.From the table it also observed for the final exam student undergone more number of compilations compared with regular programming exercises.