You are here: Home / News & Media / ‘Hacking’ sparks row over exam evaluation

‘Hacking’ sparks row over exam evaluation

by Prasad Krishna last modified Jul 02, 2013 08:58 AM
Over the past two days, Cornell University student Debarghya Das’ blog post on ‘Hacking the Indian Education System’ has kicked off a debate across the country over the security of data published online and the practice of moderation of marks obtained by school students in board examinations.

The article by Vasudha Venugopal and Karthik Subramanian was published in the Hindu on June 7, 2013. Pranesh Prakash is quoted.


The 20-year-old Cornell student extracted large amounts of class X and XII student results from a website that hosted the ICSE results using an automated program. Over 1,760 schools are affiliated to the ICSE and more than 1.2 lakh students took the board exams. Based on interpretation of the data sets, he raised allegations of large-scale “tampering” of marks by the authorities, ostensibly to maintain a healthy graph on the results.

Information Security experts said what the student did could not be viewed as a major security breach as much as it was exploiting a loophole. “Anyone with basic programming skills will be able to pull it off,” said Pranesh Prakash, policy director at the Bangalore-based Center for Internet and Society. “There are add-ons available on popular internet browers that allow users to read the embedded codes on a website and run programs to mine data.”

Government websites are most susceptible to loopholes because too many people use them, says Nitesh Betala, Chennai coordinator of Null, a community of programmers that meets regularly to explore these loopholes in public domain websites. “We inform the system administrators directly hoping that they would plug loopholes before others exploit them.”

Debarghya too explained on his blog (deedy.quora.com) on Thursday that what he did was not illegal. “I did not illegally access any database system. All I did was access information that was available to any person who entered a number into the website could access. I simply mined the data.”

The ICSE council, on its part, said it does not publish the examination results in an online manner on its website. Instead, hard copies of results are despatched to schools. But the results are disseminated to third parties such as media organisations.

Krupakar Manukonda, who runs a blog on education for the not-for-profit organisation Takshashila, said: “The online results of all the boards have serious privacy problems. I think the respective boards should issue a passcode along with a hall ticket or entering Date of Birth, First name and Last name should be made mandatory to access marks.”

Das deduced after much data crunching and statistical analysis that the “marks had been tampered with”. His claim is supported by graphs purporting to show that nearly 33 scores, such as 91, 92, 86 and so on, were never awarded to any student.

However, teachers deny the allegation. “The word tampering is wrong. There is moderation that happens across education boards,” explained a teacher, who has worked with ICSE schools in Hyderabad and Chennai. “After the first round of corrections, raw data is given to officials and head examiners who analyse how students have performed. They try to ensure the bell curve of the results does not look awkward. If it does, the implication is that the checking has been either too liberal or very strict.”

After the first moderation, there is a final moderation which is often done by a different set of teachers. “There are some instructions given to us earlier, and some changes made later, depending on analysis by the board,” said a teacher. Teachers are not told about moderation methods in both CBSE and ICSE boards.

The ICSE council says that it does follow the practice of moderation. “In keeping with the practice followed by examination conducting bodies, a process of standardisation is applied to the results, so as to take into account the variations in difficulty level of questions over the years (which may occur despite applying various norms and yardsticks), as well as the marginal variations in evaluation of answer scripts by hundreds of examiners (inter-examiner variability), for each subject.”

Some teachers are however puzzled by the findings. “It is understandable that there are many 35s because a student on the verge of passing, is often pushed to the mark. But I don’t understand why there are no 85, 87, 89, 91 and 93. And, with cut throat competition for every single mark in colleges, teachers are very careful, especially with top scoring papers,” said another senior teacher.

Filed under:
banner
ASPI-CIS Partnership

 

Donate to support our works.

 

In Flux: a technology and policy podcast by the Centre for Internet and Society