Trevecca Nazarene University has taken on the difficult task of hosting a state-wide math competition for high school students. With 350 exams that had to be scored in just a 90-minute period, they turned to Remark Office OMR.
Trevecca Nazarene University is hosting a state wide high school math competition with 350 of the top high school students from the Nashville area. These students are competing in six different categories ranging from Algebra I to Calculus and Advanced Math Topics for over $20,000 in scholarship awards and bragging rights as the smartest math student for the year. The competition is scheduled to start at 9:30am. Students are given 80 minutes to complete the exam and then take 90 minute lunch break before the awards ceremony. During the 90 minute lunch break, the University must collect, sort and score each exam using a complex scoring formula. After the scoring is complete, each score must be sorted from highest to lowest in each exam category and the top 10 winners are announced during the awards ceremony with the top three winners in each category receiving significant scholarship awards to the University. The stakes are high and the time frame is short to complete an accurate scoring of each of the 350 exams.
We had considered using the University’s traditional OMR system, but the system did not allow us to electronically transfer results into a database to manage the sorting and complex scoring of the exams. We needed a system that would provide an accurate and quick way to complete the final ranking of winners for the awards ceremony.
After a quick Internet search for OMR software, we discovered Remark Office OMR by Gravic. We downloaded the free demo version of the software and began evaluating the product. The application appeared to have everything we needed. First, it allowed us to create custom answer sheets (using a standard word processor) with the exact number of response location to perfectly match each exam type. Second, it supported both OMR and barcode reading capability to accurately read student identification numbers and exam responses. Third, it supported a complex scoring formula so no additional calculations were needed after the application completed the scoring the exam. Finally, student information and scoring results could be quickly exported to a database where results could be sorted and student information transferred to an award certificate using a mail merge feature to quickly print out the awards to be handed out during the awards ceremony.
We began by developing our custom answer sheet and running a few tests. The application was simple to learn and the support team at Gravic responded quickly to questions we had about the Office OMR application. The support team also provided help in creating an answer sheet format which would provide us with the most accurate scoring process. Using a multifunction printer, we were able scan in our test set of answer sheets into a multi-page tiff format image file. Then we created an exam key and loaded it into the application before we read in the test exams. During our testing, the application worked perfectly. We were able to process over 100 test exams quickly and accurately. After a few more test we felt confident the application would perform well.
On the day of the competition, we handed out the custom answer sheets and the high school students completed their student information and took the exam. As we were collecting the answer sheets, we noticed the wide variety of bubble marks on the answer sheet. Most students had taken care to fill in the bubbles carefully, but many were of varying shading and completeness. We were a little nervous that software would not accurately score the exams with less than perfect markings. However, the process went perfectly. Within 45 minutes we were able to scan in and score the 350 exam answer sheets. During the processing of the exams, the software allowed us to actually view the scanned image of the exam to evaluate any double answer issues and correct them.
After getting the results from the Office OMR application, we manually graded the top 10 scores in each of the six exam categories. During the manual scoring we found 6 exams that appeared to not have been scored correctly by the software. However, after double checking our manual scoring we found the person doing the manual scoring made the mistake and in all but 1 of the test. The single answer sheet that was not scored correctly by the Remark Office OMR software was filled in so lightly the scanner did not pick up several of the bubble marks. We discovered later that this can be corrected by adjusting the contrast settings on the scanner to better pick-up very light markings.
Overall, we are very pleased with Remark Office OMR software and believe it would have been very difficult to accurately score and rank all the student exams within the time limit we had.
Image storage, Review mode, Complex scoring support, Data export.