Introduction

Tom Hays, a professional data coordinator, discusses in this article how he found a suitable means for collecting data on student performance for students involved in a variety of programs with the help of Remark Office OMR.

Background

Much has been said about the new accountability requirements since the No Child Left Behind law was enacted in January of 2001. However, school systems have had accountability as part of their requirements long before the law was enacted. In 1996, when I worked in a Maryland school system as the data specialist in the Offices of Accountability and Compensatory Education; we were well aware of the need for accountability for the programs the school system implemented.

Our superintendent charged our office with determining the efficacy of intervention programs to ensure that the money being spent on the programs was getting an adequate return in student performance. One of the first tasks I embarked upon was finding suitable means for collecting data on student performance for students involved in a variety of programs. I knew the best method of collecting and analyzing the data would be accomplished with an electronic data collection system. I attended several conferences from the Association for Supervision and Curriculum Development (ASCD). I browsed the vendor fairs at these conferences looking for an electronic means of collecting data. I found several, but the price for the hardware and software was well above the budget for our offices. At one conference, I talked with a Gravic representative about Remark Office OMR®. He showed me the Remark program and how it worked with a small document scanner connected to his laptop computer. The representative ran a few different forms through the scanner and the results of the scan appeared on the screen. In my discussions with Gravic, I asked about the portability of the data – could I export the scanned data results into the different programs I used for data analysis? The answer was that I could export the data into just about any program for data analysis as well as a standard ASCII format that would allow me to move the data anywhere. That made my day!

But, what about creating the scanning documents? Did I need a special computer program to create the documents that could be scanned? No, any word processing program would do just fine. Ok, how do I make those bubbles for the users to fill in for my multiple choice items? No problem, I can use the font that installs with the software or just a capital letter “O” in the standard Arial font. I needed to collect information on each student. How can I make a sheet for every student that has a unique identifier for each student? The identifier can not take up much space on the paper since that limits the amount of space needed for the evaluation data. Again, Remark Office OMR had the answer. Use a barcode like on the cans of food in the grocery store. It takes only about a half inch by one inch space. That barcode can be the student’s identification number. What if I have more data to collect than will fit on one side of the page, can I use the back of the paper as well? Yes, that too is possible.

I saw the program run on a standard Windows computer, so I knew our office already had some of the equipment needed to run the program. Now, what about the cost of the software, could we afford it? The cost of the program was less than the cost of my data analysis program, so it was affordable. But, our office did not have a scanner. What kind of scanner did I need? The requirements for the scanner were relatively simple. I would need an image scanner that had a document feeder and was TWAIN compliant. The speed of the scanner and other features of the scanner would be whatever was needed to meet my efficiency requirements.

The time had come for a trial run of the program for suitability for collecting data for our office. We purchased a scanner and the Remark software. We were ready to go. Well, almost ready.

I had to create the documents for data collection. How to design the data collection documents was my next challenge. I read through the manual that came with the program, paying attention to the document design notes. That didn’t seem too difficult. I knew I could make the documents I needed.

The first project was to evaluate an intervention for early reading. This intervention targeted students in 1st and 2nd grades who were behind in their reading skills. The process for data analysis is the same as most intervention programs. Pre-test the student to determine the beginning level of skill and then at the end of the intervention, post-test the students to determine the exit level of skill. Compare the results of the post-test with the initial results of the pre-test to compute the gain in skill that the program produced.

Because students in the program do not yet read, or at least do not read well, the testing of the students both in the pre- and post-test require teacher evaluation and recording of the individual student results on a scannable sheet. The most recent version of the sheet is available for download at the beginning of this article.

The scan form is two pages, a front and back of one sheet of paper. The barcode is the link to the student database containing all demographic data on each student. So the teacher knows which sheet to use with each student, the identifying text is added in the barcode corner. This assessment is used to evaluate student reading ability through the use of seven specific tests: Running record, writing vocabulary, dictation assessment, comprehensive word test, phonemic awareness, letter id and concepts of print. Some of the specific tests have sub-test components. It is through the combination of the scores of the tests and the sub-tests that a derived score is calculated, thus determining the pre- and post-test achievement level of the student. On the post-test form, other important components of the evaluation are completed. The pre- and post-test book level and the number of lessons and partial lessons delivered are important to assess student growth and program delivery. Also completed only on the post-test form are explanations for why some lessons may not have been delivered. The early dismissal and late entrance marks identify these unusual circumstances.

This project of data collection to determine the value of an intervention program proved to be very useful. Although the anecdotal data from the school staff, principals and teachers indicated the program was beneficial to the engaged students, that was not enough to meet the requirements. The central office needed proof the intervention worked. The data collected using Remark Office OMR provided that proof. It was shown over the ensuing years that the students involved with the intervention improved their ability to read, even enough that the students needed no further intervention. Through other tests given in the later years (grades 3, 4, and 5) of the students in the intervention, these students maintained their level of ability or improved through these grades. This was the proof needed to show the superintendent and the central office that this intervention was worth the time and money spent. Student academic performance improved, and the data documented this improvement. It was through the use of Remark Office OMR to collect the initial and ensuing data that the Office of Accountability was able to provide this data on the individual students.

In my next article, I will discuss a “Remarkable Addition to Our Data Collection” efforts.

Thomas Hays is a retired educator who served a 50,000 student school district in Maryland as assistant supervisor for accountability and data coordinator for compensatory education from 1996 through June 2007.