Tom Hays, a professional data coordinator, discusses in this article how he transformed how his schools collected data on his kindergarten students with the help of Remark Office OMR.


My first article chronicled the beginning of my relationship with Remark Office OMR as I developed a data collection system to determine the efficacy of an intervention program in reading for grades 1 and 2. In this article I will discuss a remarkable addition to the data collection efforts: Evaluation of the kindergarten program through two assessments.

The Early Childhood Office had been involved with evaluating the kindergarten program through the use of two assessment instruments. One assessment was given at the beginning of the school year, prior to the end of the first 3 weeks of school. The second assessment was given about 3 weeks prior to the end of the school year. The first was to determine the school readiness in the areas of reading and mathematics of the children entering the kindergarten program, the second to determine the knowledge and skills learned as a result of the kindergarten program. The intent of the two assessments was to determine the effectiveness of the kindergarten program with regard to all students. Of particular interest was how well the kindergarten curriculum was being accessed and acquired by students who were assessed as below the expected readiness level. All of the data had to be collected and then hand analyzed. Needless to say, this process was very tedious and error prone, and extremely late in providing the results.

After the first experience working with the reading program, it was obvious to me that the kindergarten data collection could be accomplished with the same testing process, but a different collection and analysis method. Because this was a new idea to the supervisor of the Early Childhood office, a pilot project would be used to show the feasibility of collecting the data. The pilot for the project began with just 5 of the 30 elementary schools.

The first thing to be accomplished was a redesign of the assessment document so the data could be easily scanned and placed into a database. Because the teachers would still be giving the same assessment that they had used in previous years, there would be little change in the assessment process. However, due to the change in the method of recording the results of the assessment, teachers would require a little training on properly filling in the scannable document. Marking bubbles on individual assessment sheets instead of recording numbers on a grid for a class of children required about 15 minutes of training. This training was handled by the Early Childhood office.

The pilot project convinced the supervisor of Early Childhood of the feasibility of the project. The real clincher for both the Early Childhood office and the school principals was the rapid turn around of the data. Although there was no beginning of the year assessment to compare with this end of year data in the project, the fact that there was no hand tabulation of data and the reports could be back in the hands of the teachers, principals, and supervisor within just a few days was a pleasant surprise to them. This pilot project and the initial reports were given to the Director of Elementary Education for approval for use by all 30 schools. There was no question about should we go forward, but could we get it in place for the beginning of the next school year?

Of course, I could get the documents and database in place, but training all kindergarten teachers and convincing them the value of this process would need some thought. The pre-school meetings for the school year would be the time and place for this training. In addition to training the teachers to fill in the bubbles, the benefits to the kindergarten teachers would also need to be addressed. The demonstration reports proved to be the most useful item to get the teachers on board with the process. These reports showed the data in several different ways. One report was a class list arranged by reading score, another was the class list arranged by mathematics score, and another was an individual student report showing both the reading and mathematics scores along with the sub-test scores. The individual report was seen by the teachers as a very useful report for the back-to-school meetings with individual parents. It provided the teachers with talking points to help parents realize the strengths and needs of their children. (This proved to be a very good public relations tool for the teachers.) After the first full year of implementation of the data collecting and reporting system, both the teachers and administration agreed that the time and effort produced results that warranted the continuation of the program.

Attached to this article are two scan forms (fall and spring), directions provided to teachers for completing the forms, and samples of Fall and Spring reports. The Fall scan form is on an 8 ½ by 14 inch page that records the language arts tests and subtest scores on the left side and the mathematics scores on the right side. Again, the small amount of space required for the bar code that links to the student demographic database was critical to allow for this design. The Spring form is on an 8 ½ by 11 page, but using both sides of the paper – one side for language arts, the other for mathematics. On both forms, the adult assessor (since each child was assessed individually) was to initial the assessment score recorded. Different adults were responsible for assessing children in mathematics and language arts. Also, only the fall assessment collected data on pre-kindergarten educational experiences of children. Remark Office OMR’s ability to export the scanned data to various database products made the data useful to provide information to teachers and administrators about the Kindergarten program and individual students.

The real power of a data collection and reporting system is in the information that is provided to teachers and administrators for instructional decision making. In addition to the samples of the Remark Office OMR scan sheets, I have provided samples of some reports. Additional reports that disaggregate data in a variety of formats were created when special requests were made. It needs to be noted that the database program I used for analysis and reporting is Microsoft Access® since I am familiar with this program. However, any relational database would work just as well for a person familiar and competent with the database program. That is a great feature of Remark Office OMR—its ability to export data to just about any database format.

Included attachments are

The teacher directions were sent to each of the teachers along with his/her class set of fall and spring scan sheets. Within one week of receiving the completed scan sheets from the schools, teachers received from my office a language arts and a mathematics class listing profile and an individual student profile for each student. The Early Childhood office would receive a report showing the number and percent of readiness students for each teacher and each school. This report was also forwarded to the school principal. In the spring, the comparison reports are prepared and sent to the school principals, the Early Childhood office and the Director of Elementary Education. These reports have been helpful in informing instruction and in modification of the curriculum. Take a look at the Fall/Spring Comparison of School/Systemwide report. You can see that in the fall, 29.6% of the students were not ready for Language Arts in kindergarten, but by spring only 12.8% of the students were not ready for the 1st grade Language Arts curriculum. Also, 10.6% of the students were advanced in the fall, but by spring 39.9% were advanced in the Language Arts area. By far, the biggest change was in the area of Mathematics. Fall registered 51.2% of the students behind in mathematics, but by spring only 19.9% were still behind. This gives evidence that the kindergarten curriculum is taking students that are behind and advancing them to on grade level. Not all students, but more than 30% more are where they need to be.

The individual teacher reports shown in the examples are listed in alphabetical order by student last name. However other reports were created using the assessment total score. These would be ordered in ascending total score order to assist teachers with grouping students. The color coding on the class listing profiles helped to draw attention to specific areas of student need. For example, on the Fall Language Arts report, you will notice in the “Language in Use” section, the “Describe Obj” column draws the reader’s attention by all of the pink and yellow color background. This is an indication that the entire group of students needs instruction on this objective. On the same report, notice the student named “Brandon” has quite a few objectives unmet. A teacher seeing this report can begin planning the kind of instruction that will meet the needs of the students and avoid planning and teaching areas the students already have mastered.

The power of the data is in information provided by the reports, but it was the ease of collection made possible by Remark Office OMR that made the reports available in a timely and usable format. Together, Remark Office OMR and Microsoft Access database reporting system transformed data into useful educational information that moved instruction forward for children.

In my next article, “Remarkable Changes to Instruction,” I will discuss how Remark Office OMR was moved into the formative assessment arena in the reading/writing area of language arts in the Title I schools.

Thomas Hays is a retired educator who served a 50,000 student school district in Maryland as assistant supervisor for accountability and data coordinator for compensatory education from 1996 through June 2007.