Using the Tools Around Us

I enjoy tinkering with systems.  I enjoy building a computer from the ground up.  I enjoy exploring and asking the question of “What If…”.  I enjoy making a system or systems do something that it wasn’t originally intended to do.  That is where this blog post starts.  It starts with taking a system (Illuminate Data and Assessment) and using it to fit a need we had at work.  It starts with asking that age old question of “What If…”.
Illuminate Data and Assessment is a powerful application that allows schools to track data and assessment results about their students.  You are able to tie standards to questions within a quiz and subsequently you can design instruction around the areas which need improvement and more.  The preceding two sentences does not do the system the justice it deserves.  I really enjoy this product and when I enjoy a product, I get overly excited about it and wordy trying to explain just how awesome it is!  We have supported the application and it’s implementation in 19 charter schools in Ohio over the last three years with close to 5,500 students.  The work we are doing was submitted as part of a Straight A Innovation Grant and was awarded almost $2 Million dollars as we work to bring 17 charter schools together to collaborate and use the system.
Illuminate has a feature within the Data and Assessment application called OnTrack.  The idea behind OnTrack is to combine multiple areas that you measure into a single report and set boundaries to determine if a student is on track to meet the desired goal.  The idea for using On Track outside of the box came via a conversation we had with a school leader who had a desire for the students to own their data and work.  As I was reading books on rubrics for a presentation I am giving in June, the idea to use OnTrack came together so we could assess how the 19 schools we work with in Illuminate are progressing in their implementation of Illuminate other items.  We want the schools to become owners of the implementation process and become owners of the system.  We have tracked this data previously using a simple Google Spreadsheet but we had two major flaws with this approach.  The first flaw was that our scoring of these areas was subjective.  The second flaw was that we allowed our selves to starting awarding half points.
In response to these identified flaws, I built two OnTrack configurations after discussion with my team.  We identified that in the first semester of the year, we evaluate different items than we do in the second semester which was our driving factor in creating two configurations.  We also developed a rubric so our scores could become objective instead of subjective. Our rubric goes from 0 – 4 which are then translated within On Track to a point value.  We set the On Track scale for each of the configurations to be a total of 1,000 points possible.  One of the metrics within our grant was that we would have 80% or higher of teachers at the school using Illuminate on a monthly basis.  We assigned this 300 (30%) of the 1,000 points.  The idea behind this thought process was that if the teachers are using the system, the other 700 points would come together.  We also assigned lesser point values to other areas that we deemed were important but not necessarily 100 points important.
1st Semester:
  1. Illuminate Site Setup/SFTP
  2. ITC/SIS Configuration
  3. Data Cleanup
  4. Nightly Data Loads
  5. Implementation Meeting
  6. Illuminate Training
  7. Data Team Schedule
  8. Data Teams Training
  9. Communication
  10. Admin Check-In Schedule
  11. Logins
2nd Semester:
  1. Data Cleanup
  2. Training
  3. Data Teams Schedule
  4. Admin Checkin
  5. Next Year
  6. Communication
  7. Data Binders
  8. Logins
  9. Online Testing
As we developed the summary assessments and reports that would become our OnTrack configuration, we started asking the question of how powerful of a resource would this be if we had our schools evaluate themselves on the same rubric that we evaluate them on?  Illuminate has a feature that allows you to compare two or more OnTrack configurations next to each other in a single report thus the school response configuration was born.
We selected three schools as part of the pilot for this idea.  On May 13th 2015 one of the schools provided their scores to me and the values were thrown into the reports and the configurations ran.  In quickly observing the data, we found that the school awarded themselves 775 total points out of 1,000 while we had awarded the school 725 total points out of 1,000.  The difference in 50 points though came in different areas which is interesting to us.  The score though while different places the school into the same performance band.  I personally can’t wait until next week when we have our check-in meeting with the school leader so we can really get down to the details and talk about the differences in scoring.  At this point the school hasn’t seen yet how we have ranked them so I am sure there will be discussion.
We hope to generate discussion between us and the school leader after they assess themselves on the same rubric we use.  When you start out with a project, goals and ideas which were not thought of previously come about through the work.  This is what happened in this project with adding the school assessment piece but we also had another goal that we hadn’t thought of it when we started this project.  We are now able to take this OnTrack system and show it off to a school leader which is a starting point for discussion with the school leader on how they could possibly use the OnTrack system within their school.
We plan to submit this work as a presentation topic at the Illuminate Users Conference in 2015/2016.

Leave a Reply

Your email address will not be published. Required fields are marked *