Wrapping Up The Rubrics

I started this series with a general overview of rubrics in general with the posting  Evaluating Online Courses: a prelude. I learned a lot reviewing the seven rubrics below for online courses:

  1. CA Community College’s Online Education Initiative Rubric
  2. Quality Matters
  3. Illinois Online Network’s Quality Online Course Initiative
  4. Online Course Evaluation Project
  5. CMU’s Quality Assurance Checklist
  6. OCAT and Peer Assessment Process
  7. CSU’s Quality Online Learning & Teaching Rubric

Looking at these rubrics was a nice walk through the recent history of online learning. Also, each rubric or checklist tells us how each institution defines online learning. I still have a lot of questions that I will need to answer like:

  • How do we define an online course?
  • How does that definition inform the review process?
  • Who should review or assess an online course? In house or out?

These are all questions that institutions should ask as they formulate an implementation process. It is a good time to look at strengths and challenges of existing courses and programs. It is also an opportunity to see what resources and challenges such a process would require or bring.

There is a temptation when creating online course rubrics to attempt to assess everything related to the online class, including things like ADA 508 compliance that should be referred to experts. But this also points out an opportunity: a rubric can be used to help faculty inventory the services and resources on a campus that are there to support teaching and learning.

Photo of hands at the keyboard.

The process should be updated every year or two because of the rapid changes in technology. Concerns that were prevalent ten years ago may not be a concern today. Also, just like choosing (or not choosing) a learning management system or a content management system, the process should reflect the needs of the whole community. If you have a population of students with special needs (i.e. developmental students or first generation college goers), then access to the resources to insure student success should be assessed in online courses.

Why Open Source?
Each institution or community is unique enough, has its own population of faculty, students, and staff; its own access to resources (as rich or limited as that may be); its own challenges to warrant taking the time to work out a process that addresses those needs. This is why it is so important that this work be open source – artifacts created by this process should be openly-licensed. Copyrighting a process is like saying that all institutions are alike and have the same academic and student culture. For a while, all of the Quality Matters training material, research, and copies of the rubric were either not available or harder to find after they chose the copywritten path after fiscal year 2007 marked the end of their grant. This closes door on future development for the sake of “sustainability.” I appreciate the open license on the CSU’s QOLT rubric because if I make changes to solve local problems, then the CSU can get a copy of it and when they encounter similar situations, we will have already done the hard work for them.

A Meta-Rubric Example
Much like the rubrics used to evaluate software – I have been thinking about a meta-rubric for reviewing online course assessment rubrics. My example includes criteria that are common in the education technology world: ease of use, cost, and validity of scoring. And then I have included criteria that would be useful for a few of the community colleges that I have worked for, as well as some of my own values: openness and creating community. My example rubric is meant to start a conversation about what is important for your institution.

This is my “meta-rubric” example. Please feel free to copy, share, or comment.

Resources

This entry was posted in assessment and tagged , , , , , , . Bookmark the permalink.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.