Sept. 19, 2015 Update: I published this post first incorrectly a while ago as a “page” and not a blog post. So, I just moved it into a post. Also, I just wanted to let you know that I tried pair-programming. It did indeed make me slightly stabby, as predicted. Unfortunately with pair-programming of instructional content, you only get the basest level of feedback; e.g. “The title of the slide has a typo”. Neither designer nor reviewer has enough time or distance from the content to evaluate the content critically and the level critique ends up being focused on tactical micro-edits. I’m not sure how to leverage pair-programming except for perhaps copy-editing, where micro-edits are important and expected.
There are systems of evaluation for instructional content, such as Don Kirkpatrick’s 4-level training evaluation (R.I.P Mr. Kirkpatrick), that add a little structure to the content-review process. I’ve participated in a few Kirkpatrick-style reviews, and I find them interesting and thorough. However, I don’t find them to be particularly practical. They are a lot of work; they take a lot of time; and they result in either a thumbs-up or in recommendations for re-do’s or additions that no one has the time to complete. In the end, the post-Kirkpatrick recommendations gather dust bunnies in the corner while the business moves on to the next shiny bauble.
I’ve searched the interwebs high and low for how people are adapting Kirkpatrick to AGILE / SAM. I have, however, found a LOT of internet content on how AGILE code reviews are conducted in software development. In code reviews, software programmers get together to evaluate how a particular piece of software has been programmed. Code reviews have been around for as long as software programming has been around, but AGILE code reviews are a relatively newer concept.
Josh Cohen discusses four types of AGILE code review in this article, and they almost all work (or would work) in the context of instructional design:
- “Over the shoulder” – This is basically a peer review system where the designer and reviewer informally and synchronously review content. Then, the designer revises content on their own.
- “Pass around” – this is how I currently get SME reviews. I email my near-completed content to SMEs, who then review the content asynchronously and provide feedback via email.
- “Pair programming” – this is where two people are assigned the same programming task. One person writes the code while the other looks on comments as the code is being written. Designing instructional content this way would make anyone homicidal.
- “Tool assisted” – Cohen describes a software tool that allows programmers to check in their code and then request and collect feedback in the same tool. *Brakes squeal* Do any creative content tools have collaboration features? Not really, but it would be immensely helpful.
I WISH we had this in instructional design tools. To my knowledge, neither Storyline nor Captivate has built-in tools for collaboration, review, or versioning.
PowerPoint has comment and edit tracking; this works well when you just have one reviewer but more than one reviewer causes chaos. An interesting note is that PowerPoint slides can be versioned using a free plugin from Perforce, which is often cited for code reviews.
This article describes content-building tools that have built-in collaboration features – basically saying that Google docs leads the way in this regard. It also brings up SlideRocket, now Clearslide.
I signed up for a free trial of Clearslide. I like it for what it is – which is a tool that allows you to build, record, and present slideshows in the cloud. It is billed as a sales-pitch tool, but SMBs could run a free customer training program on it. What I don’t like about it is that the user setup is downright buggy and the collaboration tools are quite limited. But, Clearslide has potential to be the holy grail of instructional design technology. Just add real versioning, real collaboration, reviewing, some interaction and quizzing, and Clearslide would have a real winner in the instructional design space.
So, for now, AGILE evaluations will be low-tech for most of us. “Over the shoulder” and “pass arounds” are already in play for most instructional designers. We just need the industry to catch up to what we need.