I’m going to write a post or three about some of the friction that exists around using OER. There are some things about working with OER that are just harder or more painful than they need to be, and getting more people actively involved in using OER will require us to reduce or eliminate those points of friction.
I’ve been writing about continuous improvement in the context of OER for a few years now. To date, I’ve written about and worked on reducing the friction involved in a relatively centralized model for continuous improvement of OER – a “top down” approach, if you will:
- The original RISE framework article in IRRODL
- A Framework for the Continuous Improvement of OER
- RISE and Instructional Design
- The RISE Package for R: Reducing Time Through the OER Continuous Improvement Cycle
Today I want to write about the other side of continuous improvement – a complementary, “bottom up” approach to facilitating broad participation in the continuous improvement process based on individual’s experiences as opposed to data analyses.
It’s a well established principle in open source software, open content, and other volunteer settings that when it’s hard to contribute, not many people contribute. When it’s easy to contribute, more people contribute. In fact, one of the most important keys to unlocking participation by a community is removing any friction they might experience in the process of participating.
Stop and think for a minute about how the process of suggesting an improvement to an educational resource typically works. It generally happens in one of two high-friction ways. In the first model, you send an email to an firstname.lastname@example.org or email@example.com email address. In that email you have to try to describe exactly where the improvement should be made, probably by including a URL or page number, followed by a description of where on the page the change is supposed to go (“in the second sentence of the seventh paragraph…”). Then you can finally describe the specific change you think needs to be made. Or perhaps the provider wants you to use their ticketing system, and you end up in a piece of software like Zendesk. Once you figure out how to create a ticket, then you can write one up that includes all the information described above (you might even be asked to do some tagging of your ticket). Either way, if they choose to make the improvement you suggest it will likely be months or years before students in class benefit from your suggestion since it will be rolled into the next edition.
In other words, there’s a ton of friction in this process. And because it’s so painful, countless suggestions are never made that would have been made if the process were easier.
This semester at Lumen we’ve launched a continuous improvement pilot in which I believe we’ve removed just about all the friction that’s possible to remove from this process. Here’s how it works:
- There’s a new button at the bottom of every page of content. It says “Improve this page.”
- When a student or teacher or other user from the public web clicks the button, they’re linked directly to a Google Doc which includes all the content from the page. The Google doc is shared publicly and has Track Changes turned on. So you can just begin typing or commenting immediately, and your suggestions are highlighted and tracked.
- You’re done making your suggestion!
How easy is it to suggest an improvement to OER now? Faster than 30 seconds easy!
The pilot is active in three courses and we’re already receiving great feedback. Much to our excitement many of the suggestions appear to be coming from students. And they’re sending everything from spelling errors they catch to suggestions about how to make course content more inclusive.
The awesome folks on Lumen’s continuous improvement team have developed some tools and workflow that allow us to track the amount of time it takes us to vet these suggestions and get them implemented in the canonical copy of the courseware. We’re averaging well under 24 hours from the time a suggestion is made until it’s vetted and implemented, and we think we can continue to be that responsive even as the number of suggestions grows after the pilot.
And here is where Lumen’s model particularly shines – since our courseware is embedded in the LMS via LTI (rather than copied and pasted into the LMS), all these improvements are immediately available to everyone using the courseware the instant we make them. The OER is literally getting a little better every single day – benefiting both the teachers and students who have formally adopted in their LMS as well as the informal learners who access our OER on our website. That’s the beauty of transclusion – the OER embedded in the courseware and the OER published on our public-facing website are the same copy of the OER. Update once, improve everywhere.
So far the suggestions people are making aren’t something you could openly license and attribute – they’re either high-level ideas or lower-level issues like fixes to spelling or grammar (i.e., not copyrightable works that can be licensed and attributed). So to make sure people are recognized for their efforts to make things better, we’ve created a new Acknowledgments section on the About This Course page in the pilot courses, and we’re adding the names of everyone (faculty, student, and otherwise – it doesn’t matter who you are) who uses their name when making suggestions that we integrate into the courseware.