The good folks at Florida Virtual Campus have released the latest version of their Student Textbook Survey. There’s already been some great coverage (e.g., Phil Hill). However, I’ve also read people saying that the results are essentially unchanged from the 2012 survey to the 2016 survey. A quick look at Table 1 on page 11 seems to justify that claim:

table_1

However, inter-occular speculation is failing us here. An analysis of these data demonstrate that there are in fact some statistically significant differences in student responses from 2012 to 2016. The proportion of students who were impacted by the high costs of textbooks changed across the two surveys as follows:

  • “Take fewer courses” was significantly lower in 2016 than in 2012.
  • “Not register for a course” was unchanged from 2012 to 2016.
  • “Drop a course” was unchanged from 2012 to 2016.
  • “Withdraw from a course” was unchanged from 2012 to 2016.
  • “Earn a poor grade” was significantly higher in 2016 than 2012.
  • “Fail a course” was significantly higher in 2016 than 2012.
  • “Not purchase the required textbook” was significantly higher in 2016 than 2012.

The negative impact of the high cost of textbooks, as presented in the 2012 survey results, was already unacceptably high. The situation is largely the same or worse four years later.

Here’s the R code so you can re-run the analysis yourself, along with a screen grab of the full results (click to enlarge).


# Responses from 2016 and 2012, Table 1, p. 11
# https://dlss.flvc.org/documents/210036/361552/2016+Student+Textbook+Survey.pdf
take_fewer_courses < - c(.476, .491)
not_register_for_a_course <- c(.455, .451)
drop_a_course <- c(.261, .267)
withdraw_from_course <- c(.207, .206)
earn_a_poor_grade <- c(.376, .34)
fail_a_course <- c(.198, .17)
not_purchase_textbook <- c(.665, .636)

# Survey n for 2016 and 2012, Table 1, p.11
# https://dlss.flvc.org/documents/210036/361552/2016+Student+Textbook+Survey.pdf
n_2016_2012 <- c(20557, 18587)

# Chi-squared test for equality of proportions
prop.test(x = take_fewer_courses * n_2016_2012, n = n_2016_2012, correct=FALSE)
prop.test(x = not_register_for_a_course * n_2016_2012, n = n_2016_2012, correct=FALSE)
prop.test(x = drop_a_course * n_2016_2012, n = n_2016_2012, correct=FALSE)
prop.test(x = withdraw_from_course * n_2016_2012, n = n_2016_2012, correct=FALSE)
prop.test(x = earn_a_poor_grade * n_2016_2012, n = n_2016_2012, correct=FALSE)
prop.test(x = fail_a_course * n_2016_2012, n = n_2016_2012, correct=FALSE)
prop.test(x = not_purchase_textbook * n_2016_2012, n = n_2016_2012, correct=FALSE)

flvc-survey-comparison

{ 1 comment }

Having students grade each other’s work is a time-honored tradition among faculty looking to save themselves some time and headache. In addition to appreciating the time savings, many faculty argue that participation in the peer assessment process can actually promote deeper student learning. This is absolutely true when faculty take the time necessary to design the peer assessment experience and supporting artifacts (like rubrics) well. (Though you may not experience a net savings in time after you do all this preparatory work!)

I’ve been thinking recently about peer assessment in the context of renewable assessments. Recall that renewable assessments ask students to produce artifacts of value in the real world (as opposed to disposable assignments, which both faculty and students understand will be thrown away after grading). When one student is providing feedback, encouragement, criticism, and suggestions for improvement to an artifact another student is in the process of making, at what point does that stop being peer assessment and become co-creation? Honestly, I think at this point it’s already crossed over.

We frequently hear people characterize students as (ideally) being co-creators of knowledge rather than mostly empty receptacles into which we pour the dates of wars and the names and weights of chemical elements. I’m a firm believer in that vision, and having been teaching in higher ed for twenty years now recognize how difficult it can be in practice to pull off. But I think there’s something to this idea of combining renewable assessments with peer assessments… it feels like we should lean in to it.

When the renewable assessment process begins with existing open artifacts created by others that can be revised, remixed, and built upon, and we then layer peer assessment on top of that, the result is a special kind of co-creation for continuous improvement – a stigmergy that feeds back on itself in ways that seem like they ought to result in amazing student learning (though whether or not they do is an empirical question we should answer).

Perhaps we should call this activity continuous co-creation?

Let’s dig in a little further into the idea of stigmergy – and particularly how it relates to open. Wikipedia summarizes:

Stigmergy is a mechanism of indirect coordination, through the environment, between agents or actions. The principle is that the trace left in the environment by an action stimulates the performance of a next action, by the same or a different agent. In that way, subsequent actions tend to reinforce and build on each other, leading to the spontaneous emergence of coherent, apparently systematic activity.

Stigmergy is a form of self-organization. It produces complex, seemingly intelligent structures, without need for any planning, control, or even direct communication between the agents. As such it supports efficient collaboration between extremely simple agents, who lack any memory, intelligence or even individual awareness of each other.

Or, as Sun, Marsh, and Onof summarize, “In its most generic formulation, stigmergy is the phenomenon of indirect communication mediated by modifications of the environment.”

A few points. First, these agents communicate indirectly by modifying their environment. Stop and think for a moment – a large part of students’ learning environments are the media (readings, videos, etc.) which they watch and read. (And content makes up an even larger part of the learning environment for online learners, who lack the in-class aspects of the learning environment.) Inasmuch as content is the environment, how can we expect learners to modify their environment when it is copyrighted and locked down under digital rights management technology? They can’t. Only open content is modifiable in this way.

Second, like many self-organizing systems, stigmergy benefits from a larger number of agents participating in the system. For example, go spend a few happy moments playing with this simulation of ants discovering food and bringing it back to the nest. Click setup, then click go and watch them work. They communicate indirectly by modifying their environment – in this case, ants who find food leave an evaporating trail of pheromones behind as they return to the nest. Other ants who cross this trail in their random wandering turn upstream and follow the pheromone trail.

Click go again to stop the simulation and turn down the number of ants to 50. Click setup and go again. What do you notice? It takes a lot longer for the ants to find and harvest the food. Now increase the number of ants to 200 and run the simulation again. What do you notice? You can try the same manipulations (increasing and decreasing the number of agents in the system) with this simulation of birds flocking or other of the awesome NetLogo sims. The lesson is this – more actors result in more interesting and useful behavior emerging more quickly.

Because OER are freely available, they maximize the number of agents who can be influenced by them and the traces left in them by other students.

In summary, commercially published resources both restrict the number of agents who can be influenced by them (by being too expensive) and disallow students from making modifications to them (by being tightly copyrighted and severely DRMed.) In other words, it is impossible for stigmergy to emerge in the context of commercially published resources.

Open educational resources, on the other hand, maximize participation by being free and permit the modifications that allow interesting things to emerge from learners’ interactions with them. In other words, OER make the emergence of stigmergy possible – given the right supporting technology.

And so we’ve connected openness and stigmergy, but what does any of this have to do with renewable assessments? When students are revising and remixing OER (modifying their environment) and those traces are left behind (openly published) for other students to find and build on – that, my friends, is stigmergy. Add peer assessment, aka continuous co-creation, to the mix, and we’ve shortened the wait time between interactions, making the whole system evolve more quickly.

This has a range of implications both for student learning and the sustainability of the broader OER ecosystem.

{ 6 comments }

Of Sunlight, OER, and Lumen

We recently installed solar panels on our home. The benefits of adding them were immediate and obvious – the very first month they were on the roof our electric bill dropped to $9 (the fee required to stay connected to the grid) and we generated more power than we used, pushing the excess back out to the grid. Because I can’t stop thinking about open, I’ve been pondering the relationship between solar power and OER.

At the same time, I’ve been thinking about how to answer several questions I’m often asked. People who don’t work directly with Lumen sometimes have a hard time understanding what we do, and this leads to a range of confused questions like, “What does Lumen do, anyway?”, “How can you sell OER if they’re free?”, and “If OER are free, why would anyone pay you?”

As I’ve continued to think about these two topics, I realize they’re actually closely related. In fact, I believe the simplest way to answer to many questions about Lumen is by analogy. Let me explain…

Sunlight is perhaps the ultimate example of a public good. Both nonexcludable and nonrivalrous, sunlight is available to anyone and everyone for free. Sunlight is highly versatile and can do everything from making your garden grow to melting the snow off your driveway. For years now I’ve been hoping to use solar to power my home – to harness the sunlight so that it consistently and reliably does what I want (i.e., provides electricity to cool my house, run my lights, keep the wifi on, etc.). But in order to get the sunlight to do what I wanted it to, I needed to partner with someone else who had the right combination of expertise and technology.

There were dozens of questions to answer… Things like, what type of panels should I use? What’s the right number of panels to install given the amount of power I need versus what I’m likely to generate? How and where will I mount them? Should I use a system with microinverters on each panel, or one large inverter? How do I integrate the power the panels produce into my home without burning the house down? How do I tie into the grid so you can push excess power there? Does a battery system make sense in my circumstances? If so, how many should I use and what kind? Do I¬†want to be able to live monitor my energy production, or is checking the power bill at the end of the month sufficient? Etc.

You get the idea. Of course, I could have stopped everything else in my life for a few months and learned most of what I needed to know to answer these questions myself. I could probably even have done a passable job with the installation and tying into the house and the grid. But the end product of working with a company who already had the expertise, experience, and knew the relevant technology inside out was far beyond what I could have done myself.

And yes, I paid them – even though sunlight is free.

OER are a lot like sunlight. They’re another prime example of a public good. They are freely available to anyone and everyone. They’re extremely adaptable. And like sunlight, in order to use OER to reliably and predictably meet our goals (improving student outcomes and saving students money), we need to apply a combination of expertise and technology.

Again, there are dozens of questions to answer. Are you trying to replace commercial materials with OER across an entire degree program or in just a single course? Is the primary goal to improve student learning, increase graduation rates, or save students money? How, specifically, do you optimize for each of these outcomes? Are you willing to take a fresh look at your pedagogy? How would renewable assessments work in your discipline? Which OER should you use, and where do you find them? What tools will you use to revise and remix the OER you select? How are students going to access these OER – online or in print? If online, how will you integrate them into your LMS in a sustainable way? If print, how are you going to manage that process? Is this a math or other quantitative course that requires algorithmically generated and graded practice problems? What role would you like automated systems to play in personalizing the learning experience for your students? How can learning analytics help you strengthen your relationship with your students? Based on last semester’s learning results, where and how should you engage in continuous improvement of your OER?

You get the idea. Of course, many faculty could stop everything else they’re doing for a few months and learn much of what they need to know about copyright law and open licensing, instructional design and learning science, open source platforms for revising and remixing, relevant technical standards, statistics and data science, etc., to answer these questions themselves. They could probably even do a reasonable job of pulling it all together. But when faculty collaborate with Lumen, the end products are far better than what they can typically do by themselves.

And yes, institutions pay us – even though OER are free.

I think this analogy between sunlight and OER, and by extension between Lumen and a renewable energy company, works pretty well. Returning the the questions above:

Q. What does Lumen do, anyway?
A. In the same way a solar power company helps people harness sunlight to power their homes, Lumen helps faculty harness OER to power student learning.

Q. How can you sell OER if they’re free?
A. Solar power companies don’t sell sunlight and Lumen doesn’t sell OER. Sunlight and OER are free.

Q. “If OER are free, why would anyone pay you?”
A. We provide expertise and technology that help people make effective use of OER, just like solar power companies provide expertise and technology that help people create electricity from sunlight.

PS. It occurs to me that there may be more to do with this analogy. If we can fruitfully compare OER to sunlight and Lumen to a renewable energy company, should we compare commercial textbooks to oil and traditional publishers to extractive energy companies? Should we compare the broad move away from commercial materials and toward OER-based degrees with the move away from fossil fuels and toward renewable energy? These are questions for another time.

{ 5 comments }