Categories
learning open content open education

We Should Pause and Ask the Question

There’s a really terrific conversation happening on the cc-openedu listserv. It started out as a question about OER, but has moved on to a conversation about the purposes of open more generally. Dr. Chuck contributed over the weekend, and his contribution provides a great opportunity for me to respond with the first substantive post since I changed the name of the blog.

All the pull quotes in this post are from Dr. Chuck. He writes:

[Y]ou can wave your hands and dream of open content and a complete open source chain of production where the raw material is openly licensed and *everything* in the value chain is also free. I have done this for *one* course – Python for Everybody. If you start with my github repo, you can build an LMS, a web site, an online teaching system, and even a camera ready textbook ready for printing using 100% free software. I use LaTeX and pandoc for the print book – it is perhaps less convenient that [sic] Word but it is free. P.S. I also have a README that tells how to do it.

It is possible with great effort, but there are a number of problems.

First, it is harder to keep a 100% open chain of production from beginning to end – you need solid technical skills and a lot of patience. I have shown 100’s of people my 100% open process – and literally no one has replicated it because it is easier to just fall into the easier path of proprietary approaches.

There is an incredibly important point that I think people frequently miss. We are often told that the most important purpose of being open is to increase access. In theory, when it comes to educational resources nothing could possibly provide more access to more people than openly licensed source code to an LMS, website, online teaching system, and camera ready textbook. It’s got everything you need. You can adapt it any way you like. You can run it anywhere you like. You can use it any way you like. And, in the case of Python, it actually exists – right there in Dr. Chuck’s GitHub repo. 

That’s the theory. But what actually happens in practice? Dr. Chuck tells us that, despite the thousands of hours of time and effort he has invested to both (1) make the full chain of tools and content open and (2) show hundreds of people how to use these tools and content themselves, literally no one has done so. 

We should pause and ask which approach really, truly, provides access to more people – (1) the Python for Everybody GitHub repo with an excellent README and all the 5R permissions or (2) the ready-to-use version of Python for Everybody on Coursera? Empirically speaking, it’s approximately one million times easier to take Python for Everybody on Coursera than it is to install, configure, troubleshoot, and run all these tools yourself. We know this is true because as of March 1, 2021, 985,081 people have enrolled in Python for Everybody on Coursera and remember no one has stood up the tool chain themselves. 

As a 100% open end-to-end person (one of the few) – I get it – and accept that 100% open is always going to be a hard path – it is like being a monk and sleeping on cold stone floors 🙂

I have nothing but respect for Dr. Chuck. There are often sacrifices associated with living according to one’s principles, and he is obviously making them. And the incredible difficulty of this 100% open path should cause us to pause and ask ourselves about the problem we are trying to solve. If the primary goal of being open is to increase access, the Python for Everybody example is extraordinarily interesting. Think about first-generation, minoritized, or otherwise at-risk learners. Think about the (primarily adjunct) faculty who serve them.

Now, sincerely, pause and ask yourself – which provides these students and their faculty with more access – an openly licensed GitHub repo or an easy-to-use offering on Coursera? Even when the Coursera option costs $49 – which provides more access to more people? Really think about it. In terms of raw numbers, how many people who might be interested in learning Python can come up with $49? How many people who might be interested in learning Python can get the 100% open tool chain up and running?

By the way – there is a way to make money on things like printing, hosting, support, etc etc without violating the principle of 100% open. The Occam’s razor is whether someone else can replicate what you are doing if they put their mind to it.

I understand what Dr. Chuck is trying to say here, but I want to point out that someone can always replicate what you’re doing if they put their mind to it. In fact, the stories of the most successful open source software are stories of someone putting their mind to replicating the functionality of proprietary software – Linux from Unix, LibreOffice from MS Office, GIMP from Photoshop, &c., &c. You might even argue that many popular OER work very hard to closely replicate proprietary textbooks. But back to the point I believe he was actually trying to make…

This is clearly a principle he feels strongly about, given the monk-sleeping-on-cold-stone-floor level of commitment he puts into making sure he provides both options for Python for Everybody – the DIY-able GitHub version and the much more usable Coursera version. But I suppose at some point it’s with pausing to ask the question: if no one takes advantage of the DIY version, how much effort does it really merit? Especially when that effort could go into other work that would have a much larger impact? 

When I was a Shuttleworth Fellow I received one of most difficult pieces of advice I’ve ever been given. The advice was this – “don’t let your principles keep you from accomplishing your mission.” (This was in the context of a discussion Lumen’s principle of never charging students directly.) I didn’t fully understand it at the time. In the early days of Lumen, I thought the important work we were called to do was to make more things more open. Open all the things!!! Make them as open as possible!!! 

But open is a means, not an end

Lumen’s mission is to “enable unprecedented learning for all students.” There are myriad ways that leveraging the 5R power of open helps us do that. But there are also cases when investing the time and effort necessary to be more even more open would actually work against our real mission, which is enabling unprecedented learning for all students – not being as open as possible.

The thing that pisses me off in conversations like these is how those who are “pretty open” or “open core” or whatever participate in these discussions with the intent of defining their variation of hybrid proprietary + open in the name of money as “good enough” or “the best we can do” or “an ideal compromise”. They want validation / kudos / accolades for their particular choice of non-open bits.

I get that there are a lot of “not 100% open” business models and those should be “allowed” – but they should not be “celebrated” as the “pinnacle” of open just because someone makes a speech about how *their* hybrid model is the best we can do.

I don’t know that Dr. Chuck is talking to me directly here, but even if he’s not I think it’s a productive exercise to spend time with critiques from smart people you respect and see what you can learn from them. 

Speaking personally, I’m not interested in creating a business model that is celebrated as the pinnacle of open by anyone. I’m interested in developing a model that maximizes positive impacts on student success. As I’ve written before, when you’re talking about the impact educational materials can have on student outcomes, I think the key metrics are success, scale, and savings: how much can you improve outcomes, for how many students, and how much money can you save them while doing so.

Open – as in openly licensed content – has an important role to play in this effort for me personally. It enables continuous improvement, which drives gains in outcomes for students. It sidesteps royalties, which means greater savings for students.  But open is not the star of the show. It plays second fiddle to a number of things, like evidence-based learning design. (Increased access to ineffective learning materials doesn’t help anyone.)

If your goal is to be as open as possible, it will lead you to make one set of choices. You will begin by assuming Everything Should Be Open™ and work hard not to lose ground on that commitment. If your goal is anything else, you’ll think about open instrumentally – as one of many tools to accomplish your goal. Each place you might choose to be open, you will thoughtfully consider whether doing so would increase the likelihood of you accomplishing your goal. You will feel like open is a means and not the end

We should pause and ask the question – is more open always better?

Categories
instructional-technology open content research

Fall in Love with the Problem, Not the Solution

Curt Bonk and I recently published a Preface for a special issue of ETR&D on Systematic Reviews of Research on Learning Environments and Technologies. It is largely a collection of personal stories and reflections about the arc of learning technologies over the last 30 years. However, we close with some advice which I believe to be profoundly important for everyone working in and around the learning technologies field, include open advocates.

Perhaps the most frustrating thing about the field of learning technologies is the way it obsesses over technologies while devaluing or even ignoring problems faced by learners around the world. For decades, learning technologies like those discussed in this special volume have been elevated to objects of study in and of themselves. All too frequently, those working in our field respond to questions about their research agenda with answers like “I study iPads,” “I study augmented reality,” or “I study open educational resources.” We question whether this fetishization of learning technologies will help us make sustained, meaningful improvements to the world in the future. As long as we are focused on the tools themselves, the ongoing march of learning technologies will resemble an endless series of waves eternally breaking on the shore only to draw out and come crashing in again without making a visible difference in the surrounding landscape.

We encourage learning technologists to follow the old advice, ‘fall in love with the problem, not the solution.’ The world is full of so very many problems that desperately need solving—racism, poverty, crime, climate change, war, Internet access, educating refugees… the list goes on and on and range from the local to the global. At the very least, we encourage the reader to consider adding a problem to their answer to the question above. For example, “I study how to help young women maintain their interest in science and math into their high school years. iPads show real promise for mitigating this problem.” Or “I study how to make higher education more effective and affordable to students who are most at-risk. Open educational resources have an important role to play in making that happen.”

Fall in love with a problem—let it be your “standing wave.” Then as the inevitable extended, connected, and repeated waves of learning technologies roll past over the years, you will have a steady foundation from which to evaluate and use them instrumentally to make the world a better place.

You can read the full article online.

Categories
learning analytics open content

From Actionable Dashboards to Action Dashboards

Dashboards in educational contexts are usually comprised of visualizations of educational data. For example, dashboards inside the LMS are often comprised of line charts, bar graphs, and pie charts of data like logins and quiz scores.

The primary goal of educational dashboard design is, ostensibly, for them to be “actionable.” In other words, a teacher, student, or administrator should be able to take an action after spending time interpreting a dashboard. For that to be possible, three questions must be answered in the affirmative.

  1. Does the visualization communicate something useful? (design relevance)
  2. Can faculty figure out what that useful bit of information is after studying the visualization? (data literacy)
  3. Can faculty figure out what to do based on that useful bit of information? (instructional design literacy)

Design Relevance

Let me use course design as a metaphor for educational dashboard design. All too often, course design starts with selecting a textbook. The course syllabus is then designed around the contents of the book (week 1 = chapter 1). Hopefully, the textbook comes with some homework problems, a quiz bank, or some other assignments that can be used to assess student learning. These also get scheduled on the syllabus. If it’s an accreditation year, faculty may need to reverse engineer the contents of their book and the assignments they’ve chosen to come up with some “learning outcomes” they can list on the syllabus.

This sequence of course design decisions is so common that the more appropriate way of designing a course is popularly referred to as “backward design.” Backward design begins by deciding what students should learn (choosing learning outcomes), then deciding what evidence they would have to present for you to believe they had learned it (designing assessments aligned with your learning outcomes), and then choosing activities for students to engage in that will support their learning (like reading a chapter, watching a video, participating in a debate, etc. aligned with your learning outcomes). “Backward design” begins from the goal rather than ending with it.

Similarly, dashboard design frequently begins with the question “what data are available to be visualized?” Once that question is answered, the discussion moves on to questions like “what are reasonable ways to visualize those data? Which would be easiest to understand? Which would be most beautiful?” Only rarely – if ever – is the question asked, “what do we hope a teacher will actually do after looking at this dashboard?” We seldom begin with the end in mind.

Faculty Literacies

Even when educational dashboard design begins with the end in mind, the effective use of dashboards still depends entirely on faculty’s levels of data literacy and instructional design literacy. Believe it or not, not all faculty are comfortable interpreting graphs. There’s no guarantee that even a majority of faculty can read “through” a visualization to grasp the useful information it’s trying to convey. And while all faculty will have had at least one math or quantitative literacy course, the overwhelming majority of them will have never received any instructional design training. This means they’re unlikely to be familiar with research suggesting which responses might be most effective in the context of whatever useful information they’ve gleaned from the dashboard.

What is an “Action Dashboard”?

Rather than making dashboards that are “actionable,” why not make dashboards of actions? In other words, what if we didn’t convert raw data into visualizations that we hope faculty have sufficient data literacy to interpret correctly, and then further hope that they also have the ID literacy necessary to do something useful with that information? What if dashboard designers applied the necessary data literacy and ID literacy upstream, in the design of the dashboard itself, and simply presented faculty with a list of specific actions they might consider taking? Here’s one example, presented two ways:

Current model: (1) Show a line graph of course activity.

A line graph of student course activity
From https://community.canvaslms.com/docs/DOC-17996-how-do-i-compare-the-weekly-online-activity-chart-graph-with-a-section-or-student-filter-in-new-analytics

(2) Hope that faculty can interpret it accurately. (3) Hope that faculty understand that a student not logging in regularly could indicate that they’re having some kind of trouble. (Yes, I realize that’s not what the data in my sample visualization show. But work with me here.) (4) Hope faculty decide that a reasonable thing to do would be to reach out to students who haven’t logged in for a while and check on how they’re doing.

New model: Create a list of students who faculty should check in on because they haven’t logged in for a while.

Contact These Students
David Wiley email / dismiss suggestion
David hasn’t logged in for 14 days. Consider checking in today to see how they’re doing.
Elaine Wiley email / dismiss suggestion
Elaine hasn’t logged in for 8 days. Consider checking in soon to see how they’re doing.

What information should be included in an action dashboard?

Everyone knows I’m a fan of five letter frameworks! The 5Ws you learned in school provide a great starting place for thinking about action dashboards.

  • What? Describe the specific action should faculty consider taking.
  • Why? Explain why they should consider taking it.
  • When? How soon should they decide whether or not to take action?
  • Who? Name each student with regard to whom the faculty should consider taking action.
  • Where? In what format, with what tools, or in what place should faculty take the action?

From my personal perspective (and based on the research on the impact of teacher-student relationships), I think action dashboards should be filled with suggested actions that help faculty proactively express care, support, and encouragement for their students. An action dashboard might suggest faculty Contact These Students because they haven’t logged in for over a week (“Everything ok?”), because they scored above 95% on a recent quiz (“You’re crushing it!”), because they’re struggling in class (“Could you come to office hours so I can help you?”) and for a range of other reasons. If these messages were templated, faculty could send more messages of care, support, and encouragement to more students more quickly.

An action dashboard might also suggest that faculty Focus on These Topics During Class based on questions a majority of students missed on the homework. Or that faculty Review These Topics During Class to help students implement spaced review of critical concepts. Or other things.

Students, Administrators, and Power Users

There is nothing in this discussion of action dashboards from the faculty perspective that doesn’t also apply to students or administrators. It is completely unrealistic to expect them to have all the expertise necessary to translate a dashboard full of visualizations into effective actions they can take in support of student learning. Students and administrators need action dashboards, too.

It’s also true that, for faculty, students, and administrators who do have higher levels of data literacy and instructional design literacy, access to the visualizations (or even raw data) could be more powerful than an action dashboard. And I would advocate for providing those in an “advanced” view of the dashboard. My main argument here has been that, if we want dashboards to be widely used so they can improve outcomes for more students, the default view shouldn’t require high levels of data literacy or instructional design literacy in order to be usable. The default view should be immediately usable by everyone.

(It’s a bit like a Mac – if you have a high degree of computer literacy, you can open the Terminal app and type Unix commands to your heart’s content. But that shouldn’t be the default user experience. The default UX should be – and is – point and click.)

Conclusion

An “action dashboard” is a dashboard filled with specific actions a user might consider taking, presented in the context of the 5Ws. Action dashboards can be used effectively by all faculty, students, and administrators, regardless of their levels of data literacy or instructional design literacy. While action dashboards can feel restrictive to power users, power users can be provided with different views more suited to their literacy levels.