The first lecture I ever heard about “MOOCs” was forthright, apocalyptic, and distressingly plausible. The year was 2012 or 2013, and I was a young assistant professor on the upslope of an academic career. Speaking from a podium for 45 harrowing minutes, a senior official at my then-employer, a small college in the South, informed faculty that Massive Open Online Courses would spell the end for private, tuition-dependent institutions such as ours. One year, soon, given the choice of studying with Harvard and Stanford professors via computer screen, our autumn crop of freshmen would “simply not show up.”
The foretold doom hasn’t come to pass, but the MOOC hysteria of those heady days is not hard to understand. Spring 2012 had seen the founding of Udacity and Coursera, futuristic-sounding online-education providers that had already started partnering with major schools to offer free courses. Though Web 3.0, with its blockchains and nonfungible tokens, hadn’t yet emerged, the Internet-as-disruptor narrative was on every tongue; no legacy industry seemed safe. Silicon Valley still had its magic.
Major newspapers were trumpeting the new technology. The New York Times proclaimed 2012 “The Year of the MOOC” and called them “the educational happening of the moment.” A year later, the Washington Post heralded them as “an important new experiment in higher education” and suggested that “this . . . may be the school year that the MOOC truly goes mainstream.”
It wasn’t until early 2015 that cracks in the foundation of this narrative began to show. Writing in the Chronicle of Higher Education that February, Steve Kolowich crowed that “few people would now be willing to argue that Massive Open Online Courses are the future of higher education.” Pointing to suspicion among administrators that MOOCs would neither “generate money [nor] reduce costs,” Kolowich declared the new technology “unsustainable”—a word to conjure with, in the era of tech startups—and noted that the percentage of universities offering MOOCs “seem[ed] to be leveling off.” Elated, I fired off a triumphalist email to a few colleagues, opined that brick-and-mortar colleges were unkillable, and forgot about MOOCs for the rest of the decade.
But a funny thing happened while my peers and I were wallowing in self-satisfaction. Cast out of the media spotlight, MOOCs continued their rise as an alternative to traditional instruction. In 2016, an estimated 58 million students worldwide signed up for at least one Massive Open Online Course at more than 700 participating universities. By last year, those numbers had reached 220 million students and 950 institutions, with work spread across more than 19,000 courses. MOOCs may or may not be the future, but for a great many learners, they are a significant part of the present. As attitudes about in-person education have evolved—a process hastened by the Covid pandemic—MOOCs have gone from hyped novelty to mainstream option. That we have mostly ceased to talk about them means little.
To comprehend both the challenges and advantages that MOOCs face in the education marketplace, it’s helpful to understand employer attitudes, the sine qua non of contemporary higher-ed design. As Mariela J. Rivas and coauthors dryly stated in a 2020 paper, MOOCs are “marketed as opportunities for participants to improve their labor market outcomes.” If employers respect such courses and programs, then MOOCs can be expected to gobble up more market share. If not, they are deader than dead, and we should all be scrambling to short Coursera.
Happily for MOOC proponents, a group that includes both education reformers and futurists, the new technology seems to be winning hearts and minds. In an experiment designed to test the marketability of freelance coders, Rivas and her coauthors found that, while “respondents preferred all traditional degrees over a MOOC,” hiring managers nevertheless favored MOOC credentials over no credentials by a 61-point margin. (Other experiments have returned similar results.) At first glance, these outcomes may seem ominous for MOOCs, whose developers surely hope to take a bite out of traditional college’s dominance. Yet the real story here is that employers value MOOCs. Having cleared that crucial hurdle, the new technology can now play the long game, hoping that the performance of MOOC participants in the workplace will change attitudes further. Does anyone in traditional higher ed believe that brick-and-mortar students will prove more valuable forever?
In short, the progress of Massive Open Online Courses is unfolding as we should have anticipated a decade ago: slowly, as consumers reconsider habits, prejudices, and commitments. While the early media cacophony led some to expect a MOOC adoption rate resembling the tablet computer’s—owned by 45 percent of Americans within five years of its introduction—a better analogue may be the subscription music service. People will continue to own records, CDs, and MP3s, but Spotify is here to stay.
The most obvious conclusion to draw from the MOOC story so far is that prognostication is difficult—a reality that we ought to keep in mind when reading (for example) that one-third of surveyed officials believe that graduate schools will be “primarily online” by 2025. Another takeaway is that “wrong now” doesn’t mean “wrong forever;” nor should we dismiss emerging technologies because they fail to upend markets instantaneously. My colleagues and I were right to doubt that MOOCs would sweep us away in a single cataclysmic storm, but we missed the steady rain.
Having given up tenure for the world of education reform, I no longer fear such downpours. Indeed, MOOCs may represent one of the best chances we have to wring concessions from a higher-ed apparatus that remains sclerotic, self-serving, and ideologically blinkered. Should MOOCs succeed in pulling dollars away from the university machine, conservatives should cheer them on.
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.