Category: Education

Engagement Is the Enemy of Serendipity

Whenever I’m grumpy about an update to a technology I use, I try to perform a self-audit examining why I’m unhappy about this change. It’s a helpful exercise since we are all by nature resistant to even minor alterations to the technologies we use every day (which is why website redesign is now a synonym for bare-knuckle boxing), and this feeling only increases with age. Sometimes the grumpiness is justified, since one of your tools has become duller or less useful in a way you can clearly articulate; other times, well, welcome to middle age.

The New York Times recently changed their iPad app to emphasize three main tabs, Top Stories, For You, and Sections. The first is the app version of their chockablock website home page, which contains not only the main headlines and breaking news stories, but also an editor-picked mixture of stories and features from across the paper. For You is a new personalized zone that is algorithmically generated by looking at the stories and sections you have most frequently visited, or that you select to include by clicking on blue buttons that appear near specific columns and topics. The last tab is Sections, that holdover word from the print newspaper, with distinct parts that are folded and nested within each other, such as Metro, Business, Arts, and Sports.

Currently my For You tab looks as if it was designed for a hypochondriacal runner who wishes to live in outer space, but not too far away, since he still needs to acquire new books and follow the Red Sox. I shall not comment about the success of the New York Times algorithm here, other than to say that I almost never visit the For You tab, for reasons I will explain shortly. For now, suffice it to say that For You is not for me.

But the Sections tab I do visit, every day, and this is the real source of my grumpiness. At the same time that the New York Times launched those three premier tabs, they also removed the ability to swipe, simply and quickly, between sections of the newspaper. You used to be able to start your morning news consumption with the headlines and then browse through articles in different sections from left to right. Now you have to tap on Sections, which reveals a menu, from which you select another section, from which you select an article, over and over. It’s like going back to the table of contents every time you finish a chapter of a book, rather than just turning the page to the next chapter.

Sure, it seems relatively minor, and I suspect the change was made because confused people would accidentally swipe between sections, but paired with For You it subtly but firmly discourages the encounter with many of the newspaper’s sections. The assumption in this design is that if you’re a space runner, why would you want to slog through the International news section or the Arts section on the way to orbital bliss in the Science and Health sections?

* * *

When I was growing up in Boston, my first newspaper love was the sports section of the Boston Globe. I would get the paper in the morning and pull out that section and read it from cover to cover, all of the columns and game summaries and box scores. Somewhere along the way, I started briefly checking out adjacent sections, Metro and Business and Arts, and then the front section itself, with the latest news of the day and reports from around the country and world. The technology and design of the paper encouraged this sampling, as the unpacked paper was literally scattered in front of me on the table. Were many of these stories and columns boring to my young self? Undoubtedly. But for some reason—the same reason many of those reading this post will recognize—I slowly ended up paging through the whole thing from cover to cover, still focusing on the Sox, but diving into stories from various sections and broadly getting a sense of numerous fields and pursuits.

This kind of interface and user experience is now threatened because who needs to scan through seemingly irrelevant items when you can have constant go-go engagement, that holy grail of digital media. The Times, likely recognizing their analog past (which is still the present for a dwindling number of print subscribers), tries to replicate some of the old newspaper serendipity with Top Stories, which is more like A Bunch of Interesting Things after the top headlines. But I fear they have contradicted themselves in this new promotion of For You and the commensurate demotion of Sections.

The engagement of For You—which joins the countless For Yous that now dominate our online media landscape—is the enemy of serendipity, which is the chance encounter that leads to a longer, richer interaction with a topic or idea. It’s the way that a metalhead bumps into opera in a record store, or how a young kid becomes interested in history because of the book reviews that follow the box scores. It’s the way that a course taken on a whim in college leads, unexpectedly, to a new lifelong pursuit. Engagement isn’t a form of serendipity through algorithmically personalized feeds; it’s the repeated satisfaction of Present You with your myopically current loves and interests, at the expense of Future You, who will want new curiosities, hobbies, and experiences.

What We Learned from Studying the News Consumption Habits of College Students

Over the last year, I was fortunate to help guide a study of the news consumption habits of college students, and coordinate Northeastern University Library’s services for the study, including great work by our data visualization specialist Steven Braun and necessary infrastructure from our digital team, including Sarah Sweeney and Hillary Corbett. “How Students Engage with News,” out today as both a long article and accompanying datasets and media, provides a full snapshot of how college students navigate our complex and high-velocity media environment.

Picture

This is a topic that should be of urgent interest to everyone since the themes of the report, although heightened due to the more active digital practices of young people, capture how we all find and digest news today, and also points to where such consumption is heading. On a personal level, I was thrilled to be a part of this study as a librarian who wants students to develop good habits of truth-seeking, and as an intellectual historian, who has studied changing approaches to truth-seeking over time.

You should first read the entire report, or at least the executive summary, now available on a special site at Project Information Literacy, with data hosted at Northeastern University Library’s Digital Repository System (where the study will also have its long-term, preserved form). It’s been great to work with, and think along with, the lead study members, including Alison Head, John Wihbey, Pakis Metaxas, and Margy MacMillan.

“How Students Engage with News” details how college students are overwhelmed by the flood of information they see every day on multiple websites and in numerous apps, an outcome of their extraordinarily frequent attention to smartphones and social media. Students are interested in news, and want to know what’s going on, but given the sheer scale and sources of news, they find themselves somewhat paralyzed. As humans naturally do in such situations, students often satisfice in terms of news sources—accepting “good enough,” proximate (from friends or media) descriptions rather than seeking out multiple perspectives or going to “canonical” sources of news, like newspapers. Furthermore, much of what they consume is visual rather than textual—internet genres like memes, gifs, and short videos play an outsized role in their digestion of the day’s events. (Side note: After recently seeing Yale Art Gallery’s show “Seriously Funny: Caricature Through the Centuries,” I think there’s a good article to be written about the historical parallels between today’s visual memes and political cartoons from the past.) Of course, the entire population faces the same issues around our media ecology, but students are an extreme case.

And perhaps also a cautionary tale. I think this study’s analysis and large survey size (nearly 6,000 students from a wide variety of institutions) should be a wake-up call for those of us who care about the future of the news and the truth. What will happen to the careful ways we pursue an accurate understanding of what is happening in the world by weighing information sources and developing methods for verifying what one hears, sees, and reads? Librarians, for instance, used to be much more of a go-to source for students to find reliable sources of the truth, but the study shows that only 7% of students today have consulted their friendly local librarian.

It is incumbent upon us to change this. A purely technological approach—for instance, “improving” social media feeds through “better” algorithms—will not truly solve the major issues identified in the news consumption study, since students will still be overwhelmed by the volume, context, and heterogeneity of news sources. A more active stance by librarians, journalists, educators, and others who convey truth-seeking habits is essential. Along these lines, for example, we’ve greatly increased the number of workshops on digital research, information literacy, and related topics at Northeastern University Library, and students are eager attendees at these workshops. We will continue to find other ways to get out from behind our desks and connect more with students where they are.

Finally, I have used the word “habit” very consciously throughout this post, since inculcating and developing more healthy habits around news consumption will also be critical. Alan Jacobs’ notion of cultivating “temporal bandwidth” is similar to what I imagine will have to happen in this generation—habits and social norms that push against the constant now of social media, and stretch and temper our understanding of events beyond our unhealthily caffeinated present.

The Ivory Tower and the Open Web: Burritos, Browsers, and Books

In the summer of 2007, Nate Silver decided to conduct a rigorous assessment of the inexpensive Mexican restaurants in his neighborhood, Chicago’s Wicker Park. Figuring that others might be interested in the results of his study, and that he might be able to use some feedback from an audience, he took his project online.

Silver had no prior experience in such an endeavor. By day he worked as a statistician and writer at Baseball Prospectus—an innovator, to be sure, having created a clever new standard for empirically measuring the value of players, an advanced form of the “sabermetrics” vividly described by Michael Lewis in Moneyball. ((Nate Silver, “Introducing PECOTA,” in Gary Huckabay, Chris Kahrl, Dave Pease et al., eds., Baseball Prospectus 2003 (Dulles, VA: Brassey’s Publishers, 2003): 507-514. Michael Lewis, Moneyball: The Art of Winning an Unfair Game (New York: W. W. Norton & Company, 2004).)) But Silver had no experience as a food critic, nor as a web developer.

In time, his appetite took care of the former and the open web took care of the latter. Silver knit together a variety of free services as the tapestry for his culinary project. He set up a blog, The Burrito Bracket, using Google’s free Blogger web application. Weekly posts consisted of his visits to local restaurants, and the scores (in jalapeños) he awarded in twelve categories.

Home page of Nate Silver’s Burrito Bracket
Ranking system (upper left quadrant)

Being a sports geek, he organized the posts as a series of contests between two restaurants. Satisfying his urge to replicate March Madness, he modified another free application from Google, generally intended to create financial or data spreadsheets, to produce the “bracket” of the blog’s title.

Google Spreadsheets used to create the competition bracket

Like many of the savviest users of the web, Silver started small and improved the site as he went along. For instance, he had started to keep a photographic record of his restaurant visits and decided to share this documentary evidence. So he enlisted the photo-sharing site Flickr, creating an off-the-rack archive to accompany his textual descriptions and numerical scores. On August 15, 2007, he added a map to the site, geolocating each restaurant as he went along and color-coding the winners and losers.

Flickr photo archive for The Burrito Bracket (flickr.com)
Silver’s Google Map of Chicago’s Wicker Park (shaded in purple) with the location of each Mexican restaurant pinpointed

Even with its do-it-yourself enthusiasm and the allure of carne asada, Silver had trouble attracting an audience. He took to Yelp, a popular site for reviewing restaurants to plug The Burrito Bracket, and even thought about creating a Super Burrito Bracket, to cover all of Chicago. ((Frequently Asked Questions, The Burrito Bracket, http://burritobracket.blogspot.com/2007/07/faq.html)) But eventually he abandoned the site following the climactic “Burrito Bowl I.”

With his web skills improved and a presidential election year approaching, Silver decided to try his mathematical approach on that subject instead—”an opportunity for a sort of Moneyball approach to politics,” as he would later put it. ((http://www.journalism.columbia.edu/system/documents/477/original/nate_silver.pdf)) Initially, and with a nod to his obsession with Mexican food, he posted his empirical analyses of politics under the chili-pepper pseudonym “Poblano,” on the liberal website Daily Kos, which hosts blogs for its engaged readers.

Then, in March 2008, Silver registered his own web domain, with a title that was simultaneously and appropriately mathematical and political: fivethirtyeight.com, a reference to the total number of electors in the United States electoral college. He launched the site with a slight one-paragraph post on a recent poll from South Dakota and a summary of other recent polling from around the nation. As with The Burrito Bracket it was a modest start, but one that was modular and extensible. Silver soon added maps and charts to bolster his text.

FiveThirtyEight two months after launch, in May 2008

Nate Silver’s real name and FiveThiryEight didn’t remain obscure for long. His mathematical modeling of the competition between Barack Obama and Hillary Clinton for the Democratic presidential nomination proved strikingly, almost creepily, accurate. Clear-eyed, well-written, statistically rigorous posts began to be passed from browsers to BlackBerries, from bloggers to political junkies to Beltway insiders. From those wired early subscribers to his site, Silver found an increasingly large audience of those looking for data-driven, deeply researched analysis rather than the conventional reporting that presented political forecasting as more art than science.

FiveThiryEight went from just 800 visitors a day in its first month to a daily audience of 600,000 by October 2008. ((Adam Sternbergh, The Spreadsheet Psychic, New York, Oct 12, 2008, http://nymag.com/news/features/51170/)) On election day, FiveThiryEight received a remarkable 3 
million 
visitors, more than most daily newspapers
. ((http://www.journalism.columbia.edu/system/documents/477/original/nate_silver.pdf))

All of this attention for a site that most media coverage still called, with a hint of deprecation, a “blog,” or “aggregator” of polls, despite Silver’s rather obvious, if latent, journalistic skills. (Indeed, one of his roads not taken had been an offer, straight out of college, to become an assistant at The Washington Post. ((http://www.journalism.columbia.edu/system/documents/477/original/nate_silver.pdf)) ) An article in the Colorado Daily on the emergent genre represented by FiveThirtyEight led with Ken Bickers, professor and chair of the political science department at the University of Colorado, saying that such sites were a new form of “quality blogs” (rather than, evidently, the uniformly second-rate blogs that had previously existed). The article then swerved into much more ominous territory, asking whether reading FiveThirtyEight and similar blogs was potentially dangerous, especially compared to the safe environs of the traditional newspaper. Surely these sites were superficial, and they very well might have a negative effect on their audience:

Mary Coussons-Read, a professor of psychology at CU Denver, says today’s quick turnaround of information helps to make it more compelling.

“Information travels so much more quickly,” she says. “(We expect) instant gratification. If people have a question, they want an answer.”

That real-time quality can bring with it the illusion that it’s possible to perceive a whole reality by accessing various bits of information.

“There’s this immediacy of the transfer of information that leads people to believe they’re seeing everything … and that they have an understanding of the meaning of it all,” she says.

And, Coussons-Read adds, there is pleasure in processing information.

“I sometimes feel like it’s almost a recreational activity and less of an information-gathering activity,” she says.

Is it addiction?

[Michele] Wolf says there is something addicting about all that data.

“I do feel some kind of high getting new information and being able to process it,” she says. “I’m also a rock climber. I think there are some characteristics that are shared. My addiction just happens to be information.”

While there’s no such mental-health diagnosis as political addiction, Jeanne White, chemical dependency counselor at Centennial Peaks Hospital in Louisville, says political information seeking could be considered an addictive process if it reaches an extreme. ((Cindy Sutter, “Hooked on information: Can political news really be addicting?” The Colorado Daily, November 3, 2008, http://www.coloradodaily.com/ci_13105998))

This stereotype of blogs as the locus of “information” rather than knowledge, of “recreation” rather than education, was—and is—a common one, despite the wide variety of blogs, including many with long-form, erudite writing. Perhaps in 2008 such a characterization of FiveThirtyEight was unsurprising given that Silver’s only other credits to date were the Player Empirical Comparison and Optimization Test Algorithm (PECOTA) and The Burrito Bracket. Clearly, however, here was an intelligent researcher who had set his mind on a new topic to write about, with a fresh, insightful approach to the material. All he needed was a way to disseminate his findings. His audience appreciated his extraordinarily clever methods—at heart, academic techniques—for cutting through the mythologies and inadequacies of standard political commentary. All they needed was a web browser to find him.

A few journalists saw past the prevailing bias against non-traditional outlets like FiveThirtyEight. In the spring of 2010, Nate Silver bumped into Gerald Marzorati, the editor of the New York Times Magazine, on a train platform in Boston. They struck up a conversation, which eventually turned into a discussion about how FiveThirtyEight might fit into the universe of the Times, which ultimately recognized the excellence of his work and wanted FiveThirtyEight to enhance their political reporting and commentary. That summer, a little more than two years after he had started FiveThirtyEight, Silver’s “blog” merged into the Times under a licensing deal. ((Nate Silver, “FiveThirtyEight to Partner with New York Times, http://www.fivethirtyeight.com/2010/06/fivethirtyeight-to-partner-with-new.html)) In less time than it takes for most students to earn a journalism degree, Silver had willed himself into writing for one of the world’s premier news outlets, taking a seat in the top tier of political analysis. A radically democratic medium had enabled him to do all of this, without the permission of any gatekeeper.

FiveThirtyEight on the New York Times website, 2010

* * *

The story of Nate Silver and FiveThirtyEight has many important lessons for academia, all stemming from the affordances of the open web. His efforts show the do-it-yourself nature of much of the most innovative work on the web, and how one can iterate toward perfection rather than publishing works in fully polished states. His tale underlines the principle that good is good, and that the web is extraordinarily proficient at finding and disseminating the best work, often through continual, post-publication, recursive review. FiveThirtyEight also shows the power of openness to foster that dissemination and the dialogue between author and audience. Finally, the open web enables and rewards unexpected uses and genres.

Undoubtedly it is true that the path from The Burrito Bracket to The New York Times may only be navigated by an exceptionally capable and smart individual. But the tools for replicating Silver’s work are just as open to anyone, and just as powerful. It was with that belief, and the desire to encourage other academics to take advantage of the open web, that Roy Rosenzweig and I wrote Digital History: A Guide to Gathering, Preserving, and Presenting the Past on the Web. ((Daniel J. Cohen and Roy Rosenzweig, Digital History: A Guide to Gathering, Preserving, and Presenting the Past on the Web (University of Pennsylvania Press, 2006).)) We knew that the web, although fifteen years old at the time, was still somewhat alien to many professors, graduate students, and even undergraduates (who might be proficient at texting but know nothing about HTML), and we wanted to make the medium more familiar and approachable.

What we did not anticipate was another kind of resistance to the web, based not on an unfamiliarity with the digital realm or on Luddism but on the remarkable inertia of traditional academic methods and genres—the more subtle and widespread biases that hinder the academy’s adoption of new media. These prejudices are less comical, and more deep-seated, than newspapers’ penchant for tales of internet addiction. This resistance has less to do with the tools of the web and more to do with the web’s culture. It was not enough for us to conclude Digital History by saying how wonderful the openness of the web was; for many academics, this openness was part of the problem, a sign that it might be like “playing tennis with the net down,” as my graduate school mentor worriedly wrote to me. ((http://www.dancohen.org/2010/11/11/frank-turner-on-the-future-of-peer-review/))

In some respects, this opposition to the maximal use of the web is understandable. Almost by definition, academics have gotten to where they are by playing a highly scripted game extremely well. That means understanding and following self-reinforcing rules for success. For instance, in history and the humanities at most universities in the United States, there is a vertically integrated industry of monographs, beginning with the dissertation in graduate school—a proto-monograph—followed by the revisions to that work and the publication of it as a book to get tenure, followed by a second book to reach full professor status. Although we are beginning to see a slight liberalization of rules surrounding dissertations—in some places dissertations could be a series of essays or have digital components—graduate students infer that they would best be served on the job market by a traditional, analog monograph.

We thus find ourselves in a situation, now more than two decades into the era of the web, where the use of the medium in academia is modest, at best. Most academic journals have moved online but simply mimic their print editions, providing PDF facsimiles for download and having none of the functionality common to websites, such as venues for discussion. They are also largely gated, resistant not only to access by the general public but also to the coin of the web realm: the link. Similarly, when the Association of American University Presses recently asked its members about their digital publishing strategies, the presses tellingly remained steadfast in their fixation on the monograph. All of the top responses were about print-on-demand and the electronic distribution and discovery of their list, with a mere footnote for a smattering of efforts to host “databases, wikis, or blogs.” ((Association of American University Presses, “Digital Publishing in the AAUP Community; Survey Report: Winter 2009-2010,” http://aaupnet.org/resources/reports/0910digitalsurvey.pdf, p. 2)) In other words, the AAUP members see themselves almost exclusively as book publishers, not as publishers of academic work in whatever form that may take. Surveys of faculty show comfort with decades-old software like word processors but an aversion to recent digital tools and methods. ((See, for example, Robert B. Townsend, “How Is New Media Reshaping the Work of Historians?”, Perspectives on History, November 2010, http://www.historians.org/Perspectives/issues/2010/1011/1011pro2.cfm)) The professoriate may be more liberal politically than the most latte-filled ZIP code in San Francisco, but we are an extraordinarily conservative bunch when in comes to the progression and presentation of our own work. We have done far less than we should have by this point in imagining and enacting what academic work and communication might look like if it was digital first.

To be sure, as William Gibson has famously proclaimed, “The future is already here—it’s just not very evenly distributed.” ((National Public Radio, “Talk of the Nation” radio program, 30 November 1999, timecode 11:55, http://discover.npr.org/features/feature.jhtml?wfId=1067220)) Almost immediately following the advent of the web, which came out of the realm of physics, physicists began using the Los Alamos National Laboratory preprint server (later renamed ArXiv and moved to arXiv.org) to distribute scholarship directly to each other. Blogging has taken hold in some precincts of the academy, such as law and economics, and many in those disciplines rely on web-only outlets such as the Social Science Research Network. The future has had more trouble reaching the humanities, and perhaps this book is aimed slightly more at that side of campus than the science quad. But even among the early adopters, a conservatism reigns. For instance, one of the most prominent academic bloggers, the economist Tyler Cowen, still recommends to students a very traditional path for their own work. ((“Tyler Cowen: Academic Publishing,” remarks at the Institute for Humane Studies Summer Research Fellowship weekend seminar, May 2011, http://vimeo.com/24124436)) And far from being preferred by a large majority of faculty, quests to open scholarship to the general public often meet with skepticism. ((Open access mandates have been tough sells on many campuses, passing only by slight majorities or failing entirely. For instance, such a mandate was voted down at the University of Maryland, with evidence of confusion and ambivalence. http://scholarlykitchen.sspnet.org/2009/04/28/umaryland-faculty-vote-no-oa/))

If Digital History was about the mechanisms for moving academic work online, this book is about how the digital-first culture of the web might become more widespread and acceptable to the professoriate and their students. It is, by necessity, slightly more polemical than Digital History, since it takes direct aim at the conservatism of the academy that twenty years of the web have laid bare. But the web and the academy are not doomed to an inevitable clash of cultures. Viewed properly, the open web is perfectly in line with the fundamental academic goals of research, sharing of knowledge, and meritocracy. This book—and it is a book rather than a blog or stream of tweets because pragmatically that is the best way to reach its intended audience of the hesitant rather than preaching to the online choir—looks at several core academic values and asks how we can best pursue them in a digital age.

First, it points to the critical academic ability to look at any genre without bias and asks whether we might be violating that principle with respect to the web. Upon reflection many of the best things we discover in scholarship are found by disregarding popularity and packaging, by approaching creative works without prejudice. We wouldn’t think much of the meandering novel Moby-Dick if Carl Van Doren hadn’t looked past decades of mixed reviews to find the genius in Melville’s writing. Art historians have similarly unearthed talented artists who did their work outside of the royal academies and the prominent schools of practice. As the unpretentious wine writer Alexis Lichine shrewdly said in the face of fancy labels and appeals to mythical “terroir”: “There is no substitute for pulling corks.” ((Quoted in Frank J. Prial, “Wine Talk,” New York Times, 17 August 1994, http://www.nytimes.com/1994/08/17/garden/wine-talk-983519.html.))

Good is good, no matter the venue of publication or what the crowd thinks. Scholars surely understand that on a deep level, yet many persist in the valuing venue and medium over the content itself. This is especially true at crucial moments, such as promotion and tenure. Surely we can reorient ourselves to our true core value—to honor creativity and quality—which will still guide us to many traditionally published works but will also allow us to consider works in some nontraditional venues such as new open access journals or articles written and posted on a personal website or institutional repository, or digital projects.

The genre of the blog has been especially cursed by this lack of open-mindedness from the academy. Chapter 1, “What is a Blog?”, looks at the history of the blog and blogging, the anatomy and culture of a genre that is in many ways most representative of the open web. Saddled with an early characterization as being the locus of inane, narcissistic writing, the blog has had trouble making real inroads in academia, even though it is an extraordinarily flexible form and the perfect venue for a great deal of academic work. The chapter highlights some of the best examples of academic blogging and how they shape and advance arguments in a field. We can be more creative in thinking about the role of the blog within the academy, as a venue for communicating our work to colleagues as well as to a lay audience beyond the ivory tower.

This academic prejudice against the blog extends to other genres that have proliferated on the open web. Chapter 2, “Genres and the Open Web,” examines the incredible variety of those new forms, and how, with a careful eye, we might be able to import some of them profitably into the academy. Some of these genres, like the wiki, are well-known (thanks to Wikipedia, which academics have come to accept begrudgingly in the last five years). Other genres are rarer but take maximal advantage of the latitude of the open web: its malleability and interactivity. Rather than imposing the genres we know on the web—as we do when we post PDFs of print-first journal articles—we would do well to understand and adopt the web’s native genres, where helpful to scholarly pursuits.

But what of our academic interest in validity and excellence, enshrined in our peer review system? Chapter 3, “Good is Good,” examines the fundamental requirements of any such system: the necessity of highlighting only a minority of the total scholarly output, based on community standards, and of disseminating that minority of work to communities of thought and practice. The chapter compares print-age forms of vetting with native web forms of assessment and review, and proposes ways that digital methods can supplement—or even replace—our traditional modes of peer review.

“The Value, and Values, of Openness,” Chapter 4, broadly examines the nature of the web’s openness. Oddly, this openness is both the easiest trait of the web to understand and its most complex, once one begins to dig deeper. The web’s radical openness not only has led to calls for open access to academic work, which has complicated the traditional models of scholarly publishers and societies; it has also challenged our academic predisposition toward perfectionism—the desire to only publish in a “final” format, purged (as much as possible) of error. Critically, openness has also engendered unexpected uses of online materials—for instance, when Nate Silver refactored poll numbers from the raw data polling agencies posted.

Ultimately, openness is at the core of any academic model that can operate effectively on the web: it provides a way to disseminate our work easily, to assess what has been published, and to point to what’s good and valuable. Openness can naturally lead—indeed, is leading—to a fully functional shadow academic system for scholarly research and communication that exists beyond the more restrictive and inflexible structures of the past.

The Last Digit of Pi

[This is a rough transcript of my TEDxNYED talk, delivered on March 6, 2010, in New York City at the Collegiate School. TEDxNYED was an all-day conference “examining the role of new media and technology in shaping the future of education.” For a meta-post about the experience of giving a TED(x) talk, please read “Academic Theater (Reflections on TED & TEDxNYED).” What I actually said and did at TEDxNYED deviated from this transcript; I engaged the audience directly a couple of times, once for fun and once to get their ideas about the subject. I’ll post the video when it’s available.]

I want to tell you a story about a forgotten realm of education and knowledge. It is a cautionary tale, a parable of what happens when the world changes, when tradition is challenged.

Until relatively recently in human history, pi was the much sought-after solution to what was long called the “rectification” or “quadrature” of the circle, fancy words more easily symbolized by the diagram in this slide. How can you transform that circle into the overlaid square? One side of the square would be one-quarter of pi if the diameter of the circle is 1.

Pi was a coveted number for thousands of years, imbued with magical properties. Generations of scholars pursued it doggedly, often considering it the be-all and end-all of geometry.

This is a different pi—pi as we moderns know it:

Well, not all of it, as I’m sure you know. It’s just the first 200 or so digits. The number stretches on forever. I hope you weren’t expecting me to reveal the actual last digit of pi. Because there isn’t one. Strange, no?

Pi wasn’t always this strange. The ancient Egyptians knew better, pegging the ratio of the circumference to the diameter of a circle at 4 over 3 to the 4th power. That’s considerably more definite, and thus much more sensible.

Archimedes knew better, homing in on the value of pi between a couple of very close fractions.

If you are a biblical literalist, pi would seem to be 3, since the Bible clearly describes 30 cubits as encompassing a circle of 10 cubit diameter.

And the solutions kept coming. From ancient mathematicians and philosophers, to medieval scholars, to the Renaissance and the Enlightenment. Everyone seemed capable of finding—with enough effort—the exact value for pi. Squaring the circle was an effort of genius in an ancient science perfectly described centuries ago by Euclid.

But something changed radically in the eighteenth century, just after that book on the right by Joubert de la Rue. A few mathematicians started to take more seriously the nagging feeling that pi didn’t have a perfect solution as a magical fraction. It might not have a last digit after all. This critical number at the center of mathematics might, in fact, be irrational. One mathematician began to reconceptualize pi.

And there he is: the dapper Swiss German mathematician Johann Heinrich Lambert:

He was the son of a tailor, obviously, and was mostly self-taught in mathematics. His brilliant work in the 1760s showed that π/4 could not be a rational number—you could never exactly figure out the value of one side of that square—and thus that pi too was irrational. After Lambert, math textbooks declared the matter solved.

That’s right, problem solved…

Except….circle-squaring kept on going. The world of mathematics had changed with the discoveries of the eighteenth century but somehow the message didn’t get through to many people. John Parker, on the left, came up with my personal favorite solution: pi is precisely 20612/6561. Some circle-squarers, like James Smith on the right, mocked Lambert’s proof as the work of a dilettante.

Things then got testy between the new mathematicians and those who clung to the prior vision of pi. The record of this warfare is as informative as it is humorous. In the 1860s and 70s, James Smith took on Augustus De Morgan, a math professor in London, in a series of short pamphlets, which were the Victorian equivalent of Twitter.

But unsurprisingly, the castigations of professors of mathematics didn’t stop the circle-squarers. Their solutions kept on coming, even in the face of criticism, even after pi had been shown to be transcendental, meaning it couldn’t even be the root of some other number or equation. My favorite book from the turn of the twentieth century had this subtitle on the cover: “The great problem which has baffled the greatest philosophers and the brightest minds of ancient and modern times has now been solved by a humble American citizen of the city of Brooklyn.”

Now, it’s easy to laugh at these misguided circle squarers, especially when they’re from Brooklyn. But if you read circle-squarers seriously, and stop to think about it, they are not so different from you or me. Even in our knowing times, we all persist in doing things that others have long since abandoned as absurd or passé.

History tells us that people are, alas, not very good at seeing the new, and instead are very good at maintaining the past at all costs. This is particularly true in education: Euclid’s Elements, written over 2,000 years ago, was still a standard math textbook well into the 19th century, despite major mathematical advances.

So it’s worth pausing to think about the last digit of pi. Why did so many continue to pursue pi as it was traditionally conceived, and why did they resist the new math?

Think for a moment about the distinction between the old and the new pi. The old was perfect, simple, ordered, divine; the new, seemingly imprecise, prosaic, chaotic, human. So the story of pi is the story, and the psychology, of what happens when the complex and new tries to overtake the simple and traditional.

It’s happening all around us in the digital age. We’re replacing what has been perceived as perfect and ordered with the seemingly imprecise and chaotic.

Look at what has happened, for instance, in the last decade with Wikipedia and the angst about the fate of the traditional Encyclopedia.

Or newspapers in the face of new forms of journalism, such as blogging. A former baseball statistician, Nate Silver of FiveThirtyEight.com, can brazenly decide to analyze elections and economy better than most newspapers? Yes indeed.

Now this audience, hip to the right side of these screens, may want to be as mean as Augustus De Morgan to those still on the left. We may want to leave modern circle-squarers behind, and undoubtedly some of them will be left behind. But for the majority who are unsettled and are caught between the old and the new, we need other methods to convince them and to change the status quo. History tells us it’s not enough to say that people are blind to the future. We have to show precisely what the weaknesses of the old are…

…and we have to show how the new works better than the old.

Knowing pi correctly to the 10th digit is enormously helpful when accurately predicting the movements of heavenly bodies; try using James Smith’s 3 1/8 when tracing the arc of a planet or moon. For some physics, knowing pi accurately to the 40th digit is critical.

Moreover, this modern pi may be strange, but its very strangeness opened up new avenues of research and thought that were just as intellectually challenging and rewarding as squaring the circle. The transcendental nature of pi led mathematicians to ponder infinite sequences of fractions and had an impact on chaos theory. In computer science, coming up with algorithms to reach a billion or trillion digits of pi as quickly as possible advanced the field. And, if you still want an unsolved problem to crack, see if you can figure out if pi is what is called a “normal number,” where the distribution of the digits 0-9 is uniform…

…or is there instead a preponderance of eights. Now that’s a tough problem, related to real issues in modern math. So there are still problems to be solved, more advanced problems. Math didn’t end with the end of the old pi—it just moved in new, more interesting directions.

But to get to that point, mathematicians had to show in a comprehensible way how the new pi created a new order.

Academic Theater (Reflections on TED & TEDxNYED)

This past weekend’s TEDxNYED event in New York took place in the theater of a school just off Broadway. I couldn’t help thinking about the symbolism of that location during the day’s proceedings. TEDx, a spinoff regional program of the billionaires-and-brains edutainment summit in California, TED, pushes speakers like me towards theatrics.

TEDxNYED was enjoyable and I greatly appreciated the opportunity to rub elbows with some digital luminaries and some very smart educators who are doing all the hard work in the trenches while I sit here in the ivory tower blogging. Whatever criticisms may be leveled, TEDxNYED was incredibly well-run and engaging. Before you read my thoughts below, you should first read the wrap-up from Dave Bill, the TEDxNYED “curator,” who gets it exactly right. I’m enormously appreciative of Dave’s hard work and the hard work of his TEDxNYED colleagues.

Back to Broadway: among other things, TEDxNYED gave me a chance to think more about the academic lecture as theater. (It also gave me a welcome chance to summon the vaudevillian genes of my New York Jewish heritage, the effectiveness of which you will be able to assess when the video is posted to the TEDx channel on YouTube in a couple of weeks.)

Take Larry Lessig, the de facto headliner of TEDxNYED. He’s clearly a first-rate legal scholar and influential activist. But after viewing him live, I realized more than ever that he’s also a rather talented performance artist, with crack comedic timing. (Here’s his talk; judge for yourself.)

We professors don’t like to admit it, but comedy and performance are important ingredients in most successful academic lectures, and can spur the pursuit of knowledge and action far better than serious monograph or article. When I was in college nearly everyone interested in history—from any era or place—took Stephen Cohen’s class on Soviet history, mostly because he was entertaining. He even had one lecture consisting entirely of jokes. Sure, it was gimmicky. But I also know several of my classmates who went into careers in diplomacy and history because of the inspiration.

Of course, academic theater can also lead to problems. TED talks are limited to 18 minutes, inevitably leading to reductionism. As I quipped in my talk on the 6,000 year history of π, “Portions have been condensed.” The humanities particularly suffer from this condensation. For instance, as hugely entertaining as Lessig’s talk was, if you watch it I’m sure you’ll pick up that it conflates, quite problematically, two kinds of conservatism: religious conservatism and libertarianism. Just because the Cato Institute can imagine a role for remixes doesn’t mean that those who attend free church potlucks can. Modern conservatism is an extraordinarily complex mix; one need only look at the tension between libertarian and evangelical views of homosexuality. Gina Bianchini, the CEO of Ning, a network of social networks, presented her work as “the joy of connecting optimists from around the world,” leaving out the fact that the history of Ning is far more interesting: it started out as an engine for making web apps, only later turning toward social networking. That’s actually a fascinating, complex business history that I would have liked to hear more about.

TED’s tagline is the catchy “Ideas Worth Spreading.” I’m an intellectual historian and appreciate the emphasis on ideas; as an educator I’m in favor of spreading knowledge. But in my later years I’ve also come to realize that while ideas are important, execution is probably more important. Lessig and Bianchini also know this—Lessig is now working on methods of more effective lobbying and Bianchini is obviously a talented CEO—and it would have helped TEDxNYED if they had explained to the audience the nitty-gritty details of making real change and progress. It doesn’t come from clever sound bites.

The TED spotlight-on-the-stage format also encourages the audience to perceive the speakers as isolated geniuses, coming out to impart wisdom. The host who introduced me credited me as being the solitary creator of several projects and works, all of which were actually broad collaborations. Again, collaboration is more complex than the format allows. Jeff Jarvis decided to blow up the format by getting up on stage with the lights on and ranting about the insanities and inanities of modern education. This was effective in a Lenny Bruce sort of way, but like Bruce, it was the exception that proved the rule that we speakers were bound to a certain form of academic theater. Inspired by Jarvis, I broke the fourth wall and interacted with the audience a couple of times during my talk, but it was perhaps a little superficial.

Regardless of these criticisms—which I give, again, entirely in recognition of the success of the event and with an eye toward improvement for next year—I enjoyed the challenge of doing a TED talk. I’m working on a much more formal Big Lecture at Cambridge University, and TEDxNYED helpfully made me think about the problems with that format as well. Indeed, I’m not blaming TED for the problems of academic theater. I actually believe the fault lies with academics themselves, who have ceded the ground of public intellectualism in the past generation or two, leaving a vacuum that TED and TEDx are happy to fill.

Hopefully—and judging by the tweets and blog posts this is true—the attendees took away more of the advantages than the disadvantages of the format, and will go on from thought to action.

[photo credit: Kevin Jarrett]

WordCamp Ed: Conference on WordPress for Education

From CHNM‘s Dave Lester, one of founders of THATCamp: The Humanities and Technology Camp, comes WordCamp Ed:

WordCamp conferences are taking the blogging community by storm as one-day events to meet fellow WordPress users in regional communities. WordCamp Ed has been organized to specifically focus on WordPress and Education. The day-long event to take place November 22, 2008, and will bring together a wide-range of institutions of higher-ed, professors, high school teachers, and students.

WordCamp Ed will be hosted at the Center for History and New Media at George Mason University, and is co-sponsored by the Center for New Designs in Learning and Scholarship at Georgetown University.

Dave and many others, including CHNM’s Jeremy Boggs, have been hacking and creating plugins for the open-source WordPress blogging platform for some time now. This seems like a great opportunity to see what others are doing and to exchange knowledge and ideas.

Digital Campus #20 – Open to Change

Are open educational resources such as iTunes U and thought-provoking dot-coms such as BigThink.com a distraction from the mission of professors and universities, or the wave of the future? We debate the merits of “open access” intellectual content in the feature story on our twentieth Digital Campus podcast. Also, I report on the mostly good (if a little odd) experience of buying a book from PublicDomainReprints.org, and we discuss the MacBook Air, Flickr Commons, and a variety of tools for manipulating RSS feeds.

10 Most Popular Philosophy Syllabi

It’s time once again to find the most influential syllabi in a discipline—this time, philosophy—as determined by data gleaned from the Syllabus Finder. As with my earlier analysis of the most popular history syllabi the following list was compiled by running a series of calculations to determine the number of times Syllabus Finder users glanced at a syllabus (had it turn up in a search), the number of times Syllabus Finder users inspected a syllabus (actually went from the Syllabus Finder website to the website of the syllabus to do further reading), and the overall “attractiveness” of a syllabus (defined as the ratio of full reads to mere glances). It goes without saying (but I’ll say it) that this methodology is unscientific and gives an advantage to older syllabi, but it still probably provides a good sense of the most visible and viewed syllabi on the web. Anyway, here are the ten most popular philosophy syllabi.

#1 – Philosophy of Art and Beauty, Julie Van Camp, California State University, Long Beach, Spring 1998 (total of 3992 points)

#2 – Introduction to Philosophy, Andreas Teuber, Brandeis University, Fall 2004 (3699 points)

#3 – Law, Philosophy, and the Humanities, Julie Van Camp, California State University, Long Beach, Fall 2003 (3174 points)

#4 – Introduction to Philosophy, Jonathan Cohen, University of California, San Diego, Fall 1999 (2448 points)

#5 – Comparative Methodology, Bryan W. Van Norden, Vassar College, multiple semesters (1944 points)

#6 – Aesthetics, Steven Crowell, Rice University, Fall 2003 (1913 points)

#7 – Philosophical Aspects of Feminism, Lisa Schwartzman, Michigan State University, Spring 2001 (1782 points)

#8 – Morality and Society, Christian Perring, University of Kentucky, Spring 1996 (1912 points)

#9 – Gay and Lesbian Philosophy, David Barber, University of Maryland, Spring 2002 (1442 points)

#10 – Social and Political Philosophy, Eric Barnes, Mount Holyoke College, Fall 1999 (1395 points)

I will leave it to readers of this blog to assess and compare these syllabi, but two brief comments. First of all, the diversity of topics within this list is notable compared to the overwhelming emphasis on American history among the most popular history syllabi. Asthetics, politics, law, morality, gender, sexuality, and methodology are all represented. Second, congratulations to Julie Van Camp of California State University, Long Beach, who becomes the first professor with two top syllabi in a discipline. Professor Van Camp was a very early adopter of the web, having established a personal home page almost ten years ago with links to all of her syllabi. Van Camp should watch her back, however; Andreas Teuber of Brandeis is coming up quickly with what seems to be the Platonic ideal of an introductory course on philosophy. In less than two years since its inception his syllabus has been very widely consulted.

[The fine print of how the rankings were determined: 1 point was awarded for each time a syllabus showed up in a Syllabus Finder search result; 10 points were awarded for each time a Syllabus Finder user clicked through to view the entire syllabus; 100 points were awarded for each percent of “attractiveness,” where 100% attractive means that every time a syllabus made an appearance in a search result it was clicked on for further information. For instance, the top syllabus appeared in 2164 searches and was clicked on 125 times (5.78% of the searches), for a point total of 2164 + (125 X 10) + (5.78 X 100) = 3992.]

“Legal Cheating” in the Wall Street Journal

In a forthcoming article in the Chronicle of Higher Education, Roy Rosenzweig and I argue that the ubiquity of the Internet in students’ lives and advances in digital information retrieval threaten to erode multiple-choice testing, and much of standardized testing in general. A revealing article in this weekend’s Wall Street Journal shows that some schools are already ahead of the curve: “In a wireless age where kids can access the Internet’s vast store of information from their cellphones and PDAs, schools have been wrestling with how to stem the tide of high-tech cheating. Now some educators say they have the answer: Change the rules and make it legal. In doing so, they’re permitting all kinds of behavior that had been considered off-limits just a few years ago.” So which anything-goes schools are permitting this behavior, and what exactly are they doing?

The surprise is that it is actually occurring in the more rigorous and elite public and private schools, and they are allowing students to bring Internet-enabled devices into the exam room. Moreover, they are backed not by liberal education professors but by institutions such as the Bill and Melinda Gates Foundation and pragmatic observers of the information economy. As the WSJ (as well as Roy and I) point out, their argument parallels that of the introduction of calculators into mathematics education in the 1980s, eventually leading to the inclusion of these formerly taboo devices on the SATs in 1994, a move that few have since criticized. Today, if one of the main tools workers use in a digital age is the Internet, why not include it in test-taking? After all, asserts M.I.T. economist Frank Levy, it’s more important to locate and piece together information about the World Bank than to know when it was founded. “This is the way the world works,” Harvard Director of Admissions Marlyn McGrath commonsensically notes.

Of course, the bigger question, only partially addressed by the WSJ article, is how the use of these devices will change instruction in fields such as history. From elementary through high school, such instruction has often been filled with the rote memorization of dates and facts, which are easily testable (and rapidly graded) on multiple-choice forms. But we should remember that the multiple-choice test is only a century old; there have been, and there will surely be again, more instructive ways to teach and test such rich disciplines as history, literature, and philosophy.