Author: Dan Cohen

Launching the Boston Research Center

Boston Bridges

Adam Glanzman/Northeastern University

I’m delighted that the news is now out about the Andrew W. Mellon Foundation‘s grant to Northeastern University Library to launch the Boston Research Center. The BRC will seek to unify major archival collections related to Boston, hundreds of data sets about the city, digital modes of scholarship, and a wide array of researchers and visualization specialists to offer a seamless environment for studying and displaying Boston’s history and culture. It will be great to work with my colleagues at Northeastern and regional partners to develop this center over the coming years. Having grown up in Boston, and now having returned as an adult, it has a personal significance for me as well.

I’m also excited that the BRC will build upon, and combine, some of the signature strengths of Northeastern that drew me to the university last year. For decades, the library has been assembling and working with local communities to preserve materials and stories related to the city. We now have the archives of a number of local and regional newspapers, and the library has been active in the gathering of oral and documentary histories of nearby communities such as the Lower Roxbury Black History Project. We also have strong connections with other important regional collections and institutions, such the Boston Public Library, the Boston Library Consortium, and data sets produced by Boston’s municipal government and other sources, through our campus’s leadership in BARI.

My friends in digital humanities will know that Northeastern has a world-class array of faculty and researchers doing cutting-edge, interdisciplinary computational analysis. We have the NULab for Texts, Maps, and Networks, the Network Science Institute, numerous faculty in our College of Arts, Media, and Design who work on digital storytelling and information design, and the library has its own terrific Digital Scholarship Group and dedicated specialists in GIS and data visualization. We will all be working together, and with many others from beyond the university, to imagine and develop large-scale projects that examine major trends and elements of Boston, such as immigration, neighborhood transformations, economic growth, and environmental changes. There will also be an opportunity for smaller-scale stories to be documented, and of course the BRC itself will be open to anyone who would like to research the city or specific communities. As a place with a long and richly documented history, with a coastal location and educational, scientific, and commercial institutions that have long involved global relationships, the study of Boston also means the study of themes that are broadly important and applicable.

My thanks to the Mellon Foundation for their generous support. It should be fascinating to watch all of this come together—stay tuned.

Help Snell Library Help Others

I am extremely fortunate to work in a library, an institution that is designed to help others and to share knowledge, resources, and expertise. Snell Library is a very busy library. Every year, we have two million visits. On some weekdays we receive well over 10,000 visitors, with thousands of them in the building at one time. It’s great to see a library so fully used and appreciated.

Just as important, Snell Library fosters projects that help others in our Boston community and well beyond. Our staff has worked alongside members of the Lower Roxbury community to record, preserve, and curate oral histories of their neighborhood; with other libraries and archives to aggregate and make accessible thousands of documents related to school desegregation in Boston; and with other institutions and people to save the personal stories and images of the Boston Marathon bombing and its aftermath.

Our library is the home of the archives of a number of Boston newspapers, including the The Boston Phoenix, the Gay Community News, and the East Boston Community News, with more to come. The Digital Scholarship Group housed in the library supports many innovative projects, including the Women Writers Project and the Early Caribbean Digital Archive. We have a podcast that explores new ideas and discoveries, and tries to help our audience understand the past, present, and future of our world better.

It’s National Library Week, and today is Northeastern’s Giving Day. So I have a small request of those who read my blog and might appreciate the activities of such a library as Snell: please consider a modest donation to my library to help us help others. And if at least 50 students, parents, or friends donate today—and I’d really love that to be 100, even at $10—I’ll match that with $1,000 of my own. Thank you. 

>> NU Giving Day – Give to the Library <<

What’s New, Episode 14: Privacy in the Facebook Age

On the latest What’s New Podcast from Northeastern University Library, I interview Woody Hartzog, who has a new book just out this week from Harvard University Press entitled Privacy’s Blueprint: The Battle to Control the Design of New Technologies. We had a wide-ranging discussion over a half-hour, including whether (and if so, how) Facebook should be regulated by the government, how new listening devices like the Amazon Echo should be designed (and regulated), and how new European laws that go into effect in May 2018 may (or may not) affect the online landscape and privacy in the U.S.

Woody provides a plainspoken introduction to all of these complicated issues, with some truly helpful parallels to ethical and legal frameworks in other fields (such as accounting, medicine, and legal practice), and so I strongly recommend a listen to the episode if you would like to get up to speed on this important aspect of our contemporary digital lives. Given Mark Zuckerberg’s testimony today in front of Congress, it’s especially timely.

[Subscribe to What’s New on iTunes or Google Play]

Authority and Usage and Emoji

Maybe it’s a subconscious effect of my return to the blog, but I’ve found myself reading more essays recently, and so I found myself returning to the nonfiction work of David Foster Wallace.1 Despite the seeming topical randomness of his essays—John McCain’s 2000 presidential campaign, the tennis player Tracy Austin, a Maine lobster fest—there is a thematic consistency in DFW’s work, which revolves around the tension between authority and democracy, high culture intellectualism and overthinking and low culture entertainment and lack of self-reflection. That is, his essays are about America and Americans.2

Nowhere is this truer than in “Authority and American Usage,” his monumental review of Bryan A. Garner’s A Dictionary of Modern American Usage.3 DFW uses this review of a single book to recount and assess the much longer debate between prescriptive language mavens who sternly offer correct English usage, and the more permissive, descriptive scholars who eschew hard usage rules for the lived experience of language. That is, authority and democracy.

The genius of Garner, in DFW’s view, is that he is an authority on American English who recognizes and even applauds regional and communal variations, without wagging his finger, but also without becoming all loosey-goosey and anything goes. Garner manages to have his cake and eat it too: he recognizes, with the democrats, that English (and language in general) is fluid and evolves and simply can’t be fixed in some calcified Edwardian form, but that it is also helpful to have rules and some knowledge of those rules so that you can express yourself with precision and persuade others. Even democratic descriptivists should want some regularity and authoritative usage because we all speak and write in a social context, and those we speak with and write to, whether we like it or not, pick up on subtle cues in usage to interpret and judge your intent and status within the community. Garner’s fusion of democracy and authority is immensely appealing to DFW; it’s like he’s figured out how to square the circle.

But Garner’s synthesis only works if the actual communication of your well-chosen words is true to what you had mentally decided to use, and here is where the seemingly odd inclusion of emoji in the title of this post comes into play.4 Emoji upset Garner’s delicate balance and upend DFW’s intense desire to communicate precisely because they are rendered very differently on digital platforms. Emoji entail losing control of the very important human capability to choose the exact form and meaning of our words. (The variation in emoji glyphs also contributes to the difficulty of archiving current human expression, but that is the subject of another post.) See, for example, the astonishing variety of the “astonished face” emoji across multiple platforms:

emoji_face

This is, unfortunately but unsurprisingly, an artifact of the legal status of emoji, which, unlike regular old English words, apparently (or potentially) can be copyrighted in specific renderings. So lawsuit-averse giant tech companies have resorted to their own artistic execution of each emoji concept, and these renderings can have substantially different meanings, often rather distant from authorial intent. As legal and emoji scholar Eric Goldman summarizes, “Senders and recipients on different platforms are likely to see different implementations and decode the symbols differently in ways that lead to misunderstandings.” Think about someone selecting the fairly faithful second emoji from the left, above (from Apple), and texting it to someone who sees it rendered as the X-eyed middle glyph (from Facebook; Goldman, deadpan: “a depiction typically associated with death”), or the third from the left (from Google, who knows).

In short, emoji are a portent of a day when the old debate about authority vs. democracy in English usage is a quaint artifact of the twentieth century, because our digital communications have another layer of abstraction that makes it even more difficult to express ourselves clearly. There is no doubt that David Foster Wallace would dropped many foul-mouthed emoji at that possibility.

  1. Since this post is, in part, about the subtleties and importance of word choice, we might quibble here with the term “essays” for DFW’s nonfiction work. Although it is indeed the term stenciled on the cover of his nonfiction books, what is contained therein is more like a menagerie of what might be best, albeit simplistically, called writing, including steroidal book reviews, random journalistic junkets, and non-random literary slam-downs.
  2. Were DFW still with us and reading blogs, which is, let’s admit it, a laugh-out-loud impossibility, he would likely object to this simplification of his essays that in many cases present themselves more like thick description married with extended—Stretch-Armstrong-level extended—philosophical tangents. He would be doubly annoyed with my needling of this point in a footnote, which is a crass and transparent and frankly lame mimicry of DFW himself, although I hope he would have awarded consolation points for the mobius-strip referentiality here. And objectively, the style of DFW’s writing, both his fiction and nonfiction, combined snoot-grade polysyllabic dictionary-grabbers with unexpected but also well-timed f-bombs, and this fusion has always been something of a tell.
  3. The original title of DFW’s Garner review was “Tense Present: Democracy, English and Wars over Usage,” which is, let’s face it, more clever.
  4. N.B. I use emoji as both the singular and plural form, à la sushi, although this is debated and is a perfect case study in authoritarian vs. democratic English usage. Robinson Meyer talks to the prescriptive language experts and Googles the democratic use of emoji vs. emojis in a remarkably DFW-esque piece in The Atlantic.

The Post-Coding Generation?

When I was in sixth grade our class got an Apple ][ and I fell in love for the first time. The green phosphorescence of the screen and the way text commands would lead to other text instantly appearing was magical. The true occult realm could be evoked by moving beyond the command line and into assembly language, with mysterious hexidecimal pairs producing swirling lines and shapes on the screen. It was enthralling, and led to my interest in programming at an early age. I now have an almost identical Apple ][ in the corner of my office as a totem from that time.

img_0620

Of course, very few people learn assembly language anymore, and for good reason. The history of computing is the history of successive generations of coders moving up the technical stack, from low-level languages like assembly to higher languages that put all of the rudimentary calculations behind a curtain.

I’ve been thinking about this coding escalator recently because of my kids and the still-vibrant “learn to code” movement. My kids are in their early teens and I can say as a proud parent that they are very good at all of the skills needed to be great programmers. They also go to a public school that was the archrival of the public school I went to—in the Boston-area math league. The school is filled with similar kids, sons and daughters of highly educated people, many of whom work in technical and scientific fields, or at one of Boston’s many universities.

Yet I would characterize the general interest of my kids’ generation in coding as being lukewarm. They get it, they see the power of programming, and yet they are much more interested in the creativity that can occur on top of the technical stack. I suppose we should not be surprised. They are the first generation whose interactions with computers were with devices that do not have a command line—that is, with smartphones and tablets. So naturally they are drawn to the higher-level aspects of computing, which doesn’t seem like computing at all to my generation. While some may roll their eyes at Apple adding an “Everyone Can Create” initiative this week as a counterpart to “Everyone Can Code,” my kids thought this was a truly interesting development.

To be sure, those who know how to code, and code well, will always be able to shape computer platforms and apps in powerful ways, just as those who understand what’s under the hood of their car can maximize its performance. The skills one learns in programming are broadly applicable, and under the right circumstances coding can stir the imagination about what is possible in the digital realm. But most of us just want to drive, even in a suboptimal automobile, and get somewhere for some other reason, and many “learn to code” programs are frankly not especially imaginative.

In Digital History, Roy Rosenzweig and I wrote that although they are both noble professions, “historians planning a digital project should think like architects, not like plumbers.” I suspect my kids’ generation may see coding as plumbing, and would prefer to work on the design of the overall house. I’m not sure that we have fully accounted for this next generation’s shift yet, or have even come to realize that at some point the coding escalator would reach the top, and those on it would step off.

Activism, Community Input, and the Evolution of Cities: My Interview with Ted Landsmark

I’ve had a dozen great guests on the What’s New podcast, but this week’s episode features a true legend: Ted Landsmark. He is probably best known as the subject of a shocking Pulitzer Prize-winning photograph showing a gang of white teens at a rally against school desegregation attacking him with an American flag. The image became a symbol of tense race relations in the 1970s, not only in Boston but nationwide.

ted_landsmark

(photo credits: Stanley Forman/Brian Fluharty)

He should be better known, however, for his decades of work shaping the city of Boston and the greater Boston area, and for his leadership in education, transportation planning, architecture, and other critical aspects of the fabric of the city. The assault on him on City Hall Plaza in Boston only intensified his activism, and set him on a path to be at the center of how the city would be developed over the last 40 years. It’s a remarkable story.

On the podcast Ted Landsmark recounts not only this personal history, but the history of a Boston in general, and he provides a 360-degree view of how cities are designed, managed, and are responsive (or unresponsive) to community needs and desires. His sense of how urban feedback systems work, from local politics to technology like the 311 phone number many cities have implemented to hear from their citizens, is especially smart and helpful.

I hope you’ll tune in.

Revisiting Mills Kelly’s “Lying About the Past” 10 Years Later

If timing is everything, history professor Mills Kelly didn’t have such great timing for his infamous course “Lying About the Past.” Taught at George Mason University for the first time in 2008, and then again in 2012—both, notably, election years, although now seemingly from a distant era of democracy—the course stirred enormous controversy and then was never taught again in the face of institutional and external objections. Some of those objections understandably remain, but “Lying About the Past” now seems incredibly prescient and relevant.

Unlike other history courses, “Lying About the Past” did not focus on truths about the past, but on historical hoaxes. As a historian of Eastern Europe, Kelly knew a thing or two about how governments and other organizations can shape public opinion through the careful crafting of false, but quite believable, information. Also a digital historian, Kelly understood how modern tools like Photoshop could give even a college student the ability to create historical fakes, and then to disseminate those fakes widely online.

In 2008, students in the course collaborated on a fabricated pirate, Edward Owens, who supposedly roamed the high (or low) seas of the Chesapeake Bay in the 1870s. (In a bit of genius marketing, they called him “The Last American Pirate.”) In 2012, the class made a previously unknown New York City serial killer materialize out of “recently found” newspaper articles and other documents.

It was less the intellectual focus of the course, which was really about the nature of historical truth and the importance of careful research, than the dissemination of the hoaxes themselves that got Kelly and his classes in trouble. In perhaps an impolitic move, the students ended up adding and modifying articles on Wikipedia, and as YouTube recently discovered, you don’t mess with Wikipedia. Although much of the course was dedicated to the ethics of historical fakes, for many who looked at “Lying About the Past,” the public activities of the students crossed an ethical line.

But as we have learned over the last two years, the mechanisms of dissemination are just as important as the fake information being disseminated. A decade ago, Kelly’s students were exploring what became the dark arts of Russian trolls, putting their hoaxes on Twitter and Reddit and seeing the reactive behaviors of gullible forums. They learned a great deal about the circulation of information, especially when bits of fake history and forged documents align with political and cultural communities.

As Yoni Appelbaum, a fellow historian, assessed the outcome of “Lying About the Past” more generously than the pundits who piled on once the course circulated on cable TV:

If there’s a simple lesson in all of this, it’s that hoaxes tend to thrive in communities which exhibit high levels of trust. But on the Internet, where identities are malleable and uncertain, we all might be well advised to err on the side of skepticism.

History unfortunately shows that erring on the side of skepticism has not exactly been a widespread human trait. Indeed, “Lying About the Past” showed the opposite: that those who know just enough history to make plausible, but false, variations in its record, and then know how to push those fakes to the right circles, have the chance to alter history itself.

Maybe it’s a good time to teach some version of “Lying About the Past” again.

Back to the Blog

One of the most-read pieces I’ve written here remains my entreaty “Professors Start Your Blogs,” which is now 12 years old but might as well have been written in the Victorian age. It’s quaint. In 2006, many academics viewed blogs through the lens of LiveJournal and other teen-oriented, oversharing diary sites, and it seemed silly to put more serious words into that space. Of course, as I wrote that blog post encouraging blogging for more grown-up reasons, Facebook and Twitter were ramping up, and all of that teen expression would quickly move to social media.

Then the grown-ups went there, too. It was fun for a while. I met many people through Twitter who became and remain important collaborators and friends. But the salad days of “blog to reflect, tweet to connect” are gone. Long gone. Over the last year, especially, it has seemed much more like “blog to write, tweet to fight.” Moreover, the way that our writing and personal data has been used by social media companies has become more obviously problematic—not that it wasn’t problematic to begin with.

Which is why it’s once again a good time to blog, especially on one’s own domain. I’ve had this little domain of mine for 20 years, and have been writing on it for nearly 15 years. But like so many others, the pace of my blogging has slowed down considerably, from one post a week or more in 2005 to one post a month or less in 2017.

The reasons for this slowdown are many. If I am to cut myself some slack, I’ve taken on increasingly busy professional roles that have given me less time to write at length. I’ve always tried to write substantively on my blog, with posts often going over a thousand words. When I started blogging, I committed to that model of writing here—creating pieces that were more like short essays than informal quick takes.

Unfortunately this high bar made it more attractive to put quick thoughts on Twitter, and amassing a large following there over the last decade (this month marks my ten-year anniversary on Twitter) only made social media more attractive. My story is not uncommon; indeed, it is common, as my RSS reader’s weekly article count will attest.

* * *

There has been a recent movement to “re-decentralize” the web, returning our activities to sites like this one. I am unsurprisingly sympathetic to this as an idealist, and this post is my commitment to renew that ideal. I plan to write more here from now on. However, I’m also a pragmatist, and I feel the re-decentralizers have underestimated what they are up against, which is partially about technology but mostly about human nature.

I’ve already mentioned the relative ease and short amount of time it takes to express oneself on centralized services. People are chronically stretched, and building and maintaining a site, and writing at greater length than one or two sentences seems like real work. When I started this site, I didn’t have two kids and two dogs and a rather busy administrative job. Overestimating the time regular people have to futz with technology was the downfall of desktop linux, and a key reason many people use Facebook as their main outlet for expression rather a personal site.

The technology for self-hosting has undoubtedly gotten much better. When I added a blog to dancohen.org, I wrote my own blogging software, which sounds impressive, but was just some hacked-together PHP and a MySQL database. This site now runs smoothly on WordPress, and there are many great services for hosting a WordPress site, like Reclaim Hosting. It’s much easier to set up and maintain these sites, and there are even decent mobile apps from which to post, roughly equivalent to what Twitter and Facebook provide. Platforms like WordPress also come with RSS built in, which is one of the critical, open standards that are at the heart of any successful version of the open web in an age of social media. Alas, at this point most people have invested a great deal in their online presence on closed services, and inertia holds them in place.

It is psychological gravity, not technical inertia, however, that is the greater force against the open web. Human beings are social animals and centralized social media like Twitter and Facebook provide a powerful sense of ambient humanity—the feeling that “others are here”—that is often missing when one writes on one’s own site. Facebook has a whole team of Ph.D.s in social psychology finding ways to increase that feeling of ambient humanity and thus increase your usage of their service.

When I left Facebook eight years ago, it showed me five photos of my friends, some with their newborn babies, and asked if I was really sure. It is unclear to me if the re-decentralizers are willing to be, or even should be, as ruthless as this. It’s easier to work on interoperable technology than social psychology, and yet it is on the latter battlefield that the war for the open web will likely be won or lost.

* * *

Meanwhile, thinking globally but acting locally is the little bit that we can personally do. Teaching young people how to set up sites and maintain their own identities is one good way to increase and reinforce the open web. And for those of us who are no longer young, writing more under our own banner may model a better way for those who are to come.

The Significance of the Twitter Archive at the Library of Congress

It started with some techies casually joking around, and ended with the President of the United States being its most avid user. In between, it became the site of comedy and protest, several hundred million human users and countless bots, the occasional exchange of ideas and a constant stream of outrage.

All along, the Library of Congress was preserving it all. Billions of tweets, saved over 12 years, now rub shoulders with books, manuscripts, recordings, and film among the Library’s extensive holdings.

On December 31, however, this archiving will end. The day after Christmas, the Library announced that it would no longer save all tweets after that date, but instead will choose tweets to preserve “on a very selective basis,” for major events, elections, and political import. The rest of Twitter’s giant stream will flow by, untapped and ephemeral.

The Twitter archive may not be the record of our humanity that we wanted, but it’s the record we have. Due to Twitter’s original terms of service and the public availability of most tweets, which stand in contrast to many other social media platforms, such as Facebook and Snapchat, we are unlikely to preserve anything else like it from our digital age.

Undoubtedly many would consider that a good thing, and that the Twitter archive deserves the kind of mockery that flourishes on the platform itself. What can we possibly learn from the unchecked ramblings and ravings of so many, condensed to so few characters?

Yet it’s precisely this offhandedness and enforced brevity that makes the Twitter archive intriguing. Researchers have precious few sources for the plain-spoken language and everyday activities and thought of a large swath of society.

Most of what is archived is indeed done so on a very selective basis, assessed for historical significance at the time of preservation. Until the rise of digital documents and communications, the idea of “saving it all” seemed ridiculous, and even now it seems like a poor strategy given limited resources. Archives have always had to make tough choices about what to preserve and what to discard.

However, it is also true that we cannot always anticipate what future historians will want to see and read from our era. Much of what is now studied from the past are materials that somehow, fortunately, escaped the trash bin. Cookbooks give us a sense of what our ancestors ate and celebrated. Pamphlets and more recently zines document ideas and cultures outside the mainstream.

Historians have also used records in unanticipated ways. Researchers have come to realize that the Proceedings of the Old Bailey, transcriptions from London’s central criminal court, are the only record we have of the spoken words of many people who lived centuries ago but were not in the educated or elite classes. That we have them talking about the theft of a pig rather than the thought of Aristotle only gives us greater insight into the lived experience of their time.

The Twitter archive will have similar uses for researchers of the future, especially given its tremendous scale and the unique properties of the platform behind the short messages we see on it. Preserved with each tweet, but hidden from view, is additional information about tweeters and their followers. Using sophisticated computational methods, it is possible to visualize large-scale connections within the mass of users that will provide a good sense of our social interactions, communities, and divisions.

Since Twitter launched a year before the release of the iPhone, and flourished along with the smartphone, the archive is also a record of what happened when computers evolved from desktop to laptop to the much more personal embrace of our hands.

Since so many of us now worry about the impact of these devices and social media on our lives and mental health, this story and its lessons may ultimately be depressing. As we are all aware, of course, history and human expression are not always sweetness and light.

We should feel satisfied rather than dismissive that we will have a dozen years of our collective human expression to look back on, the amusing and the ugly, the trivial and, perhaps buried deep within the archive, the profound.

Institutionalizing Digital Scholarship (or Anything Else New in a Large Organization)

I recently gave a talk at Brown University on “Institutionalizing Digital Scholarship,” and upon reflection it struck me that the lessons I tried to convey were more generally applicable. Everyone prefers to talk about innovation, rather than institutionalization, but the former can only have a long-term impact if the latter occurs. What at first seems like a dreary administrative matter is actually at the heart of real and lasting change.

New ideas and methods are notoriously difficult to integrate into large organizations. Institutions and the practitioners within them, outside of and within academia (perhaps especially within academia?), too frequently claim to be open-minded but often exhibit a close-mindedness when the new impinges upon their area of work or expertise. One need only look at the reaction to digital humanities and digital scholarship over the last two decades, and the antagonism and disciplinary policing it is still subject to, often from adjacent scholars.

In my talk I drew on the experience of directing the Roy Rosenzweig Center for History and New Media at George Mason University, the Digital Public Library of America, and now the Northeastern University library. The long history of RRCHNM is especially helpful as a case study, since it faced multiple headwinds, and yet thrived, in large part due to the compelling vision of its founder and the careful pursuit of opportunities related to that vision by scores of people over many years.

If you wish to digest the entire subject, please watch my full presentation. But for those short on time, here are the three critical elements of institutionalization I concluded with. If all three of these challenging processes occur, you will know that you have successfully and fully integrated something new into an organization.

Routinizing

At first, new fields and methods are pursued haphazardly, as practitioners try to understand what they are doing and how to do it. In digital scholarship, this meant a lot of experimentation. In the 1990s and early 2000s, digital projects that advanced scholarly theories eclectically tried out new technologies. Websites were often hand-coded and distinctive. But in the long run, such one-off, innovative projects were unsustainable. The new scholarly activity had to be routinized into a common, recognizable grammar and standardized formats and infrastructure, both for audiences to grasp genres and for projects to be technically sustainable over time.

At RRCHNM, this meant that after we realized we were making the same kind of digital historical project over and over, by hand, we created generalized software, Omeka, through which we could host an infinite number of similar projects. Although it reduced flexibility somewhat, Omeka made new digital projects much easier to launch and sustain. Now there are hundreds of institutions that use the software and countless history (and non-history) projects that rely on it.

Normalizing

To become institutionalized, new activities cannot remain on the fringes. They have to become normalized, part of the ordinary set of approaches within a domain. Practitioners shouldn’t even think twice before engaging in them. Even those outside of the discipline have to recognize the validity of the new idea or method; indeed, it should become unremarkable. (Fellow historians of science will catch a reference here to Thomas Kuhn’s “normal science.”) In academia, the path to normalization often—alas, too often—expresses itself primarily around concerns over tenure. But the anxiety is broader than that and relates to how new ideas and methods receive equal recognition (broadly construed) and especially the right support structures in places like the library and information technology unit.

Depersonalizing

The story of anything new often begins with one or a small number of people, like Roy Rosenzweig, who advanced a craft without caring about the routine and the normal. In the long run, however, for new ideas and methods to last, they have to find a way to exist beyond the founders, and beyond those who follow the founders. RRCHNM has now had three directors and hundreds of staffers, but similar centers have struggled or ceased to exist after the departure of their founders. This is perhaps the toughest, and final, aspect of institutionalization. It’s hard to lose someone like Roy. On the other hand, it’s another sign of his strong vision that the center he created was able to carry on and strengthen, now over a decade after he passed away.