It’s no secret that digital technologies and networks are becoming tremendously disruptive to academia by introducing new ways of doing research, publishing, teaching and collaborating with peers. But few universities have shown much gusto for tackling this very difficult topic, let alone trying to devise some working solutions. So USC deserves some credit for a serious and sophisticated one-day symposium on the topic in January 2011.
Hosted by the USC Office of Research and the Norman Lear Center at the USC Annenberg School for Communication & Journalism, the event convened a highly interdisciplinary set of participants – from engineering, social sciences, medical fields and the humanities. The ideas was to explore some of the innovative ways that academic research is now occurring and what university administrations should do in response. Among the questions posed at the symposium:
- How do you get credit toward tenure or promotion if your work as an academic is part of a vast online collaboration?
- How should peer review be done now that online platforms make it easy to invite talented outsiders from other disciplines, and even non-academics, to review work?
- With everyone staring into the computer screens, how should research institutions design real-world spaces so that people can actually have serendipitous in-person encounters and collaborations?
I served as rapporteur for that event, and now the final report has been published. You can download a pdf copy of Creativity & Collaboration: Technology and the Future of Research in the Academy here.
In other words, if academia is a commons, how should it be re-imagined and re-structured? What new social norms may need to emerge within disciplines? I urge you to read the entire report, but for the moment, let me review a few specific academic research and publication projects that posed special challenges.
The Large Hadron Collider, or LHC, is a massive scientific instrument built in “a very expensive hole in the ground” outside of Geneva, Switzerland, is one example of a high-tech system that has required entirely new sorts of organizational structures and norms to enable scientists to share information and work together. Carl Kesselman, an engineering professor at USC, explained, “There are three or four experiments running off the Collider, and the problem they faced is that they were generating tremendous amounts of data. This required tremendous amounts of data analysis among a global community of scientists.”
To deal with this challenge, scientists ended up inventing a entirely new sort of global informatics infrastructure, “grid computing,” to manage the various scientific experiments, data-sharing and results-sharing among multiple scientific communities. The project was required the creation of a “virtual organization” that was not just about building websites and sharing digital files, but about sharing some basic processes– computing, data management, services, the use of scientific instruments as well as the business rules, policies and provenance of the information.
Another sort of high-level scientific collaboration is the Alzheimer’s Disease Neuroimaging Initiative, or SDNI, which is a revolutionary study in the clinical neurosciences for collaboration and data-sharing. The idea behind the database, administered by the National Institute of Aging, is to aggregate data from some 57 different research sites, and then “let any scientist sitting anywhere at his or her computer anywhere in the world can go into the ADNI database, day or night, and download any of the data, analyze the data however they want, with whatever novel methods or hypotheses they have, and then can publish the data freely,” in the words of Neil Buckholtz, the director of dementia research at the National Institute of Aging. This mass-collaboration of scientists around a database has been a fantastic idea – but implementing it has required considerable re-engineering of social norms and disciplinary practices.
Another fascinating experiment is the revamping of peer review at Shakespeare Quarterly. The editors came up with a new, hybrid review system for article submissions. The editors continued to screen articles upfront and to make the final publication decision, as they always had, but the journal also opened up the discussion to newcomers. This was seen as particularly important for a journal that publishes a lot of interdisciplinary studies. Katherine Rowe, a Bryn Mawr professor who sits on the editorial board of the journal, noted:
“What was novel about our outcomes was that the publication decisions were based on the push and pull among experts, rather than on the traditional two data points of anonymous, expert reviewers. There was a peer-to-peer effect in which the editors could begin to see where the field itself was grappling with problems that hadn’t yet been solved – as opposed to focusing on how a particular author might not be successfully grappling with a problem. That peer-to-peer effect among the reviewers and with the authors – as they responded to each other and grappled with particularly difficult issues – allowed us as editors to make decisions that were in a way accountable to the field as a a whole, rather than to the two reviewers that we might traditionally be looking to.”
Tara McPherson, who teaches courses in digital media, television and popular culture at the USC School of Cinematic Arts, explained that the “vernacular archives” of popular culture – or what scientists might call “datasets” – are expanding at a phenomenal rate. YouTube now hosts more than 150 million videos, with thirteen hours of content uploaded every minute. Flickr surpassed five billion photographs in September 2010, and Facebook claims that 2.5 billion photos are uploaded to its site every month.
Then there are the scholarly datasets of text, video and images used in the humanities. The USC Shoah Foundation Institute for Visual History and Education has a collection of nearly 52,000 video testimonies in 32 languages and from 56 countries, making it the largest visual history archive in the world. McPherson also noted the National Science Foundation-funded cyber-infrastructure DataOne (Data Observation Network for Earth), a versatile, distributed framework for observational data about the earth, is a very large dataset.
What does all this mean? McPherson suggests that the rise of these and other vast datasets are changing the very forms of scholarly production. People are starting to ask, “Can we have the communal nature of a blog or the multimedia capacity of YouTube incorporated into our scholarly practices? Can these vast archives change not only how we do our research, but how we share and publish it, producing new scholarly outcomes? Can our analyses 'live with' our data? What would it mean if the interpretations we are producing actually lived side-by-side with our evidence? I think it would make us more accountable because we couldn’t make wild claims. But it would also mean our interpretations could be layered.”
So some scholars see the creation of useful meta-data tools (which make it easier and more useful to study and analyze large datasets) as a serious scholarly activity that deserves full recognition from university administrators. But then the questions are raised, What constitutes a significant, meritorous contribution to meta-data, and who is the best judge?
The Norman Lear Center report covers a lot more of these sorts of issues. There are not always answers, but the questions are on-point. I might add that the publication itself (available in hard copy) is handsomely designed and enhanced by “mind mapping” illustrations by cartoonist Lloyd Dangle. Kudos as well to my colleagues Marty Kaplan and Johanna Blakley for their considerable research and planning in organizing the event.
Comments
Academia only?
If I replace in your post the words academia by "corporate universities", or even more generally by "collaboration within corporations", it is incredible how it is still relevant.
Only a part of Academia
Only the part of Academia which is dealing with research needs a restructuring of computing infrastructure, as they are researching bigger and bigger projects. The example of LHC is somehow an extreme and isolated case. Will that mean to change the entire infrastructure of Academia right away ? No, I think that as they will deal bigger projects, every project that will surpass the existing infrastructure will bring new update to that, step by step,bijuterii argint.