You are on page 1of 24

This article was downloaded by: [188.25.224.

109] On: 12 December 2012, At: 13:23 Publisher: Routledge Informa Ltd Registered in England and Wales Registered Number: 1072954 Registered office: Mortimer House, 37-41 Mortimer Street, London W1T 3JH, UK

Information, Communication & Society


Publication details, including instructions for authors and subscription information: http://www.tandfonline.com/loi/rics20

THE NEXT DECADE IN INTERNET TIME


Leah A. Lievrouw
a a

Department of Information Studies, University of California, Los Angeles, 216 GSE&IS Building, Box 951520, Los Angeles, CA, 90095-1520, USA Version of record first published: 18 Apr 2012.

To cite this article: Leah A. Lievrouw (2012): THE NEXT DECADE IN INTERNET TIME, Information, Communication & Society, 15:5, 616-638 To link to this article: http://dx.doi.org/10.1080/1369118X.2012.675691

PLEASE SCROLL DOWN FOR ARTICLE Full terms and conditions of use: http://www.tandfonline.com/page/termsand-conditions This article may be used for research, teaching, and private study purposes. Any substantial or systematic reproduction, redistribution, reselling, loan, sublicensing, systematic supply, or distribution in any form to anyone is expressly forbidden. The publisher does not give any warranty express or implied or make any representation that the contents will be complete or accurate or up to date. The accuracy of any instructions, formulae, and drug doses should be independently verified with primary sources. The publisher shall not be liable for any loss, actions, claims, proceedings, demand, or costs or damages whatsoever or howsoever caused arising directly or indirectly in connection with or arising out of the use of this material.

Leah A. Lievrouw
THE NEXT DECADE IN INTERNET TIME Ways ahead for new media studies

Downloaded by [188.25.224.109] at 13:23 12 December 2012

In this paper, three features of the Internet/new media that have developed over the last decade are discussed: the relational Internet, the enclosed Internet, and the mean world Internet. These features correspond to the three interrelated elements of new media infrastructure: the practices in which people engage to interact and share information and meaning; the tools, devices, or artifacts that people create and use in order to do so; and the social arrangements or institutional forms that develop out of and around those practices and tools. Together, the three features have had an important influence on the ways that new media are understood and used and have helped shift popular discourses and the study of new media from an emphasis on possibility, novelty, adaptability, and openness toward greater preoccupations with risk, conflict, vulnerability, routinization, stability, and control. Given these conditions, the author proposes that three problem areas again corresponding to practices, tools, and social arrangements may be important directions for new media studies over the next decade in Internet time. Network literacies and pedagogies that prepare individuals to be full and effective participants in society, politics, and culture must be developed and implemented. Dead media may pose increasing challenges to sustainable cultural heritage as well as to ever more intrusive regimes of total surveillance and capture of personal information, enabling a right to be forgotten. Commons knowledge projects may challenge and even reconfigure the foundations of institutional authority, expertise, legitimacy, and power. Keywords communication studies; cyberculture; ICTs; mobile technology; surveillance/privacy (Received 17 November 2011; final version received 12 March 2012)

Introduction
The Internet is now over 40 years old. While many early visions and expectations for networked computing and telecommunications have been realized, numerous others (for both good and ill) that the original developers of the
Information, Communication & Society Vol. 15, No. 5, June 2012, pp. 616 638 ISSN 1369-118X print/ISSN 1468-4462 online # 2012 Leah A. Lievrouw http://www.tandfonline.com http://dx.doi.org/10.1080/1369118X.2012.675691

THE NEXT DECADE IN INTERNET TIME

617

system could scarcely have imagined have emerged. Some of the most significant changes have only appeared in the decade since the Oxford Internet Institute was launched. In this paper, I begin with a discussion of three features of the current Internet/new media landscape that have emerged over the last decade: the relational Internet, the increasingly interpersonal and personally customized character of online and mobile communication; the enclosed Internet resulting from growing technological and legal restrictions on new media devices and systems; and the mean world Internet, the sense of risk and exposure online that has been used to justify the expansion of increasingly invasive private and state surveillance/security regimes. These conditions have helped shift popular perceptions of online communication and discourses about new media from a longstanding emphasis on possibility, novelty, adaptability, and openness toward current preoccupations with risk, conflict, vulnerability, routinization, stability, and control. They have also fundamentally shaped the study and understanding of networked media and information technologies and their social and cultural significance and consequences, as well as the actual use of communication technologies in everyday social, economic, and cultural life. The relational, enclosed, and mean world characteristics can be understood as manifestations of the co-determining triad of practices, tools, and social arrangements, which comprise the core elements of new media infrastructure (Lievrouw & Livingstone 2006). Together, the three aspects of infrastructure are mutually implicated in the forms and quality of mediated communication in society. They also set the terms for new media use who gets to use communication technologies, for what purposes, under what conditions, and, crucially, who decides. Taking the three features as a point of departure, the discussion moves on to an examination of several developments that have emerged in this relational, tightly bounded, and risk-averse new media context: the need for new repertoires of communication competencies and logics, or network literacies; the proliferation of obsolete, incompatible, inaccessible, and unreadable technology formats and systems, or dead media; and the growth of amateur- and peer-produced commons knowledge. Developments such as these, and their articulations and dynamics, represent the kind of research problems and issues that may lie ahead for new media studies over the next decade.

Downloaded by [188.25.224.109] at 13:23 12 December 2012

The last decade in Internet time: three features of the new media landscape
Over the last 10 years, the Internet, mobile telephony, and related communication technologies have undergone many important changes, both in terms of the platforms themselves and in terms of how people use and understand

618

INFORMATION, COMMUNICATION & SOCIETY

them. In many respects, it has been a period of normalization, domestication (Haddon 2006; Silverstone 2006), and even banalization (Lievrouw 2004) as new media technologies have become more routine, taken for granted, and integrated into everyday life. Nonetheless, three main features corresponding to the practices, tools, and social arrangements or formations of new media infrastructure have emerged and interacted over time to produce a distinctive climate or set of social conditions for contemporary mediated communication.

Practices: the relational Internet


Perhaps the most widely discussed change shaping perceptions of and discourse about new media over the last decade has been the more personalized, relational nature of mediated communication online, associated with a growing sense of sociality and embeddedness within social and technical networks (what Rheingold (2010) has called net awareness). The term Web 2.0 has become shorthand for the clear dividing lines between todays social Internet and the presumably more static, documentary, and less interactive Web 1.0 that preceded it. However, the distinction is not that simple or merely a consequence of the introduction of social network platforms such as MySpace, Friendster, Facebook, and Google+. Interpersonal interaction and small group/organizational communication processes have been a major part of networked computing from the earliest days of the ARPANET. Email was the Internets first, unqualified (and unexpected) killer app (Newell & Sproull 1982). By the late 1970s, complex computer-supported conferencing and group decisionsupport systems were already in place in many large organizations, and a growing research literature was already examining interpersonal and small group communication processes online (Short et al. 1976; Johansen et al. 1979; Hiltz & Turoff 1993 [1978]). Usenet groups and multi-user dungeons for gamers flourished in the 1980s, before graphical user interfaces were generally available. These systems and studies provided the foundation for todays research field of computer-mediated communication. Nonetheless, the pervasive use of social networks and related authoring systems such as blogs and wikis has undoubtedly affected the ways that people think about and use communication technologies in their daily lives. In particular, the newer systems have accelerated or reinforced the sense that relationships and interactions are central to the experience of using and engaging with media. Since the early 2000s, countless studies have examined changing notions of friendship, intimacy, identity and self-representation, trust, personal disclosure, bullying and abuse, and a host of other aspects of interpersonal communication in the online context. Indeed, while ICTs and the Internet are now widely credited in popular culture as sites for greater sociality and participation across traditional social, geographic, and cultural boundaries, they are also blamed for an unprecedented range of personal risks and harms arising from online interaction.
Downloaded by [188.25.224.109] at 13:23 12 December 2012

THE NEXT DECADE IN INTERNET TIME

619

This sense of the greater opportunities and benefits of online interaction, paired with a growing sense of risk associated with new forms and venues for interpersonal communication, underlies a different, and insightful, interpretation of the Internet versions 1.0 and 2.0 suggested by Graham Meikle in 2002. Although he proposed this characterization well before the age of Friendster and Myspace, Meikle argued that perhaps paradoxically the openness, diversity of viewpoints, and ease of participation that were afforded even then by new media technologies, including new horizons for interpersonal interaction, were likely to prompt a backlash, pressures to impose restrictions on user activities, and the restoration of a more stable, safe, culturally familiar, and marketoriented communication and media landscape. As he observed:
Downloaded by [188.25.224.109] at 13:23 12 December 2012

Version 2.0 is, for me, the lesser option . . . . Version 1.0 offers change; Version 2.0 offers more of the same. Version 1.0 demands openness, possibility, debate; Version 2.0 offers one-way information flows and a single option presented as choice . . . . Version 1.0 would open things up. Version 2.0 would nail them down. (Meikle 2002, pp. 12 13) The relational quality of online communication has had other important effects as well. Search and selectivity (of content, and among personal contacts and interactions) are now thoroughly integrated into mediated communication and have transformed the production and circulation of traditional mass media-style content. Search engines such as Google and Bing are the first stop for users looking for any type of information online, from political debates to recipes to medical advice, and searchers routinely share results with others they know having similar interests or needs. At the same time, searchers run the risk of encountering unexpected, unfamiliar, offensive, or harmful information that may be at odds with their existing perceptions, tastes, and beliefs. Ironically, perhaps, the new opportunities for communication and information-sharing have contributed to a parallel sense of risk, instability, and danger among many users.

Tools: the enclosed Internet


There is little question that new media technologies have become more accessible and adaptable across geographic space, diverse populations, and cultural settings, especially since the introduction of browser and search technologies in the early 1990s and the build-out of mobile networks in the 2000s. In part, this has been due to the distribution of relatively inexpensive and powerful mobile devices, applications, extensive digital networks, and investment in computational capacity and data storage on an unprecedented scale. At the same time, favorable economic policies and regulatory schemes have promoted investment in wireless communication networks and services and a shift away from the older, more

620

INFORMATION, COMMUNICATION & SOCIETY

expensive, and more regulated installed base of wire-line telephones and broadcasting. At the same time, a strong do-it-yourself user culture of remixing and hacking has flourished, where off-the-shelf products, programs, and cultural works are all commonly tinkered with, sampled, and reconfigured to suit local or specialized needs, interests, and tastes. The consequences are now familiar. Established industries and markets have been disrupted and new generations of incumbent firms have emerged, especially in the areas of media content and entertainment. Sophisticated devices and services have been appropriated and domesticated (some might say have intruded) into everyday routines of home, family, leisure, work, and culture. Personalized communication technologies and information resources are ever more integrated into political processes and economic activity, leading to new forms of mobilization and collective action, but also to the perpetuation and reinforcement of enduring social, economic, and political divides, inequities, and divisions. Each adaptation seems to spark new rounds of admiration and optimism, on the one hand (e.g., the mobilization against authoritarian regimes in the Middle East, or the so-called Arab spring, and the uncanny sense of immediacy and cohesion among family and friends on Facebook), or moral panic, on the other hand (the so-called Blackberry riots in the UK and the creepy sense of personal overexposure and lack of privacy on Facebook). It might seem that the users of new media have never had so much choice and flexibility in the range of available channels and resources. However, critics charge that the choices are more illusory than real. Media, hardware and software firms, and their allies in government and law enforcement have responded to DIY (do it yourself) and remix culture with a variety of tactics. Proprietary apps, digital rights management, and anti-circumvention technologies not only prevent unsanctioned uses and access but actually conceal such choices from users (Cohen 2003). Incomprehensible licensing agreements1 and pay walls restrict information access and circulation in seemingly arbitrary and illogical ways. Particularly in the United States, walled gardens of incompatible technical standards and platforms, and interminable contracts with prohibitive penalties for service termination, punish churn (users switching from one service to another) and lock in reliable revenue streams (Zittrain 2008). Deliberate attempts to skirt intellectual property claims as well as the most innocuous personal, fair-use, or non-commercial uses of copyrighted material are met with swift, severe (and, some charge, out-of-proportion) lawsuits, blackballing, and threats of criminal prosecution. Besieged authorities and law enforcement seek kill switches to shut down Internet access for political activists or unauthorized file sharers (Morozov 2011). To instill respect for intellectual property values, primary school children are drilled on the dangers of piracy and reckless information trafficking using curricula and lessons helpfully provided to school districts at no charge by entertainment industry groups (Gillespie 2009).

Downloaded by [188.25.224.109] at 13:23 12 December 2012

THE NEXT DECADE IN INTERNET TIME

621

Downloaded by [188.25.224.109] at 13:23 12 December 2012

To the extent that they have affected popular perceptions of appropriate or safe uses of new media technologies, or have encouraged self-censorship, such efforts have advanced private-sector and law enforcement aims of defining what counts as legitimate (paid, observable) versus illegitimate (unpaid, unobservable) communication. Yet hackability, adaptations, and workarounds persist in the face of every effort to lock down, curtail, or prosecute. Critic Peter Lunenfeld (2011) describes the current cultural arena as a secret war between downloading and uploading, in which media industries (the forces of download) struggle to maintain control over content and distribution in a technological and cultural terrain populated by users (the forces of upload) intent on stitching together new works out of anything they can find and repurpose, and sharing them with the larger online world, despite (or perhaps to spite) the technological and legal barriers.

Social arrangements: the mean world Internet


In the 1980s, communication scholars George Gerbner, Nancy Signorielli, and their colleagues developed cultivation theory, including a concept they called the mean world syndrome. They contended that people most heavily exposed to mass media depictions of violence tend to believe that crime and violence are much more prevalent in society than they actually are and tend to be more fearful or mistrustful of others than real social conditions warrant (Gerbner et al. 1986; Signorielli 1990). The third, and possibly most significant, influence on perceptions and uses of the Internet and new media over the last decade resembles the mean world syndrome, scaled up. Communication and information networks, and particularly the Internet, have been reframed as sites of struggle and danger in geopolitical, economic, and military conflicts. The attacks of September 11, 2001; the subsequent United States-led war on terror and insurgencies and assaults against Western states and interests; the rise of grass-roots democracy/independence movements and political oppositions in many developing regions of the world; and fears about criminal networks coordinating and conducting their activities online have dominated media coverage of new media and transfixed the popular imagination. These events are now routinely invoked by authorities to justify the expansion of private and state surveillance/security apparatuses with global scale and personal, pinpoint reach. In commerce and politics, the capture, collection, and assessment of information about individuals and their activities have become the raison detre for social network sites, search engines, online publishing and media, and workplace/employee monitoring systems. Data gathering and data classification have become an integral (some say indispensable) part of warfare, policing, education, health care, finance, travel, and virtually every other aspect of contemporary life.

622

INFORMATION, COMMUNICATION & SOCIETY

New institutional forms, legal regimes, and political discourses to support and justify these apparatuses have arisen, based on the assumption that the Internet is a pivotal site of conflict, vulnerability, deception, and risk to individuals and the established order. Extremist or oppositional websites of all stripes are monitored, sabotaged, or shut down outright, depending on local legal codes. For example, provisions of the first and second US Patriot Acts allowed government authorities unprecedented access to individuals electronic communications without search warrants, including telephone calls, email traffic, financial records, and even library patrons borrowing records (with the added proviso that the organizations surrendering the information were prohibited from notifying their clients either that they had been approached by law enforcement or that the information was surrendered). On an even larger scale, after the September 11 events, a congeries of US federal law enforcement, security, and relief agencies, missions, and jurisdictions were quickly reorganized into the single, outsize, omnibus US Department of Homeland Security (DHS). Originally justified by the George W. Bush administration by the need to streamline the aggregation and sharing of different agencies vast stores of intelligence on citizens and foreign visitors, the agencys current scope and the positioning of the Internet as a central point of vulnerability to national security is suggested by a recent White House statement on the 2012 budget request for the DHS. The agency is described as the principal Federal agency charged with the vital missions of preventing terrorism and enhancing security, securing and managing Americas borders, enforcing and administering immigration laws, safeguarding and securing cyberspace, and ensuring resilience to disasters (http://www.whitehouse.gov/omb/factsheet_department_homeland/). The institutional and legal changes are not restricted to the United States, however. A recent European Union directive on cyber crime strengthens and expands the authority of the European Network and Information Security Agency, metes out stiffer criminal sanctions for perpetrators of cyber attacks and the producers of related and malicious software, and requires that member states establish their own dedicated cyber-crime agencies and respond within hours to requests from other members for investigation of alleged violations originating within their borders (http://europa.eu/rapid/ pressReleasesAction.do?reference=IP/10/1239). Many of the directives provisions have been characterized as unclear or debatable: for example, by prohibiting the interception of any data deemed confidential, it would outlaw the activities of whistleblowers as well as organizations such as Wikileaks. The directive also makes participation in denial-of-service actions of any sort illegal, but in so doing could make unwitting owners of computers captured by botnets liable to prosecution. Despite these questions, and some objections by conservatives in the coalition government, the UKs Home Office signed on to the directive in February 2011 (http://www.eweekeurope.co.uk/news/home-office-adopts-

Downloaded by [188.25.224.109] at 13:23 12 December 2012

THE NEXT DECADE IN INTERNET TIME

623

flawed-eu-cyber-crime-directive-20045; http://www.bbc.co.uk/news/ukpolitics-12354931). Authoritarian regimes are often criticized by more democratic states for restricting information-seeking, personal expression or interaction online that might criticize their ruling parties, policies, or leadership. Activists and opponents of repressive governments are widely praised by Americans and Europeans for their clever uses of social networks and messaging and microblogging services to work around state surveillance and mobilize protests. However, British politicians recently found themselves on the receiving end of similar, and none-too-subtle, criticism by Iran, China, Zimbabwe, Libya, and other authoritarian states that deplored the sickness and permissiveness of British society and the hypocrisy of calls by British politicians and the general public for severe crackdowns on uses of mobile technologies and the Internet by their own citizens involved in the riots there (http://www.21cb.net/londonriots-china-response/; http://www.theatlanticwire.com/global/2011/08/ iran-libya-and-china-uk-riots-are-time-taunt/41062/). In response to the urban unrest, British leaders suggested severe measures, including banning rioters from access to communications services, shutting down social network sites in times of crisis, and even evicting families of participants in the disturbances from public housing and eliminating their benefits. However, in subsequent talks with industry representatives from Research in Motion (makers of the Blackberry Messenger service), Facebook, and Twitter, the firms assured the government that they would cooperate with law enforcement efforts to use their systems to track and identify offenders. The tech news website ZDnet quoted Facebooks statement that this was a dialogue about working together to keep people safe rather than about imposing new restrictions on Internet services and RIMs declaration that It was a positive and productive meeting and we were pleased to consult on the use of social media to engage and communicate during times of emergency (http://www.zdnet.co.uk/ blogs/from-both-sides-10005031/government-climbs-down-on-social-networkblocking-10024206/?s_cid=452). Subsequently, the coalition government backed away from its proposals to implement kill switches, allowing them to shut down social network services, although as a writer for the news site TechEye.net ironically observed, shutting down social networks is a bit like prosecuting the postman (http://news.techeye.net/Internet/home-officeconcludes-banning-facebook-was-barmy).

Downloaded by [188.25.224.109] at 13:23 12 December 2012

Practices, tools, and arrangements: articulations


Considered together, the three developments outlined above the personal-yetexposed quality of online communication, walled or enclosed technologies and standards, and the proliferation of institutions and policies designed to monitor,

624

INFORMATION, COMMUNICATION & SOCIETY

control, and stabilize an inherently risky and dangerous online world point to a future for mediated communication that is more reliable, stable, predictable, and safe, but potentially less innovative, creative, and open. Each development corresponds to a different aspect of new media infrastructure: tools, practices, and social arrangements, respectively. It is useful to examine the articulations among the three aspects and to consider how they have shaped and even reinforced one another. In terms of tools, the enclosed Internet has evolved, in part, in response to both users demands and institutional shifts toward greater oversight and control of online activities. Users want applications and devices that are easy to use, reliable (less prone to breakdown), secure (resistant to unwanted intrusion or hijacking), and safe (able to block or flag people, resources, and activities that users find undesirable, offensive, or threatening). Institutional authorities demand that systems comply with or even automatically enforce an expanding range of legal and commercial demands, including privacy laws, intellectual property claims, national security and law enforcement directives, competitive rivalries among firms and trade blocs, and cultural and ethical norms. By the same token, the devices and systems that are available in given time and place also shape users expectations about what the tools can do and what they are for, as well as what people actually do with them. The relational Internet, as pointed out previously, has become a venue for interpersonal interaction and personal expression, not only straightforward information-seeking or the consumption and appropriation of media products. Facebook members may have qualms about how the site gathers, aggregates, and shares data about them, but many readily offer real-time information about their whereabouts, contacts, interests, and social preferences to Facebook and location-based services such as foursquare or Twitter. They may use loyalty programs and mobile phone apps that collect information about their shopping habits and product consumption in exchange for discounted merchandise, or surrender personal information ranging from financial and travel records to religious beliefs and biometric scans to security firms or law enforcement agencies that assure concerned users that the collection of such detailed individual information is a necessary and appropriate means to prevent and prosecute terrorism, criminality, fraud, and so on. In terms of social arrangements, organizations and institutions also respond and adapt to the available tools and devices and to peoples communication practices and norms. Platforms and products designed to be incompatible with those of competitors and digital rights management technologies that restrict users access to and uses of media content have helped firms such as Microsoft, Apple, and Amazon dominate their respective industries and markets. The purview, size, responsibilities, and political power of state security and law enforcement agencies have vastly expanded in parallel with the availability of sophisticated systems for surveillance, data capture, storage, and analysis, as

Downloaded by [188.25.224.109] at 13:23 12 December 2012

THE NEXT DECADE IN INTERNET TIME

625

Downloaded by [188.25.224.109] at 13:23 12 December 2012

well as sharply increased citizen demands for public safety and protection from risk. In summary, the articulations among tools, practices, and social arrangements are dynamic: each builds on and reinforces the others, and a shift in one aspect can provoke corresponding shifts across the other two. This may or may not be a welcome prospect. From the progressive-left or libertarian perspective, for example, preoccupations with risk, safety, control, reliability, and security and consequent moves towards ever more monitored, standardized, filtered, regulated, and exclusionary systems, actions, and patterns of organization may look like a vicious circle or downward feedback loop. Technologies, people, and institutions are being driven toward a dystopian scenario where all forms of expressions and relationships are open to inspection by commercial interests, the state, or even other individuals, on demand, anytime, anywhere. The situation might also be likened to a kind of digital, global-scale spiral of silence, Noelle-Neumanns (1984) theory of public opinion formation and mass communication, which hypothesizes that interpersonal communication and mass media echo and reinforce popular opinions, while marginalizing, sanctioning, and silencing unpopular or disruptive ideas. The spiral of silence thus reinforces political stability and the status quo. However, it is not necessarily the case that the articulation between tools, practices, and social arrangements is always a matter of feed forward or path dependence with largely determined outcomes. The articulations among the three elements of infrastructure also create opportunities for pushback, gaps, or spaces of action towards alternative outcomes. The three developments discussed in the next section again, they correspond to the practices, tools, and social arrangements of new media infrastructure suggest not only that contemporary communication technologies remain open to new or unexpected uses and forms, but also that new media scholars should develop equally innovative approaches and perspectives to understand events as they unfold in the next decade in Internet time.

New developments, new ways ahead


If new media and mediated communication have become more relational, enclosed, and risk averse, what problem areas or issues may lie ahead for new media research and scholarship? Three are suggested here. Network literacies and pedagogies are prerequisites for effective social, economic, and political participation. Dead media pose an ever greater challenge to the sustainability of cultural heritage as well as to regimes of total surveillance and capture of personal information. Commons knowledge becomes a larger and more influential part of culture, challenging expertise, knowledge authorities, and traditional institutional power.

626

INFORMATION, COMMUNICATION & SOCIETY

Practices: network literacies


It is one thing to say that communication online has become more relational, socialized, and expressive. In practical terms, however, this requires that individuals master an emergent, articulated repertoire of communicative competencies that mixes interpersonal and group process fluency; an aptitude for organizing, pattern recognition, and making linkages and correspondences; and an orientation toward tinkering, design, and the crafting of messages that is more typically associated with traditional media production, engineering/programming, and the arts. As an ensemble, we might think of this repertoire of competencies as network literacy, where communication networks are conceived as inextricably social and technological. Those who are network literate are as comfortable with divergent cultural ideas and expressions as they are with the channels and methods for generating and sharing them. As with other forms of literacies, proficiency in the network context does not necessarily, or entirely, come naturally it must be taught and learned. Thus, some of the most compelling questions for the next decade in Internet time may ask what pedagogies must be implemented for teaching and learning network literacy and how to do so. The idea of network literacy (or literacies, more accurately) and related concepts has attracted intense interest recently among researchers and educators. For example, Jenkins et al. (2009) at the University of Southern Californias New Media Literacies Project, supported by the MacArthur Foundations Digital Media and Learning initiative, argued that new media literacies comprise digital literacies plus media literacies. They suggested that new media literacies are stymied less by simple access to technologies than by a participation gap (lack of access to learning opportunities, experiences, skills, and knowledge), a transparency problem (ability to recognize how media shape peoples perceptions of the world), and an ethics challenge (the breakdown of traditional professional training and norms). They included 11 core skills as new media literacies: play, performance, simulation, appropriation, multitasking, distributed cognition, collective intelligence, judgment, transmedia navigation, networking, and negotiation. Similarly, the Learning Through Digital Media project at the New School in New York City has collected essays, demonstrations, teaching tools, and content materials and made them available in both print and online forms. In his introduction to the project, editor and project leader Trebor Scholz noted that The most burning problem for digital learning is technological obsolescence and the attendant need to learn and readapt to new technological milieus and cycles of transformation, but suggested that technological facility is dependent on an even more important set of learned attitudes: Openness, flexibility, playfulness, persistence, and the ability to work well with others on-the-fly are at the heart of an attitude that allows learners to cope with the unrelenting velocity of technological change in

Downloaded by [188.25.224.109] at 13:23 12 December 2012

THE NEXT DECADE IN INTERNET TIME

627

the twenty-first century (http://www.learningthroughdigitalmedia.net/ introduction-learning-through-digital-media#more-362/). Although these are two of the more widely known efforts currently, similar ideas underlie a number of related concepts, and not just for young learners. For example, social intelligence has been proposed as the ability to seek and evaluate information in complex social and technological webs (Cronin & Davenport 1993). Long-time Internet observer and pundit Rheingold (2010) has advocated a scheme for network literacy that encompasses five foundational competencies: attention, participation, cooperation, critical consumption (what he calls crap detection), and net awareness (see also http://howardrheingold.posterous.com/). It might be argued and Rheingold, Jenkins, and others would surely agree that virtually all the skills or attitudes they espouse are essential components of critical thinking that should be cultivated regardless of technological or cultural setting. However, a couple of factors set network literacy apart. The first is the crucial and growing need for people to be able to evaluate information sources and content, implied by Jenkinss emphasis on judgment and Rheingolds crap detection. Such skills are especially important as traditional modes of knowledge generation, organization and gatekeeping are being challenged or eroded by more participatory and inclusive peer-production practices online, which can also be prone to revisionism, spin, incivility and bias, deliberate misinformation, and so on (see the following section on commons knowledge). Going one step further, judgment and evaluation in the network context may not simply be a matter of comparing new or untested information against established standards or truth claims. The ways that information is generated and organized are fundamentally ontological, and network-literate individuals should be ready to question that ontology as well as their own epistemic values about how useful or valid information and knowledge are gained and understood. The second distinctive aspect of network literacy is suggested by Jenkins et al.s transparency problem and Rheingolds notion of net awareness. Both concepts imply that the network-literate person must understand how he or she is situated vis-a-vis others and the larger social and technological world beyond the relations and circumstances at hand. For Rheingold (2010), net awareness includes not just a sense of the architecture of communication technology and how it enables some kinds of action and information flows and constrains others; it also requires a sense of interpersonal and group relatedness, linkages, and power, online and off. And unlike many other observers calling for a new network perspective in teaching and learning, Rheingold insist that the core principles and techniques of social network analysis power laws and the long tail phenomenon, network externalities, centrality/prominence, cliques and subgroup structures, weak ties, structural equivalence, and so on must be the core of teaching and learning about networks. Similarly, Lievrouw and Nguyen (2007) have proposed a related conceptual framework, the network imaginary, defined as the ability to imagine and visualize networks of social and

Downloaded by [188.25.224.109] at 13:23 12 December 2012

628

INFORMATION, COMMUNICATION & SOCIETY

Downloaded by [188.25.224.109] at 13:23 12 December 2012

technical relations and links, including the extension, possible breakdown, and consequences of these relations. People can anticipate the effects of their actions on these relations and consequences, within their immediate surroundings and beyond, in other systems, events, people, and places in the larger world. The network imaginary shapes peoples perceptions of the range of action open to them, in material/physical and virtual/mediated places alike. Cultivating this sense of situatedness and options for action, as well as the skills for mapping and analyzing networks, might be expected to be a fundamental part of any network or new media pedagogy. Certainly, a few other competencies might be needed to round out any comprehensive account of network literacy. For example, navigation and search (interpersonal, informational, and political) have not usually been counted as major aspects of communicative competence; however, few would argue that these are minor or secondary skills given todays technology and culture. Visualization the ability to conceptualize and render/depict complex or abstract concepts in creative ways in a variety of formats and media is especially valuable as communication technologies become less and less tied to text (Vesnas (2007) concept of Database Aesthetics, for example, is a particularly powerful way to think about making the invisible visible). Other core competencies might include hacking, remixing, and repurposing, reverse engineering, and invention, for example. But over the next decade, it will be necessary to move beyond enduring assumptions and didactic habits in order to see communication itself as a manifestation of continuously reorganizing networks of action, relations, dependencies, and roles and to teach for remediation and reconfiguration (Lievrouw 2009).

Tools: dead media


As the preceding discussion suggests, much of the interest in new media technologies over the last decade has stemmed from their seemingly limitless capacity for information capture, storage, and analysis. It has become commonplace to assume that everything all aspects of life and culture, online and offline can and will be recorded and kept using cheap, high-precision digital recording technologies and storage media, in what has been called perfect remembering (Mayer-Schonberger 2009). In both popular culture and research accounts, stories abound about thoughtless email messages, compromising photos, or intemperate videos that, once posted online, can never be completely tracked down and deleted.2 Unwanted files seem to become deathless, lurking indefinitely in remote corners of cyberspace, ready to be disinterred and circulated inappropriately when the author least expects or needs it, with dire consequences for his or her reputation, professional status, or personal life. Critics contend that the personal and social risks of such total capture and recall require new technologies and policies to facilitate the deliberate, selective,

THE NEXT DECADE IN INTERNET TIME

629

and complete expunging of sensitive, false, misleading, or personally risky information from the worlds databases. There is little question that individuals should be able to exercise much more control over their personal information online than is generally permitted today. There are obvious risks associated with the pervasive capture of personal information, and technological and legal safeguards against such ubiquitous data-gathering and third-party profiling are desirable and should be pursued. In the United States, for example, organizations that collect information about customers, clients, patrons, or patients insist that such data become their property and that the individuals whose data have been collected have no right to retrieve, modify, or delete them. Privacy policies on Facebook have evolved over time as a result of similar tensions: the efforts of the sites owners to appropriate users postings and network links as proprietary information to be sold to advertisers have been met with repeated waves of resistance from regulators and users who insist that they should have greater control over who has access to their information and how it can be used. Viviane Reding, the European justice commissioner, has recently advocated revisions to the EU Data Protection Directive that would increase individuals control over personal information online, including a right to be forgotten (OBrien 2011). However, the idea of total, loss-free digital capture of all knowledge and information, or perfect remembering, should be viewed skeptically. In the first place, the total capture and recall of a societys (or even an individuals) works and activities has never, and is unlikely ever to be, possible. All cultures forget; digital culture is no exception. Historically, the overwhelming majority of human knowledge has been lost, destroyed, sabotaged, pulled out of context, excluded from the record, suppressed, or never recorded at all. Conclusions are inevitably drawn on the basis of incomplete, contradictory, and divergent information. There is little about culture today to suggest that these processes have changed in any fundamental way as a consequence of digital communication technologies. The fact of cultural forgetting, combined with rapidly accelerating cycles of technological obsolescence and turnover, is the basis of what might be called dead media possibly the greatest barrier to the dream (or nightmare) of perfect remembering. The basic tools of the Internet (digital recording and transmission technologies, formats, and storage systems) are notably short lived and incompatible across platforms and standards, especially in comparison to physical and analog formats. Digital files and databases are notoriously fugitive and difficult to preserve in usable form for any extended period of time; they are among the most profoundly fragmented, disorganized, incompatible, and ephemeral forms of record-keeping ever devised. Formats, devices, and architectures become obsolete and are abandoned in favor of the next new design with little or no consideration for retaining the records or functions of the old systems. (As Rand researcher and preservationist Rothenberg [1999 (1995), p. 2] has

Downloaded by [188.25.224.109] at 13:23 12 December 2012

630

INFORMATION, COMMUNICATION & SOCIETY

Downloaded by [188.25.224.109] at 13:23 12 December 2012

observed, old bit streams never die they just become unreadable.) Robust, universal methods for the permanent preservation of digital records do not yet exist. While several interim strategies for maintaining digital records have been proposed, digital preservation is a largely unsolved technical problem, with no good prospects on the horizon. We might consider the implications of dead media for cultural legacy, authenticity, and memory. What, if anything, constitutes a perfect, original, or reliable record is it even possible? What does it signify or provide evidence of? Why are records kept in the first place? Whose stories are recorded or destroyed (or not), and who decides? Even if total capture of the cultural record were technically possible, would it be desirable? These questions, and the enduring cultural fact of incomplete and forgotten information, are usually framed in terms of loss and error. Paradoxically, we might agree with Mayer-Schonbergers basic insight: the social and ethical benefits of forgetting tend to be neglected in cultures that place the greatest faith in the ability of digital technologies to capture and keep an absolutely faithful and complete record of the past. However, the real dangers may lie less in ostensibly lossless webs of personal dossiers than in the histories written and judgments made on the basis of disconnected scraps of data that rarely survive more than a few years.

Social arrangements: commons knowledge


The collective creation and maintenance of vast collections of information by communities of people with shared interests what Benkler (2007) has termed commons-based peer production have become important features of online communication, from participatory journalism (Deuze et al. 2007), to popular culture (Shirky 2008), to political activism and whistleblowing (e.g., Wikileaks; Sifry 2011), to academic scholarship and science (Lievrouw 2010). These have opened unprecedented opportunities for powerful new modes of knowledge production and collaboration, but have also drawn charges of amateurism, incompetence, deliberate falsification, misattribution and misappropriation, and more. The third development with important implications for the direction of new media studies is a consequence of this increasingly collaborative, collective, inclusive, and relatively gatekeeper-free arena for communication online. Commons knowledge projects do not mobilize only the efforts of hundreds or even thousands of people, who make small, granular contributions to very large enterprises that might otherwise be too complex or expensive to undertake; in addition, the resources that are created often challenge or compete with more established, authoritative, expert-driven and institutionally supported modes of generating, compiling, and circulating knowledge.

THE NEXT DECADE IN INTERNET TIME

631

Commons knowledge projects have several important characteristics that distinguish them as forms of collaboration and knowledge-sharing. The first is their Alexandrian ambitions, in the sense of the lost library of Alexandria, which was built to hold all the knowledge of the ancient world. As the word suggests, this ideal did not originate with digital technologies; similar visions of universal, total knowledge collection and organization have motivated thinkers and scholars since Diderot and dAlemberts Encyclopedie in the eighteenth century. Before World War I, for example, Paul Otlet developed the Universal Decimal Classification and founded the Palais Mondial (later, the Mundaneum) in Belgium, where he and other documentalists envisioned that all the worlds documents including models, images, artworks, plans and diagrams, and biological specimens, and so on, as well as texts might be collected and organized according to a universal bibliographic catalog that would guide users through networks of links among related resources (see Rayward 2003, 2008a; Wright 2008). In the 1930s, H.G. Wells (1938) called for the creation of a single, enormous shared encyclopedia, or world brain, to overcome the provincialism and disciplinary blinkers of traditional academic scholarship and learning. Vannevar Bush, an administrator at the Massachusetts Institute of Technology who became Director of the US Office of Scientific Research and Development during World War II, believed that scientific and technical progress was being stymied by the growing tide of unrelated and unsynthesized research publications. In an essay in The Atlantic in 1945 entitled As We May Think, now a touchstone for historians and cultural studies of the Web, Bush proposed a device called memex that would allow users to find connections across diverse bodies of scientific knowledge and retrieve documents directly to a microfilmbased workstation (Bush 1945). From this longer perspective, Berners-Lees (1989) proposal for the hypertext transfer protocol (http), which is widely credited as the cornerstone of the World Wide Web, certainly brings forward many of the same concerns about undigested, unconnected pieces of information articulated by Vannevar Bush, as well as the necessity of creating systems that effectively make all the knowledge in the world (in Otlets phrase) accessible and navigable. Hugely successful projects such as Wikipedia continue the tradition of modern encyclopedists such as Wells (Rayward 2008b). What distinguishes a Wikipedia from a Mundaneum, however, and the second characteristic of commons knowledge is how information is collected and organized. Where Otlet, Wells, Bush, and others assumed that universal standards for cataloging and classifying huge, comprehensive collections of authoritative knowledge would be necessary to make that knowledge more accessible to anyone who might use it, commons knowledge projects are just as likely to let contributor-editors decide among themselves what topics and resources are significant and how they should be organized. Tagging, bookmarking, commenting/annotation systems, recommendation engines, and similar

Downloaded by [188.25.224.109] at 13:23 12 December 2012

632

INFORMATION, COMMUNICATION & SOCIETY

tools have enabled the generation of dynamic, bottom-up, folksonomic modes of knowledge organization that shift and evolve along with users interests, interactions, searches, and retrievals. The neologism folksonomy is a play on taxonomy, defined as a formal classification system for organizing items into specific, mutually exclusive categories or taxa. Taxonomies are generated by experts using thesauri of specialized terms to label and organize items in a collection; these standardized labels and categories are called metadata. Otlets Universal Decimal Classification system and the Dewey Decimal Classification system on which it was based are classic examples of expert-driven taxonomies. Projects such as Wikipedia, however, as well as other user-generated resources such as Flickr and Twitter, are organized according to the users own shifting perceptions of the information they contribute, collect, and comment on. There is no standardized controlled vocabulary that users must consult to decide how to tag or categorize the materials they post or how to link materials to other works. The main advantage of the folksonomic approach is an acute sensitivity to changing ideas and the cultural contexts of users, so that information resources can be highly personalized, adaptable, and open to creative, counter-intuitive, or novel approaches to problems. Folksonomic systems allow users to frame questions and interests in their own language, in ways that may be more faithful to the social and cultural worlds they inhabit. The disadvantages, however, include a tendency to idiosyncrasy: by using their own natural language (as they might do in a Google search, for example) rather than the authoritative specialist terms for certain topics, users may not find important materials that might be relevant to their interests indeed, they may not even be aware of them. The growing reliance on systems that tailor or customize what users are able to seek and find online, principally as a means to gather and exploit highly targeted marketing information about individuals tastes and interests, has recently become a focal point for debate among critics who fear that such filter bubbles are segregating people into ever narrower knowledge enclaves with little opportunity to interact across boundaries of culture, demographics, or interest (Pariser 2011). The third defining characteristic of commons knowledge can be seen as what happens when the Alexandrian impulse intersects with folksonomic modes of knowledge generation and organization: a distrust of knowledge authorities and institutions, in favor of more grass-roots or egalitarian participation by experts and amateurs alike. This anti-authoritarian tendency has strong parallels in the early, libertarian hacker culture whose members designed some of the key features of networked computing and telecommunications, in particular, their deliberately open architectures that allowed users to tinker with and modify them according to their particular needs or desires (Nissenbaum 2004; Turner 2006). Similar practices and attitudes have transferred readily into projects involving the generation and circulation of online content (Lunenfeld 2011); a striking recent example is provided by Wikileaks and similar sites, which provide secure

Downloaded by [188.25.224.109] at 13:23 12 December 2012

THE NEXT DECADE IN INTERNET TIME

633

Downloaded by [188.25.224.109] at 13:23 12 December 2012

drop boxes where anonymous contributors can submit materials that expose institutional hypocrisy or malfeasance. Some critics have accused Wikipedians and other crowdsourcing advocates of anti-intellectualism and alienating the very experts who might enrich their projects (e.g., Duguid 2006). Others allege that amateur and volunteer participation is fundamentally exploitative, capturing the efforts of highly talented contributors for free or a fraction of what their labor is worth (Terranova 2000; Lovink & Rossiter 2007). Other observers, however, see the move towards more participatory forms of knowledge production and circulation as a positive development (Surowiecki 2004; Benkler 2007; Shirky 2008). The perspectives and passionate commitment of amateur enthusiasts, advocates say, bring new vitality to traditional fields that have erected high professional barriers to entry and thus have become entrenched, stale, and more concerned with reinforcing status distinctions and reward systems than with new ideas and debates. In fact, many established disciplines have a long history of amateur scholarship and scientific discovery (Dyson 2002; Lievrouw 2010). There is little evidence, advocates say, that experts are being deliberately excluded or discouraged from participation (Sanger 2009); what participants object to is not expertise per se, but credentialism and deference to institutions, professional titles and privileges, or qualifications in themselves (Fallis 2008; Tapscott & Williams 2008).

The next decade in Internet time


In this paper, I have suggested that both the recent history of new media and the future research agenda for new media studies can be framed in terms of three articulated, and mutually shaping, aspects of new media infrastructure: the practices in which people engage to interact and share information and meaning; the tools, devices, and artifacts that people create and use in order to do so; and the social arrangements or institutional forms that develop out of and around these practices and tools. Over the last decade, new media practices have become more relational; tools and systems have become increasingly enclosed and walled; and institutions and authorities have redefined online communication as a mean world that requires new regimes of stabilization and control. Together, these features have created a climate that is widely regarded as the new normal for new media in the early twenty-first century. Given this climate, I have also proposed three possible problem areas or directions for new media research in the near future, again aligning with the practices, tools, and social arrangements of infrastructure. Network literacies and pedagogies that will allow individuals to be full and effective participants in society, economy, culture, and politics must be developed and implemented. The proliferation of dead media may provoke even greater efforts to develop systems for total surveillance and information capture, on the one hand, or

634

INFORMATION, COMMUNICATION & SOCIETY

Downloaded by [188.25.224.109] at 13:23 12 December 2012

offer a welcome haven from pervasive observation and recording, enabling a right to be forgotten, on the other hand. Commons knowledge projects may challenge and even reconfigure the foundations of institutional authority, expertise, and legitimacy. However, whether these particular problems and questions, or others, emerge as important streams of new media studies in the future or not, it seems certain that the linkages among practices, tools, and social arrangements will continue to be in flux and subject to persistent tensions and interplay. How people communicate, with whom, what devices they use, and how they organize their communicative relationships and systems will continue to be elements in a dynamic, interdependent, and emergent process. The task for new media studies in the next decade in Internet time will be to bring these elements together in coherent, innovative accounts of the ways that communication technology and society constitute one another.

Notes
1 Fine parodies of licensing agreements include those by the collective Illegal Art (http://www.illegal-art.org/contract.html), If Book Publishers Used License Agreements by Washington, DC publisher Bill Adler (http://www.adlerbooks.com/booklicense.html), and Japanese soldier found hiding in software license agreement on the satirical website NewsBiscuit (http://www.newsbiscuit.com/2007/06/19/ japanese-soldier-discovered-in-software-license-agreement/). See, for example, Mayer-Schonberger discussing his book Delete: The Virtue of Forgetting in the Digital Age (2009) at Harvard Universitys Berkman Center for Internet & Society, http://www.youtube.com/ watch?v=XwxVA0UMwLY.

References
Benkler, Y. (2007) The Wealth of Networks: How Social Production Transforms Markets and Freedom, Yale University Press, New Haven, CT and London. Berners-Lee, T. (1989) Information management: a proposal, [Online] Available at: http://www.w3.org/History/1989/proposal.html (15 November 2011) Bush, V. (1945) As we may think, The Atlantic, July, pp. 101 108, [Online] Available at: http://www.theatlantic.com/doc/194507/bush (30 March 2012). Cohen, J. E. (2003) DRM and privacy, Berkeley Technology Law Journal, vol. 18, pp. 575617. Cronin, B. & Davenport, E. (1993) Social intelligence, Annual Review of Information Science & Technology, vol. 28, pp. 344.

THE NEXT DECADE IN INTERNET TIME

635

Deuze, M., Bruns, A. & Neuberger, C. (2007) Preparing for an age of participatory news, Journalism Practice, vol. 1, no. 3, pp. 322 38. Duguid, P. (2006), Limits of self-organization: peer production and laws of quality, First Monday, vol. 11, no. 10, [Online] Available at: http:// firstmonday.org/htbin/cgiwrap/bin/ojs/index.php (30 March 2012). Dyson, F. (2002) In praise of amateurs, New York Review of Books, vol. 49, no. 19, pp 48. Fallis, D. (2008) Toward an epistemology of Wikipedia, Journal of the American Society for Information Science and Technology, vol. 59, no. 10, pp. 16621674. Gerbner, G., Gross, L., Morgan, M. & Signorielli, N. (1986) Living with television, in Perspectives on Media Effects, eds J. Bryant & D. Zillmann, Lawrence Erlbaum, Hillsdale, NJ, pp. 1740. Gillespie, T. (2009) Characterizing copyright in the classroom: the cultural work of antipiracy campaigns, Communication, Culture & Critique, vol. 2, pp. 274318. Haddon, L. (2006) The contribution of domestication research to in-home computing and media consumption, The Information Society, vol. 22, no. 4, pp. 195204. Hiltz, S. R. & Turoff, M. (1993 [1978]) The Network Nation: Human Communication Via Computer, rev edn, MIT Press, Cambridge, MA. Jenkins, H., with Clinton, K., Purushotma, R., Robison, A. J. & Weigel, M. (2009) Confronting the Challenges of Participatory Culture: Media Education for the 21st Century. The John D. and Catherine T. MacArthur Foundation Reports on Digital Media and Learning, MIT Press, Cambridge, MA. Available at: http://dmlcentral.net/resources/3756 (30 March 2012). Johansen, R., Vallee, J. & Spangler, K. (1979) Electronic Meetings: Technical Alternatives and Social Choices, Addison-Wesley, Reading, MA. Lievrouw, L. A. (2004) Whats changed about new media? Introduction to the fifth anniversary issue of New Media & Society, New Media & Society, vol. 6, no. 1, pp. 915. Lievrouw, L. A. (2009) The uses of disenchantment in new media pedagogy: teaching for remediation and reconfiguration, in Media/Cultural Studies: Critical Approaches, eds R. Hammer & D. Kellner, Peter Lang, New York, pp. 560575. Lievrouw, L. A. (2010) Social media and the production of knowledge: a return to little science? Social Epistemology, vol. 24, nos 3, July September, pp. 219237. Lievrouw, L. A. & Livingstone, S. (eds) (2006) The Handbook of New Media, updated student edn, Sage, London. Lievrouw, L. A. & Nguyen, L. U. (2007) Linking and the network imaginary, paper presented at the New Network Theory conference, organized by the Amsterdam School for Cultural Analysis, the Institute of Network Cultures (Amsterdam Polytechnic, Hogeschool van Amsterdam), and the Media Studies program at the University of Amsterdam, 28 30 June (draft available from the author).

Downloaded by [188.25.224.109] at 13:23 12 December 2012

636

INFORMATION, COMMUNICATION & SOCIETY

Lovink, G. & Rossiter, N. (eds) (2007) MyCreativity Reader: A Critique of Creative Industries, Institute of Network Cultures, Amsterdam, [Online] Available at: http://networkcultures.org/wpmu/portal/archive/ (15 November 2011). Lunenfeld, C. P. (2011) The Secret War Between Downloading and Uploading: Tales of the Computer as Culture Machine, MIT Press, Cambridge, MA. Mayer-Schonberger, V. (2009) Delete: The Virtue of Forgetting in the Digital Age, Prin ceton University Press, Princeton, NJ. Meikle, G. (2002) Future Active: Media Activism and the Internet, Routledge, New York. Morozov, E. (2011) The Net Delusion: The Dark Side of Internet Freedom, PublicAffairs Books, New York. Newell, A. & Sproull, R. F. (1982) Computer networks: prospects for scientists, Science, vol. 215, pp. 843852. Nissenbaum, H. (2004) Hackers and the contested ontology of cyberspace, New Media & Society, vol. 6, no. 2, pp. 195217. Noelle-Neumann, E. (1984) The Spiral of Silence: Public Opinion Our Social Skin, University of Chicago Press, Chicago, IL. OBrien, K. J. (2011) E.U. to tighten web privacy law, risking trans-Atlantic dispute, New York Times, 9 November, p. B4, [Online] Available at: http:// www.nytimes.com/2011/11/10/technology/eu-to-tighten-web-privacylaw-risking-trans-atlantic-dispute.html (15 November 2011). Pariser, E. (2011) The Filter Bubble: What the Internet is Hiding from You, Penguin, New York, London. Rayward, B. (2003) Knowledge organization and a new world polity: the rise and fall and rise of the ideas of Paul Otlet, Transnational Association, vol. 1 2, pp. 415. Rayward, B. (ed.) (2008a) European Modernism and the Information Society: Informing the Present, Understanding the Past, Ashgate, Aldershot. Rayward, B. (ed.) (2008b) The march of the modern and the reconstitution of the worlds knowledge apparatus: H. G. Wells, encyclopedism and the World Brain, European Modernism and the Information Society: Informing the Present, Understanding the Past, Ashgate, Aldershot, pp. 223239. Rheingold, H. (2010) Attention, and other 21st-century social media literacies, EDUCAUSE Review, vol. 45, no. 5, September/October, pp. 1424, [Online] Available at: http://www.educause.edu/EDUCAUSE+Review/EDUCAUSERe viewMaga-zineVolume45/AttentionandOther21stCenturySo/213922 (30 March 2012). Rothenberg, J. (1999) [1995] Ensuring the longevity of digital information (Expanded version of J. Rothenberg, Ensuring the longevity of digital documents, Scientific American, vol. 272, no. 1, pp. 42 7), [Online] Available at: http://www.clir.org/pubs/archives/ensuring.pdf (15 November 2011) Sanger, L. M. (2009) The fate of expertise after Wikipedia, Episteme, vol. 6, no. 1, pp. 5273. Shirky, C. (2008) Here Comes Everybody: The Power of Organizing Without Organizations, Penguin, New York.

Downloaded by [188.25.224.109] at 13:23 12 December 2012

THE NEXT DECADE IN INTERNET TIME

637

Downloaded by [188.25.224.109] at 13:23 12 December 2012

Short, J., Williams, E. & Christie, B. (1976) The Social Psychology of Telecommunications, Wiley, London New York. Sifry, M. L. (2011) Wikileaks and the Age of Transparency, Counterpoint, Berkeley, CA. Signorielli, N. (1990) Televisions mean and dangerous world: a continuation of the cultural indicators perspective, in Cultivation Analysis: New Directions in Media Effects Research, eds N. Signorielli & M. Morgan, Sage, Newbury Park, CA, pp. 85106. Silverstone, R. (2006) Domesticating domestication: reflections on the life of a concept, in The Domestication of Media and Technology, eds T. Berker, M. Hartmann, Y. Punie & K. J. Ward, Open University Press, Maidenhead, pp. 229248. Surowiecki, J. (2004) The Wisdom of Crowds, Doubleday, New York. Tapscott, D. & Williams, A. D. (2008) Wikinomics: How Mass Collaboration Changes Everything, expanded edn, Portfolio, New York. Terranova, T. (2000) Free labor: producing culture for the digital economy, Social Text, 63, vol. 18, no. 2, pp. 3358. Turner, F. (2006) From Counterculture to Cyberculture: Stewart Brand, the Whole Earth Network, and the Rise of Digital Utopianism, University of Chicago Press, Chicago, IL. Vesna, V. (ed.) (2007) Database Aesthetics: Art in the Age of Information Overflow, University of Minnesota Press, Minneapolis. Wells, H. G. (1938) World brain: the idea of a permanent world encyclopedia, World Brain, Doubleday, New York, pp. 83 88. Wright, A. (2008) The web time forgot, New York Times, 17 June, [Online] Available at: http://www.nytimes.com/2008/06/17/science/17mund.html (15 November 2011). Zittrain, J. (2008) The Future of the Internet and How to Stop It, Yale University Press, New Haven, CT, London.
Leah A. Lievrouw is a Professor in the Department of Information Studies at the University of California, Los Angeles. Her most recent book Alternative and Activist New Media (Polity, 2011) explores the ways that artists and activists use new media technologies to challenge mainstream culture, politics, and society. Along with Sonia Livingstone of the London School of Economics, she is also the co-editor of the four-volume Sage Benchmarks in Communication: New Media (Sage, 2009) and of The Handbook of New Media (updated student edition; Sage, 2006). Works in progress include Media and Meaning: Communication Technology and Society (Oxford University Press) and Foundations of Media and Communication Theory (Blackwell). Prof. Lievrouw received a PhD in communication theory and research in 1986 from the Annenberg School for Communication at the University of Southern California. She also holds an MA in biomedical communications/instructional development from the University of Texas Southwestern Medical Center in Dallas and a Bachelor of Journalism from the University of Texas at Austin. Previously, she has held faculty appointments in

638

INFORMATION, COMMUNICATION & SOCIETY

the Department of Communication in the School of Communication, Information, and Library Studies (SCILS) at Rutgers University in New Brunswick, NJ, and in the Department of Telecommunication and Film at the University of Alabama. She has also been a visiting scholar at the University of Amsterdams School of Communication Research (ASCoR) in the Netherlands and visiting professor at the ICT & Society Center at the University of Salzburg, Austria. Address: Department of Information Studies, University of California, Los Angeles, 216 GSE&IS Building, Box 951520, Los Angeles, CA 90095-1520, USA. [email: llievrou@ucla.edu]

Downloaded by [188.25.224.109] at 13:23 12 December 2012

You might also like