Zostałeś zalogowany. Za X sek. strona zostanie przeładowana.

Logo NInA
Lev Manovich
Back

Lev Manovich | A Decade in Audiovisual Culture

The article supplements the multimedia essay created by 20 household names in Polish culture and presented live on February 24th, 2017 during the NInA Finale. It was an event during which we looked back at the last decade in audiovisual culture, as well as the nearly 12-year-long history of the Institute (just days before the merger with the National Film Archive).

The Algorithms of Our Lives

In 2002, I was in Cologne, Germany, and I went into the best bookstore in the city devoted to humanities and arts titles. Its new-media section contained hundreds of books. However, not a single title was about the key driver of the "computer age": software. I started going through indexes of book after book. No "software."

Yet in the 1990s, software-based tools were adopted in all areas of professional media production and design. In the 2000s, those developments have made their way to the hundreds of millions of people writing blogs and tweeting, uploading photos and videos, reading texts on Scribd, and using free tools that 10 years earlier would have cost tens of thousands of dollars.

Thanks to practices pioneered by Google, the world now operates on web applications that remain forever in beta stage. They can be updated anytime on remote servers without consumers having to do anything—and in fact, Google is revising its search-algorithm code as often as 600 times a year. Welcome to the world of permanent change—a world defined not by heavy industrial machines that are modified infrequently, but by software that is always in flux.

Software has become a universal language, the interface to our imagination and the world. What electricity and the combustion engine were to the early 20th century, software is to the early 21st century. I think of it as a layer that permeates contemporary societies. If we want to understand today's techniques of communication, representation, simulation, analysis, decision making, memory, vision, writing, and interaction, we must understand software.

But while scholars and media and new-media theorists have covered all aspects of the IT revolution, creating fields like cyberculture studies, Internet studies, game studies, new-media theory, and the digital humanities, they have paid comparatively little attention to software, the engine that drives almost all they study.

It's time they did.

Consider the modern "atom" of cultural creation: a "document," i.e. content stored in a physical form delivered to consumers via physical copies (books, films, audio records) or electronic transmission (television). In software culture, we no longer have "documents." Instead, we have "software performances." I use the word "performance" because what we are experiencing is constructed by software in real time. Whether we are exploring a website, playing a video game, or using an app on a mobile phone to locate nearby friends or a place to eat, we are engaging with the dynamic outputs of computation.

Even when a user is working only with a single local media file stored in his or her computer, the experience is still only partly defined by the file's content and organization. The user is free to navigate the document, choosing both what information to see and the sequence in which to see it. (In Google Earth, I can zoom in and out, switching between a bird's-eye view of the area, and its details; I can also switch between different kinds of maps.)

All that requires a new way to analyze media and culture. Since the early 2000s, some of us (mostly from new-media studies and digital arts) have been working to meet that challenge. As far as I know, I was the first to use the terms "software studies" and "software theory" in 2001. The field of software studies gradually took shape in the mid-2000s. In 2006, Matthew Fuller, wrote: "In a sense, all intellectual work is now 'software study,' in that software provides its media and its context, but there are very few places where the specific nature, the materiality, of software is studied except as a matter of engineering."

The study of software culture calls for a fundamentally different humanities methodology. We need to be able to record and analyze interactive experiences, following individual users as they navigate a website or play a video game; to study different players, as opposed to using only our own game-play as the basis for analysis; to watch visitors of an interactive installation as they explore the possibilities defined by the designer—possibilities that become actual events only when the visitors act on them. In other words, we need develop our own methods of transcribing, analyzing, and visualizing interactive experiences.

As more and more of our cultural experiences, social interactions, and decision making are governed by large-scale software systems, the ability of nonexperts to discuss how these systems work becomes crucial. How does the software we use influence what we express and imagine? Shall we continue to accept the decisions made for us by algorithms if we don't know how they operate? What does it mean to be a citizen of a software society? These and many other important questions are still waiting to be analyzed.
 

– Lev Manovich

Lev Manovich

Author of seven books including "Software Takes Command" (2013), "Soft Cinema: Navigating the Database" (2005) and "The Language of New Media" (2001) which was described as "the most suggestive and broad ranging media history since Marshall McLuhan". Manovich is a Professor at The Graduate Center, City University of New York (NYU), and a Director of the Software Studies Initiative that works on the analysis and visualization of big visual cultural data.

See also