In the talk, he noted how quickly things become polarized in this era, the bad-trip bizarre extremes suggested by the Tea Party and the Palinites. Given all this, he had come to the conclusion that we’re “running obsolete code” socially… something about the information environment we’ve created enables polarization, perhaps. How much of this is the bias of a binary medium, and how much of it is attributable to the biases of the people who program our technologies… and our “social code”? He had been thinking about how our technology works vs how our technology works on us. His conclusion: if you’re not the programmer, you are one of the programmed.
So the book he was going to write was to have been called something like Program or Be Programmed. He discussed how game players progress from players to “cheaters” (i.e. they find the tricks, backdoors, and cheats within a programmed game) to author or programmer. It’s a natural progression – taking control of our environment, the reality we’re banging around in. This goes back to the creation of writing as a formalized symbolic representation of reality, and the invention of the printing press, which means the written word can be replicated. Initially “anyone can program reality” via written text, when we get the ability to read and write. However the invention of the printing press assigns more control to those who control the means of production/replication – we get the division between those who publish and those who “merely” read. Those who control publishing control which representations of reality are broadly replicated – I’ve spoken elsewhere of the invention of the printing press as the genesis of broadcast media, where control of “reality” is centralized. In the era of mass media, there’s a sense of mainstream knowledge that’s vetted carefully by editors and publishers who share similar biases and assumptions.
In the era of computers and the Internet, we’ve seen the evolution of a more decentralized, diverse “social” media. How free are we from a the centralized set of biases associated with mass publishing? While we appear to have many and diverse publishers, what we have is more bloggers but not necessarily more programmers, and Rushkoff argues that there are biases in the way things are programmed – programmers have biases or they’re directed according to the biases of others. An example is a Facebook profile, which has a structure defined by Facebook so that it reduces the personality of the Facebook user to a consumer profile. Similarly Google is programming Internet-based structures – presumably on the “open Internet” – where the bias is for Google to extract value from content creators who produce their content for free within an infrastructure that Google increasingly controls.
Doug was going to write Program or Be Programmed as a description of ten biases of digital media, and ten commands that go with them. He decided that the “era of the book” has ended, along with the biases of a linear literary culture, which gives way to the nonlinear biases of a digital culture, so he’s tossing out his list of biases and thoughts as a set of memes dropped into the digital stream. The format: bias followed by commandment, along with additional comments from Doug, and some of my own.
1) Time: “Thou shalt not be always on.” We can lose ourselves in our persistent connection to content streams. We assign our time to the presence of other voices at the expense of our own. We need to take more time to be who we are, and to shape our own thinking, our own voices.
2) Distance: “Thou shalt not do from a distance what can be done better in person.” This relates to Doug’s thinking about economies, which were delocalized and centralized by industrialization. Digital fetishism has us using long distance technology for short distance communication. Example: Doug visited a classroom where a group of students in the same room were all logged into Second Life for a meeting. Network technology has long distance biases, or equates all distance. Global becomes weak local; local becomes weak global. There are some kinds of local coordination that it makes sense to do online, but you have to be clear whether you’re using the technology where it’s most effective, or simply conceding to its inherent bias.
3) Scale – the net is biased to scale up. “Exalt the particular.” Not everything should scale. This makes me think of E.F. Schumacher’s “Small is Beautiful.”
4) Discrete: “You may always choose none of the above.” The real world is not digital, is not a symbolic representation in metrics. Online activity is a digital landscape of forced choice. I see this as more about database coherence – if you’re trying to build a manageable data, you have constraints that are built into the interface as limited choices. Simple example: you might enforce one specific way of expressing a date, e.g. MM/DD/YYYY so that all dates will have the same format, therefore all date data will be coherent. Lacking this constraint, data is less usable. To the extent we force choices in something like a user profile, we’re forcing real, complex persons to limit their self-descriptions so they fit the biases of our data structures. You should always be able to withhold choice, or choose “none of the above.”
5) Complexity. “Thou shalt never be completely right.” Doug starts by noting the Wikipedia is taken seriously as a reference, whereas “using an encyclopedia used to be a joke” – i.e. the encyclopedia has many references but none of them captures the full complexity of the subject described. Real scholarship acknowledges, embraces, and digs into that complexity. There are few answers that are completely “right.” Complex inquiry is a good thing.
6) Anonymity. “Thou shalt not be anonymous.” We should work against the tendency of the net to promote anonymity and decentralization. Doug notes that, online, we have an “out-of-bodiness” which negates nonverbal communication. By default, we are incomplete in an environment that is mostly textual and binary communication. In this context, it is liberating to adopt a strong sense of identity.
7) Contact. “Remember the humans.” Content is not king in a communications environment – CONTACT is king.
8) Abstraction. “As above, so below.” Text abstracted words from speech. Invention of text led to an abstract god. Also led to treating economy as if it is nature – but it’s not, it’s a game. Don’t make equivalencies between the abstracted model and the real world.
9) Openness. “Thou shalt not steal.” This is about the assumption that everything should be free that seems prevalent on the Internet. Doug makes a long term bet against Google: “if everything is free, there is no one left to advertise.” Free is not the same as “open source.” I think what he’s saying here is “free as in beer” is not the same as “free as in freedom,” which is Richard Stallman’s persistent distinction. In fact I don’t agree that Google or anyone else is trying to make everything free. We’re seeing a transitional economy where value and compensation are being redefined, and where especially the value and exchange of social capital is increasingly more relevant.
10) End users. Here the bias is toward making all or most of us end users rather than programmers. “Program or be programmed.” Doug notes that in the early days of computing, computer classes taught programming with BASIC etc.. Now the classes teach how to use applications that others have programmed. The user and the coder are farther apart. He argues that we should all understand programming, be able to build our own tools or configure tools other have built so that we have more control over the digital environment. “But what about the greater learning curve?” He argues this is a good thing.
There is a tendency toward centralization with systems like Facebook. Is the web becoming more centralized? How does it remain decentralized?
One last note from Doug’s talk. I wrote this note: “Future of client side technology is digital currency. Will do to central bank what Craig’s List did to Hearst. Will have to be decentralized to exist and not be taken down.”