“Surfing” the net back when dial-up was the standard connection felt more akin to wading through molasses. We understood that the term was ironic, and in reality, we viewed the Internet as a collection of nouns delivered to our computers -- web pages, sites, emails, and downloads. But our perception of the online experience has shifted, and we now see it more in terms of verbs such as streaming, Googling, Facebooking, blogging, cloud computing, news feeds, and “the conversation.” This shift in perception is especially natural for digital natives. Digital natives are those born into digital culture as a way of life, as opposed to digital immigrants, or those born before the mid-eighties and who grew up primarily as consumers of mass media. There’s a lot of gloomy talk about how digital natives are different in the way that they interact with technology, process information and socialize, and now, a new study suggests that the Googlization of knowledge is resulting in poorer memory. Whether this development is truly negative, and I don’t believe that it is, it’s clear that digital natives do take for granted their relationship to the Internet as a verb.
Digital natives take for granted that they can curate their own cultural experiences, the idea of “curatorial me.” They take for granted that they can access more music, more film, more knowledge, and more peers than any other generation in human history. They never knew a time when cultural experiences were limited by one’s budget and the whims of the mass media. And they naturally grasp the Internet as a flow of data as opposed to a collection of nouns. In a sense, even the term “curation” is misleading since it suggests collecting, possessing and owning. Curation is really about filtering the overwhelming surplus of cultural choices online. This perception of the online experience as a flow rather than a collection of nouns has staggering implications for the content industry and the economics of the Internet.
What is the value of a Tweet? Would you pay a penny for a good Tweet? For some Tweets, I might pay more than that, but of course, the vast majority aren’t worth the brain cells they burn up glancing at them. For digital natives, and for more and more digital immigrants as well, music, film, news and almost everything else that arrives online is perceived like a Tweet or a status update, as part of the deluge of data. These sparks of culture stream to us down the pipes, flicker briefly on our screens and then fade away into the ether of the Internet. They blend with the crushing flow of information instead of standing out as individual songs or movies that we purchase and possess. They have little permanence and therefore represent little real value. This shift in our perception of online content will have a greater and greater impact on its producers and distributors who, after all, profited in the past by selling us the containers of content, the physical products such as the albums, magazines, books and DVDs, the “stuff” we put on our shelves and collected. Selling culture as a collection of nouns -- newspapers, books, DVDs, photographs or albums -- will feel more and more obsolete as this perception continues to shift.
Digital Kindling
Musings on digital culture.
Tuesday, July 19, 2011
Wednesday, July 6, 2011
First Computers
"It's an incredible machine for writing," my father said. "And you can use it for other things, like organizing your recipes..."
He was talking about his new Compaq Luggable, the first PC I’d ever seen. It was 1985. Back to the Future was the most popular movie of the year and Foreigner’s “I Gotta Know What Love is” was playing whenever I turned on the radio. I looked down at the hulking plastic box on his roll-top desk, a featureless hunk of plastic as boring as a furnace, and tried to figure out what my father, a history professor, would want with a computer.
"It's portable too," he continued, undoing a couple of hefty clasps on the front which opened to reveal a keyboard and a little porthole of a screen that was integrated into the top of the body. It weighed 28 pounds.
“Watch this,” he said. He typed in “C:WP” to open WordPerfect. “Now type something.” I wrote my name and address. "Now check this out." With the cursor keys, he highlighted my first name, deleted it, and then pasted it after my address. I was speechless. “You can do that with whole paragraphs. It's called 'cut and paste.' It'll check your spelling too.”
And suddenly, it seemed to me that anyone who was serious about writing needed a computer, so in 1986, I blew $2,000 of my student loan on a 640K, no-name "IBM Compatible.” I justified the cost by telling myself (and my girlfriend) that it was essential for my Master's thesis. I'd also use it to kindle two years of travel notes into the roaring inferno of my novel, and of course, she could use it to organize our recipes and our music collection. It was an ugly brute, but all computers were ugly then. The body was the color of dirty dishwater. The tedious noise of the fan and irregular grinding bleat of the disk drive made for an experience that felt more mechanical than digital, but it did have luminous orange letters on a black screen as opposed to the old-school green letters of my father’s machine. It was more than I could afford, but I figured it would last at least 10 years if I took care of it.
The machine would take about 5 minutes to fully boot, and for two years, I typed onto its black screen, backing up everything on 5-inch floppies and printing out the fruit of my labor on a clattering dot-matrix printer. The continuous pages would fold out of the top of the printer at a blazing two pages per minute. Then came the tedious process of tearing off the perforated punch-holed sides, separating the pages, and collating the whole pile. I had joined the digital revolution.
By 1990, 640K was feeling pretty pedestrian. I wanted something sexier, namely a MacIntosh. Surprisingly, my old machine had no resale value whatsoever, and it became the first in a heap of personal e-waste that's been steadily and shamefully growing ever since. My Mac cost about the same as my first computer, just over $2,000, but it came with a hard drive, a graphic interface "desktop," and a cheery chime when I turned it on. Its squat beige body and tiny gray-scale screen felt truly compact. And friendly. Only a year before, I'd confidently counseled friends not to buy a Mac since those gimmicky mouse things, surely a passing fad, would only be one more thing to clutter up a desk. Foresight... I loved that Mac and felt an almost thrilling connection to it as I word processed on AppleWorks. I played some Tetris on it and learned some elementary spread sheeting, but for the most part, it was still just a glorified typewriter.
In 1994, I was seduced by color screens (“What will they think of next?”) and I splurged on a new MacIntosh which cost me, once again, just over $2,000. The body was shaped like a pizza box. I used it to keep a budget and record my mark book. I also briefly found myself addicted to Spectre and would fall into bed at 1:00 AM, the game’s fluorescent symmetries still unfolding and firing the synapses in my dreams. People were beginning to talk about the Information Superhighway - email, shopping, porn - and AOL CDs were showing up in the mailbox and given away in magazines or packaged with computers, but it seemed geeky and complicated, something for retired engineers.
In 1996, my friend Luke convinced me to try the World Wide Web, so on a cold January evening, he brought me a copy of Netscape Navigator and an old dial-up modem. Over a six pack of Sleeman’s, he hooked me up to the Internet through a stray university account. We watched as the Altavista Search Engine slowly descended line by line onto the screen. At the top was a Toyota ad. "What do you want to search for?" he asked, but I couldn't think of anything offhand.
So I had Internet. It was glacially slow and it was boring. I had email, but no one else I knew had it, so my inbox was depressingly empty whenever I bothered to check, not even junk mail. It was so slow that it wasn't even a decent substitute for the yellow pages since it took more time to fire up the box (2 - 3 minutes), get online (1 minute through dial-up), search and wait for the results (another minute) than it did to simply pull out the phone book. Free porn was interesting, but again, it was a lot of work for a pretty unreliable pay-off. And as we used to joke, "Who seriously wants to have sex at a computer anyway?!” Someday studies will be done about how a whole generation of men learned to masturbate at their desks.
In 1998, I moved to the iMac. Like my previous three computers, it cost just over $2,000, but for the first time, design was a factor. You could choose the color. It was an avant-garde statement on my desk. The screen was crystal clear with thousands of colors, it had some games built in, and it had scorching Internet speeds compared to the old machine. I bought educational CD Roms for the kids, Encarta, and Word for Mac. The Web was getting more interesting too if you knew where to look. There were message boards and chat rooms. There were Nigerian 416 scammers on email. There were some free games. But it was still lacking in the entertainment department. It still wasn’t really fun.
And then a student told me about this thing called Napster.
If one thing fundamentally changed my relationship to the computer and to the Internet, it was Napster. I’d collected music in the late 70s and 80s, and I made the expensive switch to CDs, but the trouble and cost of running music on three formats – CDs, vinyl, and cassettes – was laborious, expensive, and ultimately frustrating. Should I buy new music or concentrate on replacing my old favorites with CDs? At $15 to $20 a pop, an impressive music collection represented a significant portion of my disposable income. And each purchase was a risk. Sure, I liked that new hit by the Spin Doctors, but what if the rest of the album was crap? Besides, I had young children and bills and little time to sit back and listen to music. But suddenly, I was on Napster every night. I downloaded everything I could find until my hard drive got so full I could no longer download new songs without deleting something first. It was free, it was social, it was addictive, and the hint that it might be illicit -- we weren't really sure -- made it deliciously fun. Napster made the Internet something I needed. It got me exploring, demanding more speed, more connections. When Napster was forced down, it felt personal, like the public library and local dancehall had both been closed by The Man. "Piracy"? "Stealing"? Come on. We were "sharing." How dare they put a price tag on sharing culture.
So I briefly mourned the loss of Napster, but then along came Kazaa, Limewire, BitTorrent. Soon there was MySpace and YouTube and Facebook and Wikipedia and Google, and somewhere in there it all becomes a blur.
The price of computers was dropping fast and our household acquired multiple machines. The kids used the iMac, and I got a Powerbook for myself – gray-scale screen, 2 hour battery life, a trackball instead of a trackpad. Two years later, I had another Mac laptop with a color screen, followed by a short stint with a Dell PC. Add to that in quick succession a new generation, white iMac for the family, a MacBook issued to me by the college, a Thinkpad issued by my wife’s office, a MacBook for one son in university, a Dell for the other, Wifi connections for all of them, plus Palm Pilots, smartphones, iPods, an iPad, a terabyte of disk space to back up all our media, a printer/scanner… And somewhere in there the Internet became a whole lot more than a place to check email and download songs. Somewhere in there it became the thing you checked in the morning and ended up connected to for the rest of the day. Somewhere in there Wikipedia stopped feeling like some lame amateur experiment, YouTube became something we watch collectively at social gatherings, Google became a verb and email became a horrible daily chore.
For over 15 years, I've had some form of computer hogging up real estate in my various abodes. The first four each cost between $2,000 and $2,500 and for the most part, I merely used them for elementary word processing. For that money today given inflation, I could get myself the best iMac Apple sells and still have money left over for an iPad. The smallest iPod in the house has more memory than my first computer. As I write this, it’s 2:30 in the afternoon, and there are three computers on in the house. I’m not sure if anyone is at any of the others because of course we don’t sit down to the computer like we used to; we slip online to check an address, change the music, respond to a status update, or read the news. I’d hoped to one day use the computer to organize my recipes; instead, I use it to cruise the world’s library of recipes. All this to say that in the end, the digital revolution isn’t about the computer; it’s about the Web, particularly the social web and the connections it enables. And as we forge new links and the Web grows, the real machine is the collective. The digital revolution isn't the computer; it's us.
Thursday, June 16, 2011
The Post-Millennial Cultural Landscape
Culture: everything we create, share, leave behind and express.
Throughout all of human history up to the 20th Century, culture was local, live and shared. Want music? Play it yourself or find someone to play it for you. Drama? Let’s check out the vaudeville. Stories were mostly told, poems were memorized and recited, and art had to be viewed on the wall. Culture was what you shared with your community.
Then came the 20th Century technologies of broadcasting and recording. Now, culture could be packaged, distributed, bought and sold. Culture became a commodity, something we possessed and collected, rather than something that average people created. Instead of being mostly locally produced and experienced by amateurs, it was professionally produced, institutionalized, and consumed. We shopped for our cultural experiences and we built shelves to hold our albums, cassettes and books. We subscribed to our magazines and newspapers, we taped our posters to the wall, and we wore the icons of our cultural identity on t-shirts. Collectively, we kept regular appointments with our radios and television sets to consume hours and hours of culture produced by "the mass media". Culture was centralized in movie studios, television networks and publishing houses, and it was promoted by professional critics who suggested the best way for us to spend our cultural dollars. Culture was big business and profitable empires arose based on the limited supply of professionally produced content: the record industry, the film industry, broadcasting networks and publishing houses.
And the direction of that culture was naturally one way. Producers produced and consumers consumed. I enjoy the quiet anticipation when the lights dim in a cinema, but perhaps it's a fitting symbol of our relationship to culture in the twentieth century: an audience of anonymous strangers in a dark auditorium passively experiencing the same cultural product simultaneously in silence. Or maybe an isolated viewer watching The Price is Right in her basement, a shopper perusing the rack of “New York Times Best-sellers” in a drugstore, or an excited crowd waiting in line for a Rolling Stones concert – all of these represent the same one-way direction of culture in the 20th Century.
With digital culture, recording and broadcasting have become virtually free to anyone with a computer and an Internet connection. "Professional amateurs" -- average citizens following their personal interest in anything from music, film, and literature to astronomy, knitting, and (of course) kittens -- can research their passion in depth, produce professional quality material, distribute their work globally, receive immediate feedback, and form relevant subcultures. The idea of culture that informed my generation, culture as a scarce commodity produced and sold by professionals, has been completely revolutionized in the short span of a decade.
We all understand that those who made money from brokering information – real estate agents, stock brokers, newspaper classifieds, travel agents or postal workers – have seen their business models irrevocably shattered. Culture is information too, and anyone who earned a living from producing, distributing and selling it is facing a similar crisis: instead of a scarcity of culture that can be profitably packaged and sold, there is a surplus of culture exchanged for free. Be you a writer, a musician, a newspaper editor, a cinema owner, a bookstore employee, a music company executive, a photographer or a stand-up comic, making money through culture has become a whole lot more challenging.
In some ways, we’re returning to the pre-20th Century model of shared culture, of audiences producing culture for audiences. Whether it’s through Wikipedia, File-sharing, Facebook, Flickr, YouTube or personal blogs, culture now moves laterally between the participants rather than from a minority of centrally produced paid producers to the masses. Instead of mass media, we have “new media.”
Whether we call the phenomenon Web 2.0, the Internet Revolution, or the democratization of culture, it is forcing us to rethink the way we look at everything from literacy, economics, education, and art. It's causing us to reexamine our identities and our relationships, our assumptions of knowledge and our notions of government. In fact, it's difficult to think of an aspect of our lives that remains unaffected by the culture shock of the new communication tools that have materialized so far this century.
This is a over-simplification, naturally, and others have elaborated on this in far more detail. If interested, Lawrence Lessig’s writings and his highly watchable TED talk on copyright law and creativity are a good place to start. As well, this essay by Bill Ivey and Steven J. Tepper, is an accessible work that I point my students to as an introduction.
Throughout all of human history up to the 20th Century, culture was local, live and shared. Want music? Play it yourself or find someone to play it for you. Drama? Let’s check out the vaudeville. Stories were mostly told, poems were memorized and recited, and art had to be viewed on the wall. Culture was what you shared with your community.
Then came the 20th Century technologies of broadcasting and recording. Now, culture could be packaged, distributed, bought and sold. Culture became a commodity, something we possessed and collected, rather than something that average people created. Instead of being mostly locally produced and experienced by amateurs, it was professionally produced, institutionalized, and consumed. We shopped for our cultural experiences and we built shelves to hold our albums, cassettes and books. We subscribed to our magazines and newspapers, we taped our posters to the wall, and we wore the icons of our cultural identity on t-shirts. Collectively, we kept regular appointments with our radios and television sets to consume hours and hours of culture produced by "the mass media". Culture was centralized in movie studios, television networks and publishing houses, and it was promoted by professional critics who suggested the best way for us to spend our cultural dollars. Culture was big business and profitable empires arose based on the limited supply of professionally produced content: the record industry, the film industry, broadcasting networks and publishing houses.
And the direction of that culture was naturally one way. Producers produced and consumers consumed. I enjoy the quiet anticipation when the lights dim in a cinema, but perhaps it's a fitting symbol of our relationship to culture in the twentieth century: an audience of anonymous strangers in a dark auditorium passively experiencing the same cultural product simultaneously in silence. Or maybe an isolated viewer watching The Price is Right in her basement, a shopper perusing the rack of “New York Times Best-sellers” in a drugstore, or an excited crowd waiting in line for a Rolling Stones concert – all of these represent the same one-way direction of culture in the 20th Century.
With digital culture, recording and broadcasting have become virtually free to anyone with a computer and an Internet connection. "Professional amateurs" -- average citizens following their personal interest in anything from music, film, and literature to astronomy, knitting, and (of course) kittens -- can research their passion in depth, produce professional quality material, distribute their work globally, receive immediate feedback, and form relevant subcultures. The idea of culture that informed my generation, culture as a scarce commodity produced and sold by professionals, has been completely revolutionized in the short span of a decade.
We all understand that those who made money from brokering information – real estate agents, stock brokers, newspaper classifieds, travel agents or postal workers – have seen their business models irrevocably shattered. Culture is information too, and anyone who earned a living from producing, distributing and selling it is facing a similar crisis: instead of a scarcity of culture that can be profitably packaged and sold, there is a surplus of culture exchanged for free. Be you a writer, a musician, a newspaper editor, a cinema owner, a bookstore employee, a music company executive, a photographer or a stand-up comic, making money through culture has become a whole lot more challenging.
In some ways, we’re returning to the pre-20th Century model of shared culture, of audiences producing culture for audiences. Whether it’s through Wikipedia, File-sharing, Facebook, Flickr, YouTube or personal blogs, culture now moves laterally between the participants rather than from a minority of centrally produced paid producers to the masses. Instead of mass media, we have “new media.”
Whether we call the phenomenon Web 2.0, the Internet Revolution, or the democratization of culture, it is forcing us to rethink the way we look at everything from literacy, economics, education, and art. It's causing us to reexamine our identities and our relationships, our assumptions of knowledge and our notions of government. In fact, it's difficult to think of an aspect of our lives that remains unaffected by the culture shock of the new communication tools that have materialized so far this century.
This is a over-simplification, naturally, and others have elaborated on this in far more detail. If interested, Lawrence Lessig’s writings and his highly watchable TED talk on copyright law and creativity are a good place to start. As well, this essay by Bill Ivey and Steven J. Tepper, is an accessible work that I point my students to as an introduction.
Thursday, June 9, 2011
5 Things
First blog post, such an awkward little postcard to the world. The disconnect of writing to myself and simultaneously to the firmament of connected lights that is the Internet. On one hand, it's an intimate, private space - just me, my keyboard, and I - on another, it's a potentially public space, a postmodern cross between singing in the shower and delivering a speech from the Pope's gallery. So in an effort to bring some order to my thoughts and launch this thing, I’m falling back on a simple “5 Things...” post. Five Things I’m thinking about in digital culture are...
1. The future of professional content. The information age, as it was coined in the 90s, suggested the fallacy that to make money, one had to broker and sell information. The opposite has come true, and a generation has grown up with the expectation that media is free. Content creators, professional ones at least, have found themselves unable to profit from the traditional means of selling media. Whether you’re a music company, a newspaper, a film producer, a novelist or a textbook publisher, making a buck out of selling information has become a whole lot harder if not impossible, and entire industries have found themselves kicking through the rubble of their collapsed business models in a stupefied daze.
2. Net Neutrality. When we log on to the Internet we can access almost any information we want at the fastest available speed. With few exceptions, we can use any service we want any time we want. The Internet service provider may not speed up the connection for one class of user as opposed to another; there is no discrimination based on the sender or the receiver. Framing the debate as one in which illegal file-sharers are hogging up too much bandwidth and slowing things down for everyone else, ISPs are asking for the right to privilege certain web activities over others with more bandwidth and faster speeds. The fear is that this will open the door to legislating the net, imposing limits and allowing providers to sell access in packages as they do with their other businesses, such as cable or telephone services. As a consequence, we would have a fast lane and a slow lane on the Internet which would interfere with the explosion of innovation and artistic freedom and expression that the web affords us.
3. The death of the Web. A common misperception is that the Internet and the Web are the same thing. The World Wide Web is the part of the Internet we are most familiar with, the pages upon which Wikipedia, blogs like this, university research sites, and the New York Times magically appear in our browsers. Email, streaming, downloads, Skype, chat and a host of other services and functions about which most users are unaware are also part of the Internet. The Web is a wild and wooly place with its uninvited porn, spyware, phishing scams, obscene comment strings and viruses. More and more of us are migrating to our mobile devices like the iPad and taking advantage of the services and platforms we enjoy through convenient apps. These apps effectively keep us off our browsers, and the fear is that we’ll regret this move. The ease and safety of apps will push us further from the freeform innovation and inventiveness of the more open and unpredictable Web.
4. The Internet is bad for our brains. Another concern of the moment is whether the constant distractions and multitasking of being online is in fact changing the way we think. Are we losing our ability to understand longer texts and complex arguments as we ignorantly drown in a happy sea of tweets, status updates and indignant blog posts? The argument claims we are instant superficial specialists, skimming knowledge and making quick, uninformed decisions without the ability to ruminate and develop larger ideas. This idea is the basis of Nicolas Carr’s The Shallows. The other side of this argument, most famously put forward in Clay Shirky’s Cognitive Surplus, is the idea that time spent online creating even the most mundane things, such as tweeting a status update or commenting on a YouTube video, is superior to passively consuming television and therefore ultimately better for us in the long run.
5. Copyright vs. Copyfight. Copyright laws are vastly outdated for the current media landscape. Originally created to balance creativity and preserve the importance of the public domain, they speak to another era when reproducing media was impossibly expensive for the average citizen. Now that copying is free, faultless, easy and social -- it could be argued that copying files is in fact the main function of computers -- average citizens find themselves the targets of lawsuits, and media producers find themselves in the position of suing their fans and customers.
So. Five things that are on my radar.
1. The future of professional content. The information age, as it was coined in the 90s, suggested the fallacy that to make money, one had to broker and sell information. The opposite has come true, and a generation has grown up with the expectation that media is free. Content creators, professional ones at least, have found themselves unable to profit from the traditional means of selling media. Whether you’re a music company, a newspaper, a film producer, a novelist or a textbook publisher, making a buck out of selling information has become a whole lot harder if not impossible, and entire industries have found themselves kicking through the rubble of their collapsed business models in a stupefied daze.
2. Net Neutrality. When we log on to the Internet we can access almost any information we want at the fastest available speed. With few exceptions, we can use any service we want any time we want. The Internet service provider may not speed up the connection for one class of user as opposed to another; there is no discrimination based on the sender or the receiver. Framing the debate as one in which illegal file-sharers are hogging up too much bandwidth and slowing things down for everyone else, ISPs are asking for the right to privilege certain web activities over others with more bandwidth and faster speeds. The fear is that this will open the door to legislating the net, imposing limits and allowing providers to sell access in packages as they do with their other businesses, such as cable or telephone services. As a consequence, we would have a fast lane and a slow lane on the Internet which would interfere with the explosion of innovation and artistic freedom and expression that the web affords us.
3. The death of the Web. A common misperception is that the Internet and the Web are the same thing. The World Wide Web is the part of the Internet we are most familiar with, the pages upon which Wikipedia, blogs like this, university research sites, and the New York Times magically appear in our browsers. Email, streaming, downloads, Skype, chat and a host of other services and functions about which most users are unaware are also part of the Internet. The Web is a wild and wooly place with its uninvited porn, spyware, phishing scams, obscene comment strings and viruses. More and more of us are migrating to our mobile devices like the iPad and taking advantage of the services and platforms we enjoy through convenient apps. These apps effectively keep us off our browsers, and the fear is that we’ll regret this move. The ease and safety of apps will push us further from the freeform innovation and inventiveness of the more open and unpredictable Web.
4. The Internet is bad for our brains. Another concern of the moment is whether the constant distractions and multitasking of being online is in fact changing the way we think. Are we losing our ability to understand longer texts and complex arguments as we ignorantly drown in a happy sea of tweets, status updates and indignant blog posts? The argument claims we are instant superficial specialists, skimming knowledge and making quick, uninformed decisions without the ability to ruminate and develop larger ideas. This idea is the basis of Nicolas Carr’s The Shallows. The other side of this argument, most famously put forward in Clay Shirky’s Cognitive Surplus, is the idea that time spent online creating even the most mundane things, such as tweeting a status update or commenting on a YouTube video, is superior to passively consuming television and therefore ultimately better for us in the long run.
5. Copyright vs. Copyfight. Copyright laws are vastly outdated for the current media landscape. Originally created to balance creativity and preserve the importance of the public domain, they speak to another era when reproducing media was impossibly expensive for the average citizen. Now that copying is free, faultless, easy and social -- it could be argued that copying files is in fact the main function of computers -- average citizens find themselves the targets of lawsuits, and media producers find themselves in the position of suing their fans and customers.
So. Five things that are on my radar.
Labels:
content producers,
copyright,
net neutrality
Subscribe to:
Posts (Atom)