Site logo

Software Hardware Music Rants About Me

Consumption Computing and the Death of Creativity

Sometimes I'm just a bit contrarian.

Other times, I'm incredibly contrarian.

Case in point: I'm in favor, in a general sense, of technological progress, especially as it relates to computers. I'm not always in favor of everything that's done with it, but you can't fault tools for the misuse thereof, right? And in terms of easy access to overwhelming computing power, we have it better now than we ever have before. My "daily-driver" system isn't even remotely impressive by modern standards, and it still allows me to arrange complex pieces with close to a dozen virtual instruments in real-time, or perform complex operations on multi-megapixel images in single-digit seconds. I have power at my disposal that I could only dream of when I was thirteen.

And yet, in spite of all that, I find myself increasingly trying to go back. As I've been collecting MIDI gear, I've increasingly found myself drawn to the idea of creating music on my Amiga, or perhaps an old 68k Mac, and then to the idea of drawing on them, writing, programming... These are machines that run at between 1/33rd and 1/100th the clock speed of my laptop (and not even close to that in actual performance, thanks to the laptop's dual cores, multi-threading, efficiently pipelined execution, fast RAM, and large multi-level cache,) and have at the very most 1/16th the RAM (and typically closer to 1/64th.) They look like pocket calculators by comparison; why would I ever even consider trying to make them my tools of choice!?

Well, partly because I'm just crazy like that. I mean, look at my site. But I've been talking about the idea with other people (people sufficiently informed on matters vintage-electronic to know what I'm talking about, but different enough from myself to offer meaningfully different perspectives,) and I'm starting to think that maybe it's because vintage computers, moreso than modern ones, are designed for this kind of thing. That seems absurd on the face of it; after all, a general-purpose computer is a general-purpose computer. It runs any software you give it to run. And it's not like Windows (for example) is short on software for productivity or creative purposes. There's absolutely no reason I couldn't accomplish everything I want or need to on my XP laptop - but when I sit down to do so, even if it's on time I've specifically set aside for the purpose, I find myself battling my own desire to just pull up Solitaire or watch MST3K on YouTube or do anything but what I'm actually trying to get done.

I blame the media.

The Rise of Media Computing

"Multimedia" was the buzzword of the early '90s. In the future, we were told, everything would be digital, and it would all be "Multimedia." In dictionary-definition terms, "multimedia" simply means information being delivered in more than one medium, but to judge by magazine and catalog illustrations "multimedia" meant pictures of classical composers, the Space Shuttle, Gutenberg, and other facets of the totality of human knowledge all blasting into your brain from the computer monitor on rainbow-colored waves of pure enlightenment.

Oddly enough, nobody in the general public was as enamored by this idea as its proponents were. People were vaguely impressed by the novelty of being able to pop a CD-ROM in the computer and pull up a grainy 256-color picture of the Mona Lisa, but it wasn't something there was a lot of call for in daily life. So the hype gradually died off, but it never really went away. It wasn't until some years later, in the late '90s, when broadband Internet started to become the rule rather than the exception and slickly-produced pages full of images, Flash clips, and all manner of other bells and whistles could load up just about anywhere, that "multimedia" found its calling. In some ways it was the fulfillment of what the futurists had claimed back in the day; Wikipedia is loaded to the gills with text, images, and audio and video clips on any subject imaginable. On the other hand, a lot of it was just ads, ads which needed people to look at them. And for people to look at them, they had to accompany content. It didn't really matter what kind of content; by this point there were enough people on the Internet that basically anything you could conceive of (and most of the things you couldn't) had a built-in audience, so who cared? Content - the thing that the Internet was designed as a vehicle for - was now pretty much a fungible commodity.

Things developed rapidly from there. Even before DSL was ubiquitous some clever companies had figured out that by offering free (if severly limited) hosting to people who just wanted a place to call their own on the Internet, they could get content for free, tack their ads to it, and make money without having to go to the effort of creating something themselves. In the mid-2000s, as sites like YouTube sprung up, taking advantage of the increasing availability of high-speed Internet and the decreasing cost of hosting to offer users ever more space for self-expression (i.e. content) with which to draw page views, with which to get better exposure for ads. This, we were told, was the future of the Internet - "Web 2.0," as various insufferable tech publications liked to call it. This wasn't as much the case as they wanted to think (because let's be honest, out of all the content on YouTube, Blogspot, or Flickr, how much is really worth bothering with?) but it did have one very significant effect: it introduced the concept of the computer as a million-channel glorified television, which would become important later.

Meanwhile, a different trend was taking shape elsewhere in the computing world. I'm not sure, but it probably had its roots in the file-sharing mania of the early 2000s. After all, with a million billion MP3s from everyone else on the Internet, you had to keep them organized somehow, and why would you go to the trouble of doing it yourself when the computer could do it for you? Thus the media library was born: the computer as automated librarian, DJ, and projectionist, ready to serve up any song or video out of however massive a collection you could accumulate at a moment's notice, or simply pick something at random if you couldn't be bothered. This hit the big time when Apple used the concept as the basic principle for iTunes, simultaneously legitimizing both large-scale media hoarding and online purchase of intangible goods (i.e. MP3s.) This was a bit of a daring move at the time, when a whole lot of the media industry still viewed the Internet about as favorably as Klansmen view bar mitzvahs, but most companies came around pretty quickly when they saw how much money it raked in.

Record labels weren't the only ones taking notice. Microsoft certainly thought it was noteworthy, so much so that they started integrating "media" features right into Windows, in addition to Windows Media Player's own media-library feature, and even introduced a special version of Windows XP specifically dedicated to the idea of the computer as the one-stop Media Center of the future. Some folks in the TV world were watching with interest, as well, but with an eye towards more than simple one-time purchases. It would be a while before subscription online-TV services would really catch on, but when they did, the final piece fell into place, and the modern notion of Media Computing was complete.

Lotus-Eating in the Information Age

This has been the dominant idea in the development of mainstream computing for probably around five years now. The concept keeps getting pushed further, integrating these features more and more right into the operating system itself. Windows 7 has media libraries built right into Windows Explorer, complete with search capability, so you barely even have to bother knowing where anything is on your system - the computer will take care of that for you! Just call it up with a quick search! Media-consumption as a fundamental feature of the computer has been practically the defining idea behind tablets, which drop features that would be required for useful work, but are better at being portable TVs than portable TVs ever were. And now Windows 8 intends to bring that paradigm to the desktop. Computer-as-media-player is officially The Way Forward, if you ask anybody in commercial OS development. Welcome to the Future, ladies and gentlemen: never-ending entertainment at the touch of a button. Set up video streaming from your laptop to your TV, and you may never have to leave the couch again. Buy, consume, rinse, lather, repeat - it's easy!

But one thing is conspicuously absent from this model: the idea of creating. Oh, don't get me wrong, the capability is still there, at least on desktops and laptops (a.k.a. "real computers.") Even tablets may have a handful of "apps" for doodling or playing a little touchscreen piano or whatever. But the idea of creating, of the computer as an extension of the fundamental human creative urge, is conspicuously absent from the marketing material these days. Consumption of media, be it video, audio, or (for the really elite) text is the real and primary purpose of the computer now. And why not? After all, time you spend making your own stuff is time you're not spending buying and consuming media from the online store. It could even be seen as direct competition with that fine establishment!

No, much better to lay back, open the floodgates, and let the endless rush of pre-made content - infinitely variable but utterly interchangeable - wash over you. If you're feeling really wild, you can try going onto Facebook and "like"-ing the video you were watching. It's probably even integrated into the player now! It's easier, after all...so much easier. Consumption demands nothing of you but time, and occasionally your wallet. Even your attention is optional! This is life in the age of the Pure Consumer: sitting passively, chewing cud, awaiting the next milking of your bank account. But at least you're not required to exert yourself doing stuff!

Life in the Before Time

It wasn't always like this. People under the age of eighteen may not believe this, but there was a time when people bought computers to do stuff with them. There was, in fact, a time when you couldn't even watch movies on a computer. You could play games, but they weren't the only thing you did - if you only wanted to play games, you bought a Nintendo or a Sega. If you had a computer, you did stuff with it - either because you needed to (i.e. your dad doing his taxes in Quicken) or simply because you could.

When my family got our first computer (a Macintosh IIcx,) for example, we didn't even have any games for it, at first. We actually did things with it. My brothers and I drew in MacPaint, or made doofy stories in Storybook Writer (and later, doofy plays in Opening Night.) We got our first electronic piano, and my mom started sequencing and arranging sheet music over MIDI. When we upgraded to a really advanced machine (a Pentium II) it opened up exciting new possibilities - my brother Andrew and I started developing games in MegaZeux and creating music for them in a MOD tracker, we got POV-Ray and learned how to raytrace and use Photoshop, and very much more.

The thing that needs to be stressed here is that this was normal at the time. This was what people did. Not just nerds, everybody who got access to a computer. When we got the Mac, none of us were "computer people." Only Andrew and I ever really became computer nerds - but all of us experimented with many of the possibilities the computer had to offer. With everything we could do, it would've been stupid to limit ourselves. Yet now so many of us do just that.

One of the most amazing articles I've ever read is an old Rolling Stone piece, "SPACEWAR: Fanatic Life and Symbolic Death Among the Computer Bums," by Stewart Brand. It's really eye-opening to see speculation from the late '60s on where the development of personal computing (then still close to a decade in the future) would take us. The visions expressed therein, of computers as a power-to-the-people tool that could take the common man into new realms of creativity, are as inspiring as the reality of what personal computing has become is heartbreaking. From that, to this...how did it go so wrong? How did the fantastic turn out to be so banal?

I don't know. I don't think I can point to any one development from any one company and go "so you're responsible!" It's been a slow, almost unnoticeable shift over the last maybe fifteen years, and it only becomes more obvious when you look back over that time and think about it. I don't even think that there's been some giant conspiracy to turn every computer user into a passive, bovine consumer; most of the developments in that history up there are just companies doing what companies do: acting in their own short-term self-interest based on projections of future trends based on analysis of present trends. But does that make them good? Does the fact that this is where personal computing has ended up mean that this is where personal computing should have ended up?

"So What!?"

That's the refrain I've heard from a lot of people. What does it matter? Like it or not, this is where we are now. You can't change the world by wishing hard enough, no matter how theoretically right you are. And in the long run, are we that much worse off for it? Who are we to tell people how to spend their free time, anyway? Is it really so terrible that someone would rather just spend their whole free time kicking back on the couch and watching the Nostalgia Critic than spend any time on creating something of their own, exercising their own personal creative voice?

Well...yeah. Yeah, it is.

Don't get me wrong: I completely understand and sympathize with that line of thinking. I'm generally about the last person in the world to approve of buttinskis and busybodies, and even feeling the way I do on this issue I'm not really trying to say that nobody should spend their evening relaxing with a beer and an Animaniacs DVD. (Just so long as it's a nice beer.) But the simple fact of the matter is that we as human beings are created in the image of the Creator. It therefore follows that we are created, at least in part, to create. This urge is a fundamental part of the human makeup, and that's how it should be. And if someone is stifling that urge the way a lot of us today are stifling that urge, drowning it under a flood of passivity...that's not how it should be. They're shortchanging themselves and avoiding fulfilling part of what they were meant to be.

I don't necessarily mean that everybody is a nascent painter, poet, or what-have-you (though I think a lot of people might be but just haven't tried to discover and develop their own latent creative interests.) Maybe their outlet for the urge to do, to make, to shape is in something more seemingly prosaic, like gardening, or car repair. That's excellent as well! Maybe it's even something that involves taking in media, like a good film critic. But I think I can safely say that none of those are in even remotely the same category as passive, glassy-eyed media-consumerism.

(As an addendum, let me say that consumption - or, less crassly, "taking in" or "appreciation" - is important and isn't, inherently, a bad thing. After all, while a good creation is good even if nobody sees it, if nobody is enjoying it then it's not really serving its purpose. But consumption alone is not what we were made for.)

What Does All This Have to Do with Computers, Again?

We've strayed pretty far afield from those first couple paragraphs, indeed - starting off from the question of why older computers feel more suitable for making stuff with to me, and rambling on through an indictment of consumer culture, which is the larger problem I wanted to shoot off my mouth about in the first place. And as for that, if the way things are is so bad, what are we supposed to do about it?

Well, to take those one at a time, first: I'm not really sure why they do. Maybe it's just psychological; maybe putting myself in an old environment makes my brain revert to the old mindset, where the computer is a paintbrush or a keyboard or a typewriter rather than a TV. Maybe it's that they were just designed in that mindset to begin with, because they hail from before the shift took place. All's I know is that there just seems to be some je ne sais quois, a kind of "electronic muse," that makes it much less difficult for me to get in and stay in a creative mood on an older system.

As for the latter: I don't know. (Lord, this was a lot of words expended to end on a couple ambiguities!) It's not as if I can just make people change their way of thinking by talking at them. (I'd be pretty well set if that were the case, though.) I guess the best thing I can say is this: if you're reading this and aren't spending at least some of your time making things or doing things, start. Experiment. Try stuff until you find a creative outlet that resonates with you, and stick with it. Keep trying other stuff, too - I discovered (and rediscovered) interests later in life that I really wish I'd spent the intervening years nurturing. But above all, do stuff - your stuff, the stuff you were meant to do. Don't just passively experience other people's stuff. A rock can do that.