Tuesday, October 28, 2014

Life is a game/Game is a life

I was all ready for more unnecessary italicizing of ideas that seemed important to Manovich whenever we moved on to Ian Bogost's "Unit Operations"....excuse me, I apologize.

Anyway, Unit Operations: An Approach to Videogame Criticism has proven to be much less hateful than Manovich, precisely because it's about something that I think we can all relate to and something that was made possible by software...sorry, won't happen again. Bogost explores the idea of videogames, and how they relate to other media, in a pretty interesting way. And it's well-deserved.

We as a society are slow to embrace the idea that something we grew up with (and something that is so seemingly "current" that we're at a loss to consider that it has a history beyond our chronological introduction to it) could be worthy of scholarly discussion. Well, maybe it's just me; I never found myself thinking (in the midst of screwing up yet again to get beyond the basic beginner level in Super Mario) "hey, I wonder what this says about society, and about our interaction with the game versus our interaction (or lack thereof) with other mediums." Cut me some slack, I was a pre-teen.

But I did grow up with videogames, we had the old-school Atari and I recall fondly the badly pixilated thrills of games that required a joystick and which featured one button besides the one on top of the joystick, and if there was such a thing as "cheat codes" back then, I didn't know it (I've always felt like cheat codes were, ahem, cheating, both by you of the game and of the code by you by reducing your enjoyement of the game to figuring out ways to beat it that went off the beaten path. I was a bit more law-and-order then, I guess). There was still a filter, of sorts, between the videogame onscreen and your real life, the one going on around you (and the one in which cheat codes were probably used against you, to be honest. It was the Reagan/Bush era, and the nostalgia/homoerotic love-fest Republicans have for that time bewilders me). The concept of an "immersive gaming experience" consisted of Tron, which is confusing as hell when you're a little kid and what you're watching is basically Jeff Bridges in a suit made of Nite-Lites. But nowadays, of course, the game interacts with your real life, in ways that would've seemed impossible to artists back then. I have never played a Wii (there's a fantastic Key & Peele sketch about that, it veers into NSFW territory towards the end so I didn't post it to the group's Facebook page), but I have played Rock Band: it's reducing the musicianship of people I admire (and Gene Simmons) to controls on a panel, albeit a guitar-shaped one. The experience of playing live music is turned into a game in which you collect points based on how "well" you "played," and the quotation marks are appropriate. However, the italicizing could be considered excessive on my part.

I thought the discussion of non-game games (i.e., simulations like The Sims or Star Wars: Galaxies) was interesting because those games seem to re-define the purpose of videogames (i.e., the escape from reality that is such a conducive force for much of the stereotypical gaming set, the ones that aren't good with basic social interactions). Games have gone from fanciful journeys (hero-quests, to borrow some Joseph Campbell because I too have seen Star Wars and will get around to The Hero With a Thousand Faces at some point) to almost blah recreations of the real world (or in the case of Galaxies, a mundane rendering of what was originally a more cosmic idea). At what point does the idea of "life as a game" cross over from "wow, this is exciting, I get to collect points and do things in real life that I could only do in games" to IRS Audit:The Game, in which you have to navigate the legal and fiduciary respobsibilities that come with real life situations.

Scott Pilgrim Versus The World, to my mind, is the best of the "videogame brought to cinema" movies because it's not actually based on a game; the source material is a graphic novel (which, like videogame movies, is a hybrid of two things: the comic book and the novel-like narrative structure, because a lot of comic books are one-and-done affairs while a graphic novel has the potential to grow over many issues. This is a gross simplification of both comic books and novels, of course, but it works for the example). In the film, Scott has to "battle" the ex-lovers of his current flame, Ramona, in videogame-style contests that recall for me the battles one would encounter in Mortal Combat (all they needed to complete the illusion was the final "Finish him!" that confirmed MC's bloodlust in the eyes of concerned parents who, as usual, overreacted to something they didn't understand, much the same with violent rap lyrics or over-the-top slasher films). The rules of real life (you can't go around fighting people, when they die they don't increase your own chances of living nor turn into coins) are broken throughout the film, because otherwise the film is just a typical romantic comedy with a pretty good soundtrack. In a game world, Scott can defeat the evil exes and inch closer to becoming the kind of guy Ramona can live with. Complications arise, of course, as in games. But the overall feel of the movie, hyper as it is, suggests a videogame with only one outcome: Scott gets the girl. In videogames, there are multiple ways the narrative can end, and even points where it can end before you reach the supposed conclusion (as I've learned when trying to tackle Tetris, you can't really win, you can only hope to keep going).

I've never really looked at videogames as being "worthy" of such critical approaches, but that's not because I overwhelmingly think they don't deserve it. I'd just never considered it, and while I don't buy into the premise that they are always worthy of such discussion (c'mon, Donkey Kong could probably be read as a Marxist text on the fetishization of empty barrels used to crush Italian plumbers, but that's a really awkward stretch), I do think it opens up a new world for serious discussion. I think, in true High Fidelity fashion, that we can be defined by our tastes in pop culture (though in HF it's more about individuals, not groups), and as a group we can be defined by the games we embrace as much as we can the cinema, music, or (increasingly less likely) literature. Videogame studies also embrace the notion that I think we've been ignoring throughout the course, that the humanities isn't just literature. There's a philosophy behind even the simplest games, and I think we can try to discuss it (ahem, sorry) try to discuss it as seriously as we take the philosophy behind Moby-Dick or Star Wars. I just hope Manovich isn't there to italicize everything...

Tuesday, October 21, 2014

Manovich, Manovich, Manovich!

I started Software Takes Command thinking, "I can handle a book that has a fifty-page introduction, important or percieved-to-be important ideas in italics, and asks the questions that not too many people ask anymore (like 'how does Word work?')." I'm not so sure now, but I have some definite ideas about what this book tries to say.

First off, let me say this: Apple (helmed by the evil Steve Jobs, even after death) and Microsoft have made their living off keeping us away from the actual viewing of hardware and software, i.e., "how the sausage is made." Apart from that one Apple computer in the early part of the 2000s with the inside portion visible through different-colored bodies, both companies have made it a priority to keep the user at a safe distance. This is understandable from a business sense (unless you want to take apart the computer or product to figure out how it works and how to steal said design, you're not likely to succeed in said intellectual theft), but it also seems to be the real-life Revenge of the Nerds that the films only hinted at. The IT guy is the most important figure in any company, because he (or she, to be politically correct) knows what to do when everyone's computers start misbehavin'. Sleek designs and "wowee zowee!" graphics on our phones (well, not mine, I'm the one guy in America who still has a flip phone) keep us from asking the pertinant question "how does this work?" And that's usually how progress rolls.

Think back to all the various "new things" technology has given us just in the last half of the twentieth century, and how "new" and "exciting" they once were compared to how they are viewed now, seeing as they're more commonplace today. I think Kubrick's 2001: A Space Odyssey plays differently today than it did in 1968, just because we're immune through media saturation to the wonders of outer space presented in the film. If today, a filmmaker tried to get away with a space shuttle docking with the International Space Station that takes up a good chunk of screen time (not to mention being set to the "Blue Danube Waltz"), he'd be laughed out of Hollywood. Trains were the space shuttles and rockets of the nineteenth century, as Manovich alludes to; gradually through constant use, the novelty wore off and we stopped asking "how does steam cause the train to run?" Luckily for us (well, some of us), Manovich is here to ask the questions in regards to software.

Some of his insights are worth considering, but I feel like we have to slog through an awful lot of "yes, I know how that works, but thank you for going into exhausting detail for me." I don't want to bash Manovich (I just love that name, it sounds like some crazy-eyed inventor of the nineteenth century), so I'll restrain myself and move on to the next thing that the book got me thinking about.

In the last chapter (which, full disclosure, I haven't finished as of this posting), Manovich describes the incorporation of software advances into film-making, and here's where I get to show off my Film Studies minor (money well spent, state of South Carolina!). What was interesting to me was how Manovich highlighted the use of computer-generated images (CGI) in the 1990s, when the idea was both to leave the audience stunned at how clever and amazing said effects were but also not to overwhelm them with questions of "how did they do that," i.e., break the audiences' willing suspension of disbelief.

Fiction, whether in film or any other medium, relies on suspension of disbelief: yes, we know inherently that we're simply seeing still images speeded up to suggest movement on the part of human (or cartoon) characters, just as we know the words on a book's page don't really mean that this person we're reading about (be it Captain Ahab, Beowulf, or Kim Kardashian) has ever actually existed. There have been movements to call attention to such artifice, of course, and each time this is done those practitioners of whatever "shocking revelation about the nature of fiction/cinema/art/whatever" pat themselves on the back and think "gee, weren't we clever?" But the truth is, art needs that disbelief to both be present and also to be suspended, at least until the audience is lured in and can't turn away. And movie effects have been a large part of that.

In the olden days, for instance, a monster in a horror film was just some poor schmuck (probably the second cousin or brother-in-law of the director) stuffed into a suit and told to walk around with a chainsaw or axe in hand to threaten the idiotic teenagers who thought it'd be a good idea to spend a night in the haunted house/abandoned campgound, etc. But effects that seem tame today could sometimes be revolutionary at the time, pointing to new avenues for artistic expression (2001 helped George Lucas realize his vision for Star Wars). The xenomorph in the original Alien (1979) was just a guy in a suit, but because of the audience's willing suspension of disbelief, we could believe that this creature was real (and that we wanted nothing to do with it). With the advent of CGI, it was believed that more realistic monsters and creatures could be imagined, without distracting the audience from the artificial nature of the creature in question. Of course it meant that actors were usually reacting to something on a green screen, but the poor brother-in-law of the director got to sit down and relax until someone needed to make a coffee-and-cocaine run for the crew.

But as Manovich points out, there's been a shift in the thinking: movies like Sin City thrive not on the realistic integration of CGI effects but in the very highlighting of that artifice for dramatic effect (see, the book had an effect on my writing!). By calling the audiences' attention to the very artificialness of what's onscreen, they play with the notion that disbelief needs to be suspended by realistic action, in a sense.

Not to exhaust the point, but consider a film that's been made and remade a couple of times: The Thing (1951, 1982, 2011). In the original version, directed by Howard Hawks (yeah, I know the credits say Christian Nyby, but it's a Hawks movie through and through), the alien creature that threatens the camp of intrepid Arctic scientists is a giant carrot, basically, and played by James Arness as a walking, grunting "Other" that can be defeated using good old American know-how (and electricity). In John Carpenter's version, and the "prequel" that came out almost twenty years later, the alien is able to assume the identities of the guys in the camp, one at a time, and form a perfect copy that is convincing up until it's revealed as such. In this case, special effects of the real-world kind play the bad guy or guys: characters who are revealed as "Things" stretch and twist into grotesque manifestations of your worst nightmare when it comes to having your body torn apart. The most recent version (which I haven't seen much of, beyond the trailer) does this as well, but through the "magic" of CGI. We have the classic attempt to integrate CGI effects so that we both notice them and aren't distracted by them to forget what's going on onscreen (at least that's the filmmaker's hope). In that sense, the 2011 version is then not only a return to the premise of Carpenter's version, it's also a return to the "antiquated" idea of CGI both being integrated into the film and thus noticable. Once again, state of South Carolina, your money was well spent on my Film Studies minor.

I think, as someone who's interested in art, it matters to me whether CGI effects dominate a film (like Sin City), calling attention to themselves, or try to blend in (the most recent batch of Star Wars films) without necessarily doing so. No one is saying that having an actual person (again, the poor brother-in-law of the director) in a monster costume is infinitely better than having that same monster rendered by CGI (well, some people are; I think it's a case-by-case basis, myself), but software continues to redefine the logics and phsyics of film-making, and it will be an interesting time to view what sticks and what falls by the wayside in terms of computer effects.

Monday, October 13, 2014

How We Think

I have never done illegal narcotics, nor too many legal narcotics, in my lifetime. There was that one time I passed a car where the occupants inside were obviously smoking a joint (my work buddy that I was walking by the car with helpfully pointed out that that's what pot smells like) and got a little contact high, but beyond that and the occasional alcoholic experience, I haven't much time for the drugs, as the kids might say. It's not that I have a moral stand necessarily against an individual person's right to enjoy a hit of reefer every now and then, and I honestly think the drug "war" would be a lot less wasteful if we legalized some stuff that isn't legal now (I'm sure the drug cartels will find some other way to fund their operations, perhaps by branching off into highly addictive coffee beans). I just know that my mind is weird anyways, without any outside help.

How We Think by Katherine Hayles is a bit like my explanation of why I don't do drugs, in that it chronicles in the latter stages (heh-heh, "chronic"-les...sorry) the rise of more computer-friendly fiction. There's a huge argument going on in the book about narrative versus databases (i.e., the stuff we study as English and humanities majors versus the stuff we use to store the info we've gathered), and I have to say that I was intrigued by the discussion of the two works that Hayles cites (The Raw Shark Texts and Only Revolutions) because I tend to gravitate towards the odder end of the literary spectrum, if only to dip my toe in with some authors while embracing some of the more fantastical writers (Pynchon, some DeLillo, William S. Burroughs, Vonnegut). I don't always understand what I'm reading (at least I'm honest), but I find the journey enjoyable in and of itself.

I couldn't help but think of Burroughs' work with the "Cut-Up Trilogy," three books that he fashioned together out of cut-up words and phrases from other publications, when I started reading about Raw Shark. I haven't read any of the trilogy, nor the Raw Shark novel, but I think that sort of experimentation, playing with narrative expectations, can be exciting (well, occasionally frustrating, but exciting too). I read Naked Lunch over the summer, and straight through; when I read on Wikipedia that Burroughs had meant for people to be able to start wherever they wanted to skip around as they chose (a sort of junkie Choose Your Own Adventure) I wondered if I'd read the book wrong, or if there was *any* right way to read it (this was Wikipedia that I was looking at, of course, and someone could have added that detail as a goof or an inaccuracy). There's a certain sense of playfulness in the descriptions of both Raw Shark and Only Revolutions, as if, while both works have their seriousness, they have an anarchic side too, something that deviates from the path. Something that makes the reader less passive than he would normally be.

All that said, I might be hesitant to actually try and *read* either of the books mentioned. I remember loving the movie Trainspotting when I saw it (still the best depiction of Scottish heroin junkies I've ever seen, by the way), and I was excited when I found the novel that the film was based on at a local bookstore. I got it home, turned to the first page, and was gobsmacked by the heavy Scottish dialect of the first few pages. I literally got a headache (I'm not exaggerating for comic effect). I stuck with the book, however, because (thankfully) it was a multiple-narrator novel (really a collection of short stories that fit together, from differing points of view) so that the heavy Scottish dialect parts weren't the sole part of the story. Several years later, I turned to the opening page of Finnegans Wake and decided after reading a couple of lines that James Joyce was batshit crazy, so I stopped.

I think, in the clash of narrative v. database, we'll see a happy (or unhappy) marriage of the two as time and technology progresses (at least until Skynet wipes out humanity). As Hayles argues (and as I agree with her), we are a narrative-based species, always searching for the story of how we came to be, or how we came to live in the places that we live, or why it is that we die, what happens when we die, etc. The ghost in the machine may be our need for a narrative, after all; databases store information, and they do a damn good job of it, but so far they can't tell a story. But narratives let us down too, in need of constant revision as more facts become known (just look at the narrative that the NFL was trying to sell us on the whole Ray Rice incident, before the second video came out, to cite a real-world example). You constantly hear "narrative" used with cynical connotations (such as "what is our narrative for the way events unfolded in Iraq"), but it's one of our defining characteristics. That being said, a database can provide information that wasn't known when we crafted our original narrative. It's a brave new world of narrative-database hybrids, as represented by the two works cited in Hayles. It may be a bit over my head, but I'm on board with at least trying.

Tuesday, October 7, 2014

The Ecstasy of Influence

In Matthew Jocker's book Macroanalysis: Digital Methods & Literary History, we get an expansion on the premise of the previous book (Moretti's Graphs, Maps, Trees), specificially in the idea of "mapping out" or charting the ways in which books relate to one another (and differ) due to issues of gender, nationality, and chronology. I found this book frustratingly interesting, if that makes any sense. I am not mathematically inclined, so all the citations of various equations used to arrive at algorithims and what not may have gone over my head or led me to jump ahead to the next paragraph. But I liked the idea of trying to chart books more than just via close reading.

Now seems like a bad time to admit this, but I've never been that good of a close reader (or that close of a close reader, if you prefer). In my years at Tri-County Tech and further onwards, speed was the name of the game; I had to have sections 1-7 of The Odyssey or whatever read by Tuesday, and ready to talk about them in some depth. Oftentimes, speed trumped paying-attention-to, though as a lifelong (it seems like) English major I could bullshit with the best of them. Sometimes I got more from the class discussions than I would've gotten just reading the text alone, without the context of in-class discussion. So while I go to the Church of Close Reading, I often spend more time trying to get to an arbitrarily-chosen page number to round out my day of reading.

That being said, I can see why close reading is essential to literature studies; if you didn't pay attention to Moby-Dick, you might think it was a happy adventure tale (or if you watched the Demi Moore film of The Scarlet Letter, that Indians attacked the Puritan town, and Hester and Dimmesdale were able to flee to safety together). But as my Literary Theory class is good at pointing out, there are many ways to read a text (to even close-read a text); you could be all about the text minus any contextual grounding or even authorial biography (i.e., Formalism/Structuralism/Post-Structuralism) or you could be all about all that (Historicism). Word definitions can change over time (gay means something totally different to an older generation, no matter how many times my friends and I used to snicker about the former Myrtle Beach landmark "The Gay Dolphin"). Works can be neglected in their time, only becoming relevant when a new generation discovers them (think of all the cult films from the Seventies, box-office flops or cheap cash-in genre pieces that are now elevated to the ranks of Citizen Kane in some critics' eyes) or misinterpreted then and even now ("Civil Disobediance" is far more inflammatory when you see it through Thoreau's eyes than through that of Gandhi or MLK).

I found it interesting that Jockers tries to advocate for both close and "distant reading," as he points out that any truly scholarly attempt to document everything from any given time period (like English novels in the 19th century) is ultimately doomed because of the sheer bulk of materials at hand. Also, we can get a distorted view based on our limited resources or perceptions (the discussion of Irish-American literature was fascinating, because in all honesty I never thought that Irish-American literature existed in the West, not in the 19th century). I laughed a little at the word cloud for topics or phrases that came up in English novels from the 1800s (because "hounds and shooting sport" was the largest such phrase to crop up), but there's valuable insights to be gained from such "unconventional" searches. Any attempt at a close reading of primary sources from just about any era would be a daunting task and (as Jockers suggests) impossible to achieve during the course of a normal human lifespan.

I'm drawn more to late-20th century literature as a reader, because on some level I've always wanted a class that could include Salinger and Pynchon as well as the more established leading lights of literature (and I did, while as an undergrad; I had a World Lit class that included Murakami, and a 20th-century lit class that covered everything from Camus to Toni Morrison), so it's disheartening to realize that a lot of the tools used on older, public-domain literature might not be applicable to more recent work because of copyright restrictions. But given enough time, more and more work should be available for downloading and disassembling (perhaps more likely in the lifetime of any hypothetical children I have in the next twenty years or so, but still). For now, we have to make due with Jane Austen (whom I enjoy) and other dead authors who can't sue us for using their work in ways that they couldn't have imagined.

I worked at a public library for about a year (late 2009 to mid-2010), and this was at what seemed like the height of Twilight-mania, as you had the movies starting to appear (in all their moody, under-acted glory) and copycat books appear by authors who may or may not have been "influenced" by Twilight (but who were certainly influenced by the dollar signs that appeared for any book that rode that wave). I'd constantly see patrons (mostly teenage girls) come in and request various variations on the supernatural themes of Twilight ("I'm in love with a boy who, at midnight, turns into a 1973 Ford Pinto" and so forth), and you literally couldn't tell them that these books were garbage (mostly because I didn't read them, and also you weren't supposed to make fun of patrons' reading habits. Tom Clancy still had a huge audience amongst middle-aged white guys who disliked then-brand-new President Obama). When Jockers started to chart some of the works that had those kind of similarities over a number of years, it made me think of all the vampire-esque books we had at that point in time, the seemingly endless variations on the particular theme that cropped up almost overnight. Obviously, books take a long time to come together, but the sheer confluence of Twilight-Lite works made me wonder if this wasn't planned years in advance (granted, the Twilight books must have pre-dated the movies by a certain number of years, though I'm too lazy to look that up on Wikipedia). It's humbling to realize that "literature with a capital L" could often be as trendy as hashtags on Twitter about various celebrities or what not, but also humanizing. For every Stephanie Meyer or Bram Stoker, there are a million what-the-hell-is-their-name(s) out there jumping on the bandwagon. But bandwagons get awful crowded, and people have a tendency to fall off.