Listen to 405 Radio
Do Gamers Understand Art?  Part III:  Direction Without D-Pads

Do Gamers Understand Art? Part III: Direction Without D-Pads

by Sean Brady, 20 March 2013

Main image courtesy of Drew Northcott

A couple weeks ago, SONY held an "event" of some kind. The term is put in quotes for a particular reason: It is hard to describe exactly what happened, even though a lot of things did happen. There was talk of games, of new ideas, of commitments, and other things. Supposedly, a console was announced, the successor for the Playstation 3, with a somewhat upgraded controller, though one would be forgiven if they somehow missed that. For all the talk, there was something utterly lacking in the PS4 event: Something to actually relate to, a physical item to actually talk about. Instead, we had a pep rally, where nothing actually happens, only people gallivanting about trying to raise the morale of their medium, video games.

Such actions are unbecoming of an industry and medium that seeks legitimacy with the rest of society. But as every developer moved across the stage, any outsider who happened to stumble upon this "event" would be wondering if they had entered a self-help rally or the inchoate rituals of a cult. BuzzFeed's Joseph Bernstein thought he was watching a church service, aimed only at gamers. His analogy, though a stretch of the imagination, is not entirely far-fetched: Everything seemed intent on saying "This is what we can do for you, gamer," in a language that could not be understood by the average person. Nothing suggested that this was something that could draw people from all corners, like an Apple product announcement or a trailer for an upcoming film tend to do. It was gamers-only, screw the rest. Even the appearance of "maverick" game designer Jonathan Blow, who formally debuted his new artistic vision, Myst clone The Witness, did not change that aspect of isolation.

 photo Dualshock4_zps8962fef7.jpgAs the gaming press and gamers praised the new, intangible PS4 as a whole and nitpicked on the details from this event, one could see all the problems that are inherent with the culture: A continued path of isolation over inclusion, a complete detachment from the changes in the world around them, and products and media that were merely prettier, not more creative in terms of its approach. The praise over SONY's commitment to "core gamers," a strange term that entered the vernacular in the last few years to refer to gamers that owned consoles and/or gaming-based PCs, implies an utter willingness to segregate gamers from the new developments in the social and casual markets, as though it were some (un)tainted legion of loyalists. But most unfortunate is that SONY, which brilliantly implemented the idea of using other media as a gateway for people to be introduced to video games with the PS2 and (to a lesser extent) the PS3, has all but turned its back on this strategy in the name of catering to the "core" gamer.

Such appeals to a base demographic bring about many, many problems. One can see this in certain political parties around the world. One can also see it in music criticism's recent love affair with creating microgenres from witch house to vapourwave. When you isolate a culture in such ways, the output from these cultures is often of a degraded quality, which is then followed by echo-chamber criticisms that the creators were simply not acting in the purest interest, which leads to a death spiral of continued self-immolation. Many cultures and art forms end up disintegrating as a consequence. With signs coming in the form of Apple's steamrolling of the handheld market once dominated by Nintendo and SONY, and the former company setting sights on the living room, video games as most gamers know them will likely face a similar fate of utter redundancy and irrelevance.

Despite this ominous situation, gamers have the audacity to continue making the case that gaming is not only strong, but also a legitimate art form. These people barely have an understanding of the medium they are supposedly defending, and cannot even define it without having to show non-gamers what they mean. How can a gamer understand art, if they cannot explain gaming itself using simple words? This problem I demonstrated in my prior essays, and it represents the real obstacle in establishing gaming's legitimacy: A way to relate to the general populace, so that it can integrate with them.

There is a way out of this mess. In fact, there are quite a few ways out of this situation, some of which will be discussed below. But changes have to be made to the culture and the industry, and it has to be done not through the lens of a gamer, but with the mindset of those who have never touched a controller of any kind in their life. At the most basic level, some hard truths about the current state of gaming must be acknowledged. On a higher level, developers and gamers must take a look at who can be a gamer or developer. Finally, we must develop a better way to interpret and criticize the medium itself. While the hardest path, it will certainly lead to people being at the very least more respectful of the medium.

Establishing Some Truths

The first and most basic fact about video games that we must come to terms with is one of definition. What is a video game? Can we even answer that question in a way that makes sense to people, thus without example or internal reference to the culture? Not yet. The medium is and remains undefined to the masses. The masses and we gamers have an understanding of what the medium can be, from what we have created and played. But we tend to have a narrow scope of what that can be, based on when we started playing, and the extent of our playing, and thus not know entirely what it is.

Most importantly, though, we have yet to reach a limit on what can we can do in video games. The use of motion sensor technology in the soon-to-be-previous generation of consoles, and touchscreen interfaces and interactions in the current generation of handhelds, demonstrates this continuous expansion of technological innovation. This situation makes the medium hard to define with a degree of finality. It creates a setting in which every gamer and non-gamer has their own interpretation on what the medium is, without a definition that carries a consensus. When left with different interpretations, the best gamers seem to do is, rather than use whatever commonality to make something understandable, they use commonalities that they accept alone to determine what is or is not a video game. That is not definition, but dogma.

If we are to move out of stagnation as a culture, we must acknowledge that we have no acceptable definition for video games, and we may not have one for quite some time. Doing so requires using language that is understandable to people outside of the medium, something we may not have even developed yet. This is something that must be worked on. If it can happen that someone like Ken Levine or Justin Wong can sit down in a subway car next to an old lady, explain what it is that they do, explain what they are doing it for, and manage to get that lady to understand the medium at a basic—perhaps even inspire that lady to pick up a few games to play—then we will have made considerable progress. We are not there yet, but it can happen.

The second important truth about gaming relates to the actual size of the industry. As I wrote in the previous essay, the industry is set to be valued somewhere in the range of $78 billion in 2012, based on current trends. Of those numbers, $70 billion came from the computer, console, and handheld (or "traditional") market, $8 billion from the mobile market. In discussing this truth, let us focus on that $70 billion. That is a very large figure, and one that puts it above the music and television industries and just below the film industry (which, according to IBIS, is valued at around $87 billion). Numbers like that make it sound like a very legitimate industry.

However, that number is misleading. The size estimates are based on revenues, or how much money is made from selling stuff. Video games as a medium and industry carry a distinct handicap in this regard, by mere virtue of the cost of media: Software, when bought new, tends to cost on average $40-60 per copy, depending on what the software is being played on. This puts video games as the most expensive medium, far higher in cost than the average film ticket, DVD/Blu-ray, single, or album. Even in the profit-making/price-fixing heyday of the music industry, albums on CD rarely exceeded $20, while video games maintained a $50 average. Add to that the high cost of consoles and handhelds that (with some exceptions in the last two generations) served no other multimedia purpose, and the video games industry looks a little more niche than mainstream.

 photo mw3_zps3daf6985.jpgA better way to determine the actual size of an industry is by the number of units sold, especially in the case of media. Given the extreme difficulty of acquiring that information on any industry, let alone for a publication such as this one, there is a simpler way of gauging the size of the video game industry. In this case, we should reduce the amount of sales revenue in the traditional market by a factor of five. Using a factor of five reduces the cost of software to $8-12, placing it in the median range of all media sold. When that is done, we see an industry whose value is more in the range of $14 billion, that combined with the mobile market (which tends to have lower costs for software in general) adds up to $22 billion. This puts the size of the industry at something even lower than television.

Now, the methodology is clearly unscientific, and fails to take into account other forms of sales revenue, which include microtransactions, paid downloadable content, and paid subscriptions of games primarily in the MMO meta-genre, not to mention hardware sales. So, these adjusted numbers should be taken with a grain of salt. However, it does exemplify an important truth: That the gaming industry and culture are far smaller than the revenue numbers indicate. The sales revenues exaggerate the actual size of the culture, and create a false sense of legitimacy and supremacy when handed to a gamer. This must be acknowledged.

It is far easier for, say, Modern Warfare 3 to reach the $400 million sales mark than it is for a Hollywood blockbuster to perform the same feat. A movie would have to sell at least 50.25 million tickets in order to make the mark, based on the average cost of a movie ticket, Modern Warfare 3 needed to sell only 6.5 million copies to make that mark nearly two years ago, which they did in 24 hours. At the same time, however, such an economic victory becomes irrelevant in the cultural zeitgeist by matter of sheer numbers. To compare with a movie that committed similar feat, Harry Potter and the Half-Blood Prince took three days for it to reach nearly $400 million in sales, via its opening weekend. However, more people have seen that movie in that opening weekend than have bought and played Modern Warfare 3 in its entire existence. People will remember the movie more than they will the game because of this, and because those numbers will mean that people have analyzed, criticized, loved, and/or hated the movie enough for it to have some cultural significance. More importantly, though, Hollywood is capable of producing these movies at a regular enough rate that at least a few movies per year gain that sort of significance. The last game that has had such significance is Minecraft, officially released in 2011 but playable since at least 2010, and that is due to the continued playing of millions of users, which is an unusual feat in most video games.

A Better Gamer

From this point, in acknowledging our limitations and our size, we can become a better culture overall. Such changes must start at the most basic level, that of the gamers themselves. In this basic respect, several things can be done, but many fall under the jurisdiction of behavioral changes, an area that is volatile on its own merit. Changing gamer chauvinism is a tricky matter, simply by the nature of the Greater Internet Fuckwad Theory and its recent modified variant. Behavior is a matter that is better left addressed to the sociologists and the politically inclined.

 photo ouya_zpsb4445090.jpgInstead, to make for a better gamer in a way that works, we must focus on the gamer's personal understanding of the medium, and reshape it to be more open to different things. The fanboi-esque nature of gamers, while it has declined significantly in recent years thanks to third party developers eschewing exclusivity for cross-platform compatibility, still remains a dominant force in the culture. With the development of the Ouya, such zealotry may return thanks to the devotion employed by some in the open source/free software movements that is comparable to the militancy of the atheists and secular humanists.

The vision of the Ouya is quite different from the vision of the PS4 and the Wii U, and is different from whatever machinations both Apple and Microsoft are developing in the next year or so in the console field. There is not one single path that is the best or worst path, and all of them are correct in progressing video games as a medium. The concept of the consoles wars that dominated the discourse in gaming for about 15 years has lost its meaning, and it is long past time to move on from the notion that one console really is better than the other on a hardware or software selection level. More importantly, though, we must be more welcoming of new developments coming in the form of smartphones and tablets. Given the other applications that apply to both those hardware formats, a far greater pool of people will be drawn into it, and thus be possibly enticed to play a few games.

Therein lies the greatest key in being open: Letting the definition of a gamer become something more than just being a "core" gamer that invests significantly on a console, handheld, or specially-designed computer, and accepting that. A gamer can be someone who plays on their iPhone simple games, or someone who plays around on a tablet with a variable frequency. Angry Birds and other games in the casual market have done far more to bring in more gamers from various walks of life than the likes of League of Legends/Defense of the Ancients, the Call of Duty series, the Grand Theft Auto series, and any RPG that Square Enix has conjured ever could combined. It is time to acknowledge that simple concept, and welcome them into the fold.

 photo apple_zps721081bd.jpgFurthermore, one can inspire some to play some more complex and interesting games. If it is possible to inspire people to listen to more diverse genres of music as file sharing and the Internet have done, then any gamer could make an honest effort in bringing more gamers into the fold, if they are willing to accept that their tastes can be different. In turn, by bringing normal people into gaming and interacting with them in a way that is not disdainful, gamers gain a better insight to their medium, and build off an understanding of art that not only transcends what they understood art to be from video games, but help them define video games in terms of art itself, not the other way around. Through doing this, gamers can also attain further enjoyment from the games they love through the new experiences of others.

A Better Developer

As the need for gamers to open up their enclave to mainstream society and attain a better understanding of art and their own playing through others occurs, so too should developers open up their ranks to the creative and not-so-technically-inclined to build a stronger and distinct product, one that actually merits artistic criticism. To be fair, this issue is a symptom that is predominant of any industry with at least a nigh-symbiotic relationship with technology: Businesses in these industries tend to be made up of engineers and technicians that carry the mindset of which the only creative thinking allowed is in matters of logic and problem-solving. Consequently, they exert this tribal mentality that goes inverse of standard business staffing: Rather than the company shaping the employee after hire to fit within its internal culture, the employee must already fit in the culture before he or she is even hired.

The tech aspect has to be addressed in a certain way. Continuing to simplify the means to program and create games is one way, and that has been addressed in the previous essay. But, as noted, there is only so much effort that can be made into simplifying programming over a period of time, and the technology advances at a rate faster than the means to make use of them. Finally, even the basic understanding of certain frameworks may not be enough to be of particular use to developers, who essentially build their own in-house tools in creating the games. So, such a suggestion is best left for the DIY, mod, and indie communities. A more widespread availability of in-game developer tools to the community at large, so that more aggressive modding can be done, might provide some relief on that front. But such developments are limited to computer-based games, and even such a suggestion does not address the issue of bringing in creatives with no programming experience and/or disinclination in learning to program.

 photo masseffect_zps93fd1d39.jpgInstead, developers should seek ways of integrating creatives into the staff in a way that allows them to contribute without feeling out of place. There have been examples of this already in some of the larger developers, like BioWare, which has its own writing staff, and Lionhead Studios, who used a movie director in developing games such as Fable II. But there can be room for improvement. The key will come across as completely off-base, but it actually works: Hiring people with creative but absolutely no programming background to work on shaping the design elements. This includes composers, audio engineers, writers, editors, directors, and concept artists. These people can have some technical background, at least to use either the tools given to them, or use the tools they are already acquainted with, be it ProTools or Adobe Illustrator. But they should not need to know the basis of any framework or programming language from the beginning. Programmers exist for a reason, and they should not also be a sound engineer or directing motion capture actors.

It is not hard to hire quality people like this, and skimping by relying on freelancers shows that a developer has no interest in moving the medium forward. In fact, it's pretty easy and affordable to hire creatives, by virtue of the "starving artist" attribute: For example, if a game requires a narrative, hiring an entire editorial staff (say, 5-7 writers and editors) for it would be more affordable than bringing a few designers with writing experience onto the staff. A medium-to-large developer could very easily hire a non-celebrity writer full-time for a yearly salary of £16000 at entry level, depending on the location, as odds are they will earn more than the scraps from service industry work and a greatly underpaying freelance market they get now, and they will deliver a higher quality narrative that will most likely have some artistic merit to it than a freelancer or a designer with writing experience would. The same applies to other creative types.

Plus, once a strong relationship is established with the creative part of a staff, they can serve as useful balance to a game's design. Consider Metal Gear Solid 4: Guns of the Patriots. The game is highly representative of Hideo Kojima's writing style, which is to say uttering plot points, lore, and concepts repeatedly, and having to explain what is happening as though the gamer were only in primary/elementary school. If an editor went over Kojima-san's head, and was able to cut out even just the redundancies in the whole script, there is a good chance the length of the game would be shortened dramatically, and the story would actually be remembered for being something more than a punchline in terms of quality. The creative team, by checking the design team, can point out the limits of what can be done with a game's vision in development. This also works the other way around: Designers and producers can check the creative staff in terms of their imagination, and set limits and restraint that they must work. Conveniently, this generally has the benefit of forcing creatives to get clever and imaginative with their work, with increases the artistic quality of their output. In assembling a creative team as part of the staff of designers and programmers, developers can legitimately say that they have the means to understanding art, and adapting video games so that it is done with at least an understanding of art embedded in the staff.

 photo mgs4_zps462ef2aa.jpg

A Better Critic

Now, we reach the final contingent in the gaming culture that needs changing, which resides in the gaming press. It is not the journalism that needs to change, but rather the reviews that consider themselves criticisms. As noted in the previous essay, the writing that gaming reviewers utilize leave much to be desired at best, display ad-lib-level skills of writing that are less competent than a high school essay writer (or a reviewer at the Consumerist/Consumer Reports) at worst. It is a reflection that gaming should supposedly not actually face real criticism on experience, and instead be benchmarked based on values that the readership is not entirely aware. It is shown especially in those publications that used scales or summaries for certain aspects of the game (sound, graphics, etc.), or variations thereof (like Gamespot's quaint achievement-like "game emblems"). The values are constantly changing, and there is no standard or media theory upon which these reviewers make their assessments. In addition, the maddening rush of getting your review out first has become loaded with problems, primarily creating an utter detachment between gamer and reviewer when the review fails to match up with the actual experience.

 photo sim-city_zps4b0eae71.jpgThere are ways to improve this dramatically. First, reviews should be written of final, complete products. What this means is that, if there are still game-breaking bugs at launch, either the publication should wait to post any review until most major bugs are fixed, or post a review that does not talk about the game at all except for the bugs. The fiasco that came with the launch of EA's SimCity reboot recently brought many columns around about EA's failure to properly launch an online title, but very few on the fact that many reviewers and publications that had pre-release copies and access had to scramble to display an assessment that matched gamers' ire. (Surprisingly, PAR's Ben Kuchera put up a thoughtful essay on that scramble) This is not the first time that this happened: In 2001, Sir Peter Molynieux's attempt at redefining the god game genre he helped create, Black and White, got several glowing reviews and awards, and yet gamers were fuming over several major bugs, including one which made the final level of the game impossible to finish (I remember these bugs with clarity), that would not be fixed for two months after launch.

Many games have been released since then with similar problems, with nobody looking straight at the source of the problem: An inherent desire by critics to get a review out as soon as possible so as to generate high traffic, and thus ad revenue. This problem is, of course, not limited to gaming, as demonstrated last month by music criticism's egregious stampede to post reviews of My Bloody Valentine's incredibly-long-awaited third album, and is a worn remnant of print media's need to have reviews completed up to several weeks in advance of an issue's newsstand release due to press runs and distribution. But with that format of media disintegrating, writers should just give up the practice, for the simple reasons that such "FIRST" and exclusivity mentality destroys the quality of writing, and it does not necessarily guarantee high traffic and subsequent revenue. As it stands, such reviews are already beaten by internet users posting their own opinions and scores on forums, sites like GameFAQs and Metacritic, and other places. Rather than rushing a review, reviewers should focus on making it read well.

In this focus on quality, one should change the overall style to actually impose critical thinking. This includes purging discussion of the technical aspects unless it is an absolute necessity. Consider the average film or music review. A film critic rarely discusses the camerawork, the post-production, the foley, unless it undertakes an aesthetic tone (say, whether such things appropriately fit in the movie). A music critic barely discusses the mixing and mastering, if they do at all. Only gaming reviewers seem intent on bringing into their reviews statements akin to "The graphics of this game are good/hideous/bad," and other similar statements based on controls or sound or features, and that has to go. Similarly, one should not go about describing gaming mechanics in a review, though discussing mechanics can be of some use. Let the experience determine the review, not the technical specifications.

Finally, for all this talk of "experience" that several publications speak of, we cannot seem to understand that experience means talking about what we see, what we feel emotionally and mentally, and what we hear, as well as the additional gaming-related dimension of what our interactions do for the game. Many reviews, even those who wish to emphasize that "experience," tend to sideline that in favor of giving players summarizing game events and issues, some of which they already know through pre-release press. The problem is, like Kojima-san's written redundancies, they do not always need to know that. Instead, a reviewer, in becoming an actual critic, should entail the experience, what it felt like playing that game, and let that be the framework and content of the reviewer. Gamers can determine for themselves, either through play or through marketing material, what the game is about.

In making reviewers into better critics, several things can be done to help speed along the process as well. Having at least some academic understanding of media studies and theory, or even studying specific schools of media theory or theorists such as the Chicago School, Paul Lazarsfeld, and Marshall McLuhan (or even philosophers such as Frederic Jameson), would be of great use in establishing critical thinking that can apply to video games. Over time, one could even use the theories to create an actual school of thought surrounding game development and play, though that is not entirely necessary. Always helpful is the application of art theory as well, or even just some experience in actually making art and understanding what one did in creating it. Finally, establishing an actual canon of games that define the medium is of extreme importance. By canon, we do not mean a "greatest games" list, but rather games that codify aspects of gaming. They do not always have to be good, and they can even lose luster over time. But the point is that they represent important parts of the medium's development and existence.

Establishing these changes and developing them are essential in establishing the medium as an art form. While gamers must determine what video games are, critics must determine what they mean. Through this, they understand art, and thus can determine and prove whether video games are art. Publications like Polygon and Giant Bomb do not represent this, though they make some effort. You see this more in the works of Kill Screen and Action Button, and writers like Tim Rogers. Either way, by creating a basis of understanding through theory and some study, critics can answer that damned question, "Are video games art," assuredly, without needing to resort to proof.

Integration, not Ascension

Are video games art? I am not certain if they are now, but I know they can be. Using McLuhan's definitions of media, video games represent a hybrid medium built for the purpose of play. As a virtual extension of man that possibly combines multiple extensions into a single form (depending on the game), what video games build their existence on are not the interaction of the players, but the response of the games themselves as a simulation. Whether bending towards the hyperreal, or a simulacrum based on one's imagination, video games as a medium allows for a person to experience it either actively (through personally controlling the game) or passively (through watching others playing, or running an AI-controlled simulation or demonstration). That each of these simulations results in a different experience for the person allows for a diversity in interpretation, and presents the potential for an experience to yield something artistically significant and worthy of the general definition of art. Whether that has been achieved remains uncertain.

Now, what I just said above most certainly has some holes in its thinking, and I have little doubt that if Marshall McLuhan were still alive today, Woody Allen would grab him and he would utter "You know nothing of my work" to me. But I will be damned if it is not better than most of the affirmations passed along by other gamers and related critics. It is an explanation that is certainly built on theory, but more importantly is based on an integrated understanding of the world, one in which video games holds an equal place in my being with other media and other aspects of living.

That integration, ultimately, is what will bring gamers to understand art, and to define video games as art. The assumption that the medium carries this weighty, elevated significance that allows it to ascend above or transcend the assessments of critical thinking and media theory is foolhardy, and is the type of thinking that slowly implodes cultures. Video games are not special. They are a medium like any other. Cultural integration, which has already taken some shape over the last generation of video game consoles, is the best way for the medium to gain acceptance, as the exchange of ideas allows for gamers, developers, and critics to improve the culture, and in turn the medium itself.

 photo mario_zpsa69cef5b.jpgWhen the famed critic Roger Ebert penned his thoughts on gaming, he claimed "No video gamer now living will survive long enough to experience the medium as an art form." He may not be entirely wrong: As long as gamers mostly fail to understand what art is, and the culture they reside in maintains an insular attitude that refuses to define the medium that is its basis, gaming will continue to be seen as an inferior medium, with acts of sycophancy like the aforementioned PS4 "event" defining the medium in the eyes of the masses. But we can do better, and will do better with a little bit of effort. We're in World 5, but what we seek is in World 8. Just a few more levels before we can get there.

Sean Brady is a writer and editor for music website Tiny Mix Tapes, and is based in Oakland, California. He has played Spelunky 800 times, and he still hasn't beaten the damn game yet (stupid ghost...). Follow him on Twitter.

Comments

Follow Us

Recommended Posts

Popular Posts

Mailing List

Sign up to our weekly mailing list.