One finds it hard to believe that the video game industry would have trouble being legitimate in the eyes of the general populace. Reuters reported that last year, the industry would take in revenues exceeding $78 billion. This number, seventy-eight billion dollars, likely puts the industry well past the music industry as the second largest form of media by revenue (behind film), if current trends continue. Distribution-wise, the development of the Steam ecosystem has made gaming easier to purchase and receive, while minor changes to copyright rules have given some legitimacy to abandonware and the emulation of defunct consoles. If the industry continues its robust rate of growth, it may even exceed the film industry in the next year or so and become the money king of revenue. Those numbers would make it as large as the GDP of some advanced nations.
Yet the issue of legitimacy makes video games the subject of heated debate between its defenders and the rest of society. For the many who do not play video games, the very concept is considered to be just above pornography in terms of legitimacy, if not worse. I know many friends, male and female, who hold video games in utter contempt. In discussing these essays, I had a close friend stop me 3 seconds after I mentioned the medium, basically saying "Let us not talk about that, please," followed by a laugh, considering the medium a "silly thing not worth mentioning." This, after a robust conversation on a film we just saw.
In many ways, this contempt brings misunderstanding and the wrong kind of attention to the medium. Some countries, particularly Germany, have laws in place limiting the type of content in a video game sold in the country. In Australia, which has a government-run body rate films and video games (and thus affect their sale in the country), it was only last year that games were allowed to be rated the highest possible rating used in film, R18+, for violent and sexual content, with socially conservative politicians being the primary reason for holding it back. This prevented many games from being sold in the country. For America, many a mass shooting tragedy will often lead for calls to limit the sale or the content of violent video games, even when such games had nothing to do with them (as was the recent case in the Newtown/Sandy Hook massacre of November 2012). Thus, like pornography's attempts to bypass obscenity laws, the industry has to attempt to make its case for legitimacy. Gamers carry the brunt of this mission, and use the arts as the point of relation.
However, to understand and answer the question, "Are video games art?," we must also ask "Do gamers understand art?". The very people who tend to answer the former question, as noted, are gamers themselves, so they must be tested with the latter. Rarely do you see a media or art critic, or a non-gamer artist or musician or thespian, defend video games. Now, their understanding of art on a personal level may be different, but on a general level their definition of art is mostly in line with society's view of the concept. In other words, art may be something different to each person, but people know what art means. This important difference in terminology is something that gamers often fail to consider in their contributions to the former debate. As such, they tend to define art differently from society's view, often to encompass video games in a way that validates its existence whole. This leads to problematic interpretations and leaps in logic, as I demonstrated in my criticisms on Ben Kuchera's editorial on games as art in the previous essay.
As such, we must seek to know why both these questions need to be asked in these times. We must look at these questions through the lens of a non-gamer, or a gamer that has approached it through the casual or social media sectors (which many "gamers" consider heretical to what video games mean to them, but more on that later). There exists the matter of gaming's history, and how its extreme newness remains its greatest strength and weakness. We must also discuss the core problem of accessibility, most importantly in creating video games. Finally, there is the issue of gaming culture, and how insularity has created a situation where gamer culture itself has become detached from the rest of general populace.
An Unhistorical Medium
In the introduction to his book, Extra Lives, writer Tom Bissell received warnings from the game developers he interviewed that defining video games as a medium is not a wise course of action. "I was sternly informed, again and again," he writes, "of the newness of their form, the things they were still learning how to do, and of the necessity of discarding any notion of what defines video games." He goes on to note that one developer in particular sees that, because of the continuous march of technological progress, he fears he is "writing his legacy in water."
There is some clarity to be had in these comments, and we will get into the definition of video games later on. Let us first discuss the newness, or historic, factor. The first "video game," by the straightest form of objective definition, was Tennis, a game made for the ENIAC computer in 1952. The first video games made for mass use (in this case, mass consumption) were the Pong-like dedicated consoles made by Magnavox, Atari, and other companies in the mid-1970s. Depending on your view, video games are either four or six decades old.
Now, let us put that age in the context of human history. Visual art and music have existed nearly as long as humans have reached behavioral modernity, with archaeological finds of cave paintings and flutes dating back approximately 32,000 years, though their purposes remain unknown. If one were to use the general definition of art used today, these mediums have existed since the development of civilization 6,000 years ago, the fine art of theatre appearing closer to three thousand years ago. If one were to scale human history into single 24-hour day, visual art and music would appear in the opening hours of the day, with theatre appearing in the early afternoon or late evening. Video games are so new, that in this context, they only have been in existence for one minute at best, 10-20 seconds at worst.
Hold it, you say. What about film, radio, and television? This observation is obvious, given that these forms of media art have existed only slightly longer than video games. However, what sets them apart from video games is that, despite their own newness, they are in fact descendants of previously established artistic traditions: For film and television, theatre and visual art; radio, the arts of music, theatre, and opera. From this, we reach the main issue with video games that does not trouble other media: Video games are a wholly and completely new concept, from which it came from no artistic tradition, no media from which people could relate to and understand.
This newness has allowed, in many ways, a great avenue for creative potential. The medium has gotten nowhere near the limits of what it can and cannot do, and as technology progresses in different directions, we face not the creative limits that continuously mount on other art forms. New technology presents a bunch of new ideas and the chance to enact, while older styles based from older technology still present opportunities for creating different games, given the increased simplicity to make them. In addition, the lack of a concrete definition has allowed the medium to slowly become more accessible to all people.
However, this newness also underscores the problems that the other arts do not suffer from thanks to their age. In Walter Benjamin's seminal essay on cinema, The Work of Art in the Age of Mechanical Reproduction, he points out "The conventional is uncritically enjoyed, while the truly new is criticized with aversion." While Benjamin was referring specifically to how conventional structure in cinema is preferred by the audience over artistic endeavors, such an issue applies greatly to video games: Because we still do not have a very specific, very clear understanding of what this medium means, and what it can do, it is unconventional and truly new, and from there exists a contempt which is not unlike that seen from a parent whose child has spoken out of turn. Without something for people to identify and relate to, they cannot understand it in their own terms. And as is typical of society, people tend to treat the things they do not understand with repugnance.
This issue carries weight in some ways. What makes it perpetuate is the perception that video games still carries among a lot of the general populace: That of it being a children's toy, and one for boys (and when they get older, man-children) to play with. This is the result of four generations (or over two decades) of video game consoles, from the Atari Home Pong to the Super Nintendo and Sega Mega Drive/Genesis, being marketed specifically to boys, and the following two generations marketing to both boys, and the young men who grew up on previous generations of consoles and PC gaming. There has only begun a shift in marketing (and a very slow shift in perception) in the current generation of video game consoles and handhelds, due in part to the successes of the casual and social media markets. Ironically, these markets are considered heretical by many gamers, something that will be explained in their culture later.
It is from this that we can extend further into artists' and media critics' suspicion of video games as a medium, and thus why the question of whether or not it is art. This suspicion's roots are three-fold: One is the continuous commercial aspect attached to the medium, the second is gaming's detachment from history, the last (and most important) its creative inaccessibility. In regards to the first, there is little doubt that gaming has historically been the most commercial of mediums. While other forms of media, especially film and television, have always had a marked presence of commerce and marketing to them, there is still at least a limited creative intent to the vast majority of media produced. In video games, however, the industry takes a particular pride in being a business first. Throughout the first and second generations of consoles—referred by some as the Atari era—this was especially the case: Console makers served also as publishers, pushed out as much software as they could, and spent too much time imitating arcade games of the period to ensure sales success and a gluttonous flow of revenue. Such an intense focus on sales led to a glut of poor-quality games and consoles, and would ultimately lead to the industry collapsing in 1983, and facing what no other media has confronted: extinction. Following that period, creativity became a stronger part of gaming development, so as to improve quality, but only in the last decade would developers begin to consider creativity on par with making money in the development of video games.
In the second issue, video games as a mass medium was born in what some could say is the worst possible time for a medium or art form to be created: The aftermath of the 1960's, when perceptions on about everything had shifted thanks to the combination of the Civil Rights Movement and the Sexual Revolution. Media had begun to be seen in a more socially disseminating view, especially among critics. Art, music, and film had the benefit of history to protect them: As they had grown with history, the zeitgeist that a film or song was developed could be used to justify creative actions that would be considered racist, bigoted, stereotypical, and misogynistic in current times (though some critics, including Judy Berman of It's Complicated, have begun to question this line of thinking). Being born after that watershed moment in western history, video games are not afforded that protection. Furthermore, once the medium began to have more robust and complex scenarios during the third generation—known by many as the NES era—it simply ignored these developments and often used diluted derivatives of Hollywood action and sci-fi without any sense of nuance and connection to the real world as their basis of purpose. The justification of creating games that are marketed to what some see as an uncultured demographic, which may have worked in the 1930's or 1950's, could not work even in the times they were created. This turned off a large swath of the population from the medium even further for a long period of time.
An Inaccessible Medium
The greatest possible issue, however, and one that truly makes artists wonder about the medium being any form of art, is one of accessibility. However, it is not audience accessibility that is the problem. Gaming has done much to deal with this, especially in the last five years with the development of the Nintendo Wii and touchscreen-based smartphones, along with the development of the social and casual gaming markets. Playing games has gotten much easier for everyone. This much is proven when one sees a little girl on public transit playing the incredibly popular Angry Birds on her iPod Touch, which I witnessed just recently.
The real issue is the matter of creating games themselves, which remains a difficult process for many and a very large wall for those interested in gaming on a creative level to climb. To exemplify, consider this: Say you are picking up a guitar, or a paintbrush, for the very first time. It is very easy to create something by just plucking a string with your finger or pressing the brush across the canvas. You may not understand entirely what you have done in performing the action, or what you can do to make it better or worse, but you have made something out of nothing through your creative action.
There remains no analogue for this type of action in creating a video game. To do the closest thing to the most basic act of game creation, which in this case is making a controller (be it keyboard, gamepad, or otherwise) cause a 2D image to move across the screen, one has to know and input quite a few lines of code, and a few more if they want that image to animate. This is of course assuming you know the correct code, and are using the proper framework, say OpenGL, XNA, or DirectX. In other words, unlike the previously mentioned art forms, you need to already know what you are doing in order to create anything.
Now, granted, this is not the first medium to suffer from actually having to learn about the technology before creating something. Both film and photography, for the majority of the century and a half or so that it has existed, suffered from similar problems: Properly handling and installing film, ensuring your lighting conditions were acceptable for what you are shooting, confirming the camera was in focus, and (following the advent of sound in film) making sure your sound recording was in sync with what you are shooting. Then there was the matter of properly developing and editing the film. Thanks to technological advancements over that period, from instant photography and VHS tapes to the creation of all-digital formats, the process of creation eventually became much simpler. Now, all a person needs to do is whip out their smartphone, and with a push of a button, take a picture or record a video, followed by another button press to post online for the world to see.
Furthermore, the development of video games has simplified over time. In the beginning, one had to write up hundreds, perhaps thousands of hard-coded lines of programming just to get a level to function properly, using programming languages as complex and tedious as assembly. The development of application frameworks such as the ones mentioned above has greatly reduced the tedium involved in making something, especially in gaming. In addition, there exist programs such as Game Maker and Unity that minimize the amount of coding work necessary to create a game, so it is definitely possible for someone to, with a little bit of effort and time, easily make a basic game.
But that problem still remains: you still need to learn what you are doing. You cannot, like other arts, be able to create something right off the bat, even if it is utterly basic and of poor quality. That difference is fundamental, and the reduction in lines of code does not resolve this issue easily. Artists, understandably, show contempt at this particular aspect, but more importantly at the gaming culture that not only sees that as the right way of going about creating games, but the only way. Thus the barrier to creation on that level will remain a tall one for years to come.
An Insular Medium
The mostly-defunct publisher Interplay once went by the slogan, "For Gamers. By Gamers." This describes a lot of the culture surrounding gamers and even developers: A mostly-closed circle of people that speak their own language, distrust the presence of non-gamers, and have trouble relating to the outside world what they do and enjoy. Some have referred to their nature as autistic, but other than the fact that such a description is inflammatory and insulting to autistics, it lacks an understanding of the culture's depth. A better term to describe gamers and game developers is insular, creating an internal culture that is somewhat resistant to change and growth. Even the development of the social and casual markets has not disrupted the nature of this culture greatly, only calcified their existence.
One has not to look further than gaming job sites such as Gamasutra to see the insularity. Many job postings related to actual game development, even those concerning art, animation, and basic scenario writing, require some form of programming experience, with exceptions made to bottom-of-the-barrel temporary positions. Instead of trying to broaden their talent pool and utilize that pool to build interesting ideas into a decent shape, developers and publishers focus inward, trying to hire gamers with some applicable art or writing experience on top of coding experience. There are exceptions to the rule, especially Harmonix Music Systems as mentioned in the previous essay, but they are few.
Many game developers, unfortunately, have made sure that these requirements are in place less as a measure to confirm that everyone is on the same page development-wise, but more to keep actual creative people out of the development process, especially writers (and editors). SCE executive John Hight, in chatting with Tom Bissell in Extra Lives, noted that the divide between designers and writers has long been in existence, with some of the earliest games' voice-acting having scripts written by the designers themselves the day they were supposed to be recorded. Such hastiness and distrust of artists and writers has made actual game quality heavily dependent on not the technical aspects, but whether the designer has a basic understanding of intent and direction.
However, as noted before, this should not be surprising, given where these developers came from: Many of the prominent developers are gamers of second, third, and (in recent years) fourth generations of consoles. In other words, developers inspire gamers that later become developers, who inspire more gamers. Such a cycle in a medium is also seen elsewhere, in the comic book industry: Many of the prominent comic writers and artists that work today were raised on comics, with not a lot of new blood except in the fringes and zine culture. When such a cycle of talent development is created, it greatly reduces the overall quality of the product, and risks turning the industry into a niche market. While some developers may break the tedium of this cycle by displaying some interest in cinema, or even in the art world, their primary focus is gaming first. This can often mean that the artistically or cinematically-inspired aspects of a game's structure can be derivative at best, unnecessary window dressing at worst.
More importantly, though, the insularity and overall indignation towards creativity spreads downward into the rest of gaming culture, starting with the press. For all the accusations that gaming reviewers face, the worst is the one not often asked: When was the last time anyone in a major publication wrote a good review, one worth reading that did not sound like standard Consumer Reports/Consumerist boilerplate bile (and no, we do not count Ben Croshaw)? If there has been a review of that stature, it's a rarity of which nobody has actually seen. Instead, we get overviews of the technical aspects (i.e., "The graphics are…, the sound is…, controls feel like…, etc.) all in the name of encompassing some "experience." Critics in any other medium are quick to notice this focus on functionality, because that is not criticism, but merely testing to see if something works.
When all the focus is on functionality, there is no inspiration to actually write, just stenograph what is happening on screen. As such, reviews often sound boring at best, trite and mindless at worst. One could dare argue that even Ian Cohen of Pitchfork, whose drivel filled with passive-aggressive hatred and an utter disposition for pretension exceeds the incredibly high standard the music publication already sets, writes better reviews for music than Jeff Gerstmann of Giant Bomb does for video games, simply because he is actually writing and not just filling in ad-lib sheets. Such comparisons make accusations of journalistic graft stick, for it's hard to make the case of not being beholden to game publishers—who use Metacritic aggregate scores to mete out royalties to developers -when one's writing lacks any sense of originality, criticism based on prior theory, or even a unique voice.
Never is the damage of insularity greater, however, than amongst the gamers themselves. While generational thinking does exist in certain aspects of the culture, gamers as a whole utilize an ideology and terminology that isolates them from the rest of the populace, and they almost intentionally move to keep it that way. This is best exemplified in asking the most basic question, "What is a video game?" Going back to the newness of the form from earlier, an observer can see it as a trick question: There is no definitive answer. But gamers will try their damnedest to define the medium.
In this attempt at description, we see one of the main problems that plague gamers and the gaming press: They define by example rather than explanation or theory. When explaining a medium or art form, it is essential to describe what is going on with what is shown, or simply describe to people without example. For the artistic and the intelligent, utilizing or even creating a theory surrounding a medium is of great use in describing it. However, the technical and aesthetic vocabulary gamers have developed over time is exclusive to the medium, and gamers fail to properly analogize the terminology in a way that is relatable to non-gamers. The only way they seem capable of explaining themselves is by showing people examples of what they consider video games. This satisfies the inquiry of only the most ignorant, and a bulk of those will thus dismiss the medium outright because they "don't get it." In turn, it creates this sort of religious or ideological thinking, where only the few who can somehow grasp it without explanation truly understand the form. Such thinking does not help the cause of video games.
Further, when left with an example, one cannot do the things necessary to develop a proper artistic assessment and value. There exists no gaming canon that is consensus to the entire culture. Rather, the culture utilizes the "no true Scotsman" fallacy to define what is or is not acceptable, dividing the industry and isolating the culture. Evidence of this is none clearer than in the casual and social sectors of gaming. To many gamers in the culture (which the adjective "core" is applied), wildly popular games such as Angry Birds, Fruit Ninja, and Farmville are seen as something heretical, as not a real video game. Gamers cannot even appreciate the addition of these new casual and social gamers. Games such as Rock Band, which have taken the concepts of gaming into a completely different direction to the point of not being considered a game, are considered acceptable if only because they were released on a console or on the PC.
From there, this ideology stretches to even the equipment. Many treat Apple's iOS product line and Android-based devices as separate and unequal to other handhelds, enough that the iPod touch and Google Nexus 7—with designs that are ideal for handheld gaming development—are not considered real handheld consoles in comparison to the Nintendo 3DS or Sony Playstation Vita (relatedly, iOS and Android games are sporadically covered and rarely reviewed in the major game press). The Kinect motion sensor system on the Xbox 360, which has shown great potential in the amount of applications made with it, is seen as little more than a gimmick to its original target audience, while the Wii motion sensor-based controller is generally accepted by gamers with some exceptions.
What is most damning, though, is once gamers leave their comfort zone of gaming, it is hard to say whether they have any depth at all in other fields of media, art, or industry. Many gamers never stray far from their passion, and in so doing, their understanding of reality remains mixed. Some are heavily influenced by what else is marketed to their demographic, which in recent trends tend to be hyper-masculine products and media, and are simply beholden to the marketing. Others may attempt to look outward at other media and art, and either try to bring something back into gaming that is useful to the culture or influence the rest of human culture through their experiences. One can see this approach through some of the early gaming-based webcomics, and in the works of writers like Bissell, and the chiptune music scene; and one can see that depth has indeed developed in their understanding of the world. But these number amongst the few, for many who tend to venture outward become absorbed with the rest of society. The rest tend to show antipathy for those who stray, if observations from various forums are of any indication.
When such insularity occurs, it's not exactly surprising at some of the tone-deafness evident in the culture towards society's disdain of them. Society remains at odds with the gaming culture, especially considering the continued connotation of games as toys. Such an implication also implies that gamers and their sympathizers are thusly immature. The antagonistic term "man-child" often uses the continued enjoyment of video games past a certain age as one pre-requisite, since there exists the notion that one should grow out of playing video games. They are not entirely wrong. Many developers still remain the same gamers they were when they were children, with possibly some refinements in taste, and are simply developing towards themselves. For example, if one were to strip all the refinements and unique touches of the Gears of War series and leave simply its basic premise, could it have been possible to create the games for the Super Nintendo? When one answers that in the likely affirmative, it shows how structurally monolithic and unchanging the industry has been at the top level. Some indie developers, particularly Jonathan Blow, have greatly criticized this, but are missing the forest for the trees: Rather than focusing on the fact that gamers are trying to define what is currently an abstract, and that the structure that they have built around their "definition" is easily destructible and malleable, the supposedly iconoclastic indie developers are merely enforcing the "definition," and instead focus on how gamers are interpreting it.
What Cannot Be Understood
And so the circle remains closed, with some exceptions made here and there from Harmonix and truly iconoclastic designers such as June McGonigal. This, coupled with the issue of video games lacking a general definition of which people could accept, presents the reason why we must ask "Do gamers understand art?". For many gamers lack the comprehension of explaining their own medium to non-gamers, and they equally lack the means of explaining art because they rarely venture outside of the culture they exist to actually experience it on a regular basis. Therefore, many do not understand art, and thus cannot answer the important question of whether video games are art. Artists and media critics are not wrong to treat gaming with suspicion if not detestation: The medium is still very new, the obstacles to create games remain tall orders, and the disgust developers have for creatives continues to stall progress.
That does not mean it is impossible to change, and to bring light to the medium so that non-gamers can at least respect it. It is well within the realm of reason for gamers to actually integrate with mainstream society in full, rather than the partial connection that exists today, so that being a gamer is not completely separate from being a normal person, not unlike that of fans of other media. But the changes require thinking outside the culture, as non-gamers, and they will require acceptance of certain things that gamers may be reluctant to admit. It is a hard way forward, but it represents a better chance at society understanding the medium, and gaming understanding art.