Gameological At Large

PlayStation 4 controller

More, More, More—How Do You Like It?

At a New York event, Sony argues that the game industry’s problems can only be solved with more of everything.

By John Teti • February 21, 2013

At Wednesday night’s “unveiling” of the PlayStation 4 in New York, Sony did not show us the PlayStation 4, which makes this the most postmodern unveiling I’ve ever attended. However, the various Sony honchos who took the stage at the Manhattan Center auditorium did describe the heart of the machine. It’s “the gamer,” or maybe it’s the “consumer”—same thing, apparently. The word “social” was used as a noun at many points, as it, too, lies at the core of the PlayStation 4. And then there’s the “supercharged PC architecture.” You want gigabytes? Brother, you can have all the gigabytes you need.

And apparently the world’s game developers need them. One after another, self-respecting game creators took the stage to shake their heads and lament the severe “limitations” they have been forced to endure prior to the advent of the PlayStation 4. If it’s unseemly for representatives of a multi-billion-dollar mass media industry to whine about the constraints on their creativity, that didn’t seem to bother any of the men (there were no women) who regurgitated Sony’s pipe-dream talking points last night.

To hear these guys talk, the greatness of the heretofore unseen (and still unseen) PlayStation 4 is matched only by the awfulness of the PlayStation 3. Lead System Architect Mark Cerny explained—in a patronizing story-time tone reminiscent of Bobby Jindal’s 2009 star turn—that we have the internet now, and people like to be on the internet all the time, but “there are limitations to the experiences [the PlayStation 3] can provide in this new world.”

A producer from Sony’s Evolution Studios said that he and his colleagues had been sitting on their idea for a global peer-to-peer team-based persistent-connection trophy speed online network synergy car game for a decade. They even trademarked the name, DriveClub. You can see why they’d want to lock that down. The trouble is, they were never able to make it until now—apparently because the PlayStation 3 didn’t have enough social. The Evolution producer then showed us how DriveClub players can challenge online friends to beat their best race times, a feature already present in practically every racing game made in the past two years.

Killzone: Shadow Fall

Killzone: Shadow Fall

It was an evening of cognitive dissonance. For the first half hour, Sony’s people exhausted their buzzword thesaurus telling us how the PlayStation 4’s technology will make “new experiences” possible. “In the past, creators’ visions have been constrained by the limitations of technology,” said executive David Perry, but not anymore.

This proclamation of a new era was followed by a trailer for Killzone: Shadow Fall that was indistinguishable from countless other trailers we’ve seen for first-person shooters in the past. It hit all the standard plot points. The fresh-faced recruit hops off the troop transport. A building blows up. Much loud shooting ensues. (Here’s a rule of thumb. If the title of your game contains a colon, odds are it’s not a “new experience.”) Later, representatives of the Japanese studios Capcom and Square Enix stopped by to show the audience that they planned to make PlayStation 4 games with dragons in them. Sony’s tired thesis, the notion that more technology necessarily produces innovative artistry, was convincingly refuted by the content of its own event. It was like watching the Flat Earth Society unveil the year’s hottest new globes.

Even when the ideas were somewhat fresh, there was no guarantee that they would have any appeal. During a discussion of a new app “ecosystem” that barfs PlayStation all over your phone/tablet or any other device, a breathless Sony flack imagined one fantasy that would soon come true: On your telephone, you will be able to watch video clips of other people playing a fighting game, decide which opponents you would like to fight, and then challenge those people to fight (later, when you’re using the actual PlayStation 4). You can tell we live in a privileged society when we have to work this hard inventing things to desire.

On the other hand, Sony promises to solve plenty of first-world problems that really do exist. The new system will not take so long to start up—you’ll be able to pause, put the system in a sort of sleep mode like a laptop, and then start back up where you left off. The tedious process of downloading games to the PS3 will also be streamlined on the PS4. Not only will you be able to begin playing games before they’re fully downloaded, but the PS4 will also assess your tastes and pre-download portions of games it thinks you will like. It’s a clever idea that would essentially reduce the download time for a game purchase to zero (although it’s probably impractical for players whose internet providers enforce a monthly data cap). On the other hand, there’s a certain creepiness to it—Sony is essentially saying that it will mine your personal information to determine which stuff you’re most likely to buy, advertise only that stuff on your console, and act like it’s doing you a favor.

inFamous: Second Son

inFamous: Second Son

The PS4’s emphasis on a console tailor-made for YOU is the culmination of a recent trend in which Sony has made the PlayStation (or at least the marketing thereof) all about ego reinforcement. The company’s executives flagellate themselves at the altar of The Gamer. “The living room is no longer the center of the PlayStation ecosystem—the gamer is,” said Andrew House, who runs Sony’s game division. He insisted that the PlayStation will “give gamers the experience they want, and frankly that they deserve.” House had a nice line reading here, with a touch of shame on the last bit. It’s as if he were confessing that the last 20 years of Sony game consoles had been a fraud perpetrated on the poor, innocent players who were dumb enough to purchase such inferior machines.

What gamers deserve, according to Sony, is more of the same, made marginally shinier. The man from the Evolution studio breathlessly told the crowd that some of the DriveClub cars featured virtual carbon-fiber exteriors in which the physical response of every thread in the fiber is calculated separately. More than one producer marveled at the increased number of “polygons” he was now able to cram into his latest predictable genre sequel—“polygons” being industry lingo for “size of penis.”

The evening’s one aberration was Braid creator Jonathan Blow, who talked about The Witness, his upcoming open-world puzzle game. Taking the stage after a litany of let’s-blow-stuff-up trailers, he cracked, “I don’t know how I can follow all those explosions.” His segment of the event alternated between a cogent exploration of the lazy design choices that plague many mainstream games and Blow’s own determination to avoid those mistakes in The Witness. A lot of open-world games try to wow you with bigness and include a lot of filler, Blow said—true that—but in The Witness he tried to make the island world as compact and dense as possible, so that every inch of the surroundings was a potential point of interest.

Savvy self-promotion? Sure. But it was refreshing to hear someone argue that smart games require a conceptual shift rather than an injection of supercharged-PC-architecture steroids.

The Witness

The Witness

If Blow was a high point, his follow-up act—Heavy Rain creator David Cage—provided a counterbalance. “When people ask me what feature I want in future consoles, my answer is always the same: emotion,” Cage said. (Never mind that “emotion” was explicitly advertised as a feature of the PlayStation 2—the PlayStation 2 is old, and therefore it is a shitbox of lies.) He then launched into a cretinous analysis of media history. Cage asked us to consider black-and-white silent films. Their images were indistinct, Cage noted, and the actors had to exaggerate their actions. These films struggled to convey emotion because the technology was just too darn limited. Cage argued that until the PS4, video games have been akin to those old worthless silent movies. But because the new box has a super-fast processor, games will finally be able to convey emotion.

On the big screen behind Cage, he illustrated his points with scenes from Edwin Porter’s 1903 silent film The Great Train Robbery, which is only one of the most important and influential motion pictures ever made. Touring the nation to sellout crowds for years, The Great Train Robbery introduced the concept of cross-cutting—in which the action on screen cuts back and forth between two scenes taking place at the same time, creating remarkable dramatic tension. So, to recap, Porter expanded the cinematic vocabulary in a way that forever affected the way we perceive moving images, and David Cage saw fit to look down his nose at him.

As Cage was wrapping up his lament over the inherent terribleness of early film, the most iconic shot of The Great Train Robbery appeared behind him: A bandit shooting his gun directly at the audience. That image was more arresting than any of the glitzy, high-polygon explosions that Sony shoveled into our eyeballs last night. And it crystallized the deepest fallacy of the whole affair—the idea that creativity needs to be free of limitations.

Creativity thrives under limitations. People who love games understand this implicitly, since the best players find the most creative ways to succeed within the confines of the rules. The Great Train Robbery is a masterpiece not in spite of its limitations but because of them. So if David Cage doesn’t think he can produce an emotional work of art with a PlayStation 3 and an eight-figure budget, maybe he shouldn’t be in the art-making business.

Expanding the technological capabilities of our game machines is not inherently bad, but treating new tech as a magic bullet is a self-destructive delusion (if a familiar one). The reason that so many games suck is not because the technology is too modest. The reason that so many games suck is because so many games suck. Making art is hard. No microchip changes that.

And yet Sony’s developers insist on the myth of “more.” More polygons and more gigabytes because surely this time, they will lead to the promised land of creative expression. In practice, this dogma hasn’t done much to improve games. Quite the opposite. As production budgets balloon and the cost of entry shuts out independent voices, the worship of “more” is likely to be the ruination of console gaming as we know it. The industry’s arms race with itself simply is not sustainable. Yet here’s Sony, blithely promising to build a bigger gun. They’d better watch out—the recoil’s a bitch.

Share this with your friends and enemies

Write a scintillating comment

572 Responses to “More, More, More—How Do You Like It?”

  1. VS says:

    Awesome insights! I really enjoyed reading this and I’ve never even owned a console.

    • Staggering Stew Bum says:

      I’m glad that John was the guy covering this. I was reading Gameological contributor Scott Jones’ twitter feed today at work during the non-unveiling and the guy had me cracking up, but Teti’s article easily tops it. 

      Sounds like the whole event was an exercise in empty revisionist marketing gibberish.

      • Asinus says:

        The marketing BS and “gamer” fellating makes me ashamed to even own a console. 

        • HobbesMkii says:

          No one’s ever fellatiated(? fellatioed?) me for owning a console.

        • SensitiveSethPutnam says:

          @HobbesMkii:disqus Have you ever explicitly walked up to someone attractive at a bar and said “I own a PS3, fellatio please”?  Works every time.  

        • HobbesMkii says:

          @TheSensitiveGhostOfSethPu:disqus No. I don’t own a PS3. I bought a 360. I got half a western-grip handjibber for that, once. I think. I might have dreamt it.

        • It’s “fellated”, but I like “fellatiated”, because I remember a popular insult in junior high that went like this:
          “Hey! Fellatiate my one-eyed snake!”

        • Halloween_Jack says:

          I feel a skosh less foolish for inventing “cunnilinguinated.” Hmm, better check on that… aww, damnit.

        • HobbesMkii says:

          @twitter-106641927:disqus Thanks. I was going to look it up, but I felt that would be too close to being 11 again and looking up dirty words in my father’s edition of the OED.

        • Chum Joely says:

          @Halloween_Jack:disqus I prefer “cunnilingulated”.

        • GaryX says:

          @HobbesMkii:disqus Well, that’s your problem. Everyone knows that the blast-banging chips in the Wii U make it the ultimate system for getting some. You XBOTS andPLAYTARDS just DON’T understand. WIIU FTB (for the boners).

        • Mikehole says:

          Hobbes, I know a few people who could fix that for you…

          And I think its “fellated” or as it was stated in Superbad, “fellached.”
          Edit: Damn. I should read before commenting when I’m this late to the party.

      • Carlton_Hungus says:

        At a marketing event?!  Saints alive and my heavens!

        Although I agree with the analysis.  I imagine Microsoft’s event will be similar.  The WiiU is basically a plea for people to buy it and “wait until we develop new Mario/Zelda/Metroid games, you guys remember them right? we won’t even change anything in them!”

        • ApesMa says:

          In Nintendo’s defense, they innovate more with their AAA titles than pretty much anyone else. The new Zelda is supposed to change things up considerably by going back to the non-linear design of the very first one and introducing multiplayer, and the 3D Mario’s have never felt like tired rehashes to this day. The last Metroid wasn’t great, but at least it was very different from the Prime games.

          The hugely popular but less prestigious and ambitious franchises like Mario Kart, New SMB etc. are another story though.

        • Carlton_Hungus says:

          That may be the case, but I’ll believe they’ve revolutionized Zelda it when I see it, it’s been a loooooong time I feel like since Nintendo has done much innovation with its flagship franchises, really since the halcyon N64 days in my opinion, nor have they really gotten any amazing 3rd party games.

          However, since then we’ve seen many other great new IPs spring up everywhere but Nintendo, Ico, Shadow of the Colossus, Halo, Little Big Planet, God of War, Demons/Dark Souls, Uncharted, Assassin’s Creed.

          Now it’s true that since those have come out some of them have fallen into the same dreaded sequel-itis that seemingly affects all game companies (Assassin’s Creed I’m looking in your direction).  But I can’t think of any big new Nintendo IP, or even multiplatform new IP that’s come to Nintendo to help them along, they rely too much on their first party games where the innovation has been largely lacking for a decade. 

        • rubi-kun says:

          Nintendo did end up making some well-regarded original IPs for Wii such as Xenoblade and The Last Story, and the creators of Xenoblade are working on a new game for WiiU FYI. Plus they’re working with Platinum Games on The Wonderful 101 which looks awesome.

        • ApesMa says:

          Super Mario Galaxy wasn’t innovative? 3D Land? Metroid Prime? Granted, they followed that up with two direct sequels, but then changed it up again. Of the Zelda games only TP felt like an attempt to just recreate the success of an earlier game with fancier graphics. Even the two mediocre DS games were significantly different from the others.

          Nintendo release plenty of cashgrabs and endless, uninspired sequels (Mario Party), but that’s hardly ever the case with those three flagship titles. Even when they disappoint (Super Mario Sunshine) you can tell they tried hard to deliver something that wasn’t just more of the same.

          SMG 2 was a straight sequel for once, but being one of the greatest games I’ve ever played that one gets a pass.

        • rvb1023 says:

           @ApesMa:disqus Super Mario Galaxy wasn’t innovative, no matter how good it was (Very). Neither was 3D Land. Metroid Prime was incredibly innovative, but that game is over a decade old. Nintendo tends to not innovate so much anymore, rather hiding behind whatever new control scheme they have.

          Not being a big Zelda fan, I guess they innovated with that but since motion controls are all but forgotten by Nintendo I really don’t see them continuing that route, something I personally am happy for.

          @openid-111502:disqus The Last Story was not Nintendo and I don’t think Nintendo has any part in the development of W101 beyond funding it. And I do want it.

        • ApesMa says:

          @rvb1023:disqus SMG wasn’t innovative now? Have you played Sunshine or 64 lately? It was a huge leap forwards. 3D Land was at least distinctly different from any previous Mario game.

          All the 3D Zelda games since Ocarina, with the exception of TP as mentioned, have been distinctly different from eachother.

          As I said they did change Metroid with the last one. You can say a lot about Other M, but it certainly isn’t another Prime game.

        • Wearedevo says:

          SMG was awesome, but no, it certainly wasn’t innovative. Did it include any major features not present in previous 3D marios? Aside from using the wiimote to collect stars which.. uhh.. yeah. I hope you’re not going to base your claim on that. :/

        • mikonawa says:

          I think the innovation of SMG came from setting the levels in spherical surfaces, which helped alleviate camera issues inherent in 3D games and produced fresh level designs and new gravity mechanics. The levels where platforms were warped, sliced, masked (e.g. ghost levels) in real time were, from what I understand, also innovative from a technical viewpoint. And then there were the motion controlled levels.

        • ApesMa says:

          If you think pointing at starbits was the closest thing to an innovation in that game then I have to wonder if you even made it to the tutorial stage. It had to do with all the playing around with gravity and small spheres obviously, and the very different and much more clever level design. Seriously, go back to SM64 and compare. This is just weird.

    • Histamiini says:

      I agree. Excellent work, Teti.

    • Baramos x says:

       This explains why you find it exciting.

  2. soredomia says:

    “They do not possess the true fire. They speak of creation and they boast of their potential but they do not create anything beyond the mundane. Their imagination is poor, obsessed with the small details. A true Dreamer, I say, creates a grand scheme and then concentrates on the details. Starting with the details is for the ants of the imagination – the small insects who aspire only to be fed.”
    Exemplary article, sir.  Yves would be proud.

  3. Erdschwein says:

    Glad to see someone defending silent films. I feel like gaming, with its rapid technological advancements and emphasis on planned obsolescence, is as much a victim of its own progress as anything else. Some of the earliest games are still amongst the best ever made and you can’t doubt that it was partially due to the fact that the creators, while always trying to make something that was visually alluring (at least with the beginning of visual video games), were more concerned with making something that was good or at least original.

    •  I think this is why there’s such an expanding gulf between the mainstream games industry and indie developers these days. One side is constantly hungry for more polygons, more RAM, more everything and the other is looking for ways to be more interesting while using less. What’s interesting is that with Facebook games and iOS titles, it’s clear that the general public is out of step with the core industry. Sony’s presentation did nothing to address this, which doesn’t bode well. The console that succeeds this generation won’t be the one with the biggest veiny throbbing teraflops, but the one that comes up with an accessible and open marketplace for independent developers that can rival the App Store.

      In other words, it’ll probably be Apple.

      • Llodes says:

        Obviously indie designers and mobile games are squeezing into the market, but I think the current console generation is hurt worse by the way it’s tailored for playing an extremely limited subset of basic game types.  Consoles today are essentially designed to accommodate console games from 1999 that have advanced along a single dimension: graphics.  

        But in the PC market, you see games breaking new technological ground in lots of different ways.  Think about Minecraft: it’s kind of a fascinating technical marvel (and can actually be quite taxing on system resources), but it’s got graphics imported directly from 1995.    Unfortunately for Sony and Friends, most consoles were built with the apparent assumption that the world’s Minecrafts would be inevitably displaced by its Killzones.

        That didn’t happen, especially since we’ve probably hit the point of diminishing returns on graphics technology. This leaves Sony, et al., with console hardware that’s incapable of taking full advantage of a huge proportion of today’s gaming innovation.  If they’re serious about making consoles work more like PCs–i.e., less dedicated graphics processing, more flexibility for designers–they might still have a chance.  If they just mean they’re going to install high-end PC graphics hardware in the next generation, I agree, they’re in trouble.

        • Matthew Lutz says:

          Consoles hit the point of diminishing returns on graphics processing when the PS2 came out. Everything since then has just been a descent into the uncanny valley.

        • Chum Joely says:

          I think that the PC reference really does mean the overall low-level architecture, not just graphics, so yes.

      • JohnnyLongtorso says:

        Casual gaming and console gaming are two separate markets. Farmville and Words With Friends aren’t going to replace Killzone and Assassin’s Creed anytime soon. The general public may prefer casual games, but that’s because there are millions of people out there with Facebook accounts and iPhones who don’t have to pay for the games they play

      • The_Misanthrope says:

         I’m sure the big console makers would love to be able to just throw money at some game developers and get “art” in return, but doing so is an inherently risky proposition.  I will give Sony some credit for putting some faith in riskier ventures, though most of that has been happening at the tail-end of the PS3’s life, as the shareholders shift their focus to the next-generation.  So, of course, they are going to put money in beefing the tech stats and cramming more options into the PS4, because those are quantifiable, proven commodities.

      • GaryX says:

        I mean, yeah their presentation didn’t figure heavily into it, but I think the combo of Blow showing up + the way they’ve handled indies on PSN the past few years show that Sony is aware this is a segment of the industry worth supporting.

        • The_Misanthrope says:

           Good on Blow for showing up and offering a counterpoint to the “more is better” theme, but, having just watched the trailer for The Witness, it looks an awful lot like a next-gen Riven.  I’m sure some people will go for that type of game–variety in games is always better–but it’s not for me.  Puzzles are puzzles, no matter the pretty backdrop.

        • GaryX says:

          @The_Misanthrope:disqus True, and I think he’s even said it’s Myst-inspired, but being Blow, I’m willing to wait to play the game to see how it plays out. I wouldn’t be surprised if he’s holding some cards close to his chest.

      • KidvanDanzig says:

        I feel like “casual gaming will destroy the console market” is the new “console gaming will destroy the PC market”. As though there are significant numbers of people who would be playing consoles were it not for Zynga games.

    • beema says:

      A great example of this is the memorable game music from the 8 bit and 16 bit era. Composers had way less to work with and harsh memory constraints. Now we have lush orchestral scores in everything, except, when’s the last time you heard something memorable that actually stuck with you?

      • blue_lander says:

        Totally agree. CD soundtracks were such a big deal when they first came out, but I think it ruined game music. Using a chip to make music almost turns your console into a musical instrument. orchestral soundtracks and whatnot just turns it into a CD player.

        • George_Liquor says:

          I lament the end of the video game console as a musical instrument too, but I wouldn’t want to hear chiptunes return. They’re a product of their time, and largely, the medium has moved on. How inappropriate would it be to wander the wastelands of Fallout listening to an endless two-minute chiptune loop instead of Johnny Guitar?

        • Asinus says:

          @George_Liquor:disqus  , there is still something that feels more appropriate about even good midi music in video games. I have some midi hardware that can do some convincing instruments (especially double reeds), but it just has that little bit of artificiality that matches game worlds better. 

          Also, different consoles would always have slightly different music hardware, so instead of something sounding like a synth violin or whatever, it would sound like the SNES’s violin or and SID’s violin, etc. A game’s various ports would be different experiences that would match each system’s personality (for lack of wanting to find a better word). 

          But the point is that you don’t have to go all the way back to old chip tunes– there is good wave sample midi that could be pretty phenomenal with memory prices being next to nothing. Also, it would be interesting to do something in the spirit of old synth chips and just use updated, quality tone generators. They wouldn’t be the same sorts of unprocessed square waves of the days of yore… I donno… it would give consoles and games a new personality (and gamers something else to fight over, instead of arguing over which console’s port of an identical game was MORE identical). 

        • George_Liquor says:

          Due to space limitations, a lot of Gamecube games do exactly what you described. Twilight Princess and Wind Waker both use wavetable-synthesized MIDI tunes for the BGM instead of streamed audio tracks. It makes for a listening experience that’s disconnected from actual recordings enough to still sound slightly videogame-y. It’s an aesthetic that works well for a series like Zelda, which has its roots in 8 bit chiptunes, but it’s not one I would like to hear applied to every other game.

      • Merve says:

        There are plenty of memorable recent game scores – just off the top of my head: Deus Ex: Human Revolution, Botanicula, Prince of Persia, Gravity Rush, Mass Effect. Why game scores seem more generic now is because they’re comparable to movie scores. In the past, chiptunes could only be compared against chiptunes; the best or most iconic ones are the ones we still remember today. Now, without technological limitations, video game composers have the ability to ape John Williams or Hans Zimmer instead of just each other.

        • beema says:

          So if I walked up to you and asked you to hum the tune from Botanicula, you could do it? I’ve played all of those except PoP and Gravity Rush, and I can’t for the life of me think of the tunes, except for maybe some bloops from Deus Ex. I’m not saying the Deus Ex score was bad, it was great when playing the game. It’s just not very iconic or memorable to me. 

          The only somewhat “modern” game score I can remember is Harry Gregson-Williams’ Metal Gear Solid theme. I also remember the one from Uncharted merely because it looped on the menu screen which I had up while I went to do other things in the house.

          I guess it also has to do with exposure and what point in your life you played the games. My brain was a lot more impressionable back when I played Final Fantasy 6 than it is now. Even so, I think my point remains fairly valid. You just have a lot of samey ambient or orchestral stuff in games now. Which , like I said, works fine for the games, but I wouldn’t call it creative. 

          • Merve says:

            @beema:disqus: Well, the tunes from Botanicula are hard to hum, which is part of what makes them memorable. But I can play them back in my head, if that’s what you mean.

        • Captain Internet says:

          @twitter-259492037:disqus Bastion and Hotline: Miami both had great soundtracks, and they were recent. And let’s be honest- most 8-bit and 16-bit games had completely forgettable soundtracks too.

        • GaryX says:

          @twitter-259492037:disqus Yeah, I think you’re being a bit revisionist here. We remember the 8-bit and 16-bit tunes that stuck with us, sure, but if you think in ten years youtube comments aren’t gonna be all “OMG REMEMBER WHEN U HERD THE HALO SCORE 4 THE 1S TIME???” then you’re crazy. We still have plenty of good video game music. About the only good articles on Kotaku these days are the Kotaku melodic ones.

        • Logoboros says:

          @twitter-259492037:disqus You said “…it was great when playing the game. It’s just not very iconic or memorable to me.” Well, which is more important? That it be hummable, or that it serve the purpose of enhancing the game?

          I would suggest that what you’re calling “iconic,” “memorable,” and “creative” scores really are just “melodic” ones. And there’s been a general trend in scoring in general (in movies, television, and games) away from strongly melodic compositions and dominant leitmotifs (as in the John Williams mode).

          It’s not a lack of imagination; it’s a combination of changing fashions and a general belief that heavily melodic themes have a tendency to distract or overwhelm or are a bit too emotionally on-the-nose (and therefore not really serving the purpose of enhancing the thing they are scores for — which is their fundamental purpose, not to be great in isolation).

          Also, I think you’ve got a bit of a Golden Age fallacy going on: if you do go back to the full market of old 8-bit and 16-bit games, a huge number of those had unmemorable or outright terrible music (there were certainly some games I ended up putting on mute because the same 30-second long melody line would just repeat over and over and over — and some of those were definitely earworms, but I don’t rate that as inherent artistic success). It’s easy to handpick a selection of great chiptunes — and they tend to be melodic because when you have highly limited instrumentation and harmonic depth, it makes sense to go with a relatively simple melody line + percussion. But is the overall ratio of effective — and even memorable/melodic — game music really worse today? Mentally visualizing the racks and racks of thoroughly forgettable NES and SNES games that were back at my local rental place in the 90s, I’m going to say no.

        • George_Liquor says:

          Let’s not conflate “memorable” with “superior” here. I also can hum quite a few miserable tunes from 8- & 16-bit games off the top of my head. 

        • @GaryX:disqus :  nice attempt, but you lose points for being more coherent than the average youtube comment.

          My wife recently bought me Bit.Trip Complete for Wii.  Not only do the games have very primitive graphics, but the simple techno soundtrack is great, and it’s integrated into the games in such a way that it seems vitally important.

          To me orchestral scores are most often used to very generic effect, whether in movies (by far the worst offenders) or games.   An orchestral score often seems to be a signifier for “depth” or “importance” while something like Nick Cave’s minimalist score for The Proposition (great movie) can be much more effective.

        • SonicAlligator says:

          Journey. ’nuff said. 

        • Erdschwein says:

          This argument is sort of the central debate of music

        • beema says:

          Alright, well this certainly got away from my original point really quickly. Let’s not all be so f’ing pedantic, eh?

          My point was that old games had great memorable soundtracks despite technological limitations, and in some cases due to those limitations, and that new games don’t really have better, more memorable, or more creative soundtracks despite less limitations. I wasn’t calling the new music crappy. Why does everyone on the internet have to turn everything in to some sort of competition. Yeah some new games have awesome soundtracks. Anyone can find an example of anything anywhere any time that refutes anything else if they try hard enough. Fuck. I post on this site to get away from pointless fucking arguments like this.

          It was to refute the whole David Cage “technology contraints limits creativity” thing. I’d say those constraints force people to be MORE creative. You can certainly do more with less constraints  but the question is if most people actually do.

          • Merve says:

            @beema:disqus: That’s a fair point, and I’m sorry if I misinterpreted what you were trying to say. I think we just have a fundamental difference in philosophy here. In my view, if someone needs constraints to be creative, he or she can simply artificially impose those constraints on him or herself. Thus, the loosening of constraints on a macro scale need not correspond to a similar loosening on a micro scale.

            Do I think technological limitations can impede vision? Absolutely not. But can they impede the ability to realize that vision? Definitely. And maybe that forces people to think laterally and come up with creative workarounds. But I don’t feel like I can ever be sure that those workarounds are more creative than the original vision, because the realization of the original vision is a nonexistent counterfactual. In the end, we can only guess, and I think it’s alright to come out on either side of the debate.

        • Fyodor Douchetoevsky says:

          @twitter-259492037:disqus I don’t think anyone is disagreeing that restraints force people to get creative. You can still be creative without restraints though. Case in point, all the awesome modern game soundtracks that people mentioned upthread.

        • lookatthisguy says:

          To somewhat piggyback on what @Logoboros:disqus said, I think it’s a fair point to make that, as console gaming by and large tries to get closer to giving the player a cinematic feel and scope for everything, the score (and it is a score now, not just background music) has gone from recognizable melodies and leitmotifs to short, scalable cues (or even just stems) ready to be triggered and transitioned to and from at a moment’s notice to match the action, say, how close you are to tackling the enemy, or narrowly beating your opponent in that winner take all race.  Many soundtracks released today are edits of these cues/stems tied together to give a sense of how things happened as the player played through that scene of a game, but may never actually line up with how the scene went down for a player.

          In that regard, while I generally disagree with @twitter-259492037:disqus’s point, there is a theoretical argument to be made that soundtracks may soon reach the point he suggests if console gaming continues along its current trends.  To me, though, this comes down to focus on the narrative—whether you opt for stronger narrative or make a game that has less need or less apparent narrative.  If the composer has enough space to write something that can be positively memorable, you’re that much more likely to get something positively memorable than the generic, Zimmerian grooving-percussion-and-string-ostinato beds you get now.  (Not that there’s anything wrong with them, of course, just that there’s so much of it now.)

      • ricin_beans says:

        The Skyrim score is pretty damn memorable.

        • asdfmnbv says:

          The Morrowind, Skyrim and Oblivion variants on the Elder Scrolls theme are each so great. They set the tone of the each game perfectly and are a pretty great through-line for the games. Also the Morrowind theme is the greatest thing ever.

        • The_Juggernaut_Bitch says:

           Even better?  Hearing the old Morrowind score on Solstheim in the Dragonborn DLC.

        • Asinus says:

          What @asdfmnbv:disqus  said is irrefutable scientific fact. 

      • Scurvyhead says:

        Games with memorable orchestral soundtracks, based on a glance at my DVD rack alone:

        Skyward Sword
        Metal Gear Solid 3
        Final Fantasy XIII
        Red Dead Redemption (practically a love letter to Ennio Morricone)
        Assassin’s Creed
        Uncharted (series)
        Monster Hunter
        TES4: Oblivion

        (Admittedly, some of the “instrumental” tracks in these games are synthesized)

        It’s true that because they were limited to short melody/harmony loops, 8-bit (and to a lesser extent, 16-bit) soundtracks were naturally catchy– but dismissing modern game music as unmemorable is a bit shallow-minded. I chalk it up to nostalgia and the fact that chiptunes and the games they accompanied have had decades to soak into gamer brains.

      • KidvanDanzig says:

        XCOM and Skyrim.

        Please let’s not turn chiptune music into fucking classic rock.

    • KidvanDanzig says:

      It baffles me that people think video game developers don’t want to make good games

  4. ThoseEyebrows says:

    I haven’t had the chance to watch or read up on the event yet, so I’m glad I read this first. Now I know not to waste my time.

    This was a great read, though, especially the part about Cage. I mean, I hated him before, but this takes it to a whole new level. You’d think that a guy who so clearly, desperately wishes he were making movies instead of games would actually know a little bit about the medium.

    • Zack Handlen says:

      Y’know, I didn’t -hate- Heavy Rain (it was often entertainingly terrible, and there was enough broadly applied pathos that I got suckered in for a few scenes; plus, I guessed the killer really early, and that made me feel good about myself), but that is not a game made by someone who should be anywhere near a movie set. I don’t know much about Cage, but the script for HR is straight up shitty DTV thriller. It gets a reluctant pass for novelty’s sake, but what made it interesting were those limitations Cage apparently seems so obsessed with eliminating, that weird tension between the clunky acting and plotting, the cheap horrors (child endangerment? Oh do go on!), and the puzzle mechanics. A game with better animation or whatever is just going to make the silliness of the plot all the more evident. 

      Fantastic article, John. As stupid as these events always are, I’m glad they exist, if only for terrific take-downs like this. 

      • ToddG says:

        I find most of Heavy Rain to be pretty indefensible, but  that second level, where it’s just the dad trying to take care of the other kid after the tragedy of the prologue, really worked for me, as far as emotion and storytelling go.  But, yeah, the rest of the game is mostly a mess, and then we got Walking Dead four years later and saw what a game in that style could truly accomplish.  Oh, but wait, this console generation can’t convey any emotion.  I must have imagined being absolutely wrecked by the end of that game.

        • Carlton_Hungus says:

          I feel like heavy rain ran into two problems with modern games.

          First, it ran into a bit of the “uncanny valley” problem, not that these characters were really that much more realistic than those in other games, but since they were supposed to be doing more normal, human things (unlike say Nathan Drake) I think it was more of a problem.

          Also as one of the people interviewed on WRUP put it.  When a game claims to be “open-world” or the story “player driven” running into those edge of maps or funnels that take away choice become all that much more annoying as opposed to knowing that you have no choice other than to just run left.

          Heavy Rain was interesting, worth playing once, but I agree pretty indefensible, the voice acting and story were atrocious.

        • WinterFritz says:

          Ditto on The Walking Dead game. I also found Red Dead Redemption to be more emotional and weighted with
          pathos than Heavy Rain. I also thought it was more fun to play. I didn’t much care for Heavy Rain is what I’m saying.

      • Phillip Collector says:

        Seriously to hear Cage talk about gaming you’d think Heavy Rain was going to be the video game equivalent of “The 400 Blows”, or “Breathless” when in the end it was like a bad Law and Order knock off.

      • beema says:

        God I loathe David Cage so much. He shouldn’t be allowed to open his mouth. His games are relatively fun and interesting, but every time he says anything it just infuriates me. 

        • Bipolar_Bearman says:

          Shhh! What are you saying? No one can ENJOY his games. They are of the devil, have you not heard?

      • GaryX says:

        Anytime David Cage opens his mouth to critique games, he should be strapped to a chair and forced to watch the last half of Indigo Prophecy.

        Fuck that guy.

        • beema says:

          *strapped to a chair and unable to masturbate. Because he would masturbate the shit out of watching his own bs. Especially his own awkward “romance” scenes bs.

          Man, Indigo Prophecy might be the best example of a game that started out amazingly brilliant and ended more wrong than anything could possibly fuck up.

        • GaryX says:

          @twitter-259492037:disqus Agree so hard. I found the game pretty interesting up to a point, and then it just went super downhill.

    • evanwaters says:

       Anyone who cannot appreciate silent film should not be in the art or entertainment business.

      (Exception: blind people.)

    • Nacho_Matrimony says:

      Teti made him out to be way more antagonistic than he actually was. If anything, Cage is distinctly reverent of cinema like the rest of the French are. He’s applying that same appreciation to what makes game developing on 2013 so special.

      As a matter of fact, I think Teti made everything about this presentation seem more antagonistic and backwards than it actually was.

  5. Certainly a fascinating article with a number of good points, though easily too cynical and displeased with what seems to be an exaggeration of every facet. Straw man fallacies must be something of a brother or sister to the author for having used them so much. However, you do make some good points in the last section regarding the praise of “more” and what that is likely to do for the industry. Frankly, the event was a press event, a marketing ploy meant to attract as many customers as possible, and the sad truth is that most people are more interested in more of the same with some added polish and shine. As a consumer culture, we have been trained to place more stock in the increase in technology, hardware, structure than in modifications in our approaches, conceptual shifts that don’t require increased capital investment.

    • Enkidum says:

      Can you give an example of the straw manning? Genuinely curious.

      As for the cynicism, how can one not be cynical about it? It’s a marketing pitch for a product that doesn’t exist yet, presenting what is clearly a very minor set of tweaks on a design that was fully-developed a decade ago, and presenting it as something revolutionary. 

      • GaryX says:

        Welcome to video games/technology. The only time this hasn’t really happened was the jump to 3d, and maybe the Wii but that’s something else entirely.

      • The way I saw it, and please if I am mistaken then I would genuinely love for someone to correct me, Teti is presenting his argument by making the most extreme itteration of Sony’s desire to sell consoles based on hardware revolutions that are not entirely necessary, and then arguing against this extreme view, as opposed to arguing only against what they explicitly or implicitly said. Perhaps the extrapolation he employs is meant to highlight the problems with Sony’s thinking if it is proven excessively dominant in the future, but it seems to me that there is some misrepresentation of Sony and their argument.

        As for the cynicism, I suppose I agree. Maybe I just desperately want to keep optimistic, or I’m in denial over what could be the death of quality console gaming (though I still highly doubt such a death is possible). Admittedly, the hardware jump to 8GB of RAM sounds pretty astounding to me because, if I am not mistaken, then the current console generation is functioning with 512MB on the PS3. That’s a pretty colossal leap. 

        • Enkidum says:

          Do people really believe in the death of consoles? I just don’t get the fuss. Build some better machines, it’ll work out fine.

      • Nacho_Matrimony says:

        To be fair, it seems to be better equipped console for an Internet-enabled world. I was bummed by the fact that the triple-A games are too similar to today’s triple-A games, but Sony also has a good track record with their software risks (Unfinished Swan, Journey, Team ICO’s stuff, and others). That stuff will be there, and if and when Sony screws the pooch on that stuff, I know the PC’s got it in spades.

      • Bubbleset says:

        The biggest straw man was the idea that Sony was claiming the PS4 is the ultimate gaming device to tell amazing stories and everything before it was shit.  I don’t recall Sony entirely bashing what came before at all – the whole thing started with a celebration of the evolution of their consoles.

        The second biggest straw man is against developers.  No-one was saying that what they made before was terrible, or that amazing games or interesting stories were impossible in previous generations.  Just that new consoles will result in better products and more opportunities.  And this is a pretty self-evident point – LA Noire and other games have shown the promise of increased facial fidelity in telling stories, and the demo made this argument again.  Previous generations have heralded leaps in what’s possible – 3D, online gaming and connectivity, huge crowds or environments with many actors, realistic character models, etc.  More horsepower and better design will result in more of that, especially since the consoles have been showing their age.  But no-one thinks that horsepower alone makes for amazing games.  Even the least cynical would agree with his takedown of another Killzone for that reason.

        Also, the trashing of “social” and “gamer-focused” as buzzwords without acknowledging what Sony’s trying to do (and what hopefully Microsoft will as well).  The latest generation had social gaming pretty much bolted on, had ruined developer relationships especially for smaller developers, and have interfaces for online interaction and downloading that are downright terrible at times.  If Sony is claiming they’re building the console from the ground up to be a positive experience for developers and users, instead of the nightmare the PS3 was at launch or the Xbox is now, then good on them.

        It’s all promises and marketing at this point, with a dash of hyperbole, but the marketing shows they’re in a much better place than they were at the launch of the PS3.  Now it’s holding them to it and seeing if they and the developers can make good on that promise.

        • Enkidum says:

          Thanks for the detailed reply. No way I’m going to watch the presentation, I just don’t care enough, but it’s good to hear a counter to John’s viewpoint that isn’t just “why don’t you suck Sony’s dick a little more NERD”.

    • GaryX says:

      I’m sort of half and half. It’s a really well-written article, and definitely spot on (the Cage takedown is sublime), but I’m not really as cynical about the whole thing as Teti. With new freedom comes new limitations. People like Cage are always going to make shitty ‘smart’ games, and there’s always going to be blockbusters. Further, these are always going to be the things talked about at a press event.

    • Well, I’m sure there are plenty of straight write-ups around on the net to satisfy those who want that take on the event.   A healthy dose of cynisicm with tongue in cheek isn’t the worst way to approach a soulless, calculated corporate event, IMO.

    • ApesMa says:

      I think you and Sony alike underestimate the problems the industry is headed towards. New systems keep getting less and less interesting with every generation, and most people are going to lose interest. This thing doesn’t seem to offer a single big new sellling point to the average consumer. The so-called “hardcore” market isn’t big enough to keep the industry healthy. Meanwhile production costs keep rising to meet graphical demands that most people don’t care that much about. Check out console sales from the 3D era, including handhelds. The most powerful console never wins. The industry is out of touch. I mean, just look at the Vita. Did they really think there was much demand for an overpowered PSP with a touchscreen? That’s not what people are looking for in a handheld!

      Nintendo are the only ones who seem to understand this, but they are busy making terrible decisions of their own, and still can’t get proper 3rd party support. I wonder where this industry is headed.

    • The_Juggernaut_Bitch says:

       Bullshit.  Game companies continually push bigger, faster, beefier consoles, but continue to deliver product that is, frankly, pretty much the same shit we’ve been playing for 25 years now.  Yeah, Halo 4 was shinier than Halo.  It’s still basically a clone of fucking Doom.  While there’s room at the table for FPS games, can no one do something *new* with this genre?

      • Annabelle says:

        Are you kidding? Have you never played a Valve game?

        • The_Juggernaut_Bitch says:

           Yeah.  Wanna tell me how Half-Life, in any way, significantly departs from being Doom, outside of graphics?  Silent Hero saves world from invading Space/Hell monsters.

          Counter-Strike? The damn game’s 13 years old now.

          Team Fortress?  Basically a variant of Counter-Strike, being a team-based FPS with configurable team roles.

          Day of Defeat? It’s basically Company of Heroes or Medal of Honor, done by Valve in their Half-Life engine.  This is a step down for Valve IMO.

          Portal?  Not really a FPS.  FP with some kinda-sorta shooting, but this is more of a physics puzzle game.  This one I will give a nod of “something new and different” to.  Good job, Valve, loved it!

          Left 4 Dead?  This is yet another team-based shooter with Zombies.  Replace Zombies with Demons and you have Doom multi-player.

          Alien Swarm?  This was already a mod for UT2K4.

          DotA?  Not exactly an FPS, so doesn’t really enter into my point here.  It’s also a WC3 mod, not an original game.

        • lokimotive says:

          @The_Juggernaut_Bitch:disqus I get the impression that you haven’t actually played Doom in awhile.

        • KidvanDanzig says:

          @The_Juggernaut_Bitch:disqus Look at all these uninspired games! Look at how easily distinguishable they are from one another! I am seemingly incapable of differentiating between titles if they contain standard conventions like a first-person perspective! I cannot escape Doom! DOOM!! Why must effective design persist in the marketplace whyyyy
          Srsly tho, Half-Life was distinctive in that, through its use of an extended interactive intro sequence and its heavy emphasis on in-game scripted encounters (watching people get headcrabbed, scientists running out to meet the military and getting shot), it popularized the sort of narrative-focused FPS experience that System Shock had created a cult for (the same year as Doom, that scourge of gaming). Those things were new and shocking at the time. Even System Shock told most of its stories through text and audio but in Half-Life things actually happened in real-time and that was insane. Of course scripted sequences are everywhere now, that’s what makes HL a seminal game, perhaps THE seminal game.

          In terms of systems it was quite sophisticated. It was the first game to have fluid levels, meaning instead of exploring a few floors of a building and moving to an exit point to get an end-of-level screen while the next set of floors loaded, environments were continuous – actors and objects persisted between map cells (if you tolerated the odd minute of waiting for buffering). If you run out of one level with a bullsquid chasing you, its going to still be there in the second level, and as a result the game took place on a larger scale, the environments had a solid permanence they previously lacked. 

          That was a big, big deal in those days because it allowed developers to think outside the strictures of simple shooting galleries like Doom’s. You could have continuous, coherent architecture. You could have a sense of scale. It was more or less the beginnings of the mainstream gaming industry’s shift toward cinematic experience.

          So yeah, I know you’re being intentionally reductive, but you asked and I’m bored, so there it is.

    • Mike Ferraro says:

      People don’t want “more of the same with some added polish and shine”, that’s just the first thing they think of when someone says “we need a hit new product!”.  And investors will bankroll it because it’s a known quantity and they certainly can’t think of a better (safer) idea.

      The things that people want tend to be an unconventional change that makes perfect sense only in retrospect.  Touchscreen internet telephone, a computer you can take to bed, the free-to-play model…  you know that adage about Henry Ford saying “If I had asked people what they wanted, they would have said faster horses.”?  A focus group might tell you they want a shooter just like Halo, but they’re not going to buy your Halo knock-off because they already have Halo.  You show them the thing they didn’t know they wanted until they see it, you build them a car to replace their horse.

      That’s why Apple is eating the consoles’ lunch, and Sony’s response “Oh, we put a WiiU in your Vita, and an iTouch on the controller, and by the way guys we’ve got Facebooks and Steams too” is about as smart as them releasing a Smash Brothers copycat with their own characters.

      • KidvanDanzig says:

        Josh Sawyer (Fallout: New Vegas, etc) once said that one of the worst things you can do as a game designer is pay attention to what gamers say they want. It’s marginally more effective to pay attention to what they do.

  6. I never played Heavy Rain or any of Cage’s games, but the tech demo “Kara” video was just phenomenal on every front (except the voice actor for the inspector dude, who was very by-the-numbers). It’s a shame that the studio’s other work is, evidently, poop.

    • valondar says:

       Or that his analysis of film history is a bit glib. I’m sorry, what’s that, David Cage? There were limitations in conveying emotion in old silent films? I’m just going to leave The Passion of Joan of Arc over here if you don’t mind.

      It’s a shame because David Cage’s games tend to sound like the kind of things I’d really, really want to play and in general interactive movies is a genre I’d like to see more entries in. But on the one hand they’re console exclusives (except for the ever aging Fahrenheit) and on the other reviewers are sharply divided as to whether or not his writing is worth a damn.

      • Yeah, I don’t know much about their writing/development process, but it makes me wonder about how viable the auteur approach to gamemaking is in a big-studio context (“more, more!”). It really sounds like this is just a dude who really needs to get out of his own way.

        That said, hubris is kind of a necessary component of change/progress. After all, the only belief shared by all the great philosophers is that all the previous great philosophers were wrong. I guess it just needs to be tempered with a healthy dose of tastefulness and self-awareness, which seems to be where this Cage guy is going off the rails.

        • GaryX says:

          To be fair, I fee like the auteur approach to gamemaking is probably similar to filmmaking (which has its own massive numbers of human beings along with the ‘auteur’ behind it). We just have shittier auteurs in game making.

        • Fyodor Douchetoevsky says:


        • KidvanDanzig says:


          There are some pretty good “auteurs” in game design, if you conceive of an auteur as a creator whose work is distinctly “of them” and that tends to revolve around specific obsessions and themes. David Cronenberg’s very clearly a film auteur, Roland Emmerich not so much.

          The most obvious examples you’ll hear are probably Ken Levine and Chris Avellone. Levine is a little tougher to suss out, but you could say that through the Shocks there’s a focus on tension between creators and the created, parent and child, with one fighting to destroy the other (sorta a Frankenstein thing). In Avellone’s work from Planescape: Torment to KOTOR2 to NWN2: Mask of the Betrayer to New Vegas: Lonesome Road. There’s a thematic focus on oaths, what they mean, and the misfortune they can bring when made hastily or carried on beyond their original contexts. There’s also a shared focus on lost and broken people who find themselves bound together (usually by supernatural forces surrounding the player character).

          There was little of that in Alpha Protocol, of course, but I’m expecting to see some of that in Project: Eternity. Avellone’s also unfortunately retired his other touchstone: Manifestations of Ravel the Hag from P:ST showing up in his games as an old blind woman. Kreia may be the last.

      • PugsMalone says:

        Things I learned from Heavy Rain:

        Psychologists can’t break confidentiality, even when they think that their patient is a serial killer.

        Schizophrenia is a synonym for dissociative identity disorder.

        It’s okay for newscasters to refer to suspects as “dangerous lunatics”.

        • Dwigt says:

          It’s because of the limitations of the PS3…
          These elements would definitely be fixed, should the game be ported to PS4.

        • Asinus says:

          “Schizophrenia is a synonym for dissociative identity disorder.” 

          Really? They did this? I can sort of excuse this in older media or casual conversation, but why is it so easy to NOT research technical terms being used? You’d think someone at the studio would say, “You know what? I’ll double check this on the internet.” 

          To me, it’s like explaining the Moon’s phases as shadows being cast by the Earth. It’s something that people hear and then repeat it for their entire lives without ever looking into. But in a script for a major release, though, I expect some kind of professionalism.

        • PugsMalone says:

          Asinus: The first time they mentioned schizophrenia in the game, it was a layman who mentioned it, so I figured that they were setting up a scene where they described the difference- but no.

          And the psychologist who treats schizophrenia and DID as the same thing is played by Cage himself.

      • Girard says:

        I’ma let you finish, Passion of Joan of Arc, but the real champion in the “emotional art” competition is the game where you get to engage in extremely awkward, embarrassing basic-cable corpse sex (as the corpse)!

        • Asinus says:

          I had that game and gave it away  before getting very far. That is nightmare fuel. I just skipped around and landed on that kiss that looks like action figures making out. 

        • Girard says:

          I just realized my use of the word ‘corpse’ has a double meaning. Yeah, your dead-eyed, awkwardly-animated action-figure mechanically grinding on his love interest’s uncanny valley is rather corpse-like. But ALSO, one of the reasons he looks more grotesque in that scene than usual is that he actually died. And was resurrected. By (heretofore unseen) robots all of a sudden.

        • GaryX says:

          Gooooooood fuck that game. Fuck it so hard. So much fucking hate for it.

        • Asinus says:

          Oh… gross. 

        • Fyodor Douchetoevsky says:

          Oh man, such writing, oh wow. Isn’t this that game where it’s only QTEs and the writing/story is supposed to be the main attraction?

    • “Heavy Rain” was, aside from some abominable acting by French Canadians trying to play Americans, a remarkable gaming experience. If Cage brings out “Beyond: Two Souls” (oh look, a colon, but not a sequel!) as a PS4 exclusive, I will be extremely disappointed.

      My biggest concern with the non-announcement of the PS4 wasn’t so much about the graphics or power but about the seeming assumption that we all have unlimited high-speed internet and a desire to basically rent games from Sony rather than owning a copy that we can freely sell, trade, or keep indefinitely.

      • Fyodor Douchetoevsky says:

        This is like the one big thing I think consoles have over PCs. I love steam and stuff, but I really love having a physical collection of games that I actually own. Though even with the PS3 and 360 they’ve been adding stuff like multiplayer codes and things so that if you want to lend it to a friend, you’re lending out a gimped copy. 

        Of course it’s not surprising that they’re moving to a more controlled platform thing, but it’s super disappointing. I guess everyone wants to be just like Apple? Fuck that.

  7. Technological limitations certainly helped creativity in the Japanese game industry thrive in the 8- and 16-bit days. Those limitations also produced games that easily appealed across cultural lines. Lifting those limitations from console generation to generation effectively just gave Japanese developers more rope to hang themselves with. The more polygons, text boxes, and voice recording sessions that were crammed into a game from Square Enix or Konami, the greater the chance was that the resulting game would end up alienating large portions of the western audience.

    • Dikachu says:

      Also, with more polygons, Square was free to turn their RPGs into glorified fashion shows/teen soap operas.

  8. I don’t necessarily disagree, but I think it would be important to watch the event yourself and make your conclusions. It seemed to me that this articles, while making plenty of valid and interesting points, exaggerates the problems. 

    As for Cage, I happent to enjoy him and his games, but I agree that he certainly reflects film in most aspects. However, I do not think this is negative. Every medium can learn from that which came before in one way or another. 

    • valondar says:

      I think there’s room enough for everything. One can argue that games can create a unique kind of storytelling that doesn’t really resemble movies, one that’s built into the process of playing the game – something like say Portal builds your relationship with GLaDOS via tutorials and then begins to undermine it as the game becomes more challenging, that’s a kind of relationship to the player that a movie can’t get with a viewer.

      But fully cinematic experiences can make for some pretty riveting gameplay. One of the best games I played last year (and apparently a lot of other people) was the Walking Dead. Hell, NOVELISTIC experiences can make for some pretty riveting gameplay – there’s a lot of games that make the most of their text (again recent example: I really enjoyed Analogue: A Hate Story, which while it uses some anime-like art is predominantly a text game).

      It’s just a really flexible thing. I don’t buy Cage’s argument for a moment that somehow emotion is a feature lacking from current gaming – besides there being perfectly excellent facial work the last few years, there’s also been games with pretty affecting stories even without highfalutin graphics.

      I mean, you might call To The Moon – with its minimalist RPG Maker graphics – manipulative and saccharine and I wouldn’t disagree, but the entire thing is built to evoke an emotional response and it’s arguably worked for a great number of people. Shiny new polygons may offer more opportunities, but they’re not an essential feature of emotional gameplay.

      • ThoseEyebrows says:

        I definitely agree that games can learn things from cinema (and they certainly have), and that cinematic experiences in games can be very effective. 

        My problem with Cage is that he seems to view film not as a source of potential tools games can learn from, but as an aspirational model that games should be working towards. Here, by touting fancier polygons or whatever as the key to finally conveying emotion in games, he’s essentially saying that games are currently incapable of telling good stories because they aren’t enough like movies. Which is not only incredibly condescending and ignorant of all the many ways games have used interactivity to tell their stories, but also flat-out idiotic in light of the fact that the kinds of shallow, unsatisfying stories he’s complaining about tend to be the most cinematic games around.

        If he just said that he personally likes making particularly cinematic games, then that would be fine. But by presenting it as somehow the only way games can hope to achieve artistic legitimacy (because he assumes they haven’t already), he completely shortchanges the medium he chose to work in, and comes off as kind of an asshole.

        • chaos...reigns says:

           I think the bigger issue is that games resembling film is just an empty buzzword.  With any other cross-media inspiration the artist usually explains to us what element from the other art form he was trying to borrow.  But with the video game industry what sorts of films are developers trying to emulate?  What sort of cinema inspires them?  Until people like Cage can actually articulate that then film emulation is no more meaningful of a term as synergy at any of these sorts of events.

  9. Kai Kennedy says:

    I was ambivalent, before… but now I’m really mad all of a sudden! Sony can go to hell!

  10. turdsalad says:

    Isn’t giving videogames emotions how Judgement Day happened?

  11. “The living room is no longer the center of the PlayStation ecosystem—the gamer is,” said Andrew House.


  12. Llodes says:

    I’m sorry, but this is embarrassing.  Gaming on consoles has actually been severely limited by technological restrictions.  In particular, it has been limited by console architecture that heavily favors graphics processing over other game elements.  

    This is a HUGE REASON why console games lean so heavily towards flashy, heavily-scripted shoot ’em ups, with a minimum of interaction or exploration.  The current generation isn’t really designed to handle persistent, expansive, detailed worlds, especially those with lots of moving pieces, and the limitations show.  Even in games like Skyrim or Far Cry, you can see the console creaking to keep up.  A more PC-like architecture allows game designers to create games with vastly larger environments, better AI, more dynamic worlds, and lots of other things that have the potential to move mainstream gaming out of its Call-of-Duty-shaped rut.  

    We don’t have to write navel-gazing articles about this: you can just directly compare PC gaming to console gaming.  And what you find, pretty consistently, over decades, is that the technological flexibility of PCs has resulted in a much wider range of game designs and a much higher degree of gameplay experimentation.  

    • Fist Beefchest says:

       Well said. While it’s obviously true that there are great games continually being made that will run on even the crappiest of machines, it’s also true that technological advancement has almost always ushered in entirely new genres of gameplay. It wasn’t just graphical limitations that prevented the Atari 2600 from being able to deal with something like Super Mario Bros or Zelda. The gameplay itself was also far more complex and absolutely required the extra power of the NES.

      Similarly, things like Mario Kart, Goldeneye and Grand Theft Auto 3 were all whole new kinds of games (at least from console owners’ perspectives) and would not have been possible on the previous generation’s tech. The current generation might be the only exception – I can’t think of any games on Xbox 360 that couldn’t have been done on Xbox 1 with the graphics settings turned down. This might be where the “Pfft, faster hardware will just mean the same games with better graphics” perception comes from.

      So while it’s unlikely, I hope the next generation does somehow open up new forms of gameplay. What I want is for them to build upon the Minecraft proof of concept and give me a super-dynamic GTA-type game where everything I do has a permanent effect. I want to have a huge gunfight with the cops and not have the carnage just disappear when I drive around the corner. I want to be able to plant C4 around the base of a skyscraper and knock it over, and then use the rubble to build my own house with a statue of myself made of burnt cop cars and a throne made from my enemies’ bodies. I suspect I’ll be waiting until Playstation 12 before this is possible, sadly.

      • nowimnothing says:

        ” What I want is for them to build upon the Minecraft proof of concept and give me a super-dynamic GTA-type game where everything I do has a permanent effect. I want to have a huge gunfight with the cops and not have the carnage just disappear when I drive around the corner. I want to be able to plant C4 around the base of a skyscraper and knock it over, and then use the rubble to build my own house with a statue of myself made of burnt cop cars and a throne made from my enemies’ bodies.”

        Fuck yes I would play the shit out of that. Give us the plot and mystery of Myst, the open world of GTA, the destructible environments of Red Faction, the creativity of Minecraft, maybe even the social aspect of WoW. All the elements seem to be there, why can’t we get this?

        • Bad Horse says:

          Because it would be INCREDIBLY expensive. Getting that shit to work together isn’t additive, it’s exponential. I know – I work for a software company, and all our stuff does is crunch numbers and display charts and make database queries, and trying to get all the disparate elements working together is just brutal. The only way you’re going to see something like that is if old Cliffy B decides that Unreal Engine 5 needs to be able to handle that, and Epic spends millions and millions to get it there, and then devs will still have to pay an arm and a leg to modify it to fit their own needs.

        • Llodes says:

          Bad Horse: I don’t work for a software company or anything like that, so take this with a grain of salt, but I think you’re overstating the challenges involved. Look: Minecraft, which keeps getting brought up, was essentially developed by a couple of guys in a basement.  It has dramatically more moving parts than any Call of Duty ever.  It works fine.

          Now think about how much do A-list shooters cost.  It’s dozens of millions of dollars.  They have teams with hundreds of people.  All to create a game with the interactivity of a roller coaster.

          At some point, dynamic or procedural systems actually become much easier to create than purely cinematic experiences.  That’s because asset creation never gets less expensive in the latter, even as you hit the point of diminishing marginal benefit.  That random half-exploded building in the background that you saw for three seconds?  It cost the studio as much to create as all the other buildings on the map.  Entire, amazing games are built for the cost of a handful of Call of Duty assets.   

        • Bad Horse says:

          It’s true that you could probably get something very dynamic out of present-day tech, if you’re willing to make some compromises. Minecraft is focused on producing a dynamic system, and it sacrifices visuals to do it. But Minecraft isn’t for everyone, and if you’re trying to do something like it to a AAA visual standard, you are going to run into serious problems. 

          What Sony was saying in the conference is that the tech, the visuals, are the thing that’s preventing those kinds of worlds from being created, which is why they’re getting panned here and elsewhere. Trying to continually improve visuals WHILE offering a more dynamic world AND better writing AND more emotional content is all well and good, but it’s going to result in ever-blossoming budgets, and in the end that’s not going to be good for anyone. Only the richest franchises, with the broadest appeal, will be able to use those budgets, so in the end there’s going to be less creativity, not more. Unless they really open up the platform and allow self-publication, which would be the best possible thing a next-gen console could do.

        • TheKingandIRobot says:

           Because there are vast groups of players in the world that exist solely to make sure you can’t have nice things.  Seriously, you could just call that game “Flat Expanse of Rubble: The Ganking” and be 100% right in about two hours after it launched.

        • Enkidum says:

          @Llodes:disqus The challenges definitely aren’t being overstated. If anything, they’re understated. Epic spending millions and millions to get there would only scratch the surface of the problem. 

          You can have procedural games, you can have games with stories. Just combining those two things together would be a brutal achievement, and adding in things like persistent damage and an open world? And not having massive glitches every 5 minutes? Not going to happen, at least not without a whole lot of R&D.

        • Llodes says:

          Enkidum: I have no doubt it would be tremendously expensive to implement in full.  But there are at least three reasons that’s not as big a deal as people say:

          1. A-list games are ALREADY tremendously expensive.  They’re orders of magnitude more expensive than indie games which try to do this sort of stuff.  (Remember when Notch was going to try to fund development of Psychonauts 2?  And then realized his Minecraft millions weren’t remotely sufficient for the task?  And that’s Psychonauts 2, not Gears of War or something.) To some extent, testing these games would be a matter of transferring resources that already exist, rather than dredging up new resources.

          2. Implementing more dynamic worlds and whatnot doesn’t mean you have to model everything in-game in some sort Molyneux-esque flight of fancy.  Games like the original Crysis, released over half a decade ago, are dramatically more free-ranging and dynamic than the current crop of console-oriented dreck.  (On consoles, even freer-ranging games seem to boil down to moving around a static environment until you trigger scripted events.  I’m looking at you, Mass Effect.)  There’s a lot of room for partial improvement on this front.

          3.  The economics of development in this direction are positive.  It might require considerable fixed startup investment to create functioning dynamic systems (although, again, the technology for this already exists on PCs to some extent).  But once you have, the process gets a lot easier.  Look again at Minecraft: once Notch had the basics down, he could add all sorts of content willy-nilly, with interesting and emergent effects.  The game deepened dramatically in proportion to the actual amount of labor involved.  Now compare that to the ultra-laborious process of designing a new Call of Duty level.  What, you already made 12 for a million dollars apiece?  Doesn’t matter, the 13th isn’t going to cost appreciably less.

          And it’s worth pointing out that it seems like more dynamic games are simply more viable in the market, on average.  When they function as intended, games like Skyrim, Far Cry, Fallout, Dishonored, GTA tend to make waves and be quite successful, glitches and all.  Meanwhile, for everyone smash hit roller coaster ride like Call of Duty, studios churn out half a dozen technically-proficient, ultra-expensive clones, which hit the market and sink like a stone.  Do you remember anything about Homefront?  How much money did EA put into Battlefield’s single-player campaign?  How many people do you think played it?  It was an awful investment.

    • feisto says:

      Just to push back a bit (and, full disclosure, I didn’t see the event, and I’m going on strictly what Teti wrote), I didn’t take the article as Teti saying, “Pssh, better specs, big deal,” but rather to point out that the event in general seemed to focus on the improved prowess of the PS4 in the abstract, without doing a solid job of explaining HOW that prowess would create better and newer gaming experiences (something that you seem to have done a much better job of doing than the event). That is, it wasn’t Teti who was saying technical improvements=more polygons; it was the event itself that was doing that.

      I think Teti would actually agree with you on the fact that technical improvements can give developers more freedom to create better and newer gaming experiences; but instead of showing that, the event focused on flashy, heavily-scripted demos that seemed to suggest, in anything, more of the same. And if anyone seems to be rolling his eyes, it’s Cage, sniffing  at the current lame technology that doesn’t allow him to fulfill his “artistic visions,” as if technology was to blame for whatever flaws his previous games may have had.

      In fact, Teti sums up his point quite nicely here: “Expanding the technological capabilities of our game machines is not
      inherently bad, but treating new tech as a magic bullet is a
      self-destructive delusion (if a familiar one).” The PS4 is more powerful; great. Now show us what that actually means.

      • LetoII says:

        Exactly. What baffles me is how these presentations are apparently being done for gamers, and yet gamers are the ones who have been complaining about the overemphasis on graphical power for what seems like decades. 

        When I think of people wowed by super-high-def graphical capabilities, I do not think of gamers. I think of just random slack-jawed yokels who drool over any kind of slop they see on TV at Wal-Mart. I guess in the minds of Sony execs, those are “gamers.” But yeah, Minecraft has been brought up numerous times. Let’s talk more about that, because I think it’s telling.

        • GaryX says:

          “When I think of people wowed by super-high-def graphical capabilities, I do not think of gamers.”

          This must be the only gaming-related website you visit.

          You are oh so lucky.

        • LetoII says:

          Actually, you are totally right. This is the only gaming site I visit, which gives me a kind of ivory tower sort of view on the whole thing.

        • I assumed that events like these were meant to spike investor interest in the face of soft quarterly results.

        • SaoirseRonanTheAccuser says:

          Check out some other gaming sites. There are seemingly immense swaths of gamers that care for almost nothing OTHER THAN graphics.

    • ComradeQuestions says:

      I don’t think he’s arguing that technical limitations are irrelevant, just that they shouldn’t be blamed for shitty games.  There’s a reason Heavy Rain gets criticized with the phrase “press x to Jason” — it’s got nothing to do with the technical limitations and everything to do with the gameplay itself.

      And I don’t really see how the new system narrows the gap between consoles and the PC, especially in terms of flexibility.  PCs can still have their components upgraded; the PS4 can’t.  In 5 years, the latest Far Cry and Skyrim will run great on PCs and poorly on consoles.  I don’t really see that changing.

      • Tyler Mills says:

        This. If you ask the question “What is the greatest limiting factor in creating good games?” I think you will find the answer is the creativity of the humans making said games. Consider some of the other possible answers: Budget? Time? Hardware? Player’s expectations?
        We’ve seen games that have huge million dollar budgets. We’ve seen games that have been in development for 10 years. We’ve seen bleeding edge games, in whatever era that happened to be. And people that buy games run the spectrum from buying low-budget indie artsy games designed to subvert expectations all the way to the kind of people who buy every new rehash of COD. No, I think the greatest single limiting factor is human creativity.

        Look again at the instance of hardware being a limiting factor, I don’t think that has generally been the case in the history of game developers. Rather I see a trend of human creativity rushing to try to close the gap created by technology advancing way out of control. The age of early PS1 and N64 games I think are a perfect example of this. Developers were scratching their heads, saying “Polygons? 3d enviroments? What the heck do we do with all this stuff? How do we turn this into a game?” And a lot of games from that era have a sort of awkwardness to them that attests to that sentiment. Some just happened to sort of find their way. I’ve heard several refer to it as a sort of age of 3d adolescence. 

        Yes, of course better hardware can create more freedom for developers to use, but that in no way guarantees they will know how to best use that freedom, nor does it guarantee that there even is a better way to use that freedom.

        • Phillip Collector says:

          “What is the greatest limiting factor in creating good games? I think you will find the answer is the creativity of the humans making said games.”

          I agree with this although I also think it’s the market that limits creativity. It’s the market that pressures developers to Call of Dutyify their games in the hopes of massive sales numbers.

        • Girard says:

          Sometimes I think the most exciting thing about these kind of hardware announcements is thinking about the awesome games independent and small studios will be able to make for it in 10 years’ time when it’s on the cusp of obsolescence.

        • Joshua Pfeiffer says:

          I would argue that Publisher interference is the single greatest hurdle to developing creative games. The huge AAA devs all have plenty of great ideas for games… but more often than not, said creative, original ideas get shot down by the publishing overlords, in favor of dumbing things down for the biggest user draw.

        • Tyler Mills says:

          @ Joshua Pfeiffer

          Good point, I hadn’t thought of it that way. I would still argue that creativity is still the biggest bottleneck, regardless of the why. Whether it be iron-fisted publishers or lazy developers.

      • Llodes says:

        Fair enough on the first point.  As far as the second goes: PC “flexibility” doesn’t just come from their ability to evolve over time.  PC hardware actually emphasizes different things than consoles.  Console hardware is comparatively deficient on things like hard drive space and RAM, and is heavily geared towards graphics processing.  

        Predictably enough, this means intra-generation console technological advancement tends to emphasize bigger and better graphics.  Other sorts of game innovation becomes undernourished.

        A more PC-like console architecture doesn’t prevent PCs from eventually outstripping consoles, it just gives developers a wider range of choices when making games.

        • Bad Horse says:

          I think what Sony is trying to address is bad ports. The PS3 was notoriously difficult to program, and even late in the generation you have multi-platform releases on PS3 that substantially underperform their Xbox counterparts despite running on supposedly more powerful hardware. I’m looking at you, Skyrim. You too, XCOM, even though I love you to death.

        • The Guilty Party says:

          … more RAM does not really give you a whole bunch of innovative flexibility you formerly lacked. Neither does more hard drive space. They just let you lay prettier textures on your polygons and let you see more of them at once.

          The limitation you’re looking for is User Interface. You can do a lot more things with a keyboard and mouse than you can with a controller, simply because the keyboard and mouse give you higher levels of precision and more inputs.

          There are things that feel way better on a controller too, so it’s not like PCs are superior, they’re just built for different types of interaction.

          The point being, nothing about the PS4 says ‘this will let you do new things’. It just says ‘this will let you push more polygons’ (and make easier ports, as Bad Horse pointed out).

        • Enkidum says:

          @The_Guilty_Party:disqus “More RAM […and…] more hard drive space […] just let you lay prettier textures on your polygons and let you see more of them at once.”
          Sorry, but that’s just false, and almost deliberately missing @Llodes:disqus point, which s/he’s been pretty clear about in several posts.

          Graphics cards give you better graphics, and only that (well, they also let you find ET, but that’s another story). RAM and hard drive space are necessary in some respects for flashier graphics. But they’re also necessary for everything else involved in the game.  

          You can certainly develop creative games within the limitations of modern consoles. But you cannot develop extremely complex creative games. There’s a reason why, for example, Crusader Kings II could never run on an Xbox or PS3, and it’s not because of the straight-from-2002 cutting-edge graphics. It’s because it’s constantly processing massive quantities of information, for which you need RAM.

        • Llodes says:

          Enkidum: Yep, I was actually specifically thinking of CKII when I wrote these comments.  My ancient laptop could never run Battlefield 3, but it handles CKII just fine. (Well, as long as I turn off trees.)  It’s a whole new frontier, gameplay-wise, which consoles have no experience with, because they’re built for guns and flahs.

          In retrospect, I think my original post might have been a little harsh, but honestly, I was annoyed: the article seems to have been written by someone who is A. unaware that games like CKII exist, and B. unaware that console technical limitations are a large part of why they’re not available to mainstream gamers.  Those are two pretty key facts when you’re evaluating the next console generation.  

      • GaryX says:

        I think it’ll only change in that now the goal posts have moved. Far Cry and Skyrim run great on PCs and less so on consoles, but the PC version is still beholden to where the console tethered it: you can only design the game world and mechanics up to a point and from there the major adjustment is in draw distance/textures. Things can scale vertically across platforms, but not really horizontally. Originally, this didn’t use to be the case, but now that all games are pretty much the same across all platforms, it seems to be the way it goes. Now that we come into a new generation of consoles, the paradigms of game design can open up because, essentially, the minimum requirements have been beefed up more. Eventually, yes, the latest versions of Far Cry and Skyrim will run oh so much better on PCs, but I’m willing to be it’ll be back at purely a case of technically running better, not having better systems (obvs I’m excluding mods from this entire conversation).

    • Captain Internet says:

      I think you’re both right.

      The two key messages in the Sony press conference were ‘more social’ and ‘prettier’. Check out our Twitter button! Look at the upholstery! And yet the games on display featured the same tedious mix of warring “factions”, dystopian futures, fictional assault rifles and fast cars that we’ve been presented with for the last fifteen years.

      The key innovation was that after a man shot some dudes in Killzone: SpaceMall he could put the video on YouTube without getting out of his chair. Progress!

      But you’re also completely correct that there’s more ways to use the extra RAM, processing power, storage capacity and connectivity than just adding polygons and updating people’s Facebook status upon each successful headshot- although this is a failure of imagination on Sony’s part as much as anyone else’s. 

      For what it’s worth, I don’t think games need to get any prettier. This was highlighted for me by David Cage’s highly detailed real-time rendering of a tramp- I mean, what’s the point of putting resources into that kind of thing if it’s still going to look like a computer model? I’m hoping developers will start thinking along those lines and focus on increasing player agency and AI instead of just making big stompy robots even bigger and more stompy.

      Still, I think the real innovation is going to stay on PC for some time, simply because the Sony console is not going to be as open a platform. The most important limitation I think is not the computing power of the device, but how easy it is for people to even access that power to begin with.

      • Fluka says:

        Yep yep, agreed on all accounts.  PC innovation is about openness, not about graphical power.  I don’t care if you can detail every drop of blood exploding out of a stabbed guy’s neck at E3.  It’s still the same old neck stabbing E3.

        Also, I just had a wonderful mental image of PS4 presentation as Lord Flasheart:

        “Hey girls!  Look at my machinery!”

        • Colliewest says:

          Hey, just because this PS4’s packing the kind of polygons you’d normally find on a grand national winner doesn’t mean it’s not full of emotion.

        • uselessyss says:

          To be fair, a lot of the PS4 conference was about how the console will be more like a PC, developers will be able to self-publish, and the whole system will be more open to indie devs.

          Hardware is never the reason we see interesting games. We look back on the PS1 and PS2 and think, “Man, games were more creative then,” but they really weren’t. The only reason we got games like Katamari Damacy or Ape Escape on those consoles was because they were so damn popular. For every innovative, creative game idea, there were 100 crappy, unfinished knock-offs. Because the number of PS1 and PS2 games was so huge, though, we filter a lot of that out.

          I think the PS4 is encouraging just because it seems like it could be easier for those creative games to come through than on the PS3.

        • Fluka says:

          @uselessyss:disqus That’s a fair point, and it’s very good to hear that one byproduct of the new PS4 will be greater openness to independent development.  Bringing Jonathan Blow to speak at such a huge event is a nice sign, too.

          That said, I very much liked Rock Paper Shotgun’s take on the PS4 reveal.  To paraphrase: “NEW PS4 A PC: PC Gaming Declares Victory!”

        • Llodes says:

          I think part of what people are missing is that PC “openness” is largely a consequence of PC hardware architecture.  “Better specs” does not automatically translate into “pursuit of graphical power.”  Console games seem closed because it’s technically infeasible to even make open games without, for starters, a real hard drive and more RAM.

    • The_Misanthrope says:

       I’m all for a more PCesque build for the PS4, especially if it allows for more persistent experiences and the ability to mod games.  But it’s all just talk right now and the games on display don’t seem to back it up.  Meanwhile, I could just build my own supercharged PC and not have to rely on idle speculation.

      • wzzzzd says:

        Note that it’s a more PC-like hardware architecture, not user experience. You still won’t get to be able to mod your games or use whatever the hell peripherals you like.

        It just means that porting games from one basically-a-pc-internally to another basically-a-pc-internally will be more straight forward.

    • Bad Horse says:

      The problem has less to do with tech limitations and more to do with economic ones. In order to unlock all that potential you need more dev time, which means more budget, which means you have to move more units (or sell it for a higher price, which nobody does anymore). Only proven blockbusters will be able to justify those budgets, so you’ll just get prettier CoD and GTA – I can’t think of any other franchises that can actually be profitable on a $100M+ budget, and as much as I’d love to see an incredibly advanced version of my beloved Mass Effect, it’s probably not going to happen. If anything, technical advancements are going to result in less diversity and imagination in AAA gaming, and last night’s event put that on display. What do we get for all this technical advancement? More Killzone.

    • WinterFritz says:

       It opens possibilities to create better experiences, but it doesn’t mean the experiences WILL be better. Shitty gamecrafting is still going to be shitty, shitty stories are still going to be awful. Higher technology doesn’t magically give weight to whatever art is produced for it, and I believe that’s what Teti is saying here, he’s refuting Sony’s notion that just because you add in more transistors it automatically makes the games better. It’s not Teti’s job to emphasize the new world-building capabilities of the PS4, it’s Sony’s and they did a piss-poor job of it aside from Jonathan Blow.

    • midfield says:

      although i generally agree with john teti’s critique, and not with Llodes’s lament, i would like to put on my programmer hat and point out that there is a real difference for developers in targeting a PS3 versus a PS4.  the fact that the PS4 is very close to a standard PC in hardware makes a huge difference.  the Cell processor that the PS3 used is a marvelous piece of engineering, but hell to program for.

      so it is not just a leap in pure processing power that’s the story here — there’s a change in architecture (for the better, from the programmer’s point of view.)

      as an byproduct of both the PS4 and the next Xbox being probably like PCs, i hope to see fewer console-only games.  but this is probably a not going to happen.

      • Enkidum says:

        Hey, if you know something specific about this topic, I’d be very curious about what made the Cell processor so hard to program for. I can understand that half a gig of RAM places crazy limitations on you, but didn’t realize the processor itself was a problem.

        • midfield says:

          the cell processor has a single simplified powerPC core, coupled with 8 “synergistic processing elements” (SPEs.)  the main core acts mostly like a normal processor, but it is not powerful, so to get any performance out of the system, you need to use the SPEs.  they are not like normal CPUs at all — they don’t do out-of-order instructions, branch prediction, there’s no cache, etc.  they are sort of like DSPs or mini GPUs.  programming DSPs or GPUs is difficult, at best, and on the cell you have eight (!!) of them, connected by a cool but weird memory bus.  the SPEs are the processor equivalent of idiot savants, and you’ll have to do a bit of cat-herding to get them to go.

          there was a lot of noise during the cell launch that sophisticated compiler technology would solve the programmability problem.  i’m not saying they didn’t do great stuff, but it is an insanely hard problem to can find more information here:

  13. Destroy Him My Robots says:

    For one crazy moment there, I thought we were going to see the official unveiling of Titan. Would’ve been glorious, but I guess “I’m a loyal D3 player and now I think it’s awful and for console peasants!” fanboy tears is an okay consolation price.

    • valondar says:

      There are still loyal D3 players? I thought the internet had unanimously agreed that Torchlight II is better.

      • wzzzzd says:

        Honestly the only thing Torchlight 2 has over D3 is that it has single player.

        • The Guilty Party says:

          Wuh? I play D3 by myself all the time. Or I did, then I got bored because they really screwed the pooch with the loot-dropping portion of their game, which is kind of the whole point.

        • valondar says:

          @The_Guilty_Party:disqus You have to be online to play Diablo 3’s single player.

          This ridiculous innovation is absent from Torchlight 2… and the game’s better handling of loot is considered to be one of its strong points (but then basically the entire team behind the first two Diablo games was fired by Blizzard and then went off to make the legally distinct Torchlight series so… yeah).

  14. Dikachu says:

    This article is a little melodramatic… of COURSE they hyped their system as god’s gift to the gaming world, and the previous systems are all total shit and were basically responsible for the holocaust.  That’s what they do EVERY time.  All of these big companies, regardless of their industry, employ the same exact “embarrassing dad”-like morons in their marketing departments.

    For me, the key here is… what does the PS4 give me that the PS3 doesn’t, BESIDES what is essentially faster hardware and a few new gimmicks (touchpad? who gives a flying fuck?)?  I bought a PS2 mostly because I wanted a DVD player, and wound up buying PS2 games too.  I bought the PS3 because it was the best (and cheapest) blu-ray player on the market for a long time, and of course I’ve bought a bunch of PS3 games since.

    If it’s essentially just a fancier PS3, then I don’t see how Sony’s gonna regain any ground against the competition.  I certainly won’t buy one right away.

    • We won’t know for certain until we see Microsoft’s new system, but I expect that they’ll start off on a pretty equal footing. If either the PS4 or NeXtBox have any dramatically distinctive features, then they are being played very close to the chest. 

      If the hardware is pretty much interchangeable, then the key is exclusive software. Neither console-maker is going to be able to rely on its inherent novelty to move units at launch; they need a “Halo” or a “Last Guardian”. 

    • God — IS that a touchpad on the controller? What is wrong with you, Sony?

      • ApesMa says:

        Why not? It’s more practical for menus and browsers, it’s not in the way and maybe someone has other uses for it.

      • Bad Horse says:

        Yeah, I’m not 100% against it. I doubt it will see that much use in actual games past the first year or two (hi there, Sixaxis), but it should make some menus more user-friendly, as @ApesMa:disqus says.

      • Merve says:

        Well, at least you can’t say that Sony is…out of touch!


    • SamPlays says:

      I bought a PS2 to update my Genesis but the reliable DVD player was a major incentive. Same goes for the PS3. IN both cases, I waited at least 3-4 years into the cycle before plopping $300 on the system. I’ve never understood how people can pay $60 for a game. I’m a patient man. I’ll wait for a catalogue to emerge and buy discount or used. 

      • Dikachu says:

        I occasionally buy games at launch for $60, but it’s pretty rare.  It has to be a game I’m desperate to play.  Most of the time I wait for them to fall to $30, which I think is a reasonable price.

        • SamPlays says:

          $30 seems reasonable to me but I tend to find most things of interest end up down at the 20 mark within a 1-2 after release. There are many games I anticipate but, like I said, I’m patient. It’s not like I’m a 12-year-old who has to keep up with my friends. Between my local grocery store, Costco and Wal-Mart/Futureshop/Best Buy, it’s pretty easy to find sales. I understand that games are very expensive productions but $60 seems particularly overpriced. I’d like to think that prices will go down with more content becoming available digitally but that hasn’t been the case with e-books as an example.

        • @SamPlays:disqus : Back in the cartridge days, the high retail cost was due to the manufacturing process itself. Every game was full of chips and circuits.

          Now, of course, the actual manufacturing costs of a game are negligible. Retail price is mostly development, overhead, and profit. Digital delivery has a minimal effect on those elements.

        • uselessyss says:

          @SamPlays:disqus I think the market has grown so much that games don’t have to be $60, but it’s interesting to look at how expensive games used to be back in the day.

          Games in the 16-bit era often cost around $75 or even $80 sometimes, which seems crazy considering how much games have increased in size and scope since then.

        • beema says:

          The problem is that a lot of games are forced in to the $60 price tag that should be less and would probably sell better at less, but they “have” to sell for 60 because that is the accepted standard. Even with something like Call of Duty, I bet they could release a multi-player only version for $35 and it would sell even better than it’s selling now. There are just so many possibilities for growth and betterment when you break away from the rigid price point.

    • Moonside_Malcontent says:

      Having watched this conference, I’m sad (but I wouldn’t say disappointed) that video-game media seems to be taking its buzzword salad from the Thomas Friedman cafe.  Social media is the future of the world that is flat in web 2.0!  Real-time crowdsourced person-centric Twittercast!  Blog with targeted advertising in the you-oriented Facebook game! India and China! India and China!

    • Effigy_Power says:

      True, but I don’t think that just because they do the same theatrics every time they reveal something should mean that critics should stop mentioning it.
      Maturing of the gaming industry involves more than just plots and tits, it should include the ridiculous and history-reseting marketing campaigns.
      Daikatana’s “Suck it” campaign isn’t that long ago.

    • GaryX says:

      If it’s essentially just a fancier PS3, then I don’t see how Sony’s gonna regain any ground against the competition.  I certainly won’t buy one right away.

      Well, unless Microsoft dicks it up with the required online connection, the required Kinect (which also will supposedly used to verify that the person watching a movie/playing a game is the one who downloaded it), the no used games, etc etc etc.

      Then the PS4 will basically look like the only not-PC alternative to me.

      • The Guilty Party says:

        “no used games, etc, etc” was what people were saying about the PS4. I suspect they won’t do anything so obviously stupid either.

        My main issue with the playstation series is that fucking controller. 15 years ago it was a wonderful change for the better, compared to what we had before. These days it’s an uncomfortable piece of garbage. Both thumbsticks still way down where it’s totally inconvenient to reach, eh? Pass.

        • GaryX says:

          I hope they won’t, but they were reporting those things about the Xbox even up to today (Kotaku, who are they catty gossipers of this whole thing, are claiming their sources still say Durango will be always online required).

          Agreed on the controller, though I’ve never outright hated it. Wouldn’t have mind seeing the stick/dpad swapped though.

        • Dikachu says:

          I dunno, I actually like the PS controller… it feels good in the hands.  My only real problem with them is when you have to maintain a “slow walk” it’s really damn difficult to get the stick to hold at a slight tilt.

  15. Dikachu says:

    Also, I’m a little disappointed with the name “PlayStation 4″… it’s getting kinda old.  I realize they have tons of brand recognition, and it’s better than pulling an Apple and calling it something fucktarded like “the new Playstation”, but still…

    •  I was hoping they’d go with “PlayStation Gaiden”.

    • Naked Man Holding A Fudgesicle says:

      Lots of superior naming options as an alternative to PS4 are available:


      Playstation: Resurrection

      Playstation: The Revenge

      Playstation 4: Citizens on Patrol

    • valondar says:

       Honestly, given alternatives have had names like ‘Xbox 360’ (I only just found out why, thanks google) or ‘Wii’, I’m pretty okay with Sony just adding a number every six years.

      • Girard says:

        I wonder if Nintendo is going to keep adding seemingly random letters to its console names until, several generations down the pipe, we’ll all be playing the “Nintendo Waluigi”

        • Tyler Mills says:

          I kinda figured they would just keep appending personal pronouns ’till it becomes something along the lines of:


        • ApesMa says:

          Since I always end up buying the things I hope the next one has a non-embarrassing name. Fat chance, I know.

          I wonder if they might just call it Nintendo 7. Would be nice and simple and also give them a numeral edge over the PS5, for what that’s worth (evidently enough for MS to go with 360 over Xbox 2).

        • Girard says:

          Juding by both Final Fantasy and Microsoft Windows, ‘7’ is the point in a series where you just cut the bullshit and start numbering things normally – so it could happen!

        • ApesMa says:

          Yeah, that’s what I was counting on. It’s the magic number.

        • Tyler Mills says:

          @ ApesMa

          Oooh ooh, I can see that too. Let’s not forget Mario Kart 7, baby.

      • Dikachu says:

        Xbox 360 seemed alright to me, as it implies an “all around” experience.  Xbox 720 is kind of retarded though, since it’s, what, all-around twice?  People are gonna start getting dizzy.

        Wii was probably the worst console name I’d ever heard… I thought it was a joke at first.  But then came “Wii U” and I just kinda sighed.

        • HobbesMkii says:

          How dare you mock the Nintendo Piss University!

        • TheKingandIRobot says:

           It was designed to invoke rad skateboarding moves in people’s minds.  As they continue to explore cross-branding and licensing options, they become increasingly likely to rename the new one Xbox: Double Pits to Chesty

        • Electric Dragon says:

          Elementary mathematics would suggest XBox 1:XBox 360::XBox 360:XBox 129600

    • ShrikeTheAvatar says:

      I like it.  Everyone’s been calling it that for the last couple years while talking about its hypothetical existence, and they’re taking advantage of the fact that the brand is already out there.

      Better than what Xbox is going to have deal with – everyone’s been calling it ‘Xbox 720,’ which is obviously a terrible name for a console and will absolutely not be used for real.

    • TheKingandIRobot says:

       They should follow in various video game traditions on this one, not movie traditions:

      Playstation Ex+ Alpha
      Playstation III-2
      Playstation SD
      Super Playstation
      Playstation: Honor of Duty of Warriors of Code
      Playstation Damacy

    • GaryX says:

      I saw Kotaku suggest “the Playstation U” (seriously, I think) and laughed and laughed. Those fucking guys.

  16. Naked Man Holding A Fudgesicle says:

    We at Sony want a playstation with attitude. It’s edgy, it’s “in your face.” You’ve heard the expression, “let’s get busy”? Well, this is a playstation that gets “biz-zay!” Consistently and thoroughly.

    • I hope the console design will feature a backwards baseball cap and cool shades.

    • TheKingandIRobot says:

       Post Poochy episode quote.  Collect likes.

    • Fyodor Douchetoevsky says:

      “Oh, God, yes. We’re talking about a totally outrageous paradigm.”

      This is all that I think of when I hear these kinds of press conferences. 

      • George_Liquor says:

        “The rest of you writers start thinking up a name for this funky console; I dunno, something along the line of say… Playstation, only more proactive.”

        “So, Playstation okay with everybody?”

        I’m thoroughly convinced this conversation actually happened at Sony HQ.

        • ApesMa says:

          Wii and Wii U both sound like the kind of crap people come up with at the end of a very long meeting, and the others are too tired to immediately realize how terrible it is. Then they repeat it so many times that it loses all meaning. And then it has no more enemies. And then it’s won.

        • George_Liquor says:

          Wii U sounds to me like someone at Nintendo was running around making siren noises.

  17. Brian Stewart says:

    Cage’s point was important and I’ll tell you why because you were too busy being the clever defender of silent films to notice. Before Toy Story, CG animation sucked and those little animated puppets were incapable of engaging the audience in their stories. If our goal is to tell emotional narratives with 3D characters then yes, we need better 3D models, made by stronger platforms, that can deliver the subtleties of performance. If the PS4 has finally gotten us to the holy land of audience connection than it is something to celebrated. Can the same be done with 8 bit characters? It can but that’s not what we’re talking about.

    • doyourealize says:

      Is this satire?

      • SamPlays says:

        I’m pretty sure Toy Story is great because of it’s direction and writing, not because of its polygons. It could have been done as a series of hand-drawn still shots and the story would still resonate.

        • The Guilty Party says:

          Yeah, exactly. My PC is more powerful than the PS4, and my polygons aren’t audience connecting me any better.

          Well told stories and well developed characters still do it just fine though.

        • Enkidum says:

          Well… I think you’re over-stating your case a tad there. Pixar are clearly obsessive about pushing the technical limits of computer graphics, and one clear thing this allows them is better emotional expressivity, both in body and facial movements/features.

        • SamPlays says:

          @Enkidum:disqus I can’t disagree with you – Pixar’s trade is clearly rooted in computer animation. My point was that you can push polygons and graphics technology as far as you like but it cannot replace the substance of story and ideas. Bad writing, bad acting and poor creative choices aren’t overcome by pretty pictures. The lasting impact of Toy Story, and many of Pixar’s early works, is that it’s engaging on an emotional level but I’m not convinced that comes from painstakingly-detailed character models. 

    • Girard says:

      Toy Story’s narrative strengths weren’t solely in the power of its renderfarm. And I wouldn’t be surprised if the real-time graphics of the PS3 are the product of more digital horsepower than the pre-rendered graphics of the first Toy Story film, so we’ve already crossed whatever imaginary hardware threshold you and Cage are positing is necessary for expressiveness in games, if Toy Story is your measuring stick.

      The PS4’s additional power will certainly open up new opportunities to truly inventive developers, both narratively and mechanically. I suspect Cage, based on the limited scope of his ambition (Finally! The PS4 will herald the era of FMV games rendered in real-time!), and exhibited lack of talent even within that limited scope (Despite his cinematic leanings, he’s a piss-poor screen-writer), will fail to capitalize on most of those opportunities.

      • valondar says:

         @paraclete_pizza:disqus @google-edb389ca8ffa86e18edc8cb7cd5a2e6b:disqus Hell, Quantic Dream made a point of saying their tech demo Kara was rendered in real time on the Playstation 3. If that’s the limitations to emotional performances that David Cage is bemoaning having to struggle against, I have no idea what it is he wants besides photorealism.

        • Girard says:

          I suspect David Cage is the only person on the planet who thinks “The Spirits Within” is the best entry in the Final Fantasy franchise.

      • Tyler Mills says:

        Girards reply to Brian was direct, well spoken, well thought out, and well mannered reply.

        OOOOOOOOOHHHHHHHHHHHHH SNAP!!!!!!!!!!!!!!!!!

      • Vaguely off-topic, but isn’t it fortunate that “Toy Story” was both technologically groundbreaking and a solid story? Disney could have easily let the film be a glorified 90 minute tech demo.

    • Cage probably would have done better to frame his thesis the way you have. The Uncanny Valley creepiness of The Polar Express for instance.

      • ApesMa says:

        Pixar movies feature cartoonish characters. When you try to create realistic looking humans, like most games nowadays do, you get the Uncanny Valley creepiness. This is why Cage, like Zemeckis, is doomed to fail.

        • But isn’t it implicit in the idea of the Uncanny Valley that at some point graphics could become real enough to pull back out of the valley? And so it’s worth the effort of continuing to push the boundaries in that direction?

        • ApesMa says:

          It’s a long, long way off. If Hollywood manages to do it 20 years from now or something then sure, go ahead, implement it in your game. Right now it’s the wrong area to spend resources on. Gameplay is supposed to be the focal point, not (cheesy) storytelling and CG “acting”. If they manage to tell a good story as well then that’s a nice bonus.

        • Sarapen says:

          It’s taken as an article of faith that eventually we’ll reach the Promised Land of complete fidelity but I would be amused if developers just give up and say, “Fuck it, these fleshbots are getting creepy” and we end up with an Uncanny Ravine.

        • rubi-kun says:

           @edwardsung:disqus Do we even want video games to get that far? Once you get past the Uncanny Valley, wouldn’t killing all those photorealistic enemies get rather creepy?

          • ApesMa says:

            Good point. Photorealistc depictions of people being shot in the face, blown to bits etc. sounds really disgusting.

            I prefer more cartoon-like and/or stylized graphics myself. Games can be just as dark and mature that way, but more distinct and visually appealing. Realism tends to make games look too dull and similar.

    • Electric Dragon says:

      If you believe that CGI animation sucked before Toy Story, you have obviously never seen Pixar’s earlier shorts. Pixar could get more emotion out of an anglepoise lamp than I’ve seen in many modern video game protagonists.

      • wzzzzd says:

        Or watched Toy Story since it first came out.

        That film looks really dated now.

        • GaryX says:


        • Bad Horse says:

          @GaryX:disqus It does “look” dated – the movements are relatively stiff and the textures are simplistic and Sid’s dog looks like he’s made of Lego. That’s not to say it’s a bad movie, though – Pixar is clearly able to get emotion out of whatever tech they had available to them at the time. Hence, again, David Cage is an ass.

        • GaryX says:

          @Bad_Horse:disqus I guess I had read that wrong. Willing to edit comment if we’re only talking about visuals.

    • Effigy_Power says:

      So… what you’re saying… is that Harold Lloyd… would have been funnier… if they had more polygons in the 20’s?

    • TheKingandIRobot says:

       I’m more or less on board with this but that’s predominately because Teti’s reaction to the Cage speech might as well have been an mp4 of a monocle falling into a teacup over and over again.  Even if he’s right he’s coming across like the adult in a cereal commercial that totally doesn’t understand why cinnamon toast crunch tastes so good.  Stop making me sympathize with the guy that made Heavy Rain!

      Yes yes we’re all fully aware you know how important The Great Train Robbery is.  You get 300 critic points. 

      • ApesMa says:

        That clown deserved everything he got and more. Guys like him are ruining video games. I recently clicked on a trailer for something called The Last of Us (I think) because I wanted to see what it was, and all I got was what looked like a trailer for a really shitty CGI movie and no gameplay footage whatsoever! I still have no idea what that game is or how it’s played.

      • ClementC says:

        “Teti’s reaction to the Cage speech might as well have been an mp4 of a monocle falling into a teacup over and over again”

        Why can’t I like this more than once?????

  18. Mat Newman says:

    Fantastic article, and thank you for calling out David Cage’s hubris. That was reported very matter-of-factly in the Guardian and made me want to scream.
    I’m excited about PS4, but if they think emotion is in the tech, they’re insane!

  19. LllusionX Mailed the Fission

    “yes kids, over 200 years ago in the year 2007, i beleive, a war broke out between 2 ancient “gods” as these textbooks seem to tell us. it goes on to say that each god had millions of pale faced, pustuled,fat ,”Ge-eks?” i think thats how it’s pronounced. anyway. these “Ge-ecks” fought over “2 systems” im guessing systems of government. each system had their own set amount of games that apparently proved their worth. one day, in the year 2008-2009 this war turned into an all out conflict as the “Ge-ecks” were seeing who’s system was better. as ther did not forsee, one of the greaters of the large boss companies released a relatively old technology today code named the “P-S-P-3000” an archaic war machine in the museum nex door. this new system helped out the main contender of sony’s army, the PS3 gain controll of the situation. until, microsoft unleashed a secret hand-held technology called the “X-Cube” by taping an Xbox1 to a “Gam-e-cube”. however this failed…and the next page is missing, i guess we’ll never know who won…. Alright class, time to sing the pledge of fanboyism, i give my money to sonintendosoft but all profits go to sony, for in the republic in which sony stands, 7 nations all made by our little big planet. “PS-F-T-W” we’ll still never know what any of that means.

    and thats the story of the fanboys wars.(i need to start writing books…) 

  20. PrincessKyotoSparkleChew says:

    Any word on whether PS4 will support MKV files? I use my PS3 for viewing media as much as gaming.

  21. Well written article Mr. Teti. Seems like a classic case of sound and fury.

  22. DrFlimFlam says:

    I am Mark Cerny, and I’m here to ask you a question. Is a man not
    entitled to the sweat of his brow? ‘No!’ says the man in Kyoto, ‘It
    belongs to the families.’ ‘No!’ says the man in the Cupertino, ‘It belongs to Jobs .’ ‘No!’ says the man in Redmond, ‘It belongs to everyone with a PC.’ I
    rejected those answers; instead, I chose something different. I chose
    the impossible. I chose… PlayStation 4, a console where the artist would not
    fear the limitations, where the programmer would not be bound by petty restrictions , Where the great would not be constrained by the small! And
    with the sweat of your brow, PlayStation 4 can become your console as well. 

  23. Captain Internet says:

    “Sony is essentially saying that it will mine your personal information to determine which stuff you’re most likely to buy, advertise only that stuff on your console, and act like it’s doing you a favor.”

    Targeted marketing is the reason that Facebook thinks I live in Norway.

    • Spacemonkey Mafia says:

      “Hey, Captain Internet!  Here are some great deals on Reindeer Meat you might be interested in!”

      • Captain Internet says:

        “Winter offer: order today and save big on Mirtazapine and Fluoxetine at!”

        edit: disqus

    • wzzzzd says:

      It probably suspects you don’t, though.

      • The Guilty Party says:

        Yeah, they keep asking me to verify that I’m really Baron von Munchausen and that I live in Oslo and was born in Timbuktu.

        They’re kinda paranoid.

    • His_Space_Holiness says:

      For quite a while after I first joined Facebook, its ad algorithm was under the impression that I was interested in gay dating websites. Apparently it could read my “Relationship Status: Single” entry but not my “Interested In: Women” entry. Then I got AdBlock and stopped caring what Facebook thought altogether.

    • Professor_Cuntburglar says:

       I got an ad for a super realistic Robocop costume once, and now I’m convinced that Facebook can read minds.

  24. Raging Bear says:

    I’ve been weirdly lukewarm on the idea of the Playstation 4. I knew it would involve LIKE WAY MORE GRAPHICS THAN EVER BEFORE, but that’s a given, and I’m at the point where I’d like developers to go for more creativity than fidelity (That “size of penis” line? genius), so it doesn’t do a lot for me anyway. Plus, the freakish über-connectivity actively turns me off somewhat.

    On the other hand, apparently there’s an inFamous sequel coming for it, so I’ll be getting one.

    • DrKumAndGo says:

      Actually, the inFamous sequel is what really irritates about this. It sounds exactly like the sort of the sort of thing that could run perfectly fine on current generation hardware, and I would be happy to play it if it didn’t require sinking $500 into a new toy. This feels more like a move to break backwards compatibility than anything else.

      • Professor_Cuntburglar says:

         It bothers me too, but mostly because Infamous 2 had sort of a sequel-proof ending, so I know their just cashing in on the Infamous brand instead of making anything new and interesting.

        And as much as I liked Infamous 2, I sort of had had enough Infamous for a while after I finished it.

    • GaryX says:

      I think what you’re saying is what we’re all thinking in this age of few first-party games: “Looks cool. Now show me some real shit playing on it.”

    • George_Liquor says:

      PC games are almost all console ports these days, so I’m looking forward to it finally advancing the damned PC state of the art.

  25. Michael says:

    I love silent films as much as the next film buff, but David Cage is right.  Yes, there were some talented artists who did the best they could possibly do with the limitations of that time and they made some great art in the process (much as the NES was able to makes some great games in its time), but there was a limit to what could be accomplished.  If anyone seriously thinks cinema would have been better served if no one had ever bothered to invent sound, color, or consistent frame-rates they’re nuts.  Similarly I’m baffled by the number of people who apparently love games but also seem to want to see graphical power to be capped for some reason.

    It’s not Sony’s job to make game designers suddenly become more artistic or creative, that’s the marketplaces job.  It’s Sony’s job to give them as many tools as possible in order to make whatever kind of games they want, and that’s exactly what they promised to do last night.  If people want to make another over-hyped puzzle game and call it “innovative,” as Jonathan Blow apparently wants to, they can go right ahead.  If they want to see the triple-A games become quadruple-A games, and I sure as hell do, then Sony’s got us covered.

    • valondar says:

       It’s less the idea that innovation is bad and more the idea that somehow innovation is going to replace something lacking.

      Do I think that better graphics are a great idea? Totally. But do I think that games right now are UNABLE TO CONVEY EMOTION? That apparently only when the PS4 comes around will we be able to render subtle performances?

      There’s really two objections here which are implied by my two questions. The first is obviously games have been able to convey emotion for much longer and in many different kinds of graphical quality… but the second is that the technical horizon that Cage posits has already passed.

      Take any of the really big budget high tech Triple AAA games of last year. Like say Battlefield 3. Sure, Battlefield 3 may be an empty, soulless extended tech demo with no soul – but is there anything MECHANICALLY holding it back from delivering a vivid and personal story with nuanced performances? We’ve seen excellent real time cinematic work from dozens of titles in 2012 at a technical level, what they lacked in maturity was an issue of the kinds of stories they wanted to tell and the kinds of performances they got from their VAs and their models.

      So when David Cage suggests that all we need to jump into an age of emotive storytelling is even shinier technology, he sounds off-base. The issue isn’t the tech, it’s the use of it.

      • Michael says:

         There’s no doubt that the current generation is capable of telling cinematic stories to some degree, much as a silent film is perfectly capable of telling certain stories in certain ways.  I’m sure Cage (if not everyone else) is happy with what he was able to accomplish with Heavy Rain, but let’s not act like tech can’t up the ante.

        For instance, the stories for Final Fantasy 6 and Final Fantasy 7 are probably roughly equivalent on paper.  Hell, most people would argue that FF6 has a better story in many ways, but no one was crying about anything in that game the way they were emotionally impacted by the death of Aeris in FF7. 

        I would also be careful about suggesting that the graphical leap this generation isn’t going to be that great.  I remember back when people were complaining at the 360 launch that it “didn’t look that different from the Xbox.” Eight years later I doubt any of them would really want to go back a generation.  As was once said in The Jazz Singer, “you ain’t seen nothing yet.”

        • TheKingandIRobot says:

           Up yours when Celes tried to kill herself I was tearing up.  Same when Shadow finally quit running.  And when Terra came back to save the city of children from Phunbaba and found that one of them was pregnant.

          Aeris was like watching a lego brick die.

        • Geo X says:

          RIGHT THE FUCK ON.  People are still claiming that fuckin’ Aeris dying was a powerful moment?  The mind boggles.

    • I think you’re making a couple of pretty over-the-top assumptions here. Sure, the tech of silent film at the time was limited, but the tech itself didn’t specifically limit creativity. No one said or claimed that sound/color/frame-rates ruined cinema from it’s silent trappings. The point would be akin to saying sound/color/frame-rates didn’t automatically translate to creative expansion. Hell, some of the early sound films after The Jazz Singer are horrendous.

      Also, no one claimed Sony’s job was to improve game designer’s creative output. The claim was more that Sony’s presentation seem to state that the technological advancement WAS the answer, akin to that being the creative innovation. Which is BS and hilarious, considering that 90% of that technological advancement is already on most PCs. This was made worse when the visual games present were graphically more of the same.

      The odd part is that the things that WOULD be a creative improvement – more choice, more complex stories, richer characters, COMIC games (seriously, can we have more games that are funnier? Can we have a modern Earthworm Jim?) – were certainly elements that could have been made and presented, even at the tech-demo level. But we got bigger guns and faster cars instead.

      • Michael says:

        The problem is that it sounds to me like people are crapping on Sony for doing exactly what they’re supposed to do when announcing a console launch: showing off the tools that developers are going to have to work with in the next generation.  You’re right, there is no technological solution for some perceived kind of lack of creativity in game aside other than to encourage indie developers (which they expressly said they would do in that section of the Livestream with Johnathan Blow).

        In short, what the hell did you people expect?

        This isn’t a TED talk, it’s a console announcement.  Do you guys go to car shows and then ask the designer of the new Ferrari and ask him how his new high powered engine will discourage drunk driving?

        • Citric says:

          Showing off the horsepower is one thing, but the problem is that they talk about how their creativity is limitless, and then show off games that are just polished versions of what we’ve already got. That racing game, the one with the stupid name, it doesn’t look that much better than the current GT5 or Forza entries, and for all their talk of individual strands of carbon fiber they didn’t actually show anything that made driving in the game better.

          If you’re going to go on about how more power is going to lead to unbridled creativity, SHOW SOME OF THE CREATIVITY. The PS4 might be a better console, but they neatly avoided actually showing anything new that the extra power will allow you to do. 

          Basically, to use your Ferrari analogy, it’s like crowing about the hyper advanced engine out of the hood making this the fastest Ferrari ever, but only showing a video of grandma puttering to the shops.

        • Michael says:

          Again, that’s not the console manufacturer’s job.  If they were showing off a great new HDTV and they happened to use a scene from a bad Michael Bay movie to demonstrate it and you come out saying “that movie was bad” instead of “that TV looks good” then you’ve kind of missed the point.  They would have picked the scene because it shows off the maximum capabilities of the system, not because they want to posit it as the height of artistic achievement.

        • Citric says:

          That is the console manufacturer’s job, especially if they’re selling the idea that there are no limits to creative freedom. In most hardware launches, they include a game or even just a tech demo that demonstrates what you could do now that wasn’t possible previously. It sells the feature, gives some weight to the increased specs, and demonstrates why people would want to invest in the product. 
          Killzone but shinier I guess is not something that we’ve never seen before.

        • Michael says:

          It used to be that a new system just meant “better graphics” and people thought that was just fine.  The SNES was just better graphics than the NES, the PS2 was just better graphics that the PS1, the Xbox 360 was just better graphics than the Xbox.  That’s what new consoles are.  The SNES to N64 transition is pretty much the only generation change that comes anywhere close to the lofty expectations that some people seem to have of what a new system is supposed to be.

          It wasn’t until the Wii came out that people suddenly decided that every console launch needed some wacky “game changing” gimmick.  That the Wii ended up being a gimmicky piece of shit that did nothing but lower the bar for for game design doesn’t seem to occur to them.  I’ll take “the old games, but shiner” any day of the fucking week over that kind of “innovation.”  And let’s also remember that once that “innovation” wore off people stopped using the Wii altogether because it was vastly inferior technologically than the PS3 and 360.

          And for fuck’s sake, this is the very first announcement of the damn thing.  We’ve seen all of eight games for it so far, that’s probably not a representative sample size.

        • Citric says:

          By saying that the SNES was just better graphics than the NES shows you definitely have no idea what you’re talking about, because Nintendo heavily hyped new experiences on the SNES. Their launch games were Super Mario World and F-Zero, quickly followed by Pilotwings, games which were designed specifically to show off how the added power and capabilities would change how games would change on the new console. 

          SMW did it by showing off how extra space could have bigger levels with more secrets and a more elaborately designed world than previous games in the series. It was just a sequel, sure, but it was designed to show off how a more potent console could enhance the game and allow for bigger levels, and more stuff.

          F-Zero and Pilotwings were designed to show off the Mode-7 effects, which were not possible on the NES. F-Zero went with an existing genre again, but used the effects to go much faster, and design better tracks and environments. It was a game that showed how the new console would make racing games significantly more enjoyable.

          Pilotwings made a game that was simply not possible on an older consoles, again using Mode-7 effects to make a quickly paced flight game that needed the new console to actually work. It was basically a tech demo for graphical effects, but it was also a fun game that showed what those graphical effects could do, and how they could give you a new experience. Same with F-Zero. They were more than just better graphics, they were new experiences.

          The switch from PS to PS2 also emphasized what experiences were possible with the new console. The PS2 didn’t launch that strong, but one of the big hype games was Dynasty Warriors 2, which gave you a big open world and lots of enemies to stab in the face. That was their big investment moment, look at the giant worlds we can create and populate. At the time, it gave a big promise for what would be possible in the future, a promise that was eventually fulfilled with GTA3. With the PS2, it wasn’t just better graphics, it was emphasizing that you could get bigger worlds and more stuff on screen, which could (and did) lead to new gameplay experiences. Nintendo did this as well with the Mario 128 tech demo, which populated the screen with a ton of Marios and said “we can do all sorts of stuff on screen, we don’t know what we’re going to do with it yet but it’ll bring something different along to the ride.”

          This generation there was almost an admission that graphics weren’t enough. There was so much emphasis on the do it all box that it was clear that they didn’t know quite how just pushing graphics were going to move consoles. Still, the 360 pushed a seamless online experience and the benefits of that hard, while Sony emphasized the entertainment possibilities through streaming and a bluray player. It was fuzzy, especially at launch, and personally I didn’t really care until well into the generation because they didn’t actually do much that I haven’t seen before for a very long time. When the games caught up, I bought, but it took a while.

          The PS4 launch had some decent graphics and some people talking about social networking. There really was no indication of how this is going to change how I play or what new experiences this will bring – outside of some social networking nonsense – something that previous successful consoles ALL DID. Maybe at one point I’ll see what the new experiences the console makes possible are, but they sure as hell weren’t at this launch event, I’m not inspired to invest in the new console and that means the launch event failed.

          It’s rare that a console launch is just graphics, it’s more frequent that it’s about new experiences. Graphics can often facilitate these new experiences, but graphics on their own don’t make a game good. The PS4 launched failed to bring anything new and exciting to the table.

        • Michael says:

          I feel like you’re greatly exaggerating the importance of Mode-7 on the SNES.  Yeah, we got a couple of proto-CGI effects here are there on the console, but for the bulk of what we got from that system were really good looking 2D sprites and more advanced sound.  Yeah there were a couple of games like F-Zero, but those aren’t necessarily the best remembered games now.  Mostly we just think of Super Metroid, Zelda: ALTTP, Chrono Trigger, and Street Fighter 2 which were all mostly bigger and better versions of existing genres.

          As for PS to PS2, yeah they had some tech demos when they were first unveiled   And guess what, the PS4 launch had comparable tech demos as well from Epic and Havoc (the Havok engine in particular was impressive).  And of course the biggest innovation of that generation, online console gameplay, wasn’t available until years after launch.

          I don’t know you and I don’t doubt that your misgivings are genuine.  That said I kind of feel like gaming’s chattering class has kind of been taken over by a lot of aging jaded complainers who don’t seem to know what they actually want.  And I suspect that if these people were around in 1990, 2000, and 2005 we would have heard a lot of the same complaints.

    • jroberts548 says:

      No amount of technological capabilities can cover up fundamental incompetence. All the graphics in the world couldn’t make Revenge of the Sith better than A New Hope. Making the mistake of thinking technology can replace talent contributes to disasters like the Star Wars Prequels, or the Matrix sequels, of the last 2 Terminator sequels. 

      • Michael says:

         You’re right that no amount of technology would prevent the creation of something like The Phantom Menace or The Matrix Revolution, but I feel like you’re also forgetting how much technology influenced A New Hope and The Matrix.

        What would Star Wars have been like if made with 30s technology? What would The Matrix have been like if made with 1950s technology?

        The answer to the former is that it would have basically been not dissimilar from the old Buck Rogers serials, which were largely scoffed at because they were trying to make something like Star Wars using technology that wasn’t up to Star Wars levels.  As for the later, well, I don’t know that anyone in the 50s could have made The Matrix as a live action film.

        Without people like Sony making the tech of the future we simply don’t know what we’ll be missing out on.

        • Merve says:

          “Without people like Sony making the tech of the future we simply don’t know what we’ll be missing out on.”

          Well, we’d all be gaming on PC then, so we wouldn’t be missing out on much. :P

          Okay, that was semi-kidding. But you make a good point. Sometimes, technology can drive innovation. Take, for example, the laser. When Theodore Maiman made the first functioning laser back in 1960, it basically served no purpose. Nowadays, lasers are used from surgery to manufacturing to reading optical media. The world as we know it couldn’t function without lasers. It’s funny to think that just over 50 years ago, a crucial component of so many devices and processes was little more than a scientific curiosity.

  26. duwease says:

    “You can tell we live in a privileged society when we have to work this hard inventing things to desire.”

    Decades.. DECADES I’ve spent complaining about manufactured needs and the marketing that creates them.  But this sums up all of that better than I’ve ever been able to.

    Today I weep, because I’ve seen the mountain top and know there is nothing left to achieve.

    • Girard says:

      For a small monthly fee, you can be granted access to MountainTop+.

    • GaryX says:

      Think of how Marx feels! He thought that revolution was coming and everything.

    • The Guilty Party says:

      I like that they managed to achieve some sort of Ideal of Things I Would Never Do. I don’t watch videos on my phone. If I did, I wouldn’t watch videos of someone else playing a game. If I did, I wouldn’t watch fighting games, of all things. If I did, I wouldn’t say ‘boy, those folks look good, sign me up to get annihilated by them later!’. The only thing they missed was being able to tweet the results to all my friends!

  27. Cloks says:

    Exciting new games with Colons in the titles:

    Barbershop Defender: A Hairy Situation
    Where Did Thirty Dollars Go?: Budget-tastrophes for Addlepated Youth
    Am I A Squeegee Yet: The Transformative Nature of Gas Stations
    Horses: Yes, Horses
    Press Some Buttons: Today You Are Video-games

    • Professor_Cuntburglar says:

       Brownfighter: Industrial Hallways Again

      No Girls Allowed: No Homo

      Space Battles With Normal Guns: More Neon Lights And Bright Colors

    • rubi-kun says:

       That last one sounds like something BMO would make.

      Speaking of which, who wants to play Conversation Parade?

  28. JohnnyLongtorso says:

    The trajectory of the console wars seems to be to me that system exclusives are becoming fewer and fewer, so the companies have to come up with ways of hyping themselves with ancillary things like connectivity or whatever.

    That said, for me it doesn’t matter what they say, I won’t buy a console at launch, because I’m poor and also not an idiot. The PS4 will probably eventually be the only console of this generation I will end up getting, though, because of the lack-of-exclusivity thing. My 360 has been collecting dust for a long time, because I tend to default to the PS3 when I get stuff from Gamefly, and PS+ is light-years ahead of Xbox Live Gold when it comes to value.

    • The Guilty Party says:

      I expect nothing much will change. Consoles create inertia, in that all your DLC and backwards compatibility stick you to one or the other, and lack of exclusivity gives you no particular reason to switch.

  29. Chip Dipson says:

    On a more positive note. I really want to play The Witness right now. Also if all these bells and whistles will let me play a Beyond Good and Evil sequel at some point, I’ll lay down the crinkle for the privilege.

    • GaryX says:

      Oh god, now I have to go back to wanting BGE2 to come out. Thanks.

    • Enkidum says:

      Hey, did anyone else notice that BG&E was 5 bucks on PSN last week? I got it and am playing through it now. Just got the spaceship.

      • Fyodor Douchetoevsky says:

        How is it? I played it when it first came out and was pretty disappointed. It felt like it had a bunch of stuff taken from other games that it didn’t do very well. I still don’t really know why people go nuts over it.

        • Enkidum says:

          Yeah, I’m a little puzzled by the accolades. “One of the best games ever made” seems to be thrown around a lot. I’d say it’s pretty good, nothing we haven’t seen before, but generally quite well done, with awful camera controls. And a pretty cool story, with a female character who isn’t a walking pair of tits, which is nice.

        • Sarapen says:

          I played the demo, I thought the “meh” feeling was just because I hadn’t gotten far enough into the game but maybe it’s one of those things where you had to be there at the time.

  30. hastapura says:

    “Cretinous analysis” fits David Cage to a T. Dude needs to go spread his hack-guru bullshit all over another medium. I’ve gotten more emotional response out of text adventures than his sophomoric, sub-Criminal Minds nonsense.

    Anyway, I kinda wish it was just called Playstation? Or something besides Playstation 4. That just conjures the images of Playstations marching resolutely into the future, endlessly iterating with tweaks to boot time, gigabytes, and social integration. And psychic powers, if this predictive stuff plays out.

    I was really hoping, prior to this, that everyone in and around the industry understands that games look pretty good these days and that what will set the PS4 and Xbox Whatever apart (the two boxes rumored to be closely matched in specs) is the quality and creativity of their games, not the eye-assaulting shinies of their new Killzone entries. As soon as I saw that I sort of deflated a bit. It’s not even a numbered sequel, is it? Makes it seem like a handheld spin-off. I did like the look of the Dragon’s Dogma-like Capcom game, even though there is no way in hell it was in-game footage.

    And where’s my Last Guardian announcement! The PS2 had such a diverse and superb library, and getting a new Team Ico game out the door ASAP could help this machine achieve those heights. Seriously I’m playing through old-ass PS2 joints like Ghost in the Shell and Siren these days. Bring it back, Sony. I want to believe. Microsoft don’t have the pedigree that’s in play here and I so want Sony to play this right and really dominate like the old days. 


    • DrKumAndGo says:

      I like the numbering scheme, as it accurately describes the diminishing returns of each new hardware iteration.
      PS2 = MUCH better than PS1
      PS3 = some noticeable improvements over PS2
      PS4 = a bit better than PS3, probably.

      PS(X)= factor of X/(X-1) better than PS(X-1)

  31. beema says:

    I just realized that back when the current gen of consoles launched, I wasn’t really paying attention to gaming media. Now I’m immersed in gaming media, and it’s going to suck. Twitter last night was bad enough.

    So what do we really have here. A bunch of rendered teaser trailers, and some gameplay that was running on a PC. Yeah, that’s right. Watch Dogs was running on a mid-level PC. For a showcasing so obsessed with new graphics and hardware, the hardware is pathetic. “Supercharged PC” is a joke. How is an x86 processor a great new advancement? Unless the nomenclature is different with consoles, x86 has been outdated for the better part of a decade. 8 gigs of memory? OH BOY! At present, that’s more like an under-charged PC. 

    Oh hey, let’s keep them motion controls going with a thing that looks like kinect and some bullshit on the fugly controller. How about a SOCIAL button. Because god knows what I want in my deeply immersive games is facebook updates. 

    This whole thing has just made me say “fuck the game industry” more than ever.

    • Destroy Him My Robots says:

      Dude, that’s 8GB of GDDR5 memory. That’s phenomenal. That’s the stuff they use 4GB of to power $400+ graphics cards. About ten times faster than the fastest DDR3. And remember that ridiculous bandwidth is what made GTA3 run on the PS2’s lousy 4MB of video RAM.

      • beema says:

        Alright, I stand corrected on that. The x86 thing still boggles my mind, though. I’ve had 64-bit processors in my PC’s for forever now.

        Even so, my feeling is that static console hardware is ridiculous in an age when technology develops and changes so rapidly. 

        Even if PS4 matches graphical prowess of mid-to-high PC’s upon launch, that will only last for a year at most. Then we are back to multi-platform releases being held back by outdated console tech. 

        This has 8GB of unified memory, which I take to mean it’s the memory for everything. Whereas with a PC you have RAM and then each video card also has memory. So a high-end PC right now would already be “beating” the PS4 configuration, no? Nvidia just put out their Titan card, which already blows everything out of the water, and by this time next year, they will actually be reasonably priced. 

        It’s the whole closed-system problem, where MS and Sony and the big publishers don’t want to allow any user freedom. The PS4 will come out and be great (although you wont see any good, full use of its tech until towards the end of its lifecycle, if history of console launches is any indicator), and then a year later PC will have far outpaced it again. And unless game sales drastically change, the industry will remain at the behest of consoles. So PC gamers like myself will once again be stuck butting heads with the limitations of a console.

        • aklab says:

          “x86” doesn’t mean it’s not 64-bit. x86 is a chip architecture: it can be 32-bit x86, 64-bit x86, etc. Sometimes 64-bit x86 is referred to as “x64,” hence the confusion…

        • beema says:

          @aklab:disqus ah okay. I thought x64 = 64-bit and x86 = 32-bit. 

          So what have the consoles been using all this time?

        • Bad Horse says:

          @twitter-259492037:disqus Depends on the console maker. In Sony’s case, they tend to use less well-known architectures, like the weirdo 9-core Cell that’s in the PS3. Xboxes, by contrast, have always been derived from PC architecture and are easier in general to program. 

          There seems to be a correlation between programmability and sales in the console market – the closer to the accepted PC architecture you are, the more games come out, the better they look, and the more you sell.

        • George_Liquor says:

          @Bad_Horse:disqus The original Xbox sported a slightly-modified Pentium III chip, but the X360 has a PowerPC-derived processor that’s actually  closer in design to the PS3’s cell processor. 

      • George_Liquor says:

        I’m guessing Sony’s going for unified GDDR5 RAM more for simplicity’s sake than anything. Don’t get me wrong, it’ll be a big boon to the GPU, but the CPU likely won’t see much benefit over regular DDR3 RAM. 

      • Professor_Cuntburglar says:

         If you had written that in Chinese I would have understood just as much.

  32. Tyler Mills says:

    …“polygons” being industry lingo for “size of penis.”

    Is that a polygon in your pocket or are you just happy to see me?

    I was gonna do more but that’s the only cliché I could think of. :(

  33. SamPlays says:

    Great article and a nice counterpoint to the exaggerated awesomeness that’s apparent on other sites. (I like Killzone as much as the next guy but it’s nothing to get excited about. I watched the trailer footage on IGN yesterday and found myself skipping at 20-second intervals.) 

    It’s not surprising to hear that “Bigger AND Better” will somehow release “Creativity” from its polygon shackles because it’s the only legitimate selling point when discussing hardware. Though apparently the Internet is now a major bullet point. Skilled game developers can maximize the gigabyte prowess of any machine thrown their way but there’s a keen distinction between technical creativity and artistic creativity. I think any sensible person can acknowledge the art can be accomplished even with the barest of resources. 

    Personally, I’m not looking forward to integrating either “social” or “online” into my gaming experience. I prefer playing games alone or in the immediate company of others. I’m also mildly discouraged at the prospect of aligning my game experience with my ego. I’d rather experience a perspective that is decidedly not my own when playing a game. If left to my own devices, I’d probably take Nathan Drake to cognitive-behavioural therapy to figure out exactly why he cruelly and casually kills so many people.

    BTW, Teti, I clearly inspired your use of the phrase “cognitive dissonance” with my post the other day. If it means anything, I’m now inspired to use the words “cretinous” and “flagellate” in at least two discussions this week.

  34. ComradePig says:

    As my friend summarized when the whole affair was drawing to a close, this whole unveiling was basically “Buzzword: The Conference”. Arguably even more so than usual.

    It’s particularly painful trying to watch the various speakers try and generate enthusiasm for the array of ‘connectivity’ features and ‘multimedia center’ functions of the new consoles.

    These elements may appeal to the casual console owner and can sometimes be welcome bonuses for gamers as well, but console developers seem to fundamentally fail to understand that ‘gamers’ do not care much about or even desire the ability connect their games to Facebook or to use it primarily as a device for watching TV.

    This conference wasn’t as dismal as the last E3 on the ‘no games’ front, but it was still a comical exercise in watching the speakers try to sell features to an audience that by and large doesn’t really want them.

    Particularly with the probable price hike we’ll see for next-gen consoles compared to the current slate; I may very well not purchase any of the incoming generation of systems-the first time in ages that will have happened, and simply go PC exclusive.

    • DrFlimFlam says:

       My laptop is about three years old now and is starting to show some serious age despite my best efforts, but yeah, I think I’d rather fork over for a good gaming PC at this point than get another iteration of home console hardware. There is so much I need to catch up on yet.

      • Citric says:

        The problem with Steam is the relative difficulty of playing from my couch. Yeah, Big Picture exists, but it’s somewhat unwieldy and needs connections and cords.

        • beema says:

          I see you edited the thing about cables, and that for some reason my response to that showed up elsewhere! DISQUS!

          But anyway, I definitely understand this. For me the main physical barrier to PC gaming in front of my TV is the whole keyboard and mouse thing. For me that pretty much requires a desk, and wireless mice are complete shit when it comes to gaming, most of the time. Also I have much better sound on my nice gaming headphones, the cord for which only extends so far. I prefer controllers for some more action-y games that require button spamming and such, but for most games KBM is my input of choice.

          Conversely, I never understood the “I prefer couch gaming mentality.” For the majority of games, I prefer playing in PC because I feel much more immersed in their worlds that way. On a PC, closer to the screen, with headphones, I feel more walled-off from the rest of the world and there are fewer distractions and sensory stimuli to take me out of the game. So especially for games with rich atmospheres or any horror elements, I much prefer PC.

        • Merve says:

          If there were some way to stream the images from your PC to your TV in real-time, even if your PC were at the other end of your home, with your controller wirelessly linked to your PC, then you’d be in business. Bonus points if the PC can still multitask while doing all of this.

          The company that figures out how to do this will make millions.

        • SonjaMinotaur says:

          I am a couch/console gamer and here is my reasoning: When I am sitting at a computer, I am at WORK. When I am playing a game I want to be sitting on my couch wrapped up in blankets. So I am willing to sacrifice polygons/immersive experience for comfort.

        • Citric says:

          @Merve2:disqus If someone were to make a steam stream I would be all over that.

          @twitter-259492037:disqus The cables thing was a first thing in the morning statement, and then I was in the shower and thought “wait, that doesn’t make any sense.” I was thinking of the difficulty of actually connecting PC to TV in most homes, and stated it groggily and half-asleepily.

          Anyway, I prefer couch because my couch is super comfortable and has a blanket on it and I can pet the cat during cutscenes. I just like laying back more than sitting upright, everyone’s different though.

        • beema says:

          @Merve2:disqus What you are describing is essentially what OnLive is, except that you are streaming games from a remote PC. I have no idea how they are doing, but suffice it to say I haven’t heard anything about them from anyone in a while. 

          Then again, it’s not entirely the same, since you are paying for a service and you also have absolutely no direct control over anything, which is kind of the opposite of a PC gaming experience. 

  35. Horatio_Scornblower says:

    Boy, does David Cage sound like a dingus. Hey Cage, if you’re so concerned with “emotion”, then why was Heavy Rain about as emotionally nuanced as a substandard Hollywood thriller?

    • indy2003 says:

      Well, let’s be fair – Heavy Rain is a PS3 game, and we all know the PS3 is completely incapable of expressing emotion.

  36. LetoII says:

    Amen to all of this, really. I don’t care that the author was clearly biased in everything he wrote. He was annoyed at the contempt these idiots were showing their audience; what’s not to hate about that? 

    This power versus creativity thing has been going on in the video game world since I was a kid (I’m 31) and frankly I’ve yet to see how more power has added any “emotional depth” to anything. Books don’t require sixteen trillion gigs of ram to achieve true emotional depth. They’re just words printed on a page. So how do they achieve the kinds of connections with a reader’s soul that can last for an entire lifetime?

    Using silent films as an example of how we’ve somehow “evolved” in the world of film is also blatantly stupid and could only be done by someone with the most juvenile grasp of cinematic history. You’re telling me something like The Great Train Robbery or Metropolis or The Cabinet of Dr. Caligari or the great Buster Keaton or Charlie Chaplin comedies–those are all basically emotionally inferior to such modern fare like Here Comes the Boom or Madea Gets a Job? Yeah, because those are on Blu-Ray, and you can see every pore on Kevin James’ sweaty head! That’s what makes it good!

    • Horatio_Scornblower says:

      Seriously. If anything, on the cinematic front, the turgidity of super technologically advanced shit-garbage like The Hobbit and Avatar prove that advances in technology =/= advances in creativity.

      • ApesMa says:

        Avatar also showed that super expensive, state of the art effects can still look like dogshit. Blue dogshit.

    • Steve McCoy says:

      I sympathize with you, but it’s disingenuous to compare the best of early film with the worst of contemporary film.

      • LetoII says:

        I also thought of that, but I feel its justified to compare the best to the worst simply to prove that Sony’s point about progress is reductive and senile. Because obviously we haven’t “progressed” just because we have digital cinema and HD tvs now.

    • chaos...reigns says:

       This technology argument also ignores how damaging the early advent of sound was to artistic cinema.  Barring notable exceptions it took international film a good decade to recover on the whole by way of sound technology becoming cheaper.  Sony’s real burden here should have been to prove how better tech will benefit smaller and more artistically minded developers and they failed to do that(because it probably isn’t true).

    • ApesMa says:

      It’s just a false analogy. There aren’t any actors onscreen in video games, we are looking at CG characters. If he’s going to compare it would have to be older more primitive CG movies versus modern ones.

      Sorry to repeat my point further up, but here goes:

      Pixar gets emotion out of their characters, but those are cartoonish looking characters. Try to recreate realistic looking humans and you get the Uncanny Valley creepyness of Polar Express, and fancier graphics only enhances it. Improved technology sometimes ends up making things worse, not better.

  37. Steve McCoy says:

    I came in during Blow’s segment, so I left with a mildly better overall impression than Teti, but yeah, this event was far, far more “corporate” than I was expecting. So, despite me being contrarian in the next paragraphs, please understand that I thought it was a lame event overall.
    One part that argues against Teti’s thesis (although it’s only one of two or three, really), but that he completely omits, is the Media Molecule segment. I have no clue if the sculpting part could turn into a great game, or what was really going on with the band thing, but they made me smile, emphasize ideas over polygons, and more than other segment, present a believable — if not substantiated, but what is in these things — argument that the PS4’s power and features can combine to create experiences unique to the platform.

    Cage’s segment was the nadir (runner up is “Hey we’re working on another Final Fantasy, see you at E3 suckas”), but there is one fair argument buried inside his bad graphs of polygon counts and his lack of perspective. Now, I hate to use specific examples, because they’re either easily countered or evoke kneejerk responses, but I think I’m safe on this one: Raging Bull is a better, more emotive movie than The Great Train Robbery and would be impossible without the technological advancements in cinema that came in the years between. Sound, bigger framerate, special effects, mobile cameras, and even color are all vital to the Raging Bull experience. The fantastic, subtle, emotive, aching performances (and the acting style in general) would be impossible with Great Train Robbery-era technology, because of the limited framerate and lack of sound. The 16mm scenes wouldn’t have nearly the same impact if the rest of the movie weren’t in a higher-quality format. The fight scenes would be vastly reduced without the incredible sound effects or the ability to film at a higher framerate for the slow motion (also used in non-fight scenes).

    Now, to tie this back to videogames. There are many things that more processing power and more memory can bring that we do not have in the current generation. Chiefest among them is smart AI. Wouldn’t it be great if the enemies in Demon’s Souls were smarter than a sack of doorknobs? Or if the NPCs in a Skyrim-type game could react to any of your actions in a variety of context-dependent ways, rather than a subset of your actions in a handful of canned ways? We’re still in the Precambrian for AIs, and part of the problem is lack of power. Basically, the best algorithms and even the ones that try to be worse than the best, but still way better than the worst, require lots of CPU and lots and lots of memory for even simple scenarios. And that’s just for one AI agent. Most games use something closer to the worst when it comes to decision-making other than pathfinding. I can go into more detail if somebody actually reads all of this and wants to know more.

    Another benefit to more power could be active, “living” worlds. Wouldn’t it be great if your knight goes off to slay a dragon at the king’s request, but when you return, the king’s been dethroned — not because you flipped an invisible switch by killing the dragon, but because you stripped naked back in town and put pots on everybody’s heads, and when some of those people met at the tavern one night and got drunk enough, they decided to stage a coup? Or if the wildlife in the whole world followed some ecologic and over-hunting could lead to extinction, or introducing an animal from area A to area B could change area B in unexpected ways?* Dwarf Fortress kind of does things like this, but it’s all tied to you taking actions, (and the game has other downsides). More power means more things can happen while you’re doing something else, leading to unique experiences, another form of emergent gameplay.

    Anyhow, enough ranty brainstorming from me. I’d have been much happier if the developers presented more _ideas_, even if that meant more demos and pre-rendered concept videos than actual game footage. I wanted more Blows and Media Molecules, but I can’t totally blame Sony for this; it’s the current nature of the AAA** industry. And the more I think about it, the more this feels aimed primarily at investors, so a traditional emphasis on polygons is understandable.

    * Disclosure: I’m currently, slowly working with a friend on a game with this.

    ** The A’s all stand for “ads”.

    • beema says:

      Couldn’t agree more with you. The thing I would love to see evolve the most in gaming is AI. At the same time, it’s the thing I sadly expect to see evolve the least. More power might mean better AI is possible, but it doesn’t mean it’s likely. Due to the “easy money” of multiplayer, the involved solo experience that would call for better AI seems to be on the decline. There are a couple of shining examples counteracting this, but not nearly enough when compared to all the games featuring shoehorned multiplayer and “social” experiences. 

      With new tech allowing for better graphics (an easy selling point), this problem is compounded. Game development budgets are skyrocketing due largely in part to the graphical demands. This leads to publishers such as EA and Capcom hedging their bets and making middle-of-the-road market-tested-to-mediocrity AAA games in order to appeal to the broadest possible audiences to try and recoup their development costs. When you have a game selling 5 million copies being considered a failure, you have a huge problem. So when you have situations like that, how likely is it that these publishers and developers will invest money in to something as complex and time consuming (and not as obviously marketable as graphics) as AI r&d? I say not very.

    • heavenkey says:

      Totally on your wagon with the “living” worlds idea. Last night while trying to approach a fort and stage a stealthy killing in Assassin’s creed III, I was so frustrated with the limited gameplay mechanics and forced pathways to achieve objectives, that I started imagining how a real interactive experience would allow for more diverse approaches and take in  account my actions in the immediate and distant ranges and affect the world of the game. I just felt like this kid still playing with figurines and creating a coherent universe in his imagination to make up for the lack of interaction. I assumed that three or four consoles later and heaps of technological advancement, I would be asked less to close the gaps with my imagination, but maybe I’m mistaken ! 

    • valondar says:

      Sure, Raging Bull is more emotional than The Great Train Robbery. But is it more emotional than Sunrise? I’m sure some would say that it is and I respect that, but I wouldn’t.

      Raging Bull was a film that was made with technology unavailable when Sunrise premiered (although in the short term the addition of sound actually significantly curtailed camera movement, whole other conversation).

      Advancing technology is a good thing because it gives more options to the ways films can be made. I’m in full support of the creation of new game engines and experiences, and if the PS4 will let more breathing game worlds and more interesting titles come out, bully for PS4. I actually have a pretty good experience with most of the Sony electronics I’ve bought over the years so I certainly have no animus against the company.

      That said, I don’t think this means that the best older films are basically deficient.

      And that’s the core assumption of David Cage’s argument: Silent films were deficient. The existence of other choices today which are new are therefore better – from silent to sound from sound to colour and on and on. New is good. More choices are great. But you’ll forgive me if I’ll watch a Guy Maddin movie before I see anything Martin Scorsese has done in the last ten years.

      • Steve McCoy says:

        Well-stated, and I won’t disparage the other works you mention. But, if you skipped out on Hugo, you’re missing out!

    • SaoirseRonanTheAccuser says:

      “Raging Bull is a better, more emotive movie than The Great Train Robbery
      and would be impossible without the technological advancements in
      cinema that came in the years between.”

      In my opinion, The Passion of Joan of Arc is a better, more emotive movie than Raging Bull because every single resource in the film is geared towards forcing you to empathize with Joan and get inside her head.  The limitations of the technology forced them to make a movie that, though silent, still takes you further into a character’s head than most movies could even dream of going.

      What’s more, the argument made – according to this report – by Sony and the various developers is NOT ‘we have enough processing power to tell stories and build characters in ways we never could before’, but ‘we have enough processing power to make things look better than we ever did before’.  And that is the line of thought that gives us trash like AVATAR, not masterpieces like RAGING BULL or THE PASSION OF JOAN OF ARC.

      In fact, I would argue that Raging Bull is not better than The Great Train Robbery because of the technological innovations, but because of the creative ones.  Actors learned new techniques, but a lot of those techniques could be applied with or without sound.  Editors, directors, etc… all learned new ways to cut films, tell stories, and show character, and very nearly none of it was dependent on the cameras having more ‘technical’ power.

      Video game developers aren’t making that leap.  They’re just updating technology, without updating their own storytelling skills. 

      • Steve McCoy says:

        I agree that the creative innovations are vital, but still think that the technology is very important as well.

        But yeah, games are lagging in a lot of ways. The more I think about the comparison to cinema history, the less it applies (a general problem with argument via analogy). The order and rate of how videogames evolved, the ways games have become popular, economic factors, and the fact that cinema exists and influences games — game history is a total mutant of cinema history.

  38. Damn. At least *someone* gets it.

  39. Citric says:


  40. rvb1023 says:

    I came away from this a bit disappointed. If there was any console I was going to get this gen would be the PS4 as I genuinely think Sony supports the widest and most interesting first party games at this point, but they really did not show any new IPs that got me interested outside of Knack. I mean, I will play a new InFamous I just wish we had a bunch of new IPs like with the PS3. Maybe they are saving some of the big guns for E3?

    One glaring omission from this article is the fact they flew a guy from Japan only to say a new Final Fantasy will be announced at E3. That got me more mad than than any of the corporate buzzwords I knew I was going to hear last night.

    • Spacemonkey Mafia says:

      I think it’s an irony of a new console release that all the emphasis on new and groundbreaking tech has to be buttressed by familiar and comforting IP’s.
         Whether it’s a broader cultural question of gamer’s actual interest in game innovation -or exsisting IP’s providing a known reference to gauge new tech, -or just that the major companies are the one with the resources to begin development for a natal console I’m not certain.
         But while I completely agree with you that the PS3 has become a great platform for unique games, it took some time to nurture.
         Assuming the PS4 does the same, I think it will take time as well.

      • GaryX says:

         But while I completely agree with you that the PS3 has become a great platform for unique games, it took some time to nurture.

        As I said further up, I feel like by nabbing Blow, they’re at least showing that they still think this way. If PSN remains as openish (compared to Live) as it is, I think we’ll be just fine moving forward.

  41. Phillip Collector says:

    One of the things that rang false to me was David Cages assertion that going forward technology will not limit the creativity of game developers.

    Really? So if I wanted to develop a game where players have in depth conversations with NPC’s the PS4 is going to let me go that?  Yeah right. We still can’t even get the lip syncing right. Ha!

  42. indy2003 says:

    There’s every chance in the world that the PS4 could turn out to be a fine system, but I have to admit that the “social” elements have no appeal for me whatsoever. I don’t want to play a level of a game, take some screenshots of some of my most badass moments, post a video clip on Facebook, tweet my high score and then watch videos other players have posted on my phone as I search for a worthy multiplayer challenger (watch your back, usuckluzer6969!). That sounds tedious and a little exhausting. I just want to play a well-crafted game on a well-crafted system. By myself. In peace.

    Also, get off my lawn!

    • DrFlimFlam says:

      It’s merely tapping into what forums have done for years. Talking about games for ten minutes of every minute you actually play them.

      • heavenkey says:

        Yeah, I sit here reading all this mambo-jumbo about the distant ps4 gaming rather than go finish my pile of worthy current gen games. What’s that, dishonored ?, be patient boy I will be there in a minute !

      • Merve says:

        Isn’t that kind of what we do here? ;)

    • Citric says:

      I’m not so much bothered by the fact that I’ll never use the feature, I’m bothered by the fact that other people I know will use the feature, and it will get incredibly annoying, and then I’ll have to stab them.

    • Thats_A_Paddlin says:

       The only positive I can see is maybe it’ll help people share walkthroughs (I only use them when I’m really stuck!).

    • @indy2003:disqus :  that summarizes my thoughts exactly.  I understand that in this world part of why fb and twitter are so big is there are countless people who want/need to share every moment of their lives with the public.  I’m not one of them, and I also find the concept tedious and fairly repellant.

      Of everything I could be doing while I’m not gaming, sizing up future opponents would never be one of them.

  43. PaganPoet says:

    More like Playstation SNORE unveiling, amirite, brahs?

  44. kah says:

    This is bullshit. Sony is a great, high-quality company. Also, REDDIT ENTS WE HATE NIGGERS REDDIT REDDIT KKK ENTS.

  45. Spacemonkey Mafia says:

    I can see both sides of the tech debate.  On the one hand, Méliès made some of his most affecting films with MarioPaint.
       On the other, The Great Train Robbery was super rad after the bandit found the secret power up cache and pointed a Kinetic Plasma Relay rifle at the audience.

    • Effigy_Power says:

      That Melies comment made me think for a second, because if someone told me he had MarioPaint, somehow… I might believe it.
      Dude was fuckin’ magic.

  46. beema says:

    Yeah, not like those cordless consoles! 

  47. evanwaters says:

    I’d actually say that where video games are now is more like the early talkies. There has been a big leap forward in terms of what you can do, but taking advantage of it puts a bunch of obstacles in your way.

    Of course in this industry’s case, it’s not so much the tech as it is the money itself- big titles cost so much to actually make that the studios want to make sure they have broad appeal, so everything gets watered down. Steam and the like are good platforms for lower budget games which actually have some leeway to experiment, but it still feels like there’s a big gap between the latest humble indie offering and Call of Duty: Modern Warfare: Black Ops 3 that in the old days would have been filled by a variety of mid-budget offerings.

  48. I guess I wasn’t aware that 2006-2013 = the “10 years minimum” that Sony promised out of the PS3 when Wii U was first announced.

    • Citric says:

      Going by the PS2, there will probably be a few quality new releases after the new console is launched as everyone finishes up their old stuff, and we might get a Persona 4-style old console bundle of awesome a couple years post-PS4 if we’re lucky.

    • Effigy_Power says:

      You should thank them. They realized that all your games have a slight film of dust on them and figured, you’d be buying a whole new set soon anyways. Also, that LED on the front of your PS3 is a bit dim, so… new console, hmm?

      •  …and that’s the reason I’ve never purchased any Sony console since the first Playstation: pretty much the only games I cared about during the PS2-era were PS1-era (I was late to the party: a friend gave me a PS1 he had laying around and said I had to own it for one reason only: Final Fantasy 7.  I picked up a few other great games along the way).  (Anything I wanted to play on PS2 was available elsewhere.)  And PS3 offered me no way to dust off those PS1 games short of rebuying them all (if they were even available), so….no.

    • rvb1023 says:

       I imagine the PS3 will be supported for several years after the release of the PS4, same with every new Sony console when compared to its predecessor. Christ, in the early years of the PS3 the PS2 and PSP were the only things pulling a profit.

    • GaryX says:

      Since they just discontinued the PS2, I’m sure they’ll leave the PS3 around long enough to hit that mark.

  49. “On your telephone, you will be able to watch video clips of other people playing a fighting game, decide which opponents you would like to fight, and then challenge those people to fight”
    Useless! Totally and utterly useless! 

    • GaryX says:

      Almost positive that there’s is definitely a sizeable niche out there that’s actually going to take advantage of that feature.

    • LetoII says:


  50. Andreas Karas says:

    I’m sorry but this article is all kinds of nonsense:
    – for the first time in years, Sony did not come across as arrogant; remember “the next generation will begin when we say so”? There was none of this bravado and I have no idea where Mr Teti saw any talk of throbbing polygons. Sony simply stated the basic stats for their machine and then moved on to what the machine can do for the gamer. What is wrong with that?
    – I despise facebook as much as the next sane human being :-) But I accept that lots of people use it; Sony wants to engage players in a way that gaming can become part of their day to day experience in a seamless way. It’s the only thing they can do to battle iOS/Android. What is wrong with that?
    – Sony were -justly- criticised because they went all hardware-crazy with the PS3 and they felt (arrogantly) that developers and consumers would just fall in line when they realised the purported awesomeness of their system. This time, they wanted to show that they listen to the developers a bit more and the engineers a heck of a lot less. So several developers take to the stage and sing the praises of Sony’s new direction. What is wrong with that?
    – Maybe I missed it but there was no discussion about the PS3 being a failure: on the contrary, they restated what was good about the platform but that it’s time to move on. What is wrong with that?
    – I don’t know how old the author is, but he comes across as a jaded 60 year old. Maybe this could be a great opportunity for him to take a break from gaming. If he thinks that open world adventures, first person shooters, third person shooters, driving games and action/RPGs look boring, maybe it’s time for a self-imposed lent (with a bit of backgammon and point-and-click adventuring to take away the rough edges).
    – B&W movies: so, is the author saying that the craft of an actor has not changed since the early days of cinema? I can understand that a crap film is a crap film in 480p or 4k; but are we seriously arguing that the acting and filming techniques and the audiences’ expectations have remained unchanged by 100 years of technological progress?! Of course, no technology can guarantee artistic success; but -all being equal in terms of talent- what’s wrong with being given better/more technological options to realise your vision?

    Also, the author would do well to remember that the Great Train Robbery was in fact a tech demo for a new fangled medium. It was seen as a ride in an amusement park, rather than a work of art; and if you want to see the provenance of editing techniques, you’d do well to visit the National Gallery (or any other local museum)

    • Citric says:

      Facebook integration is wrong for two reasons.

      1 – Nobody actually cares about anything that someone can share using that button.
      2 – Everybody either be forced to see it anyway, or have the awkward situation where they are no longer friends with family members. 

      It’ll be only slightly less annoying than those stories about wise grandmothers your aunt is constantly sharing.

      • Andreas Karas says:

        Citric, you are preaching to the choir :-)
        Having people posting “trophies” on Twitter was pretty much the nadir of this sort of behaviour. And I do wish these tech guys could pay someone to explain to them the difference between “social” and “sociable” (and “socially adjusted”). 
        My gripe with the author is that he got hung up on these features which will be appreciated by my young nephews (and I will switch off in 5 minutes) and came across as a grouchy old man.

      • beema says:

        For a brief period I had my psn synced to Facebook so it posted whenever I earned trophies or something. I quickly realized (possibly at the insistance of my girlfriend at the time) that it was incredibly annoying. If nothing else, you should take the hint when not a single one of them has a “like” or comment. Auto-posting is probably the worst function of social media. Like when, spotify, or soundcloud auto post every time you listen to some stupid track.  Or four square. Just fuck! Nobody cares about that shit! I’ve unfollowed IRL friends on Twitter because their entire account was just auto-posts. They become especially stupid with games, though, since a million other people are “accomplishing” the same exact thing all the time.

        Even Steam’s newish social stream is utterly useless and ignored by at least everyone on my friends list. Sometimes there will be a cool screenshot, but that’s the extent of it.

    • Professor_Cuntburglar says:

       This is one of my least favorite things about the gaming community. You don’t work for Sony. You don’t have to defend them. They are a multibillion dollar company with a huge marketing team.  As consumers, we need to be criticizing these companies, not defending them at every turn. We need more articles like this, not less. That is the only way gaming will progress.

      • Andreas Karas says:

        Sorry Prof, but from where I’m sitting, what the world needs is less knee-jerk negativity. 
        As consumers we need to be critical and rational; this article is not helping us form an opinion using evidence and reason. Rather it feeds our worst fears: everyone is out to rip you off, everyone is out to get you. Have a look at the Daily Mail headlines to see where this leads (sorry I’m UK based, but I’m sure you have similar “the world sucks” media wherever you are). 

        The author is not looking to stick it to the man; he is either a bit bored with his job; or he’ll be using the page hits from his “controversial” views to furnish his CV when he applies for a job in PR. 

        • Fyodor Douchetoevsky says:

          Yeah, I’m pretty sure Teti isn’t fishing for pageviews with this. I don’t know what kind of numbers Gameological gets or whatever, but this article is totally in line with what I’ve come to expect from this site. It’s absolutely refreshing for me to see somewhere that isn’t guzzling down all the hype train gravy.

        • valondar says:

          If you want to read enthusiastic reports about the PS4, go over to IGN. The idea that games journalism should be beholden to painting the company in the best possible light, even when they hold a stupid, vaucuous presentation, is just absolutely abhorrent.

          What game journalism needs is not less negativity. Or, arguably, more negativity, but it needs to be material written with a clear voice. This is Teti’s view on Sony’s presentation. He hasn’t diluted it just so Sony would be happier with his content. He is not part of their press junket.

          And if you can’t tell the difference between some light snark like Gameological and Daily Mail’s trashy tabloid journalism, you probably haven’t read much things here.

  51. dantebk says:

    Nothing you said is wrong exactly, but I’m surprised at your cynicism. I thought it was a good presentation, and it made me want a new game system. By the time PS4 comes out, Xbox 360 will be 8 years old. There have been dozens of 360 games I have absolutely loved. I want those types of games in my life, and I’m ready to pay money for them to be shinier and maybe more complex. Dead Rising, for example, with its level of complexity, would probably not have been possible on PS2, and that’s one of my favorite games. What new thing like that can they do with PS4 or Xbox Whatever? I’m looking forward to finding out.

    PS4 seeks to replace the PS3, and I’m fine with that. It’s not replacing my PC or my iPhone. More PS4 polygons doesn’t mean less Antichambers and Kentucky Route Zeros. They talked about making it easier for small devs to self-publish on PSN, so maybe we’ll actually get MORE of those types of games.

    Saying that “so many games suck” is not a sentiment I can get behind. More great games are released every year than I have time to play. If more games “sucked” I’d probably save a lot of money.

    • GaryX says:

      Dead Rising, for example, with its level of complexity, would probably not have been possible on PS2, and that’s one of my favorite games.

      There actually was a Wii port which was pretty terrible and didn’t really work. Definitely suffered from being down ported. 

    • FakeKisser says:

      I personally disagree with a lot of the sentiment in this article, especially the last part regarding Cage’s use of the silent film. I feel completely confident saying that Cage has full respect for that film and the history of film. His point still makes sense and stands. No matter the significance of the old films, it is a fact that technology changed the way films are made, the way actors act, and the way emotion is conveyed. That was his point – not that The Great Train Robbery was rubbish and contributed nothing to film.

      For the most part, I personally think Sony did a good job for a marketing presentation of their next super-expensive gamble.
      I’m honestly a very cynical person, but I felt like this article is picking too much. I love thoughtful critiques, and I understand some of the perspective here, but I just think this article misses some of the point.

      Videogames have always been tied to tech, just as power computer has been. The push for faster hardware and new experiences (carried by improved graphics, AI, physics, et cetera) is as much a part of videogames as the creativity behind the games.

      I’m primarily a PC gamer, and I am more into story and unique gameplay than explosions, guns, and the best graphics. However, if I read this article on a non-gaming site, I would be of the opinion that it was written by someone that wasn’t familiar with videogames and the cycle of technology that has always existed in the medium.

  52. wafflesnsegways says:

    I’m disappointed that this is so focused on graphics and “social.” But I do think that there are technological improvements that can open up new doors for games. It’s probably hard to even imagine what they are at this point. I’m curious what tech improvements people think would add new levels of enjoyment to games.

    For me, my number one wish is probably improved physics engines. I love nothing more than a game where I can go around kicking the legs out from under chairs, interacting with everything, and watching the effects domino out. This is something that 2D games seem to still do better. I could play with Phun for hours.

  53. Craig Duda says:

    Another amazingly written article by John.

  54. Philip Sturm says:

    Yeah, yer right man. They should’ve just repackaged some old Ataris and sold you on the art of the craft.

  55. You’ll have to forgive me for not being cynical about a thing that lets me play more video games for fun.

    • Erdschwein says:

      That’s what I say to people about my gun. Oh, the stores we’ve robbed…

    • Professor_Cuntburglar says:

      I’m cynical because this sounds like they’re making more boring copies of current games, with added facebook, but nothing new and fun.

  56. Sony’s hubris is getting more and more hilarious, as they gasp for breath in every category they’ve ever competed in. Gaming is their last dominant (or nearly) dominant position. I’m bitter ever since the stupid Memory Stick, and because 7 of the 9 Sony products I’ve purchased in my lifetime have broken. Only a small transistor radio and my wife’s alarm clock still work. 

  57. A counterpoint to all of this is Jon Blow’s assertion (on twitter) that PS4 is much easier to develop for than the notoriously difficult PS3. In addition the expansion of the machine’s processing power and memory means less developer time spent optimizing games for constrained platforms. The Witness’ graphics would definitely run on PS3 or X360, but it would take 2-3 times the money/time/effort (I’m not a developer, so I pulled those numbers out of my ass, but the point still stands).
    AAA developers increase their budgets to accommodate the new power, but indie developers will be able to give themselves more breathing room, financially and technologically, for the same (hopefully excellent) product. Basically I think this year’s console refresh is a good thing for good games, but obviously not for any of the reasons Sony enumerated.

  58. Molemaniac says:

    As a sidenote to the proceedings, did it strike anyone else as weird that several of the games are not only going to be multi-platform (Watchdogs is probably on the new Xbox whatever, and Witness is still coming out for PC) but prior platform releases?

    Ending with two huge developers saying “Yeah, it’s nice, but this’ll be on the PS3 as well, no worries” is maybe something the Sony dudes should have run by someone before the broadcast.

    • Merve says:

      Watch Dogs is actually slated for PC, PS3, XBox 360, Wii U, and PS4. I’d be surprised if it didn’t come out for the next XBox as well.

    • rvb1023 says:

       Well, it helps that the two games in question were Activision games and we knew long before this that they would be available for every electronic device ever made.  I’m playing Black Ops 2 on my washing machine right now.

    • GaryX says:

      I imagine that’s because any new games they have are being held for E3. Plus, the install base of this generation is fairly high, so developers are probably hedging their bets by releasing current-gen versions and then basically the PC+high texture pack port to the new ones.

    • FakeKisser says:

      I personally am part of the camp that believes we’re going to see less and less “exclusive” games as time goes on. So, I think that’s what we’re seeing here and are likely to see at Microsoft’s conference, as well.

  59. JosephHilgard says:

    I’m always delighted to see David Cage get a much-needed razzing.

  60. Will Alexander says:

    Very well put. Thanks for putting some commentary and thought into an article rather than just spewing sony’s talking points.

  61. Sini_Star says:

    Hey, I like this game site. Not as much slavish drooling to the video game powers-that-be (see: IGN). 

    In short: fuck Sony!

  62. Ryan Peterson says:

    First off, I absolutely love your article.  I seriously burst out laughing at your polygons=penis size joke.  My only issue is with your commentary of Cage’s silent film discussion.  I actually thought he was speaking reverently about what those films could do.  Maybe I missed something, but from what I saw/heard he was giving that era kudos to supplying an emotional connection even though they lacked technological advances.  Just my two cents.  

  63. normanbaits1980 says:

    This is an interesting article which is a testament to the writer because I haven’t owned a game console since PS2.

    I’m obviously a bit on the outside, but it is it possible that consoles “peaked” in the early to mid 2000s and not any improvement are so insubstantial that these conferences are just pathetic?  (Coupled with the the fact that mobile games continue to become more impressive and the culture in general is getting used to what they want, where they want, which ain’t always necessarily in front of your TV.)

    Additionally, are we about to experience (or are we already experiencing) a grunge-era-style revival of “independent style” games that will act as the antidote to the shamelessly corporate offerings most console game developers are offering these days?

    • Bad Horse says:

      Let’s not blow this out of proportion. The vast majority of the game-playing population is perfectly happy to buy a $400 CoD box because that’s what their friends are playing, and they’ll continue to play it for a couple years until the next edition comes out, because it’s fun to play with your friends and this is another way to do it. Personally I prefer finely-crafted single-player experiences, but that’s just because I have no friends.

  64. Kalu Ude says:

    Ignoring the Media Molecule segment is not okay. I agree with everything else though. Nailed it with the “supercharged architecture” bit. I rolled my eyes when that part came up. Are they trying to sell the devs used autos these days?

  65. exant says:

    I think there’s a deeper story here. This narrative that the last generation’s objects are a “shitbox full of lies” is pervasive across the consumer electronics industry. The gaming industry is just an extension of this. 

    Consumer electronics companies have built a fragile, probably unsustainable business model in order to exploit the profit of innovation, which too quickly becomes forced in order to support the tailspin momentum. 

    At some point in any market real innovation becomes harder to achieve, but the breakneck hyperventilating pace must continue for the sake of budgets built on last year’s breakout success.

    This is why we have washing machines with 12 functions you’ll never use, mops that need batteries, and video cards with built-in water physics.

  66. Boonehams says:


    Black and white, silent, emotional.
    Eat it, David Cage.

    • hastapura says:

      David “Tears” Cage brings you his latest, paradigm-shifting masterpiece, “Emotions: The Cinematic Interactive Experience About Emotions”

  67. George_Liquor says:

    Being completely devoid of emotion or self-awareness myself, I find the PS4’s announcement fascinating from a technical standpoint. Both the PS2 and PS3 were trumpeted as incredibly complex technological masterpieces. The PS2’s Data’s Emotion Engine chip was thought to be so powerful people feared it would get exported to Iran or North Korea and turn up in guided missiles. The PS3’s Cell Processor, with its no fewer than eight independent Supercafubulating Processorizing Elements was expected to revolutionize not only gaming, but PC processor designs going forward. 

    Unfortunately in both cases, Sony’s unnecessarily complex designs made for game consoles that are a bitch to program. The PS3 often gets cross-platform games last and loaded with bugs because of the additional problems inherent in porting existing code to it. The PS3 may be the superior console on paper, but nobody’s been able to effectively demonstrate it yet.

    Now it looks like Sony’s taken a few notes. According to the announcements, the PS4’s going to have a multicore x86 CPU and a more “PC-like” GPU, whatever that means. To me, it sounds like Sony’s sourcing nearly off-the-shelf components to assemble their next console in a manner similar to the original Xbox. If nothing else, this should make the PS4 easier to write games for, since there’s 30+ years of development experience backing up the x86 architecture. 

  68. Great article! 

    I’ll try to keep this as coherent as possible.  I’ve been primarily a console gamer all my life, going back to the Atari 2600.  My favorite consoles of all time are the SNES, Dreamcast, PS2 & Wii.  Yes, I love the Wii.

    I’ve skipped the PS3 & Xbox 360 iterations for a few reasons.  The Wii was cheaper, and now that I have kids it makes much more sense as a family console.  Plus – kids love the motion control thing.  But economics play a big part of it – Wii games are/were cheaper, plus I really detest the model where buggy games are rushed out so that they can be patched later.  To me that’s sloppy & lazy, but also representative of many other similar problems in the industry.

    So, my daughter had a birthday recently and wanted some sort of gaming device.  We went with an Android tablet because – again – kids LOVE touchscreens and we can fill the thing with games for free (or very cheaply).   Do the Big 3 have a strategy regarding the “casualization” of gaming, and the fact that many people now have powerful computers on them most of the time, which just incidentally have countless free games?

    Another reason that I took a break from the PS3 (after having bought a PSX and PS2 at launch) was that I’ve been extremely turned off by Sony’s empty buzzword arrogance during the whole PS3 era.   And it seems that it’s now their primary corporate philosophy.   You know, maybe it was there during the PSX & PS2 days and I just didn’t notice, I don’t know.

    This article is very interesting in light of the recent Wii U launch, during which I’ve read countless laments, jeers and taunts about how the Wii U is maybe only a little more powerful than the current generation of consoles.  I don’t know what the truth of that is, I haven’t read anyone definitively say what is what in that regard, but my main thought has been “graphics do not equal a great game”.  But that is exactly….EXACTLY…where Sony & Microsoft’s bread is buttered.  There are so many people who care *primarily* about graphics that it’s hard to argue otherwise.  Of course graphics are an easy litmus test for “power” and technology and how things have advanced.

    Anyway, John, this is a great article and I especially appreciate your comments on technology vs. creativity.   Sony executives, et al, always come off as the sort of vapid, faux hip buzzword douchebags that are often satirized in movies, and it seems that this presentation was no exception.  Interesting that they’ve gone back to the “emotion” idea again. 

    Videogames are an area, an artform, with limitless creative possibilities, but it’s like everyone, all the big guys with the money, anyway, just want to use their airguns to spray glittery paint all over everything. 

    One last thing (sorry this is so long):  apparently Wii U sales have been disappointing.  Well, we’ll see how the PS4 and Xbox whatever do.  My bet is that they’ll both be very expensive, and will their sales be any better now that we’re more fully into the era of “app” gaming?

  69. MSUSteve says:

    Well I WAS excited for the PS4.  Thanks a lot, Mr. Teti!

    • Effigy_Power says:

      Fret not. There will always be other consoles to attach your brainstem to facebook and wire your house with cameras.

  70. welldoneson says:

    Watch me scintillate.

    That’ll be $1.29, please.

    OK, OK, fine, I’ll try to be less flippant.
    Of course game manufacturers try to promote the latest hardware.
    There are how many million PS3’s out there, with which gamers and developers seem, somehow, to have been able to make do.

    The hardware has evolved in terms of CPU speed, memory cost, monitor cost, and all that.  So, a hardware upgrade is needed.
    The PS3 has something like 256 meg of RAM.
    Good heavens, man, my desktop has 4 times that just on the video board.  Time marches on.

    The trick is to not only use that capability, but get people to buy it.
    So light up, John, it’s a free market.

    Sort of.

  71. ferrarimanf355 says:

    DriveClub looks awesome, but I’m convinced that the next Xbox will come with a new Project Gotham Racing game, so I’ll wait and see. 

    No backwards compatibility means I’ll hang on to my PS3 for a while longer. 

  72. Halloween_Jack says:

    I just have to offer my appreciation of this line: “(Never mind that “emotion” was explicitly advertised as a feature of the PlayStation 2—the PlayStation 2 is old, and therefore it is a shitbox of lies.)” 

    Also, too, is the delay of gratification of the average consumer so atrophied now that you can’t wait for a game to finish downloading before you start playing it? (I’ve seen that feature advertised on PC games, and my usual experience is that trying to do so quickly weaned me of any desire for them.) 

  73. goawayinternet says:

    I see where you’re coming from, but I guarantee you Microsoft will be worse.  Their box will have the same social crap, the same fancy graphics, but it will also have advertisements, Kinect everything, and Microsoft’s awful bloated interfaces and software.  

    the 8GB of GDDR5 ram is a pretty big deal from what I understand, as far as  giving the console a lot more capability, and not just concerning graphics.  I doubt they have all that many indie style games in development yet but the fact that they brought out J Blow at all is a good sign– I think Sony does understand the value of them.

    • George_Liquor says:

      I certainly agree that MS’ next console announcement will be just as bad, but that doesn’t make Sony’s latest dog & pony show any less vapid.

      As for RAM, well… it’s just RAM. It has no intrinsic game-enhancing properties; it’s just used to temporarily store data loaded from permanent storage. I think the bigger deal with the PS4 will be its purported ease of development. Hopefully we won’t see the same parade of late, sub-par, bug-ridden cross-platform games we saw with the PS3.

      • GaryX says:

        I think a jump from 512ram to 8GB though should let us see some sort of changes in design beyond just graphics.

        • George_Liquor says:

          It’ll be *more* of what we’ve already seen. More texture memory means bigger and/or more textures mapped to bigger and/or more polygons. More system memory means faster load times, bigger levels, more characters on screen, etc. *Maybe* it means smarter AI too, though AI decision trees are rarely memory hogs. 

          It’s not going to be a watershed moment in videogaming is what I’m trying to say.


      oh yes, I have 0% interest in the next Xbox 

  74. welldoneson says:



  75. the_calf_fiend says:

    Now there will be a Playstationn 4.  This means that in about 2 years I will finally think it worth the dough to by a used PS 3, hole up in my room and feel like this-,14/

  76. Dan Brooks says:

    What an exercise in snarkiness. You take potshots throughout, including a jab at Mark Cerny for his delivery — it seemed to me like he was a little nervous and maybe not the best public speaker. Is that something to criticize?

    It was a marketing event. No PS4 shown? Wait until E3. Other companies have done this in the past — it shouldn’t be a surprise. They showed games, gave a release window, showed the controller, talked about the system under the hood. That’s a lot. I’m excited for it. Is that ok?

    • Effigy_Power says:

      Why do you people react as though someone stepped on your tie? It’s an opinion piece.
      John states that the entire presentation (and the live-stream corroborates that) was about more horse-power and more social. That is, no matter how much you’re into graphics and effects, not exactly innovation.
      And it is, in all fairness, exactly the same that every company ever has said at the reveal of any hardware upgrade or game-sequel. The fact that this is common makes it no less groan-worthy. A year ago Sony was still pretty much stating that they thought the PS3 is pretty damn great, or they wouldn’t be releasing more and more games for it. Now suddenly it’s a bucket of room-temperature fish-guts.
      This repetitive statement does wear a little thin, always discarding the past as obsolete. No wonder it took someone in the industry so long to catalog classic games and hardware if even the makers show so little regard for their own ancestry.

      So yes, of course you’re allowed to be excited. I fail to find a line anywhere in the article that tells you to be as cynical (justifiably or not) as John took it. And I am sure the PS4 will be capable of intense graphical wonders. But to equate the technology at this point with the limit of emotionality in games, considering that many pixelated flash games manage to convey more feelings than any AAA title usually bother to, is bogus and righteously called out as such.
      50 Marines talking jingoism and shooting at stuff will not be emotionally gripping regardless of whether I can read the imprint on the ejected shells. And if technology is the limiting factor of AI, then I have to wonder why the AI in the original Unreal Tournament was so much better than that of Aliens: Colonial Marines…
      The limiting factor for emotion and logic lies between the monitor and the back of the seat far more often than inside the machine.

  77. the_calf_fiend says:

    “the Sony PlayStation 5, a 2,048-bit console featuring a 45-Ghz trinary
    processor, CineReal graphics booster with 2-gig biotexturing, and an RSP
    connector for 360-degree online-immersion play”

  78. I agree on your points about limitations being a boon to creative game design, but just from a technology perspective, don’t you think we’re due for an upgrade in terms of services over the current generation of consoles? The increased graphical capabilities won’t make nearly as much of a difference as making a box that is easy to program and distribute to will. 

    I mean, imagine the types of games we could see once it’s viable for developers to make smaller, more focused games instead of ONLY sprawling AAA titles? That’s worth something, right? 

  79. Chum Joely says:

    A lot of John’s criticisms seem to relate to how he perceived this marketing event as a consumer of games (player, gamer, etc.). But there were some important points in the presentation that were more directly of interest to game developers, and which John didn’t really mention  Here’s an interesting article at a developer-oriented site about industry attendees’ impressions of the PS4 presentation. Sorry, you have to register to read it, and I don’t remember whether you have to somehow prove you’re in the industry in order to register.

    To me, the most important point here is the part of Cerny’s talk that John waved away after a quick quotation about “supercharged PC architecture”. The “PC” bit is an enormous change for PlayStation, since the PS3 is based on a completely different architecture called Cell which is (so I hear) notoriously difficult to work with on the programming side. In this case, the technological change absolutely does greatly increase the number of programmers who will have the necessary skill to program for the PS4 as opposed to the PS3– more programmers know the PC-type chip architecture than the Cell architecture. So, to me, it does seem reasonable to think that this tech change (not to mention the added RAM, which John also mainly smirked at) will make it a lot easier for developers to implement their crazy new ideas. As others have pointed out, “more polygons” is not the only thing you can do with a significantly improved underlying chip architecture.

    • Phillip Collector says:

       Couldn’t agree more. In the wake of the Cell processor debacle the “supercharge PC” comment was an important signal to developers to understand how much easier the PS4 will be to program for.

      It was the closest Sony is going to get to coming out and saying, “look developers, we fucked up with the Cell and we’re sorry we made everything more difficult for everybody involved. We’ve changed our ways and we hope you’ll come back and give us a second shot”.

      That’s not something that should be dismissed.

      • George_Liquor says:

        I’m sure there was a collective sigh of relief from the developers when Sony made that “supercharged PC” announcement.

  80. urrrborrr says:

    They waited 8 years. It’s tantamount to admitting the non-revolution. Considering the new consoles’ life-cycles, their presentation is the equivalent of “getting stoked for the next decade of video games,” which is worthy-enough. 

    “Skyrim” is already unplayable on consoles vs. a $400 laptop because of the loading screens. It’s time. If you want to mod the “Oregon Trail” into a Sartrean existential narrative within “constraints,” you can do it in a browser. 

    You’re better-off just focusing on how disgusting you find corporate presentations and mainstream games in general, if you want to be interesting in a dignified way. The article is infused with a smarmy, quasi-delusional sense of moral rectitude.  

  81. tedthefed says:

    The Playstation 5 is just going to be a big empty room with a webcam in it.  “Now, the gamer is completely unhindered by technology and has has pure freedom to do whatever he wants, and it’s all uploaded instantly to the internet!  See, this one has already spray painted ‘Bitches suck!’ on the wall!”

  82. Brainspore says:

    If only “superior hardware performance” was strongly correlated to “fun and innovative gameplay.”

    • ApesMa says:

      Obvious observation, but it’s the opposite nowadays. Smaller, downloadable independent games is where the innovation’s at. Superior hardware performance means even higher production costs. You can’t take risks with AAA budgets and employers who demand you meet specific sales and Metascore projections.

  83. Andy Tuttle says:

    Damn John Teti, tell us how you really feel about the PlayStation 4. You bring up a lot of really great points in your article and I found it very insightful.

  84. D3ADP0OL says:

    When I read that Mark Cerny is both lead architect on PS4 and designing Knack my jaw dropped! He’s the same guy who made Marble Madness! Check it out at

  85. Professor_Cuntburglar says:

    If the PS4 can’t support a 4-player splitscreen GTA-style game, then what’s the point, really?


    man, we live in a cynical age, remember when the announcement of a new Playstation would make you shit yourself with excitement? now everyone, including me, has a strong feeling of “meh” 

    I can only assume that what’s going here is just the fact that console gaming is on it’s way out, I’m sorry to beat a dead horse, I’m sure I’ve said this somewhere before on here, but PC gaming is kicking console’s asses, that’s the future of gaming

    in this day and age, as technology has evolved and computers have become so ubiquitous, it just makes the most sense to cut out the middle man of Sony or what have you and just hook up a gaming PC to your TV, download Steam and there you go, what more do you need?

    the idea of a “gamebox”, a device who’s main purpose is just playing games, is already a dinosaur of a concept, no matter how many gimmicks and frills they add in an attempt to survive 

    here’s the future of gaming, for the average joes they’ll have their cellphone games and for the rest of us, we’ll have our PCs, I can pretty much guarantee you the PS4, the Wii U and the NextXbox will be the last consoles ever released, the magic is just simply gone from console gaming, the world has changed….

    • Thomas Desmond says:

      Console gaming is dead?Really?I haven’t bought a system since PS2,I just genuinely lost interest in gaming,but aren’t console games a multi-billion dollar a year industry?I’m not disagreeing with your view on consoles,just your math.


        they’re a multi-billion dollar a year industry now, but that could change 

      • DoctorMemory says:

        It’s not dead.  And if it’s dying, it’s not going to be killed by standalone tower PCs with mice and keyboards.

        Basically a lot of people with a little too much nostalgia for the mid-90s are looking at sales data from three consoles at the end of their sales cycle, and declaring with entirely unwarranted confidence the resurrection of the PC gaming platform — while meanwhile trying very hard to ignore the simultaneous contraction of the “PC” market.

        A year after the PS4 and 720 have launched, we’ll see what we shall see.  They have some challenges ahead of them, but unspoken elephant in the room isn’t the “PC”, it’s Apple and Google.

        • Halloween_Jack says:

          unspoken elephant in the room isn’t the “PC”, it’s Apple and Google.

          There have been rumors that the next Apple event will be centered around AppleTV. A lot of people have speculated that Apple is making a literal TV set instead of just a Roku-type set-top box, but there has also been talk of making more apps for AppleTV a la the XBox apps; what if… AppleTV were going to be turned into a gaming platform?

    • valondar says:

       As much as I’d like to agree that PC gaming is on the rise, realistically I think console’s are going to remain the dominant part of the gaming landscape for the forseeable future. We can rag on PS4 as much as we want, but will it really not make Sony stupid amounts of money? Because I kind of suspect it will.

    • Leaving aside that this argument has been going on for 20 years with very little to show for it, why would you say PCs are winning now, when the whole world is falling all over themselves to point out that PC sales are dropping like a rock?

      • Merve says:

        I’m curious: are PC sales data broken down by gaming PCs, family desktops, and work machines? It’s possible that gaming PC sales could be on the rise while overall PC sales are down due to family desktops being replaced with tablets and laptops.



  88. japanesebrucewillis says:

    Were we watching the same conference, John? The whole premise of Sony’s ethos is create the tools with which developer creation can best be facilitated. The same ethos that’s allowed for Journey, Flower, ICO/SOTC, Unfinished Swan etc. Moreover, they’re making moves to further facilitate self publishing.

    The Healthy Skepticism angle on this site can sometimes spill over into unhealthy cynicism. I agree the bang-bang shoot-shoot explosions is way over the top… but welcome to every console unveiling ever.

  89. Kevin King says:

    The real question: which console will end up being Poochy?

  90. Ni_Go_Zero_Ichi says:

    I can’t say how satisfying it is to see a major gaming website give David Cage the ravaging he so achingly deserves.

  91. Adam Lacoste says:

    I love this article so much that I want to amend the constitution so I can marry it.

  92. Obviously a new console isn’t really going to “change everything” but I’d say being stuck with 7 year old hardware is a pretty significant limitation that is being removed.

  93. cjob3 says:

    I don’t like all the focus on making video games more “social.” If I wanted to be social, I wouldn’t be home playing video games. 

  94. Baramos x says:

    Ugh, this looks terrible.

    And yes I own a PS3. I had to wait five years to buy the thing.

  95. Baramos x says:

    This article was perfect, by the way.

  96. I don’t think I agree with this all that much. This whole “More, more, more!” notion wasn’t my takeaway from the presentation. Sure, David Cage talked out of his ass, but that’s the type of pretentious he is. Chris Nolan has his IMAX cameras, and David Cage has polygons and mo-cap technologies; I still find both to be extremely interesting mainstream storytellers, despite how heavily they feel technology plays a role in their storytelling. I have no interest in Killzone as a franchise, but I understand that that is the type of title that will get average game enthusiasts excited, and so I have no issue with it being what it is. The fact that Blow was at the event at all was impressive to me, and made me happy. I mean, Media Molecule put on a puppet show as a tech demo, which “gamers” undoubtably thought was weird, I found extremely charming. 

    More processing, and more RAM were touted, sure, but I got the impression that Sony was more interested in providing developers with better and smarter tools, rather than sheer power. Games will look prettier, yeah, but I think all of this tech stuff is going to play a much larger role in the aspects of games that we don’t necessarily see. Maybe I’m just not jaded; maybe these big conferences still make me excited, but I’m pretty enthused about the future of games, especially on PlayStation 4. 

  97. The Colonel says:

    Just excellent writing. I don’t give have a shit about video games any more, but that was captivating and enlightening. Great work!

  98. E M says:

    Very silly article. I could easily imagine this same article being written at the advent of the PlayStation 2, saying that Sony touting more polygons will not solve the problem of bad games. Bad art is inevitable in any medium. Better technology can make  good art greater. As we have seen since the PS2, the technology has created entire new genres such as open world game, and made existing genres take on whole new dimensions such as first person shooters. Looking back in 5 years, this article will look just as silly as such an article written about the ps2 or ps3. 

    The concluding paragraph is especially egregious as almost everything said in it is wrong-headed. Indie games have never been more successful in the history of the console industry as they are today . Furthermore, games are getting easier to make, not harder. Look at how many games are being made on unreal engine 3, which is being constantly improved. Watch Geoff Keighley’s interview with the guy from epic right after the Sony press conference. He said that Unreal Engine 4 allows games to be made easier than unreal engine 3. What Teti has failed to account for is that raw technology is not the only thing that improves, the ability and creativity in making that technology easier to implement also improves. This article is essentially a reflection of Teti’s deep, unwarranted cynicism towards the possible benefits of progress. It is easy to say that which has come before is best; that limitations are what make old art great. I believe that it is the artist that makes art great. The further you remove the barriers between an artist and his/her artistic vision, the more engaging the resulting art.  

    • ApesMa says:

      “This article is essentially a reflection of Teti’s deep, unwarranted cynicism towards the possible benefits of progress” – but it’s the lack</i< of progress he is complaining about. They talk about Innovation and new possibilities, but everything they present just points towards more of the same with better graphics. 

      Nobody's saying there will be no more innovation in the next generation, but there have been fewer new gameplay possibilities offered with each new generation since 3D gaming was introduced, and that trend seems to continue. It's still very early and I hope I'm wrong, but what they presented here demonstrated the opposite of what they were trying to communicate, and makes me suspect they have no serious plans to set the stage for innovation.

      Your comments about games becoming easier to develop are intruiging, and I share your optimism about indie games. I guess games will have a similar evolution to music and movies; we increasingly have to stay away from the big hits to find the best stuff.

    • ApesMa says:

      “This article is essentially a reflection of Teti’s deep, unwarranted cynicism towards the possible benefits of progress” – but it’s the lack</i< of progress he is complaining about. They talk about Innovation and new possibilities, but everything they present just points towards more of the same with better graphics.

      Nobody's saying there will be no more innovation in the next generation, but there have been fewer new gameplay possibilities offered with each new generation since 3D gaming was introduced, and that trend seems to continue. It's still very early and I hope I'm wrong, but what they presented here demonstrated the opposite of what they were trying to communicate, and makes me suspect they have no serious plans to set the stage for innovation.

      Your comments about games becoming easier to develop are intruiging, and I share your optimism about indie games. I guess games will have a similar evolution to music and movies; we increasingly have to stay away from the big hits to find the best stuff.

  99. Excellent article. I especially like the critique of David Cage’s statements and his patronizing analogy comparing inherently limited and deplorable art to silent film.

    That’s one of the reasons I love Gameological so much, because it’s a part of The AV Club. Way too many game journalists only know video games, whereas experience across media divisions is the key to writing truly insightful works. John Teti having the education to point out the significance of The Great Train Robbery is something other game writers miss, and it’s something game designers like David Cage miss, too. So for all his talk about emotion in video games, he’s coming at it exclusively from a video game oriented perspective, as opposed to a storytelling and character-developing perspective that could invoke emotion in any medium, regardless of technological limitations. Silent films told stories DIFFERENTLY than do modern sound films, but they didn’t necessarily tell them worse.

    Also, not that Jonathan Blow isn’t already a polarizing figure, but I find it ironic that the guy who was just complaining on Twitter about duplicate keys from a key copying machine having “ads” on them for the machine manufacturer would show up to a corporate shill event like this. Makes his statements about wanting to transform the industry seem disingenuous.

    • ApesMa says:

      Other sites are afraid of alienating their readers. An article like this on IGN would cause an uproar and many PS gamers would emigrate to Gamespot or some other “unbiased” site.

  100. Ben Hourigan says:

    Interesting article, but I think it’s misguided. Yes, the marketing speak is bullshit. Yes, it was already possible for games to evoke emotion.

    But this line is flat out wrong: “Making art is hard. No microchip changes that.” The PS3 had a really strange processor architecture, and all reports point to it being a complete pain to program for. By using relatively standard components: an x86 processor and an ATI GPU, PS4 will be easier to make art with than the PS3.

    Much of what appears on the platform will be shallow garbage, shinier reiterations of old, bankrupt tropes and mechanics for the kind of people whose girlfriends listen to Nicky Minaj and Ke$ha. But there will also be things like the PS3’s Journey, which is, in game form, a beautiful, surprising, and readily intelligible story of the spiritual quest which has as its end the transcendence of individual selfhood.

    There may be something to rejoice in, in this privilege of ours: the luxury of giving new and exuberant expression to our mythologies. It is not necessarily a mere matter of inventing new things to desire: it is also one of inventing new ways to create meaning.

  101. Mr_Propellerhead says:


  102. Nicholas M says:

    “The Great Train Robbery is a masterpiece not in spite of its limitations but because of them. So if David Cage doesn’t think he can produce an emotional work of art with a PlayStation 3 and an eight-figure budget, maybe he shouldn’t be in the art-making business.”

    That zinger done went ‘n earned you a Twitter follower, friend.

  103. Merve says:

    I think the article makes the important point that advancement and innovation don’t always go hand in hand. But I don’t necessarily see that as a bad thing. Believe me: I love new ideas – weird, wacky, wonderful ideas. But I can also appreciate and enjoy an old idea executed well. Sure, I love experimental interactive fiction like Analogue: A Hate Story. But I also enjoy shooting dudes in the face, and I can’t deny that shooting dudes in the face is way more fun in high-definition than at 640×480.

  104. Sir Phobos says:

    Great review. This sounds like something I’m going to rent and get too angry to finish. I hate games that do everything for you. Might as well just make it a 10 hour cutscene, which is what MGS4 basically did.

  105. Chum Joely says:

    Look out, Gameological. Large numbers of people in the industry are posting this link back and forth to each other. Prepare to be famous.

    • ApesMa says:

      Oh I hope this is true, they need to read this. I’m sure AVC having this as their headlining feature all day helped a lot.

  106. Ladyfingers says:

    I dunno, I think this article is coming down on Sony for what, if you ask me, should make for massively easier development and therefore more flexibility to create emotional content. It’s a nonsensical luddite screed.

  107. Simon Drommel says:

    Best article ever

  108. Shane McKinley says:

    Looking back…this is the first time I’ve felt really, really unimpressed with a new system coming. I’m sure the new XBox will have plenty of empty buzz words and other dumb crap.

    • ApesMa says:

      It was inevitable really. HD graphics are enough for most of us until 4K becomes commonplace. Until then new consoles are probably going to be a harder sell than before. There won’t be any new physical formats to use as a selling point either.

      Maybe Nintendo’s strategy will pay off in the end this generation, people might be more interested in the gamepad and HD Mario, Zelda etc. than… whatever this thing is offering.

      • tvugly says:

        But the current generation of consoles aren’t actually hd… if you’re lucky you get 720p.. It’s all muddy up-resing

        • ApesMa says:

          Isn’t at least the Wii U 1080p? Looked like it when I played it once, but I may be mistaken. Anyway, most people, like me, don’t really care that much. 720p looks fine.

          Remember how long many people where perfectly happy with their Wii’s and DVD players even after buying HDTV’s?

  109. tvugly says:

    Truly an article written by gamers for gamers. By that I mean more fucking complaining.

  110. KB says:

    Did Sony say anything about backward compatibility or the patent they recently applied for to make technology that would prevent the use of used games?

    • ApesMa says:

      No backward compatibility bacause the technology is very different this time around. No anti-used game tech either.

  111. RobDX says:

    Got linked to his from an IGN writer, what a waste of my damn time. And to the commenter who said that Nintendo “updates their AAA titles and changes them up” New Super Mario Bros has been 98% the same since the first one came out….and I think you can put Zelda in that category too.

    • ApesMa says:

      NSMB is not one of their AAA titles. It’s right there in the comment you are referring to. There’s a reply button by the way. Those are made quickly by relatively small teams, while they spend many years and deploy huge teams of their most talented develpers on Zelda and the 3D Mario platformers.

      Nintendo milk their less prestigious (though extremely popular in the case of NSMB) franchises as bad as anyone, let there be no doubt about that.

  112. spookymulder8 says:

    What? Sony supports plenty of unique games. Are we just going to pretend stuff like Flower, Unfinished Swan, Journey, Heavy Rain, Puppeteer, Rain, Ico, LittleBigPlanet, Shadow of the Colossus, and even the continued support of stuff like Last Guardian etc. don’t exist?

    Sony’s Meeting is geared towards investors and developers as well. This means marketing points about checking off things the world care about that have been successful and profitable – social networking, which is what gets Sony stocks up. And it was also about ease of development and what developers can expect to work with which is important.

    This article is just plain whiney & dumb. It’s like complaining that a commercial was trying to sell you something. The lol-worthy comment about creativity and limitations while bearing some truth is also equally idiotic as a comparison. Creativity also thrives on expanded technological possibilities and monetary investments which is why cinema
    has grown so much more to do what’s possible.

    Many revered games, like Silent Hill 2, etc. wouldn’t be the same without 3D engines, sound and music, had we remained with side scrollers and midi. Films are grand because their stories involve these things called people and reality that come pre-built. In games those things need to catch up to reality to match what we take for granted in film. And even film had to advance to have better sound, better picture quality, better FX development. So more, more more of the same! Similarily Games need to improve more of the same – graphics, physics, Ai etc. Those are basic things that work and need refinement and advancement. You sound like we need a whole new paradigm shift. We might as well all just go back to the stage theater and puppet shows. Who needs those moving pictures gimmicks?

    • Merve says:

      You’re correct to point out that publishers and console manufacturers put on these shows and press conferences in an attempt to impress investors. Unfortunately, that didn’t work out for Sony, as their share price actually fell after the PS4 unveiling (perhaps because the actual machine wasn’t unveiled!)

  113. ClementC says:

    I think that almost every part of this article is wrong. Yeah, Killzone is just another cover shooter, and dragons are dragons, but did you notice that John Teti makes no mention whatsoever of Media Molecule’s stunning 3D sculpture/performance art demonstration? Could that be because that one demo completely demolishes the thesis of his article, that there’s nothing new under the sun and Sony isn’t promoting any kind of innovation?
    I love bashing Sony as much as the next guy, but to claim that Sony is only about more gigabytes and doesn’t foster innovative, emotional experiences is just plain false.

    I saw a dramatic change in Sony’s attitude and approach to developers, to gamers, and to technology in that conference. I saw a rejection of the arrogance that gave us the Cell-based PS3 and the crippled PlayStation Network. I saw a commitment to technology for making games better, not just for technology’s sake. We won’t know if it’s a sincere change until later this year, but to condemn it as shallow and more of the same seems to me to be a prejudiced view of new leadership and a new product.

  114. Benjohn Barnes says:

    One of my favourite games last year was Super Hexagon. I’m not a big gamer, but for me it was amazingly fresh, fantastic fun, hugely addictive, a learning experience … just great.

    Something especially wonderful about Super Hexagon is that, pretty much, it could have been created 30 years ago with a vector display, old old processors, and mad sound chips. 

    Super Hexagon didn’t wait 30 years for technology to get powerful enough. It waited 30 years for someone to try that idea out.

  115. Nimran Ali says:

    I’m going to play devil’s advocate, and my experience is on repairing and customizing PC’s for customer’s and myself. Yes, games will always be dependent the creativity and loftiness of developers, or even very sound gaming concepts. But the threshold has to be pushed on what a system is capable of doing, especially if you want to integrate the web into it. One of the more recent games I played was an MMO, The Secret World. Good story, but my friends and I found the investigation missions, where were cracking codes and using the fully working ingame browser to do “real life” research to help us. If you don’t have the RAM to have a browser window open and play the game (like the current PS3) then this is a good concept that is slowed by tech. Also, there has to be a measure of futureproofing in a system, because the timeline for systems is larger now. It should last the better part of a decade.

  116. FATKEV says:

    Probably the most bitter article about the PS4 yet!

    It was a marketing event – so there was some marketing!There is new tech – so there were some new tech demos!
    What were you expecting? What have the previous next gen launches shown you that led you to believe that this would be any different? (apart from the actual shiny black box).

  117. Brad Grenz says:

    Creativity thrives under limitations? Good point! That’s why we only need the studio system for our movies! That’s why no one has needed to record more than 4 tracks on an album! Oh, and who needs more than one color to paint with?

    In truth, certain kinds of creativity thrive with limitations. Others are strangled in their crib. We are well short of an era where the hardware in game systems is adequate to realize the creative visions of so many game creators, and to champion a rejection of advancement is pure madness.

    The current consoles are stifling the industry. More is exactly what we need. It’s not a coincident that the games have all started looking the same each fall. We’ve been at the practical limit of existing systems for some time, and you literally cannot push the boundaries any further without new hardware.

  118. Shinsei Chan says:

    “The Evolution producer then showed us how DriveClub players can challenge online friends to beat their best race times, a feature already present in practically every racing game made in the past two years.”
    I imagine some game developers as little creatures which live in an attic and spawn a great idea but lost track of time and space. They might think they have an amazing idea, which no one had before of course, and in the end they’re quite disappointed that their sales are not as high as expected.
    Thank you for such a great article!

  119. Brian Dell says:

    Saying that the guy who made Heavy Rain maybe shouldn’t be in the art-making business makes you sound like you don’t really understand why technology is important to art at all. Maybe you shouldn’t be in the writing business.

  120. Long_Dong_Donkey_Kong says:

    Great article. The PS4 presentation provided ammo for the “video games are not art” folks. If Sony’s idea of better games is better graphics, more enemies on screen, and more explosions, then the budgets of these games will price out creativity, and the highest level of art we can get is the equivalent of a Michael Bay movie – unless you consider a high-polygon old man’s eyes to be art. The way to make games better is to change the way we play them. Only Nintendo seems to be on board with this, and they only seem intent on changing the way games are controlled meaning you might use a touch screen to help save Princess Peach, but you are still saving Princess Peach.

    Video game publishers would rather “give Malibu Stacy a new hat” every year and roll out another Madden or COD because they can sell millions instead of making games that could be critically acclaimed, but sell a few hundred thousand. Until publishers use their money from Maddens, CODs, etc. to fund creative projects that change the way we feel, react, or interact with games, Ebert wins.

    • ApesMa says:

      I hope Nintendo will demonstrate the true potential of the gamepad once their more ambitious games are done. I doubt many third party developers ever will though. Luckily Nintendo, their second party developers and the few third party developers who are willing to make an effort will provide plenty of great games in the years to come.

      Many gamers don’t seem to realize how critical the evolution of new interfaces is to the future of a gaming industry that is in serious danger of growing stale and losing mass appeal. I understand the objections to motion controls, but the gamepad seems like something the hardcore and casual gamers alike should be able to embrace. Hardcore gamers seem staunchly conservative in this regard though, PS gamers clinging to a long outdated controller (switch the d-pad and left analogue already).

      Most people will not be able to see that PS4 brings anything significantly new to the table, and neither do I really. You know they’re in trouble when utterly useless and almost certainly really annoying social media integration is touted as a major new feature.

      • Long_Dong_Donkey_Kong says:

        You show me somebody whose every video game accomplishment is posted on his Twitter or Facebook feeds, and I’ll show you somebody who is blocked by all of his followers/friends.

        I really wanted Sony to think outside of the box and offer new experiences, and instead they showed me experiences that were possible in the PS2/Cube/XBOX generation – just prettier. I have no faith that Microsoft will offer anything other than a “whip-it-out-and-measure-it” comparison with the PS4 when they unveil their new system.

        • ApesMa says:

          My thoughts exactly. I’ve heard some bad things about Nintendo’s Mii-verse thing and can’t see the point of it, but they at least score major points for apparently making it entirely unconnected from Facebook/Twitter/etc. and instead strictly insular and game focused (correct me if I’m wrong).

          As far as true innovation is concerned, I guess it’s mostly up to small budget downloadable games, indie and otherwise, from now on.

  121. b9328 says:

    I have no expectation for the press conference of a multi-national corporation  to be “authentic”.  Maybe I should have that expectation.

    Despite this flaw they had some good moments for a reveal that is 8 months ahead of the release.  The hardware looks well thought out.  The controller, which received many snarky remarks on the web, adds something console controllers have not had — a locator device that is more precise that analog sticks.  Plus they amazingly integrated Playstation Move-like technology.  This controller probably could not have been made at any price when the first dual-shock was released, but now it will be made and will cost $49.

    Sony also dropped the PS3 marketing speak that made it sound like they didn’t really care about games but about being the center of a home’s entertainment.  Taking time to talk about their interest in their most valuable customer is pretty positive in my opinion. 

    And I think Mr Teti used a bit of hyperbole in describing Sony’s repudiation of past marketing.  1) Who cares, it is dumb marketing and 2) the repudiations were much milder than he implied.  I understand he was being sarcastic, but still, he made it sound like Sony was terrible for doing this.  I am happy they didn’t pretend they had not been wrong in the past.

    And contrary to some commentors, today’s hardware _is_ a problem for developers of all kinds maybe for indy developer’s most of all.  Delivering an HD game for the Xbox 360 or PS3 was very hard considering that even Bungie’s first 720P game was their _third_ game for the Xbox 360.  These consoles are underpowered for HD which means you need lots of labor to get games to look and play well.

    Since the resolution is staying the same, PS4 will make game development much easier for all.  And with the ability of using Unreal 4’s scripting engine, I think you might find game designers who will build games with far less technical help (read: less money).

    Sony started working on the PS4 in 2008. I think they should get the benefit of the doubt after just an introductory press conference.  We can wait to trash them the day after the PS4 is released. :-)

  122. brando120 says:

    “Creativity thrives under limitations.”
    Case in point: Star Wars 4-6 v. Star Wars 1-3.

  123. erikfinnegan says:

    I’m a PC. And I’m the ultimate gaming machine.

  124. Voltech44 says:

    You know, it’s funny.  My brother loaded up the Sony stream and invited me to watch it with him, but I decided against it because (if E3 was any indication) it would have been a bore, and I could get all the info I needed from my usual sites.  But he invited a friend over for a viewing party of sorts, so I got roped in.

    Lo and behold, the reveal was a bore.  I think our friend came close to falling asleep at a few points.

    I know that it won’t be long before the PS4 becomes an “it-item”, and inevitably gamers are going to start lining up to grab it no matter what.  But there are a lot of issues that need sorting out besides buzzword-filled conferences; if one of the earliest highlights of a show designed to show off new potential and innovation is another Killzone, then something has clearly gone amiss.  And the less said about David Cage, the better off I’ll be.

    But that aside, this was an absolutely fantastic post.  And thanks to it, I saw The Great Train Robbery for the first time.  I’d call that a victory in itself.

  125. Alan Kleiman says:

    “Creativity thrives under limitations.”

    You shouldn’t rely on unsupported claims for articles; it’s poor form and intellectually dishonest. This particular one is a false aphorism; at the broadest possible interpretation, it’s just trivially untrue. A more powerful machine will be capable of producing a much larger variety of games than a less powerful machine. Claiming otherwise is denialism. 

    The narrower interpretation is one in which you suggest you have particular insight into the nature of creativity; this is false. We don’t; if we knew how to bring out creativity, we’d be able to do so reliably. It’s just not a process we understand; let’s stop claiming we have any insight into it, then, and just raise the level of discourse all around.

    To be clear, this isn’t a personal attack against you — I have no real insight into how creativity works for myself, much less in a general sense. But that’s the point — no one does.

    • Fyodor Douchetoevsky says:

      Pretty sure that the whole “creativity thrives under limitations” thing has some truth to it. Of course someone can be creative without limits, but forcing yourself to work around whatever limits requires more creative thinking. Loads of writing exercises and stuff are based around this. I don’t think it’s a ridiculous idea, and it’s certainly not a claim that you can only be creative when you’re limited in some way.

  126. Harold Moen says:

    Excellent read and I feel it has some points to make at cinema as well.  For some reason the architecture of cinema has become a selling point.  IMAX, 3d, 48 frames per second.  This doesn’t make the movie better.  just the same with video games.  Video games have for obvious reasons always been a hardware first centered ecosystem.  selling the hardware means selling the ideas.  Cinema had to progress because of internal pressure.  Casablanca could not have been a silent film. 

     New technology means new avenues of creation.  Not better art, but new art.  One of my favorite games of the last generation was Journey.  this game could have been made (with some changes ) for an Atari 2600.  However that game would not have made the same impression on me.  Yes blockbuster movies will be made using state of the art technology, but so will games akin to Forrest Gump (which made use of modern technology to a great extent).  I think the next several years will provide a wealth of wonderful games.  Games that perhaps couldn’t have been made in the last generation.  That doesn’t make the last generations art pitiable or laughable.  Hollywood doesn’t unveil their new cameras they make movies.  

    Someday perhaps the same reality will be true of video game developers.

  127. Nick Kitchingman says:

    This seems needlessly harsh. I thought Sony handled this event pretty well overall, and I’m not really a console gamer. What’s wrong with people getting excited about new hardware? Would you be happier if sony had just sent out an email saying “we have this ps4 thing coming out in a few months, it’s kind of shitty, buy it anyway.”

    I am cautiously optimistic about the ps4.

  128. A different conceptualisation approach for videogames its badly needed. However, and I say this with respect, you should probably research more into game creation, for someone who writes about so seriously. 

    For instance native C# on PS4 is a day and night difference, It will allow development effort/time into game conceptualisation, prototyping and development. I think SONY approach is actually pretty good in the engineering side, you can have a zillion good ideas.. how and who is going to program it is still the issue. 

  129. There are more sides of the story of “moar” (memory, cores, mips, flops, etc.). And I often feel like they’re told the wrong way.

    So one popular argument (among indies and low-fi/casual gamers) goes like this “Games suck already, they don’t get better with more fancy graphics, we need to think differently.” This viewpoint is also what the above blog post represents, it’s the basic formula more != better. I won’t refute the point, it has some validity, but it isn’t the entire truth either.

    The other argument, often made by AAA studios and the like goes: “We just want more so we can do more awesome”. So this story is the basic formula more == better. I won’t refute that point either, it’s a bit of a silly point of view, but it depends, but it’s also not the whole truth.

    Then there’s a kind of pseudo-middle ground of people aware that high-def games seem to be spiraling to death at ever rising budgets, and their argument is not related to better or worse, it simply states that more == harder. There’s some truth in that as well, but it’s also not the whole truth.

    So if all of these above kind of represent certain truths, but miss out on others, what part is everybody *not* talking about?

    That’s really easy to see: more == easier. This completes the aspects on which you can argue, and I often like to point it out because it’s obvious (if not for the layman).

    So how can more be easier? In several subtle ways:

    – more hardware power means you can get away with optimizing less. This simply is a cost saving, less time to develop the same, that previously was more difficult to develop. For instance hardware tessellation makes difficult to achieve LOD managment easy enough that you can do it in an afternoon.

    – more hardware power making difficult things easier, also makes impossible things possible obviously. For instance realtime global illumination is for now mostly confined to multimillion dollar companies with deep pockets and years of research (such as frozenbyte, crytek, unreal etc.). Most of the money they spent in making realtime GI possible goes into finding clever optimizations. If you can ease off the optimizations, realtime GI gets more reachable for average developers.

    – more hardware power can give anything you did a little more bang. For instance minecraft could show you 10x more terrain than it currently does. That wouldn’t take notch much effort, but it’d provide you simply with a little more niceness. If Notch wanted to do that without more hardware power, it’s again impossible or hard to develop an optimization for. So again, more bang, less effort.

    – more hardware power makes it possible to rely to a greater extend on procedural content. Procedural content, you see, relies on precomputation a lot. This is because you can’t compute everything each frame, even where that would be desirable. But precomputation has a memory cost (it eats up ram you could’ve used otherwise) and it drives up load times. So if you can drop more procedural computation in each frame rather than precompute it, you have more load time and memory free to supply your non procedural assets, or the procedural assets that can’t be done each frame. This simply put means, you can put in more content easier.

    So what does this all boil down to? Cost saving, where cost really means *time*. What is the time you did not spent coming up with difficult and optimized ways to do what you want to do going to spent on? Well, hopefully making the game better. Not necessarily of course, but, at least there’s a chance.

  130. Cristóbal Riego says:

    Best article I’ve read in a while. Came from Giantbomb and am a fan now.

    You explained that uneasy feeling I had during the conferenced but couldn’t put to words.


    I’ve seen the end coming for a number of years. It all started when I realized that digging out an old cart release from a mound of vintage gaming auctions promised more fun on average than anything on the recent sales charts.

    This coming generation is going to be for home consoles what 1983 was to Atari.
    Unfortunately this time there is not going to be a Nintendo to swoop in, clean up the mess, fill in the void and lead everyone along a much pleasanter path merrily.

    This time the problem is not quality control, it is that the industry has grossly outgrown itself and growing bigger is no more a solution than a cyanide capsule is panacea to a disease eating away at the patient.

  132. Deidara says:

    “At a New York event, Sony argues that the game industry’s problems can only be solved with more of everything.”I don’t feel that was there message at all, yes they touted the specs but that’s obviously going to happen in a console reveal. The focus on specs was mainly emphasized to show that they are listening to developer and consumer concerns, such as much more ram and a x86 architecture. The amount of system ram on the ps3 as well as it’s architecture was why Skyrim on PS3 was such a fiasco. The architecture appeals to indie developers more because this is the type of architecture they already develop for.Also why no mention of Knack or Media Molecule in this article if creativity is the concern of the article? Knack looks unlike any game I’ve played before combined with a Pixar movie and although I’m not sure how Media Molecule’s presentation would translate into a game it showed that they are creatively experimenting. It also doesn’t help when you punctuate your point with an anecdote about David Cage one of the most creative and artistic story tellers in gaming.

  133. ToddG says:

    Also, what about LA Noire’s tech?  You were basically just watching the actors act.

  134. As much as I love the theatre and early film, I have to admit that theatre-style acting looks pretty silly on the screen. A lot of golden-age Hollywood movies don’t hold up very well for that very reason.

    The same can be said of video games that try too hard to be “cinematic”. It’s frustrating when a video game changes the rules by taking away player control at a critical moment for the sake of drama.  

  135. Logoboros says:

    And I would posit that XCOM has that effect in part because the soldiers are essentially blank slates. The player projects whatever character the player wants onto them, as a kid would do with army men, who similarly personality-less (well, except for the ones who are posed in the act of getting shot; I had a couple of those — and I used them — but it always seems kind of cruel to have a guy whose entire character is wrapped up in being perpetually and forever in agonizing pain, always being carted around by the stretcher bearers).

    Many game designers seem to want to create characters and personalities for all the figures in their game — as though having an undefined character is a flaw, an artistic failing — but to me it’s part of what seems like this larger project of outsourcing imagination. We, the designers, are going to take over more and more of the imagination process so that you, the pitiful gamer whose mind has been numbed by years of film and television, won’t have to work so hard to invest in the world of our game.

    This is also one reason why I’m anxious about the trend (I have no numbers, but anecdotally it seems real enough to me) for play with action figures to be replaced by playing video games where more and more of the imaginative work of world-building has been done for you by experts. There is, of course, a lot of custom-content building tools with modern games that acts as a kind of counterbalance, but I’m still nervous about a general decline in the amount of freeform pretending that kids do to entertain themselves these days. So, that proves I’m a codger.

  136. Nacho_Matrimony says:

    Devs and game writers need to take a few lessons from Rockstar, Valve, thatgamecompany.

  137. Geo X says:

    First: I love Mario games.  Secondly: I’m not even slightly emotionally attached to Mario as a character.  Really, now.

  138. valondar says:

    Really I think the biggest thing holding consoles back is they’re not an open system. Any game published on any console has to work through Sony or Microsoft or whoever to get released, and I’ve heard XBLA is not a cheap service to get featured on.

    And while it’s harder to get onto Steam than it is to get published, Steam’s relative openness and greenlight system and so on might work well for the Steambox.

  139. For years, the standard line against PCs was that flexibility was its greatest strength and weakness. Those of us who grew up in the 90s can all recall the ridiculous hoops we’ve gone through to make a particular game work. That even applied to current games on “top of the line” PCs.

    This problem seems much less of a reality today. Part of that is due to the internet’s problem-solving utilities (be it forums or official patches). Another reason is, as you’ve highlighted, today’s PC games rarely over-tax the PC.

  140. Effigy_Power says:

    Well put, sister.
    Claiming that better processors allow for better games in any department other than graphics is the same as claiming that HD makes better movies in any department but visuals.

    On lighter notes: Anyone who writes “It was like watching the Flat Earth Society unveil the year’s hottest new globes.” can only be applauded. :P

  141. Professor_Cuntburglar says:

     I would just like to applaud everyone for discussing PC gaming and console gaming without devolving into a bunch of “CONSOLE GAMERS ARE TEH RETARED AND I HATE THEM FOR PREFERING A DIFFERENT PRODUCT THAN ME” like everywhere else on the internet.

  142. valondar says:

    Rage: Doomquake.
    Bloodkill: Dudebro Ops II-2: The Disemboweling.
    Murder’s Code 4.5: Guys In Palette Swapped Tights.
    Serious Face: Scary Monster Edition.
    Unreal Crying: The Uncanny Valley.

  143. aklab says:

    Damn, I LOLed at MurderDeath: Killplace Vixens and now people know I’m not working. :(

  144. ToddG says:

    Lazy Guy: [ed: please add subtitle]

  145. WaxTom says:

    Shepard’s Adventure: Crisis in the Pasture
    Bus Driver Madness: Fare Weather Friends
    Cola Wars: Diet Another Day
    Man Quest: Punching A Bear
    Modern Shooty Guy: Moral Ambiguity
    Tom Clancy’s Military Fetish: Regan Edition

  146. Citric says:

    Gun Quest: The Quest for More Guns
    Bees: The Quest for Pollen
    Browns: The Gritty Realism Game
    Cleveland Browns: The Gridiron Realism Game
    A Time To Kill: Matthew McConaughey’s Bare Chest Edition
    In Spite of Rage: Still Just a Rat in a Cage
    Gratuitous: Sex and Violence Edition
    Gratuitous II: Sax and Violins Edition
    Zombies: They’re still a thing, right?
    Non-Threatening Vampires: Rainbow Sparkle

  147. duwease says:

    Trauma Center: Colons of Destiny

  148. stakkalee says:

    The GunShootist 2: Have Gun, Will Shoot

  149. Merve says:

    “Those of us who grew up in the 90s can all recall the ridiculous hoops we’ve gone through to make a particular game work. That even applied to current games on ‘top of the line’ PCs.”
    Tell me about it. My rig just won’t run Max Payne 1 or 2. I might have to roll back my drivers to get them working. Fuck you, AMD.

  150. valondar says:

    Ah, the nineties. I just got flashbacks of those goddamn Pentium processors and how each new release was more revolutionary and more an essential purchase than the last.

  151. GaryX says:

    Just to be Devil’s advocate, I kind of think the processors:better games :: HD: better movies is a false equivalency even though I agree with the sentiment.

    Though, I’d also argue that better visuals in a film have a much, much greater impact than graphics. I can generally sit through a lesser film with incredible visuals, but there’s no way I’m going to play an incredibly looking but shitty game.

  152. ApesMa says:

    @GaryX:disqus It’s not entirely accurate, but we have been getting closer and closer to that with every generation since the switch to 3D. Better processors continue to have less and less impact on actual gameplay.

    Agreed about your second point though. Having to actively engage with something that doesn’t work like it should is much more frustrating than watching a bad movie.

  153. ApesMa says:

    It should be noted that the NES saved the industry by doing the opposite of that. A lot has changed since then, but an open system could still be highly problematic.

  154. Bad Horse says:

    @ApesMa:disqus What caused the console crash in 1983 was a lack of innovation. After getting Pac-Man and Space Invaders sold back to them for the umpteenth time, the day’s consumers decided that was about all video gaming had to offer, and put that shit on the shelf. Now, due to the economics of closed console systems, we might be getting back to that point. Sony showed more FPSes yesterday. Whoopty fucking doo. I love FPSes and I’m still sick of em. I’m not going to buy a whole new box to let me play prettier ones.The alternative is to open up the platform, and see what indie developers can do with more access to all that shiny RAM. No more expensive SDKs, no more $100K license fees just to post it on PSN.

  155. ApesMa says:

    If that’s true then pretty much everybody who has written about the crash of the videogame industry and its ressurrection by the NES is wrong. 

    Nintendo’s decision to prevent the console’s library of games being flooded with crap with the official license system is usually described as the key factor of their success, along with Myamoto’s games.

    That said I also find the preospect of a more open platform exciting, and I hope it does happen and will work. As the major developers keep playing it safer, It’s very important that others can innovate on smaller budgets. I also agree 100% with everything else you said.

  156. Bad Horse says:

    @ApesMa:disqus That was true of Nintendo in 1985 but it’s the opposite problem from today, if you ask me. 

  157. ApesMa says:

    You do have a point, things are very different today. Maybe being flooded with smaller downloadable games would be a good thing, due to the internet it’s easier to find the good stuff and ignore the rest now.