I will offer a solution during this Rant. And obviously, solutions require a problem. So before I discuss the solution, let’s discuss the problem.
We’re in a worse state today than we were in 2012 – the twilight of last gen consoles.
Back then, the consoles were ancient tech and devs were struggling to get ambitious experiences running acceptably due to very real limitations. Obviously, some developers were better at this than others (see The Last Of Us and GTA V), but the point stands. Meanwhile, PCs were many, many times more powerful than consoles but no real cross platform experiences could be pushed to the limits because developers needed to cater towards the lowest common denominators – the consoles.
This had a very real negative affect on our industry. PC gamers suffered because the hardware they spent their hard earned money on was not being taken advantage of, which, as a PC gamer, I find genuinely insulting. This objectively superior PC hardware was sitting there unused because developers (a) had to cater to the consoles’ lack of power, and (b) the developers simply didn’t care about PC and insisted on treating PC gamers like second class citizens.
This situation has not changed today, and in fact, I would argue is even worse than it was 3 years ago. We currently have underpowered consoles that simply cannot meet the standards of today and developers are having to compromise simply to get their games and engines running on these weak plastic boxes. These consoles simply cannot consistently hit 1080p resolution (an ancient standard) in many cross platform “AAA” titles. That’s pathetic.
Additionally, PC gamers, once again, are paying the price due to graphical downgrades that were implemented simply to appease consoles and console gamers. Watch Dogs’ 2012 E3 outing was running on PC and could have looked that pretty (or even better) on the PC release in 2014. But because Ubisoft is Ubisoft, they deliberately sabotaged the PC version simply because console hardware cannot keep up.
These “AAA” publishers don’t want to hurt the feelings of console fanboys and instead, pretend that these consoles are more powerful than they really are. This spins a dangerous narrative which, while factually incorrect, will be perceived as truth by the masses. After all, if a thing is repeated enough times, people will see it as truth.
Of course, the argument can be made that developers don’t have to cut out high end graphics features. At the end of the day, they aren’t being forced to cut these features out. They can simply disable them for the console release and then enable them for the PC version. But again, I see this argument missing the core issue.
The core issue is that these developers shouldn’t have even been put in this situation in the first place. And just what is this situation? These consoles are highly underpowered pieces of plastic that are stagnating the industry even worse than their predecessors. What’s even more incredible is that the gap between consoles and PCs today is absolutely enormous, further exacerbating the problem. Once again, PC gamers are being neglected by the “AAA” industry.
If I were a developer, all the horsepower on the PC would be a dream for me. I would feel frustrated as hell that I can’t push the limits of tech to their limits simply because Microsoft and Sony made low end underpowered laptops.
What I am saying is that consoles are ruining the industry. However, as I mentioned at the top of this piece, today’s Rant is more than simply justified anger at consoles. Today, I want to provide a solution.
I have been asked a few times over the past few months (really, people have been asking me this since the current gen consoles launched), that if I were to build a “next gen” console, what would it be? In other words, if I were Microsoft or Sony, what would my PS4 or Xbox One be?
I’ve actually given this considerable thought, as this question is deceivingly complex. It’s one thing to just say, “Well, just add a Titan X and an i7 with 16 GB of RAM and boom! There we have Shank’s PS4 and Xbox One.”
While that does sound intoxicatingly tempting, it’s not realistic. Nor does it even make any sense, considering the fact that the Titan X wasn’t even released in 2013.
What we need to do is go back to 2012. Hang on. 2012? I thought the consoles released in 2013? Well, you’re right. But the hardware likely wasn’t finalized until late 2012 or early 2013, meaning the tech Sony and Microsoft were looking at was from 2012.
Now, for obvious reasons, we can’t do a simple 1 to 1 comparison of PS4/Xbox One hardware to their PC counterparts. It just wouldn’t make any sense. What we can do, however, is approximate hardware.
So let’s have at it. Realistically, the PS4 GPU is roughly equivalent to a GTX 660. The Xbox One GPU is roughly a GTX 650 Ti. The CPUs in each are 8-core low power mobile CPUs. Honestly, they’re pathetic.
On the memory side, Sony wisely adopted 8 GB GDDR5 RAM – the same type of memory in discrete PC graphics cards. Microsoft, meanwhile, idiotically chose a more exotic architecture featuring 32 MB eSRAM coupled with 8 GB DDR3.
As a brief aside, eSRAM is “static” RAM, meaning the memory pool does not need to be periodically refreshed. DDR3 RAM is the same RAM commonly used as PC system RAM. Rather than delve into incredibly technical discussions, just know that Microsoft’s approach to memory was asinine. They should have just copied Sony here.
Ok, so we have the hardware specs laid out. What immediately jumps out is just how underpowered these consoles were in 2013, nevermind today. They were comparable to low to mid range PCs in 2013. Today, they’re pitifully worse.
The GTX 660 and 650 Ti were mid-range GPUs in 2012. They weren’t powerhouses. They weren’t paltry either, but by no means would they provide you with top tier graphics. Just earlier that year, Epic showed off their Elemental demo of Unreal Engine 4 running on a single GTX 680, the top single-GPU card of the day. It was spectacular.
Though, it was something that these consoles would simply be incapable of handling because of their gross lack of horsepower.
Let’s also look at the landscape. 1080p was long since the standard for HD. PCs had been able to do this for eons. Additionally, 4k TVs, while new, certainly existed. In essence, HD had long been established as the standard. Gaming beyond HD was a reality and the PC gaming industry was quickly moving past 1080p.
Let’s now look at the 360 and PS3. When they released, HDTVs were present, but certainly not the norm. Gaming in 1080p on PC was definitely possible. But in the “mainstream” (that is, the living room), HD gaming wasn’t really a thing. But the 360 and PS3 ushered in the era of HD console gaming, forcing adoption of newer living room technology, and pushing the industry as a whole to adopt HD as a standard.
What a goddamn shame that the PS4 and Xbox One fail to meet the standards that their predecessors helped create.
With that in mind, what kind of hardware would I want in my PS4 and Xbox One? How about something that not only easily allowed for 1080p60 gaming, but also allowed for greater headroom for higher resolution TVs?
Now, keep in mind, the GTX 780 Ti released the same month as these consoles. That graphics card alone is more than twice as powerful as either console. And that’s a card I could play Call Of Duty: Advanced Warfare in 4k at 90fps. And these consoles struggled – struggled – to maintain a locked 60fps! Pathetic.
So for my GPU of choice, I would have included something akin to the power of a GTX 690, basically 2 GTX 680s. Would this be expensive? Yes. But these are consoles. The console business is in this for the long haul. Therefore, I would be perfectly fine with accepting a loss on hardware for the first couple of years while I raked in revenue from software and services. Then, in Year 3 and 4 would I expect to see profits on hardware as manufacturing costs reduced and supply chain was optimized.
On the CPU side, I would have included a quad-core clocked at 3.0 GHz. This is a pretty standard clock speed for gaming CPUs and again, this allows for a lot of headroom as technology evolves in the living room.
What about memory? Well, why fix something when it ain’t broke? Of all the decisions Sony made, this was perhaps one of the best. 8 GB of GDDR5 RAM will do just fine for my console, thanks.
Oh and price. Forgot about that. I believe that Sony’s $400 price looked that much more enticing not just because it was cheaper than Microsoft’s $500 console, but because it was the more powerful console which was a whole $100 less than its competition. More powerful hardware plus cheaper price equals better value.
However, I also believe that had Microsoft’s console been the more powerful of the two, Microsoft would have seen larger sales of its console. I really do believe that early adopters (read: the consumers who are among the most “hardcore” of your user base who absolutely value graphics and horsepower) would have shelled out $100 more for the more powerful device.
Therefore, I would price my console at $500.
So in the end, what I have built is a console more capable than your average PC, thus enticing your PC gamer with a mid range rig to consider a purchasing a console, meets the standard set by the 360 and PS3, exceeds that standard, and has enough horsepower to support future display technologies to allow for higher resolution gaming, while simultaneously providing developers plenty of power to push their games to the next level.
This would have been a true “next gen” console. How much further ahead would we be today, how much prettier our game worlds, how much more immersive our stories, how much more innovative our game mechanics would be had Microsoft and Sony pioneered a true path to the future, a true next generation.
Instead, we are left with these pathetic underpowered pieces of plastic, barely capable of meeting the standards of yesterday. What a damn shame, then, that these consoles are allowed to exist.