Somehow every time I'm tipsy late night on discord I end up talking about the same stuff anyway so might as well let you all hear it as well.
Seventh gen is in my humble and very biased opinion the best of them all with 6th in close second, in my far less biased opinion it's the definitive one, if for no reason other than being the transitional stage between local and on-line gaming, not to mention the sheer length of time it spaned. Not like it matters much for what I'll be touching on, but I need some excuse to know as much as I do about it. Also I'm gonna focus on CPUs because that's the interesting part.
AnywayOur story starts in 6th gen when Xbox came out, being based on PC components
(and based in general)it caused some minor controversies, but Ga,e cube was also based on computer technology - PowerPC, used at the time primarily by Macs, including the ones that ended up saving the company. Sony was about the only ones making their own processors and looking at the competition made it clear that it wasn't the way to go, so for their next console they decided to contact IBM for some of the same Power Nintendo was using. It's now 2005 and things are going wild, PowerPC is in a VERY weird spot: Motorola (because they were working with IBM on the tech and providing chips to Apple for clear but irrelevant reasons) couldn't keep up with the tech and sufice to say during G4 mac era CPU cooling in PowerMacs (the big, powerful ones) went from something the size of like two matchboxes to something taking up about a liter of space not to mention a fan which was actualy pretty loud in early units, going into G5 Apple ended up having to put water cooling in at least some of their towers and had to switch to intel for laptops.
And then at the end of the year Xbox360 came out using that exact technology. You see what happened was they caught wind of Sony's plans and worked out a deal with IBM. Their design was wayy different though but I'll get to that, don't you worry. Oh also nintendo pulled a GameCube with tech and just slapped motion controls on existing CPU which ended up being a blessing in dsguise, because unlike competition it ended up avoiding thermal issues, those being the infamous Red Ring of Death plaguing all but the last model of big 360s and yellow LED of death for PS3s, which is far less popular because breadbin ps3 took longer to develop probems.
Wii was also the only one of the competitors that didn't go for a custom CPU and what they got is only worth getting over with - single core running at 729MHz. In adition the CPU could only acces 64MB of memory and the GPU had 24MB for itself. I am only listing it here for sake of perspective it's truthfully less than unremarkable and the only advantage it had over competition that's worth talking about is that it supported Out-of-Order (instruction) execution which means it wouldn't hang on a lengthy task if it could be doing other stuff.
PS3 on the other hand was wild, in all facades of it's being at least for the first few years of it's life. Google
PS3 baby adif you don't know what I mean. and The CPU they requested from IBM was equaly wild.
Cellprocessor is a case study in efficient but insanely confusing and poorly implemented design, but I will not get into that too much, I'll just say that Gabe Newell hated the console. The CPU consists of Single double threaded core runing at 3.2GHz called
PPEsupported by 7 smaller co-processors running at the same speed called
SPEs, 6 of which were accesible to devs and one dedicated to the system. The reason why this is less than ideal is using
SPEswas confusing at best and wholely impractical at worst, which resulted in most developers, certainly those who didn't focus on just the PS3 to ditch them and just use the
PPE. And that's on top of limited OoO execution. The memory configuration of the PS3 was a 50-50 split of 254MB to both CPU and GPU with the diference being that Cell got XDR DRAM and the
Reality Sythesizer- it's GPU got GDDR3.
So when Microsoft caught wind of this they (probably) laughed and Told IBM they'll need a doover for Xbox - what would eventually become
Xenon- triple core running 6 threads at 3.2GHz, it was basically 3
PPEsmodified and stiched together. No OoOE tho. They figured game devs will have their shit together enough to optimize themselves. It shared it's 512MB of GDDR3 memory with
Xenos- the GPU of the system, though Xenos also had an extra 10MB for itself dedicated for Anti aliasing, namely 4xMSAA.
It doesn't take a genius to realize it makes more sense for multi-platform games to focus on PPE and since Xbox had more of those you can probably guess what ended up happening until RRoD took over, caused change in management and ruined everything for 360 while PS3 were digging themselves out, but that's it's own story.
The choice of those particular technologies means two things and I'll start with PS3 for computing. For a good while it was perfectly viable for researches to set up clusters of PS3s due to their sheer power and relatively minimal cost, but also much like PS2 before it PS3 came with linux capability, which makes it to this day to my knowledge the last consumer-grade PowerPC... Pc. That's because PowerMac G5s were discontinued in August 2006, while PS3 released in November same year. Which is why when Sony shut it down in firmware update 3.21 in 2010 people were outraged, and not a lot of people seem to realize why.
The other thing is that both of those machines were POWERFUL. You don't last some 10 years on the market without the juice to back it up and just to once again put thing into perspective in 2013 when they were being superced the typical gaming PC had a quad core quad threaded CPU running at 3.6GHz. If that sounds like about the same or less power than what they had then that's technically waht it is, obviously it's not so simple in practice but then again so is comparing specialized hardware to general hardware. The one indisputable fact is that we wouldn't get thread counts above 4 till 2016 when first Ryzen released. The issue then? in 2013 an entry level gaming PC had 4GB of ram for it's CPU alone, that's 8 times what Xbox could do at best and 16 times more than PS3. Here's the thing about cnsoles though - they push nigh all they can to the game, system takes minimal amount and there is literally nothing else in the background. It may be pure speculation on my part but honestly? If both of those systems had like 2GB between their processors I don't think we'd have the 8th generation we got. Xbox 360 can already handle 1080i resolutions, so i can render the apropriate screen sizes and I legitimately believe the xtra ram would be enough to push 1080p, and I don't see why PS3 would fall behind either. Also it's not like we got 4k with 8th gen, hell 1440p wastnt realy there at launch either. Obviously that would drive cost and all that but it almost feels like the milage they'd get out of that could make up for early losses, especially since Wii already took the early game and sold overall more than the other two combined, which is why we got the Kinect and the Move. It's insane how beefy these two actually were.