A few days ago, I was digging through all my old retro consoles.
Pulling out those sun-stained plastic boxes reminded me of the great gaming that graced my 80s childhood.
While turning a SNES end over end in my hands, the corner of my eyes caught my PS5 standing proud enveloped in its white cloak. Then a nostalgia-fueled thought climbed into my head: The SNES was a 16-bit console. The NES was an 8-bit console. How many bits is the PS5? 256-bits? 1024-bits?
My mind was completely overtaken by these questions. So I did the only reasonable thing a gamer looking to answers can do: I pulled out my phone and started Googling.
This is what I found:
How many bits is the PS5? The PS5’s CPU, like most modern PC and Smartphone CPUs, is 64 bit. The reason why Sony do not use the bit rating in their marketing, as they did with the PS1, is because both the PS3 and the PS4 had 64 bit CPUs. The PS5’s 64-bit CPU isn’t a good differentiator when marketing the console.
Ok, now you know the PS5’s bit rating. But there’s so much more to learn about this very interesting topic. In this article I’ll cover:
- How many bits the PS5 has in more detail
- What does bit rating even mean
- Does a consoles bit rating matter
- What’s a better way to measure a consoles power
- Why bit ratings arn’t used for marketing any more
Ok, let’s get started…
How many bits is the PS5’s CPU?
PS5’s processor is 64 bits.
This is the same as all modern CPUs both in PCs, consoles, and smart phones.
Strangely enough, one of the main reasons why Sony and other console manufacturers stopped talking about the bits as a marketing tool was because both the PS4, PS3, and other modern consoles since the Xbox 360 era were all 64bit CPUs.
As I’ll explore later in this article, you’ll see that the bit rating of a CPU simply represents the maximum length of a packet of data in bits that a CPU can process.
And it’s not relevant to a modern console’s potential power.
What do we mean when we say “bits”?
The bit ratting of a CPU determines the maximum length of a single packet of processible data in bits (1s and 0s) that a CPU can process.
A NES CPU, which is an 8-bit CPU, can handle data that is 8 binary bits in length:
This limited the complexity of the data that the NES could handle. For example, in Pack Man, you couldn’t go past level 256 because the CPU was incapable of dealing with numbers greater than 256 efficiently.
Conversely, the 32 bit PS1 massively increased the CPUs ability to handle larger packets of data.
A string of data on the PS1 could be 32 0s and 1s long. Meaning each packet of data could be far more complex.
Notice how much more complex the above string of 1s and 0s is compared to the NES 8 bits length.
The PS5 CPU is capable of reading 64 bits worth of data in one go. Here’s an example:
Again, this number is massively more complex.
At the moment, there is very little need for a higher bit rate as the number of permutations of these strings of 1s and 0s for a 64 bit CPU is vastly higher than 32bit CPUs.
Does a console’s bits matter anymore?
No. A console CPU’s bit ratting doesn’t matter anymore. CPUs have become so much more advanced than the CPUs found in the 8, 16, and 32-bit era that to think in bits is completely redundant.
If that’s the case, how else can you tell how powerful a console like the PS5 is?
Well, there are a number of ways to determine how powerful a console is that are far more effective than looking at the CPU’s bit count.
Let’s look at a few:
The GPU, or graphics card, inside a modern console is the biggest limiting factor when determining a console’s “performance”. And by performance, I mean how pretty the graphics it produces are.
So, how can you tell how powerful a GPU is instead of a console like the PS5?
There are a number of ways:
Generally, a GPU with a higher teraflop count should be more powerful than a GPU with a lower teraflop count.
For example, the PS5 GPU, which produces roughly 10 teraflops of computing power, is less powerful than the Xbox Series X’s GPU, which produces roughly 12 teraflops of computing power.
However, this measurement is less effective when comparing different generations of Console.
For example, a 4 teraflop PS4 Pro, is not as powerful as a 4 teraflop Xbox Series S.
The more GPU cores a GPU has, generally the more powerful it is. So a GPU with 3000 cores would be, without taking other things into account, more powerful than a GPU 2000 cores.
The GPU’s clock speed, or frequency, plays a big role in how fast a GPU is. Generally, faster is better. So a GPU clocked at 2GHz will be more powerful than a GPU clocked at 1.5GHz.
The CPU is incredibly important for modern games. More powerful CPUs mean more complex game worlds and more advanced animations.
Thankfully, there are now better ways to determine a CPU’s power instead of using the archaic “bit” method.
The more physical cores a CPU has the more powerful it will be.
For example, a CPU with 8 cores will be more powerful than a CPU with 4 cores. However, this is not always the case. It’s important to take into account the Generation of the CPU.
For example, PS4’s AMD bulldozer CPU has exactly the same number of cores as the PS5’s Zen 2 CPU. Yet the PS5’s CPU is 5-6 times more powerful than the PS4 CPU.
A really simple way to determine how powerful a CPU is is to compare the clock speeds of the CPU.
If core counts and CPU generation are equal, the CPU with the faster clock speed will be more powerful.
Ram is a vital part of any console as it’s used for storing data that is constantly in need by the CPU and GPU. The more of it you have the more complex assets and data you can hold ready for the CPU and GPU to use. The faster the RAM is, the quicker you can feed that data to the CPU and GPU.
Let’s look at both.
Generally the more RAM you have the better. So 32GB of RAM is better than 8GB ram. Regardless of the speed.
Ram speed measured in GB/s is one of the most important factors for high-performance RAM.
Faster speeds are always better. So a Console with 400GB/s RAM is better than a console with 200GB/s RAM
The Bit Rating of a console used to be used as a marketing tool
In the past, before computers, and the knowledge of them, went mainstream, console manufacturers needed a tool, a number, to help them differentiate the different generations from each other.
After all, mainly parents of game-playing children were buying these consoles. How would they know the difference between a NES or SNES?
Marketing teams ingeniously chose the bit rating of consoles as a convenient means of showing the difference between the old and new generations. Even to the most console illiterate parent, 16-bits has got to be better than 8-bits. And of course, it was.
However, with the onset of the early 2000s, and consoles adopting more PC-like technology, we saw a move away from the bit rating of a console. And it made sense to do so. The original Xbox’s CPU was 32 bit just like the original PS1. Yet the Xbox CPU was massively more powerful. The bit ratting of the CPU was no longer the limiting factor for performance.
It didn’t make sense to market a new console by plastering a largely irrelevant bit ratting all over marketing materials when it was exactly the same bit rate as the last generation.
So, the use of the bit rate as a measurement of performance died.
And once again were at the end of another CareerGamers article. But, before you go, let’s take a look at a quick summary of the entire article:
- The PS5 has a 64-bit CPU, just like most modern smartphones and PC CPUs
- The bit rating of CPU is a measurement of the maximum length, in bits, of data packet the CPU can process
- Bit ratting were dropped from marketing because the PS3 and PS4 both had 64 bit CPUs say selling the console on the bases of having a 64-bit CPU wouldn’t a big enough diferentaor any more
- There are better way to tell how power a console is, which includes:
- The GPU’s terflops count, Core count, and frequency
- CPU’s core count, frequency, and generation
- The RAM’s amount and speed
If you spend even a few minutes browsing the world of YouTube vloggers, you'll notice that almost all of them have their own YouTube channel banner. It's one of the first things viewers see...
It may surprise you, but I'm an avid watcher of YouTube videos. I often spend several hours a day watching tutorials on how to write better articles or how to be a better content...