GTX 4080 Spec Update and Power Usage

GTX 4080 Spec Update and Power Usage

I Hope You Invested in a Large, Quality PSU. And A/C. You'll Need It.

Well, we are inching towards the release of the GTX 4000 series of cards and the outlook so far is positive.

A great person I follow (and anyone who reports on this stuff follow him since his leaks are always right) for details on upcoming green team cards is kopite7kimi. He changed his previous leak on 4080 specs from 21 GBit/Sec GDDR6X up to 23GBit/Sec, and from 320W up to 340W.

Let's make some updates.
RTX 4080
PG136/139-SKU360
AD103-300-A1
9728FP32
16GB 23Gbps GDDR6X
total card power 340W

— kopite7kimi (@kopite7kimi) August 23, 2022

This is quite interesting as everyone was very sad to see a leak putting these initial boards using the slower GDDR6X spec, but now thanks to a bump in total power, we are up to 23Gbps which will be nice if you somehow can fill all 16GB of the onboard memory up with textures and more. If you intend to fill up 16GB (4K gamers I am looking at you) then wait for the 4090 editions or even the 4080Ti. We do not have hard specs on those yet as they tend to get released closer to launch as the specs do change leading up to the week of launch and announcement.

This should be seen as an upgrade for those not yet on an RTX board, like my aging 1080 Ti (which is good but I need ray tracing in my life, and DLSS). 3000 series owners are not going to get the kind of jump they first thought, even with the latest leaks of specs. It was rumored earlier this year that the 4000 series would be skipped or possibly branded differently since the performance would be more on par with a 5000 series jump, but as time went on, it became clear team green has not pulled the rabbit out of their hat.

Stay tuned for more details on GPU news and reviews. I will be upgrading my system to have one of the 4000 series GPUs, so I will put it through its paces in real world situations and test out certain CUDA based programs like STYLE-GAN (Ai/ML project which can generate human faces with photorealistic results - I ran it on an EC2 12 GPU machine and it was super scary.)

Also, back to my normal blog post on Friday!

Did you find this article valuable?

Support Jeremy Hall 🖥️ ⌨️🖱️ by becoming a sponsor. Any amount is appreciated!