7900 XTX or 4080 Super

Kukuza

Member
Joined
Jan 28, 2007
Messages
220
Reaction score
0
Just want to ask between this 2 model of GPU . Which one is more so call future proof . 7900 looks better with 24GB memory while 4080 super just release and looks user friendly
Generally looking at Gigabyte now i think i see they offer 4 years warranty for their cards . Other brand warranty like quite short period or maybe i never see properly .

Thanks in advance
 

NightRaven49

Master Member
Joined
May 17, 2019
Messages
3,699
Reaction score
1,544
to add on to kappak, if ur looking to play raytracing games, 4080super blows the 7900xtx out of the water. but of course u have to see the differential in price of the specific models u are looking at to make a proper decision.
Other brand warranty like quite short period or maybe i never see properly .
most other brands are 3y. notable exceptions are zotac at 5y with registration and sapphire at 2y. but i dont think warranty is that big of a deal since a good majority of parts do live long.
 

stanlawj

Supremacy Member
Joined
Jul 11, 2021
Messages
8,016
Reaction score
4,315
Just want to ask between this 2 model of GPU . Which one is more so call future proof . 7900 looks better with 24GB memory while 4080 super just release and looks user friendly
Generally looking at Gigabyte now i think i see they offer 4 years warranty for their cards . Other brand warranty like quite short period or maybe i never see properly .

Thanks in advance
Personally, I'm also shopping for GPU for 4k gaming, and I think the prices are insane.
Not buying. I would rather wait for 1 more year for 3nm process + DP2.1 Nvidia Blackwell GPU to arrive to justify the >$1k pricing.

By that time, I hope the RTX 4080 equivalent would also have lower TDP and smaller size (RTX5070).
 

ragnarok95

Senior Moderator
Senior Moderator
Joined
Mar 9, 2004
Messages
124,675
Reaction score
6,030
Need RT in gaming or for work, get 4080S. Everything else, 7900xtx. Future proof depend on your mind and pocket.
 

Koenig168

Supremacy Member
Joined
Nov 4, 2007
Messages
9,648
Reaction score
1,504
Personally, I'm also shopping for GPU for 4k gaming, and I think the prices are insane.
Not buying. I would rather wait for 1 more year for 3nm process + DP2.1 Nvidia Blackwell GPU to arrive to justify the >$1k pricing.

By that time, I hope the RTX 4080 equivalent would also have lower TDP and smaller size (RTX5070).

By that time, RTX 5070 will be priced at RTX 4080 level :ROFLMAO:
 

stanlawj

Supremacy Member
Joined
Jul 11, 2021
Messages
8,016
Reaction score
4,315
By that time, RTX 5070 will be priced at RTX 4080 level :ROFLMAO:
I doubt it. Current RTX4080 is just passable for 4k gaming with RT. Price need to drop to $1k level (i.e. US$700) to be reasonable.

Please recall these pricing. Crypto mining demand artificially inflated price in 2021:

ModelLaunchLaunch MSRP (USD)
GeForce RTX 3080Sep 17, 2020$699
GeForce RTX 3080 (12 GB)Jan 11, 2022$799
GeForce RTX 3080 TiJun 3, 2021$1,199

Current RTX4080 pricing is just pure Nvidia greed.
 
Last edited:

stanlawj

Supremacy Member
Joined
Jul 11, 2021
Messages
8,016
Reaction score
4,315
Only the GPU chip is fabbed at leading edge process. The rest of the chips including memory chips are not.

AD104 chip area = 295mm2
300mm wafer area = 70700mm2
Total chips = 240. Round down to 200.
Given the fabbing cost per wafer = $20k, then each chip costs US$100 to $150 assuming 70% wafer yield.

So the GPU chip fabbing cost probably went up 2X from 7nm to 3nm based on increase from $10k to $20k, but it is only one part of the cost. The more important costs are R&D.

So does RTX4080 (AD104) card deserve to be priced US$1k plus (SGD$1.8k effective)?
 
Last edited:

Yongkit

Supremacy Member
Joined
Oct 9, 2015
Messages
5,796
Reaction score
2,430
The price is subjectively justified for different person.

For the rich = don't care just want the GPU
For the average = consider it as end game GPU
For the poor = can see cannot touch.

No matter how it price, those have no confidence in amd will not choose Readon GPU anyway
 

KleoZy

Supremacy Member
Joined
Jan 1, 2000
Messages
5,664
Reaction score
978
The price is subjectively justified for different person.

For the rich = don't care just want the GPU
For the average = consider it as end game GPU
For the poor = can see cannot touch.

No matter how it price, those have no confidence in amd will not choose Readon GPU anyway

this I do agree.
For the rich = just get both cards.. suka suka inter-change based on games till happy.
For the average = hamtum one time. play and tweak with it, don't compare here compare there..
For the poor = ignore games first, focus on career. try to climb as far as you can to earn big bucks at a short period of time, once earn like 8k or so then come play games 🤣
 

TanKianW

Supremacy Member
Joined
Apr 21, 2005
Messages
6,675
Reaction score
3,323
Just want to ask between this 2 model of GPU . Which one is more so call future proof . 7900 looks better with 24GB memory while 4080 super just release and looks user friendly
Generally looking at Gigabyte now i think i see they offer 4 years warranty for their cards . Other brand warranty like quite short period or maybe i never see properly .

Thanks in advance

For those with real needs, or know-how, not just for gaming = Nvidia. I would prefer Zotac with the 5-year warranty too. Though I do not like (also not a fan of) the exorbitant prices they are offering, but fact is supply/demand are driven by developers, enterprise applications and big players, NOT gamers.

There is just much more flexibility with Nvidia, not just for gaming. Nvidia is better in AI-related work tasks, transcoding (PLEX, Emby, Jellyfin) and overall OSS application support. Few even understand the extensiveness of Nvidia software engineers working (investing) with the industry, researchers and developers (HW like Nvidia Jetson) before AMD even started to jump on the AI wagon. AMD has been sleeping for quite some time and will take quite a while to catch up with Nvidia when most developers are already comfortable with CUDA.

My definition of future proof: My PC hardware generally had a very long life cycle between different applications before being put on sale or fully retired. I prefer to juice my GPU starting from a high-end gaming rig, then downgrade as kids study/gaming rig, runs emulations, living room entertainment PC, a storage NAS with mixed applications or a hypervisor rig + servers. Personally, a GPU that only can be used for gaming is a waste of processing power.

A good watch for those with some time:
 
Last edited:

KleoZy

Supremacy Member
Joined
Jan 1, 2000
Messages
5,664
Reaction score
978

For those with real needs, or know-how, not just for gaming = Nvidia. I would prefer Zotac with the 5-year warranty too. Though I do not like (also not a fan of) the exorbitant prices they are offering, but fact is supply/demand are driven by developers, enterprise applications and big players, NOT gamers.

There is just much more flexibility with Nvidia, not just for gaming. Nvidia is better in AI-related work tasks, transcoding (PLEX, Emby, Jellyfin) and overall OSS application support. Few even understand the extensiveness of Nvidia software engineers working (investing) with the industry, researchers and developers (HW like Nvidia Jetson) before AMD even started to jump on the AI wagon. AMD has been sleeping for quite some time and will take quite a while to catch up with Nvidia when most developers are already comfortable with CUDA.

My definition of future proof: My PC hardware generally had a very long life cycle between different applications before being put on sale or fully retired. I prefer to juice my GPU starting from a high-end gaming rig, then downgrade as kids study/gaming rig, runs emulations, living room entertainment PC, a storage NAS with mixed applications or a hypervisor rig + servers. Personally, a GPU that only can be used for gaming is a waste of processing power.

A good watch for those with some time:

In a professional level, I totally agree with you. Especially if money generating and to prolong GPU usage in a professional level, no brainer.. whack the TOP DOG card. (Nvidia especially). That's to save time on GPU computing.

but then have to take into consideration, majority are using GPU for gaming, some do put them in good use too. as in fine tune and sure. Some are monkey see monkey do, ar bo is you have this series, I will have better, then show hand with pics.. those are damn boring lol. I rather pay attention to those threads which contributes to optimized a GPU after purhcased.
 

TanKianW

Supremacy Member
Joined
Apr 21, 2005
Messages
6,675
Reaction score
3,323
In a professional level, I totally agree with you. Especially if money generating and to prolong GPU usage in a professional level, no brainer.. whack the TOP DOG card. (Nvidia especially). That's to save time on GPU computing.

but then have to take into consideration, majority are using GPU for gaming, some do put them in good use too. as in fine tune and sure. Some are monkey see monkey do, ar bo is you have this series, I will have better, then show hand with pics.. those are damn boring lol. I rather pay attention to those threads which contributes to optimized a GPU after purhcased.

I guess I may be assuming more people or curious individuals who will look at building their homelab/home server with the current state of high speed connectivity, user friendliness applications and capable HW, than plainly a “gaming rig”.​
 
Last edited:

Phen8210

High Supremacy Member
Joined
Jul 29, 2011
Messages
28,885
Reaction score
8,214
I guess I may be assuming more people or curious individuals who will look at building their homelab/home server with the current state of high speed connectivity, user friendliness applications and capable HW, than plainly a “gaming rig”.​

Not necessarily. Nowadays, the skills needed to set up a home server only serve as a test of basic networking and infrastructure knowledge. In a professional setting, what truly matters is proficiency in cloud computing platforms like AWS, and curious individuals should focus their efforts there, challenge themselves to build projects on the cloud and also get certified.

Once an individual has established a foundational understanding of networking and infrastructure, it becomes crucial to shift focus toward learning how to leverage various cloud services. Cloud computing services are growing rapidly, with AWS alone offering over 200 services. Over the years, it has become much more complex.

While on-premises skills are still valuable, cloud computing platforms like AWS, Azure, and Google Cloud have become dominant in the IT industry. Many organizations are shifting from traditional on-premises infrastructure to hybrid/cloud for scalability, flexibility, and cost-effectiveness.

With the current trends and priorities in the IT industry, It's essential to learn application development and deployment, scripting, and automation of the OS environment, as well as networking and infrastructure, particularly in cloud environments.
 

CrispyPeddler

Senior Member
Joined
Jul 26, 2016
Messages
1,829
Reaction score
7
curious about this too. i thought nvidia will be leading in terms of gaming with all the RT, DLSS and FG? is AMD catching up on this?
 

Phen8210

High Supremacy Member
Joined
Jul 29, 2011
Messages
28,885
Reaction score
8,214
curious about this too. i thought nvidia will be leading in terms of gaming with all the RT, DLSS and FG? is AMD catching up on this?

AMD will take a very long time to catch up. Nvidia didn't build these overnight either. A lot of these technologies aren't without side effects anyway, that's why people have begun shifting towards AMD.
 
Important Forum Advisory Note
This forum is moderated by volunteer moderators who will react only to members' feedback on posts. Moderators are not employees or representatives of HWZ. Forum members and moderators are responsible for their own posts.

Please refer to our Community Guidelines and Standards, Terms of Service and Member T&Cs for more information.
Top