RTX 4000 series thread... [No scalping discussion]

stanlawj

Supremacy Member
Joined
Jul 11, 2021
Messages
8,531
Reaction score
4,524
ur entire post is a mistake, but I'll add in details after lunch.
Try me. AMD should have known what to do five years ago.


I absolutely dislike AMD. Especially the part they purposely withheld Threadripper Zen 4 from the market for more than a year.
 
Last edited:

NightRaven49

Master Member
Joined
May 17, 2019
Messages
3,707
Reaction score
1,551
The RT uses tensor cores which are capable of AI inferencing (the part where you can deploy the fully-baked AI model to answer new questions).
rt = raytracing. raytracing on nvidia gpus dont run on tensor cores, they run on the rt cores. tensor cores are used for ai tasks (such as dlss) yes, but that is separate from rt.
Which means the Nvidia GPU is future proofed for AI PC that Microsoft is preparing to launch for Win12.
not fully. nvidia is generally quite stingy on vram, which limits the size of ai models that can be run on their gpus. u can already see this limitation with onprem image generation models (eg stable diffusion), where the size of the generated image is related to the size of the vram buffer. and if u want more vram, guess what, u have to pony up big time for otherwise diminishing returns. im not sure if the ai features of win12 are onprem, but if the current iteration of copilot is anything to go by, u dont even need dedicated hardware for it.
AMD is the one who's dropping the ball. They totally missed the AI applications. In addition AMD = buggy. This reputation stuck for a long time and I don't see any meaningful improvement to date.
totally not helped by the monopolistic actions of nvidia, which they are still pulling to this day :s22: amd does have rocm as a promising alternative, especially when ai companies are actively trying to leave the walled garden of nvidia. asrock even actively promotes ai support on their 7000 series cards, with not unusable results. amd also has npus baked into their latest processors. and if amd = buggy (an outdated stereotype), then nvidia = buggy as well as they both have similar amount of bugs in their consumer drivers.
Try me. AMD should have known what to do five years ago.

ah yes, a nvidia blog post on why nvidia is better, very credible :s13: and this is comparing cpus to gpus in ai workloads, which is not a fair comparison as cpus and gpus work in fundamentally different ways.
I absolutely dislike AMD. Especially the part they purposely withheld Threadripper Zen 4 from the market for more than a year.
and they should absolutely be called out on it, but u have to hold nvidia to the same standard as well.
 
Last edited:

stanlawj

Supremacy Member
Joined
Jul 11, 2021
Messages
8,531
Reaction score
4,524
rt = raytracing. raytracing on nvidia gpus dont run on tensor cores, they run on the rt cores. tensor cores are used for ai tasks (such as dlss) yes, but that is separate from rt.

not fully. nvidia is generally quite stingy on vram, which limits the size of ai models that can be run on their gpus. u can already see this limitation with onprem image generation models (eg stable diffusion), where the size of the generated image is related to the size of the vram buffer. and if u want more vram, guess what, u have to pony up big time for otherwise diminishing returns. im not sure if the ai features of win12 are onprem, but if the current iteration of copilot is anything to go by, u dont even need dedicated hardware for it.

totally not helped by the monopolistic actions of nvidia, which they are still pulling to this day :s22: amd does have rocm as a promising alternative, especially when ai companies are actively trying to leave the walled garden of nvidia. asrock even actively promotes ai support on their 7000 series cards, with not unusable results. amd also has npus baked into their latest processors. and if amd = buggy (an outdated stereotype), then nvidia = buggy as well as they both have similar amount of bugs in their consumer drivers.

ah yes, a nvidia blog post on why nvidia is better, very credible :s13: and this is comparing cpus to gpus in ai workloads, which is not a fair comparison as cpus and gpus work in fundamentally different ways.

and they should absolutely be called out on it, but u have to hold nvidia to the same standard as well.
Oh ... you're right... I just searched and realised RT denoising is not AI based but mathematically calculated from existing rays. Somehow I got confused between the AI-based DLSS and RT denoising algorithm, because both in a way similar to cheating by interpolation.

https://developer.nvidia.com/rtx/ray-tracing/rt-denoisers

RT cores are fully dedicated to the RT tasks, and that's what make Nvidia stand out for their compute strength in RT. AMD hasn't caught up to Nvidia in this accelerator area.
 
Last edited:

stanlawj

Supremacy Member
Joined
Jul 11, 2021
Messages
8,531
Reaction score
4,524
not fully. nvidia is generally quite stingy on vram, which limits the size of ai models that can be run on their gpus. u can already see this limitation with onprem image generation models (eg stable diffusion), where the size of the generated image is related to the size of the vram buffer. and if u want more vram, guess what, u have to pony up big time for otherwise diminishing returns. im not sure if the ai features of win12 are onprem, but if the current iteration of copilot is anything to go by, u dont even need dedicated hardware for it.
You're right about the VRAM, it's all about saving cost and making profits for Nvidia.
 

watzup_ken

High Supremacy Member
Joined
Nov 21, 2003
Messages
25,672
Reaction score
2,123
The RT uses tensor cores which are capable of AI inferencing (the part where you can deploy the fully-baked AI model to answer new questions). Same as the Intel and Mac NPU.
Which means the Nvidia GPU is future proofed for AI PC that Microsoft is preparing to launch for Win12.

If I made any mistake in these statements, let me know.
AMD is the one who's dropping the ball. They totally missed the AI applications. In addition AMD = buggy. This reputation stuck for a long time and I don't see any meaningful improvement to date.

Also, if AMD can't do it, it's time for Chinese GPU manufacturers to step in to the game to challenge Nvidia's price gouging.
I gladly dispute this claim. I've been switching between AMD and Nvidia graphic solutions and I certainly don't think AMD drivers are buggy. This sounded like the entire experience with AMD graphic solution is bad, but is clearly not the case. The point is, there's always software bugs and I certainly do not feel these bugs destroys user experience.
 

Koenig168

Supremacy Member
Joined
Nov 4, 2007
Messages
9,744
Reaction score
1,557
The last GPU I bought was an AMD card. Though I only used it for a few days, the experience was bug-free.
 

Ferolare

High Supremacy Member
Joined
Feb 26, 2007
Messages
48,377
Reaction score
150
Historically speaking from my experience, I would agree that AMD driver had more bugs than Nvidia.

But at the same time, my current Nvidia card is the first since 8500GS soooooooooo...
 

ragnarok95

Senior Moderator
Senior Moderator
Joined
Mar 9, 2004
Messages
124,886
Reaction score
6,204
My experience with 6800XT and AMD driver was good. Stable. Then i went to RTX3080... stable too. Then i change to a RTX4070Ti, driver issue. :s13:
 

watzup_ken

High Supremacy Member
Joined
Nov 21, 2003
Messages
25,672
Reaction score
2,123
Historically speaking from my experience, I would agree that AMD driver had more bugs than Nvidia.

But at the same time, my current Nvidia card is the first since 8500GS soooooooooo...
I don't disagree that there may seems to be more software bugs with AMD graphic solution. But generally, the way people have put it across sounds like they are "very buggy", which is factually inaccurate. The other thing people like to compare is DLSS vs FSR image quality, which I find very perplexing. When your GPU don't have the ability to run at game smoothly, the last thing I will kick a fuss about is the quality of the image. So one either drop native resolution, graphic settings, or just enable whatever upscaling technology available. Don't get me wrong, DLSS tends to look better, but the truth is that it does not run on every graphic solution out there. So what good is it if it cannot run on some graphic cards?
 

xonix

Arch-Supremacy Member
Joined
Mar 24, 2001
Messages
17,592
Reaction score
1,694
I don't disagree that there may seems to be more software bugs with AMD graphic solution. But generally, the way people have put it across sounds like they are "very buggy", which is factually inaccurate. The other thing people like to compare is DLSS vs FSR image quality, which I find very perplexing. When your GPU don't have the ability to run at game smoothly, the last thing I will kick a fuss about is the quality of the image. So one either drop native resolution, graphic settings, or just enable whatever upscaling technology available. Don't get me wrong, DLSS tends to look better, but the truth is that it does not run on every graphic solution out there. So what good is it if it cannot run on some graphic cards?
Upscaling is always the last resort for me.
 

watzup_ken

High Supremacy Member
Joined
Nov 21, 2003
Messages
25,672
Reaction score
2,123
Upscaling is always the last resort for me.
Some games I will run at native resolution. For some games, I will upscale it, and frame cap it so that my GPU don't need to work so hard and run cooler. :LOL: I am generally not that fuss about how the game looks, unless the upscale image looks really rubbish or introduced too many severe image quality issues.
 

xonix

Arch-Supremacy Member
Joined
Mar 24, 2001
Messages
17,592
Reaction score
1,694
Some games I will run at native resolution. For some games, I will upscale it, and frame cap it so that my GPU don't need to work so hard and run cooler. :LOL: I am generally not that fuss about how the game looks, unless the upscale image looks really rubbish or introduced too many severe image quality issues.
I tweak cp2077 until give up. In the end disable RT and just run at 1440p native high without upscaling, turn on fmf, cap at 72fps.
 

Phen8210

High Supremacy Member
Joined
Jul 29, 2011
Messages
28,901
Reaction score
8,219
I tweak cp2077 until give up. In the end disable RT and just run at 1440p native high without upscaling, turn on fmf, cap at 72fps.

CP2077 is relatively easy to tune and very worth tuning.

If you are having trouble, check out BenchmarKing optimization guides, there is no need for those fancy upscaling and fancy tech. If you are on Nvidia can turn on DLAA to clean things up if you got extra performance to spare after tuning.



Last year i did a comparison and uploaded here, i did a little of my twist to it and the results are excellent.

You do not need a high-end GPU to get good graphics in CP2077.
 

xonix

Arch-Supremacy Member
Joined
Mar 24, 2001
Messages
17,592
Reaction score
1,694
CP2077 is relatively easy to tune and very worth tuning.

If you are having trouble, check out BenchmarKing optimization guides, there is no need for those fancy upscaling and fancy tech. If you are on Nvidia can turn on DLAA to clean things up if you got extra performance to spare after tuning.



Last year i did a comparison and uploaded here, i did a little of my twist to it and the results are excellent.

You do not need a high-end GPU to get good graphics in CP2077.

No lah... my system specs cannot make it :D and yet i want to get 60fps for 1% lows :ROFLMAO:
My current settings allow me to get min 60fps / 142 max with FMF
 
Last edited:

Yongkit

Supremacy Member
Joined
Oct 9, 2015
Messages
5,832
Reaction score
2,452
I don't disagree that there may seems to be more software bugs with AMD graphic solution. But generally, the way people have put it across sounds like they are "very buggy", which is factually inaccurate. The other thing people like to compare is DLSS vs FSR image quality, which I find very perplexing. When your GPU don't have the ability to run at game smoothly, the last thing I will kick a fuss about is the quality of the image. So one either drop native resolution, graphic settings, or just enable whatever upscaling technology available. Don't get me wrong, DLSS tends to look better, but the truth is that it does not run on every graphic solution out there. So what good is it if it cannot run on some graphic cards?
After switched to Readon GPU I forget everything about rtx and DLSS Happily play on native with HDR and color is surprisingly better too.
 

matique

Senior Member
Joined
Sep 25, 2017
Messages
797
Reaction score
786
not fully. nvidia is generally quite stingy on vram, which limits the size of ai models that can be run on their gpus. u can already see this limitation with onprem image generation models (eg stable diffusion), where the size of the generated image is related to the size of the vram buffer. and if u want more vram, guess what, u have to pony up big time for otherwise diminishing returns. im not sure if the ai features of win12 are onprem, but if the current iteration of copilot is anything to go by, u dont even need dedicated hardware for it.

SD does not like running on AMD GPUs, requiring quite a bit of workaround to get it working on radeon cards. It is what it is, if you do want to explore consumer level AI programs, nvidia GPUs are the way to go. Agree with you about being stingy with VRAM and Nvidia being too pricy. But when the competition is just meh, they can get away with it.
 

KYZT2021

Arch-Supremacy Member
Joined
Jun 18, 2021
Messages
21,323
Reaction score
2,800
Nvidia doing apple pricing model, whatever price consumer will still buy, 4090 still in stock in CN despite the D
 
Important Forum Advisory Note
This forum is moderated by volunteer moderators who will react only to members' feedback on posts. Moderators are not employees or representatives of HWZ Forums. Forum members and moderators are responsible for their own posts. Please refer to our Community Guidelines and Standards and Terms and Conditions for more information.
Top