Need recommendation for new AM4 motherboard.

michael_thm

Arch-Supremacy Member
Joined
Jan 1, 2000
Messages
21,708
Reaction score
2,483
Hi I am currently running a Ryzen 5800X on a B550M motherboard. DDR4 48GB ram(2x8+2x16) @2933. RTX 5060 Ti 16GB. 2x1TB NMVE SSD. 8x8TB NAS HDD. 650W Corsair PSU.

I am looking to add another RTX5060 Ti 16GB so that I can have 32GB VRAM.

However pcpartpicker seems to be F uped in filtering motherboards with 2xPCIe x8/x16. AM4 motherboards.
Therefore I would like to ask for recommendation of affordable x570 motherboards which will support 2 x RTX 5060Ti. Thanks.
 

Koenig168

Supremacy Member
Joined
Nov 4, 2007
Messages
9,654
Reaction score
1,505
Hi I am currently running a Ryzen 5800X on a B550M motherboard. DDR4 48GB ram(2x8+2x16) @2933. RTX 5060 Ti 16GB. 2x1TB NMVE SSD. 8x8TB NAS HDD. 650W Corsair PSU.

I am looking to add another RTX5060 Ti 16GB so that I can have 32GB VRAM.

However pcpartpicker seems to be F uped in filtering motherboards with 2xPCIe x8/x16. AM4 motherboards.
Therefore I would like to ask for recommendation of affordable x570 motherboards which will support 2 x RTX 5060Ti. Thanks.

You won't find affordable MBs with X16 for dual GPUs. You will need MBs with PXL switches and those aren't cheap.

For 2 X8, I think most X570 MBs should be able to handle that. Just take a look at the manual for the MB you are interested in.
 

abstrax

Senior Member
Joined
Nov 29, 2001
Messages
1,423
Reaction score
127
So long never heard gamers wan use SLI for PC.......lol

I rather he sell away the 5060Ti n get 5070Ti....
 

michael_thm

Arch-Supremacy Member
Joined
Jan 1, 2000
Messages
21,708
Reaction score
2,483
probably ML
Yup. a lot of the smaller ai models are 20ish GB even at Q8. Even with 10ish models, the headroom for context is limited. So wanna get 32GB VRAM for headroom. Any recommendation coz I see those mainly around $400. That's like 3 B550m motherboard just for the ability to do x8+x8 PCIe. Hopefully, got $200+ ones?

Anyone else who read this thread later... planning the upgrade only in a few months so please shoot away any recommendation coz got time.
 

Koenig168

Supremacy Member
Joined
Nov 4, 2007
Messages
9,654
Reaction score
1,505
Yup. a lot of the smaller ai models are 20ish GB even at Q8. Even with 10ish models, the headroom for context is limited. So wanna get 32GB VRAM for headroom. Any recommendation coz I see those mainly around $400. That's like 3 B550m motherboard just for the ability to do x8+x8 PCIe. Hopefully, got $200+ ones?

Anyone else who read this thread later... planning the upgrade only in a few months so please shoot away any recommendation coz got time.

$200 for a new X570 seems a bit difficult, suggest you look at the secondhand market.

Alternatively, you can check out Taobao.
 
Last edited:

Goodshot

Great Supremacy Member
Joined
Sep 20, 2013
Messages
61,638
Reaction score
3,980
X570 taobao $100+ have , but don't think they have brand new anymore. And also I rather sell it get 5070 rather than dual gpu, don't think vram help gaming alot on a 5060ti

The chip itself has limits
 

iceblendedchoc

Arch-Supremacy Member
Joined
Nov 22, 2016
Messages
21,949
Reaction score
9,098
Hi I am currently running a Ryzen 5800X on a B550M motherboard. DDR4 48GB ram(2x8+2x16) @2933. RTX 5060 Ti 16GB. 2x1TB NMVE SSD. 8x8TB NAS HDD. 650W Corsair PSU.

I am looking to add another RTX5060 Ti 16GB so that I can have 32GB VRAM.

However pcpartpicker seems to be F uped in filtering motherboards with 2xPCIe x8/x16. AM4 motherboards.
Therefore I would like to ask for recommendation of affordable x570 motherboards which will support 2 x RTX 5060Ti. Thanks.
Just use a PCIe Splitter , cheapest solution but not sure how you going to house it? maybe get a old mining rig and house them and run these cards
 

86technie

High Supremacy Member
Joined
Jun 8, 2006
Messages
39,023
Reaction score
5,080
Hi I am currently running a Ryzen 5800X on a B550M motherboard. DDR4 48GB ram(2x8+2x16) @2933. RTX 5060 Ti 16GB. 2x1TB NMVE SSD. 8x8TB NAS HDD. 650W Corsair PSU.

I am looking to add another RTX5060 Ti 16GB so that I can have 32GB VRAM.

However pcpartpicker seems to be F uped in filtering motherboards with 2xPCIe x8/x16. AM4 motherboards.
Therefore I would like to ask for recommendation of affordable x570 motherboards which will support 2 x RTX 5060Ti. Thanks.

For gaming?
Sli + Affordable doesn't fit.
If you want 32GB for AI or any other purpose.

I rather you buy this Gigabyte Radeon AI PRO R9700 32GB.
Why SLI was dead was due to few reasons:

1) SLI optimization not all application/games are optimized for SLI
Some games will have very bad screen tearing.
2) Power consumption

If you want cheaper RTX5090 32GB option this will be the best option.
Half the price of a RTX 5090.

https://thetechyard.com/products/gi...OLoQd5v2A6g9J65c1jjDzLKDbl5I6GKkaAhajEALw_wcB
 

deepblue_82

Arch-Supremacy Member
Joined
Oct 23, 2007
Messages
18,634
Reaction score
2,097
For gaming?
Sli + Affordable doesn't fit.
If you want 32GB for AI or any other purpose.

I rather you buy this Gigabyte Radeon AI PRO R9700 32GB.
Why SLI was dead was due to few reasons:

1) SLI optimization not all application/games are optimized for SLI
Some games will have very bad screen tearing.
2) Power consumption

If you want cheaper RTX5090 32GB option this will be the best option.
Half the price of a RTX 5090.

https://thetechyard.com/products/gi...OLoQd5v2A6g9J65c1jjDzLKDbl5I6GKkaAhajEALw_wcB
wow that Gigabyte Radeon AI PRO R9700 32GB. is good stuff.. i meant for AI...
for the long run its good.
for short run u can find a am4 sli mb...
 

michael_thm

Arch-Supremacy Member
Joined
Jan 1, 2000
Messages
21,708
Reaction score
2,483
For gaming?
Sli + Affordable doesn't fit.
If you want 32GB for AI or any other purpose.

I rather you buy this Gigabyte Radeon AI PRO R9700 32GB.
Why SLI was dead was due to few reasons:

1) SLI optimization not all application/games are optimized for SLI
Some games will have very bad screen tearing.
2) Power consumption

If you want cheaper RTX5090 32GB option this will be the best option.
Half the price of a RTX 5090.

https://thetechyard.com/products/gi...OLoQd5v2A6g9J65c1jjDzLKDbl5I6GKkaAhajEALw_wcB
I am starting with a 5600 Ti 16GB because it is just a hobby for fun and just starting out on self hosted LLMs after trying out paid subscriptions to cloud LLMs. Currently still paying USD30 for grok AI. Self hosted LLMs have more features. And won't tell you that you ran out of tokens. Another thing is that you get to use specialized AI models which does specific tasks better which is why companies use their own AI models besides the need for privacy and data protection. Cloud subscriptions for access to a list of specialized AI models are pretty steep and not for home use.

One good example is writing a novel. For cloud AI subscriptions, the training is such that it will keep recycling the same words and also write rubbish shiok stuff when you write in mandarin... knn dunno what wine lah supercar lah what hit face lah what it will spend half the chapter describing these nonsenses because the ah tiongs love it. It will also forget the user prompts which have been fixed as rules just a chapter down the road... wrote rules that I don't want to see shiok descriptions but it will forget! The only way is to have control over the system prompts which you won't get with consumer cloud AI subscriptions.

I did consider getting a 32GB VRAM GPU but decided against it as not sure if the experience of self hosted LLMs is worth it. A RTX 5060 Ti 16GB is very cheap at $550 compared to the $1900 price of a R9700 32GB or the $4000 RTX 5090 32GB. Of coz maybe I may choose to go either R9700 or RTX 5090 later on but the current plan is 2 x 5060 Ti which will cost me only $550 more.
 

86technie

High Supremacy Member
Joined
Jun 8, 2006
Messages
39,023
Reaction score
5,080
I am starting with a 5600 Ti 16GB because it is just a hobby for fun and just starting out on self hosted LLMs after trying out paid subscriptions to cloud LLMs. Currently still paying USD30 for grok AI. Self hosted LLMs have more features. And won't tell you that you ran out of tokens. Another thing is that you get to use specialized AI models which does specific tasks better which is why companies use their own AI models besides the need for privacy and data protection. Cloud subscriptions for access to a list of specialized AI models are pretty steep and not for home use.

One good example is writing a novel. For cloud AI subscriptions, the training is such that it will keep recycling the same words and also write rubbish shiok stuff when you write in mandarin... knn dunno what wine lah supercar lah what hit face lah what it will spend half the chapter describing these nonsenses because the ah tiongs love it. It will also forget the user prompts which have been fixed as rules just a chapter down the road... wrote rules that I don't want to see shiok descriptions but it will forget! The only way is to have control over the system prompts which you won't get with consumer cloud AI subscriptions.

I did consider getting a 32GB VRAM GPU but decided against it as not sure if the experience of self hosted LLMs is worth it. A RTX 5060 Ti 16GB is very cheap at $550 compared to the $1900 price of a R9700 32GB or the $4000 RTX 5090 32GB. Of coz maybe I may choose to go either R9700 or RTX 5090 later on but the current plan is 2 x 5060 Ti which will cost me only $550 more.
Take note the driver for that is also different.
What you are using the Geforce is for gaming not for AI modeling.
Nvidia also have AI cards but that is Telsa range.

PS the link below is just info sharing not my card.
You can buy one and try than you will know the difference.

Telsa cards are design for compute power. Take note this one have no display out.
So your AI model have to assign to this GPU.

Nvidia Tesla V100 PCIe https://carousell.app.link/sdY5qgm99Yb
 

ragnarok95

Senior Moderator
Senior Moderator
Joined
Mar 9, 2004
Messages
124,678
Reaction score
6,035
Take note the driver for that is also different.
What you are using the Geforce is for gaming not for AI modeling.
Nvidia also have AI cards but that is Telsa range.

PS the link below is just info sharing not my card.
You can buy one and try than you will know the difference.

Telsa cards are design for compute power. Take note this one have no display out.
So your AI model have to assign to this GPU.

Nvidia Tesla V100 PCIe https://carousell.app.link/sdY5qgm99Yb
If the card works for him, why not? He just need the Vram.
 

TanKianW

Supremacy Member
Joined
Apr 21, 2005
Messages
6,679
Reaction score
3,325
If you need self-hosted LLMs, sticking to Nvidia cards provide better software support on Linux. So it will be a better choice imo, even for 2x GPUs setup. Windows operating system generally has better support on AMD for AI. Choosing Linux over Windows for such use case are generally preferred. Though there will be some learning curve if you are not sufficiently "Linux savvy". Another setup option could be the newer AMD AI pro mini-PCs if you want to run really large models which requires lots of Vram, but are not going to come cheap too.

Personally, I am running my own self-hosted LLMs using the 3x RTX4000 SFF Ada Edition with 20GB VRAM which runs on 3 different hosts (each nodes on 5950X with 128GB ECC memory), loaded with 3 different LLMs models on Linux Mint 22.2 Zara (based on Ubuntu 24.04) virtual machines. I am also running a hyper-converged system with storage on 3x TrueNAS storage servers with a mixture of enterprise HDD and SSDs, inter-connected on a 2x 10GbE network backbone infrastructure (with MLAGG redundancy setup) which I use to cluster the hosts for bigger models when needed, though the latency will still be high (200G fiber network with RDMA preferred) but suitable for once in a while testing purposes.

Setting up your very own local LLMs or AI servers are really fun and opens up lots of possibilities. You can integrate with your smart home assistant over the Home Assistant platform, runs self monitoring 3D-printing farms, AI video/picture generator or create useful AI automations using n8n, just naming a few. Have fun!​
 
Last edited:
Important Forum Advisory Note
This forum is moderated by volunteer moderators who will react only to members' feedback on posts. Moderators are not employees or representatives of HWZ. Forum members and moderators are responsible for their own posts.

Please refer to our Community Guidelines and Standards, Terms of Service and Member T&Cs for more information.
Top