Stable diffusion thread

x1243x

Arch-Supremacy Member
Joined
Oct 29, 2004
Messages
11,689
Reaction score
3,597
ask the seasoned users here. what's a good lora for animals? my animals look very odd
 

zheng

Banned
Joined
Dec 20, 2001
Messages
52,689
Reaction score
48,715
ask the seasoned users here. what's a good lora for animals? my animals look very odd
sd with animal are just horrible in general... except for cats :s13:

midjourney v5 did a great job for some animals
 

zheng

Banned
Joined
Dec 20, 2001
Messages
52,689
Reaction score
48,715
Now I need to generate all the background images before the Colab free period ends.
Using common Taiwanese food LoRA.
egdUWe3.png
bg easy... can just cope from the web :crazy:
 

doogyhatts

Arch-Supremacy Member
Joined
Feb 13, 2018
Messages
12,320
Reaction score
3,362
The gradio link does not appear anymore. So I think Colab has ended its support for SD web-ui.
 

x1243x

Arch-Supremacy Member
Joined
Oct 29, 2004
Messages
11,689
Reaction score
3,597
nvidia gpu with 8gb vram can liao...12gb vram is recommended... 16gb vram even better... 24gb vram even more better if u wanna do own training
haha.. I want to try my hand at training but 24gb card is way too expensive for a hobby.. what's a good 12gb card?
 

project_00

Master Member
Joined
Jan 3, 2007
Messages
3,072
Reaction score
1,942
haha.. I want to try my hand at training but 24gb card is way too expensive for a hobby.. what's a good 12gb card?
You can get a 3060 12GB for about $500, I just got one at the start of the year. if you have the budget, can try 3080 TI
 

project_00

Master Member
Joined
Jan 3, 2007
Messages
3,072
Reaction score
1,942
I see.. but performance wise 3060ti is better than 3060 right?

3060 TI only has 8GB. From what I see on reddit, VRAM is king for SD. There is a tradeoff between slightly faster generation time vs able to run more batches. I would go for higher VRAM, because it is very annoying to restart generation when VRAM is exhausted.
 

x1243x

Arch-Supremacy Member
Joined
Oct 29, 2004
Messages
11,689
Reaction score
3,597
3060 TI only has 8GB. From what I see on reddit, VRAM is king for SD. There is a tradeoff between slightly faster generation time vs able to run more batches. I would go for higher VRAM, because it is very annoying to restart generation when VRAM is exhausted.
I see.. thank you for the explanation.. just thinking that long term I may be gaming more than doing ai pictures that's all.. haha
 

doogyhatts

Arch-Supremacy Member
Joined
Feb 13, 2018
Messages
12,320
Reaction score
3,362
The gradio link does not appear anymore. So I think Colab has ended its support for SD web-ui.
Looks like gradio was temporary down just now.
I managed to get it to work again by changing remote-moe to cloudflared.
 
Important Forum Advisory Note
This forum is moderated by volunteer moderators who will react only to members' feedback on posts. Moderators are not employees or representatives of HWZ. Forum members and moderators are responsible for their own posts.

Please refer to our Community Guidelines and Standards, Terms of Service and Member T&Cs for more information.
Top