sd with animal are just horrible in general... except for catsask the seasoned users here. what's a good lora for animals? my animals look very odd
bg easy... can just cope from the webNow I need to generate all the background images before the Colab free period ends.
Using common Taiwanese food LoRA.
![]()

haha ok so it's nothing to do with the prompts..sd with animal are just horrible in general... except for cats![]()
midjourney v5 did a great job for some animals

Eye blinking test with only depth animation. LeiaPix + CapCut.
Clone stamping the closed eyes took a rather long time for 10 images.

time to buy gpuThe gradio link does not appear anymore. So I think Colab has ended its support for SD web-ui.
I will expend all my accumulated points on Pixai.art first before buying a new GPU.time to buy gpu
would a 3060ti be ok?I will expend all my accumulated points on Pixai.art first before buying a new GPU.
nvidia gpu with 8gb vram can liao...12gb vram is recommended... 16gb vram even better... 24gb vram even more better if u wanna do own trainingwould a 3060ti be ok?
haha.. I want to try my hand at training but 24gb card is way too expensive for a hobby.. what's a good 12gb card?nvidia gpu with 8gb vram can liao...12gb vram is recommended... 16gb vram even better... 24gb vram even more better if u wanna do own training
You can get a 3060 12GB for about $500, I just got one at the start of the year. if you have the budget, can try 3080 TIhaha.. I want to try my hand at training but 24gb card is way too expensive for a hobby.. what's a good 12gb card?
I see.. but performance wise 3060ti is better than 3060 right?You can get a 3060 12GB for about $500, I just got one at the start of the year. if you have the budget, can try 3080 TI
Like that also can hah?zheng bro can teach how to use stable diffusion?
I see.. but performance wise 3060ti is better than 3060 right?
I see.. thank you for the explanation.. just thinking that long term I may be gaming more than doing ai pictures that's all.. haha3060 TI only has 8GB. From what I see on reddit, VRAM is king for SD. There is a tradeoff between slightly faster generation time vs able to run more batches. I would go for higher VRAM, because it is very annoying to restart generation when VRAM is exhausted.
Looks like gradio was temporary down just now.The gradio link does not appear anymore. So I think Colab has ended its support for SD web-ui.
I checked the pricelist from Fuwell just now. About $500 for an ASUS RTX 3060 with 12gb ram.would a 3060ti be ok?