>>90234364Okay, so I think downsizing the lora does too much harm to warrant it. Here's a comparison of the lhc_3 lora, a resize to 64dims, base Noob_vpred and the full model.
https://litter.catbox.moe/5dmqtx.jpegHonestly surprising how good the lora extract is for many chuubas. Could also do 128 dims, but at that point we are talking about less than a third of its size in reduction, and really don't think it makes any difference if it's 1gb or 1.5gb
>>90236545Agreed, dooby thighs were necessary (got a bit carried away so more than just thighs below)
https://files.catbox.moe/ddaejr.pnghttps://files.catbox.moe/i3qhq7.pnghttps://files.catbox.moe/l0u9lx.pnghttps://files.catbox.moe/3ierqf.pnghttps://files.catbox.moe/0q3p8g.pnghttps://files.catbox.moe/3lk33b.pnghttps://files.catbox.moe/ub7vkw.pnghttps://files.catbox.moe/cwcmq8.png>>90235975Thanks anon. Already did one myself on Illustrious, but more can't hurt. What model is this one trained on?
>>90245830Awesome. Thanks anon!
>128 is the sweetspot for rouweiYeah, seems they need pretty big extracts for them to be good.
>>90253517>>90255602Dang, sad to hear it's messing with other loras, but looking really good nonetheless. Nice work
guess it was inevitable for the extract to be bad at the triangle halo when even the model has issues with it>>90268534Nta, but honestly, for most basic and even intermediate merging I think mecha-nodes are overkill and simple/block-merging is usually enough, which are just one node in comfy. Not sure what Supermerger can do, but I I think those will probably be able to deal with most normal merging.
>>90281590What a handsome and cool Lui. I want her to save me from distress. Also, that environment is insanely pretty. Could I ask for a catbox?