>>2865070Looks like fun, but that's a lot to gen
>>2865072>requirements for SDXL LoRA training are 24GB VRAMI have a 3090 and I train using ezscripts locally.
Training with batch 2, gradient checkpointing, cache latents and xformers, with AdamW8 takes around 11.24gb of vram for me