>>48237337>unet_lr: 5e-4>text_encoder_lr: 1e-4I did try adjusting these but since I didn't know what I was doing it probably didn't really help on whichever version of the lora it was, so i left it on those by default ever since.
>>48237379>What are the settings you are using? uh...I don't know how to really explain, the resolution I'm training at is 768. I've played around with different repeats and epochs, last one i tried was 8 repeats and 16 epochs but that was only because I saw some talk about less repeats + more epochs being better for small datasets. I have the lr scheduler on cosine_with_restarts and lr scheduler number at 3
>how big is the dataset?around 23 after culling the dataset a few times