Alright I tried retraining a lora with the same settings, and it looks like the update did fuck with the results, the loss was lower but it didn't come out that different so the difference probably isn't as big as I first thought. Let's hope the fix is as easy as lowering the learning rate, here's a grid
https://files.catbox.moe/nad8tw.png the new version isn't all that worse but I still think the original looks better