>>71234064>>71234171That json is pretty fucked.
Trains on clip skip 2 like it's 1.5, assumes everyone has nvidia xformers and enough memory for batch size 2, uses 75 token length and 64 dim for some reason. That would be like a 400-800 mb lora just for a character which is ridiculous.