>>87772754Yeah I guess it depends on whether that data finds its way to illust/noob or a popular finetune eventually. But getting a good feel for what works best when training for new models is going to be worthwhile even if the loras you bake now become redundant, there's always next project.