If anyone can help me figure out what to do with this out of memory error:
>OutOfMemoryError: CUDA out of memory. Tried to allocate 4.00 GiB (GPU 0; 8.00 GiB total capacity; 6.13 GiB already allocated; 111.02 MiB free; 6.31 GiB reserved in total by PyTorch)>If reserved memory is >> allocated memory try setting max_split_size_mb to avoid fragmentation.-I'd be very happy. I'm assuming the "max_split_size_mb" is a setting, but I can't find it.
It's also perplexing me that PyTorch is using 6GB to run a webUI that does nothing. The 6GB is being used when just idling in the UI.
Have this Noel butt-stuff mini series I proompted a while back as some appreciation for any help.
https://files.catbox.moe/3amipm.pnghttps://files.catbox.moe/h9ydxa.pnghttps://files.catbox.moe/pkhfvp.pnghttps://files.catbox.moe/k1vr81.pnghttps://files.catbox.moe/87gify.pnghttps://files.catbox.moe/fx45ub.png