>>44541329>gonna let the rest of these finishOutOfMemoryError: CUDA out of memory. Tried to allocate 1.69 GiB (GPU 0; 14.75 GiB total capacity; 10.16 GiB already allocated; 892.81 MiB free; 12.58 GiB reserved in total by PyTorch) If reserved memory is >> allocated memory try setting max_split_size_mb to avoid fragmentation. See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa