OutOfMemoryError: CUDA out of memory. [ File loading a GPT-2 model]

I’m trying to fine-tune GPT-2 model for content removal use-case.

In this process I’m facing the error mentioned below :-

“OutOfMemoryError: CUDA out of memory. Tried to allocate 982.00 MiB (GPU 0; 14.75 GiB total capacity; 13.36 GiB already allocated; 272.81 MiB free; 13.43 GiB reserved in total by PyTorch) If reserved memory is >> allocated memory try setting max_split_size_mb to avoid fragmentation. See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF”

I have tried resetting the runtime along with few functions to empty cache or resetting it, etc.

Can anyone help me with the same!

Maybe this might help Online User Community

2 Likes