0

I can't seem to clear the GPU memory after sending a single variable to the GPU.

import torch
tm = torch.Tensor([1,2]).to("cuda")
!nvidia-smi

|===============================+======================+======================|
|   0  GeForce RTX 208...  On   | 00000000:3D:00.0 Off |                  N/A |
|  0%   37C    P2    52W / 250W |    730MiB / 10989MiB |      0%      Default

So I use 730MiB... Now no matter what I try I can not make the 730MiB go to zero:

del tm                                                                                                                                                                                                 
torch.cuda.empty_cache()                                                                                                                                                                               
import sys;sys.modules[__name__].__dict__.clear()                                                                                                                                                      
%reset                                                                                                                                                                                                 
Once deleted, variables cannot be recovered. Proceed (y/[n])? y
!nvidia-smi
|   0  GeForce RTX 208...  On   | 00000000:3D:00.0 Off |                  N/A |
|  0%   35C    P8     1W / 250W |    728MiB / 10989MiB |      0%      Default |

I would be happy to hear any suggestions, Thanks

Oren
  • 4,711
  • 4
  • 37
  • 63

1 Answers1

0

Ok, it is not possible, this memory is torch drivers, and it can not be released. I've opened a ticket in pytorch GitHub - https://github.com/pytorch/pytorch/issues/37664

Oren
  • 4,711
  • 4
  • 37
  • 63