这是一个更复杂的问题,但是否有这样做的功能?我只是找不到关于这个主题的任何文档。例如,我的用法如下:
from google.colab import utils # I made this up
colab_pro = utils.colab_is_pro()
if colab_pro:
# train model with higher settings
else:
# train model with lower settings目前,我确实有办法做到这一点,但这是相当麻烦的:
gpu_name = !nvidia-smi --query-gpu=gpu_name --format=csv
# You get Tesla T4 with free colab and faster GPUs with colab pro
colab_pro = False if 'T4' in gpu_name else True发布于 2022-01-25 14:03:33
#A Colab pro environment should have >20Gb of total memory.
from psutil import virtual_memory
colab_pro = virtual_memory().total / 1e9
print('Your runtime has {:.1f} gigabytes of available RAM\n'.format(colab_pro))
if colab_pro < 20:
print('Not using a high-RAM runtime')
# train model with lower settings
else:
print('You are using a high-RAM runtime!')
# train model with higher settings此外,您可以检查colab中的可用内存,如下所示:
!cat /proc/meminfo一个Colab环境应该有>20 of的总内存。
https://colab.research.google.com/notebooks/pro.ipynb#scrollTo=V1G82GuO-tez
https://stackoverflow.com/questions/63712602
复制相似问题