GPU系列|nvidia-smi命令

2018-11-22  本文已影响0人  reallocing

查询显存使用情况和总量

$ nvidia-smi --query-gpu=index,memory.used,memory.total --format=csv         
                                                                    
index, memory.used [MiB], memory.total [MiB]                                                        
0, 10956 MiB, 11441 MiB                                                                           
1, 0 MiB, 11441 MiB
$nvidia-smi --query-gpu=index,memory.used,memory.total --format=csv,noheader,nounits

0, 10956, 11441                                                                                    
1, 0, 11441

To list certain details about each GPU, try:

$ nvidia-smi --query-gpu=index,name,uuid,serial --format=csv

0, Tesla K40m, GPU-d0e093a0-c3b3-f458-5a55-6eb69fxxxxxx, 0323913xxxxxx
1, Tesla K40m, GPU-d105b085-7239-3871-43ef-975ecaxxxxxx, 0324214xxxxxx

Query the VBIOS version of each device:

$ nvidia-smi --query-gpu=gpu_name,gpu_bus_id,vbios_version --format=csv
name, pci.bus_id, vbios_version
GRID K2, 0000:87:00.0, 80.04.D4.00.07
GRID K2, 0000:88:00.0, 80.04.D4.00.08

Query Description

image.png
$ nvidia-smi --query-gpu=timestamp,name,pci.bus_id,driver_version,pstate,pcie.link.gen.max,
pcie.link.gen.current,temperature.gpu,utilization.gpu,utilization.memory,
memory.total,memory.free,memory.used --format=csv -l 5

When adding additional parameters to a query, ensure that no spaces are added between the queries options.

Query Description

image.png

You can get a complete list of the query arguments by issuing: nvidia-smi --help-query-gpu

参考

上一篇 下一篇

猜你喜欢

热点阅读