GeekbenchAI 1.0.0
wget https://cdn.geekbench.com/GeekbenchAI-1.0.0-Linux.tar.gz tar xzvf GeekbenchAI-1.0.0-Linux.tar.gz cd ~/GeekbenchAI-1.0.0-Linux/ ./banff |
without cd
root@server4:~# GeekbenchAI-1.0.0-Linux/banff GeekbenchAI-1.0.0-Linux/banff: error while loading shared libraries: libonnxruntime.so.1.18.1: cannot open shared object file: No such file or directory |
Test was performed on version "Geekbench ML 0.6.0"
wget https://cdn.geekbench.com/GeekbenchML-0.6.0-Linux.tar.gz tar xvf GeekbenchML-0.6.0-Linux.tar.gz GeekbenchML-0.6.0-Linux/banff |
Power Limits 35/40 55/55 15/55
Workload | Accuracy | Score | Workload | Accuracy | Score | Workload | Accuracy | Score | Workload | Accuracy | Score | |
---|---|---|---|---|---|---|---|---|---|---|---|---|
Image Classification (F32) | 100% | 1048 196.1 IPS | Image Classification (F32) | 100% | 1061 198.5 IPS | Image Classification (F32) | 100% | 1060 198.3 IPS | Image Classification (F32) | 100% | 833 155.8 IPS | |
Image Classification (F16) | 100% | 1047 195.9 IPS | Image Classification (F16) | 100% | 1063 198.8 IPS | Image Classification (F16) | 100% | 1053 197.1 IPS | Image Classification (F16) | 100% | 895 167.5 IPS | |
Image Classification (I8) | 99% | 662 123.9 IPS | Image Classification (I8) | 99% | 656 122.8 IPS | Image Classification (I8) | 99% | 658 123.2 IPS | Image Classification (I8) | 99% | 524 98.1 IPS | |
Image Segmentation (F32) | 100% | 1354 22.6 IPS | Image Segmentation (F32) | 100% | 1345 22.5 IPS | Image Segmentation (F32) | 100% | 1323 22.1 IPS | Image Segmentation (F32) | 100% | 1171 19.6 IPS | |
Image Segmentation (F16) | 100% | 1352 22.6 IPS | Image Segmentation (F16) | 100% | 1341 22.4 IPS | Image Segmentation (F16) | 100% | 1319 22.0 IPS | Image Segmentation (F16) | 100% | 1171 19.5 IPS | |
Image Segmentation (I8) | 98% | 697 11.6 IPS | Image Segmentation (I8) | 98% | 695 11.6 IPS | Image Segmentation (I8) | 98% | 692 11.6 IPS | Image Segmentation (I8) | 98% | 697 11.6 IPS | |
Pose Estimation (F32) | 100% | 1914 2.32 IPS | Pose Estimation (F32) | 100% | 1939 2.35 IPS | Pose Estimation (F32) | 100% | 1904 2.31 IPS | Pose Estimation (F32) | 100% | 1490 1.80 IPS | |
Pose Estimation (F16) | 100% | 1916 2.32 IPS | Pose Estimation (F16) | 100% | 1932 2.34 IPS | Pose Estimation (F16) | 100% | 1901 2.30 IPS | Pose Estimation (F16) | 100% | 1413 1.71 IPS | |
Pose Estimation (I8) | 100% | 1634 1.98 IPS | Pose Estimation (I8) | 100% | 1632 1.98 IPS | Pose Estimation (I8) | 100% | 1632 1.98 IPS | Pose Estimation (I8) | 100% | 1433 1.73 IPS | |
Object Detection (F32) | 100% | 1127 84.1 IPS | Object Detection (F32) | 100% | 1019 76.1 IPS | Object Detection (F32) | 100% | 1130 84.4 IPS | Object Detection (F32) | 100% | 884 66.0 IPS | |
Object Detection (F16) | 100% | 1123 83.9 IPS | Object Detection (F16) | 100% | 1126 84.1 IPS | Object Detection (F16) | 100% | 1015 75.7 IPS | Object Detection (F16) | 100% | 1056 78.9 IPS | |
Object Detection (I8) | 65% | 734 54.8 IPS | Object Detection (I8) | 65% | 729 54.5 IPS | Object Detection (I8) | 65% | 730 54.5 IPS | Object Detection (I8) | 65% | 718 53.6 IPS | |
Face Detection (F32) | 100% | 2087 24.8 IPS | Face Detection (F32) | 100% | 2082 24.8 IPS | Face Detection (F32) | 100% | 2084 24.8 IPS | Face Detection (F32) | 100% | 1765 21.0 IPS | |
Face Detection (F16) | 100% | 2087 24.8 IPS | Face Detection (F16) | 100% | 2083 24.8 IPS | Face Detection (F16) | 100% | 747 8.89 IPS | Face Detection (F16) | 100% | 1779 21.2 IPS | |
Face Detection (I8) | 87% | 1598 19.0 IPS | Face Detection (I8) | 87% | 1598 19.0 IPS | Face Detection (I8) | 87% | 913 10.9 IPS | Face Detection (I8) | 87% | 1280 15.2 IPS | |
Depth Estimation (F32) | 100% | 1990 15.4 IPS | Depth Estimation (F32) | 100% | 2023 15.7 IPS | Depth Estimation (F32) | 100% | 1130 8.76 IPS | Depth Estimation (F32) | 100% | 1562 12.1 IPS | |
Depth Estimation (F16) | 100% | 1996 15.5 IPS | Depth Estimation (F16) | 100% | 2017 15.6 IPS | Depth Estimation (F16) | 100% | 978 7.59 IPS | Depth Estimation (F16) | 100% | 1617 12.5 IPS | |
Depth Estimation (I8) | 95% | 1545 12.0 IPS | Depth Estimation (I8) | 95% | 1548 12.0 IPS | Depth Estimation (I8) | 95% | 1089 8.45 IPS | Depth Estimation (I8) | 95% | 1461 11.3 IPS | |
Style Transfer (F32) | 100% | 2981 3.92 IPS | Style Transfer (F32) | 100% | 3029 3.98 IPS | Style Transfer (F32) | 100% | 1056 1.39 IPS | Style Transfer (F32) | 100% | 2141 2.82 IPS | |
Style Transfer (F16) | 100% | 2987 3.93 IPS | Style Transfer (F16) | 100% | 3027 3.98 IPS | Style Transfer (F16) | 100% | 1002 1.32 IPS | Style Transfer (F16) | 100% | 2191 2.88 IPS | |
Style Transfer (I8) | 98% | 3586 4.72 IPS | Style Transfer (I8) | 98% | 3583 4.71 IPS | Style Transfer (I8) | 98% | 1451 1.91 IPS | Style Transfer (I8) | 98% | 2908 3.83 IPS | |
Image Super-Resolution (F32) | 100% | 1091 39.0 IPS | Image Super-Resolution (F32) | 100% | 1299 46.4 IPS | Image Super-Resolution (F32) | 100% | 1242 44.3 IPS | Image Super-Resolution (F32) | 100% | 1004 35.9 IPS | |
Image Super-Resolution (F16) | 100% | 1090 38.9 IPS | Image Super-Resolution (F16) | 100% | 1299 46.4 IPS | Image Super-Resolution (F16) | 100% | 1237 44.2 IPS | Image Super-Resolution (F16) | 100% | 1097 39.2 IPS | |
Image Super-Resolution (I8) | 98% | 822 29.4 IPS | Image Super-Resolution (I8) | 98% | 947 33.8 IPS | Image Super-Resolution (I8) | 98% | 906 32.4 IPS | Image Super-Resolution (I8) | 98% | 947 33.8 IPS | |
Text Classification (F32) | 100% | 861 1.24 KIPS | Text Classification (F32) | 100% | 984 1.41 KIPS | Text Classification (F32) | 100% | 979 1.41 KIPS | Text Classification (F32) | 100% | 859 1.24 KIPS | |
Text Classification (F16) | 100% | 855 1.23 KIPS | Text Classification (F16) | 100% | 858 1.23 KIPS | Text Classification (F16) | 100% | 971 1.40 KIPS | Text Classification (F16) | 100% | 858 1.23 KIPS | |
Text Classification (I8) | 92% | 315 452.5 IPS | Text Classification (I8) | 92% | 315 452.1 IPS | Text Classification (I8) | 92% | 446 640.7 IPS | Text Classification (I8) | 92% | 320 460.0 IPS | |
Machine Translation (F32) | 100% | 1823 33.6 IPS | Machine Translation (F32) | 100% | 1831 33.7 IPS | Machine Translation (F32) | 100% | 1139 21.0 IPS | Machine Translation (F32) | 100% | 1338 24.6 IPS | |
Machine Translation (F16) | 100% | 1845 34.0 IPS | Machine Translation (F16) | 100% | 1842 33.9 IPS | Machine Translation (F16) | 100% | 523 9.63 IPS | Machine Translation (F16) | 100% | 1545 28.4 IPS | |
Machine Translation (I8) | 62% | 607 11.2 IPS | Machine Translation (I8) | 62% | 610 11.2 IPS | Machine Translation (I8) | 62% | 338 6.22 IPS | Machine Translation (I8) | 62% | 548 10.1 IPS |