Upload Date | May 08 2024 08:46 AM |
Views | 1 |
System Information | |
---|---|
Operating System | iOS 16.7.2 |
Model | iPhone 11 Pro Max |
Model ID | iPhone12,5 |
Motherboard | D431AP |
CPU Information | |
---|---|
Name | Apple A13 Bionic |
Topology | 1 Processor, 6 Cores |
Identifier | ARM |
Base Frequency | 2.66 GHz |
Cluster 1 | 2 Cores |
Cluster 2 | 4 Cores |
L1 Instruction Cache | 96.0 KB x 1 |
L1 Data Cache | 48.0 KB x 1 |
L2 Cache | 4.00 MB x 1 |
Memory Information | |
---|---|
Size | 3.66 GB |
Inference Information | |
---|---|
Framework | Core ML |
Backend | CPU |
Device | Default |
Workload | Accuracy | Score | |
---|---|---|---|
Image Classification (F32)
|
100% |
531
99.4 IPS |
|
Image Classification (F16)
|
100% |
642
120.2 IPS |
|
Image Classification (I8)
|
99% |
637
119.2 IPS |
|
Image Segmentation (F32)
|
100% |
425
7.09 IPS |
|
Image Segmentation (F16)
|
100% |
423
7.07 IPS |
|
Image Segmentation (I8)
|
100% |
421
7.03 IPS |
|
Pose Estimation (F32)
|
100% |
3246
3.93 IPS |
|
Pose Estimation (F16)
|
100% |
5506
6.67 IPS |
|
Pose Estimation (I8)
|
100% |
5518
6.68 IPS |
|
Object Detection (F32)
|
100% |
561
41.9 IPS |
|
Object Detection (F16)
|
100% |
719
53.7 IPS |
|
Object Detection (I8)
|
97% |
561
41.9 IPS |
|
Face Detection (F32)
|
100% |
1102
13.1 IPS |
|
Face Detection (F16)
|
100% |
1143
13.6 IPS |
|
Face Detection (I8)
|
99% |
1141
13.6 IPS |
|
Depth Estimation (F32)
|
100% |
1973
15.3 IPS |
|
Depth Estimation (F16)
|
99% |
2537
19.7 IPS |
|
Depth Estimation (I8)
|
99% |
2530
19.6 IPS |
|
Style Transfer (F32)
|
100% |
5602
7.37 IPS |
|
Style Transfer (F16)
|
100% |
7654
10.1 IPS |
|
Style Transfer (I8)
|
100% |
7653
10.1 IPS |
|
Image Super-Resolution (F32)
|
100% |
1663
59.4 IPS |
|
Image Super-Resolution (F16)
|
100% |
1731
61.8 IPS |
|
Image Super-Resolution (I8)
|
100% |
1731
61.8 IPS |
|
Text Classification (F32)
|
100% |
643
923.7 IPS |
|
Text Classification (F16)
|
100% |
401
576.4 IPS |
|
Text Classification (I8)
|
96% |
413
594.0 IPS |
|
Machine Translation (F32)
|
100% |
752
13.8 IPS |
|
Machine Translation (F16)
|
99% |
1208
22.2 IPS |
|
Machine Translation (I8)
|
99% |
735
13.5 IPS |