Basic requirements for running all TensorFlow models on Windows include:
- Install Python 3.9+ 64-bit release for Windows, and add it to your system
%PATH%
environment variable. - Download and Install Microsoft Visual C++ 2022 Redistributable.
- MSYS2. If MSYS2 is installed to
C:\msys64
, addC:\msys64\usr\bin
to your%PATH%
environment variable. Then, usingcmd.exe
, run:
pacman -S git patch unzip
- Install intel-tensorflow
- Set
MSYS64_BASH=C:\msys64\usr\bin\bash.exe
environment variable to your system. The path may change based on where have you installed MSYS2 on our system. - Install the common models dependencies:
- python-tk
- libsm6
- libxext6
- requests
Individual models may have additional dependencies that need to be installed before running it. Please follow the instructions in each model documentation.
The following list of models are tested on Windows, please check each model instructions from the Model Documentation
column based on the available precisions.
Note that on Windows systems all of the system cores will be used. For users of Windows desktop/laptops, it is strongly encouraged to instead use the batch file provided here to open a Windows command prompt pre-configured with optimized settings to achieve high AI workload performance on Intel hardware (e.g. TigerLake & AlderLake) for image recognition models.
Use Case | Model | Mode | Model Documentation |
---|---|---|---|
Image Recognition | DenseNet169 | Inference | FP32 |
Image Recognition | Inception V3 | Inference | Int8 FP32 |
Image Recognition | Inception V4 | Inference | Int8 FP32 |
Image Recognition | MobileNet V1* | Inference | Int8 FP32 |
Image Recognition | ResNet 101 | Inference | Int8 FP32 |
Image Recognition | ResNet 50 | Inference | Int8 FP32 |
Image Recognition | ResNet 50v1.5 | Inference | Int8 FP32 |
Image Segmentation | 3D U-Net MLPerf | Inference | FP32 BFloat16 |
Language Modeling | BERT | Inference | FP32 |
Language Translation | BERT | Inference | FP32 |
Language Translation | Transformer_LT_Official | Inference | FP32 |
Object Detection | R-FCN | Inference | Int8 FP32 |
Object Detection | SSD-MobileNet* | Inference | Int8 FP32 |
Object Detection | SSD-ResNet34* | Inference | Int8 FP32 |
Recommendation | DIEN | Inference | FP32 |
Recommendation | Wide & Deep | Inference | FP32 |
Intel® Extension for PyTorch is currently not supported on Windows.
Install PyTorch
pip install torch torchvision
The following list of models are tested on Windows, please check each model instructions from the Model Documentation
column based on the available precisions.
Use Case | Model | Mode | Model Documentation |
---|---|---|---|
Image Recognition | GoogLeNet | Inference | FP32 |
Image Recognition | Inception v3 | Inference | FP32 |
Image Recognition | MNASNet 0.5 | Inference | FP32 |
Image Recognition | MNASNet 1.0 | Inference | FP32 |
Image Recognition | ResNet 50 | Inference | FP32 BFloat16 |
Image Recognition | ResNet 101 | Inference | FP32 |
Image Recognition | ResNet 152 | Inference | FP32 |
Image Recognition | ResNext 32x4d | Inference | FP32 |
Image Recognition | ResNext 32x16d | Inference | FP32 BFloat16 |
Image Recognition | VGG-11 | Inference | FP32 |
Image Recognition | VGG-11 with batch normalization | Inference | FP32 |
Image Recognition | Wide ResNet-50-2 | Inference | FP32 |
Image Recognition | Wide ResNet-101-2 | Inference | FP32 |
Language Modeling | T5 | Inference | FP32 Int8** |
Object Detection | Faster R-CNN ResNet50 FPN | Inference | FP32 |
Object Detection | Mask R-CNN | Inference | FP32 |
Object Detection | Mask R-CNN ResNet50 FPN | Inference | FP32 |
Object Detection | RetinaNet ResNet-50 FPN | Inference | FP32 |
Shot Boundary Detection | TransNetV2 | Inference | FP32 |