Skip to content

Commit

Permalink
[DOCS] Update tutorials for master (openvinotoolkit#19307)
Browse files Browse the repository at this point in the history
* update-160823

* fixes

* fix-toc-headings

* fix-headings

* fix

* fix-headings

* fix

* fix-headings

* fixes

* Update 220-cross-lingual-books-alignment-with-output.rst

* fixes

* fix

* fix-toc-headings

* fix-headings

* fix toc

* fix toc

* fix toc

* add-missing-301-nncf

* Update 301-tensorflow-training-openvino-nncf-with-output.rst

* fix toc

* fixes
  • Loading branch information
sgolebiewski-intel authored Aug 21, 2023
1 parent 7c273dc commit 20bf7ae
Show file tree
Hide file tree
Showing 360 changed files with 19,737 additions and 5,819 deletions.
2 changes: 1 addition & 1 deletion docs/nbdoc/consts.py
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@

repo_name = "openvino_notebooks"

artifacts_link = "http://repository.toolbox.iotg.sclab.intel.com/projects/ov-notebook/0.1.0-latest/20230711220806/dist/rst_files/"
artifacts_link = "http://repository.toolbox.iotg.sclab.intel.com/projects/ov-notebook/0.1.0-latest/20230815220807/dist/rst_files/"

blacklisted_extensions = ['.xml', '.bin']

Expand Down
39 changes: 25 additions & 14 deletions docs/notebooks/001-hello-world-with-output.rst
Original file line number Diff line number Diff line change
@@ -1,6 +1,8 @@
Hello Image Classification
==========================

.. _top:

This basic introduction to OpenVINO™ shows how to do inference with an
image classification model.

Expand All @@ -11,10 +13,19 @@ Zoo <https://github.com/openvinotoolkit/open_model_zoo/>`__ is used in
this tutorial. For more information about how OpenVINO IR models are
created, refer to the `TensorFlow to
OpenVINO <101-tensorflow-classification-to-openvino-with-output.html>`__
tutorial.
tutorial.

**Table of contents**:

- `Imports <#imports>`__
- `Download the Model and data samples <#download-the-model-and-data-samples>`__
- `Select inference device <#select-inference-device>`__
- `Load the Model <#load-the-model>`__
- `Load an Image <#load-an-image>`__
- `Do Inference <#do-inference>`__

Imports
-------
Imports `<#top>`__
############################################

.. code:: ipython3
Expand All @@ -29,8 +40,8 @@ Imports
sys.path.append("../utils")
from notebook_utils import download_file
Download the Model and data samples
-----------------------------------
Download the Model and data samples `<#top>`__
########################################################################

.. code:: ipython3
Expand Down Expand Up @@ -63,10 +74,10 @@ Download the Model and data samples
artifacts/v3-small_224_1.0_float.bin: 0%| | 0.00/4.84M [00:00<?, ?B/s]
Select inference device
-----------------------
Select inference device `<#top>`__
############################################################

select device from dropdown list for running inference using OpenVINO
Select device from dropdown list for running inference using OpenVINO:

.. code:: ipython3
Expand All @@ -91,8 +102,8 @@ select device from dropdown list for running inference using OpenVINO
Load the Model
--------------
Load the Model `<#top>`__
###################################################

.. code:: ipython3
Expand All @@ -102,8 +113,8 @@ Load the Model
output_layer = compiled_model.output(0)
Load an Image
-------------
Load an Image `<#top>`__
##################################################

.. code:: ipython3
Expand All @@ -122,8 +133,8 @@ Load an Image
.. image:: 001-hello-world-with-output_files/001-hello-world-with-output_10_0.png


Do Inference
------------
Do Inference `<#top>`__
#################################################

.. code:: ipython3
Expand Down
6 changes: 3 additions & 3 deletions docs/notebooks/001-hello-world-with-output_files/index.html
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
<html>
<head><title>Index of /projects/ov-notebook/0.1.0-latest/20230711220806/dist/rst_files/001-hello-world-with-output_files/</title></head>
<head><title>Index of /projects/ov-notebook/0.1.0-latest/20230815220807/dist/rst_files/001-hello-world-with-output_files/</title></head>
<body bgcolor="white">
<h1>Index of /projects/ov-notebook/0.1.0-latest/20230711220806/dist/rst_files/001-hello-world-with-output_files/</h1><hr><pre><a href="../">../</a>
<a href="001-hello-world-with-output_10_0.png">001-hello-world-with-output_10_0.png</a> 12-Jul-2023 00:11 387941
<h1>Index of /projects/ov-notebook/0.1.0-latest/20230815220807/dist/rst_files/001-hello-world-with-output_files/</h1><hr><pre><a href="../">../</a>
<a href="001-hello-world-with-output_10_0.png">001-hello-world-with-output_10_0.png</a> 16-Aug-2023 01:31 387941
</pre><hr></body>
</html>
112 changes: 56 additions & 56 deletions docs/notebooks/002-openvino-api-with-output.rst
Original file line number Diff line number Diff line change
Expand Up @@ -4,29 +4,27 @@ OpenVINO™ Runtime API Tutorial
This notebook explains the basics of the OpenVINO Runtime API. It
covers:

- `Loading OpenVINO Runtime and Showing
Info <#Loading-OpenVINO-Runtime-and-Showing-Info>`__
- `Loading a Model <#Loading-a-Model>`__
- `Loading OpenVINO Runtime and Showing Info <#loading-openvino-runtime-and-showing-info>`__
- `Loading a Model <#loading-a-model>`__

- `OpenVINO IR Model <#OpenVINO-IR-Model>`__
- `ONNX Model <#ONNX-Model>`__
- `PaddlePaddle Model <#PaddlePaddle-Model>`__
- `TensorFlow Model <#TensorFlow-Model>`__
- `TensorFlow Lite Model <#TensorFlow-Lite-Model>`__
- `OpenVINO IR Model <#openvino-ir-model>`__
- `ONNX Model <#onnx-model>`__
- `PaddlePaddle Model <#paddlepaddle-model>`__
- `TensorFlow Model <#tensorflow-model>`__
- `TensorFlow Lite Model <#tensorflow-lite-model>`__

- `Getting Information about a
Model <#Getting-Information-about-a-Model>`__
- `Getting Information about a Model <#getting-information-about-a-model>`__

- `Model Inputs <#Model-Inputs>`__
- `Model Outputs <#Model-Outputs>`__
- `Model Inputs <#model-inputs>`__
- `Model Outputs <#model-outputs>`__

- `Doing Inference on a Model <#Doing-Inference-on-a-Model>`__
- `Reshaping and Resizing <#Reshaping-and-Resizing>`__
- `Doing Inference on a Model <#doing-inference-on-a-model>`__
- `Reshaping and Resizing <#reshaping-and-resizing>`__

- `Change Image Size <#Change-Image-Size>`__
- `Change Batch Size <#Change-Batch-Size>`__
- `Change Image Size <#change-image-size>`__
- `Change Batch Size <#change-batch-size>`__

- `Caching a Model <#Caching-a-Model>`__
- `Caching a Model <#caching-a-model>`__

The notebook is divided into sections with headers. The next cell
contains global requirements installation and imports. Each section is
Expand Down Expand Up @@ -54,12 +52,12 @@ same.
.. parsed-literal::
Requirement already satisfied: requests in /opt/home/k8sworker/ci-ai/cibuilds/ov-notebook/OVNotebookOps-448/.workspace/scm/ov-notebook/.venv/lib/python3.8/site-packages (2.31.0)
Requirement already satisfied: tqdm in /opt/home/k8sworker/ci-ai/cibuilds/ov-notebook/OVNotebookOps-448/.workspace/scm/ov-notebook/.venv/lib/python3.8/site-packages (4.65.0)
Requirement already satisfied: charset-normalizer<4,>=2 in /opt/home/k8sworker/ci-ai/cibuilds/ov-notebook/OVNotebookOps-448/.workspace/scm/ov-notebook/.venv/lib/python3.8/site-packages (from requests) (3.2.0)
Requirement already satisfied: idna<4,>=2.5 in /opt/home/k8sworker/ci-ai/cibuilds/ov-notebook/OVNotebookOps-448/.workspace/scm/ov-notebook/.venv/lib/python3.8/site-packages (from requests) (3.4)
Requirement already satisfied: urllib3<3,>=1.21.1 in /opt/home/k8sworker/ci-ai/cibuilds/ov-notebook/OVNotebookOps-448/.workspace/scm/ov-notebook/.venv/lib/python3.8/site-packages (from requests) (1.26.16)
Requirement already satisfied: certifi>=2017.4.17 in /opt/home/k8sworker/ci-ai/cibuilds/ov-notebook/OVNotebookOps-448/.workspace/scm/ov-notebook/.venv/lib/python3.8/site-packages (from requests) (2023.5.7)
Requirement already satisfied: requests in /opt/home/k8sworker/ci-ai/cibuilds/ov-notebook/OVNotebookOps-475/.workspace/scm/ov-notebook/.venv/lib/python3.8/site-packages (2.31.0)
Requirement already satisfied: tqdm in /opt/home/k8sworker/ci-ai/cibuilds/ov-notebook/OVNotebookOps-475/.workspace/scm/ov-notebook/.venv/lib/python3.8/site-packages (4.66.1)
Requirement already satisfied: charset-normalizer<4,>=2 in /opt/home/k8sworker/ci-ai/cibuilds/ov-notebook/OVNotebookOps-475/.workspace/scm/ov-notebook/.venv/lib/python3.8/site-packages (from requests) (3.2.0)
Requirement already satisfied: idna<4,>=2.5 in /opt/home/k8sworker/ci-ai/cibuilds/ov-notebook/OVNotebookOps-475/.workspace/scm/ov-notebook/.venv/lib/python3.8/site-packages (from requests) (3.4)
Requirement already satisfied: urllib3<3,>=1.21.1 in /opt/home/k8sworker/ci-ai/cibuilds/ov-notebook/OVNotebookOps-475/.workspace/scm/ov-notebook/.venv/lib/python3.8/site-packages (from requests) (1.26.16)
Requirement already satisfied: certifi>=2017.4.17 in /opt/home/k8sworker/ci-ai/cibuilds/ov-notebook/OVNotebookOps-475/.workspace/scm/ov-notebook/.venv/lib/python3.8/site-packages (from requests) (2023.7.22)
Loading OpenVINO Runtime and Showing Info
Expand Down Expand Up @@ -116,24 +114,24 @@ OpenVINO IR Model
An OpenVINO IR (Intermediate Representation) model consists of an
``.xml`` file, containing information about network topology, and a
``.bin`` file, containing the weights and biases binary data. Models in
OpenVINO IR format are obtained by using Model Optimizer tool. The
OpenVINO IR format are obtained by using model conversion API. The
``read_model()`` function expects the ``.bin`` weights file to have the
same filename and be located in the same directory as the ``.xml`` file:
``model_weights_file == Path(model_xml).with_suffix(".bin")``. If this
is the case, specifying the weights file is optional. If the weights
file has a different filename, it can be specified using the ``weights``
parameter in ``read_model()``.

The OpenVINO `Model
Optimizer <https://docs.openvino.ai/2023.0/openvino_docs_MO_DG_Deep_Learning_Model_Optimizer_DevGuide.html#doxid-openvino-docs-m-o-d-g-deep-learning-model-optimizer-dev-guide>`__
tool is used to convert models to OpenVINO IR format. Model Optimizer
reads the original model and creates an OpenVINO IR model (.xml and .bin
files) so inference can be performed without delays due to format
conversion. Optionally, Model Optimizer can adjust the model to be more
suitable for inference, for example, by alternating input shapes,
embedding preprocessing and cutting training parts off. For information
on how to convert your existing TensorFlow, PyTorch or ONNX model to
OpenVINO IR format with Model Optimizer, refer to the
The OpenVINO `model conversion
API <https://docs.openvino.ai/2023.0/openvino_docs_MO_DG_Deep_Learning_Model_Optimizer_DevGuide.html#doxid-openvino-docs-m-o-d-g-deep-learning-model-optimizer-dev-guide>`__
tool is used to convert models to OpenVINO IR format. Model conversion
API reads the original model and creates an OpenVINO IR model (``.xml``
and ``.bin`` files) so inference can be performed without delays due to
format conversion. Optionally, model conversion API can adjust the model
to be more suitable for inference, for example, by alternating input
shapes, embedding preprocessing and cutting training parts off. For
information on how to convert your existing TensorFlow, PyTorch or ONNX
model to OpenVINO IR format with model conversion API, refer to the
`tensorflow-to-openvino <101-tensorflow-classification-to-openvino-with-output.html>`__
and
`pytorch-onnx-to-openvino <102-pytorch-onnx-to-openvino-with-output.html>`__
Expand Down Expand Up @@ -165,7 +163,7 @@ notebooks.
.. parsed-literal::
PosixPath('/opt/home/k8sworker/ci-ai/cibuilds/ov-notebook/OVNotebookOps-448/.workspace/scm/ov-notebook/notebooks/002-openvino-api/model/classification.bin')
PosixPath('/opt/home/k8sworker/ci-ai/cibuilds/ov-notebook/OVNotebookOps-475/.workspace/scm/ov-notebook/notebooks/002-openvino-api/model/classification.bin')
Expand Down Expand Up @@ -212,7 +210,7 @@ points to the filename of an ONNX model.
.. parsed-literal::
PosixPath('/opt/home/k8sworker/ci-ai/cibuilds/ov-notebook/OVNotebookOps-448/.workspace/scm/ov-notebook/notebooks/002-openvino-api/model/segmentation.onnx')
PosixPath('/opt/home/k8sworker/ci-ai/cibuilds/ov-notebook/OVNotebookOps-475/.workspace/scm/ov-notebook/notebooks/002-openvino-api/model/segmentation.onnx')
Expand Down Expand Up @@ -268,7 +266,7 @@ without any conversion step. Pass the filename with extension to
.. parsed-literal::
PosixPath('/opt/home/k8sworker/ci-ai/cibuilds/ov-notebook/OVNotebookOps-448/.workspace/scm/ov-notebook/notebooks/002-openvino-api/model/inference.pdiparams')
PosixPath('/opt/home/k8sworker/ci-ai/cibuilds/ov-notebook/OVNotebookOps-475/.workspace/scm/ov-notebook/notebooks/002-openvino-api/model/inference.pdiparams')
Expand All @@ -292,13 +290,15 @@ TensorFlow Model
~~~~~~~~~~~~~~~~

TensorFlow models saved in frozen graph format can also be passed to
``read_model`` starting in OpenVINO 2022.3.

.. note::

* Directly loading TensorFlow models is available as a preview feature in the OpenVINO 2022.3 release. Fully functional support will be provided in the upcoming 2023 releases.
* Currently support is limited to only frozen graph inference format. Other TensorFlow model formats must be converted to OpenVINO IR using `Model Optimizer <https://docs.openvino.ai/2023.0/openvino_docs_MO_DG_prepare_model_convert_model_Convert_Model_From_TensorFlow.html>`__.
``read_model`` starting in OpenVINO 2022.3.

**NOTE**: Directly loading TensorFlow models is available as a
preview feature in the OpenVINO 2022.3 release. Fully functional
support will be provided in the upcoming 2023 releases. Currently
support is limited to only frozen graph inference format. Other
TensorFlow model formats must be converted to OpenVINO IR using
`model conversion
API <https://docs.openvino.ai/2023.0/openvino_docs_MO_DG_prepare_model_convert_model_Convert_Model_From_TensorFlow.html>`__.

.. code:: ipython3
Expand All @@ -318,7 +318,7 @@ TensorFlow models saved in frozen graph format can also be passed to
.. parsed-literal::
PosixPath('/opt/home/k8sworker/ci-ai/cibuilds/ov-notebook/OVNotebookOps-448/.workspace/scm/ov-notebook/notebooks/002-openvino-api/model/classification.pb')
PosixPath('/opt/home/k8sworker/ci-ai/cibuilds/ov-notebook/OVNotebookOps-475/.workspace/scm/ov-notebook/notebooks/002-openvino-api/model/classification.pb')
Expand Down Expand Up @@ -370,7 +370,7 @@ It is pre-trained model optimized to work with TensorFlow Lite.
.. parsed-literal::
PosixPath('/opt/home/k8sworker/ci-ai/cibuilds/ov-notebook/OVNotebookOps-448/.workspace/scm/ov-notebook/notebooks/002-openvino-api/model/classification.tflite')
PosixPath('/opt/home/k8sworker/ci-ai/cibuilds/ov-notebook/OVNotebookOps-475/.workspace/scm/ov-notebook/notebooks/002-openvino-api/model/classification.tflite')
Expand All @@ -395,7 +395,7 @@ Getting Information about a Model
The OpenVINO Model instance stores information about the model.
Information about the inputs and outputs of the model are in
``model.inputs`` and ``model.outputs``. These are also properties of the
CompiledModel instance. While using ``model.inputs`` and
``CompiledModel`` instance. While using ``model.inputs`` and
``model.outputs`` in the cells below, you can also use
``compiled_model.inputs`` and ``compiled_model.outputs``.

Expand All @@ -419,7 +419,7 @@ CompiledModel instance. While using ``model.inputs`` and
.. parsed-literal::
PosixPath('/opt/home/k8sworker/ci-ai/cibuilds/ov-notebook/OVNotebookOps-448/.workspace/scm/ov-notebook/notebooks/002-openvino-api/model/classification.bin')
PosixPath('/opt/home/k8sworker/ci-ai/cibuilds/ov-notebook/OVNotebookOps-475/.workspace/scm/ov-notebook/notebooks/002-openvino-api/model/classification.bin')
Expand Down Expand Up @@ -581,8 +581,8 @@ on a model, first create an inference request by calling the
``compiled_model`` that was loaded with ``compile_model()``. Then, call
the ``infer()`` method of ``InferRequest``. It expects one argument:
``inputs``. This is a dictionary that maps input layer names to input
data or list of input data in np.ndarray format, where the position of
the input tensor corresponds to input index. If a model has a single
data or list of input data in ``np.ndarray`` format, where the position
of the input tensor corresponds to input index. If a model has a single
input, wrapping to a dictionary or list can be omitted.

.. code:: ipython3
Expand Down Expand Up @@ -612,7 +612,7 @@ input, wrapping to a dictionary or list can be omitted.
.. parsed-literal::
PosixPath('/opt/home/k8sworker/ci-ai/cibuilds/ov-notebook/OVNotebookOps-448/.workspace/scm/ov-notebook/notebooks/002-openvino-api/model/classification.bin')
PosixPath('/opt/home/k8sworker/ci-ai/cibuilds/ov-notebook/OVNotebookOps-475/.workspace/scm/ov-notebook/notebooks/002-openvino-api/model/classification.bin')
Expand Down Expand Up @@ -707,10 +707,10 @@ add the ``N`` dimension (where ``N``\ = 1) by calling the
**Do inference**

Now that the input data is in the right shape, run inference. The
CompiledModel inference result is a dictionary where keys are the Output
class instances (the same keys in ``compiled_model.outputs`` that can
also be obtained with ``compiled_model.output(index)``) and values -
predicted result in np.array format.
``CompiledModel`` inference result is a dictionary where keys are the
Output class instances (the same keys in ``compiled_model.outputs`` that
can also be obtained with ``compiled_model.output(index)``) and values -
predicted result in ``np.array`` format.

.. code:: ipython3
Expand Down Expand Up @@ -797,7 +797,7 @@ input shape.
.. parsed-literal::
PosixPath('/opt/home/k8sworker/ci-ai/cibuilds/ov-notebook/OVNotebookOps-448/.workspace/scm/ov-notebook/notebooks/002-openvino-api/model/segmentation.bin')
PosixPath('/opt/home/k8sworker/ci-ai/cibuilds/ov-notebook/OVNotebookOps-475/.workspace/scm/ov-notebook/notebooks/002-openvino-api/model/segmentation.bin')
Expand Down Expand Up @@ -948,7 +948,7 @@ the cache.
.. parsed-literal::
PosixPath('/opt/home/k8sworker/ci-ai/cibuilds/ov-notebook/OVNotebookOps-448/.workspace/scm/ov-notebook/notebooks/002-openvino-api/model/classification.bin')
PosixPath('/opt/home/k8sworker/ci-ai/cibuilds/ov-notebook/OVNotebookOps-475/.workspace/scm/ov-notebook/notebooks/002-openvino-api/model/classification.bin')
Expand Down
Loading

0 comments on commit 20bf7ae

Please sign in to comment.