From 10274c09087983b2bfb2d423848d5921dc42b965 Mon Sep 17 00:00:00 2001 From: Evgenya Stepyreva Date: Mon, 30 Nov 2020 16:39:58 +0300 Subject: [PATCH] [DOC] ShapeInference.md update. slyalin comments (#3355) * [DOC] ShapeInference.md update. slyalin comments * Apply suggestions from code review Co-authored-by: Alina Alborova * Apply suggestions from code review Co-authored-by: Alina Alborova * Update docs/IE_DG/ShapeInference.md Co-authored-by: Alina Alborova --- docs/IE_DG/ShapeInference.md | 49 +++++++++++++++++++++++++++----------------- 1 file changed, 30 insertions(+), 19 deletions(-) diff --git a/docs/IE_DG/ShapeInference.md b/docs/IE_DG/ShapeInference.md index e45378e..ea86911 100644 --- a/docs/IE_DG/ShapeInference.md +++ b/docs/IE_DG/ShapeInference.md @@ -1,6 +1,36 @@ Using Shape Inference {#openvino_docs_IE_DG_ShapeInference} ========================================== +OpenVINO™ provides the following methods for runtime model reshaping: + +* **Set a new input shape** with the `InferenceEngine::CNNNetwork::reshape` method.
+ The `InferenceEngine::CNNNetwork::reshape` method updates input shapes and propagates them down to the outputs of the model through all intermediate layers. + +> **NOTES**: +> - Starting with the 2021.1 release, the Model Optimizer converts topologies keeping shape-calculating sub-graphs by default, which enables correct shape propagation during reshaping in most cases. +> - Older versions of IRs are not guaranteed to reshape successfully. Please regenerate them with the Model Optimizer of the latest version of OpenVINO™.
+> - If an ONNX model does not have a fully defined input shape and the model was imported with the ONNX importer, reshape the model before loading it to the plugin. + +* **Set a new batch dimension value** with the `InferenceEngine::CNNNetwork::setBatchSize` method.
+ The meaning of a model batch may vary depending on the model design. + This method does not deduce batch placement for inputs from the model architecture. + It assumes that the batch is placed at the zero index in the shape for all inputs and uses the `InferenceEngine::CNNNetwork::reshape` method to propagate updated shapes through the model. + + The method transforms the model before a new shape propagation to relax a hard-coded batch dimension in the model, if any. + + Use `InferenceEngine::CNNNetwork::reshape` instead of `InferenceEngine::CNNNetwork::setBatchSize` to set new input shapes for the model in case the model has: + * Multiple inputs with different zero-index dimension meanings + * Input without a batch dimension + * 0D, 1D, or 3D shape + + The `InferenceEngine::CNNNetwork::setBatchSize` method is a high-level API method that wraps the `InferenceEngine::CNNNetwork::reshape` method call and works for trivial models from the batch placement standpoint. + Use `InferenceEngine::CNNNetwork::reshape` for other models. + + Using the `InferenceEngine::CNNNetwork::setBatchSize` method for models with a non-zero index batch placement or for models with inputs that do not have a batch dimension may lead to undefined behaviour. + +You can change input shapes multiple times using the `InferenceEngine::CNNNetwork::reshape` and `InferenceEngine::CNNNetwork::setBatchSize` methods in any order. +If a model has a hard-coded batch dimension, use `InferenceEngine::CNNNetwork::setBatchSize` first to change the batch, then call `InferenceEngine::CNNNetwork::reshape` to update other dimensions, if needed. + Inference Engine takes three kinds of a model description as an input, which are converted into an `InferenceEngine::CNNNetwork` object: 1. [Intermediate Representation (IR)](../MO_DG/IR_and_opsets.md) through `InferenceEngine::Core::ReadNetwork` 2. [ONNX model](../IE_DG/OnnxImporterTutorial.md) through `InferenceEngine::Core::ReadNetwork` @@ -23,25 +53,6 @@ for (const auto & parameter : parameters) { To feed input data of a shape that is different from the model input shape, reshape the model first. -OpenVINO™ provides the following methods for runtime model reshaping: - -* **Set a new input shape** with the `InferenceEngine::CNNNetwork::reshape` method.
- The `InferenceEngine::CNNNetwork::reshape` method updates input shapes and propagates them down to the outputs of the model through all intermediate layers. - -> **NOTES**: -> - Starting with the 2021.1 release, the Model Optimizer converts topologies keeping shape-calculating sub-graphs by default, which enables correct shape propagation during reshaping. -> - Older versions of IRs are not guaranteed to reshape successfully. Please regenerate them with the Model Optimizer of the latest version of OpenVINO™.
-> - If an ONNX model does not have a fully defined input shape and the model was imported with the ONNX importer, reshape the model before loading it to the plugin. - -* **Set a new batch dimension value** with the `InferenceEngine::CNNNetwork::setBatchSize` method.
- The meaning of a model batch may vary depending on the model design. - Batch dimension is usually placed at the 0 index of all inputs of the model. - This method does not work for models with a non-zero index batch placement or models with inputs without a batch dimension. - The method transforms the model before a new shape propagation to relax a hard-coded batch dimension in the model, if any. - -You can change input shapes multiple times using the `InferenceEngine::CNNNetwork::reshape` and `InferenceEngine::CNNNetwork::setBatchSize` methods in any order. -If a model has a hard-coded batch dimension, use `InferenceEngine::CNNNetwork::setBatchSize` first to change the batch, then call `InferenceEngine::CNNNetwork::reshape` to update other dimensions, if needed. - Once the input shape of `InferenceEngine::CNNNetwork` is set, call the `InferenceEngine::Core::LoadNetwork` method to get an `InferenceEngine::ExecutableNetwork` object for inference with updated shapes. There are other approaches to reshape the model during the stage of IR generation or [nGraph::Function creation](../nGraph_DG/build_function.md). -- 2.7.4