Fix for Kaldi models with a batch of more than 1 (#1012)
authorIvan Tikhonov <ivan.tikhonov@intel.com>
Tue, 23 Jun 2020 05:22:12 +0000 (08:22 +0300)
committerGitHub <noreply@github.com>
Tue, 23 Jun 2020 05:22:12 +0000 (08:22 +0300)
commit3490b985ddc56cf1474a93c3101d4baa3c925e9c
treefe62c710964e9958ddd0c76f781353f3886915c4
parentb5be90a8862c19296c002ca765b0d6611c6121a3
Fix for Kaldi models with a batch of more than 1 (#1012)

* Fix kaldi models (batch > 1)

* ngraph codestyle

* fix ngraph to ie conversion

* Added comment

* apply review comments

* Added test for the case using the SetBatchSize function when ReadValue op is in the network

* Check status code instead of message

* Use new ngraph api
inference-engine/src/legacy_api/src/convert_function_to_cnn_network.cpp
inference-engine/tests/functional/inference_engine/cnn_network/cnn_ngraph_impl_tests.cpp