IVGCVSW-2560 Verify Inference test for TensorFlow Lite MobileNet SSD
authorNarumol Prangnawarat <narumol.prangnawarat@arm.com>
Mon, 25 Feb 2019 17:26:05 +0000 (17:26 +0000)
committerNarumol Prangnawarat <narumol.prangnawarat@arm.com>
Tue, 26 Feb 2019 17:41:15 +0000 (17:41 +0000)
commit4628d05455dfc179f0437913185e76888115a98a
treea8eac68ee5aee88a7071ac6f13af7932b98caa87
parent452869973b9a45c9c44820d16f92f7dfc96e9aef
IVGCVSW-2560 Verify Inference test for TensorFlow Lite MobileNet SSD

 * Assign output shape of MobileNet SSD to ArmNN network
 * Add m_OverridenOutputShapes to TfLiteParser to set shape in GetNetworkOutputBindingInfo
 * Use input quantization instead of output quantization params
 * Correct data and datatype in Inference test

Change-Id: I01ac2e07ed08e8928ba0df33a4847399e1dd8394
Signed-off-by: Narumol Prangnawarat <narumol.prangnawarat@arm.com>
Signed-off-by: Aron Virginas-Tar <Aron.Virginas-Tar@arm.com>
include/armnn/Descriptors.hpp
src/armnnTfLiteParser/TfLiteParser.cpp
src/armnnTfLiteParser/TfLiteParser.hpp
src/armnnTfLiteParser/test/DetectionPostProcess.cpp
src/backends/reference/workloads/DetectionPostProcess.cpp
tests/InferenceModel.hpp
tests/MobileNetSsdDatabase.hpp
tests/MobileNetSsdInferenceTest.hpp
tests/ObjectDetectionCommon.hpp