From: Alexey Suhov Date: Mon, 10 Feb 2020 18:08:40 +0000 (+0300) Subject: Merge pull request #296 from dkurt/patch-1 X-Git-Url: http://review.tizen.org/git/?a=commitdiff_plain;h=949b74059fb1e266d2d324ba6d0528d77ec32311;hp=-c;p=platform%2Fupstream%2Fdldt.git Merge pull request #296 from dkurt/patch-1 Do not build CMake from source --- 949b74059fb1e266d2d324ba6d0528d77ec32311 diff --combined inference-engine/README.md index 9de589c,326cfd9..3d3eabe --- a/inference-engine/README.md +++ b/inference-engine/README.md @@@ -22,7 -22,6 +22,7 @@@ - [Build Steps](#build-steps-2) - [Additional Build Options](#additional-build-options-3) - [Use Custom OpenCV Builds for Inference Engine](#use-custom-opencv-builds-for-inference-engine) +- [Adding Inference Engine to your project](#adding-inference-engine-to-your-project) - [(Optional) Additional Installation Steps for the Intel® Movidius™ Neural Compute Stick and Neural Compute Stick 2](#optional-additional-installation-steps-for-the-intel-movidius-neural-compute-stick-and-neural-compute-stick-2) - [For Linux, Raspbian Stretch* OS](#for-linux-raspbian-stretch-os) - [For Windows](#for-windows-1) @@@ -63,13 -62,7 +63,13 @@@ The software was validated on git submodule init git submodule update --recursive ``` -2. Install build dependencies using the `install_dependencies.sh` script in the project root folder. +2. Install build dependencies using the `install_dependencies.sh` script in the project root folder: + ```sh + chmod +x install_dependencies.sh + ``` + ```sh + ./install_dependencies.sh + ``` 3. By default, the build enables the Inference Engine GPU plugin to infer models on your Intel® Processor Graphics. This requires you to [Install Intel® Graphics Compute Runtime for OpenCL™ Driver package 19.04.12237](https://github.com/intel/compute-runtime/releases/tag/19.04.12237) before running the build. If you don't want to use the GPU plugin, use the `-DENABLE_CLDNN=OFF` CMake build option and skip the installation of the Intel® Graphics Compute Runtime for OpenCL™ Driver. 4. Create a build folder: ```sh @@@ -97,20 -90,33 +97,20 @@@ You can use the following additional bu - If the CMake-based build script can not find and download the OpenCV package that is supported on your platform, or if you want to use a custom build of the OpenCV library, refer to the [Use Custom OpenCV Builds](#use-custom-opencv-builds-for-inference-engine) section for details. -- To build the Python API wrapper, use the `-DENABLE_PYTHON=ON` option. To specify an exact Python version, use the following options: - ```sh - -DPYTHON_EXECUTABLE=`which python3.7` \ - -DPYTHON_LIBRARY=/usr/lib/x86_64-linux-gnu/libpython3.7m.so \ - -DPYTHON_INCLUDE_DIR=/usr/include/python3.7 - ``` +- To build the Python API wrapper: + 1. Install all additional packages listed in the `/inference-engine/ie_bridges/python/requirements.txt` file: + ```sh + pip install -r requirements.txt + ``` + 2. use the `-DENABLE_PYTHON=ON` option. To specify an exact Python version, use the following options: + ```sh + -DPYTHON_EXECUTABLE=`which python3.7` \ + -DPYTHON_LIBRARY=/usr/lib/x86_64-linux-gnu/libpython3.7m.so \ + -DPYTHON_INCLUDE_DIR=/usr/include/python3.7 + ``` - To switch off/on the CPU and GPU plugins, use the `cmake` options `-DENABLE_MKL_DNN=ON/OFF` and `-DENABLE_CLDNN=ON/OFF` respectively. -5. Adding to your project - - For CMake projects, set an environment variable `InferenceEngine_DIR`: - - ```sh - export InferenceEngine_DIR=/path/to/dldt/inference-engine/build/ - ``` - - Then you can find Inference Engine by `find_package`: - - ```cmake - find_package(InferenceEngine) - - include_directories(${InferenceEngine_INCLUDE_DIRS}) - - target_link_libraries(${PROJECT_NAME} ${InferenceEngine_LIBRARIES} dl) - ``` - ## Build for Raspbian Stretch* OS > **NOTE**: Only the MYRIAD plugin is supported. @@@ -198,6 -204,7 +198,7 @@@ with the following content crossbuild-essential-armhf \ git \ wget \ + cmake \ libusb-1.0-0-dev:armhf \ libgtk-3-dev:armhf \ libavcodec-dev:armhf \ @@@ -207,12 -214,6 +208,6 @@@ libgstreamer-plugins-base1.0-dev:armhf \ libpython3-dev:armhf \ python3-pip - - RUN wget https://www.cmake.org/files/v3.14/cmake-3.14.3.tar.gz && \ - tar xf cmake-3.14.3.tar.gz && \ - (cd cmake-3.14.3 && ./bootstrap --parallel=$(nproc --all) && make --jobs=$(nproc --all) && make install) && \ - rm -rf cmake-3.14.3 cmake-3.14.3.tar.gz - ``` It uses the Debian\* Stretch (Debian 9) OS for compilation because it is a base of the Raspbian\* Stretch. @@@ -365,13 -366,7 +360,13 @@@ The software was validated on git submodule init git submodule update --recursive ``` -2. Install build dependencies using the `install_dependencies.sh` script in the project root folder. +2. Install build dependencies using the `install_dependencies.sh` script in the project root folder: + ```sh + chmod +x install_dependencies.sh + ``` + ```sh + ./install_dependencies.sh + ``` 3. Create a build folder: ```sh mkdir build @@@ -419,22 -414,6 +414,22 @@@ After you got the built OpenCV library 1. Set the `OpenCV_DIR` environment variable to the directory where the `OpenCVConfig.cmake` file of you custom OpenCV build is located. 2. Disable the package automatic downloading with using the `-DENABLE_OPENCV=OFF` option for CMake-based build script for Inference Engine. +## Adding Inference Engine to your project + +For CMake projects, set the `InferenceEngine_DIR` environment variable: + +```sh +export InferenceEngine_DIR=/path/to/dldt/inference-engine/build/ +``` + +Then you can find Inference Engine by `find_package`: + +```cmake +find_package(InferenceEngine) +include_directories(${InferenceEngine_INCLUDE_DIRS}) +target_link_libraries(${PROJECT_NAME} ${InferenceEngine_LIBRARIES} dl) +``` + ## (Optional) Additional Installation Steps for the Intel® Movidius™ Neural Compute Stick and Neural Compute Stick 2 > **NOTE**: These steps are only required if you want to perform inference on Intel® Movidius™ Neural Compute Stick or the Intel® Neural Compute Stick 2 using the Inference Engine MYRIAD Plugin. See also [Intel® Neural Compute Stick 2 Get Started](https://software.intel.com/en-us/neural-compute-stick/get-started) @@@ -477,7 -456,7 +472,7 @@@ For Intel® Movidius™ Neural Compute 1. Go to the `/inference-engine/thirdparty/movidius/MovidiusDriver` directory, where the `DLDT_ROOT_DIR` is the directory to which the DLDT repository was cloned. 2. Right click on the `Movidius_VSC_Device.inf` file and choose **Install** from the pop up menu. -You have installed the driver for your Intel® Movidius™ Neural Compute Stick or Intel® Neural Compute Stick 2. +You have installed the driver for your Intel® Movidius™ Neural Compute Stick or Intel® Neural Compute Stick 2. ## Next Steps @@@ -494,4 -473,4 +489,4 @@@ Congratulations, you have built the Inf * [Model Optimizer Developer Guide](https://docs.openvinotoolkit.org/latest/_docs_MO_DG_Deep_Learning_Model_Optimizer_DevGuide.html) --- - \* Other names and brands may be claimed as the property of others. + \* Other names and brands may be claimed as the property of others.