![ONNXRuntime inference works well on Raspberry Pi 4 with Intel NCS2: step by step setup with OpenVINO Execution Provider - PUT Vision Lab ONNXRuntime inference works well on Raspberry Pi 4 with Intel NCS2: step by step setup with OpenVINO Execution Provider - PUT Vision Lab](https://putvision.github.io/assets/images/posts/2022/01/rasp.webp)
ONNXRuntime inference works well on Raspberry Pi 4 with Intel NCS2: step by step setup with OpenVINO Execution Provider - PUT Vision Lab
GitHub - nknytk/built-onnxruntime-for-raspberrypi-linux: Built python wheel files of https://github.com/microsoft/onnxruntime for raspberry pi 32bit linux.
![Performance analysis for different embedded platforms; FPGA, JX GPU, JX... | Download Scientific Diagram Performance analysis for different embedded platforms; FPGA, JX GPU, JX... | Download Scientific Diagram](https://www.researchgate.net/publication/341456762/figure/fig7/AS:892522251440139@1589805295353/Performance-analysis-for-different-embedded-platforms-FPGA-JX-GPU-JX-CPU-RPI3B-and_Q640.jpg)
Performance analysis for different embedded platforms; FPGA, JX GPU, JX... | Download Scientific Diagram
![Announcing ONNX Runtime Availability in the NVIDIA Jetson Zoo for High Performance Inferencing | NVIDIA Technical Blog Announcing ONNX Runtime Availability in the NVIDIA Jetson Zoo for High Performance Inferencing | NVIDIA Technical Blog](https://developer.nvidia.com/blog/wp-content/uploads/2020/08/onnx-runtime.png)