How to Build an Openvisioncapsules Compatible Hardware

How to Build an Openvisioncapsules Compatible Hardware

How to build an OpenVisionCapsules HW Compatible Hardware HW VisionCapsules System Architecture Non smart vision device Edge computing HW w/ AI Vision SW OpenVisionCapsules Open Stream BrainFrame SDK Gateway Video Frames Smart vision device Edge computing HW w/ AI Vision SW Vision Capsules AI AI Metadata in Video Frames BrainFrame OpenVisionCapsules Metadata filled in Video Frames SDK Video Frames Load HW DNN Model Open source AI Information BrainFrame can be download from https://dilililabs.com/docs/downloads/. It Software Vendor Implementation is offered by Dilili Labs as an AI Software Vendor. Video Frames HW partner implementation HW VisionCapsules Adaptor Architecture Video Frames Video Frames HW_detector_person AI metadata _metadata classifier_person .cap HW_detector_adap tor.cap Enterprise, AI results Business & Industrial Applications recognizer_person tracker_person. .cap cap AI Information Smart Vision Device BrainFrame Video Frames Software Vendor Implementation HW partner implementation OpenVisionCapsules Opensource VisionCapsules Hardware Adaptor Implementation HW partner Implementation AI Information Video Frames VisionCapsules Hardware Adaptor VIDEO FRAME pre post inference processing: processing: AI RESULT bypass AI METADATA metadata AI results vcap spec: vcap_utils list_local_devices UI Options device_mapper backend_loader VisionCapsules Source code can be downloaded from github.com/opencv/open_vision_capsules. HW partner need to implement VisionCapsules HW adaptor. Follow the steps below based on a VisionCapsules sample implementation, 1. HW VisionCapsules receives METADATA, and pack it into proper AI results. 2. Inference should be bypass, as it is done on hardware 3. If the device stack is not supported by TensorFlow, OpenCV DNN, or OpenVino, then a custom device needs to be created. Check “Add a Custom Device” HW VisionCapsules - Add a Custom Device to BrainFrame Opensource Dilili Labs BrainFrame Canned VisionCapsules Runtime Software Vendor Environment Implementation HW partner Implementation OpenCV Hacked implementation TensorFlow OpenVino Custom Device DNN Device Device list_local_devices list_local_devices Device custom_device_mapper device_mapper device_mapper custom_backend_loader backend_loader device_mapper backend_loader backend_loader BrainFrame out-of-the-box supports TensorFlow, OpenCV DNN, or OpenVino. If a HW partner implementation is not supported by TensorFlow, OpenCV DNN, or OpenVino, then a custom device needs to be added for BrainFrame to load HW accelerated DNN model to a HW device. Follow the steps below to add a Custom device to BrainFrame, 1. Adding Device Name to the system: BrainFrame Canned VisionCapsules Runtime Environment uses TensorFlow Stack list_local_devices to find all available devices. HW partner may contact Software Vendor for the BrainFrame to support a Custom Device. 2. Alternatively, HW partner can hack BrainFrame Implementation, replace TensorFlow Device with a Custom Device, so BrainFrame will route the request to the list_local_devices of the Custom Device. 3. User can always ignore 1) & 2) and manually load a DNN model with an external tool. BrainFrame will not be able to control the HW in this case. OpenVisionCapsules Format A portable format for all others, e.g. Tensorflow, Torch, ONNX, etc. Items Existing Model Formats: OpenVisionCapsules Format Tensorflow - .pb/.meta/.pbtxt, .tflite .cap Keras - .h5, .keras Caffe - .caffemodel, .prototxt Torch - .pt, .pth, t7 (Portable format for all others) ONNX -.onnx (Self-contained, Executable) Design goal DNN training DNN deployment DNN model AI computing target support: ✔ ✔ Computing CPU, GPU, or other AI chip Batch-deploy DNN models onto X ✔ multiple computing targets Separately batchable inference X ✔ from parallelizable operations Self-containe DNN weights & model in a X / ✔ ✔ d DNN single file Separate files in most formats model & Executable logic for pre/post X ✔ processing processing information Expose standardized APIs for X ✔ runtime algorithm configuration DNN Model Describes the interconnection to X ✔ Interconnect, other DNN models Compatibility Compatible with other formats X ✔.

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    6 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us