Contributor
Jianyu Liu

FFmpeg DNN inference on IGPU full pipeline support


Mentors
Ting Fu
Organization
Intel Video and Audio for Linux
Technologies
c, GPU Memory
Topics
deep learning, video, Hardware Accelerate
Problem: FFmpeg DNN(Deep Neural Network) module has enabled OpenVINO backend inference on Intel GPU. While the FFmpeg decode and encode work on cpu currently memory copy between CPU and GPU, which lead to notable latency. It results in performance degradation when we do inference with some certain models. Therefore,I wan't to support the FFmpeg DNN inference GPU full pipeline, which means the FFmpeg DNN can do all the decode, encode and inference only on Intel GPU, without any memory copy. Method: Integrate ffmpeg dnn module and OpenVINO GPU inference with VaSurface GPU memory type Deliverables: A full gpu codec and inference pipeline on ffmpeg dnn module and a well writen document with user guide expamples