Graph.queue_inference()

Info Value
Package mvnc
Module mvncapi
Version 2.0
See also Graph, Fifo, Graph.queue_inference_with_fifo_elem()

Overview

This method queues an inference with specified input and output Fifos.

Syntax

graph.queue_inference(input_fifo, output_fifo)

Parameters

Parameter Type Description
input_fifo Fifo A FIFO queue for graph inputs. The FifoState must be ALLOCATED.
output_fifo Fifo A FIFO queue for graph outputs. The FifoState must be ALLOCATED.

Return

None

Raises

Exception with a status code from Status if underlying function calls return a status other than Status.OK.

Notes

Example

from mvnc import mvncapi

#
# Create and open a Device...
#

# Create a Graph
graph = mvncapi.Graph('graph1')

# Read a compiled network graph from file (set the graph_filepath correctly for your graph file)
graph_filepath = './graph'
with open(graph_filepath, 'rb') as f:
    graph_buffer = f.read()

# Allocate the graph on the device and create input and output Fifos
input_fifo, output_fifo = graph.allocate_with_fifos(device, graph_buffer)

#
# Pre-procces your input tensor and write to the input Fifo...
#

# Queue an inference
graph.queue_inference(input_fifo, output_fifo)

#
# Read the output from the output_fifo and use it as needed...
#

# Deallocate and destroy the fifo and graph handles, close the device, and destroy the device handle
input_fifo.destroy()
output_fifo.destroy()
graph.destroy()
device.close()
device.destroy()