:mod:`blueoil.cmd.measure_latency` ================================== .. py:module:: blueoil.cmd.measure_latency Module Contents --------------- Functions ~~~~~~~~~ .. autoapisummary:: blueoil.cmd.measure_latency._pre_process blueoil.cmd.measure_latency._measure_time blueoil.cmd.measure_latency._run blueoil.cmd.measure_latency.run blueoil.cmd.measure_latency.main .. function:: _pre_process(raw_image, pre_processor, data_format) .. function:: _measure_time(config, restore_path, step_size) .. function:: _run(config_file, experiment_id, restore_path, image_size, step_size, cpu) .. function:: run(config_file, experiment_id, restore_path, image_size, step_size, cpu) .. function:: main(config_file, experiment_id, restore_path, image_size, step_size, cpu) Measure the average latency of certain model's prediction at runtime. The latency is averaged over number of repeated executions -- by default is to run it 100 times. Each execution is measured after tensorflow is already initialized and both model and images are loaded. Batch size is always 1. Measure two types latency, First is `overall` (including pre-post-processing which is being executed on CPU), Second is `network-only` (model inference, excluding pre-post-processing).