:mod:blueoil.quantizations.linear =================================== .. py:module:: blueoil.quantizations.linear Module Contents --------------- Functions ~~~~~~~~~ .. autoapisummary:: blueoil.quantizations.linear.linear_mid_tread_half_quantizer .. function:: linear_mid_tread_half_quantizer(bit=None, max_value=None, backward=None, dtype=tf.float32) Linear mid tread half quantizer. This quantization creates a linear mid tread half quantizer. If backward is provided, this backward will be used in backpropagation. This quantization method is DoReFa-Net [1]_ activation quantization variant, the difference from DoReFa-Net [1]_ is to be able to change max_value. op_type is LinearMidTreadHalfQuantizer. Forward is: .. math:: \mathbf{X} & = \text{clip}\big(\mathbf{X}, 0, max\_value\big)\\ \mathbf{Y} & = \begin{cases} \mathbf{X}, & \text{if $bit$ is 32} \\ \frac{\text{round}\big(\frac{\mathbf{X}}{max\_value} \cdot (2^{bit}-1)\big)}{2^{bit}-1} \cdot max\_value, & otherwise \end{cases} Default backward is: .. math:: \frac{\partial Loss}{\partial \mathbf{X}} = \begin{cases} \frac{\partial Loss}{\partial y}, & \text{if $0 < x < max\_value$}\\ 0, & otherwise \end{cases} :param bit: Specify the bit of quantization. :type bit: int :param max_value: Be used for shift and clip. :type max_value: int :param backward: Be used in backpropagation. :type backward: callable :param dtype: Define the data type of args of forward and backward. :type dtype: tf.DType :returns: forward function (grad_func defined). :rtype: callable Reference: - Deep Learning with Low Precision by Half-wave Gaussian Quantization _ .. [1] DoReFa-Net: Training Low Bitwidth Convolutional Neural Networks with Low Bitwidth Gradients _