8.1.1.8.1.2. blueoil.quantizations.linear

8.1.1.8.1.2.1. Module Contents

8.1.1.8.1.2.1.1. Functions

linear_mid_tread_half_quantizer(bit=None, max_value=None, backward=None, dtype=tf.float32)

Linear mid tread half quantizer.

blueoil.quantizations.linear.linear_mid_tread_half_quantizer(bit=None, max_value=None, backward=None, dtype=tf.float32)

Linear mid tread half quantizer.

This quantization creates a linear mid tread half quantizer. If backward is provided, this backward will be used in backpropagation.

This quantization method is DoReFa-Net 1 activation quantization variant, the difference from DoReFa-Net 1 is to be able to change max_value.

op_type is LinearMidTreadHalfQuantizer.

Forward is:

\[\begin{split}\mathbf{X} & = \text{clip}\big(\mathbf{X}, 0, max\_value\big)\\ \mathbf{Y} & = \begin{cases} \mathbf{X}, & \text{if $bit$ is 32} \\ \frac{\text{round}\big(\frac{\mathbf{X}}{max\_value} \cdot (2^{bit}-1)\big)}{2^{bit}-1} \cdot max\_value, & otherwise \end{cases}\end{split}\]

Default backward is:

\[\begin{split}\frac{\partial Loss}{\partial \mathbf{X}} = \begin{cases} \frac{\partial Loss}{\partial y}, & \text{if $0 < x < max\_value$}\\ 0, & otherwise \end{cases}\end{split}\]
Parameters
  • bit (int) – Specify the bit of quantization.

  • max_value (int) – Be used for shift and clip.

  • backward (callable) – Be used in backpropagation.

  • dtype (tf.DType) – Define the data type of args of forward and backward.

Returns

forward function (grad_func defined).

Return type

callable

Reference:
1(1,2)

DoReFa-Net: Training Low Bitwidth Convolutional Neural Networks with Low Bitwidth Gradients