Hard sigmoid explained

In artificial intelligence, especially computer vision and artificial neural networks, a hard sigmoid is non-smooth function used in place of a sigmoid function. These retain the basic shape of a sigmoid, rising from 0 to 1, but using simpler functions, especially piecewise linear functions or piecewise constant functions. These are preferred where speed of computation is more important than precision.

Examples

The most extreme examples are the sign function or Heaviside step function, which go from −1 to 1 or 0 to 1 (which to use depends on normalization) at 0.[1]

Other examples include the Theano library, which provides two approximations: ultra_fast_sigmoid, which is a multi-part piecewise approximation and hard_sigmoid, which is a 3-part piecewise linear approximation (output 0, line with slope 0.2, output 1).[2] [3]

Notes and References

  1. Curves and Surfaces in Computer Vision and Graphics, Volume 1610, SPIE, 1992, p. 301
  2. Web site: nnet – Ops for neural networks . 2018-09-03 . 2018-08-14 . https://web.archive.org/web/20180814165920/http://deeplearning.net/software/theano/library/tensor/nnet/nnet.html . dead .
  3. https://github.com/Theano/Theano/blob/38a6331ae23250338290e886a72daadb33441bc4/theano/tensor/nnet/sigm.py#L279 Theano/sigm.py at 38a6331ae23250338290e886a72daadb33441bc4 · Theano/Theano · GitHub