Layer (deep learning) explained

A layer in a deep learning model is a structure or network topology in the model's architecture, which takes information from the previous layers and then passes it to the next layer.

Layer types

The first type of layer is the Dense layer, also called the fully-connected layer,[1] [2] [3] and is used for abstract representations of input data. In this layer, neurons connect to every neuron in the preceding layer. In multilayer perceptron networks, these layers are stacked together.

The Convolutional layer[4] is typically used for image analysis tasks. In this layer, the network detects edges, textures, and patterns. The outputs from this layer are then fed into a fully-connected layer for further processing. See also: CNN model.

The Pooling layer[5] [6] is used to reduce the size of data input.

The Recurrent layer is used for text processing with a memory function. Similar to the Convolutional layer, the output of recurrent layers are usually fed into a fully-connected layer for further processing. See also: RNN model.[7] [8] [9]

The Normalization layer adjusts the output data from previous layers to achieve a regular distribution. This results in improved scalability and model training.

A Hidden layer is any of the layers in a Neural Network that aren't the input or output layers.

Differences with layers of the neocortex

There is an intrinsic difference between deep learning layering and neocortical layering: deep learning layering depends on network topology, while neocortical layering depends on intra-layers homogeneity.

See also

Notes and References

  1. Web site: CS231n Convolutional Neural Networks for Visual Recognition . CS231n Convolutional Neural Networks for Visual Recognition . 10 May 2016 . 27 Apr 2021 . Fully-connected layer: Neurons in a fully connected layer have full connections to all activations in the previous layer, as seen in regular Neural Networks..
  2. Web site: Fully connected layer . MATLAB . 1 Mar 2021 . 27 Apr 2021. A fully connected layer multiplies the input by a weight matrix and then adds a bias vector..
  3. Book: Géron, Aurélien . Hands-on machine learning with Scikit-Learn, Keras, and TensorFlow : concepts, tools, and techniques to build intelligent systems . O'Reilly Media, Inc . Sebastopol, CA . 2019 . 978-1-4920-3264-9 . 1124925244 . 322–323.
  4. Book: Habibi, Aghdam, Hamed. Guide to convolutional neural networks : a practical application to traffic-sign detection and classification. Heravi, Elnaz Jahani. 9783319575490. Cham, Switzerland. 987790957. 2017-05-30.
  5. A Neural Network for Speaker-Independent Isolated Word Recognition . Yamaguchi . Kouichi . Sakamoto . Kenji . Akabane . Toshio . Fujimoto . Yoshiji . November 1990 . Kobe, Japan . First International Conference on Spoken Language Processing (ICSLP 90).
  6. Book: Ciresan . Dan . Ueli . Meier . Jürgen . Schmidhuber . 2012 IEEE Conference on Computer Vision and Pattern Recognition . Multi-column deep neural networks for image classification . June 2012 . 3642–3649 . 10.1109/CVPR.2012.6248110 . 1202.2745 . 978-1-4673-1226-4 . 812295155 . Institute of Electrical and Electronics Engineers (IEEE) . New York, NY. 10.1.1.300.3283 . 2161592 .
  7. Dupond. Samuel. 2019. A thorough review on the current advance of neural network structures.. Annual Reviews in Control. 14. 200–230.
  8. 2018-11-01. State-of-the-art in artificial neural network applications: A survey. Heliyon. en. 4. 11. e00938. 10.1016/j.heliyon.2018.e00938. 2405-8440. free. Abiodun. Oludare Isaac. Jantan. Aman. Omolara. Abiodun Esther. Dada. Kemi Victoria. Mohamed. Nachaat Abdelatif. Arshad. Humaira. 30519653. 6260436. 2018Heliy...400938A .
  9. 2018-12-01. Time series forecasting using artificial neural networks methodologies: A systematic review. Future Computing and Informatics Journal. en. 3. 2. 334–340. 10.1016/j.fcij.2018.10.003. 2314-7288. free. Tealab. Ahmed.