PyTorch explained

PyTorch
Developer:Meta AI
Released:[1]
Platform:IA-32, x86-64, ARM64
Language:English
Genre:Library for machine learning and deep learning
License:BSD-3[2]

PyTorch is a machine learning library based on the Torch library,[3] [4] [5] used for applications such as computer vision and natural language processing,[6] originally developed by Meta AI and now part of the Linux Foundation umbrella.[7] [8] [9] [10] It is recognized as one of the two most popular machine learning libraries alongside TensorFlow, offering free and open-source software released under the modified BSD license. Although the Python interface is more polished and the primary focus of development, PyTorch also has a C++ interface.[11]

A number of pieces of deep learning software are built on top of PyTorch, including Tesla Autopilot,[12] Uber's Pyro,[13] Hugging Face's Transformers, PyTorch Lightning,[14] and Catalyst.[15]

PyTorch provides two high-level features:[16]

History

Meta (formerly known as Facebook) operates both PyTorch and Convolutional Architecture for Fast Feature Embedding (Caffe2), but models defined by the two frameworks were mutually incompatible. The Open Neural Network Exchange (ONNX) project was created by Meta and Microsoft in September 2017 for converting models between frameworks. Caffe2 was merged into PyTorch at the end of March 2018.[17] In September 2022, Meta announced that PyTorch would be governed by PyTorch Foundation, a newly created independent organizationa subsidiary of Linux Foundation.[18]

PyTorch 2.0 was released on 15 March 2023.[19]

PyTorch tensors

See main article: Tensor (machine learning). PyTorch defines a class called Tensor (torch.Tensor) to store and operate on homogeneous multidimensional rectangular arrays of numbers. PyTorch Tensors are similar to NumPy Arrays, but can also be operated on a CUDA-capable NVIDIA GPU. PyTorch has also been developing support for other GPU platforms, for example, AMD's ROCm[20] and Apple's Metal Framework.[21]

PyTorch supports various sub-types of Tensors.[22]

Note that the term "tensor" here does not carry the same meaning as tensor in mathematics or physics. The meaning of the word in machine learning is only superficially related to its original meaning as a certain kind of object in linear algebra. Tensors in PyTorch are simply multi-dimensional arrays.

PyTorch neural networks

See main article: Neural network (machine learning).

PyTorch defines a module called nn (torch.nn) to describe neural networks and to support training. This module offers a comprehensive collection of building blocks for neural networks, including various layers and activation functions, enabling the construction of complex models. Networks are built by inheriting from the torch.nn module and defining the sequence of operations in the forward function.

Example

The following program shows the low-level functionality of the library with a simple example

import torchdtype = torch.floatdevice = torch.device("cpu") # Execute all calculations on the CPU

  1. device = torch.device("cuda:0") # Executes all calculations on the GPU
  2. Create a tensor and fill it with random numbers

a = torch.randn(2, 3, device=device, dtype=dtype)print(a)

  1. Output: tensor(-1.1884, 0.8498, -1.7129,
  2. [-0.8816, 0.1944, 0.5847]])

b = torch.randn(2, 3, device=device, dtype=dtype)print(b)

  1. Output: tensor(0.7178, -0.8453, -1.3403,
  2. [1.3262, 1.1512, -1.7070]])

print(a * b)

  1. Output: tensor(-0.8530, -0.7183, 2.58,
  2. [-1.1692, 0.2238, -0.9981]])

print(a.sum)

  1. Output: tensor(-2.1540)

print(a[1,2]) # Output of the element in the third column of the second row (zero based)

  1. Output: tensor(0.5847)

print(a.max)

  1. Output: tensor(0.8498)

The following code-block shows an example of the higher level functionality provided nn module. A neural network with linear layers is defined in the example.import torchfrom torch import nn # Import the nn sub-module from PyTorch

class NeuralNetwork(nn.Module): # Neural networks are defined as classes def __init__(self): # Layers and variables are defined in the __init__ method super.__init__ # Must be in every network. self.flatten = nn.Flatten # Construct a flattening layer. self.linear_relu_stack = nn.Sequential(# Construct a stack of layers. nn.Linear(28*28, 512), # Linear Layers have an input and output shape nn.ReLU, # ReLU is one of many activation functions provided by nn nn.Linear(512, 512), nn.ReLU, nn.Linear(512, 10),)

def forward(self, x): # This function defines the forward pass. x = self.flatten(x) logits = self.linear_relu_stack(x) return logits

See also

Notes and References

  1. Web site: PyTorch Alpha-1 release. Chintala. Soumith. 1 September 2016.
  2. Web site: Claburn . Thomas . 12 September 2022 . PyTorch gets lit under The Linux Foundation . The Register.
  3. News: Facebook brings GPU-powered machine learning to Python. Yegulalp. Serdar. 19 January 2017. InfoWorld. 11 December 2017.
  4. Web site: Why AI and machine learning researchers are beginning to embrace PyTorch. Lorica. Ben. 3 August 2017. O'Reilly Media. 11 December 2017.
  5. Book: Ketkar, Nikhil. Deep Learning with Python. 2017. Apress, Berkeley, CA. 9781484227657. 195–208. en. 10.1007/978-1-4842-2766-4_12. Introduction to PyTorch.
  6. Web site: NLP with PyTorch: A Comprehensive Guide. Moez Ali. Jun 2023. datacamp.com. en. 2024-04-01.
  7. News: When two trends fuse: PyTorch and recommender systems. Patel. Mo. 2017-12-07. O'Reilly Media. 2017-12-18. en.
  8. News: Facebook and Microsoft collaborate to simplify conversions from PyTorch to Caffe2. Mannes. John. TechCrunch. 2017-12-18. en. FAIR is accustomed to working with PyTorch – a deep learning framework optimized for achieving state of the art results in research, regardless of resource constraints. Unfortunately in the real world, most of us are limited by the computational capabilities of our smartphones and computers..
  9. Web site: Tech giants are using open source frameworks to dominate the AI community. Arakelyan. Sophia. 2017-11-29. VentureBeat. en-US. 2017-12-18.
  10. Web site: PyTorch strengthens its governance by joining the Linux Foundation . 2022-09-13 . pytorch.org . en.
  11. Web site: The C++ Frontend. PyTorch Master Documentation. 2019-07-29.
  12. Web site: Karpathy. Andrej. PyTorch at Tesla - Andrej Karpathy, Tesla.
  13. News: Uber AI Labs Open Sources Pyro, a Deep Probabilistic Programming Language. 2017-11-03. Uber Engineering Blog. 2017-12-18. en-US.
  14. Web site: Ecosystem Tools. pytorch.org. en. 2020-06-18.
  15. Web site: Ecosystem Tools. pytorch.org. en. 2020-04-04.
  16. Web site: PyTorch – About . pytorch.org . 2018-06-11 . https://web.archive.org/web/20180615190804/https://pytorch.org/about/ . 2018-06-15 . dead .
  17. Web site: Caffe2 Merges With PyTorch. 2018-04-02.
  18. Web site: Meta spins off PyTorch Foundation to make AI framework vendor neutral . 2022-09-12 . . Edwards . Benj.
  19. News: PyTorch 2.0 brings new fire to open-source machine learning . 16 March 2023 . VentureBeat . 15 March 2023.
  20. Web site: Installing PyTorch for ROCm. 2024-02-09. rocm.docs.amd.com.
  21. Web site: Introducing Accelerated PyTorch Training on Mac . 2022-06-04 . pytorch.org . en.
  22. Web site: An Introduction to PyTorch – A Simple yet Powerful Deep Learning Library . analyticsvidhya.com . 2018-06-11. 2018-02-22 .