Tensors and Variables in PyTorch and TensorFlow

Classified in Computers

Written at on English with a size of 2.4 KB.

Tensors and Variables in PyTorch and TensorFlow

Here's a brief explanation of tensors and variables in the context of deep learning frameworks like PyTorch and TensorFlow:

Tensors

  • Definition: A tensor is a multi-dimensional array used to represent data (such as scalars, vectors, matrices, or higher-dimensional data).
  • Common Operations: Tensors can be manipulated with mathematical operations (addition, multiplication, etc.), reshaped, sliced, etc.

In PyTorch, tensors are the core data structure:

import torch
# Create a tensor
a = torch.tensor([[1, 2], [3, 4]])
# Basic operations
b = a + 2         # Adds 2 to each element
c = a * a         # Element-wise multiplication
d = a @ a         # Matrix multiplication

Output:

Tensor `b`: [[3, 4], [5, 6]]
Tensor `c`: [[1, 4], [9, 16]]
Tensor `d`: [[7, 10], [15, 22]]

In TensorFlow:

import tensorflow as tf
# Create a tensor
a = tf.constant([[1, 2], [3, 4]])
# Basic operations
b = a + 2
c = a * a
d = tf.matmul(a, a)

Variables

  • Definition: Variables are tensors whose values can be modified. They are often used to store model parameters during training.
  • In TensorFlow, tf.Variable is used for variables:
# TensorFlow Variable
x = tf.Variable([[1, 2], [3, 4]])
# Update variable value
x.assign_add([[1, 1], [1, 1]])
  • In PyTorch, variables are now represented by torch.Tensor, and gradients are tracked by setting requires_grad=True:
# PyTorch Tensor as Variable
x = torch.tensor([[1, 2], [3, 4]], dtype=torch.float32, requires_grad=True)
# Perform operation and backpropagate
y = x + 2
z = y.mean()
z.backward()  # Backpropagation

In PyTorch, after z.backward(), the gradient of z with respect to x can be accessed with x.grad.

This is a basic cheat sheet for tensors and variables, showing how they are handled in PyTorch and TensorFlow!

What is Autograd?

How to Reshape Tensors?

Entradas relacionadas: