The Difference Between Tensor.view() and torch.reshape() in PyTorch – PyTorch Tutorial

By | April 28, 2022

Both of pytorch tensor.view() and torch.reshape() can change the size of a tensor. What’s the difference between them. In this tutorial, we will introduce it to you.

Difference between tensor.view() and torch.reshape() in PyTorch

tensor.view() must be used in a contiguous tensor, however, torch.reshape() can be used on any kinds of tensor.

For example:

import torch
x = torch.tensor([[1, 2, 2],[2, 1, 3]])
x = x.transpose(0, 1)
print(x)
y = x.view(-1)
print(y)

Run this code, we will get:

RuntimeError: view size is not compatible with input tensor’s size and stride

Because tensor.transpose() make x is not contiguous. We can read the tutorial below to fix this error.

Understand tensor.contiguous() with Examples: How to Use? – PyTorch Tutorial

However, torch.reshape() can be use correctly. For example:

import torch
x = torch.tensor([[1, 2, 2],[2, 1, 3]])
x = x.transpose(0, 1)
print(x)
y = torch.reshape(x, [-1])
print(y)

Run this code, we will get:

tensor([[1, 2],
        [2, 1],
        [2, 3]])
tensor([1, 2, 2, 1, 2, 3])

As to us, it is hard to know a tensor is contiguous or not, it is a good idea to use torch.reshape() to replace tensor.view() when you are coding.

Leave a Reply