Emily T. Burak
PyTorch Introduction: PyTorch Tensors (A Gentle Intro to PyTorch Pt. 2)
Tensors: The backbone of PyTorch
Welcome back to my series of blogs on PyTorch, gently dipping toes into the vast Deep Learning ocean of the torch ecosystem. In the first post of this series, I gave a very high-level overview of torch. Now it's time to talk tensors. I'll be writing about tensors and torch, what a tensor is, as well as what you actually do with data in a tensor form.
Let's get tense.
Tensors and PyTorch
As indicated above, I feel that tensors provide the backbone both in code and conceptually for PyTorch. The framework is built around tensors and the myriad of ways you can structure tensors, work with multiple tensors, and operate over them. This is greatly helpful due to, and the backbone serves, a back consisting of neural networks. Neural networks can be structured around N-dimensional tensors effectively. Now, what does that mean, what is a tensor after all? Let's continue.
What is a tensor anyway?
Tensors are an n-dimensional data structure. A data structure is what it says on the tin, a codified way of storing information with regards to its structure and the operations performed over it. Data structures are key to good software engineering practice, as well as, shockingly, data science uses data structures as well, knowing data structures keenly gives a good leg up in the practice of either or both.
Tensors are n-dimensional, meaning you can get galaxy brained and keep scaling up. A tensor can be 0-dimensional to 1-dimensional(scalar), 2-dimensional(vector), 3-dimensional matrices, etc., etc., the math behind which is outside the scope of this blog post.

What do you do with an n-dimensional tensor, what do you do with an n-dimensional tensor in the morning
Here's the part everyone loves -- like the last post, a koan-style sample of code from which you can enlighten yourself. This is adapted from a lesson in Rayan Slim's excellent Udemy course on PyTorch for Deep Learning and Computer Vision.
#google colab doesn't come with torch installed :(
!pip3 install torch
Requirement already satisfied: torch in /usr/local/lib/python3.7/dist-packages (1.8.1+cu101) Requirement already satisfied: typing-extensions in /usr/local/lib/python3.7/dist-packages (from torch) (3.7.4.3) Requirement already satisfied: numpy in /usr/local/lib/python3.7/dist-packages (from torch) (1.19.5)
In [2]:
#gotta import it of course, and numpy, a library with shared philosophy and interoperability
import torch
import numpy as np
In [4]:
#creates tensor, this is an idiomatic, basic way not using range
v = torch.tensor([1,2,3,4,5,6])
#indexing of tensors, the stop index is exclusive(Pythonic in this sense)
print(v[1:4])
tensor([2, 3, 4])
In [5]:
#float tensor -- a tensor of...floats, whoa
f = torch.FloatTensor([1,2,3,4,5,6])
print(f)
tensor([1., 2., 3., 4., 5., 6.])
In [6]:
print(f.dtype)
torch.float32
In [8]:
#.size() gives the size of a tensor, dead simple
print(f.size())
torch.Size([6])
In [9]:
#view tensor as 6 rows in 1 column, ala. numpy's reshape method
v.view(6,1)
Out[9]:
tensor([[1], [2], [3], [4], [5], [6]])
In [10]:
#viewing tensor as 3 rows w/2 columns...
v.view(3,2)
Out[10]:
tensor([[1, 2], [3, 4], [5, 6]])
In [11]:
v.view(3,-1)
Out[11]:
tensor([[1, 2], [3, 4], [5, 6]])
In [14]:
#creating a numpy n-dimensional array(type: ndarray)
a = np.array([1,2,3,4,5])
In [15]:
#creating a tensor from a numpy array!
tensor_cnv = torch.from_numpy(a)
print(tensor_cnv)
tensor([1, 2, 3, 4, 5])
In [16]:
#another way of giving the type
print(tensor_cnv.type())
torch.LongTensor
In [17]:
#whoa, turning it back into an ndarray! wild stuff,,,
numpy_cnv = tensor_cnv.numpy()
print(numpy_cnv)
[1 2 3 4 5]
In [18]:
#interchangable between numpy arrays and tensors
In [21]:
print(type(numpy_cnv))
<class 'numpy.ndarray'>
Smash that like button, subscribe, and share -- wait, this isn't Youtube! I kid. Jokes. I hope you've learned a bit about some bytes of tensors in reading this. Tell all your tensor-curious friends about this post and keep an eye out for the next post in this series!