Torch sum a tensor along an axis

100,864

Solution 1

The simplest and best solution is to use torch.sum().

To sum all elements of a tensor:

torch.sum(x) # gives back a scalar

To sum over all rows (i.e. for each column):

torch.sum(x, dim=0) # size = [1, ncol]

To sum over all columns (i.e. for each row):

torch.sum(x, dim=1) # size = [nrow, 1]

Solution 2

Alternatively, you can use tensor.sum(axis) where axis indicates 0 and 1 for summing over rows and columns respectively, for a 2D tensor.

In [210]: X
Out[210]: 
tensor([[  1,  -3,   0,  10],
        [  9,   3,   2,  10],
        [  0,   3, -12,  32]])

In [211]: X.sum(1)
Out[211]: tensor([ 8, 24, 23])

In [212]: X.sum(0)
Out[212]: tensor([ 10,   3, -10,  52])

As, we can see from the above outputs, in both cases, the output is a 1D tensor. If you, on the other hand, wish to retain the dimension of the original tensor in the output as well, then you've set the boolean kwarg keepdim to True as in:

In [217]: X.sum(0, keepdim=True)
Out[217]: tensor([[ 10,   3, -10,  52]])

In [218]: X.sum(1, keepdim=True)
Out[218]: 
tensor([[ 8],
        [24],
        [23]])

Solution 3

If you have tensor my_tensor, and you wish to sum across the second array dimension (that is, the one with index 1, which is the column-dimension, if the tensor is 2-dimensional, as yours is), use torch.sum(my_tensor,1) or equivalently my_tensor.sum(1) see documentation here.

One thing that is not mentioned explicitly in the documentation is: you can sum across the last array-dimension by using -1 (or the second-to last dimension, with -2, etc.)

So, in your example, you could use: outputs.sum(1) or torch.sum(outputs,1), or, equivalently, outputs.sum(-1) or torch.sum(outputs,-1). All of these would give the same result, an output tensor of size torch.Size([10]), with each entry being the sum over the all rows in a given column of the tensor outputs.

To illustrate with a 3-dimensional tensor:

In [1]: my_tensor = torch.arange(24).view(2, 3, 4) 
Out[1]: 
tensor([[[ 0,  1,  2,  3],
         [ 4,  5,  6,  7],
         [ 8,  9, 10, 11]],

        [[12, 13, 14, 15],
         [16, 17, 18, 19],
         [20, 21, 22, 23]]])

In [2]: my_tensor.sum(2)
Out[2]:
tensor([[ 6, 22, 38],
        [54, 70, 86]])

In [3]: my_tensor.sum(-1)
Out[3]:
tensor([[ 6, 22, 38],
        [54, 70, 86]])

Solution 4

Based on doc https://pytorch.org/docs/stable/generated/torch.sum.html

it should be

dim (int or tuple of python:ints) – the dimension or dimensions to reduce.

dim=0 means reduce row dimensions: condense all rows = sum by col
dim=1 means reduce col dimensions: condense cols= sum by row

Solution 5

Torch sum along multiple axis or dimensions

Just for the sake of completeness (I could not find it easily) I include how to sum along multiple dimensions with torch.sum which is heavily used in computer vision tasks where you have to reduce along H and W dimensions.

If you have an image x with shape C x H x W and want to compute the average pixel intensity value per channel you could do:

avg = torch.sum(x, dim=(1,2)) / (H*W)     # Sum along (H,W) and norm
Share:
100,864

Related videos on Youtube

Abhishek Bhatia
Author by

Abhishek Bhatia

"The purpose of computing is insight, not numbers."- Richard Hamming, 1961 Abhishek has had a unique interdisciplinary research exposure to AI systems. His projects range from designing artificially intelligent autonomous systems that operate in varied setups, all the way to studying common emergent phenomena in natural systems. He has published 5 research papers in the field of complex systems, artificial intelligence and statistical inference. He is currently working on Deep Reinforcement Learning applications for Natural Language Processing and General-game Playing. He is also enthusiastic about open-source tools and frequently contributes to many open-source projects.

Updated on April 26, 2022

Comments

  • Abhishek Bhatia
    Abhishek Bhatia almost 2 years

    How do I sum over the columns of a tensor?

    torch.Size([10, 100])    --->    torch.Size([10])
    
  • obadul024
    obadul024 almost 4 years
    thanks, this is a great answer. this dim var seems very counter-intuitive at first glance
  • kkgarg
    kkgarg over 3 years
    A nice observation about the dimension of the resultant tensor is that whichever dim we supply as 1, the final tensor would have 1 in that particular axis, keeping the dimensions of the rest axes unchanged. This helps me especially to visualize how we would sum in case of higher dimensional tensors.