How to understand the term `tensor` in TensorFlow?

19,892

Solution 1

TensorFlow doesn't have first-class Tensor objects, meaning that there are no notion of Tensor in the underlying graph that's executed by the runtime. Instead the graph consists of op nodes connected to each other, representing operations. An operation allocates memory for its outputs, which are available on endpoints :0, :1, etc, and you can think of each of these endpoints as a Tensor. If you have tensor corresponding to nodename:0 you can fetch its value as sess.run(tensor) or sess.run('nodename:0'). Execution granularity happens at operation level, so the run method will execute op which will compute all of the endpoints, not just the :0 endpoint. It's possible to have an Op node with no outputs (like tf.group) in which case there are no tensors associated with it. It is not possible to have tensors without an underlying Op node.

You can examine what happens in underlying graph by doing something like this

tf.reset_default_graph()
value = tf.constant(1)
print(tf.get_default_graph().as_graph_def())

So with tf.constant you get a single operation node, and you can fetch it using sess.run("Const:0") or sess.run(value)

Similarly, value=tf.placeholder(tf.int32) creates a regular node with name Placeholder, and you could feed it as feed_dict={"Placeholder:0":2} or feed_dict={value:2}. You can not feed and fetch a placeholder in the same session.run call, but you can see the result by attaching a tf.identity node on top and fetching that.

For variable

tf.reset_default_graph()
value = tf.Variable(tf.ones_initializer()(()))
value2 = value+3
print(tf.get_default_graph().as_graph_def())

You'll see that it creates two nodes Variable and Variable/read, the :0 endpoint is a valid value to fetch on both of these nodes. However Variable:0 has a special ref type meaning it can be used as an input to mutating operations. The result of Python call tf.Variable is a Python Variable object and there's some Python magic to substitute Variable/read:0 or Variable:0 depending on whether mutation is necessary. Since most ops have only 1 endpoint, :0 is dropped. Another example is Queue -- close() method will create a new Close op node which connects to Queue op. To summarize -- operations on python objects like Variable and Queue map to different underlying TensorFlow op nodes depending on usage.

For ops like tf.split or tf.nn.top_k which create nodes with multiple endpoints, Python's session.run call automatically wraps output in tuple or collections.namedtuple of Tensor objects which can be fetched individually.

Solution 2

From the glossary:

A Tensor is a typed multi-dimensional array. For example, a 4-D array of floating point numbers representing a mini-batch of images with dimensions [batch, height, width, channel].

Basically, every data is a Tensor in TensorFlow (hence the name):

  • placeholders are Tensors to which you can feed a value (with the feed_dict argument in sess.run())
  • Variables are Tensors which you can update (with var.assign()). Technically speaking, tf.Variable is not a subclass of tf.Tensor though
  • tf.constant is just the most basic Tensor, which contains a fixed value given when you create it

However, in the graph, every node is an operation, which can have Tensors as inputs or outputs.

Solution 3

As already mentioned by others, yes they are all tensors.

The way I understood those is to first visualize and understand 1D, 2D, 3D, 4D, 5D, and 6D tensors as in the picture below. (source: knoldus)

tensor-definition

Now, in the context of TensorFlow, you can imagine a computation graph like the one below,

computation-graph

Here, the Ops take two tensors a and b as input; multiplies the tensors with itself and then adds the result of these multiplications to produce the result tensor t3. And these multiplications and addition Ops happen at the nodes in the computation graph.

And these tensors a and b can be constant tensors, Variable tensors, or placeholders. It doesn't matter, as long as they are of the same data type and compatible shapes(or broadcastable to it) to achieve the operations.

Share:
19,892
ZijunLost
Author by

ZijunLost

Android, Computer Vision, A little bit math.

Updated on June 07, 2022

Comments

  • ZijunLost
    ZijunLost almost 2 years

    I am new to TensorFlow. While I am reading the existing documentation, I found the term tensor really confusing. Because of it, I need to clarify the following questions:

    1. What is the relationship between tensor and Variable, tensor
      vs. tf.constant, 'tensor' vs. tf.placeholder?
    2. Are they all types of tensors?
  • ZijunLost
    ZijunLost almost 8 years
    I don't agree. If you look at section 2 of tensorflow white paper:"In a TensorFlow graph, each node has zero or more inputs and zero or more outputs, and represents the instantiation of an operation". Variables, constant and placeholders are nodes, aka, instantiation of OPERATIONS just like tf.mul or tf.add . I think they produce tensors as output, but they themselves are not tensors.
  • Olivier Moindrot
    Olivier Moindrot almost 8 years
    Well yes, the graph is composed of operations, which pass Tensors between themselves. I will update my tensor to explain the operations linked to tf.constant and tf.placeholder
  • ZijunLost
    ZijunLost almost 8 years
    thanks, but I still believe it is better to call placeholders or constants as 'operations' that produce tensors instead of saying "placeholders are tensors". If you read the doc: "TensorFlow provides a placeholder operation that must be fed with data on execution. For more info, see the section on Feeding data."
  • Abhishek Bhatia
    Abhishek Bhatia over 6 years
    what do you mean by first-class Tensor objects? There is a class called tf.Tensor. Could explain with an example if possible.
  • Yaroslav Bulatov
    Yaroslav Bulatov over 6 years
    there are no tensors in the underlying graph, just ops connected to other ops
  • Abhishek Bhatia
    Abhishek Bhatia over 6 years
    how do you access an operation? Can you only access the inputs and outputs of operation directly?
  • Yaroslav Bulatov
    Yaroslav Bulatov over 6 years
    I may be being too vague, for a more precise explanation I recommend reading this paper -- dl.acm.org/citation.cfm?doid=3088525.3088527
  • n1k31t4
    n1k31t4 over 6 years
    Thanks for this explanation! (+1) The second example doesn't work for me, using tf.__version__ = 1.1.0. the Variable requires positional argument: shape.