# Getting Started With TensorFlow: A Brief Introduction

### If you're interested in deep learning, machine learning, and numerical computation, you need to know how to use TensorFlow. Here's a brief intro!

Join the DZone community and get the full member experience.

Join For FreeTensorFlow is an open source software library, provided by Google, mainly for deep learning, machine learning and numerical computation using data flow graphs.

Looking at their website, the first definition they have written for TensorFlow goes something like this -

TensorFlow™ is an open source software library for numerical computation using data flow graphs. Nodes in the graph represent mathematical operations, while the graph edges represent the multidimensional data arrays (tensors) communicated between them. The flexible architecture allows you to deploy computation to one or more CPUs or GPUs in a desktop, server, or mobile device with a single API.

At first look, it may look very confusing to the reader. But don't worry, just keep on reading further and I promise you at the end of the blog you will understand each and every word of this definition.

So, first question, what's a data flow graph?

## Data Flow Graphs

It's really simple. It's just a graph which represents data dependencies between a number of operations. Generally speaking, an operation can be any arithmetic operation, making a node of the graph and the inputs and output of that operation can be the edges of the graph. Let's look at a very basic example and sum the square of two numbers a & b.

```
t1 = a * a
t2 = b * b
t3 = t1 + t2
```

The DFG for the above operations will look like this -

Dataflow graphs give a lot of advantages in computation to TensorFlow, some of which are listed below -

**Parallelism.**By using explicit edges to represent dependencies between operations, it is easy for the system to identify operations that can execute in parallel. For example, in our situation above, t1 and t2 can be computed in parallel.**Distributed execution.**By using explicit edges to represent the values that flow between operations, it is possible for TensorFlow to partition your program across multiple devices (CPUs, GPUs, and TPUs) attached to different machines. TensorFlow inserts the necessary communication and coordination between devices. For example, in our situation, t1 and t2 can be computed on different devices.**Compilation.**TensorFlow's XLA compiler can use the information in your dataflow graph to generate faster code, for example, by fusing together adjacent operations.**Portability.**The dataflow graph is a language-independent representation of the code in your model. You can build a dataflow graph in Python, store it in a SavedModel, and restore it in a C++ program for low-latency inference.

Ok, so now we know what data flow graphs are and why does tensorflow use them. Now, let's understand tensors, the basic unit for tensorflow.

## Tensor

The central unit of data in TensorFlow is the **tensor**. A tensor consists of a set of primitive values shaped into an array of any number of dimensions. In simple terms, a tensor is just a multi-dimensional array. A rank of a tensor is the dimension of the array. For example a 2-dimensional array is a rank 2 tensor, a 3-dimensional array is a rank 3 tensor. So, `[1, 2, 3]`

is 1-D array, thus, a rank 1 tensor, `[[1, 2, 3], [4, 5, 6]]`

is a 2-D array, thus, a rank 2 tensor.

The shape of a tensor is just how many elements does each dimension of tensor contains. For example, let's take the following tensor -

The outermost array consists of 2 elements, each of these 2 elements consist of 1 element and each of these elements consist of 3 elements. So, the shape of this tensor becomes `[2, 1, 3]`

.

Ok, now that we have a basic idea about tensors, why are they calling the software tensorflow? When we discussed about data flow graphs, the edges were all the numbers, a, b, t1, t2 and t3 while the nodes were arithmetic operations, multiplication and addition. In tensorflow, the edges are tensors and the nodes are operations, which they call tensorflow operations. So, tensors flow throughout the graph as inputs for various tensor operations which in turn produce new tensors as output. Thus, the name, **TensorFlow**.

So, looking back at the definition, now it seems pretty simple to understand what they have written.

I will be posting another blog as a follow-up of this blog where I will give a practical example and code samples for tensorflow, so stay tuned!

References -

I hope this blog turned out to be helpful for you.

*This article was first published on the Knoldus blog.*

If you enjoyed this article and want to learn more about TensorFlow, check out this collection of tutorials and articles on all things TensorFlow.

Published at DZone with permission of Akshansh Jain, DZone MVB. See the original article here.

Opinions expressed by DZone contributors are their own.

Comments