Hello all , This post is for anyone who is willing to get started with tensorflow. It’s a quick guide to understand what tensorflow is and how it works. This tutorial mainly consist two parts :
- Basic intuition and theory to understand tensorflow.
- Practical implementation to understand how to use tensorlow.
1. Introduction to Tensorflow
Tensorflow is an open source software library by google developed by google brain team. Tensorflow is created for large scale tasks with heavy numerical computations. It’s main application in seen in the fields of machine learning and deep learning . Tensorflow is based on data flow graphs . Tensorflow’s flexible architecture allows us to deploy computation on one or more CPU’s ,GPU’s,desktop,server or even in a mobile device(Sounds great right?).All of this can be done using single API
Since there are so many libraries out there , you may think why to learn a new one now?
The answer is simple, because of the following reasons:
- It provides great API for both python and C++ , python’s API is more complete and easy to use.
- Tensorflow reduces the compile time since it has C++ backend so it’s much faster .
- The main reason why tensorflow is accepted across the world rapidly is because it supports one or more CPU’s ,GPU’s and distributed processing in a cluster.
Why deep learning with Tensorflow?
Tensorflow is currently used by deep learning experts because of it’s built-in support for deep learning and neural networks , so it’s easy to assemble a network and assign parameter’s and run the training process.
It has collection of simple , trainable mathematical functions useful for neural networks and any gradient based machine learning algorithm will benefit from tensorflow’s auto-differentiator and first-rate optimizers.
It’s versatile for it’s large collection of flexible tools.
It provides a lot of flexibility since it gives us control over network structure and functions used for processing.
What is Data Flow Graph?
It basically consists 2 computational units:
1. Nodes : Which consist of mathematical operations
2. Edges: It represent’s multi-dimensional arrays(tensors)
The standard usage is to first build a graph and the execute it to do computations. Once the graph is build, an inter-loop is written to drive computation.
2. Creating your First Graph with Tensorflow
If you haven’t installed tensorflow yet , check out this guide.
Now since everything is ready to use, let’s create our first graph!
Building a Graph
The following code creates the graph represented above .
import tensorflow as tf
That’s all there is to it! The most important thing to understand is that this code does not actually perform any computation, even though it looks like it does (especially the last line). It just creates a computation graph. In fact, even the variables are not initialized yet. To evaluate this graph, you need to open a TensorFlow session and use it to initialize the variables and evaluate f.
Running a session
A TensorFlow session takes care of placing
the operations onto devices such as CPUs and GPUs and running them, and it holds all the variable values.3 The following code creates a session, initializes the variables, and evaluates, and f then closes the session (which frees up resources):
>>> sess = tf.Session()
>>> result = sess.run(f)
Using “with” block
Having to repeat sess.run() all the time is a bit cumbersome, but fortunately there is a better way:
with tf.Session() as sess:
result = f.eval()
Inside the with block, the session is set as the default session. Calling x.initial
izer.run() is equivalent to calling tf.get_default_session().run(x.initializer), and similarly f.eval() is equivalent to calling tf.get_default_session().run(f). This makes the code easier to read. Moreover, the session is automatically closed at the end of the block.
Global Variable Initializer
Instead of manually running the initializer for every single variable, you can use the global_variables_initializer() function. Note that it does not actually perform the initialization immediately, but rather creates a node in the graph that will initialize all variables when it is run:
init = tf.global_variables_initializer() # prepare an init node
with tf.Session() as sess:
init.run() # actually initialize all the variables
result = f.eval()
Since it’s difficult for interactive environment like ipython and jupyter notebooks, there’s an option to create interactive sessions that run on demand.
>> sess = tf.InteractiveSession()
>>> result = f.eval()
To summarize ,A TensorFlow program is typically split into two parts: the first part builds a computation graph (this is called the construction phase), and the second part runs it (this is the execution phase). The construction phase typically builds a computation graph representing the ML model and the computations required to train it. The execution phase generally runs a loop that evaluates a training step repeatedly (for example, one step per mini-batch), gradually improving the model parameters.
You have developed a basic intuition of what tensorflow is and how to use it. If you want to get your hands on this, your exercise is to implement a basic linear regression model using tensorflow and upload it to github and share it with us.
Thanks for reading the article, let me know what you think about it in comments and if you have any doubt regarding this post or any other , ask in our forums.Please subscribe and share the blog.