# Learn TensorFlow: Creating the Linear Regression Model in TensorFlow 2

### In this article, take a look at how to create the linear regression model in TensorFlow 2.

Join the DZone community and get the full member experience.

Join For Free**Introduction to TensorFlow 2 (TF 2)**

From TensorFlow Guide, there are major changes in TF 2:

**API Cleanup**: Removing redundant APIs such as`tf.app`

,`tf.flags`

,`tf.logging`

**E****ager Execution**: Executing eagerly like Python**No more globals**: Keeping track of your variables**F****unctions, not sessions**: Can decorate a Python function using`tf.function()`

to mark it for JIT compilation so that TensorFlow runs it as a single graph.

**Building Linear Regression in TF 2**

In one of my older articles, I introduced the linear regression algorithm and how to create a simple linear regression model using TensorFlow 1.X. In this post, I will rebuild that model using TensorFlow 2.x as follows:

`xxxxxxxxxx`

`import tensorflow as tf`

`import numpy as np`

`import matplotlib.pyplot as plt`

`learning_rate = 0.01`

`# steps of looping through all your data to update the parameters`

`training_epochs = 100`

`# the training set`

`x_train = np.linspace(0, 10, 100)`

`y_train = x_train + np.random.normal(0,1,100)`

`w0 = tf.Variable(0.)`

`w1 = tf.Variable(0.)`

`def h(x):`

` y = w1*x + w0`

` return y`

`def squared_error(y_pred, y_true):`

` return tf.reduce_mean(tf.square(y_pred - y_true))`

`# train model`

`for epoch in range(training_epochs):`

` with tf.GradientTape() as tape:`

` y_predicted = h(x_train)`

` costF = squared_error(y_predicted, y_train)`

` # get gradients`

` gradients = tape.gradient(costF, [w1,w0])`

` # compute and adjust weights`

` w1.assign_sub(gradients[0]*learning_rate)`

` w0.assign_sub(gradients[1]*learning_rate)`

`plt.scatter(x_train, y_train)`

`# plot the best fit line`

`plt.plot(x_train, h(x_train), 'r')`

`plt.show()`

The result will look like this:

We note some changes:

- declare variables (
`tf.Variable`

) but don't need to use`tf.global_variables_initializer`

, that means, TensorFlow 2.0 doesn’t make it mandatory to initialize variables. - train our model using t
`f.GradientTape`

and we will use`assign_sub`

for weight variables. - not require the session execution.

**Conclusion**

TensorFlow is a great platform for deep learning and machine learning and TF 2.0 focuses on simplicity and ease of use. In this post, I introduced some new changes in TF 2.0 by building simple linear regression models from scratch (don't use APIs such as Keras), and I hope you feel excited about it.

Opinions expressed by DZone contributors are their own.

Comments