Introduction to TensorFlow Fundamentals
This is the first lab in the Machine Learning with TensorFlow series. The primary focus of this lab is to help you understand the fundamental components of TensorFlow.
- Creating scalars, vectors, matrices, and tensors
- Inspecting different attributes of a tensor.
- Performing basic operations on tensors.
- Learn to index and slice tensors.
- Reshaping and manipulating tensors.
To get the most out of this series:
- you should know how to write code, ideally have some experience programming in Python.
- have a decent understanding of variables, linear equations, matrices, calculus, and statistics.
- don't just stick to the examples and code I've provided, play around and break things - that's the best way to learn.
In order to try out the code snippets or complete to-do tasks mentioned in each section, you can use Google Colaboratory notebook. Colab is basically Google's implementation of Jupyter Notebooks where you can run your code.
You can learn more about working with Colabs by following this notebook.
Reshaping & Manipulating Tensors
Another important fundamental concept that you should learn is reshaping tensors as you'd be using this operation to prepare your data for the input layers.
For example, if you have an image of shape [height, width, channels] and you want to add an outer batch axis to a single element, you can do that using reshape or expand_dims to make it of the shape [batch, height, width, channels].
Important methods for reshaping include:
Let's learn to code them:
reshape() will take the tensor you want to reshape and the shape you want to turn it into.
Use the same tensor t1 as used in the last section.
##reshaping t1 from (7,) to (7,1) tf.reshape(t1, [7, 1])
For a 2D tensor:
##checking out for t2 t2 = tf.Variable([[1, 2.4], [3,4], [4,5], [6,7]]) ##reshaping t2 from (4,2) to (2,4) tf.reshape(t2, [2,4])
Note: you can only reshape a tensor as long as there is an equal number of elements in both the shape that the tensor has and the shape you want to turn it into. For example, you can't reshape a tensor with (2, 2) into (5, 1). Try it!
Let's do some more of this:
t3 = tf.constant( [[[ 0, 1, 2, 3, 4], [ 5, 6, 7, 8, 9]], [[10, 11, 12, 13, 14], [15, 16, 17, 18, 19]], [[20, 21, 22, 23, 24], [25, 26, 27, 28, 29]]] ) ##flatten a 3D tensor, you can use -1(it will find a pattern that fits) print(tf.reshape(t3, [-1])) ##for three rows and whatever fits by adding -1 print(tf.reshape(t3, [3, -1]))
Returns a tensor with a length 1 axis inserted at index
## expanding dimensions at index 0 tf.expand_dims(t1, axis=0)
This changes the shape of t1 from (7,) to (1, 7) because we added the axis at index 0. Try adding it at -1 or 1.
Removes dimensions of size 1 from the shape of a tensor.
##counterpart of expand_dims is squeeze expanded_t1 = tf.expand_dims(t1, 1) print(expanded_t1) ##squeezing expand_dims tf.squeeze(t3, axis=1)
Learn more about squeeze here.
Stacks a list of rank-
R tensors into one rank-
Let's say, you have 3 tensors of shape (3, 3), then the stack() method will pack them all together along the axis you mention and return a single tensor.
a = tf.constant([1,7]) # shape (2,) b = tf.constant([2,8]) c = tf.constant([3,9]) print(tf.stack([a, b, c])) print(tf.stack([a, b, c], axis=1))
Check the difference for yourself after running it!
Don't let the learning stop!
At this point, you should be well-versed with what tensors are, how to create and inspect them and how to reshape and manipulate them. But this is just the beginning, your journey has just gotten off to a good start. Buil on this momentum now!
Take a look at what else you can achieve with tensors and the next steps in your ML journey!