Tensorflow 2.0 in 2 minutes
Tensorflow 2.0-alpha was released a couple of days ago with a bunch of exciting features. It can be installed by following command:
pip install -U –pre tensorflow
In this post I explore the 17 most key features among them. The purpose is to make this short, crisp but touch on all the major pointers.
- Improvement in tf.keras high level API: Tensorflow 2.0 takes the compatibility between imperative Keras and DAG driven tensorflow to next level by adopting tf.keras to its core. This will make prototyping and productionizing deep learning models fast. Also, this will engage and bring more developers towards deep learning, keras being more intuitive.
- Eager execution by default: No need to create an interactive session to execute the graph. TF 2.0 introduces the eager execution by default moving all the session related boilerplate under the hood.
- Compact and improved documentation: TF 2.0 offers better organized documentation. Most of it is available here: https://www.tensorflow.org/versions/r2.0/api_docs/python/tf
- Clarity: 2.0 takes clarity to the next level by removing various duplicate functionalities like multiple versions of GRU and LSTM cells available. 2.0 takes care of choosing the optimum node according to hardware giving developer a single unified library to choose from for instance one implementation of LSTM and one for GRU.
- Low Level API: Full low level API in tf.raw_ops with inheritable interfaces for variables, checkpoints and layers to define your own components.
- Easy Up-gradation: Conversion script in tf_upgrade_v2 to convert TF 1.0 code into TF 2.0 code automatically. Just write: !tf_upgrade_v2 –infile <input_file> –outfile <output_file>
- Backward compatibility: Comes with a separate backward compatibility module tf.compat.v1 for getting the older components.
- One optimizer, one losses module, one layers module: Unified optimizers module in tf.keras.optimizer.*. Similarly one losses and one layers module under tf.keras.losses.* and tf.keras.layers.*.
- Better graphical visualization: 2.0 gives better graph visualizations in Tensorboard even for keras models.
- Easy to distribute: Provides more options for scaling and multi GPU training via tf.distribute.Strategy module. strategy = tf.distribute.MirroredStrategy()
<define the model here>
- Save and Import keras model: Easy to save and load the keras models by using tf.keras.experimental.*.
- Run Keras On TPUs: 2.0 comes with tf.distribute.experimental.TPUStrategy() that will allow the keras code to run on TPUs.
- New datasets available: 2.0 comes with new datasets to test the models on in vision, audio and text domain.
- More pre-trained models at TF Hub: More pre-trained models from the world of NLP and Vision available at Tensorflow Hub.
- Improved error reporting: Improved error reporting with exact line number and full call stack.
- TFFederated: TF federated to support federated learning on edge devices.
- Swift support and Fast AI: 2.0 to come with a Swift library. Jeremy Howard will be delivering a course on the same.
Source: Tensorflow Dev Summit 2019