graph.py 4.3 KB
Newer Older
A
graphs  
Amirsina Torfi 已提交
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118
# -*- coding: utf-8 -*-
"""graph.ipynb

Automatically generated by Colaboratory.

Original file is located at
    https://colab.research.google.com/drive/1ibfKtpxC_hIhZlPbefCoqpAS7jTdyiFw

## Intorduction to TensorFlow graphs.

For long, the big complaint about TensorFlow was *It's not flexible for debugging!* With the advent of TensorFlow 2.0, that changed drastically.

Now TensorFlow allows you to run the oprations **eagerly**. That means, you can run TensorFlow operations by Python and return the outputs to Python again. That creates a lot of flexibility, especially for debugging. 

But there are some merits in NOT using the eagerly option. You can run operations on TensorFlow graphs that in some scenarios leads to significant speed up. According to TensorFlow:

> Graphs are data structures that contain a set of [tf.Operation](https://www.tensorflow.org/api_docs/python/tf/Operation) objects, which represent units of computation; and [tf.Tensor](https://www.tensorflow.org/api_docs/python/tf/Tensor) objects, which represent the units of data that flow between operations. They are defined in a [tf.Graph](https://www.tensorflow.org/api_docs/python/tf/Graph) context. Since these graphs are data structures, they can be saved, run, and restored all without the original Python code.

Let's have some example for transforming functions to graphs!
"""

# Loading necessary libraries
import tensorflow as tf
import numpy as np
import timeit

"""### Operation

We can take a Python function on graph with [@tf.function](https://www.tensorflow.org/api_docs/python/tf/function) decorator.
"""

@tf.function
def multiply_fn(a, b):
  return tf.matmul(a, b)

# Create some tensors
a = tf.constant([[0.5, 0.5]])
b = tf.constant([[10.0], [1.0]])

# Check function
print('Multiple a of shape {} with b of shape {}'.format(a.shape, b.shape))
print(multiply_fn(a, b).numpy())

# Function without neing take to graph, i.e., with eager execution.
def add_fn(a, b):
  return tf.add(a, b)

# Create some tensors
a = tf.constant([[0.5, 0.5]])
b = tf.constant([[10.0], [1.0]])

# Check function
print('Add a of shape {} with b of shape {}'.format(a.shape, b.shape))
print(add_fn(a, b).numpy())

"""### Speedup

Now let's define a custom model and run it:

1. eagerly
2. on graph

To check how to define models refer to: https://www.tensorflow.org/api_docs/python/tf/keras/Model
"""

class ModelShallow(tf.keras.Model):

  def __init__(self):
    super(ModelShallow, self).__init__()
    self.dense1 = tf.keras.layers.Dense(10, activation=tf.nn.relu)
    self.dense2 = tf.keras.layers.Dense(20, activation=tf.nn.relu)
    self.dense3 = tf.keras.layers.Dense(30, activation=tf.nn.softmax)
    self.dropout = tf.keras.layers.Dropout(0.5)

  def call(self, inputs, training=False):
    x = self.dense1(inputs)
    if training:
      x = self.dropout(x, training=training)
    x = self.dense2(x)
    out = self.dense3(x)
    return out

class ModelDeep(tf.keras.Model):

  def __init__(self):
    super(ModelDeep, self).__init__()
    self.dense1 = tf.keras.layers.Dense(1000, activation=tf.nn.relu)
    self.dense2 = tf.keras.layers.Dense(2000, activation=tf.nn.relu)
    self.dense3 = tf.keras.layers.Dense(3000, activation=tf.nn.softmax)
    self.dropout = tf.keras.layers.Dropout(0.5)

  def call(self, inputs, training=False):
    x = self.dense1(inputs)
    if training:
      x = self.dropout(x, training=training)
    x = self.dense2(x)
    out = self.dense3(x)
    return out

# Create the model with eager esxecution by default
model_shallow_with_eager = ModelShallow()

# Take model to graph. 
# NOTE: Instead of using decorators, we can ditectly operate tf.function on the model.
model_shallow_on_graph = tf.function(ModelShallow())

# Model deep
model_deep_with_eager = ModelDeep()
model_deep_on_graph = tf.function(ModelDeep())

# sample input
sample_input = tf.random.uniform([60, 28, 28])

# Check time for shallow model
print("Shallow Model - Eager execution time:", timeit.timeit(lambda: model_shallow_with_eager(sample_input), number=1000))
print("Shallow Model - Graph-based execution time:", timeit.timeit(lambda: model_shallow_on_graph(sample_input), number=1000))

# Check time for deep model
A
Amirsina Torfi 已提交
119 120
print("Deep Model - Eager execution time:", timeit.timeit(lambda: model_deep_with_eager(sample_input), number=100))
print("Deep Model - Graph-based execution time:", timeit.timeit(lambda: model_deep_on_graph(sample_input), number=100))
A
graphs  
Amirsina Torfi 已提交
121