Tensorflow Sample

Requirements

  • Authenticated to gcloud (gcloud auth application-default login)

This notebook demonstrate how to deploy iris classifier based on Tensorflow Estimators using Merlin

[ ]:
!pip install --upgrade -r requirements.txt > /dev/null
[ ]:
import merlin
import warnings
import os
import tensorflow as tf
import pandas as pd
from merlin.model import ModelType
warnings.filterwarnings('ignore')
[ ]:
tf.__version__

1. Initialize Merlin Resources

1.1 Set Merlin Server

[ ]:
merlin.set_url("localhost:3000/api/merlin")

1.2 Set Active Project

project represent a project in real life. You may have multiple model within a project.

merlin.set_project(<project_name>) will set the active project into the name matched by argument. You can only set it to an existing project. If you would like to create a new project, please do so from the MLP console at http://localhost:3000/projects/create.

[ ]:
merlin.set_project("sample")

1.3 Set Active Model

model represents an abstract ML model. Conceptually, model in Merlin is similar to a class in programming language. To instantiate a model you’ll have to create a model_version.

Each model has a type, currently model type supported by Merlin are: sklearn, xgboost, tensorflow, pytorch, and user defined model (i.e. pyfunc model).

model_version represents a snapshot of particular model iteration. You’ll be able to attach information such as metrics and tag to a given model_version as well as deploy it as a model service.

merlin.set_model(<model_name>, <model_type>) will set the active model to the name given by parameter, if the model with given name is not found, a new model will be created.

[ ]:
merlin.set_model("tensorflow-sample", ModelType.TENSORFLOW)

2. Train Model

2.1 Prepare Train and Test Set

[ ]:
CSV_COLUMN_NAMES = ['sepal_length', 'sepal_width', 'petal_length', 'petal_width', 'species']
SPECIES = ['Setosa', 'Versicolor', 'Virginica']

train_path = tf.keras.utils.get_file(
    "iris_training.csv", "https://storage.googleapis.com/download.tensorflow.org/data/iris_training.csv")
test_path = tf.keras.utils.get_file(
    "iris_test.csv", "https://storage.googleapis.com/download.tensorflow.org/data/iris_test.csv")

train = pd.read_csv(train_path, names=CSV_COLUMN_NAMES, header=0)
test = pd.read_csv(test_path, names=CSV_COLUMN_NAMES, header=0)
train_y = train.pop('species')
test_y = test.pop('species')

# The label column has now been removed from the features.
train.head()

2.2 Create Input Function

[ ]:
def input_fn(features, labels, training=True, batch_size=256):
    """An input function for training or evaluating"""
    # Convert the inputs to a Dataset.
    dataset = tf.data.Dataset.from_tensor_slices((dict(features), labels))

    # Shuffle and repeat if you are in training mode.
    if training:
        dataset = dataset.shuffle(1000).repeat()

    return dataset.batch(batch_size)

2.3 Define Feature Columns

[ ]:
my_feature_columns = []
for key in train.keys():
    my_feature_columns.append(tf.feature_column.numeric_column(key=key))

print(my_feature_columns)

2.4 Build Estimators

[ ]:
# Build a DNN with 2 hidden layers with 30 and 10 hidden nodes each.
classifier = tf.estimator.DNNClassifier(
    feature_columns=my_feature_columns,
    # Two hidden layers of 10 nodes each.
    hidden_units=[30, 10],
    # The model must choose between 3 classes.
    n_classes=3)

2.5 Train Estimator

[ ]:
classifier.train(
    input_fn=lambda: input_fn(train, train_y, training=True),
    steps=5000)

2.6 Serialize Model

[ ]:
# Define the input receiver for the raw tensors
def serving_input_fn():
    feature_spec = {
      'petal_length': tf.placeholder(dtype=tf.float32, shape=[None,1], name='petal_length'),
      'petal_width' : tf.placeholder(dtype=tf.float32, shape=[None,1], name='petal_width'),
      'sepal_length': tf.placeholder(dtype=tf.float32, shape=[None,1], name='sepal_length'),
      'sepal_width' : tf.placeholder(dtype=tf.float32, shape=[None,1], name='sepal_width'),
    }
    return tf.estimator.export.build_raw_serving_input_receiver_fn(feature_spec)()
[ ]:
classifier.export_saved_model('tensorflow-model', serving_input_fn)

3. Upload and Deploy Model

[ ]:
with merlin.new_model_version() as v:
    v.log_model(model_dir='tensorflow-model')

3.1 Deploy Model

[ ]:
endpoint = merlin.deploy(v)

3.2 Send Test Request

[ ]:
%%bash -s "$endpoint.url"
curl -v -X POST $1 -d '{
  "signature_name" : "predict",
  "instances": [
    {"sepal_length":2.8, "sepal_width":1.0, "petal_length":6.8, "petal_width":0.4},
    {"sepal_length":0.1, "sepal_width":0.5, "petal_length":1.8, "petal_width":2.4}
  ]
}'

3.3 Delete Deployment

[ ]:
merlin.undeploy(v)
[ ]: