8000 GitHub - dkobran/mnist-sample: An example mnist model for Gradient to demonstrate TensorFlow model parsing, distributed training, and more.
[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content

An example mnist model for Gradient to demonstrate TensorFlow model parsing, distributed training, and more.

License

Notifications You must be signed in to change notification settings

dkobran/mnist-sample

 
 

Repository files navigation

MNIST in TensorFlow

This directory builds a convolutional neural net to classify the MNIST dataset using the tf.data, tf.estimator.Estimator, and tf.layers APIs.

Setup

To begin, you'll simply need the latest version of TensorFlow installed. First make sure you've added the models folder to your Python path; otherwise you may encounter an error like ImportError: No module named mnist.

Then to train the model, run the following:

python mnist.py

Distributed Training on Gradient

Just run the example with following parameters:

  "name": "Mnist Sample",
  "projectHandle": "<your project handle",
  "parameterServerContainer": "tensorflow/tensorflow:1.13.1-gpu-py3",
  "parameterServerMachineType": "K80",
  "parameterServerCount": 1,
  "workerCommand": "python mnist.py",
  "workerContainer": "tensorflow/tensorflow:1.13.1-gpu-py3",
  "workspaceUrl": "git+https://github.com/paperspace/mnist-sample.git",
  "workerMachineType": "K80",
  "workerCount": 2,
  "parameterServerCommand": "python mnist.py"

Gradient will generate TF_CONFIG in base64 format for each node so all you need to do in your other projects:

paperspace_tf_config = json.loads(base64.urlsafe_b64decode(os.environ.get('TF_CONFIG')).decode('utf-8'))

Exporting the model

You can export the model into Tensorflow SavedModel format by using the argument --export_dir:

python mnist.py --export_dir /tmp/mnist_saved_model

Training the model for use with Tensorflow Serving on a CPU

If you are training on Tensorflow using a GPU but would like to export the model for use in Tensorflow Serving on a CPU-only server you can train and/or export the model using --data_format=channels_last:

python mnist.py --data_format=channels_last

The SavedModel will be saved in a timestamped directory under /tmp/mnist_saved_model/ (e.g. /tmp/mnist_saved_model/1513630966/).

Getting predictions with SavedModel Use saved_model_cli to inspect and execute the SavedModel.

About

An example mnist model for Gradient to demonstrate TensorFlow model parsing, distributed training, and more.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 98.5%
  • Shell 1.5%
0