神刀安全网

Save and Load Your Keras Deep Learning Models

Keras is a simple and powerful Python library for deep learning.

Given that deep learning models can take hours, days and even weeks to train, it is important to know how to save and load them from disk.

In this post you will discover how you can save your Keras models to file and load them up again to make predictions.

Let’s get started.

Save and Load Your Keras Deep Learning Models

Save and Load Your Keras Deep Learning Models

Photo by art_inthecity , some rights reserved.

Tutorial Overview

Keras separates the concerns of saving your model architecture and saving your model weights.

Model weights are saved to HDF5 format . This is a grid format that is ideal for storing multi-dimensional arrays of numbers.

The model structure can be described and saved using two different formats: JSON and YAML.

In this post we are going to look at two examples of saving and loading your model to file:

  • Save Model to JSON.
  • Save Model to YAML.

Each example will also demonstrate saving and loading your model weights to HDF5 formatted files.

The examples will use the same simple network trained on the Pima Indians onset of diabetes binary classification dataset. This is a small dataset that contains all numerical data and is easy to work with. You can download this dataset and place it in your working directory.

Get Started in Deep Learning With Python

Save and Load Your Keras Deep Learning Models

Deep Learning gets state-of-the-art results and Python hosts the most powerful tools.

Get started now!

PDF Download and Email Course.

FREE 14-Day Mini-Course onDeep Learning With Python

Download Your FREE Mini-Course

Download your PDF containing all 14 lessons.

Get your daily lesson via email with tips and tricks.

Save Your Neural Network Model to JSON

JSON is a simple file format for describing data hierarchically.

Keras provides the ability to describe any model using JSON format with a to_json() function. This can be saved to file and later loaded via the model_from_json() function that will create a new model from the JSON specification.

The weights are saved directly from the model using the save_weights() function and later loaded using the symmetrical load_weights() function.

The example below trains and evaluates a simple model on the Pima Indians dataset. The model is then converted to JSON format and written to model.json in the local directory. The network weights are written to model.h5 in the local directory.

The model and weight data is loaded from the saved files and a new model is created. It is important to compile the loaded model before it is used. This is so that predictions made using the model can use the appropriate efficient computation from the Keras backend.

The model is evaluated in the same way printing the same evaluation score.

# MLP for Pima Indians Dataset serialize to JSON and HDF5 fromkeras.modelsimportSequential fromkeras.layersimportDense fromkeras.modelsimportmodel_from_json importnumpy importos # fix random seed for reproducibility seed = 7 numpy.random.seed(seed) # load pima indians dataset dataset = numpy.loadtxt("pima-indians-diabetes.csv", delimiter=",") # split into input (X) and output (Y) variables X = dataset[:,0:8] Y = dataset[:,8] # create model model = Sequential() model.add(Dense(12, input_dim=8, init='uniform', activation='relu')) model.add(Dense(8, init='uniform', activation='relu')) model.add(Dense(1, init='uniform', activation='sigmoid')) # Compile model model.compile(loss='binary_crossentropy', optimizer='adam', metrics=['accuracy']) # Fit the model model.fit(X, Y, nb_epoch=150, batch_size=10, verbose=0) # evaluate the model scores = model.evaluate(X, Y, verbose=0) print("%s: %.2f%%" % (model.metrics_names[1], scores[1]*100))   # serialize model to JSON model_json = model.to_json() withopen("model.json", "w") as json_file:     json_file.write(model_json) # serialize weights to HDF5 model.save_weights("model.h5") print("Saved model to disk")   # later...   # load json and create model json_file = open('model.json', 'r') loaded_model_json = json_file.read() json_file.close() loaded_model = model_from_json(loaded_model_json) # load weights into new model loaded_model.load_weights("model.h5") print("Loaded model from disk")   # evaluate loaded model on test data loaded_model.compile(loss='binary_crossentropy', optimizer='rmsprop', metrics=['accuracy']) score = loaded_model.evaluate(X, Y, verbose=0) print "%s: %.2f%%" % (loaded_model.metrics_names[1], score[1]*100) 

Running this example provides the output below.

acc: 79.56% Saved model to disk Loaded model from disk acc: 79.56% 

The JSON format of the model looks like the following:

{    "class_name":"Sequential",    "config":[       {          "class_name":"Dense",          "config":{             "W_constraint":null,             "b_constraint":null,             "name":"dense_1",             "output_dim":12,             "activity_regularizer":null,             "trainable":true,             "init":"uniform",             "input_dtype":"float32",             "input_dim":8,             "b_regularizer":null,             "W_regularizer":null,             "activation":"relu",             "batch_input_shape":[                null,                8             ]          }       },       {          "class_name":"Dense",          "config":{             "W_constraint":null,             "b_constraint":null,             "name":"dense_2",             "activity_regularizer":null,             "trainable":true,             "init":"uniform",             "input_dim":null,             "b_regularizer":null,             "W_regularizer":null,             "activation":"relu",             "output_dim":8          }       },       {          "class_name":"Dense",          "config":{             "W_constraint":null,             "b_constraint":null,             "name":"dense_3",             "activity_regularizer":null,             "trainable":true,             "init":"uniform",             "input_dim":null,             "b_regularizer":null,             "W_regularizer":null,             "activation":"sigmoid",             "output_dim":1          }       }    ] } 

Save Your Neural Network Model to YAML

This example is much the same as the above JSON example, except the YAML format is used for the model specification.

The model is described using YAML, saved to file model.yaml and later loaded into a new model via the model_from_yaml() function. Weights are handled in the same way as above in HDF5 format as model.h5.

# MLP for Pima Indians Dataset serialize to YAML and HDF5 fromkeras.modelsimportSequential fromkeras.layersimportDense fromkeras.modelsimportmodel_from_yaml importnumpy importos # fix random seed for reproducibility seed = 7 numpy.random.seed(seed) # load pima indians dataset dataset = numpy.loadtxt("pima-indians-diabetes.csv", delimiter=",") # split into input (X) and output (Y) variables X = dataset[:,0:8] Y = dataset[:,8] # create model model = Sequential() model.add(Dense(12, input_dim=8, init='uniform', activation='relu')) model.add(Dense(8, init='uniform', activation='relu')) model.add(Dense(1, init='uniform', activation='sigmoid')) # Compile model model.compile(loss='binary_crossentropy', optimizer='adam', metrics=['accuracy']) # Fit the model model.fit(X, Y, nb_epoch=150, batch_size=10, verbose=0) # evaluate the model scores = model.evaluate(X, Y, verbose=0) print("%s: %.2f%%" % (model.metrics_names[1], scores[1]*100))   # serialize model to YAML model_yaml = model.to_yaml() withopen("model.yaml", "w") as yaml_file:     yaml_file.write(model_yaml) # serialize weights to HDF5 model.save_weights("model.h5") print("Saved model to disk")   # later...   # load YAML and create model yaml_file = open('model.yaml', 'r') loaded_model_yaml = yaml_file.read() yaml_file.close() loaded_model = model_from_yaml(loaded_model_yaml) # load weights into new model loaded_model.load_weights("model.h5") print("Loaded model from disk")   # evaluate loaded model on test data loaded_model.compile(loss='binary_crossentropy', optimizer='rmsprop', metrics=['accuracy']) score = loaded_model.evaluate(X, Y, verbose=0) print "%s: %.2f%%" % (loaded_model.metrics_names[1], score[1]*100) 

Running the example displays the following output:

acc: 79.56% Saved model to disk Loaded model from disk acc: 79.56% 

The model described in YAML format looks like the following:

class_name: Sequential config: - class_name: Dense   config:     W_constraint: null     W_regularizer: null     activation: relu     activity_regularizer: null     b_constraint: null     b_regularizer: null     batch_input_shape: !!python/tuple [null, 8]     init: uniform     input_dim: 8     input_dtype: float32     name: dense_1     output_dim: 12     trainable: true - class_name: Dense   config: {W_constraint: null, W_regularizer: null, activation: relu, activity_regularizer: null,     b_constraint: null, b_regularizer: null, init: uniform, input_dim: null, name: dense_2,     output_dim: 8, trainable: true} - class_name: Dense   config: {W_constraint: null, W_regularizer: null, activation: sigmoid, activity_regularizer: null,     b_constraint: null, b_regularizer: null, init: uniform, input_dim: null, name: dense_3,     output_dim: 1, trainable: true} 

Summary

In this post you discovered how to serialize your Keras deep learning models.

You learned how you can save your trained models to files and later load them up and use them to make predictions.

You also learned that model weights are easily stored using  HDF5 format and that the network structure can be saved in either JSON or YAML format.

Do you have any questions about saving your deep learning models or about this post? Ask your questions in the comments and I will do my best to answer them.

Do You Want To Get Started With Deep Learning?

Save and Load Your Keras Deep Learning Models

You can develop and evaluate deep learning models in just a few lines of Python code. You need:

Deep Learning With Python

Take the next step with 14 self-study tutorials and

7 end-to-end projects.

Covers multi-layer perceptrons, convolutional neural networks, objection recognition and more.

Ideal for machine learning practitioners already familiar with the Python ecosystem.

Bring Deep Learning To Your Machine Learning Projects

转载本站任何文章请注明:转载至神刀安全网,谢谢神刀安全网 » Save and Load Your Keras Deep Learning Models

分享到:更多 ()

评论 抢沙发

  • 昵称 (必填)
  • 邮箱 (必填)
  • 网址