Deep LearningTutorials

Implementing Deep Learning with Keras

The following excerpt is from the title Deep Learning with Theano, Chapter 5 written by Christopher Bourez. The book offers a complete overview of Deep Learning with Theano, a Python-based library that makes optimizing numerical expressions and deep learning models easy on CPU or GPU.

In this article, we introduce you to the highly popular deep learning library – Keras, which sits on top of the both, Theano and Tensorflow. It is a flexible platform for training deep learning models with ease.

Keras is a high-level neural network API, written in Python and capable of running on top of either TensorFlow or Theano. It was developed to make implementing deep learning models as fast and easy as possible for research and development. You can install Keras easily using conda, as follows:
conda install keras

When writing your Python code, importing Keras will tell you which backend is used:

>>> import keras

Using Theano backend.

Using cuDNN version 5110 on context None

Preallocating 10867/11439 Mb (0.950000) on cuda0

Mapped name None to device cuda0: Tesla K80 (0000:83:00.0)

Mapped name dev0 to device cuda0: Tesla K80 (0000:83:00.0)

Using cuDNN version 5110 on context dev1

Preallocating 10867/11439 Mb (0.950000) on cuda1

Mapped name dev1 to device cuda1: Tesla K80 (0000:84:00.0)

If you have installed Tensorflow, it might not use Theano. To specify which backend

to use, write a Keras configuration file, ~/.keras/keras.json:

{

"epsilon": 1e-07,

"floatx": "float32",

"image_data_format": "channels_last",

"backend": "theano"

}

It is also possible to specify the Theano backend directly with the environment Variable:

KERAS_BACKEND=theano python

Note that the device used is the device we specified for Theano in the ~/.theanorc file. It is also possible to modify these variables with Theano environment variables:

KERAS_BACKEND=theano THEANO_FLAGS=device=cuda,floatX=float32,mode=FAST_

RUN python

Programming with Keras

Keras provides a set of methods for data pre-processing and for building models. Layers and models are callable functions on tensors and return tensors. In Keras, there is no difference between a layer/module and a model: a model can be part of a bigger model and composed of multiple layers. Such a sub-model behaves as a module, with inputs/outputs.

Let’s create a network with two linear layers, a ReLU non-linearity in between, and a softmax output:

from keras.layers import Input, Dense

from keras.models import Model
inputs = Input(shape=(784,))
x = Dense(64, activation='relu')(inputs)

predictions = Dense(10, activation='softmax')(x)

model = Model(inputs=inputs, outputs=predictions)

The model module contains methods to get input and output shape for either one or multiple inputs/outputs, and list the submodules of our module:

>>> model.input_shape

(None, 784)

>>> model.get_input_shape_at(0)

(None, 784)


>>> model.output_shape

(None, 10)


>>> model.get_output_shape_at(0)

(None, 10)


>>> model.name

'Sequential_1'


>>> model.input

/dense_3_input


>>> model.output

Softmax.0


>>> model.get_output_at(0)

Softmax.0


>>> model.layers

[<keras.layers.core.Dense object at 0x7f0abf7d6a90>, <keras.layers.core.Dense object at 0x7f0abf74af90>]

In order to avoid specify inputs to every layer, Keras proposes a functional way of writing models with the Sequential module, to build a new module or model composed.

The following definition of the model builds exactly the same model as shown previously, with input_dim to specify the input dimension of the block that would be unknown otherwise and generate an error:

from keras.models import Sequential

from keras.layers import Dense, Activation

model = Sequential()

model.add(Dense(units=64, input_dim=784, activation='relu'))

model.add(Dense(units=10, activation='softmax'))

The model is considered a module or layer that can be part of a bigger model:

model2 = Sequential()

model2.add(model)

model2.add(Dense(units=10, activation='softmax'))

Each module/model/layer can be compiled then and trained with data :

model.compile(optimizer='rmsprop',

loss='categorical_crossentropy',

metrics=['accuracy'])

model.fit(data, labels)

Thus, we see it is fairly easy to train a model in Keras. The simplicity and ease of use that Keras offers makes it a very popular choice of tool for deep learning.

If you think the article is useful, check out the book Deep Learning with Theano for interesting deep learning concepts and their implementation using Theano.

For more information on the Keras library and how to train efficient deep learning models, make sure to check our highly popular title Deep Learning with Keras.

 

Tags

Amey Varangaonkar

Data Science Enthusiast. A massive science fiction and Manchester United fan. Loves to read, write and listen to music.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *