My personal notes for the sequential model API in Keras (Tensorflow 2.0)

NOTE: this is still WIP

Main ideas:

  • pass keras.layers into keras.models.Sequential constructor as a list
  • used for quickly creating layers of nodes together to form a neural network
  • cannot use this if there are unconventional flows like skip connections from ResNet

Common layer classes:

  • keras.layers.Input
  • keras.layers.Dense
  • keras.layers.Flatten
  • keras.layers.Conv2D
  • keras.layers.MaxPooling2D

Common activations:

  • keras.activations.relu
  • keras.activations.tanh
  • keras.activations.softmax (for multi-class classification)
  • keras.activations.sigmoid (for binary classification)

Once layers are created, there are some useful methods/properties to know:

  • model.summary()
  • model.weights
  • model.layers
  • model.add()
  • model.pop()

If the input shape is not specified (e.g. no keras.layers.Input as the first layer, or no input_shape parameter specified in the first layer), keras will figure out from the given data during execution. However, that means initialization will not take place until the network is run, so things like model.summary() and model.weights will not work until then.

Example 1:

Simplest one-node perceptron for linear regression.

model = keras.models.Sequential([
    keras.layers.Dense(1, input_shape=(1))
])

Since no activation function is specified, it just does a linear pass-through activation: a(x) = x

Example 2:

Basic multi-layer perceptron:

  • 10 inputs
  • 1 hidden layer with 128 nodes
  • 4 outputs
model = keras.models.Sequential([
    keras.layers.Dense(128, activation='relu', input_shape=(10)),
    keras.layers.Dense(4, activation='softmax')
])

Example 3:

Basic convolutional neural network for MNIST

model = keras.models.Sequential([
    # one conv + pooling layer
    keras.layers.Conv2D(64, (3,3), activation='relu', input_shape=(28,28,1)),
    keras.layers.MaxPooling2D(2,2),
    # fully connected layer at the end
    keras.layers.Flatten(),
    keras.layers.Dense(512, activation='relu'),
    keras.layers.Dense(10, activation='softmax')
])