You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ORPA-pyOpenRPA/Resources/WPy64-3720/notebooks/docs/keras-intro_brianspiering.i...

1285 lines
32 KiB

{
"cells": [
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"source": [
"<center><img src=\"images/title.png\" width=\"95%\"/></center>\n",
"<center><a href=\"http://bit.ly/pybay-keras\">bit.ly/pybay-keras</a></center>"
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"source": [
"Who Am I?\n",
"-----\n",
"\n",
"<center>Brian Spiering</center>"
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "fragment"
}
},
"source": [
"What Do I Do?\n",
"------\n",
"\n",
"<b><center>Professor @</center><b>\n",
"<center><img src=\"images/msds_logo.png\" width=\"28%\"/></center>"
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"source": [
"<center><img src=\"images/real_deep_learning.png\" width=\"80%\"/></center>"
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"source": [
"Keras - Neural Networks for humans\n",
"------\n",
"\n",
"<center><img src=\"images/keras-logo-small.jpg\" width=\"20%\"/></center>\n",
"\n",
"A high-level, intuitive API for Deep Learning.\n",
"\n",
"Easy to define neural networks, then automatically handles execution.\n",
"\n",
"A simple, modular interface which allows focus on learning and enables fast experimentation"
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"source": [
"Goals\n",
"-----\n",
"\n",
"- General introduction to Deep Learning\n",
"- Overview of keras library\n",
"- An end-to-end example in keras "
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"source": [
"Anti-Goals\n",
"-----\n",
"\n",
"- Understanding of Deep Learning (there will be no equations)\n",
"- Building neural networks from scratch\n",
"- Complete survey of keras library"
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"source": [
"Deep Learning 101\n",
"-----\n",
"<center><img src=\"images/neural_nets.jpg\" width=\"75%\"/></center>"
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"source": [
"Deep Learning (DL) are Neural networks (NN) with >1 hidden layer\n",
"-------\n",
"\n",
"<center><img src=\"images/neural-networks-layers.jpg\" width=\"80%\"/></center>"
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"source": [
"Neural Networks are Nodes & Edges\n",
"------\n",
"<center><img src=\"images/sum.png\" width=\"75%\"/></center>"
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"source": [
"Nonlinear function allows learning of nonlinear relationships\n",
"------\n",
"\n",
"<center><img src=\"images/function_3.png\" width=\"80%\"/></center>"
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"source": [
"Groups of nodes all the way down\n",
"------\n",
"\n",
"<center><img src=\"images/layers.png\" width=\"75%\"/></center>"
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"source": [
"Deep Learning isn't magic, it is just very good at finding patterns\n",
"------\n",
"\n",
"<center><img src=\"images/features.png\" width=\"80%\"/></center>"
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"source": [
"Deep Learning has fewer steps than traditional Machine Learning\n",
"------\n",
"\n",
"<center><img src=\"images/traditional-ml-deep-learning-2.png\" width=\"100%\"/></center>"
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"source": [
"If you want to follow along…\n",
"-----\n",
"\n",
"GitHub repo: [bit.ly/pybay-keras](http://bit.ly/pybay-keras)\n",
"\n",
"If you want to type along…\n",
"------\n",
"\n",
"1. Run a local Jupyter Notebook\n",
"1. [Binder](https://mybinder.org/v2/gh/brianspiering/keras-intro/master): In-Browser Jupyter Notebook\n",
"1. [Colaboratory](https://colab.research.google.com/): \"Google Docs for Jupyter Notebooks\""
]
},
{
"cell_type": "code",
"execution_count": 84,
"metadata": {
"slideshow": {
"slide_type": "skip"
}
},
"outputs": [],
"source": [
"reset -fs"
]
},
{
"cell_type": "code",
"execution_count": 85,
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"outputs": [],
"source": [
"import keras"
]
},
{
"cell_type": "code",
"execution_count": 86,
"metadata": {
"slideshow": {
"slide_type": "fragment"
}
},
"outputs": [],
"source": [
"# What is the backend / execution engine?"
]
},
{
"cell_type": "code",
"execution_count": 87,
"metadata": {
"slideshow": {
"slide_type": "skip"
}
},
"outputs": [
{
"data": {
"text/plain": [
"'tensorflow'"
]
},
"execution_count": 87,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"keras.backend.backend()"
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"source": [
"<center><img src=\"images/tf_logo.jpg\" width=\"70%\"/></center>\n",
"\n",
"\"An open-source software library for Machine Intelligence\"\n",
"\n",
"Numerical computation using data flow graphs. "
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"source": [
"TensorFlow: A great backend\n",
"------\n",
"A __very__ flexible architecture which allows you to do almost any numerical operation.\n",
"\n",
"Then deploy the computation to CPUs or GPUs (one or more) across desktop, cloud, or mobile device. \n",
"<center><img src=\"images/tf_features.png\" width=\"38%\"/></center>"
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"source": [
"MNIST handwritten digit database: <br> The “Hello World!” of Computer Vision\n",
"------\n",
"\n",
"<center><img src=\"images/mnist-digits.png\" width=\"80%\"/></center>"
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"source": [
"<center><img src=\"images/MNIST-Matrix.png\" width=\"100%\"/></center>"
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"source": [
"<center><img src=\"images/MNIST_neuralnet_image.png\" width=\"100%\"/></center>"
]
},
{
"cell_type": "code",
"execution_count": 88,
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"outputs": [],
"source": [
"# Import data\n"
]
},
{
"cell_type": "code",
"execution_count": 89,
"metadata": {
"slideshow": {
"slide_type": "skip"
}
},
"outputs": [],
"source": [
"from keras.datasets import mnist"
]
},
{
"cell_type": "code",
"execution_count": 90,
"metadata": {
"slideshow": {
"slide_type": "fragment"
}
},
"outputs": [],
"source": [
"# Setup train and test splits\n"
]
},
{
"cell_type": "code",
"execution_count": 91,
"metadata": {
"slideshow": {
"slide_type": "skip"
}
},
"outputs": [],
"source": [
"(x_train, y_train), (x_test, y_test) = mnist.load_data()"
]
},
{
"cell_type": "code",
"execution_count": 92,
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"outputs": [],
"source": [
"from random import randint\n",
"from matplotlib import pyplot\n",
"\n",
"%matplotlib inline"
]
},
{
"cell_type": "code",
"execution_count": 93,
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"outputs": [
{
"data": {
"image/png": "iVBORw0KGgoAAAANSUhEUgAAAP8AAAD8CAYAAAC4nHJkAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMi4zLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvIxREBQAADb9JREFUeJzt3X+IXPW5x/HPY2xFbP9IyJqG/OjGEkRZuKkOoaJcDMXGxECsv2j+KBFKNmjFJhS8If5RFQpyuTWJP1LZtqFJTG0KjTXgjxsJglalOAlJtY1tfri3jVmTCRa7/SOUJM/9Y8+Wbdz5zuzMmXNm87xfIDNznnPmPIz57JmZ75nzNXcXgHguKbsBAOUg/EBQhB8IivADQRF+ICjCDwRF+IGgCD8QFOEHgrq0yJ1Nnz7de3t7i9wlEMrg4KBOnz5tzazbVvjN7FZJmyRNkfRTd388tX5vb6+q1Wo7uwSQUKlUml635bf9ZjZF0jOSlki6VtIKM7u21ecDUKx2PvMvlHTE3Y+5+z8l/VLS8nzaAtBp7YR/lqS/jnl8PFv2b8ys38yqZlat1Wpt7A5AntoJ/3hfKnzm98HuPuDuFXev9PT0tLE7AHlqJ/zHJc0Z83i2pBPttQOgKO2E/11J881snpl9XtK3JO3Opy0AndbyUJ+7nzWzByT9r0aG+ra4+x9y6wxAR7U1zu/uL0t6OadeABSI03uBoAg/EBThB4Ii/EBQhB8IivADQRF+ICjCDwRF+IGgCD8QFOEHgiL8QFCEHwiK8ANBEX4gKMIPBEX4gaAIPxAU4QeCIvxAUIQfCIrwA0ERfiAowg8ERfiBoAg/EBThB4Ii/EBQhB8Iqq1Zes1sUNKwpHOSzrp7JY+mAHReW+HPLHL30zk8D4AC8bYfCKrd8LukPWa2z8z682gIQDHafdt/o7ufMLMrJb1mZh+4+xtjV8j+KPRL0ty5c9vcHYC8tHXkd/cT2e0pSS9IWjjOOgPuXnH3Sk9PTzu7A5CjlsNvZleY2RdH70v6hqT382oMQGe187Z/hqQXzGz0eX7h7q/m0hWAjms5/O5+TNJ/5NgLSnDu3LlkfceOHcn6pk2bkvX9+/dPuKdRmzdvTtbvu+++lp8bDPUBYRF+ICjCDwRF+IGgCD8QFOEHgsrjV33oYkePHk3WV6xYkawfPHgwWZ86dWqyfvXVV9etHTt2LLntzp07k/X+/vTPSaZMmZKsR8eRHwiK8ANBEX4gKMIPBEX4gaAIPxAU4QeCYpz/IvDOO+/Urd15553JbWfNmpWs79u3L1nv6+tL1lPWrVuXrD/77LPJ+tDQULI+e/bsCfcUCUd+ICjCDwRF+IGgCD8QFOEHgiL8QFCEHwiKcf5JYPv27cn6Qw89VLfWaJz/ySefTNYvuaRzx4fp06cn642uFTBt2rQ82wmHIz8QFOEHgiL8QFCEHwiK8ANBEX4gKMIPBNVwnN/MtkhaJumUu/dly6ZJ2impV9KgpHvc/W+da/Pitm3btmS90fXpV61aVbe2cePG5LadHMeXpLNnz9atHThwILntpZem/3lyXf72NPN//ueSbr1g2TpJe919vqS92WMAk0jD8Lv7G5I+uWDxcklbs/tbJd2ec18AOqzV93wz3H1IkrLbK/NrCUAROv6Fn5n1m1nVzKq1Wq3TuwPQpFbDf9LMZkpSdnuq3oruPuDuFXev9PT0tLg7AHlrNfy7Ja3M7q+U9GI+7QAoSsPwm9nzkt6RdLWZHTez70h6XNItZnZY0i3ZYwCTSMNxfnevN4H713Pu5aLV6Nr3q1evTtaXLl2arG/YsKFureyx8M2bN9et7dixI7ntjBkzkvXh4eFk/bLLLkvWo+MMPyAowg8ERfiBoAg/EBThB4Ii/EBQXLq7AK+88kqyfubMmWT94YcfTtYb/fS1HW+++Way3ujS3y+99FLL+160aFGy3ujS30jjyA8ERfiBoAg/EBThB4Ii/EBQhB8IivADQTHOX4B2r2B01113Jeu33XZb3drQ0FBy27fffjtZ//jjj5P1dlx++eXJ+r333tuxfYMjPxAW4QeCIvxAUIQfCIrwA0ERfiAowg8ExTh/AZYsWZKsz58/P1k/fPhwsv7MM89MuKdRjXpbu3Ztsv7BBx8k6w8++GDd2rJly5LbLl68OFlHezjyA0ERfiAowg8ERfiBoAg/EBThB4Ii/EBQDcf5zWyLpGWSTrl7X7bsEUmrJNWy1da7+8udanKymzt3brLe6Df1e/bsSdYXLFhQt3bVVVclt200jbWZJeu7du1K1lNuuOGGlrdF+5o58v9c0q3jLN/g7guy/wg+MMk0DL+7vyHpkwJ6AVCgdj7zP2BmvzezLWY2NbeOABSi1fD/WNJXJC2QNCTpR/VWNLN+M6uaWbVWq9VbDUDBWgq/u59093Pufl7STyQtTKw74O4Vd6+0eyFLAPlpKfxmNnPMw29Kej+fdgAUpZmhvucl3Sxpupkdl/QDSTeb2QJJLmlQ0uoO9gigA8zdC9tZpVLxarVa2P7Qvg8//DBZv+OOO5L1efPm1a21c44AxlepVFStVtMnZ2Q4ww8IivADQRF+ICjCDwRF+IGgCD8QFJfuRtJbb72VrB88eDBZX7lyZZ7tIEcc+YGgCD8QFOEHgiL8QFCEHwiK8ANBEX4gKMb5gztz5kyyvnfv3mT9+uuvT9bXrFkz4Z5QDI78QFCEHwiK8ANBEX4gKMIPBEX4gaAIPxAU4/zBbdy4MVl/7rnnkvXXX389z3ZQII78QFCEHwiK8ANBEX4gKMIPBEX4gaAIPxBUw3F+M5sjaZukL0k6L2nA3TeZ2TRJOyX1ShqUdI+7/61zraIVR44cSdafeOKJZH3x4sXJ+k033TThntAdmjnyn5X0fXe/RtLXJH3XzK6VtE7SXnefL2lv9hjAJNEw/O4+5O77s/vDkg5JmiVpuaSt2WpbJd3eqSYB5G9Cn/nNrFfSVyX9TtIMdx+SRv5ASLoy7+YAdE7T4TezL0j6taQ17v73CWzXb2ZVM6vWarVWegTQAU2F38w+p5Hg73D3Xdnik2Y2M6vPlHRqvG3dfcDdK+5e6enpyaNnADloGH4zM0k/k3TI3cd+Nbxb0ugUrCslvZh/ewA6pZmf9N4o6duS3jOzA9my9ZIel/QrM/uOpL9IurszLaKR4eHhurXHHnssuW2jj2JLly5tqSd0v4bhd/ffSrI65a/n2w6AonCGHxAU4QeCIvxAUIQfCIrwA0ERfiAoLt19EXj11Vfr1rZv357cdtGiRcn6/fff31JP6H4c+YGgCD8QFOEHgiL8QFCEHwiK8ANBEX4gKMb5J4FPP/00WV+7dm3Lz/3000+3vC0mN478QFCEHwiK8ANBEX4gKMIPBEX4gaAIPxAU4/xdoNE4/qOPPpqsf/TRR3VrjabYvuaaa5J1XLw48gNBEX4gKMIPBEX4gaAIPxAU4QeCIvxAUA3H+c1sjqRtkr4k6bykAXffZGaPSFolaXSC9/Xu/nKnGr2YHT16NFnfsGFDst7X11e39tRTTyW3Nas3+zouds2c5HNW0vfdfb+ZfVHSPjN7LattcPf/6Vx7ADqlYfjdfUjSUHZ/2MwOSZrV6cYAdNaEPvObWa+kr0r6XbboATP7vZltMbOpdbbpN7OqmVVrtdp4qwAoQdPhN7MvSPq1pDXu/ndJP5b0FUkLNPLO4EfjbefuA+5ecfdKT09PDi0DyENT4Tezz2kk+DvcfZckuftJdz/n7ucl/UTSws61CSBvDcNvI18H/0zSIXd/YszymWNW+6ak9/NvD0CnNPNt/42Svi3pPTM7kC1bL2mFmS2Q5JIGJa3uSIcBXHfddcm6uxfUCSJp5tv+30oabzCYMX1gEuMMPyAowg8ERfiBoAg/EBThB4Ii/EBQhB8IivADQRF+ICjCDwRF+IGgCD8QFOEHgiL8QFBW5G/Fzawm6f/GLJou6XRhDUxMt/bWrX1J9NaqPHv7srs3db28QsP/mZ2bVd29UloDCd3aW7f2JdFbq8rqjbf9QFCEHwiq7PAPlLz/lG7trVv7kuitVaX0VupnfgDlKfvID6AkpYTfzG41sz+Z2REzW1dGD/WY2aCZvWdmB8ysWnIvW8zslJm9P2bZNDN7zcwOZ7fjTpNWUm+PmNlH2Wt3wMyWltTbHDN73cwOmdkfzOx72fJSX7tEX6W8boW/7TezKZL+LOkWScclvStphbv/sdBG6jCzQUkVdy99TNjM/lPSPyRtc/e+bNl/S/rE3R/P/nBOdff/6pLeHpH0j7Jnbs4mlJk5dmZpSbdLulclvnaJvu5RCa9bGUf+hZKOuPsxd/+npF9KWl5CH13P3d+Q9MkFi5dL2prd36qRfzyFq9NbV3D3IXffn90fljQ6s3Spr12ir1KUEf5Zkv465vFxddeU3y5pj5ntM7P+spsZx4xs2vTR6dOvLLmfCzWcublIF8ws3TWvXSszXuetjPCPN/tPNw053Oju10laIum72dtbNKepmZuLMs7M0l2h1Rmv81ZG+I9LmjPm8WxJJ0roY1zufiK7PSXpBXXf7MMnRydJzW5PldzPv3TTzM3jzSytLnjtumnG6zLC/66k+WY2z8w+L+lbknaX0MdnmNkV2RcxMrMrJH1D3Tf78G5JK7P7KyW9WGIv/6ZbZm6uN7O0Sn7tum3G61JO8smGMjZKmiJpi7v/sPAmxmFmV2nkaC+NTGL6izJ7M7PnJd2skV99nZT0A0m/kfQrSXMl/UXS3e5e+BdvdXq7WSNvXf81c/PoZ+yCe7tJ0puS3pN0Plu8XiOfr0t77RJ9rVAJrxtn+AFBcYYfEBThB4Ii/EBQhB8IivADQRF+ICjCDwRF+IGg/h/APtduoHT5SAAAAABJRU5ErkJggg==\n",
"text/plain": [
"<Figure size 432x288 with 1 Axes>"
]
},
"metadata": {},
"output_type": "display_data"
}
],
"source": [
"pyplot.imshow(x_train[randint(0, x_train.shape[0])], cmap='gray_r');"
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "skip"
}
},
"source": [
"Munge data\n",
"-----\n",
"\n",
"<center><img src=\"images/mnist_keras.png\" width=\"75%\"/></center>\n",
"\n",
"Convert image matrix into vector to feed into first layer "
]
},
{
"cell_type": "code",
"execution_count": 94,
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"outputs": [],
"source": [
"# Munge Data\n",
"# Transform from matrix to vector, cast, and normalize"
]
},
{
"cell_type": "code",
"execution_count": 95,
"metadata": {
"slideshow": {
"slide_type": "skip"
}
},
"outputs": [],
"source": [
"\n",
"image_size = 784 # 28 x 28\n",
"\n",
"x_train = x_train.reshape(x_train.shape[0], image_size) # Transform from matrix to vector\n",
"x_train = x_train.astype('float32') # Cast as 32 bit integers\n",
"x_train /= 255 # Normalize inputs from 0-255 to 0.0-1.0\n",
"\n",
"x_test = x_test.reshape(x_test.shape[0], image_size) # Transform from matrix to vector\n",
"x_test = x_test.astype('float32') # Cast as 32 bit integers\n",
"x_test /= 255 # Normalize inputs from 0-255 to 0.0-1.0"
]
},
{
"cell_type": "code",
"execution_count": 96,
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"outputs": [],
"source": [
"# Convert class vectors to binary class matrices\n"
]
},
{
"cell_type": "code",
"execution_count": 97,
"metadata": {
"slideshow": {
"slide_type": "skip"
}
},
"outputs": [],
"source": [
"y_train = keras.utils.to_categorical(y_train, 10)\n",
"y_test = keras.utils.to_categorical(y_test, 10)"
]
},
{
"cell_type": "code",
"execution_count": 98,
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"outputs": [],
"source": [
"# Import the most common type of neural network\n"
]
},
{
"cell_type": "code",
"execution_count": 99,
"metadata": {
"slideshow": {
"slide_type": "skip"
}
},
"outputs": [],
"source": [
"from keras.models import Sequential"
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "skip"
}
},
"source": [
"RTFM - https://keras.io/layers/"
]
},
{
"cell_type": "code",
"execution_count": 100,
"metadata": {
"slideshow": {
"slide_type": "fragment"
}
},
"outputs": [],
"source": [
"# Define model instance\n"
]
},
{
"cell_type": "code",
"execution_count": 101,
"metadata": {
"slideshow": {
"slide_type": "skip"
}
},
"outputs": [],
"source": [
"model = Sequential()"
]
},
{
"cell_type": "code",
"execution_count": 102,
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"outputs": [],
"source": [
"# Import the most common type of network layer, fully interconnected\n"
]
},
{
"cell_type": "code",
"execution_count": 103,
"metadata": {
"slideshow": {
"slide_type": "skip"
}
},
"outputs": [],
"source": [
"from keras.layers import Dense"
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "fragment"
}
},
"source": [
"<center><img src=\"images/dense.png\" width=\"55%\"/></center>"
]
},
{
"cell_type": "code",
"execution_count": 104,
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"outputs": [],
"source": [
"# Define input layer\n"
]
},
{
"cell_type": "code",
"execution_count": 105,
"metadata": {
"slideshow": {
"slide_type": "skip"
}
},
"outputs": [],
"source": [
"layer_input = Dense(units=512, # Number of nodes\n",
" activation='sigmoid', # The nonlinearity\n",
" input_shape=(image_size,)) \n",
"model.add(layer_input)"
]
},
{
"cell_type": "code",
"execution_count": 106,
"metadata": {
"slideshow": {
"slide_type": "fragment"
}
},
"outputs": [],
"source": [
"# Define another layer\n"
]
},
{
"cell_type": "code",
"execution_count": 107,
"metadata": {
"slideshow": {
"slide_type": "skip"
}
},
"outputs": [],
"source": [
"model.add(Dense(units=512, activation='sigmoid'))"
]
},
{
"cell_type": "code",
"execution_count": 108,
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"outputs": [],
"source": [
"# Define output layers\n"
]
},
{
"cell_type": "code",
"execution_count": 109,
"metadata": {
"slideshow": {
"slide_type": "skip"
}
},
"outputs": [],
"source": [
"layer_output = Dense(units=10, # Number of digits (0-9)\n",
" activation='softmax') # Convert neural activation to probability of category\n",
"\n",
"model.add(layer_output)"
]
},
{
"cell_type": "code",
"execution_count": 110,
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"outputs": [],
"source": [
"# Print summary\n"
]
},
{
"cell_type": "code",
"execution_count": 111,
"metadata": {
"slideshow": {
"slide_type": "skip"
}
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"_________________________________________________________________\n",
"Layer (type) Output Shape Param # \n",
"=================================================================\n",
"dense_9 (Dense) (None, 512) 401920 \n",
"_________________________________________________________________\n",
"dense_10 (Dense) (None, 512) 262656 \n",
"_________________________________________________________________\n",
"dense_11 (Dense) (None, 10) 5130 \n",
"=================================================================\n",
"Total params: 669,706\n",
"Trainable params: 669,706\n",
"Non-trainable params: 0\n",
"_________________________________________________________________\n"
]
}
],
"source": [
"model.summary()"
]
},
{
"cell_type": "code",
"execution_count": 112,
"metadata": {
"slideshow": {
"slide_type": "fragment"
}
},
"outputs": [],
"source": [
"# Yes - we compile the model to run it\n"
]
},
{
"cell_type": "code",
"execution_count": 113,
"metadata": {
"slideshow": {
"slide_type": "skip"
}
},
"outputs": [],
"source": [
"model.compile(loss='categorical_crossentropy', \n",
" optimizer='sgd',\n",
" metrics=['accuracy'])"
]
},
{
"cell_type": "code",
"execution_count": 114,
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"outputs": [],
"source": [
"# Train the model\n"
]
},
{
"cell_type": "code",
"execution_count": 115,
"metadata": {
"slideshow": {
"slide_type": "skip"
}
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Train on 54000 samples, validate on 6000 samples\n",
"Epoch 1/5\n",
"54000/54000 [==============================] - 15s 285us/step - loss: 2.1522 - acc: 0.3213 - val_loss: 1.8987 - val_acc: 0.5315\n",
"Epoch 2/5\n",
"54000/54000 [==============================] - 14s 262us/step - loss: 1.5000 - acc: 0.6548 - val_loss: 1.0769 - val_acc: 0.7430\n",
"Epoch 3/5\n",
"54000/54000 [==============================] - 15s 285us/step - loss: 0.9003 - acc: 0.7860 - val_loss: 0.6709 - val_acc: 0.8560\n",
"Epoch 4/5\n",
"54000/54000 [==============================] - 14s 266us/step - loss: 0.6515 - acc: 0.8317 - val_loss: 0.5121 - val_acc: 0.8778\n",
"Epoch 5/5\n",
"54000/54000 [==============================] - 18s 340us/step - loss: 0.5385 - acc: 0.8549 - val_loss: 0.4268 - val_acc: 0.8940\n"
]
}
],
"source": [
"training = model.fit(x_train, \n",
" y_train,\n",
" epochs=5, # Number of passes over complete dataset\n",
" verbose=True, \n",
" validation_split=0.1)"
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"source": [
"<center><img src=\"images/waiting.jpg\" width=\"55%\"/></center>"
]
},
{
"cell_type": "code",
"execution_count": 116,
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"outputs": [],
"source": [
"# Let's see how well our model performs\n"
]
},
{
"cell_type": "code",
"execution_count": 117,
"metadata": {
"slideshow": {
"slide_type": "skip"
}
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"10000/10000 [==============================] - 1s 106us/step\n",
"Test loss: 0.476\n",
"Test accuracy: 87.140%\n"
]
}
],
"source": [
"loss, accuracy = model.evaluate(x_test, \n",
" y_test, \n",
" verbose=True)\n",
"print(f\"Test loss: {loss:.3}\")\n",
"print(f\"Test accuracy: {accuracy:.3%}\")"
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"source": [
"Keras' Other Features\n",
"-----\n",
"\n",
"- Common built-in functions (e.g., activation functions and optimitizers)\n",
"- Convolutional neural network (CNN or ConvNet)\n",
"- Recurrent neural network (RNN) & Long-short term memory (LSTM)\n",
"- Pre-trained models"
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"source": [
"Summary\n",
"-----\n",
"\n",
"- Keras is designed for human beings, not computers.\n",
"- Easier to try out Deep Learning (focus on the __what__, not the __how__).\n",
"- Simple to define neural networks."
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"source": [
"<center><img src=\"images/twitter.png\" width=\"100%\"/></center>"
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"source": [
"Futher Study - Keras\n",
"--------\n",
"\n",
"- [Keras docs](https://keras.io/)\n",
"- [Keras blog](https://blog.keras.io/)\n",
"- Keras courses\n",
" - [edX](https://www.edx.org/course/deep-learning-fundamentals-with-keras)\n",
" - [Coursera](https://www.coursera.org/lecture/ai/keras-overview-7GfN9)"
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"source": [
"Futher Study - Deep Learning\n",
"--------\n",
"\n",
"- Prerequisites: Linear Algebra, Probability, Machine Learning\n",
"- [fast.ai Course](http://www.fast.ai/)\n",
"- [Deep Learning Book](http://www.deeplearningbook.org/)"
]
},
{
"cell_type": "markdown",
"metadata": {
"collapsed": true,
"slideshow": {
"slide_type": "slide"
}
},
"source": [
"<br>"
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"source": [
"Bonus Material\n",
"--------"
]
},
{
"cell_type": "code",
"execution_count": 118,
"metadata": {},
"outputs": [],
"source": [
"# reset -fs"
]
},
{
"cell_type": "code",
"execution_count": 119,
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"outputs": [],
"source": [
"# from keras import *"
]
},
{
"cell_type": "code",
"execution_count": 120,
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"outputs": [],
"source": [
"# whos"
]
},
{
"cell_type": "code",
"execution_count": 121,
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"outputs": [],
"source": [
"# from keras.datasets import fashion_mnist"
]
},
{
"cell_type": "code",
"execution_count": 122,
"metadata": {
"slideshow": {
"slide_type": "fragment"
}
},
"outputs": [],
"source": [
"# # Setup train and test splits\n",
"# (x_train, y_train), (x_test, y_test) = fashion_mnist.load_data()"
]
},
{
"cell_type": "code",
"execution_count": 123,
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"outputs": [],
"source": [
"# from random import randint\n",
"# from matplotlib import pyplot\n",
"\n",
"# %matplotlib inline"
]
},
{
"cell_type": "code",
"execution_count": 124,
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"outputs": [],
"source": [
"# pyplot.imshow(x_train[randint(0, x_train.shape[0])], cmap='gray_r');"
]
},
{
"cell_type": "code",
"execution_count": 125,
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"outputs": [],
"source": [
"# # Define CNN model\n",
"\n",
"# # Redefine input dimensions to make sure conv works\n",
"# img_rows, img_cols = 28, 28\n",
"# x_train = x_train.reshape(x_train.shape[0], img_rows, img_cols, 1)\n",
"# x_test = x_test.reshape(x_test.shape[0], img_rows, img_cols, 1)\n",
"# input_shape = (img_rows, img_cols, 1)"
]
},
{
"cell_type": "code",
"execution_count": 126,
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"outputs": [],
"source": [
"# import keras"
]
},
{
"cell_type": "code",
"execution_count": 127,
"metadata": {
"slideshow": {
"slide_type": "fragment"
}
},
"outputs": [],
"source": [
"# # Convert class vectors to binary class matrices\n",
"# y_train = keras.utils.to_categorical(y_train, 10)\n",
"# y_test = keras.utils.to_categorical(y_test, 10)"
]
},
{
"cell_type": "code",
"execution_count": 128,
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"outputs": [],
"source": [
"# from keras.layers import Conv2D, Dense, Flatten, MaxPooling2D"
]
},
{
"cell_type": "code",
"execution_count": 129,
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"outputs": [],
"source": [
"# # Define model\n",
"# model = Sequential()\n",
"# model.add(Conv2D(32, \n",
"# kernel_size=(3, 3),\n",
"# activation='sigmoid',\n",
"# input_shape=input_shape))\n",
"# model.add(Conv2D(64, (3, 3), activation='sigmoid'))\n",
"# model.add(MaxPooling2D(pool_size=(2, 2)))\n",
"# model.add(Flatten())\n",
"# model.add(Dense(128, activation='sigmoid'))\n",
"# model.add(Dense(10, activation='softmax'))"
]
},
{
"cell_type": "code",
"execution_count": 130,
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"outputs": [],
"source": [
"# model.compile(loss='categorical_crossentropy', \n",
"# optimizer='adam',\n",
"# metrics=['accuracy'])"
]
},
{
"cell_type": "code",
"execution_count": 131,
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"outputs": [],
"source": [
"# # Define training\n",
"# training = model.fit(x_train, \n",
"# y_train,\n",
"# epochs=5,\n",
"# verbose=True, \n",
"# validation_split=0.1)"
]
},
{
"cell_type": "code",
"execution_count": 132,
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"outputs": [],
"source": [
"# loss, accuracy = model.evaluate(x_test, \n",
"# y_test, \n",
"# verbose=True)\n",
"# print(f\"Test loss: {loss:.3}\")\n",
"# print(f\"Test accuracy: {accuracy:.3%}\")"
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"source": [
"What is `keras`? \n",
"-----\n",
"\n",
"<center><img src=\"https://www.thevintagenews.com/wp-content/uploads/2017/08/a-drinking-horn-from-the-16th-century-known-as-the-roordahuizum-drinking-horn-on-display-in-the-frisian-museum-at-leeuwarden-640x360.jpg\" width=\"75%\"/></center>\n",
"\n",
"Keras (κέρας) means horn in Greek. "
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"source": [
"It is a reference to a literary image from ancient Greek and Latin literature.\n",
"\n",
"First found in the Odyssey, where dream spirits (Oneiroi, singular Oneiros) are divided between those who deceive men with false visions, who arrive to Earth through a gate of ivory, and those who announce a future that will come to pass, who arrive through a gate of horn. \n",
"\n",
"It's a play on the words κέρας (horn) / κραίνω (fulfill), and ἐλέφας (ivory) / ἐλεφαίρομαι (deceive)."
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "skip"
}
},
"source": [
"[Source](https://keras.io/#why-this-name-keras)"
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"source": [
"<br>"
]
}
],
"metadata": {
"celltoolbar": "Slideshow",
"kernelspec": {
"display_name": "Python 3",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.6.0"
}
},
"nbformat": 4,
"nbformat_minor": 2
}