Browse by Domains

Keras Tutorial | The Beginners Guide to Deep Learning

Table of contents

Keras

Around a year back,Keras was integrated to TensorFlow 2.0, which succeeded TensorFlow 1.0. Now Keras is a part of TensorFlow.

The writer of Keras is Francois Chollette.

Data loading and preprocessing

Neural networks don’t process raw data, encoded JPEG image files, or CSV files. They handle vectorized and standardized representations.

Text files require to be read into string tensors and then split into individual words. Finally, the terms need to be indexed and turned in to integer typed tensors.

Image files are read and converted/decoded into integer arrays/tensors, and then transformed to floating-point and finally normalized to small values ( between 0,1).

CSV files need to be parsed, where numerical features are transformed into floating-point tensors with certain categorical features indexed and converted into integer tensors. Finally, each element needs to be normalized [0,1].

Data loading

Keras models allow three types of inputs:

NumPy arrays:  This is a good option if the data fits in memory.

TensorFlow Dataset objects. : This is a high-performance option which is more suitable for datasets that do not fit in memory and that are either streamed from disk or distributed filesystems.

Before we start training a model, we will need to make your data available in one of these formats. If we have a large dataset and we are operating on GPU(s), we should consider using the Dataset object, since it will automatically take care of performance-critical details, such as:

Asynchronously preprocessing our data on CPU while our GPU is busy, and will buffer it to a queue,

in turn, prefetching data on GPU memory so that it’s quickly available when the GPU has finished processing the previous batch of data so that we can reach full GPU utilization on our machine.

Keras supports  a wide of  range of utilities to help us turn raw data on ours disk into a Dataset object:

tf.keras.preprocessing.image_dataset_from_directory : It turns image files sorted into class-specific folders into a well labelled dataset of image tensors which are of a definite shape.

tf.keras.preprocessing.text_dataset_from_directory is used for the same over text files.

Assuming the following directory structure :

We will try and load the directories as a Keras dataset.

import os
import glob
from PIL import Image
path=r"C:\Users\VAGISH\Documents\Lightshot"
new_path=r"C:\Users\VAGISH\LightShot"
 
for file in os.listdir(path):
    img=Image.open(os.path.join(path,file)).resize((200,200))
    final_path=os.path.join(new_path,file)
    img.save(final_path)
 
import tensorflow as tf
path=r'C:\Users\VAGISH\Lightshot'
dataset = tf.keras.preprocessing.image_dataset_from_directory(path,batch_size=8,image_size=(200,200))
for data,labels in dataset:
    print(data.dtype)
    print(data.shape)
    print(labels.dtype)
    print(labels.shape)
    print(labels)
Output:
 <dtype: 'float32'> 
 (8, 200, 200, 3)
 <dtype: 'int32'> 
 (8,)  
 tf.Tensor([0 1 0 1 0 1 0 1], shape=(8,), dtype=int32) 

 Similarly, we can load text files organized in different folders to a dataset object using :

tf.keras.preprocessing.text_dataset_from_directory​

Data Preprocessing with Keras

Once we have data in the form of string/int/float Numpy arrays, or a dataset object that yields batches of string/int/float tensors, the next step is to pre process the data. Taking up keras courses will help you learn more about the concept.

This usually means:

1.Tokenization of string data, followed by indexing

2.Feature normalization

3.Rescaling data to small values (zero-mean and variance or in range [0,1])

4.Text Vectorization

Keras supports a text vectorization layer, which can be directly used in the models. It holds an index for mapping of words for string type data or tokens to integer indices

Normalization

It stores the mean and variance of all features.

The state of the preprocessing layer can be directly obtained by calling layer.adapt(data) on a sample of training data.

Example:

Converting strings to sequences of integer word indices

from tensorflow.keras.layers.experimental.preprocessing import TextVectorization
import numpy as np
training_data=np.array([["I am brave."],["I am a student"],["I study at school"]])
print(training_data)
 
vectorizer=TextVectorization(output_mode="tf-idf")
vectorizer.adapt(training_data)
 
data = vectorizer(training_data)
print(data)
OUTPUT:
[['I am vagish.']
  ['I am a student'] 
  ['I study at school']] 
tf.Tensor(
[[0.         0.5596158  0.6931472  0.91629076 0.      0.
 0.      0.      0.     ]
 [0.      0.5596158  0.6931472  0.      0.      0.91629076
0.      0.      0.91629076]
 [0.      0.5596158  0.         0.      0.91629076 0.
  0.91629076 0.91629076 0.     ]], shape=(3, 9), dtype=float32)


from tensorflow.keras.layers.experimental.preprocessing import TextVectorization
import numpy as np
training_data=np.array([["I am vagish."],["I am a student"],["I study at school"]])
print(training_data)
 
vectorizer=TextVectorization(output_mode="binary")
vectorizer.adapt(training_data)
 
data = vectorizer(training_data)
print(data)
Output
[['I am vagish.']
   ['I am a student'] 
   ['I study at school']] 

 tf.Tensor( 
 [[0. 1. 1. 1. 0. 0. 0. 0. 0.] 
 [0. 1. 1. 0. 0. 1. 0. 0. 1.] 
 [0. 1. 0. 0. 1. 0. 1. 1. 0.]], shape=(3, 9), dtype=float32) 

Generating Bi-Gram type data

from tensorflow.keras.layers.experimental.preprocessing import TextVectorization
 
# Example training data, of dtype `string`.
training_data = np.array([["This is me"], ["And there they are"]])
 
vectorizer = TextVectorization(output_mode="binary", ngrams=2)
 
vectorizer.adapt(training_data)
int_data = vectorizer(training_data)
print(int_data)

Output

tf.Tensor(

[[0. 1. 1. 0. 1. 0. 1. 1. 1. 0. 0. 0. 1. 0. 0. 1. 1.]

 [0. 1. 1. 0. 0. 1. 0. 0. 0. 1. 1. 1. 1. 1. 1. 0. 0.]], shape=(2, 17), dtype=float32)

Feature Normalization


from tensorflow.keras.layers.experimental.preprocessing import Normalization
 
training_data = np.random.randint(0, 256, size=(256, 200, 200, 3)).astype("float32")
 
normalizer = Normalization(axis=-1)
normalizer.adapt(training_data)
 
normalized_data = normalizer(training_data)
print(training_data)
print("var: %.4f" % np.var(normalized_data))
print("mean: %.4f" % np.mean(normalized_data))

Ouput:

[[[[177.  55.  50.]

   [ 85.  57.  44.]

   [219. 217.  10.]

   …

   [146.  36. 211.]

   [212. 149.  89.]

   [ 35. 247. 171.]]

  [[ 79.  13.  43.]

   [238.  12. 112.]

   [236.  75. 190.]

   …

   [ 87.  37. 177.]

   [ 50.  35. 137.]

   [161. 232.  78.]]

  [[ 28. 207. 133.]

   [ 79.  76. 229.]

   [ 29.   9. 215.]

   …

   [242. 135. 241.]

   [ 12. 250.  58.]

   [225.  83. 165.]]

  …

  [[193. 170.  77.]

   [239.  14. 213.]

   [ 86. 237. 197.]

   …

   [224. 117.  71.]

   [209.  82. 100.]

   [151.  60.  73.]]

  [[ 25.  81.  58.]

   [  0.  89. 187.]

   [ 67.  18. 184.]

   …

   [ 39.  75. 129.]

   [ 32.  87.  50.]

   [ 30. 140. 210.]]

  [[ 38. 155.  70.]

   [ 92. 131. 141.]

   [145. 110. 166.]

   …

   [ 17. 181. 115.]

   [255. 179. 177.]

   [212. 231. 195.]]]

 [[[ 48. 200. 236.]

   [167. 245. 193.]

   [  9.  20. 242.]

   …

   [184. 103. 221.]

   [182. 142. 114.]

   [247. 100.  77.]]

  [[ 62.  24.  81.]

   [216.  63. 116.]

   [ 75.  42. 156.]

   …

   [237.  21. 114.]

   [146. 246. 227.]

   [ 87.  94. 148.]]

  [[140. 223. 121.]

   [ 59.  14. 174.]

   [ 14. 117. 165.]

   …

   [ 12.  17. 234.]

   [238. 132.  56.]

   [207. 145.  69.]]

  …

  [[ 20.  88.   4.]

   [114.  42. 178.]

   [236. 193.   2.]

   …

   [  7. 192.  59.]

   [233.   9. 249.]

   [ 86.  43.  14.]]

  [[166. 170. 128.]

   [137.  74.  19.]

   [  8.  62.  33.]

   …

   [227.  17.  54.]

   [ 17.   8. 113.]

   [234.  66. 254.]]

  [[148. 238. 223.]

   [213. 197. 218.]

   [  6.  66.  68.]

   …

   [201. 164. 121.]

   [ 28. 206. 122.]

   [225. 158. 215.]]]

 [[[122. 201.  85.]

   [202.  15. 160.]

   [120. 175. 103.]

   …

   [ 43. 102. 177.]

   [182.   7.   7.]

   [ 29.  47. 139.]]

  [[143. 136. 123.]

   [147.  61. 182.]

   [110.  52. 149.]

   …

   [ 64.  73.  98.]

   [ 17.  39.  80.]

   [173. 105.  21.]]

  [[ 48. 177. 124.]

   [ 73. 108. 189.]

   [192. 155.  27.]

   …

   [ 90.  81.  86.]

   [101. 170. 161.]

   [195. 218.  27.]]

  …

  [[201. 154.  27.]

   [255. 116. 182.]

   [ 44. 234. 154.]

   …

   [  7. 130.  91.]

   [ 13.  76. 124.]

   [ 73. 223. 110.]]

  [[117.  39. 143.]

   [ 13.  50.  66.]

   [ 52.  68. 137.]

   …

   [ 72. 122. 229.]

   [ 61.  99. 126.]

   [199.  73. 218.]]

  [[ 45. 215. 255.]

   [113. 178. 201.]

   [255. 180. 247.]

   …

   [217.  86.  39.]

   [219. 224. 181.]

   [ 28.  14. 123.]]]

 …

 [[[208. 184.  12.]

   [  6. 110. 134.]

   [ 26.  47.  47.]

   …

   [ 63.  58.  78.]

   [182.  73. 225.]

   [227.  28.  63.]]

  [[226. 123. 213.]

   [208. 175. 242.]

   [ 69. 152.  59.]

   …

   [ 21. 122. 163.]

   [118. 254. 119.]

   [157. 146.  23.]]

  [[ 67.  31. 252.]

   [229. 106. 135.]

   [151.  98.  56.]

   …

   [191.  76. 197.]

   [ 82.  91. 119.]

   [165. 119.  81.]]

  …

  [[116. 121. 114.]

   [232. 203.  36.]

   [ 46. 152.  17.]

   …

   [107.  68.  24.]

   [199. 145. 182.]

   [116.  35. 216.]]

  [[230. 160. 172.]

   [ 88. 174.  54.]

   [139.  61. 229.]

   …

   [ 60. 137. 174.]

   [ 46.  24.  11.]

   [190. 120.  56.]]

  [[183. 166. 218.]

   [ 51.  65. 206.]

   [236.  60. 202.]

   …

   [142. 199. 106.]

   [163. 154.  71.]

   [172. 172.   7.]]]

 [[[150. 148. 113.]

   [238. 221.  55.]

   [ 47.  33. 118.]

   …

   [140. 234.  62.]

   [ 75. 122.  59.]

   [136.  35. 132.]]

  [[112. 190.  42.]

   [ 65. 117. 249.]

   [  3. 223.  41.]

   …

   [176. 245.  70.]

   [129.  63.  91.]

   [100.  25. 185.]]

  [[ 94. 170.   6.]

   [141. 191. 198.]

   [149. 171. 254.]

   …

   [166. 209. 137.]

   [ 95. 176. 155.]

   [139. 103. 210.]]

  …

  [[ 49.  58.  76.]

   [ 66. 237.  32.]

   [ 69.  61. 218.]

   …

   [212.  77. 169.]

   [103. 177. 251.]

   [ 67.  28. 224.]]

  [[ 54.  24. 250.]

   [181.  61. 145.]

   [235. 116.  49.]

   …

   [195. 169.  99.]

   [ 42. 240.   9.]

   [203. 177.  25.]]

  [[  8.  91.  78.]

   [ 81.  25.  96.]

   [ 92. 229.  69.]

   …

   [ 56.   8. 159.]

   [244.   1.  94.]

   [189.  58. 202.]]]

 [[[  2. 212.  25.]

   [183. 164. 186.]

   [106. 187. 178.]

   …

   [ 97.  20. 168.]

   [144. 242. 246.]

   [117.  15. 126.]]

  [[130. 155.  79.]

   [103. 105. 244.]

   [163. 242. 237.]

   …

   [101. 168. 186.]

   [ 22. 149. 226.]

   [244. 170.  46.]]

  [[140.   7. 163.]

   [151.  89.  27.]

   [244.  77. 210.]

   …

   [210. 173. 240.]

   [176. 210.  44.]

   [ 26. 145. 213.]]

  …

  [[249. 229.  66.]

   [254. 176. 218.]

   [ 31. 218. 183.]

   …

   [ 79. 140. 181.]

   [148.  47. 242.]

   [180. 142.  86.]]

  [[ 59. 166. 236.]

   [ 92. 207. 140.]

   [225. 220. 153.]

   …

   [ 26. 253. 247.]

   [148. 177.  60.]

   [ 91. 136.   4.]]

  [[178. 114.  26.]

   [140.  38. 232.]

   [207. 199. 243.]

   …

   [129. 180. 212.]

   [ 47. 159.  97.]

   [182. 179.   1.]]]]

var: 1.0000

mean: 0.0000

Rescaling and center cropping images

from tensorflow.keras.layers.experimental.preprocessing import CenterCrop
from tensorflow.keras.layers.experimental.preprocessing import Rescaling
 
training_data = np.random.randint(0, 256, size=(16, 200, 200, 3)).astype("float32")
 
cropper = CenterCrop(height=100, width=100)
scaler = Rescaling(scale=1.0 / 255)
 
output_data = scaler(cropper(training_data))
print(output_data)
print("shape:", output_data.shape)
print("min:", np.min(output_data))
print("max:", np.max(output_data))

Output :

tf.Tensor(

[[[[0.50980395 0.56078434 0.3647059 ]

   [0.10196079 0.9215687  0.15686275]

   [0.95294124 0.9490197  0.427451  ]

   …

   [0.4784314  0.77647066 0.77647066]

   [0.37254903 0.78823537 0.9686275 ]

   [0.15686275 0.72156864 0.8980393 ]]

  [[0.5254902  0.6156863  0.07450981]

   [0.0627451  0.8862746  0.24313727]

   [0.9058824  0.23529413 0.5372549 ]

   …

   [0.64705884 0.16862746 0.5176471 ]

   [0.31764707 0.7019608  0.9686275 ]

   [0.8431373  0.8352942  0.7137255 ]]

  [[0.6862745  0.05882353 0.35686275]

   [0.909804   0.9607844  0.7294118 ]

   [0.37254903 0.6156863  0.10196079]

   …

   [0.16078432 0.97647065 0.70980394]

   [0.53333336 0.7803922  0.8117648 ]

   [0.5568628  0.05882353 0.5411765 ]]

  …

  [[0.21176472 0.25490198 0.32156864]

   [0.6666667  0.6509804  0.5568628 ]

   [1.     0.3803922  0.3019608 ]

   …

   [0.9294118  0.75294125 0.3803922 ]

   [0.3254902  0.34117648 0.21568629]

   [0.77647066 0.43921572 0.5568628 ]]

  [[0.45882356 0.20784315 0.28627452]

   [0.23137257 0.16078432 0.45882356]

   [0.7843138  0.0509804  0.5882353 ]

   …

   [0.96470594 0.20000002 0.0509804 ]

   [0.49803925 0.07843138 0.14901961]

   [0.7960785  0.3254902  0.7803922 ]]

  [[0.00392157 0.77647066 0.14117648]

   [0.30588236 0.5058824  0.01568628]

   [0.01176471 0.41176474 0.09803922]

   …

   [0.13333334 0.25882354 0.13333334]

   [1.     0.5529412  0.6       ]

   [0.3921569  0.18431373 0.65882355]]]

 [[[0.03529412 0.13725491 0.2392157 ]

   [0.3372549  0.8588236  0.2627451 ]

   [0.5647059  0.9568628  0.7176471 ]

   …

   [0.4784314  0.94117653 0.11764707]

   [0.60784316 0.82745105 0.53333336]

   [0.6117647  0.8588236  0.43137258]]

  [[0.5019608  0.7843138  0.13333334]

   [0.9686275  0.7568628  0.85098046]

   [0.21568629 0.21960786 0.7058824 ]

   …

   [0.85098046 0.882353   0.5568628 ]

   [0.44705886 0.4901961  0.17254902]

   [0.94117653 0.16862746 0.31764707]]

  [[0.79215693 0.13333334 0.9176471 ]

   [0.20392159 0.44705886 0.3254902 ]

   [0.909804   0.6745098  0.62352943]

   …

   [0.58431375 0.39607847 0.9490197 ]

   [0.20000002 0.6666667  0.7686275 ]

   [0.01568628 0.56078434 0.8196079 ]]

  …

  [[0.30588236 0.29803923 0.5176471 ]

   [0.02745098 0.09411766 0.56078434]

   [0.53333336 0.38823533 0.5568628 ]

   …

   [0.6666667  0.17254902 0.8235295 ]

   [0.25490198 0.09019608 0.28627452]

   [0.7176471  0.12941177 0.6   ]]

  [[0.2901961  0.74509805 0.8588236 ]

   [0.58431375 0.43921572 0.23137257]

   [0.38431376 0.04313726 0.23529413]

   …

   [0.32156864 0.20000002 0.8862746 ]

   [0.47450984 0.53333336 0.5137255 ]

   [0.7019608  0.7725491  0.19215688]]

  [[0.38823533 0.19607845 0.35686275]

   [0.28235295 0.28235295 0.75294125]

   [0.08627451 0.62352943 0.07450981]

   …

   [0.82745105 0.52156866 0.54509807]

   [0.13333334 0.29411766 0.58431375]

   [0.7137255  0.92549026 0.14117648]]]

 [[[0.6039216  0.882353   0.227451  ]

   [0.34901962 0.5058824  0.909804  ]

   [0.95294124 0.78823537 0.5058824 ]

   …

   [0.9058824  0.8000001  0.58431375]

   [0.2901961  0.5764706  0.5764706 ]

   [0.3921569  0.6745098  0.02745098]]

  [[0.60784316 0.8862746  0.08235294]

   [0.39607847 0.97647065 0.46274513]

   [0.5294118  0.38823533 0.5058824 ]

   …

   [0.74509805 0.4156863  0.52156866]

   [0.06666667 0.77647066 0.49803925]

   [0.9686275  0.3254902  0.14901961]]

  [[0.0509804  0.65882355 0.8588236 ]

   [0.6509804  0.2784314  0.8235295 ]

   [0.60784316 0.19215688 0.4431373 ]

   …

   [0.627451   0.7372549  0.01176471]

   [0.05490196 0.75294125 0.17254902]

   [0.73333335 0.49803925 0.6745098 ]]

  …

  [[0.8313726  0.09411766 0.70980394]

   [0.13333334 0.5529412  0.8941177 ]

   [0.2901961  0.45098042 0.427451  ]

   …

   [0.47450984 0.35686275 0.85098046]

   [0.8000001  0.05490196 0.9960785 ]

   [0.6    0.57254905 0.10588236]]

  [[0.5019608  0.16078432 0.79215693]

   [0.04313726 0.07450981 0.3254902 ]

   [0.7058824  0.2627451  0.22352943]

   …

   [0.49411768 0.26666668 0.07450981]

   [0.05882353 0.1137255  0.73333335]

   [0.4901961  0.4784314  0.65882355]]

  [[0.44705886 0.27450982 0.9843138 ]

   [0.45882356 0.11764707 0.48235297]

   [0.57254905 0.10980393 0.81568635]

   …

   [0.454902   0.9450981  0.64705884]

   [0.454902   0.6156863  0.04705883]

   [0.59607846 0.64705884 0.01176471]]]

 …

 [[[0.23137257 0.7294118  0.7568628 ]

   [0.2509804  0.8941177  0.14117648]

   [0.8941177  0.96470594 0.18823531]

   …

   [0.8352942  0.36862746 0.74509805]

   [0.50980395 0.87843144 0.13333334]

   [0.62352943 0.     0.8352942 ]]

  [[0.73333335 0.5176471  0.58431375]

   [0.16862746 0.5058824  0.30588236]

   [0.00392157 0.36862746 0.8078432 ]

   …

   [0.8196079  0.9803922  0.9333334 ]

   [0.27450982 0.3529412  0.8000001 ]

   [0.91372555 0.4901961  0.85098046]]

  [[0.227451   0.04313726 0.7137255 ]

   [0.28627452 0.6901961  0.97647065]

   [0.8745099  0.9215687  0.77647066]

   …

   [0.4431373  0.94117653 0.96470594]

   [0.05490196 0.8862746  0.09411766]

   [0.7607844  0.64705884 0.3254902 ]]

  …

  [[0.9450981  0.74509805 0.7490196 ]

   [0.34901962 0.6901961  0.41176474]

   [0.65882355 0.01568628 0.43921572]

   …

   [0.36862746 0.5529412  0.4666667 ]

   [0.81568635 0.68235296 0.909804  ]

   [0.4901961  0.3921569  0.49411768]]

  [[0.70980394 0.8078432  0.8352942 ]

   [0.15686275 0.882353   0.3529412 ]

   [0.3803922  0.24705884 0.3529412 ]

   …

   [0.5411765  0.9921569  0.454902  ]

   [0.4156863  0.43137258 0.0627451 ]

   [0.54901963 0.92549026 0.7568628 ]]

  [[0.854902   0.18039216 0.61960787]

   [0.68235296 0.10196079 0.07058824]

   [0.23137257 0.5019608  0.48627454]

   …

   [0.627451   0.50980395 0.5568628 ]

   [0.0509804  0.6509804  0.6       ]

   [0.59607846 0.03529412 0.1137255 ]]]

 [[[0.6509804  0.62352943 0.00392157]

   [0.23137257 0.7803922  0.627451  ]

   [0.94117653 0.05882353 0.10980393]

   …

   [0.24313727 0.5254902  0.02745098]

   [0.28235295 0.25490198 0.64705884]

   [0.7411765  0.05882353 0.16078432]]

  [[0.7254902  0.20392159 0.5137255 ]

   [0.69411767 0.29803923 0.2627451 ]

   [0.2392157  0.80392164 0.69411767]

   …

   [0.46274513 0.7843138  0.40784317]

   [0.19607845 0.16078432 0.24705884]

   [0.14509805 0.3019608  0.04313726]]

  [[0.13333334 0.69411767 0.43137258]

   [0.8470589  0.6156863  0.49411768]

   [0.4039216  0.37254903 0.59607846]

   …

   [0.45098042 0.9450981  0.    ]

   [0.3372549  0.49411768 0.9490197 ]

   [0.4039216  0.27058825 0.227451  ]]

  …

  [[0.28235295 0.882353   0.6784314 ]

   [0.75294125 0.46274513 0.0627451 ]

   [0.9490197  0.6627451  0.25882354]

   …

   [0.8196079  0.17254902 0.3372549 ]

   [0.97647065 0.8000001  0.5137255 ]

   [0.9215687  0.227451   0.50980395]]

  [[0.37647063 0.74509805 1.    ]

   [0.31764707 0.10980393 0.59607846]

   [0.89019614 0.18823531 0.61960787]

   …

   [0.9490197  0.17254902 0.02745098]

   [0.97647065 0.86274517 0.25882354]

   [0.30588236 0.20784315 0.4039216 ]]

  [[0.6392157  0.2627451  0.654902  ]

   [0.96470594 0.27450982 0.7294118 ]

   [0.16862746 0.40784317 0.6156863 ]

   …

   [0.9843138  0.03529412 0.40784317]

   [0.86666673 0.9058824  0.9058824 ]

   [0.     0.52156866 0.42352945]]]

 [[[0.6    0.34509805 0.7137255 ]

   [0.7686275  0.4156863  0.9803922 ]

   [0.8862746  0.79215693 0.04705883]

   …

   [0.7254902  0.20000002 0.5411765 ]

   [0.09019608 0.9843138  0.5176471 ]

   [0.6    0.49411768 0.6117647 ]]

  [[0.5294118  0.09411766 0.86666673]

   [0.7960785  0.34117648 0.7607844 ]

   [0.6039216  0.6117647  0.73333335]

   …

   [0.3803922  0.83921576 0.5254902 ]

   [0.15686275 0.4784314  0.34901962]

   [0.82745105 0.95294124 0.8941177 ]]

  [[0.7568628  0.6666667  0.5764706 ]

   [0.14117648 0.05490196 0.05490196]

   [0.44705886 0.07450981 0.13333334]

   …

   [0.47450984 0.81568635 0.37254903]

   [0.5686275  0.427451   0.06666667]

   [0.9960785  0.9294118  0.29411766]]

  …

  [[0.7294118  0.7137255  0.21960786]

   [0.41960788 0.81568635 0.7843138 ]

   [0.61960787 0.6784314  0.9607844 ]

   …

   [0.6117647  0.427451   0.86274517]

   [0.89019614 0.2509804  0.47450984]

   [0.40000004 0.3803922  0.9607844 ]]

  [[0.21568629 0.7254902  0.10588236]

   [0.45098042 0.54509807 0.18823531]

   [0.41176474 0.30980393 0.09411766]

   …

   [0.2627451  0.882353   0.8862746 ]

   [0.4666667  0.89019614 0.69411767]

   [0.38823533 0.64705884 0.42352945]]

  [[0.00784314 0.33333334 0.7803922 ]

   [0.5647059  0.7686275  0.03529412]

   [0.44705886 0.19215688 0.7490196 ]

   …

   [0.6117647  0.7803922  0.60784316]

   [0.7137255  0.6509804  0.18823531]

   [0.87843144 0.9960785  0.9607844 ]]]], shape=(64, 100, 100, 3), dtype=float32)

shape: (64, 100, 100, 3)

min: 0.0

max: 1.0

Building models with keras Functional API

A simple input output transformation is known as a layer. Multiple layers can be stacked to get desired results.

A linear  layer that maps its inputs to a 16 dimensional feature space can be written as :

D=keras.layers.Dense(units=)

A model is practically a directed acyclic graph of individual layers. The most powerful method to build keras models is by using the Functional API.

We start by specifying the input size and shape which are mandatory.If any dimension of the input is likely to vary we can specify it as None.

For example :

Lets consider we have input shape of (200,200,3) for a RGB image but for any dimensional input shape, we can specify it as (None,None,3)

For taking inputs of RGB images of any size we can do the following:

Inputs=keras.Input(shape=(None,None,3))

After the input has been defined, we can chain layers

from tensorflow import keras
from tensorflow.keras import layers
# Let's say we expect our inputs to be RGB images of arbitrary size
inputs = keras.Input(shape=(None, None, 3))
 
 
x = CenterCrop(height=100, width=100)(inputs)
 
x = Rescaling(scale=1.0 / 255)(x)
 
# Apply some convolution and pooling layers
x = layers.Conv2D(filters=32, kernel_size=(2, 2), activation="relu")(x)
x = layers.MaxPooling2D(pool_size=(1, 1))(x)
x = layers.Conv2D(filters=32, kernel_size=(2, 2), activation="relu")(x)
x = layers.MaxPooling2D(pool_size=(1, 1))(x)
x = layers.Conv2D(filters=32, kernel_size=(2, 2), activation="relu")(x)
x = layers.Conv2D(filters=32, kernel_size=(2, 2), activation="relu")(x)
x = layers.MaxPooling2D(pool_size=(1, 1))(x)
x = layers.Conv2D(filters=32, kernel_size=(2, 2), activation="relu")(x)
x = layers.MaxPooling2D(pool_size=(1, 1))(x)
x = layers.Conv2D(filters=32, kernel_size=(2, 2), activation="relu")(x)
 
 
x = layers.GlobalAveragePooling2D()(x)
 
 
num_classes = 10
outputs = layers.Dense(num_classes, activation="softmax")(x)

Once we have defined the directed acyclic graph of layers that turns our input(s) into our outputs, we instantiate a Model object:

model = keras.Model(inputs=inputs, outputs=outputs)
This model behaves basically like a bigger layer. You can call it on batches of data, like this:
 
data = np.random.randint(0, 256, size=(64, 200, 200, 3)).astype("float32")
processed_data = model(data)
print(processed_data.shape)

A model.summary can be printed.

model.summary()

Model: “functional_3”

_________________________________________________________________

Layer (type)             Output Shape          Param #  

=================================================================

input_4 (InputLayer)     [(None, None, None, 3)]   0        

_________________________________________________________________

center_crop_5 (CenterCrop)   (None, 150, 150, 3)   0         

_________________________________________________________________

rescaling_5 (Rescaling)  (None, 150, 150, 3)   0        

_________________________________________________________________

conv2d_11 (Conv2D)       (None, 149, 149, 32)      416      

_________________________________________________________________

max_pooling2d_8 (MaxPooling2 (None, 149, 149, 32)  0        

_________________________________________________________________

conv2d_12 (Conv2D)       (None, 148, 148, 32)  4128     

_________________________________________________________________

max_pooling2d_9 (MaxPooling2 (None, 148, 148, 32)  0        

_________________________________________________________________

conv2d_13 (Conv2D)       (None, 147, 147, 32)  4128     

_________________________________________________________________

conv2d_14 (Conv2D)       (None, 146, 146, 32)  4128     

_________________________________________________________________

max_pooling2d_10 (MaxPooling (None, 146, 146, 32)  0        

_________________________________________________________________

conv2d_15 (Conv2D)       (None, 145, 145, 32)  4128     

_________________________________________________________________

max_pooling2d_11 (MaxPooling (None, 145, 145, 32)  0        

_________________________________________________________________

conv2d_16 (Conv2D)       (None, 144, 144, 32)  4128     

_________________________________________________________________

global_average_pooling2d_2 ( (None, 32)            0    

_________________________________________________________________

dense_2 (Dense)          (None, 10)            330  

=================================================================

Total params: 21,386

Trainable params: 21,386

Non-trainable params: 0

The next step for us is to train the model on our data. The Model class has a built-in training loop, called the fit() method that accepts Dataset objects or Python generators  that yield batches of data, or NumPy arrays.

Before we can call fit(), we need to specify an optimizer and a loss function.This is the compile() step:

model.compile(optimizer=keras.optimizers.RMSprop(learning_rate=1e-4),

              loss=keras.losses.CategoricalCrossentropy())

Loss and optimizer can be specified via their string identifiers (in this case their default constructor argument values are used):

model.compile(optimizer=’rmsprop’, loss=’categorical_crossentropy’)

Once our model is compiled, you can start “fitting” the model to the data. Here’s what fitting a model looks like with NumPy data:

model.fit(numpy_array_of_samples, numpy_array_of_labels,

          batch_size=32, epochs=10)

Besides the data, you have to specify two key parameters: the batch_size and the number of epochs (iterations on the data). Here our data will get sliced on batches of 32 samples, and the model will iterate 10 times over the data during training.

Here’s what fitting a model looks like with a dataset:

model.fit(dataset_of_samples_and_labels, epochs=10)

Since the data yielded by a dataset is expect to be already batched, you don’t need to specify the batch size here.

Let’s look at it in practice with a toy example model that learns to classify MNIST digits:

# Get the data as Numpy arrays
(x_train, y_train), (x_test, y_test) = keras.datasets.mnist.load_data()
 
# Build a simple model
inputs = keras.Input(shape=(28, 28))
x = layers.experimental.preprocessing.Rescaling(1.0 / 255)(inputs)
x = layers.Flatten()(x)
x = layers.Dense(128, activation="relu")(x)
x = layers.Dense(128, activation="relu")(x)
outputs = layers.Dense(10, activation="softmax")(x)
model = keras.Model(inputs, outputs)
model.summary()
 
# Compile the model
model.compile(optimizer="adam", loss="sparse_categorical_crossentropy")
 
# Train the model for 1 epoch from Numpy data
batch_size = 64
print("Fit on NumPy data")
history = model.fit(x_train, y_train, batch_size=batch_size, epochs=1)
 
# Train the model for 1 epoch using a dataset
dataset = tf.data.Dataset.from_tensor_slices((x_train, y_train)).batch(batch_size)
print("Fit on Dataset")
history = model.fit(dataset, epochs=1)

{‘loss’: [0.11615095287561417]}

For a detailed overview of how to use fit(), see the guide to training & evaluation with the built-in Keras methods.

Keeping track of performance metrics

As you’re training a model, you want to keep track of metrics such as classification accuracy, precision, recall, AUC, etc. Besides, you want to monitor these metrics not only on the training data, but also on a validation set.

Monitoring metrics

You can pass a list of metric objects to compile(), like this:

model.compile(
    optimizer="adam",
    loss="sparse_categorical_crossentropy",
    metrics=[keras.metrics.SparseCategoricalAccuracy(name="acc")],
)
history = model.fit(dataset, epochs=1)

Passing validation data to fit()

You can pass validation data to fit() to monitor your validation loss & validation metrics. Validation metrics get reported at the end of each epoch.

val_dataset = tf.data.Dataset.from_tensor_slices((x_test, y_test)).batch(batch_size)
history = model.fit(dataset, epochs=1, validation_data=val_dataset)

938/938 [==============================] – 1s 1ms/step – loss: 0.0556 – acc: 0.9829 – val_loss: 0.1163 – val_acc: 0.9670

Using callbacks for checkpointing 

If training goes on for more than a few minutes, it’s important to save your model at regular intervals during training. You can then use your saved models to restart training in case your training process crashes (this is important for multi-worker distributed training, since with many workers at least one of them is bound to fail at some point).

An important feature of Keras is callbacks, configured in fit(). Callbacks are objects that get called by the model at different point during training, in particular:

At the beginning and end of each batch

At the beginning and end of each epoch

Callbacks are a way to make model trainable entirely scriptable.

You can use callbacks to periodically save your model. Here’s a simple example: a ModelCheckpoint callback configured to save the model at the end of every epoch. The filename will include the current epoch.


callbacks = [
    keras.callbacks.ModelCheckpoint(
        filepath='path/to/my/model_{epoch}',
        save_freq='epoch')
]
model.fit(dataset, epochs=2, callbacks=callbacks)

You can also use callbacks to do things like periodically changing the learning of your optimizer, streaming metrics to a Slack bot, sending yourself an email notification when training is complete, etc.

For detailed overview of what callbacks are available and how to write your own, see the callbacks API documentation and the guide to writing custom callbacks.

Monitoring training progress with TensorBoard

Staring at the Keras progress bar isn’t the most ergonomic way to monitor how your loss and metrics are evolving over time. There’s a better solution: TensorBoard, a web application that can display real-time graphs of your metrics (and more).

To use TensorBoard with fit(), simply pass a keras.callbacks.TensorBoard callback specifying the directory where to store TensorBoard logs:

callbacks = [
    keras.callbacks.TensorBoard(log_dir='./logs')
]
model.fit(dataset, epochs=2, callbacks=callbacks)

You can then launch a TensorBoard instance that you can open in your browser to monitor the logs getting written to this location:

tensorboard –logdir=./logs

What’s more, you can launch an in-line TensorBoard tab when training models in Jupyter / Colab notebooks. Here’s more information.

After fit(): evaluating test performance & generating predictions on new data

Once you have a trained model, you can evaluate its loss and metrics on new data via evaluate():

loss, acc = model.evaluate(val_dataset)  # returns loss and metrics
print("loss: %.2f" % loss)
print("acc: %.2f" % acc)

157/157 [==============================] – 0s 907us/step – loss: 0.1163 – acc: 0.9670

loss: 0.12

acc: 0.97

We can also generate NumPy arrays of predictions (the activations of the output layer(s) in the model) via predict():

predictions = model.predict(val_dataset)
print(predictions.shape)

(10000, 10)

Using fit() with a custom training step

By default, fit() is configured for supervised learning. If you need a different kind of training loop (for instance, a GAN training loop), you can provide your own implementation of the Model.train_step() method. This is the method that is repeatedly called during fit().

Metrics, callbacks, etc. will work as usual.

Here’s a simple example that reimplements what fit() normally does:

class CustomModel(keras.Model):
  def train_step(self, data):
	# Unpack the data. Its structure depends on your model and
	# on what you pass to `fit()`.
	x, y = data
	with tf.GradientTape() as tape:
  	y_pred = self(x, training=True)  # Forward pass
  	# Compute the loss value
  	# (the loss function is configured in `compile()`)
  	loss = self.compiled_loss(y, y_pred,
                                regularization_losses=self.losses)
	# Compute gradients
    trainable_vars = self.trainable_variables
	gradients = tape.gradient(loss, trainable_vars)
	# Update weights
    self.optimizer.apply_gradients(zip(gradients, trainable_vars))
	# Update metrics (includes the metric that tracks the loss)
    self.compiled_metrics.update_state(y, y_pred)
	# Return a dict mapping metric names to current value
	return {m.name: m.result() for m in self.metrics}
 
# Construct and compile an instance of CustomModel
inputs = keras.Input(shape=(32,))
outputs = keras.layers.Dense(1)(inputs)
model = CustomModel(inputs, outputs)
model.compile(optimizer='adam', loss='mse', metrics=[...])
 
# Just use `fit` as usual
model.fit(dataset, epochs=3, callbacks=...)

Code can be written in keras using two methodologies:

1.  Sequential API

2.  Functional API

We will see Sequential API in this tutorial

Sequential API

A sequential model is appropriate where a plain stack of layers is required, given each layer has exactly one input tensor and one output tensor.

A sequential model is not appropriate when:

Designing a network with multiple inputs or multiple outputs

If any one of the layers has multiple inputs or multiple outputs

When sharing of layers is required

When we have a nonlinear network topology in place

Sequential model Creation

We can create a sequential model by first importing the sequential class from keras.models and then passing a list of layers (which give a stacking appearance to the layers) and then passing on this list as parameters to the sequential constructor.

Lets train a image detection model on the fashion mnist dataset:

Sample code in Tensorflow

# TensorFlow and tf.keras
import tensorflow as tf
from tensorflow import keras
 
# Helper libraries
import numpy as np
import matplotlib.pyplot as plt
 
fashion_mnist = keras.datasets.fashion_mnist
 
(train_images, train_labels), (test_images, test_labels) = fashion_mnist.load_data()
class_names = ['T-shirt/top', 'Trouser', 'Pullover', 'Dress', 'Coat',
               'Sandal', 'Shirt', 'Sneaker', 'Bag', 'Ankle boot']
plt.figure()
plt.imshow(train_images[0])
plt.colorbar()
plt.grid(False)
plt.show()
 
train_images = train_images / 255.0
 
test_images = test_images / 255.0
 
plt.figure(figsize=(10,10))
for i in range(25):
    plt.subplot(5,5,i+1)
    plt.xticks([])
    plt.yticks([])
    plt.grid(False)
    plt.imshow(train_images[i], cmap=plt.cm.binary)
    plt.xlabel(class_names[train_labels[i]])
plt.show()
 
model = keras.Sequential([
    keras.layers.Flatten(input_shape=(28, 28)),
    keras.layers.Dense(128, activation='relu'),
    keras.layers.Dense(10)
])
 
 
model.compile(optimizer='adam',
              loss=tf.keras.losses.SparseCategoricalCrossentropy(from_logits=True),
              metrics=['accuracy'])
 
model.fit(train_images, train_labels, epochs=10)
 
test_loss, test_acc = model.evaluate(test_images,  test_labels, verbose=2)
 
print('\nTest accuracy:', test_acc)
 
probability_model = tf.keras.Sequential([model, 
                                         tf.keras.layers.Softmax()])
 
predictions = probability_model.predict(test_images)
 
np.argmax(predictions[0])
 
def plot_image(i, predictions_array, true_label, img):
  predictions_array, true_label, img = predictions_array, true_label[i], img[i]
  plt.grid(False)
  plt.xticks([])
  plt.yticks([])
 
  plt.imshow(img, cmap=plt.cm.binary)
 
  predicted_label = np.argmax(predictions_array)
  if predicted_label == true_label:
    color = 'blue'
  else:
    color = 'red'
 
  plt.xlabel("{} {:2.0f}% ({})".format(class_names[predicted_label],
                                100*np.max(predictions_array),
                                class_names[true_label]),
                                color=color)
 
def plot_value_array(i, predictions_array, true_label):
  predictions_array, true_label = predictions_array, true_label[i]
  plt.grid(False)
  plt.xticks(range(10))
  plt.yticks([])
  thisplot = plt.bar(range(10), predictions_array, color="#777777")
  plt.ylim([0, 1])
  predicted_label = np.argmax(predictions_array)
 
  thisplot[predicted_label].set_color('red')
  thisplot[true_label].set_color('blue')
 
  # Plot the first X test images, their predicted labels, and the true labels.
# Color correct predictions in blue and incorrect predictions in red.
num_rows = 5
num_cols = 3
num_images = num_rows*num_cols
plt.figure(figsize=(2*2*num_cols, 2*num_rows))
for i in range(num_images):
  plt.subplot(num_rows, 2*num_cols, 2*i+1)
  plot_image(i, predictions[i], test_labels, test_images)
  plt.subplot(num_rows, 2*num_cols, 2*i+2)
  plot_value_array(i, predictions[i], test_labels)
plt.tight_layout()
plt.show()

plt.show()

Sample train images values

[[[0. 0. 0. … 0. 0. 0.]

  [0. 0. 0. … 0. 0. 0.]

  [0. 0. 0. … 0. 0. 0.]

  …

  [0. 0. 0. … 0. 0. 0.]

  [0. 0. 0. … 0. 0. 0.]

  [0. 0. 0. … 0. 0. 0.]]

 [[0. 0. 0. … 0. 0. 0.]

  [0. 0. 0. … 0. 0. 0.]

  [0. 0. 0. … 0. 0. 0.]

  …

  [0. 0. 0. … 0. 0. 0.]

  [0. 0. 0. … 0. 0. 0.]

  [0. 0. 0. … 0. 0. 0.]]

 [[0. 0. 0. … 0. 0. 0.]

  [0. 0. 0. … 0. 0. 0.]

  [0. 0. 0. … 0. 0. 0.]

  …

  [0. 0. 0. … 0. 0. 0.]

  [0. 0. 0. … 0. 0. 0.]

  [0. 0. 0. … 0. 0. 0.]]

 …

 [[0. 0. 0. … 0. 0. 0.]

  [0. 0. 0. … 0. 0. 0.]

  [0. 0. 0. … 0. 0. 0.]

  …

  [0. 0. 0. … 0. 0. 0.]

  [0. 0. 0. … 0. 0. 0.]

  [0. 0. 0. … 0. 0. 0.]]

 [[0. 0. 0. … 0. 0. 0.]

  [0. 0. 0. … 0. 0. 0.]

  [0. 0. 0. … 0. 0. 0.]

  …

  [0. 0. 0. … 0. 0. 0.]

  [0. 0. 0. … 0. 0. 0.]

  [0. 0. 0. … 0. 0. 0.]]

 [[0. 0. 0. … 0. 0. 0.]

  [0. 0. 0. … 0. 0. 0.]

  [0. 0. 0. … 0. 0. 0.]

  …

  [0. 0. 0. … 0. 0. 0.]

  [0. 0. 0. … 0. 0. 0.]

  [0. 0. 0. … 0. 0. 0.]]]

sample test images values

[[[0. 0. 0. … 0. 0. 0.]

  [0. 0. 0. … 0. 0. 0.]

  [0. 0. 0. … 0. 0. 0.]

  …

  [0. 0. 0. … 0. 0. 0.]

  [0. 0. 0. … 0. 0. 0.]

  [0. 0. 0. … 0. 0. 0.]]

 [[0. 0. 0. … 0. 0. 0.]

  [0. 0. 0. … 0. 0. 0.]

  [0. 0. 0. … 0. 0. 0.]

  …

  [0. 0. 0. … 0. 0. 0.]

  [0. 0. 0. … 0. 0. 0.]

  [0. 0. 0. … 0. 0. 0.]]

 [[0. 0. 0. … 0. 0. 0.]

  [0. 0. 0. … 0. 0. 0.]

  [0. 0. 0. … 0. 0. 0.]

  …

  [0. 0. 0. … 0. 0. 0.]

  [0. 0. 0. … 0. 0. 0.]

  [0. 0. 0. … 0. 0. 0.]]

 …

 [[0. 0. 0. … 0. 0. 0.]

  [0. 0. 0. … 0. 0. 0.]

  [0. 0. 0. … 0. 0. 0.]

  …

  [0. 0. 0. … 0. 0. 0.]

  [0. 0. 0. … 0. 0. 0.]

  [0. 0. 0. … 0. 0. 0.]]

 [[0. 0. 0. … 0. 0. 0.]

  [0. 0. 0. … 0. 0. 0.]

  [0. 0. 0. … 0. 0. 0.]

  …

  [0. 0. 0. … 0. 0. 0.]

  [0. 0. 0. … 0. 0. 0.]

  [0. 0. 0. … 0. 0. 0.]]

 [[0. 0. 0. … 0. 0. 0.]

  [0. 0. 0. … 0. 0. 0.]

  [0. 0. 0. … 0. 0. 0.]

  …

  [0. 0. 0. … 0. 0. 0.]

  [0. 0. 0. … 0. 0. 0.]

  [0. 0. 0. … 0. 0. 0.]]]

Model: “sequential_12”

_________________________________________________________________

Layer (type)             Output Shape          Param #  

=================================================================

flatten_6 (Flatten)      (None, 784)           0    

_________________________________________________________________

dense_15 (Dense)         (None, 128)           100480

_________________________________________________________________

dense_16 (Dense)         (None, 10)            1290 

=================================================================

Total params: 101,770

Trainable params: 101,770

Non-trainable params: 0

_________________________________________________________________

None

Epoch 1/20

1875/1875 [==============================] – 2s 1ms/step – loss: 0.4999 – accuracy: 0.8244

Epoch 2/20

1875/1875 [==============================] – 2s 1ms/step – loss: 0.3753 – accuracy: 0.8644

Epoch 3/20

1875/1875 [==============================] – 2s 1ms/step – loss: 0.3372 – accuracy: 0.8773

Epoch 4/20

1875/1875 [==============================] – 2s 1ms/step – loss: 0.3125 – accuracy: 0.8851

Epoch 5/20

1875/1875 [==============================] – 3s 1ms/step – loss: 0.2941 – accuracy: 0.8925

Epoch 6/20

1875/1875 [==============================] – 2s 1ms/step – loss: 0.2799 – accuracy: 0.8958

Epoch 7/20

1875/1875 [==============================] – 2s 1ms/step – loss: 0.2683 – accuracy: 0.9004

Epoch 8/20

1875/1875 [==============================] – 2s 1ms/step – loss: 0.2570 – accuracy: 0.9038

Epoch 9/20

1875/1875 [==============================] – 2s 1ms/step – loss: 0.2466 – accuracy: 0.9088

Epoch 10/20

1875/1875 [==============================] – 3s 1ms/step – loss: 0.2389 – accuracy: 0.9104

Epoch 11/20

1875/1875 [==============================] – 3s 1ms/step – loss: 0.2319 – accuracy: 0.9137

Epoch 12/20

1875/1875 [==============================] – 3s 1ms/step – loss: 0.2230 – accuracy: 0.9172

Epoch 13/20

1875/1875 [==============================] – 3s 1ms/step – loss: 0.2168 – accuracy: 0.9195

Epoch 14/20

1875/1875 [==============================] – 3s 1ms/step – loss: 0.2109 – accuracy: 0.9209

Epoch 15/20

1875/1875 [==============================] – 2s 1ms/step – loss: 0.2055 – accuracy: 0.9234

Epoch 16/20

1875/1875 [==============================] – 2s 1ms/step – loss: 0.1993 – accuracy: 0.9249

Epoch 17/20

1875/1875 [==============================] – 2s 1ms/step – loss: 0.1939 – accuracy: 0.9272

Epoch 18/20

1875/1875 [==============================] – 2s 1ms/step – loss: 0.1908 – accuracy: 0.9287

Epoch 19/20

1875/1875 [==============================] – 2s 1ms/step – loss: 0.1867 – accuracy: 0.9304

Epoch 20/20

1875/1875 [==============================] – 2s 1ms/step – loss: 0.1811 – accuracy: 0.9325

313/313 – 0s – loss: 0.3704 – accuracy: 0.8847

Test accuracy: 0.8847000002861023

[[0.     0.     0.     0.     0.     0.

  0.         0.     0.     0.     0.     0.

  0.         0.     0.     0.     0.     0.

  0.         0.     0.     0.     0.     0.

  0.         0.     0.     0.    ]

 [0.         0.     0.     0.     0.     0.

  0.         0.     0.     0.     0.     0.

  0.         0.     0.    0.         0.     0.

  0.         0.     0.     0.     0.     0.

  0.         0.     0.     0.    ]

 [0.         0.     0.     0.     0.     0.

  0.         0.     0.     0.     0.     0.

  0.     0.     0.     0.     0.     0.

  0.         0.     0.     0.     0.     0.

  0.         0.     0.     0.    ]

 [0.         0.     0.     0.     0.     0.

  0.         0.     0.     0.         0.     0.

  0.00392157 0.     0.     0.05098039 0.28627451 0.

  0.         0.00392157 0.01568627 0.         0.     0.

  0.         0.00392157 0.00392157 0.    ]

 [0.         0.     0.     0.     0.     0.

  0.         0.     0.     0.     0.     0.

  0.01176471 0.     0.14117647 0.53333333 0.49803922 0.24313725

  0.21176471 0.     0.     0.     0.00392157 0.01176471

  0.01568627 0.     0.     0.01176471]

 [0.         0.     0.     0.     0.     0.

  0.         0.     0.     0.     0.     0.

  0.02352941 0.     0.4    0.8    0.69019608 0.5254902

  0.56470588 0.48235294 0.09019608 0.     0.     0.

  0.         0.04705882 0.03921569 0.    ]

 [0.         0.     0.     0.     0.     0.

  0.         0.     0.     0.     0.     0.

  0.         0.     0.60784314 0.9254902  0.81176471 0.69803922

  0.41960784 0.61176471 0.63137255 0.42745098 0.25098039 0.09019608

  0.30196078 0.50980392 0.28235294 0.05882353]

 [0.         0.     0.     0.     0.     0.

  0.         0.     0.     0.     0.     0.00392157

  0.         0.27058824 0.81176471 0.8745098  0.85490196 0.84705882

  0.84705882 0.63921569 0.49803922 0.4745098  0.47843137 0.57254902

  0.55294118 0.34509804 0.6745098  0.25882353]

 [0.         0.     0.     0.     0.     0.

  0.         0.     0.     0.00392157 0.00392157 0.00392157

  0.         0.78431373 0.90980392 0.90980392 0.91372549 0.89803922

  0.8745098  0.8745098  0.84313725 0.83529412 0.64313725 0.49803922

  0.48235294 0.76862745 0.89803922 0.    ]

 [0.         0.     0.     0.     0.     0.

  0.         0.     0.     0.     0.     0.

  0.         0.71764706 0.88235294 0.84705882 0.8745098  0.89411765

  0.92156863 0.89019608 0.87843137 0.87058824 0.87843137 0.86666667

  0.8745098  0.96078431 0.67843137 0.    ]

 [0.         0.     0.     0.     0.     0.

  0.         0.     0.     0.     0.     0.

  0.         0.75686275 0.89411765 0.85490196 0.83529412 0.77647059

  0.70588235 0.83137255 0.82352941 0.82745098 0.83529412 0.8745098

  0.8627451  0.95294118 0.79215686 0.    ]

 [0.         0.     0.     0.     0.     0.

  0.         0.     0.     0.00392157 0.01176471 0.

  0.04705882 0.85882353 0.8627451  0.83137255 0.85490196 0.75294118

  0.6627451  0.89019608 0.81568627 0.85490196 0.87843137 0.83137255

  0.88627451 0.77254902 0.81960784 0.20392157]

 [0.         0.     0.     0.     0.     0.

  0.         0.     0.     0.     0.02352941 0.

  0.38823529 0.95686275 0.87058824 0.8627451  0.85490196 0.79607843

  0.77647059 0.86666667 0.84313725 0.83529412 0.87058824 0.8627451

  0.96078431 0.46666667 0.65490196 0.21960784]

 [0.         0.     0.     0.     0.     0.

  0.         0.     0.     0.01568627 0.     0.

  0.21568627 0.9254902  0.89411765 0.90196078 0.89411765 0.94117647

  0.90980392 0.83529412 0.85490196 0.8745098  0.91764706 0.85098039

  0.85098039 0.81960784 0.36078431 0.    ]

 [0.         0.     0.00392157 0.01568627 0.02352941 0.02745098

  0.00784314 0.     0.         0.     0.     0.

  0.92941176 0.88627451 0.85098039 0.8745098  0.87058824 0.85882353

  0.87058824 0.86666667 0.84705882 0.8745098  0.89803922 0.84313725

  0.85490196 1.     0.30196078 0.    ]

 [0.         0.01176471 0.     0.     0.         0.

  0.         0.     0.     0.24313725 0.56862745 0.8

  0.89411765 0.81176471 0.83529412 0.86666667 0.85490196 0.81568627

  0.82745098 0.85490196 0.87843137 0.8745098  0.85882353 0.84313725

  0.87843137 0.95686275 0.62352941 0.        ]

 [0.         0.     0.     0.     0.07058824 0.17254902

  0.32156863 0.41960784 0.74117647 0.89411765 0.8627451  0.87058824

  0.85098039 0.88627451 0.78431373 0.80392157 0.82745098 0.90196078

  0.87843137 0.91764706 0.69019608 0.7372549  0.98039216 0.97254902

  0.91372549 0.93333333 0.84313725 0.    ]

 [0.         0.22352941 0.73333333 0.81568627 0.87843137 0.86666667

  0.87843137 0.81568627 0.8    0.83921569 0.81568627 0.81960784

  0.78431373 0.62352941 0.96078431 0.75686275 0.80784314 0.8745098

  1.         1.     0.86666667 0.91764706 0.86666667 0.82745098

  0.8627451  0.90980392 0.96470588 0.    ]

 [0.01176471 0.79215686 0.89411765 0.87843137 0.86666667 0.82745098

  0.82745098 0.83921569 0.80392157 0.80392157 0.80392157 0.8627451

  0.94117647 0.31372549 0.58823529 1.     0.89803922 0.86666667

  0.7372549  0.60392157 0.74901961 0.82352941 0.8    0.81960784

  0.87058824 0.89411765 0.88235294 0.    ]

 [0.38431373 0.91372549 0.77647059 0.82352941 0.87058824 0.89803922

  0.89803922 0.91764706 0.97647059 0.8627451  0.76078431 0.84313725

  0.85098039 0.94509804 0.25490196 0.28627451 0.41568627 0.45882353

  0.65882353 0.85882353 0.86666667 0.84313725 0.85098039 0.8745098

  0.8745098  0.87843137 0.89803922 0.11372549]

 [0.29411765 0.8    0.83137255 0.8    0.75686275 0.80392157

  0.82745098 0.88235294 0.84705882 0.7254902  0.77254902 0.80784314

  0.77647059 0.83529412 0.94117647 0.76470588 0.89019608 0.96078431

  0.9372549  0.8745098  0.85490196 0.83137255 0.81960784 0.87058824

  0.8627451  0.86666667 0.90196078 0.2627451 ]

 [0.18823529 0.79607843 0.71764706 0.76078431 0.83529412 0.77254902

  0.7254902  0.74509804 0.76078431 0.75294118 0.79215686 0.83921569

  0.85882353 0.86666667 0.8627451  0.9254902  0.88235294 0.84705882

  0.78039216 0.80784314 0.72941176 0.70980392 0.69411765 0.6745098

  0.70980392 0.80392157 0.80784314 0.45098039]

 [0.         0.47843137 0.85882353 0.75686275 0.70196078 0.67058824

  0.71764706 0.76862745 0.8    0.82352941 0.83529412 0.81176471

  0.82745098 0.82352941 0.78431373 0.76862745 0.76078431 0.74901961

  0.76470588 0.74901961 0.77647059 0.75294118 0.69019608 0.61176471

  0.65490196 0.69411765 0.82352941 0.36078431]

 [0.         0.     0.29019608 0.74117647 0.83137255 0.74901961

  0.68627451 0.6745098  0.68627451 0.70980392 0.7254902  0.7372549

  0.74117647 0.7372549  0.75686275 0.77647059 0.8    0.81960784

  0.82352941 0.82352941 0.82745098 0.7372549  0.7372549  0.76078431

  0.75294118 0.84705882 0.66666667 0.    ]

 [0.00784314 0.     0.     0.     0.25882353 0.78431373

  0.87058824 0.92941176 0.9372549  0.94901961 0.96470588 0.95294118

  0.95686275 0.86666667 0.8627451  0.75686275 0.74901961 0.70196078

  0.71372549 0.71372549 0.70980392 0.69019608 0.65098039 0.65882353

  0.38823529 0.22745098 0.     0.    ]

 [0.         0.     0.     0.     0.     0.

  0.         0.15686275 0.23921569 0.17254902 0.28235294 0.16078431

  0.1372549  0.     0.     0.     0.     0.

  0.         0.     0.     0.     0.     0.

  0.         0.     0.     0.    ]

 [0.         0.     0.     0.     0.     0.

  0.         0.     0.     0.     0.     0.

  0.         0.     0.     0.         0.     0.

  0.         0.     0.     0.     0.     0.

  0.         0.     0.     0.    ]

 [0.         0.     0.     0.     0.     0.

  0.         0.     0.     0.     0.     0.

  0.     0.         0.     0.     0.     0.

  0.         0.     0.     0.     0.     0.

  0.         0.     0.     0.    ]]

[[0.     0.     0.     0.     0.     0.

  0.         0.     0.     0.     0.     0.

  0.         0.     0.     0.     0.     0.

  0.         0.     0.     0.     0.     0.

  0.         0.     0.     0.    ]

 [0.         0.     0.     0.     0.     0.

  0.         0.     0.     0.     0.     0.

  0.         0.     0.     0.     0.     0.

  0.         0.     0.     0.     0.     0.

  0.         0.     0.     0.    ]

 [0.         0.     0.     0.     0.     0.

  0.         0.     0.     0.     0.     0.

  0.         0.     0.     0.     0.     0.

  0.         0.     0.     0.     0.     0.

  0.         0.     0.     0.    ]

 [0.         0.     0.     0.     0.     0.

  0.         0.     0.     0.     0.     0.

  0.         0.     0.     0.     0.     0.

  0.         0.     0.     0.     0.     0.

  0.         0.     0.         0.    ]

 [0.         0.     0.     0.     0.     0.

  0.         0.     0.     0.     0.     0.

  0.         0.     0.     0.     0.     0.

  0.         0.     0.     0.     0.         0.

  0.         0.     0.     0.    ]

 [0.         0.     0.     0.     0.     0.

  0.         0.     0.     0.     0.     0.

  0.         0.     0.     0.     0.     0.

  0.         0.         0.         0.     0.     0.

  0.         0.     0.     0.    ]

 [0.         0.     0.     0.     0.     0.

  0.         0.     0.     0.     0.     0.

  0.         0.     0.     0.     0.     0.

  0.         0.     0.     0.     0.     0.

  0.         0.     0.     0.    ]

 [0.         0.     0.     0.     0.     0.

  0.         0.     0.     0.     0.     0.

  0.         0.         0.     0.     0.     0.

  0.         0.01176471 0.00392157 0.         0.     0.02745098

  0.         0.14509804 0.     0.    ]

 [0.         0.     0.     0.     0.     0.

  0.         0.     0.     0.     0.     0.

  0.         0.00392157 0.00784314 0.         0.10588235 0.32941176

  0.04313725 0.     0.     0.     0.     0.

  0.         0.46666667 0.     0.    ]

 [0.         0.     0.     0.     0.     0.

  0.         0.     0.     0.     0.     0.

  0.         0.00392157 0.     0.     0.34509804 0.56078431

  0.43137255 0.     0.     0.     0.     0.08627451

  0.36470588 0.41568627 0.     0.    ]

 [0.         0.     0.     0.     0.     0.

  0.         0.     0.     0.     0.     0.

  0.         0.01568627 0.     0.20784314 0.50588235 0.47058824

  0.57647059 0.68627451 0.61568627 0.65098039 0.52941176 0.60392157

  0.65882353 0.54901961 0.     0.    ]

 [0.         0.     0.     0.     0.     0.

  0.         0.     0.     0.     0.     0.

  0.00784314 0.     0.04313725 0.5372549  0.50980392 0.50196078

  0.62745098 0.69019608 0.62352941 0.65490196 0.69803922 0.58431373

  0.59215686 0.56470588 0.     0.    ]

 [0.         0.     0.     0.     0.     0.

  0.00392157 0.     0.00784314 0.00392157 0.     0.01176471

  0.         0.     0.45098039 0.44705882 0.41568627 0.5372549

  0.65882353 0.6    0.61176471 0.64705882 0.65490196 0.56078431

  0.61568627 0.61960784 0.04313725 0.    ]

 [0.         0.     0.     0.     0.00392157 0.

  0.         0.     0.     0.     0.01176471 0.

  0.         0.34901961 0.54509804 0.35294118 0.36862745 0.6

  0.58431373 0.51372549 0.59215686 0.6627451  0.6745098  0.56078431

  0.62352941 0.6627451  0.18823529 0.    ]

 [0.         0.     0.     0.     0.     0.

  0.00784314 0.01568627 0.00392157 0.     0.     0.

  0.38431373 0.53333333 0.43137255 0.42745098 0.43137255 0.63529412

  0.52941176 0.56470588 0.58431373 0.62352941 0.65490196 0.56470588

  0.61960784 0.6627451  0.46666667 0.    ]

 [0.         0.     0.00784314 0.00784314 0.00392157 0.00784314

  0.         0.     0.     0.     0.10196078 0.42352941

  0.45882353 0.38823529 0.43529412 0.45882353 0.53333333 0.61176471

  0.5254902  0.60392157 0.60392157 0.61176471 0.62745098 0.55294118

  0.57647059 0.61176471 0.69803922 0.    ]

 [0.01176471 0.     0.     0.     0.     0.

  0.         0.08235294 0.20784314 0.36078431 0.45882353 0.43529412

  0.40392157 0.45098039 0.50588235 0.5254902  0.56078431 0.60392157

  0.64705882 0.66666667 0.60392157 0.59215686 0.60392157 0.56078431

  0.54117647 0.58823529 0.64705882 0.16862745]

 [0.         0.     0.09019608 0.21176471 0.25490196 0.29803922

  0.33333333 0.4627451  0.50196078 0.48235294 0.43529412 0.44313725

  0.4627451  0.49803922 0.49019608 0.54509804 0.52156863 0.53333333

  0.62745098 0.54901961 0.60784314 0.63137255 0.56470588 0.60784314

  0.6745098  0.63137255 0.74117647 0.24313725]

 [0.         0.26666667 0.36862745 0.35294118 0.43529412 0.44705882

  0.43529412 0.44705882 0.45098039 0.49803922 0.52941176 0.53333333

  0.56078431 0.49411765 0.49803922 0.59215686 0.60392157 0.56078431

  0.58039216 0.49019608 0.63529412 0.63529412 0.56470588 0.54117647

  0.6        0.63529412 0.76862745 0.22745098]

 [0.2745098  0.6627451  0.50588235 0.40784314 0.38431373 0.39215686

  0.36862745 0.38039216 0.38431373 0.4    0.42352941 0.41568627

  0.46666667 0.47058824 0.50588235 0.58431373 0.61176471 0.65490196

  0.74509804 0.74509804 0.76862745 0.77647059 0.77647059 0.73333333

  0.77254902 0.74117647 0.72156863 0.14117647]

 [0.0627451  0.49411765 0.67058824 0.7372549  0.7372549  0.72156863

  0.67058824 0.6    0.52941176 0.47058824 0.49411765 0.49803922

  0.57254902 0.7254902  0.76470588 0.81960784 0.81568627 1.

  0.81960784 0.69411765 0.96078431 0.98823529 0.98431373 0.98431373

  0.96862745 0.8627451  0.80784314 0.19215686]

 [0.         0.     0.     0.04705882 0.2627451  0.41568627

  0.64313725 0.7254902  0.78039216 0.82352941 0.82745098 0.82352941

  0.81568627 0.74509804 0.58823529 0.32156863 0.03137255 0.

  0.         0.     0.69803922 0.81568627 0.7372549  0.68627451

  0.63529412 0.61960784 0.59215686 0.04313725]

 [0.         0.     0.     0.     0.     0.

  0.         0.     0.     0.         0.     0.

  0.         0.     0.     0.     0.     0.

  0.         0.     0.     0.     0.     0.

  0.         0.     0.     0.    ]

 [0.         0.     0.     0.     0.     0.

  0.         0.     0.     0.     0.     0.

  0.         0.     0.     0.     0.     0.

  0.         0.     0.     0.     0.     0.

  0.         0.     0.     0.    ]

 [0.         0.     0.     0.         0.     0.

  0.         0.     0.     0.     0.     0.

  0.         0.     0.     0.     0.     0.

  0.         0.     0.     0.     0.     0.

  0.         0.     0.     0.    ]

 [0.     0.         0.     0.     0.     0.

  0.         0.     0.     0.     0.     0.

  0.         0.     0.     0.     0.     0.

  0.         0.     0.     0.     0.     0.

  0.         0.     0.     0.    ]

 [0.         0.     0.     0.     0.     0.

  0.         0.     0.     0.     0.     0.

  0.         0.     0.     0.     0.     0.

  0.         0.     0.     0.     0.         0.

  0.         0.     0.     0.    ]

 [0.         0.     0.     0.     0.     0.

  0.         0.     0.     0.     0.     0.

  0.         0.     0.     0.     0.     0.

  0.         0.     0.         0.     0.     0.

  0.         0.     0.     0.    ]]

Vagish Abhishek

Leave a Comment

Your email address will not be published. Required fields are marked *

Great Learning Free Online Courses
Scroll to Top