{"id":18572,"date":"2020-09-16T16:12:00","date_gmt":"2020-09-16T10:42:00","guid":{"rendered":"https:\/\/www.mygreatlearning.com\/blog\/keras-tutorial\/"},"modified":"2024-09-03T18:00:08","modified_gmt":"2024-09-03T12:30:08","slug":"keras-tutorial","status":"publish","type":"post","link":"https:\/\/www.mygreatlearning.com\/blog\/keras-tutorial\/","title":{"rendered":"Keras Tutorial - Deep Learning Framework"},"content":{"rendered":"\n<p><strong>Keras<\/strong><\/p>\n\n\n\n<p>Around a year back,Keras was integrated to TensorFlow 2.0, which succeeded TensorFlow 1.0. Now Keras is a part of TensorFlow.<\/p>\n\n\n\n<p>The writer of Keras is Francois Chollette. <\/p>\n\n\n\n<p><strong>Data loading and preprocessing<\/strong><\/p>\n\n\n\n<p>Neural networks don't process raw data, encoded JPEG image files, or CSV files. They handle vectorized and standardized representations.<\/p>\n\n\n\n<p>Text files require to be read into string tensors and then split into individual words. Finally, the terms need to be indexed and turned in to integer typed tensors.<\/p>\n\n\n\n<p>Image files are read and converted\/decoded into integer arrays\/tensors, and then transformed to floating-point and finally normalized to small values ( between 0,1).<\/p>\n\n\n\n<p>CSV files need to be parsed, where numerical features are transformed into floating-point tensors with certain categorical features indexed and converted into integer tensors. Finally, each element needs to be normalized [0,1].<\/p>\n\n\n\n<p><strong>Data loading<\/strong><\/p>\n\n\n\n<p>Keras models allow three types of inputs:<\/p>\n\n\n\n<p>NumPy arrays:&nbsp; This is a good option if the data fits in memory.<\/p>\n\n\n\n<p>TensorFlow Dataset objects. : This is a high-performance option which is more suitable for datasets that do not fit in memory and that are either streamed from disk or distributed filesystems.<\/p>\n\n\n\n<p>Before we start training a model, we will need to make your data available in one of these formats. If we have a large dataset and we are operating on GPU(s), we should consider using the Dataset object, since it will automatically take care of performance-critical details, such as:<\/p>\n\n\n\n<p>Asynchronously preprocessing our data on CPU while our GPU is busy, and will buffer it to a queue,<\/p>\n\n\n\n<p>in turn, prefetching data on GPU memory so that it's quickly available when the GPU has finished processing the previous batch of data so that we can reach full GPU utilization on our machine.<\/p>\n\n\n\n<p>Keras supports&nbsp; a wide of&nbsp; range of utilities to help us turn raw data on ours disk into a Dataset object:<\/p>\n\n\n\n<p>tf.keras.preprocessing.image_dataset_from_directory : It turns image files sorted into class-specific folders into a well labelled dataset of image tensors which are of a definite shape.<\/p>\n\n\n\n<p>tf.keras.preprocessing.text_dataset_from_directory is used for the same over text files.<\/p>\n\n\n\n<p>Assuming the following directory structure :<\/p>\n\n\n\n<p>We will try and load the directories as a Keras dataset.<br><\/p>\n\n\n\n<pre class=\"wp-block-code\"><code>import os\nimport glob\nfrom PIL import Image\npath=r\"C:\\Users\\VAGISH\\Documents\\Lightshot\"\nnew_path=r\"C:\\Users\\VAGISH\\LightShot\"\n \nfor file in os.listdir(path):\n    img=Image.open(os.path.join(path,file)).resize((200,200))\n    final_path=os.path.join(new_path,file)\n    img.save(final_path)\n \nimport tensorflow as tf\npath=r'C:\\Users\\VAGISH\\Lightshot'\ndataset = tf.keras.preprocessing.image_dataset_from_directory(path,batch_size=8,image_size=(200,200))\nfor data,labels in dataset:\n    print(data.dtype)\n    print(data.shape)\n    print(labels.dtype)\n    print(labels.shape)\n    print(labels)\n<\/code><\/pre>\n\n\n\n<pre class=\"wp-block-preformatted\">Output:\n &lt;dtype: 'float32'&gt; \n (8, 200, 200, 3)\n &lt;dtype: 'int32'&gt; \n (8,)  \n tf.Tensor([0 1 0 1 0 1 0 1], shape=(8,), dtype=int32) <\/pre>\n\n\n\n<p>&nbsp;Similarly, we can load text files organized in different folders to a dataset object using :<\/p>\n\n\n\n<pre class=\"wp-block-preformatted\">tf.keras.preprocessing.text_dataset_from_directory\u200b<\/pre>\n\n\n\n<p><strong>Data Preprocessing with Keras<\/strong><\/p>\n\n\n\n<p>Once we have data in the form of string\/int\/float Numpy arrays, or a dataset object that yields batches of string\/int\/float tensors, the next step is to pre process the data. Taking up <a href=\"https:\/\/www.mygreatlearning.com\/keras\/free-courses\" target=\"_blank\" rel=\"noreferrer noopener\">keras courses<\/a> will help you learn more about the concept. <\/p>\n\n\n\n<p>This usually means:<\/p>\n\n\n\n<p>1.Tokenization of string data, followed by indexing<\/p>\n\n\n\n<p>2.Feature normalization<\/p>\n\n\n\n<p>3.Rescaling data to small values (zero-mean and variance or in range [0,1])<\/p>\n\n\n\n<p>4.Text Vectorization<\/p>\n\n\n\n<p>Keras supports a text vectorization layer, which can be directly used in the models. It holds an index for mapping of words for string type data or tokens to integer indices<\/p>\n\n\n\n<p><strong>Normalization<\/strong><\/p>\n\n\n\n<p>It stores the mean and variance of all features.<\/p>\n\n\n\n<p>The state of the preprocessing layer can be directly obtained by calling layer.adapt(data) on a sample of training data.<\/p>\n\n\n\n<p>Example:<\/p>\n\n\n\n<p>Converting strings to sequences of integer word indices<br><\/p>\n\n\n\n<pre class=\"wp-block-code\"><code>from tensorflow.keras.layers.experimental.preprocessing import TextVectorization\nimport numpy as np\ntraining_data=np.array(&#91;&#91;\"I am brave.\"],&#91;\"I am a student\"],&#91;\"I study at school\"]])\nprint(training_data)\n \nvectorizer=TextVectorization(output_mode=\"tf-idf\")\nvectorizer.adapt(training_data)\n \ndata = vectorizer(training_data)\nprint(data)\n<\/code><\/pre>\n\n\n\n<pre class=\"wp-block-code\"><code>OUTPUT:\n&#91;&#91;'I am vagish.']\n  &#91;'I am a student'] \n  &#91;'I study at school']] \ntf.Tensor(\n&#91;&#91;0.         0.5596158  0.6931472  0.91629076 0.      0.\n 0.      0.      0.     ]\n &#91;0.      0.5596158  0.6931472  0.      0.      0.91629076\n0.      0.      0.91629076]\n &#91;0.      0.5596158  0.         0.      0.91629076 0.\n  0.91629076 0.91629076 0.     ]], shape=(3, 9), dtype=float32)<\/code><\/pre>\n\n\n\n<p><br><\/p>\n\n\n\n<pre class=\"wp-block-code\"><code>from tensorflow.keras.layers.experimental.preprocessing import TextVectorization\nimport numpy as np\ntraining_data=np.array(&#91;&#91;\"I am vagish.\"],&#91;\"I am a student\"],&#91;\"I study at school\"]])\nprint(training_data)\n \nvectorizer=TextVectorization(output_mode=\"binary\")\nvectorizer.adapt(training_data)\n \ndata = vectorizer(training_data)\nprint(data)<\/code><\/pre>\n\n\n\n<pre class=\"wp-block-preformatted\"><strong>Output<\/strong>\n[['I am vagish.']\n   ['I am a student'] \n   ['I study at school']] \n\n tf.Tensor( \n [[0. 1. 1. 1. 0. 0. 0. 0. 0.] \n [0. 1. 1. 0. 0. 1. 0. 0. 1.] \n [0. 1. 0. 0. 1. 0. 1. 1. 0.]], shape=(3, 9), dtype=float32) <\/pre>\n\n\n\n<p><strong>Generating Bi-Gram type data<\/strong><br><\/p>\n\n\n\n<pre class=\"wp-block-code\"><code>from tensorflow.keras.layers.experimental.preprocessing import TextVectorization\n \n# Example training data, of dtype `string`.\ntraining_data = np.array(&#91;&#91;\"This is me\"], &#91;\"And there they are\"]])\n \nvectorizer = TextVectorization(output_mode=\"binary\", ngrams=2)\n \nvectorizer.adapt(training_data)\nint_data = vectorizer(training_data)\nprint(int_data)\n<\/code><\/pre>\n\n\n\n<p><strong>Output<\/strong><\/p>\n\n\n\n<p>tf.Tensor(<\/p>\n\n\n\n<p>[[0. 1. 1. 0. 1. 0. 1. 1. 1. 0. 0. 0. 1. 0. 0. 1. 1.]<\/p>\n\n\n\n<p>&nbsp;[0. 1. 1. 0. 0. 1. 0. 0. 0. 1. 1. 1. 1. 1. 1. 0. 0.]], shape=(2, 17), dtype=float32)<\/p>\n\n\n\n<p><strong>Feature Normalization<\/strong><br><\/p>\n\n\n\n<pre class=\"wp-block-code\"><code>\nfrom tensorflow.keras.layers.experimental.preprocessing import Normalization\n \ntraining_data = np.random.randint(0, 256, size=(256, 200, 200, 3)).astype(\"float32\")\n \nnormalizer = Normalization(axis=-1)\nnormalizer.adapt(training_data)\n \nnormalized_data = normalizer(training_data)\nprint(training_data)\nprint(\"var: %.4f\" % np.var(normalized_data))\nprint(\"mean: %.4f\" % np.mean(normalized_data))\n<\/code><\/pre>\n\n\n\n<p>Ouput:<\/p>\n\n\n\n<p>[[[[177.&nbsp; 55.&nbsp; 50.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[ 85.&nbsp; 57.&nbsp; 44.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[219. 217.&nbsp; 10.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;...<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[146.&nbsp; 36. 211.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[212. 149.&nbsp; 89.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[ 35. 247. 171.]]<\/p>\n\n\n\n<p>&nbsp;&nbsp;[[ 79.&nbsp; 13.&nbsp; 43.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[238.&nbsp; 12. 112.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[236.&nbsp; 75. 190.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;...<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[ 87.&nbsp; 37. 177.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[ 50.&nbsp; 35. 137.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[161. 232.&nbsp; 78.]]<\/p>\n\n\n\n<p>&nbsp;&nbsp;[[ 28. 207. 133.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[ 79.&nbsp; 76. 229.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[ 29. &nbsp; 9. 215.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;...<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[242. 135. 241.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[ 12. 250.&nbsp; 58.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[225.&nbsp; 83. 165.]]<\/p>\n\n\n\n<p>&nbsp;&nbsp;...<\/p>\n\n\n\n<p>&nbsp;&nbsp;[[193. 170.&nbsp; 77.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[239.&nbsp; 14. 213.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[ 86. 237. 197.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;...<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[224. 117.&nbsp; 71.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[209.&nbsp; 82. 100.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[151.&nbsp; 60.&nbsp; 73.]]<\/p>\n\n\n\n<p>&nbsp;&nbsp;[[ 25.&nbsp; 81.&nbsp; 58.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[&nbsp; 0.&nbsp; 89. 187.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[ 67.&nbsp; 18. 184.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;...<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[ 39.&nbsp; 75. 129.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[ 32.&nbsp; 87.&nbsp; 50.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[ 30. 140. 210.]]<\/p>\n\n\n\n<p>&nbsp;&nbsp;[[ 38. 155.&nbsp; 70.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[ 92. 131. 141.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[145. 110. 166.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;...<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[ 17. 181. 115.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[255. 179. 177.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[212. 231. 195.]]]<\/p>\n\n\n\n<p>&nbsp;[[[ 48. 200. 236.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[167. 245. 193.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[&nbsp; 9.&nbsp; 20. 242.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;...<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[184. 103. 221.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[182. 142. 114.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[247. 100.&nbsp; 77.]]<\/p>\n\n\n\n<p>&nbsp;&nbsp;[[ 62.&nbsp; 24.&nbsp; 81.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[216.&nbsp; 63. 116.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[ 75.&nbsp; 42. 156.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;...<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[237.&nbsp; 21. 114.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[146. 246. 227.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[ 87.&nbsp; 94. 148.]]<\/p>\n\n\n\n<p>&nbsp;&nbsp;[[140. 223. 121.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[ 59.&nbsp; 14. 174.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[ 14. 117. 165.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;...<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[ 12.&nbsp; 17. 234.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[238. 132.&nbsp; 56.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[207. 145.&nbsp; 69.]]<\/p>\n\n\n\n<p>&nbsp;&nbsp;...<\/p>\n\n\n\n<p>&nbsp;&nbsp;[[ 20.&nbsp; 88. &nbsp; 4.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[114.&nbsp; 42. 178.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[236. 193. &nbsp; 2.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;...<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[&nbsp; 7. 192.&nbsp; 59.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[233. &nbsp; 9. 249.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[ 86.&nbsp; 43.&nbsp; 14.]]<\/p>\n\n\n\n<p>&nbsp;&nbsp;[[166. 170. 128.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[137.&nbsp; 74.&nbsp; 19.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[&nbsp; 8.&nbsp; 62.&nbsp; 33.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;...<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[227.&nbsp; 17.&nbsp; 54.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[ 17. &nbsp; 8. 113.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[234.&nbsp; 66. 254.]]<\/p>\n\n\n\n<p>&nbsp;&nbsp;[[148. 238. 223.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[213. 197. 218.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[&nbsp; 6.&nbsp; 66.&nbsp; 68.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;...<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[201. 164. 121.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[ 28. 206. 122.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[225. 158. 215.]]]<\/p>\n\n\n\n<p>&nbsp;[[[122. 201.&nbsp; 85.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[202.&nbsp; 15. 160.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[120. 175. 103.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;...<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[ 43. 102. 177.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[182. &nbsp; 7. &nbsp; 7.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[ 29.&nbsp; 47. 139.]]<\/p>\n\n\n\n<p>&nbsp;&nbsp;[[143. 136. 123.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[147.&nbsp; 61. 182.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[110.&nbsp; 52. 149.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;...<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[ 64.&nbsp; 73.&nbsp; 98.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[ 17.&nbsp; 39.&nbsp; 80.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[173. 105.&nbsp; 21.]]<\/p>\n\n\n\n<p>&nbsp;&nbsp;[[ 48. 177. 124.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[ 73. 108. 189.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[192. 155.&nbsp; 27.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;...<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[ 90.&nbsp; 81.&nbsp; 86.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[101. 170. 161.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[195. 218.&nbsp; 27.]]<\/p>\n\n\n\n<p>&nbsp;&nbsp;...<\/p>\n\n\n\n<p>&nbsp;&nbsp;[[201. 154.&nbsp; 27.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[255. 116. 182.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[ 44. 234. 154.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;...<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[&nbsp; 7. 130.&nbsp; 91.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[ 13.&nbsp; 76. 124.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[ 73. 223. 110.]]<\/p>\n\n\n\n<p>&nbsp;&nbsp;[[117.&nbsp; 39. 143.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[ 13.&nbsp; 50.&nbsp; 66.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[ 52.&nbsp; 68. 137.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;...<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[ 72. 122. 229.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[ 61.&nbsp; 99. 126.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[199.&nbsp; 73. 218.]]<\/p>\n\n\n\n<p>&nbsp;&nbsp;[[ 45. 215. 255.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[113. 178. 201.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[255. 180. 247.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;...<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[217.&nbsp; 86.&nbsp; 39.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[219. 224. 181.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[ 28.&nbsp; 14. 123.]]]<\/p>\n\n\n\n<p>&nbsp;...<\/p>\n\n\n\n<p>&nbsp;[[[208. 184.&nbsp; 12.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[&nbsp; 6. 110. 134.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[ 26.&nbsp; 47.&nbsp; 47.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;...<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[ 63.&nbsp; 58.&nbsp; 78.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[182.&nbsp; 73. 225.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[227.&nbsp; 28.&nbsp; 63.]]<\/p>\n\n\n\n<p>&nbsp;&nbsp;[[226. 123. 213.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[208. 175. 242.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[ 69. 152.&nbsp; 59.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;...<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[ 21. 122. 163.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[118. 254. 119.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[157. 146.&nbsp; 23.]]<\/p>\n\n\n\n<p>&nbsp;&nbsp;[[ 67.&nbsp; 31. 252.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[229. 106. 135.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[151.&nbsp; 98.&nbsp; 56.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;...<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[191.&nbsp; 76. 197.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[ 82.&nbsp; 91. 119.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[165. 119.&nbsp; 81.]]<\/p>\n\n\n\n<p>&nbsp;&nbsp;...<\/p>\n\n\n\n<p>&nbsp;&nbsp;[[116. 121. 114.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[232. 203.&nbsp; 36.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[ 46. 152.&nbsp; 17.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;...<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[107.&nbsp; 68.&nbsp; 24.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[199. 145. 182.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[116.&nbsp; 35. 216.]]<\/p>\n\n\n\n<p>&nbsp;&nbsp;[[230. 160. 172.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[ 88. 174.&nbsp; 54.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[139.&nbsp; 61. 229.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;...<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[ 60. 137. 174.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[ 46.&nbsp; 24.&nbsp; 11.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[190. 120.&nbsp; 56.]]<\/p>\n\n\n\n<p>&nbsp;&nbsp;[[183. 166. 218.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[ 51.&nbsp; 65. 206.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[236.&nbsp; 60. 202.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;...<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[142. 199. 106.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[163. 154.&nbsp; 71.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[172. 172. &nbsp; 7.]]]<\/p>\n\n\n\n<p>&nbsp;[[[150. 148. 113.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[238. 221.&nbsp; 55.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[ 47.&nbsp; 33. 118.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;...<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[140. 234.&nbsp; 62.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[ 75. 122.&nbsp; 59.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[136.&nbsp; 35. 132.]]<\/p>\n\n\n\n<p>&nbsp;&nbsp;[[112. 190.&nbsp; 42.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[ 65. 117. 249.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[&nbsp; 3. 223.&nbsp; 41.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;...<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[176. 245.&nbsp; 70.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[129.&nbsp; 63.&nbsp; 91.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[100.&nbsp; 25. 185.]]<\/p>\n\n\n\n<p>&nbsp;&nbsp;[[ 94. 170. &nbsp; 6.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[141. 191. 198.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[149. 171. 254.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;...<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[166. 209. 137.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[ 95. 176. 155.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[139. 103. 210.]]<\/p>\n\n\n\n<p>&nbsp;&nbsp;...<\/p>\n\n\n\n<p>&nbsp;&nbsp;[[ 49.&nbsp; 58.&nbsp; 76.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[ 66. 237.&nbsp; 32.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[ 69.&nbsp; 61. 218.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;...<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[212.&nbsp; 77. 169.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[103. 177. 251.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[ 67.&nbsp; 28. 224.]]<\/p>\n\n\n\n<p>&nbsp;&nbsp;[[ 54.&nbsp; 24. 250.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[181.&nbsp; 61. 145.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[235. 116.&nbsp; 49.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;...<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[195. 169.&nbsp; 99.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[ 42. 240. &nbsp; 9.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[203. 177.&nbsp; 25.]]<\/p>\n\n\n\n<p>&nbsp;&nbsp;[[&nbsp; 8.&nbsp; 91.&nbsp; 78.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[ 81.&nbsp; 25.&nbsp; 96.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[ 92. 229.&nbsp; 69.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;...<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[ 56. &nbsp; 8. 159.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[244. &nbsp; 1.&nbsp; 94.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[189.&nbsp; 58. 202.]]]<\/p>\n\n\n\n<p>&nbsp;[[[&nbsp; 2. 212.&nbsp; 25.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[183. 164. 186.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[106. 187. 178.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;...<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[ 97.&nbsp; 20. 168.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[144. 242. 246.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[117.&nbsp; 15. 126.]]<\/p>\n\n\n\n<p>&nbsp;&nbsp;[[130. 155.&nbsp; 79.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[103. 105. 244.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[163. 242. 237.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;...<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[101. 168. 186.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[ 22. 149. 226.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[244. 170.&nbsp; 46.]]<\/p>\n\n\n\n<p>&nbsp;&nbsp;[[140. &nbsp; 7. 163.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[151.&nbsp; 89.&nbsp; 27.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[244.&nbsp; 77. 210.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;...<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[210. 173. 240.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[176. 210.&nbsp; 44.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[ 26. 145. 213.]]<\/p>\n\n\n\n<p>&nbsp;&nbsp;...<\/p>\n\n\n\n<p>&nbsp;&nbsp;[[249. 229.&nbsp; 66.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[254. 176. 218.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[ 31. 218. 183.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;...<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[ 79. 140. 181.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[148.&nbsp; 47. 242.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[180. 142.&nbsp; 86.]]<\/p>\n\n\n\n<p>&nbsp;&nbsp;[[ 59. 166. 236.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[ 92. 207. 140.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[225. 220. 153.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;...<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[ 26. 253. 247.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[148. 177.&nbsp; 60.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[ 91. 136. &nbsp; 4.]]<\/p>\n\n\n\n<p>&nbsp;&nbsp;[[178. 114.&nbsp; 26.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[140.&nbsp; 38. 232.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[207. 199. 243.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;...<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[129. 180. 212.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[ 47. 159.&nbsp; 97.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[182. 179. &nbsp; 1.]]]]<\/p>\n\n\n\n<p>var: 1.0000<\/p>\n\n\n\n<p>mean: 0.0000<\/p>\n\n\n\n<p><strong>Rescaling and center cropping images<\/strong><br><\/p>\n\n\n\n<pre class=\"wp-block-code\"><code>from tensorflow.keras.layers.experimental.preprocessing import CenterCrop\nfrom tensorflow.keras.layers.experimental.preprocessing import Rescaling\n \ntraining_data = np.random.randint(0, 256, size=(16, 200, 200, 3)).astype(\"float32\")\n \ncropper = CenterCrop(height=100, width=100)\nscaler = Rescaling(scale=1.0 \/ 255)\n \noutput_data = scaler(cropper(training_data))\nprint(output_data)\nprint(\"shape:\", output_data.shape)\nprint(\"min:\", np.min(output_data))\nprint(\"max:\", np.max(output_data))\n<\/code><\/pre>\n\n\n\n<p>Output :<\/p>\n\n\n\n<p>tf.Tensor(<\/p>\n\n\n\n<p>[[[[0.50980395 0.56078434 0.3647059 ]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.10196079 0.9215687&nbsp; 0.15686275]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.95294124 0.9490197&nbsp; 0.427451&nbsp; ]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;...<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.4784314&nbsp; 0.77647066 0.77647066]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.37254903 0.78823537 0.9686275 ]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.15686275 0.72156864 0.8980393 ]]<\/p>\n\n\n\n<p>&nbsp;&nbsp;[[0.5254902&nbsp; 0.6156863&nbsp; 0.07450981]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.0627451&nbsp; 0.8862746&nbsp; 0.24313727]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.9058824&nbsp; 0.23529413 0.5372549 ]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;...<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.64705884 0.16862746 0.5176471 ]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.31764707 0.7019608&nbsp; 0.9686275 ]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.8431373&nbsp; 0.8352942&nbsp; 0.7137255 ]]<\/p>\n\n\n\n<p>&nbsp;&nbsp;[[0.6862745&nbsp; 0.05882353 0.35686275]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.909804 &nbsp; 0.9607844&nbsp; 0.7294118 ]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.37254903 0.6156863&nbsp; 0.10196079]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;...<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.16078432 0.97647065 0.70980394]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.53333336 0.7803922&nbsp; 0.8117648 ]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.5568628&nbsp; 0.05882353 0.5411765 ]]<\/p>\n\n\n\n<p>&nbsp;&nbsp;...<\/p>\n\n\n\n<p>&nbsp;&nbsp;[[0.21176472 0.25490198 0.32156864]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.6666667&nbsp; 0.6509804&nbsp; 0.5568628 ]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[1. &nbsp; &nbsp; \t0.3803922&nbsp; 0.3019608 ]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;...<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.9294118&nbsp; 0.75294125 0.3803922 ]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.3254902&nbsp; 0.34117648 0.21568629]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.77647066 0.43921572 0.5568628 ]]<\/p>\n\n\n\n<p>&nbsp;&nbsp;[[0.45882356 0.20784315 0.28627452]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.23137257 0.16078432 0.45882356]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.7843138&nbsp; 0.0509804&nbsp; 0.5882353 ]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;...<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.96470594 0.20000002 0.0509804 ]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.49803925 0.07843138 0.14901961]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.7960785&nbsp; 0.3254902&nbsp; 0.7803922 ]]<\/p>\n\n\n\n<p>&nbsp;&nbsp;[[0.00392157 0.77647066 0.14117648]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.30588236 0.5058824&nbsp; 0.01568628]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.01176471 0.41176474 0.09803922]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;...<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.13333334 0.25882354 0.13333334]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[1. &nbsp; &nbsp; \t0.5529412&nbsp; 0.6 &nbsp; &nbsp; &nbsp; ]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.3921569&nbsp; 0.18431373 0.65882355]]]<\/p>\n\n\n\n<p>&nbsp;[[[0.03529412 0.13725491 0.2392157 ]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.3372549&nbsp; 0.8588236&nbsp; 0.2627451 ]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.5647059&nbsp; 0.9568628&nbsp; 0.7176471 ]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;...<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.4784314&nbsp; 0.94117653 0.11764707]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.60784316 0.82745105 0.53333336]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.6117647&nbsp; 0.8588236&nbsp; 0.43137258]]<\/p>\n\n\n\n<p>&nbsp;&nbsp;[[0.5019608&nbsp; 0.7843138&nbsp; 0.13333334]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.9686275&nbsp; 0.7568628&nbsp; 0.85098046]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.21568629 0.21960786 0.7058824 ]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;...<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.85098046 0.882353 &nbsp; 0.5568628 ]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.44705886 0.4901961&nbsp; 0.17254902]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.94117653 0.16862746 0.31764707]]<\/p>\n\n\n\n<p>&nbsp;&nbsp;[[0.79215693 0.13333334 0.9176471 ]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.20392159 0.44705886 0.3254902 ]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.909804 &nbsp; 0.6745098&nbsp; 0.62352943]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;...<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.58431375 0.39607847 0.9490197 ]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.20000002 0.6666667&nbsp; 0.7686275 ]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.01568628 0.56078434 0.8196079 ]]<\/p>\n\n\n\n<p>&nbsp;&nbsp;...<\/p>\n\n\n\n<p>&nbsp;&nbsp;[[0.30588236 0.29803923 0.5176471 ]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.02745098 0.09411766 0.56078434]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.53333336 0.38823533 0.5568628 ]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;...<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.6666667&nbsp; 0.17254902 0.8235295 ]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.25490198 0.09019608 0.28627452]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.7176471&nbsp; 0.12941177 0.6 &nbsp; \t]]<\/p>\n\n\n\n<p>&nbsp;&nbsp;[[0.2901961&nbsp; 0.74509805 0.8588236 ]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.58431375 0.43921572 0.23137257]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.38431376 0.04313726 0.23529413]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;...<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.32156864 0.20000002 0.8862746 ]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.47450984 0.53333336 0.5137255 ]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.7019608&nbsp; 0.7725491&nbsp; 0.19215688]]<\/p>\n\n\n\n<p>&nbsp;&nbsp;[[0.38823533 0.19607845 0.35686275]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.28235295 0.28235295 0.75294125]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.08627451 0.62352943 0.07450981]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;...<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.82745105 0.52156866 0.54509807]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.13333334 0.29411766 0.58431375]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.7137255&nbsp; 0.92549026 0.14117648]]]<\/p>\n\n\n\n<p>&nbsp;[[[0.6039216&nbsp; 0.882353 &nbsp; 0.227451&nbsp; ]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.34901962 0.5058824&nbsp; 0.909804&nbsp; ]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.95294124 0.78823537 0.5058824 ]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;...<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.9058824&nbsp; 0.8000001&nbsp; 0.58431375]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.2901961&nbsp; 0.5764706&nbsp; 0.5764706 ]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.3921569&nbsp; 0.6745098&nbsp; 0.02745098]]<\/p>\n\n\n\n<p>&nbsp;&nbsp;[[0.60784316 0.8862746&nbsp; 0.08235294]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.39607847 0.97647065 0.46274513]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.5294118&nbsp; 0.38823533 0.5058824 ]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;...<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.74509805 0.4156863&nbsp; 0.52156866]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.06666667 0.77647066 0.49803925]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.9686275&nbsp; 0.3254902&nbsp; 0.14901961]]<\/p>\n\n\n\n<p>&nbsp;&nbsp;[[0.0509804&nbsp; 0.65882355 0.8588236 ]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.6509804&nbsp; 0.2784314&nbsp; 0.8235295 ]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.60784316 0.19215688 0.4431373 ]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;...<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.627451 &nbsp; 0.7372549&nbsp; 0.01176471]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.05490196 0.75294125 0.17254902]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.73333335 0.49803925 0.6745098 ]]<\/p>\n\n\n\n<p>&nbsp;&nbsp;...<\/p>\n\n\n\n<p>&nbsp;&nbsp;[[0.8313726&nbsp; 0.09411766 0.70980394]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.13333334 0.5529412&nbsp; 0.8941177 ]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.2901961&nbsp; 0.45098042 0.427451&nbsp; ]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;...<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.47450984 0.35686275 0.85098046]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.8000001&nbsp; 0.05490196 0.9960785 ]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.6&nbsp; &nbsp; \t0.57254905 0.10588236]]<\/p>\n\n\n\n<p>&nbsp;&nbsp;[[0.5019608&nbsp; 0.16078432 0.79215693]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.04313726 0.07450981 0.3254902 ]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.7058824&nbsp; 0.2627451&nbsp; 0.22352943]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;...<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.49411768 0.26666668 0.07450981]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.05882353 0.1137255&nbsp; 0.73333335]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.4901961&nbsp; 0.4784314&nbsp; 0.65882355]]<\/p>\n\n\n\n<p>&nbsp;&nbsp;[[0.44705886 0.27450982 0.9843138 ]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.45882356 0.11764707 0.48235297]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.57254905 0.10980393 0.81568635]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;...<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.454902 &nbsp; 0.9450981&nbsp; 0.64705884]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.454902 &nbsp; 0.6156863&nbsp; 0.04705883]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.59607846 0.64705884 0.01176471]]]<\/p>\n\n\n\n<p>&nbsp;...<\/p>\n\n\n\n<p>&nbsp;[[[0.23137257 0.7294118&nbsp; 0.7568628 ]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.2509804&nbsp; 0.8941177&nbsp; 0.14117648]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.8941177&nbsp; 0.96470594 0.18823531]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;...<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.8352942&nbsp; 0.36862746 0.74509805]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.50980395 0.87843144 0.13333334]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.62352943 0. &nbsp; &nbsp; \t0.8352942 ]]<\/p>\n\n\n\n<p>&nbsp;&nbsp;[[0.73333335 0.5176471&nbsp; 0.58431375]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.16862746 0.5058824&nbsp; 0.30588236]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.00392157 0.36862746 0.8078432 ]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;...<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.8196079&nbsp; 0.9803922&nbsp; 0.9333334 ]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.27450982 0.3529412&nbsp; 0.8000001 ]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.91372555 0.4901961&nbsp; 0.85098046]]<\/p>\n\n\n\n<p>&nbsp;&nbsp;[[0.227451 &nbsp; 0.04313726 0.7137255 ]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.28627452 0.6901961&nbsp; 0.97647065]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.8745099&nbsp; 0.9215687&nbsp; 0.77647066]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;...<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.4431373&nbsp; 0.94117653 0.96470594]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.05490196 0.8862746&nbsp; 0.09411766]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.7607844&nbsp; 0.64705884 0.3254902 ]]<\/p>\n\n\n\n<p>&nbsp;&nbsp;...<\/p>\n\n\n\n<p>&nbsp;&nbsp;[[0.9450981&nbsp; 0.74509805 0.7490196 ]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.34901962 0.6901961&nbsp; 0.41176474]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.65882355 0.01568628 0.43921572]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;...<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.36862746 0.5529412&nbsp; 0.4666667 ]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.81568635 0.68235296 0.909804&nbsp; ]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.4901961&nbsp; 0.3921569&nbsp; 0.49411768]]<\/p>\n\n\n\n<p>&nbsp;&nbsp;[[0.70980394 0.8078432&nbsp; 0.8352942 ]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.15686275 0.882353 &nbsp; 0.3529412 ]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.3803922&nbsp; 0.24705884 0.3529412 ]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;...<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.5411765&nbsp; 0.9921569&nbsp; 0.454902&nbsp; ]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.4156863&nbsp; 0.43137258 0.0627451 ]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.54901963 0.92549026 0.7568628 ]]<\/p>\n\n\n\n<p>&nbsp;&nbsp;[[0.854902 &nbsp; 0.18039216 0.61960787]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.68235296 0.10196079 0.07058824]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.23137257 0.5019608&nbsp; 0.48627454]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;...<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.627451 &nbsp; 0.50980395 0.5568628 ]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.0509804&nbsp; 0.6509804&nbsp; 0.6 &nbsp; &nbsp; &nbsp; ]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.59607846 0.03529412 0.1137255 ]]]<\/p>\n\n\n\n<p>&nbsp;[[[0.6509804&nbsp; 0.62352943 0.00392157]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.23137257 0.7803922&nbsp; 0.627451&nbsp; ]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.94117653 0.05882353 0.10980393]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;...<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.24313727 0.5254902&nbsp; 0.02745098]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.28235295 0.25490198 0.64705884]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.7411765&nbsp; 0.05882353 0.16078432]]<\/p>\n\n\n\n<p>&nbsp;&nbsp;[[0.7254902&nbsp; 0.20392159 0.5137255 ]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.69411767 0.29803923 0.2627451 ]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.2392157&nbsp; 0.80392164 0.69411767]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;...<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.46274513 0.7843138&nbsp; 0.40784317]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.19607845 0.16078432 0.24705884]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.14509805 0.3019608&nbsp; 0.04313726]]<\/p>\n\n\n\n<p>&nbsp;&nbsp;[[0.13333334 0.69411767 0.43137258]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.8470589&nbsp; 0.6156863&nbsp; 0.49411768]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.4039216&nbsp; 0.37254903 0.59607846]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;...<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.45098042 0.9450981&nbsp; 0.&nbsp; &nbsp; \t]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.3372549&nbsp; 0.49411768 0.9490197 ]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.4039216&nbsp; 0.27058825 0.227451&nbsp; ]]<\/p>\n\n\n\n<p>&nbsp;&nbsp;...<\/p>\n\n\n\n<p>&nbsp;&nbsp;[[0.28235295 0.882353 &nbsp; 0.6784314 ]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.75294125 0.46274513 0.0627451 ]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.9490197&nbsp; 0.6627451&nbsp; 0.25882354]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;...<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.8196079&nbsp; 0.17254902 0.3372549 ]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.97647065 0.8000001&nbsp; 0.5137255 ]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.9215687&nbsp; 0.227451 &nbsp; 0.50980395]]<\/p>\n\n\n\n<p>&nbsp;&nbsp;[[0.37647063 0.74509805 1.&nbsp; &nbsp; \t]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.31764707 0.10980393 0.59607846]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.89019614 0.18823531 0.61960787]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;...<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.9490197&nbsp; 0.17254902 0.02745098]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.97647065 0.86274517 0.25882354]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.30588236 0.20784315 0.4039216 ]]<\/p>\n\n\n\n<p>&nbsp;&nbsp;[[0.6392157&nbsp; 0.2627451&nbsp; 0.654902&nbsp; ]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.96470594 0.27450982 0.7294118 ]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.16862746 0.40784317 0.6156863 ]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;...<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.9843138&nbsp; 0.03529412 0.40784317]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.86666673 0.9058824&nbsp; 0.9058824 ]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0. &nbsp; &nbsp; \t0.52156866 0.42352945]]]<\/p>\n\n\n\n<p>&nbsp;[[[0.6&nbsp; &nbsp; \t0.34509805 0.7137255 ]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.7686275&nbsp; 0.4156863&nbsp; 0.9803922 ]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.8862746&nbsp; 0.79215693 0.04705883]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;...<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.7254902&nbsp; 0.20000002 0.5411765 ]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.09019608 0.9843138&nbsp; 0.5176471 ]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.6&nbsp; &nbsp; \t0.49411768 0.6117647 ]]<\/p>\n\n\n\n<p>&nbsp;&nbsp;[[0.5294118&nbsp; 0.09411766 0.86666673]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.7960785&nbsp; 0.34117648 0.7607844 ]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.6039216&nbsp; 0.6117647&nbsp; 0.73333335]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;...<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.3803922&nbsp; 0.83921576 0.5254902 ]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.15686275 0.4784314&nbsp; 0.34901962]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.82745105 0.95294124 0.8941177 ]]<\/p>\n\n\n\n<p>&nbsp;&nbsp;[[0.7568628&nbsp; 0.6666667&nbsp; 0.5764706 ]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.14117648 0.05490196 0.05490196]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.44705886 0.07450981 0.13333334]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;...<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.47450984 0.81568635 0.37254903]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.5686275&nbsp; 0.427451 &nbsp; 0.06666667]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.9960785&nbsp; 0.9294118&nbsp; 0.29411766]]<\/p>\n\n\n\n<p>&nbsp;&nbsp;...<\/p>\n\n\n\n<p>&nbsp;&nbsp;[[0.7294118&nbsp; 0.7137255&nbsp; 0.21960786]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.41960788 0.81568635 0.7843138 ]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.61960787 0.6784314&nbsp; 0.9607844 ]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;...<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.6117647&nbsp; 0.427451 &nbsp; 0.86274517]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.89019614 0.2509804&nbsp; 0.47450984]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.40000004 0.3803922&nbsp; 0.9607844 ]]<\/p>\n\n\n\n<p>&nbsp;&nbsp;[[0.21568629 0.7254902&nbsp; 0.10588236]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.45098042 0.54509807 0.18823531]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.41176474 0.30980393 0.09411766]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;...<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.2627451&nbsp; 0.882353 &nbsp; 0.8862746 ]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.4666667&nbsp; 0.89019614 0.69411767]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.38823533 0.64705884 0.42352945]]<\/p>\n\n\n\n<p>&nbsp;&nbsp;[[0.00784314 0.33333334 0.7803922 ]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.5647059&nbsp; 0.7686275&nbsp; 0.03529412]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.44705886 0.19215688 0.7490196 ]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;...<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.6117647&nbsp; 0.7803922&nbsp; 0.60784316]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.7137255&nbsp; 0.6509804&nbsp; 0.18823531]<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;[0.87843144 0.9960785&nbsp; 0.9607844 ]]]], shape=(64, 100, 100, 3), dtype=float32)<\/p>\n\n\n\n<p>shape: (64, 100, 100, 3)<\/p>\n\n\n\n<p>min: 0.0<\/p>\n\n\n\n<p>max: 1.0<\/p>\n\n\n\n<p><\/p>\n\n\n\n<p><strong>Building models with keras Functional API<\/strong><\/p>\n\n\n\n<p>A simple input output transformation is known as a layer. Multiple layers can be stacked to get desired results.<\/p>\n\n\n\n<p>A linear&nbsp; layer that maps its inputs to a 16 dimensional feature space can be written as :<\/p>\n\n\n\n<p>D=keras.layers.Dense(units=)<\/p>\n\n\n\n<p>A model is practically a directed acyclic graph of individual layers. The most powerful method to build keras models is by using the Functional API.<\/p>\n\n\n\n<p>We start by specifying the input size and shape which are mandatory.If any dimension of the input is likely to vary we can specify it as None.<\/p>\n\n\n\n<p>For example :<\/p>\n\n\n\n<p>Lets consider we have input shape of (200,200,3) for a RGB image but for any dimensional input shape, we can specify it as (None,None,3)<\/p>\n\n\n\n<p>For taking inputs of RGB images of any size we can do the following:<\/p>\n\n\n\n<p>Inputs=keras.Input(shape=(None,None,3))<\/p>\n\n\n\n<p>After the input has been defined, we can chain layers<br><\/p>\n\n\n\n<pre class=\"wp-block-code\"><code>from tensorflow import keras\nfrom tensorflow.keras import layers\n# Let's say we expect our inputs to be RGB images of arbitrary size\ninputs = keras.Input(shape=(None, None, 3))\n \n \nx = CenterCrop(height=100, width=100)(inputs)\n \nx = Rescaling(scale=1.0 \/ 255)(x)\n \n# Apply some convolution and pooling layers\nx = layers.Conv2D(filters=32, kernel_size=(2, 2), activation=\"relu\")(x)\nx = layers.MaxPooling2D(pool_size=(1, 1))(x)\nx = layers.Conv2D(filters=32, kernel_size=(2, 2), activation=\"relu\")(x)\nx = layers.MaxPooling2D(pool_size=(1, 1))(x)\nx = layers.Conv2D(filters=32, kernel_size=(2, 2), activation=\"relu\")(x)\nx = layers.Conv2D(filters=32, kernel_size=(2, 2), activation=\"relu\")(x)\nx = layers.MaxPooling2D(pool_size=(1, 1))(x)\nx = layers.Conv2D(filters=32, kernel_size=(2, 2), activation=\"relu\")(x)\nx = layers.MaxPooling2D(pool_size=(1, 1))(x)\nx = layers.Conv2D(filters=32, kernel_size=(2, 2), activation=\"relu\")(x)\n \n \nx = layers.GlobalAveragePooling2D()(x)\n \n \nnum_classes = 10\noutputs = layers.Dense(num_classes, activation=\"softmax\")(x)\n<\/code><\/pre>\n\n\n\n<p>Once we have defined the directed acyclic graph of layers that turns our input(s) into our outputs, we instantiate a Model object:<br><\/p>\n\n\n\n<pre class=\"wp-block-code\"><code>model = keras.Model(inputs=inputs, outputs=outputs)\nThis model behaves basically like a bigger layer. You can call it on batches of data, like this:\n \ndata = np.random.randint(0, 256, size=(64, 200, 200, 3)).astype(\"float32\")\nprocessed_data = model(data)\nprint(processed_data.shape)\n<\/code><\/pre>\n\n\n\n<p>A model.summary can be printed.<\/p>\n\n\n\n<p>model.summary()<\/p>\n\n\n\n<p>Model: \"functional_3\"<\/p>\n\n\n\n<p>_________________________________________________________________<\/p>\n\n\n\n<p>Layer (type) &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; \tOutput Shape&nbsp; &nbsp; &nbsp; &nbsp; &nbsp; \tParam #&nbsp;&nbsp;<\/p>\n\n\n\n<p>=================================================================<\/p>\n\n\n\n<p>input_4 (InputLayer) &nbsp; &nbsp; \t[(None, None, None, 3)] &nbsp; 0&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;<\/p>\n\n\n\n<p>_________________________________________________________________<\/p>\n\n\n\n<p>center_crop_5 (CenterCrop) &nbsp; (None, 150, 150, 3) &nbsp; \t0&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;<\/p>\n\n\n\n<p>_________________________________________________________________<\/p>\n\n\n\n<p>rescaling_5 (Rescaling)&nbsp; \t(None, 150, 150, 3) &nbsp; \t0&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;<\/p>\n\n\n\n<p>_________________________________________________________________<\/p>\n\n\n\n<p>conv2d_11 (Conv2D) &nbsp; &nbsp; &nbsp; \t(None, 149, 149, 32)&nbsp; &nbsp; &nbsp; 416&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;<\/p>\n\n\n\n<p>_________________________________________________________________<\/p>\n\n\n\n<p>max_pooling2d_8 (MaxPooling2 (None, 149, 149, 32)&nbsp; \t0&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;<\/p>\n\n\n\n<p>_________________________________________________________________<\/p>\n\n\n\n<p>conv2d_12 (Conv2D) &nbsp; &nbsp; &nbsp; \t(None, 148, 148, 32)&nbsp; \t4128&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;<\/p>\n\n\n\n<p>_________________________________________________________________<\/p>\n\n\n\n<p>max_pooling2d_9 (MaxPooling2 (None, 148, 148, 32)&nbsp; \t0&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;<\/p>\n\n\n\n<p>_________________________________________________________________<\/p>\n\n\n\n<p>conv2d_13 (Conv2D) &nbsp; &nbsp; &nbsp; \t(None, 147, 147, 32)&nbsp; \t4128&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;<\/p>\n\n\n\n<p>_________________________________________________________________<\/p>\n\n\n\n<p>conv2d_14 (Conv2D) &nbsp; &nbsp; &nbsp; \t(None, 146, 146, 32)&nbsp; \t4128&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;<\/p>\n\n\n\n<p>_________________________________________________________________<\/p>\n\n\n\n<p>max_pooling2d_10 (MaxPooling (None, 146, 146, 32)&nbsp; \t0&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;<\/p>\n\n\n\n<p>_________________________________________________________________<\/p>\n\n\n\n<p>conv2d_15 (Conv2D) &nbsp; &nbsp; &nbsp; \t(None, 145, 145, 32)&nbsp; \t4128&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;<\/p>\n\n\n\n<p>_________________________________________________________________<\/p>\n\n\n\n<p>max_pooling2d_11 (MaxPooling (None, 145, 145, 32)&nbsp; \t0&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;<\/p>\n\n\n\n<p>_________________________________________________________________<\/p>\n\n\n\n<p>conv2d_16 (Conv2D) &nbsp; &nbsp; &nbsp; \t(None, 144, 144, 32)&nbsp; \t4128&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;<\/p>\n\n\n\n<p>_________________________________________________________________<\/p>\n\n\n\n<p>global_average_pooling2d_2 ( (None, 32)&nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; \t0 &nbsp; &nbsp; \t<\/p>\n\n\n\n<p>_________________________________________________________________<\/p>\n\n\n\n<p>dense_2 (Dense)&nbsp; &nbsp; &nbsp; &nbsp; &nbsp; \t(None, 10)&nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; \t330 &nbsp; \t<\/p>\n\n\n\n<p>=================================================================<\/p>\n\n\n\n<p>Total params: 21,386<\/p>\n\n\n\n<p>Trainable params: 21,386<\/p>\n\n\n\n<p>Non-trainable params: 0<\/p>\n\n\n\n<p>The next step for us is to train the model on our data. The Model class has a built-in training loop, called the fit() method that accepts Dataset objects or Python generators&nbsp; that yield batches of data, or NumPy arrays.<\/p>\n\n\n\n<p>Before we can call fit(), we need to specify an optimizer and a loss function.This is the compile() step:<\/p>\n\n\n\n<p>model.compile(optimizer=keras.optimizers.RMSprop(learning_rate=1e-4),<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;loss=keras.losses.CategoricalCrossentropy())<\/p>\n\n\n\n<p>Loss and optimizer can be specified via their string identifiers (in this case their default constructor argument values are used):<\/p>\n\n\n\n<p>model.compile(optimizer='rmsprop', loss='categorical_crossentropy')<\/p>\n\n\n\n<p>Once our model is compiled, you can start \"fitting\" the model to the data. Here's what fitting a model looks like with NumPy data:<\/p>\n\n\n\n<p>model.fit(numpy_array_of_samples, numpy_array_of_labels,<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;batch_size=32, epochs=10)<\/p>\n\n\n\n<p>Besides the data, you have to specify two key parameters: the batch_size and the number of epochs (iterations on the data). Here our data will get sliced on batches of 32 samples, and the model will iterate 10 times over the data during training.<\/p>\n\n\n\n<p>Here's what fitting a model looks like with a dataset:<\/p>\n\n\n\n<p>model.fit(dataset_of_samples_and_labels, epochs=10)<\/p>\n\n\n\n<p>Since the data yielded by a dataset is expect to be already batched, you don't need to specify the batch size here.<\/p>\n\n\n\n<p>Let's look at it in practice with a toy example model that learns to classify MNIST digits:<br><\/p>\n\n\n\n<pre class=\"wp-block-code\"><code># Get the data as Numpy arrays\n(x_train, y_train), (x_test, y_test) = keras.datasets.mnist.load_data()\n \n# Build a simple model\ninputs = keras.Input(shape=(28, 28))\nx = layers.experimental.preprocessing.Rescaling(1.0 \/ 255)(inputs)\nx = layers.Flatten()(x)\nx = layers.Dense(128, activation=\"relu\")(x)\nx = layers.Dense(128, activation=\"relu\")(x)\noutputs = layers.Dense(10, activation=\"softmax\")(x)\nmodel = keras.Model(inputs, outputs)\nmodel.summary()\n \n# Compile the model\nmodel.compile(optimizer=\"adam\", loss=\"sparse_categorical_crossentropy\")\n \n# Train the model for 1 epoch from Numpy data\nbatch_size = 64\nprint(\"Fit on NumPy data\")\nhistory = model.fit(x_train, y_train, batch_size=batch_size, epochs=1)\n \n# Train the model for 1 epoch using a dataset\ndataset = tf.data.Dataset.from_tensor_slices((x_train, y_train)).batch(batch_size)\nprint(\"Fit on Dataset\")\nhistory = model.fit(dataset, epochs=1)\n<\/code><\/pre>\n\n\n\n<p>{'loss': [0.11615095287561417]}<\/p>\n\n\n\n<p>For a detailed overview of how to use fit(), see the guide to training &amp; evaluation with the built-in Keras methods.<\/p>\n\n\n\n<p><strong>Keeping track of performance metrics<\/strong><\/p>\n\n\n\n<p>As you're training a model, you want to keep track of metrics such as classification accuracy, precision, recall, AUC, etc. Besides, you want to monitor these metrics not only on the training data, but also on a validation set.<\/p>\n\n\n\n<p><strong>Monitoring metrics<\/strong><\/p>\n\n\n\n<p>You can pass a list of metric objects to compile(), like this:<br><\/p>\n\n\n\n<pre class=\"wp-block-code\"><code>model.compile(\n    optimizer=\"adam\",\n    loss=\"sparse_categorical_crossentropy\",\n    metrics=&#91;keras.metrics.SparseCategoricalAccuracy(name=\"acc\")],\n)\nhistory = model.fit(dataset, epochs=1)\n<\/code><\/pre>\n\n\n\n<p><strong>Passing validation data to fit()<\/strong><\/p>\n\n\n\n<p>You can pass validation data to fit() to monitor your validation loss &amp; validation metrics. Validation metrics get reported at the end of each epoch.<br><\/p>\n\n\n\n<pre class=\"wp-block-code\"><code>val_dataset = tf.data.Dataset.from_tensor_slices((x_test, y_test)).batch(batch_size)\nhistory = model.fit(dataset, epochs=1, validation_data=val_dataset)<\/code><\/pre>\n\n\n\n<p><\/p>\n\n\n\n<p>938\/938 [==============================] - 1s 1ms\/step - loss: 0.0556 - acc: 0.9829 - val_loss: 0.1163 - val_acc: 0.9670<\/p>\n\n\n\n<p><strong>Using callbacks for checkpointing&nbsp;<\/strong><\/p>\n\n\n\n<p>If training goes on for more than a few minutes, it's important to save your model at regular intervals during training. You can then use your saved models to restart training in case your training process crashes (this is important for multi-worker distributed training, since with many workers at least one of them is bound to fail at some point).<\/p>\n\n\n\n<p>An important feature of Keras is callbacks, configured in fit(). Callbacks are objects that get called by the model at different point during training, in particular:<\/p>\n\n\n\n<p>At the beginning and end of each batch<\/p>\n\n\n\n<p>At the beginning and end of each epoch<\/p>\n\n\n\n<p>Callbacks are a way to make model trainable entirely scriptable.<\/p>\n\n\n\n<p>You can use callbacks to periodically save your model. Here's a simple example: a ModelCheckpoint callback configured to save the model at the end of every epoch. The filename will include the current epoch.<br><\/p>\n\n\n\n<p><br><\/p>\n\n\n\n<pre class=\"wp-block-code\"><code>callbacks = &#91;\n    keras.callbacks.ModelCheckpoint(\n        filepath='path\/to\/my\/model_{epoch}',\n        save_freq='epoch')\n]\nmodel.fit(dataset, epochs=2, callbacks=callbacks)\n<\/code><\/pre>\n\n\n\n<p>You can also use callbacks to do things like periodically changing the learning of your optimizer, streaming metrics to a Slack bot, sending yourself an email notification when training is complete, etc.<\/p>\n\n\n\n<p>For detailed overview of what callbacks are available and how to write your own, see the callbacks API documentation and the guide to writing custom callbacks.<\/p>\n\n\n\n<p><strong>Monitoring training progress with TensorBoard<\/strong><\/p>\n\n\n\n<p>Staring at the Keras progress bar isn't the most ergonomic way to monitor how your loss and metrics are evolving over time. There's a better solution: TensorBoard, a web application that can display real-time graphs of your metrics (and more).<\/p>\n\n\n\n<p>To use TensorBoard with fit(), simply pass a keras.callbacks.TensorBoard callback specifying the directory where to store TensorBoard logs:<br><\/p>\n\n\n\n<pre class=\"wp-block-code\"><code>callbacks = &#91;\n    keras.callbacks.TensorBoard(log_dir='.\/logs')\n]\nmodel.fit(dataset, epochs=2, callbacks=callbacks)\n<\/code><\/pre>\n\n\n\n<p>You can then launch a TensorBoard instance that you can open in your browser to monitor the logs getting written to this location:<\/p>\n\n\n\n<p>tensorboard --logdir=.\/logs<\/p>\n\n\n\n<p>What's more, you can launch an in-line TensorBoard tab when training models in Jupyter \/ Colab notebooks. Here's more information.<\/p>\n\n\n\n<p>After fit(): evaluating test performance &amp; generating predictions on new data<\/p>\n\n\n\n<p>Once you have a trained model, you can evaluate its loss and metrics on new data via evaluate():<br><\/p>\n\n\n\n<pre class=\"wp-block-code\"><code>loss, acc = model.evaluate(val_dataset)  # returns loss and metrics\nprint(\"loss: %.2f\" % loss)\nprint(\"acc: %.2f\" % acc)\n<\/code><\/pre>\n\n\n\n<p>157\/157 [==============================] - 0s 907us\/step - loss: 0.1163 - acc: 0.9670<\/p>\n\n\n\n<p>loss: 0.12<\/p>\n\n\n\n<p>acc: 0.97<\/p>\n\n\n\n<p>We can also generate NumPy arrays of predictions (the activations of the output layer(s) in the model) via predict():<br><\/p>\n\n\n\n<pre class=\"wp-block-code\"><code>predictions = model.predict(val_dataset)\nprint(predictions.shape)\n<\/code><\/pre>\n\n\n\n<p>(10000, 10)<\/p>\n\n\n\n<p><strong>Using fit() with a custom training step<\/strong><\/p>\n\n\n\n<p>By default, fit() is configured for supervised learning. If you need a different kind of training loop (for instance, a GAN training loop), you can provide your own implementation of the Model.train_step() method. This is the method that is repeatedly called during fit().<\/p>\n\n\n\n<p>Metrics, callbacks, etc. will work as usual.<\/p>\n\n\n\n<p>Here's a simple example that reimplements what fit() normally does:<br><\/p>\n\n\n\n<pre class=\"wp-block-code\"><code>class CustomModel(keras.Model):\n  def train_step(self, data):\n\t# Unpack the data. Its structure depends on your model and\n\t# on what you pass to `fit()`.\n\tx, y = data\n\twith tf.GradientTape() as tape:\n  \ty_pred = self(x, training=True)  # Forward pass\n  \t# Compute the loss value\n  \t# (the loss function is configured in `compile()`)\n  \tloss = self.compiled_loss(y, y_pred,\n                                regularization_losses=self.losses)\n\t# Compute gradients\n    trainable_vars = self.trainable_variables\n\tgradients = tape.gradient(loss, trainable_vars)\n\t# Update weights\n    self.optimizer.apply_gradients(zip(gradients, trainable_vars))\n\t# Update metrics (includes the metric that tracks the loss)\n    self.compiled_metrics.update_state(y, y_pred)\n\t# Return a dict mapping metric names to current value\n\treturn {m.name: m.result() for m in self.metrics}\n \n# Construct and compile an instance of CustomModel\ninputs = keras.Input(shape=(32,))\noutputs = keras.layers.Dense(1)(inputs)\nmodel = CustomModel(inputs, outputs)\nmodel.compile(optimizer='adam', loss='mse', metrics=&#91;...])\n \n# Just use `fit` as usual\nmodel.fit(dataset, epochs=3, callbacks=...)\n<\/code><\/pre>\n\n\n\n<p>Code can be written in keras using two methodologies:<\/p>\n\n\n\n<p>1.&nbsp; \tSequential API<\/p>\n\n\n\n<p>2.&nbsp; \tFunctional API<\/p>\n\n\n\n<p>We will see Sequential API in this tutorial<\/p>\n\n\n\n<p><strong>Sequential API<\/strong><\/p>\n\n\n\n<p>A sequential model is appropriate where a plain stack of layers is required, given each layer has exactly one input tensor and one output tensor.<\/p>\n\n\n\n<p>A sequential model is not appropriate when:<\/p>\n\n\n\n<p>Designing a network with multiple inputs or multiple outputs<\/p>\n\n\n\n<p>If any one of the layers has multiple inputs or multiple outputs<\/p>\n\n\n\n<p>When sharing of layers is required<\/p>\n\n\n\n<p>When we have a nonlinear network topology in place<\/p>\n\n\n\n<p><strong>Sequential model Creation<\/strong><\/p>\n\n\n\n<p>We can create a sequential model by first importing the sequential class from keras.models and then passing a list of layers (which give a stacking appearance to the layers) and then passing on this list as parameters to the sequential constructor.<\/p>\n\n\n\n<p>Lets train a image detection model on the fashion mnist dataset:<\/p>\n\n\n\n<p><strong>Sample code in Tensorflow<\/strong><br><\/p>\n\n\n\n<pre class=\"wp-block-code\"><code># TensorFlow and tf.keras\nimport tensorflow as tf\nfrom tensorflow import keras\n \n# Helper libraries\nimport numpy as np\nimport matplotlib.pyplot as plt\n \nfashion_mnist = keras.datasets.fashion_mnist\n \n(train_images, train_labels), (test_images, test_labels) = fashion_mnist.load_data()\nclass_names = &#91;'T-shirt\/top', 'Trouser', 'Pullover', 'Dress', 'Coat',\n               'Sandal', 'Shirt', 'Sneaker', 'Bag', 'Ankle boot']\nplt.figure()\nplt.imshow(train_images&#91;0])\nplt.colorbar()\nplt.grid(False)\nplt.show()\n \ntrain_images = train_images \/ 255.0\n \ntest_images = test_images \/ 255.0\n \nplt.figure(figsize=(10,10))\nfor i in range(25):\n    plt.subplot(5,5,i+1)\n    plt.xticks(&#91;])\n    plt.yticks(&#91;])\n    plt.grid(False)\n    plt.imshow(train_images&#91;i], cmap=plt.cm.binary)\n    plt.xlabel(class_names&#91;train_labels&#91;i]])\nplt.show()\n \nmodel = keras.Sequential(&#91;\n    keras.layers.Flatten(input_shape=(28, 28)),\n    keras.layers.Dense(128, activation='relu'),\n    keras.layers.Dense(10)\n])\n \n \nmodel.compile(optimizer='adam',\n              loss=tf.keras.losses.SparseCategoricalCrossentropy(from_logits=True),\n              metrics=&#91;'accuracy'])\n \nmodel.fit(train_images, train_labels, epochs=10)\n \ntest_loss, test_acc = model.evaluate(test_images,  test_labels, verbose=2)\n \nprint('\\nTest accuracy:', test_acc)\n \nprobability_model = tf.keras.Sequential(&#91;model, \n                                         tf.keras.layers.Softmax()])\n \npredictions = probability_model.predict(test_images)\n \nnp.argmax(predictions&#91;0])\n \ndef plot_image(i, predictions_array, true_label, img):\n  predictions_array, true_label, img = predictions_array, true_label&#91;i], img&#91;i]\n  plt.grid(False)\n  plt.xticks(&#91;])\n  plt.yticks(&#91;])\n \n  plt.imshow(img, cmap=plt.cm.binary)\n \n  predicted_label = np.argmax(predictions_array)\n  if predicted_label == true_label:\n    color = 'blue'\n  else:\n    color = 'red'\n \n  plt.xlabel(\"{} {:2.0f}% ({})\".format(class_names&#91;predicted_label],\n                                100*np.max(predictions_array),\n                                class_names&#91;true_label]),\n                                color=color)\n \ndef plot_value_array(i, predictions_array, true_label):\n  predictions_array, true_label = predictions_array, true_label&#91;i]\n  plt.grid(False)\n  plt.xticks(range(10))\n  plt.yticks(&#91;])\n  thisplot = plt.bar(range(10), predictions_array, color=\"#777777\")\n  plt.ylim(&#91;0, 1])\n  predicted_label = np.argmax(predictions_array)\n \n  thisplot&#91;predicted_label].set_color('red')\n  thisplot&#91;true_label].set_color('blue')\n \n  # Plot the first X test images, their predicted labels, and the true labels.\n# Color correct predictions in blue and incorrect predictions in red.\nnum_rows = 5\nnum_cols = 3\nnum_images = num_rows*num_cols\nplt.figure(figsize=(2*2*num_cols, 2*num_rows))\nfor i in range(num_images):\n  plt.subplot(num_rows, 2*num_cols, 2*i+1)\n  plot_image(i, predictions&#91;i], test_labels, test_images)\n  plt.subplot(num_rows, 2*num_cols, 2*i+2)\n  plot_value_array(i, predictions&#91;i], test_labels)\nplt.tight_layout()\nplt.show()\n<\/code><\/pre>\n\n\n\n<p><\/p>\n\n\n\n<p><strong>plt.show()<\/strong><\/p>\n\n\n\n<p>Sample train images values<\/p>\n\n\n\n<p>[[[0. 0. 0. ... 0. 0. 0.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;[0. 0. 0. ... 0. 0. 0.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;[0. 0. 0. ... 0. 0. 0.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;...<\/p>\n\n\n\n<p>&nbsp;&nbsp;[0. 0. 0. ... 0. 0. 0.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;[0. 0. 0. ... 0. 0. 0.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;[0. 0. 0. ... 0. 0. 0.]]<\/p>\n\n\n\n<p>&nbsp;[[0. 0. 0. ... 0. 0. 0.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;[0. 0. 0. ... 0. 0. 0.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;[0. 0. 0. ... 0. 0. 0.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;...<\/p>\n\n\n\n<p>&nbsp;&nbsp;[0. 0. 0. ... 0. 0. 0.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;[0. 0. 0. ... 0. 0. 0.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;[0. 0. 0. ... 0. 0. 0.]]<\/p>\n\n\n\n<p>&nbsp;[[0. 0. 0. ... 0. 0. 0.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;[0. 0. 0. ... 0. 0. 0.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;[0. 0. 0. ... 0. 0. 0.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;...<\/p>\n\n\n\n<p>&nbsp;&nbsp;[0. 0. 0. ... 0. 0. 0.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;[0. 0. 0. ... 0. 0. 0.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;[0. 0. 0. ... 0. 0. 0.]]<\/p>\n\n\n\n<p>&nbsp;...<\/p>\n\n\n\n<p>&nbsp;[[0. 0. 0. ... 0. 0. 0.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;[0. 0. 0. ... 0. 0. 0.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;[0. 0. 0. ... 0. 0. 0.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;...<\/p>\n\n\n\n<p>&nbsp;&nbsp;[0. 0. 0. ... 0. 0. 0.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;[0. 0. 0. ... 0. 0. 0.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;[0. 0. 0. ... 0. 0. 0.]]<\/p>\n\n\n\n<p>&nbsp;[[0. 0. 0. ... 0. 0. 0.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;[0. 0. 0. ... 0. 0. 0.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;[0. 0. 0. ... 0. 0. 0.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;...<\/p>\n\n\n\n<p>&nbsp;&nbsp;[0. 0. 0. ... 0. 0. 0.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;[0. 0. 0. ... 0. 0. 0.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;[0. 0. 0. ... 0. 0. 0.]]<\/p>\n\n\n\n<p>&nbsp;[[0. 0. 0. ... 0. 0. 0.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;[0. 0. 0. ... 0. 0. 0.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;[0. 0. 0. ... 0. 0. 0.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;...<\/p>\n\n\n\n<p>&nbsp;&nbsp;[0. 0. 0. ... 0. 0. 0.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;[0. 0. 0. ... 0. 0. 0.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;[0. 0. 0. ... 0. 0. 0.]]]<\/p>\n\n\n\n<p>sample test images values<\/p>\n\n\n\n<p>[[[0. 0. 0. ... 0. 0. 0.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;[0. 0. 0. ... 0. 0. 0.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;[0. 0. 0. ... 0. 0. 0.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;...<\/p>\n\n\n\n<p>&nbsp;&nbsp;[0. 0. 0. ... 0. 0. 0.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;[0. 0. 0. ... 0. 0. 0.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;[0. 0. 0. ... 0. 0. 0.]]<\/p>\n\n\n\n<p>&nbsp;[[0. 0. 0. ... 0. 0. 0.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;[0. 0. 0. ... 0. 0. 0.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;[0. 0. 0. ... 0. 0. 0.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;...<\/p>\n\n\n\n<p>&nbsp;&nbsp;[0. 0. 0. ... 0. 0. 0.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;[0. 0. 0. ... 0. 0. 0.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;[0. 0. 0. ... 0. 0. 0.]]<\/p>\n\n\n\n<p>&nbsp;[[0. 0. 0. ... 0. 0. 0.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;[0. 0. 0. ... 0. 0. 0.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;[0. 0. 0. ... 0. 0. 0.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;...<\/p>\n\n\n\n<p>&nbsp;&nbsp;[0. 0. 0. ... 0. 0. 0.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;[0. 0. 0. ... 0. 0. 0.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;[0. 0. 0. ... 0. 0. 0.]]<\/p>\n\n\n\n<p>&nbsp;...<\/p>\n\n\n\n<p>&nbsp;[[0. 0. 0. ... 0. 0. 0.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;[0. 0. 0. ... 0. 0. 0.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;[0. 0. 0. ... 0. 0. 0.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;...<\/p>\n\n\n\n<p>&nbsp;&nbsp;[0. 0. 0. ... 0. 0. 0.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;[0. 0. 0. ... 0. 0. 0.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;[0. 0. 0. ... 0. 0. 0.]]<\/p>\n\n\n\n<p>&nbsp;[[0. 0. 0. ... 0. 0. 0.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;[0. 0. 0. ... 0. 0. 0.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;[0. 0. 0. ... 0. 0. 0.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;...<\/p>\n\n\n\n<p>&nbsp;&nbsp;[0. 0. 0. ... 0. 0. 0.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;[0. 0. 0. ... 0. 0. 0.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;[0. 0. 0. ... 0. 0. 0.]]<\/p>\n\n\n\n<p>&nbsp;[[0. 0. 0. ... 0. 0. 0.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;[0. 0. 0. ... 0. 0. 0.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;[0. 0. 0. ... 0. 0. 0.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;...<\/p>\n\n\n\n<p>&nbsp;&nbsp;[0. 0. 0. ... 0. 0. 0.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;[0. 0. 0. ... 0. 0. 0.]<\/p>\n\n\n\n<p>&nbsp;&nbsp;[0. 0. 0. ... 0. 0. 0.]]]<\/p>\n\n\n\n<p>Model: \"sequential_12\"<\/p>\n\n\n\n<p>_________________________________________________________________<\/p>\n\n\n\n<p>Layer (type) &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; \tOutput Shape&nbsp; &nbsp; &nbsp; &nbsp; &nbsp; \tParam #&nbsp;&nbsp;<\/p>\n\n\n\n<p>=================================================================<\/p>\n\n\n\n<p>flatten_6 (Flatten)&nbsp; &nbsp; &nbsp; \t(None, 784) &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; \t0 &nbsp; &nbsp; \t<\/p>\n\n\n\n<p>_________________________________________________________________<\/p>\n\n\n\n<p>dense_15 (Dense) &nbsp; &nbsp; &nbsp; &nbsp; \t(None, 128) &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; \t100480\t<\/p>\n\n\n\n<p>_________________________________________________________________<\/p>\n\n\n\n<p>dense_16 (Dense) &nbsp; &nbsp; &nbsp; &nbsp; \t(None, 10)&nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; \t1290&nbsp; \t<\/p>\n\n\n\n<p>=================================================================<\/p>\n\n\n\n<p>Total params: 101,770<\/p>\n\n\n\n<p>Trainable params: 101,770<\/p>\n\n\n\n<p>Non-trainable params: 0<\/p>\n\n\n\n<p>_________________________________________________________________<\/p>\n\n\n\n<p>None<\/p>\n\n\n\n<p>Epoch 1\/20<\/p>\n\n\n\n<p>1875\/1875 [==============================] - 2s 1ms\/step - loss: 0.4999 - accuracy: 0.8244<\/p>\n\n\n\n<p>Epoch 2\/20<\/p>\n\n\n\n<p>1875\/1875 [==============================] - 2s 1ms\/step - loss: 0.3753 - accuracy: 0.8644<\/p>\n\n\n\n<p>Epoch 3\/20<\/p>\n\n\n\n<p>1875\/1875 [==============================] - 2s 1ms\/step - loss: 0.3372 - accuracy: 0.8773<\/p>\n\n\n\n<p>Epoch 4\/20<\/p>\n\n\n\n<p>1875\/1875 [==============================] - 2s 1ms\/step - loss: 0.3125 - accuracy: 0.8851<\/p>\n\n\n\n<p>Epoch 5\/20<\/p>\n\n\n\n<p>1875\/1875 [==============================] - 3s 1ms\/step - loss: 0.2941 - accuracy: 0.8925<\/p>\n\n\n\n<p>Epoch 6\/20<\/p>\n\n\n\n<p>1875\/1875 [==============================] - 2s 1ms\/step - loss: 0.2799 - accuracy: 0.8958<\/p>\n\n\n\n<p>Epoch 7\/20<\/p>\n\n\n\n<p>1875\/1875 [==============================] - 2s 1ms\/step - loss: 0.2683 - accuracy: 0.9004<\/p>\n\n\n\n<p>Epoch 8\/20<\/p>\n\n\n\n<p>1875\/1875 [==============================] - 2s 1ms\/step - loss: 0.2570 - accuracy: 0.9038<\/p>\n\n\n\n<p>Epoch 9\/20<\/p>\n\n\n\n<p>1875\/1875 [==============================] - 2s 1ms\/step - loss: 0.2466 - accuracy: 0.9088<\/p>\n\n\n\n<p>Epoch 10\/20<\/p>\n\n\n\n<p>1875\/1875 [==============================] - 3s 1ms\/step - loss: 0.2389 - accuracy: 0.9104<\/p>\n\n\n\n<p>Epoch 11\/20<\/p>\n\n\n\n<p>1875\/1875 [==============================] - 3s 1ms\/step - loss: 0.2319 - accuracy: 0.9137<\/p>\n\n\n\n<p>Epoch 12\/20<\/p>\n\n\n\n<p>1875\/1875 [==============================] - 3s 1ms\/step - loss: 0.2230 - accuracy: 0.9172<\/p>\n\n\n\n<p>Epoch 13\/20<\/p>\n\n\n\n<p>1875\/1875 [==============================] - 3s 1ms\/step - loss: 0.2168 - accuracy: 0.9195<\/p>\n\n\n\n<p>Epoch 14\/20<\/p>\n\n\n\n<p>1875\/1875 [==============================] - 3s 1ms\/step - loss: 0.2109 - accuracy: 0.9209<\/p>\n\n\n\n<p>Epoch 15\/20<\/p>\n\n\n\n<p>1875\/1875 [==============================] - 2s 1ms\/step - loss: 0.2055 - accuracy: 0.9234<\/p>\n\n\n\n<p>Epoch 16\/20<\/p>\n\n\n\n<p>1875\/1875 [==============================] - 2s 1ms\/step - loss: 0.1993 - accuracy: 0.9249<\/p>\n\n\n\n<p>Epoch 17\/20<\/p>\n\n\n\n<p>1875\/1875 [==============================] - 2s 1ms\/step - loss: 0.1939 - accuracy: 0.9272<\/p>\n\n\n\n<p>Epoch 18\/20<\/p>\n\n\n\n<p>1875\/1875 [==============================] - 2s 1ms\/step - loss: 0.1908 - accuracy: 0.9287<\/p>\n\n\n\n<p>Epoch 19\/20<\/p>\n\n\n\n<p>1875\/1875 [==============================] - 2s 1ms\/step - loss: 0.1867 - accuracy: 0.9304<\/p>\n\n\n\n<p>Epoch 20\/20<\/p>\n\n\n\n<p>1875\/1875 [==============================] - 2s 1ms\/step - loss: 0.1811 - accuracy: 0.9325<\/p>\n\n\n\n<p>313\/313 - 0s - loss: 0.3704 - accuracy: 0.8847<\/p>\n\n\n\n<p>Test accuracy: 0.8847000002861023<\/p>\n\n\n\n<p>[[0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.&nbsp; &nbsp; \t]<\/p>\n\n\n\n<p>&nbsp;[0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0.&nbsp; \t &nbsp; 0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.&nbsp; &nbsp; \t]<\/p>\n\n\n\n<p>&nbsp;[0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.&nbsp; &nbsp; \t]<\/p>\n\n\n\n<p>&nbsp;[0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0.\t &nbsp; &nbsp; 0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0.00392157 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.05098039 0.28627451 0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0. &nbsp; &nbsp; &nbsp; &nbsp; 0.00392157 0.01568627 0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0. &nbsp; &nbsp; &nbsp; &nbsp; 0.00392157 0.00392157 0.&nbsp; &nbsp; \t]<\/p>\n\n\n\n<p>&nbsp;[0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0.01176471 0. &nbsp; &nbsp; \t0.14117647 0.53333333 0.49803922 0.24313725<\/p>\n\n\n\n<p>&nbsp;&nbsp;0.21176471 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.00392157 0.01176471<\/p>\n\n\n\n<p>&nbsp;&nbsp;0.01568627 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.01176471]<\/p>\n\n\n\n<p>&nbsp;[0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0.02352941 0. &nbsp; &nbsp; \t0.4&nbsp; &nbsp; \t0.8&nbsp; &nbsp; \t0.69019608 0.5254902<\/p>\n\n\n\n<p>&nbsp;&nbsp;0.56470588 0.48235294 0.09019608 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0. &nbsp; &nbsp; &nbsp; &nbsp; 0.04705882 0.03921569 0.&nbsp; &nbsp; \t]<\/p>\n\n\n\n<p>&nbsp;[0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0.60784314 0.9254902&nbsp; 0.81176471 0.69803922<\/p>\n\n\n\n<p>&nbsp;&nbsp;0.41960784 0.61176471 0.63137255 0.42745098 0.25098039 0.09019608<\/p>\n\n\n\n<p>&nbsp;&nbsp;0.30196078 0.50980392 0.28235294 0.05882353]<\/p>\n\n\n\n<p>&nbsp;[0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.00392157<\/p>\n\n\n\n<p>&nbsp;&nbsp;0. &nbsp; &nbsp; &nbsp; &nbsp; 0.27058824 0.81176471 0.8745098&nbsp; 0.85490196 0.84705882<\/p>\n\n\n\n<p>&nbsp;&nbsp;0.84705882 0.63921569 0.49803922 0.4745098&nbsp; 0.47843137 0.57254902<\/p>\n\n\n\n<p>&nbsp;&nbsp;0.55294118 0.34509804 0.6745098&nbsp; 0.25882353]<\/p>\n\n\n\n<p>&nbsp;[0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.00392157 0.00392157 0.00392157<\/p>\n\n\n\n<p>&nbsp;&nbsp;0. &nbsp; &nbsp; &nbsp; &nbsp; 0.78431373 0.90980392 0.90980392 0.91372549 0.89803922<\/p>\n\n\n\n<p>&nbsp;&nbsp;0.8745098&nbsp; 0.8745098&nbsp; 0.84313725 0.83529412 0.64313725 0.49803922<\/p>\n\n\n\n<p>&nbsp;&nbsp;0.48235294 0.76862745 0.89803922 0.&nbsp; &nbsp; \t]<\/p>\n\n\n\n<p>&nbsp;[0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0. &nbsp; &nbsp; &nbsp; &nbsp; 0.71764706 0.88235294 0.84705882 0.8745098&nbsp; 0.89411765<\/p>\n\n\n\n<p>&nbsp;&nbsp;0.92156863 0.89019608 0.87843137 0.87058824 0.87843137 0.86666667<\/p>\n\n\n\n<p>&nbsp;&nbsp;0.8745098&nbsp; 0.96078431 0.67843137 0.&nbsp; &nbsp; \t]<\/p>\n\n\n\n<p>&nbsp;[0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0. &nbsp; &nbsp; &nbsp; &nbsp; 0.75686275 0.89411765 0.85490196 0.83529412 0.77647059<\/p>\n\n\n\n<p>&nbsp;&nbsp;0.70588235 0.83137255 0.82352941 0.82745098 0.83529412 0.8745098<\/p>\n\n\n\n<p>&nbsp;&nbsp;0.8627451&nbsp; 0.95294118 0.79215686 0.&nbsp; &nbsp; \t]<\/p>\n\n\n\n<p>&nbsp;[0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.00392157 0.01176471 0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0.04705882 0.85882353 0.8627451&nbsp; 0.83137255 0.85490196 0.75294118<\/p>\n\n\n\n<p>&nbsp;&nbsp;0.6627451&nbsp; 0.89019608 0.81568627 0.85490196 0.87843137 0.83137255<\/p>\n\n\n\n<p>&nbsp;&nbsp;0.88627451 0.77254902 0.81960784 0.20392157]<\/p>\n\n\n\n<p>&nbsp;[0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.02352941 0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0.38823529 0.95686275 0.87058824 0.8627451&nbsp; 0.85490196 0.79607843<\/p>\n\n\n\n<p>&nbsp;&nbsp;0.77647059 0.86666667 0.84313725 0.83529412 0.87058824 0.8627451<\/p>\n\n\n\n<p>&nbsp;&nbsp;0.96078431 0.46666667 0.65490196 0.21960784]<\/p>\n\n\n\n<p>&nbsp;[0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.01568627 0. &nbsp; &nbsp; \t0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0.21568627 0.9254902&nbsp; 0.89411765 0.90196078 0.89411765 0.94117647<\/p>\n\n\n\n<p>&nbsp;&nbsp;0.90980392 0.83529412 0.85490196 0.8745098&nbsp; 0.91764706 0.85098039<\/p>\n\n\n\n<p>&nbsp;&nbsp;0.85098039 0.81960784 0.36078431 0.&nbsp; &nbsp; \t]<\/p>\n\n\n\n<p>&nbsp;[0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0.00392157 0.01568627 0.02352941 0.02745098<\/p>\n\n\n\n<p>&nbsp;&nbsp;0.00784314 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0.92941176 0.88627451 0.85098039 0.8745098&nbsp; 0.87058824 0.85882353<\/p>\n\n\n\n<p>&nbsp;&nbsp;0.87058824 0.86666667 0.84705882 0.8745098&nbsp; 0.89803922 0.84313725<\/p>\n\n\n\n<p>&nbsp;&nbsp;0.85490196 1. &nbsp; &nbsp; \t0.30196078 0.&nbsp; &nbsp; \t]<\/p>\n\n\n\n<p>&nbsp;[0. &nbsp; &nbsp; &nbsp; &nbsp; 0.01176471 0. &nbsp; &nbsp; \t0.\t &nbsp; &nbsp; 0. &nbsp; &nbsp; &nbsp; &nbsp; 0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.24313725 0.56862745 0.8<\/p>\n\n\n\n<p>&nbsp;&nbsp;0.89411765 0.81176471 0.83529412 0.86666667 0.85490196 0.81568627<\/p>\n\n\n\n<p>&nbsp;&nbsp;0.82745098 0.85490196 0.87843137 0.8745098&nbsp; 0.85882353 0.84313725<\/p>\n\n\n\n<p>&nbsp;&nbsp;0.87843137 0.95686275 0.62352941 0.&nbsp; &nbsp; &nbsp; &nbsp; ]<\/p>\n\n\n\n<p>&nbsp;[0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.07058824 0.17254902<\/p>\n\n\n\n<p>&nbsp;&nbsp;0.32156863 0.41960784 0.74117647 0.89411765 0.8627451&nbsp; 0.87058824<\/p>\n\n\n\n<p>&nbsp;&nbsp;0.85098039 0.88627451 0.78431373 0.80392157 0.82745098 0.90196078<\/p>\n\n\n\n<p>&nbsp;&nbsp;0.87843137 0.91764706 0.69019608 0.7372549&nbsp; 0.98039216 0.97254902<\/p>\n\n\n\n<p>&nbsp;&nbsp;0.91372549 0.93333333 0.84313725 0.&nbsp; &nbsp; \t]<\/p>\n\n\n\n<p>&nbsp;[0. &nbsp; &nbsp; &nbsp; &nbsp; 0.22352941 0.73333333 0.81568627 0.87843137 0.86666667<\/p>\n\n\n\n<p>&nbsp;&nbsp;0.87843137 0.81568627 0.8&nbsp; &nbsp; \t0.83921569 0.81568627 0.81960784<\/p>\n\n\n\n<p>&nbsp;&nbsp;0.78431373 0.62352941 0.96078431 0.75686275 0.80784314 0.8745098<\/p>\n\n\n\n<p>&nbsp;&nbsp;1. &nbsp; &nbsp; &nbsp; &nbsp; 1. &nbsp; &nbsp; \t0.86666667 0.91764706 0.86666667 0.82745098<\/p>\n\n\n\n<p>&nbsp;&nbsp;0.8627451&nbsp; 0.90980392 0.96470588 0.&nbsp; &nbsp; \t]<\/p>\n\n\n\n<p>&nbsp;[0.01176471 0.79215686 0.89411765 0.87843137 0.86666667 0.82745098<\/p>\n\n\n\n<p>&nbsp;&nbsp;0.82745098 0.83921569 0.80392157 0.80392157 0.80392157 0.8627451<\/p>\n\n\n\n<p>&nbsp;&nbsp;0.94117647 0.31372549 0.58823529 1. &nbsp; &nbsp; \t0.89803922 0.86666667<\/p>\n\n\n\n<p>&nbsp;&nbsp;0.7372549&nbsp; 0.60392157 0.74901961 0.82352941 0.8&nbsp; &nbsp; \t0.81960784<\/p>\n\n\n\n<p>&nbsp;&nbsp;0.87058824 0.89411765 0.88235294 0.&nbsp; &nbsp; \t]<\/p>\n\n\n\n<p>&nbsp;[0.38431373 0.91372549 0.77647059 0.82352941 0.87058824 0.89803922<\/p>\n\n\n\n<p>&nbsp;&nbsp;0.89803922 0.91764706 0.97647059 0.8627451&nbsp; 0.76078431 0.84313725<\/p>\n\n\n\n<p>&nbsp;&nbsp;0.85098039 0.94509804 0.25490196 0.28627451 0.41568627 0.45882353<\/p>\n\n\n\n<p>&nbsp;&nbsp;0.65882353 0.85882353 0.86666667 0.84313725 0.85098039 0.8745098<\/p>\n\n\n\n<p>&nbsp;&nbsp;0.8745098&nbsp; 0.87843137 0.89803922 0.11372549]<\/p>\n\n\n\n<p>&nbsp;[0.29411765 0.8&nbsp; &nbsp; \t0.83137255 0.8&nbsp; &nbsp; \t0.75686275 0.80392157<\/p>\n\n\n\n<p>&nbsp;&nbsp;0.82745098 0.88235294 0.84705882 0.7254902&nbsp; 0.77254902 0.80784314<\/p>\n\n\n\n<p>&nbsp;&nbsp;0.77647059 0.83529412 0.94117647 0.76470588 0.89019608 0.96078431<\/p>\n\n\n\n<p>&nbsp;&nbsp;0.9372549&nbsp; 0.8745098&nbsp; 0.85490196 0.83137255 0.81960784 0.87058824<\/p>\n\n\n\n<p>&nbsp;&nbsp;0.8627451&nbsp; 0.86666667 0.90196078 0.2627451 ]<\/p>\n\n\n\n<p>&nbsp;[0.18823529 0.79607843 0.71764706 0.76078431 0.83529412 0.77254902<\/p>\n\n\n\n<p>&nbsp;&nbsp;0.7254902&nbsp; 0.74509804 0.76078431 0.75294118 0.79215686 0.83921569<\/p>\n\n\n\n<p>&nbsp;&nbsp;0.85882353 0.86666667 0.8627451&nbsp; 0.9254902&nbsp; 0.88235294 0.84705882<\/p>\n\n\n\n<p>&nbsp;&nbsp;0.78039216 0.80784314 0.72941176 0.70980392 0.69411765 0.6745098<\/p>\n\n\n\n<p>&nbsp;&nbsp;0.70980392 0.80392157 0.80784314 0.45098039]<\/p>\n\n\n\n<p>&nbsp;[0. &nbsp; &nbsp; &nbsp; &nbsp; 0.47843137 0.85882353 0.75686275 0.70196078 0.67058824<\/p>\n\n\n\n<p>&nbsp;&nbsp;0.71764706 0.76862745 0.8&nbsp; &nbsp; \t0.82352941 0.83529412 0.81176471<\/p>\n\n\n\n<p>&nbsp;&nbsp;0.82745098 0.82352941 0.78431373 0.76862745 0.76078431 0.74901961<\/p>\n\n\n\n<p>&nbsp;&nbsp;0.76470588 0.74901961 0.77647059 0.75294118 0.69019608 0.61176471<\/p>\n\n\n\n<p>&nbsp;&nbsp;0.65490196 0.69411765 0.82352941 0.36078431]<\/p>\n\n\n\n<p>&nbsp;[0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0.29019608 0.74117647 0.83137255 0.74901961<\/p>\n\n\n\n<p>&nbsp;&nbsp;0.68627451 0.6745098&nbsp; 0.68627451 0.70980392 0.7254902&nbsp; 0.7372549<\/p>\n\n\n\n<p>&nbsp;&nbsp;0.74117647 0.7372549&nbsp; 0.75686275 0.77647059 0.8&nbsp; &nbsp; \t0.81960784<\/p>\n\n\n\n<p>&nbsp;&nbsp;0.82352941 0.82352941 0.82745098 0.7372549&nbsp; 0.7372549&nbsp; 0.76078431<\/p>\n\n\n\n<p>&nbsp;&nbsp;0.75294118 0.84705882 0.66666667 0.&nbsp; &nbsp; \t]<\/p>\n\n\n\n<p>&nbsp;[0.00784314 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.25882353 0.78431373<\/p>\n\n\n\n<p>&nbsp;&nbsp;0.87058824 0.92941176 0.9372549&nbsp; 0.94901961 0.96470588 0.95294118<\/p>\n\n\n\n<p>&nbsp;&nbsp;0.95686275 0.86666667 0.8627451&nbsp; 0.75686275 0.74901961 0.70196078<\/p>\n\n\n\n<p>&nbsp;&nbsp;0.71372549 0.71372549 0.70980392 0.69019608 0.65098039 0.65882353<\/p>\n\n\n\n<p>&nbsp;&nbsp;0.38823529 0.22745098 0. &nbsp; &nbsp; \t0.&nbsp; &nbsp; \t]<\/p>\n\n\n\n<p>&nbsp;[0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0. &nbsp; &nbsp; &nbsp; &nbsp; 0.15686275 0.23921569 0.17254902 0.28235294 0.16078431<\/p>\n\n\n\n<p>&nbsp;&nbsp;0.1372549&nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.&nbsp; &nbsp; \t]<\/p>\n\n\n\n<p>&nbsp;[0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.&nbsp; &nbsp; \t]<\/p>\n\n\n\n<p>&nbsp;[0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0. \t&nbsp; &nbsp; 0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.&nbsp; &nbsp; \t]]<\/p>\n\n\n\n<p>[[0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.&nbsp; &nbsp; \t]<\/p>\n\n\n\n<p>&nbsp;[0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.&nbsp; &nbsp; \t]<\/p>\n\n\n\n<p>&nbsp;[0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.&nbsp; &nbsp; \t]<\/p>\n\n\n\n<p>&nbsp;[0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0. &nbsp; &nbsp; &nbsp; &nbsp; 0. \t&nbsp; &nbsp; 0. &nbsp; &nbsp; &nbsp; &nbsp; 0.&nbsp; &nbsp; \t]<\/p>\n\n\n\n<p>&nbsp;[0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; &nbsp; &nbsp; 0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.&nbsp; &nbsp; \t]<\/p>\n\n\n\n<p>&nbsp;[0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.&nbsp; &nbsp; \t]<\/p>\n\n\n\n<p>&nbsp;[0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.&nbsp; &nbsp; \t]<\/p>\n\n\n\n<p>&nbsp;[0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0. &nbsp; &nbsp; &nbsp; &nbsp; 0.01176471 0.00392157 0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0.02745098<\/p>\n\n\n\n<p>&nbsp;&nbsp;0. &nbsp; &nbsp; &nbsp; &nbsp; 0.14509804 0. &nbsp; &nbsp; \t0.&nbsp; &nbsp; \t]<\/p>\n\n\n\n<p>&nbsp;[0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0. &nbsp; &nbsp; &nbsp; &nbsp; 0.00392157 0.00784314 0. &nbsp; &nbsp; &nbsp; &nbsp; 0.10588235 0.32941176<\/p>\n\n\n\n<p>&nbsp;&nbsp;0.04313725 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0. &nbsp; &nbsp; &nbsp; &nbsp; 0.46666667 0. &nbsp; &nbsp; \t0.&nbsp; &nbsp; \t]<\/p>\n\n\n\n<p>&nbsp;[0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0. &nbsp; &nbsp; &nbsp; &nbsp; 0.00392157 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.34509804 0.56078431<\/p>\n\n\n\n<p>&nbsp;&nbsp;0.43137255 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.08627451<\/p>\n\n\n\n<p>&nbsp;&nbsp;0.36470588 0.41568627 0. &nbsp; &nbsp; \t0.&nbsp; &nbsp; \t]<\/p>\n\n\n\n<p>&nbsp;[0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0. &nbsp; &nbsp; &nbsp; &nbsp; 0.01568627 0. &nbsp; &nbsp; \t0.20784314 0.50588235 0.47058824<\/p>\n\n\n\n<p>&nbsp;&nbsp;0.57647059 0.68627451 0.61568627 0.65098039 0.52941176 0.60392157<\/p>\n\n\n\n<p>&nbsp;&nbsp;0.65882353 0.54901961 0. &nbsp; &nbsp; \t0.&nbsp; &nbsp; \t]<\/p>\n\n\n\n<p>&nbsp;[0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0.00784314 0. &nbsp; &nbsp; \t0.04313725 0.5372549&nbsp; 0.50980392 0.50196078<\/p>\n\n\n\n<p>&nbsp;&nbsp;0.62745098 0.69019608 0.62352941 0.65490196 0.69803922 0.58431373<\/p>\n\n\n\n<p>&nbsp;&nbsp;0.59215686 0.56470588 0. &nbsp; &nbsp; \t0.&nbsp; &nbsp; \t]<\/p>\n\n\n\n<p>&nbsp;[0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0.00392157 0. &nbsp; &nbsp; \t0.00784314 0.00392157 0. &nbsp; &nbsp; \t0.01176471<\/p>\n\n\n\n<p>&nbsp;&nbsp;0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0.45098039 0.44705882 0.41568627 0.5372549<\/p>\n\n\n\n<p>&nbsp;&nbsp;0.65882353 0.6&nbsp; &nbsp; \t0.61176471 0.64705882 0.65490196 0.56078431<\/p>\n\n\n\n<p>&nbsp;&nbsp;0.61568627 0.61960784 0.04313725 0.&nbsp; &nbsp; \t]<\/p>\n\n\n\n<p>&nbsp;[0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.00392157 0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.01176471 0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0. &nbsp; &nbsp; &nbsp; &nbsp; 0.34901961 0.54509804 0.35294118 0.36862745 0.6<\/p>\n\n\n\n<p>&nbsp;&nbsp;0.58431373 0.51372549 0.59215686 0.6627451&nbsp; 0.6745098&nbsp; 0.56078431<\/p>\n\n\n\n<p>&nbsp;&nbsp;0.62352941 0.6627451&nbsp; 0.18823529 0.&nbsp; &nbsp; \t]<\/p>\n\n\n\n<p>&nbsp;[0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0.00784314 0.01568627 0.00392157 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0.38431373 0.53333333 0.43137255 0.42745098 0.43137255 0.63529412<\/p>\n\n\n\n<p>&nbsp;&nbsp;0.52941176 0.56470588 0.58431373 0.62352941 0.65490196 0.56470588<\/p>\n\n\n\n<p>&nbsp;&nbsp;0.61960784 0.6627451&nbsp; 0.46666667 0.&nbsp; &nbsp; \t]<\/p>\n\n\n\n<p>&nbsp;[0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0.00784314 0.00784314 0.00392157 0.00784314<\/p>\n\n\n\n<p>&nbsp;&nbsp;0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.10196078 0.42352941<\/p>\n\n\n\n<p>&nbsp;&nbsp;0.45882353 0.38823529 0.43529412 0.45882353 0.53333333 0.61176471<\/p>\n\n\n\n<p>&nbsp;&nbsp;0.5254902&nbsp; 0.60392157 0.60392157 0.61176471 0.62745098 0.55294118<\/p>\n\n\n\n<p>&nbsp;&nbsp;0.57647059 0.61176471 0.69803922 0.&nbsp; &nbsp; \t]<\/p>\n\n\n\n<p>&nbsp;[0.01176471 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0. &nbsp; &nbsp; &nbsp; &nbsp; 0.08235294 0.20784314 0.36078431 0.45882353 0.43529412<\/p>\n\n\n\n<p>&nbsp;&nbsp;0.40392157 0.45098039 0.50588235 0.5254902&nbsp; 0.56078431 0.60392157<\/p>\n\n\n\n<p>&nbsp;&nbsp;0.64705882 0.66666667 0.60392157 0.59215686 0.60392157 0.56078431<\/p>\n\n\n\n<p>&nbsp;&nbsp;0.54117647 0.58823529 0.64705882 0.16862745]<\/p>\n\n\n\n<p>&nbsp;[0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0.09019608 0.21176471 0.25490196 0.29803922<\/p>\n\n\n\n<p>&nbsp;&nbsp;0.33333333 0.4627451&nbsp; 0.50196078 0.48235294 0.43529412 0.44313725<\/p>\n\n\n\n<p>&nbsp;&nbsp;0.4627451&nbsp; 0.49803922 0.49019608 0.54509804 0.52156863 0.53333333<\/p>\n\n\n\n<p>&nbsp;&nbsp;0.62745098 0.54901961 0.60784314 0.63137255 0.56470588 0.60784314<\/p>\n\n\n\n<p>&nbsp;&nbsp;0.6745098&nbsp; 0.63137255 0.74117647 0.24313725]<\/p>\n\n\n\n<p>&nbsp;[0. &nbsp; &nbsp; &nbsp; &nbsp; 0.26666667 0.36862745 0.35294118 0.43529412 0.44705882<\/p>\n\n\n\n<p>&nbsp;&nbsp;0.43529412 0.44705882 0.45098039 0.49803922 0.52941176 0.53333333<\/p>\n\n\n\n<p>&nbsp;&nbsp;0.56078431 0.49411765 0.49803922 0.59215686 0.60392157 0.56078431<\/p>\n\n\n\n<p>&nbsp;&nbsp;0.58039216 0.49019608 0.63529412 0.63529412 0.56470588 0.54117647<\/p>\n\n\n\n<p>&nbsp;&nbsp;0.6&nbsp; &nbsp; &nbsp; &nbsp; 0.63529412 0.76862745 0.22745098]<\/p>\n\n\n\n<p>&nbsp;[0.2745098&nbsp; 0.6627451&nbsp; 0.50588235 0.40784314 0.38431373 0.39215686<\/p>\n\n\n\n<p>&nbsp;&nbsp;0.36862745 0.38039216 0.38431373 0.4&nbsp; &nbsp; \t0.42352941 0.41568627<\/p>\n\n\n\n<p>&nbsp;&nbsp;0.46666667 0.47058824 0.50588235 0.58431373 0.61176471 0.65490196<\/p>\n\n\n\n<p>&nbsp;&nbsp;0.74509804 0.74509804 0.76862745 0.77647059 0.77647059 0.73333333<\/p>\n\n\n\n<p>&nbsp;&nbsp;0.77254902 0.74117647 0.72156863 0.14117647]<\/p>\n\n\n\n<p>&nbsp;[0.0627451&nbsp; 0.49411765 0.67058824 0.7372549&nbsp; 0.7372549&nbsp; 0.72156863<\/p>\n\n\n\n<p>&nbsp;&nbsp;0.67058824 0.6&nbsp; &nbsp; \t0.52941176 0.47058824 0.49411765 0.49803922<\/p>\n\n\n\n<p>&nbsp;&nbsp;0.57254902 0.7254902&nbsp; 0.76470588 0.81960784 0.81568627 1.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0.81960784 0.69411765 0.96078431 0.98823529 0.98431373 0.98431373<\/p>\n\n\n\n<p>&nbsp;&nbsp;0.96862745 0.8627451&nbsp; 0.80784314 0.19215686]<\/p>\n\n\n\n<p>&nbsp;[0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.04705882 0.2627451&nbsp; 0.41568627<\/p>\n\n\n\n<p>&nbsp;&nbsp;0.64313725 0.7254902&nbsp; 0.78039216 0.82352941 0.82745098 0.82352941<\/p>\n\n\n\n<p>&nbsp;&nbsp;0.81568627 0.74509804 0.58823529 0.32156863 0.03137255 0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0.69803922 0.81568627 0.7372549&nbsp; 0.68627451<\/p>\n\n\n\n<p>&nbsp;&nbsp;0.63529412 0.61960784 0.59215686 0.04313725]<\/p>\n\n\n\n<p>&nbsp;[0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.&nbsp; &nbsp; \t]<\/p>\n\n\n\n<p>&nbsp;[0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.&nbsp; &nbsp; \t]<\/p>\n\n\n\n<p>&nbsp;[0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.&nbsp; &nbsp; \t]<\/p>\n\n\n\n<p>&nbsp;[0.\t &nbsp; &nbsp; 0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.&nbsp; &nbsp; \t]<\/p>\n\n\n\n<p>&nbsp;[0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; &nbsp; &nbsp; 0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.&nbsp; &nbsp; \t]<\/p>\n\n\n\n<p>&nbsp;[0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; \t&nbsp; 0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.<\/p>\n\n\n\n<p>&nbsp;&nbsp;0. &nbsp; &nbsp; &nbsp; &nbsp; 0. &nbsp; &nbsp; \t0. &nbsp; &nbsp; \t0.&nbsp; &nbsp; \t]]<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Keras Around a year back,Keras was integrated to TensorFlow 2.0, which succeeded TensorFlow 1.0. Now Keras is a part of TensorFlow. The writer of Keras is Francois Chollette. Data loading and preprocessing Neural networks don't process raw data, encoded JPEG image files, or CSV files. They handle vectorized and standardized representations. Text files require to [&hellip;]<\/p>\n","protected":false},"author":41,"featured_media":17718,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_acf_changed":false,"_uag_custom_page_level_css":"","site-sidebar-layout":"default","site-content-layout":"","ast-site-content-layout":"default","site-content-style":"default","site-sidebar-style":"default","ast-global-header-display":"","ast-banner-title-visibility":"","ast-main-header-display":"","ast-hfb-above-header-display":"","ast-hfb-below-header-display":"","ast-hfb-mobile-header-display":"","site-post-title":"","ast-breadcrumbs-content":"","ast-featured-img":"","footer-sml-layout":"","ast-disable-related-posts":"","theme-transparent-header-meta":"","adv-header-id-meta":"","stick-header-meta":"","header-above-stick-meta":"","header-main-stick-meta":"","header-below-stick-meta":"","astra-migrate-meta-layouts":"set","ast-page-background-enabled":"default","ast-page-background-meta":{"desktop":{"background-color":"var(--ast-global-color-4)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"tablet":{"background-color":"","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"mobile":{"background-color":"","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""}},"ast-content-background-meta":{"desktop":{"background-color":"var(--ast-global-color-5)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"tablet":{"background-color":"var(--ast-global-color-5)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"mobile":{"background-color":"var(--ast-global-color-5)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""}},"footnotes":""},"categories":[2],"tags":[],"content_type":[],"class_list":["post-18572","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-artificial-intelligence"],"acf":[],"yoast_head":"<!-- This site is optimized with the Yoast SEO Premium plugin v27.3 (Yoast SEO v27.3) - https:\/\/yoast.com\/product\/yoast-seo-premium-wordpress\/ -->\n<title>Keras Tutorial | An Introduction for Beginners<\/title>\n<meta name=\"description\" content=\"Keras Tutorial for Beginners: Around a year back,Keras was integrated to TensorFlow 2.0, which succeeded TensorFlow 1.0. Now Keras is a part of TensorFlow.\" \/>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/www.mygreatlearning.com\/blog\/keras-tutorial\/\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"Keras Tutorial - Deep Learning Framework\" \/>\n<meta property=\"og:description\" content=\"Keras Tutorial for Beginners: Around a year back,Keras was integrated to TensorFlow 2.0, which succeeded TensorFlow 1.0. Now Keras is a part of TensorFlow.\" \/>\n<meta property=\"og:url\" content=\"https:\/\/www.mygreatlearning.com\/blog\/keras-tutorial\/\" \/>\n<meta property=\"og:site_name\" content=\"Great Learning Blog: Free Resources what Matters to shape your Career!\" \/>\n<meta property=\"article:publisher\" content=\"https:\/\/www.facebook.com\/GreatLearningOfficial\/\" \/>\n<meta property=\"article:published_time\" content=\"2020-09-16T10:42:00+00:00\" \/>\n<meta property=\"article:modified_time\" content=\"2024-09-03T12:30:08+00:00\" \/>\n<meta property=\"og:image\" content=\"http:\/\/www.mygreatlearning.com\/blog\/wp-content\/uploads\/2020\/07\/featured-image-Keras-Tutorial.jpg\" \/>\n\t<meta property=\"og:image:width\" content=\"1000\" \/>\n\t<meta property=\"og:image:height\" content=\"700\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/jpeg\" \/>\n<meta name=\"author\" content=\"Great Learning Editorial Team\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:creator\" content=\"@https:\/\/twitter.com\/Great_Learning\" \/>\n<meta name=\"twitter:site\" content=\"@Great_Learning\" \/>\n<meta name=\"twitter:label1\" content=\"Written by\" \/>\n\t<meta name=\"twitter:data1\" content=\"Great Learning Editorial Team\" \/>\n\t<meta name=\"twitter:label2\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data2\" content=\"24 minutes\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\\\/\\\/schema.org\",\"@graph\":[{\"@type\":\"Article\",\"@id\":\"https:\\\/\\\/www.mygreatlearning.com\\\/blog\\\/keras-tutorial\\\/#article\",\"isPartOf\":{\"@id\":\"https:\\\/\\\/www.mygreatlearning.com\\\/blog\\\/keras-tutorial\\\/\"},\"author\":{\"name\":\"Great Learning Editorial Team\",\"@id\":\"https:\\\/\\\/www.mygreatlearning.com\\\/blog\\\/#\\\/schema\\\/person\\\/6f993d1be4c584a335951e836f2656ad\"},\"headline\":\"Keras Tutorial - Deep Learning Framework\",\"datePublished\":\"2020-09-16T10:42:00+00:00\",\"dateModified\":\"2024-09-03T12:30:08+00:00\",\"mainEntityOfPage\":{\"@id\":\"https:\\\/\\\/www.mygreatlearning.com\\\/blog\\\/keras-tutorial\\\/\"},\"wordCount\":6710,\"commentCount\":0,\"publisher\":{\"@id\":\"https:\\\/\\\/www.mygreatlearning.com\\\/blog\\\/#organization\"},\"image\":{\"@id\":\"https:\\\/\\\/www.mygreatlearning.com\\\/blog\\\/keras-tutorial\\\/#primaryimage\"},\"thumbnailUrl\":\"https:\\\/\\\/www.mygreatlearning.com\\\/blog\\\/wp-content\\\/uploads\\\/2020\\\/07\\\/featured-image-Keras-Tutorial.jpg\",\"articleSection\":[\"AI and Machine Learning\"],\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"CommentAction\",\"name\":\"Comment\",\"target\":[\"https:\\\/\\\/www.mygreatlearning.com\\\/blog\\\/keras-tutorial\\\/#respond\"]}]},{\"@type\":\"WebPage\",\"@id\":\"https:\\\/\\\/www.mygreatlearning.com\\\/blog\\\/keras-tutorial\\\/\",\"url\":\"https:\\\/\\\/www.mygreatlearning.com\\\/blog\\\/keras-tutorial\\\/\",\"name\":\"Keras Tutorial | An Introduction for Beginners\",\"isPartOf\":{\"@id\":\"https:\\\/\\\/www.mygreatlearning.com\\\/blog\\\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\\\/\\\/www.mygreatlearning.com\\\/blog\\\/keras-tutorial\\\/#primaryimage\"},\"image\":{\"@id\":\"https:\\\/\\\/www.mygreatlearning.com\\\/blog\\\/keras-tutorial\\\/#primaryimage\"},\"thumbnailUrl\":\"https:\\\/\\\/www.mygreatlearning.com\\\/blog\\\/wp-content\\\/uploads\\\/2020\\\/07\\\/featured-image-Keras-Tutorial.jpg\",\"datePublished\":\"2020-09-16T10:42:00+00:00\",\"dateModified\":\"2024-09-03T12:30:08+00:00\",\"description\":\"Keras Tutorial for Beginners: Around a year back,Keras was integrated to TensorFlow 2.0, which succeeded TensorFlow 1.0. Now Keras is a part of TensorFlow.\",\"breadcrumb\":{\"@id\":\"https:\\\/\\\/www.mygreatlearning.com\\\/blog\\\/keras-tutorial\\\/#breadcrumb\"},\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\\\/\\\/www.mygreatlearning.com\\\/blog\\\/keras-tutorial\\\/\"]}]},{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\\\/\\\/www.mygreatlearning.com\\\/blog\\\/keras-tutorial\\\/#primaryimage\",\"url\":\"https:\\\/\\\/www.mygreatlearning.com\\\/blog\\\/wp-content\\\/uploads\\\/2020\\\/07\\\/featured-image-Keras-Tutorial.jpg\",\"contentUrl\":\"https:\\\/\\\/www.mygreatlearning.com\\\/blog\\\/wp-content\\\/uploads\\\/2020\\\/07\\\/featured-image-Keras-Tutorial.jpg\",\"width\":1000,\"height\":700},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\\\/\\\/www.mygreatlearning.com\\\/blog\\\/keras-tutorial\\\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Blog\",\"item\":\"https:\\\/\\\/www.mygreatlearning.com\\\/blog\\\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"AI and Machine Learning\",\"item\":\"https:\\\/\\\/www.mygreatlearning.com\\\/blog\\\/artificial-intelligence\\\/\"},{\"@type\":\"ListItem\",\"position\":3,\"name\":\"Keras Tutorial &#8211; Deep Learning Framework\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\\\/\\\/www.mygreatlearning.com\\\/blog\\\/#website\",\"url\":\"https:\\\/\\\/www.mygreatlearning.com\\\/blog\\\/\",\"name\":\"Great Learning Blog\",\"description\":\"Learn, Upskill &amp; Career Development Guide and Resources\",\"publisher\":{\"@id\":\"https:\\\/\\\/www.mygreatlearning.com\\\/blog\\\/#organization\"},\"alternateName\":\"Great Learning\",\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\\\/\\\/www.mygreatlearning.com\\\/blog\\\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"en-US\"},{\"@type\":\"Organization\",\"@id\":\"https:\\\/\\\/www.mygreatlearning.com\\\/blog\\\/#organization\",\"name\":\"Great Learning\",\"url\":\"https:\\\/\\\/www.mygreatlearning.com\\\/blog\\\/\",\"logo\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\\\/\\\/www.mygreatlearning.com\\\/blog\\\/#\\\/schema\\\/logo\\\/image\\\/\",\"url\":\"https:\\\/\\\/www.mygreatlearning.com\\\/blog\\\/wp-content\\\/uploads\\\/2022\\\/06\\\/GL-Logo.jpg\",\"contentUrl\":\"https:\\\/\\\/www.mygreatlearning.com\\\/blog\\\/wp-content\\\/uploads\\\/2022\\\/06\\\/GL-Logo.jpg\",\"width\":900,\"height\":900,\"caption\":\"Great Learning\"},\"image\":{\"@id\":\"https:\\\/\\\/www.mygreatlearning.com\\\/blog\\\/#\\\/schema\\\/logo\\\/image\\\/\"},\"sameAs\":[\"https:\\\/\\\/www.facebook.com\\\/GreatLearningOfficial\\\/\",\"https:\\\/\\\/x.com\\\/Great_Learning\",\"https:\\\/\\\/www.instagram.com\\\/greatlearningofficial\\\/\",\"https:\\\/\\\/www.linkedin.com\\\/school\\\/great-learning\\\/\",\"https:\\\/\\\/in.pinterest.com\\\/greatlearning12\\\/\",\"https:\\\/\\\/www.youtube.com\\\/user\\\/beaconelearning\\\/\"],\"description\":\"Great Learning is a leading global ed-tech company for professional training and higher education. It offers comprehensive, industry-relevant, hands-on learning programs across various business, technology, and interdisciplinary domains driving the digital economy. These programs are developed and offered in collaboration with the world's foremost academic institutions.\",\"email\":\"info@mygreatlearning.com\",\"legalName\":\"Great Learning Education Services Pvt. Ltd\",\"foundingDate\":\"2013-11-29\",\"numberOfEmployees\":{\"@type\":\"QuantitativeValue\",\"minValue\":\"1001\",\"maxValue\":\"5000\"}},{\"@type\":\"Person\",\"@id\":\"https:\\\/\\\/www.mygreatlearning.com\\\/blog\\\/#\\\/schema\\\/person\\\/6f993d1be4c584a335951e836f2656ad\",\"name\":\"Great Learning Editorial Team\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\\\/\\\/www.mygreatlearning.com\\\/blog\\\/wp-content\\\/uploads\\\/2022\\\/02\\\/unnamed.webp\",\"url\":\"https:\\\/\\\/www.mygreatlearning.com\\\/blog\\\/wp-content\\\/uploads\\\/2022\\\/02\\\/unnamed.webp\",\"contentUrl\":\"https:\\\/\\\/www.mygreatlearning.com\\\/blog\\\/wp-content\\\/uploads\\\/2022\\\/02\\\/unnamed.webp\",\"caption\":\"Great Learning Editorial Team\"},\"description\":\"The Great Learning Editorial Staff includes a dynamic team of subject matter experts, instructors, and education professionals who combine their deep industry knowledge with innovative teaching methods. Their mission is to provide learners with the skills and insights needed to excel in their careers, whether through upskilling, reskilling, or transitioning into new fields.\",\"sameAs\":[\"https:\\\/\\\/www.mygreatlearning.com\\\/\",\"https:\\\/\\\/in.linkedin.com\\\/school\\\/great-learning\\\/\",\"https:\\\/\\\/x.com\\\/https:\\\/\\\/twitter.com\\\/Great_Learning\",\"https:\\\/\\\/www.youtube.com\\\/channel\\\/UCObs0kLIrDjX2LLSybqNaEA\"],\"award\":[\"Best EdTech Company of the Year 2024\",\"Education Economictimes Outstanding Education\\\/Edtech Solution Provider of the Year 2024\",\"Leading E-learning Platform 2024\"],\"url\":\"https:\\\/\\\/www.mygreatlearning.com\\\/blog\\\/author\\\/greatlearning\\\/\"}]}<\/script>\n<!-- \/ Yoast SEO Premium plugin. -->","yoast_head_json":{"title":"Keras Tutorial | An Introduction for Beginners","description":"Keras Tutorial for Beginners: Around a year back,Keras was integrated to TensorFlow 2.0, which succeeded TensorFlow 1.0. Now Keras is a part of TensorFlow.","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/www.mygreatlearning.com\/blog\/keras-tutorial\/","og_locale":"en_US","og_type":"article","og_title":"Keras Tutorial - Deep Learning Framework","og_description":"Keras Tutorial for Beginners: Around a year back,Keras was integrated to TensorFlow 2.0, which succeeded TensorFlow 1.0. Now Keras is a part of TensorFlow.","og_url":"https:\/\/www.mygreatlearning.com\/blog\/keras-tutorial\/","og_site_name":"Great Learning Blog: Free Resources what Matters to shape your Career!","article_publisher":"https:\/\/www.facebook.com\/GreatLearningOfficial\/","article_published_time":"2020-09-16T10:42:00+00:00","article_modified_time":"2024-09-03T12:30:08+00:00","og_image":[{"width":1000,"height":700,"url":"http:\/\/www.mygreatlearning.com\/blog\/wp-content\/uploads\/2020\/07\/featured-image-Keras-Tutorial.jpg","type":"image\/jpeg"}],"author":"Great Learning Editorial Team","twitter_card":"summary_large_image","twitter_creator":"@https:\/\/twitter.com\/Great_Learning","twitter_site":"@Great_Learning","twitter_misc":{"Written by":"Great Learning Editorial Team","Est. reading time":"24 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"Article","@id":"https:\/\/www.mygreatlearning.com\/blog\/keras-tutorial\/#article","isPartOf":{"@id":"https:\/\/www.mygreatlearning.com\/blog\/keras-tutorial\/"},"author":{"name":"Great Learning Editorial Team","@id":"https:\/\/www.mygreatlearning.com\/blog\/#\/schema\/person\/6f993d1be4c584a335951e836f2656ad"},"headline":"Keras Tutorial - Deep Learning Framework","datePublished":"2020-09-16T10:42:00+00:00","dateModified":"2024-09-03T12:30:08+00:00","mainEntityOfPage":{"@id":"https:\/\/www.mygreatlearning.com\/blog\/keras-tutorial\/"},"wordCount":6710,"commentCount":0,"publisher":{"@id":"https:\/\/www.mygreatlearning.com\/blog\/#organization"},"image":{"@id":"https:\/\/www.mygreatlearning.com\/blog\/keras-tutorial\/#primaryimage"},"thumbnailUrl":"https:\/\/www.mygreatlearning.com\/blog\/wp-content\/uploads\/2020\/07\/featured-image-Keras-Tutorial.jpg","articleSection":["AI and Machine Learning"],"inLanguage":"en-US","potentialAction":[{"@type":"CommentAction","name":"Comment","target":["https:\/\/www.mygreatlearning.com\/blog\/keras-tutorial\/#respond"]}]},{"@type":"WebPage","@id":"https:\/\/www.mygreatlearning.com\/blog\/keras-tutorial\/","url":"https:\/\/www.mygreatlearning.com\/blog\/keras-tutorial\/","name":"Keras Tutorial | An Introduction for Beginners","isPartOf":{"@id":"https:\/\/www.mygreatlearning.com\/blog\/#website"},"primaryImageOfPage":{"@id":"https:\/\/www.mygreatlearning.com\/blog\/keras-tutorial\/#primaryimage"},"image":{"@id":"https:\/\/www.mygreatlearning.com\/blog\/keras-tutorial\/#primaryimage"},"thumbnailUrl":"https:\/\/www.mygreatlearning.com\/blog\/wp-content\/uploads\/2020\/07\/featured-image-Keras-Tutorial.jpg","datePublished":"2020-09-16T10:42:00+00:00","dateModified":"2024-09-03T12:30:08+00:00","description":"Keras Tutorial for Beginners: Around a year back,Keras was integrated to TensorFlow 2.0, which succeeded TensorFlow 1.0. Now Keras is a part of TensorFlow.","breadcrumb":{"@id":"https:\/\/www.mygreatlearning.com\/blog\/keras-tutorial\/#breadcrumb"},"inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https:\/\/www.mygreatlearning.com\/blog\/keras-tutorial\/"]}]},{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/www.mygreatlearning.com\/blog\/keras-tutorial\/#primaryimage","url":"https:\/\/www.mygreatlearning.com\/blog\/wp-content\/uploads\/2020\/07\/featured-image-Keras-Tutorial.jpg","contentUrl":"https:\/\/www.mygreatlearning.com\/blog\/wp-content\/uploads\/2020\/07\/featured-image-Keras-Tutorial.jpg","width":1000,"height":700},{"@type":"BreadcrumbList","@id":"https:\/\/www.mygreatlearning.com\/blog\/keras-tutorial\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Blog","item":"https:\/\/www.mygreatlearning.com\/blog\/"},{"@type":"ListItem","position":2,"name":"AI and Machine Learning","item":"https:\/\/www.mygreatlearning.com\/blog\/artificial-intelligence\/"},{"@type":"ListItem","position":3,"name":"Keras Tutorial &#8211; Deep Learning Framework"}]},{"@type":"WebSite","@id":"https:\/\/www.mygreatlearning.com\/blog\/#website","url":"https:\/\/www.mygreatlearning.com\/blog\/","name":"Great Learning Blog","description":"Learn, Upskill &amp; Career Development Guide and Resources","publisher":{"@id":"https:\/\/www.mygreatlearning.com\/blog\/#organization"},"alternateName":"Great Learning","potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/www.mygreatlearning.com\/blog\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"en-US"},{"@type":"Organization","@id":"https:\/\/www.mygreatlearning.com\/blog\/#organization","name":"Great Learning","url":"https:\/\/www.mygreatlearning.com\/blog\/","logo":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/www.mygreatlearning.com\/blog\/#\/schema\/logo\/image\/","url":"https:\/\/www.mygreatlearning.com\/blog\/wp-content\/uploads\/2022\/06\/GL-Logo.jpg","contentUrl":"https:\/\/www.mygreatlearning.com\/blog\/wp-content\/uploads\/2022\/06\/GL-Logo.jpg","width":900,"height":900,"caption":"Great Learning"},"image":{"@id":"https:\/\/www.mygreatlearning.com\/blog\/#\/schema\/logo\/image\/"},"sameAs":["https:\/\/www.facebook.com\/GreatLearningOfficial\/","https:\/\/x.com\/Great_Learning","https:\/\/www.instagram.com\/greatlearningofficial\/","https:\/\/www.linkedin.com\/school\/great-learning\/","https:\/\/in.pinterest.com\/greatlearning12\/","https:\/\/www.youtube.com\/user\/beaconelearning\/"],"description":"Great Learning is a leading global ed-tech company for professional training and higher education. It offers comprehensive, industry-relevant, hands-on learning programs across various business, technology, and interdisciplinary domains driving the digital economy. These programs are developed and offered in collaboration with the world's foremost academic institutions.","email":"info@mygreatlearning.com","legalName":"Great Learning Education Services Pvt. Ltd","foundingDate":"2013-11-29","numberOfEmployees":{"@type":"QuantitativeValue","minValue":"1001","maxValue":"5000"}},{"@type":"Person","@id":"https:\/\/www.mygreatlearning.com\/blog\/#\/schema\/person\/6f993d1be4c584a335951e836f2656ad","name":"Great Learning Editorial Team","image":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/www.mygreatlearning.com\/blog\/wp-content\/uploads\/2022\/02\/unnamed.webp","url":"https:\/\/www.mygreatlearning.com\/blog\/wp-content\/uploads\/2022\/02\/unnamed.webp","contentUrl":"https:\/\/www.mygreatlearning.com\/blog\/wp-content\/uploads\/2022\/02\/unnamed.webp","caption":"Great Learning Editorial Team"},"description":"The Great Learning Editorial Staff includes a dynamic team of subject matter experts, instructors, and education professionals who combine their deep industry knowledge with innovative teaching methods. Their mission is to provide learners with the skills and insights needed to excel in their careers, whether through upskilling, reskilling, or transitioning into new fields.","sameAs":["https:\/\/www.mygreatlearning.com\/","https:\/\/in.linkedin.com\/school\/great-learning\/","https:\/\/x.com\/https:\/\/twitter.com\/Great_Learning","https:\/\/www.youtube.com\/channel\/UCObs0kLIrDjX2LLSybqNaEA"],"award":["Best EdTech Company of the Year 2024","Education Economictimes Outstanding Education\/Edtech Solution Provider of the Year 2024","Leading E-learning Platform 2024"],"url":"https:\/\/www.mygreatlearning.com\/blog\/author\/greatlearning\/"}]}},"uagb_featured_image_src":{"full":["https:\/\/www.mygreatlearning.com\/blog\/wp-content\/uploads\/2020\/07\/featured-image-Keras-Tutorial.jpg",1000,700,false],"thumbnail":["https:\/\/www.mygreatlearning.com\/blog\/wp-content\/uploads\/2020\/07\/featured-image-Keras-Tutorial-150x150.jpg",150,150,true],"medium":["https:\/\/www.mygreatlearning.com\/blog\/wp-content\/uploads\/2020\/07\/featured-image-Keras-Tutorial-300x210.jpg",300,210,true],"medium_large":["https:\/\/www.mygreatlearning.com\/blog\/wp-content\/uploads\/2020\/07\/featured-image-Keras-Tutorial-768x538.jpg",768,538,true],"large":["https:\/\/www.mygreatlearning.com\/blog\/wp-content\/uploads\/2020\/07\/featured-image-Keras-Tutorial.jpg",1000,700,false],"1536x1536":["https:\/\/www.mygreatlearning.com\/blog\/wp-content\/uploads\/2020\/07\/featured-image-Keras-Tutorial.jpg",1000,700,false],"2048x2048":["https:\/\/www.mygreatlearning.com\/blog\/wp-content\/uploads\/2020\/07\/featured-image-Keras-Tutorial.jpg",1000,700,false],"web-stories-poster-portrait":["https:\/\/www.mygreatlearning.com\/blog\/wp-content\/uploads\/2020\/07\/featured-image-Keras-Tutorial.jpg",640,448,false],"web-stories-publisher-logo":["https:\/\/www.mygreatlearning.com\/blog\/wp-content\/uploads\/2020\/07\/featured-image-Keras-Tutorial.jpg",96,67,false],"web-stories-thumbnail":["https:\/\/www.mygreatlearning.com\/blog\/wp-content\/uploads\/2020\/07\/featured-image-Keras-Tutorial.jpg",150,105,false]},"uagb_author_info":{"display_name":"Great Learning Editorial Team","author_link":"https:\/\/www.mygreatlearning.com\/blog\/author\/greatlearning\/"},"uagb_comment_info":0,"uagb_excerpt":"Keras Around a year back,Keras was integrated to TensorFlow 2.0, which succeeded TensorFlow 1.0. Now Keras is a part of TensorFlow. The writer of Keras is Francois Chollette. Data loading and preprocessing Neural networks don't process raw data, encoded JPEG image files, or CSV files. They handle vectorized and standardized representations. Text files require to&hellip;","_links":{"self":[{"href":"https:\/\/www.mygreatlearning.com\/blog\/wp-json\/wp\/v2\/posts\/18572","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.mygreatlearning.com\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.mygreatlearning.com\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.mygreatlearning.com\/blog\/wp-json\/wp\/v2\/users\/41"}],"replies":[{"embeddable":true,"href":"https:\/\/www.mygreatlearning.com\/blog\/wp-json\/wp\/v2\/comments?post=18572"}],"version-history":[{"count":15,"href":"https:\/\/www.mygreatlearning.com\/blog\/wp-json\/wp\/v2\/posts\/18572\/revisions"}],"predecessor-version":[{"id":104597,"href":"https:\/\/www.mygreatlearning.com\/blog\/wp-json\/wp\/v2\/posts\/18572\/revisions\/104597"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.mygreatlearning.com\/blog\/wp-json\/wp\/v2\/media\/17718"}],"wp:attachment":[{"href":"https:\/\/www.mygreatlearning.com\/blog\/wp-json\/wp\/v2\/media?parent=18572"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.mygreatlearning.com\/blog\/wp-json\/wp\/v2\/categories?post=18572"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.mygreatlearning.com\/blog\/wp-json\/wp\/v2\/tags?post=18572"},{"taxonomy":"content_type","embeddable":true,"href":"https:\/\/www.mygreatlearning.com\/blog\/wp-json\/wp\/v2\/content_type?post=18572"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}