Selaa lähdekoodia

Added References

bharatk-parallel 4 vuotta sitten
vanhempi
commit
9611fee56c

+ 11 - 1
hpc_ai/ai_science_cfd/English/python/jupyter_notebook/Intro_to_DL/CNN's.ipynb

@@ -53,6 +53,8 @@
     "\n",
     "![alt_text](images/cnn.jpeg)\n",
     "\n",
+    "*Source: https://fr.mathworks.com/solutions/deep-learning/convolutional-neural-network.html*\n",
+    "\n",
     "Each input image will pass it through a series of convolution layers with filters (Kernels), pooling, fully connected layers (FC) and apply Softmax function to classify an object with probabilistic values between 0 and 1. \n",
     "\n",
     "Let us discuss in brief about the following in detail : \n",
@@ -70,10 +72,14 @@
     "\n",
     "![alt_text](images/conv.gif)\n",
     "\n",
+    "*Source: https://towardsdatascience.com/a-comprehensive-guide-to-convolutional-neural-networks-the-eli5-way-3bd2b1164a53*\n",
+    "\n",
     "We have seen how the convolution operation works, and now let us now see how convolution operation is carried out with multiple layers.\n",
     "\n",
     "![alt_text](images/conv_depth.png)\n",
     "\n",
+    "*Source: https://towardsdatascience.com/a-comprehensive-introduction-to-different-types-of-convolutions-in-deep-learning-669281e58215*\n",
+    "\n",
     "Let us define the terms :\n",
     "\n",
     "- Hin : Height dimension of the layer\n",
@@ -113,6 +119,8 @@
     "\n",
     "![alt_text](images/max_pool.png)\n",
     "\n",
+    "*Source: https://www.programmersought.com/article/47163598855/*\n",
+    "\n",
     "#### Fully Connected Layer :\n",
     "\n",
     "We will then flatten the output from the convolutions layers and feed into it a _Fully Connected layer_ to generate a prediction. The fully connected layer is an ANN Model whose inputs are the features of the Inputs obtained from the Convolutions Layers. \n",
@@ -133,6 +141,8 @@
     "<td> <img src=\"images/convtranspose_conv.gif\" alt=\"Drawing\" style=\"width: 500px;\"/> </td>\n",
     "</tr></table>\n",
     "\n",
+    "*Source https://towardsdatascience.com/a-comprehensive-introduction-to-different-types-of-convolutions-in-deep-learning-669281e58215*\n",
+    "\n",
     "Tranposed Convolution can also be visualised as Convolution of a Layer with 2x2 padding as displayed in the right gif.\n",
     "\n",
     "\n",
@@ -505,7 +515,7 @@
    "cell_type": "markdown",
    "metadata": {},
    "source": [
-    "## References : \n",
+    "## Acknowledgement : \n",
     "\n",
     "\n",
     "[Transposed Convolutions explained](https://medium.com/apache-mxnet/transposed-convolutions-explained-with-ms-excel-52d13030c7e8)\n",

+ 3 - 1
hpc_ai/ai_science_cfd/English/python/jupyter_notebook/Intro_to_DL/Part_2.ipynb

@@ -72,7 +72,9 @@
     "\n",
     "We will be using the **F-MNIST ( Fashion MNIST )** dataset, which is a very popular dataset. This dataset contains 70,000 grayscale images in 10 categories. The images show individual articles of clothing at low resolution (28 by 28 pixels).\n",
     "\n",
-    "<img src=\"images/fashion-mnist.png\" alt=\"Fashion MNIST sprite\"  width=\"600\">"
+    "<img src=\"images/fashion-mnist.png\" alt=\"Fashion MNIST sprite\"  width=\"600\">\n",
+    "\n",
+    "*Source: https://www.tensorflow.org/tutorials/keras/classification*"
    ]
   },
   {

+ 13 - 1
hpc_ai/ai_science_climate/English/python/jupyter_notebook/Intro_to_DL/CNN's.ipynb

@@ -48,6 +48,8 @@
     "\n",
     "![alt_text](images/cnn.jpeg)\n",
     "\n",
+    "*Source: https://fr.mathworks.com/solutions/deep-learning/convolutional-neural-network.html*\n",
+    "\n",
     "Each input image will pass it through a series of convolution layers with filters (Kernels), pooling, fully connected layers (FC) and apply Softmax function to classify an object with probabilistic values between 0 and 1. \n",
     "\n",
     "Let us discuss in brief about the following in detail : \n",
@@ -65,10 +67,15 @@
     "\n",
     "![alt_text](images/conv.gif)\n",
     "\n",
+    "*Source: https://towardsdatascience.com/a-comprehensive-guide-to-convolutional-neural-networks-the-eli5-way-3bd2b1164a53*\n",
+    "\n",
     "We have seen how the convolution operation works, and now let us now see how convolution operation is carried out with multiple layers.\n",
     "\n",
     "![alt_text](images/conv_depth.png)\n",
     "\n",
+    "*Source: https://towardsdatascience.com/a-comprehensive-introduction-to-different-types-of-convolutions-in-deep-learning-669281e58215*\n",
+    "\n",
+    "\n",
     "Let us define the terms :\n",
     "\n",
     "- Hin : Height dimension of the layer\n",
@@ -108,6 +115,8 @@
     "\n",
     "![alt_text](images/max_pool.png)\n",
     "\n",
+    "*Source: https://www.programmersought.com/article/47163598855/*\n",
+    "\n",
     "#### Fully Connected Layer :\n",
     "\n",
     "We will then flatten the output from the convolutions layers and feed into it a _Fully Connected layer_ to generate a prediction. The fully connected layer is an ANN Model whose inputs are the features of the Inputs obtained from the Convolutions Layers. \n",
@@ -128,6 +137,9 @@
     "<td> <img src=\"images/convtranspose_conv.gif\" alt=\"Drawing\" style=\"width: 500px;\"/> </td>\n",
     "</tr></table>\n",
     "\n",
+    "\n",
+    "*Source https://towardsdatascience.com/a-comprehensive-introduction-to-different-types-of-convolutions-in-deep-learning-669281e58215*\n",
+    "\n",
     "Tranposed Convolution can also be visualised as Convolution of a Layer with 2x2 padding as displayed in the right gif.\n",
     "\n",
     "\n",
@@ -502,7 +514,7 @@
    "source": [
     "\n",
     "\n",
-    "## References : \n",
+    "## Acknowledgements : \n",
     "\n",
     "\n",
     "[Transposed Convolutions explained](https://medium.com/apache-mxnet/transposed-convolutions-explained-with-ms-excel-52d13030c7e8)\n",

+ 3 - 1
hpc_ai/ai_science_climate/English/python/jupyter_notebook/Intro_to_DL/Part_2.ipynb

@@ -71,7 +71,9 @@
     "\n",
     "We will be using the **F-MNIST ( Fashion MNIST )** dataset, which is a very popular dataset. This dataset contains 70,000 grayscale images in 10 categories. The images show individual articles of clothing at low resolution (28 by 28 pixels).\n",
     "\n",
-    "<img src=\"images/fashion-mnist.png\" alt=\"Fashion MNIST sprite\"  width=\"600\">"
+    "<img src=\"images/fashion-mnist.png\" alt=\"Fashion MNIST sprite\"  width=\"600\">\n",
+    "\n",
+    "*Source: https://www.tensorflow.org/tutorials/keras/classification*"
    ]
   },
   {

+ 2 - 0
hpc_ai/ai_science_climate/English/python/jupyter_notebook/Tropical_Cyclone_Intensity_Estimation/Approach_to_the_Problem_&_Inspecting_and_Cleaning_the_Required_Data.ipynb

@@ -175,6 +175,8 @@
     "\n",
     "<td><img src=\"images/grad.jpg\" alt=\"Drawing\" style=\"width: 420px;\"/></td>\n",
     "\n",
+    "*Source: https://towardsdatascience.com/linear-regression-using-gradient-descent-97a6c8700931*\n",
+    "\n",
     "GD runs through all the samples in training set to do a single update for a parameter in a particular iteration. In SGD, on the other hand, you use only one or subset of training sample from your training set to do the update for a parameter in a particular iteration. \n",
     "\n",
     "Using SGD will be faster because only one training sample is used and it starts improving itself right away from the first sample.\n",