Przeglądaj źródła

Fixed review comments. Dockerfile and Singularity scripts

bharatk-parallel 4 lat temu
rodzic
commit
84f382b6d9
17 zmienionych plików z 84 dodań i 44 usunięć
  1. 7 0
      hpc_ai/ai_science_cfd/English/python/jupyter_notebook/CFD/Part2.ipynb
  2. 7 1
      hpc_ai/ai_science_cfd/English/python/jupyter_notebook/CFD/Part3.ipynb
  3. 3 0
      hpc_ai/ai_science_cfd/English/python/jupyter_notebook/CFD/Part4.ipynb
  4. 7 1
      hpc_ai/ai_science_cfd/English/python/jupyter_notebook/Intro_to_DL/CNN's.ipynb
  5. 8 0
      hpc_ai/ai_science_cfd/English/python/jupyter_notebook/Intro_to_DL/Part_2.ipynb
  6. 5 2
      hpc_ai/ai_science_cfd/English/python/jupyter_notebook/Intro_to_DL/Resnets.ipynb
  7. 8 8
      hpc_ai/ai_science_cfd/README.MD
  8. 2 6
      hpc_ai/ai_science_cfd/Singularity
  9. 6 1
      hpc_ai/ai_science_climate/English/python/jupyter_notebook/Intro_to_DL/CNN's.ipynb
  10. 11 3
      hpc_ai/ai_science_climate/English/python/jupyter_notebook/Intro_to_DL/Part_2.ipynb
  11. 1 1
      hpc_ai/ai_science_climate/English/python/jupyter_notebook/Tropical_Cyclone_Intensity_Estimation/Competition.ipynb
  12. 4 4
      hpc_ai/ai_science_climate/English/python/jupyter_notebook/Tropical_Cyclone_Intensity_Estimation/Countering_Data_Imbalance.ipynb
  13. 2 2
      hpc_ai/ai_science_climate/English/python/jupyter_notebook/Tropical_Cyclone_Intensity_Estimation/Downloading_Images.ipynb
  14. 4 5
      hpc_ai/ai_science_climate/English/python/jupyter_notebook/Tropical_Cyclone_Intensity_Estimation/Manipulation_of_Image_Data_and_Category_Determination_using_Text_Data.ipynb
  15. 1 1
      hpc_ai/ai_science_climate/English/python/jupyter_notebook/Tropical_Cyclone_Intensity_Estimation/The_Problem_Statement.ipynb
  16. 7 8
      hpc_ai/ai_science_climate/README.MD
  17. 1 1
      hpc_ai/ai_science_climate/Singularity

+ 7 - 0
hpc_ai/ai_science_cfd/English/python/jupyter_notebook/CFD/Part2.ipynb

@@ -94,6 +94,9 @@
     "# Importing Necessary Libaries \n",
     "from __future__ import print_function\n",
     "\n",
+    "import sys\n",
+    "sys.path.append('/workspace/python/source_code')\n",
+    "\n",
     "import numpy as np\n",
     "import utils.data_utils as data_utils\n",
     "import tensorflow as tf\n",
@@ -552,6 +555,10 @@
    "source": [
     "In the upcoming notebook let us define a 5 Layer fully connected network and train it.\n",
     "\n",
+    "## Important:\n",
+    "<mark>Shutdown the kernel before clicking on “Next Notebook” to free up the GPU memory.</mark>\n",
+    "\n",
+    "\n",
     "## Licensing\n",
     "This material is released by NVIDIA Corporation under the Creative Commons Attribution 4.0 International (CC BY 4.0)"
    ]

+ 7 - 1
hpc_ai/ai_science_cfd/English/python/jupyter_notebook/CFD/Part3.ipynb

@@ -70,6 +70,9 @@
     "# Importing Necessary Libraries \n",
     "from __future__ import print_function\n",
     "\n",
+    "import sys\n",
+    "sys.path.append('/workspace/python/source_code')\n",
+    "\n",
     "import numpy as np\n",
     "import utils.data_utils as data_utils\n",
     "import tensorflow as tf\n",
@@ -781,7 +784,10 @@
     "\n",
     "To put the above in simple words, when our Convolution neural networks learn, we have seen that it also convolutes over an area of the set kernel size where it takes the considerations of the neighbouring pixels and not just a single pixel, this makes the signed distance function a rightful choice as it assigns values to all the pixels in the input image.\n",
     "\n",
-    "In the upcoming notebook, let us introduce some advance networks and train them."
+    "In the upcoming notebook, let us introduce some advance networks and train them.\n",
+    "\n",
+    "## Important:\n",
+    "<mark>Shutdown the kernel before clicking on “Next Notebook” to free up the GPU memory.</mark>\n"
    ]
   },
   {

+ 3 - 0
hpc_ai/ai_science_cfd/English/python/jupyter_notebook/CFD/Part4.ipynb

@@ -65,6 +65,9 @@
     "# Import Necessary Libraries\n",
     "from __future__ import print_function\n",
     "\n",
+    "import sys\n",
+    "sys.path.append('/workspace/python/source_code')\n",
+    "\n",
     "import numpy as np \n",
     "import time\n",
     "import importlib\n",

+ 7 - 1
hpc_ai/ai_science_cfd/English/python/jupyter_notebook/Intro_to_DL/CNN's.ipynb

@@ -492,7 +492,13 @@
     "\n",
     "\n",
     "\n",
-    "Congrats on coming this far, wow that you are introduced to Machine Learning and Deep Learning, You can get started on the Domain Specific Problem accessible through the Home Page."
+    "Congrats on coming this far, wow that you are introduced to Machine Learning and Deep Learning, You can get started on the Domain Specific Problem accessible through the Home Page.\n",
+    "\n",
+    "## Exercise \n",
+    "Play with different hyper-parameters ( Epoch, depth of layers , kernel size to bring down loss further\n",
+    "\n",
+    "## Important:\n",
+    "<mark>Shutdown the kernel before clicking on “Next Notebook” to free up the GPU memory.</mark>\n"
    ]
   },
   {

+ 8 - 0
hpc_ai/ai_science_cfd/English/python/jupyter_notebook/Intro_to_DL/Part_2.ipynb

@@ -375,6 +375,14 @@
    "source": [
     "We get an Accuracy of 87% in the Test dataset which is less than the 89% we got during the Training phase, This problem in ML is called as Overfitting, and we have discussed the same in the previous notebook. \n",
     "\n",
+    "## Exercise\n",
+    "\n",
+    "Try adding more dense layers to the network above and observe change in accuracy.\n",
+    "\n",
+    "## Important:\n",
+    "<mark>Shutdown the kernel before clicking on “Next Notebook” to free up the GPU memory.</mark>\n",
+    "\n",
+    "\n",
     "## Licensing\n",
     "This material is released by NVIDIA Corporation under the Creative Commons Attribution 4.0 International (CC BY 4.0)"
    ]

+ 5 - 2
hpc_ai/ai_science_cfd/English/python/jupyter_notebook/Intro_to_DL/Resnets.ipynb

@@ -570,7 +570,10 @@
     "\n",
     "So, from the table above we can conclude that for this example how CNNs are efficient compared to other Machine Learning algorithms when it comes to image processing tasks.\n",
     "\n",
-    "Congrats on coming this far, now that you are introduced to Machine Learning and Deep Learning, you can get started on the domain specific problem accessible through the Home Page."
+    "Congrats on coming this far, now that you are introduced to Machine Learning and Deep Learning, you can get started on the domain specific problem accessible through the Home Page.\n",
+    "\n",
+    "## Important:\n",
+    "<mark>Shutdown the kernel before clicking on “Next Notebook” to free up the GPU memory.</mark>\n"
    ]
   },
   {
@@ -620,7 +623,7 @@
    "name": "python",
    "nbconvert_exporter": "python",
    "pygments_lexer": "ipython3",
-   "version": "3.6.9"
+   "version": "3.6.2"
   }
  },
  "nbformat": 4,

+ 8 - 8
hpc_ai/ai_science_cfd/README.MD

@@ -1,10 +1,10 @@
 # openacc-training-materials
-Training materials provided by OpenACC.org. The objective of this lab is to give an introduction to application of Artificial Intelligence (AI) algorithms in Science ( High Performance Computing(HPC) Simulations ). This Bootcamp will introduce you to fundamentals of AI and how they can be applied to CFD
+Training materials provided by OpenACC.org. The objective of this lab is to give an introduction to application of Artificial Intelligence (AI) algorithms in Science ( High Performance Computing(HPC) Simulations ). This Bootcamp will introduce you to fundamentals of AI and how they can be applied to CFD (Computational Fluid Dynamics)
 
 ## Prerequisites:
 To run this tutorial you will need a machine with NVIDIA GPU.
 
-- Install the [Docker](https://docs.docker.com/get-docker/) or [Singularity](https://sylabs.io/docs/]).
+- Install the [Docker](https://docs.docker.com/get-docker/) or [Singularity](https://sylabs.io/docs/).
 
 ## Creating containers
 To start with, you will have to build a Docker or Singularity container.
@@ -14,7 +14,7 @@ To build a docker container, run:
 `sudo docker build --network=host -t <imagename>:<tagnumber> .`
 
 For instance:
-`sudo docker build -t myimage:1.0 .`
+`sudo docker build --network=host -t myimage:1.0 .`
 
 and to run the container, run:
 `sudo docker run --rm -it --gpus=all --network=host -p 8888:8888 myimage:1.0`
@@ -22,12 +22,13 @@ and to run the container, run:
 The container launches jupyter notebook and runs on port 8888
 `jupyter notebook --ip 0.0.0.0 --port 8888 --no-browser --allow-root`
 
-Once inside the container, start the lab by clicking on the `Start_Here.ipynb` notebook.
+Then, open the jupyter notebook in browser: http://localhost:8888
+Start working on the lab by clicking on the `Start_Here.ipynb` notebook.
 
 ### Singularity Container
 
 To build the singularity container, run: 
-`singularity build <image_name>.simg Singularity`
+`sudo singularity build <image_name>.simg Singularity`
 
 and copy the files to your local machine to make sure changes are stored locally:
 `singularity run <image_name>.simg cp -rT /workspace ~/workspace`
@@ -35,8 +36,7 @@ and copy the files to your local machine to make sure changes are stored locally
 Then, run the container:
 `singularity run --nv <image_name>.simg jupyter notebook --notebook-dir=~/workspace`
 
-Once inside the container, start the lab by clicking on the `Start_Here.ipynb` notebook.
+Then, open the jupyter notebook in browser: http://localhost:8888
+Start working on the lab by clicking on the `Start_Here.ipynb` notebook.
 
-## Questions?
-Please join [OpenACC Slack Channel](https://openacclang.slack.com/messages/openaccusergroup) for questions.
 

+ 2 - 6
hpc_ai/ai_science_cfd/Singularity

@@ -8,15 +8,11 @@ FROM: nvcr.io/nvidia/tensorflow:20.01-tf2-py3
     apt-get update -y
     apt-get install -y libsm6 libxext6 libxrender-dev git
     pip3 install opencv-python==4.1.2.30 pandas seaborn sklearn matplotlib scikit-fmm tqdm h5py gdown
-    mkdir /workspace/CFD/data
+    mkdir /workspace/python/jupyter_notebook/CFD/data
     python3 /workspace/python/source_code/dataset.py
 
 %files
-     English/ /workspace/
-#    English/python/jupyter_notebook/CFD /workspace/CFD
-#    English/python/jupyter_notebook/Intro_to_DL /workspace/Intro_to_DL
-#    English/Start_Here.ipynb /workspace/
-#    English/python/source_code/dataset.py /workspace/
+     English/* /workspace/
 
 %runscript
     "$@"

Plik diff jest za duży
+ 6 - 1
hpc_ai/ai_science_climate/English/python/jupyter_notebook/Intro_to_DL/CNN's.ipynb


+ 11 - 3
hpc_ai/ai_science_climate/English/python/jupyter_notebook/Intro_to_DL/Part_2.ipynb

@@ -30,8 +30,7 @@
     "**Contents of the this notebook:**\n",
     "\n",
     "- [How a Deep Learning project is planned ?](#Machine-Learning-Pipeline)\n",
-    "- [Wrapping things up with an example ( Classification )](#Wrapping-Things-up-with-an-Example)\n",
-    "     - [Fully Connected Networks](#Image-Classification-on-types-of-Clothes)\n",
+    "- [Wrapping things up with an example ( Classification )](#Image-Classification-on-types-of-clothes)\n",
     "\n",
     "\n",
     "**By the end of this notebook participant will:**\n",
@@ -373,7 +372,16 @@
    "cell_type": "markdown",
    "metadata": {},
    "source": [
-    "We get an Accuracy of 87% in the Test dataset which is less than the 89% we got during the Training phase, This problem in ML is called as Overfitting, and we have discussed the same in the previous notebook. \n",
+    "## Exercise\n",
+    "\n",
+    "Try adding more dense layers to the network above and observe change in accuracy."
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "metadata": {},
+   "source": [
+    "We get an Accuracy of 87% in the Test dataset which is less than the 89% we got during the Training phase, This problem in ML is called as Overfitting\n",
     "\n",
     "## Important:\n",
     "<mark>Shutdown the kernel before clicking on “Next Notebook” to free up the GPU memory.</mark>\n",

+ 1 - 1
hpc_ai/ai_science_climate/English/python/jupyter_notebook/Tropical_Cyclone_Intensity_Estimation/Competition.ipynb

@@ -341,7 +341,7 @@
    "cell_type": "markdown",
    "metadata": {},
    "source": [
-    "Let us Now save our Model and the trained Weights for Future usage :"
+    "Let us now save our Model and the trained Weights for Future usage :"
    ]
   },
   {

+ 4 - 4
hpc_ai/ai_science_climate/English/python/jupyter_notebook/Tropical_Cyclone_Intensity_Estimation/Countering_Data_Imbalance.ipynb

@@ -38,10 +38,10 @@
     "\n",
     "**Contents of the this notebook:**\n",
     "\n",
-    "- [Understand the drawbacks of existing solution](#Understand-the-Drawbacks)\n",
-    "- [Working out the solution](#Working-out-the-Solution)\n",
+    "- [Understand the drawbacks of existing solution](#Understanding-the-drawbacks)\n",
+    "- [Working out the solution](#Working-out-the-solution)\n",
     "    - [Data Augmentation](#Data-Augmentation)\n",
-    "- [Training the model](#Training-the-Model)\n",
+    "- [Training the model](#Training-the-Model-with-Data-Augmentation)\n",
     "\n",
     "**By the end of this notebook participant will:**\n",
     "\n",
@@ -119,7 +119,7 @@
    "cell_type": "markdown",
    "metadata": {},
    "source": [
-    "# Training the Model with Data Augmentation : \n",
+    "# Training the Model with Data Augmentation \n",
     "\n",
     "\n",
     "We create a new function called `augmentation(name,category,filenames,labels,i)` and here we add more samples to Category which have imbalanced data.  "

+ 2 - 2
hpc_ai/ai_science_climate/English/python/jupyter_notebook/Tropical_Cyclone_Intensity_Estimation/Downloading_Images.ipynb

@@ -55,7 +55,7 @@
     "\n",
     "#### The Cell Type has been changed to Markdown so that you don't accidentally run the code.\n",
     "\n",
-    "#### It is not recommended to run the Code Until you fully Understand the code and it's Consequences , It Can Download Huge Amount of Data (~ 10's of GB's ) thereby Filling Your Computer's Memory"
+    "#### It is not recommended to run the code until you fully understand the code and it's consequences , It can download huge amount of data (~ 10's of GB's ) thereby filling your computer's memory"
    ]
   }
  ],
@@ -75,7 +75,7 @@
    "name": "python",
    "nbconvert_exporter": "python",
    "pygments_lexer": "ipython3",
-   "version": "3.6.8"
+   "version": "3.6.2"
   }
  },
  "nbformat": 4,

+ 4 - 5
hpc_ai/ai_science_climate/English/python/jupyter_notebook/Tropical_Cyclone_Intensity_Estimation/Manipulation_of_Image_Data_and_Category_Determination_using_Text_Data.ipynb

@@ -38,10 +38,10 @@
     "\n",
     "**Contents of this notebook:**\n",
     "\n",
-    "- [Understand the Model Requirements](#Understand-the-Model-Requirements)\n",
-    "    - [Exploring Resizing Options](#Exploring-Different-Types-of-Resizing-options)\n",
+    "- [Understand the Model Requirements](#Understand-the-Model-requirements)\n",
+    "    - [Exploring Resizing Options](#Exploring-different-types-of-resizing-options)\n",
     "    - [Choosing a Random Patch](#Step-2-:-Choosing-a-Random-Patch-from-the-Image)\n",
-    "- [Annotating Our Dataset ](#Annotating-Our-Dataset) \n",
+    "- [Annotating Our Dataset ](#Annotating-our-dataset) \n",
     "- [Wrapping Things Up](#Wrapping-Things-Up-:)\n",
     "    - [Preparing the Dataset](#Preparing-the-Dataset)\n",
     "    - [Defining our Model](#Defining-our-Model)\n",
@@ -129,7 +129,7 @@
    "source": [
     "### Step 2 : Choosing a Random Patch from the Image\n",
     "\n",
-    "We will use the `np.random.randint()` function from the Numpy toolbox to generate random numbers. The parameters of this function are the upper limits and size of the Output array as mentioned in the [Numpy Documentation](https://docs.scipy.org/doc/numpy-1.15.1/reference/generated/numpy.random.randint.html)"
+    "We will use the `np.random.randint()` function from the Numpy toolbox to generate random numbers. The parameters of this function are the upper limits and size of the Output array as mentioned in the [Numpy Documentation](https://numpy.org/doc/stable/reference/random/generated/numpy.random.randint.html)"
    ]
   },
   {
@@ -639,7 +639,6 @@
    ]
   },
   {
-   "attachments": {},
    "cell_type": "markdown",
    "metadata": {},
    "source": [

+ 1 - 1
hpc_ai/ai_science_climate/English/python/jupyter_notebook/Tropical_Cyclone_Intensity_Estimation/The_Problem_Statement.ipynb

@@ -136,7 +136,7 @@
    "name": "python",
    "nbconvert_exporter": "python",
    "pygments_lexer": "ipython3",
-   "version": "3.6.9"
+   "version": "3.6.2"
   }
  },
  "nbformat": 4,

+ 7 - 8
hpc_ai/ai_science_climate/README.MD

@@ -4,7 +4,7 @@ Training materials provided by OpenACC.org. The objective of this lab is to give
 ## Prerequisites:
 To run this tutorial you will need a machine with NVIDIA GPU.
 
-- Install the [Docker](https://docs.docker.com/get-docker/) or [Singularity](https://sylabs.io/docs/]).
+- Install the [Docker](https://docs.docker.com/get-docker/) or [Singularity](https://sylabs.io/docs/).
 
 ## Creating containers
 To start with, you will have to build a Docker or Singularity container.
@@ -22,21 +22,20 @@ and to run the container, run:
 The container launches jupyter notebook and runs on port 8888
 `jupyter notebook --ip 0.0.0.0 --port 8888 --no-browser --allow-root`
 
-Once inside the container, start the lab by clicking on the `Start_Here.ipynb` notebook.
+Then, open the jupyter notebook in browser: http://localhost:8888
+Start working on the lab by clicking on the `Start_Here.ipynb` notebook.
 
 ### Singularity Container
 
 To build the singularity container, run: 
-`singularity build <image_name>.simg Singularity`
+`sudo singularity build <image_name>.simg Singularity`
 
 and copy the files to your local machine to make sure changes are stored locally:
 `singularity run <image_name>.simg cp -rT /workspace ~/workspace`
 
 Then, run the container:
-`singularity run --nv <image_name>.simg jupyter notebook --notebook-dir=/workspace/python/jupyter_notebook/`
+`singularity run --nv <image_name>.simg jupyter notebook --notebook-dir=~/workspace`
 
-Once inside the container, start the lab by clicking on the `Start_Here.ipynb` notebook.
-
-## Questions?
-Please join [OpenACC Slack Channel](https://openacclang.slack.com/messages/openaccusergroup) for questions.
+Then, open the jupyter notebook in browser: http://localhost:8888
+Start working on the lab by clicking on the `Start_Here.ipynb` notebook.
 

+ 1 - 1
hpc_ai/ai_science_climate/Singularity

@@ -11,7 +11,7 @@ FROM: nvcr.io/nvidia/tensorflow:20.01-tf2-py3
     python3 /workspace/python/source_code/dataset.py
     
 %files
-    English/ /workspace/
+    English/* /workspace/
 
 %runscript
     "$@"