Prechádzať zdrojové kódy

Updated README. Updated link and references

bharatk-parallel 4 rokov pred
rodič
commit
78e9fee343

+ 2 - 1
hpc/miniprofiler/README.md

@@ -6,6 +6,7 @@ To run this tutorial you will need a machine with NVIDIA GPU.
 
 - Install the [Docker](https://docs.docker.com/get-docker/) or [Singularity](https://sylabs.io/docs/]).
 - Install Nvidia toolkit, [Nsight Systems (latest version)](https://developer.nvidia.com/nsight-systems) and [compute (latest version)](https://developer.nvidia.com/nsight-compute).
+- The base containers required for the lab may require users to create a NGC account and generate an API key (https://docs.nvidia.com/ngc/ngc-catalog-user-guide/index.html#registering-activating-ngc-account)
 
 ## Creating containers
 To start with, you will have to build a Docker or Singularity container.
@@ -34,7 +35,7 @@ Once inside the container, open the jupyter notebook in browser: http://localhos
 ### Singularity Container
 
 To build the singularity container, run: 
-`singularity build miniapp_profiler.simg Singularity`
+`sudo singularity build miniapp_profiler.simg Singularity`
 
 and copy the files to your local machine to make sure changes are stored locally:
 `singularity run miniapp_profiler.simg cp -rT /labs ~/labs`

+ 2 - 1
hpc/openacc/README.md

@@ -6,6 +6,7 @@ To run this tutorial you will need a machine with NVIDIA GPU.
 
 - Install the [Docker](https://docs.docker.com/get-docker/) or [Singularity](https://sylabs.io/docs/]).
 - Install Nvidia toolkit, [Nsight Systems (latest version)](https://developer.nvidia.com/nsight-systems) and [compute (latest version)](https://developer.nvidia.com/nsight-compute).
+- The base containers required for the lab may require users to create a NGC account and generate an API key (https://docs.nvidia.com/ngc/ngc-catalog-user-guide/index.html#registering-activating-ngc-account).
 
 ## Creating containers
 To start with, you will have to build a Docker or Singularity container.
@@ -34,7 +35,7 @@ Once inside the container, open the jupyter notebook in browser: http://localhos
 ### Singularity Container
 
 To build the singularity container, run: 
-`singularity build openacc.simg Singularity`
+`sudo singularity build openacc.simg Singularity`
 
 and copy the files to your local machine to make sure changes are stored locally:
 `singularity run openacc.simg cp -rT /labs ~/labs`

+ 5 - 0
hpc_ai/ai_science_cfd/README.MD

@@ -8,6 +8,8 @@ To run this tutorial you will need a machine with NVIDIA GPU.
 
 Make sure both Docker and Singularity has been installed with NVIDIA GPU support
 
+- The base containers required for the lab may require users to create a NGC account and generate an API key (https://docs.nvidia.com/ngc/ngc-catalog-user-guide/index.html#registering-activating-ngc-account)
+
 ## Creating containers
 To start with, you will have to build a Docker or Singularity container.
 
@@ -51,3 +53,6 @@ Q. Cannot write to /tmp directory
 
 A. Some notebooks depend on writing logs to /tmp directory. While creating container make sure /tmp director is accesible with write permission to container. Else the user can also change the tmp directory location
 
+Q. "ResourceExhaustedError" error is observed while running the labs
+A. Currently the batch size and network model is set to consume 16GB GPU memory. In order to use the labs without any modifications it is recommended to have GPU with minimum 16GB GPU memory. Else the users can play with batch size to reduce the memory footprint
+

+ 5 - 3
hpc_ai/ai_science_climate/English/python/jupyter_notebook/Tropical_Cyclone_Intensity_Estimation/Approach_to_the_Problem_&_Inspecting_and_Cleaning_the_Required_Data.ipynb

@@ -21,7 +21,7 @@
     "[1](The_Problem_Statement.ipynb)\n",
     "[2]\n",
     "[3](Manipulation_of_Image_Data_and_Category_Determination_using_Text_Data.ipynb)\n",
-    "[4](Removing_Edges_Black_Patches_and_Countering_Data_Imbalance.ipynb)\n",
+    "[4](Countering_Data_Imbalance.ipynb)\n",
     "[5](Competition.ipynb)\n",
     "     \n",
     "     \n",
@@ -88,10 +88,12 @@
     "\n",
     "<table><tr>\n",
     "<td><img src=\"images/example.jpg\" alt=\"Drawing\" style=\"width: 320px;\"/></td>\n",
-    "<td><img src=\"https://www.nrlmry.navy.mil/tcdat/tc05/ATL/12L.KATRINA/ir/geo/1km/20050827.1715.goes12.x.ir1km.12LKATRINA.100kts-940mb-244N-846W.jpg\" alt=\"Drawing\" style=\"width: 320px;\"/></td>\n",
+    "<td><img src=\"images/example1.jpg\" alt=\"Drawing\" style=\"width: 320px;\"/></td>\n",
     "</tr>\n",
     "</table>\n",
     "\n",
+    "*Source: https://www.nrlmry.navy.mil/*\n",
+    "\n",
     "#### Each Image will be annotated to a category of Cyclone Intensity using the text data with the help of the following table :\n",
     "\n",
     "![alt text](images/cat.png)"
@@ -450,7 +452,7 @@
     "[1](The_Problem_Statement.ipynb)\n",
     "[2]\n",
     "[3](Manipulation_of_Image_Data_and_Category_Determination_using_Text_Data.ipynb)\n",
-    "[4](Removing_Edges_Black_Patches_and_Countering_Data_Imbalance.ipynb)\n",
+    "[4](Countering_Data_Imbalance.ipynb)\n",
     "[5](Competition.ipynb)\n",
     "&emsp;&emsp;&emsp;&emsp;&emsp;\n",
     "&emsp;&emsp;&emsp;&emsp;&emsp;\n",

+ 2 - 2
hpc_ai/ai_science_climate/English/python/jupyter_notebook/Tropical_Cyclone_Intensity_Estimation/The_Problem_Statement.ipynb

@@ -22,7 +22,7 @@
     "[1]\n",
     "[2](Approach_to_the_Problem_&_Inspecting_and_Cleaning_the_Required_Data.ipynb)\n",
     "[3](Manipulation_of_Image_Data_and_Category_Determination_using_Text_Data.ipynb)\n",
-    "[4](Removing_Edges_Black_Patches_and_Countering_Data_Imbalance.ipynb)\n",
+    "[4](Countering_Data_Imbalance.ipynb)\n",
     "[5](Competition.ipynb)\n",
     "&emsp;&emsp;&emsp;&emsp;&emsp;\n",
     "&emsp;&emsp;&emsp;&emsp;&emsp;\n",
@@ -101,7 +101,7 @@
     "[1]\n",
     "[2](Approach_to_the_Problem_&_Inspecting_and_Cleaning_the_Required_Data.ipynb)\n",
     "[3](Manipulation_of_Image_Data_and_Category_Determination_using_Text_Data.ipynb)\n",
-    "[4](Removing_Edges_Black_Patches_and_Countering_Data_Imbalance.ipynb)\n",
+    "[4](Countering_Data_Imbalance.ipynb)\n",
     "[5](Competition.ipynb)\n",
     "&emsp;&emsp;&emsp;&emsp;&emsp;\n",
     "&emsp;&emsp;&emsp;&emsp;&emsp;\n",

BIN
hpc_ai/ai_science_climate/English/python/jupyter_notebook/Tropical_Cyclone_Intensity_Estimation/images/example1.jpg


+ 7 - 0
hpc_ai/ai_science_climate/README.MD

@@ -6,6 +6,10 @@ To run this tutorial you will need a machine with NVIDIA GPU.
 
 - Install the latest [Docker](https://docs.nvidia.com/datacenter/cloud-native/container-toolkit/install-guide.html#docker) or [Singularity](https://sylabs.io/docs/).
 
+- Currently the batch size and network model is set to consume 16GB GPU memory. In order to use the labs without any modifications it is recommended to have GPU with minimum 16GB GPU memory else the users will observe "ResourceExhaustedError" error. Else the users can play with batch size to reduce the memory footprint
+
+- The base containers required for the lab may require users to create a NGC account and generate an API key (https://docs.nvidia.com/ngc/ngc-catalog-user-guide/index.html#registering-activating-ngc-account)
+
 ## Creating containers
 To start with, you will have to build a Docker or Singularity container.
 
@@ -50,3 +54,6 @@ Q. Cannot write to /tmp directory
 
 A. Some notebooks depend on writing logs to /tmp directory. While creating container make sure /tmp director is accesible with write permission to container. Else the user can also change the tmp directory location
 
+Q. "ResourceExhaustedError" error is observed while running the labs 
+A. Currently the batch size and network model is set to consume 16GB GPU memory. In order to use the labs without any modifications it is recommended to have GPU with minimum 16GB GPU memory. Else the users can play with batch size to reduce the memory footprint
+