瀏覽代碼

Update readme.md

Mike Mikailov 3 年之前
父節點
當前提交
9f1de31a8c
共有 1 個文件被更改,包括 12 次插入12 次删除
  1. 12 12
      readme.md

+ 12 - 12
readme.md

@@ -9,16 +9,16 @@
 - Adjust configuration parameters in files *config_testing.txt*, *config_normal.txt* and *config_tumor.txt* located at the <a href="https://github.com/DIDSR/HPC_DPAI"> root </a> directory of the codes.
 - Run the commands listed in the following subsections to launch Son of Grid Engine (SGE) jobs to extract, group patches in HDF5 files and create a lookup table for every HDF5 file. 
 ## 1.1 Extract and group
-- qsub image_patch_extract/split_main.sh ./config_testing.txt  
+- qsub ./image_patch_extract/split_main.sh ./config_testing.txt  
 -- *split_main.sh* in turn submits *split_grp.sh* which in turns runs *split_grp.py* in array job fashion. Every task in the array job processes one slide.
-- qsub image_patch_extract/split_main.sh ./config_normal.txt  
-- qsub image_patch_extract/split_main.sh ./config_tumor.txt  
+- qsub ./image_patch_extract/split_main.sh ./config_normal.txt  
+- qsub ./image_patch_extract/split_main.sh ./config_tumor.txt  
 
 The \*.sh files mentioned in this section are located under <a href="https://github.com/DIDSR/HPC_DPAI/tree/master/image_patch_extract">image_patch_extract</a> directory while the config_*.txt files are at the <a href="https://github.com/DIDSR/HPC_DPAI"> root </a> directory of the codes.
 ## 1.2 Create lookup tables
-- bash image_patch_extract/create_lookup_grp.sh ./config_testing.txt  
-- bash image_patch_extract/create_lookup_grp.sh ./config_normal.txt  
-- bash image_patch_extract/create_lookup_grp.sh ./config_tumor.txt  
+- bash ./image_patch_extract/create_lookup_grp.sh ./config_testing.txt  
+- bash ./image_patch_extract/create_lookup_grp.sh ./config_normal.txt  
+- bash ./image_patch_extract/create_lookup_grp.sh ./config_tumor.txt  
 
 The lookup tables are created only once and used at [Prediction](#2-prediction) stage for launching array job tasks. These tasks are run in parallel and scalable manner - if there are not enough resourcs for running all tasks then they are queued up automatically and started as resources become available. Each task processes only one group. 
 
@@ -31,15 +31,15 @@ The \*.sh file mentioned in this section is located under <a href="https://githu
 The \*.sh files mentioned in sections 2.1 and 2.2  below are located under <a href="https://github.com/DIDSR/HPC_DPAI/tree/master/prediction">prediction</a> directory while the config_*.txt files are at the <a href="https://github.com/DIDSR/HPC_DPAI"> root </a> directory of the codes.
 
 ## 2.1 With color normalization
-- qsub prediction/process_main.sh ./config_testing_cn_true.txt  
+- qsub ./prediction/process_main.sh ./config_testing_cn_true.txt  
 -- *process_main.sh* in turn submits a number of SGE jobs using *process_array.sh* which in turn runs *process_images_grp_normalization_wli.py* in the array jobs generated for every slide. Number of tasks in an array job determined automatically based on the number of groups in the corresponding HDF5 file.
-- qsub prediction/process_main.sh ./config_normal_cn_true.txt  
-- qsub prediction/process_main.sh ./config_tumor_cn_true.txt  
+- qsub ./prediction/process_main.sh ./config_normal_cn_true.txt  
+- qsub ./prediction/process_main.sh ./config_tumor_cn_true.txt  
 
 ## 2.2 Without color normalization 
-- qsub prediction/process_main.sh ./config_testing.txt  
-- qsub prediction/process_main.sh ./config_normal.txt  
-- qsub prediction/process_main.sh ./config_tumor.txt  
+- qsub ./prediction/process_main.sh ./config_testing.txt  
+- qsub ./prediction/process_main.sh ./config_normal.txt  
+- qsub ./prediction/process_main.sh ./config_tumor.txt  
 
 # 3 Heatmap stitching
 After the predictions matrices have been generated an SGE job using *heatmap_main.sh* SGE scrip could be launched to genertae heatmaps. Two arguments for this launch are: a) type of the slides (test, normal or tumor); b) the root directory of the results, like in below example run: