diff --git a/conf/vsc_ugent.config b/conf/vsc_ugent.config index 7df41f9..560367a 100644 --- a/conf/vsc_ugent.config +++ b/conf/vsc_ugent.config @@ -32,6 +32,7 @@ profiles { executor = 'slurm' queue = 'skitty' maxRetries = 2 + beforeScript = "export SINGULARITY_CACHEDIR=$VSC_SCRATCH_VO_USER/.singularity" scratch = "$VSC_SCRATCH_VO_USER" } } @@ -50,6 +51,7 @@ profiles { executor = 'slurm' queue = 'swalot' maxRetries = 2 + beforeScript = "export SINGULARITY_CACHEDIR=$VSC_SCRATCH_VO_USER/.singularity" scratch = "$VSC_SCRATCH_VO_USER" } } @@ -68,6 +70,7 @@ profiles { executor = 'slurm' queue = 'victini' maxRetries = 2 + beforeScript = "export SINGULARITY_CACHEDIR=$VSC_SCRATCH_VO_USER/.singularity" scratch = "$VSC_SCRATCH_VO_USER" } } @@ -86,6 +89,7 @@ profiles { executor = 'slurm' queue = 'kirlia' maxRetries = 2 + beforeScript = "export SINGULARITY_CACHEDIR=$VSC_SCRATCH_VO_USER/.singularity" scratch = "$VSC_SCRATCH_VO_USER" } } @@ -104,6 +108,7 @@ profiles { executor = 'slurm' queue = 'doduo' maxRetries = 2 + beforeScript = "export SINGULARITY_CACHEDIR=$VSC_SCRATCH_VO_USER/.singularity" scratch = "$VSC_SCRATCH_VO_USER" } } diff --git a/docs/vsc_ugent.md b/docs/vsc_ugent.md index 95c7f96..027e123 100644 --- a/docs/vsc_ugent.md +++ b/docs/vsc_ugent.md @@ -19,16 +19,12 @@ module load Nextflow nextflow run -profile vsc_ugent, ``` -I also highly recommend specifying a location of a Singularity cache directory, by specifying the location with the `$SINGULARITY_CACHEDIR` bash environment variable in your `.bash_profile` or `.bashrc` or by adding it to your SLURM/PBS script. If this cache directory is not specified, the cache directory defaults to your `$HOME/.singularity` directory, which does not have a lot of disk space. - -```shell -export SINGULARITY_CACHEDIR=$VSC_SCRATCH_VO_USER/.singularity -``` +I also highly recommend specifying a location of a Singularity cache directory, by specifying the location with the `$SINGULARITY_CACHEDIR` bash environment variable in your `.bash_profile` or `.bashrc` or by adding it to your PBS script. If this cache directory is not specified, All of the intermediate files required to run the pipeline will be stored in the `work/` directory. It is recommended to delete this directory after the pipeline has finished successfully because it can get quite large, and all of the main output files will be saved in the `results/` directory anyway. The config contains a `cleanup` command that removes the `work/` directory automatically once the pipeline has completed successfully. If the run does not complete successfully then the `work/` dir should be removed manually to save storage space. The default work directory is set to `$VSC_SCRATCH_VO_USER/work` per this configuration -You can also add several TORQUE options to the PBS script. More about this on this [link](http://hpcugent.github.io/vsc_user_docs/pdf/intro-HPC-linux-gent.pdf#appendix.B). +You can also add several TORQUE options to the SLURM/PBS script. More about this on this [link](http://hpcugent.github.io/vsc_user_docs/pdf/intro-HPC-linux-gent.pdf#appendix.B). To submit your job to the cluster by using the following command: @@ -38,4 +34,4 @@ qsub