mirror of
https://github.com/MillironX/nf-configs.git
synced 2024-11-22 00:26:03 +00:00
automated the internal cache dir location for singularity
This commit is contained in:
parent
b2f7538b33
commit
c2b26a734d
2 changed files with 8 additions and 7 deletions
|
@ -32,6 +32,7 @@ profiles {
|
||||||
executor = 'slurm'
|
executor = 'slurm'
|
||||||
queue = 'skitty'
|
queue = 'skitty'
|
||||||
maxRetries = 2
|
maxRetries = 2
|
||||||
|
beforeScript = "export SINGULARITY_CACHEDIR=$VSC_SCRATCH_VO_USER/.singularity"
|
||||||
scratch = "$VSC_SCRATCH_VO_USER"
|
scratch = "$VSC_SCRATCH_VO_USER"
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
@ -50,6 +51,7 @@ profiles {
|
||||||
executor = 'slurm'
|
executor = 'slurm'
|
||||||
queue = 'swalot'
|
queue = 'swalot'
|
||||||
maxRetries = 2
|
maxRetries = 2
|
||||||
|
beforeScript = "export SINGULARITY_CACHEDIR=$VSC_SCRATCH_VO_USER/.singularity"
|
||||||
scratch = "$VSC_SCRATCH_VO_USER"
|
scratch = "$VSC_SCRATCH_VO_USER"
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
@ -68,6 +70,7 @@ profiles {
|
||||||
executor = 'slurm'
|
executor = 'slurm'
|
||||||
queue = 'victini'
|
queue = 'victini'
|
||||||
maxRetries = 2
|
maxRetries = 2
|
||||||
|
beforeScript = "export SINGULARITY_CACHEDIR=$VSC_SCRATCH_VO_USER/.singularity"
|
||||||
scratch = "$VSC_SCRATCH_VO_USER"
|
scratch = "$VSC_SCRATCH_VO_USER"
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
@ -86,6 +89,7 @@ profiles {
|
||||||
executor = 'slurm'
|
executor = 'slurm'
|
||||||
queue = 'kirlia'
|
queue = 'kirlia'
|
||||||
maxRetries = 2
|
maxRetries = 2
|
||||||
|
beforeScript = "export SINGULARITY_CACHEDIR=$VSC_SCRATCH_VO_USER/.singularity"
|
||||||
scratch = "$VSC_SCRATCH_VO_USER"
|
scratch = "$VSC_SCRATCH_VO_USER"
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
@ -104,6 +108,7 @@ profiles {
|
||||||
executor = 'slurm'
|
executor = 'slurm'
|
||||||
queue = 'doduo'
|
queue = 'doduo'
|
||||||
maxRetries = 2
|
maxRetries = 2
|
||||||
|
beforeScript = "export SINGULARITY_CACHEDIR=$VSC_SCRATCH_VO_USER/.singularity"
|
||||||
scratch = "$VSC_SCRATCH_VO_USER"
|
scratch = "$VSC_SCRATCH_VO_USER"
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
|
@ -19,16 +19,12 @@ module load Nextflow
|
||||||
nextflow run <pipeline> -profile vsc_ugent,<CLUSTER> <Add your other parameters>
|
nextflow run <pipeline> -profile vsc_ugent,<CLUSTER> <Add your other parameters>
|
||||||
```
|
```
|
||||||
|
|
||||||
I also highly recommend specifying a location of a Singularity cache directory, by specifying the location with the `$SINGULARITY_CACHEDIR` bash environment variable in your `.bash_profile` or `.bashrc` or by adding it to your SLURM/PBS script. If this cache directory is not specified, the cache directory defaults to your `$HOME/.singularity` directory, which does not have a lot of disk space.
|
I also highly recommend specifying a location of a Singularity cache directory, by specifying the location with the `$SINGULARITY_CACHEDIR` bash environment variable in your `.bash_profile` or `.bashrc` or by adding it to your PBS script. If this cache directory is not specified,
|
||||||
|
|
||||||
```shell
|
|
||||||
export SINGULARITY_CACHEDIR=$VSC_SCRATCH_VO_USER/.singularity
|
|
||||||
```
|
|
||||||
|
|
||||||
All of the intermediate files required to run the pipeline will be stored in the `work/` directory. It is recommended to delete this directory after the pipeline has finished successfully because it can get quite large, and all of the main output files will be saved in the `results/` directory anyway.
|
All of the intermediate files required to run the pipeline will be stored in the `work/` directory. It is recommended to delete this directory after the pipeline has finished successfully because it can get quite large, and all of the main output files will be saved in the `results/` directory anyway.
|
||||||
The config contains a `cleanup` command that removes the `work/` directory automatically once the pipeline has completed successfully. If the run does not complete successfully then the `work/` dir should be removed manually to save storage space. The default work directory is set to `$VSC_SCRATCH_VO_USER/work` per this configuration
|
The config contains a `cleanup` command that removes the `work/` directory automatically once the pipeline has completed successfully. If the run does not complete successfully then the `work/` dir should be removed manually to save storage space. The default work directory is set to `$VSC_SCRATCH_VO_USER/work` per this configuration
|
||||||
|
|
||||||
You can also add several TORQUE options to the PBS script. More about this on this [link](http://hpcugent.github.io/vsc_user_docs/pdf/intro-HPC-linux-gent.pdf#appendix.B).
|
You can also add several TORQUE options to the SLURM/PBS script. More about this on this [link](http://hpcugent.github.io/vsc_user_docs/pdf/intro-HPC-linux-gent.pdf#appendix.B).
|
||||||
|
|
||||||
To submit your job to the cluster by using the following command:
|
To submit your job to the cluster by using the following command:
|
||||||
|
|
||||||
|
@ -38,4 +34,4 @@ qsub <script name>.pbs
|
||||||
|
|
||||||
> **NB:** The profile only works for the clusters `skitty`, `swalot`, `victini`, `kirlia` and `doduo`.
|
> **NB:** The profile only works for the clusters `skitty`, `swalot`, `victini`, `kirlia` and `doduo`.
|
||||||
|
|
||||||
> **NB:** The default directory where the `work/` and `singularity` (cache directory for images) is located in `$VSC_SCRATCH_VO_USER`.
|
> **NB:** The default directory where the `work/` and `singularity/` (cache directory for images) is located in `$VSC_SCRATCH_VO_USER`.
|
||||||
|
|
Loading…
Reference in a new issue