mirror of
https://github.com/MillironX/nf-configs.git
synced 2024-12-24 19:18:17 +00:00
Update mpcdf.md
This commit is contained in:
parent
3995057d7f
commit
7088ed196f
1 changed files with 10 additions and 10 deletions
|
@ -6,7 +6,7 @@ All nf-core pipelines have been successfully configured for use on the HPCs at [
|
|||
|
||||
To run Nextflow, the `jdk` module must be loaded. To use the nf-core profile(s), run the pipeline with `-profile <cluster>,mpcdf`.
|
||||
|
||||
Currently the following clusters are supported: cobra, draco\* (\* coming soon)
|
||||
Currently the following clusters are supported: cobra, raven
|
||||
|
||||
>NB: Nextflow will need to submit the jobs via SLURM to the clusters and as such the commands above will have to be executed on one of the head nodes. Check the [MPCDF documentation](https://www.mpcdf.mpg.de/services/computing).
|
||||
|
||||
|
@ -20,10 +20,10 @@ To use: `-profile cobra,mpcdf`
|
|||
|
||||
Sets the following parameters:
|
||||
|
||||
Maximum parallel running jobs: 8
|
||||
Max. memory: 750.GB
|
||||
Max. CPUs: 80
|
||||
Max. walltime: 24.h
|
||||
- Maximum parallel running jobs: 8
|
||||
- Max. memory: 750.GB
|
||||
- Max. CPUs: 80
|
||||
- Max. walltime: 24.h
|
||||
|
||||
## draco
|
||||
|
||||
|
@ -31,7 +31,7 @@ Max. walltime: 24.h
|
|||
|
||||
## raven
|
||||
|
||||
Raven does not currently support singularity, therefore the anaconda/module is loaded for each process.
|
||||
Raven does not currently support singularity, therefore `module load anaconda/3/2020.02` is loaded for each process.
|
||||
|
||||
Due to this, we also recommend setting the `$NXF_CONDA_CACHEDIR` to a location of your choice to store all environments (so to prevent nextflow building the environment on every run).
|
||||
|
||||
|
@ -39,7 +39,7 @@ To use: `-profile raven,mpcdf`
|
|||
|
||||
Sets the following parameters:
|
||||
|
||||
Maximum parallel running jobs: 8
|
||||
Max. memory: 368.GB
|
||||
Max. CPUs: 192
|
||||
Max. walltime: 24.h
|
||||
- Maximum parallel running jobs: 8
|
||||
- Max. memory: 368.GB
|
||||
- Max. CPUs: 192
|
||||
- Max. walltime: 24.h
|
||||
|
|
Loading…
Reference in a new issue