1
0
Fork 0
mirror of https://github.com/MillironX/nf-configs.git synced 2024-11-22 00:26:03 +00:00

docs(utd_ganymede): add initial documentation

This commit is contained in:
Edmund Miller 2020-03-09 20:07:46 -05:00
parent 123c44c56d
commit 1c62395332
No known key found for this signature in database
GPG key ID: BD387FF7BC10AA9D
3 changed files with 22 additions and 1 deletions

View file

@ -117,6 +117,7 @@ Currently documentation is available for the following systems:
* [SHH](docs/shh.md)
* [UCT_HEX](docs/uct_hex.md)
* [UPPMAX](docs/uppmax.md)
* [UTD_GANYMEDE](docs/utd_ganymede.md)
* [UZH](docs/uzh.md)
### Uploading to `nf-core/configs`

18
docs/utd_ganymede.md Normal file
View file

@ -0,0 +1,18 @@
# nf-core/configs: UTD Ganymede Configuration
All nf-core pipelines have been successfully configured for use on the Ganymede HPC cluster at the [The Univeristy of Texas at Dallas](https://www.utdallas.edu/).
To use, run the pipeline with `-profile utd_ganymede`. This will download and launch the [`utd_ganymede.config`](../conf/utd_ganymede.config) which has been pre-configured with a setup suitable for the Ganymede HPC cluster. Using this profile, a docker image containing all of the required software will be downloaded, and converted to a Singularity image before execution of the pipeline.
Before running the pipeline you will need to load Singularity using the environment module system on Ganymede. You can do this by issuing the commands below:
```bash
## Singularity environment modules
module purge
module load singularity
```
All of the intermediate files required to run the pipeline will be stored in the `work/` directory. It is recommended to delete this directory after the pipeline has finished successfully because it can get quite large, and all of the main output files will be saved in the `results/` directory anyway.
>NB: You will need an account to use the HPC cluster on Ganymede in order to run the pipeline. If in doubt contact Ganymedeadmins.
>NB: Nextflow will need to submit the jobs via SLURM to the HPC cluster and as such the commands above will have to be executed on one of the login nodes. If in doubt contact GanymedeAdmins.

View file

@ -37,6 +37,7 @@ profiles {
shh { includeConfig "${params.custom_config_base}/conf/shh.config" }
uct_hex { includeConfig "${params.custom_config_base}/conf/uct_hex.config" }
uppmax { includeConfig "${params.custom_config_base}/conf/uppmax.config" }
utd_ganymede { includeConfig "${params.custom_config_base}/conf/utd_ganymede.config" }
uzh { includeConfig "${params.custom_config_base}/conf/uzh.config" }
}
@ -50,6 +51,7 @@ params {
crick: ['.thecrick.org'],
genotoul: ['.genologin1.toulouse.inra.fr', '.genologin2.toulouse.inra.fr'],
genouest: ['.genouest.org'],
uppmax: ['.uppmax.uu.se']
uppmax: ['.uppmax.uu.se'],
utd_ganymede: ['ganymede.utdallas.edu']
]
}