1
0
Fork 0
mirror of https://github.com/MillironX/nf-configs.git synced 2024-11-22 08:29:54 +00:00

Merge pull request #9 from apeltzer/master

Add SHH Configuration / add custom config_profile_* descriptions
This commit is contained in:
Phil Ewels 2019-01-07 12:59:22 +01:00 committed by GitHub
commit 723428c186
No known key found for this signature in database
GPG key ID: 4AEE18F83AFDEB23
19 changed files with 203 additions and 63 deletions

View file

@ -34,9 +34,11 @@ If you want to use an existing config available in `nf-core/configs`, and you're
If you decide to upload your custom config file to `nf-core/configs` then this will ensure that your custom config file will be automatically downloaded, and available at run-time to all nf-core pipelines, and to everyone within your organisation. You will simply have to specify `-profile <config_name>` in the command used to run the pipeline. See [`nf-core/configs`](https://github.com/nf-core/configs/tree/master/conf) for examples. If you decide to upload your custom config file to `nf-core/configs` then this will ensure that your custom config file will be automatically downloaded, and available at run-time to all nf-core pipelines, and to everyone within your organisation. You will simply have to specify `-profile <config_name>` in the command used to run the pipeline. See [`nf-core/configs`](https://github.com/nf-core/configs/tree/master/conf) for examples.
Please also make sure to add an extra `params` section with `params. config_profile_name`, `params.config_profile_description`, `params.config_profile_contact` and `params.config_profile_url` set to reasonable values. Users will get information on who wrote the configuration profile then when executing a nf-core pipeline and can report back if there are things missing for example.
### Testing ### Testing
If you want to add a new custom config file to `nf-core/configs` please can you test that your pipeline of choice runs as expected by using the [`-c`](https://www.nextflow.io/docs/latest/config.html) parameter. If you want to add a new custom config file to `nf-core/configs` please test that your pipeline of choice runs as expected by using the [`-c`](https://www.nextflow.io/docs/latest/config.html) parameter.
```bash ```bash
## Example command for nf-core/rnaseq ## Example command for nf-core/rnaseq
@ -45,10 +47,26 @@ nextflow run nf-core/rnaseq --reads '*_R{1,2}.fastq.gz' --genome GRCh37 -c '[pat
### Documentation ### Documentation
You will have to create a [Markdown document](https://www.markdownguide.org/getting-started/) outlining the details required to use the custom config file within your organisation. You will have to create a [Markdown document](https://www.markdownguide.org/getting-started/) outlining the details required to use the custom config file within your organisation. You might orientate yourself using the [Template](docs/template.md) that we provide and filling out the information for your cluster there.
See [`nf-core/configs/docs`](https://github.com/nf-core/configs/tree/master/docs) for examples. See [`nf-core/configs/docs`](https://github.com/nf-core/configs/tree/master/docs) for examples.
Currently documentation is available for the following clusters:
* [BINAC](docs/binac.md)
* [CCGA](docs/ccga.md)
* [CFC](docs/binac.md)
* [CRICK](docs/crick.md)
* [GIS](docs/gis.md)
* [HEBBE](docs/hebbe.md)
* [MENDEL](docs/mendel.md)
* [PHOENIX](docs/phoenix.md)
* [SHH](docs/shh.md)
* [UCT_HEX](docs/uct_hex.md)
* [UPPMAX-DEVEL](docs/uppmax-devel.md)
* [UPPMAX](docs/uppmax.md)
* [UZH](docs/uzh.md)
### Uploading to `nf-core/configs` ### Uploading to `nf-core/configs`
[Fork](https://help.github.com/articles/fork-a-repo/) the `nf-core/configs` repository to your own GitHub account. Within the local clone of your fork add the custom config file to the [`conf/`](https://github.com/nf-core/configs/tree/master/conf) directory, and the documentation file to the [`docs/`](https://github.com/nf-core/configs/tree/master/docs) directory. You will also need to edit and add your custom profile to the [`nfcore_custom.config`](https://github.com/nf-core/configs/blob/master/nfcore_custom.config) file in the top-level directory of the clone. [Fork](https://help.github.com/articles/fork-a-repo/) the `nf-core/configs` repository to your own GitHub account. Within the local clone of your fork add the custom config file to the [`conf/`](https://github.com/nf-core/configs/tree/master/conf) directory, and the documentation file to the [`docs/`](https://github.com/nf-core/configs/tree/master/docs) directory. You will also need to edit and add your custom profile to the [`nfcore_custom.config`](https://github.com/nf-core/configs/blob/master/nfcore_custom.config) file in the top-level directory of the clone.

View file

@ -1,9 +1,10 @@
/* //Profile config names for nf-core/configs
* ---------------------------------------------------------------------------- params {
* Nextflow config file for use with Singularity on BINAC cluster in Tuebingen config_profile_name = 'BINAC'
* ---------------------------------------------------------------------------- config_profile_description = 'BINAC Cluster Profile provided by nf-core/configs.'
* Defines basic usage limits and singularity image id. config_profile_contact = 'Alexander Peltzer (@apeltzer)'
*/ config_profile_url = 'https://www.bwhpc-c5.de/wiki/index.php/Category:BwForCluster_BinAC'
}
singularity { singularity {
enabled = true enabled = true

View file

@ -1,3 +1,11 @@
//Profile config names for nf-core/configs
params {
config_profile_name = 'CCGA'
config_profile_description = 'CCGA Cluster Profile provided by nf-core/configs.'
config_profile_contact = 'Marc Hoeppner (@marchoeppner)'
config_profile_url = 'https://www.ikmb.uni-kiel.de/'
}
/* /*
* ------------------------------------------------- * -------------------------------------------------
* Nextflow config file with environment modules for RZCluster in Kiel * Nextflow config file with environment modules for RZCluster in Kiel

View file

@ -1,9 +1,10 @@
/* //Profile config names for nf-core/configs
* ------------------------------------------------------------- params {
* Nextflow config file for use with Singularity on CFC at QBIC config_profile_name = 'CFC'
* ------------------------------------------------------------- config_profile_description = 'Core Facility Cluster Profile provided by nf-core/configs.'
* Defines basic usage limits and singularity image id. config_profile_contact = 'Alexander Peltzer (@apeltzer)'
*/ config_profile_url = 'http://qbic.uni-tuebingen.de/'
}
singularity { singularity {
enabled = true enabled = true

View file

@ -1,8 +1,10 @@
/* //Profile config names for nf-core/configs
* ------------------------------------------------- params {
* Nextflow config file for CAMP HPC @ The Crick config_profile_name = 'CRICK'
* ------------------------------------------------- config_profile_description = 'The Francis Crick Institute CAMP HPC Cluster Profile provided by nf-core/configs.'
*/ config_profile_contact = 'Harshil Patel (@drpatelh )'
config_profile_url = 'https://www.crick.ac.uk/research/platforms-and-facilities/scientific-computing/technologies'
}
singularity { singularity {
enabled = true enabled = true

View file

@ -1,10 +1,10 @@
/* //Profile config names for nf-core/configs
* ------------------------------------------------- params {
* Gothenburg Hebbe Cluster config file config_profile_name = 'HEBBE'
* ------------------------------------------------- config_profile_description = 'Gothenburg Hebbe Cluster Profile provided by nf-core/configs.'
* http://www.c3se.chalmers.se/index.php/Hebbe config_profile_contact = 'Phil Ewels (@ewels )'
*/ config_profile_url = 'http://www.c3se.chalmers.se/index.php/Hebbe'
}
singularity { singularity {
enabled = true enabled = true

View file

@ -1,8 +1,10 @@
/* //Profile config names for nf-core/configs
* -------------------------------------------------------------------------------------- params {
* Nextflow config file for the MENDEL cluster at the Gregor Mendel Institute in Vienna config_profile_name = 'MENDEL'
* ------------------------------------------------------------------------------------- config_profile_description = 'MENDEL cluster profile provided by nf-core/configs'
*/ config_profile_contact = 'Philipp H (@phue)'
config_profile_url = 'http://www.gmi.oeaw.ac.at/'
}
singularity { singularity {
enabled = true enabled = true

View file

@ -1,9 +1,10 @@
/* //Profile config names for nf-core/configs
* ---------------------------------------------------------------------------- params {
* Nextflow config file for use with Singularity on Phoenix Cluster Adelaide config_profile_name = 'PHOENIX'
* ---------------------------------------------------------------------------- config_profile_description = 'Phoenix Research cluster profile provided by nf-core/configs'
* Defines basic usage limits and singularity image id. config_profile_contact = 'Yassine Souilmi / Alexander Peltzer (@yassineS, @apeltzer)'
*/ config_profile_url = 'https://www.adelaide.edu.au/phoenix/'
}
singularity { singularity {
enabled = true enabled = true

23
conf/shh.config Normal file
View file

@ -0,0 +1,23 @@
//Profile config names for nf-core/configs
params {
config_profile_name = 'SHH'
config_profile_description = 'MPI SHH Cluster Profile provided by nf-core/configs.'
config_profile_contact = 'James Fellows Yates (@jfy133)'
config_profile_url = 'https://shh.mpg.de'
}
singularity {
enabled = true
cacheDir = "/projects1/users/$USER/nextflow/nf_cache/singularity/"
}
process {
executor = 'slurm'
queue = 'medium'
}
params {
max_memory = 734.GB
max_cpus = 64
max_time = 48.h
}

View file

@ -1,9 +1,10 @@
/* //Profile config names for nf-core/configs
* ------------------------------------------------- params {
* University of Cape Town HEX cluster config file config_profile_name = 'uct_hex'
* ------------------------------------------------- config_profile_description = 'University of Cape Town HEX cluster config file provided by nf-core/configs.'
* http://hpc.uct.ac.za/index.php/hex-3/ config_profile_contact = 'Katie Lennard (@kviljoen)'
*/ config_profile_url = 'http://hpc.uct.ac.za/index.php/hex-3/'
}
singularity { singularity {
enabled = true enabled = true
@ -22,6 +23,3 @@ executor{
jobName = { "$task.tag" } jobName = { "$task.tag" }
} }
params {
igenomes_base = '/scratch/DB/bio/rna-seq/references'
}

View file

@ -1,13 +1,20 @@
/* //Profile config names for nf-core/configs
* ------------------------------------------------- params {
* Nextflow config file for UPPMAX (rackham / irma) config_profile_name = 'UPPMAX-DEVEL'
* ------------------------------------------------- config_profile_description = 'UPPMAX Development Cluster Profile provided by nf-core/configs.'
config_profile_contact = 'Phil Ewels (@ewels)'
config_profile_url = 'https://www.uppmax.uu.se/'
}
/* Additional description:
* To be applied after main UPPMAX config, overwrites config and * To be applied after main UPPMAX config, overwrites config and
* submits jobs to the `devcore` queue, which has much faster * submits jobs to the `devcore` queue, which has much faster
* queue times. All jobs are limited to 1 hour to be eligible * queue times. All jobs are limited to 1 hour to be eligible
* for this queue and only one job allowed at a time. * for this queue and only one job allowed at a time.
*/ */
executor { executor {
name = 'slurm' name = 'slurm'
queueSize = 1 queueSize = 1

View file

@ -1,7 +1,12 @@
/* //Profile config names for nf-core/configs
* ------------------------------------------------- params {
* Nextflow config file for UPPMAX (rackham / irma) config_profile_name = 'UPPMAX'
* ------------------------------------------------- config_profile_description = 'UPPMAX Cluster Profile provided by nf-core/configs.'
config_profile_contact = 'Phil Ewels (@ewels)'
config_profile_url = 'https://www.uppmax.uu.se/'
}
/* Additional description:
* Defines reference genomes, using iGenome paths * Defines reference genomes, using iGenome paths
* Imported under the default 'standard' Nextflow * Imported under the default 'standard' Nextflow
* profile in nextflow.config * profile in nextflow.config

View file

@ -1,8 +1,10 @@
/* //Profile config names for nf-core/configs
* -------------------------------------------------------------------------------- params{
* Nextflow config file for use with Singularity on University of Zurich Cluster config_profile_name = 'UZH'
* -------------------------------------------------------------------------------- config_profile_description = 'UZH science cloud profile provided by nf-core/configs'
*/ config_profile_contact = 'Judith Neukamm/Alexander Peltzer (@JudithNeukamm, @apeltzer)'
config_profile_url = 'https://www.id.uzh.ch/en/scienceit/infrastructure/sciencecloud.html'
}
singularity { singularity {
enabled = true enabled = true

19
docs/binac.md Normal file
View file

@ -0,0 +1,19 @@
# nf-core/configs: BINAC Configuration
All nf-core pipelines have been successfully configured for use on the BINAC cluster at the insert institution here.
To use, run the pipeline with `-profile binac`. This will download and launch the [`binac.config`](../conf/binac.config) which has been pre-configured with a setup suitable for the BINAC cluster. Using this profile, Nextflow will download a singularity image with all of the required software before execution of the pipeline.
Before running the pipeline you will need to load Nextflow and Singularity using the environment module system on BINAC cluster. You can do this by issuing the commands below:
```bash
## Load Nextflow and Singularity environment modules
module purge
module load devel/java_jdk/1.8.0u112
module load devel/singularity/3.0.1
```
>NB: You will need an account to use the HPC cluster BINAC in order to run the pipeline. If in doubt contact IT.
>NB: Nextflow will need to submit the jobs via the job scheduler to the HPC cluster and as such the commands above will have to be executed on one of the login nodes. If in doubt contact IT.

19
docs/cfc.md Normal file
View file

@ -0,0 +1,19 @@
# nf-core/configs: CFC Configuration
All nf-core pipelines have been successfully configured for use on the CFC cluster at the insert institution here.
To use, run the pipeline with `-profile cfc`. This will download and launch the [`cfc.config`](../conf/cfc.config) which has been pre-configured with a setup suitable for the CFC cluster. Using this profile, Nextflow will download a singularity image with all of the required software before execution of the pipeline.
Before running the pipeline you will need to load Nextflow and Singularity using the environment module system on CFC cluster. You can do this by issuing the commands below:
```bash
## Load Nextflow and Singularity environment modules
module purge
module load devel/java_jdk/1.8.0u121
module load qbic/singularity_slurm/3.0.1
```
>NB: You will need an account to use the HPC cluster CFC in order to run the pipeline. If in doubt contact IT.
>NB: Nextflow will need to submit the jobs via the job scheduler to the HPC cluster and as such the commands above will have to be executed on one of the login nodes. If in doubt contact IT.

View file

@ -11,9 +11,6 @@ Before running the pipeline you will need to load Nextflow and Singularity using
module purge module purge
module load Nextflow/0.32.0 module load Nextflow/0.32.0
module load Singularity/2.6.0-foss-2016b module load Singularity/2.6.0-foss-2016b
## Example command for nf-core/atacseq
nextflow run nf-core/atacseq -profile crick --genome GRCh37 --design /path/to/design.csv --email test.user@crick.ac.uk
``` ```
A local copy of the iGenomes resource has been made available on CAMP so you should be able to run the pipeline against any reference available in the `igenomes.config` specific to the nf-core pipeline. You can do this by simply using the `--genome <GENOME_ID>` parameter. Some of the more exotic genomes may not have been downloaded onto CAMP so have a look in the `igenomes_base` path specified in [`crick.config`](../conf/crick.config), and if your genome of interest isnt present please contact [BABS](mailto:bioinformatics@crick.ac.uk). A local copy of the iGenomes resource has been made available on CAMP so you should be able to run the pipeline against any reference available in the `igenomes.config` specific to the nf-core pipeline. You can do this by simply using the `--genome <GENOME_ID>` parameter. Some of the more exotic genomes may not have been downloaded onto CAMP so have a look in the `igenomes_base` path specified in [`crick.config`](../conf/crick.config), and if your genome of interest isnt present please contact [BABS](mailto:bioinformatics@crick.ac.uk).

26
docs/template.md Normal file
View file

@ -0,0 +1,26 @@
# nf-core/configs: PROFILE Configuration
All nf-core pipelines have been successfully configured for use on the PROFILE CLUSTER at the insert institution here.
To use, run the pipeline with `-profile PROFILENAME`. This will download and launch the [`profile.config`](../conf/profile.config) which has been pre-configured with a setup suitable for the PROFILE cluster. Using this profile, Nextflow will download a singularity image with all of the required software before execution of the pipeline.
## Below are non-mandatory information e.g. on modules to load etc.
Before running the pipeline you will need to load Nextflow and Singularity using the environment module system on PROFILE CLUSTER. You can do this by issuing the commands below:
```bash
## Load Nextflow and Singularity environment modules
module purge
module load Nextflow/0.32.0
module load Singularity/2.6.0
```
## Below are non-mandatory information on iGenomes specific configuration
A local copy of the iGenomes resource has been made available on PROFILE CLUSTER so you should be able to run the pipeline against any reference available in the `igenomes.config` specific to the nf-core pipeline.
You can do this by simply using the `--genome <GENOME_ID>` parameter.
>NB: You will need an account to use the HPC cluster on PROFILE CLUSTER in order to run the pipeline. If in doubt contact IT.
>NB: Nextflow will need to submit the jobs via the job scheduler to the HPC cluster and as such the commands above will have to be executed on one of the login nodes. If in doubt contact IT.

10
docs/uzh.md Normal file
View file

@ -0,0 +1,10 @@
# nf-core/configs: UZH Configuration
All nf-core pipelines have been successfully configured for use on the UZH cluster at the insert institution here.
To use, run the pipeline with `-profile uzh`. This will download and launch the [`uzh.config`](../conf/uzh.config) which has been pre-configured with a setup suitable for the UZH cluster. Using this profile, Nextflow will download a singularity image with all of the required software before execution of the pipeline.
>NB: You will need an account to use the HPC cluster UZH in order to run the pipeline. If in doubt contact IT.
>NB: Nextflow will need to submit the jobs via the job scheduler to the HPC cluster and as such the commands above will have to be executed on one of the login nodes. If in doubt contact IT.

View file

@ -19,6 +19,7 @@ profiles {
hebbe { includeConfig "${config_base}/hebbe.config" } hebbe { includeConfig "${config_base}/hebbe.config" }
mendel { includeConfig "${config_base}/mendel.config" } mendel { includeConfig "${config_base}/mendel.config" }
phoenix { includeConfig "${config_base}/pheonix.config" } phoenix { includeConfig "${config_base}/pheonix.config" }
shh { includeConfig "${config_base}/shh.config" }
uct_hex { includeConfig "${config_base}/uct_hex.config" } uct_hex { includeConfig "${config_base}/uct_hex.config" }
uppmax_devel { includeConfig "${config_base}/uppmax.config" uppmax_devel { includeConfig "${config_base}/uppmax.config"
includeConfig "${config_base}/uppmax-devel.config" includeConfig "${config_base}/uppmax-devel.config"