mirror of
https://github.com/MillironX/nf-configs.git
synced 2024-12-25 19:48:16 +00:00
Merge pull request #258 from ameynert/master
Update eddie config docs & added viralrecon config
This commit is contained in:
commit
6ede44b221
3 changed files with 25 additions and 8 deletions
13
conf/pipeline/viralrecon/eddie.config
Normal file
13
conf/pipeline/viralrecon/eddie.config
Normal file
|
@ -0,0 +1,13 @@
|
|||
env {
|
||||
BLASTDB_LMDB_MAP_SIZE=100000000
|
||||
}
|
||||
|
||||
process {
|
||||
withName : '.*PICARD.*' {
|
||||
clusterOptions = {"-l h_vmem=${(task.memory + 4.GB).bytes/task.cpus}"}
|
||||
}
|
||||
|
||||
withName : '.*SNPEFF.*' {
|
||||
clusterOptions = {"-l h_vmem=${(task.memory + 4.GB).bytes/task.cpus}"}
|
||||
}
|
||||
}
|
|
@ -1,10 +1,10 @@
|
|||
# nf-core/configs: Eddie Configuration
|
||||
|
||||
nf-core pipelines sarek, rnaseq, and atacseq have all been tested on the University of Edinburgh Eddie HPC.
|
||||
nf-core pipelines sarek, rnaseq, atacseq, and viralrecon have all been tested on the University of Edinburgh Eddie HPC. All except atacseq have pipeline-specific config files; atacseq does not yet support this.
|
||||
|
||||
## Getting help
|
||||
|
||||
There is a Slack channel dedicated to eddie users on the MRC IGMM Slack: [https://igmm.slack.com/channels/eddie3](https://igmm.slack.com/channels/eddie3)
|
||||
There is a Slack channel dedicated to eddie users on the MRC IGC Slack: [https://igmm.slack.com/channels/eddie3](https://igmm.slack.com/channels/eddie3)
|
||||
|
||||
## Using the Eddie config profile
|
||||
|
||||
|
@ -35,13 +35,13 @@ This config enables Nextflow to manage the pipeline jobs via the SGE job schedul
|
|||
|
||||
## Singularity set-up
|
||||
|
||||
Load Singularity from the module system and, if you have access to `/exports/igmm/eddie/NextGenResources`, set the Singularity cache directory to the NextGenResources path below. If some containers for your pipeline run are not present, please contact the [IGMM Data Manager](data.manager@igmm.ed.ac.uk) to have them added. You can add these lines to the file `$HOME/.bashrc`, or you can run these commands before you run an nf-core pipeline.
|
||||
Load Singularity from the module system and, if you have access to `/exports/igmm/eddie/BioinformaticsResources`, set the Singularity cache directory to the BioinformaticsResources path below. If some containers for your pipeline run are not present, please contact the [IGC Data Manager](data.manager@igc.ed.ac.uk) to have them added. You can add these lines to the file `$HOME/.bashrc`, or you can run these commands before you run an nf-core pipeline.
|
||||
|
||||
If you do not have access to `/exports/igmm/eddie/NextGenResources`, set the Singularity cache directory to somewhere sensible that is not in your `$HOME` area (which has limited space). It will take time to download all the Singularity containers, but you can use this again.
|
||||
If you do not have access to `/exports/igmm/eddie/BioinformaticsResources`, set the Singularity cache directory to somewhere sensible that is not in your `$HOME` area (which has limited space). It will take time to download all the Singularity containers, but you can use this again.
|
||||
|
||||
```bash
|
||||
module load singularity
|
||||
export NXF_SINGULARITY_CACHEDIR="/exports/igmm/eddie/NextGenResources/nextflow/singularity"
|
||||
export NXF_SINGULARITY_CACHEDIR="/exports/igmm/eddie/BioinformaticsResources/nf-core/singularity-images"
|
||||
```
|
||||
|
||||
Singularity will create a directory `.singularity` in your `$HOME` directory on eddie. Space on `$HOME` is very limited, so it is a good idea to create a directory somewhere else with more room and link the locations.
|
||||
|
@ -68,7 +68,7 @@ If your eddie terminal disconnects your Nextflow job will stop. You can run Next
|
|||
nohup ./nextflow_run.sh &
|
||||
```
|
||||
|
||||
### On a wild west node - IGMM only
|
||||
### On a wild west node - IGC only
|
||||
|
||||
Wild west nodes on eddie can be accessed via ssh (node2c15, node2c16, node3g22). To run Nextflow on one of these nodes, do it within a [screen session](https://linuxize.com/post/how-to-use-linux-screen/).
|
||||
|
||||
|
@ -92,12 +92,12 @@ screen -r <session_name>
|
|||
|
||||
## Using iGenomes references
|
||||
|
||||
A local copy of the iGenomes resource has been made available on the Eddie HPC for those with access to `/exports/igmm/eddie/NextGenResources` so you should be able to run the pipeline against any reference available in the `igenomes.config`.
|
||||
A local copy of the iGenomes resource has been made available on the Eddie HPC for those with access to `/exports/igmm/eddie/BioinformaticsResources` so you should be able to run the pipeline against any reference available in the `igenomes.config`.
|
||||
You can do this by simply using the `--genome <GENOME_ID>` parameter.
|
||||
|
||||
## Adjusting maximum resources
|
||||
|
||||
This config is set for IGMM standard nodes which have 32 cores and 384GB memory. If you are a non-IGMM user, please see the [ECDF specification](https://www.wiki.ed.ac.uk/display/ResearchServices/Memory+Specification) and adjust the `--clusterOptions` flag appropriately, e.g.
|
||||
This config is set for IGC standard nodes which have 32 cores and 384GB memory. If you are a non-IGC user, please see the [ECDF specification](https://www.wiki.ed.ac.uk/display/ResearchServices/Memory+Specification) and adjust the `--clusterOptions` flag appropriately, e.g.
|
||||
|
||||
```bash
|
||||
--clusterOptions "-C mem256GB" --max_memory "256GB"
|
||||
|
|
|
@ -5,3 +5,7 @@
|
|||
*/
|
||||
|
||||
includeConfig "${params.custom_config_base}/conf/pipeline/viralrecon/genomes.config"
|
||||
|
||||
profiles {
|
||||
eddie { includeConfig "${params.custom_config_base}/conf/pipeline/viralrecon/eddie.config" }
|
||||
}
|
||||
|
|
Loading…
Reference in a new issue