1
0
Fork 0
mirror of https://github.com/MillironX/nf-configs.git synced 2024-12-04 20:49:54 +00:00
nf-configs/docs/uppmax.md

276 lines
11 KiB
Markdown
Raw Normal View History

2019-01-07 13:25:06 +00:00
# nf-core/configs: UPPMAX Configuration
All nf-core pipelines have been successfully configured for use on the Swedish UPPMAX clusters.
2020-11-25 21:45:16 +00:00
## Getting help
We have a Slack channel dedicated to UPPMAX users on the nf-core Slack: [https://nfcore.slack.com/channels/uppmax](https://nfcore.slack.com/channels/uppmax)
2019-01-07 13:25:06 +00:00
## Using the UPPMAX config profile
2019-11-25 23:48:17 +00:00
2021-04-21 09:23:50 +00:00
Before running the pipeline you will need to either install `Nextflow` or load it using the environment module system (this can be done with e.g. `module load bioinfo-tools Nextflow/<VERSION>` where `VERSION` is e.g. `20.10`).
2021-04-20 13:38:49 +00:00
2019-12-05 14:18:01 +00:00
To use, run the pipeline with `-profile uppmax` (one hyphen).
This will download and launch the [`uppmax.config`](../conf/uppmax.config) which has been pre-configured with a setup suitable for the UPPMAX servers.
2021-04-21 09:23:50 +00:00
It will enable `Nextflow` to manage the pipeline jobs via the `Slurm` job scheduler.
Using this profile, `Docker` image(s) containing required software(s) will be downloaded, and converted to `Singularity` image(s) if needed before execution of the pipeline.
2021-04-21 09:23:50 +00:00
Recent version of `Nextflow` also support the environment variable `NXF_SINGULARITY_CACHEDIR` which can be used to supply images.
Images for some `nf-core` pipelines are available under `/sw/data/ToolBox/nf-core/` and those can be used by `NXF_SINGULARITY_CACHEDIR=/sw/data/ToolBox/nf-core/; export NXF_SINGULARITY_CACHEDIR`.
2019-01-07 13:25:06 +00:00
In addition to this config profile, you will also need to specify an UPPMAX project id.
2021-04-21 09:23:50 +00:00
You can do this with the `--project` flag (two hyphens) when launching `Nextflow`.
2021-04-20 13:38:49 +00:00
For example:
2019-01-07 13:25:06 +00:00
```bash
2021-04-20 13:38:49 +00:00
# Launch a nf-core pipeline with the uppmax profile for the project id snic2018-1-234
$ nextflow run nf-core/<PIPELINE> -profile uppmax --project snic2018-1-234 [...]
2019-01-07 13:25:06 +00:00
```
2021-04-20 13:38:49 +00:00
> NB: If you're not sure what your UPPMAX project ID is, try running `groups` or checking SUPR.
2020-11-25 21:45:16 +00:00
2021-04-21 09:23:50 +00:00
Just run `Nextflow` on a login node and it will handle everything else.
2019-01-07 13:25:06 +00:00
2021-04-21 09:23:50 +00:00
Remember to use `-bg` to launch `Nextflow` in the background, so that the pipeline doesn't exit if you leave your terminal session.
Alternatively, you can also launch `Nextflow` in a `screen` or a `tmux` session.
2020-11-25 21:45:16 +00:00
2021-04-21 09:23:50 +00:00
## Using AWS iGenomes references
2019-11-25 23:48:17 +00:00
2021-04-21 09:23:50 +00:00
A local copy of the `AWS iGenomes` resource has been made available on all UPPMAX clusters so you should be able to run the pipeline against any reference available in the `conf/igenomes.config`.
2019-01-07 13:25:06 +00:00
You can do this by simply using the `--genome <GENOME_ID>` parameter.
## Getting more memory
2019-11-25 23:48:17 +00:00
2021-04-21 09:23:50 +00:00
If your `nf-core` pipeline run is running out of memory, you can run on a fat node with more memory using the following `Nextflow` flags:
2019-01-07 13:25:06 +00:00
```bash
--clusterOptions "-C mem256GB -p node" --max_memory "256GB"
2019-01-07 13:25:06 +00:00
```
This raises the ceiling of available memory from the default of `128.GB` to `256.GB`.
2021-04-20 13:38:49 +00:00
`rackham` has nodes with 128GB, 256GB and 1TB memory available.
2019-01-07 13:25:06 +00:00
Note that each job will still start with the same request as normal, but restarted attempts with larger requests will be able to request greater amounts of memory.
All jobs will be submitted to fat nodes using this method, so it's only for use in extreme circumstances.
2020-11-25 21:45:16 +00:00
## Different UPPMAX clusters
2019-12-05 14:18:01 +00:00
The UPPMAX nf-core configuration profile uses the `hostname` of the active environment to automatically apply the following resource limits:
2019-12-05 14:18:01 +00:00
2022-03-17 23:33:31 +00:00
- `rackham`
- cpus available: 20 cpus
- memory available: 125 GB
- `bianca`
- cpus available: 16 cpus
- memory available: 109 GB
- `miarka`
- cpus available: 48 cpus
- memory available: 357 GB
2019-12-05 14:18:01 +00:00
## Development config
2019-11-25 23:48:17 +00:00
2019-12-05 14:18:01 +00:00
If doing pipeline development work on UPPMAX, the `devel` profile allows for faster testing.
2019-01-07 13:25:06 +00:00
Applied after main UPPMAX config, it overwrites certain parts of the config and submits jobs to the `devcore` queue, which has much faster queue times.
All jobs are limited to 1 hour to be eligible for this queue and only one job allowed at a time.
It is not suitable for use with real data.
To use it, submit with `-profile uppmax,devel`.
2020-11-26 08:53:51 +00:00
2021-04-20 13:38:49 +00:00
## Running on bianca
2020-11-26 08:53:51 +00:00
2021-04-20 13:58:51 +00:00
> :warning: For more information, please follow the following guides:
>
2022-03-17 23:33:31 +00:00
> - [UPPMAX `bianca` user guide](http://uppmax.uu.se/support/user-guides/bianca-user-guide/).
> - [nf-core guide for running offline](https://nf-co.re/usage/offline)
> - [nf-core `tools` guide for downloading pipelines for offline use](https://nf-co.re/tools#downloading-pipelines-for-offline-use).
> - [UPPMAX `Singularity` guide](https://www.uppmax.uu.se/support-sv/user-guides/singularity-user-guide/).
2020-11-26 08:53:51 +00:00
2021-04-16 15:01:58 +00:00
For security reasons, there is no internet access on `bianca` so you can't download from or upload files to the cluster directly.
2021-04-20 13:38:49 +00:00
Before running a nf-core pipeline on `bianca` you will first have to download the pipeline and singularity images needed elsewhere and transfer them via the `wharf` area to your own `bianca` project.
2020-11-26 08:53:51 +00:00
2021-04-20 13:38:49 +00:00
In this guide, we use `rackham` to download and transfer files to the `wharf` area, but it can also be done on your own computer.
2021-04-16 15:01:58 +00:00
If you use `rackham` to download the pipeline and the singularity containers, we recommend using an interactive session (cf [interactive guide](https://www.uppmax.uu.se/support/faq/running-jobs-faq/how-can-i-run-interactively-on-a-compute-node/)), which is what we do in the following guide.
2020-11-26 08:53:51 +00:00
2021-04-20 13:38:49 +00:00
### Download and install Nextflow
2020-11-26 08:53:51 +00:00
2021-04-21 09:23:50 +00:00
You can use the `Nextflow` UPPMAX provided `module`, but if necessary, you can also download a more recent version.
```bash
2021-04-21 09:28:58 +00:00
# Connect to bianca
$ ssh -A <USER>-<BIANCA_PROJECT>@bianca.uppmax.uu.se
2021-04-21 09:23:50 +00:00
# See the available versions for the module
module spider Nextflow
# Load a specific version of the Nextflow module
module load bioinfo-tools Nextflow/<VERSION>`
```
2021-04-16 15:01:58 +00:00
```bash
# Connect to rackham
2021-04-20 13:38:49 +00:00
$ ssh -X <USER>@rackham.uppmax.uu.se
2021-04-16 15:01:58 +00:00
# Or stay in your terminal
# Download the nextflow-all bundle
2021-04-20 13:38:49 +00:00
$ wget https://github.com/nextflow-io/nextflow/releases/download/v<NEXTFLOW_VERSION>/nextflow-<NEXTFLOW_VERSION>-all
2021-04-16 15:01:58 +00:00
2021-04-20 13:38:49 +00:00
# Connect to the wharf area using sftp
$ sftp <USER>-<BIANCA_PROJECT>@bianca-sftp.uppmax.uu.se:<USER>-<BIANCA_PROJECT>
2021-04-16 15:01:58 +00:00
2021-04-20 13:38:49 +00:00
# Transfer nextflow to the wharf area
sftp> put nextflow-<NEXTFLOW_VERSION>-all .
2021-04-16 15:01:58 +00:00
# Exit sftp
$ exit
# Connect to bianca
2021-04-20 13:38:49 +00:00
$ ssh -A <USER>-<BIANCA_PROJECT>@bianca.uppmax.uu.se
2021-04-16 15:01:58 +00:00
# Go to your project
$ cd /castor/project/proj_nobackup
# Make folder for Nextflow
$ mkdir tools
$ mkdir tools/nextflow
2021-04-20 13:38:49 +00:00
# Move Nextflow from the wharf area to its directory
$ mv /castor/project/proj_nobackup/wharf/<USER>/<USER>-<BIANCA_PROJECT>/nextflow-<NEXTFLOW_VERSION>-all /castor/project/proj_nobackup/tools/nextflow
2021-04-16 15:01:58 +00:00
# Establish permission
2021-04-20 13:38:49 +00:00
$ chmod a+x /castor/project/proj_nobackup/tools/nextflow/nextflow-<NEXTFLOW_VERSION>-all
2021-04-16 15:01:58 +00:00
# If you want other people to use it
# Be sure that your group has rights to the directory as well
2021-04-20 13:38:49 +00:00
$ chown -R .<BIANCA_PROJECT> /castor/project/proj_nobackup/tools/nextflow/nextflow-<NEXTFLOW_VERSION>-all
2021-04-16 15:01:58 +00:00
# Make a link to it
2021-04-20 13:38:49 +00:00
$ ln -s /castor/project/proj_nobackup/tools/nextflow/nextflow-<NEXTFLOW_VERSION>-all /castor/project/proj_nobackup/tools/nextflow/nextflow
2021-04-16 15:01:58 +00:00
# And every time you're launching Nextflow, don't forget to export the following ENV variables
# Or add them to your .bashrc file
$ export NXF_HOME=/castor/project/proj/nobackup/tools/nextflow/
$ export PATH=${NXF_HOME}:${PATH}
$ export NXF_TEMP=$SNIC_TMP
$ export NXF_LAUNCHER=$SNIC_TMP
$ export NXF_SINGULARITY_CACHEDIR=/castor/project/proj_nobackup/singularity-images
```
2021-04-20 13:38:49 +00:00
### Install nf-core tools
2021-04-16 15:01:58 +00:00
2021-04-21 09:23:50 +00:00
You can use the `nf-core` UPPMAX provided `module`, but if necessary, you can also download a more recent version.
```bash
# Connect to rackham
$ ssh -X <USER>@rackham.uppmax.uu.se
# See the available versions for the module
module spider nf-core
# Load a specific version of the nf-core module
module load bioinfo-tools nf-core/<VERSION>`
```
2021-04-16 15:01:58 +00:00
```bash
# Connect to rackham
2021-04-20 13:38:49 +00:00
$ ssh -X <USER>@rackham.uppmax.uu.se
2021-04-16 15:01:58 +00:00
# Or stay in your terminal
# Install the latest pip version
$ pip3 install --upgrade --force-reinstall git+https://github.com/nf-core/tools.git@dev --user
2021-04-20 13:38:49 +00:00
```
2021-04-16 15:01:58 +00:00
2021-04-20 13:38:49 +00:00
### Download and transfer a nf-core pipeline
```bash
# Connect to rackham
$ ssh -X <USER>@rackham.uppmax.uu.se
# Or stay in your terminal
2021-04-16 15:01:58 +00:00
2021-04-20 13:38:49 +00:00
# Open an interactive session (if you are on rackham)
$ interactive <rackham_project>
2021-04-16 15:01:58 +00:00
2021-04-21 09:17:06 +00:00
# Download a pipeline with the singularity images
$ nf-core download <PIPELINE> -r <PIPELINE_VERSION> -s --compress none
2021-04-20 13:38:49 +00:00
# If necessary, extra singularity images can be download separately
# For example, if you downloaded nf-core/sarek, you will need extra images for annotation
# Here we download the nf-core/sarek GRCh38 specific images
2021-04-16 15:01:58 +00:00
$ singularity pull --name nfcore-sareksnpeff-2.7.GRCh38.img docker://nfcore/sareksnpeff:2.7.GRCh38
$ singularity pull --name nfcore-sarekvep-2.7.GRCh38.img docker://nfcore/sarekvep:2.7.GRCh38
2021-04-20 13:38:49 +00:00
# Which can then be moved into the nf-core/sarek download folder
2021-04-16 15:01:58 +00:00
$ mv *.img nf-core-sarek-2.7/singularity-images/.
2021-04-20 13:38:49 +00:00
# Connect to the wharf area using sftp
$ sftp <USER>-<BIANCA_PROJECT>@bianca-sftp.uppmax.uu.se:<USER>-<BIANCA_PROJECT>
2021-04-16 15:01:58 +00:00
2021-04-21 09:17:06 +00:00
# Transfer <PIPELINE> folder from rackham to the wharf area
sftp> put -r nf-core-<PIPELINE>-<PIPELINE_VERSION> .
2021-04-16 15:01:58 +00:00
# The archives will be in the wharf folder in your user home on your bianca project
# Connect to bianca
2021-04-20 13:38:49 +00:00
$ ssh -A <USER>-<BIANCA_PROJECT>@bianca.uppmax.uu.se
2021-04-16 15:01:58 +00:00
# Go to your project
$ cd /castor/project/proj_nobackup
2021-04-20 13:38:49 +00:00
# Make and go into a nf-core directory (where you will store all nf-core pipelines')
$ mkdir nf-core
2021-04-20 13:38:49 +00:00
$ cd nf-core
2021-04-16 15:01:58 +00:00
2021-04-20 13:38:49 +00:00
# Move the folder from the wharf area to the project
2021-04-21 09:17:06 +00:00
$ cp /castor/project/proj_nobackup/wharf/<USER>/<USER>-<BIANCA_PROJECT>/nf-core-<PIPELINE>-<PIPELINE_VERSION> .
2021-04-16 15:01:58 +00:00
# If you want other people to use it,
# Be sure that your group has rights to the directory as well
2021-04-21 09:17:06 +00:00
$ chown -R .<BIANCA_PROJECT> nf-core-<PIPELINE>-<PIPELINE_VERSION>
2021-04-16 15:01:58 +00:00
# Make a symbolic link to the extracted repository
2021-04-21 09:17:06 +00:00
$ ln -s nf-core-<PIPELINE>-<PIPELINE_VERSION> nf-core-<PIPELINE>-default
2021-04-16 15:01:58 +00:00
```
2021-04-21 09:51:14 +00:00
The principle is to have every member of your project to be able to use the same `nf-core/<PIPELINE>` version at the same time.
So every member of the project who wants to use `nf-core/<PIPELINE>` will need to do:
2021-04-16 15:01:58 +00:00
```bash
# Connect to bianca
2021-04-20 13:38:49 +00:00
$ ssh -A <USER>-<BIANCA_PROJECT>@bianca.uppmax.uu.se
2021-04-16 15:01:58 +00:00
# Go to your user directory
2021-04-20 13:38:49 +00:00
$ cd /home/<USER>
2021-04-16 15:01:58 +00:00
2021-04-21 09:17:06 +00:00
# Make a symbolic link to the default nf-core/<PIPELINE>
$ ln -s /castor/project/proj_nobackup/nf-core/nf-core-<PIPELINE>-default nf-core-<PIPELINE>
2021-04-16 15:01:58 +00:00
```
2021-04-21 09:51:14 +00:00
And then `nf-core/<PIPELINE>` can be used with:
2021-04-16 15:01:58 +00:00
```bash
2021-04-21 09:17:06 +00:00
# run <PIPELINE> on bianca
$ nextflow run ~/<PIPELINE> -profile uppmax --project <BIANCA_PROJECT> --genome <GENOME_ASSEMBLY> ...
2021-04-16 15:01:58 +00:00
```
2021-04-20 13:38:49 +00:00
## Update a pipeline
2021-04-16 15:01:58 +00:00
2021-04-20 13:38:49 +00:00
To update, repeat the same steps as for installing and update the link.
2021-04-16 15:01:58 +00:00
```bash
# Connect to bianca (Connect to rackham first if needed)
2021-04-20 13:38:49 +00:00
$ ssh -A <USER>-<BIANCA_PROJECT>@bianca.uppmax.uu.se
2021-04-16 15:01:58 +00:00
2021-04-21 09:17:06 +00:00
# Go to the nf-core directory in your project
2021-04-20 13:38:49 +00:00
$ cd /castor/project/proj_nobackup/nf-core
2021-04-16 15:01:58 +00:00
# Remove link
2021-04-21 09:17:06 +00:00
$ unlink nf-core-<PIPELINE>-default
2021-04-16 15:01:58 +00:00
2021-04-21 09:17:06 +00:00
# Link to new nf-core/<PIPELINE> version
$ ln -s nf-core-<PIPELINE>-<PIPELINE_VERSION> nf-core-<PIPELINE>-default
2021-04-16 15:01:58 +00:00
```
2021-04-21 09:17:06 +00:00
You can for example keep a `nf-core-<PIPELINE>-default` version that you are sure is working, an make a link for a `nf-core-<PIPELINE>-testing` or `nf-core-<PIPELINE>-development`.