mirror of
https://github.com/MillironX/nf-core_modules.git
synced 2024-11-14 05:43:08 +00:00
Merge branch 'nf-core:master' into master
This commit is contained in:
commit
706b07246b
207 changed files with 4178 additions and 569 deletions
10
.nf-core.yml
Normal file
10
.nf-core.yml
Normal file
|
@ -0,0 +1,10 @@
|
||||||
|
bump-versions:
|
||||||
|
rseqc/junctionannotation: False
|
||||||
|
rseqc/bamstat: False
|
||||||
|
rseqc/readduplication: False
|
||||||
|
rseqc/readdistribution: False
|
||||||
|
rseqc/junctionsaturation: False
|
||||||
|
rseqc/inferexperiment: False
|
||||||
|
rseqc/innerdistance: False
|
||||||
|
sortmerna: False
|
||||||
|
malt/build: False
|
22
README.md
22
README.md
|
@ -381,6 +381,8 @@ Please follow the steps below to run the tests locally:
|
||||||
|
|
||||||
- See [docs on running pytest-workflow](https://pytest-workflow.readthedocs.io/en/stable/#running-pytest-workflow) for more info.
|
- See [docs on running pytest-workflow](https://pytest-workflow.readthedocs.io/en/stable/#running-pytest-workflow) for more info.
|
||||||
|
|
||||||
|
> :warning: if you have a module named `build` this can conflict with some pytest internal behaviour. This results in no tests being run (i.e. recieving a message of `collected 0 items`). In this case rename the `tests/<module>/build` directry to `tests/<module>/build_test`, and update the corresponding `test.yml` accordingly. An example can be seen with the [`bowtie2/build` module tests](https://github.com/nf-core/modules/tree/master/tests/modules/bowtie2/build_test).
|
||||||
|
|
||||||
### Uploading to `nf-core/modules`
|
### Uploading to `nf-core/modules`
|
||||||
|
|
||||||
[Fork](https://help.github.com/articles/fork-a-repo/) the `nf-core/modules` repository to your own GitHub account. Within the local clone of your fork add the module file to the [`modules/`](modules) directory. Please try and keep PRs as atomic as possible to aid the reviewing process - ideally, one module addition/update per PR.
|
[Fork](https://help.github.com/articles/fork-a-repo/) the `nf-core/modules` repository to your own GitHub account. Within the local clone of your fork add the module file to the [`modules/`](modules) directory. Please try and keep PRs as atomic as possible to aid the reviewing process - ideally, one module addition/update per PR.
|
||||||
|
@ -429,6 +431,16 @@ using a combination of `bwa` and `samtools` to output a BAM file instead of a SA
|
||||||
|
|
||||||
- All function names MUST follow the `camelCase` convention.
|
- All function names MUST follow the `camelCase` convention.
|
||||||
|
|
||||||
|
#### Input/output options
|
||||||
|
|
||||||
|
- Input channel declarations MUST be defined for all _possible_ input files (i.e. both required and optional files).
|
||||||
|
- Directly associated auxiliary files to an input file MAY be defined within the same input channel alongside the main input channel (e.g. [BAM and BAI](https://github.com/nf-core/modules/blob/e937c7950af70930d1f34bb961403d9d2aa81c7d/modules/samtools/flagstat/main.nf#L22)).
|
||||||
|
- Other generic auxiliary files used across different input files (e.g. common reference sequences) MAY be defined using a dedicated input channel (e.g. [reference files](https://github.com/nf-core/modules/blob/3cabc95d0ed8a5a4e07b8f9b1d1f7ff9a70f61e1/modules/bwa/mem/main.nf#L21-L23)).
|
||||||
|
|
||||||
|
- Named file extensions MUST be emitted for ALL output channels e.g. `path "*.txt", emit: txt`.
|
||||||
|
|
||||||
|
- Optional inputs are not currently supported by Nextflow. However, passing an empty list (`[]`) instead of a file as a module parameter can be used to work around this issue.
|
||||||
|
|
||||||
#### Module parameters
|
#### Module parameters
|
||||||
|
|
||||||
- A module file SHOULD only define input and output files as command-line parameters to be executed within the process.
|
- A module file SHOULD only define input and output files as command-line parameters to be executed within the process.
|
||||||
|
@ -439,18 +451,16 @@ using a combination of `bwa` and `samtools` to output a BAM file instead of a SA
|
||||||
|
|
||||||
- Any parameters that need to be evaluated in the context of a particular sample e.g. single-end/paired-end data MUST also be defined within the process.
|
- Any parameters that need to be evaluated in the context of a particular sample e.g. single-end/paired-end data MUST also be defined within the process.
|
||||||
|
|
||||||
#### Input/output options
|
|
||||||
|
|
||||||
- Named file extensions MUST be emitted for ALL output channels e.g. `path "*.txt", emit: txt`.
|
|
||||||
|
|
||||||
- Optional inputs are not currently supported by Nextflow. However, "fake files" MAY be used to work around this issue.
|
|
||||||
|
|
||||||
#### Resource requirements
|
#### Resource requirements
|
||||||
|
|
||||||
- An appropriate resource `label` MUST be provided for the module as listed in the [nf-core pipeline template](https://github.com/nf-core/tools/blob/master/nf_core/pipeline-template/conf/base.config#L29-L46) e.g. `process_low`, `process_medium` or `process_high`.
|
- An appropriate resource `label` MUST be provided for the module as listed in the [nf-core pipeline template](https://github.com/nf-core/tools/blob/master/nf_core/pipeline-template/conf/base.config#L29-L46) e.g. `process_low`, `process_medium` or `process_high`.
|
||||||
|
|
||||||
- If the tool supports multi-threading then you MUST provide the appropriate parameter using the Nextflow `task` variable e.g. `--threads $task.cpus`.
|
- If the tool supports multi-threading then you MUST provide the appropriate parameter using the Nextflow `task` variable e.g. `--threads $task.cpus`.
|
||||||
|
|
||||||
|
- If a module contains _multiple_ tools that supports multi-threading (e.g. [piping output into a samtools command](https://github.com/nf-core/modules/blob/28b023e6f4d0d2745406d9dc6e38006882804e67/modules/bowtie2/align/main.nf#L32-L46)), you MUST assign cpus per tool such that the total number of used CPUs does not exceed `task.cpus`.
|
||||||
|
- For example, combining two (or more) tools that both (all) have multi-threading, this can be assigned to the variable [`split_cpus`](https://github.com/nf-core/modules/blob/28b023e6f4d0d2745406d9dc6e38006882804e67/modules/bowtie2/align/main.nf#L32)
|
||||||
|
- If one tool is multi-threaded and another uses a single thread, you can specify directly in the command itself e.g. with [`${task.cpus - 1}`](https://github.com/nf-core/modules/blob/6e68c1af9a514bb056c0513ebba6764efd6750fc/modules/bwa/sampe/main.nf#L42-L43)
|
||||||
|
|
||||||
#### Software requirements
|
#### Software requirements
|
||||||
|
|
||||||
[BioContainers](https://biocontainers.pro/#/) is a registry of Docker and Singularity containers automatically created from all of the software packages on [Bioconda](https://bioconda.github.io/). Where possible we will use BioContainers to fetch pre-built software containers and Bioconda to install software using Conda.
|
[BioContainers](https://biocontainers.pro/#/) is a registry of Docker and Singularity containers automatically created from all of the software packages on [Bioconda](https://bioconda.github.io/). Where possible we will use BioContainers to fetch pre-built software containers and Bioconda to install software using Conda.
|
||||||
|
|
|
@ -11,11 +11,11 @@ process ALLELECOUNTER {
|
||||||
mode: params.publish_dir_mode,
|
mode: params.publish_dir_mode,
|
||||||
saveAs: { filename -> saveFiles(filename:filename, options:params.options, publish_dir:getSoftwareName(task.process), meta:meta, publish_by_meta:['id']) }
|
saveAs: { filename -> saveFiles(filename:filename, options:params.options, publish_dir:getSoftwareName(task.process), meta:meta, publish_by_meta:['id']) }
|
||||||
|
|
||||||
conda (params.enable_conda ? "bioconda::cancerit-allelecount=4.2.1" : null)
|
conda (params.enable_conda ? 'bioconda::cancerit-allelecount=4.3.0' : null)
|
||||||
if (workflow.containerEngine == 'singularity' && !params.singularity_pull_docker_container) {
|
if (workflow.containerEngine == 'singularity' && !params.singularity_pull_docker_container) {
|
||||||
container "https://depot.galaxyproject.org/singularity/cancerit-allelecount:4.2.1--h3ecb661_0"
|
container "https://depot.galaxyproject.org/singularity/cancerit-allelecount:4.3.0--h41abebc_0"
|
||||||
} else {
|
} else {
|
||||||
container "quay.io/biocontainers/cancerit-allelecount:4.2.1--h3ecb661_0"
|
container "quay.io/biocontainers/cancerit-allelecount:4.3.0--h41abebc_0"
|
||||||
}
|
}
|
||||||
|
|
||||||
input:
|
input:
|
||||||
|
|
68
modules/arriba/functions.nf
Normal file
68
modules/arriba/functions.nf
Normal file
|
@ -0,0 +1,68 @@
|
||||||
|
//
|
||||||
|
// Utility functions used in nf-core DSL2 module files
|
||||||
|
//
|
||||||
|
|
||||||
|
//
|
||||||
|
// Extract name of software tool from process name using $task.process
|
||||||
|
//
|
||||||
|
def getSoftwareName(task_process) {
|
||||||
|
return task_process.tokenize(':')[-1].tokenize('_')[0].toLowerCase()
|
||||||
|
}
|
||||||
|
|
||||||
|
//
|
||||||
|
// Function to initialise default values and to generate a Groovy Map of available options for nf-core modules
|
||||||
|
//
|
||||||
|
def initOptions(Map args) {
|
||||||
|
def Map options = [:]
|
||||||
|
options.args = args.args ?: ''
|
||||||
|
options.args2 = args.args2 ?: ''
|
||||||
|
options.args3 = args.args3 ?: ''
|
||||||
|
options.publish_by_meta = args.publish_by_meta ?: []
|
||||||
|
options.publish_dir = args.publish_dir ?: ''
|
||||||
|
options.publish_files = args.publish_files
|
||||||
|
options.suffix = args.suffix ?: ''
|
||||||
|
return options
|
||||||
|
}
|
||||||
|
|
||||||
|
//
|
||||||
|
// Tidy up and join elements of a list to return a path string
|
||||||
|
//
|
||||||
|
def getPathFromList(path_list) {
|
||||||
|
def paths = path_list.findAll { item -> !item?.trim().isEmpty() } // Remove empty entries
|
||||||
|
paths = paths.collect { it.trim().replaceAll("^[/]+|[/]+\$", "") } // Trim whitespace and trailing slashes
|
||||||
|
return paths.join('/')
|
||||||
|
}
|
||||||
|
|
||||||
|
//
|
||||||
|
// Function to save/publish module results
|
||||||
|
//
|
||||||
|
def saveFiles(Map args) {
|
||||||
|
if (!args.filename.endsWith('.version.txt')) {
|
||||||
|
def ioptions = initOptions(args.options)
|
||||||
|
def path_list = [ ioptions.publish_dir ?: args.publish_dir ]
|
||||||
|
if (ioptions.publish_by_meta) {
|
||||||
|
def key_list = ioptions.publish_by_meta instanceof List ? ioptions.publish_by_meta : args.publish_by_meta
|
||||||
|
for (key in key_list) {
|
||||||
|
if (args.meta && key instanceof String) {
|
||||||
|
def path = key
|
||||||
|
if (args.meta.containsKey(key)) {
|
||||||
|
path = args.meta[key] instanceof Boolean ? "${key}_${args.meta[key]}".toString() : args.meta[key]
|
||||||
|
}
|
||||||
|
path = path instanceof String ? path : ''
|
||||||
|
path_list.add(path)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if (ioptions.publish_files instanceof Map) {
|
||||||
|
for (ext in ioptions.publish_files) {
|
||||||
|
if (args.filename.endsWith(ext.key)) {
|
||||||
|
def ext_list = path_list.collect()
|
||||||
|
ext_list.add(ext.value)
|
||||||
|
return "${getPathFromList(ext_list)}/$args.filename"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
} else if (ioptions.publish_files == null) {
|
||||||
|
return "${getPathFromList(path_list)}/$args.filename"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
47
modules/arriba/main.nf
Normal file
47
modules/arriba/main.nf
Normal file
|
@ -0,0 +1,47 @@
|
||||||
|
// Import generic module functions
|
||||||
|
include { initOptions; saveFiles; getSoftwareName } from './functions'
|
||||||
|
|
||||||
|
params.options = [:]
|
||||||
|
options = initOptions(params.options)
|
||||||
|
|
||||||
|
process ARRIBA {
|
||||||
|
tag "$meta.id"
|
||||||
|
label 'process_medium'
|
||||||
|
publishDir "${params.outdir}",
|
||||||
|
mode: params.publish_dir_mode,
|
||||||
|
saveAs: { filename -> saveFiles(filename:filename, options:params.options, publish_dir:getSoftwareName(task.process), meta:meta, publish_by_meta:['id']) }
|
||||||
|
|
||||||
|
conda (params.enable_conda ? "bioconda::arriba=2.1.0" : null)
|
||||||
|
if (workflow.containerEngine == 'singularity' && !params.singularity_pull_docker_container) {
|
||||||
|
container "https://depot.galaxyproject.org/singularity/arriba:2.1.0--h3198e80_1"
|
||||||
|
} else {
|
||||||
|
container "quay.io/biocontainers/arriba:2.1.0--h3198e80_1"
|
||||||
|
}
|
||||||
|
|
||||||
|
input:
|
||||||
|
tuple val(meta), path(bam)
|
||||||
|
path fasta
|
||||||
|
path gtf
|
||||||
|
|
||||||
|
output:
|
||||||
|
tuple val(meta), path("*.fusions.tsv") , emit: fusions
|
||||||
|
tuple val(meta), path("*.fusions.discarded.tsv"), emit: fusions_fail
|
||||||
|
path "*.version.txt" , emit: version
|
||||||
|
|
||||||
|
script:
|
||||||
|
def software = getSoftwareName(task.process)
|
||||||
|
def prefix = options.suffix ? "${meta.id}${options.suffix}" : "${meta.id}"
|
||||||
|
def blacklist = (options.args.contains('-b')) ? '' : '-f blacklist'
|
||||||
|
"""
|
||||||
|
arriba \\
|
||||||
|
-x $bam \\
|
||||||
|
-a $fasta \\
|
||||||
|
-g $gtf \\
|
||||||
|
-o ${prefix}.fusions.tsv \\
|
||||||
|
-O ${prefix}.fusions.discarded.tsv \\
|
||||||
|
$blacklist \\
|
||||||
|
$options.args
|
||||||
|
|
||||||
|
echo \$(arriba -h | grep 'Version:' 2>&1) | sed 's/Version:\s//' > ${software}.version.txt
|
||||||
|
"""
|
||||||
|
}
|
54
modules/arriba/meta.yml
Normal file
54
modules/arriba/meta.yml
Normal file
|
@ -0,0 +1,54 @@
|
||||||
|
name: arriba
|
||||||
|
description: Arriba is a command-line tool for the detection of gene fusions from RNA-Seq data.
|
||||||
|
keywords:
|
||||||
|
- fusion
|
||||||
|
- arriba
|
||||||
|
tools:
|
||||||
|
- arriba:
|
||||||
|
description: Fast and accurate gene fusion detection from RNA-Seq data
|
||||||
|
homepage: https://github.com/suhrig/arriba
|
||||||
|
documentation: https://arriba.readthedocs.io/en/latest/
|
||||||
|
tool_dev_url: https://github.com/suhrig/arriba
|
||||||
|
doi: "10.1101/gr.257246.119"
|
||||||
|
licence: ['MIT']
|
||||||
|
|
||||||
|
input:
|
||||||
|
- meta:
|
||||||
|
type: map
|
||||||
|
description: |
|
||||||
|
Groovy Map containing sample information
|
||||||
|
e.g. [ id:'test', single_end:false ]
|
||||||
|
- bam:
|
||||||
|
type: file
|
||||||
|
description: BAM/CRAM/SAM file
|
||||||
|
pattern: "*.{bam,cram,sam}"
|
||||||
|
- fasta:
|
||||||
|
type: file
|
||||||
|
description: Assembly FASTA file
|
||||||
|
pattern: "*.{fasta}"
|
||||||
|
- gtf:
|
||||||
|
type: file
|
||||||
|
description: Annotation GTF file
|
||||||
|
pattern: "*.{gtf}"
|
||||||
|
|
||||||
|
output:
|
||||||
|
- meta:
|
||||||
|
type: map
|
||||||
|
description: |
|
||||||
|
Groovy Map containing sample information
|
||||||
|
e.g. [ id:'test', single_end:false ]
|
||||||
|
- version:
|
||||||
|
type: file
|
||||||
|
description: File containing software version
|
||||||
|
pattern: "*.{version.txt}"
|
||||||
|
- fusions:
|
||||||
|
type: file
|
||||||
|
description: File contains fusions which pass all of Arriba's filters.
|
||||||
|
pattern: "*.{fusions.tsv}"
|
||||||
|
- fusions_fail:
|
||||||
|
type: file
|
||||||
|
description: File contains fusions that Arriba classified as an artifact or that are also observed in healthy tissue.
|
||||||
|
pattern: "*.{fusions.discarded.tsv}"
|
||||||
|
|
||||||
|
authors:
|
||||||
|
- "@praveenraj2018"
|
68
modules/bcftools/concat/functions.nf
Normal file
68
modules/bcftools/concat/functions.nf
Normal file
|
@ -0,0 +1,68 @@
|
||||||
|
//
|
||||||
|
// Utility functions used in nf-core DSL2 module files
|
||||||
|
//
|
||||||
|
|
||||||
|
//
|
||||||
|
// Extract name of software tool from process name using $task.process
|
||||||
|
//
|
||||||
|
def getSoftwareName(task_process) {
|
||||||
|
return task_process.tokenize(':')[-1].tokenize('_')[0].toLowerCase()
|
||||||
|
}
|
||||||
|
|
||||||
|
//
|
||||||
|
// Function to initialise default values and to generate a Groovy Map of available options for nf-core modules
|
||||||
|
//
|
||||||
|
def initOptions(Map args) {
|
||||||
|
def Map options = [:]
|
||||||
|
options.args = args.args ?: ''
|
||||||
|
options.args2 = args.args2 ?: ''
|
||||||
|
options.args3 = args.args3 ?: ''
|
||||||
|
options.publish_by_meta = args.publish_by_meta ?: []
|
||||||
|
options.publish_dir = args.publish_dir ?: ''
|
||||||
|
options.publish_files = args.publish_files
|
||||||
|
options.suffix = args.suffix ?: ''
|
||||||
|
return options
|
||||||
|
}
|
||||||
|
|
||||||
|
//
|
||||||
|
// Tidy up and join elements of a list to return a path string
|
||||||
|
//
|
||||||
|
def getPathFromList(path_list) {
|
||||||
|
def paths = path_list.findAll { item -> !item?.trim().isEmpty() } // Remove empty entries
|
||||||
|
paths = paths.collect { it.trim().replaceAll("^[/]+|[/]+\$", "") } // Trim whitespace and trailing slashes
|
||||||
|
return paths.join('/')
|
||||||
|
}
|
||||||
|
|
||||||
|
//
|
||||||
|
// Function to save/publish module results
|
||||||
|
//
|
||||||
|
def saveFiles(Map args) {
|
||||||
|
if (!args.filename.endsWith('.version.txt')) {
|
||||||
|
def ioptions = initOptions(args.options)
|
||||||
|
def path_list = [ ioptions.publish_dir ?: args.publish_dir ]
|
||||||
|
if (ioptions.publish_by_meta) {
|
||||||
|
def key_list = ioptions.publish_by_meta instanceof List ? ioptions.publish_by_meta : args.publish_by_meta
|
||||||
|
for (key in key_list) {
|
||||||
|
if (args.meta && key instanceof String) {
|
||||||
|
def path = key
|
||||||
|
if (args.meta.containsKey(key)) {
|
||||||
|
path = args.meta[key] instanceof Boolean ? "${key}_${args.meta[key]}".toString() : args.meta[key]
|
||||||
|
}
|
||||||
|
path = path instanceof String ? path : ''
|
||||||
|
path_list.add(path)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if (ioptions.publish_files instanceof Map) {
|
||||||
|
for (ext in ioptions.publish_files) {
|
||||||
|
if (args.filename.endsWith(ext.key)) {
|
||||||
|
def ext_list = path_list.collect()
|
||||||
|
ext_list.add(ext.value)
|
||||||
|
return "${getPathFromList(ext_list)}/$args.filename"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
} else if (ioptions.publish_files == null) {
|
||||||
|
return "${getPathFromList(path_list)}/$args.filename"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
40
modules/bcftools/concat/main.nf
Normal file
40
modules/bcftools/concat/main.nf
Normal file
|
@ -0,0 +1,40 @@
|
||||||
|
// Import generic module functions
|
||||||
|
include { initOptions; saveFiles; getSoftwareName } from './functions'
|
||||||
|
|
||||||
|
params.options = [:]
|
||||||
|
options = initOptions(params.options)
|
||||||
|
|
||||||
|
process BCFTOOLS_CONCAT {
|
||||||
|
tag "$meta.id"
|
||||||
|
label 'process_medium'
|
||||||
|
publishDir "${params.outdir}",
|
||||||
|
mode: params.publish_dir_mode,
|
||||||
|
saveAs: { filename -> saveFiles(filename:filename, options:params.options, publish_dir:getSoftwareName(task.process), meta:meta, publish_by_meta:['id']) }
|
||||||
|
|
||||||
|
conda (params.enable_conda ? "bioconda::bcftools=1.11" : null)
|
||||||
|
if (workflow.containerEngine == 'singularity' && !params.singularity_pull_docker_container) {
|
||||||
|
container "https://depot.galaxyproject.org/singularity/bcftools:1.11--h7c999a4_0"
|
||||||
|
} else {
|
||||||
|
container "quay.io/biocontainers/bcftools:1.11--h7c999a4_0"
|
||||||
|
}
|
||||||
|
|
||||||
|
input:
|
||||||
|
tuple val(meta), path(vcfs)
|
||||||
|
|
||||||
|
output:
|
||||||
|
tuple val(meta), path("*.gz"), emit: vcf
|
||||||
|
path "*.version.txt" , emit: version
|
||||||
|
|
||||||
|
script:
|
||||||
|
def software = getSoftwareName(task.process)
|
||||||
|
prefix = options.suffix ? "${meta.id}${options.suffix}" : "${meta.id}"
|
||||||
|
"""
|
||||||
|
bcftools concat \\
|
||||||
|
--output ${prefix}.vcf.gz \\
|
||||||
|
$options.args \\
|
||||||
|
--threads $task.cpus \\
|
||||||
|
${vcfs}
|
||||||
|
|
||||||
|
echo \$(bcftools --version 2>&1) | sed 's/^.*bcftools //; s/ .*\$//' > ${software}.version.txt
|
||||||
|
"""
|
||||||
|
}
|
42
modules/bcftools/concat/meta.yml
Normal file
42
modules/bcftools/concat/meta.yml
Normal file
|
@ -0,0 +1,42 @@
|
||||||
|
name: bcftools_concat
|
||||||
|
description: Concatenate VCF files
|
||||||
|
keywords:
|
||||||
|
- variant calling
|
||||||
|
- concat
|
||||||
|
- bcftools
|
||||||
|
- VCF
|
||||||
|
|
||||||
|
tools:
|
||||||
|
- concat:
|
||||||
|
description: |
|
||||||
|
Concatenate VCF files.
|
||||||
|
homepage: http://samtools.github.io/bcftools/bcftools.html
|
||||||
|
documentation: http://www.htslib.org/doc/bcftools.html
|
||||||
|
doi: 10.1093/bioinformatics/btp352
|
||||||
|
input:
|
||||||
|
- meta:
|
||||||
|
type: map
|
||||||
|
description: |
|
||||||
|
Groovy Map containing sample information
|
||||||
|
e.g. [ id:'test', single_end:false ]
|
||||||
|
- vcfs:
|
||||||
|
type: files
|
||||||
|
description: |
|
||||||
|
List containing 2 or more vcf files
|
||||||
|
e.g. [ 'file1.vcf', 'file2.vcf' ]
|
||||||
|
output:
|
||||||
|
- meta:
|
||||||
|
type: map
|
||||||
|
description: |
|
||||||
|
Groovy Map containing sample information
|
||||||
|
e.g. [ id:'test', single_end:false ]
|
||||||
|
- vcf:
|
||||||
|
type: file
|
||||||
|
description: VCF concatenated output file
|
||||||
|
pattern: "*.{vcf.gz}"
|
||||||
|
- version:
|
||||||
|
type: file
|
||||||
|
description: File containing software version
|
||||||
|
pattern: "*.{version.txt}"
|
||||||
|
authors:
|
||||||
|
- "@abhi18av"
|
|
@ -11,11 +11,11 @@ process BCFTOOLS_CONSENSUS {
|
||||||
mode: params.publish_dir_mode,
|
mode: params.publish_dir_mode,
|
||||||
saveAs: { filename -> saveFiles(filename:filename, options:params.options, publish_dir:getSoftwareName(task.process), meta:meta, publish_by_meta:['id']) }
|
saveAs: { filename -> saveFiles(filename:filename, options:params.options, publish_dir:getSoftwareName(task.process), meta:meta, publish_by_meta:['id']) }
|
||||||
|
|
||||||
conda (params.enable_conda ? 'bioconda::bcftools=1.11' : null)
|
conda (params.enable_conda ? 'bioconda::bcftools=1.13' : null)
|
||||||
if (workflow.containerEngine == 'singularity' && !params.singularity_pull_docker_container) {
|
if (workflow.containerEngine == 'singularity' && !params.singularity_pull_docker_container) {
|
||||||
container 'https://depot.galaxyproject.org/singularity/bcftools:1.11--h7c999a4_0'
|
container 'https://depot.galaxyproject.org/singularity/bcftools:1.13--h3a49de5_0'
|
||||||
} else {
|
} else {
|
||||||
container 'quay.io/biocontainers/bcftools:1.11--h7c999a4_0'
|
container 'quay.io/biocontainers/bcftools:1.13--h3a49de5_0'
|
||||||
}
|
}
|
||||||
|
|
||||||
input:
|
input:
|
||||||
|
|
|
@ -11,11 +11,11 @@ process BCFTOOLS_FILTER {
|
||||||
mode: params.publish_dir_mode,
|
mode: params.publish_dir_mode,
|
||||||
saveAs: { filename -> saveFiles(filename:filename, options:params.options, publish_dir:getSoftwareName(task.process), meta:meta, publish_by_meta:['id']) }
|
saveAs: { filename -> saveFiles(filename:filename, options:params.options, publish_dir:getSoftwareName(task.process), meta:meta, publish_by_meta:['id']) }
|
||||||
|
|
||||||
conda (params.enable_conda ? "bioconda::bcftools=1.11" : null)
|
conda (params.enable_conda ? 'bioconda::bcftools=1.13' : null)
|
||||||
if (workflow.containerEngine == 'singularity' && !params.singularity_pull_docker_container) {
|
if (workflow.containerEngine == 'singularity' && !params.singularity_pull_docker_container) {
|
||||||
container "https://depot.galaxyproject.org/singularity/bcftools:1.11--h7c999a4_0"
|
container "https://depot.galaxyproject.org/singularity/bcftools:1.13--h3a49de5_0"
|
||||||
} else {
|
} else {
|
||||||
container "quay.io/biocontainers/bcftools:1.11--h7c999a4_0"
|
container "quay.io/biocontainers/bcftools:1.13--h3a49de5_0"
|
||||||
}
|
}
|
||||||
|
|
||||||
input:
|
input:
|
||||||
|
|
|
@ -11,11 +11,11 @@ process BCFTOOLS_ISEC {
|
||||||
mode: params.publish_dir_mode,
|
mode: params.publish_dir_mode,
|
||||||
saveAs: { filename -> saveFiles(filename:filename, options:params.options, publish_dir:getSoftwareName(task.process), meta:meta, publish_by_meta:['id']) }
|
saveAs: { filename -> saveFiles(filename:filename, options:params.options, publish_dir:getSoftwareName(task.process), meta:meta, publish_by_meta:['id']) }
|
||||||
|
|
||||||
conda (params.enable_conda ? "bioconda::bcftools=1.11" : null)
|
conda (params.enable_conda ? 'bioconda::bcftools=1.13' : null)
|
||||||
if (workflow.containerEngine == 'singularity' && !params.singularity_pull_docker_container) {
|
if (workflow.containerEngine == 'singularity' && !params.singularity_pull_docker_container) {
|
||||||
container "https://depot.galaxyproject.org/singularity/bcftools:1.11--h7c999a4_0"
|
container "https://depot.galaxyproject.org/singularity/bcftools:1.13--h3a49de5_0"
|
||||||
} else {
|
} else {
|
||||||
container "quay.io/biocontainers/bcftools:1.11--h7c999a4_0"
|
container "quay.io/biocontainers/bcftools:1.13--h3a49de5_0"
|
||||||
}
|
}
|
||||||
|
|
||||||
input:
|
input:
|
||||||
|
|
|
@ -11,11 +11,11 @@ process BCFTOOLS_MERGE {
|
||||||
mode: params.publish_dir_mode,
|
mode: params.publish_dir_mode,
|
||||||
saveAs: { filename -> saveFiles(filename:filename, options:params.options, publish_dir:getSoftwareName(task.process), meta:meta, publish_by_meta:['id']) }
|
saveAs: { filename -> saveFiles(filename:filename, options:params.options, publish_dir:getSoftwareName(task.process), meta:meta, publish_by_meta:['id']) }
|
||||||
|
|
||||||
conda (params.enable_conda ? "bioconda::bcftools=1.11" : null)
|
conda (params.enable_conda ? 'bioconda::bcftools=1.13' : null)
|
||||||
if (workflow.containerEngine == 'singularity' && !params.singularity_pull_docker_container) {
|
if (workflow.containerEngine == 'singularity' && !params.singularity_pull_docker_container) {
|
||||||
container "https://depot.galaxyproject.org/singularity/bcftools:1.11--h7c999a4_0"
|
container "https://depot.galaxyproject.org/singularity/bcftools:1.13--h3a49de5_0"
|
||||||
} else {
|
} else {
|
||||||
container "quay.io/biocontainers/bcftools:1.11--h7c999a4_0"
|
container "quay.io/biocontainers/bcftools:1.13--h3a49de5_0"
|
||||||
}
|
}
|
||||||
|
|
||||||
input:
|
input:
|
||||||
|
|
|
@ -11,11 +11,11 @@ process BCFTOOLS_MPILEUP {
|
||||||
mode: params.publish_dir_mode,
|
mode: params.publish_dir_mode,
|
||||||
saveAs: { filename -> saveFiles(filename:filename, options:params.options, publish_dir:getSoftwareName(task.process), meta:meta, publish_by_meta:['id']) }
|
saveAs: { filename -> saveFiles(filename:filename, options:params.options, publish_dir:getSoftwareName(task.process), meta:meta, publish_by_meta:['id']) }
|
||||||
|
|
||||||
conda (params.enable_conda ? "bioconda::bcftools=1.11" : null)
|
conda (params.enable_conda ? 'bioconda::bcftools=1.13' : null)
|
||||||
if (workflow.containerEngine == 'singularity' && !params.singularity_pull_docker_container) {
|
if (workflow.containerEngine == 'singularity' && !params.singularity_pull_docker_container) {
|
||||||
container "https://depot.galaxyproject.org/singularity/bcftools:1.11--h7c999a4_0"
|
container "https://depot.galaxyproject.org/singularity/bcftools:1.13--h3a49de5_0"
|
||||||
} else {
|
} else {
|
||||||
container "quay.io/biocontainers/bcftools:1.11--h7c999a4_0"
|
container "quay.io/biocontainers/bcftools:1.13--h3a49de5_0"
|
||||||
}
|
}
|
||||||
|
|
||||||
input:
|
input:
|
||||||
|
|
68
modules/bcftools/norm/functions.nf
Normal file
68
modules/bcftools/norm/functions.nf
Normal file
|
@ -0,0 +1,68 @@
|
||||||
|
//
|
||||||
|
// Utility functions used in nf-core DSL2 module files
|
||||||
|
//
|
||||||
|
|
||||||
|
//
|
||||||
|
// Extract name of software tool from process name using $task.process
|
||||||
|
//
|
||||||
|
def getSoftwareName(task_process) {
|
||||||
|
return task_process.tokenize(':')[-1].tokenize('_')[0].toLowerCase()
|
||||||
|
}
|
||||||
|
|
||||||
|
//
|
||||||
|
// Function to initialise default values and to generate a Groovy Map of available options for nf-core modules
|
||||||
|
//
|
||||||
|
def initOptions(Map args) {
|
||||||
|
def Map options = [:]
|
||||||
|
options.args = args.args ?: ''
|
||||||
|
options.args2 = args.args2 ?: ''
|
||||||
|
options.args3 = args.args3 ?: ''
|
||||||
|
options.publish_by_meta = args.publish_by_meta ?: []
|
||||||
|
options.publish_dir = args.publish_dir ?: ''
|
||||||
|
options.publish_files = args.publish_files
|
||||||
|
options.suffix = args.suffix ?: ''
|
||||||
|
return options
|
||||||
|
}
|
||||||
|
|
||||||
|
//
|
||||||
|
// Tidy up and join elements of a list to return a path string
|
||||||
|
//
|
||||||
|
def getPathFromList(path_list) {
|
||||||
|
def paths = path_list.findAll { item -> !item?.trim().isEmpty() } // Remove empty entries
|
||||||
|
paths = paths.collect { it.trim().replaceAll("^[/]+|[/]+\$", "") } // Trim whitespace and trailing slashes
|
||||||
|
return paths.join('/')
|
||||||
|
}
|
||||||
|
|
||||||
|
//
|
||||||
|
// Function to save/publish module results
|
||||||
|
//
|
||||||
|
def saveFiles(Map args) {
|
||||||
|
if (!args.filename.endsWith('.version.txt')) {
|
||||||
|
def ioptions = initOptions(args.options)
|
||||||
|
def path_list = [ ioptions.publish_dir ?: args.publish_dir ]
|
||||||
|
if (ioptions.publish_by_meta) {
|
||||||
|
def key_list = ioptions.publish_by_meta instanceof List ? ioptions.publish_by_meta : args.publish_by_meta
|
||||||
|
for (key in key_list) {
|
||||||
|
if (args.meta && key instanceof String) {
|
||||||
|
def path = key
|
||||||
|
if (args.meta.containsKey(key)) {
|
||||||
|
path = args.meta[key] instanceof Boolean ? "${key}_${args.meta[key]}".toString() : args.meta[key]
|
||||||
|
}
|
||||||
|
path = path instanceof String ? path : ''
|
||||||
|
path_list.add(path)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if (ioptions.publish_files instanceof Map) {
|
||||||
|
for (ext in ioptions.publish_files) {
|
||||||
|
if (args.filename.endsWith(ext.key)) {
|
||||||
|
def ext_list = path_list.collect()
|
||||||
|
ext_list.add(ext.value)
|
||||||
|
return "${getPathFromList(ext_list)}/$args.filename"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
} else if (ioptions.publish_files == null) {
|
||||||
|
return "${getPathFromList(path_list)}/$args.filename"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
42
modules/bcftools/norm/main.nf
Normal file
42
modules/bcftools/norm/main.nf
Normal file
|
@ -0,0 +1,42 @@
|
||||||
|
// Import generic module functions
|
||||||
|
include { initOptions; saveFiles; getSoftwareName } from './functions'
|
||||||
|
|
||||||
|
params.options = [:]
|
||||||
|
options = initOptions(params.options)
|
||||||
|
|
||||||
|
process BCFTOOLS_NORM {
|
||||||
|
tag "$meta.id"
|
||||||
|
label 'process_medium'
|
||||||
|
publishDir "${params.outdir}",
|
||||||
|
mode: params.publish_dir_mode,
|
||||||
|
saveAs: { filename -> saveFiles(filename:filename, options:params.options, publish_dir:getSoftwareName(task.process), meta:meta, publish_by_meta:['id']) }
|
||||||
|
|
||||||
|
conda (params.enable_conda ? "bioconda::bcftools=1.13" : null)
|
||||||
|
if (workflow.containerEngine == 'singularity' && !params.singularity_pull_docker_container) {
|
||||||
|
container "https://depot.galaxyproject.org/singularity/bcftools:1.13--h3a49de5_0"
|
||||||
|
} else {
|
||||||
|
container "quay.io/biocontainers/bcftools:1.13--h3a49de5_0"
|
||||||
|
}
|
||||||
|
|
||||||
|
input:
|
||||||
|
tuple val(meta), path(vcf)
|
||||||
|
path(fasta)
|
||||||
|
|
||||||
|
output:
|
||||||
|
tuple val(meta), path("*.gz") , emit: vcf
|
||||||
|
path "*.version.txt" , emit: version
|
||||||
|
|
||||||
|
script:
|
||||||
|
def software = getSoftwareName(task.process)
|
||||||
|
def prefix = options.suffix ? "${meta.id}${options.suffix}" : "${meta.id}"
|
||||||
|
"""
|
||||||
|
bcftools norm \\
|
||||||
|
--fasta-ref ${fasta} \\
|
||||||
|
--output ${prefix}.vcf.gz \\
|
||||||
|
$options.args \\
|
||||||
|
--threads $task.cpus \\
|
||||||
|
${vcf}
|
||||||
|
|
||||||
|
echo \$(bcftools --version 2>&1) | sed 's/^.*bcftools //; s/ .*\$//' > ${software}.version.txt
|
||||||
|
"""
|
||||||
|
}
|
45
modules/bcftools/norm/meta.yml
Normal file
45
modules/bcftools/norm/meta.yml
Normal file
|
@ -0,0 +1,45 @@
|
||||||
|
name: bcftools_norm
|
||||||
|
description: Normalize VCF file
|
||||||
|
keywords:
|
||||||
|
- normalize
|
||||||
|
- norm
|
||||||
|
- variant calling
|
||||||
|
- VCF
|
||||||
|
tools:
|
||||||
|
- norm:
|
||||||
|
description: |
|
||||||
|
Normalize VCF files.
|
||||||
|
homepage: http://samtools.github.io/bcftools/bcftools.html
|
||||||
|
documentation: http://www.htslib.org/doc/bcftools.html
|
||||||
|
doi: 10.1093/bioinformatics/btp352
|
||||||
|
input:
|
||||||
|
- meta:
|
||||||
|
type: map
|
||||||
|
description: |
|
||||||
|
Groovy Map containing sample information
|
||||||
|
e.g. [ id:'test', single_end:false ]
|
||||||
|
- vcf:
|
||||||
|
type: file
|
||||||
|
description: |
|
||||||
|
The vcf file to be normalized
|
||||||
|
e.g. 'file1.vcf'
|
||||||
|
- fasta:
|
||||||
|
type: file
|
||||||
|
description: FASTA reference file
|
||||||
|
pattern: "*.{fasta,fa}"
|
||||||
|
output:
|
||||||
|
- meta:
|
||||||
|
type: map
|
||||||
|
description: |
|
||||||
|
Groovy Map containing sample information
|
||||||
|
e.g. [ id:'test', single_end:false ]
|
||||||
|
- vcf:
|
||||||
|
type: file
|
||||||
|
description: VCF normalized output file
|
||||||
|
pattern: "*.{vcf.gz}"
|
||||||
|
- version:
|
||||||
|
type: file
|
||||||
|
description: File containing software version
|
||||||
|
pattern: "*.{version.txt}"
|
||||||
|
authors:
|
||||||
|
- "@abhi18av"
|
68
modules/bcftools/reheader/functions.nf
Normal file
68
modules/bcftools/reheader/functions.nf
Normal file
|
@ -0,0 +1,68 @@
|
||||||
|
//
|
||||||
|
// Utility functions used in nf-core DSL2 module files
|
||||||
|
//
|
||||||
|
|
||||||
|
//
|
||||||
|
// Extract name of software tool from process name using $task.process
|
||||||
|
//
|
||||||
|
def getSoftwareName(task_process) {
|
||||||
|
return task_process.tokenize(':')[-1].tokenize('_')[0].toLowerCase()
|
||||||
|
}
|
||||||
|
|
||||||
|
//
|
||||||
|
// Function to initialise default values and to generate a Groovy Map of available options for nf-core modules
|
||||||
|
//
|
||||||
|
def initOptions(Map args) {
|
||||||
|
def Map options = [:]
|
||||||
|
options.args = args.args ?: ''
|
||||||
|
options.args2 = args.args2 ?: ''
|
||||||
|
options.args3 = args.args3 ?: ''
|
||||||
|
options.publish_by_meta = args.publish_by_meta ?: []
|
||||||
|
options.publish_dir = args.publish_dir ?: ''
|
||||||
|
options.publish_files = args.publish_files
|
||||||
|
options.suffix = args.suffix ?: ''
|
||||||
|
return options
|
||||||
|
}
|
||||||
|
|
||||||
|
//
|
||||||
|
// Tidy up and join elements of a list to return a path string
|
||||||
|
//
|
||||||
|
def getPathFromList(path_list) {
|
||||||
|
def paths = path_list.findAll { item -> !item?.trim().isEmpty() } // Remove empty entries
|
||||||
|
paths = paths.collect { it.trim().replaceAll("^[/]+|[/]+\$", "") } // Trim whitespace and trailing slashes
|
||||||
|
return paths.join('/')
|
||||||
|
}
|
||||||
|
|
||||||
|
//
|
||||||
|
// Function to save/publish module results
|
||||||
|
//
|
||||||
|
def saveFiles(Map args) {
|
||||||
|
if (!args.filename.endsWith('.version.txt')) {
|
||||||
|
def ioptions = initOptions(args.options)
|
||||||
|
def path_list = [ ioptions.publish_dir ?: args.publish_dir ]
|
||||||
|
if (ioptions.publish_by_meta) {
|
||||||
|
def key_list = ioptions.publish_by_meta instanceof List ? ioptions.publish_by_meta : args.publish_by_meta
|
||||||
|
for (key in key_list) {
|
||||||
|
if (args.meta && key instanceof String) {
|
||||||
|
def path = key
|
||||||
|
if (args.meta.containsKey(key)) {
|
||||||
|
path = args.meta[key] instanceof Boolean ? "${key}_${args.meta[key]}".toString() : args.meta[key]
|
||||||
|
}
|
||||||
|
path = path instanceof String ? path : ''
|
||||||
|
path_list.add(path)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if (ioptions.publish_files instanceof Map) {
|
||||||
|
for (ext in ioptions.publish_files) {
|
||||||
|
if (args.filename.endsWith(ext.key)) {
|
||||||
|
def ext_list = path_list.collect()
|
||||||
|
ext_list.add(ext.value)
|
||||||
|
return "${getPathFromList(ext_list)}/$args.filename"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
} else if (ioptions.publish_files == null) {
|
||||||
|
return "${getPathFromList(path_list)}/$args.filename"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
47
modules/bcftools/reheader/main.nf
Normal file
47
modules/bcftools/reheader/main.nf
Normal file
|
@ -0,0 +1,47 @@
|
||||||
|
// Import generic module functions
|
||||||
|
include { initOptions; saveFiles; getSoftwareName } from './functions'
|
||||||
|
|
||||||
|
params.options = [:]
|
||||||
|
options = initOptions(params.options)
|
||||||
|
|
||||||
|
process BCFTOOLS_REHEADER {
|
||||||
|
tag "$meta.id"
|
||||||
|
label 'process_low'
|
||||||
|
publishDir "${params.outdir}",
|
||||||
|
mode: params.publish_dir_mode,
|
||||||
|
saveAs: { filename -> saveFiles(filename:filename, options:params.options, publish_dir:getSoftwareName(task.process), meta:meta, publish_by_meta:['id']) }
|
||||||
|
|
||||||
|
conda (params.enable_conda ? "bioconda::bcftools=1.13" : null)
|
||||||
|
if (workflow.containerEngine == 'singularity' && !params.singularity_pull_docker_container) {
|
||||||
|
container "https://depot.galaxyproject.org/singularity/bcftools:1.13--h3a49de5_0"
|
||||||
|
} else {
|
||||||
|
container "quay.io/biocontainers/bcftools:1.13--h3a49de5_0"
|
||||||
|
}
|
||||||
|
|
||||||
|
input:
|
||||||
|
tuple val(meta), path(vcf)
|
||||||
|
path fai
|
||||||
|
path header
|
||||||
|
|
||||||
|
output:
|
||||||
|
tuple val(meta), path("*.vcf.gz"), emit: vcf
|
||||||
|
path "*.version.txt" , emit: version
|
||||||
|
|
||||||
|
script:
|
||||||
|
def software = getSoftwareName(task.process)
|
||||||
|
def prefix = options.suffix ? "${meta.id}${options.suffix}" : "${meta.id}"
|
||||||
|
def update_sequences = fai ? "-f $fai" : ""
|
||||||
|
def new_header = header ? "-h $header" : ""
|
||||||
|
"""
|
||||||
|
bcftools \\
|
||||||
|
reheader \\
|
||||||
|
$update_sequences \\
|
||||||
|
$new_header \\
|
||||||
|
$options.args \\
|
||||||
|
--threads $task.cpus \\
|
||||||
|
-o ${prefix}.vcf.gz \\
|
||||||
|
$vcf
|
||||||
|
|
||||||
|
echo \$(bcftools --version 2>&1) | sed 's/^.*bcftools //; s/ .*\$//' > ${software}.version.txt
|
||||||
|
"""
|
||||||
|
}
|
51
modules/bcftools/reheader/meta.yml
Normal file
51
modules/bcftools/reheader/meta.yml
Normal file
|
@ -0,0 +1,51 @@
|
||||||
|
name: bcftools_reheader
|
||||||
|
description: Reheader a VCF file
|
||||||
|
keywords:
|
||||||
|
- reheader
|
||||||
|
- vcf
|
||||||
|
- update header
|
||||||
|
tools:
|
||||||
|
- reheader:
|
||||||
|
description: |
|
||||||
|
Modify header of VCF/BCF files, change sample names.
|
||||||
|
homepage: http://samtools.github.io/bcftools/bcftools.html
|
||||||
|
documentation: http://samtools.github.io/bcftools/bcftools.html#reheader
|
||||||
|
doi: 10.1093/gigascience/giab008
|
||||||
|
licence: ['GPL']
|
||||||
|
|
||||||
|
input:
|
||||||
|
- meta:
|
||||||
|
type: map
|
||||||
|
description: |
|
||||||
|
Groovy Map containing sample information
|
||||||
|
e.g. [ id:'test', single_end:false ]
|
||||||
|
- vcf:
|
||||||
|
type: file
|
||||||
|
description: VCF/BCF file
|
||||||
|
pattern: "*.{vcf.gz,vcf,bcf}"
|
||||||
|
- fai:
|
||||||
|
type: file
|
||||||
|
description: Fasta index to update header sequences with
|
||||||
|
pattern: "*.{fai}"
|
||||||
|
- header:
|
||||||
|
type: file
|
||||||
|
description: New header to add to the VCF
|
||||||
|
pattern: "*.{header.txt}"
|
||||||
|
|
||||||
|
output:
|
||||||
|
- meta:
|
||||||
|
type: map
|
||||||
|
description: |
|
||||||
|
Groovy Map containing sample information
|
||||||
|
e.g. [ id:'test', single_end:false ]
|
||||||
|
- version:
|
||||||
|
type: file
|
||||||
|
description: File containing software version
|
||||||
|
pattern: "*.{version.txt}"
|
||||||
|
- vcf:
|
||||||
|
type: file
|
||||||
|
description: VCF with updated header
|
||||||
|
pattern: "*.{vcf.gz}"
|
||||||
|
|
||||||
|
authors:
|
||||||
|
- "@bjohnnyd"
|
|
@ -11,11 +11,11 @@ process BCFTOOLS_STATS {
|
||||||
mode: params.publish_dir_mode,
|
mode: params.publish_dir_mode,
|
||||||
saveAs: { filename -> saveFiles(filename:filename, options:params.options, publish_dir:getSoftwareName(task.process), meta:meta, publish_by_meta:['id']) }
|
saveAs: { filename -> saveFiles(filename:filename, options:params.options, publish_dir:getSoftwareName(task.process), meta:meta, publish_by_meta:['id']) }
|
||||||
|
|
||||||
conda (params.enable_conda ? "bioconda::bcftools=1.11" : null)
|
conda (params.enable_conda ? 'bioconda::bcftools=1.13' : null)
|
||||||
if (workflow.containerEngine == 'singularity' && !params.singularity_pull_docker_container) {
|
if (workflow.containerEngine == 'singularity' && !params.singularity_pull_docker_container) {
|
||||||
container "https://depot.galaxyproject.org/singularity/bcftools:1.11--h7c999a4_0"
|
container "https://depot.galaxyproject.org/singularity/bcftools:1.13--h3a49de5_0"
|
||||||
} else {
|
} else {
|
||||||
container "quay.io/biocontainers/bcftools:1.11--h7c999a4_0"
|
container "quay.io/biocontainers/bcftools:1.13--h3a49de5_0"
|
||||||
}
|
}
|
||||||
|
|
||||||
input:
|
input:
|
||||||
|
|
|
@ -11,11 +11,11 @@ process BLAST_BLASTN {
|
||||||
mode: params.publish_dir_mode,
|
mode: params.publish_dir_mode,
|
||||||
saveAs: { filename -> saveFiles(filename:filename, options:params.options, publish_dir:getSoftwareName(task.process), meta:meta, publish_by_meta:['id']) }
|
saveAs: { filename -> saveFiles(filename:filename, options:params.options, publish_dir:getSoftwareName(task.process), meta:meta, publish_by_meta:['id']) }
|
||||||
|
|
||||||
conda (params.enable_conda ? 'bioconda::blast=2.10.1' : null)
|
conda (params.enable_conda ? 'bioconda::blast=2.12.0' : null)
|
||||||
if (workflow.containerEngine == 'singularity' && !params.singularity_pull_docker_container) {
|
if (workflow.containerEngine == 'singularity' && !params.singularity_pull_docker_container) {
|
||||||
container 'https://depot.galaxyproject.org/singularity/blast:2.10.1--pl526he19e7b1_3'
|
container 'https://depot.galaxyproject.org/singularity/blast:2.12.0--pl5262h3289130_0'
|
||||||
} else {
|
} else {
|
||||||
container 'quay.io/biocontainers/blast:2.10.1--pl526he19e7b1_3'
|
container 'quay.io/biocontainers/blast:2.12.0--pl5262h3289130_0'
|
||||||
}
|
}
|
||||||
|
|
||||||
input:
|
input:
|
||||||
|
|
|
@ -11,11 +11,11 @@ process BLAST_MAKEBLASTDB {
|
||||||
mode: params.publish_dir_mode,
|
mode: params.publish_dir_mode,
|
||||||
saveAs: { filename -> saveFiles(filename:filename, options:params.options, publish_dir:getSoftwareName(task.process), meta:[:], publish_by_meta:[]) }
|
saveAs: { filename -> saveFiles(filename:filename, options:params.options, publish_dir:getSoftwareName(task.process), meta:[:], publish_by_meta:[]) }
|
||||||
|
|
||||||
conda (params.enable_conda ? 'bioconda::blast=2.10.1' : null)
|
conda (params.enable_conda ? 'bioconda::blast=2.12.0' : null)
|
||||||
if (workflow.containerEngine == 'singularity' && !params.singularity_pull_docker_container) {
|
if (workflow.containerEngine == 'singularity' && !params.singularity_pull_docker_container) {
|
||||||
container 'https://depot.galaxyproject.org/singularity/blast:2.10.1--pl526he19e7b1_3'
|
container 'https://depot.galaxyproject.org/singularity/blast:2.12.0--pl5262h3289130_0'
|
||||||
} else {
|
} else {
|
||||||
container 'quay.io/biocontainers/blast:2.10.1--pl526he19e7b1_3'
|
container 'quay.io/biocontainers/blast:2.12.0--pl5262h3289130_0'
|
||||||
}
|
}
|
||||||
|
|
||||||
input:
|
input:
|
||||||
|
|
|
@ -11,11 +11,11 @@ process BOWTIE2_BUILD {
|
||||||
mode: params.publish_dir_mode,
|
mode: params.publish_dir_mode,
|
||||||
saveAs: { filename -> saveFiles(filename:filename, options:params.options, publish_dir:'index', meta:[:], publish_by_meta:[]) }
|
saveAs: { filename -> saveFiles(filename:filename, options:params.options, publish_dir:'index', meta:[:], publish_by_meta:[]) }
|
||||||
|
|
||||||
conda (params.enable_conda ? 'bioconda::bowtie2=2.4.2' : null)
|
conda (params.enable_conda ? 'bioconda::bowtie2=2.4.4' : null)
|
||||||
if (workflow.containerEngine == 'singularity' && !params.singularity_pull_docker_container) {
|
if (workflow.containerEngine == 'singularity' && !params.singularity_pull_docker_container) {
|
||||||
container 'https://depot.galaxyproject.org/singularity/bowtie2:2.4.2--py38h1c8e9b9_1'
|
container 'https://depot.galaxyproject.org/singularity/bowtie2:2.4.4--py39hbb4e92a_0'
|
||||||
} else {
|
} else {
|
||||||
container 'quay.io/biocontainers/bowtie2:2.4.2--py38h1c8e9b9_1'
|
container 'quay.io/biocontainers/bowtie2:2.4.4--py36hd4290be_0'
|
||||||
}
|
}
|
||||||
|
|
||||||
input:
|
input:
|
||||||
|
|
68
modules/bwa/aln/functions.nf
Normal file
68
modules/bwa/aln/functions.nf
Normal file
|
@ -0,0 +1,68 @@
|
||||||
|
//
|
||||||
|
// Utility functions used in nf-core DSL2 module files
|
||||||
|
//
|
||||||
|
|
||||||
|
//
|
||||||
|
// Extract name of software tool from process name using $task.process
|
||||||
|
//
|
||||||
|
def getSoftwareName(task_process) {
|
||||||
|
return task_process.tokenize(':')[-1].tokenize('_')[0].toLowerCase()
|
||||||
|
}
|
||||||
|
|
||||||
|
//
|
||||||
|
// Function to initialise default values and to generate a Groovy Map of available options for nf-core modules
|
||||||
|
//
|
||||||
|
def initOptions(Map args) {
|
||||||
|
def Map options = [:]
|
||||||
|
options.args = args.args ?: ''
|
||||||
|
options.args2 = args.args2 ?: ''
|
||||||
|
options.args3 = args.args3 ?: ''
|
||||||
|
options.publish_by_meta = args.publish_by_meta ?: []
|
||||||
|
options.publish_dir = args.publish_dir ?: ''
|
||||||
|
options.publish_files = args.publish_files
|
||||||
|
options.suffix = args.suffix ?: ''
|
||||||
|
return options
|
||||||
|
}
|
||||||
|
|
||||||
|
//
|
||||||
|
// Tidy up and join elements of a list to return a path string
|
||||||
|
//
|
||||||
|
def getPathFromList(path_list) {
|
||||||
|
def paths = path_list.findAll { item -> !item?.trim().isEmpty() } // Remove empty entries
|
||||||
|
paths = paths.collect { it.trim().replaceAll("^[/]+|[/]+\$", "") } // Trim whitespace and trailing slashes
|
||||||
|
return paths.join('/')
|
||||||
|
}
|
||||||
|
|
||||||
|
//
|
||||||
|
// Function to save/publish module results
|
||||||
|
//
|
||||||
|
def saveFiles(Map args) {
|
||||||
|
if (!args.filename.endsWith('.version.txt')) {
|
||||||
|
def ioptions = initOptions(args.options)
|
||||||
|
def path_list = [ ioptions.publish_dir ?: args.publish_dir ]
|
||||||
|
if (ioptions.publish_by_meta) {
|
||||||
|
def key_list = ioptions.publish_by_meta instanceof List ? ioptions.publish_by_meta : args.publish_by_meta
|
||||||
|
for (key in key_list) {
|
||||||
|
if (args.meta && key instanceof String) {
|
||||||
|
def path = key
|
||||||
|
if (args.meta.containsKey(key)) {
|
||||||
|
path = args.meta[key] instanceof Boolean ? "${key}_${args.meta[key]}".toString() : args.meta[key]
|
||||||
|
}
|
||||||
|
path = path instanceof String ? path : ''
|
||||||
|
path_list.add(path)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if (ioptions.publish_files instanceof Map) {
|
||||||
|
for (ext in ioptions.publish_files) {
|
||||||
|
if (args.filename.endsWith(ext.key)) {
|
||||||
|
def ext_list = path_list.collect()
|
||||||
|
ext_list.add(ext.value)
|
||||||
|
return "${getPathFromList(ext_list)}/$args.filename"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
} else if (ioptions.publish_files == null) {
|
||||||
|
return "${getPathFromList(path_list)}/$args.filename"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
67
modules/bwa/aln/main.nf
Normal file
67
modules/bwa/aln/main.nf
Normal file
|
@ -0,0 +1,67 @@
|
||||||
|
// Import generic module functions
|
||||||
|
include { initOptions; saveFiles; getSoftwareName } from './functions'
|
||||||
|
|
||||||
|
params.options = [:]
|
||||||
|
options = initOptions(params.options)
|
||||||
|
|
||||||
|
process BWA_ALN {
|
||||||
|
tag "$meta.id"
|
||||||
|
label 'process_medium'
|
||||||
|
publishDir "${params.outdir}",
|
||||||
|
mode: params.publish_dir_mode,
|
||||||
|
saveAs: { filename -> saveFiles(filename:filename, options:params.options, publish_dir:getSoftwareName(task.process), meta:meta, publish_by_meta:['id']) }
|
||||||
|
|
||||||
|
conda (params.enable_conda ? "bioconda::bwa=0.7.17" : null)
|
||||||
|
if (workflow.containerEngine == 'singularity' && !params.singularity_pull_docker_container) {
|
||||||
|
container "https://depot.galaxyproject.org/singularity/bwa:0.7.17--h5bf99c6_8"
|
||||||
|
} else {
|
||||||
|
container "quay.io/biocontainers/bwa:0.7.17--h5bf99c6_8"
|
||||||
|
}
|
||||||
|
|
||||||
|
input:
|
||||||
|
tuple val(meta), path(reads)
|
||||||
|
path index
|
||||||
|
|
||||||
|
output:
|
||||||
|
tuple val(meta), path("*.sai"), emit: sai
|
||||||
|
path "*.version.txt" , emit: version
|
||||||
|
|
||||||
|
script:
|
||||||
|
def software = getSoftwareName(task.process)
|
||||||
|
def prefix = options.suffix ? "${meta.id}${options.suffix}" : "${meta.id}"
|
||||||
|
|
||||||
|
if (meta.single_end) {
|
||||||
|
"""
|
||||||
|
INDEX=`find -L ./ -name "*.amb" | sed 's/.amb//'`
|
||||||
|
|
||||||
|
bwa aln \\
|
||||||
|
$options.args \\
|
||||||
|
-t $task.cpus \\
|
||||||
|
-f ${prefix}.sai \\
|
||||||
|
\$INDEX \\
|
||||||
|
${reads}
|
||||||
|
|
||||||
|
echo \$(bwa 2>&1) | sed 's/^.*Version: //; s/Contact:.*\$//' > ${software}.version.txt
|
||||||
|
"""
|
||||||
|
} else {
|
||||||
|
"""
|
||||||
|
INDEX=`find -L ./ -name "*.amb" | sed 's/.amb//'`
|
||||||
|
|
||||||
|
bwa aln \\
|
||||||
|
$options.args \\
|
||||||
|
-t $task.cpus \\
|
||||||
|
-f ${prefix}.1.sai \\
|
||||||
|
\$INDEX \\
|
||||||
|
${reads[0]}
|
||||||
|
|
||||||
|
bwa aln \\
|
||||||
|
$options.args \\
|
||||||
|
-t $task.cpus \\
|
||||||
|
-f ${prefix}.2.sai \\
|
||||||
|
\$INDEX \\
|
||||||
|
${reads[1]}
|
||||||
|
|
||||||
|
echo \$(bwa 2>&1) | sed 's/^.*Version: //; s/Contact:.*\$//' > ${software}.version.txt
|
||||||
|
"""
|
||||||
|
}
|
||||||
|
}
|
54
modules/bwa/aln/meta.yml
Normal file
54
modules/bwa/aln/meta.yml
Normal file
|
@ -0,0 +1,54 @@
|
||||||
|
name: bwa_aln
|
||||||
|
description: Find SA coordinates of the input reads for bwa short-read mapping
|
||||||
|
keywords:
|
||||||
|
- bwa
|
||||||
|
- aln
|
||||||
|
- short-read
|
||||||
|
- align
|
||||||
|
- reference
|
||||||
|
- fasta
|
||||||
|
- map
|
||||||
|
- fastq
|
||||||
|
tools:
|
||||||
|
- bwa:
|
||||||
|
description: |
|
||||||
|
BWA is a software package for mapping DNA sequences against
|
||||||
|
a large reference genome, such as the human genome.
|
||||||
|
homepage: http://bio-bwa.sourceforge.net/
|
||||||
|
documentation: http://bio-bwa.sourceforge.net/
|
||||||
|
doi: "10.1093/bioinformatics/btp324"
|
||||||
|
licence: ['GPL v3']
|
||||||
|
|
||||||
|
input:
|
||||||
|
- meta:
|
||||||
|
type: map
|
||||||
|
description: |
|
||||||
|
Groovy Map containing sample information
|
||||||
|
e.g. [ id:'test', single_end:false ]
|
||||||
|
- reads:
|
||||||
|
type: file
|
||||||
|
description: |
|
||||||
|
List of input FastQ files of size 1 and 2 for single-end and paired-end data,
|
||||||
|
respectively.
|
||||||
|
- index:
|
||||||
|
type: file
|
||||||
|
description: BWA genome index files
|
||||||
|
pattern: "Directory containing BWA index *.{amb,ann,bwt,pac,sa}"
|
||||||
|
|
||||||
|
output:
|
||||||
|
- meta:
|
||||||
|
type: map
|
||||||
|
description: |
|
||||||
|
Groovy Map containing sample information
|
||||||
|
e.g. [ id:'test', single_end:false ]
|
||||||
|
- version:
|
||||||
|
type: file
|
||||||
|
description: File containing software version
|
||||||
|
pattern: "*.{version.txt}"
|
||||||
|
- sai:
|
||||||
|
type: file
|
||||||
|
description: Single or paired SA coordinate files
|
||||||
|
pattern: "*.sai"
|
||||||
|
|
||||||
|
authors:
|
||||||
|
- "@jfy133"
|
68
modules/bwa/sampe/functions.nf
Normal file
68
modules/bwa/sampe/functions.nf
Normal file
|
@ -0,0 +1,68 @@
|
||||||
|
//
|
||||||
|
// Utility functions used in nf-core DSL2 module files
|
||||||
|
//
|
||||||
|
|
||||||
|
//
|
||||||
|
// Extract name of software tool from process name using $task.process
|
||||||
|
//
|
||||||
|
def getSoftwareName(task_process) {
|
||||||
|
return task_process.tokenize(':')[-1].tokenize('_')[0].toLowerCase()
|
||||||
|
}
|
||||||
|
|
||||||
|
//
|
||||||
|
// Function to initialise default values and to generate a Groovy Map of available options for nf-core modules
|
||||||
|
//
|
||||||
|
def initOptions(Map args) {
|
||||||
|
def Map options = [:]
|
||||||
|
options.args = args.args ?: ''
|
||||||
|
options.args2 = args.args2 ?: ''
|
||||||
|
options.args3 = args.args3 ?: ''
|
||||||
|
options.publish_by_meta = args.publish_by_meta ?: []
|
||||||
|
options.publish_dir = args.publish_dir ?: ''
|
||||||
|
options.publish_files = args.publish_files
|
||||||
|
options.suffix = args.suffix ?: ''
|
||||||
|
return options
|
||||||
|
}
|
||||||
|
|
||||||
|
//
|
||||||
|
// Tidy up and join elements of a list to return a path string
|
||||||
|
//
|
||||||
|
def getPathFromList(path_list) {
|
||||||
|
def paths = path_list.findAll { item -> !item?.trim().isEmpty() } // Remove empty entries
|
||||||
|
paths = paths.collect { it.trim().replaceAll("^[/]+|[/]+\$", "") } // Trim whitespace and trailing slashes
|
||||||
|
return paths.join('/')
|
||||||
|
}
|
||||||
|
|
||||||
|
//
|
||||||
|
// Function to save/publish module results
|
||||||
|
//
|
||||||
|
def saveFiles(Map args) {
|
||||||
|
if (!args.filename.endsWith('.version.txt')) {
|
||||||
|
def ioptions = initOptions(args.options)
|
||||||
|
def path_list = [ ioptions.publish_dir ?: args.publish_dir ]
|
||||||
|
if (ioptions.publish_by_meta) {
|
||||||
|
def key_list = ioptions.publish_by_meta instanceof List ? ioptions.publish_by_meta : args.publish_by_meta
|
||||||
|
for (key in key_list) {
|
||||||
|
if (args.meta && key instanceof String) {
|
||||||
|
def path = key
|
||||||
|
if (args.meta.containsKey(key)) {
|
||||||
|
path = args.meta[key] instanceof Boolean ? "${key}_${args.meta[key]}".toString() : args.meta[key]
|
||||||
|
}
|
||||||
|
path = path instanceof String ? path : ''
|
||||||
|
path_list.add(path)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if (ioptions.publish_files instanceof Map) {
|
||||||
|
for (ext in ioptions.publish_files) {
|
||||||
|
if (args.filename.endsWith(ext.key)) {
|
||||||
|
def ext_list = path_list.collect()
|
||||||
|
ext_list.add(ext.value)
|
||||||
|
return "${getPathFromList(ext_list)}/$args.filename"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
} else if (ioptions.publish_files == null) {
|
||||||
|
return "${getPathFromList(path_list)}/$args.filename"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
46
modules/bwa/sampe/main.nf
Normal file
46
modules/bwa/sampe/main.nf
Normal file
|
@ -0,0 +1,46 @@
|
||||||
|
// Import generic module functions
|
||||||
|
include { initOptions; saveFiles; getSoftwareName } from './functions'
|
||||||
|
|
||||||
|
params.options = [:]
|
||||||
|
options = initOptions(params.options)
|
||||||
|
|
||||||
|
process BWA_SAMPE {
|
||||||
|
tag "$meta.id"
|
||||||
|
label 'process_medium'
|
||||||
|
publishDir "${params.outdir}",
|
||||||
|
mode: params.publish_dir_mode,
|
||||||
|
saveAs: { filename -> saveFiles(filename:filename, options:params.options, publish_dir:getSoftwareName(task.process), meta:meta, publish_by_meta:['id']) }
|
||||||
|
|
||||||
|
conda (params.enable_conda ? "bioconda::bwa=0.7.17 bioconda::samtools=1.12" : null)
|
||||||
|
if (workflow.containerEngine == 'singularity' && !params.singularity_pull_docker_container) {
|
||||||
|
container "https://depot.galaxyproject.org/singularity/mulled-v2-fe8faa35dbf6dc65a0f7f5d4ea12e31a79f73e40:66ed1b38d280722529bb8a0167b0cf02f8a0b488-0"
|
||||||
|
} else {
|
||||||
|
container "quay.io/biocontainers/mulled-v2-fe8faa35dbf6dc65a0f7f5d4ea12e31a79f73e40:66ed1b38d280722529bb8a0167b0cf02f8a0b488-0"
|
||||||
|
}
|
||||||
|
|
||||||
|
input:
|
||||||
|
tuple val(meta), path(reads), path(sai)
|
||||||
|
path index
|
||||||
|
|
||||||
|
output:
|
||||||
|
tuple val(meta), path("*.bam"), emit: bam
|
||||||
|
path "*.version.txt" , emit: version
|
||||||
|
|
||||||
|
script:
|
||||||
|
def software = getSoftwareName(task.process)
|
||||||
|
def prefix = options.suffix ? "${meta.id}${options.suffix}" : "${meta.id}"
|
||||||
|
def read_group = meta.read_group ? "-r ${meta.read_group}" : ""
|
||||||
|
|
||||||
|
"""
|
||||||
|
INDEX=`find -L ./ -name "*.amb" | sed 's/.amb//'`
|
||||||
|
|
||||||
|
bwa sampe \\
|
||||||
|
$options.args \\
|
||||||
|
$read_group \\
|
||||||
|
\$INDEX \\
|
||||||
|
$sai \\
|
||||||
|
$reads | samtools sort -@ ${task.cpus - 1} -O bam - > ${prefix}.bam
|
||||||
|
|
||||||
|
echo \$(bwa 2>&1) | sed 's/^.*Version: //; s/Contact:.*\$//' > ${software}.version.txt
|
||||||
|
"""
|
||||||
|
}
|
58
modules/bwa/sampe/meta.yml
Normal file
58
modules/bwa/sampe/meta.yml
Normal file
|
@ -0,0 +1,58 @@
|
||||||
|
name: bwa_sampe
|
||||||
|
description: Convert paired-end bwa SA coordinate files to SAM format
|
||||||
|
keywords:
|
||||||
|
- bwa
|
||||||
|
- aln
|
||||||
|
- short-read
|
||||||
|
- align
|
||||||
|
- reference
|
||||||
|
- fasta
|
||||||
|
- map
|
||||||
|
- sam
|
||||||
|
- bam
|
||||||
|
tools:
|
||||||
|
- bwa:
|
||||||
|
description: |
|
||||||
|
BWA is a software package for mapping DNA sequences against
|
||||||
|
a large reference genome, such as the human genome.
|
||||||
|
homepage: http://bio-bwa.sourceforge.net/
|
||||||
|
documentation: http://bio-bwa.sourceforge.net/
|
||||||
|
doi: "10.1093/bioinformatics/btp324"
|
||||||
|
licence: ['GPL v3']
|
||||||
|
|
||||||
|
input:
|
||||||
|
- meta:
|
||||||
|
type: map
|
||||||
|
description: |
|
||||||
|
Groovy Map containing sample information.
|
||||||
|
e.g. [ id:'test', single_end:false ]
|
||||||
|
- reads:
|
||||||
|
type: file
|
||||||
|
description: FASTQ files specified alongside meta in input channel.
|
||||||
|
pattern: "*.{fastq,fq}.gz"
|
||||||
|
- sai:
|
||||||
|
type: file
|
||||||
|
description: SAI file specified alongside meta and reads in input channel.
|
||||||
|
pattern: "*.sai"
|
||||||
|
- index:
|
||||||
|
type: directory
|
||||||
|
description: Directory containing BWA index files (amb,ann,bwt,pac,sa) from BWA_INDEX
|
||||||
|
pattern: "bwa/"
|
||||||
|
|
||||||
|
output:
|
||||||
|
- meta:
|
||||||
|
type: map
|
||||||
|
description: |
|
||||||
|
Groovy Map containing sample information
|
||||||
|
e.g. [ id:'test', single_end:false ]
|
||||||
|
- version:
|
||||||
|
type: file
|
||||||
|
description: File containing software version
|
||||||
|
pattern: "*.{version.txt}"
|
||||||
|
- bam:
|
||||||
|
type: file
|
||||||
|
description: BAM file
|
||||||
|
pattern: "*.bam"
|
||||||
|
|
||||||
|
authors:
|
||||||
|
- "@jfy133"
|
68
modules/bwa/samse/functions.nf
Normal file
68
modules/bwa/samse/functions.nf
Normal file
|
@ -0,0 +1,68 @@
|
||||||
|
//
|
||||||
|
// Utility functions used in nf-core DSL2 module files
|
||||||
|
//
|
||||||
|
|
||||||
|
//
|
||||||
|
// Extract name of software tool from process name using $task.process
|
||||||
|
//
|
||||||
|
def getSoftwareName(task_process) {
|
||||||
|
return task_process.tokenize(':')[-1].tokenize('_')[0].toLowerCase()
|
||||||
|
}
|
||||||
|
|
||||||
|
//
|
||||||
|
// Function to initialise default values and to generate a Groovy Map of available options for nf-core modules
|
||||||
|
//
|
||||||
|
def initOptions(Map args) {
|
||||||
|
def Map options = [:]
|
||||||
|
options.args = args.args ?: ''
|
||||||
|
options.args2 = args.args2 ?: ''
|
||||||
|
options.args3 = args.args3 ?: ''
|
||||||
|
options.publish_by_meta = args.publish_by_meta ?: []
|
||||||
|
options.publish_dir = args.publish_dir ?: ''
|
||||||
|
options.publish_files = args.publish_files
|
||||||
|
options.suffix = args.suffix ?: ''
|
||||||
|
return options
|
||||||
|
}
|
||||||
|
|
||||||
|
//
|
||||||
|
// Tidy up and join elements of a list to return a path string
|
||||||
|
//
|
||||||
|
def getPathFromList(path_list) {
|
||||||
|
def paths = path_list.findAll { item -> !item?.trim().isEmpty() } // Remove empty entries
|
||||||
|
paths = paths.collect { it.trim().replaceAll("^[/]+|[/]+\$", "") } // Trim whitespace and trailing slashes
|
||||||
|
return paths.join('/')
|
||||||
|
}
|
||||||
|
|
||||||
|
//
|
||||||
|
// Function to save/publish module results
|
||||||
|
//
|
||||||
|
def saveFiles(Map args) {
|
||||||
|
if (!args.filename.endsWith('.version.txt')) {
|
||||||
|
def ioptions = initOptions(args.options)
|
||||||
|
def path_list = [ ioptions.publish_dir ?: args.publish_dir ]
|
||||||
|
if (ioptions.publish_by_meta) {
|
||||||
|
def key_list = ioptions.publish_by_meta instanceof List ? ioptions.publish_by_meta : args.publish_by_meta
|
||||||
|
for (key in key_list) {
|
||||||
|
if (args.meta && key instanceof String) {
|
||||||
|
def path = key
|
||||||
|
if (args.meta.containsKey(key)) {
|
||||||
|
path = args.meta[key] instanceof Boolean ? "${key}_${args.meta[key]}".toString() : args.meta[key]
|
||||||
|
}
|
||||||
|
path = path instanceof String ? path : ''
|
||||||
|
path_list.add(path)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if (ioptions.publish_files instanceof Map) {
|
||||||
|
for (ext in ioptions.publish_files) {
|
||||||
|
if (args.filename.endsWith(ext.key)) {
|
||||||
|
def ext_list = path_list.collect()
|
||||||
|
ext_list.add(ext.value)
|
||||||
|
return "${getPathFromList(ext_list)}/$args.filename"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
} else if (ioptions.publish_files == null) {
|
||||||
|
return "${getPathFromList(path_list)}/$args.filename"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
46
modules/bwa/samse/main.nf
Normal file
46
modules/bwa/samse/main.nf
Normal file
|
@ -0,0 +1,46 @@
|
||||||
|
// Import generic module functions
|
||||||
|
include { initOptions; saveFiles; getSoftwareName } from './functions'
|
||||||
|
|
||||||
|
params.options = [:]
|
||||||
|
options = initOptions(params.options)
|
||||||
|
|
||||||
|
process BWA_SAMSE {
|
||||||
|
tag "$meta.id"
|
||||||
|
label 'process_medium'
|
||||||
|
publishDir "${params.outdir}",
|
||||||
|
mode: params.publish_dir_mode,
|
||||||
|
saveAs: { filename -> saveFiles(filename:filename, options:params.options, publish_dir:getSoftwareName(task.process), meta:meta, publish_by_meta:['id']) }
|
||||||
|
|
||||||
|
conda (params.enable_conda ? "bioconda::bwa=0.7.17 bioconda::samtools=1.12" : null)
|
||||||
|
if (workflow.containerEngine == 'singularity' && !params.singularity_pull_docker_container) {
|
||||||
|
container "https://depot.galaxyproject.org/singularity/mulled-v2-fe8faa35dbf6dc65a0f7f5d4ea12e31a79f73e40:66ed1b38d280722529bb8a0167b0cf02f8a0b488-0"
|
||||||
|
} else {
|
||||||
|
container "quay.io/biocontainers/mulled-v2-fe8faa35dbf6dc65a0f7f5d4ea12e31a79f73e40:66ed1b38d280722529bb8a0167b0cf02f8a0b488-0"
|
||||||
|
}
|
||||||
|
|
||||||
|
input:
|
||||||
|
tuple val(meta), path(reads), path(sai)
|
||||||
|
path index
|
||||||
|
|
||||||
|
output:
|
||||||
|
tuple val(meta), path("*.bam"), emit: bam
|
||||||
|
path "*.version.txt" , emit: version
|
||||||
|
|
||||||
|
script:
|
||||||
|
def software = getSoftwareName(task.process)
|
||||||
|
def prefix = options.suffix ? "${meta.id}${options.suffix}" : "${meta.id}"
|
||||||
|
def read_group = meta.read_group ? "-r ${meta.read_group}" : ""
|
||||||
|
|
||||||
|
"""
|
||||||
|
INDEX=`find -L ./ -name "*.amb" | sed 's/.amb//'`
|
||||||
|
|
||||||
|
bwa samse \\
|
||||||
|
$options.args \\
|
||||||
|
$read_group \\
|
||||||
|
\$INDEX \\
|
||||||
|
$sai \\
|
||||||
|
$reads | samtools sort -@ ${task.cpus - 1} -O bam - > ${prefix}.bam
|
||||||
|
|
||||||
|
echo \$(bwa 2>&1) | sed 's/^.*Version: //; s/Contact:.*\$//' > ${software}.version.txt
|
||||||
|
"""
|
||||||
|
}
|
59
modules/bwa/samse/meta.yml
Normal file
59
modules/bwa/samse/meta.yml
Normal file
|
@ -0,0 +1,59 @@
|
||||||
|
name: bwa_samse
|
||||||
|
description: Convert bwa SA coordinate file to SAM format
|
||||||
|
keywords:
|
||||||
|
- bwa
|
||||||
|
- aln
|
||||||
|
- short-read
|
||||||
|
- align
|
||||||
|
- reference
|
||||||
|
- fasta
|
||||||
|
- map
|
||||||
|
- sam
|
||||||
|
- bam
|
||||||
|
|
||||||
|
tools:
|
||||||
|
- bwa:
|
||||||
|
description: |
|
||||||
|
BWA is a software package for mapping DNA sequences against
|
||||||
|
a large reference genome, such as the human genome.
|
||||||
|
homepage: http://bio-bwa.sourceforge.net/
|
||||||
|
documentation: http://bio-bwa.sourceforge.net/
|
||||||
|
doi: "10.1093/bioinformatics/btp324"
|
||||||
|
licence: ['GPL v3']
|
||||||
|
|
||||||
|
input:
|
||||||
|
- meta:
|
||||||
|
type: map
|
||||||
|
description: |
|
||||||
|
Groovy Map containing sample information.
|
||||||
|
e.g. [ id:'test', single_end:false ]
|
||||||
|
- reads:
|
||||||
|
type: file
|
||||||
|
description: FASTQ files specified alongside meta in input channel.
|
||||||
|
pattern: "*.{fastq,fq}.gz"
|
||||||
|
- sai:
|
||||||
|
type: file
|
||||||
|
description: SAI file specified alongside meta and reads in input channel.
|
||||||
|
pattern: "*.sai"
|
||||||
|
- index:
|
||||||
|
type: directory
|
||||||
|
description: Directory containing BWA index files (amb,ann,bwt,pac,sa) from BWA_INDEX
|
||||||
|
pattern: "bwa/"
|
||||||
|
|
||||||
|
output:
|
||||||
|
- meta:
|
||||||
|
type: map
|
||||||
|
description: |
|
||||||
|
Groovy Map containing sample information
|
||||||
|
e.g. [ id:'test', single_end:false ]
|
||||||
|
- version:
|
||||||
|
type: file
|
||||||
|
description: File containing software version
|
||||||
|
pattern: "*.{version.txt}"
|
||||||
|
- bam:
|
||||||
|
type: file
|
||||||
|
description: BAM file
|
||||||
|
pattern: "*.bam"
|
||||||
|
|
||||||
|
authors:
|
||||||
|
- "@jfy133"
|
|
@ -11,11 +11,11 @@ process CNVKIT {
|
||||||
mode: params.publish_dir_mode,
|
mode: params.publish_dir_mode,
|
||||||
saveAs: { filename -> saveFiles(filename:filename, options:params.options, publish_dir:getSoftwareName(task.process), meta:meta, publish_by_meta:['id']) }
|
saveAs: { filename -> saveFiles(filename:filename, options:params.options, publish_dir:getSoftwareName(task.process), meta:meta, publish_by_meta:['id']) }
|
||||||
|
|
||||||
conda (params.enable_conda ? "bioconda::cnvkit=0.9.8" : null)
|
conda (params.enable_conda ? 'bioconda::cnvkit=0.9.9' : null)
|
||||||
if (workflow.containerEngine == 'singularity' && !params.singularity_pull_docker_container) {
|
if (workflow.containerEngine == 'singularity' && !params.singularity_pull_docker_container) {
|
||||||
container "https://depot.galaxyproject.org/singularity/cnvkit:0.9.8--py_0"
|
container "https://depot.galaxyproject.org/singularity/cnvkit:0.9.9--pyhdfd78af_0"
|
||||||
} else {
|
} else {
|
||||||
container "quay.io/biocontainers/cnvkit:0.9.8--py_0"
|
container "quay.io/biocontainers/cnvkit:0.9.9--pyhdfd78af_0"
|
||||||
}
|
}
|
||||||
|
|
||||||
input:
|
input:
|
||||||
|
|
|
@ -5,7 +5,7 @@ params.options = [:]
|
||||||
options = initOptions(params.options)
|
options = initOptions(params.options)
|
||||||
|
|
||||||
process COOLER_DIGEST {
|
process COOLER_DIGEST {
|
||||||
tag '$fasta'
|
tag "$fasta"
|
||||||
label 'process_medium'
|
label 'process_medium'
|
||||||
publishDir "${params.outdir}",
|
publishDir "${params.outdir}",
|
||||||
mode: params.publish_dir_mode,
|
mode: params.publish_dir_mode,
|
||||||
|
@ -32,7 +32,7 @@ process COOLER_DIGEST {
|
||||||
"""
|
"""
|
||||||
cooler digest \\
|
cooler digest \\
|
||||||
$options.args \\
|
$options.args \\
|
||||||
-o "${fasta.baseName}_${enzyme.replaceAll(/[^0-9a-zA-Z]+/, "_")}.bed" \\
|
-o "${fasta.baseName}_${enzyme.replaceAll(/[^0-9a-zA-Z]+/, '_')}.bed" \\
|
||||||
$chromsizes \\
|
$chromsizes \\
|
||||||
$fasta \\
|
$fasta \\
|
||||||
$enzyme
|
$enzyme
|
||||||
|
|
|
@ -11,11 +11,11 @@ process CUTADAPT {
|
||||||
mode: params.publish_dir_mode,
|
mode: params.publish_dir_mode,
|
||||||
saveAs: { filename -> saveFiles(filename:filename, options:params.options, publish_dir:getSoftwareName(task.process), meta:meta, publish_by_meta:['id']) }
|
saveAs: { filename -> saveFiles(filename:filename, options:params.options, publish_dir:getSoftwareName(task.process), meta:meta, publish_by_meta:['id']) }
|
||||||
|
|
||||||
conda (params.enable_conda ? 'bioconda::cutadapt=3.2' : null)
|
conda (params.enable_conda ? 'bioconda::cutadapt=3.4' : null)
|
||||||
if (workflow.containerEngine == 'singularity' && !params.singularity_pull_docker_container) {
|
if (workflow.containerEngine == 'singularity' && !params.singularity_pull_docker_container) {
|
||||||
container 'https://depot.galaxyproject.org/singularity/cutadapt:3.2--py38h0213d0e_0'
|
container 'https://depot.galaxyproject.org/singularity/cutadapt:3.4--py39h38f01e4_1'
|
||||||
} else {
|
} else {
|
||||||
container 'quay.io/biocontainers/cutadapt:3.2--py38h0213d0e_0'
|
container 'quay.io/biocontainers/cutadapt:3.4--py37h73a75cf_1'
|
||||||
}
|
}
|
||||||
|
|
||||||
input:
|
input:
|
||||||
|
|
|
@ -11,11 +11,11 @@ process DEEPTOOLS_COMPUTEMATRIX {
|
||||||
mode: params.publish_dir_mode,
|
mode: params.publish_dir_mode,
|
||||||
saveAs: { filename -> saveFiles(filename:filename, options:params.options, publish_dir:getSoftwareName(task.process), meta:meta, publish_by_meta:['id']) }
|
saveAs: { filename -> saveFiles(filename:filename, options:params.options, publish_dir:getSoftwareName(task.process), meta:meta, publish_by_meta:['id']) }
|
||||||
|
|
||||||
conda (params.enable_conda ? "bioconda::deeptools=3.5.0" : null)
|
conda (params.enable_conda ? 'bioconda::deeptools=3.5.1' : null)
|
||||||
if (workflow.containerEngine == 'singularity' && !params.singularity_pull_docker_container) {
|
if (workflow.containerEngine == 'singularity' && !params.singularity_pull_docker_container) {
|
||||||
container "https://depot.galaxyproject.org/singularity/deeptools:3.5.0--py_0"
|
container "https://depot.galaxyproject.org/singularity/deeptools:3.5.1--py_0"
|
||||||
} else {
|
} else {
|
||||||
container "quay.io/biocontainers/deeptools:3.5.0--py_0"
|
container "quay.io/biocontainers/deeptools:3.5.1--py_0"
|
||||||
}
|
}
|
||||||
|
|
||||||
input:
|
input:
|
||||||
|
|
|
@ -11,11 +11,11 @@ process DEEPTOOLS_PLOTFINGERPRINT {
|
||||||
mode: params.publish_dir_mode,
|
mode: params.publish_dir_mode,
|
||||||
saveAs: { filename -> saveFiles(filename:filename, options:params.options, publish_dir:getSoftwareName(task.process), meta:meta, publish_by_meta:['id']) }
|
saveAs: { filename -> saveFiles(filename:filename, options:params.options, publish_dir:getSoftwareName(task.process), meta:meta, publish_by_meta:['id']) }
|
||||||
|
|
||||||
conda (params.enable_conda ? "bioconda::deeptools=3.5.0" : null)
|
conda (params.enable_conda ? 'bioconda::deeptools=3.5.1' : null)
|
||||||
if (workflow.containerEngine == 'singularity' && !params.singularity_pull_docker_container) {
|
if (workflow.containerEngine == 'singularity' && !params.singularity_pull_docker_container) {
|
||||||
container "https://depot.galaxyproject.org/singularity/deeptools:3.5.0--py_0"
|
container "https://depot.galaxyproject.org/singularity/deeptools:3.5.1--py_0"
|
||||||
} else {
|
} else {
|
||||||
container "quay.io/biocontainers/deeptools:3.5.0--py_0"
|
container "quay.io/biocontainers/deeptools:3.5.1--py_0"
|
||||||
}
|
}
|
||||||
|
|
||||||
input:
|
input:
|
||||||
|
|
|
@ -11,11 +11,11 @@ process DEEPTOOLS_PLOTHEATMAP {
|
||||||
mode: params.publish_dir_mode,
|
mode: params.publish_dir_mode,
|
||||||
saveAs: { filename -> saveFiles(filename:filename, options:params.options, publish_dir:getSoftwareName(task.process), meta:meta, publish_by_meta:['id']) }
|
saveAs: { filename -> saveFiles(filename:filename, options:params.options, publish_dir:getSoftwareName(task.process), meta:meta, publish_by_meta:['id']) }
|
||||||
|
|
||||||
conda (params.enable_conda ? "bioconda::deeptools=3.5.0" : null)
|
conda (params.enable_conda ? 'bioconda::deeptools=3.5.1' : null)
|
||||||
if (workflow.containerEngine == 'singularity' && !params.singularity_pull_docker_container) {
|
if (workflow.containerEngine == 'singularity' && !params.singularity_pull_docker_container) {
|
||||||
container "https://depot.galaxyproject.org/singularity/deeptools:3.5.0--py_0"
|
container "https://depot.galaxyproject.org/singularity/deeptools:3.5.1--py_0"
|
||||||
} else {
|
} else {
|
||||||
container "quay.io/biocontainers/deeptools:3.5.0--py_0"
|
container "quay.io/biocontainers/deeptools:3.5.1--py_0"
|
||||||
}
|
}
|
||||||
|
|
||||||
input:
|
input:
|
||||||
|
|
|
@ -11,11 +11,11 @@ process DEEPTOOLS_PLOTPROFILE {
|
||||||
mode: params.publish_dir_mode,
|
mode: params.publish_dir_mode,
|
||||||
saveAs: { filename -> saveFiles(filename:filename, options:params.options, publish_dir:getSoftwareName(task.process), meta:meta, publish_by_meta:['id']) }
|
saveAs: { filename -> saveFiles(filename:filename, options:params.options, publish_dir:getSoftwareName(task.process), meta:meta, publish_by_meta:['id']) }
|
||||||
|
|
||||||
conda (params.enable_conda ? "bioconda::deeptools=3.5.0" : null)
|
conda (params.enable_conda ? 'bioconda::deeptools=3.5.1' : null)
|
||||||
if (workflow.containerEngine == 'singularity' && !params.singularity_pull_docker_container) {
|
if (workflow.containerEngine == 'singularity' && !params.singularity_pull_docker_container) {
|
||||||
container "https://depot.galaxyproject.org/singularity/deeptools:3.5.0--py_0"
|
container "https://depot.galaxyproject.org/singularity/deeptools:3.5.1--py_0"
|
||||||
} else {
|
} else {
|
||||||
container "quay.io/biocontainers/deeptools:3.5.0--py_0"
|
container "quay.io/biocontainers/deeptools:3.5.1--py_0"
|
||||||
}
|
}
|
||||||
|
|
||||||
input:
|
input:
|
||||||
|
|
68
modules/dragonflye/functions.nf
Normal file
68
modules/dragonflye/functions.nf
Normal file
|
@ -0,0 +1,68 @@
|
||||||
|
//
|
||||||
|
// Utility functions used in nf-core DSL2 module files
|
||||||
|
//
|
||||||
|
|
||||||
|
//
|
||||||
|
// Extract name of software tool from process name using $task.process
|
||||||
|
//
|
||||||
|
def getSoftwareName(task_process) {
|
||||||
|
return task_process.tokenize(':')[-1].tokenize('_')[0].toLowerCase()
|
||||||
|
}
|
||||||
|
|
||||||
|
//
|
||||||
|
// Function to initialise default values and to generate a Groovy Map of available options for nf-core modules
|
||||||
|
//
|
||||||
|
def initOptions(Map args) {
|
||||||
|
def Map options = [:]
|
||||||
|
options.args = args.args ?: ''
|
||||||
|
options.args2 = args.args2 ?: ''
|
||||||
|
options.args3 = args.args3 ?: ''
|
||||||
|
options.publish_by_meta = args.publish_by_meta ?: []
|
||||||
|
options.publish_dir = args.publish_dir ?: ''
|
||||||
|
options.publish_files = args.publish_files
|
||||||
|
options.suffix = args.suffix ?: ''
|
||||||
|
return options
|
||||||
|
}
|
||||||
|
|
||||||
|
//
|
||||||
|
// Tidy up and join elements of a list to return a path string
|
||||||
|
//
|
||||||
|
def getPathFromList(path_list) {
|
||||||
|
def paths = path_list.findAll { item -> !item?.trim().isEmpty() } // Remove empty entries
|
||||||
|
paths = paths.collect { it.trim().replaceAll("^[/]+|[/]+\$", "") } // Trim whitespace and trailing slashes
|
||||||
|
return paths.join('/')
|
||||||
|
}
|
||||||
|
|
||||||
|
//
|
||||||
|
// Function to save/publish module results
|
||||||
|
//
|
||||||
|
def saveFiles(Map args) {
|
||||||
|
if (!args.filename.endsWith('.version.txt')) {
|
||||||
|
def ioptions = initOptions(args.options)
|
||||||
|
def path_list = [ ioptions.publish_dir ?: args.publish_dir ]
|
||||||
|
if (ioptions.publish_by_meta) {
|
||||||
|
def key_list = ioptions.publish_by_meta instanceof List ? ioptions.publish_by_meta : args.publish_by_meta
|
||||||
|
for (key in key_list) {
|
||||||
|
if (args.meta && key instanceof String) {
|
||||||
|
def path = key
|
||||||
|
if (args.meta.containsKey(key)) {
|
||||||
|
path = args.meta[key] instanceof Boolean ? "${key}_${args.meta[key]}".toString() : args.meta[key]
|
||||||
|
}
|
||||||
|
path = path instanceof String ? path : ''
|
||||||
|
path_list.add(path)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if (ioptions.publish_files instanceof Map) {
|
||||||
|
for (ext in ioptions.publish_files) {
|
||||||
|
if (args.filename.endsWith(ext.key)) {
|
||||||
|
def ext_list = path_list.collect()
|
||||||
|
ext_list.add(ext.value)
|
||||||
|
return "${getPathFromList(ext_list)}/$args.filename"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
} else if (ioptions.publish_files == null) {
|
||||||
|
return "${getPathFromList(path_list)}/$args.filename"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
45
modules/dragonflye/main.nf
Normal file
45
modules/dragonflye/main.nf
Normal file
|
@ -0,0 +1,45 @@
|
||||||
|
// Import generic module functions
|
||||||
|
include { initOptions; saveFiles; getSoftwareName } from './functions'
|
||||||
|
|
||||||
|
params.options = [:]
|
||||||
|
options = initOptions(params.options)
|
||||||
|
|
||||||
|
process DRAGONFLYE {
|
||||||
|
tag "$meta.id"
|
||||||
|
label 'process_medium'
|
||||||
|
publishDir "${params.outdir}",
|
||||||
|
mode: params.publish_dir_mode,
|
||||||
|
saveAs: { filename -> saveFiles(filename:filename, options:params.options, publish_dir:getSoftwareName(task.process), meta:meta, publish_by_meta:['id']) }
|
||||||
|
|
||||||
|
conda (params.enable_conda ? "bioconda::dragonflye=1.0.4" : null)
|
||||||
|
if (workflow.containerEngine == 'singularity' && !params.singularity_pull_docker_container) {
|
||||||
|
container "https://depot.galaxyproject.org/singularity/dragonflye:1.0.4--hdfd78af_0"
|
||||||
|
} else {
|
||||||
|
container "quay.io/biocontainers/dragonflye:1.0.4--hdfd78af_0"
|
||||||
|
}
|
||||||
|
|
||||||
|
input:
|
||||||
|
tuple val(meta), path(reads)
|
||||||
|
|
||||||
|
output:
|
||||||
|
tuple val(meta), path("contigs.fa") , emit: contigs
|
||||||
|
tuple val(meta), path("dragonflye.log") , emit: log
|
||||||
|
tuple val(meta), path("{flye,miniasm,raven}.fasta") , emit: raw_contigs
|
||||||
|
tuple val(meta), path("{miniasm,raven}-unpolished.gfa"), optional:true , emit: gfa
|
||||||
|
tuple val(meta), path("flye-info.txt"), optional:true , emit: txt
|
||||||
|
path "*.version.txt" , emit: version
|
||||||
|
|
||||||
|
script:
|
||||||
|
def software = getSoftwareName(task.process)
|
||||||
|
def memory = task.memory.toGiga()
|
||||||
|
"""
|
||||||
|
dragonflye \\
|
||||||
|
--reads ${reads} \\
|
||||||
|
$options.args \\
|
||||||
|
--cpus $task.cpus \\
|
||||||
|
--ram $memory \\
|
||||||
|
--outdir ./ \\
|
||||||
|
--force
|
||||||
|
echo \$(dragonflye --version 2>&1) | sed 's/^.*dragonflye //' > ${software}.version.txt
|
||||||
|
"""
|
||||||
|
}
|
57
modules/dragonflye/meta.yml
Normal file
57
modules/dragonflye/meta.yml
Normal file
|
@ -0,0 +1,57 @@
|
||||||
|
name: dragonflye
|
||||||
|
description: Assemble bacterial isolate genomes from Nanopore reads
|
||||||
|
keywords:
|
||||||
|
- bacterial
|
||||||
|
- assembly
|
||||||
|
- nanopore
|
||||||
|
|
||||||
|
tools:
|
||||||
|
- dragonflye:
|
||||||
|
description: Microbial assembly pipeline for Nanopore reads
|
||||||
|
homepage: https://github.com/rpetit3/dragonflye
|
||||||
|
documentation: https://github.com/rpetit3/dragonflye/blob/main/README.md
|
||||||
|
licence: ['GPL v2']
|
||||||
|
|
||||||
|
input:
|
||||||
|
- meta:
|
||||||
|
type: map
|
||||||
|
description: |
|
||||||
|
Groovy Map containing sample information
|
||||||
|
e.g. [ id:'test', single_end:false ]
|
||||||
|
- reads:
|
||||||
|
type: file
|
||||||
|
description: Input Nanopore FASTQ file
|
||||||
|
pattern: "*.fastq.gz"
|
||||||
|
output:
|
||||||
|
- meta:
|
||||||
|
type: map
|
||||||
|
description: |
|
||||||
|
Groovy Map containing sample information
|
||||||
|
e.g. [ id:'test', single_end:false ]
|
||||||
|
- version:
|
||||||
|
type: file
|
||||||
|
description: File containing software version
|
||||||
|
pattern: "*.{version.txt}"
|
||||||
|
- contigs:
|
||||||
|
type: file
|
||||||
|
description: The final assembly produced by Dragonflye
|
||||||
|
pattern: "contigs.fa"
|
||||||
|
- log:
|
||||||
|
type: file
|
||||||
|
description: Full log file for bug reporting
|
||||||
|
pattern: "dragonflye.log"
|
||||||
|
- raw_contigs:
|
||||||
|
type: file
|
||||||
|
description: Raw assembly produced by the assembler (Flye, Miniasm, or Raven)
|
||||||
|
pattern: "{flye,miniasm,raven}.fasta"
|
||||||
|
- txt:
|
||||||
|
type: file
|
||||||
|
description: Assembly information output by Flye
|
||||||
|
pattern: "flye-info.txt"
|
||||||
|
- gfa:
|
||||||
|
type: file
|
||||||
|
description: Assembly graph produced by Miniasm, or Raven
|
||||||
|
pattern: "{miniasm,raven}-unpolished.gfa"
|
||||||
|
|
||||||
|
authors:
|
||||||
|
- "@rpetit3"
|
68
modules/dshbio/exportsegments/functions.nf
Normal file
68
modules/dshbio/exportsegments/functions.nf
Normal file
|
@ -0,0 +1,68 @@
|
||||||
|
//
|
||||||
|
// Utility functions used in nf-core DSL2 module files
|
||||||
|
//
|
||||||
|
|
||||||
|
//
|
||||||
|
// Extract name of software tool from process name using $task.process
|
||||||
|
//
|
||||||
|
def getSoftwareName(task_process) {
|
||||||
|
return task_process.tokenize(':')[-1].tokenize('_')[0].toLowerCase()
|
||||||
|
}
|
||||||
|
|
||||||
|
//
|
||||||
|
// Function to initialise default values and to generate a Groovy Map of available options for nf-core modules
|
||||||
|
//
|
||||||
|
def initOptions(Map args) {
|
||||||
|
def Map options = [:]
|
||||||
|
options.args = args.args ?: ''
|
||||||
|
options.args2 = args.args2 ?: ''
|
||||||
|
options.args3 = args.args3 ?: ''
|
||||||
|
options.publish_by_meta = args.publish_by_meta ?: []
|
||||||
|
options.publish_dir = args.publish_dir ?: ''
|
||||||
|
options.publish_files = args.publish_files
|
||||||
|
options.suffix = args.suffix ?: ''
|
||||||
|
return options
|
||||||
|
}
|
||||||
|
|
||||||
|
//
|
||||||
|
// Tidy up and join elements of a list to return a path string
|
||||||
|
//
|
||||||
|
def getPathFromList(path_list) {
|
||||||
|
def paths = path_list.findAll { item -> !item?.trim().isEmpty() } // Remove empty entries
|
||||||
|
paths = paths.collect { it.trim().replaceAll("^[/]+|[/]+\$", "") } // Trim whitespace and trailing slashes
|
||||||
|
return paths.join('/')
|
||||||
|
}
|
||||||
|
|
||||||
|
//
|
||||||
|
// Function to save/publish module results
|
||||||
|
//
|
||||||
|
def saveFiles(Map args) {
|
||||||
|
if (!args.filename.endsWith('.version.txt')) {
|
||||||
|
def ioptions = initOptions(args.options)
|
||||||
|
def path_list = [ ioptions.publish_dir ?: args.publish_dir ]
|
||||||
|
if (ioptions.publish_by_meta) {
|
||||||
|
def key_list = ioptions.publish_by_meta instanceof List ? ioptions.publish_by_meta : args.publish_by_meta
|
||||||
|
for (key in key_list) {
|
||||||
|
if (args.meta && key instanceof String) {
|
||||||
|
def path = key
|
||||||
|
if (args.meta.containsKey(key)) {
|
||||||
|
path = args.meta[key] instanceof Boolean ? "${key}_${args.meta[key]}".toString() : args.meta[key]
|
||||||
|
}
|
||||||
|
path = path instanceof String ? path : ''
|
||||||
|
path_list.add(path)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if (ioptions.publish_files instanceof Map) {
|
||||||
|
for (ext in ioptions.publish_files) {
|
||||||
|
if (args.filename.endsWith(ext.key)) {
|
||||||
|
def ext_list = path_list.collect()
|
||||||
|
ext_list.add(ext.value)
|
||||||
|
return "${getPathFromList(ext_list)}/$args.filename"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
} else if (ioptions.publish_files == null) {
|
||||||
|
return "${getPathFromList(path_list)}/$args.filename"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
40
modules/dshbio/exportsegments/main.nf
Normal file
40
modules/dshbio/exportsegments/main.nf
Normal file
|
@ -0,0 +1,40 @@
|
||||||
|
// Import generic module functions
|
||||||
|
include { initOptions; saveFiles; getSoftwareName } from './functions'
|
||||||
|
|
||||||
|
params.options = [:]
|
||||||
|
options = initOptions(params.options)
|
||||||
|
|
||||||
|
process DSHBIO_EXPORTSEGMENTS {
|
||||||
|
tag "${meta.id}"
|
||||||
|
label 'process_medium'
|
||||||
|
publishDir "${params.outdir}",
|
||||||
|
mode: params.publish_dir_mode,
|
||||||
|
saveAs: { filename -> saveFiles(filename:filename, options:params.options, publish_dir:getSoftwareName(task.process), meta:meta, publish_by_meta:['id']) }
|
||||||
|
|
||||||
|
conda (params.enable_conda ? "bioconda::dsh-bio=2.0.5" : null)
|
||||||
|
if (workflow.containerEngine == 'singularity' && !params.singularity_pull_docker_container) {
|
||||||
|
container "https://depot.galaxyproject.org/singularity/dsh-bio:2.0.5--hdfd78af_0"
|
||||||
|
} else {
|
||||||
|
container "quay.io/biocontainers/dsh-bio:2.0.5--hdfd78af_0"
|
||||||
|
}
|
||||||
|
|
||||||
|
input:
|
||||||
|
tuple val(meta), path(gfa)
|
||||||
|
|
||||||
|
output:
|
||||||
|
tuple val(meta), path("*.fa"), emit: fasta
|
||||||
|
path "*.version.txt" , emit: version
|
||||||
|
|
||||||
|
script:
|
||||||
|
def software = getSoftwareName(task.process)
|
||||||
|
def prefix = options.suffix ? "${meta.id}${options.suffix}" : "${meta.id}"
|
||||||
|
"""
|
||||||
|
dsh-bio \\
|
||||||
|
export-segments \\
|
||||||
|
$options.args \\
|
||||||
|
-i $gfa \\
|
||||||
|
-o ${prefix}.fa
|
||||||
|
|
||||||
|
echo \$(dsh-bio --version 2>&1) | grep -o 'dsh-bio-tools .*' | cut -f2 -d ' ' > ${software}.version.txt
|
||||||
|
"""
|
||||||
|
}
|
40
modules/dshbio/exportsegments/meta.yml
Normal file
40
modules/dshbio/exportsegments/meta.yml
Normal file
|
@ -0,0 +1,40 @@
|
||||||
|
name: dshbio_exportsegments
|
||||||
|
description: Export assembly segment sequences in GFA 1.0 format to FASTA format
|
||||||
|
keywords:
|
||||||
|
- gfa
|
||||||
|
- assembly
|
||||||
|
- segment
|
||||||
|
tools:
|
||||||
|
- dshbio:
|
||||||
|
description: |
|
||||||
|
Reads, features, variants, assemblies, alignments, genomic range trees, pangenome
|
||||||
|
graphs, and a bunch of random command line tools for bioinformatics. LGPL version 3
|
||||||
|
or later.
|
||||||
|
homepage: https://github.com/heuermh/dishevelled-bio
|
||||||
|
documentation: https://github.com/heuermh/dishevelled-bio
|
||||||
|
input:
|
||||||
|
- meta:
|
||||||
|
type: map
|
||||||
|
description: |
|
||||||
|
Groovy Map containing sample information
|
||||||
|
e.g. [ id:'test', single_end:false ]
|
||||||
|
- gfa:
|
||||||
|
type: file
|
||||||
|
description: Assembly segments in GFA 1.0 format
|
||||||
|
pattern: "*.{gfa}"
|
||||||
|
output:
|
||||||
|
- meta:
|
||||||
|
type: map
|
||||||
|
description: |
|
||||||
|
Groovy Map containing sample information
|
||||||
|
e.g. [ id:'test', single_end:false ]
|
||||||
|
- fasta:
|
||||||
|
type: file
|
||||||
|
description: Assembly segment sequences in FASTA format
|
||||||
|
pattern: "*.{fa}"
|
||||||
|
- version:
|
||||||
|
type: file
|
||||||
|
description: File containing software version
|
||||||
|
pattern: "*.{version.txt}"
|
||||||
|
authors:
|
||||||
|
- "@heuermh"
|
|
@ -11,11 +11,11 @@ process DSHBIO_FILTERBED {
|
||||||
mode: params.publish_dir_mode,
|
mode: params.publish_dir_mode,
|
||||||
saveAs: { filename -> saveFiles(filename:filename, options:params.options, publish_dir:getSoftwareName(task.process), meta:meta, publish_by_meta:['id']) }
|
saveAs: { filename -> saveFiles(filename:filename, options:params.options, publish_dir:getSoftwareName(task.process), meta:meta, publish_by_meta:['id']) }
|
||||||
|
|
||||||
conda (params.enable_conda ? "bioconda::dsh-bio=2.0.4" : null)
|
conda (params.enable_conda ? "bioconda::dsh-bio=2.0.5" : null)
|
||||||
if (workflow.containerEngine == 'singularity' && !params.singularity_pull_docker_container) {
|
if (workflow.containerEngine == 'singularity' && !params.singularity_pull_docker_container) {
|
||||||
container "https://depot.galaxyproject.org/singularity/dsh-bio:2.0.4--hdfd78af_0"
|
container "https://depot.galaxyproject.org/singularity/dsh-bio:2.0.5--hdfd78af_0"
|
||||||
} else {
|
} else {
|
||||||
container "quay.io/biocontainers/dsh-bio:2.0.4--hdfd78af_0"
|
container "quay.io/biocontainers/dsh-bio:2.0.5--hdfd78af_0"
|
||||||
}
|
}
|
||||||
|
|
||||||
input:
|
input:
|
||||||
|
|
|
@ -11,11 +11,11 @@ process DSHBIO_FILTERGFF3 {
|
||||||
mode: params.publish_dir_mode,
|
mode: params.publish_dir_mode,
|
||||||
saveAs: { filename -> saveFiles(filename:filename, options:params.options, publish_dir:getSoftwareName(task.process), meta:meta, publish_by_meta:['id']) }
|
saveAs: { filename -> saveFiles(filename:filename, options:params.options, publish_dir:getSoftwareName(task.process), meta:meta, publish_by_meta:['id']) }
|
||||||
|
|
||||||
conda (params.enable_conda ? "bioconda::dsh-bio=2.0.4" : null)
|
conda (params.enable_conda ? "bioconda::dsh-bio=2.0.5" : null)
|
||||||
if (workflow.containerEngine == 'singularity' && !params.singularity_pull_docker_container) {
|
if (workflow.containerEngine == 'singularity' && !params.singularity_pull_docker_container) {
|
||||||
container "https://depot.galaxyproject.org/singularity/dsh-bio:2.0.4--hdfd78af_0"
|
container "https://depot.galaxyproject.org/singularity/dsh-bio:2.0.5--hdfd78af_0"
|
||||||
} else {
|
} else {
|
||||||
container "quay.io/biocontainers/dsh-bio:2.0.4--hdfd78af_0"
|
container "quay.io/biocontainers/dsh-bio:2.0.5--hdfd78af_0"
|
||||||
}
|
}
|
||||||
|
|
||||||
input:
|
input:
|
||||||
|
|
|
@ -11,11 +11,11 @@ process DSHBIO_SPLITBED {
|
||||||
mode: params.publish_dir_mode,
|
mode: params.publish_dir_mode,
|
||||||
saveAs: { filename -> saveFiles(filename:filename, options:params.options, publish_dir:getSoftwareName(task.process), meta:meta, publish_by_meta:['id']) }
|
saveAs: { filename -> saveFiles(filename:filename, options:params.options, publish_dir:getSoftwareName(task.process), meta:meta, publish_by_meta:['id']) }
|
||||||
|
|
||||||
conda (params.enable_conda ? "bioconda::dsh-bio=2.0.4" : null)
|
conda (params.enable_conda ? "bioconda::dsh-bio=2.0.5" : null)
|
||||||
if (workflow.containerEngine == 'singularity' && !params.singularity_pull_docker_container) {
|
if (workflow.containerEngine == 'singularity' && !params.singularity_pull_docker_container) {
|
||||||
container "https://depot.galaxyproject.org/singularity/dsh-bio:2.0.4--hdfd78af_0"
|
container "https://depot.galaxyproject.org/singularity/dsh-bio:2.0.5--hdfd78af_0"
|
||||||
} else {
|
} else {
|
||||||
container "quay.io/biocontainers/dsh-bio:2.0.4--hdfd78af_0"
|
container "quay.io/biocontainers/dsh-bio:2.0.5--hdfd78af_0"
|
||||||
}
|
}
|
||||||
|
|
||||||
input:
|
input:
|
||||||
|
|
|
@ -11,11 +11,11 @@ process DSHBIO_SPLITGFF3 {
|
||||||
mode: params.publish_dir_mode,
|
mode: params.publish_dir_mode,
|
||||||
saveAs: { filename -> saveFiles(filename:filename, options:params.options, publish_dir:getSoftwareName(task.process), meta:meta, publish_by_meta:['id']) }
|
saveAs: { filename -> saveFiles(filename:filename, options:params.options, publish_dir:getSoftwareName(task.process), meta:meta, publish_by_meta:['id']) }
|
||||||
|
|
||||||
conda (params.enable_conda ? "bioconda::dsh-bio=2.0.4" : null)
|
conda (params.enable_conda ? "bioconda::dsh-bio=2.0.5" : null)
|
||||||
if (workflow.containerEngine == 'singularity' && !params.singularity_pull_docker_container) {
|
if (workflow.containerEngine == 'singularity' && !params.singularity_pull_docker_container) {
|
||||||
container "https://depot.galaxyproject.org/singularity/dsh-bio:2.0.4--hdfd78af_0"
|
container "https://depot.galaxyproject.org/singularity/dsh-bio:2.0.5--hdfd78af_0"
|
||||||
} else {
|
} else {
|
||||||
container "quay.io/biocontainers/dsh-bio:2.0.4--hdfd78af_0"
|
container "quay.io/biocontainers/dsh-bio:2.0.5--hdfd78af_0"
|
||||||
}
|
}
|
||||||
|
|
||||||
input:
|
input:
|
||||||
|
|
68
modules/expansionhunter/functions.nf
Normal file
68
modules/expansionhunter/functions.nf
Normal file
|
@ -0,0 +1,68 @@
|
||||||
|
//
|
||||||
|
// Utility functions used in nf-core DSL2 module files
|
||||||
|
//
|
||||||
|
|
||||||
|
//
|
||||||
|
// Extract name of software tool from process name using $task.process
|
||||||
|
//
|
||||||
|
def getSoftwareName(task_process) {
|
||||||
|
return task_process.tokenize(':')[-1].tokenize('_')[0].toLowerCase()
|
||||||
|
}
|
||||||
|
|
||||||
|
//
|
||||||
|
// Function to initialise default values and to generate a Groovy Map of available options for nf-core modules
|
||||||
|
//
|
||||||
|
def initOptions(Map args) {
|
||||||
|
def Map options = [:]
|
||||||
|
options.args = args.args ?: ''
|
||||||
|
options.args2 = args.args2 ?: ''
|
||||||
|
options.args3 = args.args3 ?: ''
|
||||||
|
options.publish_by_meta = args.publish_by_meta ?: []
|
||||||
|
options.publish_dir = args.publish_dir ?: ''
|
||||||
|
options.publish_files = args.publish_files
|
||||||
|
options.suffix = args.suffix ?: ''
|
||||||
|
return options
|
||||||
|
}
|
||||||
|
|
||||||
|
//
|
||||||
|
// Tidy up and join elements of a list to return a path string
|
||||||
|
//
|
||||||
|
def getPathFromList(path_list) {
|
||||||
|
def paths = path_list.findAll { item -> !item?.trim().isEmpty() } // Remove empty entries
|
||||||
|
paths = paths.collect { it.trim().replaceAll("^[/]+|[/]+\$", "") } // Trim whitespace and trailing slashes
|
||||||
|
return paths.join('/')
|
||||||
|
}
|
||||||
|
|
||||||
|
//
|
||||||
|
// Function to save/publish module results
|
||||||
|
//
|
||||||
|
def saveFiles(Map args) {
|
||||||
|
if (!args.filename.endsWith('.version.txt')) {
|
||||||
|
def ioptions = initOptions(args.options)
|
||||||
|
def path_list = [ ioptions.publish_dir ?: args.publish_dir ]
|
||||||
|
if (ioptions.publish_by_meta) {
|
||||||
|
def key_list = ioptions.publish_by_meta instanceof List ? ioptions.publish_by_meta : args.publish_by_meta
|
||||||
|
for (key in key_list) {
|
||||||
|
if (args.meta && key instanceof String) {
|
||||||
|
def path = key
|
||||||
|
if (args.meta.containsKey(key)) {
|
||||||
|
path = args.meta[key] instanceof Boolean ? "${key}_${args.meta[key]}".toString() : args.meta[key]
|
||||||
|
}
|
||||||
|
path = path instanceof String ? path : ''
|
||||||
|
path_list.add(path)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if (ioptions.publish_files instanceof Map) {
|
||||||
|
for (ext in ioptions.publish_files) {
|
||||||
|
if (args.filename.endsWith(ext.key)) {
|
||||||
|
def ext_list = path_list.collect()
|
||||||
|
ext_list.add(ext.value)
|
||||||
|
return "${getPathFromList(ext_list)}/$args.filename"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
} else if (ioptions.publish_files == null) {
|
||||||
|
return "${getPathFromList(path_list)}/$args.filename"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
45
modules/expansionhunter/main.nf
Normal file
45
modules/expansionhunter/main.nf
Normal file
|
@ -0,0 +1,45 @@
|
||||||
|
// Import generic module functions
|
||||||
|
include { initOptions; saveFiles; getSoftwareName } from './functions'
|
||||||
|
|
||||||
|
params.options = [:]
|
||||||
|
options = initOptions(params.options)
|
||||||
|
|
||||||
|
process EXPANSIONHUNTER {
|
||||||
|
tag "$meta.id"
|
||||||
|
label 'process_low'
|
||||||
|
publishDir "${params.outdir}",
|
||||||
|
mode: params.publish_dir_mode,
|
||||||
|
saveAs: { filename -> saveFiles(filename:filename, options:params.options, publish_dir:getSoftwareName(task.process), meta:meta, publish_by_meta:['id']) }
|
||||||
|
|
||||||
|
conda (params.enable_conda ? "bioconda::expansionhunter=4.0.2" : null)
|
||||||
|
if (workflow.containerEngine == 'singularity' && !params.singularity_pull_docker_container) {
|
||||||
|
container "https://depot.galaxyproject.org/singularity/expansionhunter:4.0.2--he785bd8_0"
|
||||||
|
} else {
|
||||||
|
container "quay.io/biocontainers/expansionhunter:4.0.2--he785bd8_0"
|
||||||
|
}
|
||||||
|
|
||||||
|
input:
|
||||||
|
tuple val(meta), path(bam), path(bai)
|
||||||
|
path fasta
|
||||||
|
path variant_catalog
|
||||||
|
|
||||||
|
output:
|
||||||
|
tuple val(meta), path("*.vcf"), emit: vcf
|
||||||
|
path "*.version.txt" , emit: version
|
||||||
|
|
||||||
|
script:
|
||||||
|
def software = getSoftwareName(task.process)
|
||||||
|
def prefix = options.suffix ? "${meta.id}${options.suffix}" : "${meta.id}"
|
||||||
|
def gender = (meta.gender == 'male' || meta.gender == 1 || meta.gender == 'XY') ? "male" : "female"
|
||||||
|
"""
|
||||||
|
ExpansionHunter \\
|
||||||
|
$options.args \\
|
||||||
|
--reads $bam \\
|
||||||
|
--output-prefix $prefix \\
|
||||||
|
--reference $fasta \\
|
||||||
|
--variant-catalog $variant_catalog \\
|
||||||
|
--sex $gender
|
||||||
|
|
||||||
|
echo \$(ExpansionHunter --version 2>&1) | sed 's/^.*ExpansionHunter //' > ${software}.version.txt
|
||||||
|
"""
|
||||||
|
}
|
50
modules/expansionhunter/meta.yml
Normal file
50
modules/expansionhunter/meta.yml
Normal file
|
@ -0,0 +1,50 @@
|
||||||
|
name: expansionhunter
|
||||||
|
description: write your description here
|
||||||
|
keywords:
|
||||||
|
- STR
|
||||||
|
- repeat_expansions
|
||||||
|
tools:
|
||||||
|
- expansionhunter:
|
||||||
|
description: A tool for estimating repeat sizes
|
||||||
|
homepage: https://github.com/Illumina/ExpansionHunter
|
||||||
|
documentation: https://github.com/Illumina/ExpansionHunter/blob/master/docs/01_Introduction.md
|
||||||
|
tool_dev_url: None
|
||||||
|
doi: "10.1093/bioinformatics/btz431"
|
||||||
|
licence: ['Apache v2.0']
|
||||||
|
|
||||||
|
input:
|
||||||
|
- meta:
|
||||||
|
type: map
|
||||||
|
description: |
|
||||||
|
Groovy Map containing sample information
|
||||||
|
e.g. [ id:'test', single_end:false ]
|
||||||
|
- bam:
|
||||||
|
type: file
|
||||||
|
description: BAM/CRAM file
|
||||||
|
pattern: "*.{bam,cram}"
|
||||||
|
- fasta:
|
||||||
|
type: file
|
||||||
|
description: Reference genome
|
||||||
|
pattern: "*.{fa,fasta}"
|
||||||
|
- variant_catalog:
|
||||||
|
type: file
|
||||||
|
description: json file with repeat expansion sites to genotype
|
||||||
|
pattern: "*.{json}"
|
||||||
|
|
||||||
|
output:
|
||||||
|
- meta:
|
||||||
|
type: map
|
||||||
|
description: |
|
||||||
|
Groovy Map containing sample information
|
||||||
|
e.g. [ id:'test', gender:'female' ]
|
||||||
|
- version:
|
||||||
|
type: file
|
||||||
|
description: File containing software version
|
||||||
|
pattern: "*.{version.txt}"
|
||||||
|
- vcf:
|
||||||
|
type: file
|
||||||
|
description: VCF with repeat expansions
|
||||||
|
pattern: "*.{vcf}"
|
||||||
|
|
||||||
|
authors:
|
||||||
|
- "@jemten"
|
|
@ -20,6 +20,8 @@ process FASTP {
|
||||||
|
|
||||||
input:
|
input:
|
||||||
tuple val(meta), path(reads)
|
tuple val(meta), path(reads)
|
||||||
|
val save_trimmed_fail
|
||||||
|
val save_merged
|
||||||
|
|
||||||
output:
|
output:
|
||||||
tuple val(meta), path('*.trim.fastq.gz') , emit: reads
|
tuple val(meta), path('*.trim.fastq.gz') , emit: reads
|
||||||
|
@ -28,13 +30,14 @@ process FASTP {
|
||||||
tuple val(meta), path('*.log') , emit: log
|
tuple val(meta), path('*.log') , emit: log
|
||||||
path '*.version.txt' , emit: version
|
path '*.version.txt' , emit: version
|
||||||
tuple val(meta), path('*.fail.fastq.gz') , optional:true, emit: reads_fail
|
tuple val(meta), path('*.fail.fastq.gz') , optional:true, emit: reads_fail
|
||||||
|
tuple val(meta), path('*.merged.fastq.gz'), optional:true, emit: reads_merged
|
||||||
|
|
||||||
script:
|
script:
|
||||||
// Added soft-links to original fastqs for consistent naming in MultiQC
|
// Added soft-links to original fastqs for consistent naming in MultiQC
|
||||||
def software = getSoftwareName(task.process)
|
def software = getSoftwareName(task.process)
|
||||||
def prefix = options.suffix ? "${meta.id}${options.suffix}" : "${meta.id}"
|
def prefix = options.suffix ? "${meta.id}${options.suffix}" : "${meta.id}"
|
||||||
if (meta.single_end) {
|
if (meta.single_end) {
|
||||||
def fail_fastq = params.save_trimmed_fail ? "--failed_out ${prefix}.fail.fastq.gz" : ''
|
def fail_fastq = save_trimmed_fail ? "--failed_out ${prefix}.fail.fastq.gz" : ''
|
||||||
"""
|
"""
|
||||||
[ ! -f ${prefix}.fastq.gz ] && ln -s $reads ${prefix}.fastq.gz
|
[ ! -f ${prefix}.fastq.gz ] && ln -s $reads ${prefix}.fastq.gz
|
||||||
fastp \\
|
fastp \\
|
||||||
|
@ -49,7 +52,8 @@ process FASTP {
|
||||||
echo \$(fastp --version 2>&1) | sed -e "s/fastp //g" > ${software}.version.txt
|
echo \$(fastp --version 2>&1) | sed -e "s/fastp //g" > ${software}.version.txt
|
||||||
"""
|
"""
|
||||||
} else {
|
} else {
|
||||||
def fail_fastq = params.save_trimmed_fail ? "--unpaired1 ${prefix}_1.fail.fastq.gz --unpaired2 ${prefix}_2.fail.fastq.gz" : ''
|
def fail_fastq = save_trimmed_fail ? "--unpaired1 ${prefix}_1.fail.fastq.gz --unpaired2 ${prefix}_2.fail.fastq.gz" : ''
|
||||||
|
def merge_fastq = save_merged ? "-m --merged_out ${prefix}.merged.fastq.gz" : ''
|
||||||
"""
|
"""
|
||||||
[ ! -f ${prefix}_1.fastq.gz ] && ln -s ${reads[0]} ${prefix}_1.fastq.gz
|
[ ! -f ${prefix}_1.fastq.gz ] && ln -s ${reads[0]} ${prefix}_1.fastq.gz
|
||||||
[ ! -f ${prefix}_2.fastq.gz ] && ln -s ${reads[1]} ${prefix}_2.fastq.gz
|
[ ! -f ${prefix}_2.fastq.gz ] && ln -s ${reads[1]} ${prefix}_2.fastq.gz
|
||||||
|
@ -61,6 +65,7 @@ process FASTP {
|
||||||
--json ${prefix}.fastp.json \\
|
--json ${prefix}.fastp.json \\
|
||||||
--html ${prefix}.fastp.html \\
|
--html ${prefix}.fastp.html \\
|
||||||
$fail_fastq \\
|
$fail_fastq \\
|
||||||
|
$merge_fastq \\
|
||||||
--thread $task.cpus \\
|
--thread $task.cpus \\
|
||||||
--detect_adapter_for_pe \\
|
--detect_adapter_for_pe \\
|
||||||
$options.args \\
|
$options.args \\
|
||||||
|
|
|
@ -30,7 +30,7 @@ output:
|
||||||
e.g. [ id:'test', single_end:false ]
|
e.g. [ id:'test', single_end:false ]
|
||||||
- reads:
|
- reads:
|
||||||
type: file
|
type: file
|
||||||
description: The trimmed/modified fastq reads
|
description: The trimmed/modified/unmerged fastq reads
|
||||||
pattern: "*trim.fastq.gz"
|
pattern: "*trim.fastq.gz"
|
||||||
- json:
|
- json:
|
||||||
type: file
|
type: file
|
||||||
|
@ -52,6 +52,10 @@ output:
|
||||||
type: file
|
type: file
|
||||||
description: Reads the failed the preprocessing
|
description: Reads the failed the preprocessing
|
||||||
pattern: "*fail.fastq.gz"
|
pattern: "*fail.fastq.gz"
|
||||||
|
- reads_merged:
|
||||||
|
type: file
|
||||||
|
description: Reads that were successfully merged
|
||||||
|
pattern: "*.{merged.fastq.gz}"
|
||||||
authors:
|
authors:
|
||||||
- "@drpatelh"
|
- "@drpatelh"
|
||||||
- "@kevinmenden"
|
- "@kevinmenden"
|
||||||
|
|
|
@ -10,11 +10,11 @@ process GUBBINS {
|
||||||
mode: params.publish_dir_mode,
|
mode: params.publish_dir_mode,
|
||||||
saveAs: { filename -> saveFiles(filename:filename, options:params.options, publish_dir:getSoftwareName(task.process), meta:[:], publish_by_meta:[]) }
|
saveAs: { filename -> saveFiles(filename:filename, options:params.options, publish_dir:getSoftwareName(task.process), meta:[:], publish_by_meta:[]) }
|
||||||
|
|
||||||
conda (params.enable_conda ? "bioconda::gubbins=2.4.1" : null)
|
conda (params.enable_conda ? 'bioconda::gubbins=3.0.0' : null)
|
||||||
if (workflow.containerEngine == 'singularity' && !params.singularity_pull_docker_container) {
|
if (workflow.containerEngine == 'singularity' && !params.singularity_pull_docker_container) {
|
||||||
container "https://depot.galaxyproject.org/singularity/gubbins:2.4.1--py38h197edbe_1"
|
container "https://depot.galaxyproject.org/singularity/gubbins:3.0.0--py39h5bf99c6_0"
|
||||||
} else {
|
} else {
|
||||||
container "quay.io/biocontainers/gubbins:2.4.1--py38h197edbe_1"
|
container "quay.io/biocontainers/gubbins:3.0.0--py39h5bf99c6_0"
|
||||||
}
|
}
|
||||||
|
|
||||||
input:
|
input:
|
||||||
|
|
|
@ -14,11 +14,11 @@ process HISAT2_BUILD {
|
||||||
mode: params.publish_dir_mode,
|
mode: params.publish_dir_mode,
|
||||||
saveAs: { filename -> saveFiles(filename:filename, options:params.options, publish_dir:'index', meta:[:], publish_by_meta:[]) }
|
saveAs: { filename -> saveFiles(filename:filename, options:params.options, publish_dir:'index', meta:[:], publish_by_meta:[]) }
|
||||||
|
|
||||||
conda (params.enable_conda ? "bioconda::hisat2=2.2.0" : null)
|
conda (params.enable_conda ? 'bioconda::hisat2=2.2.1' : null)
|
||||||
if (workflow.containerEngine == 'singularity' && !params.singularity_pull_docker_container) {
|
if (workflow.containerEngine == 'singularity' && !params.singularity_pull_docker_container) {
|
||||||
container "https://depot.galaxyproject.org/singularity/hisat2:2.2.0--py37hfa133b6_4"
|
container "https://depot.galaxyproject.org/singularity/hisat2:2.2.1--h1b792b2_3"
|
||||||
} else {
|
} else {
|
||||||
container "quay.io/biocontainers/hisat2:2.2.0--py37hfa133b6_4"
|
container "quay.io/biocontainers/hisat2:2.2.1--h1b792b2_3"
|
||||||
}
|
}
|
||||||
|
|
||||||
input:
|
input:
|
||||||
|
|
|
@ -13,11 +13,11 @@ process HISAT2_EXTRACTSPLICESITES {
|
||||||
mode: params.publish_dir_mode,
|
mode: params.publish_dir_mode,
|
||||||
saveAs: { filename -> saveFiles(filename:filename, options:params.options, publish_dir:getSoftwareName(task.process), meta:[:], publish_by_meta:[]) }
|
saveAs: { filename -> saveFiles(filename:filename, options:params.options, publish_dir:getSoftwareName(task.process), meta:[:], publish_by_meta:[]) }
|
||||||
|
|
||||||
conda (params.enable_conda ? "bioconda::hisat2=2.2.0" : null)
|
conda (params.enable_conda ? 'bioconda::hisat2=2.2.1' : null)
|
||||||
if (workflow.containerEngine == 'singularity' && !params.singularity_pull_docker_container) {
|
if (workflow.containerEngine == 'singularity' && !params.singularity_pull_docker_container) {
|
||||||
container "https://depot.galaxyproject.org/singularity/hisat2:2.2.0--py37hfa133b6_4"
|
container "https://depot.galaxyproject.org/singularity/hisat2:2.2.1--h1b792b2_3"
|
||||||
} else {
|
} else {
|
||||||
container "quay.io/biocontainers/hisat2:2.2.0--py37hfa133b6_4"
|
container "quay.io/biocontainers/hisat2:2.2.1--h1b792b2_3"
|
||||||
}
|
}
|
||||||
|
|
||||||
input:
|
input:
|
||||||
|
|
|
@ -11,11 +11,11 @@ process IQTREE {
|
||||||
mode: params.publish_dir_mode,
|
mode: params.publish_dir_mode,
|
||||||
saveAs: { filename -> saveFiles(filename:filename, options:params.options, publish_dir:getSoftwareName(task.process), meta:[:], publish_by_meta:[]) }
|
saveAs: { filename -> saveFiles(filename:filename, options:params.options, publish_dir:getSoftwareName(task.process), meta:[:], publish_by_meta:[]) }
|
||||||
|
|
||||||
conda (params.enable_conda ? "bioconda::iqtree=2.1.2" : null)
|
conda (params.enable_conda ? 'bioconda::iqtree=2.1.4_beta' : null)
|
||||||
if (workflow.containerEngine == 'singularity' && !params.singularity_pull_docker_container) {
|
if (workflow.containerEngine == 'singularity' && !params.singularity_pull_docker_container) {
|
||||||
container "https://depot.galaxyproject.org/singularity/iqtree:2.1.2--h56fc30b_0"
|
container "https://depot.galaxyproject.org/singularity/iqtree:2.1.4_beta--hdcc8f71_0"
|
||||||
} else {
|
} else {
|
||||||
container "quay.io/biocontainers/iqtree:2.1.2--h56fc30b_0"
|
container "quay.io/biocontainers/iqtree:2.1.4_beta--hdcc8f71_0"
|
||||||
}
|
}
|
||||||
|
|
||||||
input:
|
input:
|
||||||
|
|
|
@ -11,11 +11,11 @@ process KALLISTOBUSTOOLS_COUNT {
|
||||||
mode: params.publish_dir_mode,
|
mode: params.publish_dir_mode,
|
||||||
saveAs: { filename -> saveFiles(filename:filename, options:params.options, publish_dir:getSoftwareName(task.process), meta:meta, publish_by_meta:['id']) }
|
saveAs: { filename -> saveFiles(filename:filename, options:params.options, publish_dir:getSoftwareName(task.process), meta:meta, publish_by_meta:['id']) }
|
||||||
|
|
||||||
conda (params.enable_conda ? "bioconda::kb-python=0.26.0" : null)
|
conda (params.enable_conda ? 'bioconda::kb-python=0.26.3' : null)
|
||||||
if (workflow.containerEngine == 'singularity' && !params.singularity_pull_docker_container) {
|
if (workflow.containerEngine == 'singularity' && !params.singularity_pull_docker_container) {
|
||||||
container "https://depot.galaxyproject.org/singularity/kb-python:0.26.0--pyhdfd78af_0"
|
container "https://depot.galaxyproject.org/singularity/kb-python:0.26.3--pyhdfd78af_0"
|
||||||
} else {
|
} else {
|
||||||
container "quay.io/biocontainers/kb-python:0.26.0--pyhdfd78af_0"
|
container "quay.io/biocontainers/kb-python:0.26.3--pyhdfd78af_0"
|
||||||
}
|
}
|
||||||
|
|
||||||
input:
|
input:
|
||||||
|
@ -24,20 +24,18 @@ process KALLISTOBUSTOOLS_COUNT {
|
||||||
path t2g
|
path t2g
|
||||||
path t1c
|
path t1c
|
||||||
path t2c
|
path t2c
|
||||||
val use_t1c
|
|
||||||
val use_t2c
|
|
||||||
val workflow
|
val workflow
|
||||||
val technology
|
val technology
|
||||||
|
|
||||||
output:
|
output:
|
||||||
tuple val(meta), path ("*_kallistobustools_count") , emit: kallistobustools_count
|
tuple val(meta), path ("*.count"), emit: count
|
||||||
path "*.version.txt" , emit: version
|
path "*.version.txt" , emit: version
|
||||||
|
|
||||||
script:
|
script:
|
||||||
def software = getSoftwareName(task.process)
|
def software = getSoftwareName(task.process)
|
||||||
def prefix = options.suffix ? "${meta.id}${options.suffix}" : "${meta.id}"
|
def prefix = options.suffix ? "${meta.id}${options.suffix}" : "${meta.id}"
|
||||||
def cdna = use_t1c ? "-c1 $t1c" : ''
|
def cdna = t1c ? "-c1 $t1c" : ''
|
||||||
def introns = use_t2c ? "-c2 $t2c" : ''
|
def introns = t2c ? "-c2 $t2c" : ''
|
||||||
"""
|
"""
|
||||||
kb \\
|
kb \\
|
||||||
count \\
|
count \\
|
||||||
|
@ -49,7 +47,7 @@ process KALLISTOBUSTOOLS_COUNT {
|
||||||
--workflow $workflow \\
|
--workflow $workflow \\
|
||||||
-x $technology \\
|
-x $technology \\
|
||||||
$options.args \\
|
$options.args \\
|
||||||
-o ${prefix}_kallistobustools_count \\
|
-o ${prefix}.count \\
|
||||||
${reads[0]} \\
|
${reads[0]} \\
|
||||||
${reads[1]}
|
${reads[1]}
|
||||||
|
|
||||||
|
|
|
@ -18,14 +18,11 @@ input:
|
||||||
description: |
|
description: |
|
||||||
Groovy Map containing sample information
|
Groovy Map containing sample information
|
||||||
e.g. [ id:'test', single_end:false ]
|
e.g. [ id:'test', single_end:false ]
|
||||||
- fastq1:
|
- reads:
|
||||||
type: file
|
type: file
|
||||||
description: Read 1 fastq file
|
description: |
|
||||||
pattern: "*.{fastq,fastq.gz}"
|
List of input FastQ files of size 1 and 2 for single-end and paired-end data,
|
||||||
- fastq2:
|
respectively.
|
||||||
type: file
|
|
||||||
description: Read 2 fastq file
|
|
||||||
pattern: "*.{fastq,fastq.gz}"
|
|
||||||
- index:
|
- index:
|
||||||
type: file
|
type: file
|
||||||
description: kb-ref index file (.idx)
|
description: kb-ref index file (.idx)
|
||||||
|
@ -38,17 +35,11 @@ input:
|
||||||
type: file
|
type: file
|
||||||
description: kb ref's c1 spliced_t2c file
|
description: kb ref's c1 spliced_t2c file
|
||||||
pattern: "*.{cdna_t2c.txt}"
|
pattern: "*.{cdna_t2c.txt}"
|
||||||
- use_t1c:
|
|
||||||
type: boolean
|
|
||||||
description: Whether to use the c1 txt file for RNA velocity and nucleus workflows
|
|
||||||
- t2c:
|
- t2c:
|
||||||
type: file
|
type: file
|
||||||
description: kb ref's c2 unspliced_t2c file
|
description: kb ref's c2 unspliced_t2c file
|
||||||
pattern: "*.{introns_t2c.txt}"
|
pattern: "*.{introns_t2c.txt}"
|
||||||
- use_t2c:
|
- workflow:
|
||||||
type: boolean
|
|
||||||
description: Whether to use the c2 txt file for RNA velocity and nucleus workflows
|
|
||||||
- kb_workflow:
|
|
||||||
type: value
|
type: value
|
||||||
description: String value defining worfklow to use, can be one of "standard", "lamanno", "nucleus"
|
description: String value defining worfklow to use, can be one of "standard", "lamanno", "nucleus"
|
||||||
pattern: "{standard,lamanno,nucleus,kite}"
|
pattern: "{standard,lamanno,nucleus,kite}"
|
||||||
|
@ -57,17 +48,16 @@ input:
|
||||||
description: String value defining the sequencing technology used.
|
description: String value defining the sequencing technology used.
|
||||||
pattern: "{10XV1,10XV2,10XV3,CELSEQ,CELSEQ2,DROPSEQ,INDROPSV1,INDROPSV2,INDROPSV3,SCRUBSEQ,SURECELL,SMARTSEQ}"
|
pattern: "{10XV1,10XV2,10XV3,CELSEQ,CELSEQ2,DROPSEQ,INDROPSV1,INDROPSV2,INDROPSV3,SCRUBSEQ,SURECELL,SMARTSEQ}"
|
||||||
|
|
||||||
|
|
||||||
output:
|
output:
|
||||||
- meta:
|
- meta:
|
||||||
type: map
|
type: map
|
||||||
description: |
|
description: |
|
||||||
Groovy Map containing sample information
|
Groovy Map containing sample information
|
||||||
e.g. [ id:'test']
|
e.g. [ id:'test']
|
||||||
- kallistobustools_count:
|
- count:
|
||||||
type: file
|
type: file
|
||||||
description: kb count output folder
|
description: kb count output folder
|
||||||
pattern: "*_{kallistobustools_count}"
|
pattern: "*.{count}"
|
||||||
- version:
|
- version:
|
||||||
type: file
|
type: file
|
||||||
description: File containing software version
|
description: File containing software version
|
||||||
|
|
|
@ -11,11 +11,11 @@ process KALLISTOBUSTOOLS_REF {
|
||||||
mode: params.publish_dir_mode,
|
mode: params.publish_dir_mode,
|
||||||
saveAs: { filename -> saveFiles(filename:filename, options:params.options, publish_dir:getSoftwareName(task.process), meta:[:], publish_by_meta:[]) }
|
saveAs: { filename -> saveFiles(filename:filename, options:params.options, publish_dir:getSoftwareName(task.process), meta:[:], publish_by_meta:[]) }
|
||||||
|
|
||||||
conda (params.enable_conda ? "bioconda::kb-python=0.26.0" : null)
|
conda (params.enable_conda ? 'bioconda::kb-python=0.26.3' : null)
|
||||||
if (workflow.containerEngine == 'singularity' && !params.singularity_pull_docker_container) {
|
if (workflow.containerEngine == 'singularity' && !params.singularity_pull_docker_container) {
|
||||||
container "https://depot.galaxyproject.org/singularity/kb-python:0.26.0--pyhdfd78af_0"
|
container "https://depot.galaxyproject.org/singularity/kb-python:0.26.3--pyhdfd78af_0"
|
||||||
} else {
|
} else {
|
||||||
container "quay.io/biocontainers/kb-python:0.26.0--pyhdfd78af_0"
|
container "quay.io/biocontainers/kb-python:0.26.3--pyhdfd78af_0"
|
||||||
}
|
}
|
||||||
|
|
||||||
input:
|
input:
|
||||||
|
|
|
@ -11,11 +11,11 @@ process LAST_DOTPLOT {
|
||||||
mode: params.publish_dir_mode,
|
mode: params.publish_dir_mode,
|
||||||
saveAs: { filename -> saveFiles(filename:filename, options:params.options, publish_dir:getSoftwareName(task.process), meta:meta, publish_by_meta:['id']) }
|
saveAs: { filename -> saveFiles(filename:filename, options:params.options, publish_dir:getSoftwareName(task.process), meta:meta, publish_by_meta:['id']) }
|
||||||
|
|
||||||
conda (params.enable_conda ? "bioconda::last=1238" : null)
|
conda (params.enable_conda ? 'bioconda::last=1250' : null)
|
||||||
if (workflow.containerEngine == 'singularity' && !params.singularity_pull_docker_container) {
|
if (workflow.containerEngine == 'singularity' && !params.singularity_pull_docker_container) {
|
||||||
container "https://depot.galaxyproject.org/singularity/last:1238--h2e03b76_0"
|
container "https://depot.galaxyproject.org/singularity/last:1250--h2e03b76_0"
|
||||||
} else {
|
} else {
|
||||||
container "quay.io/biocontainers/last:1238--h2e03b76_0"
|
container "quay.io/biocontainers/last:1250--h2e03b76_0"
|
||||||
}
|
}
|
||||||
|
|
||||||
input:
|
input:
|
||||||
|
|
|
@ -11,11 +11,11 @@ process LAST_LASTAL {
|
||||||
mode: params.publish_dir_mode,
|
mode: params.publish_dir_mode,
|
||||||
saveAs: { filename -> saveFiles(filename:filename, options:params.options, publish_dir:getSoftwareName(task.process), meta:meta, publish_by_meta:['id']) }
|
saveAs: { filename -> saveFiles(filename:filename, options:params.options, publish_dir:getSoftwareName(task.process), meta:meta, publish_by_meta:['id']) }
|
||||||
|
|
||||||
conda (params.enable_conda ? "bioconda::last=1238" : null)
|
conda (params.enable_conda ? 'bioconda::last=1250' : null)
|
||||||
if (workflow.containerEngine == 'singularity' && !params.singularity_pull_docker_container) {
|
if (workflow.containerEngine == 'singularity' && !params.singularity_pull_docker_container) {
|
||||||
container "https://depot.galaxyproject.org/singularity/last:1238--h2e03b76_0"
|
container "https://depot.galaxyproject.org/singularity/last:1250--h2e03b76_0"
|
||||||
} else {
|
} else {
|
||||||
container "quay.io/biocontainers/last:1238--h2e03b76_0"
|
container "quay.io/biocontainers/last:1250--h2e03b76_0"
|
||||||
}
|
}
|
||||||
|
|
||||||
input:
|
input:
|
||||||
|
|
|
@ -11,11 +11,11 @@ process LAST_LASTDB {
|
||||||
mode: params.publish_dir_mode,
|
mode: params.publish_dir_mode,
|
||||||
saveAs: { filename -> saveFiles(filename:filename, options:params.options, publish_dir:getSoftwareName(task.process), meta:meta, publish_by_meta:['id']) }
|
saveAs: { filename -> saveFiles(filename:filename, options:params.options, publish_dir:getSoftwareName(task.process), meta:meta, publish_by_meta:['id']) }
|
||||||
|
|
||||||
conda (params.enable_conda ? "bioconda::last=1238" : null)
|
conda (params.enable_conda ? 'bioconda::last=1250' : null)
|
||||||
if (workflow.containerEngine == 'singularity' && !params.singularity_pull_docker_container) {
|
if (workflow.containerEngine == 'singularity' && !params.singularity_pull_docker_container) {
|
||||||
container "https://depot.galaxyproject.org/singularity/last:1238--h2e03b76_0"
|
container "https://depot.galaxyproject.org/singularity/last:1250--h2e03b76_0"
|
||||||
} else {
|
} else {
|
||||||
container "quay.io/biocontainers/last:1238--h2e03b76_0"
|
container "quay.io/biocontainers/last:1250--h2e03b76_0"
|
||||||
}
|
}
|
||||||
|
|
||||||
input:
|
input:
|
||||||
|
|
|
@ -11,11 +11,11 @@ process LAST_MAFCONVERT {
|
||||||
mode: params.publish_dir_mode,
|
mode: params.publish_dir_mode,
|
||||||
saveAs: { filename -> saveFiles(filename:filename, options:params.options, publish_dir:getSoftwareName(task.process), meta:meta, publish_by_meta:['id']) }
|
saveAs: { filename -> saveFiles(filename:filename, options:params.options, publish_dir:getSoftwareName(task.process), meta:meta, publish_by_meta:['id']) }
|
||||||
|
|
||||||
conda (params.enable_conda ? "bioconda::last=1238" : null)
|
conda (params.enable_conda ? 'bioconda::last=1250' : null)
|
||||||
if (workflow.containerEngine == 'singularity' && !params.singularity_pull_docker_container) {
|
if (workflow.containerEngine == 'singularity' && !params.singularity_pull_docker_container) {
|
||||||
container "https://depot.galaxyproject.org/singularity/last:1238--h2e03b76_0"
|
container "https://depot.galaxyproject.org/singularity/last:1250--h2e03b76_0"
|
||||||
} else {
|
} else {
|
||||||
container "quay.io/biocontainers/last:1238--h2e03b76_0"
|
container "quay.io/biocontainers/last:1250--h2e03b76_0"
|
||||||
}
|
}
|
||||||
|
|
||||||
input:
|
input:
|
||||||
|
|
|
@ -11,11 +11,11 @@ process LAST_MAFSWAP {
|
||||||
mode: params.publish_dir_mode,
|
mode: params.publish_dir_mode,
|
||||||
saveAs: { filename -> saveFiles(filename:filename, options:params.options, publish_dir:getSoftwareName(task.process), meta:meta, publish_by_meta:['id']) }
|
saveAs: { filename -> saveFiles(filename:filename, options:params.options, publish_dir:getSoftwareName(task.process), meta:meta, publish_by_meta:['id']) }
|
||||||
|
|
||||||
conda (params.enable_conda ? "bioconda::last=1238" : null)
|
conda (params.enable_conda ? 'bioconda::last=1250' : null)
|
||||||
if (workflow.containerEngine == 'singularity' && !params.singularity_pull_docker_container) {
|
if (workflow.containerEngine == 'singularity' && !params.singularity_pull_docker_container) {
|
||||||
container "https://depot.galaxyproject.org/singularity/last:1238--h2e03b76_0"
|
container "https://depot.galaxyproject.org/singularity/last:1250--h2e03b76_0"
|
||||||
} else {
|
} else {
|
||||||
container "quay.io/biocontainers/last:1238--h2e03b76_0"
|
container "quay.io/biocontainers/last:1250--h2e03b76_0"
|
||||||
}
|
}
|
||||||
|
|
||||||
input:
|
input:
|
||||||
|
|
|
@ -11,11 +11,11 @@ process LAST_POSTMASK {
|
||||||
mode: params.publish_dir_mode,
|
mode: params.publish_dir_mode,
|
||||||
saveAs: { filename -> saveFiles(filename:filename, options:params.options, publish_dir:getSoftwareName(task.process), meta:meta, publish_by_meta:['id']) }
|
saveAs: { filename -> saveFiles(filename:filename, options:params.options, publish_dir:getSoftwareName(task.process), meta:meta, publish_by_meta:['id']) }
|
||||||
|
|
||||||
conda (params.enable_conda ? "bioconda::last=1238" : null)
|
conda (params.enable_conda ? 'bioconda::last=1250' : null)
|
||||||
if (workflow.containerEngine == 'singularity' && !params.singularity_pull_docker_container) {
|
if (workflow.containerEngine == 'singularity' && !params.singularity_pull_docker_container) {
|
||||||
container "https://depot.galaxyproject.org/singularity/last:1238--h2e03b76_0"
|
container "https://depot.galaxyproject.org/singularity/last:1250--h2e03b76_0"
|
||||||
} else {
|
} else {
|
||||||
container "quay.io/biocontainers/last:1238--h2e03b76_0"
|
container "quay.io/biocontainers/last:1250--h2e03b76_0"
|
||||||
}
|
}
|
||||||
|
|
||||||
input:
|
input:
|
||||||
|
|
|
@ -11,11 +11,11 @@ process LAST_SPLIT {
|
||||||
mode: params.publish_dir_mode,
|
mode: params.publish_dir_mode,
|
||||||
saveAs: { filename -> saveFiles(filename:filename, options:params.options, publish_dir:getSoftwareName(task.process), meta:meta, publish_by_meta:['id']) }
|
saveAs: { filename -> saveFiles(filename:filename, options:params.options, publish_dir:getSoftwareName(task.process), meta:meta, publish_by_meta:['id']) }
|
||||||
|
|
||||||
conda (params.enable_conda ? "bioconda::last=1238" : null)
|
conda (params.enable_conda ? 'bioconda::last=1250' : null)
|
||||||
if (workflow.containerEngine == 'singularity' && !params.singularity_pull_docker_container) {
|
if (workflow.containerEngine == 'singularity' && !params.singularity_pull_docker_container) {
|
||||||
container "https://depot.galaxyproject.org/singularity/last:1238--h2e03b76_0"
|
container "https://depot.galaxyproject.org/singularity/last:1250--h2e03b76_0"
|
||||||
} else {
|
} else {
|
||||||
container "quay.io/biocontainers/last:1238--h2e03b76_0"
|
container "quay.io/biocontainers/last:1250--h2e03b76_0"
|
||||||
}
|
}
|
||||||
|
|
||||||
input:
|
input:
|
||||||
|
|
|
@ -11,11 +11,11 @@ process LAST_TRAIN {
|
||||||
mode: params.publish_dir_mode,
|
mode: params.publish_dir_mode,
|
||||||
saveAs: { filename -> saveFiles(filename:filename, options:params.options, publish_dir:getSoftwareName(task.process), meta:meta, publish_by_meta:['id']) }
|
saveAs: { filename -> saveFiles(filename:filename, options:params.options, publish_dir:getSoftwareName(task.process), meta:meta, publish_by_meta:['id']) }
|
||||||
|
|
||||||
conda (params.enable_conda ? "bioconda::last=1238" : null)
|
conda (params.enable_conda ? 'bioconda::last=1250' : null)
|
||||||
if (workflow.containerEngine == 'singularity' && !params.singularity_pull_docker_container) {
|
if (workflow.containerEngine == 'singularity' && !params.singularity_pull_docker_container) {
|
||||||
container "https://depot.galaxyproject.org/singularity/last:1238--h2e03b76_0"
|
container "https://depot.galaxyproject.org/singularity/last:1250--h2e03b76_0"
|
||||||
} else {
|
} else {
|
||||||
container "quay.io/biocontainers/last:1238--h2e03b76_0"
|
container "quay.io/biocontainers/last:1250--h2e03b76_0"
|
||||||
}
|
}
|
||||||
|
|
||||||
input:
|
input:
|
||||||
|
|
68
modules/malt/build/functions.nf
Normal file
68
modules/malt/build/functions.nf
Normal file
|
@ -0,0 +1,68 @@
|
||||||
|
//
|
||||||
|
// Utility functions used in nf-core DSL2 module files
|
||||||
|
//
|
||||||
|
|
||||||
|
//
|
||||||
|
// Extract name of software tool from process name using $task.process
|
||||||
|
//
|
||||||
|
def getSoftwareName(task_process) {
|
||||||
|
return task_process.tokenize(':')[-1].tokenize('_')[0].toLowerCase()
|
||||||
|
}
|
||||||
|
|
||||||
|
//
|
||||||
|
// Function to initialise default values and to generate a Groovy Map of available options for nf-core modules
|
||||||
|
//
|
||||||
|
def initOptions(Map args) {
|
||||||
|
def Map options = [:]
|
||||||
|
options.args = args.args ?: ''
|
||||||
|
options.args2 = args.args2 ?: ''
|
||||||
|
options.args3 = args.args3 ?: ''
|
||||||
|
options.publish_by_meta = args.publish_by_meta ?: []
|
||||||
|
options.publish_dir = args.publish_dir ?: ''
|
||||||
|
options.publish_files = args.publish_files
|
||||||
|
options.suffix = args.suffix ?: ''
|
||||||
|
return options
|
||||||
|
}
|
||||||
|
|
||||||
|
//
|
||||||
|
// Tidy up and join elements of a list to return a path string
|
||||||
|
//
|
||||||
|
def getPathFromList(path_list) {
|
||||||
|
def paths = path_list.findAll { item -> !item?.trim().isEmpty() } // Remove empty entries
|
||||||
|
paths = paths.collect { it.trim().replaceAll("^[/]+|[/]+\$", "") } // Trim whitespace and trailing slashes
|
||||||
|
return paths.join('/')
|
||||||
|
}
|
||||||
|
|
||||||
|
//
|
||||||
|
// Function to save/publish module results
|
||||||
|
//
|
||||||
|
def saveFiles(Map args) {
|
||||||
|
if (!args.filename.endsWith('.version.txt')) {
|
||||||
|
def ioptions = initOptions(args.options)
|
||||||
|
def path_list = [ ioptions.publish_dir ?: args.publish_dir ]
|
||||||
|
if (ioptions.publish_by_meta) {
|
||||||
|
def key_list = ioptions.publish_by_meta instanceof List ? ioptions.publish_by_meta : args.publish_by_meta
|
||||||
|
for (key in key_list) {
|
||||||
|
if (args.meta && key instanceof String) {
|
||||||
|
def path = key
|
||||||
|
if (args.meta.containsKey(key)) {
|
||||||
|
path = args.meta[key] instanceof Boolean ? "${key}_${args.meta[key]}".toString() : args.meta[key]
|
||||||
|
}
|
||||||
|
path = path instanceof String ? path : ''
|
||||||
|
path_list.add(path)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if (ioptions.publish_files instanceof Map) {
|
||||||
|
for (ext in ioptions.publish_files) {
|
||||||
|
if (args.filename.endsWith(ext.key)) {
|
||||||
|
def ext_list = path_list.collect()
|
||||||
|
ext_list.add(ext.value)
|
||||||
|
return "${getPathFromList(ext_list)}/$args.filename"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
} else if (ioptions.publish_files == null) {
|
||||||
|
return "${getPathFromList(path_list)}/$args.filename"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
58
modules/malt/build/main.nf
Normal file
58
modules/malt/build/main.nf
Normal file
|
@ -0,0 +1,58 @@
|
||||||
|
// Import generic module functions
|
||||||
|
include { initOptions; saveFiles; getSoftwareName } from './functions'
|
||||||
|
|
||||||
|
params.options = [:]
|
||||||
|
options = initOptions(params.options)
|
||||||
|
|
||||||
|
process MALT_BUILD {
|
||||||
|
|
||||||
|
label 'process_high'
|
||||||
|
publishDir "${params.outdir}",
|
||||||
|
mode: params.publish_dir_mode,
|
||||||
|
saveAs: { filename -> saveFiles(filename:filename, options:params.options, publish_dir:getSoftwareName(task.process), meta:[:], publish_by_meta:[]) }
|
||||||
|
|
||||||
|
// Do not **auto-bump** due to problem with change of version numbering between 0.4.1 and 0.5.2
|
||||||
|
// (originally 0.4.1 was listed as 0.41, so is always selected as 'latest' even though it is not!)
|
||||||
|
conda (params.enable_conda ? "bioconda::malt=0.5.2" : null)
|
||||||
|
if (workflow.containerEngine == 'singularity' && !params.singularity_pull_docker_container) {
|
||||||
|
container "https://depot.galaxyproject.org/singularity/malt:0.5.2--0"
|
||||||
|
} else {
|
||||||
|
container "quay.io/biocontainers/malt:0.5.2--0"
|
||||||
|
}
|
||||||
|
|
||||||
|
input:
|
||||||
|
path fastas
|
||||||
|
val seq_type
|
||||||
|
path gff
|
||||||
|
path map_db
|
||||||
|
|
||||||
|
output:
|
||||||
|
path "malt_index/" , emit: index
|
||||||
|
path "*.version.txt" , emit: version
|
||||||
|
path "malt-build.log", emit: log
|
||||||
|
|
||||||
|
script:
|
||||||
|
def software = getSoftwareName(task.process)
|
||||||
|
def avail_mem = 6
|
||||||
|
if (!task.memory) {
|
||||||
|
log.info '[MALT_BUILD] Available memory not known - defaulting to 6GB. Specify process memory requirements to change this.'
|
||||||
|
} else {
|
||||||
|
avail_mem = task.memory.giga
|
||||||
|
}
|
||||||
|
def igff = gff ? "-igff ${gff}" : ""
|
||||||
|
|
||||||
|
"""
|
||||||
|
malt-build \\
|
||||||
|
-J-Xmx${avail_mem}g \\
|
||||||
|
-v \\
|
||||||
|
--input ${fastas.join(' ')} \\
|
||||||
|
-s $seq_type \\
|
||||||
|
$igff \\
|
||||||
|
-d 'malt_index/' \\
|
||||||
|
-t ${task.cpus} \\
|
||||||
|
$options.args \\
|
||||||
|
-mdb ${map_db}/*.db |&tee malt-build.log
|
||||||
|
|
||||||
|
malt-build --help |& tail -n 3 | head -n 1 | cut -f 2 -d'(' | cut -f 1 -d ',' | cut -d ' ' -f 2 > ${software}.version.txt
|
||||||
|
"""
|
||||||
|
}
|
55
modules/malt/build/meta.yml
Normal file
55
modules/malt/build/meta.yml
Normal file
|
@ -0,0 +1,55 @@
|
||||||
|
name: malt_build
|
||||||
|
description: MALT, an acronym for MEGAN alignment tool, is a sequence alignment and analysis tool designed for processing high-throughput sequencing data, especially in the context of metagenomics.
|
||||||
|
keywords:
|
||||||
|
- malt
|
||||||
|
- alignment
|
||||||
|
- metagenomics
|
||||||
|
- ancient DNA
|
||||||
|
- aDNA
|
||||||
|
- palaeogenomics
|
||||||
|
- archaeogenomics
|
||||||
|
- microbiome
|
||||||
|
- database
|
||||||
|
tools:
|
||||||
|
- malt:
|
||||||
|
description: A tool for mapping metagenomic data
|
||||||
|
homepage: https://www.wsi.uni-tuebingen.de/lehrstuehle/algorithms-in-bioinformatics/software/malt/
|
||||||
|
documentation: https://software-ab.informatik.uni-tuebingen.de/download/malt/manual.pdf
|
||||||
|
tool_dev_url: None
|
||||||
|
doi: "10.1038/s41559-017-0446-6"
|
||||||
|
licence: ['GPL v3']
|
||||||
|
|
||||||
|
input:
|
||||||
|
- fastas:
|
||||||
|
type: file
|
||||||
|
description: Directory of, or FASTA reference files for indexing
|
||||||
|
pattern: "*/|*.fasta"
|
||||||
|
- seq_type:
|
||||||
|
type: string
|
||||||
|
description: Type of input data
|
||||||
|
pattern: "DNA|Protein"
|
||||||
|
- gff:
|
||||||
|
type: file
|
||||||
|
description: Directory of, or GFF3 files of input FASTA files
|
||||||
|
pattern: "*/|*.gff|*.gff3"
|
||||||
|
- map_db:
|
||||||
|
type: file
|
||||||
|
description: MEGAN .db file from https://software-ab.informatik.uni-tuebingen.de/download/megan6/welcome.html
|
||||||
|
pattern:
|
||||||
|
|
||||||
|
output:
|
||||||
|
- version:
|
||||||
|
type: file
|
||||||
|
description: File containing software version
|
||||||
|
pattern: "*.{version.txt}"
|
||||||
|
- index:
|
||||||
|
type: directory
|
||||||
|
description: Directory containing MALT database index directory
|
||||||
|
pattern: "malt_index/"
|
||||||
|
- log:
|
||||||
|
type: file
|
||||||
|
description: Log file from STD out of malt-build
|
||||||
|
pattern: "malt-build.log"
|
||||||
|
|
||||||
|
authors:
|
||||||
|
- "@jfy133"
|
68
modules/malt/run/functions.nf
Normal file
68
modules/malt/run/functions.nf
Normal file
|
@ -0,0 +1,68 @@
|
||||||
|
//
|
||||||
|
// Utility functions used in nf-core DSL2 module files
|
||||||
|
//
|
||||||
|
|
||||||
|
//
|
||||||
|
// Extract name of software tool from process name using $task.process
|
||||||
|
//
|
||||||
|
def getSoftwareName(task_process) {
|
||||||
|
return task_process.tokenize(':')[-1].tokenize('_')[0].toLowerCase()
|
||||||
|
}
|
||||||
|
|
||||||
|
//
|
||||||
|
// Function to initialise default values and to generate a Groovy Map of available options for nf-core modules
|
||||||
|
//
|
||||||
|
def initOptions(Map args) {
|
||||||
|
def Map options = [:]
|
||||||
|
options.args = args.args ?: ''
|
||||||
|
options.args2 = args.args2 ?: ''
|
||||||
|
options.args3 = args.args3 ?: ''
|
||||||
|
options.publish_by_meta = args.publish_by_meta ?: []
|
||||||
|
options.publish_dir = args.publish_dir ?: ''
|
||||||
|
options.publish_files = args.publish_files
|
||||||
|
options.suffix = args.suffix ?: ''
|
||||||
|
return options
|
||||||
|
}
|
||||||
|
|
||||||
|
//
|
||||||
|
// Tidy up and join elements of a list to return a path string
|
||||||
|
//
|
||||||
|
def getPathFromList(path_list) {
|
||||||
|
def paths = path_list.findAll { item -> !item?.trim().isEmpty() } // Remove empty entries
|
||||||
|
paths = paths.collect { it.trim().replaceAll("^[/]+|[/]+\$", "") } // Trim whitespace and trailing slashes
|
||||||
|
return paths.join('/')
|
||||||
|
}
|
||||||
|
|
||||||
|
//
|
||||||
|
// Function to save/publish module results
|
||||||
|
//
|
||||||
|
def saveFiles(Map args) {
|
||||||
|
if (!args.filename.endsWith('.version.txt')) {
|
||||||
|
def ioptions = initOptions(args.options)
|
||||||
|
def path_list = [ ioptions.publish_dir ?: args.publish_dir ]
|
||||||
|
if (ioptions.publish_by_meta) {
|
||||||
|
def key_list = ioptions.publish_by_meta instanceof List ? ioptions.publish_by_meta : args.publish_by_meta
|
||||||
|
for (key in key_list) {
|
||||||
|
if (args.meta && key instanceof String) {
|
||||||
|
def path = key
|
||||||
|
if (args.meta.containsKey(key)) {
|
||||||
|
path = args.meta[key] instanceof Boolean ? "${key}_${args.meta[key]}".toString() : args.meta[key]
|
||||||
|
}
|
||||||
|
path = path instanceof String ? path : ''
|
||||||
|
path_list.add(path)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if (ioptions.publish_files instanceof Map) {
|
||||||
|
for (ext in ioptions.publish_files) {
|
||||||
|
if (args.filename.endsWith(ext.key)) {
|
||||||
|
def ext_list = path_list.collect()
|
||||||
|
ext_list.add(ext.value)
|
||||||
|
return "${getPathFromList(ext_list)}/$args.filename"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
} else if (ioptions.publish_files == null) {
|
||||||
|
return "${getPathFromList(path_list)}/$args.filename"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
54
modules/malt/run/main.nf
Normal file
54
modules/malt/run/main.nf
Normal file
|
@ -0,0 +1,54 @@
|
||||||
|
// Import generic module functions
|
||||||
|
include { initOptions; saveFiles; getSoftwareName } from './functions'
|
||||||
|
|
||||||
|
params.options = [:]
|
||||||
|
options = initOptions(params.options)
|
||||||
|
|
||||||
|
process MALT_RUN {
|
||||||
|
|
||||||
|
label 'process_high_memory'
|
||||||
|
publishDir "${params.outdir}",
|
||||||
|
mode: params.publish_dir_mode,
|
||||||
|
saveAs: { filename -> saveFiles(filename:filename, options:params.options, publish_dir:getSoftwareName(task.process), meta:[:], publish_by_meta:[]) }
|
||||||
|
|
||||||
|
conda (params.enable_conda ? "bioconda::malt=0.5.2" : null)
|
||||||
|
if (workflow.containerEngine == 'singularity' && !params.singularity_pull_docker_container) {
|
||||||
|
container "https://depot.galaxyproject.org/singularity/malt:0.5.2--0"
|
||||||
|
} else {
|
||||||
|
container "quay.io/biocontainers/malt:0.5.2--0"
|
||||||
|
}
|
||||||
|
|
||||||
|
input:
|
||||||
|
path fastqs
|
||||||
|
val mode
|
||||||
|
path index
|
||||||
|
|
||||||
|
output:
|
||||||
|
path "*.rma6" , emit: rma6
|
||||||
|
path "*.{tab,text,sam}", optional:true, emit: alignments
|
||||||
|
path "*.log" , emit: log
|
||||||
|
path "*.version.txt" , emit: version
|
||||||
|
|
||||||
|
script:
|
||||||
|
def software = getSoftwareName(task.process)
|
||||||
|
def avail_mem = 6
|
||||||
|
if (!task.memory) {
|
||||||
|
log.info '[MALT_RUN] Available memory not known - defaulting to 6GB. Specify process memory requirements to change this.'
|
||||||
|
} else {
|
||||||
|
avail_mem = task.memory.giga
|
||||||
|
}
|
||||||
|
|
||||||
|
"""
|
||||||
|
malt-run \\
|
||||||
|
-J-Xmx${avail_mem}g \\
|
||||||
|
-t ${task.cpus} \\
|
||||||
|
-v \\
|
||||||
|
-o . \\
|
||||||
|
$options.args \\
|
||||||
|
--inFile ${fastqs.join(' ')} \\
|
||||||
|
-m $mode \\
|
||||||
|
--index $index/ |&tee malt-run.log
|
||||||
|
|
||||||
|
echo \$(malt-run --help 2>&1) | grep -o 'version.* ' | cut -f 1 -d ',' | cut -f2 -d ' ' > ${software}.version.txt
|
||||||
|
"""
|
||||||
|
}
|
53
modules/malt/run/meta.yml
Normal file
53
modules/malt/run/meta.yml
Normal file
|
@ -0,0 +1,53 @@
|
||||||
|
name: malt_run
|
||||||
|
description: MALT, an acronym for MEGAN alignment tool, is a sequence alignment and analysis tool designed for processing high-throughput sequencing data, especially in the context of metagenomics.
|
||||||
|
keywords:
|
||||||
|
- malt
|
||||||
|
- alignment
|
||||||
|
- metagenomics
|
||||||
|
- ancient DNA
|
||||||
|
- aDNA
|
||||||
|
- palaeogenomics
|
||||||
|
- archaeogenomics
|
||||||
|
- microbiome
|
||||||
|
tools:
|
||||||
|
- malt:
|
||||||
|
description: A tool for mapping metagenomic data
|
||||||
|
homepage: https://www.wsi.uni-tuebingen.de/lehrstuehle/algorithms-in-bioinformatics/software/malt/
|
||||||
|
documentation: https://software-ab.informatik.uni-tuebingen.de/download/malt/manual.pdf
|
||||||
|
tool_dev_url: None
|
||||||
|
doi: "10.1038/s41559-017-0446-6"
|
||||||
|
licence: ['GPL v3']
|
||||||
|
|
||||||
|
input:
|
||||||
|
- fastqs:
|
||||||
|
type: file
|
||||||
|
description: Input FASTQ files
|
||||||
|
pattern: "*.{fastq.gz,fq.gz}"
|
||||||
|
- mode:
|
||||||
|
type: string
|
||||||
|
description: Program mode
|
||||||
|
pattern: 'Unknown|BlastN|BlastP|BlastX|Classifier'
|
||||||
|
- index:
|
||||||
|
type: directory
|
||||||
|
description: Index/database directory from malt-build
|
||||||
|
pattern: '*/'
|
||||||
|
output:
|
||||||
|
- version:
|
||||||
|
type: file
|
||||||
|
description: File containing software version
|
||||||
|
pattern: "*.{version.txt}"
|
||||||
|
- rma6:
|
||||||
|
type: file
|
||||||
|
description: MEGAN6 RMA6 file
|
||||||
|
pattern: "*.rma6"
|
||||||
|
- sam:
|
||||||
|
type: file
|
||||||
|
description: Alignment files in Tab, Text or MEGAN-compatible SAM format
|
||||||
|
pattern: "*.{tab,txt,sam}"
|
||||||
|
- log:
|
||||||
|
type: file
|
||||||
|
description: Log of verbose MALT stdout
|
||||||
|
pattern: "malt-run.log"
|
||||||
|
|
||||||
|
authors:
|
||||||
|
- "@jfy133"
|
|
@ -11,11 +11,11 @@ process METAPHLAN3 {
|
||||||
mode: params.publish_dir_mode,
|
mode: params.publish_dir_mode,
|
||||||
saveAs: { filename -> saveFiles(filename:filename, options:params.options, publish_dir:getSoftwareName(task.process), meta:meta, publish_by_meta:['id']) }
|
saveAs: { filename -> saveFiles(filename:filename, options:params.options, publish_dir:getSoftwareName(task.process), meta:meta, publish_by_meta:['id']) }
|
||||||
|
|
||||||
conda (params.enable_conda ? "bioconda::metaphlan=3.0.10" : null)
|
conda (params.enable_conda ? 'bioconda::metaphlan=3.0.12' : null)
|
||||||
if (workflow.containerEngine == 'singularity' && !params.singularity_pull_docker_container) {
|
if (workflow.containerEngine == 'singularity' && !params.singularity_pull_docker_container) {
|
||||||
container "https://depot.galaxyproject.org/singularity/metaphlan:3.0.10--pyhb7b1952_0"
|
container "https://depot.galaxyproject.org/singularity/metaphlan:3.0.12--pyhb7b1952_0"
|
||||||
} else {
|
} else {
|
||||||
container "quay.io/biocontainers/metaphlan:3.0.10--pyhb7b1952_0"
|
container "quay.io/biocontainers/metaphlan:3.0.12--pyhb7b1952_0"
|
||||||
}
|
}
|
||||||
|
|
||||||
input:
|
input:
|
||||||
|
|
|
@ -11,11 +11,11 @@ process METHYLDACKEL_EXTRACT {
|
||||||
mode: params.publish_dir_mode,
|
mode: params.publish_dir_mode,
|
||||||
saveAs: { filename -> saveFiles(filename:filename, options:params.options, publish_dir:getSoftwareName(task.process), meta:meta, publish_by_meta:['id']) }
|
saveAs: { filename -> saveFiles(filename:filename, options:params.options, publish_dir:getSoftwareName(task.process), meta:meta, publish_by_meta:['id']) }
|
||||||
|
|
||||||
conda (params.enable_conda ? "bioconda::methyldackel=0.5.2" : null)
|
conda (params.enable_conda ? 'bioconda::methyldackel=0.6.0' : null)
|
||||||
if (workflow.containerEngine == 'singularity' && !params.singularity_pull_docker_container) {
|
if (workflow.containerEngine == 'singularity' && !params.singularity_pull_docker_container) {
|
||||||
container "https://depot.galaxyproject.org/singularity/methyldackel:0.5.2--h7435645_0"
|
container "https://depot.galaxyproject.org/singularity/methyldackel:0.6.0--h22771d5_0"
|
||||||
} else {
|
} else {
|
||||||
container "quay.io/biocontainers/methyldackel:0.5.2--h7435645_0"
|
container "quay.io/biocontainers/methyldackel:0.6.0--h22771d5_0"
|
||||||
}
|
}
|
||||||
|
|
||||||
input:
|
input:
|
||||||
|
|
|
@ -11,11 +11,11 @@ process METHYLDACKEL_MBIAS {
|
||||||
mode: params.publish_dir_mode,
|
mode: params.publish_dir_mode,
|
||||||
saveAs: { filename -> saveFiles(filename:filename, options:params.options, publish_dir:getSoftwareName(task.process), meta:meta, publish_by_meta:['id']) }
|
saveAs: { filename -> saveFiles(filename:filename, options:params.options, publish_dir:getSoftwareName(task.process), meta:meta, publish_by_meta:['id']) }
|
||||||
|
|
||||||
conda (params.enable_conda ? "bioconda::methyldackel=0.5.2" : null)
|
conda (params.enable_conda ? 'bioconda::methyldackel=0.6.0' : null)
|
||||||
if (workflow.containerEngine == 'singularity' && !params.singularity_pull_docker_container) {
|
if (workflow.containerEngine == 'singularity' && !params.singularity_pull_docker_container) {
|
||||||
container "https://depot.galaxyproject.org/singularity/methyldackel:0.5.2--h7435645_0"
|
container "https://depot.galaxyproject.org/singularity/methyldackel:0.6.0--h22771d5_0"
|
||||||
} else {
|
} else {
|
||||||
container "quay.io/biocontainers/methyldackel:0.5.2--h7435645_0"
|
container "quay.io/biocontainers/methyldackel:0.6.0--h22771d5_0"
|
||||||
}
|
}
|
||||||
|
|
||||||
input:
|
input:
|
||||||
|
|
|
@ -11,11 +11,11 @@ process MINIMAP2_ALIGN {
|
||||||
mode: params.publish_dir_mode,
|
mode: params.publish_dir_mode,
|
||||||
saveAs: { filename -> saveFiles(filename:filename, options:params.options, publish_dir:getSoftwareName(task.process), meta:meta, publish_by_meta:['id']) }
|
saveAs: { filename -> saveFiles(filename:filename, options:params.options, publish_dir:getSoftwareName(task.process), meta:meta, publish_by_meta:['id']) }
|
||||||
|
|
||||||
conda (params.enable_conda ? "bioconda::minimap2=2.17" : null)
|
conda (params.enable_conda ? 'bioconda::minimap2=2.21' : null)
|
||||||
if (workflow.containerEngine == 'singularity' && !params.singularity_pull_docker_container) {
|
if (workflow.containerEngine == 'singularity' && !params.singularity_pull_docker_container) {
|
||||||
container "https://depot.galaxyproject.org/singularity/minimap2:2.17--hed695b0_3"
|
container "https://depot.galaxyproject.org/singularity/minimap2:2.21--h5bf99c6_0"
|
||||||
} else {
|
} else {
|
||||||
container "quay.io/biocontainers/minimap2:2.17--hed695b0_3"
|
container "quay.io/biocontainers/minimap2:2.21--h5bf99c6_0"
|
||||||
}
|
}
|
||||||
|
|
||||||
input:
|
input:
|
||||||
|
|
|
@ -10,11 +10,11 @@ process MINIMAP2_INDEX {
|
||||||
mode: params.publish_dir_mode,
|
mode: params.publish_dir_mode,
|
||||||
saveAs: { filename -> saveFiles(filename:filename, options:params.options, publish_dir:getSoftwareName(task.process), meta:[:], publish_by_meta:['']) }
|
saveAs: { filename -> saveFiles(filename:filename, options:params.options, publish_dir:getSoftwareName(task.process), meta:[:], publish_by_meta:['']) }
|
||||||
|
|
||||||
conda (params.enable_conda ? "bioconda::minimap2=2.17" : null)
|
conda (params.enable_conda ? 'bioconda::minimap2=2.21' : null)
|
||||||
if (workflow.containerEngine == 'singularity' && !params.singularity_pull_docker_container) {
|
if (workflow.containerEngine == 'singularity' && !params.singularity_pull_docker_container) {
|
||||||
container "https://depot.galaxyproject.org/singularity/minimap2:2.17--hed695b0_3"
|
container "https://depot.galaxyproject.org/singularity/minimap2:2.21--h5bf99c6_0"
|
||||||
} else {
|
} else {
|
||||||
container "quay.io/biocontainers/minimap2:2.17--hed695b0_3"
|
container "quay.io/biocontainers/minimap2:2.21--h5bf99c6_0"
|
||||||
}
|
}
|
||||||
|
|
||||||
input:
|
input:
|
||||||
|
|
|
@ -11,11 +11,11 @@ process MOSDEPTH {
|
||||||
mode: params.publish_dir_mode,
|
mode: params.publish_dir_mode,
|
||||||
saveAs: { filename -> saveFiles(filename:filename, options:params.options, publish_dir:getSoftwareName(task.process), meta:meta, publish_by_meta:['id']) }
|
saveAs: { filename -> saveFiles(filename:filename, options:params.options, publish_dir:getSoftwareName(task.process), meta:meta, publish_by_meta:['id']) }
|
||||||
|
|
||||||
conda (params.enable_conda ? 'bioconda::mosdepth=0.3.1' : null)
|
conda (params.enable_conda ? 'bioconda::mosdepth=0.3.2' : null)
|
||||||
if (workflow.containerEngine == 'singularity' && !params.singularity_pull_docker_container) {
|
if (workflow.containerEngine == 'singularity' && !params.singularity_pull_docker_container) {
|
||||||
container "https://depot.galaxyproject.org/singularity/mosdepth:0.3.1--ha7ba039_0"
|
container "https://depot.galaxyproject.org/singularity/mosdepth:0.3.2--h01d7912_0"
|
||||||
} else {
|
} else {
|
||||||
container "quay.io/biocontainers/mosdepth:0.3.1--ha7ba039_0"
|
container "quay.io/biocontainers/mosdepth:0.3.2--h01d7912_0"
|
||||||
}
|
}
|
||||||
|
|
||||||
input:
|
input:
|
||||||
|
|
|
@ -10,11 +10,11 @@ process MULTIQC {
|
||||||
mode: params.publish_dir_mode,
|
mode: params.publish_dir_mode,
|
||||||
saveAs: { filename -> saveFiles(filename:filename, options:params.options, publish_dir:getSoftwareName(task.process), meta:[:], publish_by_meta:[]) }
|
saveAs: { filename -> saveFiles(filename:filename, options:params.options, publish_dir:getSoftwareName(task.process), meta:[:], publish_by_meta:[]) }
|
||||||
|
|
||||||
conda (params.enable_conda ? "bioconda::multiqc=1.10.1" : null)
|
conda (params.enable_conda ? 'bioconda::multiqc=1.11' : null)
|
||||||
if (workflow.containerEngine == 'singularity' && !params.singularity_pull_docker_container) {
|
if (workflow.containerEngine == 'singularity' && !params.singularity_pull_docker_container) {
|
||||||
container "https://depot.galaxyproject.org/singularity/multiqc:1.10.1--py_0"
|
container "https://depot.galaxyproject.org/singularity/multiqc:1.11--pyhdfd78af_0"
|
||||||
} else {
|
} else {
|
||||||
container "quay.io/biocontainers/multiqc:1.10.1--py_0"
|
container "quay.io/biocontainers/multiqc:1.11--pyhdfd78af_0"
|
||||||
}
|
}
|
||||||
|
|
||||||
input:
|
input:
|
||||||
|
|
|
@ -11,11 +11,11 @@ process NANOPLOT {
|
||||||
mode: params.publish_dir_mode,
|
mode: params.publish_dir_mode,
|
||||||
saveAs: { filename -> saveFiles(filename:filename, options:params.options, publish_dir:getSoftwareName(task.process), meta:meta, publish_by_meta:['id']) }
|
saveAs: { filename -> saveFiles(filename:filename, options:params.options, publish_dir:getSoftwareName(task.process), meta:meta, publish_by_meta:['id']) }
|
||||||
|
|
||||||
conda (params.enable_conda ? "bioconda::nanoplot=1.36.1" : null)
|
conda (params.enable_conda ? 'bioconda::nanoplot=1.38.0' : null)
|
||||||
if (workflow.containerEngine == 'singularity' && !params.singularity_pull_docker_container) {
|
if (workflow.containerEngine == 'singularity' && !params.singularity_pull_docker_container) {
|
||||||
container "https://depot.galaxyproject.org/singularity/nanoplot:1.36.1--pyhdfd78af_0"
|
container "https://depot.galaxyproject.org/singularity/nanoplot:1.38.0--pyhdfd78af_0"
|
||||||
} else {
|
} else {
|
||||||
container "quay.io/biocontainers/nanoplot:1.36.1--pyhdfd78af_0"
|
container "quay.io/biocontainers/nanoplot:1.38.0--pyhdfd78af_0"
|
||||||
}
|
}
|
||||||
|
|
||||||
input:
|
input:
|
||||||
|
|
|
@ -20,35 +20,28 @@ process NEXTCLADE {
|
||||||
|
|
||||||
input:
|
input:
|
||||||
tuple val(meta), path(fasta)
|
tuple val(meta), path(fasta)
|
||||||
val output_format
|
|
||||||
|
|
||||||
output:
|
output:
|
||||||
tuple val(meta), path("${prefix}.csv") , optional:true, emit: csv
|
tuple val(meta), path("${prefix}.csv") , emit: csv
|
||||||
tuple val(meta), path("${prefix}.json") , optional:true, emit: json
|
tuple val(meta), path("${prefix}.json") , emit: json
|
||||||
tuple val(meta), path("${prefix}.tree.json") , optional:true, emit: json_tree
|
tuple val(meta), path("${prefix}.tree.json") , emit: json_tree
|
||||||
tuple val(meta), path("${prefix}.tsv") , optional:true, emit: tsv
|
tuple val(meta), path("${prefix}.tsv") , emit: tsv
|
||||||
tuple val(meta), path("${prefix}.clades.tsv"), optional:true, emit: tsv_clades
|
tuple val(meta), path("${prefix}.clades.tsv"), optional:true, emit: tsv_clades
|
||||||
path "*.version.txt" , emit: version
|
path "*.version.txt" , emit: version
|
||||||
|
|
||||||
script:
|
script:
|
||||||
def software = getSoftwareName(task.process)
|
def software = getSoftwareName(task.process)
|
||||||
prefix = options.suffix ? "${meta.id}${options.suffix}" : "${meta.id}"
|
prefix = options.suffix ? "${meta.id}${options.suffix}" : "${meta.id}"
|
||||||
def format = output_format
|
|
||||||
if (!(format in ['json', 'csv', 'tsv', 'tree', 'tsv-clades-only'])) {
|
|
||||||
format = 'json'
|
|
||||||
}
|
|
||||||
def extension = format
|
|
||||||
if (format in ['tsv-clades-only']) {
|
|
||||||
extension = '.clades.tsv'
|
|
||||||
} else if (format in ['tree']) {
|
|
||||||
extension = 'tree.json'
|
|
||||||
}
|
|
||||||
"""
|
"""
|
||||||
nextclade \\
|
nextclade \\
|
||||||
$options.args \\
|
$options.args \\
|
||||||
--jobs $task.cpus \\
|
--jobs $task.cpus \\
|
||||||
--input-fasta $fasta \\
|
--input-fasta $fasta \\
|
||||||
--output-${format} ${prefix}.${extension}
|
--output-json ${prefix}.json \\
|
||||||
|
--output-csv ${prefix}.csv \\
|
||||||
|
--output-tsv ${prefix}.tsv \\
|
||||||
|
--output-tsv-clades-only ${prefix}.clades.tsv \\
|
||||||
|
--output-tree ${prefix}.tree.json
|
||||||
|
|
||||||
echo \$(nextclade --version 2>&1) > ${software}.version.txt
|
echo \$(nextclade --version 2>&1) > ${software}.version.txt
|
||||||
"""
|
"""
|
||||||
|
|
|
@ -11,7 +11,7 @@ tools:
|
||||||
documentation: None
|
documentation: None
|
||||||
tool_dev_url: https://github.com/nextstrain/nextclade
|
tool_dev_url: https://github.com/nextstrain/nextclade
|
||||||
doi: ""
|
doi: ""
|
||||||
licence: ['MIT']
|
licence: ["MIT"]
|
||||||
|
|
||||||
input:
|
input:
|
||||||
- meta:
|
- meta:
|
||||||
|
@ -23,11 +23,6 @@ input:
|
||||||
type: file
|
type: file
|
||||||
description: FASTA file containing one or more consensus sequences
|
description: FASTA file containing one or more consensus sequences
|
||||||
pattern: "*.{fasta,fa}"
|
pattern: "*.{fasta,fa}"
|
||||||
- output_format:
|
|
||||||
type: string
|
|
||||||
description: |
|
|
||||||
String for output format supported by nextclade
|
|
||||||
i.e one of 'json', 'csv', 'tsv', 'tree', 'tsv-clades-only'
|
|
||||||
|
|
||||||
output:
|
output:
|
||||||
- meta:
|
- meta:
|
||||||
|
|
|
@ -11,11 +11,11 @@ process PICARD_COLLECTMULTIPLEMETRICS {
|
||||||
mode: params.publish_dir_mode,
|
mode: params.publish_dir_mode,
|
||||||
saveAs: { filename -> saveFiles(filename:filename, options:params.options, publish_dir:getSoftwareName(task.process), meta:meta, publish_by_meta:['id']) }
|
saveAs: { filename -> saveFiles(filename:filename, options:params.options, publish_dir:getSoftwareName(task.process), meta:meta, publish_by_meta:['id']) }
|
||||||
|
|
||||||
conda (params.enable_conda ? "bioconda::picard=2.23.9" : null)
|
conda (params.enable_conda ? 'bioconda::picard=2.25.7' : null)
|
||||||
if (workflow.containerEngine == 'singularity' && !params.singularity_pull_docker_container) {
|
if (workflow.containerEngine == 'singularity' && !params.singularity_pull_docker_container) {
|
||||||
container "https://depot.galaxyproject.org/singularity/picard:2.23.9--0"
|
container "https://depot.galaxyproject.org/singularity/picard:2.25.7--hdfd78af_0"
|
||||||
} else {
|
} else {
|
||||||
container "quay.io/biocontainers/picard:2.23.9--0"
|
container "quay.io/biocontainers/picard:2.25.7--hdfd78af_0"
|
||||||
}
|
}
|
||||||
|
|
||||||
input:
|
input:
|
||||||
|
|
|
@ -11,11 +11,11 @@ process PICARD_COLLECTWGSMETRICS {
|
||||||
mode: params.publish_dir_mode,
|
mode: params.publish_dir_mode,
|
||||||
saveAs: { filename -> saveFiles(filename:filename, options:params.options, publish_dir:getSoftwareName(task.process), meta:meta, publish_by_meta:['id']) }
|
saveAs: { filename -> saveFiles(filename:filename, options:params.options, publish_dir:getSoftwareName(task.process), meta:meta, publish_by_meta:['id']) }
|
||||||
|
|
||||||
conda (params.enable_conda ? "bioconda::picard=2.25.0" : null)
|
conda (params.enable_conda ? 'bioconda::picard=2.25.7' : null)
|
||||||
if (workflow.containerEngine == 'singularity' && !params.singularity_pull_docker_container) {
|
if (workflow.containerEngine == 'singularity' && !params.singularity_pull_docker_container) {
|
||||||
container "https://depot.galaxyproject.org/singularity/picard:2.25.0--0"
|
container "https://depot.galaxyproject.org/singularity/picard:2.25.7--hdfd78af_0"
|
||||||
} else {
|
} else {
|
||||||
container "quay.io/biocontainers/picard:2.25.0--0"
|
container "quay.io/biocontainers/picard:2.25.7--hdfd78af_0"
|
||||||
}
|
}
|
||||||
|
|
||||||
input:
|
input:
|
||||||
|
|
68
modules/picard/filtersamreads/functions.nf
Normal file
68
modules/picard/filtersamreads/functions.nf
Normal file
|
@ -0,0 +1,68 @@
|
||||||
|
//
|
||||||
|
// Utility functions used in nf-core DSL2 module files
|
||||||
|
//
|
||||||
|
|
||||||
|
//
|
||||||
|
// Extract name of software tool from process name using $task.process
|
||||||
|
//
|
||||||
|
def getSoftwareName(task_process) {
|
||||||
|
return task_process.tokenize(':')[-1].tokenize('_')[0].toLowerCase()
|
||||||
|
}
|
||||||
|
|
||||||
|
//
|
||||||
|
// Function to initialise default values and to generate a Groovy Map of available options for nf-core modules
|
||||||
|
//
|
||||||
|
def initOptions(Map args) {
|
||||||
|
def Map options = [:]
|
||||||
|
options.args = args.args ?: ''
|
||||||
|
options.args2 = args.args2 ?: ''
|
||||||
|
options.args3 = args.args3 ?: ''
|
||||||
|
options.publish_by_meta = args.publish_by_meta ?: []
|
||||||
|
options.publish_dir = args.publish_dir ?: ''
|
||||||
|
options.publish_files = args.publish_files
|
||||||
|
options.suffix = args.suffix ?: ''
|
||||||
|
return options
|
||||||
|
}
|
||||||
|
|
||||||
|
//
|
||||||
|
// Tidy up and join elements of a list to return a path string
|
||||||
|
//
|
||||||
|
def getPathFromList(path_list) {
|
||||||
|
def paths = path_list.findAll { item -> !item?.trim().isEmpty() } // Remove empty entries
|
||||||
|
paths = paths.collect { it.trim().replaceAll("^[/]+|[/]+\$", "") } // Trim whitespace and trailing slashes
|
||||||
|
return paths.join('/')
|
||||||
|
}
|
||||||
|
|
||||||
|
//
|
||||||
|
// Function to save/publish module results
|
||||||
|
//
|
||||||
|
def saveFiles(Map args) {
|
||||||
|
if (!args.filename.endsWith('.version.txt')) {
|
||||||
|
def ioptions = initOptions(args.options)
|
||||||
|
def path_list = [ ioptions.publish_dir ?: args.publish_dir ]
|
||||||
|
if (ioptions.publish_by_meta) {
|
||||||
|
def key_list = ioptions.publish_by_meta instanceof List ? ioptions.publish_by_meta : args.publish_by_meta
|
||||||
|
for (key in key_list) {
|
||||||
|
if (args.meta && key instanceof String) {
|
||||||
|
def path = key
|
||||||
|
if (args.meta.containsKey(key)) {
|
||||||
|
path = args.meta[key] instanceof Boolean ? "${key}_${args.meta[key]}".toString() : args.meta[key]
|
||||||
|
}
|
||||||
|
path = path instanceof String ? path : ''
|
||||||
|
path_list.add(path)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if (ioptions.publish_files instanceof Map) {
|
||||||
|
for (ext in ioptions.publish_files) {
|
||||||
|
if (args.filename.endsWith(ext.key)) {
|
||||||
|
def ext_list = path_list.collect()
|
||||||
|
ext_list.add(ext.value)
|
||||||
|
return "${getPathFromList(ext_list)}/$args.filename"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
} else if (ioptions.publish_files == null) {
|
||||||
|
return "${getPathFromList(path_list)}/$args.filename"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
64
modules/picard/filtersamreads/main.nf
Normal file
64
modules/picard/filtersamreads/main.nf
Normal file
|
@ -0,0 +1,64 @@
|
||||||
|
// Import generic module functions
|
||||||
|
include { initOptions; saveFiles; getSoftwareName } from './functions'
|
||||||
|
|
||||||
|
params.options = [:]
|
||||||
|
options = initOptions(params.options)
|
||||||
|
|
||||||
|
process PICARD_FILTERSAMREADS {
|
||||||
|
tag "$meta.id"
|
||||||
|
label 'process_low'
|
||||||
|
publishDir "${params.outdir}",
|
||||||
|
mode: params.publish_dir_mode,
|
||||||
|
saveAs: { filename -> saveFiles(filename:filename, options:params.options, publish_dir:getSoftwareName(task.process), meta:meta, publish_by_meta:['id']) }
|
||||||
|
|
||||||
|
conda (params.enable_conda ? 'bioconda::picard=2.25.7' : null)
|
||||||
|
if (workflow.containerEngine == 'singularity' && !params.singularity_pull_docker_container) {
|
||||||
|
container "https://depot.galaxyproject.org/singularity/picard:2.25.7--hdfd78af_0"
|
||||||
|
} else {
|
||||||
|
container "quay.io/biocontainers/picard:2.25.7--hdfd78af_0"
|
||||||
|
}
|
||||||
|
|
||||||
|
input:
|
||||||
|
tuple val(meta), path(bam), path(readlist)
|
||||||
|
val filter
|
||||||
|
|
||||||
|
output:
|
||||||
|
tuple val(meta), path("*.bam"), emit: bam
|
||||||
|
path "*.version.txt" , emit: version
|
||||||
|
|
||||||
|
script:
|
||||||
|
def software = getSoftwareName(task.process)
|
||||||
|
def prefix = options.suffix ? "${meta.id}${options.suffix}" : "${meta.id}"
|
||||||
|
def avail_mem = 3
|
||||||
|
if (!task.memory) {
|
||||||
|
log.info '[Picard FilterSamReads] Available memory not known - defaulting to 3GB. Specify process memory requirements to change this.'
|
||||||
|
} else {
|
||||||
|
avail_mem = task.memory.giga
|
||||||
|
}
|
||||||
|
if ( filter == 'includeAligned' || filter == 'excludeAligned' ) {
|
||||||
|
"""
|
||||||
|
picard \\
|
||||||
|
FilterSamReads \\
|
||||||
|
-Xmx${avail_mem}g \\
|
||||||
|
--INPUT $bam \\
|
||||||
|
--OUTPUT ${prefix}.bam \\
|
||||||
|
--FILTER $filter \\
|
||||||
|
$options.args
|
||||||
|
|
||||||
|
echo \$(picard FilterSamReads --version 2>&1) | grep -o 'Version:.*' | cut -f2- -d: > ${software}.version.txt
|
||||||
|
"""
|
||||||
|
} else if ( filter == 'includeReadList' || filter == 'excludeReadList' ) {
|
||||||
|
"""
|
||||||
|
picard \\
|
||||||
|
FilterSamReads \\
|
||||||
|
-Xmx${avail_mem}g \\
|
||||||
|
--INPUT $bam \\
|
||||||
|
--OUTPUT ${prefix}.bam \\
|
||||||
|
--FILTER $filter \\
|
||||||
|
--READ_LIST_FILE $readlist \\
|
||||||
|
$options.args
|
||||||
|
|
||||||
|
echo \$(picard FilterSamReads --version 2>&1) | grep -o 'Version:.*' | cut -f2- -d: > ${software}.version.txt
|
||||||
|
"""
|
||||||
|
}
|
||||||
|
}
|
51
modules/picard/filtersamreads/meta.yml
Normal file
51
modules/picard/filtersamreads/meta.yml
Normal file
|
@ -0,0 +1,51 @@
|
||||||
|
name: picard_filtersamreads
|
||||||
|
description: Filters SAM/BAM files to include/exclude either aligned/unaligned reads or based on a read list
|
||||||
|
keywords:
|
||||||
|
- bam
|
||||||
|
- filter
|
||||||
|
tools:
|
||||||
|
- picard:
|
||||||
|
description: |
|
||||||
|
A set of command line tools (in Java) for manipulating high-throughput sequencing (HTS)
|
||||||
|
data and formats such as SAM/BAM/CRAM and VCF.
|
||||||
|
homepage: https://broadinstitute.github.io/picard/
|
||||||
|
documentation: https://broadinstitute.github.io/picard/
|
||||||
|
tool_dev_url: https://github.com/broadinstitute/picard
|
||||||
|
doi: ""
|
||||||
|
licence: ['MIT']
|
||||||
|
|
||||||
|
input:
|
||||||
|
- meta:
|
||||||
|
type: map
|
||||||
|
description: |
|
||||||
|
Groovy Map containing sample information
|
||||||
|
e.g. [ id:'test', single_end:false ]
|
||||||
|
- bam:
|
||||||
|
type: file
|
||||||
|
description: List of BAM files. If filtering without read list must be sorted by queryname with picard sortsam
|
||||||
|
pattern: "*.{bam}"
|
||||||
|
- filter:
|
||||||
|
type: value
|
||||||
|
description: Picard filter type
|
||||||
|
pattern: "includeAligned|excludeAligned|includeReadList|excludeReadList"
|
||||||
|
- readlist:
|
||||||
|
type: file
|
||||||
|
description: Optional text file containing reads IDs to include or exclude
|
||||||
|
|
||||||
|
output:
|
||||||
|
- meta:
|
||||||
|
type: map
|
||||||
|
description: |
|
||||||
|
Groovy Map containing sample information
|
||||||
|
e.g. [ id:'test', single_end:false ]
|
||||||
|
- bam:
|
||||||
|
type: file
|
||||||
|
description: Filtered BAM file
|
||||||
|
pattern: "*.{bam}"
|
||||||
|
- version:
|
||||||
|
type: file
|
||||||
|
description: File containing software version
|
||||||
|
pattern: "*.{version.txt}"
|
||||||
|
|
||||||
|
authors:
|
||||||
|
- "@jfy133"
|
|
@ -11,11 +11,11 @@ process PICARD_MARKDUPLICATES {
|
||||||
mode: params.publish_dir_mode,
|
mode: params.publish_dir_mode,
|
||||||
saveAs: { filename -> saveFiles(filename:filename, options:params.options, publish_dir:getSoftwareName(task.process), meta:meta, publish_by_meta:['id']) }
|
saveAs: { filename -> saveFiles(filename:filename, options:params.options, publish_dir:getSoftwareName(task.process), meta:meta, publish_by_meta:['id']) }
|
||||||
|
|
||||||
conda (params.enable_conda ? "bioconda::picard=2.23.9" : null)
|
conda (params.enable_conda ? 'bioconda::picard=2.25.7' : null)
|
||||||
if (workflow.containerEngine == 'singularity' && !params.singularity_pull_docker_container) {
|
if (workflow.containerEngine == 'singularity' && !params.singularity_pull_docker_container) {
|
||||||
container "https://depot.galaxyproject.org/singularity/picard:2.23.9--0"
|
container "https://depot.galaxyproject.org/singularity/picard:2.25.7--hdfd78af_0"
|
||||||
} else {
|
} else {
|
||||||
container "quay.io/biocontainers/picard:2.23.9--0"
|
container "quay.io/biocontainers/picard:2.25.7--hdfd78af_0"
|
||||||
}
|
}
|
||||||
|
|
||||||
input:
|
input:
|
||||||
|
|
|
@ -11,11 +11,11 @@ process PICARD_MERGESAMFILES {
|
||||||
mode: params.publish_dir_mode,
|
mode: params.publish_dir_mode,
|
||||||
saveAs: { filename -> saveFiles(filename:filename, options:params.options, publish_dir:getSoftwareName(task.process), meta:meta, publish_by_meta:['id']) }
|
saveAs: { filename -> saveFiles(filename:filename, options:params.options, publish_dir:getSoftwareName(task.process), meta:meta, publish_by_meta:['id']) }
|
||||||
|
|
||||||
conda (params.enable_conda ? "bioconda::picard=2.23.9" : null)
|
conda (params.enable_conda ? 'bioconda::picard=2.25.7' : null)
|
||||||
if (workflow.containerEngine == 'singularity' && !params.singularity_pull_docker_container) {
|
if (workflow.containerEngine == 'singularity' && !params.singularity_pull_docker_container) {
|
||||||
container "https://depot.galaxyproject.org/singularity/picard:2.23.9--0"
|
container "https://depot.galaxyproject.org/singularity/picard:2.25.7--hdfd78af_0"
|
||||||
} else {
|
} else {
|
||||||
container "quay.io/biocontainers/picard:2.23.9--0"
|
container "quay.io/biocontainers/picard:2.25.7--hdfd78af_0"
|
||||||
}
|
}
|
||||||
|
|
||||||
input:
|
input:
|
||||||
|
|
68
modules/picard/sortsam/functions.nf
Normal file
68
modules/picard/sortsam/functions.nf
Normal file
|
@ -0,0 +1,68 @@
|
||||||
|
//
|
||||||
|
// Utility functions used in nf-core DSL2 module files
|
||||||
|
//
|
||||||
|
|
||||||
|
//
|
||||||
|
// Extract name of software tool from process name using $task.process
|
||||||
|
//
|
||||||
|
def getSoftwareName(task_process) {
|
||||||
|
return task_process.tokenize(':')[-1].tokenize('_')[0].toLowerCase()
|
||||||
|
}
|
||||||
|
|
||||||
|
//
|
||||||
|
// Function to initialise default values and to generate a Groovy Map of available options for nf-core modules
|
||||||
|
//
|
||||||
|
def initOptions(Map args) {
|
||||||
|
def Map options = [:]
|
||||||
|
options.args = args.args ?: ''
|
||||||
|
options.args2 = args.args2 ?: ''
|
||||||
|
options.args3 = args.args3 ?: ''
|
||||||
|
options.publish_by_meta = args.publish_by_meta ?: []
|
||||||
|
options.publish_dir = args.publish_dir ?: ''
|
||||||
|
options.publish_files = args.publish_files
|
||||||
|
options.suffix = args.suffix ?: ''
|
||||||
|
return options
|
||||||
|
}
|
||||||
|
|
||||||
|
//
|
||||||
|
// Tidy up and join elements of a list to return a path string
|
||||||
|
//
|
||||||
|
def getPathFromList(path_list) {
|
||||||
|
def paths = path_list.findAll { item -> !item?.trim().isEmpty() } // Remove empty entries
|
||||||
|
paths = paths.collect { it.trim().replaceAll("^[/]+|[/]+\$", "") } // Trim whitespace and trailing slashes
|
||||||
|
return paths.join('/')
|
||||||
|
}
|
||||||
|
|
||||||
|
//
|
||||||
|
// Function to save/publish module results
|
||||||
|
//
|
||||||
|
def saveFiles(Map args) {
|
||||||
|
if (!args.filename.endsWith('.version.txt')) {
|
||||||
|
def ioptions = initOptions(args.options)
|
||||||
|
def path_list = [ ioptions.publish_dir ?: args.publish_dir ]
|
||||||
|
if (ioptions.publish_by_meta) {
|
||||||
|
def key_list = ioptions.publish_by_meta instanceof List ? ioptions.publish_by_meta : args.publish_by_meta
|
||||||
|
for (key in key_list) {
|
||||||
|
if (args.meta && key instanceof String) {
|
||||||
|
def path = key
|
||||||
|
if (args.meta.containsKey(key)) {
|
||||||
|
path = args.meta[key] instanceof Boolean ? "${key}_${args.meta[key]}".toString() : args.meta[key]
|
||||||
|
}
|
||||||
|
path = path instanceof String ? path : ''
|
||||||
|
path_list.add(path)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if (ioptions.publish_files instanceof Map) {
|
||||||
|
for (ext in ioptions.publish_files) {
|
||||||
|
if (args.filename.endsWith(ext.key)) {
|
||||||
|
def ext_list = path_list.collect()
|
||||||
|
ext_list.add(ext.value)
|
||||||
|
return "${getPathFromList(ext_list)}/$args.filename"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
} else if (ioptions.publish_files == null) {
|
||||||
|
return "${getPathFromList(path_list)}/$args.filename"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
49
modules/picard/sortsam/main.nf
Normal file
49
modules/picard/sortsam/main.nf
Normal file
|
@ -0,0 +1,49 @@
|
||||||
|
|
||||||
|
// Import generic module functions
|
||||||
|
include { initOptions; saveFiles; getSoftwareName } from './functions'
|
||||||
|
|
||||||
|
params.options = [:]
|
||||||
|
options = initOptions(params.options)
|
||||||
|
|
||||||
|
process PICARD_SORTSAM {
|
||||||
|
tag "$meta.id"
|
||||||
|
label 'process_low'
|
||||||
|
publishDir "${params.outdir}",
|
||||||
|
mode: params.publish_dir_mode,
|
||||||
|
saveAs: { filename -> saveFiles(filename:filename, options:params.options, publish_dir:getSoftwareName(task.process), meta:meta, publish_by_meta:['id']) }
|
||||||
|
|
||||||
|
conda (params.enable_conda ? 'bioconda::picard=2.25.7' : null)
|
||||||
|
if (workflow.containerEngine == 'singularity' && !params.singularity_pull_docker_container) {
|
||||||
|
container "https://depot.galaxyproject.org/singularity/picard:2.25.7--hdfd78af_0"
|
||||||
|
} else {
|
||||||
|
container "quay.io/biocontainers/picard:2.25.7--hdfd78af_0"
|
||||||
|
}
|
||||||
|
|
||||||
|
input:
|
||||||
|
tuple val(meta), path(bam)
|
||||||
|
val sort_order
|
||||||
|
|
||||||
|
output:
|
||||||
|
tuple val(meta), path("*.bam"), emit: bam
|
||||||
|
path "*.version.txt" , emit: version
|
||||||
|
|
||||||
|
script:
|
||||||
|
def software = getSoftwareName(task.process)
|
||||||
|
def prefix = options.suffix ? "${meta.id}${options.suffix}" : "${meta.id}"
|
||||||
|
def avail_mem = 3
|
||||||
|
if (!task.memory) {
|
||||||
|
log.info '[Picard SortSam] Available memory not known - defaulting to 3GB. Specify process memory requirements to change this.'
|
||||||
|
} else {
|
||||||
|
avail_mem = task.memory.giga
|
||||||
|
}
|
||||||
|
"""
|
||||||
|
picard \\
|
||||||
|
SortSam \\
|
||||||
|
-Xmx${avail_mem}g \\
|
||||||
|
--INPUT $bam \\
|
||||||
|
--OUTPUT ${prefix}.bam \\
|
||||||
|
--SORT_ORDER $sort_order
|
||||||
|
|
||||||
|
echo \$(picard SortSam --version 2>&1) | grep -o 'Version:.*' | cut -f2- -d: > ${software}.version.txt
|
||||||
|
"""
|
||||||
|
}
|
47
modules/picard/sortsam/meta.yml
Normal file
47
modules/picard/sortsam/meta.yml
Normal file
|
@ -0,0 +1,47 @@
|
||||||
|
name: picard_sortsam
|
||||||
|
description: Sorts BAM/SAM files based on a variety of picard specific criteria
|
||||||
|
keywords:
|
||||||
|
- sort
|
||||||
|
- bam
|
||||||
|
- sam
|
||||||
|
tools:
|
||||||
|
- picard:
|
||||||
|
description: |
|
||||||
|
A set of command line tools (in Java) for manipulating high-throughput sequencing (HTS)
|
||||||
|
data and formats such as SAM/BAM/CRAM and VCF.
|
||||||
|
homepage: https://broadinstitute.github.io/picard/
|
||||||
|
documentation: https://broadinstitute.github.io/picard/
|
||||||
|
|
||||||
|
input:
|
||||||
|
- meta:
|
||||||
|
type: map
|
||||||
|
description: |
|
||||||
|
Groovy Map containing sample information
|
||||||
|
e.g. [ id:'test', single_end:false ]
|
||||||
|
- bam:
|
||||||
|
type: file
|
||||||
|
description: BAM/CRAM/SAM file
|
||||||
|
pattern: "*.{bam,sam}"
|
||||||
|
- sort_order:
|
||||||
|
type: value
|
||||||
|
description: Picard sort order type
|
||||||
|
pattern: "unsorted|queryname|coordinate|duplicate|unknown"
|
||||||
|
|
||||||
|
output:
|
||||||
|
- meta:
|
||||||
|
type: map
|
||||||
|
description: |
|
||||||
|
Groovy Map containing sample information
|
||||||
|
e.g. [ id:'test', single_end:false ]
|
||||||
|
- version:
|
||||||
|
type: file
|
||||||
|
description: File containing software version
|
||||||
|
pattern: "*.{version.txt}"
|
||||||
|
- bam:
|
||||||
|
type: file
|
||||||
|
description: Sorted BAM/CRAM/SAM file
|
||||||
|
pattern: "*.{bam}"
|
||||||
|
|
||||||
|
|
||||||
|
authors:
|
||||||
|
- "@jfy133"
|
|
@ -1,32 +1,27 @@
|
||||||
name: prodigal
|
name: prodigal
|
||||||
## TODO nf-core: Add a description of the module and list keywords
|
description: Prodigal (Prokaryotic Dynamic Programming Genefinding Algorithm) is a microbial (bacterial and archaeal) gene finding program
|
||||||
description: write your description here
|
|
||||||
keywords:
|
keywords:
|
||||||
- sort
|
- sort
|
||||||
tools:
|
tools:
|
||||||
- prodigal:
|
- prodigal:
|
||||||
## TODO nf-core: Add a description and other details for the software below
|
|
||||||
description: Prodigal (Prokaryotic Dynamic Programming Genefinding Algorithm) is a microbial (bacterial and archaeal) gene finding program
|
description: Prodigal (Prokaryotic Dynamic Programming Genefinding Algorithm) is a microbial (bacterial and archaeal) gene finding program
|
||||||
homepage: {}
|
homepage: {}
|
||||||
documentation: {}
|
documentation: {}
|
||||||
tool_dev_url: {}
|
tool_dev_url: {}
|
||||||
doi: ""
|
doi: ""
|
||||||
licence: ['GPL v3']
|
licence: ["GPL v3"]
|
||||||
|
|
||||||
## TODO nf-core: Add a description of all of the variables used as input
|
|
||||||
input:
|
input:
|
||||||
- meta:
|
- meta:
|
||||||
type: map
|
type: map
|
||||||
description: |
|
description: |
|
||||||
Groovy Map containing sample information
|
Groovy Map containing sample information
|
||||||
e.g. [ id:'test', single_end:false ]
|
e.g. [ id:'test', single_end:false ]
|
||||||
## TODO nf-core: Delete / customise this example input
|
|
||||||
- bam:
|
- bam:
|
||||||
type: file
|
type: file
|
||||||
description: BAM/CRAM/SAM file
|
description: BAM/CRAM/SAM file
|
||||||
pattern: "*.{bam,cram,sam}"
|
pattern: "*.{bam,cram,sam}"
|
||||||
|
|
||||||
## TODO nf-core: Add a description of all of the variables used as output
|
|
||||||
output:
|
output:
|
||||||
- meta:
|
- meta:
|
||||||
type: map
|
type: map
|
||||||
|
@ -37,7 +32,6 @@ output:
|
||||||
type: file
|
type: file
|
||||||
description: File containing software version
|
description: File containing software version
|
||||||
pattern: "*.{version.txt}"
|
pattern: "*.{version.txt}"
|
||||||
## TODO nf-core: Delete / customise this example output
|
|
||||||
- bam:
|
- bam:
|
||||||
type: file
|
type: file
|
||||||
description: Sorted BAM/CRAM/SAM file
|
description: Sorted BAM/CRAM/SAM file
|
||||||
|
|
|
@ -10,11 +10,11 @@ process RAXMLNG {
|
||||||
mode: params.publish_dir_mode,
|
mode: params.publish_dir_mode,
|
||||||
saveAs: { filename -> saveFiles(filename:filename, options:params.options, publish_dir:getSoftwareName(task.process), meta:[:], publish_by_meta:[]) }
|
saveAs: { filename -> saveFiles(filename:filename, options:params.options, publish_dir:getSoftwareName(task.process), meta:[:], publish_by_meta:[]) }
|
||||||
|
|
||||||
conda (params.enable_conda ? "bioconda::raxml-ng=1.0.2" : null)
|
conda (params.enable_conda ? 'bioconda::raxml-ng=1.0.3' : null)
|
||||||
if (workflow.containerEngine == 'singularity' && !params.singularity_pull_docker_container) {
|
if (workflow.containerEngine == 'singularity' && !params.singularity_pull_docker_container) {
|
||||||
container "https://depot.galaxyproject.org/singularity/raxml-ng:1.0.2--h7447c1b_0"
|
container "https://depot.galaxyproject.org/singularity/raxml-ng:1.0.3--h32fcf60_0"
|
||||||
} else {
|
} else {
|
||||||
container "quay.io/biocontainers/raxml-ng:1.0.2--h7447c1b_0"
|
container "quay.io/biocontainers/raxml-ng:1.0.3--h32fcf60_0"
|
||||||
}
|
}
|
||||||
|
|
||||||
input:
|
input:
|
||||||
|
|
|
@ -11,7 +11,7 @@ process RSEQC_BAMSTAT {
|
||||||
mode: params.publish_dir_mode,
|
mode: params.publish_dir_mode,
|
||||||
saveAs: { filename -> saveFiles(filename:filename, options:params.options, publish_dir:getSoftwareName(task.process), meta:meta, publish_by_meta:['id']) }
|
saveAs: { filename -> saveFiles(filename:filename, options:params.options, publish_dir:getSoftwareName(task.process), meta:meta, publish_by_meta:['id']) }
|
||||||
|
|
||||||
conda (params.enable_conda ? "bioconda::rseqc=3.0.1" : null)
|
conda (params.enable_conda ? "bioconda::rseqc=3.0.1 'conda-forge::r-base>=3.5'" : null)
|
||||||
if (workflow.containerEngine == 'singularity' && !params.singularity_pull_docker_container) {
|
if (workflow.containerEngine == 'singularity' && !params.singularity_pull_docker_container) {
|
||||||
container "https://depot.galaxyproject.org/singularity/rseqc:3.0.1--py37h516909a_1"
|
container "https://depot.galaxyproject.org/singularity/rseqc:3.0.1--py37h516909a_1"
|
||||||
} else {
|
} else {
|
||||||
|
|
|
@ -11,7 +11,7 @@ process RSEQC_INFEREXPERIMENT {
|
||||||
mode: params.publish_dir_mode,
|
mode: params.publish_dir_mode,
|
||||||
saveAs: { filename -> saveFiles(filename:filename, options:params.options, publish_dir:getSoftwareName(task.process), meta:meta, publish_by_meta:['id']) }
|
saveAs: { filename -> saveFiles(filename:filename, options:params.options, publish_dir:getSoftwareName(task.process), meta:meta, publish_by_meta:['id']) }
|
||||||
|
|
||||||
conda (params.enable_conda ? "bioconda::rseqc=3.0.1" : null)
|
conda (params.enable_conda ? "bioconda::rseqc=3.0.1 'conda-forge::r-base>=3.5'" : null)
|
||||||
if (workflow.containerEngine == 'singularity' && !params.singularity_pull_docker_container) {
|
if (workflow.containerEngine == 'singularity' && !params.singularity_pull_docker_container) {
|
||||||
container "https://depot.galaxyproject.org/singularity/rseqc:3.0.1--py37h516909a_1"
|
container "https://depot.galaxyproject.org/singularity/rseqc:3.0.1--py37h516909a_1"
|
||||||
} else {
|
} else {
|
||||||
|
|
Some files were not shown because too many files have changed in this diff Show more
Loading…
Reference in a new issue