mirror of
https://github.com/MillironX/nf-core_modules.git
synced 2024-12-22 02:58:17 +00:00
Merge remote-tracking branch 'upstream/master' into sexdeterrmine
This commit is contained in:
commit
37c6fb8fed
382 changed files with 7609 additions and 1126 deletions
64
.github/ISSUE_TEMPLATE/bug_report.md
vendored
64
.github/ISSUE_TEMPLATE/bug_report.md
vendored
|
@ -1,64 +0,0 @@
|
|||
---
|
||||
name: Bug report
|
||||
about: Report something that is broken or incorrect
|
||||
title: "[BUG]"
|
||||
---
|
||||
|
||||
<!--
|
||||
# nf-core/module bug report
|
||||
|
||||
Hi there!
|
||||
|
||||
Thanks for telling us about a problem with the modules.
|
||||
Please delete this text and anything that's not relevant from the template below:
|
||||
-->
|
||||
|
||||
## Check Documentation
|
||||
|
||||
I have checked the following places for your error:
|
||||
|
||||
- [ ] [nf-core website: troubleshooting](https://nf-co.re/usage/troubleshooting)
|
||||
- [ ] [nf-core/module documentation](https://github.com/nf-core/modules/blob/master/README.md)
|
||||
|
||||
## Description of the bug
|
||||
|
||||
<!-- A clear and concise description of what the bug is. -->
|
||||
|
||||
## Steps to reproduce
|
||||
|
||||
Steps to reproduce the behaviour:
|
||||
|
||||
1. Command line: <!-- [e.g. `nextflow run ...`] -->
|
||||
2. See error: <!-- [Please provide your error message] -->
|
||||
|
||||
## Expected behaviour
|
||||
|
||||
<!-- A clear and concise description of what you expected to happen. -->
|
||||
|
||||
## Log files
|
||||
|
||||
Have you provided the following extra information/files:
|
||||
|
||||
- [ ] The command used to run the module
|
||||
- [ ] The `.nextflow.log` file <!-- this is a hidden file in the directory where you launched the module -->
|
||||
|
||||
## System
|
||||
|
||||
- Hardware: <!-- [e.g. HPC, Desktop, Cloud...] -->
|
||||
- Executor: <!-- [e.g. slurm, local, awsbatch...] -->
|
||||
- OS: <!-- [e.g. CentOS Linux, macOS, Linux Mint...] -->
|
||||
- Version <!-- [e.g. 7, 10.13.6, 18.3...] -->
|
||||
|
||||
## Nextflow Installation
|
||||
|
||||
- Version: <!-- [e.g. 19.10.0] -->
|
||||
|
||||
## Container engine
|
||||
|
||||
- Engine: <!-- [e.g. Conda, Docker, Singularity or Podman] -->
|
||||
- version: <!-- [e.g. 1.0.0] -->
|
||||
- Image tag: <!-- [e.g. nfcore/module:2.6] -->
|
||||
|
||||
## Additional context
|
||||
|
||||
<!-- Add any other context about the problem here. -->
|
52
.github/ISSUE_TEMPLATE/bug_report.yml
vendored
Normal file
52
.github/ISSUE_TEMPLATE/bug_report.yml
vendored
Normal file
|
@ -0,0 +1,52 @@
|
|||
name: Bug report
|
||||
description: Report something that is broken or incorrect
|
||||
labels: bug
|
||||
body:
|
||||
- type: checkboxes
|
||||
attributes:
|
||||
label: Have you checked the docs?
|
||||
description: I have checked the following places for my error
|
||||
options:
|
||||
- label: "[nf-core website: troubleshooting](https://nf-co.re/usage/troubleshooting)"
|
||||
required: true
|
||||
- label: "[nf-core modules documentation](https://nf-co.re/docs/contributing/modules)"
|
||||
required: true
|
||||
|
||||
- type: textarea
|
||||
id: description
|
||||
attributes:
|
||||
label: Description of the bug
|
||||
description: A clear and concise description of what the bug is.
|
||||
validations:
|
||||
required: true
|
||||
|
||||
- type: textarea
|
||||
id: command_used
|
||||
attributes:
|
||||
label: Command used and terminal output
|
||||
description: Steps to reproduce the behaviour. Please paste the command you used to launch the pipeline and the output from your terminal.
|
||||
render: console
|
||||
placeholder: |
|
||||
$ nextflow run ...
|
||||
|
||||
Some output where something broke
|
||||
|
||||
- type: textarea
|
||||
id: files
|
||||
attributes:
|
||||
label: Relevant files
|
||||
description: |
|
||||
Please drag and drop the relevant files here. Create a `.zip` archive if the extension is not allowed.
|
||||
Your verbose log file `.nextflow.log` is often useful _(this is a hidden file in the directory where you launched the pipeline)_ as well as custom Nextflow configuration files.
|
||||
|
||||
- type: textarea
|
||||
id: system
|
||||
attributes:
|
||||
label: System information
|
||||
description: |
|
||||
* Nextflow version _(eg. 21.10.3)_
|
||||
* Hardware _(eg. HPC, Desktop, Cloud)_
|
||||
* Executor _(eg. slurm, local, awsbatch)_
|
||||
* Container engine and version: _(e.g. Docker 1.0.0, Singularity, Conda, Podman, Shifter or Charliecloud)_
|
||||
* OS and version: _(eg. CentOS Linux, macOS, Ubuntu 22.04)_
|
||||
* Image tag: <!-- [e.g. nfcore/cellranger:2.6] -->
|
32
.github/ISSUE_TEMPLATE/feature_request.md
vendored
32
.github/ISSUE_TEMPLATE/feature_request.md
vendored
|
@ -1,32 +0,0 @@
|
|||
---
|
||||
name: Feature request
|
||||
about: Suggest an idea for nf-core/modules
|
||||
title: "[FEATURE]"
|
||||
---
|
||||
|
||||
<!--
|
||||
# nf-core/modules feature request
|
||||
|
||||
Hi there!
|
||||
|
||||
Thanks for suggesting a new feature for the modules!
|
||||
Please delete this text and anything that's not relevant from the template below:
|
||||
-->
|
||||
|
||||
## Is your feature request related to a problem? Please describe
|
||||
|
||||
<!-- A clear and concise description of what the problem is. -->
|
||||
|
||||
<!-- e.g. [I'm always frustrated when ...] -->
|
||||
|
||||
## Describe the solution you'd like
|
||||
|
||||
<!-- A clear and concise description of what you want to happen. -->
|
||||
|
||||
## Describe alternatives you've considered
|
||||
|
||||
<!-- A clear and concise description of any alternative solutions or features you've considered. -->
|
||||
|
||||
## Additional context
|
||||
|
||||
<!-- Add any other context about the feature request here. -->
|
32
.github/ISSUE_TEMPLATE/feature_request.yml
vendored
Normal file
32
.github/ISSUE_TEMPLATE/feature_request.yml
vendored
Normal file
|
@ -0,0 +1,32 @@
|
|||
name: Feature request
|
||||
description: Suggest an idea for nf-core/modules
|
||||
labels: feature
|
||||
title: "[FEATURE]"
|
||||
body:
|
||||
- type: textarea
|
||||
id: description
|
||||
attributes:
|
||||
label: Is your feature request related to a problem? Please describe
|
||||
description: A clear and concise description of what the bug is.
|
||||
placeholder: |
|
||||
<!-- e.g. [I'm always frustrated when ...] -->
|
||||
validations:
|
||||
required: true
|
||||
|
||||
- type: textarea
|
||||
id: solution
|
||||
attributes:
|
||||
label: Describe the solution you'd like
|
||||
description: A clear and concise description of the solution you want to happen.
|
||||
|
||||
- type: textarea
|
||||
id: alternatives
|
||||
attributes:
|
||||
label: Describe alternatives you've considered
|
||||
description: A clear and concise description of any alternative solutions or features you've considered.
|
||||
|
||||
- type: textarea
|
||||
id: additional_context
|
||||
attributes:
|
||||
label: Additional context
|
||||
description: Add any other context about the feature request here.
|
26
.github/ISSUE_TEMPLATE/new_module.md
vendored
26
.github/ISSUE_TEMPLATE/new_module.md
vendored
|
@ -1,26 +0,0 @@
|
|||
---
|
||||
name: New module
|
||||
about: Suggest a new module for nf-core/modules
|
||||
title: "new module: TOOL/SUBTOOL"
|
||||
label: new module
|
||||
---
|
||||
|
||||
<!--
|
||||
# nf-core/modules new module suggestion
|
||||
|
||||
Hi there!
|
||||
|
||||
Thanks for suggesting a new module for the modules!
|
||||
Please delete this text and anything that's not relevant from the template below:
|
||||
|
||||
Replace TOOL with the bioconda name for the tool in the following text, so that the link is functional.
|
||||
|
||||
Replace TOOL/SUBTOOL in the issue title so that it's understandable.
|
||||
-->
|
||||
|
||||
I think it would be good to have a module for [TOOL](https://bioconda.github.io/recipes/TOOL/README.html)
|
||||
|
||||
- [ ] This module does not exist yet with the [`nf-core modules list`](https://github.com/nf-core/tools#list-modules) command
|
||||
- [ ] There is no [open pull request](https://github.com/nf-core/modules/pulls) for this module
|
||||
- [ ] There is no [open issue](https://github.com/nf-core/modules/issues) for this module
|
||||
- [ ] If I'm planning to work on this module, I added myself to the `Assignees` to facilitate tracking who is working on the module
|
36
.github/ISSUE_TEMPLATE/new_module.yml
vendored
Normal file
36
.github/ISSUE_TEMPLATE/new_module.yml
vendored
Normal file
|
@ -0,0 +1,36 @@
|
|||
name: New module
|
||||
description: Suggest a new module for nf-core/modules
|
||||
title: "new module: TOOL/SUBTOOL"
|
||||
labels: new module
|
||||
body:
|
||||
- type: checkboxes
|
||||
attributes:
|
||||
label: Is there an existing module for this?
|
||||
description: This module does not exist yet with the [`nf-core modules list`](https://github.com/nf-core/tools#list-modules) command
|
||||
options:
|
||||
- label: I have searched for the existing module
|
||||
required: true
|
||||
|
||||
- type: checkboxes
|
||||
attributes:
|
||||
label: Is there an open PR for this?
|
||||
description: There is no [open pull request](https://github.com/nf-core/modules/pulls) for this module
|
||||
options:
|
||||
- label: I have searched for existing PRs
|
||||
required: true
|
||||
|
||||
- type: checkboxes
|
||||
attributes:
|
||||
label: Is there an open issue for this?
|
||||
description: There is no [open issue](https://github.com/nf-core/modules/issues) for this module
|
||||
options:
|
||||
- label: I have searched for existing issues
|
||||
required: true
|
||||
|
||||
- type: checkboxes
|
||||
attributes:
|
||||
label: Are you going to work on this?
|
||||
description: If I'm planning to work on this module, I added myself to the `Assignees` to facilitate tracking who is working on the module
|
||||
options:
|
||||
- label: If I'm planning to work on this module, I added myself to the `Assignees` to facilitate tracking who is working on the module
|
||||
required: false
|
68
modules/antismash/antismashlite/main.nf
Normal file
68
modules/antismash/antismashlite/main.nf
Normal file
|
@ -0,0 +1,68 @@
|
|||
process ANTISMASH_ANTISMASHLITE {
|
||||
tag "$meta.id"
|
||||
label 'process_medium'
|
||||
|
||||
conda (params.enable_conda ? "bioconda::antismash-lite=6.0.1" : null)
|
||||
container "${ workflow.containerEngine == 'singularity' && !task.ext.singularity_pull_docker_container ?
|
||||
'https://depot.galaxyproject.org/singularity/antismash-lite:6.0.1--pyhdfd78af_1' :
|
||||
'quay.io/biocontainers/antismash-lite:6.0.1--pyhdfd78af_1' }"
|
||||
|
||||
containerOptions {
|
||||
workflow.containerEngine == 'singularity' ?
|
||||
"-B $antismash_dir:/usr/local/lib/python3.8/site-packages/antismash" :
|
||||
workflow.containerEngine == 'docker' ?
|
||||
"-v \$PWD/$antismash_dir:/usr/local/lib/python3.8/site-packages/antismash" :
|
||||
''
|
||||
}
|
||||
|
||||
input:
|
||||
tuple val(meta), path(sequence_input)
|
||||
path(databases)
|
||||
path(antismash_dir) // Optional input: AntiSMASH installation folder. It is not needed for using this module with conda, but required for docker/singularity (see meta.yml).
|
||||
path(gff)
|
||||
|
||||
output:
|
||||
tuple val(meta), path("${prefix}/clusterblast/*_c*.txt") , optional: true, emit: clusterblast_file
|
||||
tuple val(meta), path("${prefix}/{css,images,js}") , emit: html_accessory_files
|
||||
tuple val(meta), path("${prefix}/knownclusterblast/region*/ctg*.html") , optional: true, emit: knownclusterblast_html
|
||||
tuple val(meta), path("${prefix}/knownclusterblast/*_c*.txt") , optional: true, emit: knownclusterblast_txt
|
||||
tuple val(meta), path("${prefix}/svg/clusterblast*.svg") , optional: true, emit: svg_files_clusterblast
|
||||
tuple val(meta), path("${prefix}/svg/knownclusterblast*.svg") , optional: true, emit: svg_files_knownclusterblast
|
||||
tuple val(meta), path("${prefix}/*.gbk") , emit: gbk_input
|
||||
tuple val(meta), path("${prefix}/*.json") , emit: json_results
|
||||
tuple val(meta), path("${prefix}/*.log") , emit: log
|
||||
tuple val(meta), path("${prefix}/*.zip") , emit: zip
|
||||
tuple val(meta), path("${prefix}/*region*.gbk") , emit: gbk_results
|
||||
tuple val(meta), path("${prefix}/clusterblastoutput.txt") , optional: true, emit: clusterblastoutput
|
||||
tuple val(meta), path("${prefix}/index.html") , emit: html
|
||||
tuple val(meta), path("${prefix}/knownclusterblastoutput.txt") , optional: true, emit: knownclusterblastoutput
|
||||
tuple val(meta), path("${prefix}/regions.js") , emit: json_sideloading
|
||||
path "versions.yml" , emit: versions
|
||||
|
||||
when:
|
||||
task.ext.when == null || task.ext.when
|
||||
|
||||
script:
|
||||
def args = task.ext.args ?: ''
|
||||
prefix = task.ext.suffix ? "${meta.id}${task.ext.suffix}" : "${meta.id}"
|
||||
gff_flag = "--genefinding-gff3 ${gff}"
|
||||
|
||||
"""
|
||||
## We specifically do not include annotations (--genefinding-tool none) as
|
||||
## this should be run as a separate module for versioning purposes
|
||||
antismash \\
|
||||
$args \\
|
||||
$gff_flag \\
|
||||
-c $task.cpus \\
|
||||
--output-dir $prefix \\
|
||||
--genefinding-tool none \\
|
||||
--logfile $prefix/${prefix}.log \\
|
||||
--databases $databases \\
|
||||
$sequence_input
|
||||
|
||||
cat <<-END_VERSIONS > versions.yml
|
||||
"${task.process}":
|
||||
antismash-lite: \$(antismash --version | sed 's/antiSMASH //')
|
||||
END_VERSIONS
|
||||
"""
|
||||
}
|
128
modules/antismash/antismashlite/meta.yml
Normal file
128
modules/antismash/antismashlite/meta.yml
Normal file
|
@ -0,0 +1,128 @@
|
|||
name: antismash_antismashlite
|
||||
description: |
|
||||
antiSMASH allows the rapid genome-wide identification, annotation
|
||||
and analysis of secondary metabolite biosynthesis gene clusters.
|
||||
keywords:
|
||||
- secondary metabolites
|
||||
- BGC
|
||||
- biosynthetic gene cluster
|
||||
- genome mining
|
||||
- NRPS
|
||||
- RiPP
|
||||
- antibiotics
|
||||
- prokaryotes
|
||||
- bacteria
|
||||
- eukaryotes
|
||||
- fungi
|
||||
- antismash
|
||||
|
||||
tools:
|
||||
- antismashlite:
|
||||
description: "antiSMASH - the antibiotics and Secondary Metabolite Analysis SHell"
|
||||
homepage: "https://docs.antismash.secondarymetabolites.org"
|
||||
documentation: "https://docs.antismash.secondarymetabolites.org"
|
||||
tool_dev_url: "https://github.com/antismash/antismash"
|
||||
doi: "10.1093/nar/gkab335"
|
||||
licence: "['AGPL v3']"
|
||||
|
||||
input:
|
||||
- meta:
|
||||
type: map
|
||||
description: |
|
||||
Groovy Map containing sample information
|
||||
e.g. [ id:'test', single_end:false ]
|
||||
- sequence_input:
|
||||
type: file
|
||||
description: nucleotide sequence file (annotated)
|
||||
pattern: "*.{gbk, gb, gbff, genbank, embl, fasta, fna}"
|
||||
- databases:
|
||||
type: directory
|
||||
description: downloaded AntiSMASH databases e.g. data/databases
|
||||
pattern: "*/"
|
||||
- antismash_dir:
|
||||
type: directory
|
||||
description: |
|
||||
A local copy of an AntiSMASH installation folder. This is required when running with
|
||||
docker and singularity (not required for conda), due to attempted 'modifications' of
|
||||
files during database checks in the installation directory, something that cannot
|
||||
be done in immutable docker/singularity containers. Therefore, a local installation
|
||||
directory needs to be mounted (including all modified files from the downloading step)
|
||||
to the container as a workaround.
|
||||
pattern: "*/"
|
||||
- gff:
|
||||
type: file
|
||||
pattern: "*.gff"
|
||||
|
||||
output:
|
||||
- meta:
|
||||
type: map
|
||||
description: |
|
||||
Groovy Map containing sample information
|
||||
e.g. [ id:'test', single_end:false ]
|
||||
- versions:
|
||||
type: file
|
||||
description: File containing software versions
|
||||
pattern: "versions.yml"
|
||||
- clusterblast_file:
|
||||
type: file
|
||||
description: Output of ClusterBlast algorithm
|
||||
pattern: "clusterblast/*_c*.txt"
|
||||
- html_accessory_files:
|
||||
type: directory
|
||||
description: Accessory files for the HTML output
|
||||
pattern: "{css/,images/,js/}"
|
||||
- knownclusterblast_html:
|
||||
type: file
|
||||
description: Tables with MIBiG hits in HTML format
|
||||
pattern: "knownclusterblast/region*/ctg*.html"
|
||||
- knownclusterblast_txt:
|
||||
type: file
|
||||
description: Tables with MIBiG hits
|
||||
pattern: "knownclusterblast/*_c*.txt"
|
||||
- svg_files_clusterblast:
|
||||
type: file
|
||||
description: SVG images showing the % identity of the aligned hits against their queries
|
||||
pattern: "svg/clusterblast*.svg"
|
||||
- svg_files_knownclusterblast:
|
||||
type: file
|
||||
description: SVG images showing the % identity of the aligned hits against their queries
|
||||
pattern: "svg/knownclusterblast*.svg"
|
||||
- gbk_input:
|
||||
type: file
|
||||
description: Nucleotide sequence and annotations in GenBank format; converted from input file
|
||||
pattern: "*.gbk"
|
||||
- json_results:
|
||||
type: file
|
||||
description: Nucleotide sequence and annotations in JSON format; converted from GenBank file (gbk_input)
|
||||
pattern: "*.json"
|
||||
- log:
|
||||
type: file
|
||||
description: Contains all the logging output that antiSMASH produced during its run
|
||||
pattern: "*.log"
|
||||
- zip:
|
||||
type: file
|
||||
description: Contains a compressed version of the output folder in zip format
|
||||
pattern: "*.zip"
|
||||
- gbk_results:
|
||||
type: file
|
||||
description: Nucleotide sequence and annotations in GenBank format; one file per antiSMASH hit
|
||||
pattern: "*region*.gbk"
|
||||
- clusterblastoutput:
|
||||
type: file
|
||||
description: Raw BLAST output of known clusters previously predicted by antiSMASH using the built-in ClusterBlast algorithm
|
||||
pattern: "clusterblastoutput.txt"
|
||||
- html:
|
||||
type: file
|
||||
description: Graphical web view of results in HTML format
|
||||
patterN: "index.html"
|
||||
- knownclusterblastoutput:
|
||||
type: file
|
||||
description: Raw BLAST output of known clusters of the MIBiG database
|
||||
pattern: "knownclusterblastoutput.txt"
|
||||
- json_sideloading:
|
||||
type: file
|
||||
description: Sideloaded annotations of protoclusters and/or subregions (see antiSMASH documentation "Annotation sideloading")
|
||||
pattern: "regions.js"
|
||||
|
||||
authors:
|
||||
- "@jasmezz"
|
|
@ -7,8 +7,9 @@ process ANTISMASH_ANTISMASHLITEDOWNLOADDATABASES {
|
|||
'quay.io/biocontainers/antismash-lite:6.0.1--pyhdfd78af_1' }"
|
||||
|
||||
/*
|
||||
These files are normally downloaded by download-antismash-databases itself, and must be retrieved for input by manually running the command with conda or a standalone installation of antiSMASH. Therefore we do not recommend using this module for production pipelines, but rather require users to specify their own local copy of the antiSMASH database in pipelines. This is solely for use for CI tests of the nf-core/module version of antiSMASH.
|
||||
These files are normally downloaded/created by download-antismash-databases itself, and must be retrieved for input by manually running the command with conda or a standalone installation of antiSMASH. Therefore we do not recommend using this module for production pipelines, but rather require users to specify their own local copy of the antiSMASH database in pipelines. This is solely for use for CI tests of the nf-core/module version of antiSMASH.
|
||||
Reason: Upon execution, the tool checks if certain database files are present within the container and if not, it tries to create them in /usr/local/bin, for which only root user has write permissions. Mounting those database files with this module prevents the tool from trying to create them.
|
||||
These files are also emitted as output channels in this module to enable the antismash-lite module to use them as mount volumes to the docker/singularity containers.
|
||||
*/
|
||||
|
||||
containerOptions {
|
||||
|
@ -26,6 +27,7 @@ process ANTISMASH_ANTISMASHLITEDOWNLOADDATABASES {
|
|||
|
||||
output:
|
||||
path("antismash_db") , emit: database
|
||||
path("antismash_dir"), emit: antismash_dir
|
||||
path "versions.yml", emit: versions
|
||||
|
||||
when:
|
||||
|
@ -33,14 +35,22 @@ process ANTISMASH_ANTISMASHLITEDOWNLOADDATABASES {
|
|||
|
||||
script:
|
||||
def args = task.ext.args ?: ''
|
||||
conda = params.enable_conda
|
||||
"""
|
||||
download-antismash-databases \\
|
||||
--database-dir antismash_db \\
|
||||
$args
|
||||
|
||||
if [[ $conda = false ]]; \
|
||||
then \
|
||||
cp -r /usr/local/lib/python3.8/site-packages/antismash antismash_dir; \
|
||||
else \
|
||||
cp -r \$(python -c 'import antismash;print(antismash.__file__.split("/__")[0])') antismash_dir; \
|
||||
fi
|
||||
|
||||
cat <<-END_VERSIONS > versions.yml
|
||||
"${task.process}":
|
||||
antismash: \$(antismash --version | sed 's/antiSMASH //')
|
||||
antismash-lite: \$(antismash --version | sed 's/antiSMASH //')
|
||||
END_VERSIONS
|
||||
"""
|
||||
}
|
||||
|
|
|
@ -27,17 +27,17 @@ input:
|
|||
- database_css:
|
||||
type: directory
|
||||
description: |
|
||||
antismash/outputs/html/css folder which is being created during the antiSMASH database downloading step. These files are normally downloaded by download-antismash-databases itself, and must be retrieved by the use by manually running the command with conda or a standalone installation of antiSMASH. Therefore we do not recommend using this module for production pipelines, but rather require users to specify their own local copy of the antiSMASH database in pipelines.
|
||||
antismash/outputs/html/css folder which is being created during the antiSMASH database downloading step. These files are normally downloaded by download-antismash-databases itself, and must be retrieved by the user by manually running the command with conda or a standalone installation of antiSMASH. Therefore we do not recommend using this module for production pipelines, but rather require users to specify their own local copy of the antiSMASH database in pipelines.
|
||||
pattern: "css"
|
||||
- database_detection:
|
||||
type: directory
|
||||
description: |
|
||||
antismash/detection folder which is being created during the antiSMASH database downloading step. These files are normally downloaded by download-antismash-databases itself, and must be retrieved by the use by manually running the command with conda or a standalone installation of antiSMASH. Therefore we do not recommend using this module for production pipelines, but rather require users to specify their own local copy of the antiSMASH database in pipelines.
|
||||
antismash/detection folder which is being created during the antiSMASH database downloading step. These files are normally downloaded by download-antismash-databases itself, and must be retrieved by the user by manually running the command with conda or a standalone installation of antiSMASH. Therefore we do not recommend using this module for production pipelines, but rather require users to specify their own local copy of the antiSMASH database in pipelines.
|
||||
pattern: "detection"
|
||||
- database_modules:
|
||||
type: directory
|
||||
description: |
|
||||
antismash/modules folder which is being created during the antiSMASH database downloading step. These files are normally downloaded by download-antismash-databases itself, and must be retrieved by the use by manually running the command with conda or a standalone installation of antiSMASH. Therefore we do not recommend using this module for production pipelines, but rather require users to specify their own local copy of the antiSMASH database in pipelines.
|
||||
antismash/modules folder which is being created during the antiSMASH database downloading step. These files are normally downloaded by download-antismash-databases itself, and must be retrieved by the user by manually running the command with conda or a standalone installation of antiSMASH. Therefore we do not recommend using this module for production pipelines, but rather require users to specify their own local copy of the antiSMASH database in pipelines.
|
||||
pattern: "modules"
|
||||
|
||||
output:
|
||||
|
@ -50,6 +50,11 @@ output:
|
|||
type: directory
|
||||
description: Download directory for antiSMASH databases
|
||||
pattern: "antismash_db"
|
||||
- antismash_dir:
|
||||
type: directory
|
||||
description: |
|
||||
antismash installation folder which is being modified during the antiSMASH database downloading step. The modified files are normally downloaded by download-antismash-databases itself, and must be retrieved by the user by manually running the command with conda or a standalone installation of antiSMASH. Therefore we do not recommend using this module for production pipelines, but rather require users to specify their own local copy of the antiSMASH database and installation folder in pipelines.
|
||||
pattern: "antismash_dir"
|
||||
|
||||
authors:
|
||||
- "@jasmezz"
|
||||
|
|
|
@ -2,15 +2,20 @@ process ARRIBA {
|
|||
tag "$meta.id"
|
||||
label 'process_medium'
|
||||
|
||||
conda (params.enable_conda ? "bioconda::arriba=2.1.0" : null)
|
||||
conda (params.enable_conda ? "bioconda::arriba=2.2.1" : null)
|
||||
container "${ workflow.containerEngine == 'singularity' && !task.ext.singularity_pull_docker_container ?
|
||||
'https://depot.galaxyproject.org/singularity/arriba:2.1.0--h3198e80_1' :
|
||||
'quay.io/biocontainers/arriba:2.1.0--h3198e80_1' }"
|
||||
'https://depot.galaxyproject.org/singularity/arriba:2.2.1--hecb563c_2' :
|
||||
'quay.io/biocontainers/arriba:2.2.1--hecb563c_2' }"
|
||||
|
||||
input:
|
||||
tuple val(meta), path(bam)
|
||||
path fasta
|
||||
path gtf
|
||||
path blacklist
|
||||
path known_fusions
|
||||
path structural_variants
|
||||
path tags
|
||||
path protein_domains
|
||||
|
||||
output:
|
||||
tuple val(meta), path("*.fusions.tsv") , emit: fusions
|
||||
|
@ -23,7 +28,12 @@ process ARRIBA {
|
|||
script:
|
||||
def args = task.ext.args ?: ''
|
||||
def prefix = task.ext.prefix ?: "${meta.id}"
|
||||
def blacklist = (args.contains('-b')) ? '' : '-f blacklist'
|
||||
def blacklist = blacklist ? "-b $blacklist" : "-f blacklist"
|
||||
def known_fusions = known_fusions ? "-k $known_fusions" : ""
|
||||
def structural_variants = structural_variants ? "-d $structual_variants" : ""
|
||||
def tags = tags ? "-t $tags" : ""
|
||||
def protein_domains = protein_domains ? "-p $protein_domains" : ""
|
||||
|
||||
"""
|
||||
arriba \\
|
||||
-x $bam \\
|
||||
|
@ -32,6 +42,10 @@ process ARRIBA {
|
|||
-o ${prefix}.fusions.tsv \\
|
||||
-O ${prefix}.fusions.discarded.tsv \\
|
||||
$blacklist \\
|
||||
$known_fusions \\
|
||||
$structural_variants \\
|
||||
$tags \\
|
||||
$protein_domains \\
|
||||
$args
|
||||
|
||||
cat <<-END_VERSIONS > versions.yml
|
||||
|
@ -39,4 +53,14 @@ process ARRIBA {
|
|||
arriba: \$(arriba -h | grep 'Version:' 2>&1 | sed 's/Version:\s//')
|
||||
END_VERSIONS
|
||||
"""
|
||||
|
||||
stub:
|
||||
def prefix = task.ext.prefix ?: "${meta.id}"
|
||||
"""
|
||||
echo stub > ${prefix}.fusions.tsv
|
||||
echo stub > ${prefix}.fusions.discarded.tsv
|
||||
|
||||
echo "${task.process}:" > versions.yml
|
||||
echo ' arriba: 2.2.1' >> versions.yml
|
||||
"""
|
||||
}
|
||||
|
|
|
@ -30,6 +30,26 @@ input:
|
|||
type: file
|
||||
description: Annotation GTF file
|
||||
pattern: "*.{gtf}"
|
||||
- blacklist:
|
||||
type: file
|
||||
description: Blacklist file
|
||||
pattern: "*.{tsv}"
|
||||
- known_fusions:
|
||||
type: file
|
||||
description: Known fusions file
|
||||
pattern: "*.{tsv}"
|
||||
- structural_variants:
|
||||
type: file
|
||||
description: Structural variants file
|
||||
pattern: "*.{tsv}"
|
||||
- tags:
|
||||
type: file
|
||||
description: Tags file
|
||||
pattern: "*.{tsv}"
|
||||
- protein_domains:
|
||||
type: file
|
||||
description: Protein domains file
|
||||
pattern: "*.{gff3}"
|
||||
|
||||
output:
|
||||
- meta:
|
||||
|
@ -51,4 +71,4 @@ output:
|
|||
pattern: "*.{fusions.discarded.tsv}"
|
||||
|
||||
authors:
|
||||
- "@praveenraj2018"
|
||||
- "@praveenraj2018,@rannick"
|
||||
|
|
|
@ -2,10 +2,10 @@ process BAMTOOLS_SPLIT {
|
|||
tag "$meta.id"
|
||||
label 'process_low'
|
||||
|
||||
conda (params.enable_conda ? "bioconda::bamtools=2.5.1" : null)
|
||||
conda (params.enable_conda ? "bioconda::bamtools=2.5.2" : null)
|
||||
container "${ workflow.containerEngine == 'singularity' && !task.ext.singularity_pull_docker_container ?
|
||||
'https://depot.galaxyproject.org/singularity/bamtools:2.5.1--h9a82719_9' :
|
||||
'quay.io/biocontainers/bamtools:2.5.1--h9a82719_9' }"
|
||||
'https://depot.galaxyproject.org/singularity/bamtools:2.5.2--hd03093a_0' :
|
||||
'quay.io/biocontainers/bamtools:2.5.2--hd03093a_0' }"
|
||||
|
||||
input:
|
||||
tuple val(meta), path(bam)
|
||||
|
@ -20,11 +20,15 @@ process BAMTOOLS_SPLIT {
|
|||
script:
|
||||
def args = task.ext.args ?: ''
|
||||
def prefix = task.ext.prefix ?: "${meta.id}"
|
||||
def input_list = bam.collect{"-in $it"}.join(' ')
|
||||
"""
|
||||
bamtools \\
|
||||
split \\
|
||||
-in $bam \\
|
||||
$args
|
||||
merge \\
|
||||
$input_list \\
|
||||
| bamtools \\
|
||||
split \\
|
||||
-stub $prefix \\
|
||||
$args
|
||||
|
||||
cat <<-END_VERSIONS > versions.yml
|
||||
"${task.process}":
|
||||
|
|
|
@ -23,7 +23,7 @@ input:
|
|||
e.g. [ id:'test', single_end:false ]
|
||||
- bam:
|
||||
type: file
|
||||
description: A BAM file to split
|
||||
description: A list of one or more BAM files to merge and then split
|
||||
pattern: "*.bam"
|
||||
|
||||
output:
|
||||
|
@ -43,3 +43,4 @@ output:
|
|||
|
||||
authors:
|
||||
- "@sguizard"
|
||||
- "@matthdsm"
|
||||
|
|
38
modules/bedtools/split/main.nf
Normal file
38
modules/bedtools/split/main.nf
Normal file
|
@ -0,0 +1,38 @@
|
|||
process BEDTOOLS_SPLIT {
|
||||
tag "$meta.id"
|
||||
label 'process_low'
|
||||
|
||||
conda (params.enable_conda ? "bioconda::bedtools=2.30.0" : null)
|
||||
container "${ workflow.containerEngine == 'singularity' && !task.ext.singularity_pull_docker_container ?
|
||||
'https://depot.galaxyproject.org/singularity/bedtools:2.30.0--h468198e_3':
|
||||
'quay.io/biocontainers/bedtools:2.30.0--h7d7f7ad_2' }"
|
||||
|
||||
input:
|
||||
tuple val(meta), path(bed)
|
||||
val(number_of_files)
|
||||
|
||||
output:
|
||||
tuple val(meta), path("*.bed"), emit: beds
|
||||
path "versions.yml" , emit: versions
|
||||
|
||||
when:
|
||||
task.ext.when == null || task.ext.when
|
||||
|
||||
script:
|
||||
def args = task.ext.args ?: ''
|
||||
def prefix = task.ext.prefix ?: "${meta.id}"
|
||||
|
||||
"""
|
||||
bedtools \\
|
||||
split \\
|
||||
$args \\
|
||||
-i $bed \\
|
||||
-p $prefix \\
|
||||
-n $number_of_files
|
||||
|
||||
cat <<-END_VERSIONS > versions.yml
|
||||
"${task.process}":
|
||||
bedtools: \$(bedtools --version | sed -e "s/bedtools v//g")
|
||||
END_VERSIONS
|
||||
"""
|
||||
}
|
41
modules/bedtools/split/meta.yml
Normal file
41
modules/bedtools/split/meta.yml
Normal file
|
@ -0,0 +1,41 @@
|
|||
name: "bedtools_split"
|
||||
description: Split BED files into several smaller BED files
|
||||
keywords:
|
||||
- sort
|
||||
tools:
|
||||
- "bedtools":
|
||||
description: "A powerful toolset for genome arithmetic"
|
||||
documentation: "https://bedtools.readthedocs.io/en/latest/content/tools/sort.html"
|
||||
licence: "['MIT', 'GPL v2']"
|
||||
|
||||
input:
|
||||
- meta:
|
||||
type: map
|
||||
description: |
|
||||
Groovy Map containing sample information
|
||||
e.g. [ id:'test', single_end:false ]
|
||||
- bed:
|
||||
type: file
|
||||
description: BED file
|
||||
pattern: "*.bed"
|
||||
- bed:
|
||||
type: value
|
||||
description: The number of files to split the BED into
|
||||
|
||||
output:
|
||||
- meta:
|
||||
type: map
|
||||
description: |
|
||||
Groovy Map containing sample information
|
||||
e.g. [ id:'test', single_end:false ]
|
||||
- versions:
|
||||
type: file
|
||||
description: File containing software versions
|
||||
pattern: "versions.yml"
|
||||
- beds:
|
||||
type: file
|
||||
description: list of split BED files
|
||||
pattern: "*.bed"
|
||||
|
||||
authors:
|
||||
- "@nvnieuwk"
|
38
modules/biobambam/bammerge/main.nf
Normal file
38
modules/biobambam/bammerge/main.nf
Normal file
|
@ -0,0 +1,38 @@
|
|||
process BIOBAMBAM_BAMMERGE {
|
||||
tag "$meta.id"
|
||||
label 'process_low'
|
||||
|
||||
conda (params.enable_conda ? "bioconda::biobambam=2.0.183" : null)
|
||||
container "${ workflow.containerEngine == 'singularity' && !task.ext.singularity_pull_docker_container ?
|
||||
'https://depot.galaxyproject.org/singularity/biobambam:2.0.183--h9f5acd7_1':
|
||||
'quay.io/biocontainers/biobambam:2.0.183--h9f5acd7_1' }"
|
||||
|
||||
input:
|
||||
tuple val(meta), path(bam)
|
||||
|
||||
output:
|
||||
tuple val(meta), path("${prefix}.bam") ,emit: bam
|
||||
tuple val(meta), path("*.bai") ,optional:true, emit: bam_index
|
||||
tuple val(meta), path("*.md5") ,optional:true, emit: checksum
|
||||
path "versions.yml" ,emit: versions
|
||||
|
||||
when:
|
||||
task.ext.when == null || task.ext.when
|
||||
|
||||
script:
|
||||
def args = task.ext.args ?: ''
|
||||
prefix = task.ext.prefix ?: "${meta.id}"
|
||||
def input_string = bam.join(" I=")
|
||||
|
||||
"""
|
||||
bammerge \\
|
||||
I=${input_string} \\
|
||||
$args \\
|
||||
> ${prefix}.bam
|
||||
|
||||
cat <<-END_VERSIONS > versions.yml
|
||||
"${task.process}":
|
||||
bammerge: \$( bammerge --version |& sed '1!d; s/.*version //; s/.\$//' )
|
||||
END_VERSIONS
|
||||
"""
|
||||
}
|
46
modules/biobambam/bammerge/meta.yml
Normal file
46
modules/biobambam/bammerge/meta.yml
Normal file
|
@ -0,0 +1,46 @@
|
|||
name: biobambam_bammerge
|
||||
description: Merge a list of sorted bam files
|
||||
keywords:
|
||||
- merge
|
||||
- bam
|
||||
tools:
|
||||
- biobambam:
|
||||
description: |
|
||||
biobambam is a set of tools for early stage alignment file processing.
|
||||
homepage: https://gitlab.com/german.tischler/biobambam2
|
||||
documentation: https://gitlab.com/german.tischler/biobambam2/-/blob/master/README.md
|
||||
doi: 10.1186/1751-0473-9-13
|
||||
licence: ["GPL v3"]
|
||||
input:
|
||||
- meta:
|
||||
type: map
|
||||
description: |
|
||||
Groovy Map containing sample information
|
||||
e.g. [ id:'test', single_end:false ]
|
||||
- bam:
|
||||
type: file
|
||||
description: List containing 1 or more bam files
|
||||
output:
|
||||
- meta:
|
||||
type: map
|
||||
description: |
|
||||
Groovy Map containing sample information
|
||||
e.g. [ id:'test', single_end:false ]
|
||||
- bam:
|
||||
type: file
|
||||
description: Merged BAM file
|
||||
pattern: "*.bam"
|
||||
- bam_index:
|
||||
type: file
|
||||
description: BAM index file
|
||||
pattern: "*"
|
||||
- checksum:
|
||||
type: file
|
||||
description: Checksum file
|
||||
pattern: "*"
|
||||
- versions:
|
||||
type: file
|
||||
description: File containing software versions
|
||||
pattern: "versions.yml"
|
||||
authors:
|
||||
- "@matthdsm"
|
|
@ -1,77 +1,71 @@
|
|||
process BOWTIE2_ALIGN {
|
||||
tag "$meta.id"
|
||||
label 'process_high'
|
||||
label "process_high"
|
||||
|
||||
conda (params.enable_conda ? 'bioconda::bowtie2=2.4.4 bioconda::samtools=1.15.1 conda-forge::pigz=2.6' : null)
|
||||
container "${ workflow.containerEngine == 'singularity' && !task.ext.singularity_pull_docker_container ?
|
||||
'https://depot.galaxyproject.org/singularity/mulled-v2-ac74a7f02cebcfcc07d8e8d1d750af9c83b4d45a:1744f68fe955578c63054b55309e05b41c37a80d-0' :
|
||||
'quay.io/biocontainers/mulled-v2-ac74a7f02cebcfcc07d8e8d1d750af9c83b4d45a:1744f68fe955578c63054b55309e05b41c37a80d-0' }"
|
||||
conda (params.enable_conda ? "bioconda::bowtie2=2.4.4 bioconda::samtools=1.15.1 conda-forge::pigz=2.6" : null)
|
||||
container "${ workflow.containerEngine == "singularity" && !task.ext.singularity_pull_docker_container ?
|
||||
"https://depot.galaxyproject.org/singularity/mulled-v2-ac74a7f02cebcfcc07d8e8d1d750af9c83b4d45a:1744f68fe955578c63054b55309e05b41c37a80d-0" :
|
||||
"quay.io/biocontainers/mulled-v2-ac74a7f02cebcfcc07d8e8d1d750af9c83b4d45a:1744f68fe955578c63054b55309e05b41c37a80d-0" }"
|
||||
|
||||
input:
|
||||
tuple val(meta), path(reads)
|
||||
path index
|
||||
val save_unaligned
|
||||
val sort_bam
|
||||
|
||||
output:
|
||||
tuple val(meta), path('*.bam') , emit: bam
|
||||
tuple val(meta), path('*.log') , emit: log
|
||||
tuple val(meta), path('*fastq.gz'), emit: fastq, optional:true
|
||||
tuple val(meta), path("*.bam") , emit: bam
|
||||
tuple val(meta), path("*.log") , emit: log
|
||||
tuple val(meta), path("*fastq.gz"), emit: fastq, optional:true
|
||||
path "versions.yml" , emit: versions
|
||||
|
||||
when:
|
||||
task.ext.when == null || task.ext.when
|
||||
|
||||
script:
|
||||
def args = task.ext.args ?: ''
|
||||
def args2 = task.ext.args2 ?: ''
|
||||
def args = task.ext.args ?: ""
|
||||
def args2 = task.ext.args2 ?: ""
|
||||
def prefix = task.ext.prefix ?: "${meta.id}"
|
||||
|
||||
def unaligned = ""
|
||||
def reads_args = ""
|
||||
if (meta.single_end) {
|
||||
def unaligned = save_unaligned ? "--un-gz ${prefix}.unmapped.fastq.gz" : ''
|
||||
"""
|
||||
INDEX=`find -L ./ -name "*.rev.1.bt2" | sed 's/.rev.1.bt2//'`
|
||||
bowtie2 \\
|
||||
-x \$INDEX \\
|
||||
-U $reads \\
|
||||
--threads $task.cpus \\
|
||||
$unaligned \\
|
||||
$args \\
|
||||
2> ${prefix}.bowtie2.log \\
|
||||
| samtools view -@ $task.cpus $args2 -bhS -o ${prefix}.bam -
|
||||
|
||||
cat <<-END_VERSIONS > versions.yml
|
||||
"${task.process}":
|
||||
bowtie2: \$(echo \$(bowtie2 --version 2>&1) | sed 's/^.*bowtie2-align-s version //; s/ .*\$//')
|
||||
samtools: \$(echo \$(samtools --version 2>&1) | sed 's/^.*samtools //; s/Using.*\$//')
|
||||
pigz: \$( pigz --version 2>&1 | sed 's/pigz //g' )
|
||||
END_VERSIONS
|
||||
"""
|
||||
unaligned = save_unaligned ? "--un-gz ${prefix}.unmapped.fastq.gz" : ""
|
||||
reads_args = "-U ${reads}"
|
||||
} else {
|
||||
def unaligned = save_unaligned ? "--un-conc-gz ${prefix}.unmapped.fastq.gz" : ''
|
||||
"""
|
||||
INDEX=`find -L ./ -name "*.rev.1.bt2" | sed 's/.rev.1.bt2//'`
|
||||
bowtie2 \\
|
||||
-x \$INDEX \\
|
||||
-1 ${reads[0]} \\
|
||||
-2 ${reads[1]} \\
|
||||
--threads $task.cpus \\
|
||||
$unaligned \\
|
||||
$args \\
|
||||
2> ${prefix}.bowtie2.log \\
|
||||
| samtools view -@ $task.cpus $args2 -bhS -o ${prefix}.bam -
|
||||
|
||||
if [ -f ${prefix}.unmapped.fastq.1.gz ]; then
|
||||
mv ${prefix}.unmapped.fastq.1.gz ${prefix}.unmapped_1.fastq.gz
|
||||
fi
|
||||
if [ -f ${prefix}.unmapped.fastq.2.gz ]; then
|
||||
mv ${prefix}.unmapped.fastq.2.gz ${prefix}.unmapped_2.fastq.gz
|
||||
fi
|
||||
|
||||
cat <<-END_VERSIONS > versions.yml
|
||||
"${task.process}":
|
||||
bowtie2: \$(echo \$(bowtie2 --version 2>&1) | sed 's/^.*bowtie2-align-s version //; s/ .*\$//')
|
||||
samtools: \$(echo \$(samtools --version 2>&1) | sed 's/^.*samtools //; s/Using.*\$//')
|
||||
pigz: \$( pigz --version 2>&1 | sed 's/pigz //g' )
|
||||
END_VERSIONS
|
||||
"""
|
||||
unaligned = save_unaligned ? "--un-conc-gz ${prefix}.unmapped.fastq.gz" : ""
|
||||
reads_args = "-1 ${reads[0]} -2 ${reads[1]}"
|
||||
}
|
||||
|
||||
def samtools_command = sort_bam ? 'sort' : 'view'
|
||||
|
||||
"""
|
||||
INDEX=`find -L ./ -name "*.rev.1.bt2" | sed "s/.rev.1.bt2//"`
|
||||
[ -z "\$INDEX" ] && INDEX=`find -L ./ -name "*.rev.1.bt2l" | sed "s/.rev.1.bt2l//"`
|
||||
[ -z "\$INDEX" ] && echo "Bowtie2 index files not found" 1>&2 && exit 1
|
||||
|
||||
bowtie2 \\
|
||||
-x \$INDEX \\
|
||||
$reads_args \\
|
||||
--threads $task.cpus \\
|
||||
$unaligned \\
|
||||
$args \\
|
||||
2> ${prefix}.bowtie2.log \\
|
||||
| samtools $samtools_command $args2 --threads $task.cpus -o ${prefix}.bam -
|
||||
|
||||
if [ -f ${prefix}.unmapped.fastq.1.gz ]; then
|
||||
mv ${prefix}.unmapped.fastq.1.gz ${prefix}.unmapped_1.fastq.gz
|
||||
fi
|
||||
|
||||
if [ -f ${prefix}.unmapped.fastq.2.gz ]; then
|
||||
mv ${prefix}.unmapped.fastq.2.gz ${prefix}.unmapped_2.fastq.gz
|
||||
fi
|
||||
|
||||
cat <<-END_VERSIONS > versions.yml
|
||||
"${task.process}":
|
||||
bowtie2: \$(echo \$(bowtie2 --version 2>&1) | sed 's/^.*bowtie2-align-s version //; s/ .*\$//')
|
||||
samtools: \$(echo \$(samtools --version 2>&1) | sed 's/^.*samtools //; s/Using.*\$//')
|
||||
pigz: \$( pigz --version 2>&1 | sed 's/pigz //g' )
|
||||
END_VERSIONS
|
||||
"""
|
||||
}
|
||||
|
|
|
@ -29,6 +29,15 @@ input:
|
|||
type: file
|
||||
description: Bowtie2 genome index files
|
||||
pattern: "*.ebwt"
|
||||
- save_unaligned:
|
||||
type: boolean
|
||||
description: |
|
||||
Save reads that do not map to the reference (true) or discard them (false)
|
||||
(default: false)
|
||||
- sort_bam:
|
||||
type: boolean
|
||||
description: use samtools sort (true) or samtools view (false)
|
||||
pattern: "true or false"
|
||||
output:
|
||||
- bam:
|
||||
type: file
|
||||
|
|
84
modules/busco/main.nf
Normal file
84
modules/busco/main.nf
Normal file
|
@ -0,0 +1,84 @@
|
|||
process BUSCO {
|
||||
tag "$meta.id"
|
||||
label 'process_medium'
|
||||
|
||||
conda (params.enable_conda ? "bioconda::busco=5.3.2" : null)
|
||||
container "${ workflow.containerEngine == 'singularity' && !task.ext.singularity_pull_docker_container ?
|
||||
'https://depot.galaxyproject.org/singularity/busco:5.3.2--pyhdfd78af_0':
|
||||
'quay.io/biocontainers/busco:5.3.2--pyhdfd78af_0' }"
|
||||
|
||||
input:
|
||||
tuple val(meta), path('tmp_input/*')
|
||||
each lineage // Required: lineage to check against, "auto" enables --auto-lineage instead
|
||||
path busco_lineages_path // Recommended: path to busco lineages - downloads if not set
|
||||
path config_file // Optional: busco configuration file
|
||||
|
||||
output:
|
||||
tuple val(meta), path("*-busco.batch_summary.txt"), emit: batch_summary
|
||||
tuple val(meta), path("short_summary.*.txt") , emit: short_summaries_txt, optional: true
|
||||
tuple val(meta), path("short_summary.*.json") , emit: short_summaries_json, optional: true
|
||||
tuple val(meta), path("*-busco") , emit: busco_dir
|
||||
path "versions.yml" , emit: versions
|
||||
|
||||
when:
|
||||
task.ext.when == null || task.ext.when
|
||||
|
||||
script:
|
||||
def args = task.ext.args ?: ''
|
||||
def prefix = task.ext.prefix ?: "${meta.id}-${lineage}"
|
||||
def busco_config = config_file ? "--config $config_file" : ''
|
||||
def busco_lineage = lineage.equals('auto') ? '--auto-lineage' : "--lineage_dataset ${lineage}"
|
||||
def busco_lineage_dir = busco_lineages_path ? "--offline --download_path ${busco_lineages_path}" : ''
|
||||
"""
|
||||
# Nextflow changes the container --entrypoint to /bin/bash (container default entrypoint: /usr/local/env-execute)
|
||||
# Check for container variable initialisation script and source it.
|
||||
if [ -f "/usr/local/env-activate.sh" ]; then
|
||||
set +u # Otherwise, errors out because of various unbound variables
|
||||
. "/usr/local/env-activate.sh"
|
||||
set -u
|
||||
fi
|
||||
|
||||
# If the augustus config directory is not writable, then copy to writeable area
|
||||
if [ ! -w "\${AUGUSTUS_CONFIG_PATH}" ]; then
|
||||
# Create writable tmp directory for augustus
|
||||
AUG_CONF_DIR=\$( mktemp -d -p \$PWD )
|
||||
cp -r \$AUGUSTUS_CONFIG_PATH/* \$AUG_CONF_DIR
|
||||
export AUGUSTUS_CONFIG_PATH=\$AUG_CONF_DIR
|
||||
echo "New AUGUSTUS_CONFIG_PATH=\${AUGUSTUS_CONFIG_PATH}"
|
||||
fi
|
||||
|
||||
# Ensure the input is uncompressed
|
||||
INPUT_SEQS=input_seqs
|
||||
mkdir "\$INPUT_SEQS"
|
||||
cd "\$INPUT_SEQS"
|
||||
for FASTA in ../tmp_input/*; do
|
||||
if [ "\${FASTA##*.}" == 'gz' ]; then
|
||||
gzip -cdf "\$FASTA" > \$( basename "\$FASTA" .gz )
|
||||
else
|
||||
ln -s "\$FASTA" .
|
||||
fi
|
||||
done
|
||||
cd ..
|
||||
|
||||
busco \\
|
||||
--cpu $task.cpus \\
|
||||
--in "\$INPUT_SEQS" \\
|
||||
--out ${prefix}-busco \\
|
||||
$busco_lineage \\
|
||||
$busco_lineage_dir \\
|
||||
$busco_config \\
|
||||
$args
|
||||
|
||||
# clean up
|
||||
rm -rf "\$INPUT_SEQS"
|
||||
|
||||
# Move files to avoid staging/publishing issues
|
||||
mv ${prefix}-busco/batch_summary.txt ${prefix}-busco.batch_summary.txt
|
||||
mv ${prefix}-busco/*/short_summary.*.{json,txt} . || echo "Short summaries were not available: No genes were found."
|
||||
|
||||
cat <<-END_VERSIONS > versions.yml
|
||||
"${task.process}":
|
||||
busco: \$( busco --version 2>&1 | sed 's/^BUSCO //' )
|
||||
END_VERSIONS
|
||||
"""
|
||||
}
|
69
modules/busco/meta.yml
Normal file
69
modules/busco/meta.yml
Normal file
|
@ -0,0 +1,69 @@
|
|||
name: busco
|
||||
description: Benchmarking Universal Single Copy Orthologs
|
||||
keywords:
|
||||
- quality control
|
||||
- genome
|
||||
- transcriptome
|
||||
- proteome
|
||||
tools:
|
||||
- busco:
|
||||
description: BUSCO provides measures for quantitative assessment of genome assembly, gene set, and transcriptome completeness based on evolutionarily informed expectations of gene content from near-universal single-copy orthologs selected from OrthoDB.
|
||||
homepage: https://busco.ezlab.org/
|
||||
documentation: https://busco.ezlab.org/busco_userguide.html
|
||||
tool_dev_url: https://gitlab.com/ezlab/busco
|
||||
doi: "10.1007/978-1-4939-9173-0_14"
|
||||
licence: ["MIT"]
|
||||
|
||||
input:
|
||||
- meta:
|
||||
type: map
|
||||
description: |
|
||||
Groovy Map containing sample information
|
||||
e.g. [ id:'test', single_end:false ]
|
||||
- fasta:
|
||||
type: file
|
||||
description: Nucleic or amino acid sequence file in FASTA format.
|
||||
pattern: "*.{fasta,fna,fa,fasta.gz,fna.gz,fa.gz}"
|
||||
- lineage:
|
||||
type: value
|
||||
description: The BUSCO lineage to use, or "auto" to automatically select lineage
|
||||
- busco_lineages_path:
|
||||
type: directory
|
||||
description: Path to local BUSCO lineages directory.
|
||||
- config_file:
|
||||
type: file
|
||||
description: Path to BUSCO config file.
|
||||
|
||||
output:
|
||||
- meta:
|
||||
type: map
|
||||
description: |
|
||||
Groovy Map containing sample information
|
||||
e.g. [ id:'test', single_end:false ]
|
||||
- batch_summary:
|
||||
type: file
|
||||
description: Summary of all sequence files analyzed
|
||||
pattern: "*-busco.batch_summary.txt"
|
||||
- short_summaries_txt:
|
||||
type: file
|
||||
description: Short Busco summary in plain text format
|
||||
pattern: "short_summary.*.txt"
|
||||
- short_summaries_json:
|
||||
type: file
|
||||
description: Short Busco summary in JSON format
|
||||
pattern: "short_summary.*.json"
|
||||
- busco_dir:
|
||||
type: directory
|
||||
description: BUSCO lineage specific output
|
||||
pattern: "*-busco"
|
||||
- versions:
|
||||
type: file
|
||||
description: File containing software versions
|
||||
pattern: "versions.yml"
|
||||
|
||||
authors:
|
||||
- "@priyanka-surana"
|
||||
- "@charles-plessy"
|
||||
- "@mahesh-panchal"
|
||||
- "@muffato"
|
||||
- "@jvhagey"
|
|
@ -4,8 +4,8 @@ process CAT_FASTQ {
|
|||
|
||||
conda (params.enable_conda ? "conda-forge::sed=4.7" : null)
|
||||
container "${ workflow.containerEngine == 'singularity' && !task.ext.singularity_pull_docker_container ?
|
||||
'https://containers.biocontainers.pro/s3/SingImgsRepo/biocontainers/v1.2.0_cv1/biocontainers_v1.2.0_cv1.img' :
|
||||
'biocontainers/biocontainers:v1.2.0_cv1' }"
|
||||
'https://depot.galaxyproject.org/singularity/ubuntu:20.04' :
|
||||
'ubuntu:20.04' }"
|
||||
|
||||
input:
|
||||
tuple val(meta), path(reads, stageAs: "input*/*")
|
||||
|
|
|
@ -2,43 +2,42 @@ process CNVPYTOR_CALLCNVS {
|
|||
tag "$meta.id"
|
||||
label 'process_medium'
|
||||
|
||||
conda (params.enable_conda ? "bioconda::cnvpytor=1.0" : null)
|
||||
conda (params.enable_conda ? "bioconda::cnvpytor=1.2.1" : null)
|
||||
container "${ workflow.containerEngine == 'singularity' && !task.ext.singularity_pull_docker_container ?
|
||||
'https://depot.galaxyproject.org/singularity/cnvpytor:1.0--py39h6a678da_2':
|
||||
'quay.io/biocontainers/cnvpytor:1.0--py39h6a678da_2' }"
|
||||
'https://depot.galaxyproject.org/singularity/cnvpytor:1.2.1--pyhdfd78af_0':
|
||||
'quay.io/biocontainers/cnvpytor:1.2.1--pyhdfd78af_0' }"
|
||||
|
||||
input:
|
||||
tuple val(meta), path(pytor)
|
||||
val bin_sizes
|
||||
|
||||
output:
|
||||
tuple val(meta), path("*.tsv"), emit: cnvs
|
||||
path "versions.yml" , emit: versions
|
||||
tuple val(meta), path("${pytor.baseName}.pytor") , emit: pytor
|
||||
path "versions.yml" , emit: versions
|
||||
|
||||
when:
|
||||
task.ext.when == null || task.ext.when
|
||||
|
||||
script:
|
||||
def args = task.ext.args ?: '1000'
|
||||
def prefix = task.ext.prefix ?: "${meta.id}"
|
||||
def bins = bin_sizes ?: '1000'
|
||||
"""
|
||||
cnvpytor \\
|
||||
-root $pytor \\
|
||||
-call $args > ${prefix}.tsv
|
||||
-call $bin_sizes
|
||||
|
||||
cat <<-END_VERSIONS > versions.yml
|
||||
"${task.process}":
|
||||
cnvpytor: \$(echo \$(cnvpytor --version 2>&1) | sed 's/^.*pyCNVnator //; s/Using.*\$//' ))
|
||||
cnvpytor: \$(echo \$(cnvpytor --version 2>&1) | sed 's/CNVpytor //' ))
|
||||
END_VERSIONS
|
||||
"""
|
||||
|
||||
stub:
|
||||
def prefix = task.ext.prefix ?: "${meta.id}"
|
||||
"""
|
||||
touch ${prefix}.tsv
|
||||
touch ${pytor.baseName}.pytor
|
||||
|
||||
cat <<-END_VERSIONS > versions.yml
|
||||
"${task.process}":
|
||||
cnvpytor: \$(echo \$(cnvpytor --version 2>&1) | sed 's/^.*pyCNVnator //; s/Using.*\$//' ))
|
||||
cnvpytor: \$(echo \$(cnvpytor --version 2>&1) | sed 's/CNVpytor //' ))
|
||||
END_VERSIONS
|
||||
"""
|
||||
}
|
||||
|
|
|
@ -17,8 +17,11 @@ input:
|
|||
e.g. [ id:'test']
|
||||
- pytor:
|
||||
type: file
|
||||
description: cnvpytor root file
|
||||
description: pytor file containing partitions of read depth histograms using mean-shift method
|
||||
pattern: "*.{pytor}"
|
||||
- bin_sizes:
|
||||
type: string
|
||||
description: list of binsizes separated by space e.g. "1000 10000" and "1000"
|
||||
|
||||
output:
|
||||
- meta:
|
||||
|
@ -26,10 +29,10 @@ output:
|
|||
description: |
|
||||
Groovy Map containing sample information
|
||||
e.g. [ id:'test' ]
|
||||
- cnvs:
|
||||
- pytor:
|
||||
type: file
|
||||
description: file containing identified copy numer variations
|
||||
pattern: "*.{tsv}"
|
||||
description: pytor files containing cnv calls
|
||||
pattern: "*.{pytor}"
|
||||
- versions:
|
||||
type: file
|
||||
description: File containing software versions
|
||||
|
|
|
@ -2,13 +2,15 @@ process CNVPYTOR_HISTOGRAM {
|
|||
tag "$meta.id"
|
||||
label 'process_medium'
|
||||
|
||||
conda (params.enable_conda ? "bioconda::cnvpytor=1.0" : null)
|
||||
conda (params.enable_conda ? "bioconda::cnvpytor=1.2.1" : null)
|
||||
container "${ workflow.containerEngine == 'singularity' && !task.ext.singularity_pull_docker_container ?
|
||||
'https://depot.galaxyproject.org/singularity/cnvpytor:1.0--py39h6a678da_2':
|
||||
'quay.io/biocontainers/cnvpytor:1.0--py39h6a678da_2' }"
|
||||
'https://depot.galaxyproject.org/singularity/cnvpytor:1.2.1--pyhdfd78af_0':
|
||||
'quay.io/biocontainers/cnvpytor:1.2.1--pyhdfd78af_0' }"
|
||||
|
||||
input:
|
||||
tuple val(meta), path(pytor)
|
||||
val bin_sizes
|
||||
|
||||
|
||||
output:
|
||||
tuple val(meta), path("${pytor.baseName}.pytor") , emit: pytor
|
||||
|
@ -18,15 +20,15 @@ process CNVPYTOR_HISTOGRAM {
|
|||
task.ext.when == null || task.ext.when
|
||||
|
||||
script:
|
||||
def args = task.ext.args ?: '1000'
|
||||
def bins = bin_sizes ?: '1000'
|
||||
"""
|
||||
cnvpytor \\
|
||||
-root $pytor \\
|
||||
-his $args
|
||||
-his $bins
|
||||
|
||||
cat <<-END_VERSIONS > versions.yml
|
||||
"${task.process}":
|
||||
cnvpytor: \$(echo \$(cnvpytor --version 2>&1) | sed 's/^.*pyCNVnator //; s/Using.*\$//' ))
|
||||
cnvpytor: \$(echo \$(cnvpytor --version 2>&1) | sed 's/CNVpytor //' ))
|
||||
END_VERSIONS
|
||||
"""
|
||||
|
||||
|
@ -36,7 +38,7 @@ process CNVPYTOR_HISTOGRAM {
|
|||
|
||||
cat <<-END_VERSIONS > versions.yml
|
||||
"${task.process}":
|
||||
cnvpytor: \$(echo \$(cnvpytor --version 2>&1) | sed 's/^.*pyCNVnator //; s/Using.*\$//' ))
|
||||
cnvpytor: \$(echo \$(cnvpytor --version 2>&1) | sed 's/CNVpytor //' ))
|
||||
END_VERSIONS
|
||||
"""
|
||||
}
|
||||
|
|
|
@ -22,6 +22,9 @@ input:
|
|||
type: file
|
||||
description: pytor file containing read depth data
|
||||
pattern: "*.{pytor}"
|
||||
- bin_sizes:
|
||||
type: string
|
||||
description: list of binsizes separated by space e.g. "1000 10000" and "1000"
|
||||
|
||||
output:
|
||||
- meta:
|
||||
|
@ -40,3 +43,4 @@ output:
|
|||
|
||||
authors:
|
||||
- "@sima-r"
|
||||
- "@ramprasadn"
|
||||
|
|
|
@ -2,10 +2,10 @@ process CNVPYTOR_IMPORTREADDEPTH {
|
|||
tag "$meta.id"
|
||||
label 'process_medium'
|
||||
|
||||
conda (params.enable_conda ? "bioconda::cnvpytor=1.0" : null)
|
||||
conda (params.enable_conda ? "bioconda::cnvpytor=1.2.1" : null)
|
||||
container "${ workflow.containerEngine == 'singularity' && !task.ext.singularity_pull_docker_container ?
|
||||
'https://depot.galaxyproject.org/singularity/cnvpytor:1.0--py39h6a678da_2':
|
||||
'quay.io/biocontainers/cnvpytor:1.0--py39h6a678da_2' }"
|
||||
'https://depot.galaxyproject.org/singularity/cnvpytor:1.2.1--pyhdfd78af_0':
|
||||
'quay.io/biocontainers/cnvpytor:1.2.1--pyhdfd78af_0' }"
|
||||
|
||||
input:
|
||||
tuple val(meta), path(input_file), path(index)
|
||||
|
@ -32,7 +32,7 @@ process CNVPYTOR_IMPORTREADDEPTH {
|
|||
|
||||
cat <<-END_VERSIONS > versions.yml
|
||||
"${task.process}":
|
||||
cnvpytor: \$(echo \$(cnvpytor --version 2>&1) | sed 's/^.*pyCNVnator //; s/Using.*\$//' ))
|
||||
cnvpytor: \$(echo \$(cnvpytor --version 2>&1) | sed 's/CNVpytor //' ))
|
||||
END_VERSIONS
|
||||
"""
|
||||
|
||||
|
@ -43,7 +43,7 @@ process CNVPYTOR_IMPORTREADDEPTH {
|
|||
|
||||
cat <<-END_VERSIONS > versions.yml
|
||||
"${task.process}":
|
||||
cnvpytor: \$(echo \$(cnvpytor --version 2>&1) | sed 's/^.*pyCNVnator //; s/Using.*\$//' ))
|
||||
cnvpytor: \$(echo \$(cnvpytor --version 2>&1) | sed 's/CNVpytor //' ))
|
||||
END_VERSIONS
|
||||
"""
|
||||
}
|
||||
|
|
|
@ -52,3 +52,4 @@ output:
|
|||
|
||||
authors:
|
||||
- "@sima-r"
|
||||
- "@ramprasadn"
|
||||
|
|
|
@ -2,13 +2,14 @@ process CNVPYTOR_PARTITION {
|
|||
tag "$meta.id"
|
||||
label 'process_medium'
|
||||
|
||||
conda (params.enable_conda ? "bioconda::cnvpytor=1.0" : null)
|
||||
conda (params.enable_conda ? "bioconda::cnvpytor=1.2.1" : null)
|
||||
container "${ workflow.containerEngine == 'singularity' && !task.ext.singularity_pull_docker_container ?
|
||||
'https://depot.galaxyproject.org/singularity/cnvpytor:1.0--py39h6a678da_2':
|
||||
'quay.io/biocontainers/cnvpytor:1.0--py39h6a678da_2' }"
|
||||
'https://depot.galaxyproject.org/singularity/cnvpytor:1.2.1--pyhdfd78af_0':
|
||||
'quay.io/biocontainers/cnvpytor:1.2.1--pyhdfd78af_0' }"
|
||||
|
||||
input:
|
||||
tuple val(meta), path(pytor)
|
||||
val bin_sizes
|
||||
|
||||
output:
|
||||
tuple val(meta), path("${pytor.baseName}.pytor"), emit: pytor
|
||||
|
@ -18,15 +19,15 @@ process CNVPYTOR_PARTITION {
|
|||
task.ext.when == null || task.ext.when
|
||||
|
||||
script:
|
||||
def args = task.ext.args ?: ''
|
||||
def bins = bin_sizes ?: '1000'
|
||||
"""
|
||||
cnvpytor \\
|
||||
-root $pytor \\
|
||||
-partition $args
|
||||
-partition $bins
|
||||
|
||||
cat <<-END_VERSIONS > versions.yml
|
||||
"${task.process}":
|
||||
cnvpytor: \$(echo \$(cnvpytor --version 2>&1) | sed 's/^.*pyCNVnator //; s/Using.*\$//' ))
|
||||
cnvpytor: \$(echo \$(cnvpytor --version 2>&1) | sed 's/CNVpytor //' ))
|
||||
END_VERSIONS
|
||||
"""
|
||||
|
||||
|
@ -36,7 +37,7 @@ process CNVPYTOR_PARTITION {
|
|||
|
||||
cat <<-END_VERSIONS > versions.yml
|
||||
"${task.process}":
|
||||
cnvpytor: \$(echo \$(cnvpytor --version 2>&1) | sed 's/^.*pyCNVnator //; s/Using.*\$//' ))
|
||||
cnvpytor: \$(echo \$(cnvpytor --version 2>&1) | sed 's/CNVpytor //' ))
|
||||
END_VERSIONS
|
||||
"""
|
||||
}
|
||||
|
|
|
@ -22,6 +22,9 @@ input:
|
|||
type: file
|
||||
description: pytor file containing read depth data
|
||||
pattern: "*.{pytor}"
|
||||
- bin_sizes:
|
||||
type: string
|
||||
description: list of binsizes separated by space e.g. "1000 10000" and "1000"
|
||||
|
||||
output:
|
||||
- meta:
|
||||
|
@ -40,3 +43,4 @@ output:
|
|||
|
||||
authors:
|
||||
- "@sima-r"
|
||||
- "@ramprasadn"
|
||||
|
|
60
modules/cnvpytor/view/main.nf
Normal file
60
modules/cnvpytor/view/main.nf
Normal file
|
@ -0,0 +1,60 @@
|
|||
process CNVPYTOR_VIEW {
|
||||
tag "$meta.id"
|
||||
label 'process_medium'
|
||||
|
||||
conda (params.enable_conda ? "bioconda::cnvpytor=1.2.1" : null)
|
||||
container "${ workflow.containerEngine == 'singularity' && !task.ext.singularity_pull_docker_container ?
|
||||
'https://depot.galaxyproject.org/singularity/cnvpytor:1.2.1--pyhdfd78af_0':
|
||||
'quay.io/biocontainers/cnvpytor:1.2.1--pyhdfd78af_0' }"
|
||||
|
||||
input:
|
||||
tuple val(meta), path(pytor_files)
|
||||
val bin_sizes
|
||||
val output_format
|
||||
|
||||
output:
|
||||
tuple val(meta), path("*.vcf"), emit: vcf , optional: true
|
||||
tuple val(meta), path("*.tsv"), emit: tsv , optional: true
|
||||
tuple val(meta), path("*.xls"), emit: xls , optional: true
|
||||
path "versions.yml" , emit: versions
|
||||
|
||||
when:
|
||||
task.ext.when == null || task.ext.when
|
||||
|
||||
script:
|
||||
def output_suffix = output_format ?: 'vcf'
|
||||
def bins = bin_sizes ?: '1000'
|
||||
def input = pytor_files.join(" ")
|
||||
def prefix = task.ext.prefix ?: "${meta.id}"
|
||||
"""
|
||||
|
||||
python3 <<CODE
|
||||
import cnvpytor,os
|
||||
binsizes = "${bins}".split(" ")
|
||||
for binsize in binsizes:
|
||||
file_list = "${input}".split(" ")
|
||||
app = cnvpytor.Viewer(file_list, params={} )
|
||||
outputfile = "{}_{}.{}".format("${prefix}",binsize.strip(),"${output_suffix}")
|
||||
app.print_filename = outputfile
|
||||
app.bin_size = int(binsize)
|
||||
app.print_calls_file()
|
||||
CODE
|
||||
|
||||
cat <<-END_VERSIONS > versions.yml
|
||||
"${task.process}":
|
||||
cnvpytor: \$(echo \$(cnvpytor --version 2>&1) | sed 's/CNVpytor //' ))
|
||||
END_VERSIONS
|
||||
"""
|
||||
|
||||
stub:
|
||||
def output_suffix = output_format ?: 'vcf'
|
||||
def prefix = task.ext.prefix ?: "${meta.id}"
|
||||
"""
|
||||
touch ${prefix}.${output_suffix}
|
||||
|
||||
cat <<-END_VERSIONS > versions.yml
|
||||
"${task.process}":
|
||||
cnvpytor: \$(echo \$(cnvpytor --version 2>&1) | sed 's/CNVpytor //' ))
|
||||
END_VERSIONS
|
||||
"""
|
||||
}
|
56
modules/cnvpytor/view/meta.yml
Normal file
56
modules/cnvpytor/view/meta.yml
Normal file
|
@ -0,0 +1,56 @@
|
|||
name: cnvpytor_view
|
||||
description: view function to generate vcfs
|
||||
keywords:
|
||||
- cnv calling
|
||||
tools:
|
||||
- cnvpytor:
|
||||
description: calling CNVs using read depth
|
||||
homepage: https://github.com/abyzovlab/CNVpytor
|
||||
documentation: https://github.com/abyzovlab/CNVpytor
|
||||
tool_dev_url: https://github.com/abyzovlab/CNVpytor
|
||||
doi: "10.1101/2021.01.27.428472v1"
|
||||
licence: ["MIT"]
|
||||
|
||||
input:
|
||||
- meta:
|
||||
type: map
|
||||
description: |
|
||||
Groovy Map containing sample information
|
||||
e.g. [ id:'test' ]
|
||||
- pytor_files:
|
||||
type: file
|
||||
description: pytor file containing cnv calls. To merge calls from multiple samples use a list of files.
|
||||
pattern: "*.{pytor}"
|
||||
- bin_sizes:
|
||||
type: string
|
||||
description: list of binsizes separated by space e.g. "1000 10000" and "1000"
|
||||
- output_format:
|
||||
type: string
|
||||
description: output format of the cnv calls. Valid entries are "tsv", "vcf", and "xls"
|
||||
|
||||
output:
|
||||
- meta:
|
||||
type: map
|
||||
description: |
|
||||
Groovy Map containing sample information
|
||||
e.g. [ id:'test' ]
|
||||
- tsv:
|
||||
type: file
|
||||
description: tsv file containing cnv calls
|
||||
pattern: "*.{tsv}"
|
||||
- vcf:
|
||||
type: file
|
||||
description: vcf file containing cnv calls
|
||||
pattern: "*.{vcf}"
|
||||
- xls:
|
||||
type: file
|
||||
description: xls file containing cnv calls
|
||||
pattern: "*.{xls}"
|
||||
- versions:
|
||||
type: file
|
||||
description: File containing software versions
|
||||
pattern: "versions.yml"
|
||||
|
||||
authors:
|
||||
- "@sima-r"
|
||||
- "@ramprasadn"
|
|
@ -2,10 +2,10 @@ process CUSTOM_GETCHROMSIZES {
|
|||
tag "$fasta"
|
||||
label 'process_low'
|
||||
|
||||
conda (params.enable_conda ? "bioconda::samtools=1.15" : null)
|
||||
conda (params.enable_conda ? "bioconda::samtools=1.15.1" : null)
|
||||
container "${ workflow.containerEngine == 'singularity' && !task.ext.singularity_pull_docker_container ?
|
||||
'https://depot.galaxyproject.org/singularity/samtools:1.15--h1170115_1' :
|
||||
'quay.io/biocontainers/samtools:1.15--h1170115_1' }"
|
||||
'https://depot.galaxyproject.org/singularity/samtools:1.15.1--h1170115_0' :
|
||||
'quay.io/biocontainers/samtools:1.15.1--h1170115_0' }"
|
||||
|
||||
input:
|
||||
path fasta
|
||||
|
|
20
modules/custom/sratoolsncbisettings/main.nf
Normal file
20
modules/custom/sratoolsncbisettings/main.nf
Normal file
|
@ -0,0 +1,20 @@
|
|||
process CUSTOM_SRATOOLSNCBISETTINGS {
|
||||
tag 'ncbi-settings'
|
||||
label 'process_low'
|
||||
|
||||
conda (params.enable_conda ? 'bioconda::sra-tools=2.11.0' : null)
|
||||
container "${ workflow.containerEngine == 'singularity' && !task.ext.singularity_pull_docker_container ?
|
||||
'https://depot.galaxyproject.org/singularity/sra-tools:2.11.0--pl5321ha49a11a_3' :
|
||||
'quay.io/biocontainers/sra-tools:2.11.0--pl5321ha49a11a_3' }"
|
||||
|
||||
output:
|
||||
path('*.mkfg') , emit: ncbi_settings
|
||||
path 'versions.yml', emit: versions
|
||||
|
||||
when:
|
||||
task.ext.when == null || task.ext.when
|
||||
|
||||
shell:
|
||||
config = "/LIBS/GUID = \"${UUID.randomUUID().toString()}\"\\n/libs/cloud/report_instance_identity = \"true\"\\n"
|
||||
template 'detect_ncbi_settings.sh'
|
||||
}
|
28
modules/custom/sratoolsncbisettings/meta.yml
Normal file
28
modules/custom/sratoolsncbisettings/meta.yml
Normal file
|
@ -0,0 +1,28 @@
|
|||
name: "sratoolsncbisettings"
|
||||
description: Test for the presence of suitable NCBI settings or create them on the fly.
|
||||
keywords:
|
||||
- NCBI
|
||||
- settings
|
||||
- sra-tools
|
||||
- prefetch
|
||||
- fasterq-dump
|
||||
tools:
|
||||
- "sratools":
|
||||
description: "SRA Toolkit and SDK from NCBI"
|
||||
homepage: https://github.com/ncbi/sra-tools
|
||||
documentation: https://github.com/ncbi/sra-tools/wiki
|
||||
tool_dev_url: https://github.com/ncbi/sra-tools
|
||||
licence: "['Public Domain']"
|
||||
|
||||
output:
|
||||
- versions:
|
||||
type: file
|
||||
description: File containing software versions
|
||||
pattern: "versions.yml"
|
||||
- ncbi_settings:
|
||||
type: file
|
||||
description: An NCBI user settings file.
|
||||
pattern: "*.mkfg"
|
||||
|
||||
authors:
|
||||
- "@Midnighter"
|
|
@ -0,0 +1,45 @@
|
|||
#!/usr/bin/env bash
|
||||
|
||||
set -u
|
||||
|
||||
|
||||
# Get the expected NCBI settings path and define the environment variable
|
||||
# `NCBI_SETTINGS`.
|
||||
eval "$(vdb-config -o n NCBI_SETTINGS | sed 's/[" ]//g')"
|
||||
|
||||
# If the user settings do not exist yet, create a file suitable for `prefetch`
|
||||
# and `fasterq-dump`. If an existing settings file does not contain the required
|
||||
# values, error out with a helpful message.
|
||||
if [[ ! -f "${NCBI_SETTINGS}" ]]; then
|
||||
printf '!{config}' > 'user-settings.mkfg'
|
||||
else
|
||||
prefetch --help &> /dev/null
|
||||
if [[ $? = 78 ]]; then
|
||||
echo "You have an existing vdb-config at '${NCBI_SETTINGS}' but it is"\
|
||||
"missing the required entries for /LIBS/GUID and"\
|
||||
"/libs/cloud/report_instance_identity."\
|
||||
"Feel free to add the following to your settings file:" >&2
|
||||
echo "$(printf '!{config}')" >&2
|
||||
exit 1
|
||||
fi
|
||||
fasterq-dump --help &> /dev/null
|
||||
if [[ $? = 78 ]]; then
|
||||
echo "You have an existing vdb-config at '${NCBI_SETTINGS}' but it is"\
|
||||
"missing the required entries for /LIBS/GUID and"\
|
||||
"/libs/cloud/report_instance_identity."\
|
||||
"Feel free to add the following to your settings file:" >&2
|
||||
echo "$(printf '!{config}')" >&2
|
||||
exit 1
|
||||
fi
|
||||
if [[ "${NCBI_SETTINGS}" != *.mkfg ]]; then
|
||||
echo "The detected settings '${NCBI_SETTINGS}' do not have the required"\
|
||||
"file extension '.mkfg'." >&2
|
||||
exit 1
|
||||
fi
|
||||
cp "${NCBI_SETTINGS}" ./
|
||||
fi
|
||||
|
||||
cat <<-END_VERSIONS > versions.yml
|
||||
"!{task.process}":
|
||||
sratools: $(vdb-config --version 2>&1 | grep -Eo '[0-9.]+')
|
||||
END_VERSIONS
|
|
@ -2,20 +2,26 @@ process DIAMOND_BLASTP {
|
|||
tag "$meta.id"
|
||||
label 'process_medium'
|
||||
|
||||
// Dimaond is limited to v2.0.9 because there is not a
|
||||
// singularity version higher than this at the current time.
|
||||
conda (params.enable_conda ? "bioconda::diamond=2.0.9" : null)
|
||||
conda (params.enable_conda ? "bioconda::diamond=2.0.15" : null)
|
||||
container "${ workflow.containerEngine == 'singularity' && !task.ext.singularity_pull_docker_container ?
|
||||
'https://depot.galaxyproject.org/singularity/diamond:2.0.9--hdcc8f71_0' :
|
||||
'quay.io/biocontainers/diamond:2.0.9--hdcc8f71_0' }"
|
||||
'https://depot.galaxyproject.org/singularity/diamond:2.0.15--hb97b32f_0' :
|
||||
'quay.io/biocontainers/diamond:2.0.15--hb97b32f_0' }"
|
||||
|
||||
input:
|
||||
tuple val(meta), path(fasta)
|
||||
path db
|
||||
path db
|
||||
val out_ext
|
||||
val blast_columns
|
||||
|
||||
output:
|
||||
tuple val(meta), path('*.txt'), emit: txt
|
||||
path "versions.yml" , emit: versions
|
||||
tuple val(meta), path('*.blast'), optional: true, emit: blast
|
||||
tuple val(meta), path('*.xml') , optional: true, emit: xml
|
||||
tuple val(meta), path('*.txt') , optional: true, emit: txt
|
||||
tuple val(meta), path('*.daa') , optional: true, emit: daa
|
||||
tuple val(meta), path('*.sam') , optional: true, emit: sam
|
||||
tuple val(meta), path('*.tsv') , optional: true, emit: tsv
|
||||
tuple val(meta), path('*.paf') , optional: true, emit: paf
|
||||
path "versions.yml" , emit: versions
|
||||
|
||||
when:
|
||||
task.ext.when == null || task.ext.when
|
||||
|
@ -23,6 +29,21 @@ process DIAMOND_BLASTP {
|
|||
script:
|
||||
def args = task.ext.args ?: ''
|
||||
def prefix = task.ext.prefix ?: "${meta.id}"
|
||||
def columns = blast_columns ? "${blast_columns}" : ''
|
||||
switch ( out_ext ) {
|
||||
case "blast": outfmt = 0; break
|
||||
case "xml": outfmt = 5; break
|
||||
case "txt": outfmt = 6; break
|
||||
case "daa": outfmt = 100; break
|
||||
case "sam": outfmt = 101; break
|
||||
case "tsv": outfmt = 102; break
|
||||
case "paf": outfmt = 103; break
|
||||
default:
|
||||
outfmt = '6';
|
||||
out_ext = 'txt';
|
||||
log.warn("Unknown output file format provided (${out_ext}): selecting DIAMOND default of tabular BLAST output (txt)");
|
||||
break
|
||||
}
|
||||
"""
|
||||
DB=`find -L ./ -name "*.dmnd" | sed 's/.dmnd//'`
|
||||
|
||||
|
@ -31,8 +52,9 @@ process DIAMOND_BLASTP {
|
|||
--threads $task.cpus \\
|
||||
--db \$DB \\
|
||||
--query $fasta \\
|
||||
--outfmt ${outfmt} ${columns} \\
|
||||
$args \\
|
||||
--out ${prefix}.txt
|
||||
--out ${prefix}.${out_ext}
|
||||
|
||||
cat <<-END_VERSIONS > versions.yml
|
||||
"${task.process}":
|
||||
|
|
|
@ -28,12 +28,50 @@ input:
|
|||
type: directory
|
||||
description: Directory containing the protein blast database
|
||||
pattern: "*"
|
||||
- out_ext:
|
||||
type: string
|
||||
description: |
|
||||
Specify the type of output file to be generated. `blast` corresponds to
|
||||
BLAST pairwise format. `xml` corresponds to BLAST xml format.
|
||||
`txt` corresponds to to BLAST tabular format. `tsv` corresponds to
|
||||
taxonomic classification format.
|
||||
pattern: "blast|xml|txt|daa|sam|tsv|paf"
|
||||
- blast_columns:
|
||||
type: string
|
||||
description: |
|
||||
Optional space separated list of DIAMOND tabular BLAST output keywords
|
||||
used for in conjunction with the 'txt' out_ext option (--outfmt 6). See
|
||||
DIAMOND documnetation for more information.
|
||||
|
||||
output:
|
||||
- txt:
|
||||
- blast:
|
||||
type: file
|
||||
description: File containing blastp hits
|
||||
pattern: "*.{blastp.txt}"
|
||||
pattern: "*.{blast}"
|
||||
- xml:
|
||||
type: file
|
||||
description: File containing blastp hits
|
||||
pattern: "*.{xml}"
|
||||
- txt:
|
||||
type: file
|
||||
description: File containing hits in tabular BLAST format.
|
||||
pattern: "*.{txt}"
|
||||
- daa:
|
||||
type: file
|
||||
description: File containing hits DAA format
|
||||
pattern: "*.{daa}"
|
||||
- sam:
|
||||
type: file
|
||||
description: File containing aligned reads in SAM format
|
||||
pattern: "*.{sam}"
|
||||
- tsv:
|
||||
type: file
|
||||
description: Tab separated file containing taxonomic classification of hits
|
||||
pattern: "*.{tsv}"
|
||||
- paf:
|
||||
type: file
|
||||
description: File containing aligned reads in pairwise mapping format format
|
||||
pattern: "*.{paf}"
|
||||
- versions:
|
||||
type: file
|
||||
description: File containing software versions
|
||||
|
@ -41,3 +79,4 @@ output:
|
|||
|
||||
authors:
|
||||
- "@spficklin"
|
||||
- "@jfy133"
|
||||
|
|
|
@ -2,20 +2,26 @@ process DIAMOND_BLASTX {
|
|||
tag "$meta.id"
|
||||
label 'process_medium'
|
||||
|
||||
// Dimaond is limited to v2.0.9 because there is not a
|
||||
// singularity version higher than this at the current time.
|
||||
conda (params.enable_conda ? "bioconda::diamond=2.0.9" : null)
|
||||
conda (params.enable_conda ? "bioconda::diamond=2.0.15" : null)
|
||||
container "${ workflow.containerEngine == 'singularity' && !task.ext.singularity_pull_docker_container ?
|
||||
'https://depot.galaxyproject.org/singularity/diamond:2.0.9--hdcc8f71_0' :
|
||||
'quay.io/biocontainers/diamond:2.0.9--hdcc8f71_0' }"
|
||||
'https://depot.galaxyproject.org/singularity/diamond:2.0.15--hb97b32f_0' :
|
||||
'quay.io/biocontainers/diamond:2.0.15--hb97b32f_0' }"
|
||||
|
||||
input:
|
||||
tuple val(meta), path(fasta)
|
||||
path db
|
||||
path db
|
||||
val out_ext
|
||||
val blast_columns
|
||||
|
||||
output:
|
||||
tuple val(meta), path('*.txt'), emit: txt
|
||||
path "versions.yml" , emit: versions
|
||||
tuple val(meta), path('*.blast'), optional: true, emit: blast
|
||||
tuple val(meta), path('*.xml') , optional: true, emit: xml
|
||||
tuple val(meta), path('*.txt') , optional: true, emit: txt
|
||||
tuple val(meta), path('*.daa') , optional: true, emit: daa
|
||||
tuple val(meta), path('*.sam') , optional: true, emit: sam
|
||||
tuple val(meta), path('*.tsv') , optional: true, emit: tsv
|
||||
tuple val(meta), path('*.paf') , optional: true, emit: paf
|
||||
path "versions.yml" , emit: versions
|
||||
|
||||
when:
|
||||
task.ext.when == null || task.ext.when
|
||||
|
@ -23,6 +29,21 @@ process DIAMOND_BLASTX {
|
|||
script:
|
||||
def args = task.ext.args ?: ''
|
||||
def prefix = task.ext.prefix ?: "${meta.id}"
|
||||
def columns = blast_columns ? "${blast_columns}" : ''
|
||||
switch ( out_ext ) {
|
||||
case "blast": outfmt = 0; break
|
||||
case "xml": outfmt = 5; break
|
||||
case "txt": outfmt = 6; break
|
||||
case "daa": outfmt = 100; break
|
||||
case "sam": outfmt = 101; break
|
||||
case "tsv": outfmt = 102; break
|
||||
case "paf": outfmt = 103; break
|
||||
default:
|
||||
outfmt = '6';
|
||||
out_ext = 'txt';
|
||||
log.warn("Unknown output file format provided (${out_ext}): selecting DIAMOND default of tabular BLAST output (txt)");
|
||||
break
|
||||
}
|
||||
"""
|
||||
DB=`find -L ./ -name "*.dmnd" | sed 's/.dmnd//'`
|
||||
|
||||
|
@ -31,8 +52,9 @@ process DIAMOND_BLASTX {
|
|||
--threads $task.cpus \\
|
||||
--db \$DB \\
|
||||
--query $fasta \\
|
||||
--outfmt ${outfmt} ${columns} \\
|
||||
$args \\
|
||||
--out ${prefix}.txt
|
||||
--out ${prefix}.${out_ext}
|
||||
|
||||
cat <<-END_VERSIONS > versions.yml
|
||||
"${task.process}":
|
||||
|
|
|
@ -28,12 +28,44 @@ input:
|
|||
type: directory
|
||||
description: Directory containing the nucelotide blast database
|
||||
pattern: "*"
|
||||
- out_ext:
|
||||
type: string
|
||||
description: |
|
||||
Specify the type of output file to be generated. `blast` corresponds to
|
||||
BLAST pairwise format. `xml` corresponds to BLAST xml format.
|
||||
`txt` corresponds to to BLAST tabular format. `tsv` corresponds to
|
||||
taxonomic classification format.
|
||||
pattern: "blast|xml|txt|daa|sam|tsv|paf"
|
||||
|
||||
output:
|
||||
- blast:
|
||||
type: file
|
||||
description: File containing blastp hits
|
||||
pattern: "*.{blast}"
|
||||
- xml:
|
||||
type: file
|
||||
description: File containing blastp hits
|
||||
pattern: "*.{xml}"
|
||||
- txt:
|
||||
type: file
|
||||
description: File containing blastx hits
|
||||
pattern: "*.{blastx.txt}"
|
||||
description: File containing hits in tabular BLAST format.
|
||||
pattern: "*.{txt}"
|
||||
- daa:
|
||||
type: file
|
||||
description: File containing hits DAA format
|
||||
pattern: "*.{daa}"
|
||||
- sam:
|
||||
type: file
|
||||
description: File containing aligned reads in SAM format
|
||||
pattern: "*.{sam}"
|
||||
- tsv:
|
||||
type: file
|
||||
description: Tab separated file containing taxonomic classification of hits
|
||||
pattern: "*.{tsv}"
|
||||
- paf:
|
||||
type: file
|
||||
description: File containing aligned reads in pairwise mapping format format
|
||||
pattern: "*.{paf}"
|
||||
- versions:
|
||||
type: file
|
||||
description: File containing software versions
|
||||
|
@ -41,3 +73,4 @@ output:
|
|||
|
||||
authors:
|
||||
- "@spficklin"
|
||||
- "@jfy133"
|
||||
|
|
|
@ -2,12 +2,10 @@ process DIAMOND_MAKEDB {
|
|||
tag "$fasta"
|
||||
label 'process_medium'
|
||||
|
||||
// Dimaond is limited to v2.0.9 because there is not a
|
||||
// singularity version higher than this at the current time.
|
||||
conda (params.enable_conda ? 'bioconda::diamond=2.0.9' : null)
|
||||
conda (params.enable_conda ? "bioconda::diamond=2.0.15" : null)
|
||||
container "${ workflow.containerEngine == 'singularity' && !task.ext.singularity_pull_docker_container ?
|
||||
'https://depot.galaxyproject.org/singularity/diamond:2.0.9--hdcc8f71_0' :
|
||||
'quay.io/biocontainers/diamond:2.0.9--hdcc8f71_0' }"
|
||||
'https://depot.galaxyproject.org/singularity/diamond:2.0.15--hb97b32f_0' :
|
||||
'quay.io/biocontainers/diamond:2.0.15--hb97b32f_0' }"
|
||||
|
||||
input:
|
||||
path fasta
|
||||
|
|
43
modules/elprep/merge/main.nf
Normal file
43
modules/elprep/merge/main.nf
Normal file
|
@ -0,0 +1,43 @@
|
|||
process ELPREP_MERGE {
|
||||
tag "$meta.id"
|
||||
label 'process_low'
|
||||
|
||||
conda (params.enable_conda ? "bioconda::elprep=5.1.2" : null)
|
||||
container "${ workflow.containerEngine == 'singularity' && !task.ext.singularity_pull_docker_container ?
|
||||
'https://depot.galaxyproject.org/singularity/elprep:5.1.2--he881be0_0':
|
||||
'quay.io/biocontainers/elprep:5.1.2--he881be0_0' }"
|
||||
|
||||
input:
|
||||
tuple val(meta), path(bam)
|
||||
|
||||
output:
|
||||
tuple val(meta), path("output/**.{bam,sam}") , emit: bam
|
||||
path "versions.yml" , emit: versions
|
||||
|
||||
when:
|
||||
task.ext.when == null || task.ext.when
|
||||
|
||||
script:
|
||||
def args = task.ext.args ?: ''
|
||||
def prefix = task.ext.prefix ?: "${meta.id}"
|
||||
def suffix = args.contains("--output-type sam") ? "sam" : "bam"
|
||||
def single_end = meta.single_end ? " --single-end" : ""
|
||||
|
||||
"""
|
||||
# create directory and move all input so elprep can find and merge them before splitting
|
||||
mkdir input
|
||||
mv ${bam} input/
|
||||
|
||||
elprep merge \\
|
||||
input/ \\
|
||||
output/${prefix}.${suffix} \\
|
||||
$args \\
|
||||
${single_end} \\
|
||||
--nr-of-threads $task.cpus
|
||||
|
||||
cat <<-END_VERSIONS > versions.yml
|
||||
"${task.process}":
|
||||
elprep: \$(elprep 2>&1 | head -n2 | tail -n1 |sed 's/^.*version //;s/ compiled.*\$//')
|
||||
END_VERSIONS
|
||||
"""
|
||||
}
|
44
modules/elprep/merge/meta.yml
Normal file
44
modules/elprep/merge/meta.yml
Normal file
|
@ -0,0 +1,44 @@
|
|||
name: "elprep_merge"
|
||||
description: Merge split bam/sam chunks in one file
|
||||
keywords:
|
||||
- bam
|
||||
- sam
|
||||
- merge
|
||||
tools:
|
||||
- "elprep":
|
||||
description: "elPrep is a high-performance tool for preparing .sam/.bam files for variant calling in sequencing pipelines. It can be used as a drop-in replacement for SAMtools/Picard/GATK4."
|
||||
homepage: "https://github.com/ExaScience/elprep"
|
||||
documentation: "https://github.com/ExaScience/elprep"
|
||||
tool_dev_url: "https://github.com/ExaScience/elprep"
|
||||
doi: "10.1371/journal.pone.0244471"
|
||||
licence: "['AGPL v3']"
|
||||
|
||||
input:
|
||||
- meta:
|
||||
type: map
|
||||
description: |
|
||||
Groovy Map containing sample information
|
||||
e.g. [ id:'test', single_end:false ]
|
||||
- bam:
|
||||
type: file
|
||||
description: List of BAM/SAM chunks to merge
|
||||
pattern: "*.{bam,sam}"
|
||||
|
||||
output:
|
||||
- meta:
|
||||
type: map
|
||||
description: |
|
||||
Groovy Map containing sample information
|
||||
e.g. [ id:'test', single_end:false ]
|
||||
#
|
||||
- versions:
|
||||
type: file
|
||||
description: File containing software versions
|
||||
pattern: "versions.yml"
|
||||
- bam:
|
||||
type: file
|
||||
description: Merged BAM/SAM file
|
||||
pattern: "*.{bam,sam}"
|
||||
|
||||
authors:
|
||||
- "@matthdsm"
|
|
@ -8,13 +8,14 @@ LABEL \
|
|||
COPY environment.yml /
|
||||
RUN conda env create -f /environment.yml && conda clean -a
|
||||
|
||||
# Add conda installation dir to PATH (instead of doing 'conda activate')
|
||||
ENV PATH /opt/conda/envs/nf-core-vep-104.3/bin:$PATH
|
||||
|
||||
# Setup default ARG variables
|
||||
ARG GENOME=GRCh38
|
||||
ARG SPECIES=homo_sapiens
|
||||
ARG VEP_VERSION=99
|
||||
ARG VEP_VERSION=104
|
||||
ARG VEP_TAG=104.3
|
||||
|
||||
# Add conda installation dir to PATH (instead of doing 'conda activate')
|
||||
ENV PATH /opt/conda/envs/nf-core-vep-${VEP_TAG}/bin:$PATH
|
||||
|
||||
# Download Genome
|
||||
RUN vep_install \
|
||||
|
@ -27,4 +28,4 @@ RUN vep_install \
|
|||
--NO_BIOPERL --NO_HTSLIB --NO_TEST --NO_UPDATE
|
||||
|
||||
# Dump the details of the installed packages to a file for posterity
|
||||
RUN conda env export --name nf-core-vep-104.3 > nf-core-vep-104.3.yml
|
||||
RUN conda env export --name nf-core-vep-${VEP_TAG} > nf-core-vep-${VEP_TAG}.yml
|
||||
|
|
|
@ -10,11 +10,12 @@ build_push() {
|
|||
VEP_TAG=$4
|
||||
|
||||
docker build \
|
||||
. \
|
||||
-t nfcore/vep:${VEP_TAG}.${GENOME} \
|
||||
software/vep/. \
|
||||
--build-arg GENOME=${GENOME} \
|
||||
--build-arg SPECIES=${SPECIES} \
|
||||
--build-arg VEP_VERSION=${VEP_VERSION}
|
||||
--build-arg VEP_VERSION=${VEP_VERSION} \
|
||||
--build-arg VEP_TAG=${VEP_TAG}
|
||||
|
||||
docker push nfcore/vep:${VEP_TAG}.${GENOME}
|
||||
}
|
||||
|
|
|
@ -13,6 +13,7 @@ process ENSEMBLVEP {
|
|||
val species
|
||||
val cache_version
|
||||
path cache
|
||||
path extra_files
|
||||
|
||||
output:
|
||||
tuple val(meta), path("*.ann.vcf"), emit: vcf
|
||||
|
|
|
@ -10,17 +10,6 @@ tools:
|
|||
homepage: https://www.ensembl.org/info/docs/tools/vep/index.html
|
||||
documentation: https://www.ensembl.org/info/docs/tools/vep/script/index.html
|
||||
licence: ["Apache-2.0"]
|
||||
params:
|
||||
- use_cache:
|
||||
type: boolean
|
||||
description: |
|
||||
Enable the usage of containers with cache
|
||||
Does not work with conda
|
||||
- vep_tag:
|
||||
type: value
|
||||
description: |
|
||||
Specify the tag for the container
|
||||
https://hub.docker.com/r/nfcore/vep/tags
|
||||
input:
|
||||
- meta:
|
||||
type: map
|
||||
|
@ -47,6 +36,10 @@ input:
|
|||
type: file
|
||||
description: |
|
||||
path to VEP cache (optional)
|
||||
- extra_files:
|
||||
type: tuple
|
||||
description: |
|
||||
path to file(s) needed for plugins (optional)
|
||||
output:
|
||||
- vcf:
|
||||
type: file
|
||||
|
|
41
modules/gamma/main.nf
Normal file
41
modules/gamma/main.nf
Normal file
|
@ -0,0 +1,41 @@
|
|||
def VERSION = '2.1' // Version information not provided by tool on CLI
|
||||
|
||||
process GAMMA {
|
||||
tag "$meta.id"
|
||||
label 'process_low'
|
||||
|
||||
conda (params.enable_conda ? "bioconda::gamma=2.1" : null)
|
||||
container "${ workflow.containerEngine == 'singularity' && !task.ext.singularity_pull_docker_container ?
|
||||
'https://depot.galaxyproject.org/singularity/gamma%3A2.1--hdfd78af_0':
|
||||
'quay.io/biocontainers/gamma:2.1--hdfd78af_0' }"
|
||||
|
||||
input:
|
||||
tuple val(meta), path(fasta)
|
||||
path(db)
|
||||
|
||||
output:
|
||||
tuple val(meta), path("*.gamma") , emit: gamma
|
||||
tuple val(meta), path("*.psl") , emit: psl
|
||||
tuple val(meta), path("*.gff") , optional:true , emit: gff
|
||||
tuple val(meta), path("*.fasta"), optional:true , emit: fasta
|
||||
path "versions.yml" , emit: versions
|
||||
|
||||
when:
|
||||
task.ext.when == null || task.ext.when
|
||||
|
||||
script:
|
||||
def args = task.ext.args ?: ''
|
||||
def prefix = task.ext.prefix ?: "${meta.id}"
|
||||
"""
|
||||
GAMMA.py \\
|
||||
$args \\
|
||||
$fasta \\
|
||||
$db \\
|
||||
$prefix
|
||||
|
||||
cat <<-END_VERSIONS > versions.yml
|
||||
"${task.process}":
|
||||
gamma: $VERSION
|
||||
END_VERSIONS
|
||||
"""
|
||||
}
|
63
modules/gamma/meta.yml
Normal file
63
modules/gamma/meta.yml
Normal file
|
@ -0,0 +1,63 @@
|
|||
name: "gamma"
|
||||
description: Gene Allele Mutation Microbial Assessment
|
||||
keywords:
|
||||
- gamma
|
||||
- gene-calling
|
||||
tools:
|
||||
- "gamma":
|
||||
description: "Tool for Gene Allele Mutation Microbial Assessment"
|
||||
homepage: "https://github.com/rastanton/GAMMA"
|
||||
documentation: "https://github.com/rastanton/GAMMA"
|
||||
tool_dev_url: "https://github.com/rastanton/GAMMA"
|
||||
doi: "10.1093/bioinformatics/btab607"
|
||||
licence: "['Apache License 2.0']"
|
||||
|
||||
input:
|
||||
- meta:
|
||||
type: map
|
||||
description: |
|
||||
Groovy Map containing sample information
|
||||
e.g. [ id:'test', single_end:false ]
|
||||
- fasta:
|
||||
type: file
|
||||
description: FASTA file
|
||||
pattern: "*.{fa,fasta}"
|
||||
- db:
|
||||
type: file
|
||||
description: Database in FASTA format
|
||||
pattern: "*.{fa,fasta}"
|
||||
|
||||
output:
|
||||
- meta:
|
||||
type: map
|
||||
description: |
|
||||
Groovy Map containing sample information
|
||||
e.g. [ id:'test', single_end:false ]
|
||||
- versions:
|
||||
type: file
|
||||
description: File containing software versions
|
||||
pattern: "versions.yml"
|
||||
|
||||
- gamma:
|
||||
type: file
|
||||
description: GAMMA file with annotated gene matches
|
||||
pattern: "*.{gamma}"
|
||||
|
||||
- psl:
|
||||
type: file
|
||||
description: PSL file with all gene matches found
|
||||
pattern: "*.{psl}"
|
||||
|
||||
- gff:
|
||||
type: file
|
||||
description: GFF file
|
||||
pattern: "*.{gff}"
|
||||
|
||||
- fasta:
|
||||
type: file
|
||||
description: multifasta file of the gene matches
|
||||
pattern: "*.{fasta}"
|
||||
|
||||
authors:
|
||||
- "@sateeshperi"
|
||||
- "@rastanton"
|
|
@ -2,10 +2,10 @@ process GATK4_APPLYBQSR {
|
|||
tag "$meta.id"
|
||||
label 'process_low'
|
||||
|
||||
conda (params.enable_conda ? "bioconda::gatk4=4.2.5.0" : null)
|
||||
conda (params.enable_conda ? "bioconda::gatk4=4.2.6.1" : null)
|
||||
container "${ workflow.containerEngine == 'singularity' && !task.ext.singularity_pull_docker_container ?
|
||||
'https://depot.galaxyproject.org/singularity/gatk4:4.2.5.0--hdfd78af_0' :
|
||||
'quay.io/biocontainers/gatk4:4.2.5.0--hdfd78af_0' }"
|
||||
'https://depot.galaxyproject.org/singularity/gatk4:4.2.6.1--hdfd78af_0':
|
||||
'quay.io/biocontainers/gatk4:4.2.6.1--hdfd78af_0' }"
|
||||
|
||||
input:
|
||||
tuple val(meta), path(input), path(input_index), path(bqsr_table), path(intervals)
|
||||
|
|
|
@ -2,10 +2,10 @@ process GATK4_APPLYBQSR_SPARK {
|
|||
tag "$meta.id"
|
||||
label 'process_low'
|
||||
|
||||
conda (params.enable_conda ? "bioconda::gatk4=4.2.3.0" : null)
|
||||
conda (params.enable_conda ? "bioconda::gatk4=4.2.6.1" : null)
|
||||
container "${ workflow.containerEngine == 'singularity' && !task.ext.singularity_pull_docker_container ?
|
||||
'https://depot.galaxyproject.org/singularity/gatk4:4.2.3.0--hdfd78af_0' :
|
||||
'quay.io/biocontainers/gatk4:4.2.3.0--hdfd78af_0' }"
|
||||
'https://depot.galaxyproject.org/singularity/gatk4:4.2.6.1--hdfd78af_0':
|
||||
'quay.io/biocontainers/gatk4:4.2.6.1--hdfd78af_0' }"
|
||||
|
||||
input:
|
||||
tuple val(meta), path(input), path(input_index), path(bqsr_table), path(intervals)
|
||||
|
|
|
@ -2,10 +2,10 @@ process GATK4_APPLYVQSR {
|
|||
tag "$meta.id"
|
||||
label 'process_low'
|
||||
|
||||
conda (params.enable_conda ? "bioconda::gatk4=4.2.5.0" : null)
|
||||
conda (params.enable_conda ? "bioconda::gatk4=4.2.6.1" : null)
|
||||
container "${ workflow.containerEngine == 'singularity' && !task.ext.singularity_pull_docker_container ?
|
||||
'https://depot.galaxyproject.org/singularity/gatk4:4.2.5.0--hdfd78af_0' :
|
||||
'quay.io/biocontainers/gatk4:4.2.5.0--hdfd78af_0' }"
|
||||
'https://depot.galaxyproject.org/singularity/gatk4:4.2.6.1--hdfd78af_0':
|
||||
'quay.io/biocontainers/gatk4:4.2.6.1--hdfd78af_0' }"
|
||||
|
||||
input:
|
||||
tuple val(meta), path(vcf), path(vcf_tbi), path(recal), path(recal_index), path(tranches)
|
||||
|
|
|
@ -2,10 +2,10 @@ process GATK4_BASERECALIBRATOR {
|
|||
tag "$meta.id"
|
||||
label 'process_low'
|
||||
|
||||
conda (params.enable_conda ? "bioconda::gatk4=4.2.5.0" : null)
|
||||
conda (params.enable_conda ? "bioconda::gatk4=4.2.6.1" : null)
|
||||
container "${ workflow.containerEngine == 'singularity' && !task.ext.singularity_pull_docker_container ?
|
||||
'https://depot.galaxyproject.org/singularity/gatk4:4.2.5.0--hdfd78af_0' :
|
||||
'quay.io/biocontainers/gatk4:4.2.5.0--hdfd78af_0' }"
|
||||
'https://depot.galaxyproject.org/singularity/gatk4:4.2.6.1--hdfd78af_0':
|
||||
'quay.io/biocontainers/gatk4:4.2.6.1--hdfd78af_0' }"
|
||||
|
||||
input:
|
||||
tuple val(meta), path(input), path(input_index), path(intervals)
|
||||
|
|
|
@ -2,10 +2,10 @@ process GATK4_BASERECALIBRATOR_SPARK {
|
|||
tag "$meta.id"
|
||||
label 'process_low'
|
||||
|
||||
conda (params.enable_conda ? "bioconda::gatk4=4.2.3.0" : null)
|
||||
conda (params.enable_conda ? "bioconda::gatk4=4.2.6.1" : null)
|
||||
container "${ workflow.containerEngine == 'singularity' && !task.ext.singularity_pull_docker_container ?
|
||||
'https://depot.galaxyproject.org/singularity/gatk4:4.2.3.0--hdfd78af_0' :
|
||||
'broadinstitute/gatk:4.2.3.0' }"
|
||||
'https://depot.galaxyproject.org/singularity/gatk4:4.2.6.1--hdfd78af_0':
|
||||
'quay.io/biocontainers/gatk4:4.2.6.1--hdfd78af_0' }"
|
||||
|
||||
input:
|
||||
tuple val(meta), path(input), path(input_index), path(intervals)
|
||||
|
|
|
@ -2,10 +2,10 @@ process GATK4_BEDTOINTERVALLIST {
|
|||
tag "$meta.id"
|
||||
label 'process_medium'
|
||||
|
||||
conda (params.enable_conda ? "bioconda::gatk4=4.2.5.0" : null)
|
||||
conda (params.enable_conda ? "bioconda::gatk4=4.2.6.1" : null)
|
||||
container "${ workflow.containerEngine == 'singularity' && !task.ext.singularity_pull_docker_container ?
|
||||
'https://depot.galaxyproject.org/singularity/gatk4:4.2.5.0--hdfd78af_0' :
|
||||
'quay.io/biocontainers/gatk4:4.2.5.0--hdfd78af_0' }"
|
||||
'https://depot.galaxyproject.org/singularity/gatk4:4.2.6.1--hdfd78af_0':
|
||||
'quay.io/biocontainers/gatk4:4.2.6.1--hdfd78af_0' }"
|
||||
|
||||
input:
|
||||
tuple val(meta), path(bed)
|
||||
|
|
|
@ -2,10 +2,10 @@ process GATK4_CALCULATECONTAMINATION {
|
|||
tag "$meta.id"
|
||||
label 'process_low'
|
||||
|
||||
conda (params.enable_conda ? "bioconda::gatk4=4.2.5.0" : null)
|
||||
conda (params.enable_conda ? "bioconda::gatk4=4.2.6.1" : null)
|
||||
container "${ workflow.containerEngine == 'singularity' && !task.ext.singularity_pull_docker_container ?
|
||||
'https://depot.galaxyproject.org/singularity/gatk4:4.2.5.0--hdfd78af_0' :
|
||||
'quay.io/biocontainers/gatk4:4.2.5.0--hdfd78af_0' }"
|
||||
'https://depot.galaxyproject.org/singularity/gatk4:4.2.6.1--hdfd78af_0':
|
||||
'quay.io/biocontainers/gatk4:4.2.6.1--hdfd78af_0' }"
|
||||
|
||||
input:
|
||||
tuple val(meta), path(pileup), path(matched)
|
||||
|
|
58
modules/gatk4/cnnscorevariants/main.nf
Normal file
58
modules/gatk4/cnnscorevariants/main.nf
Normal file
|
@ -0,0 +1,58 @@
|
|||
process GATK4_CNNSCOREVARIANTS {
|
||||
tag "$meta.id"
|
||||
label 'process_low'
|
||||
|
||||
//Conda is not supported at the moment: https://github.com/broadinstitute/gatk/issues/7811
|
||||
if (params.enable_conda) {
|
||||
exit 1, "Conda environments cannot be used for GATK4/CNNScoreVariants at the moment. Please use docker or singularity containers."
|
||||
}
|
||||
container 'broadinstitute/gatk:4.2.6.1' //Biocontainers is missing a package
|
||||
|
||||
input:
|
||||
tuple val(meta), path(vcf), path(tbi), path(aligned_input), path(intervals)
|
||||
path fasta
|
||||
path fai
|
||||
path dict
|
||||
path architecture
|
||||
path weights
|
||||
|
||||
output:
|
||||
tuple val(meta), path("*cnn.vcf.gz") , emit: vcf
|
||||
tuple val(meta), path("*cnn.vcf.gz.tbi"), emit: tbi
|
||||
path "versions.yml" , emit: versions
|
||||
|
||||
when:
|
||||
task.ext.when == null || task.ext.when
|
||||
|
||||
script:
|
||||
def args = task.ext.args ?: ''
|
||||
def prefix = task.ext.prefix ?: "${meta.id}"
|
||||
def aligned_input = aligned_input ? "--input $aligned_input" : ""
|
||||
def interval_command = intervals ? "--intervals $intervals" : ""
|
||||
def architecture = architecture ? "--architecture $architecture" : ""
|
||||
def weights = weights ? "--weights $weights" : ""
|
||||
|
||||
def avail_mem = 3
|
||||
if (!task.memory) {
|
||||
log.info '[GATK CnnScoreVariants] Available memory not known - defaulting to 3GB. Specify process memory requirements to change this.'
|
||||
} else {
|
||||
avail_mem = task.memory.giga
|
||||
}
|
||||
"""
|
||||
gatk --java-options "-Xmx${avail_mem}g" CNNScoreVariants \\
|
||||
--variant $vcf \\
|
||||
--output ${prefix}.cnn.vcf.gz \\
|
||||
--reference $fasta \\
|
||||
$interval_command \\
|
||||
$aligned_input \\
|
||||
$architecture \\
|
||||
$weights \\
|
||||
--tmp-dir . \\
|
||||
$args
|
||||
|
||||
cat <<-END_VERSIONS > versions.yml
|
||||
"${task.process}":
|
||||
gatk4: \$(echo \$(gatk --version 2>&1) | sed 's/^.*(GATK) v//; s/ .*\$//')
|
||||
END_VERSIONS
|
||||
"""
|
||||
}
|
80
modules/gatk4/cnnscorevariants/meta.yml
Normal file
80
modules/gatk4/cnnscorevariants/meta.yml
Normal file
|
@ -0,0 +1,80 @@
|
|||
name: "gatk4_cnnscorevariants"
|
||||
description: Apply a Convolutional Neural Net to filter annotated variants
|
||||
keywords:
|
||||
- gatk4_cnnscorevariants
|
||||
- gatk4
|
||||
- variants
|
||||
tools:
|
||||
- gatk4:
|
||||
description: |
|
||||
Developed in the Data Sciences Platform at the Broad Institute, the toolkit offers a wide variety of tools
|
||||
with a primary focus on variant discovery and genotyping. Its powerful processing engine
|
||||
and high-performance computing features make it capable of taking on projects of any size.
|
||||
homepage: https://gatk.broadinstitute.org/hc/en-us
|
||||
documentation: https://gatk.broadinstitute.org/hc/en-us/categories/360002369672s
|
||||
doi: 10.1158/1538-7445.AM2017-3590
|
||||
licence: ["Apache-2.0"]
|
||||
|
||||
input:
|
||||
- meta:
|
||||
type: map
|
||||
description: |
|
||||
Groovy Map containing sample information
|
||||
e.g. [ id:'test', single_end:false ]
|
||||
- vcf:
|
||||
type: file
|
||||
description: VCF file
|
||||
pattern: "*.vcf.gz"
|
||||
- tbi:
|
||||
type: file
|
||||
description: VCF index file
|
||||
pattern: "*.vcf.gz.tbi"
|
||||
- aligned_input:
|
||||
type: file
|
||||
description: BAM/CRAM file from alignment (optional)
|
||||
pattern: "*.{bam,cram}"
|
||||
- intervals:
|
||||
type: file
|
||||
description: Bed file with the genomic regions included in the library (optional)
|
||||
- fasta:
|
||||
type: file
|
||||
description: The reference fasta file
|
||||
pattern: "*.fasta"
|
||||
- fai:
|
||||
type: file
|
||||
description: Index of reference fasta file
|
||||
pattern: "*.fasta.fai"
|
||||
- dict:
|
||||
type: file
|
||||
description: GATK sequence dictionary
|
||||
pattern: "*.dict"
|
||||
- architecture:
|
||||
type: file
|
||||
description: Neural Net architecture configuration json file (optional)
|
||||
pattern: "*.json"
|
||||
- weights:
|
||||
type: file
|
||||
description: Keras model HD5 file with neural net weights. (optional)
|
||||
pattern: "*.hd5"
|
||||
|
||||
output:
|
||||
- meta:
|
||||
type: map
|
||||
description: |
|
||||
Groovy Map containing sample information
|
||||
e.g. [ id:'test', single_end:false ]
|
||||
- versions:
|
||||
type: file
|
||||
description: File containing software versions
|
||||
pattern: "versions.yml"
|
||||
- vcf:
|
||||
type: file
|
||||
description: Annotated VCF file
|
||||
pattern: "*.vcf"
|
||||
- tbi:
|
||||
type: file
|
||||
description: VCF index file
|
||||
pattern: "*.vcf.gz.tbi"
|
||||
|
||||
authors:
|
||||
- "@FriederikeHanssen"
|
|
@ -2,10 +2,10 @@ process GATK4_COMBINEGVCFS {
|
|||
tag "$meta.id"
|
||||
label 'process_low'
|
||||
|
||||
conda (params.enable_conda ? "bioconda::gatk4=4.2.5.0" : null)
|
||||
conda (params.enable_conda ? "bioconda::gatk4=4.2.6.1" : null)
|
||||
container "${ workflow.containerEngine == 'singularity' && !task.ext.singularity_pull_docker_container ?
|
||||
'https://depot.galaxyproject.org/singularity/gatk4:4.2.5.0--hdfd78af_0' :
|
||||
'quay.io/biocontainers/gatk4:4.2.5.0--hdfd78af_0' }"
|
||||
'https://depot.galaxyproject.org/singularity/gatk4:4.2.6.1--hdfd78af_0':
|
||||
'quay.io/biocontainers/gatk4:4.2.6.1--hdfd78af_0' }"
|
||||
|
||||
input:
|
||||
tuple val(meta), path(vcf), path(vcf_idx)
|
||||
|
|
|
@ -2,10 +2,10 @@ process GATK4_CREATESEQUENCEDICTIONARY {
|
|||
tag "$fasta"
|
||||
label 'process_medium'
|
||||
|
||||
conda (params.enable_conda ? "bioconda::gatk4=4.2.5.0" : null)
|
||||
conda (params.enable_conda ? "bioconda::gatk4=4.2.6.1" : null)
|
||||
container "${ workflow.containerEngine == 'singularity' && !task.ext.singularity_pull_docker_container ?
|
||||
'https://depot.galaxyproject.org/singularity/gatk4:4.2.5.0--hdfd78af_0' :
|
||||
'quay.io/biocontainers/gatk4:4.2.5.0--hdfd78af_0' }"
|
||||
'https://depot.galaxyproject.org/singularity/gatk4:4.2.6.1--hdfd78af_0':
|
||||
'quay.io/biocontainers/gatk4:4.2.6.1--hdfd78af_0' }"
|
||||
|
||||
input:
|
||||
path fasta
|
||||
|
|
|
@ -2,10 +2,10 @@ process GATK4_CREATESOMATICPANELOFNORMALS {
|
|||
tag "$meta.id"
|
||||
label 'process_low'
|
||||
|
||||
conda (params.enable_conda ? "bioconda::gatk4=4.2.5.0" : null)
|
||||
conda (params.enable_conda ? "bioconda::gatk4=4.2.6.1" : null)
|
||||
container "${ workflow.containerEngine == 'singularity' && !task.ext.singularity_pull_docker_container ?
|
||||
'https://depot.galaxyproject.org/singularity/gatk4:4.2.5.0--hdfd78af_0' :
|
||||
'quay.io/biocontainers/gatk4:4.2.5.0--hdfd78af_0' }"
|
||||
'https://depot.galaxyproject.org/singularity/gatk4:4.2.6.1--hdfd78af_0':
|
||||
'quay.io/biocontainers/gatk4:4.2.6.1--hdfd78af_0' }"
|
||||
|
||||
input:
|
||||
tuple val(meta), path(genomicsdb)
|
||||
|
|
|
@ -2,10 +2,10 @@ process GATK4_ESTIMATELIBRARYCOMPLEXITY {
|
|||
tag "$meta.id"
|
||||
label 'process_medium'
|
||||
|
||||
conda (params.enable_conda ? "bioconda::gatk4=4.2.5.0" : null)
|
||||
conda (params.enable_conda ? "bioconda::gatk4=4.2.6.1" : null)
|
||||
container "${ workflow.containerEngine == 'singularity' && !task.ext.singularity_pull_docker_container ?
|
||||
'https://depot.galaxyproject.org/singularity/gatk4:4.2.5.0--hdfd78af_0' :
|
||||
'quay.io/biocontainers/gatk4:4.2.5.0--hdfd78af_0' }"
|
||||
'https://depot.galaxyproject.org/singularity/gatk4:4.2.6.1--hdfd78af_0':
|
||||
'quay.io/biocontainers/gatk4:4.2.6.1--hdfd78af_0' }"
|
||||
|
||||
input:
|
||||
tuple val(meta), path(input)
|
||||
|
|
|
@ -2,10 +2,10 @@ process GATK4_FASTQTOSAM {
|
|||
tag "$meta.id"
|
||||
label 'process_medium'
|
||||
|
||||
conda (params.enable_conda ? "bioconda::gatk4=4.2.5.0" : null)
|
||||
conda (params.enable_conda ? "bioconda::gatk4=4.2.6.1" : null)
|
||||
container "${ workflow.containerEngine == 'singularity' && !task.ext.singularity_pull_docker_container ?
|
||||
'https://depot.galaxyproject.org/singularity/gatk4:4.2.5.0--hdfd78af_0' :
|
||||
'quay.io/biocontainers/gatk4:4.2.5.0--hdfd78af_0' }"
|
||||
'https://depot.galaxyproject.org/singularity/gatk4:4.2.6.1--hdfd78af_0':
|
||||
'quay.io/biocontainers/gatk4:4.2.6.1--hdfd78af_0' }"
|
||||
|
||||
input:
|
||||
tuple val(meta), path(reads)
|
||||
|
|
|
@ -2,10 +2,10 @@ process GATK4_FILTERMUTECTCALLS {
|
|||
tag "$meta.id"
|
||||
label 'process_low'
|
||||
|
||||
conda (params.enable_conda ? "bioconda::gatk4=4.2.5.0" : null)
|
||||
conda (params.enable_conda ? "bioconda::gatk4=4.2.6.1" : null)
|
||||
container "${ workflow.containerEngine == 'singularity' && !task.ext.singularity_pull_docker_container ?
|
||||
'https://depot.galaxyproject.org/singularity/gatk4:4.2.5.0--hdfd78af_0' :
|
||||
'quay.io/biocontainers/gatk4:4.2.5.0--hdfd78af_0' }"
|
||||
'https://depot.galaxyproject.org/singularity/gatk4:4.2.6.1--hdfd78af_0':
|
||||
'quay.io/biocontainers/gatk4:4.2.6.1--hdfd78af_0' }"
|
||||
|
||||
input:
|
||||
tuple val(meta), path(vcf), path(vcf_tbi), path(stats), path(orientationbias), path(segmentation), path(table), val(estimate)
|
||||
|
|
51
modules/gatk4/filtervarianttranches/main.nf
Normal file
51
modules/gatk4/filtervarianttranches/main.nf
Normal file
|
@ -0,0 +1,51 @@
|
|||
process GATK4_FILTERVARIANTTRANCHES {
|
||||
tag "$meta.id"
|
||||
label 'process_low'
|
||||
|
||||
conda (params.enable_conda ? "bioconda::gatk4=4.2.6.1" : null)
|
||||
container "${ workflow.containerEngine == 'singularity' && !task.ext.singularity_pull_docker_container ?
|
||||
'https://depot.galaxyproject.org/singularity/gatk4:4.2.6.1--hdfd78af_0':
|
||||
'quay.io/biocontainers/gatk4:4.2.6.1--hdfd78af_0' }"
|
||||
|
||||
input:
|
||||
tuple val(meta), path(vcf), path(tbi), path(intervals)
|
||||
path resources
|
||||
path resources_index
|
||||
path fasta
|
||||
path fai
|
||||
path dict
|
||||
|
||||
|
||||
output:
|
||||
tuple val(meta), path("*.vcf.gz") , emit: vcf
|
||||
tuple val(meta), path("*.vcf.gz.tbi"), emit: tbi
|
||||
path "versions.yml" , emit: versions
|
||||
|
||||
when:
|
||||
task.ext.when == null || task.ext.when
|
||||
|
||||
script:
|
||||
def args = task.ext.args ?: ''
|
||||
def prefix = task.ext.prefix ?: "${meta.id}"
|
||||
|
||||
def resources = resources.collect{"--resource $it"}.join(' ')
|
||||
def avail_mem = 3
|
||||
if (!task.memory) {
|
||||
log.info '[GATK FilterVariantTranches] Available memory not known - defaulting to 3GB. Specify process memory requirements to change this.'
|
||||
} else {
|
||||
avail_mem = task.memory.giga
|
||||
}
|
||||
"""
|
||||
gatk --java-options "-Xmx${avail_mem}g" FilterVariantTranches \\
|
||||
--variant $vcf \\
|
||||
$resources \\
|
||||
--output ${prefix}.filtered.vcf.gz \\
|
||||
--tmp-dir . \\
|
||||
$args
|
||||
|
||||
cat <<-END_VERSIONS > versions.yml
|
||||
"${task.process}":
|
||||
gatk4: \$(echo \$(gatk --version 2>&1) | sed 's/^.*(GATK) v//; s/ .*\$//')
|
||||
END_VERSIONS
|
||||
"""
|
||||
}
|
68
modules/gatk4/filtervarianttranches/meta.yml
Normal file
68
modules/gatk4/filtervarianttranches/meta.yml
Normal file
|
@ -0,0 +1,68 @@
|
|||
name: "gatk4_filtervarianttranches"
|
||||
description: Apply tranche filtering
|
||||
keywords:
|
||||
- gatk4
|
||||
- filtervarianttranches
|
||||
|
||||
tools:
|
||||
- "gatk4":
|
||||
description: Genome Analysis Toolkit (GATK4)
|
||||
homepage: https://gatk.broadinstitute.org/hc/en-us
|
||||
documentation: https://gatk.broadinstitute.org/hc/en-us
|
||||
tool_dev_url: https://github.com/broadinstitute/gatk
|
||||
doi: "10.1158/1538-7445.AM2017-3590"
|
||||
licence: ["BSD-3-clause"]
|
||||
|
||||
input:
|
||||
- meta:
|
||||
type: map
|
||||
description: |
|
||||
Groovy Map containing sample information
|
||||
e.g. [ id:'test', single_end:false ]
|
||||
- vcf:
|
||||
type: file
|
||||
description: a VCF file containing variants, must have info key:CNN_2D
|
||||
pattern: "*.vcf.gz"
|
||||
- tbi:
|
||||
type: file
|
||||
description: tbi file matching with -vcf
|
||||
pattern: "*.vcf.gz.tbi"
|
||||
- resources:
|
||||
type: list
|
||||
description: resource A VCF containing known SNP and or INDEL sites. Can be supplied as many times as necessary
|
||||
pattern: "*.vcf.gz"
|
||||
- resources_index:
|
||||
type: list
|
||||
description: Index of resource VCF containing known SNP and or INDEL sites. Can be supplied as many times as necessary
|
||||
pattern: "*.vcf.gz"
|
||||
- fasta:
|
||||
type: file
|
||||
description: The reference fasta file
|
||||
pattern: "*.fasta"
|
||||
- fai:
|
||||
type: file
|
||||
description: Index of reference fasta file
|
||||
pattern: "fasta.fai"
|
||||
- dict:
|
||||
|
||||
output:
|
||||
- meta:
|
||||
type: map
|
||||
description: |
|
||||
Groovy Map containing sample information
|
||||
e.g. [ id:'test', single_end:false ]
|
||||
- versions:
|
||||
type: file
|
||||
description: File containing software versions
|
||||
pattern: "versions.yml"
|
||||
- vcf:
|
||||
type: file
|
||||
description: VCF file
|
||||
pattern: "*.vcf.gz"
|
||||
- tbi:
|
||||
type: file
|
||||
description: VCF index file
|
||||
pattern: "*.vcf.gz.tbi"
|
||||
|
||||
authors:
|
||||
- "@FriederikeHanssen"
|
|
@ -2,10 +2,10 @@ process GATK4_GATHERBQSRREPORTS {
|
|||
tag "$meta.id"
|
||||
label 'process_medium'
|
||||
|
||||
conda (params.enable_conda ? "bioconda::gatk4=4.2.5.0" : null)
|
||||
conda (params.enable_conda ? "bioconda::gatk4=4.2.6.1" : null)
|
||||
container "${ workflow.containerEngine == 'singularity' && !task.ext.singularity_pull_docker_container ?
|
||||
'https://depot.galaxyproject.org/singularity/gatk4:4.2.5.0--hdfd78af_0' :
|
||||
'quay.io/biocontainers/gatk4:4.2.5.0--hdfd78af_0' }"
|
||||
'https://depot.galaxyproject.org/singularity/gatk4:4.2.6.1--hdfd78af_0':
|
||||
'quay.io/biocontainers/gatk4:4.2.6.1--hdfd78af_0' }"
|
||||
|
||||
input:
|
||||
tuple val(meta), path(table)
|
||||
|
|
|
@ -2,10 +2,10 @@ process GATK4_GATHERPILEUPSUMMARIES {
|
|||
tag "$meta.id"
|
||||
label 'process_low'
|
||||
|
||||
conda (params.enable_conda ? "bioconda::gatk4=4.2.5.0" : null)
|
||||
conda (params.enable_conda ? "bioconda::gatk4=4.2.6.1" : null)
|
||||
container "${ workflow.containerEngine == 'singularity' && !task.ext.singularity_pull_docker_container ?
|
||||
'https://depot.galaxyproject.org/singularity/gatk4:4.2.5.0--hdfd78af_0' :
|
||||
'quay.io/biocontainers/gatk4:4.2.5.0--hdfd78af_0' }"
|
||||
'https://depot.galaxyproject.org/singularity/gatk4:4.2.6.1--hdfd78af_0':
|
||||
'quay.io/biocontainers/gatk4:4.2.6.1--hdfd78af_0' }"
|
||||
|
||||
|
||||
input:
|
||||
|
|
|
@ -2,10 +2,10 @@ process GATK4_GENOMICSDBIMPORT {
|
|||
tag "$meta.id"
|
||||
label 'process_low'
|
||||
|
||||
conda (params.enable_conda ? "bioconda::gatk4=4.2.5.0" : null)
|
||||
conda (params.enable_conda ? "bioconda::gatk4=4.2.6.1" : null)
|
||||
container "${ workflow.containerEngine == 'singularity' && !task.ext.singularity_pull_docker_container ?
|
||||
'https://depot.galaxyproject.org/singularity/gatk4:4.2.5.0--hdfd78af_0' :
|
||||
'quay.io/biocontainers/gatk4:4.2.5.0--hdfd78af_0' }"
|
||||
'https://depot.galaxyproject.org/singularity/gatk4:4.2.6.1--hdfd78af_0':
|
||||
'quay.io/biocontainers/gatk4:4.2.6.1--hdfd78af_0' }"
|
||||
|
||||
input:
|
||||
tuple val(meta), path(vcf), path(tbi), path(interval_file), val(interval_value), path(wspace)
|
||||
|
|
|
@ -2,10 +2,10 @@ process GATK4_GENOTYPEGVCFS {
|
|||
tag "$meta.id"
|
||||
label 'process_high'
|
||||
|
||||
conda (params.enable_conda ? "bioconda::gatk4=4.2.5.0" : null)
|
||||
conda (params.enable_conda ? "bioconda::gatk4=4.2.6.1" : null)
|
||||
container "${ workflow.containerEngine == 'singularity' && !task.ext.singularity_pull_docker_container ?
|
||||
'https://depot.galaxyproject.org/singularity/gatk4:4.2.5.0--hdfd78af_0' :
|
||||
'quay.io/biocontainers/gatk4:4.2.5.0--hdfd78af_0' }"
|
||||
'https://depot.galaxyproject.org/singularity/gatk4:4.2.6.1--hdfd78af_0':
|
||||
'quay.io/biocontainers/gatk4:4.2.6.1--hdfd78af_0' }"
|
||||
|
||||
input:
|
||||
tuple val(meta), path(gvcf), path(gvcf_index), path(intervals), path(intervals_index)
|
||||
|
|
|
@ -2,10 +2,10 @@ process GATK4_GETPILEUPSUMMARIES {
|
|||
tag "$meta.id"
|
||||
label 'process_low'
|
||||
|
||||
conda (params.enable_conda ? "bioconda::gatk4=4.2.5.0" : null)
|
||||
conda (params.enable_conda ? "bioconda::gatk4=4.2.6.1" : null)
|
||||
container "${ workflow.containerEngine == 'singularity' && !task.ext.singularity_pull_docker_container ?
|
||||
'https://depot.galaxyproject.org/singularity/gatk4:4.2.5.0--hdfd78af_0' :
|
||||
'quay.io/biocontainers/gatk4:4.2.5.0--hdfd78af_0' }"
|
||||
'https://depot.galaxyproject.org/singularity/gatk4:4.2.6.1--hdfd78af_0':
|
||||
'quay.io/biocontainers/gatk4:4.2.6.1--hdfd78af_0' }"
|
||||
|
||||
input:
|
||||
tuple val(meta), path(input), path(index), path(intervals)
|
||||
|
@ -40,7 +40,7 @@ process GATK4_GETPILEUPSUMMARIES {
|
|||
--variant $variants \\
|
||||
--output ${prefix}.pileups.table \\
|
||||
$reference_command \\
|
||||
$sites_command \\
|
||||
$interval_command \\
|
||||
--tmp-dir . \\
|
||||
$args
|
||||
|
||||
|
|
|
@ -2,10 +2,10 @@ process GATK4_HAPLOTYPECALLER {
|
|||
tag "$meta.id"
|
||||
label 'process_medium'
|
||||
|
||||
conda (params.enable_conda ? "bioconda::gatk4=4.2.5.0" : null)
|
||||
conda (params.enable_conda ? "bioconda::gatk4=4.2.6.1" : null)
|
||||
container "${ workflow.containerEngine == 'singularity' && !task.ext.singularity_pull_docker_container ?
|
||||
'https://depot.galaxyproject.org/singularity/gatk4:4.2.5.0--hdfd78af_0' :
|
||||
'quay.io/biocontainers/gatk4:4.2.5.0--hdfd78af_0' }"
|
||||
'https://depot.galaxyproject.org/singularity/gatk4:4.2.6.1--hdfd78af_0':
|
||||
'quay.io/biocontainers/gatk4:4.2.6.1--hdfd78af_0' }"
|
||||
|
||||
input:
|
||||
tuple val(meta), path(input), path(input_index), path(intervals)
|
||||
|
@ -17,7 +17,7 @@ process GATK4_HAPLOTYPECALLER {
|
|||
|
||||
output:
|
||||
tuple val(meta), path("*.vcf.gz"), emit: vcf
|
||||
tuple val(meta), path("*.tbi") , emit: tbi
|
||||
tuple val(meta), path("*.tbi") , optional:true, emit: tbi
|
||||
path "versions.yml" , emit: versions
|
||||
|
||||
when:
|
||||
|
|
|
@ -2,10 +2,10 @@ process GATK4_INDEXFEATUREFILE {
|
|||
tag "$meta.id"
|
||||
label 'process_low'
|
||||
|
||||
conda (params.enable_conda ? "bioconda::gatk4=4.2.5.0" : null)
|
||||
conda (params.enable_conda ? "bioconda::gatk4=4.2.6.1" : null)
|
||||
container "${ workflow.containerEngine == 'singularity' && !task.ext.singularity_pull_docker_container ?
|
||||
'https://depot.galaxyproject.org/singularity/gatk4:4.2.5.0--hdfd78af_0' :
|
||||
'quay.io/biocontainers/gatk4:4.2.5.0--hdfd78af_0' }"
|
||||
'https://depot.galaxyproject.org/singularity/gatk4:4.2.6.1--hdfd78af_0':
|
||||
'quay.io/biocontainers/gatk4:4.2.6.1--hdfd78af_0' }"
|
||||
|
||||
input:
|
||||
tuple val(meta), path(feature_file)
|
||||
|
|
|
@ -2,10 +2,10 @@ process GATK4_INTERVALLISTTOBED {
|
|||
tag "$meta.id"
|
||||
label 'process_low'
|
||||
|
||||
conda (params.enable_conda ? "bioconda::gatk4=4.2.5.0" : null)
|
||||
conda (params.enable_conda ? "bioconda::gatk4=4.2.6.1" : null)
|
||||
container "${ workflow.containerEngine == 'singularity' && !task.ext.singularity_pull_docker_container ?
|
||||
'https://depot.galaxyproject.org/singularity/gatk4:4.2.5.0--hdfd78af_0' :
|
||||
'quay.io/biocontainers/gatk4:4.2.5.0--hdfd78af_0' }"
|
||||
'https://depot.galaxyproject.org/singularity/gatk4:4.2.6.1--hdfd78af_0':
|
||||
'quay.io/biocontainers/gatk4:4.2.6.1--hdfd78af_0' }"
|
||||
|
||||
input:
|
||||
tuple val(meta), path(intervals)
|
||||
|
|
|
@ -2,10 +2,10 @@ process GATK4_INTERVALLISTTOOLS {
|
|||
tag "$meta.id"
|
||||
label 'process_medium'
|
||||
|
||||
conda (params.enable_conda ? "bioconda::gatk4=4.2.5.0" : null)
|
||||
conda (params.enable_conda ? "bioconda::gatk4=4.2.6.1" : null)
|
||||
container "${ workflow.containerEngine == 'singularity' && !task.ext.singularity_pull_docker_container ?
|
||||
'https://depot.galaxyproject.org/singularity/gatk4:4.2.5.0--hdfd78af_0' :
|
||||
'quay.io/biocontainers/gatk4:4.2.5.0--hdfd78af_0' }"
|
||||
'https://depot.galaxyproject.org/singularity/gatk4:4.2.6.1--hdfd78af_0':
|
||||
'quay.io/biocontainers/gatk4:4.2.6.1--hdfd78af_0' }"
|
||||
|
||||
input:
|
||||
tuple val(meta), path(intervals)
|
||||
|
|
|
@ -2,10 +2,10 @@ process GATK4_LEARNREADORIENTATIONMODEL {
|
|||
tag "$meta.id"
|
||||
label 'process_low'
|
||||
|
||||
conda (params.enable_conda ? "bioconda::gatk4=4.2.5.0" : null)
|
||||
conda (params.enable_conda ? "bioconda::gatk4=4.2.6.1" : null)
|
||||
container "${ workflow.containerEngine == 'singularity' && !task.ext.singularity_pull_docker_container ?
|
||||
'https://depot.galaxyproject.org/singularity/gatk4:4.2.5.0--hdfd78af_0' :
|
||||
'quay.io/biocontainers/gatk4:4.2.5.0--hdfd78af_0' }"
|
||||
'https://depot.galaxyproject.org/singularity/gatk4:4.2.6.1--hdfd78af_0':
|
||||
'quay.io/biocontainers/gatk4:4.2.6.1--hdfd78af_0' }"
|
||||
|
||||
input:
|
||||
tuple val(meta), path(f1r2)
|
||||
|
|
|
@ -1,18 +1,18 @@
|
|||
process GATK4_MARKDUPLICATES {
|
||||
tag "$meta.id"
|
||||
label 'process_low'
|
||||
label 'process_medium'
|
||||
|
||||
conda (params.enable_conda ? "bioconda::gatk4=4.2.5.0" : null)
|
||||
conda (params.enable_conda ? "bioconda::gatk4=4.2.6.1" : null)
|
||||
container "${ workflow.containerEngine == 'singularity' && !task.ext.singularity_pull_docker_container ?
|
||||
'https://depot.galaxyproject.org/singularity/gatk4:4.2.5.0--hdfd78af_0' :
|
||||
'quay.io/biocontainers/gatk4:4.2.5.0--hdfd78af_0' }"
|
||||
'https://depot.galaxyproject.org/singularity/gatk4:4.2.6.1--hdfd78af_0':
|
||||
'quay.io/biocontainers/gatk4:4.2.6.1--hdfd78af_0' }"
|
||||
|
||||
input:
|
||||
tuple val(meta), path(bam)
|
||||
|
||||
output:
|
||||
tuple val(meta), path("*.bam") , emit: bam
|
||||
tuple val(meta), path("*.bai") , emit: bai
|
||||
tuple val(meta), path("*.bai") , optional:true, emit: bai
|
||||
tuple val(meta), path("*.metrics"), emit: metrics
|
||||
path "versions.yml" , emit: versions
|
||||
|
||||
|
|
|
@ -2,10 +2,10 @@ process GATK4_MERGEBAMALIGNMENT {
|
|||
tag "$meta.id"
|
||||
label 'process_low'
|
||||
|
||||
conda (params.enable_conda ? "bioconda::gatk4=4.2.5.0" : null)
|
||||
conda (params.enable_conda ? "bioconda::gatk4=4.2.6.1" : null)
|
||||
container "${ workflow.containerEngine == 'singularity' && !task.ext.singularity_pull_docker_container ?
|
||||
'https://depot.galaxyproject.org/singularity/gatk4:4.2.5.0--hdfd78af_0' :
|
||||
'quay.io/biocontainers/gatk4:4.2.5.0--hdfd78af_0' }"
|
||||
'https://depot.galaxyproject.org/singularity/gatk4:4.2.6.1--hdfd78af_0':
|
||||
'quay.io/biocontainers/gatk4:4.2.6.1--hdfd78af_0' }"
|
||||
|
||||
input:
|
||||
tuple val(meta), path(aligned), path(unmapped)
|
||||
|
@ -43,4 +43,15 @@ process GATK4_MERGEBAMALIGNMENT {
|
|||
gatk4: \$(echo \$(gatk --version 2>&1) | sed 's/^.*(GATK) v//; s/ .*\$//')
|
||||
END_VERSIONS
|
||||
"""
|
||||
|
||||
stub:
|
||||
def prefix = task.ext.prefix ?: "${meta.id}"
|
||||
"""
|
||||
touch ${prefix}.bam
|
||||
|
||||
cat <<-END_VERSIONS > versions.yml
|
||||
"${task.process}":
|
||||
gatk4: \$(echo \$(gatk --version 2>&1) | sed 's/^.*(GATK) v//; s/ .*\$//')
|
||||
END_VERSIONS
|
||||
"""
|
||||
}
|
||||
|
|
|
@ -2,10 +2,10 @@ process GATK4_MERGEMUTECTSTATS {
|
|||
tag "$meta.id"
|
||||
label 'process_low'
|
||||
|
||||
conda (params.enable_conda ? "bioconda::gatk4=4.2.5.0" : null)
|
||||
conda (params.enable_conda ? "bioconda::gatk4=4.2.6.1" : null)
|
||||
container "${ workflow.containerEngine == 'singularity' && !task.ext.singularity_pull_docker_container ?
|
||||
'https://depot.galaxyproject.org/singularity/gatk4:4.2.5.0--hdfd78af_0' :
|
||||
'quay.io/biocontainers/gatk4:4.2.5.0--hdfd78af_0' }"
|
||||
'https://depot.galaxyproject.org/singularity/gatk4:4.2.6.1--hdfd78af_0':
|
||||
'quay.io/biocontainers/gatk4:4.2.6.1--hdfd78af_0' }"
|
||||
|
||||
input:
|
||||
tuple val(meta), path(stats)
|
||||
|
|
|
@ -2,10 +2,10 @@ process GATK4_MERGEVCFS {
|
|||
tag "$meta.id"
|
||||
label 'process_medium'
|
||||
|
||||
conda (params.enable_conda ? "bioconda::gatk4=4.2.5.0" : null)
|
||||
conda (params.enable_conda ? "bioconda::gatk4=4.2.6.1" : null)
|
||||
container "${ workflow.containerEngine == 'singularity' && !task.ext.singularity_pull_docker_container ?
|
||||
'https://depot.galaxyproject.org/singularity/gatk4:4.2.5.0--hdfd78af_0' :
|
||||
'quay.io/biocontainers/gatk4:4.2.5.0--hdfd78af_0' }"
|
||||
'https://depot.galaxyproject.org/singularity/gatk4:4.2.6.1--hdfd78af_0':
|
||||
'quay.io/biocontainers/gatk4:4.2.6.1--hdfd78af_0' }"
|
||||
|
||||
input:
|
||||
tuple val(meta), path(vcf)
|
||||
|
|
|
@ -2,10 +2,10 @@ process GATK4_MUTECT2 {
|
|||
tag "$meta.id"
|
||||
label 'process_medium'
|
||||
|
||||
conda (params.enable_conda ? "bioconda::gatk4=4.2.5.0" : null)
|
||||
conda (params.enable_conda ? "bioconda::gatk4=4.2.6.1" : null)
|
||||
container "${ workflow.containerEngine == 'singularity' && !task.ext.singularity_pull_docker_container ?
|
||||
'https://depot.galaxyproject.org/singularity/gatk4:4.2.5.0--hdfd78af_0' :
|
||||
'quay.io/biocontainers/gatk4:4.2.5.0--hdfd78af_0' }"
|
||||
'https://depot.galaxyproject.org/singularity/gatk4:4.2.6.1--hdfd78af_0':
|
||||
'quay.io/biocontainers/gatk4:4.2.6.1--hdfd78af_0' }"
|
||||
|
||||
input:
|
||||
tuple val(meta), path(input), path(input_index), path(intervals)
|
||||
|
@ -57,4 +57,18 @@ process GATK4_MUTECT2 {
|
|||
gatk4: \$(echo \$(gatk --version 2>&1) | sed 's/^.*(GATK) v//; s/ .*\$//')
|
||||
END_VERSIONS
|
||||
"""
|
||||
|
||||
stub:
|
||||
def prefix = task.ext.prefix ?: "${meta.id}"
|
||||
"""
|
||||
touch ${prefix}.vcf.gz
|
||||
touch ${prefix}.vcf.gz.tbi
|
||||
touch ${prefix}.vcf.gz.stats
|
||||
touch ${prefix}.f1r2.tar.gz
|
||||
|
||||
cat <<-END_VERSIONS > versions.yml
|
||||
"${task.process}":
|
||||
gatk4: \$(echo \$(gatk --version 2>&1) | sed 's/^.*(GATK) v//; s/ .*\$//')
|
||||
END_VERSIONS
|
||||
"""
|
||||
}
|
||||
|
|
|
@ -2,10 +2,10 @@ process GATK4_REVERTSAM {
|
|||
tag "$meta.id"
|
||||
label 'process_medium'
|
||||
|
||||
conda (params.enable_conda ? "bioconda::gatk4=4.2.5.0" : null)
|
||||
conda (params.enable_conda ? "bioconda::gatk4=4.2.6.1" : null)
|
||||
container "${ workflow.containerEngine == 'singularity' && !task.ext.singularity_pull_docker_container ?
|
||||
'https://depot.galaxyproject.org/singularity/gatk4:4.2.5.0--hdfd78af_0' :
|
||||
'quay.io/biocontainers/gatk4:4.2.5.0--hdfd78af_0' }"
|
||||
'https://depot.galaxyproject.org/singularity/gatk4:4.2.6.1--hdfd78af_0':
|
||||
'quay.io/biocontainers/gatk4:4.2.6.1--hdfd78af_0' }"
|
||||
|
||||
input:
|
||||
tuple val(meta), path(bam)
|
||||
|
@ -39,4 +39,15 @@ process GATK4_REVERTSAM {
|
|||
gatk4: \$(echo \$(gatk --version 2>&1) | sed 's/^.*(GATK) v//; s/ .*\$//')
|
||||
END_VERSIONS
|
||||
"""
|
||||
|
||||
stub:
|
||||
def prefix = task.ext.prefix ?: "${meta.id}"
|
||||
"""
|
||||
touch ${prefix}.reverted.bam
|
||||
|
||||
cat <<-END_VERSIONS > versions.yml
|
||||
"${task.process}":
|
||||
gatk4: \$(echo \$(gatk --version 2>&1) | sed 's/^.*(GATK) v//; s/ .*\$//')
|
||||
END_VERSIONS
|
||||
"""
|
||||
}
|
||||
|
|
|
@ -2,10 +2,10 @@ process GATK4_SAMTOFASTQ {
|
|||
tag "$meta.id"
|
||||
label 'process_medium'
|
||||
|
||||
conda (params.enable_conda ? "bioconda::gatk4=4.2.5.0" : null)
|
||||
conda (params.enable_conda ? "bioconda::gatk4=4.2.6.1" : null)
|
||||
container "${ workflow.containerEngine == 'singularity' && !task.ext.singularity_pull_docker_container ?
|
||||
'https://depot.galaxyproject.org/singularity/gatk4:4.2.5.0--hdfd78af_0' :
|
||||
'quay.io/biocontainers/gatk4:4.2.5.0--hdfd78af_0' }"
|
||||
'https://depot.galaxyproject.org/singularity/gatk4:4.2.6.1--hdfd78af_0':
|
||||
'quay.io/biocontainers/gatk4:4.2.6.1--hdfd78af_0' }"
|
||||
|
||||
input:
|
||||
tuple val(meta), path(bam)
|
||||
|
@ -40,4 +40,17 @@ process GATK4_SAMTOFASTQ {
|
|||
gatk4: \$(echo \$(gatk --version 2>&1) | sed 's/^.*(GATK) v//; s/ .*\$//')
|
||||
END_VERSIONS
|
||||
"""
|
||||
|
||||
stub:
|
||||
def prefix = task.ext.prefix ?: "${meta.id}"
|
||||
"""
|
||||
touch ${prefix}.fastq.gz
|
||||
touch ${prefix}_1.fastq.gz
|
||||
touch ${prefix}_2.fastq.gz
|
||||
|
||||
cat <<-END_VERSIONS > versions.yml
|
||||
"${task.process}":
|
||||
gatk4: \$(echo \$(gatk --version 2>&1) | sed 's/^.*(GATK) v//; s/ .*\$//')
|
||||
END_VERSIONS
|
||||
"""
|
||||
}
|
||||
|
|
|
@ -2,10 +2,10 @@ process GATK4_SELECTVARIANTS {
|
|||
tag "$meta.id"
|
||||
label 'process_medium'
|
||||
|
||||
conda (params.enable_conda ? "bioconda::gatk4=4.2.5.0" : null)
|
||||
conda (params.enable_conda ? "bioconda::gatk4=4.2.6.1" : null)
|
||||
container "${ workflow.containerEngine == 'singularity' && !task.ext.singularity_pull_docker_container ?
|
||||
'https://depot.galaxyproject.org/singularity/gatk4:4.2.5.0--hdfd78af_0':
|
||||
'quay.io/biocontainers/gatk4:4.2.5.0--hdfd78af_0' }"
|
||||
'https://depot.galaxyproject.org/singularity/gatk4:4.2.6.1--hdfd78af_0':
|
||||
'quay.io/biocontainers/gatk4:4.2.6.1--hdfd78af_0' }"
|
||||
|
||||
input:
|
||||
tuple val(meta), path(vcf), path(vcf_idx)
|
||||
|
|
48
modules/gatk4/splitintervals/main.nf
Normal file
48
modules/gatk4/splitintervals/main.nf
Normal file
|
@ -0,0 +1,48 @@
|
|||
process GATK4_SPLITINTERVALS {
|
||||
tag "$meta.id"
|
||||
label 'process_low'
|
||||
|
||||
conda (params.enable_conda ? "bioconda::gatk4=4.2.6.1" : null)
|
||||
container "${ workflow.containerEngine == 'singularity' && !task.ext.singularity_pull_docker_container ?
|
||||
'https://depot.galaxyproject.org/singularity/gatk4:4.2.6.1--hdfd78af_0':
|
||||
'quay.io/biocontainers/gatk4:4.2.6.1--hdfd78af_0' }"
|
||||
|
||||
input:
|
||||
tuple val(meta), path(intervals)
|
||||
path(fasta)
|
||||
path(fasta_fai)
|
||||
path(dict)
|
||||
|
||||
output:
|
||||
tuple val(meta), path("**.interval_list"), emit: split_intervals
|
||||
path "versions.yml" , emit: versions
|
||||
|
||||
when:
|
||||
task.ext.when == null || task.ext.when
|
||||
|
||||
script:
|
||||
def args = task.ext.args ?: ''
|
||||
def prefix = task.ext.prefix ?: "${meta.id}"
|
||||
def reference = fasta ? "--reference $fasta" : ""
|
||||
|
||||
def avail_mem = 3
|
||||
if (!task.memory) {
|
||||
log.info '[GATK SplitIntervals] Available memory not known - defaulting to 3GB. Specify process memory requirements to change this.'
|
||||
} else {
|
||||
avail_mem = task.memory.giga
|
||||
}
|
||||
|
||||
"""
|
||||
gatk --java-options "-Xmx${avail_mem}g" SplitIntervals \\
|
||||
--output ${prefix} \\
|
||||
--intervals $intervals \\
|
||||
$reference \\
|
||||
--tmp-dir . \\
|
||||
$args
|
||||
|
||||
cat <<-END_VERSIONS > versions.yml
|
||||
"${task.process}":
|
||||
gatk4: \$(echo \$(gatk --version 2>&1) | sed 's/^.*(GATK) v//; s/ .*\$//')
|
||||
END_VERSIONS
|
||||
"""
|
||||
}
|
53
modules/gatk4/splitintervals/meta.yml
Normal file
53
modules/gatk4/splitintervals/meta.yml
Normal file
|
@ -0,0 +1,53 @@
|
|||
name: gatk4_splitintervals
|
||||
keywords:
|
||||
- interval
|
||||
- bed
|
||||
tools:
|
||||
- gatk4:
|
||||
description: Genome Analysis Toolkit (GATK4)
|
||||
homepage: https://gatk.broadinstitute.org/hc/en-us
|
||||
documentation: https://gatk.broadinstitute.org/hc/en-us/categories/360002369672s
|
||||
tool_dev_url: https://github.com/broadinstitute/gatk
|
||||
doi: "10.1158/1538-7445.AM2017-3590"
|
||||
licence: ["BSD-3-clause"]
|
||||
|
||||
input:
|
||||
- meta:
|
||||
type: map
|
||||
description: |
|
||||
Groovy Map containing sample information
|
||||
e.g. [ id:'test' ]
|
||||
- interval:
|
||||
type: file
|
||||
description: Interval list or BED
|
||||
pattern: "*.{interval,interval_list,bed}"
|
||||
- fasta:
|
||||
type: file
|
||||
description: Reference FASTA
|
||||
pattern: "*.{fa,fasta}"
|
||||
- fasta_fai:
|
||||
type: file
|
||||
description: Reference FASTA index
|
||||
pattern: "*.fai"
|
||||
- dict:
|
||||
type: file
|
||||
description: Reference sequence dictionary
|
||||
pattern: "*.dict"
|
||||
|
||||
output:
|
||||
- meta:
|
||||
type: map
|
||||
description: |
|
||||
Groovy Map containing sample information
|
||||
e.g. [ id:'test' ]
|
||||
- bed:
|
||||
type: file
|
||||
description: A list of scattered interval lists
|
||||
pattern: "*.interval_list"
|
||||
- versions:
|
||||
type: file
|
||||
description: File containing software versions
|
||||
pattern: "versions.yml"
|
||||
|
||||
authors:
|
||||
- "@nvnieuwk"
|
|
@ -2,13 +2,13 @@ process GATK4_SPLITNCIGARREADS {
|
|||
tag "$meta.id"
|
||||
label 'process_medium'
|
||||
|
||||
conda (params.enable_conda ? "bioconda::gatk4=4.2.5.0" : null)
|
||||
conda (params.enable_conda ? "bioconda::gatk4=4.2.6.1" : null)
|
||||
container "${ workflow.containerEngine == 'singularity' && !task.ext.singularity_pull_docker_container ?
|
||||
'https://depot.galaxyproject.org/singularity/gatk4:4.2.5.0--hdfd78af_0' :
|
||||
'quay.io/biocontainers/gatk4:4.2.5.0--hdfd78af_0' }"
|
||||
'https://depot.galaxyproject.org/singularity/gatk4:4.2.6.1--hdfd78af_0':
|
||||
'quay.io/biocontainers/gatk4:4.2.6.1--hdfd78af_0' }"
|
||||
|
||||
input:
|
||||
tuple val(meta), path(bam)
|
||||
tuple val(meta), path(bam), path(bai), path(intervals)
|
||||
path fasta
|
||||
path fai
|
||||
path dict
|
||||
|
@ -23,6 +23,7 @@ process GATK4_SPLITNCIGARREADS {
|
|||
script:
|
||||
def args = task.ext.args ?: ''
|
||||
def prefix = task.ext.prefix ?: "${meta.id}"
|
||||
def interval_command = intervals ? "--intervals $intervals" : ""
|
||||
|
||||
def avail_mem = 3
|
||||
if (!task.memory) {
|
||||
|
@ -35,6 +36,7 @@ process GATK4_SPLITNCIGARREADS {
|
|||
--input $bam \\
|
||||
--output ${prefix}.bam \\
|
||||
--reference $fasta \\
|
||||
$interval_command \\
|
||||
--tmp-dir . \\
|
||||
$args
|
||||
|
||||
|
|
|
@ -23,6 +23,13 @@ input:
|
|||
type: list
|
||||
description: BAM/SAM/CRAM file containing reads
|
||||
pattern: "*.{bam,sam,cram}"
|
||||
- bai:
|
||||
type: list
|
||||
description: BAI/SAI/CRAI index file (optional)
|
||||
pattern: "*.{bai,sai,crai}"
|
||||
- intervals:
|
||||
type: file
|
||||
description: Bed file with the genomic regions included in the library (optional)
|
||||
- fasta:
|
||||
type: file
|
||||
description: The reference fasta file
|
||||
|
|
|
@ -2,10 +2,10 @@ process GATK4_VARIANTFILTRATION {
|
|||
tag "$meta.id"
|
||||
label 'process_medium'
|
||||
|
||||
conda (params.enable_conda ? "bioconda::gatk4=4.2.5.0" : null)
|
||||
conda (params.enable_conda ? "bioconda::gatk4=4.2.6.1" : null)
|
||||
container "${ workflow.containerEngine == 'singularity' && !task.ext.singularity_pull_docker_container ?
|
||||
'https://depot.galaxyproject.org/singularity/gatk4:4.2.5.0--hdfd78af_0' :
|
||||
'quay.io/biocontainers/gatk4:4.2.5.0--hdfd78af_0' }"
|
||||
'https://depot.galaxyproject.org/singularity/gatk4:4.2.6.1--hdfd78af_0':
|
||||
'quay.io/biocontainers/gatk4:4.2.6.1--hdfd78af_0' }"
|
||||
|
||||
input:
|
||||
tuple val(meta), path(vcf), path(tbi)
|
||||
|
|
|
@ -2,10 +2,10 @@ process GATK4_VARIANTRECALIBRATOR {
|
|||
tag "$meta.id"
|
||||
label 'process_low'
|
||||
|
||||
conda (params.enable_conda ? "bioconda::gatk4=4.2.5.0" : null)
|
||||
conda (params.enable_conda ? "bioconda::gatk4=4.2.6.1" : null)
|
||||
container "${ workflow.containerEngine == 'singularity' && !task.ext.singularity_pull_docker_container ?
|
||||
'https://depot.galaxyproject.org/singularity/gatk4:4.2.5.0--hdfd78af_0' :
|
||||
'quay.io/biocontainers/gatk4:4.2.5.0--hdfd78af_0' }"
|
||||
'https://depot.galaxyproject.org/singularity/gatk4:4.2.6.1--hdfd78af_0':
|
||||
'quay.io/biocontainers/gatk4:4.2.6.1--hdfd78af_0' }"
|
||||
|
||||
input:
|
||||
tuple val(meta), path(vcf), path(tbi)
|
||||
|
|
40
modules/genomescope2/main.nf
Normal file
40
modules/genomescope2/main.nf
Normal file
|
@ -0,0 +1,40 @@
|
|||
process GENOMESCOPE2 {
|
||||
tag "$meta.id"
|
||||
label 'process_low'
|
||||
|
||||
conda (params.enable_conda ? "bioconda::genomescope2=2.0" : null)
|
||||
container "${ workflow.containerEngine == 'singularity' && !task.ext.singularity_pull_docker_container ?
|
||||
'https://depot.galaxyproject.org/singularity/genomescope2:2.0--py310r41hdfd78af_5':
|
||||
'quay.io/biocontainers/genomescope2:2.0--py310r41hdfd78af_5' }"
|
||||
|
||||
input:
|
||||
tuple val(meta), path(histogram)
|
||||
|
||||
output:
|
||||
tuple val(meta), path("*_linear_plot.png") , emit: linear_plot_png
|
||||
tuple val(meta), path("*_transformed_linear_plot.png"), emit: transformed_linear_plot_png
|
||||
tuple val(meta), path("*_log_plot.png") , emit: log_plot_png
|
||||
tuple val(meta), path("*_transformed_log_plot.png") , emit: transformed_log_plot_png
|
||||
tuple val(meta), path("*_model.txt") , emit: model
|
||||
tuple val(meta), path("*_summary.txt") , emit: summary
|
||||
path "versions.yml" , emit: versions
|
||||
|
||||
when:
|
||||
task.ext.when == null || task.ext.when
|
||||
|
||||
script:
|
||||
def args = task.ext.args ?: ''
|
||||
prefix = task.ext.prefix ?: "${meta.id}"
|
||||
"""
|
||||
genomescope2 \\
|
||||
--input $histogram \\
|
||||
$args \\
|
||||
--output . \\
|
||||
--name_prefix $prefix
|
||||
|
||||
cat <<-END_VERSIONS > versions.yml
|
||||
'${task.process}':
|
||||
genomescope2: \$( genomescope2 -v | sed 's/GenomeScope //' )
|
||||
END_VERSIONS
|
||||
"""
|
||||
}
|
67
modules/genomescope2/meta.yml
Normal file
67
modules/genomescope2/meta.yml
Normal file
|
@ -0,0 +1,67 @@
|
|||
name: "genomescope2"
|
||||
description: Estimate genome heterozygosity, repeat content, and size from sequencing reads using a kmer-based statistical approach
|
||||
keywords:
|
||||
- "genome size"
|
||||
- "genome heterozygosity"
|
||||
- "repeat content"
|
||||
tools:
|
||||
- "genomescope2":
|
||||
description: "Reference-free profiling of polyploid genomes"
|
||||
homepage: "http://qb.cshl.edu/genomescope/genomescope2.0/"
|
||||
documentation: "https://github.com/tbenavi1/genomescope2.0/blob/master/README.md"
|
||||
tool_dev_url: "https://github.com/tbenavi1/genomescope2.0"
|
||||
doi: "https://doi.org/10.1038/s41467-020-14998-3"
|
||||
licence: "['Apache License, Version 2.0 (Apache-2.0)']"
|
||||
|
||||
input:
|
||||
- meta:
|
||||
type: map
|
||||
description: |
|
||||
Groovy Map containing sample information
|
||||
e.g. [ id:'test', single_end:false ]
|
||||
- histogram:
|
||||
type: file
|
||||
description: A K-mer histogram file
|
||||
pattern: "*.hist"
|
||||
|
||||
output:
|
||||
- meta:
|
||||
type: map
|
||||
description: |
|
||||
Groovy Map containing sample information
|
||||
e.g. [ id:'test', single_end:false ]
|
||||
- versions:
|
||||
type: file
|
||||
description: File containing software versions
|
||||
pattern: "versions.yml"
|
||||
- linear_plot_png:
|
||||
type: file
|
||||
description: A genomescope2 linear plot in PNG format
|
||||
pattern: "*_linear_plot.png"
|
||||
- linear_plot_png:
|
||||
type: file
|
||||
description: A genomescope2 linear plot in PNG format
|
||||
pattern: "*_linear_plot.png"
|
||||
- transformed_linear_plot_png:
|
||||
type: file
|
||||
description: A genomescope2 transformed linear plot in PNG format
|
||||
pattern: "*_transformed_linear_plot.png"
|
||||
- log_plot_png:
|
||||
type: file
|
||||
description: A genomescope2 log plot in PNG format
|
||||
pattern: "*_log_plot.png"
|
||||
- transformed_log_plot_png:
|
||||
type: file
|
||||
description: A genomescope2 transformed log plot in PNG format
|
||||
pattern: "*_transformed_log_plot.png"
|
||||
- model:
|
||||
type: file
|
||||
description: Genomescope2 model fit summary
|
||||
pattern: "*_model.txt"
|
||||
- summary:
|
||||
type: file
|
||||
description: Genomescope2 histogram summary
|
||||
pattern: "*_summary.txt"
|
||||
|
||||
authors:
|
||||
- "@mahesh-panchal"
|
|
@ -4,8 +4,8 @@ process GUNZIP {
|
|||
|
||||
conda (params.enable_conda ? "conda-forge::sed=4.7" : null)
|
||||
container "${ workflow.containerEngine == 'singularity' && !task.ext.singularity_pull_docker_container ?
|
||||
'https://containers.biocontainers.pro/s3/SingImgsRepo/biocontainers/v1.2.0_cv1/biocontainers_v1.2.0_cv1.img' :
|
||||
'biocontainers/biocontainers:v1.2.0_cv1' }"
|
||||
'https://depot.galaxyproject.org/singularity/ubuntu:20.04' :
|
||||
'ubuntu:20.04' }"
|
||||
|
||||
input:
|
||||
tuple val(meta), path(archive)
|
||||
|
|
42
modules/happy/happy/main.nf
Normal file
42
modules/happy/happy/main.nf
Normal file
|
@ -0,0 +1,42 @@
|
|||
def VERSION = '0.3.14'
|
||||
|
||||
process HAPPY_HAPPY {
|
||||
tag "$meta.id"
|
||||
label 'process_medium'
|
||||
|
||||
conda (params.enable_conda ? "bioconda::hap.py=0.3.14" : null)
|
||||
container "${ workflow.containerEngine == 'singularity' && !task.ext.singularity_pull_docker_container ?
|
||||
'https://depot.galaxyproject.org/singularity/hap.py:0.3.14--py27h5c5a3ab_0':
|
||||
'quay.io/biocontainers/hap.py:0.3.14--py27h5c5a3ab_0' }"
|
||||
|
||||
input:
|
||||
tuple val(meta), path(truth_vcf), path(query_vcf), path(bed)
|
||||
tuple path(fasta), path(fasta_fai)
|
||||
|
||||
output:
|
||||
tuple val(meta), path('*.csv'), path('*.json') , emit: metrics
|
||||
path "versions.yml" , emit: versions
|
||||
|
||||
when:
|
||||
task.ext.when == null || task.ext.when
|
||||
|
||||
script:
|
||||
def args = task.ext.args ?: ''
|
||||
def prefix = task.ext.prefix ?: "${meta.id}"
|
||||
|
||||
"""
|
||||
hap.py \\
|
||||
$truth_vcf \\
|
||||
$query_vcf \\
|
||||
$args \\
|
||||
--reference $fasta \\
|
||||
--threads $task.cpus \\
|
||||
-R $bed \\
|
||||
-o $prefix
|
||||
|
||||
cat <<-END_VERSIONS > versions.yml
|
||||
"${task.process}":
|
||||
hap.py: $VERSION
|
||||
END_VERSIONS
|
||||
"""
|
||||
}
|
67
modules/happy/happy/meta.yml
Normal file
67
modules/happy/happy/meta.yml
Normal file
|
@ -0,0 +1,67 @@
|
|||
name: "happy_happy"
|
||||
description: Hap.py is a tool to compare diploid genotypes at haplotype level. Rather than comparing VCF records row by row, hap.py will generate and match alternate sequences in a superlocus. A superlocus is a small region of the genome (sized between 1 and around 1000 bp) that contains one or more variants.
|
||||
keywords:
|
||||
- happy
|
||||
- benchmark
|
||||
- haplotype
|
||||
tools:
|
||||
- "happy":
|
||||
description: "Haplotype VCF comparison tools"
|
||||
homepage: "https://www.illumina.com/products/by-type/informatics-products/basespace-sequence-hub/apps/hap-py-benchmarking.html"
|
||||
documentation: "https://github.com/Illumina/hap.py"
|
||||
tool_dev_url: "https://github.com/Illumina/hap.py"
|
||||
doi: ""
|
||||
licence: "['BSD-2-clause']"
|
||||
|
||||
input:
|
||||
- meta:
|
||||
type: map
|
||||
description: |
|
||||
Groovy Map containing sample information
|
||||
e.g. [ id:'test', single_end:false ]
|
||||
- truth_vcf:
|
||||
type: file
|
||||
description: gold standard VCF file
|
||||
pattern: "*.{vcf,vcf.gz}"
|
||||
- query_vcf:
|
||||
type: file
|
||||
description: VCF/GVCF file to query
|
||||
pattern: "*.{vcf,vcf.gz}"
|
||||
- bed:
|
||||
type: file
|
||||
description: BED file
|
||||
pattern: "*.bed"
|
||||
- fasta:
|
||||
type: file
|
||||
description: FASTA file of the reference genome
|
||||
pattern: "*.{fa,fasta}"
|
||||
- fasta_fai:
|
||||
type: file
|
||||
description: The index of the reference FASTA
|
||||
pattern: "*.fai"
|
||||
|
||||
output:
|
||||
- meta:
|
||||
type: map
|
||||
description: |
|
||||
Groovy Map containing sample information
|
||||
e.g. [ id:'test', single_end:false ]
|
||||
- summary:
|
||||
type: file
|
||||
description: A CSV file containing the summary of the benchmarking
|
||||
pattern: "*.summary.csv"
|
||||
- extended:
|
||||
type: file
|
||||
description: A CSV file containing extended info of the benchmarking
|
||||
pattern: "*.extended.csv"
|
||||
- runinfo:
|
||||
type: file
|
||||
description: A JSON file containing the run info
|
||||
pattern: "*.runinfo.json"
|
||||
- versions:
|
||||
type: file
|
||||
description: File containing software versions
|
||||
pattern: "versions.yml"
|
||||
|
||||
authors:
|
||||
- "@nvnieuwk"
|
41
modules/happy/prepy/main.nf
Normal file
41
modules/happy/prepy/main.nf
Normal file
|
@ -0,0 +1,41 @@
|
|||
def VERSION = '0.3.14'
|
||||
|
||||
process HAPPY_PREPY {
|
||||
tag "$meta.id"
|
||||
label 'process_medium'
|
||||
|
||||
conda (params.enable_conda ? "bioconda::hap.py=0.3.14" : null)
|
||||
container "${ workflow.containerEngine == 'singularity' && !task.ext.singularity_pull_docker_container ?
|
||||
'https://depot.galaxyproject.org/singularity/hap.py:0.3.14--py27h5c5a3ab_0':
|
||||
'quay.io/biocontainers/hap.py:0.3.14--py27h5c5a3ab_0' }"
|
||||
|
||||
input:
|
||||
tuple val(meta), path(vcf), path(bed)
|
||||
tuple path(fasta), path(fasta_fai)
|
||||
|
||||
output:
|
||||
tuple val(meta), path('*.vcf.gz') , emit: preprocessed_vcf
|
||||
path "versions.yml" , emit: versions
|
||||
|
||||
when:
|
||||
task.ext.when == null || task.ext.when
|
||||
|
||||
script:
|
||||
def args = task.ext.args ?: ''
|
||||
def prefix = task.ext.prefix ?: "${meta.id}"
|
||||
|
||||
"""
|
||||
pre.py \\
|
||||
$args \\
|
||||
-R $bed \\
|
||||
--reference $fasta \\
|
||||
--threads $task.cpus \\
|
||||
$vcf \\
|
||||
${prefix}.vcf.gz
|
||||
|
||||
cat <<-END_VERSIONS > versions.yml
|
||||
"${task.process}":
|
||||
pre.py: $VERSION
|
||||
END_VERSIONS
|
||||
"""
|
||||
}
|
55
modules/happy/prepy/meta.yml
Normal file
55
modules/happy/prepy/meta.yml
Normal file
|
@ -0,0 +1,55 @@
|
|||
name: "happy_prepy"
|
||||
description: Pre.py is a preprocessing tool made to preprocess VCF files for Hap.py
|
||||
keywords:
|
||||
- happy
|
||||
- benchmark
|
||||
- haplotype
|
||||
tools:
|
||||
- "happy":
|
||||
description: "Haplotype VCF comparison tools"
|
||||
homepage: "https://www.illumina.com/products/by-type/informatics-products/basespace-sequence-hub/apps/hap-py-benchmarking.html"
|
||||
documentation: "https://github.com/Illumina/hap.py"
|
||||
tool_dev_url: "https://github.com/Illumina/hap.py"
|
||||
doi: ""
|
||||
licence: "['BSD-2-clause']"
|
||||
|
||||
input:
|
||||
- meta:
|
||||
type: map
|
||||
description: |
|
||||
Groovy Map containing sample information
|
||||
e.g. [ id:'test', single_end:false ]
|
||||
- vcf:
|
||||
type: file
|
||||
description: VCF file to preprocess
|
||||
pattern: "*.{vcf,vcf.gz}"
|
||||
- bed:
|
||||
type: file
|
||||
description: BED file
|
||||
pattern: "*.bed"
|
||||
- fasta:
|
||||
type: file
|
||||
description: FASTA file of the reference genome
|
||||
pattern: "*.{fa,fasta}"
|
||||
- fasta_fai:
|
||||
type: file
|
||||
description: The index of the reference FASTA
|
||||
pattern: "*.fai"
|
||||
|
||||
output:
|
||||
- meta:
|
||||
type: map
|
||||
description: |
|
||||
Groovy Map containing sample information
|
||||
e.g. [ id:'test', single_end:false ]
|
||||
- vcf:
|
||||
type: file
|
||||
description: A preprocessed VCF file
|
||||
pattern: "*.vcf.gz"
|
||||
- versions:
|
||||
type: file
|
||||
description: File containing software versions
|
||||
pattern: "versions.yml"
|
||||
|
||||
authors:
|
||||
- "@nvnieuwk"
|
45
modules/hmtnote/main.nf
Normal file
45
modules/hmtnote/main.nf
Normal file
|
@ -0,0 +1,45 @@
|
|||
process HMTNOTE {
|
||||
tag "$meta.id"
|
||||
label 'process_low'
|
||||
|
||||
conda (params.enable_conda ? "bioconda::hmtnote=0.7.2" : null)
|
||||
container "${ workflow.containerEngine == 'singularity' && !task.ext.singularity_pull_docker_container ?
|
||||
'https://depot.galaxyproject.org/singularity/hmtnote:0.7.2--pyhdfd78af_0':
|
||||
'quay.io/biocontainers/hmtnote:0.7.2--pyhdfd78af_0' }"
|
||||
|
||||
input:
|
||||
tuple val(meta), path(vcf)
|
||||
|
||||
output:
|
||||
tuple val(meta), path("*_annotated.vcf"), emit: vcf
|
||||
path "versions.yml" , emit: versions
|
||||
|
||||
when:
|
||||
task.ext.when == null || task.ext.when
|
||||
|
||||
script:
|
||||
def args = task.ext.args ?: ''
|
||||
def prefix = task.ext.prefix ?: "${meta.id}"
|
||||
|
||||
"""
|
||||
hmtnote \\
|
||||
annotate \\
|
||||
$vcf \\
|
||||
${prefix}_annotated.vcf \\
|
||||
$args
|
||||
|
||||
cat <<-END_VERSIONS > versions.yml
|
||||
"${task.process}":
|
||||
hmtnote: \$(echo \$(hmtnote --version 2>&1) | sed 's/^.*hmtnote, version //; s/Using.*\$//' ))
|
||||
END_VERSIONS
|
||||
"""
|
||||
stub:
|
||||
def prefix = task.ext.prefix ?: "${meta.id}"
|
||||
"""
|
||||
touch ${prefix}_annotated.vcf
|
||||
cat <<-END_VERSIONS > versions.yml
|
||||
"${task.process}":
|
||||
hmtnote: \$(echo \$(hmtnote --version 2>&1) | sed 's/^.*hmtnote, version //; s/Using.*\$//' ))
|
||||
END_VERSIONS
|
||||
"""
|
||||
}
|
39
modules/hmtnote/meta.yml
Normal file
39
modules/hmtnote/meta.yml
Normal file
|
@ -0,0 +1,39 @@
|
|||
name: hmtnote
|
||||
description: Human mitochondrial variants annotation using HmtVar.
|
||||
keywords:
|
||||
- hmtnote mitochondria annotation
|
||||
tools:
|
||||
- hmtnote:
|
||||
description: Human mitochondrial variants annotation using HmtVar.
|
||||
homepage: https://github.com/robertopreste/HmtNote
|
||||
documentation: https://hmtnote.readthedocs.io/en/latest/usage.html
|
||||
tool_dev_url: None
|
||||
doi: "https://doi.org/10.1101/600619"
|
||||
licence: ["MIT"]
|
||||
|
||||
input:
|
||||
- meta:
|
||||
type: map
|
||||
description: |
|
||||
Groovy Map containing sample information
|
||||
- vcf:
|
||||
type: file
|
||||
description: vcf file
|
||||
pattern: "*.vcf"
|
||||
|
||||
output:
|
||||
- meta:
|
||||
type: map
|
||||
description: |
|
||||
Groovy Map containing sample information
|
||||
- versions:
|
||||
type: file
|
||||
description: File containing software versions
|
||||
pattern: "versions.yml"
|
||||
- vcf:
|
||||
type: file
|
||||
description: annotated vcf
|
||||
pattern: "*_annotated.vcf"
|
||||
|
||||
authors:
|
||||
- "@sysbiocoder"
|
Some files were not shown because too many files have changed in this diff Show more
Loading…
Reference in a new issue