Overview
The nf-test framework in the nf-core ecosystem enables comprehensive testing for processes, workflows, and pipelines.
This chapter covers the fundamentals of nf-core module testing, from basic syntax to advanced scenarios involving chained modules.
Basic test syntax
The basic syntax for a process test follows this structure:
nextflow_process {
name "<NAME>"
script "<PATH/TO/NEXTFLOW_SCRIPT.nf>"
process "<PROCESS_NAME>"
test("<TEST_NAME>") {
// Test implementation
}
}
Key points:
- Script paths starting with
./
or../
are relative to the test script’s location.
Essential Assertions
Tests use assertions to verify the expected output of the process in the then
block.
To report multiple failures in a single test run, group your assertions within an assertAll
block.
Channels that lack explicit names can be addressed using square brackets and the corresponding index, for example process.out[0]
for the first channel. This indexing method provides a straightforward way to interact with channels without the need for predefined names.
// Process status
assert process.success
assert process.exitStatus == 0
// Output channels
assert process.out.my_channel != null
assert process.out.my_channel.size() == 3
assert process.out.my_channel.get(0) == "expected_value"
// For unnamed channels, use index notation
assert process.out[0] != null
assert process.out[0].size() == 3
// Group assertions to see all failures at once
assertAll(
{ assert process.success },
{ assert snapshot(process.out).match() }
)
Module testing principles
- Each module should contain a
tests/
folder alongside itsmain.nf
file. - Test files come with snapshots of component output channels (a stored representation of the correct channel contents).
- Tests verify both functionality and expected outputs.
- Support both regular and stub (
-stub
) testing modes.
Testing an existing module
Let’s examine testing the bedtools/bamtobed
module, which is a simple standalone module:
cd path/to/modules
# Run all tests for the module
nf-core modules test bedtools/bamtobed --profile docker
This will run all tests for the module and display the results, including any failures or snapshot mismatches.
,--./,-.
___ __ __ __ ___ /,-._.--~\
|\ | |__ __ / ` / \ |__) |__ } {
| \| | \__, \__/ | \ |___ \`-._,-`-,
`._,._,'
nf-core/tools version 3.3.2 - https://nf-co.re
INFO Generating nf-test snapshot
â•──────────────────────────────────────────── nf-test output ────────────────────────────────────────────╮
│ │
│ 🚀 nf-test 0.9.0 │
│ https://www.nf-test.com │
│ (c) 2021 - 2024 Lukas Forer and Sebastian Schoenherr │
│ │
│ Load .nf-test/plugins/nft-bam/0.5.0/nft-bam-0.5.0.jar │
│ Load .nf-test/plugins/nft-compress/0.1.0/nft-compress-0.1.0.jar │
│ Load .nf-test/plugins/nft-vcf/1.0.7/nft-vcf-1.0.7.jar │
│ Load .nf-test/plugins/nft-csv/0.1.0/nft-csv-0.1.0.jar │
│ Load .nf-test/plugins/nft-utils/0.0.3/nft-utils-0.0.3.jar │
│ Load .nf-test/plugins/nft-fastq/0.0.1/nft-fastq-0.0.1.jar │
│ Load .nf-test/plugins/nft-anndata/0.1.0/nft-anndata-0.1.0.jar │
│ │
│ Test Process BEDTOOLS_BAMTOBED │
│ │
│ Test [824188d1] 'sarscov2 - bam' PASSED (3.335s) │
│ Test [f4f6429b] 'stub' PASSED (3.154s) │
│ │
│ │
│ SUCCESS: Executed 2 tests in 6.498s │
│ │
╰────────────────────────────────────────────────────────────────────────────────────────────────────────╯
INFO Generating nf-test snapshot again to check stability
â•──────────────────────────────────────────── nf-test output ────────────────────────────────────────────╮
│ │
│ 🚀 nf-test 0.9.0 │
│ https://www.nf-test.com │
│ (c) 2021 - 2024 Lukas Forer and Sebastian Schoenherr │
│ │
│ Load .nf-test/plugins/nft-bam/0.5.0/nft-bam-0.5.0.jar │
│ Load .nf-test/plugins/nft-compress/0.1.0/nft-compress-0.1.0.jar │
│ Load .nf-test/plugins/nft-vcf/1.0.7/nft-vcf-1.0.7.jar │
│ Load .nf-test/plugins/nft-csv/0.1.0/nft-csv-0.1.0.jar │
│ Load .nf-test/plugins/nft-utils/0.0.3/nft-utils-0.0.3.jar │
│ Load .nf-test/plugins/nft-fastq/0.0.1/nft-fastq-0.0.1.jar │
│ Load .nf-test/plugins/nft-anndata/0.1.0/nft-anndata-0.1.0.jar │
│ │
│ Test Process BEDTOOLS_BAMTOBED │
│ │
│ Test [824188d1] 'sarscov2 - bam' PASSED (3.277s) │
│ Test [f4f6429b] 'stub' PASSED (3.161s) │
│ │
│ │
│ SUCCESS: Executed 2 tests in 6.446s │
│ │
╰────────────────────────────────────────────────────────────────────────────────────────────────────────╯
INFO All tests passed!
Creating a new module with tests
When creating a new module using nf-core/tools
, a test file is automatically generated based on the template.
# Create a new module using nf-core/tools
cd path/to/modules
nf-core modules create seqtk/sample
This creates the following module structure:
modules/nf-core/seqtk/sample/
├── main.nf
├── meta.yml
└── tests/
├── main.nf.test
└── tags.yml
The generated test file (tests/main.nf.test
), with positional input channels input[0]
and input[1]
provided, will look like this:
nextflow_process {
name "Test Process SEQTK_SAMPLE"
script "../main.nf"
process "SEQTK_SAMPLE"
tag "modules"
tag "modules_nfcore"
tag "seqtk"
tag "seqtk/sample"
test("sarscov2 - fastq") {
when {
process {
"""
input[0] = [
[ id:'test', single_end:false ], // meta map
file(params.modules_testdata_base_path + 'genomics/sarscov2/illumina/fastq/test_1.fastq.gz', checkIfExists: true)
]
input[1] = 10 // Number of reads to sample
"""
}
}
then {
assertAll(
{ assert process.success },
{ assert snapshot(process.out).match() }
)
}
}
test("sarscov2 - fastq - stub") {
options "-stub"
when {
process {
"""
input[0] = [
[ id:'test', single_end:false ], // meta map
file(params.modules_testdata_base_path + 'genomics/sarscov2/illumina/fastq/test_1.fastq.gz', checkIfExists: true)
]
input[1] = 10
"""
}
}
then {
assertAll(
{ assert process.success },
{ assert snapshot(process.out).match() }
)
}
}
}
After providing the appropriate test data, run the tests to create a snapshot of the output:
nf-core modules test seqtk/sample --profile docker
,--./,-.
___ __ __ __ ___ /,-._.--~\
|\ | |__ __ / ` / \ |__) |__ } {
| \| | \__, \__/ | \ |___ \`-._,-`-,
`._,._,'
nf-core/tools version 3.3.2 - https://nf-co.re
INFO Generating nf-test snapshot
â•──────────────────────────────────────────── nf-test output ────────────────────────────────────────────╮
│ │
│ 🚀 nf-test 0.9.0 │
│ https://www.nf-test.com │
│ (c) 2021 - 2024 Lukas Forer and Sebastian Schoenherr │
│ │
│ Load .nf-test/plugins/nft-bam/0.5.0/nft-bam-0.5.0.jar │
│ Load .nf-test/plugins/nft-compress/0.1.0/nft-compress-0.1.0.jar │
│ Load .nf-test/plugins/nft-vcf/1.0.7/nft-vcf-1.0.7.jar │
│ Load .nf-test/plugins/nft-csv/0.1.0/nft-csv-0.1.0.jar │
│ Load .nf-test/plugins/nft-utils/0.0.3/nft-utils-0.0.3.jar │
│ Load .nf-test/plugins/nft-fastq/0.0.1/nft-fastq-0.0.1.jar │
│ Load .nf-test/plugins/nft-anndata/0.1.0/nft-anndata-0.1.0.jar │
│ │
│ Test Process SEQTK_SAMPLE │
│ │
│ Test [a4edc395] 'sarscov2_sample_singleend_fqgz' PASSED (14.223s) │
│ Test [42c1ef08] 'sarscov2_sample_pairedend_fqgz' PASSED (3.353s) │
│ Test [7b327705] 'sarscov2_sample_singlend_fqgz_stub' PASSED (3.042s) │
│ Test [f1c40b60] 'sarscov2_sample_singleend_frac' PASSED (3.326s) │
│ │
│ Test Workflow FASTQ_CONTAM_SEQTK_KRAKEN │
│ │
│ Test [609e9434] 'sarscov2 - fastq - 25000 - krakendb' PASSED (31.378s) │
│ Test [b5fb1cc0] 'sarscov2 - fastq - [12500, 25000] - krakendb' PASSED (7.803s) │
│ Test [aac34908] 'sarscov2 - fastq - [12500, 25000] - krakendb -- stub' PASSED (7.262s) │
│ │
│ │
│ SUCCESS: Executed 7 tests in 70.391s │
│ │
╰────────────────────────────────────────────────────────────────────────────────────────────────────────╯
INFO Generating nf-test snapshot again to check stability
â•──────────────────────────────────────────── nf-test output ────────────────────────────────────────────╮
│ │
│ 🚀 nf-test 0.9.0 │
│ https://www.nf-test.com │
│ (c) 2021 - 2024 Lukas Forer and Sebastian Schoenherr │
│ │
│ Load .nf-test/plugins/nft-bam/0.5.0/nft-bam-0.5.0.jar │
│ Load .nf-test/plugins/nft-compress/0.1.0/nft-compress-0.1.0.jar │
│ Load .nf-test/plugins/nft-vcf/1.0.7/nft-vcf-1.0.7.jar │
│ Load .nf-test/plugins/nft-csv/0.1.0/nft-csv-0.1.0.jar │
│ Load .nf-test/plugins/nft-utils/0.0.3/nft-utils-0.0.3.jar │
│ Load .nf-test/plugins/nft-fastq/0.0.1/nft-fastq-0.0.1.jar │
│ Load .nf-test/plugins/nft-anndata/0.1.0/nft-anndata-0.1.0.jar │
│ │
│ Test Process SEQTK_SAMPLE │
│ │
│ Test [a4edc395] 'sarscov2_sample_singleend_fqgz' PASSED (3.611s) │
│ Test [42c1ef08] 'sarscov2_sample_pairedend_fqgz' PASSED (3.321s) │
│ Test [7b327705] 'sarscov2_sample_singlend_fqgz_stub' PASSED (3.098s) │
│ Test [f1c40b60] 'sarscov2_sample_singleend_frac' PASSED (3.263s) │
│ │
│ Test Workflow FASTQ_CONTAM_SEQTK_KRAKEN │
│ │
│ Test [609e9434] 'sarscov2 - fastq - 25000 - krakendb' PASSED (5.934s) │
│ Test [b5fb1cc0] 'sarscov2 - fastq - [12500, 25000] - krakendb' PASSED (6.564s) │
│ Test [aac34908] 'sarscov2 - fastq - [12500, 25000] - krakendb -- stub' PASSED (7.03s) │
│ │
│ │
│ SUCCESS: Executed 7 tests in 32.853s │
│ │
╰────────────────────────────────────────────────────────────────────────────────────────────────────────╯
INFO All tests passed!
Testing parameter variations
There are two primary ways to alter parameters for a module test: creating a test-specific configuration file or using a params
block within the test itself.
Creating a parameter-specific configuration
For modules requiring additional parameters, you can create a <filename>.config
file in the tests/
directory:
# Create config file for parameter testing
touch modules/nf-core/seqtk/sample/tests/nextflow.config
Add the configuration:
process {
withName: 'SEQTK_SAMPLE' {
ext.args = params.module_args
}
}
Then apply the config in your test as shown below:
nextflow_process {
name "Test Process SEQTK_SAMPLE"
script "../main.nf"
process "SEQTK_SAMPLE"
config "./nextflow.config"
test("pass module_args to the module") {
when {
params {
module_args = "--help"
}
process {
"""
input[0] = [
[ id:'test', single_end:false ], // meta map
file(params.modules_testdata_base_path + 'genomics/sarscov2/illumina/fastq/test_1.fastq.gz', checkIfExists: true)
]
input[1] = 10
"""
}
}
then {
// ...
}
}
}
Overriding parameters with the params
block
In addition to a nextflow.config
file, nf-test
provides a params
block within the when
block to override Nextflow’s input params
for a specific test.
when {
params {
outdir = "output"
}
process {
"""
input[0] = [
[ id:'test', single_end:false ], // meta map
file(params.modules_testdata_base_path + 'genomics/sarscov2/illumina/fastq/test_1.fastq.gz', checkIfExists: true)
]
input[1] = 10
"""
}
}
Testing chained modules
For modules that depend on the output of other modules, use the setup
method to define the dependency.
The setup
method allows you to specify processes or workflows that need to be executed before the primary when
block. It serves as a mechanism to prepare the required input data or set up essential steps prior to the primary processing block.
Within the setup block, you can use the run
method to define and execute dependent processes or workflows.
Syntax
The run
method syntax for a process is as follows:
run("ProcessName") {
script "path/to/process/script.nf"
process {
// Define the process inputs here
}
}
The run
method syntax for a workflow is as follows:
run("WorkflowName") {
script "path/to/workflow/script.nf"
workflow {
// Define the workflow inputs here
}
}
Please keep in mind that changes in processes or workflows, which are executed in the setup method, can result in a failed test run.
Local setup
method
A setup
method can be defined within a test case to execute a dependent process. This process generates input data required for the primary process. The setup
block specifies the execution of the dependency, and the when
block defines the processing logic for the module under test.
Here’s an example for abricate/summary
, which requires output from abricate/run
:
nextflow_process {
name "Test Process ABRICATE_SUMMARY"
script "../main.nf"
process "ABRICATE_SUMMARY"
tag "modules"
tag "modules_nfcore"
tag "abricate"
tag "abricate/summary"
test("bacteroides_fragilis - genome_fna_gz") {
setup {
run("ABRICATE_RUN") {
script "../../run/main.nf"
process {
"""
input[0] = Channel.fromList([
tuple([ id:'test1', single_end:false ], // meta map
file(params.modules_testdata_base_path + 'genomics/prokaryotes/bacteroides_fragilis/genome/genome.fna.gz', checkIfExists: true)),
tuple([ id:'test2', single_end:false ],
file(params.modules_testdata_base_path + 'genomics/prokaryotes/haemophilus_influenzae/genome/genome.fna.gz', checkIfExists: true))
])
"""
}
}
}
when {
process {
"""
// Collect reports from ABRICATE_RUN, create a new meta map, and provide it as input
input[0] = ABRICATE_RUN.out.report.collect{ meta, report -> report }.map{ report -> [[ id: 'test_summary'], report]}
"""
}
}
then {
assertAll(
{ assert process.success },
{ assert snapshot(process.out).match() }
)
}
}
}
Run the tests:
nf-core modules test abricate/summary --profile docker
This will execute the test with the chained module setup, running ABRICATE_RUN
first to generate the required input, then testing ABRICATE_SUMMARY
with that output.
,--./,-.
___ __ __ __ ___ /,-._.--~\
|\ | |__ __ / ` / \ |__) |__ } {
| \| | \__, \__/ | \ |___ \`-._,-`-,
`._,._,'
nf-core/tools version 3.3.2 - https://nf-co.re
INFO Generating nf-test snapshot
â•──────────────────────────────────────────── nf-test output ────────────────────────────────────────────╮
│ │
│ 🚀 nf-test 0.9.0 │
│ https://www.nf-test.com │
│ (c) 2021 - 2024 Lukas Forer and Sebastian Schoenherr │
│ │
│ Load .nf-test/plugins/nft-bam/0.5.0/nft-bam-0.5.0.jar │
│ Load .nf-test/plugins/nft-compress/0.1.0/nft-compress-0.1.0.jar │
│ Load .nf-test/plugins/nft-vcf/1.0.7/nft-vcf-1.0.7.jar │
│ Load .nf-test/plugins/nft-csv/0.1.0/nft-csv-0.1.0.jar │
│ Load .nf-test/plugins/nft-utils/0.0.3/nft-utils-0.0.3.jar │
│ Load .nf-test/plugins/nft-fastq/0.0.1/nft-fastq-0.0.1.jar │
│ Load .nf-test/plugins/nft-anndata/0.1.0/nft-anndata-0.1.0.jar │
│ │
│ Test Process ABRICATE_SUMMARY │
│ │
│ Test [fc133477] 'Should run without failures' PASSED (102.456s) │
│ │
│ │
│ SUCCESS: Executed 1 tests in 102.459s │
│ │
╰────────────────────────────────────────────────────────────────────────────────────────────────────────╯
INFO Generating nf-test snapshot again to check stability
â•──────────────────────────────────────────── nf-test output ────────────────────────────────────────────╮
│ │
│ 🚀 nf-test 0.9.0 │
│ https://www.nf-test.com │
│ (c) 2021 - 2024 Lukas Forer and Sebastian Schoenherr │
│ │
│ Load .nf-test/plugins/nft-bam/0.5.0/nft-bam-0.5.0.jar │
│ Load .nf-test/plugins/nft-compress/0.1.0/nft-compress-0.1.0.jar │
│ Load .nf-test/plugins/nft-vcf/1.0.7/nft-vcf-1.0.7.jar │
│ Load .nf-test/plugins/nft-csv/0.1.0/nft-csv-0.1.0.jar │
│ Load .nf-test/plugins/nft-utils/0.0.3/nft-utils-0.0.3.jar │
│ Load .nf-test/plugins/nft-fastq/0.0.1/nft-fastq-0.0.1.jar │
│ Load .nf-test/plugins/nft-anndata/0.1.0/nft-anndata-0.1.0.jar │
│ │
│ Test Process ABRICATE_SUMMARY │
│ │
│ Test [fc133477] 'Should run without failures' PASSED (11.658s) │
│ │
│ │
│ SUCCESS: Executed 1 tests in 11.667s │
│ │
╰────────────────────────────────────────────────────────────────────────────────────────────────────────╯
INFO All tests passed!
Global setup
method
A global setup
method can be defined for all tests within a nextflow_process
definition. The setup
is applied to multiple test
cases, ensuring a consistent setup for each test. This approach is useful when multiple tests share the same setup requirements.
nextflow_process {
name "Test Process ABRICATE_SUMMARY"
script "../main.nf"
process "ABRICATE_SUMMARY"
config "./nextflow.config"
setup {
run("ABRICATE_RUN") {
script "../../run/main.nf"
process {
"""
input[0] = Channel.fromList([
tuple([ id:'test1', single_end:false ], // meta map
file(params.modules_testdata_base_path + 'genomics/prokaryotes/bacteroides_fragilis/genome/genome.fna.gz', checkIfExists: true)),
tuple([ id:'test2', single_end:false ],
file(params.modules_testdata_base_path + 'genomics/prokaryotes/haemophilus_influenzae/genome/genome.fna.gz', checkIfExists: true))
])
"""
}
}
}
test("first test") {
when {
process {
"""
input[0] = ABRICATE_RUN.out.report.collect{ meta, report -> report }.map{ report -> [[ id: 'test_summary'], report]}
"""
}
}
then {
assert process.success
assert snapshot(process.out).match()
}
}
test("second test") {
when {
process {
"""
input[0] = ABRICATE_RUN.out.report.collect{ meta, report -> report }.map{ report -> [[ id: 'test_summary'], report]}
"""
}
}
then {
assert process.success
assert snapshot(process.out).match()
}
}
}
Aliasing dependencies
If you need to run the same process multiple times, you can set an alias
for the process:
nextflow_process {
// ...
setup {
run("UNTAR", alias: "UNTAR1") {
script "modules/nf-core/untar/main.nf"
process {
"""
input[0] = Channel.fromList(...)
"""
}
}
run("UNTAR", alias: "UNTAR2") {
script "modules/nf-core/untar/main.nf"
process {
"""
input[0] = Channel.fromList(...)
"""
}
}
run("UNTAR", alias: "UNTAR3") {
script "modules/nf-core/untar/main.nf"
process {
"""
input[0] = Channel.fromList(...)
"""
}
}
}
test("Test with three different inputs") {
when {
process {
"""
input[0] = UNTAR1.out.untar.map{ it[1] }
input[1] = UNTAR2.out.untar.map{ it[1] }
input[2] = UNTAR3.out.untar.map{ it[1] }
"""
}
}
then {
// ...
}
}
}
Updating module snapshots
When module outputs change (e.g., due to version bumps), you need to update snapshots:
nf-core modules test abricate/summary --profile docker --update
You will see the following warning at the start of the test run:
│ 🚀 nf-test 0.9.0 │
│ https://www.nf-test.com │
│ (c) 2021 - 2024 Lukas Forer and Sebastian Schoenherr │
│ │
│ Load .nf-test/plugins/nft-bam/0.5.0/nft-bam-0.5.0.jar │
│ Load .nf-test/plugins/nft-compress/0.1.0/nft-compress-0.1.0.jar │
│ Load .nf-test/plugins/nft-vcf/1.0.7/nft-vcf-1.0.7.jar │
│ Load .nf-test/plugins/nft-csv/0.1.0/nft-csv-0.1.0.jar │
│ Load .nf-test/plugins/nft-utils/0.0.3/nft-utils-0.0.3.jar │
│ Load .nf-test/plugins/nft-fastq/0.0.1/nft-fastq-0.0.1.jar │
│ Load .nf-test/plugins/nft-anndata/0.1.0/nft-anndata-0.1.0.jar │
│ Warning: every snapshot that fails during this test run is re-record.
Once the test passes, the snapshot will be updated and the test re-run to verify the snapshot is stable.
Testing with nf-test
You can also run the tests with nf-test directly using the --tag
option:
nf-test test --profile docker --tag abricate/summary
# or specify test path
nf-test test --profile docker modules/nf-core/abricate/summary/tests/main.nf.test
This will run the tests for the module and display the results, including any failures or snapshot mismatches.
🚀 nf-test 0.9.0
https://www.nf-test.com
(c) 2021 - 2024 Lukas Forer and Sebastian Schoenherr
Load .nf-test/plugins/nft-bam/0.5.0/nft-bam-0.5.0.jar
Load .nf-test/plugins/nft-compress/0.1.0/nft-compress-0.1.0.jar
Load .nf-test/plugins/nft-vcf/1.0.7/nft-vcf-1.0.7.jar
Load .nf-test/plugins/nft-csv/0.1.0/nft-csv-0.1.0.jar
Load .nf-test/plugins/nft-utils/0.0.3/nft-utils-0.0.3.jar
Load .nf-test/plugins/nft-fastq/0.0.1/nft-fastq-0.0.1.jar
Load .nf-test/plugins/nft-anndata/0.1.0/nft-anndata-0.1.0.jar
Test Process ABRICATE_SUMMARY
Test [fc133477] 'Should run without failures' PASSED (15.286s)
SUCCESS: Executed 1 tests in 15.294s
The nf-test test
command runs the test only once compared to the nf-core modules test
command which runs the test twice to confirm snapshot stability.
For more nf-test assertion patterns, see the nf-test assertions examples documentation.
Next steps
Continue to Testing Subworkflows to learn about testing more complex multi-module components.