Nextflow workflow report

[angry_hopper] (resumed run)

Workflow execution completed unsuccessfully!

The exit status of the task that caused the workflow execution to fail was: 1.

The full error message was:

Error executing process > 'NFCORE_DIFFERENTIALABUNDANCE:DIFFERENTIALABUNDANCE:VALIDATOR (study1_samplesheet.csv)'

Caused by:
  Process `NFCORE_DIFFERENTIALABUNDANCE:DIFFERENTIALABUNDANCE:VALIDATOR (study1_samplesheet.csv)` terminated with an error exit status (1)


Command executed:

  validate_fom_components.R \
      --sample_metadata "study1_samplesheet.csv" \
      --feature_metadata 'GCF_002022765.anno.tsv' \
      --assay_files "salmon.merged.gene_counts_length_scaled_dataset1.tsv" \
      --contrasts_file "study1_contrasts.csv" \
      --output_directory "dataset_1" \
      --sample_id_col 'sample' --feature_id_col 'gene_id'
  
  cat <<-END_VERSIONS > versions.yml
  "NFCORE_DIFFERENTIALABUNDANCE:DIFFERENTIALABUNDANCE:VALIDATOR":
      r-base: $(echo $(R --version 2>&1) | sed 's/^.*R version //; s/ .*$//')
      r-shinyngs: $(Rscript -e "library(shinyngs); cat(as.character(packageVersion('shinyngs')))")
  END_VERSIONS

Command exit status:
  1

Command output:
  [1] "Reading sample sheet at study1_samplesheet.csv with ID col sample"
  [1] "Reading feature metadata at GCF_002022765.anno.tsv with ID col gene_id"
  [1] "Reading assay matrix salmon.merged.gene_counts_length_scaled_dataset1.tsv and validating against samples and features (if supplied)"
  [1] "...  salmon.merged.gene_counts_length_scaled_dataset1.tsv matrix good"
  [1] "Reading contrast definitions and validating against sample sheet"

Command error:
      Position, rank, rbind, Reduce, rownames, sapply, setdiff, sort,
      table, tapply, union, unique, unsplit, which.max, which.min
  
  Loading required package: S4Vectors
  
  Attaching package: ‘S4Vectors’
  
  The following object is masked from ‘package:utils’:
  
      findMatches
  
  The following objects are masked from ‘package:base’:
  
      expand.grid, I, unname
  
  Loading required package: IRanges
  Loading required package: GenomeInfoDb
  Loading required package: Biobase
  Welcome to Bioconductor
  
      Vignettes contain introductory material; view with
      'browseVignettes()'. To cite Bioconductor, see
      'citation("Biobase")', and for packages 'citation("pkgname")'.
  
  
  Attaching package: ‘Biobase’
  
  The following object is masked from ‘package:MatrixGenerics’:
  
      rowMedians
  
  The following objects are masked from ‘package:matrixStats’:
  
      anyMissing, rowMedians
  
  
  Attaching package: ‘shinyngs’
  
  The following object is masked from ‘package:MatrixGenerics’:
  
      colMedians
  
  The following object is masked from ‘package:matrixStats’:
  
      colMedians
  
  Error in checkListIsSubset(val, samples[[var]], "contrast levels", "sample metadata variable") : 
    Not all contrast levels (ABC_VIMS_Family_2017190) are available in the sample metadata variable (ABC_VIMS_Family_2017084,ABC_VIMS_Family_2017089,ABC_VIMS_Family_2017120,ABC_VIMS_Family_2017084,ABC_VIMS_Family_2017084,ABC_VIMS_Family_2017089,ABC_VIMS_Family_2017089,ABC_VIMS_Family_2017090,ABC_VIMS_Family_2017090,ABC_VIMS_Family_2017120,ABC_VIMS_Family_2017089,ABC_VIMS_Family_2017084,ABC_VIMS_Family_2017089,ABC_VIMS_Family_2017090,ABC_VIMS_Family_2017090,ABC_VIMS_Family_2017120,ABC_VIMS_Family_2017084,ABC_VIMS_Family_2017084,ABC_VIMS_Family_2017089,ABC_VIMS_Family_2017090,ABC_VIMS_Family_2017120,ABC_VIMS_Family_2017120,ABC_VIMS_Family_2017084,ABC_VIMS_Family_2017089,ABC_VIMS_Family_2017089,ABC_VIMS_Family_2017090,ABC_VIMS_Family_2017084,ABC_VIMS_Family_2017090,ABC_VIMS_Family_2017120,ABC_VIMS_Family_2017120,ABC_VIMS_Family_2017084,ABC_VIMS_Family_2017084,ABC_VIMS_Family_2017089,ABC_VIMS_Family_2017090,ABC_VIMS_Family_2017090,ABC_VIMS_Family_2017120,ABC_VIMS_Family_2017089,ABC_VI
  Calls: validate_inputs -> read_contrasts -> checkListIsSubset
  Execution halted

Work dir:
  /mmfs1/gscratch/scrubbed/strigg/analyses/20250701_diffabund/work/4f/79875b1c9c90c9d6a02085756a56f8

Container:
  /gscratch/scrubbed/srlab/.apptainer/depot.galaxyproject.org-singularity-r-shinyngs-1.8.8--r43hdfd78af_0.img

Tip: view the complete command output by changing to the process work dir and entering the command `cat .command.out`
Run times
01-Jul-2025 10:23:09 - 01-Jul-2025 10:41:39 (duration: 18m 30s)
  1 succeeded  
  0 cached  
  0 ignored  
  3 failed  
Nextflow command
nextflow run nf-core/differentialabundance -c /gscratch/srlab/strigg/bin/uw_hyak_srlab.config -resume --study_name dataset_1 --input study1_samplesheet.csv --contrasts study1_contrasts.csv --outdir /gscratch/scrubbed/strigg/analyses/20250701_diffabund --matrix /gscratch/scrubbed/strigg/analyses/20250701_diffabund/salmon.merged.gene_counts_length_scaled_dataset1.tsv --feature_type gene --features_gtf_feature_type gene --gtf /gscratch/srlab/strigg/GENOMES/GCF_002022765.2_C_virginica-3.0_genomic_noEmptyGeneIDs.gtf
CPU-Hours
(a few seconds)
Launch directory
/mmfs1/gscratch/scrubbed/strigg/analyses/20250701_diffabund
Work directory
/mmfs1/gscratch/scrubbed/strigg/analyses/20250701_diffabund/work
Project directory
/mmfs1/home/strigg/.nextflow/assets/nf-core/differentialabundance
Script name
main.nf
Script ID
245bb78a7f289adc8d5fe330165781c1
Workflow session
b3f06869-0d12-4ae2-8a82-f397ea4aea3b
Workflow repository
https://github.com/nf-core/differentialabundance, revision master (commit hash 3dd360fed0dca1780db1bdf5dce85e5258fa2253)
Workflow profile
standard
Workflow container
[RMARKDOWNNOTEBOOK:biocontainers/r-shinyngs:1.8.8--r43hdfd78af_0]
Container engine
singularity
Nextflow version
version 24.10.6, build 5937 (23-04-2025 16:53 UTC)

Resource Usage

These plots give an overview of the distribution of resource usage for each process.

CPU

Memory

Job Duration

I/O

Tasks

This table shows information about each task in the workflow. Use the search box on the right to filter rows for specific values. Clicking headers will sort the table by that value and scrolling side to side will reveal more columns.

(tasks table omitted because the dataset is too big)