Skip to content

RecursionError: maximum recursion depth exceeded #221

@sarah872

Description

@sarah872

I am getting a weird error after the assembly step:

Starting Unicycler (2019-12-19 13:20:37)
    Welcome to Unicycler, an assembly pipeline for bacterial genomes. Since you
provided both short and long reads, Unicycler will perform a hybrid assembly.
It will first use SPAdes to make a short-read assembly graph, and then it will
use various methods to scaffold that graph with the long reads.
    For more information, please see https://github.com/rrwick/Unicycler

Command: /proj/04_programs/Unicycler/Unicycler/unicycler-runner.py -l ../02_trimmed+corrected-reads/fastq_runid_1d2f069b4156b0bfdc46e8cd5825a3419de1a39d_ALL.fq.gz.qcat-len100.fq.gz.canu-corr.fasta.gz -1 /tmp/slurm-6561352/pe12.fq.gz -2 /tmp/slurm-6561352/pe22.fq.gz --out unicycler-hybrid_1385_A_all_R1.fq.gz.trim+corr.fq.gz-paired_fastq_runid_1d2f069b4156b0bfdc46e8cd5825a3419de1a39d_ALL.fq.gz.qcat-len100.fq.gz.canu-corr.fasta.gz_flag-meta_mode-normal_kmers-21,33,55,77-_6561352 -t 8 --mode normal --spades_tmp_dir /tmp/slurm-6561352 --kmers 21,33,55,77 --no_correct

Unicycler version: v0.4.8
Using 8 threads

Making output directory:
  /scratch/mg_ton_2019-12-17/03_assembly_ill-corr_ont-corr/unicycler-hybrid_1385_A_all_R1.fq.gz.trim+corr.fq.gz-paired_fastq_runid_1d2f069b4156b0bfdc46e8cd5825a3419de1a39d_ALL.fq.gz.qcat-len100.fq.gz.canu-corr.fasta.gz_flag-meta_mode-normal_kmers-21,33,55,77-_6561352

Dependencies:
  Program         Version     Status  
  spades.py       3.13.1      good    
  racon           1.4.3       good    
  makeblastdb     2.8.1+      good    
  tblastn         2.8.1+      good    
  bowtie2-build   2.3.5.1     good    
  bowtie2         2.3.5.1     good    
  samtools        1.9         good    
  java            1.8.0_152   good    
  pilon           1.22        good    
  bcftools                    not used


SPAdes assemblies (2019-12-19 13:24:19)
    Unicycler now uses SPAdes to assemble the short reads. It scores the
assembly graph for each k-mer using the number of contigs (fewer is better) and
the number of dead ends (fewer is better). The score function is 1/(c*(d+2)),
where c is the contig count and d is the dead end count.

K-mer   Contigs   Dead ends   Score   
   21                           failed
   33                           failed
   55                           failed
   77    79,986      18,176   6.88e-10 ← best

Read depth filter: removed 102385 contigs totalling 27191794 bp
Deleting /scratch/mg_ton_2019-12-17/03_assembly_ill-corr_ont-corr/unicycler-hybrid_1385_A_all_R1.fq.gz.trim+corr.fq.gz-paired_fastq_runid_1d2f069b4156b0bfdc46e8cd5825a3419de1a39d_ALL.fq.gz.qcat-len100.fq.gz.canu-corr.fasta.gz_flag-meta_mode-normal_kmers-21,33,55,77-_6561352/spades_assembly/
Deleting /tmp/slurm-6561352/


Determining graph multiplicity (2019-12-19 20:25:29)
    Multiplicity is the number of times a sequence occurs in the underlying
sequence. Single-copy contigs (those with a multiplicity of one, occurring only
once in the underlying sequence) are particularly useful.
Traceback (most recent call last):
  File "/proj/04_programs/Unicycler/Unicycler/unicycler-runner.py", line 21, in <module>
    main()
  File "/proj/04_programs/Unicycler/Unicycler/unicycler/unicycler.py", line 89, in main
    determine_copy_depth(graph)
  File "/proj/04_programs/Unicycler/Unicycler/unicycler/assembly_graph_copy_depth.py", line 74, in determine_copy_depth
    determine_copy_depth_part_2(graph, settings.COPY_PROPAGATION_TOLERANCE, copy_depth_table)
  File "/proj/04_programs/Unicycler/Unicycler/unicycler/assembly_graph_copy_depth.py", line 104, in determine_copy_depth_part_2
    determine_copy_depth_part_2(graph, tolerance, copy_depth_table)
  File "/proj/04_programs/Unicycler/Unicycler/unicycler/assembly_graph_copy_depth.py", line 104, in determine_copy_depth_part_2
    determine_copy_depth_part_2(graph, tolerance, copy_depth_table)
  File "/proj/04_programs/Unicycler/Unicycler/unicycler/assembly_graph_copy_depth.py", line 104, in determine_copy_depth_part_2
    determine_copy_depth_part_2(graph, tolerance, copy_depth_table)
  [Previous line repeated 982 more times]
  File "/proj/04_programs/Unicycler/Unicycler/unicycler/assembly_graph_copy_depth.py", line 103, in determine_copy_depth_part_2
    if redistribute_copy_depths(graph, tolerance, copy_depth_table):
  File "/proj/04_programs/Unicycler/Unicycler/unicycler/assembly_graph_copy_depth.py", line 240, in redistribute_copy_depths
    arrangements = shuffle_into_bins(copy_depths, bins, targets)
  File "/proj/04_programs/Unicycler/Unicycler/unicycler/assembly_graph_copy_depth.py", line 459, in shuffle_into_bins
    arrangements += shuffle_into_bins(items[1:], bins_copy, targets)
  File "/proj/04_programs/Unicycler/Unicycler/unicycler/assembly_graph_copy_depth.py", line 459, in shuffle_into_bins
    arrangements += shuffle_into_bins(items[1:], bins_copy, targets)
  File "/proj/04_programs/Unicycler/Unicycler/unicycler/assembly_graph_copy_depth.py", line 459, in shuffle_into_bins
    arrangements += shuffle_into_bins(items[1:], bins_copy, targets)
  [Previous line repeated 3 more times]
  File "/proj/04_programs/Unicycler/Unicycler/unicycler/assembly_graph_copy_depth.py", line 463, in shuffle_into_bins
    elif all(x for x in bins) and \
RecursionError: maximum recursion depth exceeded

Do you have any idea where that could come from?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions