I have the same proglem, ~1300 genomes, more than 800GB of data generated on hard drive. Is there a solution? SOLUTION: I used a dedicated high capacity SSD I ran into a second problem: Now it runs smoothly, jellyfish runs but then there is a step where it has to write into dedicated directoryies. It runs ok the first 600 genomes than it reports for every other genomes "awk: write failure (File too large) awk: close failed on file /dev/stdout (File too large)" until the last one. It generates a fasta...
I have the same proglem, ~1300 genomes, more than 800GB of data generated on hard drive. Is there a solution? SOLUTION: I used a dedicated high capacity SSD I ran into a second problem: Now it runs smoothly, jellyfish runs but then there is a step where it has to write into dedicated directoryies. It runs ok the first 600 genomes than it reports for every other genomes "awk: write failure (File too large) awk: close failed on file /dev/stdout (File too large)" until the last one. It generates a fasta...
I have the same proglem, ~1300 genomes, more than 800GB of data generated on hard drive. Is there a solution? SOLUTION: I used a dedicated high capacity SSD I ran into a second problem: Now it runs smoothly, jellyfish runs but then there is a step where it has to write into dedicated directoryies. It runs ok the first 600 genomes than it reports for every other genomes "awk: write failure (File too large) awk: close failed on file /dev/stdout (File too large)" until the last one. It generates a fasta...
I have the same proglem, 1300 genomes, more than 800GB of data generated on hard drive. Is there a solution?