Re: [svtoolkit-help] SVtoolkit Error: Exception processing cluster: Permitted to write any record u
Status: Beta
Brought to you by:
bhandsaker
From: Ashish K. <as...@we...> - 2012-02-13 13:48:39
|
Hi Bob, That's great. And if memory becomes an issue, I could always increase the max heap size. Thanks, Ashish. -----Original Message----- From: Bob Handsaker [mailto:han...@br...] Sent: 12 February 2012 21:54 To: svt...@li... Subject: Re: [svtoolkit-help] SVtoolkit Error: Exception processing cluster: Permitted to write any record upstream of position... Hi, Ashish, As a workaround, you can use the -vcfCachingDistance argument to SVDiscovery. The default (since build 683) is 10000, but you can try increasing this to 20000. The minimum amount you need to increase by is the difference between the two positions in the error message. Larger values will increase the memory footprint by holding more data in memory before writing to the VCF file. -Bob On 2/12/12 6:56 AM, Ashish Kumar wrote: > Hi Bob, > > I've got a similar error on the latest release version: > SVToolkit version 1.04 (build 840) > Build date: 2012/01/20 13:17:44 > > Following is the error stack: > > ##### ERROR > ---------------------------------------------------------------------- > -------------------- > ##### ERROR stack trace > java.lang.IllegalArgumentException: Permitted to write any record upstream of position 10804760, but a record at 21:10803683 was just added. > at org.broadinstitute.sting.utils.codecs.vcf.SortingVCFWriterBase.noteCurrentRecord(SortingVCFWriterBase.java:101) > at org.broadinstitute.sting.utils.codecs.vcf.SortingVCFWriter.noteCurrentRecord(SortingVCFWriter.java:55) > at org.broadinstitute.sting.utils.codecs.vcf.SortingVCFWriterBase.add(SortingVCFWriterBase.java:123) > at org.broadinstitute.sv.util.vcf.VCFWriterFactory$VCFWriterAdapter.add(VCFWriterFactory.java:76) > at org.broadinstitute.sv.discovery.DeletionDiscoveryAlgorithm.writeVCFRecord(DeletionDiscoveryAlgorithm.java:646) > at org.broadinstitute.sv.discovery.DeletionDiscoveryAlgorithm.processCluster(DeletionDiscoveryAlgorithm.java:444) > at org.broadinstitute.sv.discovery.DeletionDiscoveryAlgorithm.processClusters(DeletionDiscoveryAlgorithm.java:342) > at org.broadinstitute.sv.discovery.DeletionDiscoveryAlgorithm.runDiscovery(DeletionDiscoveryAlgorithm.java:191) > at org.broadinstitute.sv.discovery.SVDiscoveryWalker.onTraversalDone(SVDiscoveryWalker.java:168) > at org.broadinstitute.sv.discovery.SVDiscoveryWalker.onTraversalDone(SVDiscoveryWalker.java:45) > at org.broadinstitute.sting.gatk.executive.Accumulator$StandardAccumulator.finishTraversal(Accumulator.java:129) > at org.broadinstitute.sting.gatk.executive.LinearMicroScheduler.execute(LinearMicroScheduler.java:76) > at org.broadinstitute.sting.gatk.GenomeAnalysisEngine.execute(GenomeAnalysisEngine.java:234) > at org.broadinstitute.sting.gatk.CommandLineExecutable.execute(CommandLineExecutable.java:113) > at org.broadinstitute.sv.main.SVCommandLine.execute(SVCommandLine.java:105) > at org.broadinstitute.sting.commandline.CommandLineProgram.start(CommandLineProgram.java:221) > at org.broadinstitute.sv.main.SVCommandLine.main(SVCommandLine.java:67) > at org.broadinstitute.sv.main.SVDiscovery.main(SVDiscovery.java:21) > ##### ERROR > ---------------------------------------------------------------------- > -------------------- ##### ERROR A GATK RUNTIME ERROR has occurred > (version 1.0-6121-g40e3165): > ##### ERROR > ##### ERROR Please visit the wiki to see if this is a known problem > ##### ERROR If not, please post the error, with stack trace, to the > GATK forum ##### ERROR Visit our wiki for extensive documentation > http://www.broadinstitute.org/gsa/wiki > ##### ERROR Visit our forum to view answers to commonly asked > questions http://getsatisfaction.com/gsa ##### ERROR ##### ERROR > MESSAGE: Permitted to write any record upstream of position 10804760, but a record at 21:10803683 was just added. > ##### ERROR > ---------------------------------------------------------------------- > -------------------- > > Let me know if you would need any additional information. > > Thanks, > Ashish > > > -----Original Message----- > From: Bob Handsaker [mailto:han...@br...] > Sent: 21 November 2011 17:25 > To: svt...@li... > Subject: Re: [svtoolkit-help] SVtoolkit Error: Exception processing cluster: Permitted to write any record upstream of position... > > The problem doesn't have anything to do with input data size (at least not directly). > It has to do with the requirement that the output records need to be coordinate sorted. > The code attempts to buffer to keep the output records in order, but the buffering in this case in insufficient. > > What version are you using? > I have fixed some problems in this area, so you could try with the > latest interim release from > ftp://ftp.broadinstitute.org/pub/svtoolkit/releases/interim/ > and see if this helps. > > The interim releases don't have updated documentation. > They are also not necessarily 100% compatible, but I think people have adapted to them without too much trouble. > > If it still happens with the latest interim release, let me know. > > -Bob > > On 11/21/11 10:48 AM, John Broxholme wrote: >> Hi there, >> >> We're trying to use svtoolkit for SV detection on some >> deep-sequencing data. This particular project comprises data from >> many (up to 800) BAMs. I've reduced the number to >> 80 and I still see this kind of error. Where might it come from and how might I fix it? >> >> I'm not sure which part of the log is most useful, but I've appended >> what looks to be most informative. >> >> Thanks, >> John Broxholme >> WTCHG, >> Oxford, UK >> >> ... >> INFO 00:19:23,590 SVDiscovery - Clustering: LR split size 14 / 95 >> maximal clique size 2 clique count 3 INFO 00:19:23,590 SVDiscovery - >> Clustering: LR split size 12 / 95 maximal clique size 1 clique count >> 18 INFO 00:19:23,590 SVDiscovery - Processing cluster >> 14:24437577-24437715 >> 14:24519740-24520124 LR 10 >> INFO 00:19:24,712 SVDiscovery - Processing cluster >> 14:24425879-24426190 >> 14:24461366-24461735 LR 6 >> Error: Exception processing cluster: Permitted to write any record >> upstream of position 24427689, but a record at 14:24426202 was just added. >> Cluster: 14:24425879-24426190 14:24461366-24461735 LR 6 [GC >> 2440851K->2036038K(2645952K), 0.0058190 secs] [Full GC >> 2036038K->34379K(2645952K), 0.7118760 secs] ##### ERROR >> --------------------------------------------------------------------- >> - >> -------------------- >> ##### ERROR stack trace >> java.lang.IllegalArgumentException: Permitted to write any record >> upstream of position 24427689, but a record at 14:24426202 was just added. >> at >> org.broadinstitute.sting.utils.codecs.vcf.SortingVCFWriterBase.noteCurrentRecord(SortingVCFWriterBase.java:101) >> at >> org.broadinstitute.sting.utils.codecs.vcf.SortingVCFWriter.noteCurrentRecord(SortingVCFWriter.java:55) >> at >> org.broadinstitute.sting.utils.codecs.vcf.SortingVCFWriterBase.add(SortingVCFWriterBase.java:123) >> at >> org.broadinstitute.sv.util.vcf.VCFWriterFactory$VCFWriterAdapter.add(VCFWriterFactory.java:76) >> at >> org.broadinstitute.sv.discovery.DeletionDiscoveryAlgorithm.writeVCFRecord(DeletionDiscoveryAlgorithm.java:641) >> at >> org.broadinstitute.sv.discovery.DeletionDiscoveryAlgorithm.processCluster(DeletionDiscoveryAlgorithm.java:439) >> at >> org.broadinstitute.sv.discovery.DeletionDiscoveryAlgorithm.processClusters(DeletionDiscoveryAlgorithm.java:337) >> at >> org.broadinstitute.sv.discovery.DeletionDiscoveryAlgorithm.runDiscovery(DeletionDiscoveryAlgorithm.java:186) >> at >> org.broadinstitute.sv.discovery.SVDiscoveryWalker.onTraversalDone(SVDiscoveryWalker.java:166) >> at >> org.broadinstitute.sv.discovery.SVDiscoveryWalker.onTraversalDone(SVDiscoveryWalker.java:45) >> at >> org.broadinstitute.sting.gatk.executive.Accumulator$StandardAccumulator.finishTraversal(Accumulator.java:129) >> at >> org.broadinstitute.sting.gatk.executive.LinearMicroScheduler.execute(LinearMicroScheduler.java:76) >> at >> org.broadinstitute.sting.gatk.GenomeAnalysisEngine.execute(GenomeAnalysisEngine.java:234) >> at >> org.broadinstitute.sting.gatk.CommandLineExecutable.execute(CommandLineExecutable.java:113) >> at org.broadinstitute.sv.main.SVCommandLine.execute(SVCommandLine.java:105) >> at >> org.broadinstitute.sting.commandline.CommandLineProgram.start(CommandLineProgram.java:221) >> at org.broadinstitute.sv.main.SVCommandLine.main(SVCommandLine.java:67) >> at >> org.broadinstitute.sv.main.SVDiscovery.main(SVDiscovery.java:21) >> ##### ERROR >> --------------------------------------------------------------------- >> - >> -------------------- ##### ERROR A GATK RUNTIME ERROR has occurred >> (version 1.0-6121-g40e3165): >> ##### ERROR >> ##### ERROR Please visit the wiki to see if this is a known problem >> ##### ERROR If not, please post the error, with stack trace, to the >> GATK forum ##### ERROR Visit our wiki for extensive documentation >> http://www.broadinstitute.org/gsa/wiki >> ##### ERROR Visit our forum to view answers to commonly asked >> questions http://getsatisfaction.com/gsa ##### ERROR ##### ERROR >> MESSAGE: Permitted to write any record upstream of position 24427689, >> but a record at 14:24426202 was just added. >> ##### ERROR >> ------------------------------------------------------------------------------------------ >> at org.broadinstitute.sting.queue.util.ShellJob.run(ShellJob.scala:24) >> at >> org.broadinstitute.sting.queue.engine.shell.ShellJobRunner.start(ShellJobRunner.scala:54) >> at org.broadinstitute.sting.queue.engine.FunctionEdge.start(FunctionEdge.scala:56) >> at org.broadinstitute.sting.queue.engine.QGraph.runJobs(QGraph.scala:383) >> at org.broadinstitute.sting.queue.engine.QGraph.run(QGraph.scala:123) >> at org.broadinstitute.sting.queue.QCommandLine.execute(QCommandLine.scala:111) >> at >> org.broadinstitute.sting.commandline.CommandLineProgram.start(CommandLineProgram.java:221) >> at org.broadinstitute.sting.queue.QCommandLine$.main(QCommandLine.scala:57) >> at >> org.broadinstitute.sting.queue.QCommandLine.main(QCommandLine.scala) >> INFO 00:19:40,719 QGraph - 74 Pend, 2 Run, 0 Fail, 12 Done ... >> >> >> --------------------------------------------------------------------- >> - >> -------- All the data continuously generated in your IT >> infrastructure contains a definitive record of customers, application >> performance, security threats, fraudulent activity, and more. Splunk >> takes this data and makes sense of it. IT sense. And common sense. >> http://p.sf.net/sfu/splunk-novd2d >> _______________________________________________ >> svtoolkit-help mailing list >> svt...@li... >> https://lists.sourceforge.net/lists/listinfo/svtoolkit-help > > ---------------------------------------------------------------------- > -------- All the data continuously generated in your IT infrastructure > contains a definitive record of customers, application performance, security threats, fraudulent activity, and more. Splunk takes this data and makes sense of it. IT sense. And common sense. > http://p.sf.net/sfu/splunk-novd2d > _______________________________________________ > svtoolkit-help mailing list > svt...@li... > https://lists.sourceforge.net/lists/listinfo/svtoolkit-help > > ---------------------------------------------------------------------- > -------- Virtualization& Cloud Management Using Capacity Planning > Cloud computing makes use of virtualization - but cloud computing also > focuses on allowing computing to be delivered as a service. > http://www.accelacomm.com/jaw/sfnl/114/51521223/ > _______________________________________________ > svtoolkit-help mailing list > svt...@li... > https://lists.sourceforge.net/lists/listinfo/svtoolkit-help ------------------------------------------------------------------------------ Virtualization & Cloud Management Using Capacity Planning Cloud computing makes use of virtualization - but cloud computing also focuses on allowing computing to be delivered as a service. http://www.accelacomm.com/jaw/sfnl/114/51521223/ _______________________________________________ svtoolkit-help mailing list svt...@li... https://lists.sourceforge.net/lists/listinfo/svtoolkit-help |