Hi i have created multithread program using java with xpath evaluation.But it is showing ArrayIndexOutOfBoundException-1.Can u tell why it is not working in multithread environmet.Please find below code.
package com.test;
import java.io.File;
import java.io.FileInputStream;
import java.io.IOException;
import org.springframework.util.StringUtils;
import com.ximpleware.AutoPilot;
import com.ximpleware.VTDGen;
import com.ximpleware.VTDNav;
import com.ximpleware.XPathParseException;
class Test extends Thread {
private VTDNav vn;
private AutoPilot ap;
public Test() {
}
public Test(VTDNav vn) {
this.vn = vn;
}
public void run() {
try {
for (int i = 1; i < 1000; i++) {
String s = "//data-set/record[COL1='" + i
+ "' and COL2='1' and COL3='Administrator' and COL4='HR Administrator Payroll, Day Rate' and COL5='Ea' and COL6='Robert Walters AUD' and COL7='1' and COL8='Administrator' and COL9='HR Administrator Payroll, Day Rate' and COL10='Ea']";
ap = new AutoPilot(vn);
this.ap.selectXPath(s);
String s1 = this.ap.evalXPathToString();
if (!StringUtils.isEmpty(s1)) {
System.out.println(s1);
}
this.ap.resetXPath();
}
} catch (Exception e) {
// TODO Auto-generated catch block
System.out.println(e);
}
}
}
public class VTD_EXCELExample extends Test {
public static void main(String[] args) {
// TODO Auto-generated method stub
try {
VTDGen vg = new VTDGen();
vg.parseFile("E:/Test_batch/validation.xml", false);
VTDNav vn = vg.getNav();
for (int i = 1; i < 5; i++) {
Test t = new Test(vn);
t.start();
}
} catch (Exception e) {
// TODO Auto-generated catch block
System.out.println(e);
}
System.out.println("done");
}
}
ok, will look into it.. thanks for reporting
Your code has one issue that causes the problem, the fix is super simple.... you should clone a copy of VTDNav for each thread, instead of sharing the VTDNav object for all threads...so it is not a bug for the library... also try get the 2.13.1 release... it has many bug fixes...
Last edit: jimmy zhang 2016-12-10
Hi Thanks for reply i tried this way, first time it worked fine and after that it is not working. it s working inconsistency.i applied it in project using spring batch partitioning.Please check into that.i want to develop further process in our app because of this issue completely stopped devlopment process.I am waiting for your reply
your input this time is less helpful... it doesn't give enough detail... I want to know more: what error are you encounttering? The same as before or new ones?
From my experience, the cloneNav addition should fix the null pointer issue if you do things correctly in the rest of your code... I have done the test many times that bug is gone for good...
The new code piece you gave me is not working ... I don't have springbatch code installed. can you give a bit effort to isolate the problem and provide a complete test case..
Hi In normal multi-thread program it is working fine but when we integrate with web app with spring batch partition that time this error is coming while checking with large data with 10 thread ,java.lang.ArrayIndexOutOfBoundsException: -1
at com.ximpleware.arrayList.get(arrayList.java:39)
at com.ximpleware.ContextBuffer.store(ContextBuffer.java:293)
at com.ximpleware.VTDNav.push2(VTDNav.java:3512)
at com.ximpleware.LocationPathExpr.evalString(LocationPathExpr.java:269)
at com.ximpleware.AutoPilot.evalXPathToString(AutoPilot.java:892)
at com.test.CustomItemProcessor.process(CustomItemProcessor.java:33)
at com.test.CustomItemProcessor.process(CustomItemProcessor.java:1)
at org.springframework.batch.core.step.item.SimpleChunkProcessor.doProcess(SimpleChunkProcessor.java:126)
at org.springframework.batch.core.step.item.SimpleChunkProcessor.transform(SimpleChunkProcessor.java:293)
at org.springframework.batch.core.step.item.SimpleChunkProcessor.process(SimpleChunkProcessor.java:192)
at org.springframework.batch.core.step.item.ChunkOrientedTasklet.execute(ChunkOrientedTasklet.java:75)
at org.springframework.batch.core.step.tasklet.TaskletStep$ChunkTransactionCallback.doInTransaction(TaskletStep.java:406)
at org.springframework.batch.core.step.tasklet.TaskletStep$ChunkTransactionCallback.doInTransaction(TaskletStep.java:330)
at org.springframework.transaction.support.TransactionTemplate.execute(TransactionTemplate.java:133)
at org.springframework.batch.core.step.tasklet.TaskletStep$2.doInChunkContext(TaskletStep.java:271)
at org.springframework.batch.core.scope.context.StepContextRepeatCallback.doInIteration(StepContextRepeatCallback.java:81)
at org.springframework.batch.repeat.support.RepeatTemplate.getNextResult(RepeatTemplate.java:374)
at org.springframework.batch.repeat.support.RepeatTemplate.executeInternal(RepeatTemplate.java:215)
at org.springframework.batch.repeat.support.RepeatTemplate.iterate(RepeatTemplate.java:144)
at org.springframework.batch.core.step.tasklet.TaskletStep.doExecute(TaskletStep.java:257)
at org.springframework.batch.core.step.AbstractStep.execute(AbstractStep.java:200)
at org.springframework.batch.core.partition.support.TaskExecutorPartitionHandler$1.call(TaskExecutorPartitionHandler.java:139)
at org.springframework.batch.core.partition.support.TaskExecutorPartitionHandler$1.call(TaskExecutorPartitionHandler.java:136)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at java.lang.Thread.run(Thread.java:745)
Last edit: Ramesh 2016-12-13
Hi i have created small project for your checking purpose.Here App.java is a main class you just execute it first time it will work but if you execute multiple times you will get the error..Note(If error happens only it will print in console.it is huge file please be patience while running)
I need you to isolate the error, get rid of spring framework dependency and simplify the testcase to the point that I can replicate the exception with relative ease...
Hi sorry some configuration missed in spring thatsy that error i got.now it is working fine.but for huge xml it is taking 300ms to check the value using evalXPathToBoolean().can we still reduce the time to 50ms ?
yes, if you change your xpath to
/data-set/record[COL1='" + i
that is... remove the lead slash,,, how huge is your xml btw?
also try to use evalXPath before anything else....
do this
if (ap.evalXPath()!=-1) {
System.out.println("found the node!");
}else
System.out.println("not node found");
ya i tried by removing the lead slash it reduced to 210ms but if i use ap.evalXPath() it is taking much more time than this.my xml file will have more than 80000 records.can u suggest still reduce the time to 50ms because i will upload more than 1 lakh records so each record will check in this xml whether available or not..for this i will split into 10 file each file contains 10000 record so all the file will run simultaneosly by using threading.for ex(10000*220ms=2200000ms) means 35 minutes.if we reduce to 50ms means it will complete in 8 minutes.Please suggest how to do?
1 lakh records ? what is lakh?
ap.evalXPath() taking much more time? no way... you are not doing it right...
also 210ms is not what I think you should be getting, for some you should get much less... for some you should get more....
your test xml has 40000+ records in each....
on average with my machine with your xml file it takes about 1 ms in most cases...
are you using a really slow machine? Or you are not turning on server JVM or using an outdated server jvm version...
one more trick in your sleeve.... rewrite the xpath predicate list so that the most varying one in the front so the predicate evaluation can return false without going thru the rest of and condition checks...
hi there i have attached only 40000+ records for testing purpose.In that if the elements are in starting then it is taking only 1ms but if the searching element is at last then it is taking 220ms.Code is here.
package com.test;
import com.ximpleware.AutoPilot;
import com.ximpleware.VTDGen;
import com.ximpleware.VTDNav;
import com.ximpleware.XPathParseException;
public class TestValidation {
}
I got 65ms on my machine running your code
ok i am using 8gb ram i think in staging it will be fast.Thanks for your support
one more thing: make sure the # of threads is <= than the number of physical cores. otherwise the speedup won't be as much as you would like...
ok thanks for your support.