#3 when submit continuous querys, there will be a Exception


version 1.2.1

The problem arises *only* when the first query is not finished and you fire second query.

When you submit a sql query, the query is not finished, if you submit another query,

there will be a exception

09/03/03 15:20:59 INFO mapred.JobClient: Task Id : attempt_200903031500_0006_r_000000_1, Status : FAILED
java.io.IOException: File: hdfs://hadoop2.xunjienet.com:9000/user/hadoop/cloudbase/lib/commons-logging-1.0.4.jar has changed on HDFS since job started

at org.apache.hadoop.filecache.DistributedCache.ifExistsAndFresh(DistributedCache.java:450)
at org.apache.hadoop.filecache.DistributedCache.localizeCache(DistributedCache.java:341)
at org.apache.hadoop.filecache.DistributedCache.getLocalCache(DistributedCache.java:203)
at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:178)

The file hdfs://hadoop2.xunjienet.com:9000/user/hadoop/cloudbase/lib/commons-logging-1.0.4.jar timestamp is always change, but when I use 1.1, the jar file timestamp is not change always.


    • labels: --> CloudBase Server
    • priority: 5 --> 7
    • assigned_to: nobody --> tsingh
  • Working on this... will submit a patch soon. Meanwhile one can use this workaround-
    if the goal is to submit multiple queries so that they run *one after the another*, then append semi-colon at the end of each query. This will make Squirrel to send subsequent queries to the CloudBase only when the previous query is finished, for example-

    SELECT * FROM test_table1;
    SELECT * FROM test_table2;

    • status: open --> closed
  • Fixed in version 1.3