Menu

#21 Decimal point shifting incorrectly for getFieldValue().set()

v1.0_(example)
open
1
2018-05-17
2018-02-02
No

Hi,

I am currently using JRecord.jar version 0.81.4 for my project.
We have a cobol layout that has one of its field of the format

FUEL-CHARGE PIC9 PIC 9(13).9(2).

We are initialising a record(of type net.sf.JRecord.Details.Line) and setting the value using record.getFieldValue("FUEL-CHARGE PIC9 ").set("0000000337339.79")

When we invoke record.getFieldValue("FUEL-CHARGE PIC9 ").asString(),
-in normal scenario the output is 337339.79
-but under heavy load(mutliple transactions running parallely), we are getting 337339 and 337339.790000

So under heavy load the decimal point and the digits after that do not appear or the decimal point shifts with zero padding to the right instead of padding to left.

Please help to understand the root cause and possible solutions for this issue.

Thanks,
Ramesh Rajan

Discussion

  • RAMESH RAJAN

    RAMESH RAJAN - 2018-02-02

    Two corrections:

    1. Fields is of the format
      FUEL-CHARGE PIC 9(13).9(2).

    2.We are retrieving the value using record.getFieldValue("FUEL-CHARGE").asString() and not record.getFieldValue("FUEL-CHARGE PIC9 ").asString()

     

    Last edit: RAMESH RAJAN 2018-02-02
  • Bruce Martin

    Bruce Martin - 2018-02-02

    I will have a look at it.

    Are you updating the same record in different transactions or are you updating different record ????

    Is a record being updated in multiple transactions ????

     

    Last edit: Bruce Martin 2018-02-02
  • RAMESH RAJAN

    RAMESH RAJAN - 2018-02-02

    Hi Martin,

    It is one record created and updated per transaction. But multiple transactions run at the same time.

    Also this issue does not occur consistently. But we have noticed that whenever we got this issue, then we were running around 150 to 500 transactions parallely and out of them one record might get this issue.

     
  • Bruce Martin

    Bruce Martin - 2018-02-03

    Please show me you code / copybook - is there any occurs depending ???

    I will send you my e-mail in case you do not want to attach to the problem

     
  • Michael Zoghby

    Michael Zoghby - 2018-05-15

    I actually am seeing a similiar problem with versions:

    <!-- JRECORD -->
            <dependency>
                <groupId>net.sf.JRecord</groupId>
                <artifactId>JRecord</artifactId>
                <version>0.81.1</version>
            </dependency>
            <dependency>
                <groupId>net.sf.bruce_a_martin.cb2xml</groupId>
                <artifactId>cb2xml</artifactId>
                <version>0.95.3</version>
            </dependency>
    

    This is my line code:

    ExternalRecord writerCopybook = copyBookLoader.loadCopyBook(copyBookInputStream, COPYBOOK_NAME,
                    CopybookLoader.SPLIT_01_LEVEL, 0, CHARSET_EBCDIC, Cb2xmlConstants.USE_STANDARD_COLUMNS,
                    Convert.FMT_MAINFRAME, 0, null);
    // in another method 
    this.workingPrivateRecord = new Line(this.writerCopybook.getRecord(recordName).asLayoutDetail());
    // and finally 
    this.workingPrivateRecord.getFieldValue(fieldName).set(fieldValue);
    

    This is my copybook definition of a few problem fields:

    15 B1F-CTRY-PERCENT-BY-TRANS-AMT PIC S9(3)V99 COMP-3.
    15 B1F-MCC-TOTAL-TRANS-AMT       PIC S9(11)V99 COMP-3.
    

    An excerpt from my logs:

    Error parsing B1F-CTRY-PERCENT-BY-TRANS-AMT: with value: 100.00 and CobolType: COBOL_PACKED_DOUBLE as a Number.
    java.util.concurrent.ExecutionException: java.lang.NumberFormatException: For input string: "000.00C"
        at java.util.concurrent.FutureTask.report(FutureTask.java:122)
        at java.util.concurrent.FutureTask.get(FutureTask.java:192)
        at com.capitalone.cardcore.librarytools.copybookreader.model.partitioning.PartitionExecutor.afterExecute(PartitionExecutor.java:33)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1157)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:748)
    Caused by: java.lang.NumberFormatException: For input string: "000.00C"
        at java.lang.NumberFormatException.forInputString(NumberFormatException.java:65)
        at java.lang.Integer.parseInt(Integer.java:580)
    

    and a different field from another location:

    2018-05-15 10:37:19 ERROR HydraInitWriter:265 - Error parsing B1F-MCC-TOTAL-TRANS-AMT: with value: 72288.18 and CobolType: COBOL_PACKED_DOUBLE as a Number. Field is likely no longer NUMERIC in COBOL definition.
    java.util.concurrent.ExecutionException: java.lang.NumberFormatException: For input string: "818.00C"
        at java.util.concurrent.FutureTask.report(FutureTask.java:122)
        at java.util.concurrent.FutureTask.get(FutureTask.java:192)
        at com.capitalone.cardcore.librarytools.copybookreader.model.partitioning.PartitionExecutor.afterExecute(PartitionExecutor.java:33)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1157)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:748)
    Caused by: java.lang.NumberFormatException: For input string: "818.00C"
        at java.lang.NumberFormatException.forInputString(NumberFormatException.java:65)
        at java.lang.Integer.parseInt(Integer.java:580)
        at java.math.BigInteger.<init>(BigInteger.java:479)
        at net.sf.JRecord.Types.TypePackedDecimal.setField(TypePackedDecimal.java:116)
        at net.sf.JRecord.Details.Line.setField(Line.java:441)
    

    What can I do to straighten this out? It looks like it's shifting the decimal and removing data from the front of the line. I've also seen parsing issues on the "." I'm hoping I can just modify the values I supply to JRecord and not update just yet? We're on the heels of a rollout and I'm not sure how backwards-compatible the newest versions are with the one we're using...

     

    Last edit: Michael Zoghby 2018-05-15
  • Michael Zoghby

    Michael Zoghby - 2018-05-15

    These COMP-3 decimal fields are the ONLY field types that have problems. This copybook has every field type under the sun and no other fields are having trouble being "reset" with new data.

    10 B1F-RTD-RULES.
        15  B1F-RTD-RULE           PIC X(3) OCCURS 10 TIMES.
    10 B1F-NULL-CW-START-DAYS        PIC X(3).
    10 B1F-CW-VALUE                  PIC X(4).
    10 B1F-CKPNT-VAL-SCORE-RAW       PIC -9(6).99.
    10 B1F-CKPNT-VAL-SCORE-CNVTD     PIC 9(3).
    10 B1F-CVV2-VALIDATION-RESULT    PIC X(1).
    10 B1F-LAST-INTERAC-ATC          PIC S9(09) COMP.
    10 B1F-PREV-INTERAC-ATC          PIC S9(09) COMP.
    10 B1F-MID-AREA.
        15 B1F-MID-FIRST-TRANS-DATE      PIC S9(07) COMP-3.
        15 B1F-MID-LAST-TRANS-DATE       PIC S9(07) COMP-3.
        15 B1F-MID-MONTHS-WITH-TRANS     PIC S9(03) COMP-3.
        15 B1F-MID-RANK-BY-TRANS         PIC S9(05) COMP-3.
        15 B1F-MID-TOTAL-TRANS           PIC S9(05) COMP-3.
        15 B1F-MID-TOTAL-TRANS-AMT       PIC S9(11)V99 COMP-3.
    

    For the fields that aren't flagged as implied decimal, there is no problem loading new values either. The only fields struggling are the implied decimal COMP-3s.

    The other thing that is interesting/difficult is that not ever field fails to be set and I can't identify a pattern of behavior. Most fields set completely fine. For example:

    System.out.println("Setting " + fieldName + ": with value: " + fieldValue);
    
    this.workingPrivateRecord.getFieldValue(fieldName).set(fieldValue);
    
    System.out.println(fieldName + ": with final value: " +
    this.workingPrivateRecord.getFieldValue(fieldName).asString());
    

    and the log:

    Setting B1F-MCC-MONTHS-WITH-TRANS: with value: 1 and CobolType: COBOL_PACKED_DOUBLE as a Number.
    B1F-MCC-MONTHS-WITH-TRANS: with final value: 1 and CobolType: COBOL_PACKED_DOUBLE as a Number.
    Setting B1F-MCC-RANK-BY-TRANS: with value: 5 and CobolType: COBOL_PACKED_DOUBLE as a Number.
    B1F-MCC-RANK-BY-TRANS: with final value: 5 and CobolType: COBOL_PACKED_DOUBLE as a Number.
    Setting B1F-MCC-TOTAL-TRANS: with value: 1 and CobolType: COBOL_PACKED_DOUBLE as a Number.
    B1F-MCC-TOTAL-TRANS: with final value: 1 and CobolType: COBOL_PACKED_DOUBLE as a Number.
    Setting B1F-MCC-TOTAL-TRANS-AMT: with value: 803.00 and CobolType: COBOL_PACKED_DOUBLE as a Number.
    B1F-MCC-TOTAL-TRANS-AMT: with final value: 803.00 and CobolType: COBOL_PACKED_DOUBLE as a Number.
    Setting B1F-MCC-PERCENT-BY-TRANS-AMT: with value: 2.77 and CobolType: COBOL_PACKED_DOUBLE as a Number.
    B1F-MCC-PERCENT-BY-TRANS-AMT: with final value: 2.77 and CobolType: COBOL_PACKED_DOUBLE as a Number.
    Setting B1F-ST-FIRST-TRANS-DATE: with value: 2018132 and CobolType: COBOL_PACKED_DOUBLE as a Number.
    B1F-ST-FIRST-TRANS-DATE: with final value: 2018132 and CobolType: COBOL_PACKED_DOUBLE as a Number.
    

    It's also worth mentioning that when I attempt to manually remove the decimal:

    fieldValue = StringUtils.remove(fieldValue, ".");
    this.workingPrivateRecord.getFieldValue(fieldName).set(fieldValue);
    

    I get the following error:

    Error parsing B1F-CTRY-PERCENT-BY-TRANS-AMT: with value: 10000 and CobolType: COBOL_PACKED_DOUBLE as a Number.
    java.util.concurrent.ExecutionException: net.sf.JRecord.Common.RecordException: Value is to big for field true > 1,157 4 ~ 3 16
        at java.util.concurrent.FutureTask.report(FutureTask.java:122)
        at java.util.concurrent.FutureTask.get(FutureTask.java:192)
        at com.capitalone.cardcore.librarytools.copybookreader.model.partitioning.PartitionExecutor.afterExecute(PartitionExecutor.java:33)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1157)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:748)
    Caused by: net.sf.JRecord.Common.RecordException: Value is to big for field true > 1,157 4 ~ 3 16
        at net.sf.JRecord.Common.Conversion.setBigInt(Conversion.java:595)
        at net.sf.JRecord.Types.TypePackedDecimal.setField(TypePackedDecimal.java:116)
        at net.sf.JRecord.Details.Line.setField(Line.java:441)
        at net.sf.JRecord.Details.BasicLine.setField(BasicLine.java:221)
        at net.sf.JRecord.Details.FieldValue.set(FieldValue.java:163)
    
     

    Last edit: Michael Zoghby 2018-05-15
  • Bruce Martin

    Bruce Martin - 2018-05-15

    Yes the problem was in several versions. I have uploaded 2 fixed versions:

    • 0.81.4
    • JRecord 0.90 Release Candidate 5

    I will try and do another 0.90 Release Candidate 6. This will have:

    • Option to turn off the new Type system if neccessary
    • RecordDecider builder option (JRecordInterface1)

    The problem is in net.sf.JRecord.Common.Conversion method
    getNumberformat, it should be:

        public static NumberFormat getNumberformat() {
            return NumberFormat.getNumberInstance(Locale.US);
        }
    
     
  • Michael Zoghby

    Michael Zoghby - 2018-05-16

    Awesome news! Two followup questions though...

    The first post in this ticket indicates that they were using 0.81.4 when they saw the problem. I'm using an older version than that so I can't say from any experience - just curious if you rolled out a new release on top of that previous version that patched it?

    Second - purely from a curiosity standpoint. I only saw this issue when we ran JRecord in a multithreaded environment and only for these few COMP-3 fields for this one file. I'm publishing over 50 EBCDIC files using JRecord - all multithreading - and never saw this problem. I'm curious if the root cause might be more complex? I've never had cause to doubt JRecord - it has absolutely saved my life - I just want to make sure I understand what's going on under the hood on this issue.

     
  • Bruce Martin

    Bruce Martin - 2018-05-17

    Code is

          get NumberFormat
          Update NumberFormat
          Use NumberFormat
    

    The program uses the numberFormat straight after it is updated so there is not much time for another process to get in and update it.
    A lot of Cobol programmers will use the same picture definition e.g. s9(7), s9(7)V99. There are going to be cases where it is happening
    but you do not see it.

    I can only go on what Ramesh reported:

    • Problems only occurred when more than 150 concurrent transactions. Below

    If this is the only problem, you will see it if

    • Very hgh volumes
    • different picture formats are being used in the JVM.

    I do not plan on backporting all version, to much work - to much to do already. I will probably update JRecord 0.91.5 and leave it at that.

     

Anonymous
Anonymous

Add attachments
Cancel





MongoDB Logo MongoDB