Hitachi Vantara Pentaho Community Forums
Results 1 to 13 of 13

Thread: java.lang.ArrayIndexOutOfBoundsException

  1. #1
    Join Date
    Mar 2007
    Posts
    216

    Smile java.lang.ArrayIndexOutOfBoundsException

    Hi,

    1°-unzip the attached file and open decomp.ktr
    2°-hit F10
    3°-preview the step "if you can preview me it works" with first 100 lines and first box ticked. > you can not and you have an error "java.lang.ArrayIndexOutOfBoundsException" in the step "Split fields".
    4°-delete the step "delete short_filename" and link "java script that uses short_filename" with "if you can preview me it works".
    5°-preview the step "if you can preview me it works" with first 100 lines and first box ticked again. > it works
    It seems that deleting a field used in a javascript, even in a step after this javascript, can't work. Is that right ? Do you know what happens ?

    a+, =)
    -=Clément=-
    Attached Files Attached Files

  2. #2
    Join Date
    May 2006
    Posts
    4,882

    Default

    You know the drill ;-)

    Regards,
    Sven

  3. #3
    Join Date
    Mar 2007
    Posts
    216

    Smile

    Hi,

    Quote Originally Posted by sboden View Post
    You know the drill ;-)

    Regards,
    Sven

    http://jira.pentaho.org/browse/PDI-500


    a+, =)
    -=Clément=-

  4. #4
    Join Date
    May 2006
    Posts
    4,882

    Default

    Clément,

    Can you check at with the latest version of the jar files. I don't see the problem anymore.

    Regards,
    Sven

  5. #5
    Join Date
    Mar 2007
    Posts
    216

    Smile

    Hi,

    Quote Originally Posted by sboden View Post
    Clément,

    Can you check at with the latest version of the jar files. I don't see the problem anymore.

    Regards,
    Sven
    I don't know how to do that. I tried to find some jar files on the forum, and did found but once downloaded I did not manage to know what was their version number/release date. e.g. the kettle-engine-3.0.jar file of my Kettle-3.0.0.GA directory was created on Nov.15th and modified on Nov.14th . Where are the latest jar files I should try with, please ?

    a+, =)
    -=Clément=-

  6. #6
    Join Date
    Nov 1999
    Posts
    9,729

    Default

    Put these in lib/

    http://kettle3.s3.amazonaws.com/kettle-ui-swt-3.0.jar
    http://kettle3.s3.amazonaws.com/kettle-engine-3.0.jar

    And this one in libext/

    http://kettle.pentaho.org/svn/Kettle...ionchecker.jar

    And also create a directory called build-res with this in it:

    http://kettle.pentaho.org/svn/Kettle...ion.properties

    That should give you a quick fix for this problem.

    Let us know what happens ;-)

    Matt

  7. #7
    Join Date
    Mar 2007
    Posts
    216

    Smile

    Hi,

    Quote Originally Posted by MattCasters View Post
    (...)
    That should give you a quick fix for this problem.

    Let us know what happens ;-)

    Matt
    Thanks, this issue is fixed by the latest jar

    a+, =)
    -=Clément=-

  8. #8

    Default

    Hi Matt

    I seem to have the same problem in my trasformation in the step Filter Rows. Unfortunatelly I can not download the jars you have posted, so I can't check if the solution you provided solves my problem too. When I click on the link I get an XML, and a call with wget to the link renders me the response:

    --2011-11-30 11:23:33-- http://kettle3.s3.amazonaws.com/kettle-ui-swt-3.0.jar
    Resolving kettle3.s3.amazonaws.com... 207.171.163.226
    Connecting to kettle3.s3.amazonaws.com|207.171.163.226|:80... connected.
    HTTP request sent, awaiting response... 403 Forbidden
    2011-11-30 11:23:34 ERROR 403: Forbidden.

    How can I get a hold on those .jar and .properties files?

    Thanks

  9. #9
    Join Date
    Nov 1999
    Posts
    9,729

    Default

    You can always build from source of-course.
    If you desperately want to stick to 3.0, I would recommend 3.0.5 : http://source.pentaho.org/viewvc/svn...ranches/3.0.5/

    Matt

  10. #10

    Default

    Sorry,
    It's only after I have posted my question did I saw that the original question was not from last week, but last week + a few years.
    Of course I don't want to use 3.0, I am using 4.2.1 GA.
    My error is still present, but I have managed a workaround for now. I've modulated a little bit my pretty big transformation, and I call them in a job. Sure I lose the paralele speed with that, but for now it's not an issue.

    My ArrayIndexOutOfBoundException occours when in the following scenario inside a transformation:

    The flow goes straight into all of the bellow steps:

    - Part A :
    - look-up values in a table A
    - if not found call a WS (Rest client step), parse response
    - insert the found value inside tab A
    - Part B:
    - then look-up again tab A for different value
    - if not found call a WS (Rest client step), parse response
    - insert the found value inside tab A

    If I run this, at the second look-up I get the ArrayIndexOutOfBoundException exception.
    If I comment out any of the parts it's working. Seems that the 2 parts together don't work. That's why modulating Part1 and Part2 in 2 transformations works.

    Thanks

  11. #11

    Default

    Hi Matt

    Unfortunatelly it's not a solution for me to serialize the 2 parts I have described. I need them to run in parallel. I have maped them in the same transformation, both parts pointing to an external transformation with the adequate parameters. And still the error occours when I ran the transformation with both parts. If I "comment out" one of them, the other part runs without a problem.

  12. #12
    Join Date
    Apr 2008
    Posts
    1,771

    Default

    Hi.
    I would use a "Stop block this step until step finishes" before the 2nd lookup and I would wait for the first lookup t finish.

    Mick

  13. #13

    Default

    Hi All

    I have solved the problem by making sure that my data stay the same type all over the map step. It's a BigNumber data type and I used "Select / Rename step" to ensure the data type stays the same, plus, very important, I've set the Decimal. This was paramount in solving the problem.
    Also, I discovered a bug, already confirmed by 2 other collegues in my team. The Formula step doesn't do the calculation correctly while using BigNumber data types. Firstly, it forces us to use Number as output type, but then the result is not correct. I may create a Jira ticket for this.

    Thanks
    Rolland

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  
Privacy Policy | Legal Notices | Safe Harbor Privacy Policy

Copyright © 2005 - 2019 Hitachi Vantara Corporation. All Rights Reserved.