Hitachi Vantara Pentaho Community Forums
Results 1 to 5 of 5

Thread: OutOfMemoryError in FixedInputData.resizeByteBuffer(FixedInputData.jav a:81)

  1. #1
    Join Date
    May 2008
    Posts
    7

    Default OutOfMemoryError in FixedInputData.resizeByteBuffer(FixedInputData.jav a:81)

    Thanks for a brilliant product...we use it very heavily at our company

    I have seen this error sporadically and am kind of at a standstill.

    I am running a job programmatically and the Linux Tomcat JVM has 1.5g of memory allocated. Using Kettle Version 3.0.4-GA. The fixed length input file is only 800,000 records and we do not run it in parallel. Each record is 2871 characters long.

    The NIO buffer size is 50000. Should this be reduced to help these issues?

    Just wondering if this is a known issue? Thanks in advance.
    2008/08/25 11:12:52 - dataquick-fixed-input.0 - ERROR (version 3.0.4, build 53 from 2007/11/14 00:11:43) : Unexpected error :
    2008/08/25 11:14:19 - dataquick-fixed-input.0 - ERROR (version 3.0.4, build 53 from 2007/11/14 00:11:43) : java.lang.OutOfMemoryError: Java heap space
    2008/08/25 11:14:19 - dataquick-fixed-input.0 - ERROR (version 3.0.4, build 53 from 2007/11/14 00:11:43) : at org.pentaho.di.trans.steps.fixedinput.FixedInputData.resizeByteBuffer(FixedInputData.java:81)
    2008/08/25 11:14:19 - dataquick-fixed-input.0 - ERROR (version 3.0.4, build 53 from 2007/11/14 00:11:43) : at org.pentaho.di.trans.steps.fixedinput.FixedInput.readOneRow(FixedInput.java:136)
    2008/08/25 11:14:19 - dataquick-fixed-input.0 - ERROR (version 3.0.4, build 53 from 2007/11/14 00:11:43) : at org.pentaho.di.trans.steps.fixedinput.FixedInput.processRow(FixedInput.java:76)
    2008/08/25 11:14:19 - dataquick-fixed-input.0 - ERROR (version 3.0.4, build 53 from 2007/11/14 00:11:43) : at org.pentaho.di.trans.steps.fixedinput.FixedInput.run(FixedInput.java:305)

  2. #2
    Join Date
    Nov 1999
    Posts
    9,729

    Default

    Typically this is not a problem in the CSV/Fixed file input steps, rather a problem elsewhere in the transformation.
    You solve it by loading less data into memory (caches, stream lookup, etc) or by giving the JVM more memory.
    (Spoon.bat/.sh, Pan.bat/.sh, Kitchen.bat/.sh, Carte.bat/.sh or the Pentaho platform)

    Matt

  3. #3
    Join Date
    May 2008
    Posts
    7

    Default

    Thanks Matt.

    I am running this job in Tomcat and have allocated 1.5g of memory. I don't use any caching and don't have any stream lookups. Are there other places the data is loaded into memory? I can try bumping up the JVM to 2g which is the maximum size since I am on 32 bit Linux.

    Franz Garsombke.

  4. #4
    DEinspanjer Guest

    Default

    Have you tried lowering the rowset size in the Transformation misc properties?

  5. #5
    Join Date
    May 2008
    Posts
    7

    Default

    Great idea. I had no idea that property existed. That should definitely help...each row has 2871 characters.

    Thanks Matt

    Franz

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  
Privacy Policy | Legal Notices | Safe Harbor Privacy Policy

Copyright © 2005 - 2019 Hitachi Vantara Corporation. All Rights Reserved.