Hitachi Vantara Pentaho Community Forums
Results 1 to 7 of 7

Thread: java.lang.OutOfMemoryError: GC overhead limit exceeded

  1. #1
    Join Date
    Nov 2012
    Posts
    11

    Default java.lang.OutOfMemoryError: GC overhead limit exceeded

    Hi,

    recently I've upgraded pentaho data integration from 4.4 to 5.4 (exact version: 5.4.0.1-130). Created new DB repo in Oracle. Exported old repo contents to xml and imported to new DB repo.
    Java version is the same as used to be: 1.7.0_19. Running it on 64 bit, CentOS release 6.4 (Final). System has 4 GB Ram.
    The job that used on run without the problem on 4.4 , now fails with 5.4 with error : java.lang.OutOfMemoryError: GC overhead limit exceeded
    Just to give y more background - pretty much all is done within Oracle - and 95% is just invoking sql scripts with some parallelism of jobs.
    I've increased JVM memory settings from
    PENTAHO_DI_JAVA_OPTIONS="-Xmx512m -XX:MaxPermSize=256m"
    to
    PENTAHO_DI_JAVA_OPTIONS="-Xmx1024m -XX:MaxPermSize=512m"

    but it didn't help.

    Does anyone have a clue what could be the issue ?

    thanks a lot,
    Pawel

  2. #2
    Join Date
    Nov 2012
    Posts
    11

    Default

    btw - with penatho 6 - same issues...

  3. #3
    Join Date
    Apr 2015
    Posts
    15

    Default

    What is the transform doing that receives that error. I have gotten this recently with a large file that I then had to do a sort before a group by. I decreased the sort size (rows in memory) and that helped.

  4. #4
    Join Date
    Aug 2015
    Posts
    313

    Default

    have you tried after JVM size from
    PENTAHO_DI_JAVA_OPTIONS="-Xmx1024m -XX:MaxPermSize=512m"

    to

    PENTAHO_DI_JAVA_OPTIONS="-Xmx2048m -XX:MaxPermSize=1024m"

    you can increase to upto some more size aswell if you are using 64 bit java. try and let me know, i already succeed using this approach

  5. #5
    Join Date
    Nov 2012
    Posts
    11

    Default

    Quote Originally Posted by santhi View Post
    have you tried after JVM size from
    PENTAHO_DI_JAVA_OPTIONS="-Xmx1024m -XX:MaxPermSize=512m"

    to

    PENTAHO_DI_JAVA_OPTIONS="-Xmx2048m -XX:MaxPermSize=1024m"

    you can increase to upto some more size aswell if you are using 64 bit java. try and let me know, i already succeed using this approach
    Hi,

    the above doesn't solve the issue. There's no transformation causing the failure , no bigger data reading - all it does is executes in parallel a bunch of SQL scripts - so all the work is done in Oracle Server and Pentaho merely servers like a Workflow, Wrapper, Error Handler...

    Any ideas why I have this error ?

    thanks a lot,
    Pawel

  6. #6
    Join Date
    Aug 2014
    Posts
    21

    Default

    Hi I am getting this OutOfMemoryError exception after:
    I connect to a rest service with the REst Client Step and and save the result to a file.
    I increased the JVM memory settings but this still is not enough because the data I am consuming is very big .


    So is there a way to write the response coming from the rest service immediatly, without loading the entire response to the memory, so I dont have to worry about having memory enough to handle large data?

  7. #7
    Join Date
    Nov 2012
    Posts
    11

    Default

    Anybody , please ? any feedback would be highly appreciated !!!
    Again - I don't seem to consume memory since I merely run SQL Scripts in sequence .... plus the job runs smoothly with 4.4 ....

Tags for this Thread

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  
Privacy Policy | Legal Notices | Safe Harbor Privacy Policy

Copyright © 2005 - 2019 Hitachi Vantara Corporation. All Rights Reserved.