Hitachi Vantara Pentaho Community Forums
Results 1 to 3 of 3

Thread: Memory leaks?

  1. #1

    Default Memory leaks?

    I'm running a kettle process regularly (every 10 minutes) within the Pentaho framework. After about a day it runs out of memory (heap space) despite me allocating 1GB to that. Are there any known causes of memory leaks in this sort of environment? Has anyone else tried to doing the same sort of thing.

  2. #2

    Arrow memory leak


    I just had this error : memory leak, it came from the HSQL DB.
    I tried to add an option to the command (add memory)

    " java -Xms64m -Xmx256m "

    And i did not have the error anymore : i looked in the pentaho-data\hibernate folder and saw that my hibernate.script file was just becoming a 32MB file. I suppose this file was the problem..

    I solved the problem for a while, but what if it become > 256Mo.. i saw all parameters are saved in it. so all trigers parameters are also in it i suppose, so cannot delete it? am i wrong?

    Can anybody tell us more about it?

    Best regards

  3. #3


    I'm starting to experience this as well, but my scenario is running the Kettle job/transformations on the Pentaho BI Platform. Are the two above stand-alone or within the BI Platform?

    I do not have any hibernate databases, so I know that is not the case. I do have a fair amount of ModJS scripts with isRegExp() methods, if those are causing leaks. I personally haven't been able to dig deep enough into it yet, just fire-fighting.

    * I am running a job with 4 transformations

    * One transformation makes heavy use of ModJS isRegExp() method

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
Privacy Policy | Legal Notices | Safe Harbor Privacy Policy

Copyright © 2005 - 2019 Hitachi Vantara Corporation. All Rights Reserved.