Hitachi Vantara Pentaho Community Forums
Results 1 to 5 of 5

Thread: Migrating from 4.4.0-stable to 5.0.1.A-stable - Memory issues

  1. #1
    Join Date
    Feb 2013
    Posts
    20

    Default Migrating from 4.4.0-stable to 5.0.1.A-stable - Memory issues

    I've been working through an issue in CE 4.4.0-stable where Kitchen would error due to the number of parallel transformations being executed. After some digging around found a reference in Jira that the issue had been fixed in 5.0 so decided to upgrade to test.

    The bit where it was failing is now fine but I now get memory leaks further down then I execute a SQL SP via the call DB Procedure step. I've increased the memory settings and have tried disable the garbage collector with no luck.

    If I job in isolation it completes without any problems.

    Is anyone else getting problems running jobs in v5.0.1.A ? I'm assuming that I don't need to upgrade my jobs and transformations in way.

    Thanks

  2. #2
    Join Date
    Apr 2012
    Posts
    6

    Default

    We are running 5.0.2.

    Lot's of problems. Threads and out of memory exceptions for jobs that were running fine with version 4, repository failures, different name handling, you name it.

    So you are not alone, unfortunately.

    While you do not need to change your jobs - they are just XML - you can expect that steps would behave differently - simply due to a number of changes that have been done to the code base.

  3. #3
    Join Date
    Feb 2013
    Posts
    20

    Default

    Glad to know that it's not just me

    I've reverted back to v4.4 and worked around my concurrency issue by not running so many task in parrellel

    I'm going to have to leave the migration to v5.0 to a later date where I can negotiate time for an internal project. Real shame as v5.0 seem to be much quick.

  4. #4
    Join Date
    Nov 1999
    Posts
    9,729

    Default

    There's really no reason for both issues. Check to see if it's not a JDBC driver 'upgrade' or something that is causing the issues.
    In any case, you're both wrong in the sense that if you see major regressions, you should file a JIRA case so that we can at least look at it.

    Thanks in advance if you would change your mind!

    Matt

  5. #5
    Join Date
    Jan 2013
    Posts
    12

    Default Experiencing memory issues in PDI 5.0.1

    Hi All

    we are also in our environment recently migrating from PDI 4.4 to PDI 5.0.1 stable version . we are using a sqlserver and launching 20 different connections in parallel (we are not selecting any rows , but running some update statements using tables on sql server database) . But the issue is - in PDI 4.4 it is working great but the same job we are unable to run in the PDI 5.0.1 stable . We have tried the options of setting the memory size to 2 GB and also the : "-XX:+UseConcMarkSweepGC"

    but it is not working and failing with the issue of :
    java.lang.OutOfMemoryError: Java heap space.

    Can somebody throw some light regarding this issue .

    Thanks a lot for all your patience.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  
Privacy Policy | Legal Notices | Safe Harbor Privacy Policy

Copyright © 2005 - 2019 Hitachi Vantara Corporation. All Rights Reserved.