Hitachi Vantara Pentaho Community Forums
Results 1 to 7 of 7

Thread: Performance

  1. #1

    Arrow Performance

    Hi ,

    I got some transformations set up and its running as a job, there are a total of 4 transformations in this job. The amount of data being updated is about 210,000. I have started the process in an Intel P4 Dual Core PC with more than 2.5 GB of memory... Its been more than 29 hours and the process is still running. Ofcourse there are a number of steps going in the transformations like selecting an id and then updating a field, mostly the insert / update steps..

    But I really doubt that there is something wrong at my end because KETTLE should be much faster than this (I trust it).. Is there anyway to speed up the process or point KETTLE to use more memory or somthing like that ... ?

    The main point is I tried the same jobs in Windows OS and the result was far better than what I see in Linux now... in Windows the jobs were completed in 12 hours or so... and in Linux its still continuing to run after 29 hours.. I wont say that KETTLE is slow, its somthing thats wrong in my end, may be the configurations .. can some one help me clear this .. my understanding is that the performance should be more in Linux, but here I got the otherwise .... both the Windows and the Linux was run in the same config hardware ...

    How can I improve the KETTLE performance in Linux, I am running a GUI version of Linux

    Thanks in advance
    BiL

  2. #2

    Default

    Well if I'm not mistaken your Kettle install will probably be only using 256MB of memory (or possibly 512Mb, my memory is faling in old age) you need to change the -Xmx flags in your start script.

    Tom
    This is a signature.... everyone gets it.

    Join the Unofficial Pentaho IRC channel on freenode.
    Server: chat.freenode.net Channel: ##pentaho

    Please try and make an effort and search the wiki and forums before posting!
    Checkout the Saiku, the future of Open Source Interactive OLAP(http://analytical-labs.com)

    http://mattlittle.files.wordpress.co...-bananaman.jpg

  3. #3

    Arrow

    you are right Tom, KETTLE is using only 256 MB of memory, the system got around 2 GB of free memory , shall I change the 256 to 2000 in the start script ?

    Please let me know what you think,,,

    Thanks
    BiL

  4. #4

    Default

    Well Sven and Matt know far more about that than I do but I'd certainly crank it up a bit and see if it sucks up the extra memory. Although whether that will be the solution to your problem depends on the complexity of the transformations (and 29 hours seems a bit extreme to me).

    Tom
    This is a signature.... everyone gets it.

    Join the Unofficial Pentaho IRC channel on freenode.
    Server: chat.freenode.net Channel: ##pentaho

    Please try and make an effort and search the wiki and forums before posting!
    Checkout the Saiku, the future of Open Source Interactive OLAP(http://analytical-labs.com)

    http://mattlittle.files.wordpress.co...-bananaman.jpg

  5. #5

    Arrow

    Thanks Tom , I will try it out will let you know the update..

    Cheers
    BiL

  6. #6
    Join Date
    Nov 1999
    Posts
    9,729

    Default

    Try 3.1.0-M3 or later. That one has a performance graph feature that lets you track down performance bottlenecks.
    The step(s) with the highest average input buffer size is the slowest.

    All the best,
    Matt

  7. #7

    Arrow

    Thanks Matt, I will try it out and post an update ...

    Regards
    BiL

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  
Privacy Policy | Legal Notices | Safe Harbor Privacy Policy

Copyright © 2005 - 2019 Hitachi Vantara Corporation. All Rights Reserved.