Hitachi Vantara Pentaho Community Forums
Results 1 to 2 of 2

Thread: Same id_batch on parallel running kitchen-processes

  1. #1

    Default Same id_batch on parallel running kitchen-processes

    Hi,

    first of all: We currently still use PDI 2.5.1 because of some custom Plugins which could not migrated yet.

    We use the kettle-logging (transformationlog and joblog) to determine how many rows have been updated/deleted/inserted by a transformation and we have our own logging which logs the name of the transformation, the corresponding id_job and the id_batch, so that we can link it per database joins.

    Now we encouter a problem, when we start more than one kitchen process at the same time, because it seem's, that all transformations get the same batch_id, so the transformation which ends last, will insert it's written/updated/deleted-rows into the table. The other values are lost.

    Is this just a PDI 2.5.1. Problem or will it still exist in other releases?

  2. #2
    Join Date
    Nov 1999
    Posts
    9,729

    Default

    I know about the problem but I recall fixing this somewhere in the 3.x line.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  
Privacy Policy | Legal Notices | Safe Harbor Privacy Policy

Copyright © 2005 - 2019 Hitachi Vantara Corporation. All Rights Reserved.