Hitachi Vantara Pentaho Community Forums
Results 1 to 3 of 3

Thread: Different results from different plateform ??

  1. #1
    Join Date
    Oct 2007
    Posts
    2

    Default Different results from different plateform ??


    Hi,

    In order to migrate PDI 2.4.0 and 2.4.1 to PDI 2.5.1, we have install a 2.5.1 referentiel and migrate jobs into (database Oracle 9i). In fact, we have jobs on 2.4.0 (the oldest) and jobs in 2.4.1M1 (the new one).

    We have make jobs test on windows XP pro, with jre 1.5.0-b64 by using kitchen => it's works well.

    But by using the same job on the same referentiel (script kitchen too), in Solaris 9 and with jre build 1.5.0_06-b05, we can't execute correctly the job. We have this error :
    14:30:11,792 ERROR [0] Insertion in table DAMAGE.0 - Because of an error, this step can't continue:
    Error batch inserting rows into table [EW_DAMAGE]
    Error updating batch
    ORA-01843: not a valid month

    We don't understand this different result from the same job. We precise that on Solaris, the job on 2.4.0 works well.

    Have you an idea, please ??

  2. #2
    Join Date
    May 2006
    Posts
    4,882

    Default

    - Try the exact input with your old job... I would expect them to fail to.

    - In 2.5.1 switch off batch update, switch on row level debugging and then find the row with your error... I suspect the month in one of your rows will be wrong.

    Regards,
    Sven

  3. #3
    Join Date
    Oct 2007
    Posts
    2

    Default

    Hi Sven,

    Thanks for your quick response. For my part, I apologize for my present too late reply.

    I have not develop this jobs. Just i have in charge on exploitation and little maintenance tasks (with Kettle, the migration on 2.5.1). Ok, it would be very simple. It's, but it doesn't work on Unix (the target plateform). We develop on window plateform and use kitchen to run this jobs on Unix.

    In fact, we use a Table output step to finish this job. we put a insertion date, initialised by a select TO_CHAR(TRUNC(SYSDATE), 'DD/MM/YYYY') INSERTION_DATE. So it's not a Oracle date format.

    With no action, this step works for us, perfect.
    For this row, we can find this, by clicking on SQL button on Spoon :
    ALTER TABLE EW_DAMAGE ADD ( INSERTION_DATE_KTL VARCHAR2(10) ) ;
    UPDATE EW_DAMAGE SET INSERTION_DATE_KTL=INSERTION_DATE;
    ALTER TABLE EW_DAMAGE DROP ( INSERTION_DATE )
    ;
    ALTER TABLE EW_DAMAGE ADD ( INSERTION_DATE VARCHAR2(10) ) ;
    UPDATE EW_DAMAGE SET INSERTION_DATE=INSERTION_DATE_KTL;
    ALTER TABLE EW_DAMAGE DROP ( INSERTION_DATE_KTL )

    ;

    I suppose that Kettle inserts in a first step a varchar2(10), and after, Kettle transform this row in date format (it's not visible). We have a date format for this row after (like we want).

    Why i have blockage on Unix ??
    I try to open Spoon in Unix to have a look on this SQL sentence (i believe that Kettle try to insert directly the insertion_date to date format on Unix). But i have trouble with my X-window client with Spoon.
    I will make a test with 2.5.2 or i will change this steps and make tests with another steps. It's not a critical task (the jobs work well on 2.4.0 and 2.4.1 : our oldest jobs).

    Thanks again for your support and i will inform you about the resolution on this migration.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  
Privacy Policy | Legal Notices | Safe Harbor Privacy Policy

Copyright © 2005 - 2019 Hitachi Vantara Corporation. All Rights Reserved.