Hitachi Vantara Pentaho Community Forums
Results 1 to 7 of 7

Thread: Problem parsing a UTC time

  1. #1
    Join Date
    Apr 2011
    Posts
    6

    Default Problem parsing a UTC time

    Using Kettle (version 4.1.0), I'm getting this error while trying to export some salesforce records to a file:

    Created Date String : couldn't convert string [2008-03-09T02:34:54.000Z] to a date using format [yyyy-MM-dd'T'HH:mm:ss'.000Z']

    The date is in the defined format. What *seems* to be happening is that it is trying to parse it into a local-time-zone date object, even though it says it is GMT+0 (the Z). Thus, it hits a time that doesn't exist (due to time zone changes) and fails.

    This would seem to be a bug. Anyone have any ideas?

    Thanks!

  2. #2
    Join Date
    Sep 2009
    Posts
    810

    Default

    hey there,

    in the format string the entire substring '.000Z' is declared as a constant, which means the Z is not being parsed but discarded. Thus there is no time zone information in the format pattern. Thus, as you correctly assume, the local time zone is used.

    There are several ways to deal with this. Personally, I'd try to import the field as string, then append a 'UTC' and use the select values step to convert to a date using an appropriate pattern yyyy-MM-dd'T'HH:mm:ss'.000Z'z. You can also use the JavaScript step for conversion passing an explicit time zone.

    hth

    Cheers

    Slawo

  3. #3
    Join Date
    Apr 2011
    Posts
    6

    Default

    I understand the 000Z is hard coded. And I know I can filter these fields through another step. The problem is there are many date/time fields and manually creating modifications for each field would be very time consuming.

    This is why I think this is a bug: Kettle (Or Spoon) automatically detects the type (and format) of the fields based on its query to Salesforce. So, it, in effect, "knows" that there is a Z, thus the time is GMT+0. But it then discards said GMT+0 indicator, and tries to parse it as a local time, leading to failure. It is discarding indicators of time zones, thus I would say this is a bug (design flaw).

  4. #4
    Join Date
    Sep 2009
    Posts
    810

    Default

    Hi jkugler,

    if the data type conversion happens in the Salesforce Step itself, you may have a point there. In this case, I'd suggest to create a small transformation showing the issue and filing a jira case here: http://jira.pentaho.com/browse/PDI

    Cheers

    Slawo

  5. #5
    Join Date
    Apr 2011
    Posts
    6

    Default

    Thanks. I posted a bug: http://jira.pentaho.com/browse/PDI-5595 I didn't have a test case to upload, but if they ask for one, I'll try to assemble one.

  6. #6
    Join Date
    Apr 2011
    Posts
    1

    Default

    I had this same issue. Currently, I'm working around it by passing the SalesForce date fields into MySQL as strings (set in PDI) and varchar (set in the MySQL table) and then parsing them once they're in MySQL to turn them into datetime format. Just to throw out another way to work around the issue while the bug is still open.

  7. #7
    Join Date
    Apr 2011
    Posts
    6

    Default

    Thanks. The silly thing was, we didn't *need* to parse the time: we were just exporting data, we could always parse the time fields later. It was just that the time fields were auto-detected, and Kettle tried to parse them, resulting in the failure.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  
Privacy Policy | Legal Notices | Safe Harbor Privacy Policy

Copyright © 2005 - 2019 Hitachi Vantara Corporation. All Rights Reserved.