Hitachi Vantara Pentaho Community Forums
Results 1 to 7 of 7

Thread: Automate file naming with incremental numbering

  1. #1
    Join Date
    Jan 2014
    Posts
    7

    Question Automate file naming with incremental numbering

    Hi all,

    This is probably a fairly basic question, I'm new to Pentaho, and have found myself stumped.

    I have to do a simple extract from a SQL database each day, and output a flat text file. This part isn't an issue, but the file that is outpu has to follow a particular structure similar to: 'NamePart1_XXXXXX_NamePart2'. The 'XXXXX' part needs to 0000001 initially, going up by 1 with every daily run.

    Can anyone suggest how this can be done at all?

    Thanks in advance for any help!

    Kind regards,

    Andy

  2. #2
    Join Date
    Jun 2012
    Posts
    5,534

    Default

    For starters: There's a step "Get System Info" you can use to obtain the current date. You may have decided on a start date kept in a variable, so use step "Get Variables" to retrieve that value as a Date as well. Step "Calculator" can help you to compute the days between both dates (delta = Date A - Date B). Specify a suitable number format like 00000 for the Integer result. Your final filename can be a concatenation of literal constants and your delta value. Text File Output is able to read the filename from a field.
    So long, and thanks for all the fish.

  3. #3
    Join Date
    Jan 2014
    Posts
    7

    Default

    Hi Marabu,

    Thanks very much for the update.
    I don't actually want to refer to the current date, however, I just need the file name to include the digits '000001' to start with, moving to '000002', '000003' etc. going up 1 with each run.

    Is this possible at all?

    Kind regards,

  4. #4
    Join Date
    Jun 2012
    Posts
    5,534

    Default

    I thought it easier to derive the counter from the system clock than maintaining a synchronized counter using some kind of persistence (file, db, ...).
    So yes, it is possible, of course.
    So long, and thanks for all the fish.

  5. #5
    Join Date
    Jan 2014
    Posts
    7

    Default

    Hi Marabu,

    Thanks for your further reply.
    I don't suppose you'd be in a position to actually talk me through the steps of doing such a thing would you? I am completely new to the product, as I mentioned, so don't really have a clue where to start unfortunately.

    Kind regards,

    Andy

  6. #6
    Join Date
    Jun 2012
    Posts
    5,534

    Default

    Here is a Kettle job consisting of two transformations, "Get-Day" and "Write-CSV".

    Transformation "Get-Day" retrieves the system date and a day zero from where you want to start numbering your days.
    To set ZERODAY open the job settings and select page "Parameters".
    You'll see another parameter DIR which is used to compose the TFO filename.
    Calculator has a function "Date A - Date B" to calculate the days between two dates.
    I attached the number format 00000 as you can see.
    At last I create a JVM string variable DAYNUM with the chosen formatting.

    Transformation "Write-CSV" reads a row from a data grid that you want to replace it with your Table Input step.
    Text File Output makes good use of the variables - look at the filename.

    BTW: If looking at the Kettle samples folder doesn't get you started, you should consider to read an introductory book - scan the sticky threads for some titles.

    Good luck.
    Attached Files Attached Files
    So long, and thanks for all the fish.

  7. #7
    Join Date
    Jan 2014
    Posts
    7

    Default

    Hi Marabu,

    Apologies for not replying until now, I just checked back here and saw your amazing reply. Thanks so much for this, everything is all setup and running perfectly now!

    Thanks again!

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  
Privacy Policy | Legal Notices | Safe Harbor Privacy Policy

Copyright © 2005 - 2019 Hitachi Vantara Corporation. All Rights Reserved.