Hitachi Vantara Pentaho Community Forums
Results 1 to 2 of 2

Thread: Metadata Injection ETL execution from CDE

  1. #1
    Join Date
    Mar 2016
    Posts
    7

    Default Metadata Injection ETL execution from CDE

    Hi everyone...

    So basically I have a situation where I have a table with dynamic columns I need to display via CDE.

    The table must show data for the current year (9 data columns + Total Column) and additionaly only totalized columns for the following years if there is data for said years.

    So I put together a couple of PDI ETLs, the first injects denormalization info (inject.ktr) into the second (denorm.ktr) in order to denormalize whatever years are available. (Based on the example in this thread)


    So in CDE I can upload both ETLs (inject and denorm) and select inject.ktr as the "Kettle Transformation File" to use as a datasource, but then I cannot specify a "Kettle Step name" from the denorm.ktr ETL. (I can... but it does not actually get data from it, it only gets data if I specify a "Kettle Step name" from the inject.ktr).

    So basically as I see it (without having seen any of the code) it seems to me in order to be able to do this I would have to be able to specify to CDE 3 items:

    1. the
    "Kettle Transformation File", which is the file it will run
    2. the "Kettle Step name" from which to get the data
    3. the Kettle Transformation File containing the "Kettle Step name" that will provide the data.

    Normally 1 and 3 are the same ktr, but with this schema, where you need two files it seems like I need to enter different file names.

    I can see in the tomcat/logs/pentaho.log that if I specity a "Kettle Step name" in the first ETL "inject.ktr" the steps in the "denorm.ktr" are also run, I just don't seem to be able to obtain the data from a step in the template etl "denorm.ktr" from CDE.

    Do you guys think this might be possible with CDE today? Am I missing something?

    Thank you in advance,
    Andi
    Last edited by kahennig; 02-20-2017 at 11:24 AM.

  2. #2
    Join Date
    Mar 2016
    Posts
    7

    Default

    OK, so I have finally been able to do this.
    My first etl is used to prepare and inject the metadata the second one is used to actually execute a denormalization into which the metadata is injected.
    Now I had two problems.

    1. once uploded into biserver the first etl would not see the second etl. This I solved sticking the second etl in tomcat/bin for now. Later on I hope to try this out:
    http://ubiquis.co.uk/pdi/readingwrit...n-pentaho-5-x/

    2. I was not able to get data dack from the second etl... It seems the "kettle over kettleTransFromFile" datasource in CDE can only obtain data from a "Kettle Step name" whitin the etl first called and defined in "Kettle Transformation File". So after many hours of browsing forums and different articles a line in the Pentaho documentation grabbed my attention... I do not recall the exact wording nor the sites URL... but the gist was that the Metadata Injection step had some config where one could tell it to grab data from a specific step in the second etl and return it into its own flow... And I figured out that it is on the "Metadata Injection" step, "Options" tab under "Template step to read from (optional)" so in the box next to this lable I just entered the name of the step I wanted to get the data from and voilá the data was now available in my first etl.

    So I just added a Dummy step after the Metadata Injection step in my first etl and in CDE I told it to get the data from this Dummy Step. Done.

    Here an example for this based on the example given by simon in this post:

    So I meant to attach an example here... but it seems I don't have permission to do so. Sorry.

Tags for this Thread

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  
Privacy Policy | Legal Notices | Safe Harbor Privacy Policy

Copyright © 2005 - 2017 Pentaho Corporation. All Rights Reserved.