Hitachi Vantara Pentaho Community Forums
Results 1 to 3 of 3

Thread: Table output gets jammed

  1. #1
    Join Date
    Dec 2017
    Posts
    2

    Default Table output gets jammed

    Hello,

    I need to do a task consisting on creating an ETL that will populate the following star-schema

    Name:  schema.jpg
Views: 47
Size:  20.3 KB

    The information will be taken from


    I have successfully created the transformations to populate D_BNF, D_PRACTICE, A_POSTCODE, A_LCG and D_COST.

    The other dimensions (D_TIME, D_AMP, D_VMP, D_VTM) get populated when reading the practice files.

    I successfully manage to read and fill in the dimensions but when it comes to insert the values in the fact table F_PRESCRIBING, using the step table output, the step gets jammed. What do I mean with jammed. It does the first batch of 200 ( or whatever quantity I establish) but it gets stucked there, it doesnt do anything more and I just get notice on the console that the linenr get filled for different steps, no error message.

    jammed proccess Name:  cuelgue.png
Views: 41
Size:  5.9 KB

    log Name:  log.jpg
Views: 38
Size:  26.2 KB

    input and output Name:  inputs y outputs.jpg
Views: 41
Size:  27.7 KB

    The funny thing of the situation is that, if I delete the entries from the fact table F_PRESCRIBING only, and rerun the transformation, the output table step gets jammed at a different point.

    To be even more funny, if I put the info in a file and then from a different transformation I table output that file into the fact table it works perfectly.

    From my point of view it looks like a dimension lookup / update step is not working fine.

    I leave the transformation attached.

    Thanks for any help in advance.

    I'm using PDI 7.1
    On windows server 2012R2
    And an oracle data base.
    Attached Files Attached Files

  2. #2
    Join Date
    Apr 2008
    Posts
    4,696

    Default

    First thing that I notice is that you have set "Type" to None in check id_practician and check id_bnf
    Second thing that I notice is that you only have the "True" set in the filter rows -- not necessarily a problem, but seems odd.

    You will probably get better performance by doing a combination lookup/update instead of doing an Insert/Update step for ID_DATE

    Next I would recommend turning up the log level and see what that reveals.

    My overall guess is that you are running out of connections allowed to the DB. Especially when you say that if you write it all to a file, and then try to load the file to a table-output step in another transformation it works consistently.

    The issue isn't with Dim Lookup/Update, but rather with your selection of Oracle XE.

  3. #3
    Join Date
    Dec 2017
    Posts
    2

    Default

    I also i think it might be a problem with oracle XE. But i still find it rather strange that first try it loads 1 batch then second time it loads more but not completly, third time it loads more, and so on...

    Thank you for your time and help.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  
Privacy Policy | Legal Notices | Safe Harbor Privacy Policy

Copyright © 2005 - 2019 Hitachi Vantara Corporation. All Rights Reserved.