Hitachi Vantara Pentaho Community Forums
Results 1 to 8 of 8

Thread: Table input Double-Value of select is not complete

  1. #1
    Join Date
    Nov 2016
    Posts
    4

    Default Table input Double-Value of select is not complete

    Hello,
    I have a Problem that I could not fix up to now. When I do a "Select" at a Table-Input the double values are not returned completely and are always cut. Here are some examples:
    15.0420 => 15 // 12.6000 => 12,6 // 4.1176 => 4,1 // 0.1123 => 0,1 // 0.0012 => 0

    I seems like, that minimum two numbers are shown and max one number after the comma. However it is very important to get the right figures.
    I use pentaho 6.1, and mariaDB 5.5.49.

    Thanks for every help.

    Best regards,
    Dave

  2. #2
    Join Date
    Jun 2012
    Posts
    5,534

    Default

    Relax, it's only a display thing. Adjust the format string if you want to see more decimals.
    So long, and thanks for all the fish.

  3. #3
    Join Date
    Nov 2016
    Posts
    4

    Default

    Thanks for you answer. After I do the query, it just delivers me the data like this and the further handling is done with the "reduced" doubles. How is it possible to Adjust the format string? Maybe this can improve my problem. I need to see 4 numbers after the ".".

  4. #4
    Join Date
    Apr 2016
    Posts
    156

    Default

    Many steps have different ways to set the conversion string / format string for values they (the steps) create, however Table Input step doesn't have this.

    Try adding a 'Select Values' step immediately after your 'Table Input' step. In the 'Select Values':
    * go to the 'Meta' tab (don't use the Select&Alter or Remove tabs)
    * select the fieldnames you want 4 digits seen in preview
    * set Type = Number, and Format = #.####

    Additionally, if you're concerned about the # formatting, you can set such long numerical values to PDI type 'BigNumber'. Again, this works for steps that create new fields, however wouldn't work directly for Table Input step (would need to send through Select Values' Meta tab to convert to BigNumber type).
    My runtime environment: MacOS, JDK 1.8u121, PDI 7.0

  5. #5
    Join Date
    Nov 2016
    Posts
    4

    Default

    Thank you for you reply and sorry that I answer so late.
    Fortunately, your advice is working. Unfortunately it is not working for my processing. My "table input" does not load one specific table so that I can then select a specific value, it loads fields and tables by a configuration file. In detail, my Query in the table input looks like: "SELECT ${QUERY_1} FROM `${slave_tab}`". The Variable Query is a list of fields and the slave_tab is the respective table. Therefore the table input e.g. has the Select Statement "Select id, net_price, invoice_date from invoice" or also "Select id, name, last_name, gender from customer". The structure of the tables and the content that are loaded by the table input is completely different. However I need the "dynamic structure" of the selects because the tables (approx. 100) and the table structure frequently changes and I cannot update all the time the table input once the table structure changes. I need a "global" solution that the Table Input always delivers the "right" number format and not the short format. Or does there exist something else than a table input that can to the same and delivers a more precise number? Or... is it possible to say in the select values, that all numbers should be converted to #.####???
    Last edited by dschu; 11-14-2016 at 10:07 AM.

  6. #6
    Join Date
    Apr 2016
    Posts
    156

    Default

    Quote Originally Posted by dschu View Post
    Unfortunately it is not working for my processing.
    It is not clear where in your processing this is "not working." Can you explain where you feel there is an issue?

    Even with your dynamic table/column selection method, the PDI formatting is just that -- formatting.

    I suspect that you may be outputting the data somewhere... e.g. to a CSV, or to another database/table. Can you give us specifics?

    Depending on how you use the data downstream in your PDI transformation, there are configuration options to set in other Steps. Rather than try to force the formatting in your dynamic 'Input Table' step, try instead to configure the formatting in the downstream steps that use the value.

    For your use case of such 'dynamic' tables... search the forums and the blogs for PDI 'Metadata Injection'. Look through the samples in your PDI installation. Try to create some basic 'toy' samples of Metadata Injection (don't jump right in to your full solution).
    My runtime environment: MacOS, JDK 1.8u121, PDI 7.0

  7. #7
    Join Date
    Nov 2016
    Posts
    4

    Default

    The problem was, that I could not select the fields to format them. If I insert the column name and format it, but the column name does not exist, I get an error. Fortunately I now solved the problem. Under "Edit->Edit the kettle.properties file", I edited the "KETTLE_DEFAULT_NUMBER_FORMAT" with your format. In my environment that is luckily sufficient. Thanks for your help and hint with the formatting :-)
    Last edited by dschu; 11-14-2016 at 11:35 AM.

  8. #8
    Join Date
    Apr 2016
    Posts
    156

    Default

    Quote Originally Posted by dschu View Post
    Fortunately I now solved the problem. Under "Edit->Edit the kettle.properties file", I edited the "KETTLE_DEFAULT_NUMBER_FORMAT" with [the] format.
    Awesome! Thanks also for adding back to the community by sharing that very helpful suggestion of editing kettle.properties with a default number format.

    It is crazy-useful for everyone reading this thread afterwards to see such a great resolution step. :-)
    My runtime environment: MacOS, JDK 1.8u121, PDI 7.0

Tags for this Thread

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  
Privacy Policy | Legal Notices | Safe Harbor Privacy Policy

Copyright © 2005 - 2019 Hitachi Vantara Corporation. All Rights Reserved.