Hitachi Vantara Pentaho Community Forums
Results 1 to 7 of 7

Thread: Failed to call a DB function n times from the step Input Table

  1. #1

    Default Failed to call a DB function n times from the step Input Table

    Hi everyone, my question is:
    The execution starts by reading the information from a CSV file to analyze the data and sends the result to the table entry step and in this step I call a DB function for each previous step output. Before the process finishes I get the error

    2017/10/02 09:26:53 - Entrada Tabla.0 - ERROR (version 4.2.0-stable, build 15748 from 2011-09-08 13.11.42 by buildguy) : ERROR: memoria compartida agotada
    2017/10/02 09:26:53 - Entrada Tabla.0 - ERROR (version 4.2.0-stable, build 15748 from 2011-09-08 13.11.42 by buildguy) : Hint: Puede ser necesario incrementar max_locks_per_transaction.


    Any clue?

    Kettle Version: 4.2.0
    Tomcat 7
    Java 7

    Thanks in advance.
    Attached Images Attached Images  

  2. #2
    Join Date
    Sep 2011
    Posts
    152

    Default

    please post the error in english.

    it does not seems to be kettle issue , looks like you will have to change in db to allow more number of connection.

  3. #3
    Join Date
    May 2016
    Posts
    282

    Default

    You are getting a database error, what database are you using? Have you looked at that error message for that database forum/community/support?
    OS: Ubuntu 16.04 64 bits
    Java: Openjdk 1.8.0_131
    Pentaho 6.1 CE

  4. #4

    Default

    Quote Originally Posted by Ana GH View Post
    You are getting a database error, what database are you using? Have you looked at that error message for that database forum/community/support?
    Hi, yes it is a database error but it occurs when invoking n times a stored procedure in postgresql. That's why I wanted to know if there is any step or mechanism to release resources between each call. On the other hand if you increase the parameters of postgres it may happen that when trying to read a larger CSV, the same error occurs again.
    Variable was increased to:
    max_locks_per_transaction = 128

    ERROR: out of shared memory
    HINT: You might need to increase max_locks_per_transaction.


    Best regards
    Mariano


    Kettle Version: 4.2.0
    Tomcat 7
    Java 7
    PostgreSQL 9.5.3 on x86_64-pc-linux-gnu, compiled by gcc (Debian 4.9.2-10) 4.9.2, 64-bit
    Last edited by Camarzana Mariano; 10-03-2017 at 09:03 AM.

  5. #5
    Join Date
    May 2016
    Posts
    282

    Default

    You can try executing the procedure for each input row, but if you have a lot of rows, this would be inefficient. In the Advanced tab of the Job Entry configuration you can check the "Execute for every input row" property.
    Regards
    OS: Ubuntu 16.04 64 bits
    Java: Openjdk 1.8.0_131
    Pentaho 6.1 CE

  6. #6
    Join Date
    Sep 2011
    Posts
    152

    Default

    are you calling same procedure n times with different parameter?

    what is the exact scenario, we might think of some other solution if we know the exact situation.

  7. #7

    Default

    Quote Originally Posted by rajeshbcrec View Post
    are you calling same procedure n times with different parameter?

    what is the exact scenario, we might think of some other solution if we know the exact situation.

    Firstly recover the data from a CSV file by modifying some columns and the output takes the step Input Table. Because of this the function runs as many times as row records contain the CSV file.
    I was able to solve the error using the step Query database
    Attached Images Attached Images  

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  
Privacy Policy | Legal Notices | Safe Harbor Privacy Policy

Copyright © 2005 - 2019 Hitachi Vantara Corporation. All Rights Reserved.