Hitachi Vantara Pentaho Community Forums
Results 1 to 5 of 5

Thread: Problem with more than one Informix table output copy

  1. #1

    Default Problem with more than one Informix table output copy

    Hi,

    everytime I try to increase the number of copies of a table output step that writes to an Informix table in Kettle 3.1.3, I get the following error:

    Code:
    AR_STICHWORT 2.1 - ERROR (version 3.1.3, build 1 from 2009/03/16 09:37:00) : Because of an error, this step can't continue: 
    AR_STICHWORT 2.1 - ERROR (version 3.1.3, build 1 from 2009/03/16 09:37:00) : org.pentaho.di.core.exception.KettleException: 
    AR_STICHWORT 2.1 - ERROR (version 3.1.3, build 1 from 2009/03/16 09:37:00) : Error batch inserting rows into table [AR_STICHWORT].
    AR_STICHWORT 2.1 - ERROR (version 3.1.3, build 1 from 2009/03/16 09:37:00) : Errors encountered (first 10):
    AR_STICHWORT 2.1 - ERROR (version 3.1.3, build 1 from 2009/03/16 09:37:00) : Could not insert new row into the table.
    AR_STICHWORT 2.1 - ERROR (version 3.1.3, build 1 from 2009/03/16 09:37:00) : 
    AR_STICHWORT 2.1 - ERROR (version 3.1.3, build 1 from 2009/03/16 09:37:00) : 
    AR_STICHWORT 2.1 - ERROR (version 3.1.3, build 1 from 2009/03/16 09:37:00) : Error updating batch
    AR_STICHWORT 2.1 - ERROR (version 3.1.3, build 1 from 2009/03/16 09:37:00) : Could not insert new row into the table.
    AR_STICHWORT 2.1 - ERROR (version 3.1.3, build 1 from 2009/03/16 09:37:00) : 
    AR_STICHWORT 2.1 - ERROR (version 3.1.3, build 1 from 2009/03/16 09:37:00) : 
    AR_STICHWORT 2.1 - ERROR (version 3.1.3, build 1 from 2009/03/16 09:37:00) :     at org.pentaho.di.trans.steps.tableoutput.TableOutput.writeToTable(TableOutput.java:337)
    AR_STICHWORT 2.1 - ERROR (version 3.1.3, build 1 from 2009/03/16 09:37:00) :     at org.pentaho.di.trans.steps.tableoutput.TableOutput.processRow(TableOutput.java:78)
    AR_STICHWORT 2.1 - ERROR (version 3.1.3, build 1 from 2009/03/16 09:37:00) :     at org.pentaho.di.trans.step.BaseStep.runStepThread(BaseStep.java:2677)
    AR_STICHWORT 2.1 - ERROR (version 3.1.3, build 1 from 2009/03/16 09:37:00) :     at org.pentaho.di.trans.steps.tableoutput.TableOutput.run(TableOutput.java:634)
    AR_STICHWORT 2.1 - ERROR (version 3.1.3, build 1 from 2009/03/16 09:37:00) : Caused by: org.pentaho.di.core.exception.KettleDatabaseBatchException: 
    AR_STICHWORT 2.1 - ERROR (version 3.1.3, build 1 from 2009/03/16 09:37:00) : Error updating batch
    AR_STICHWORT 2.1 - ERROR (version 3.1.3, build 1 from 2009/03/16 09:37:00) : Could not insert new row into the table.
    AR_STICHWORT 2.1 - ERROR (version 3.1.3, build 1 from 2009/03/16 09:37:00) : 
    AR_STICHWORT 2.1 - ERROR (version 3.1.3, build 1 from 2009/03/16 09:37:00) :     at org.pentaho.di.trans.steps.tableoutput.TableOutput.writeToTable(TableOutput.java:251)
    AR_STICHWORT 2.1 - ERROR (version 3.1.3, build 1 from 2009/03/16 09:37:00) :     ... 3 more
    AR_STICHWORT 2.1 - ERROR (version 3.1.3, build 1 from 2009/03/16 09:37:00) : Caused by: java.sql.BatchUpdateException: Could not insert new row into the table.
    AR_STICHWORT 2.1 - ERROR (version 3.1.3, build 1 from 2009/03/16 09:37:00) :     at com.informix.jdbc.IfxPreparedStatement.executeBatch(IfxPreparedStatement.java:3588)
    AR_STICHWORT 2.1 - ERROR (version 3.1.3, build 1 from 2009/03/16 09:37:00) :     at org.pentaho.di.trans.steps.tableoutput.TableOutput.writeToTable(TableOutput.java:246)
    Hope somebody can help me

    Holger

  2. #2
    Join Date
    Jul 2009
    Posts
    24

    Default Use batch update for inserts

    There are a couple of basic things you may check, coming from a PDI rookie.

    1. Make sure the hops are distributing rows to each copy of the Table Output vs. Copying rows. If you copy all the Table Output steps will attempt to insert the same data.

    2. If you are distributing unique rows you may want to test the transformation without the batch insert enabled.
    (Turn off the Use batch update for inserts option in the Table Output step)

    This will give you a couple things to check until some of the veterans get to your post.

    Thank you,
    Jeff

  3. #3

    Default

    Okay, I have also progressed a little further with this problem.
    The hops were already distributing, but locks got in the way on the first try. So, I set IFX_ISOLATION_LEVEL = 1, which helped me run for a bit, till somewhere around 1.5 million rows later the crash happened again. I cannot turn the batch off, because the transformation than hangs and the error output from the batch does not give me the SQL error. I do not why it hangs, but from one second to the next the time is still increasing, but the number of data flowing through the transformation stays the same.
    I might have to change the source code to get to the error, if nobody else has an idea.

  4. #4
    Join Date
    Nov 1999
    Posts
    9,729

    Default

    Multiple steps for output can and will cause dead-lock errors on the database, even on Oracle etc.

    The problem is that writing in parallel is not possible. The only way to work around it is by caching things etc.

    Doing it in multiple transactions (>1 thread) causes deadlocks on either data-blocks, data-rows, table-data or their index variations.

    Parallel inserts for Oracle work for example, but not if you have an index on the table in question.

    Matt

  5. #5

    Default

    Thanks Matt and Jeff for your answers.

    We finally found our problem. The copy of the table that my colleague created was in page locking mode. We set it to row locking and of course skipped all the indices. Now, that we have the output faster than the input, I can start thinking about splitting the input data for a number of input table steps to read.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  
Privacy Policy | Legal Notices | Safe Harbor Privacy Policy

Copyright © 2005 - 2019 Hitachi Vantara Corporation. All Rights Reserved.