Hitachi Vantara Pentaho Community Forums
Results 1 to 10 of 10

Thread: Error: Lock wait timeout exceeded; try restarting transaction - How to solve?

  1. #1
    Join Date
    Aug 2008
    Posts
    563

    Default Error: Lock wait timeout exceeded; try restarting transaction - How to solve?

    In the last few days I have been trying to create an ETL process that prepares the date for a cube. I have fixed a couple of errors, but I cannot fix this one: Lock wait timeout exceeded; try restarting transaction

    2009/03/27 14:42:29 - Thread[Process one service table (Process one service table (Thread-689)),5,main] - exec(7, 0, mlist_transformation.0)
    2009/03/27 14:46:00 - Table output.0 - ERROR (version 3.2.0-M2, build 10376 from 2009-03-16 04.27.19 by tomcat) : Because of an error, this step can't continue:
    2009/03/27 14:46:00 - Table output.0 - ERROR (version 3.2.0-M2, build 10376 from 2009-03-16 04.27.19 by tomcat) : org.pentaho.di.core.exception.KettleException:
    2009/03/27 14:46:00 - Table output.0 - ERROR (version 3.2.0-M2, build 10376 from 2009-03-16 04.27.19 by tomcat) : Error batch inserting rows into table [rd_product_sales].
    2009/03/27 14:46:00 - Table output.0 - ERROR (version 3.2.0-M2, build 10376 from 2009-03-16 04.27.19 by tomcat) : Errors encountered (first 10):
    2009/03/27 14:46:00 - Table output.0 - ERROR (version 3.2.0-M2, build 10376 from 2009-03-16 04.27.19 by tomcat) : Lock wait timeout exceeded; try restarting transaction
    2009/03/27 14:46:00 - Table output.0 - ERROR (version 3.2.0-M2, build 10376 from 2009-03-16 04.27.19 by tomcat) :
    2009/03/27 14:46:00 - Table output.0 - ERROR (version 3.2.0-M2, build 10376 from 2009-03-16 04.27.19 by tomcat) :
    2009/03/27 14:46:00 - Table output.0 - ERROR (version 3.2.0-M2, build 10376 from 2009-03-16 04.27.19 by tomcat) : Error updating batch
    2009/03/27 14:46:00 - Table output.0 - ERROR (version 3.2.0-M2, build 10376 from 2009-03-16 04.27.19 by tomcat) : Lock wait timeout exceeded; try restarting transaction
    2009/03/27 14:46:00 - Table output.0 - ERROR (version 3.2.0-M2, build 10376 from 2009-03-16 04.27.19 by tomcat) :
    2009/03/27 14:46:00 - Table output.0 - ERROR (version 3.2.0-M2, build 10376 from 2009-03-16 04.27.19 by tomcat) :
    2009/03/27 14:46:00 - Table output.0 - ERROR (version 3.2.0-M2, build 10376 from 2009-03-16 04.27.19 by tomcat) : at org.pentaho.di.trans.steps.tableoutput.TableOutput.writeToTable(TableOutput.java:383)
    2009/03/27 14:46:00 - Table output.0 - ERROR (version 3.2.0-M2, build 10376 from 2009-03-16 04.27.19 by tomcat) : at org.pentaho.di.trans.steps.tableoutput.TableOutput.processRow(TableOutput.java:114)
    2009/03/27 14:46:00 - Table output.0 - ERROR (version 3.2.0-M2, build 10376 from 2009-03-16 04.27.19 by tomcat) : at org.pentaho.di.trans.step.BaseStep.runStepThread(BaseStep.java:2853)
    2009/03/27 14:46:00 - Table output.0 - ERROR (version 3.2.0-M2, build 10376 from 2009-03-16 04.27.19 by tomcat) : at org.pentaho.di.trans.steps.tableoutput.TableOutput.run(TableOutput.java:680)
    2009/03/27 14:46:00 - Table output.0 - ERROR (version 3.2.0-M2, build 10376 from 2009-03-16 04.27.19 by tomcat) : Caused by: org.pentaho.di.core.exception.KettleDatabaseBatchException:
    2009/03/27 14:46:00 - Table output.0 - ERROR (version 3.2.0-M2, build 10376 from 2009-03-16 04.27.19 by tomcat) : Error updating batch
    2009/03/27 14:46:00 - Table output.0 - ERROR (version 3.2.0-M2, build 10376 from 2009-03-16 04.27.19 by tomcat) : Lock wait timeout exceeded; try restarting transaction
    2009/03/27 14:46:00 - Table output.0 - ERROR (version 3.2.0-M2, build 10376 from 2009-03-16 04.27.19 by tomcat) :
    2009/03/27 14:46:00 - Table output.0 - ERROR (version 3.2.0-M2, build 10376 from 2009-03-16 04.27.19 by tomcat) : at org.pentaho.di.trans.steps.tableoutput.TableOutput.writeToTable(TableOutput.java:299)
    2009/03/27 14:46:00 - Table output.0 - ERROR (version 3.2.0-M2, build 10376 from 2009-03-16 04.27.19 by tomcat) : ... 3 more
    2009/03/27 14:46:00 - Table output.0 - ERROR (version 3.2.0-M2, build 10376 from 2009-03-16 04.27.19 by tomcat) : Caused by: java.sql.BatchUpdateException: Lock wait timeout exceeded; try restarting transaction
    2009/03/27 14:46:00 - Table output.0 - ERROR (version 3.2.0-M2, build 10376 from 2009-03-16 04.27.19 by tomcat) : at com.mysql.jdbc.ServerPreparedStatement.executeBatch(ServerPreparedStatement.java:652)
    2009/03/27 14:46:00 - Table output.0 - ERROR (version 3.2.0-M2, build 10376 from 2009-03-16 04.27.19 by tomcat) : at org.pentaho.di.trans.steps.tableoutput.TableOutput.writeToTable(TableOutput.java:294)
    2009/03/27 14:46:00 - Table output.0 - ERROR (version 3.2.0-M2, build 10376 from 2009-03-16 04.27.19 by tomcat) : ... 3 more
    2009/03/27 14:46:01 - Process one service table - Finished job entry [mlist_transformation] (result=[false])
    2009/03/27 14:46:01 - Process one service table - Finished job entry [Define Service Variables] (result=[false])

    When I run the transaction on a small amount of data everything works fine, once I use more data, I get usually this problem. In most cases the job seems like it is running, but nothing is being logged (it seems like nothing is happening). When I stop the job, I get the above error message.

    This seems to have something to do with the combination lookup/update steps ... but I am not sure.

    How can I solve this? I would appreciate if somebody could point me into the right direction.
    Thanks a lot,
    Diddy

  2. #2
    Join Date
    Nov 1999
    Posts
    9,729

    Default

    It means another process has a transaction / lock on the table you're writing to.
    It goes like this:

    Process A opens a transaction, updates a record, doesn't end the transaction.
    Process B opens a transaction, wants to update the same record: this will block until A ends the transaction.

    Note that every step in Kettle opens a separate transaction by default.

  3. #3
    Join Date
    Aug 2008
    Posts
    563

    Default

    Ok, thanks a lot for your answer!
    Is it somehow necessary that every step has its own seperate transaction or would it be possible to use the same transaction for several steps? Could the problem be solved by this? If yes, how can I do this?

    Also, a very very basic question: The data goes throught all the steps row by row, or every step gets all the data in one go? From locking at the log, it looks more like the first scenario.

    Best regards,
    Diddy
    Last edited by diddy; 03-27-2009 at 11:53 AM.

  4. #4
    Join Date
    May 2006
    Posts
    4,882

    Default

    Under the options tab of a transformation (ctrl t in spoon) you have "unique connection" (recently renamed to something else), which makes the transaction use 1 connection per transformation instead of 1 connection per step.

    For the row processing... all steps run in parallel in each a seperate thread... so yes row by row.

    Regards,
    Sven

  5. #5
    Join Date
    Aug 2008
    Posts
    563

    Default

    Hi Sven,
    Many thanks for your answer! Now I changed this to a unique conenction and now I get following error :
    Streaming result set com.mysql.jdbc.RowDataDynamic@19e96f7 is still active. No statements may be issued when any streaming result sets are open and in use on a given connection. Ensure that you have called .close() on any active streaming result sets before attempting more queries.

    Is there a way to solve this? ( Sorry for so many questions!)

    Best regards,
    Diddy

    P.S: Unique connection is now called 'Make the transformation database transactional'.

  6. #6
    Join Date
    Aug 2008
    Posts
    563

    Default Some more info

    ... as stated below I still have a problem.
    There is one point that might cause the problem:
    I get data from database A and export the results to database B (different conenction details). Plus the 'combination lookup/update' access database B.

    So 'Use unique connections' would create two unique connections or would it be only able to create one connection? (maybe this additional info might help you to identify the problem).
    Thanks,
    Diddy

  7. #7

    Default insert/update step throwing lock wait timeout error

    I am also facing the same issue.
    My transformation is trying to update few millions of records using insert/update step.The transformation is executing on all days except weekends.Faced this error 4 or 5 times out of 35 times of execution.
    But when we re-run the transformation ,it gets completed successfully.

  8. #8

    Default error message i got

    Error inserting row
    Lock wait timeout exceeded; try restarting transaction

    at org.pentaho.di.trans.steps.insertupdate.InsertUpdate.processRow(InsertUpdate.java:289)
    at org.pentaho.di.trans.steps.insertupdate.InsertUpdate.run(InsertUpdate.java:492)
    Caused by: org.pentaho.di.core.exception.KettleDatabaseException:
    Error inserting row
    Lock wait timeout exceeded; try restarting transaction
    at org.pentaho.di.core.database.Database.insertRow(Database.java:1324)
    at org.pentaho.di.core.database.Database.insertRow(Database.java:1211)
    at org.pentaho.di.core.database.Database.insertRow(Database.java:1194)
    at org.pentaho.di.trans.steps.insertupdate.InsertUpdate.lookupValues(InsertUpdate.java:103)
    at org.pentaho.di.trans.steps.insertupdate.InsertUpdate.processRow(InsertUpdate.java:274)
    ... 1 more
    Caused by: java.sql.SQLException: Lock wait timeout exceeded; try restarting transaction
    at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:2975)
    at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:1600)
    at com.mysql.jdbc.ServerPreparedStatement.serverExecute(ServerPreparedStatement.java:1129)
    at com.mysql.jdbc.ServerPreparedStatement.executeInternal(ServerPreparedStatement.java:681)
    at com.mysql.jdbc.PreparedStatement.executeUpdate(PreparedStatement.java:1368)
    at com.mysql.jdbc.PreparedStatement.executeUpdate(PreparedStatement.java:1283)
    at com.mysql.jdbc.PreparedStatement.executeUpdate(PreparedStatement.java:1268)
    at org.pentaho.di.core.database.Database.insertRow(Database.java:1274)
    ... 5 more

  9. #9
    Join Date
    Nov 1999
    Posts
    9,729

    Default

    It probably just means that there is another process that grabs a lock on one of the tables used in the transformation. MySQL doesn't really handle this very gracefully and to be honest, most databases don't.

  10. #10
    Join Date
    Mar 2010
    Posts
    23

    Unhappy still happends

    Sorry to post in and old thread, but it still happening the same issue.

    I read (believe from Matt, not sure) that changing the mysql variable
    Code:
    net_write_timeout
    to a higher value resolve the issue, but no.

    I'm trying rasing the values of variables:
    Code:
    innodb_lock_wait_timeout
    and
    Code:
    table_lock_wait_timeout
    but no succeeded.

    It seems that the variable
    Code:
    net_read_timeout
    does the trick, but cannot be sure because the error is not always and there is probable that will happend again.

    Perhaps this information can be useful: The insert/update step receive data from 2 flows, perhaps combining them in a dummy before do the insert/update?? is there a difference?

    need some feedback about going on the right road here
    Last edited by ElmoVil; 05-31-2010 at 08:07 PM. Reason: add more info

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  
Privacy Policy | Legal Notices | Safe Harbor Privacy Policy

Copyright © 2005 - 2019 Hitachi Vantara Corporation. All Rights Reserved.