Hitachi Vantara Pentaho Community Forums
Results 1 to 5 of 5

Thread: Kettle gives error for more than 10000 rows in table input for postgresql Redshift

  1. #1

    Default Kettle gives error for more than 10000 rows in table input for postgresql Redshift

    Hi,

    I am new to kettle. Following are the steps I followed:

    1. Created a new transformation.
    2. Draged a Table Input.
    3. Created a connection for my Redshift DB. Tested OK.
    4. Select rows with limit less than 10k, I get the out put.
    5 Select rows with limit higher than 10k and I get following exception.

    In the stack trace it says query was cancelled by "user". I am guessing kettle cancels the query because of some timeout setting or row-limit.
    (More inclined to row-limit as 9999 rows work fine but 11000 doesn't)
    Looks to me that I am missing some settings. Your help is appreciated. Thanks.



    2014/01/10 19:38:33 - Table input.0 - ERROR (version 5.0.1-stable, build 1 from 2013-11-15_16-08-58 by buildguy) : Unexpected error
    2014/01/10 19:38:33 - Table input.0 - ERROR (version 5.0.1-stable, build 1 from 2013-11-15_16-08-58 by buildguy) : org.pentaho.di.core.exception.KettleDatabaseException:
    2014/01/10 19:38:33 - Table input.0 - An error occurred executing SQL:
    2014/01/10 19:38:33 - Table input.0 - SELECT * FROM users LIMIT 11000
    2014/01/10 19:38:33 - Table input.0 -
    2014/01/10 19:38:33 - Table input.0 - Error determining value metadata from SQL resultset metadata
    2014/01/10 19:38:33 - Table input.0 - ERROR: Invalid protocol sequence 'P' while in PortalSuspended state.
    2014/01/10 19:38:33 - Table input.0 -
    2014/01/10 19:38:33 - Table input.0 -
    2014/01/10 19:38:33 - Table input.0 - at org.pentaho.di.core.database.Database.openQuery(Database.java:1643)
    2014/01/10 19:38:33 - Table input.0 - at org.pentaho.di.trans.steps.tableinput.TableInput.doQuery(TableInput.java:233)
    2014/01/10 19:38:33 - Table input.0 - at org.pentaho.di.trans.steps.tableinput.TableInput.processRow(TableInput.java:143)
    2014/01/10 19:38:33 - Table input.0 - at org.pentaho.di.trans.step.RunThread.run(RunThread.java:60)
    2014/01/10 19:38:33 - Table input.0 - at java.lang.Thread.run(Thread.java:680)
    2014/01/10 19:38:33 - Table input.0 - Caused by: org.pentaho.di.core.exception.KettleDatabaseException:
    2014/01/10 19:38:33 - Table input.0 - Error determining value metadata from SQL resultset metadata
    2014/01/10 19:38:33 - Table input.0 - ERROR: Invalid protocol sequence 'P' while in PortalSuspended state.
    2014/01/10 19:38:33 - Table input.0 -
    2014/01/10 19:38:33 - Table input.0 - at org.pentaho.di.core.row.value.ValueMetaBase.getValueFromSQLType(ValueMetaBase.java:4480)
    2014/01/10 19:38:33 - Table input.0 - at org.pentaho.di.core.database.Database.getValueFromSQLType(Database.java:2191)
    2014/01/10 19:38:33 - Table input.0 - at org.pentaho.di.core.database.Database.getRowInfo(Database.java:2137)
    2014/01/10 19:38:33 - Table input.0 - at org.pentaho.di.core.database.Database.openQuery(Database.java:1639)
    2014/01/10 19:38:33 - Table input.0 - ... 4 more
    2014/01/10 19:38:33 - Table input.0 - Caused by: org.postgresql.util.PSQLException: ERROR: Invalid protocol sequence 'P' while in PortalSuspended state.
    2014/01/10 19:38:33 - Table input.0 - at org.postgresql.core.v3.QueryExecutorImpl.receiveErrorResponse(QueryExecutorImpl.java:2101)
    2014/01/10 19:38:33 - Table input.0 - at org.postgresql.core.v3.QueryExecutorImpl.processResults(QueryExecutorImpl.java:1834)
    2014/01/10 19:38:33 - Table input.0 - at org.postgresql.core.v3.QueryExecutorImpl.execute(QueryExecutorImpl.java:255)
    2014/01/10 19:38:33 - Table input.0 - at org.postgresql.jdbc2.AbstractJdbc2Statement.execute(AbstractJdbc2Statement.java:510)
    2014/01/10 19:38:33 - Table input.0 - at org.postgresql.jdbc2.AbstractJdbc2Statement.executeWithFlags(AbstractJdbc2Statement.java:372)
    2014/01/10 19:38:33 - Table input.0 - at org.postgresql.jdbc2.AbstractJdbc2Statement.executeQuery(AbstractJdbc2Statement.java:252)
    2014/01/10 19:38:33 - Table input.0 - at org.postgresql.jdbc2.AbstractJdbc2ResultSetMetaData.fetchFieldMetaData(AbstractJdbc2ResultSetMetaData.java:238)
    2014/01/10 19:38:33 - Table input.0 - at org.postgresql.jdbc2.AbstractJdbc2ResultSetMetaData.isAutoIncrement(AbstractJdbc2ResultSetMetaData.java:59)
    2014/01/10 19:38:33 - Table input.0 - at org.postgresql.jdbc2.AbstractJdbc2ResultSetMetaData.getColumnTypeName(AbstractJdbc2ResultSetMetaData.java:348)
    2014/01/10 19:38:33 - Table input.0 - at org.pentaho.di.core.row.value.ValueMetaBase.getOriginalColumnMetadata(ValueMetaBase.java:4493)
    2014/01/10 19:38:33 - Table input.0 - at org.pentaho.di.core.row.value.ValueMetaBase.getValueFromSQLType(ValueMetaBase.java:4459)
    2014/01/10 19:38:33 - Table input.0 - ... 7 more
    2014/01/10 19:38:33 - Table input.0 - Finished reading query, closing connection.
    2014/01/10 19:38:33 - PG - ERROR (version 5.0.1-stable, build 1 from 2013-11-15_16-08-58 by buildguy) : Error disconnecting from database:
    2014/01/10 19:38:33 - PG -
    2014/01/10 19:38:33 - PG - Error comitting connection
    2014/01/10 19:38:33 - PG - ERROR: Query cancelled on user's request
    2014/01/10 19:38:33 - PG - ERROR (version 5.0.1-stable, build 1 from 2013-11-15_16-08-58 by buildguy) : org.pentaho.di.core.exception.KettleDatabaseException:
    2014/01/10 19:38:33 - PG - Error comitting connection
    2014/01/10 19:38:33 - PG - ERROR: Query cancelled on user's request
    2014/01/10 19:38:33 - PG -
    2014/01/10 19:38:33 - PG - at org.pentaho.di.core.database.Database.commit(Database.java:734)
    2014/01/10 19:38:33 - PG - at org.pentaho.di.core.database.Database.commit(Database.java:708)
    2014/01/10 19:38:33 - PG - at org.pentaho.di.core.database.Database.disconnect(Database.java:588)
    2014/01/10 19:38:33 - PG - at org.pentaho.di.trans.steps.tableinput.TableInput.dispose(TableInput.java:285)
    2014/01/10 19:38:33 - PG - at org.pentaho.di.trans.step.RunThread.run(RunThread.java:100)
    2014/01/10 19:38:33 - PG - at java.lang.Thread.run(Thread.java:680)
    2014/01/10 19:38:33 - PG - Caused by: org.postgresql.util.PSQLException: ERROR: Query cancelled on user's request
    2014/01/10 19:38:33 - PG - at org.postgresql.core.v3.QueryExecutorImpl.receiveErrorResponse(QueryExecutorImpl.java:2101)
    2014/01/10 19:38:33 - PG - at org.postgresql.core.v3.QueryExecutorImpl.processResults(QueryExecutorImpl.java:1834)
    2014/01/10 19:38:33 - PG - at org.postgresql.core.v3.QueryExecutorImpl.execute(QueryExecutorImpl.java:255)
    2014/01/10 19:38:33 - PG - at org.postgresql.jdbc2.AbstractJdbc2Connection.executeTransactionCommand(AbstractJdbc2Connection.java:683)
    2014/01/10 19:38:33 - PG - at org.postgresql.jdbc2.AbstractJdbc2Connection.commit(AbstractJdbc2Connection.java:707)
    2014/01/10 19:38:33 - PG - at org.pentaho.di.core.database.Database.commit(Database.java:726)
    2014/01/10 19:38:33 - PG - ... 5 more
    2014/01/10 19:38:33 - Table input.0 - Finished processing (I=0, O=0, R=0, W=0, U=0, E=1)
    2014/01/10 19:38:33 - testtranf - ERROR (version 5.0.1-stable, build 1 from 2013-11-15_16-08-58 by buildguy) : Errors detected!
    2014/01/10 19:38:33 - Spoon - The transformation has finished!!
    2014/01/10 19:38:33 - testtranf - ERROR (version 5.0.1-stable, build 1 from 2013-11-15_16-08-58 by buildguy) : Errors detected!
    2014/01/10 19:38:33 - testtranf - ERROR (version 5.0.1-stable, build 1 from 2013-11-15_16-08-58 by buildguy) : Errors detected!

  2. #2

    Default

    It was postgresql diver issue. Kettle has 9.1 driver by default with installation package. But as per this page http://docs.aws.amazon.com/redshift/...workbench.html 8.4 is the driver to use.

  3. #3
    Join Date
    Mar 2014
    Posts
    2

    Default Persisting problem Kettle/Redshift

    Quote Originally Posted by vijayagrawal View Post
    It was postgresql diver issue. Kettle has 9.1 driver by default with installation package. But as per this page http://docs.aws.amazon.com/redshift/...workbench.html 8.4 is the driver to use.
    I actually seem to have that problem as well and the tricky part is that I had 8.4 driver to begin with since I knew that is needed with amazon. My query runs smooth on SQL workbench using same connection and driver but gives the described error message with Kettle. If I limit the query more it works fine.

  4. #4
    Join Date
    Mar 2014
    Posts
    2

    Default

    Quote Originally Posted by sampsa View Post
    I actually seem to have that problem as well and the tricky part is that I had 8.4 driver to begin with since I knew that is needed with amazon. My query runs smooth on SQL workbench using same connection and driver but gives the described error message with Kettle. If I limit the query more it works fine.
    Less than mysterious but just changing the driver and flushing cache wasn't enough. Don't know what was the crucial part but installing from scratch including deleting .kettle did the trick.

  5. #5
    Join Date
    Dec 2014
    Posts
    1

    Default

    You don't need to uninstall pentaho.
    Just download the postgresql 8.4 driver (jar), and copy it into data-integration\lib.
    Of course, delete the other postgresql driver that already exists there.

Tags for this Thread

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  
Privacy Policy | Legal Notices | Safe Harbor Privacy Policy

Copyright © 2005 - 2019 Hitachi Vantara Corporation. All Rights Reserved.