Hitachi Vantara Pentaho Community Forums
Results 1 to 7 of 7

Thread: "java.sql.SQLException: Method not supported" when retrieving data from hive table.

  1. #1

    Default "java.sql.SQLException: Method not supported" when retrieving data from hive table.

    Hi, all

    I am new on the kettle. So I tried to run the example "extracting data from hive to load an RDBMS".
    When testing the HIVE database connection, it shows the connection is good.
    But when reading data from hive, the following error is shown.
    I also tried to find the answer from the forum, It seems a common issue before. but I didn't get the answer. Can someone share how you solve the issue?
    Note:
    Kettle-Spoon Version: 4.4.0-stable.
    Hadoop version:CDH4. hadoop 2.0
    I had copied the following jars(hive-*.jar) from CDH4 cluster to data-integration\libext\JDBC.


    2013/07/25 17:07:09 - Hive(135.252.31.26) - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) : ERROR executing query: org.pentaho.di.core.exception.KettleDatabaseException:
    2013/07/25 17:07:09 - Hive(135.252.31.26) - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) : Error getting row information from database:
    2013/07/25 17:07:09 - Hive(135.252.31.26) - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) : Method not supported
    2013/07/25 17:07:09 - Hive(135.252.31.26) - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) : ERROR in part: openQuery : get rowinfo
    2013/07/25 17:07:09 - Table input.0 - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) : Unexpected error
    2013/07/25 17:07:09 - Table input.0 - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) : org.pentaho.di.core.exception.KettleDatabaseException:
    2013/07/25 17:07:09 - Table input.0 - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) : An error occurred executing SQL in part [openQuery : get rowinfo]:
    2013/07/25 17:07:09 - Table input.0 - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) : select * from myuser
    2013/07/25 17:07:09 - Table input.0 - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) :
    2013/07/25 17:07:09 - Table input.0 - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) : Error getting row information from database:
    2013/07/25 17:07:09 - Table input.0 - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) : Method not supported
    2013/07/25 17:07:09 - Table input.0 - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) :
    2013/07/25 17:07:09 - Table input.0 - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) :
    2013/07/25 17:07:09 - Table input.0 - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) : at org.pentaho.di.core.database.Database.openQuery(Database.java:1863)
    2013/07/25 17:07:09 - Table input.0 - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) : at org.pentaho.di.trans.steps.tableinput.TableInput.doQuery(TableInput.java:233)
    2013/07/25 17:07:09 - Table input.0 - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) : at org.pentaho.di.trans.steps.tableinput.TableInput.processRow(TableInput.java:143)
    2013/07/25 17:07:09 - Table input.0 - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) : at org.pentaho.di.trans.step.RunThread.run(RunThread.java:50)
    2013/07/25 17:07:09 - Table input.0 - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) : at java.lang.Thread.run(Thread.java:619)
    2013/07/25 17:07:09 - Table input.0 - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) : Caused by: org.pentaho.di.core.exception.KettleDatabaseException:
    2013/07/25 17:07:09 - Table input.0 - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) : Error getting row information from database:
    2013/07/25 17:07:09 - Table input.0 - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) : Method not supported
    2013/07/25 17:07:09 - Table input.0 - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) :
    2013/07/25 17:07:09 - Table input.0 - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) : at org.pentaho.di.core.database.Database.getRowInfo(Database.java:2396)
    2013/07/25 17:07:09 - Table input.0 - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) : at org.pentaho.di.core.database.Database.openQuery(Database.java:1850)
    2013/07/25 17:07:09 - Table input.0 - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) : ... 4 more
    2013/07/25 17:07:09 - Table input.0 - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) : Caused by: java.sql.SQLException: Method not supported
    2013/07/25 17:07:09 - Table input.0 - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) : at org.apache.hadoop.hive.jdbc.HiveResultSetMetaData.isSigned(HiveResultSetMetaData.java:180)
    2013/07/25 17:07:09 - Table input.0 - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) : at org.pentaho.di.core.database.Database.getValueFromSQLType(Database.java:2408)
    2013/07/25 17:07:09 - Table input.0 - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) : at org.pentaho.di.core.database.Database.getRowInfo(Database.java:2389)
    2013/07/25 17:07:09 - Table input.0 - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) : ... 5 more
    2013/07/25 17:07:09 - Hive(135.252.31.26) - Statement canceled!
    2013/07/25 17:07:09 - hive_to_rdbms - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) : Something went wrong while trying to stop the transformation: org.pentaho.di.core.exception.KettleDatabaseException:
    2013/07/25 17:07:09 - hive_to_rdbms - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) : Error cancelling statement
    2013/07/25 17:07:09 - hive_to_rdbms - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) : Method not supported
    2013/07/25 17:07:09 - hive_to_rdbms - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) : org.pentaho.di.core.exception.KettleDatabaseException:
    2013/07/25 17:07:09 - hive_to_rdbms - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) : Error cancelling statement
    2013/07/25 17:07:09 - hive_to_rdbms - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) : Method not supported
    2013/07/25 17:07:09 - hive_to_rdbms - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) :
    2013/07/25 17:07:09 - hive_to_rdbms - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) : at org.pentaho.di.core.database.Database.cancelStatement(Database.java:635)
    2013/07/25 17:07:09 - hive_to_rdbms - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) : at org.pentaho.di.core.database.Database.cancelQuery(Database.java:615)
    2013/07/25 17:07:09 - hive_to_rdbms - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) : at org.pentaho.di.trans.steps.tableinput.TableInput.stopRunning(TableInput.java:303)
    2013/07/25 17:07:09 - hive_to_rdbms - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) : at org.pentaho.di.trans.Trans.stopAll(Trans.java:1563)
    2013/07/25 17:07:09 - hive_to_rdbms - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) : at org.pentaho.di.trans.step.BaseStep.stopAll(BaseStep.java:2525)
    2013/07/25 17:07:09 - hive_to_rdbms - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) : at org.pentaho.di.trans.step.RunThread.run(RunThread.java:74)
    2013/07/25 17:07:09 - hive_to_rdbms - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) : at java.lang.Thread.run(Thread.java:619)
    2013/07/25 17:07:09 - hive_to_rdbms - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) : Caused by: java.sql.SQLException: Method not supported
    2013/07/25 17:07:09 - hive_to_rdbms - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) : at org.apache.hadoop.hive.jdbc.HiveStatement.cancel(HiveStatement.java:85)
    2013/07/25 17:07:09 - hive_to_rdbms - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) : at org.pentaho.di.core.database.Database.cancelStatement(Database.java:629)
    2013/07/25 17:07:09 - hive_to_rdbms - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) : ... 6 more

    Thanks.
    Maria
    Last edited by mariax; 07-25-2013 at 05:34 AM.

  2. #2
    Join Date
    Sep 2012
    Posts
    71

    Default

    Pentaho Data Integration (aka Kettle) has a Hive driver for CDH 4 that supports more methods than the one that comes from the CDH distribution. For that reason, you shouldn't copy the hive-* JARs from the cluster to the JDBC folder. Note in the JDBC folder there is a JAR called pentaho-hadoop-hive-jdbc-<version>, this is a kind of proxy to our "fixed" Hive driver.

    I recommend removing the hive-* JARs you put into JDBC, setting your hadoop configuration to CDH 4 by setting the "active.hadoop.configuration" property in plugins/pentaho-big-data-plugin/plugin.properties to cdh4, then starting Spoon. If that doesn't work please let us know and we will take another look.

  3. #3

    Default

    Hi, Matt

    Thanks for your quick response.
    Before I copied the file from hadoop cluster, I had tried the steps you mentioned(1. copying the hive-jdbc-0.7.0-pentaho-1.0.2.jar to \libext\JDBC. 2.change the active.hadoop.configuration=cdh4), then tried to test the HIVE DB connection, the following error is shown.
    That's why I tried to copy the jar from cluster. Can you help me have a check?
    Note(I had run hive --service hiveserver on hive cluster).

    org.pentaho.di.core.exception.KettleDatabaseException:
    Error occured while trying to connect to the database
    Error connecting to database: (using class org.apache.hadoop.hive.jdbc.HiveDriver)
    org/apache/thrift/transport/TTransportException

    at org.pentaho.di.core.database.Database.normalConnect(Database.java:366)
    at org.pentaho.di.core.database.Database.connect(Database.java:315)
    at org.pentaho.di.core.database.Database.connect(Database.java:277)
    at org.pentaho.di.core.database.Database.connect(Database.java:267)
    at org.pentaho.di.core.database.DatabaseFactory.getConnectionTestReport(DatabaseFactory.java:86)
    at org.pentaho.di.core.database.DatabaseMeta.testConnection(DatabaseMeta.java:2464)
    at org.pentaho.ui.database.event.DataHandler.testDatabaseConnection(DataHandler.java:533)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at org.pentaho.ui.xul.impl.AbstractXulDomContainer.invoke(AbstractXulDomContainer.java:329)
    at org.pentaho.ui.xul.impl.AbstractXulComponent.invoke(AbstractXulComponent.java:139)
    at org.pentaho.ui.xul.impl.AbstractXulComponent.invoke(AbstractXulComponent.java:123)
    at org.pentaho.ui.xul.swt.tags.SwtButton.access$500(SwtButton.java:26)
    at org.pentaho.ui.xul.swt.tags.SwtButton$4.widgetSelected(SwtButton.java:119)
    at org.eclipse.swt.widgets.TypedListener.handleEvent(Unknown Source)
    at org.eclipse.swt.widgets.EventTable.sendEvent(Unknown Source)
    at org.eclipse.swt.widgets.Widget.sendEvent(Unknown Source)
    at org.eclipse.swt.widgets.Display.runDeferredEvents(Unknown Source)
    at org.eclipse.swt.widgets.Display.readAndDispatch(Unknown Source)
    at org.eclipse.jface.window.Window.runEventLoop(Window.java:820)
    at org.eclipse.jface.window.Window.open(Window.java:796)
    at org.pentaho.ui.xul.swt.tags.SwtDialog.show(SwtDialog.java:378)
    at org.pentaho.ui.xul.swt.tags.SwtDialog.show(SwtDialog.java:304)
    at org.pentaho.di.ui.core.database.dialog.XulDatabaseDialog.open(XulDatabaseDialog.java:115)
    at org.pentaho.di.ui.core.database.dialog.DatabaseDialog.open(DatabaseDialog.java:62)
    at org.pentaho.di.ui.spoon.delegates.SpoonDBDelegate.editConnection(SpoonDBDelegate.java:88)
    at org.pentaho.di.ui.spoon.Spoon.doubleClickedInTree(Spoon.java:2828)
    at org.pentaho.di.ui.spoon.Spoon.access$1900(Spoon.java:315)
    at org.pentaho.di.ui.spoon.Spoon$23.widgetDefaultSelected(Spoon.java:5329)
    at org.eclipse.swt.widgets.TypedListener.handleEvent(Unknown Source)
    at org.eclipse.swt.widgets.EventTable.sendEvent(Unknown Source)
    at org.eclipse.swt.widgets.Widget.sendEvent(Unknown Source)
    at org.eclipse.swt.widgets.Display.runDeferredEvents(Unknown Source)
    at org.eclipse.swt.widgets.Display.readAndDispatch(Unknown Source)
    at org.pentaho.di.ui.spoon.Spoon.readAndDispatch(Spoon.java:1221)
    at org.pentaho.di.ui.spoon.Spoon.waitForDispose(Spoon.java:7044)
    at org.pentaho.di.ui.spoon.Spoon.start(Spoon.java:8304)
    at org.pentaho.di.ui.spoon.Spoon.main(Spoon.java:580)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at org.pentaho.commons.launcher.Launcher.main(Launcher.java:134)
    Caused by: org.pentaho.di.core.exception.KettleDatabaseException:
    Error connecting to database: (using class org.apache.hadoop.hive.jdbc.HiveDriver)
    org/apache/thrift/transport/TTransportException
    at org.pentaho.di.core.database.Database.connectUsingClass(Database.java:506)
    at org.pentaho.di.core.database.Database.normalConnect(Database.java:350)
    ... 44 more
    Caused by: java.lang.NoClassDefFoundError: org/apache/thrift/transport/TTransportException
    at org.apache.hadoop.hive.jdbc.HiveDriver.connect(HiveDriver.java:104)
    at java.sql.DriverManager.getConnection(DriverManager.java:582)
    at java.sql.DriverManager.getConnection(DriverManager.java:207)
    at org.pentaho.di.core.database.Database.connectUsingClass(Database.java:488)
    ... 45 more
    Caused by: java.lang.ClassNotFoundException: org.apache.thrift.transport.TTransportException
    at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:307)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:248)
    ... 49 more
    主机名 : 135.252.31.26
    端口 : 10000
    数据库名:warehouse


    Thanks.
    Maria
    Last edited by mariax; 07-26-2013 at 06:51 AM.

  4. #4

    Default

    Hi all

    Is there anyone met this issue? Can you share your solution to solve it?
    Thanks.

    Maria

  5. #5
    Join Date
    Sep 2012
    Posts
    71

    Default

    That first step where you copied in a hive-*.jar should not have been done, the only related JAR that should be in libext/JDBC is pentaho-hadoop-hive-jdbc-shim-<version>.JAR. If any of the hive-* JARs are in that folder, then they'd all need to be, along with libfb303.jar and the thrift JARs. That's why instead of doing all that, I recommend only having the pentaho-hadoop-hive-jdbc-shim JAR in libext/JDBC, and keeping the hive-* JARs in your cdh4 folder. That's the way PDI is when you download it.

    The error you're seeing above is a result of PDI finding a Hive JAR which needs (among other things) the Thrift JAR, which appears to not be in the same place.

    Your original error results from using Cloudera's Hive driver which does not implement many JDBC API methods that PDI needs to function properly. That's why we have our own version of the hive driver in the cdh4 folder (called hive-0.7.0-pentaho-1.0.2 or something like that). Simply put, there should be no JARs copied from your cluster to your PDI client, the cdh4 folder already contains the correct versions of all necessary JARs.

  6. #6

    Default

    Matt

    You are right. My issue is solved now. Thanks for your helping.

    Maria

  7. #7
    Join Date
    Jun 2013
    Posts
    44

    Default

    thanks for the support ... as this is very frequent to get stuck with the somewhat similar trouble in discussion here and the helpful post is going to help in multiple manners to sort out the same

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  
Privacy Policy | Legal Notices | Safe Harbor Privacy Policy

Copyright © 2005 - 2019 Hitachi Vantara Corporation. All Rights Reserved.