Hitachi Vantara Pentaho Community Forums
Results 1 to 2 of 2

Thread: How can I use kettle to connect hive2 with kerberos approving?

  1. #1
    Join Date
    Mar 2016
    Posts
    5

    Default How can I use kettle to connect hive2 with kerberos approving?

    I want to use kettle to deal with data in hive2,but the hive need to approve kerberos.what can I do to connect the hive2,thanks!

  2. #2
    Join Date
    Mar 2016
    Posts
    5

    Default

    The wrong message!
    org.osgi.service.blueprint.container.ComponentDefinitionException: Error when instantiating bean .component-1 of class org.pentaho.big.data.impl.vfs.hdfs.HDFSFileProvider
    at org.apache.aries.blueprint.container.BeanRecipe.getInstance(BeanRecipe.java:333)
    at org.apache.aries.blueprint.container.BeanRecipe.internalCreate2(BeanRecipe.java:806)
    at org.apache.aries.blueprint.container.BeanRecipe.internalCreate(BeanRecipe.java:787)
    at org.apache.aries.blueprint.di.AbstractRecipe$1.call(AbstractRecipe.java:79)
    at java.util.concurrent.FutureTask.run(FutureTask.java:262)
    at org.apache.aries.blueprint.di.AbstractRecipe.create(AbstractRecipe.java:88)
    at org.apache.aries.blueprint.container.BlueprintRepository.createInstances(BlueprintRepository.java:245)
    at org.apache.aries.blueprint.container.BlueprintRepository.createAll(BlueprintRepository.java:183)
    at org.apache.aries.blueprint.container.BlueprintContainerImpl.instantiateEagerComponents(BlueprintContainerImpl.java:682)
    at org.apache.aries.blueprint.container.BlueprintContainerImpl.doRun(BlueprintContainerImpl.java:377)
    at org.apache.aries.blueprint.container.BlueprintContainerImpl.run(BlueprintContainerImpl.java:269)
    at org.apache.aries.blueprint.container.BlueprintExtender.createContainer(BlueprintExtender.java:294)
    at org.apache.aries.blueprint.container.BlueprintExtender.createContainer(BlueprintExtender.java:263)
    at org.apache.aries.blueprint.container.BlueprintExtender.modifiedBundle(BlueprintExtender.java:253)
    at org.apache.aries.util.tracker.hook.BundleHookBundleTracker$Tracked.customizerModified(BundleHookBundleTracker.java:500)
    at org.apache.aries.util.tracker.hook.BundleHookBundleTracker$Tracked.customizerModified(BundleHookBundleTracker.java:433)
    at org.apache.aries.util.tracker.hook.BundleHookBundleTracker$AbstractTracked.track(BundleHookBundleTracker.java:725)
    at org.apache.aries.util.tracker.hook.BundleHookBundleTracker$Tracked.bundleChanged(BundleHookBundleTracker.java:463)
    at org.apache.aries.util.tracker.hook.BundleHookBundleTracker$BundleEventHook.event(BundleHookBundleTracker.java:422)
    at org.apache.felix.framework.util.SecureAction.invokeBundleEventHook(SecureAction.java:1127)
    at org.apache.felix.framework.util.EventDispatcher.createWhitelistFromHooks(EventDispatcher.java:696)
    at org.apache.felix.framework.util.EventDispatcher.fireBundleEvent(EventDispatcher.java:484)
    at org.apache.felix.framework.Felix.fireBundleEvent(Felix.java:4429)
    at org.apache.felix.framework.Felix.startBundle(Felix.java:2100)
    at org.apache.felix.framework.Felix.setActiveStartLevel(Felix.java:1299)
    at org.apache.felix.framework.FrameworkStartLevelImpl.run(FrameworkStartLevelImpl.java:304)
    at java.lang.Thread.run(Thread.java:745)
    Caused by: org.apache.commons.vfs2.FileSystemException: Multiple providers registered for URL scheme "hdfs".
    at org.apache.commons.vfs2.impl.DefaultFileSystemManager.addProvider(DefaultFileSystemManager.java:208)
    at org.pentaho.big.data.impl.vfs.hdfs.HDFSFileProvider.<init>(HDFSFileProvider.java:98)
    at org.pentaho.big.data.impl.vfs.hdfs.HDFSFileProvider.<init>(HDFSFileProvider.java:87)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
    at org.apache.aries.blueprint.utils.ReflectionUtils.newInstance(ReflectionUtils.java:329)
    at org.apache.aries.blueprint.container.BeanRecipe.newInstance(BeanRecipe.java:962)
    at org.apache.aries.blueprint.container.BeanRecipe.getInstance(BeanRecipe.java:331)
    ... 26 more
    Apr 07, 2016 8:39:54 PM org.apache.cxf.endpoint.ServerImpl initDestination
    INFO: Setting the server's publish address to be /lineage
    2016/04/07 20:39:54 - Pan - Start of run.
    2016/04/07 20:39:54 - RepositoriesMeta - Reading repositories XML file: /home/xxxxx/.kettle/repositories.xml
    2016/04/07 20:39:54 - testTrans - Dispatching started for transformation [testTrans]
    2016/04/07 20:39:55 - 表输入.0 - ERROR (version 6.0.1.0-386, build 1 from 2015-12-03 11.37.25 by buildguy) : An error occurred, processing will be stopped:
    2016/04/07 20:39:55 - 表输入.0 - Error occurred while trying to connect to the database
    2016/04/07 20:39:55 - 表输入.0 -
    2016/04/07 20:39:55 - 表输入.0 - Error connecting to database: (using class org.apache.hive.jdbc.HiveDriver)
    2016/04/07 20:39:55 - 表输入.0 - Could not open client transport with JDBC Uri: jdbc:hive2://170.190.2.97:10000/d_test: java.net.ConnectException: Connection refused
    2016/04/07 20:39:55 - 表输入.0 - ERROR (version 6.0.1.0-386, build 1 from 2015-12-03 11.37.25 by buildguy) : Error initializing step [表输入]
    2016/04/07 20:39:55 - testTrans - ERROR (version 6.0.1.0-386, build 1 from 2015-12-03 11.37.25 by buildguy) : Step [表输入.0] failed to initialize!
    2016/04/07 20:39:55 - 表输入.0 - Finished reading query, closing connection.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  
Privacy Policy | Legal Notices | Safe Harbor Privacy Policy

Copyright © 2005 - 2019 Hitachi Vantara Corporation. All Rights Reserved.