Hitachi Vantara Pentaho Community Forums
Results 1 to 8 of 8

Thread: Run Kettle Jobs/ktr in AIX Platform system

  1. #1
    Join Date
    Aug 2013
    Posts
    100

    Default Run Kettle Jobs/ktr in AIX Platform system

    Hello,

    How i can run my pdi jobs/ktr in AIX Platform environment .Right now i am using pdi-ce-6.1.0.1-196 .



    Is there any document or guideline where i can find to run the ktr files.Please let me know..


    Regards
    Sumit








  2. #2
    Join Date
    Apr 2008
    Posts
    4,696

    Default

    Your other thread made the error clearer.
    Always remember to post a complete picture of what's going on.

    Quote Originally Posted by Sumit_Bansal View Post
    Code:
    I'm sorry, this AIX platform  is not yet supported
    ?
    Ok, so your uname -m is returning something that isn't expected.
    Can you run uname -m from the command line and let us know what it says?
    Code:
    case `uname -s` in
        AIX)
        ARCH=`uname -m`
            case $ARCH in
    
    
                ppc)
                    LIBPATH=$CURRENTDIR/../libswt/aix/
                    ;;
    
    
                ppc64)
                    LIBPATH=$CURRENTDIR/../libswt/aix64/
                    ;;
    
    
                *)    
                    echo "I'm sorry, this AIX platform [$ARCH] is not yet supported!"
                    exit
                    ;;
            esac
            ;;
    Could you also run uname -p and advise how it responds?

    Are you running a 64bit Java, or a 32bit Java?
    Are you running a 64bit kernel, or a 32bit kernel?
    Which version of AIX is this?
    Is this within an LPAR, or a WPAR, or full machine instance?

    You somewhat need to give us the complete picture before we can tell you how to fix it.
    Last edited by gutlez; 02-08-2017 at 05:16 PM.

  3. #3
    Join Date
    Aug 2013
    Posts
    100

    Default

    Hi Gutlz,
    Thanks a lot for your quick response. Please find below mentioned required information:

    • uname –m -> 00F982AD4C00
    • uname –p -> powerpc
    • AIX Version -> 7.1.0.0
    • We are running 64bit java in AIX Platform -> echo $JAVA_HOME

    /home/usr/ibm-java-ppc64-71


    • We are running 64 bit kernel.
    • We are using full machine instance.

    Let me know if you will require any further information form my end..Please suggest me how we can fixed that issues.


    Quote Originally Posted by gutlez View Post
    Your other thread made the error clearer.
    Always remember to post a complete picture of what's going on.


    Ok, so your uname -m is returning something that isn't expected.
    Can you run uname -m from the command line and let us know what it says?
    Code:
    case `uname -s` in
        AIX)
        ARCH=`uname -m`
            case $ARCH in
    
    
                ppc)
                    LIBPATH=$CURRENTDIR/../libswt/aix/
                    ;;
    
    
                ppc64)
                    LIBPATH=$CURRENTDIR/../libswt/aix64/
                    ;;
    
    
                *)    
                    echo "I'm sorry, this AIX platform [$ARCH] is not yet supported!"
                    exit
                    ;;
            esac
            ;;
    Could you also run uname -p and advise how it responds?

    Are you running a 64bit Java, or a 32bit Java?
    Are you running a 64bit kernel, or a 32bit kernel?
    Which version of AIX is this?
    Is this within an LPAR, or a WPAR, or full machine instance?

    You somewhat need to give us the complete picture before we can tell you how to fix it.
    Last edited by Sumit_Bansal; 02-09-2017 at 02:14 AM.

  4. #4
    Join Date
    Apr 2008
    Posts
    4,696

    Default

    I would suggest you try editing the spoon.sh file,

    in the section I quoted above, change `uname -m` to `uname -p`
    I would change ppc64 to powerpc

    That should allow pan to start.

  5. #5
    Join Date
    Aug 2013
    Posts
    100

    Default

    Thanks a lot for your quick response...I have changed the `uname -m` to `uname -p` as you have mentioned ..now plz tell me what to do now..?

    Quote Originally Posted by gutlez View Post
    I would suggest you try editing the spoon.sh file,

    in the section I quoted above, change `uname -m` to `uname -p`
    I would change ppc64 to powerpc

    That should allow pan to start.

  6. #6
    Join Date
    Apr 2008
    Posts
    4,696

    Default

    Did you also change the line that read:

    ppc64)
    to
    powerpc)

    as I suggested?

    After that, what does pan say when you try to run it?

  7. #7
    Join Date
    Aug 2013
    Posts
    100

    Default

    I have replaced ppc64 to powerpc and run the below mentioned command and here i am using pdi-ce-6.1

    ./pan.sh -file=/usr/local/stack/etl/user_act.ktr

    Also getting the below mentioned error:

    find: 0652-017 -maxdepth is not a valid option.
    10:27:21,880 INFO [KarafInstance]
    *******************************************************************************
    *** Karaf Instance Number: 1 at /usr/local/stack/britam/data-integration/./ ***
    *** system/karaf/caches/pan/data-1 ***
    *** Karaf Port:8802 ***
    *** OSGI Service Port:9051 ***
    *******************************************************************************
    10:27:21,881 INFO [KarafBoot] Checking to see if org.pentaho.clean.karaf.cache is enabled
    Feb 13, 2017 10:27:22 AM org.apache.karaf.main.Main launch
    INFO: Installing and starting initial bundles
    Feb 13, 2017 10:27:22 AM org.apache.karaf.main.Main launch
    INFO: All initial bundles installed and set to start
    Feb 13, 2017 10:27:22 AM org.apache.karaf.main.Main$KarafLockCallback lockAquire d
    INFO: Lock acquired. Setting startlevel to 100
    Creating configuration from org.apache.karaf.command.acl.jaas.cfg
    Creating configuration from org.apache.karaf.jaas.cfg
    Creating configuration from org.apache.karaf.command.acl.feature.cfg
    Creating configuration from org.apache.karaf.log.cfg
    Creating configuration from jmx.acl.osgi.compendium.cm.cfg
    Creating configuration from org.apache.karaf.features.repos.cfg
    Creating configuration from org.ops4j.pax.web.cfg
    Creating configuration from org.apache.karaf.features.obr.cfg
    Creating configuration from jmx.acl.org.apache.karaf.bundle.cfg
    Creating configuration from org.apache.activemq.webconsole.cfg
    Creating configuration from org.apache.karaf.features.cfg
    Creating configuration from org.apache.karaf.management.cfg
    Creating configuration from jmx.acl.org.apache.karaf.config.cfg
    Creating configuration from org.apache.karaf.command.acl.kar.cfg
    Creating configuration from org.apache.karaf.command.acl.config.cfg
    Creating configuration from org.apache.karaf.webconsole.cfg
    Creating configuration from jmx.acl.org.apache.karaf.security.jmx.cfg
    Creating configuration from org.apache.karaf.command.acl.shell.cfg
    Creating configuration from org.pentaho.caching-default.cfg
    Creating configuration from org.apache.activemq.server-default.cfg
    Creating configuration from org.apache.felix.fileinstall-deploy.cfg
    Creating configuration from org.apache.karaf.shell.cfg
    Creating configuration from mondrian.cfg
    Creating configuration from org.ops4j.pax.logging.cfg
    Creating configuration from org.ops4j.pax.url.mvn.cfg
    Creating configuration from org.apache.karaf.kar.cfg
    Creating configuration from org.apache.karaf.command.acl.bundle.cfg
    Creating configuration from org.apache.karaf.command.acl.scope_bundle.cfg
    Creating configuration from jmx.acl.cfg
    Creating configuration from jmx.acl.java.lang.Memory.cfg
    Creating configuration from org.apache.karaf.command.acl.system.cfg
    Creating configuration from org.pentaho.features.cfg
    2017/02/13 10:27:24 - cfgbuilder - Warning: The configuration parameter [org] is not supported by the default configuration builder for scheme: sftp
    Feb 13, 2017 10:27:28 AM org.pentaho.caching.impl.PentahoCacheManagerFactory$RegistrationHandler$1 onSuccess
    INFO: New Caching Service registered
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:file:/usr/local/stack/britam/data-integration/launcher/../lib/slf4j-log4j12-1.7.7.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: Found binding in [jar:file:/usr/local/stack/britam/data-integration/plugins/pentaho-big-data-plugin/lib/slf4j-log4j12-1.7.3.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
    10:27:29,563 ERROR [BlueprintContainerImpl] Unable to start blueprint container for bundle pentaho-hadoop-shims-mapr-osgi-jaas
    org.osgi.service.blueprint.container.ComponentDefinitionException: Error when instantiating bean .component-1 of class com.pentaho.big.data.bundles.impl.shim.common.ShimBridgingClassloader
    at org.apache.aries.blueprint.container.BeanRecipe.getInstance(BeanRecipe.java:315)
    at org.apache.aries.blueprint.container.BeanRecipe.internalCreate2(BeanRecipe.java:806)
    at org.apache.aries.blueprint.container.BeanRecipe.internalCreate(BeanRecipe.java:787)
    at org.apache.aries.blueprint.di.AbstractRecipe$1.call(AbstractRecipe.java:79)
    at java.util.concurrent.FutureTask.run(FutureTask.java:274)
    at org.apache.aries.blueprint.di.AbstractRecipe.create(AbstractRecipe.java:88)
    at org.apache.aries.blueprint.container.BlueprintRepository.createInstances(BlueprintRepository.java:245)
    at org.apache.aries.blueprint.container.BlueprintRepository.createAll(BlueprintRepository.java:183)
    at org.apache.aries.blueprint.container.BlueprintContainerImpl.instantiateEagerComponents(BlueprintContainerImpl.java:682)
    at org.apache.aries.blueprint.container.BlueprintContainerImpl.doRun(BlueprintContainerImpl.java:377)
    at org.apache.aries.blueprint.container.BlueprintContainerImpl.run(BlueprintContainerImpl.java:269)
    at org.apache.aries.blueprint.container.BlueprintExtender.createContainer(BlueprintExtender.java:294)
    at org.apache.aries.blueprint.container.BlueprintExtender.createContainer(BlueprintExtender.java:263)
    at org.apache.aries.blueprint.container.BlueprintExtender.modifiedBundle(BlueprintExtender.java:253)
    at org.apache.aries.util.tracker.hook.BundleHookBundleTracker$Tracked.customizerModified(BundleHookBundleTracker.java:500)
    at org.apache.aries.util.tracker.hook.BundleHookBundleTracker$Tracked.customizerModified(BundleHookBundleTracker.java:433)
    at org.apache.aries.util.tracker.hook.BundleHookBundleTracker$AbstractTracked.track(BundleHookBundleTracker.java:725)
    at org.apache.aries.util.tracker.hook.BundleHookBundleTracker$Tracked.bundleChanged(BundleHookBundleTracker.java:463)
    at org.apache.aries.util.tracker.hook.BundleHookBundleTracker$BundleEventHook.event(BundleHookBundleTracker.java:422)
    at org.apache.felix.framework.util.SecureAction.invokeBundleEventHook(SecureAction.java:1127)
    at org.apache.felix.framework.util.EventDispatcher.createWhitelistFromHooks(EventDispatcher.java:696)
    at org.apache.felix.framework.util.EventDispatcher.fireBundleEvent(EventDispatcher.java:484)
    at org.apache.felix.framework.Felix.fireBundleEvent(Felix.java:4429)
    at org.apache.felix.framework.Felix.startBundle(Felix.java:2100)
    at org.apache.felix.framework.BundleImpl.start(BundleImpl.java:976)
    at org.apache.felix.framework.BundleImpl.start(BundleImpl.java:963)
    at org.apache.karaf.features.internal.FeaturesServiceImpl.cleanUpOnFailure(FeaturesServiceImpl.java:531)
    at org.apache.karaf.features.internal.FeaturesServiceImpl.installFeatures(FeaturesServiceImpl.java:478)
    at org.apache.karaf.features.internal.BootFeaturesInstaller.installBootFeatures(BootFeaturesInstaller.java:92)
    at org.apache.karaf.features.internal.BootFeaturesInstaller$1.run(BootFeaturesInstaller.java:71)
    Caused by: java.lang.NoClassDefFoundError: com.sun.security.auth.login.ConfigFile
    at java.lang.Class.forNameImpl(Native Method)
    at java.lang.Class.forName(Class.java:309)
    at com.pentaho.big.data.bundles.impl.shim.common.ShimBridgingClassloader.create(ShimBridgingClassloader.java:61)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:95)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:56)
    at java.lang.reflect.Method.invoke(Method.java:620)
    at org.apache.aries.blueprint.utils.ReflectionUtils.invoke(ReflectionUtils.java:297)
    at org.apache.aries.blueprint.container.BeanRecipe.invoke(BeanRecipe.java:958)
    at org.apache.aries.blueprint.container.BeanRecipe.getInstance(BeanRecipe.java:313)
    ... 29 more
    Caused by: java.lang.ClassNotFoundException
    at com.pentaho.big.data.bundles.impl.shim.common.ShimBridgingClassloader.findClass(ShimBridgingClassloader.java:127)
    at java.lang.ClassLoader.loadClassHelper(ClassLoader.java:786)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:764)
    at com.pentaho.big.data.bundles.impl.shim.common.ShimBridgingClassloader.loadClass(ShimBridgingClassloader.java:180)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:741)
    ... 39 more
    10:27:29,694 ERROR [BootFeaturesInstaller] Error installing boot features
    java.lang.Exception: Could not start bundle mvnentaho/pdi-dataservice-server-plugin/6.1.0.1-196 in feature(s) pdi-dataservice-6.1.0.1-196: Unresolved constraint in bundle pdi-dataservice-server-plugin [59]: Unable to resolve 59.0: missing requirement [59.0] osgi.wiring.package; (osgi.wiring.package=org.eclipse.swt.custom)
    at org.apache.karaf.features.internal.FeaturesServiceImpl.startBundle(FeaturesServiceImpl.java:504)
    at org.apache.karaf.features.internal.FeaturesServiceImpl.installFeatures(FeaturesServiceImpl.java:459)
    at org.apache.karaf.features.internal.BootFeaturesInstaller.installBootFeatures(BootFeaturesInstaller.java:92)
    at org.apache.karaf.features.internal.BootFeaturesInstaller$1.run(BootFeaturesInstaller.java:71)
    Caused by: org.osgi.framework.BundleException: Unresolved constraint in bundle pdi-dataservice-server-plugin [59]: Unable to resolve 59.0: missing requirement [59.0] osgi.wiring.package; (osgi.wiring.package=org.eclipse.swt.custom)
    at org.apache.felix.framework.Felix.resolveBundleRevision(Felix.java:4002)
    at org.apache.felix.framework.Felix.startBundle(Felix.java:2045)
    at org.apache.felix.framework.BundleImpl.start(BundleImpl.java:976)
    at org.apache.felix.framework.BundleImpl.start(BundleImpl.java:963)
    at org.apache.karaf.features.internal.FeaturesServiceImpl.startBundle(FeaturesServiceImpl.java:501)
    ... 3 more
    Creating configuration from pentaho.metaverse.cfg
    Creating configuration from pentaho.geo.roles.cfg

    Please advise how we can resolved this.

    Quote Originally Posted by gutlez View Post
    Did you also change the line that read:

    ppc64)
    to
    powerpc)

    as I suggested?

    After that, what does pan say when you try to run it?

  8. #8
    Join Date
    Apr 2008
    Posts
    4,696

    Default

    Unfortunately, you're now beyond the depth of what I can help with.

    You'll probably have to involve your system administrator.

Tags for this Thread

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  
Privacy Policy | Legal Notices | Safe Harbor Privacy Policy

Copyright © 2005 - 2019 Hitachi Vantara Corporation. All Rights Reserved.