Hitachi Vantara Pentaho Community Forums
Results 1 to 22 of 22

Thread: Kitchen.sh issue: Unknown error in KarafBlueprintWatcher

  1. #1
    Join Date
    Sep 2015
    Posts
    12

    Default PDI 6 Error

    I installed the PDI 6.0 community version, however it is giving below error everytime I execute the kitche.sh: Oct 27, 2015 7:24:30 AM org.apache.cxf.endpoint.ServerImpl initDestination
    INFO: Setting the server's publish address to be /lineage
    07:25:27,835 ERROR [KarafLifecycleListener] Error in Blueprint Watcher
    org.pentaho.osgi.api.IKarafBlueprintWatcher$BlueprintWatcherException: Unknown error in KarafBlueprintWatcher
    at org.pentaho.osgi.impl.KarafBlueprintWatcherImpl.waitForBlueprint(KarafBlueprintWatcherImpl.java:89)
    at org.pentaho.di.osgi.KarafLifecycleListener$2.run(KarafLifecycleListener.java:112)
    at java.lang.Thread.run(Thread.java:745)
    Caused by: org.pentaho.osgi.api.IKarafBlueprintWatcher$BlueprintWatcherException: Timed out waiting for blueprints to load: pentaho-big-data-api-pig,pentaho-big-data-kettle-plugins-pig
    at org.pentaho.osgi.impl.KarafBlueprintWatcherImpl.waitForBlueprint(KarafBlueprintWatcherImpl.java:77)
    ... 2 more

  2. #2
    Join Date
    Sep 2015
    Posts
    12

    Default Kitchen.sh issue: Unknown error in KarafBlueprintWatcher

    I installed PDI 6.0 community version. it has many small issues while I execute the kitchen.sh. Many of them I resolved. However below issue I am not able to resolve. Also it is not occurring sometimes. So not sure how to fix this. Good thing is, it is not stopping the job executiong however it is increasing the job execution time while waiting for some blue print.: 06:32:55,241 ERROR [KarafLifecycleListener] Error in Blueprint Watcher
    org.pentaho.osgi.api.IKarafBlueprintWatcher$BlueprintWatcherException: Unknown error in KarafBlueprintWatcher
    at org.pentaho.osgi.impl.KarafBlueprintWatcherImpl.waitForBlueprint(KarafBlueprintWatcherImpl.java:89)
    at org.pentaho.di.osgi.KarafLifecycleListener$2.run(KarafLifecycleListener.java:112)
    at java.lang.Thread.run(Thread.java:745)
    Caused by: org.pentaho.osgi.api.IKarafBlueprintWatcher$BlueprintWatcherException: Timed out waiting for blueprints to load: pentaho-big-data-api-hdfs,pentaho-big-data-impl-vfs-hdfs,pentaho-big-data-impl-clusterTests
    at org.pentaho.osgi.impl.KarafBlueprintWatcherImpl.waitForBlueprint(KarafBlueprintWatcherImpl.java:77)
    ... 2 more

  3. #3
    Join Date
    Sep 2015
    Posts
    12

    Default

    Anyone has faced this issue? As this is taking 1 min. time before timeout the error and delaying the Job execution.

  4. #4

    Default

    having similar issue. job executes successfully, but script hangs.

  5. #5
    Join Date
    Dec 2009
    Posts
    332

    Default

    We had the same issue. Not sure if this is the right solution, but it worked:

    I removed the interior nodes of these to files in the classes directory: kettle-lifecycle-listeners.xml and kettle-registry-extensions.xml so that they are like this:
    kettle-lifecycle-listeners.xml
    <listeners>
    </listeners>

    kettle-registry-extensions.xml
    <registry-extensions>
    </registry-extensions>

    Not entirely sure what it does, but it does seem to stop karaf from do anything at all - which is what we wanted.

  6. #6

    Default

    Same issue here. Is it possible to contact the devs?

  7. #7
    Join Date
    Aug 2011
    Posts
    236

    Default

    Hi khelms,

    Thanks for the tip - it worked for me. I do still get :-

    log4j:ERROR Could not create an Appender. Reported error follows.
    java.lang.ClassCastException: org.apache.log4j.ConsoleAppender cannot be cast to org.apache.log4j.Appender

    but the logj4 script (log4j.xml in classes dir) look a bit more involved.

    Any ideas?
    PDI 8.0.0
    MySQL - 5.6.27
    Redshift - 1.0.1485
    PostgreSQL 8.0.2
    OS - Ubuntu 10.04.2

  8. #8
    Join Date
    Mar 2016
    Posts
    6

    Default

    Hi khelms,

    I have tried your suggestions but still I'm unable to resolve the issue.

    Is there any other solution ? How can we block these logs? when does this come?


  9. #9
    Join Date
    Jan 2006
    Posts
    245

    Default

    Hi all,

    I had same problem in my PDI 6.1 CE installation, long startup time then an exception with this error message


    Timed out waiting for blueprints to load: data-refinery-pdi-plugin


    The solution was really simple and was caused by the fact that in CE the data refinery plugin is not present (I imagine...) but the entry in karaf configuration to load the plugin is mistakenly present. To solve it open the file


    <pdi_home>/system/karaf/etc/org.apache.karaf.features.cfg


    and change line 31 from


    featuresBoot=config,pentaho-client,pentaho-metaverse,pdi-dataservice,pdi-data-refinery


    to


    featuresBoot=config,pentaho-client,pentaho-metaverse,pdi-dataservice


    Save and close the file. Then delete the content of the directory <pdi_home>/system/karaf/caches and restart Spoon. Your PDI CE installation will return starting as fast as expected.

    Hope this helps!
    Follow Me on Twitter: sramazzina
    My Skype account: sramazzina
    My Blog
    View my profile on LinkedIn: http://www.linkedin.com/in/sramazzina
    Author of Pentaho Data Integration Kitchen How-To and Pentaho Business Analytics Cookbook

    Join us on IRC server Freenode.net, channel ##pentaho ##saiku

  10. #10
    Join Date
    May 2010
    Posts
    8

    Default

    Thanks sramazzina! This worked on our 6.1 PDI EE version, and eliminated the 2min delay (as it would eventually time out) on all of our jobs. Something that's been a hassle for quite a while.
    Enterprise 6.1

  11. #11
    Join Date
    Jan 2006
    Posts
    245

    Default

    Quote Originally Posted by nmiles View Post
    Thanks sramazzina! This worked on our 6.1 PDI EE version, and eliminated the 2min delay (as it would eventually time out) on all of our jobs. Something that's been a hassle for quite a while.
    Glad to know that helped you in solving the issue!

    Best

    Sergio
    Follow Me on Twitter: sramazzina
    My Skype account: sramazzina
    My Blog
    View my profile on LinkedIn: http://www.linkedin.com/in/sramazzina
    Author of Pentaho Data Integration Kitchen How-To and Pentaho Business Analytics Cookbook

    Join us on IRC server Freenode.net, channel ##pentaho ##saiku

  12. #12

    Default

    Good morning to everybody,

    I hope that someone will help me, I've followed the instructions to remove the data-refinery-pdi-plugin, but I keep to have the following error:

    Code:
    DEBUG: _PENTAHO_JAVA_HOME=/opt/java/jdk1.8.0_102/
    DEBUG: _PENTAHO_JAVA=/opt/java/jdk1.8.0_102//bin/java
    Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=256m; support was removed in 8.0
    11:54:04,592 INFO  [KarafInstance] 
    *******************************************************************************
    *** Karaf Instance Number: 1 at /usr/local/data-integration_6_1/./system/ka ***
    ***   raf/caches/default/data-1                                             ***
    *** Karaf Port:8802                                                         ***
    *** OSGI Service Port:9051                                                  ***
    *******************************************************************************
    11:54:04,593 INFO  [KarafBoot] Checking to see if org.pentaho.clean.karaf.cache is enabled
    ott 19, 2016 11:54:05 AM org.apache.karaf.main.Main$KarafLockCallback lockAquired
    INFORMAZIONI: Lock acquired. Setting startlevel to 100
    2016/10/19 11:54:05 - Kitchen - Avvio dell'esecuzione.
    2016/10/19 11:54:05 - RepositoriesMeta - Lettura del file XML del repository: /home/kettle/.kettle/repositories.xml
    2016/10/19 11:54:07 - cfgbuilder - Warning: The configuration parameter [org] is not supported by the default configuration builder for scheme: sftp
    ott 19, 2016 11:54:10 AM org.pentaho.caching.impl.PentahoCacheManagerFactory$RegistrationHandler$1 onSuccess
    INFORMAZIONI: New Caching Service registered
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:file:/usr/local/data-integration_6_1/launcher/../lib/slf4j-log4j12-1.7.7.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: Found binding in [jar:file:/usr/local/data-integration_6_1/plugins/pentaho-big-data-plugin/lib/slf4j-log4j12-1.7.3.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
    11:56:10,434 ERROR [KarafLifecycleListener] Error in Blueprint Watcher
    org.pentaho.osgi.api.IKarafBlueprintWatcher$BlueprintWatcherException: Unknown error in KarafBlueprintWatcher
             at org.pentaho.osgi.impl.KarafBlueprintWatcherImpl.waitForBlueprint(KarafBlueprintWatcherImpl.java:103)
             at org.pentaho.di.osgi.KarafLifecycleListener$2.run(KarafLifecycleListener.java:161)
             at java.lang.Thread.run(Thread.java:745)
    Caused by: org.pentaho.osgi.api.IKarafBlueprintWatcher$BlueprintWatcherException: Timed out waiting for blueprints to load: pentaho-big-data-api-hdfs,pentaho-big-data-impl-vfs-hdfs,pentaho-big-data-impl-clusterTests
             at org.pentaho.osgi.impl.KarafBlueprintWatcherImpl.waitForBlueprint(KarafBlueprintWatcherImpl.java:88)
             ... 2 more
    What I'm missing ?

    Thank you in advance, any help would be appreciated.

  13. #13
    Join Date
    Jan 2006
    Posts
    245

    Default

    Quote Originally Posted by alessio.missio View Post
    Good morning to everybody,

    I hope that someone will help me, I've followed the instructions to remove the data-refinery-pdi-plugin, but I keep to have the following error:

    Code:
    DEBUG: _PENTAHO_JAVA_HOME=/opt/java/jdk1.8.0_102/
    DEBUG: _PENTAHO_JAVA=/opt/java/jdk1.8.0_102//bin/java
    Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=256m; support was removed in 8.0
    11:54:04,592 INFO  [KarafInstance] 
    *******************************************************************************
    *** Karaf Instance Number: 1 at /usr/local/data-integration_6_1/./system/ka ***
    ***   raf/caches/default/data-1                                             ***
    *** Karaf Port:8802                                                         ***
    *** OSGI Service Port:9051                                                  ***
    *******************************************************************************
    11:54:04,593 INFO  [KarafBoot] Checking to see if org.pentaho.clean.karaf.cache is enabled
    ott 19, 2016 11:54:05 AM org.apache.karaf.main.Main$KarafLockCallback lockAquired
    INFORMAZIONI: Lock acquired. Setting startlevel to 100
    2016/10/19 11:54:05 - Kitchen - Avvio dell'esecuzione.
    2016/10/19 11:54:05 - RepositoriesMeta - Lettura del file XML del repository: /home/kettle/.kettle/repositories.xml
    2016/10/19 11:54:07 - cfgbuilder - Warning: The configuration parameter [org] is not supported by the default configuration builder for scheme: sftp
    ott 19, 2016 11:54:10 AM org.pentaho.caching.impl.PentahoCacheManagerFactory$RegistrationHandler$1 onSuccess
    INFORMAZIONI: New Caching Service registered
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:file:/usr/local/data-integration_6_1/launcher/../lib/slf4j-log4j12-1.7.7.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: Found binding in [jar:file:/usr/local/data-integration_6_1/plugins/pentaho-big-data-plugin/lib/slf4j-log4j12-1.7.3.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
    11:56:10,434 ERROR [KarafLifecycleListener] Error in Blueprint Watcher
    org.pentaho.osgi.api.IKarafBlueprintWatcher$BlueprintWatcherException: Unknown error in KarafBlueprintWatcher
             at org.pentaho.osgi.impl.KarafBlueprintWatcherImpl.waitForBlueprint(KarafBlueprintWatcherImpl.java:103)
             at org.pentaho.di.osgi.KarafLifecycleListener$2.run(KarafLifecycleListener.java:161)
             at java.lang.Thread.run(Thread.java:745)
    Caused by: org.pentaho.osgi.api.IKarafBlueprintWatcher$BlueprintWatcherException: Timed out waiting for blueprints to load: pentaho-big-data-api-hdfs,pentaho-big-data-impl-vfs-hdfs,pentaho-big-data-impl-clusterTests
             at org.pentaho.osgi.impl.KarafBlueprintWatcherImpl.waitForBlueprint(KarafBlueprintWatcherImpl.java:88)
             ... 2 more
    What I'm missing ?

    Thank you in advance, any help would be appreciated.
    Which version of Pentaho are you using. It seems your're trying to load big data things that are not available in your PDI. Same problem as for data refinery but targeting other plugins.

    S
    Follow Me on Twitter: sramazzina
    My Skype account: sramazzina
    My Blog
    View my profile on LinkedIn: http://www.linkedin.com/in/sramazzina
    Author of Pentaho Data Integration Kitchen How-To and Pentaho Business Analytics Cookbook

    Join us on IRC server Freenode.net, channel ##pentaho ##saiku

  14. #14

    Default

    Thanks for the reply Sergio,

    I'm using Pentaho 6.1 free edition.

    Is there any shell command to view the version under Linux?

    Thanks again!

  15. #15
    Join Date
    Jan 2006
    Posts
    245

    Default

    Where did you get this version? Did you compiled it by yourself? Seems strange....
    Follow Me on Twitter: sramazzina
    My Skype account: sramazzina
    My Blog
    View my profile on LinkedIn: http://www.linkedin.com/in/sramazzina
    Author of Pentaho Data Integration Kitchen How-To and Pentaho Business Analytics Cookbook

    Join us on IRC server Freenode.net, channel ##pentaho ##saiku

  16. #16

    Default

    Hi Sergio,

    no, it's just the open source version of PDI. Is there any command to write under linux to view the pdi version ?

    Thanks again
    A

  17. #17

    Default

    To find the version use the following command:

    ./kitchen.sh -version

  18. #18

    Default

    Hi Sergio,

    here is the version:
    Code:
    kettle@srv2:~$ /usr/local/data-integration_6_1/kitchen.sh -version
    DEBUG: _PENTAHO_JAVA_HOME=/opt/java/jdk1.8.0_102/
    DEBUG: _PENTAHO_JAVA=/opt/java/jdk1.8.0_102//bin/java
    Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=256m; support was removed in 8.0
    08:04:19,134 INFO  [KarafInstance]
    *******************************************************************************
    *** Karaf Instance Number: 2 at /usr/local/data-integration_6_1/./system/ka ***
    ***   raf/caches/default/data-4                                             ***
    *** Karaf Port:8803                                                         ***
    *** OSGI Service Port:9052                                                  ***
    *******************************************************************************
    08:04:19,136 INFO  [KarafBoot] Checking to see if org.pentaho.clean.karaf.cache is enabled
    ott 20, 2016 8:04:20 AM org.apache.karaf.main.Main$KarafLockCallback lockAquired
    INFORMAZIONI: Lock acquired. Setting startlevel to 100
    2016/10/20 08:04:20 - Kitchen - Versione di Kettle 6.1.0.1-196, build 1, data di build: 2016-04-07 12.08.49

    Thanks for your efforts

    A.

  19. #19

    Default

    Good morning everybody,

    does anyone have any idea on how could I resolve this problem?

    Thank you very much
    A.

  20. #20
    Join Date
    Oct 2016
    Posts
    3

    Default

    Hi,

    I am also using 6.1 and i have tried the above fix but the blueprint watcher message keeps appearing every now and then.

    Any other suggestions?

    Kind Regards,

    geales

  21. #21
    Join Date
    Jan 2017
    Posts
    1

    Default

    We are on Windows 2012 R2 6.1 CE Kettle version and have started getting issue. It was fine for a few months and all of a sudden started getting it.
    Tried all the solutions mentioned here. Restarted the machine a few times but it still happens sporadically.
    The job was earlier scheduled with Windows task scheduled for every 5 minutes and changed to every 10 minutes but still did not help.
    Has anyone able to solve it for good.
    Thanks so much for your help,
    dubeysom

  22. #22
    Join Date
    Nov 2014
    Posts
    4

    Default

    This worked for me on Pentaho 7 as well. Thanks a lot.

    Quote Originally Posted by sramazzina View Post
    Hi all,

    I had same problem in my PDI 6.1 CE installation, long startup time then an exception with this error message


    Timed out waiting for blueprints to load: data-refinery-pdi-plugin


    The solution was really simple and was caused by the fact that in CE the data refinery plugin is not present (I imagine...) but the entry in karaf configuration to load the plugin is mistakenly present. To solve it open the file


    <pdi_home>/system/karaf/etc/org.apache.karaf.features.cfg


    and change line 31 from


    featuresBoot=config,pentaho-client,pentaho-metaverse,pdi-dataservice,pdi-data-refinery


    to


    featuresBoot=config,pentaho-client,pentaho-metaverse,pdi-dataservice


    Save and close the file. Then delete the content of the directory <pdi_home>/system/karaf/caches and restart Spoon. Your PDI CE installation will return starting as fast as expected.

    Hope this helps!

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  
Privacy Policy | Legal Notices | Safe Harbor Privacy Policy

Copyright © 2005 - 2019 Hitachi Vantara Corporation. All Rights Reserved.