Hitachi Vantara Pentaho Community Forums
Results 1 to 18 of 18

Thread: How to avoid useless log lines?

  1. #1
    Join Date
    Nov 2013
    Posts
    382

    Default How to avoid useless log lines?

    Using PDI V7 and regardless of loglevel value, we get 100 lines of useless info for every execution making hard to read and track execution of jobs on the log file. Is there any way to avoid them?

    Code:
    #######################################################################
    WARNING:  no libwebkitgtk-1.0 detected, some features will be unavailable
        Consider installing the package with apt-get or yum.
        e.g. 'sudo apt-get install libwebkitgtk-1.0-0'
    #######################################################################
    OpenJDK 64-Bit Server VM warning: ignoring option MaxPermSize=256m; support was removed in 8.0
    11:00:03,752 INFO  [KarafBoot] Checking to see if org.pentaho.clean.karaf.cache is enabled
    11:00:03,878 INFO  [KarafInstance] 
    *******************************************************************************
    *** Karaf Instance Number: 1 at /srv/repositori/programa/data-integration/. ***
    ***   /system/karaf/caches/kitchen/data-1                                   ***
    *** Karaf Port:8802                                                         ***
    *** OSGI Service Port:9051                                                  ***
    *******************************************************************************
    mar 10, 2017 11:00:04 AM org.apache.karaf.main.Main$KarafLockCallback lockAquired
    INFORMACIÓN: Lock acquired. Setting startlevel to 100
    2017/03/10 11:00:05 - Kitchen - Logging is at level : Minimal
    2017/03/10 11:00:05 - Kitchen - Start of run.
    2017-03-10 11:00:09.586:INFO:oejs.Server:jetty-8.1.15.v20140411
    2017-03-10 11:00:09.659:INFO:oejs.AbstractConnector:Started NIOSocketConnectorWrapper@0.0.0.0:9051
    mar 10, 2017 11:00:10 AM org.apache.cxf.bus.blueprint.NamespaceHandlerRegisterer register
    INFORMACIÓN: Registered blueprint namespace handler for http://cxf.apache.org/blueprint/core
    mar 10, 2017 11:00:10 AM org.apache.cxf.bus.blueprint.NamespaceHandlerRegisterer register
    INFORMACIÓN: Registered blueprint namespace handler for http://cxf.apache.org/configuration/beans
    ... and so on up to 100 lines
    Thks

  2. #2
    Join Date
    Aug 2016
    Posts
    290

    Default

    What I have done is not log anything except errors. In addition to that, I have custom log steps (write to file) to write anything other than errors to log file, which means you can make it as clean as you wish.

  3. #3
    Join Date
    Nov 2013
    Posts
    382

    Default

    Sparkles,
    with -level=Error I get exactly 108 useless lines and not even the name of the job being executed!

    Look at the V4 log file. I can see Job start/end time as well as Transactions executed. In V7 I get the same info plus 108 useless lines. There must be a way to avoid them

    Code:
    INFO  10-03 13:16:15,183 - Kitchen - Start=2017/03/10 13:16:12.314, Stop=2017/03/10 13:16:15.183
    INFO  10-03 13:16:26,310 - Kitchen - Logging is at level : Minimal logging
    INFO  10-03 13:16:26,311 - Kitchen - Start of run.
    INFO  10-03 13:16:26,335 - RepositoriesMeta - Reading repositories XML file: /home/pentajo/.kettle/repositories.xml
    INFO  10-03 13:16:26,833 - MyRepo/MyJob- Start of job execution
    Loading transformation with ID : /MyRepo/MyTrans1.ktr
    Loading transformation with ID : /MyRepo/MyTrans2.ktr
    INFO  10-03 13:16:29,917 - MyRepo/MyJob - Job execution finished
    INFO  10-03 13:16:29,918 - Kitchen - Finished!
    INFO  10-03 13:16:29,919 - Kitchen - Start=2017/03/10 13:16:26.311, Stop=2017/03/10 13:16:29.918
    INFO  10-03 13:16:29,919 - Kitchen - Processing ended after 3 seconds.

  4. #4
    Join Date
    May 2014
    Posts
    358

    Default

    The Kettle logging level controls only what Kettle itself logs. The useless lines come from some Apache libraries. There has to be some configuration file somewhere, if not anywhere under data-integration, then maybe hidden in one of the jars?

  5. #5
    Join Date
    Nov 2013
    Posts
    382

    Default

    Quote Originally Posted by Lukfi View Post
    The Kettle logging level controls only what Kettle itself logs. The useless lines come from some Apache libraries. There has to be some configuration file somewhere, if not anywhere under data-integration, then maybe hidden in one of the jars?
    A

    I found this related post but unsure on what to do

    kettle-lifecycle-listeners:
    Code:
    <listeners>
    <listener id="PdiOsgiBridge">
    <description>PDI-OSGI-Bridge Listener</description>
    <tooltip/>
    <classname>org.pentaho.di.osgi.KettleLifeCycleAdapter</classname>
    <meta-classname/>
    <version-browser-classname/>
    </listener>
    </listeners>
    kettle-registry-extensions:
    Code:
    <registry-extensions>
    <registry-extension id="PdiOsgiBridge">
    <description>PDI-OSGI-Bridge Extension</description>
    <tooltip/>
    <classname>org.pentaho.di.osgi.registryExtension.OSGIPluginRegistryExtension</classname>
    <meta-classname/>
    <version-browser-classname/>
    </registry-extension>
    </registry-extensions>
    Simply deleting the <classname>... line will be safe? Or deleting (renaming! to old_xxx) the whole file?

    Thks
    Last edited by DepButi; 03-13-2017 at 05:43 AM.

  6. #6
    Join Date
    Nov 2013
    Posts
    382

    Default

    Almost there. Edited the files so only first/last lines present: <listeners></listeners> and <registry-extensions></registry-extensions>

    Now I get and error message 44 lines long . Any idea how to improve it?

    Code:
    log4j:ERROR Could not create an Appender. Reported error follows.
    java.lang.ClassCastException: org.apache.log4j.ConsoleAppender cannot be cast to org.apache.log4j.Appender
    	at org.apache.log4j.xml.DOMConfigurator.parseAppender(DOMConfigurator.java:248)
    	at org.apache.log4j.xml.DOMConfigurator.findAppenderByName(DOMConfigurator.java:176)
    	at org.apache.log4j.xml.DOMConfigurator.findAppenderByReference(DOMConfigurator.java:191)
    	at org.apache.log4j.xml.DOMConfigurator.parseChildrenOfLoggerElement(DOMConfigurator.java:523)
    	at org.apache.log4j.xml.DOMConfigurator.parseCategory(DOMConfigurator.java:436)
    	at org.apache.log4j.xml.DOMConfigurator.parse(DOMConfigurator.java:1004)
    	at org.apache.log4j.xml.DOMConfigurator.doConfigure(DOMConfigurator.java:872)
    	at org.apache.log4j.xml.DOMConfigurator.doConfigure(DOMConfigurator.java:755)
    	at org.apache.log4j.xml.DOMConfigurator.configure(DOMConfigurator.java:896)
    	at org.pentaho.di.core.logging.log4j.Log4jLogging.applyLog4jConfiguration(Log4jLogging.java:81)
    	at org.pentaho.di.core.logging.log4j.Log4jLogging.createLogger(Log4jLogging.java:89)
    	at org.pentaho.di.core.logging.log4j.Log4jLogging.init(Log4jLogging.java:68)
    	at org.pentaho.di.core.KettleClientEnvironment.initLogginPlugins(KettleClientEnvironment.java:141)
    	at org.pentaho.di.core.KettleClientEnvironment.init(KettleClientEnvironment.java:104)
    	at org.pentaho.di.kitchen.Kitchen$1.call(Kitchen.java:91)
    	at org.pentaho.di.kitchen.Kitchen$1.call(Kitchen.java:84)
    	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    	at java.lang.Thread.run(Thread.java:745)
    log4j:ERROR Could not create an Appender. Reported error follows.
    java.lang.ClassCastException: org.apache.log4j.ConsoleAppender cannot be cast to org.apache.log4j.Appender
    	at org.apache.log4j.xml.DOMConfigurator.parseAppender(DOMConfigurator.java:248)
    	at org.apache.log4j.xml.DOMConfigurator.findAppenderByName(DOMConfigurator.java:176)
    	at org.apache.log4j.xml.DOMConfigurator.findAppenderByReference(DOMConfigurator.java:191)
    	at org.apache.log4j.xml.DOMConfigurator.parseChildrenOfLoggerElement(DOMConfigurator.java:523)
    	at org.apache.log4j.xml.DOMConfigurator.parseRoot(DOMConfigurator.java:492)
    	at org.apache.log4j.xml.DOMConfigurator.parse(DOMConfigurator.java:1006)
    	at org.apache.log4j.xml.DOMConfigurator.doConfigure(DOMConfigurator.java:872)
    	at org.apache.log4j.xml.DOMConfigurator.doConfigure(DOMConfigurator.java:755)
    	at org.apache.log4j.xml.DOMConfigurator.configure(DOMConfigurator.java:896)
    	at org.pentaho.di.core.logging.log4j.Log4jLogging.applyLog4jConfiguration(Log4jLogging.java:81)
    	at org.pentaho.di.core.logging.log4j.Log4jLogging.createLogger(Log4jLogging.java:89)
    	at org.pentaho.di.core.logging.log4j.Log4jLogging.init(Log4jLogging.java:68)
    	at org.pentaho.di.core.KettleClientEnvironment.initLogginPlugins(KettleClientEnvironment.java:141)
    	at org.pentaho.di.core.KettleClientEnvironment.init(KettleClientEnvironment.java:104)
    	at org.pentaho.di.kitchen.Kitchen$1.call(Kitchen.java:91)
    	at org.pentaho.di.kitchen.Kitchen$1.call(Kitchen.java:84)
    	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    	at java.lang.Thread.run(Thread.java:745)

  7. #7
    Join Date
    Nov 2013
    Posts
    382

    Default

    After several tests we decided not to touch anything.

    Without extensions we get the 44 lines error.
    Without listeners cron jobs work but on-line doesn't recognize repositories so in fact we cannot work

    If anyone ever knows how to solve it, we will appreciate any feedback. As it is now, logs are extremely hard to follow when an error is suspected. And our admins were used to check it from time to time. With dozens of cron jobs working it's almost impossible to understand what message comes from what job as they are mixed with the useless lines

  8. #8
    Join Date
    Sep 2016
    Posts
    8

    Default

    Quote Originally Posted by DepButi View Post
    After several tests we decided not to touch anything.

    Without extensions we get the 44 lines error.
    Without listeners cron jobs work but on-line doesn't recognize repositories so in fact we cannot work

    If anyone ever knows how to solve it, we will appreciate any feedback. As it is now, logs are extremely hard to follow when an error is suspected. And our admins were used to check it from time to time. With dozens of cron jobs working it's almost impossible to understand what message comes from what job as they are mixed with the useless lines
    Hi, I'm working to solve this too. I'm running the jobs and transformations with sql server agent jobs but it shows too much information. Do you found any solution to fix this? because I tried everything to fix this "problem".

  9. #9
    Join Date
    Feb 2017
    Posts
    14

    Default

    I believe what your pointing to is the logs that are written by karaf and not pdi. There might be a setting in the system/karaf folder to turn off this info.

  10. #10
    Join Date
    Nov 2013
    Posts
    382

    Default

    Quote Originally Posted by tkaszuba View Post
    I believe what your pointing to is the logs that are written by karaf and not pdi. There might be a setting in the system/karaf folder to turn off this info.
    Most probably. But we were not able to find it. And honestly, it's creating some angst on our admins

  11. #11
    Join Date
    Sep 2016
    Posts
    8

    Default

    I have successfully elimated a couples of lines modifying

    featuresBoot=config,pentaho-client,pentaho-metaverse,pdi-dataservice,pdi-data-refinery,pdi-marketplace,community-edition

    to

    featuresBoot=config,pentaho-client,pentaho-metaverse,pdi-dataservice,pdi-data-refinery,community-edition

    on the file ...\data-integration\system\karaf\etc\org.apache.karaf.features.cfg

  12. #12
    Join Date
    Nov 2013
    Posts
    382

    Default

    Cannot see any diference, still +100 useless lines for every job.

  13. #13
    Join Date
    Sep 2016
    Posts
    8

    Default

    Quote Originally Posted by DepButi View Post
    Cannot see any diference, still +100 useless lines for every job.
    You have to clean

    ...\pdi-ce-7.0.0.0-25\data-integration\system\karaf\caches

    every time you make a change.

  14. #14
    Join Date
    Nov 2013
    Posts
    382

    Default

    Quote Originally Posted by jdrageme01 View Post
    You have to clean

    ...\pdi-ce-7.0.0.0-25\data-integration\system\karaf\caches

    every time you make a change.
    What does exactly mean "clean" ? There are quite a few (sub)folders and files there.

    Code:
    \karaf\caches
        \spoon
            \data-1
                \cache
                    \bundle0
                        bundle.id
                    ... several hundred \bundlen folders
                    cache.lock
                \generated-bundles
                \kar
                \log
                \txlog
                PortsAssigned.txt
        \kitchen
            (same structure as \spoon)

  15. #15
    Join Date
    Sep 2016
    Posts
    8

    Default

    Quote Originally Posted by DepButi View Post
    What does exactly mean "clean" ? There are quite a few (sub)folders and files there.

    Code:
    \karaf\caches
        \spoon
            \data-1
                \cache
                    \bundle0
                        bundle.id
                    ... several hundred \bundlen folders
                    cache.lock
                \generated-bundles
                \kar
                \log
                \txlog
                PortsAssigned.txt
        \kitchen
            (same structure as \spoon)
    Just delete everything inside the folder (...\pdi-ce-7.0.0.0-25\data-integration\system\karaf\caches\). It will be generated again.

  16. #16
    Join Date
    Dec 2016
    Posts
    8

    Default

    I was able to eliminate

    11:00:03,752 INFO [KarafBoot] Checking to see if org.pentaho.clean.karaf.cache is enabled
    11:00:03,878 INFO [KarafInstance]
    *******************************************************************************
    *** Karaf Instance Number: 1 at /srv/repositori/programa/data-integration/. ***
    *** /system/karaf/caches/kitchen/data-1 ***
    *** Karaf Port:8802 ***
    *** OSGI Service Port:9051 ***
    *******************************************************************************

    by modifying classes/log4j.xml. I was able to do this by changing
    <!-- ============================== -->
    <!-- Append messages to the console -->
    <!-- ============================== -->


    <appender name="PENTAHOCONSOLE" class="org.apache.log4j.ConsoleAppender">
    <param name="Target" value="System.out"/>
    <param name="Threshold" value="ERROR"/>


    <layout class="org.apache.log4j.PatternLayout">
    <!-- The default pattern: Date Priority [Category] Message\n -->
    <param name="ConversionPattern" value="%d{ABSOLUTE} %-5p [%c{1}] %m%n"/>
    </layout>
    </appender>

    In the Threshold param. It is by default INFO. By changing it to ERROR, the log message above goes away.


    EDIT: jdrageme01's method does work. The log file is much cleaner after removing pdi-marketplace
    Last edited by bdodson; 04-13-2017 at 05:15 PM.

  17. #17
    Join Date
    Nov 2013
    Posts
    382

    Default

    Everything seems to work fine and the log files are now again readable.

    A great thanks from our admins !!

    EDIT: add ...

    Now if the duplicate definition of SLF4J was corrected, it would be perfect

    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:file:/srv/repositori/programa/data-integration/launcher/../lib/slf4j-log4j12-1.7.7.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: Found binding in [jar:file:/srv/repositori/programa/data-integration/plugins/pentaho-big-data-plugin/lib/slf4j-log4j12-1.7.3.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
    Last edited by DepButi; 04-18-2017 at 04:40 AM.

  18. #18
    Join Date
    May 2016
    Posts
    1

    Default

    http://jira.pentaho.com/browse/PDI-15426 provides some tips to trim down the amount of logs. Vote, comment, watch, etc. to get Pentaho to look at it.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  
Privacy Policy | Legal Notices | Safe Harbor Privacy Policy

Copyright © 2005 - 2019 Hitachi Vantara Corporation. All Rights Reserved.