Hitachi Vantara Pentaho Community Forums
Results 1 to 11 of 11

Thread: Enterprise Repository connection API

  1. #1
    Join Date
    Sep 2012
    Posts
    15

    Default Enterprise Repository connection API

    Hello, I would ask if anyone would courteously provide an example of a Java class to connect to the enterprise repository and launch a job.
    For example, the following code achieves that purpose (from ad italian blog http://musarra.wordpress.com/2011/06...b-by-java-api/ ), but in my experience only for kettle databases (Db or file).
    I apologize in advance if you don't deal with enterprise features in this forum.
    Thanks.


    Code:
    package it.dontesta.lab.jobs.kettle;
    
    import org.pentaho.di.core.KettleEnvironment;
    import org.pentaho.di.core.logging.LogChannel;
    import org.pentaho.di.core.logging.LogChannelInterface;
    import org.pentaho.di.core.plugins.PluginRegistry;
    import org.pentaho.di.core.plugins.RepositoryPluginType;
    import org.pentaho.di.job.Job;
    import org.pentaho.di.job.JobMeta;
    import org.pentaho.di.repository.RepositoriesMeta;
    import org.pentaho.di.repository.Repository;
    import org.pentaho.di.repository.RepositoryDirectoryInterface;
    import org.pentaho.di.repository.RepositoryMeta;
    
    /**
     * @author amusarra
     */
    public class ExecuteGenericJob {
        public static final String STRING_LOG = "ExecuteGenericJob";
    
        public static void main(String[] args) throws Exception {
            LogChannelInterface log = new LogChannel(STRING_LOG);
            String filename = args[0];
    
            KettleEnvironment.init();
    
            RepositoriesMeta repositoriesMeta = new RepositoriesMeta();
            repositoriesMeta.readData();
    
            RepositoryMeta repositoryMeta = repositoriesMeta.findRepository("1");
            PluginRegistry registry = PluginRegistry.getInstance();
            Repository repository = registry.loadClass(RepositoryPluginType.class,
                    repositoryMeta, Repository.class);
    
            log.logBasic("Repository Description: "
                    + repositoryMeta.getDescription());
    
            repository.init(repositoryMeta);
            repository.connect("admin", "admin");
    
            RepositoryDirectoryInterface directory = repository
                    .loadRepositoryDirectoryTree();
            directory = directory.findDirectory("ShirusKettleDirectory");
    
            JobMeta jobMeta = new JobMeta();
            jobMeta = repository.loadJob(filename, directory, null, null);
    
            log.logBasic("JobMeta Description: " + jobMeta.getDescription());
            log.logBasic("JobMeta Version: " + jobMeta.getJobversion());
            log.logBasic("JobMeta Modify Date: " + jobMeta.getModifiedDate());
            log.logBasic("JobMeta Id: " + jobMeta.getObjectId().getId());
    
            Job job = new Job(repository, jobMeta);
            log.logBasic("Job Name: " + job.getJobname());
    
            job.start();
            job.waitUntilFinished();
    
            if (job.getErrors() != 0) {
                log.logError("Job Error: " + job.getErrors());
                log.logError("Error encountered!");
            }
        }
    }

  2. #2
    Join Date
    Nov 1999
    Posts
    9,729

    Default

    The code is the same, it looks good (apart from the username/password that is). You just have to make sure you have the EE Repository plugin in the plugins/ folder.

  3. #3
    Join Date
    Mar 2011
    Posts
    3

    Cool

    Hi Penty,
    I answered the question relating to the Enterprise Repository on my blog. See the full answer at http://musarra.wordpress.com/2011/06.../#comment-1125
    Unfortunately it is written in Italian. However, I am attaching the code needed to connect to the repository type Enterprise.

    Bye,
    Antonio.

    Code:
    // Define Location        
    PurRepositoryLocation purreplocation = new PurRepositoryLocation(
                    "http://localhost:9080/pentaho-di");
    
    
            // Define Repository Metadata
            PurRepositoryMeta repositoriesMeta = new PurRepositoryMeta(
                    "PentahoEnterpriseRepository", "2",
                    "My Pentaho Enterprise Repository", purreplocation, false);
    
    
            // Instance a Enterprise Repository
            PurRepository repository = new PurRepository();
        
            log.logBasic("Repository Description: "
                    + repositoriesMeta.getDescription());
            
            repository.init(repositoriesMeta);
    repository.connect("joe", "password");

  4. #4
    Join Date
    Sep 2012
    Posts
    15

    Default

    Thanks for the answer.
    Yes, default username/password for enterprise repository are "joe"/"password".
    Ok, I confirm that the code is working (except for the problem that follows), taking care to copy the folder "plugins" to the Eclipse project (from [pentaho installation folder]\design-tools\data-integration, or to the ".kettle" dir where all configuration files are loaded from at run time (for me C:\Users\m.pavone\.kettle).
    There is another way to create the connection with the enterprise repository, substituting this code to the first part of what already posted:
    Code:
    PurRepositoryLocation purreplocation = new PurRepositoryLocation("http://localhost:9080/pentaho-di");
    		PurRepositoryMeta repositoriesMeta = new PurRepositoryMeta("PentahoEnterpriseRepository", "2", "2", purreplocation, false);
    		PluginRegistry registry = PluginRegistry.getInstance();
    		PurRepository repository = new PurRepository();
    		log.logBasic("Repository Description: " + repositoriesMeta.getDescription());
    		repository.init(repositoriesMeta);
    		repository.connect("joe", "password");
    The problem is that in both cases I get now the following exception:

    Exception in thread "main" org.pentaho.di.core.exception.KettleException:
    com.pentaho.commons.dsc.c: license missing, invalid, or expired
    license missing, invalid, or expired

    at org.pentaho.di.repository.pur.PurRepository.connect(SourceFile:307)
    at com.ExecuteGenericJob.main(ExecuteGenericJob.java:74)
    Caused by: com.pentaho.commons.dsc.c: license missing, invalid, or expired
    at com.pentaho.commons.dsc.g.a(SourceFile:48)
    at org.pentaho.di.repository.pur.PurRepositoryMeta.getRepositoryCapabilities(SourceFile:70)
    at org.pentaho.di.repository.BaseRepositorySecurityProvider.<init>(BaseRepositorySecurityProvider.java:28)
    at org.pentaho.di.repository.pur.d.<init>(SourceFile:20)
    at org.pentaho.di.repository.pur.u.<init>(SourceFile:15)
    at org.pentaho.di.repository.pur.PurRepository.connect(SourceFile:282)
    ... 1 more


    Now, I have the license and it is not expired. Can you help me to understand which license's file is searched and where, so that I can put it in the rigth path?
    Thank you.

  5. #5
    Join Date
    Sep 2012
    Posts
    15

    Default

    Hi amusarra,
    we have already discussed about this on your blog, I'm Marco.
    Thanks again, but I'm still dealing with this license issue.
    I sent an email to the enterprise support, without any reply for the moment.
    I have all the licenses but I get the aforesaid exception. I don't understend why. I copied the file .installedLicenses.xml almost everywhere in my pc :-) without any results.

  6. #6
    Join Date
    Sep 2012
    Posts
    15

    Default

    Looking at the log of the data-integration-server (inside the tomcat folder), I noticed this exception occurring when I call the job from my java class:

    Code:
    INFO  18-09 13:52:15,791 - RepositoriesMeta - Reading repositories XML file: C:\.kettle\repositories.xml
    java.lang.Throwable: Missing, invalid, or expired product license: Pentaho Hadoop Enterprise Edition
    ERROR 18-09 13:52:15,869 - Error loading plugin: HIVE/Hadoop Hive{class org.pentaho.di.core.plugins.DatabasePluginType}
    org.pentaho.di.core.exception.KettlePluginException: 
    Unexpected error loading class:
    Missing, invalid, or expired product license: Pentaho Hadoop Enterprise Edition
    
        at org.pentaho.di.core.plugins.PluginRegistry.loadClass(PluginRegistry.java:381)
        at org.pentaho.di.core.plugins.PluginRegistry.loadClass(PluginRegistry.java:240)
        at org.pentaho.di.core.database.DatabaseMeta.getDatabaseInterfacesMap(DatabaseMeta.java:1302)
        at org.pentaho.di.core.database.DatabaseMeta.findDatabaseInterface(DatabaseMeta.java:488)
        at org.pentaho.di.core.database.DatabaseMeta.getDatabaseInterface(DatabaseMeta.java:461)
        at org.pentaho.di.core.database.DatabaseMeta.setValues(DatabaseMeta.java:536)
        at org.pentaho.di.core.database.DatabaseMeta.setDefault(DatabaseMeta.java:412)
        at org.pentaho.di.core.database.DatabaseMeta.<init>(DatabaseMeta.java:402)
        at org.pentaho.di.core.database.DatabaseMeta.<init>(DatabaseMeta.java:864)
        at org.pentaho.di.repository.RepositoriesMeta.readData(RepositoriesMeta.java:217)
        at com.pentaho.pdi.ws.RepositorySyncWebService.sync(SourceFile:29)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
        at java.lang.reflect.Method.invoke(Unknown Source)
        at com.sun.xml.ws.api.server.InstanceResolver$1.invoke(InstanceResolver.java:246)
        at com.sun.xml.ws.server.InvokerTube$2.invoke(InvokerTube.java:146)
        at com.sun.xml.ws.server.sei.EndpointMethodHandler.invoke(EndpointMethodHandler.java:257)
        at com.sun.xml.ws.server.sei.SEIInvokerTube.processRequest(SEIInvokerTube.java:95)
        at com.sun.xml.ws.api.pipe.Fiber.__doRun(Fiber.java:629)
        at com.sun.xml.ws.api.pipe.Fiber._doRun(Fiber.java:588)
        at com.sun.xml.ws.api.pipe.Fiber.doRun(Fiber.java:573)
        at com.sun.xml.ws.api.pipe.Fiber.runSync(Fiber.java:470)
        at com.sun.xml.ws.server.WSEndpointImpl$2.process(WSEndpointImpl.java:295)
        at com.sun.xml.ws.transport.http.HttpAdapter$HttpToolkit.handle(HttpAdapter.java:515)
        at com.sun.xml.ws.transport.http.HttpAdapter.handle(HttpAdapter.java:285)
        at com.sun.xml.ws.transport.http.servlet.ServletAdapter.handle(ServletAdapter.java:143)
        at com.sun.xml.ws.transport.http.servlet.WSServletDelegate.doGet(WSServletDelegate.java:155)
        at com.sun.xml.ws.transport.http.servlet.WSServletDelegate.doPost(WSServletDelegate.java:189)
        at org.pentaho.platform.web.servlet.PentahoWSSpringServlet.doPost(PentahoWSSpringServlet.java:99)
        at javax.servlet.http.HttpServlet.service(HttpServlet.java:637)
        at javax.servlet.http.HttpServlet.service(HttpServlet.java:717)
        at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:290)
        at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
        at org.pentaho.platform.web.http.filters.PentahoWebContextFilter.doFilter(PentahoWebContextFilter.java:77)
        at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
        at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
        at org.pentaho.platform.web.http.filters.PentahoRequestContextFilter.doFilter(PentahoRequestContextFilter.java:84)
        at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
        at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
        at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:378)
        at org.springframework.security.intercept.web.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:109)
        at org.springframework.security.intercept.web.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:83)
        at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
        at org.springframework.security.ui.ExceptionTranslationFilter.doFilterHttp(ExceptionTranslationFilter.java:101)
        at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
        at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
        at org.springframework.security.providers.anonymous.AnonymousProcessingFilter.doFilterHttp(AnonymousProcessingFilter.java:105)
        at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
        at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
        at org.springframework.security.ui.basicauth.BasicProcessingFilter.doFilterHttp(BasicProcessingFilter.java:174)
        at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
        at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
        at org.springframework.security.context.HttpSessionContextIntegrationFilter.doFilterHttp(HttpSessionContextIntegrationFilter.java:235)
        at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
        at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
        at org.pentaho.platform.web.http.filters.HttpSessionPentahoSessionIntegrationFilter.doFilter(HttpSessionPentahoSessionIntegrationFilter.java:264)
        at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
        at org.springframework.security.wrapper.SecurityContextHolderAwareRequestFilter.doFilterHttp(SecurityContextHolderAwareRequestFilter.java:91)
        at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
        at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
        at org.springframework.security.util.FilterChainProxy.doFilter(FilterChainProxy.java:175)
        at org.springframework.security.util.FilterToBeanProxy.doFilter(FilterToBeanProxy.java:99)
        at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
        at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
        at com.pentaho.ui.servlet.SystemStatusFilter.doFilter(SourceFile:43)
        at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
        at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
        at org.pentaho.platform.web.http.filters.SetCharacterEncodingFilter.doFilter(SetCharacterEncodingFilter.java:113)
        at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
        at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
        at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:233)
        at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:191)
        at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:470)
        at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:127)
        at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:102)
        at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:109)
        at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:298)
        at org.apache.coyote.http11.Http11Processor.process(Http11Processor.java:857)
        at org.apache.coyote.http11.Http11Protocol$Http11ConnectionHandler.process(Http11Protocol.java:588)
        at org.apache.tomcat.util.net.JIoEndpoint$Worker.run(JIoEndpoint.java:489)
        at java.lang.Thread.run(Unknown Source)
    My job is a very simple example, and does not use hadoop. Do you no why I get this exception?
    I have the following licenses installed:
    - Pentaho Analysis Enterprise Edition
    - Pentaho BI Platform Enterprise Edition
    - Pentaho Dashboard Designer
    - Pentaho PDI Enterprise Edition

    Do I need another license to invoke a job using pentaho java api? If I connect to the repository via spoon I get no exceptions!
    Is there a way to disable the loading of hadoop plugins for remote invocations?
    Thanks to all.

  7. #7
    Join Date
    Sep 2012
    Posts
    15

    Default

    PROBLEM SOLVED

    To invoke a job saved on the Enterprise Repository, through standalone java application with pentaho's java API, it is necessary to define the system variable "pentaho.installed.licenses.file" with the path of the .installedLicenses.xml as value.
    To reach our goal it is sufficient to add an instruction like the following, near the beginning of our already discussed java class:
    System.setProperty(“pentaho.installed.licenses.file”, “[right_path]\.installedLicenses.xml”);

    Some notes:
    1) the file .installedLicenses.xml is located in the root pentaho installation directory;
    2) the value of this variable must be the same of the essential environment variable PENTAHO_INSTALLED_LICENSE_PATH. For example, having already set it, i wrote something like this:
    System.setProperty(“pentaho.installed.licenses.file”, System.getProperty(“PENTAHO_INSTALLED_LICENSE_PATH”));
    3) No Hadoop licenses are required.
    4) I successfully tested this remote invocation as a standalone java application. Deploying it on an application server produces other problems.

    The body of a class that does the work can be something like this:
    Code:
            System.out.println("KETTLE_HOME: " + System.getProperty("KETTLE_HOME"));
            System.out.println("PENTAHO_INSTALLED_LICENSE_PATH: " + System.getProperty("PENTAHO_INSTALLED_LICENSE_PATH"));
            
            System.setProperty("pentaho.installed.licenses.file", System.getProperty("PENTAHO_INSTALLED_LICENSE_PATH"));
            
            LogChannelInterface log = new LogChannel("launchJob");
            RepositoriesMeta repositoriesMeta = new RepositoriesMeta();
            
            try {
                System.out.println("initializing kettleEnvironment");
                KettleEnvironment.init();
    
                System.out.println("reading repositories data...");
                repositoriesMeta.readData();
                System.out.println("repositoriesMeta.getXML(): " + repositoriesMeta.getXML());
                System.out.println("repositories found: " + repositoriesMeta.nrRepositories());
                
                System.out.println("loading repository meta...");
                RepositoryMeta repositoryMeta = repositoriesMeta.findRepository("repository_name"); //inside <name>...</name> in repositories.xml
                if(repositoryMeta==null) {
                    System.out.println("repositoryMeta is null.");
                }
                System.out.println("Repository Description: " + repositoryMeta.getDescription());
                
                PluginRegistry registry = PluginRegistry.getInstance();
                if(registry==null) {
                    System.out.println("registry is null");
                }
                
                System.out.println("loading repository plugin");
                Repository repository = registry.loadClass(RepositoryPluginType.class,repositoryMeta, Repository.class);
        
                System.out.println("initializing repository");
                repository.init(repositoryMeta);
                
                System.out.println("connecting to repository");
                repository.connect("joe", "password");
    
                System.out.println("loading repository tree");
                RepositoryDirectoryInterface directory = repository.loadRepositoryDirectoryTree();
                
                System.out.println("finding directory");
                directory = directory.findDirectory("home/joe");
                logJobsInDir(repository,directory);
                
                System.out.println("loading job meta...");
                JobMeta jobMeta = repository.loadJob("job_name", directory, null, null);
                if(jobMeta != null) {
                    System.out.println("JobMeta Description: " + jobMeta.getDescription());
                    System.out.println("JobMeta Version: " + jobMeta.getJobversion());
                    System.out.println("JobMeta Modify Date: " + jobMeta.getModifiedDate());
                    System.out.println("JobMeta Id: " + jobMeta.getObjectId().getId());
                } else {
                    System.out.println("jobMeta is null");
                }
                
                Job job = new Job(repository, jobMeta);
                System.out.println("Job Name: " + job.getJobname());
        
                System.out.println("starting job...");
                job.start();
                job.waitUntilFinished();
        
                if (job.getErrors() != 0) {
                    System.out.println("Job Error: " + job.getErrors());
                    System.out.println("Job execution ends with errors");
                } else {
                    System.out.println("Job execution successfully terminated!");
                }
            } catch (KettleSecurityException ke) {
                System.out.println("catching KettleSecurityException ");
                ke.printStackTrace();
                throw new .....;
            } catch (KettlePluginException e){
                logger.error("catching KettlePluginException");
                e.printStackTrace();
                throw new .....;
            } catch (KettleException ke) {
                logger.error("catching KettleException");
                ke.printStackTrace();
                throw new .....;
            } catch (Throwable ke) {
                logger.error("catching Throwable");
                ke.printStackTrace();
                throw new .....;
            } 
    
        }
        
        private void logJobsInDir(Repository repository, RepositoryDirectoryInterface directory) throws KettleException{
            String jobnames[] = repository.getJobNames(directory.getObjectId(), false);
            System.out.println("found jobs:");
            for(String name : jobnames) {
                System.out.println(name);
            }
        }
    
    }

    remember to :
    1) define the environment variables
    - KETTLE_HOME (it must point to a folder containing a ".kettle" folder: containing the repositories.xml file and the "plugin" folder that you have to copy here from pentaho\design-tools\data-integration);
    - PENTAHO_INSTALLED_LICENSE_PATH (it must point to the .installedLicenses.xml file);
    2) include in the classpath:
    kettle-core.jar (from pentaho\design-tools\data-integration\lib)
    kettle-db.jar (from pentaho\design-tools\data-integration\lib)
    kettle-engine.jar (from pentaho\design-tools\data-integration\lib)
    javassist.jar (from pentaho\design-tools\data-integration\libext)
    scannotation-1.0.2.jar (from pentaho\design-tools\data-integration\libext)
    commons-vfs-1.0.jar (e.g. from pentaho\server\enterprise-console\lib)
    Last edited by Penty; 10-16-2012 at 10:25 AM.

  8. #8
    Join Date
    Jun 2013
    Posts
    6

    Default

    Matt,
    The following code is what i am trying to make it work by connecting to the local repository. The repository plugin is in the plugins folder too.

    package job;
    import org.pentaho.di.core.KettleEnvironment;
    import org.pentaho.di.core.plugins.PluginRegistry;
    import org.pentaho.di.core.plugins.RepositoryPluginType;
    import org.pentaho.di.job.Job;
    import org.pentaho.di.job.JobMeta;
    import org.pentaho.di.repository.RepositoriesMeta;
    import org.pentaho.di.repository.Repository;
    import org.pentaho.di.repository.RepositoryDirectoryInterface;
    import org.pentaho.di.repository.RepositoryMeta;

    public class JobReaderRepository {
    public static void main(String[] args) throws Exception {
    System.out.println("KETTLE_HOME: " + System.getProperty("KETTLE_HOME"));
    System.out.println("PENTAHO_INSTALLED_LICENSE_PATH: " + System.getProperty("PENTAHO_INSTALLED_LICENSE_PATH"));

    System.setProperty("pentaho.installed.licenses.file", System.getProperty("PENTAHO_INSTALLED_LICENSE_PATH"));

    RepositoriesMeta repositoriesMeta = new RepositoriesMeta();
    KettleEnvironment.init();
    System.out.println("reading repositories data...");

    repositoriesMeta.readData();
    System.out.println("repositories found: " + repositoriesMeta.nrRepositories());
    System.out.println("loading repository meta...");
    RepositoryMeta repositoryMeta = repositoriesMeta.findRepository("BIRT_DEV_REP"); //inside <name>...</name> in repositories.xml

    if(repositoryMeta==null) {
    System.out.println("repositoryMeta is null.");
    }else
    {
    System.out.println("Repository Description: " + repositoryMeta.getDescription());

    }
    PluginRegistry registry = PluginRegistry.getInstance();

    System.out.println("loading repository plugin");
    Repository repository = registry.loadClass(RepositoryPluginType.class,repositoryMeta, Repository.class);

    System.out.println("initializing repository");
    repository.init(repositoryMeta);

    System.out.println("connecting to repository");
    repository.connect("joe", "password");

    System.out.println("loading repository tree");
    RepositoryDirectoryInterface directory = repository.loadRepositoryDirectoryTree();

    }

    }




    Exception in thread "main"
    org.pentaho.di.core.exception.KettleException: com.pentaho.commons.dsc.d: license missing, invalid, or expiredlicense missing, invalid, or expired


    at org.pentaho.di.repository.pur.PurRepository.connect(SourceFile:338)at job.JobReaderRepository.main(
    JobReaderRepository.java:44)Caused by: com.pentaho.commons.dsc.d: license missing, invalid, or expiredat com.pentaho.commons.dsc.l.a(SourceFile:63)at org.pentaho.di.repository.pur.PurRepositoryMeta.getRepositoryCapabilities(SourceFile:79)at org.pentaho.di.repository.BaseRepositorySecurityProvider.<init>(BaseRepositorySecurityProvider.java:38)at org.pentaho.di.repository.pur.e.<init>(SourceFile:27)at org.pentaho.di.repository.pur.v.<init>(SourceFile:22)at org.pentaho.di.repository.pur.PurRepository.connect(SourceFile:309)
    ... 1 more


    My license file is still valid for 15 more days. How to solve the above issue and be able to connect to a file on the repository?




  9. #9
    Join Date
    Jan 2015
    Posts
    2

    Default

    This was indicated as solved in 2012, with the operative code line being

    System.setProperty(“pentaho.installed.licenses.file”, “[right_path]\.installedLicenses.xml”);

    However, I just tried this on the 5.2 EnterpriseDI trial version I received last week and it does not work, giving the same missing or expired license error.

    Has the truth changed?

  10. #10
    Join Date
    Jan 2015
    Posts
    2

    Default

    Problem solved BUT:

    It turns out that the license file that I had did not include the proper software item. My proble was that the error message did not indicate the exact nature of the problem. If there were actually TWO notifications -- one for a MISSING license file and one for an INVALID license file, the I'd have saved about 8 hours work.

  11. #11
    Join Date
    Feb 2015
    Posts
    3

    Default java.lang.NoClassDefFoundError: com/pentaho/pdi/ws/RepositorySyncException

    Hello Penty,

    I did all the things mentioned in your post. I am able to successfully access Pentaho repository from standalone java application. But I am getting above exception when I am trying to access the Repository from a Java Restful Webservice method. I am getting following exception when I am trying reposioty.connect(username, password).

    org.pentaho.di.core.exception.KettleException:
    java.lang.NoClassDefFoundError: com/pentaho/pdi/ws/RepositorySyncException
    com/pentaho/pdi/ws/RepositorySyncException


    Quote Originally Posted by Penty View Post
    PROBLEM SOLVED

    To invoke a job saved on the Enterprise Repository, through standalone java application with pentaho's java API, it is necessary to define the system variable "pentaho.installed.licenses.file" with the path of the .installedLicenses.xml as value.
    To reach our goal it is sufficient to add an instruction like the following, near the beginning of our already discussed java class:
    System.setProperty(“pentaho.installed.licenses.file”, “[right_path]\.installedLicenses.xml”);

    Some notes:
    1) the file .installedLicenses.xml is located in the root pentaho installation directory;
    2) the value of this variable must be the same of the essential environment variable PENTAHO_INSTALLED_LICENSE_PATH. For example, having already set it, i wrote something like this:
    System.setProperty(“pentaho.installed.licenses.file”, System.getProperty(“PENTAHO_INSTALLED_LICENSE_PATH”));
    3) No Hadoop licenses are required.
    4) I successfully tested this remote invocation as a standalone java application. Deploying it on an application server produces other problems.

    The body of a class that does the work can be something like this:
    Code:
            System.out.println("KETTLE_HOME: " + System.getProperty("KETTLE_HOME"));
            System.out.println("PENTAHO_INSTALLED_LICENSE_PATH: " + System.getProperty("PENTAHO_INSTALLED_LICENSE_PATH"));
            
            System.setProperty("pentaho.installed.licenses.file", System.getProperty("PENTAHO_INSTALLED_LICENSE_PATH"));
            
            LogChannelInterface log = new LogChannel("launchJob");
            RepositoriesMeta repositoriesMeta = new RepositoriesMeta();
            
            try {
                System.out.println("initializing kettleEnvironment");
                KettleEnvironment.init();
    
                System.out.println("reading repositories data...");
                repositoriesMeta.readData();
                System.out.println("repositoriesMeta.getXML(): " + repositoriesMeta.getXML());
                System.out.println("repositories found: " + repositoriesMeta.nrRepositories());
                
                System.out.println("loading repository meta...");
                RepositoryMeta repositoryMeta = repositoriesMeta.findRepository("repository_name"); //inside <name>...</name> in repositories.xml
                if(repositoryMeta==null) {
                    System.out.println("repositoryMeta is null.");
                }
                System.out.println("Repository Description: " + repositoryMeta.getDescription());
                
                PluginRegistry registry = PluginRegistry.getInstance();
                if(registry==null) {
                    System.out.println("registry is null");
                }
                
                System.out.println("loading repository plugin");
                Repository repository = registry.loadClass(RepositoryPluginType.class,repositoryMeta, Repository.class);
        
                System.out.println("initializing repository");
                repository.init(repositoryMeta);
                
                System.out.println("connecting to repository");
                repository.connect("joe", "password");
    
                System.out.println("loading repository tree");
                RepositoryDirectoryInterface directory = repository.loadRepositoryDirectoryTree();
                
                System.out.println("finding directory");
                directory = directory.findDirectory("home/joe");
                logJobsInDir(repository,directory);
                
                System.out.println("loading job meta...");
                JobMeta jobMeta = repository.loadJob("job_name", directory, null, null);
                if(jobMeta != null) {
                    System.out.println("JobMeta Description: " + jobMeta.getDescription());
                    System.out.println("JobMeta Version: " + jobMeta.getJobversion());
                    System.out.println("JobMeta Modify Date: " + jobMeta.getModifiedDate());
                    System.out.println("JobMeta Id: " + jobMeta.getObjectId().getId());
                } else {
                    System.out.println("jobMeta is null");
                }
                
                Job job = new Job(repository, jobMeta);
                System.out.println("Job Name: " + job.getJobname());
        
                System.out.println("starting job...");
                job.start();
                job.waitUntilFinished();
        
                if (job.getErrors() != 0) {
                    System.out.println("Job Error: " + job.getErrors());
                    System.out.println("Job execution ends with errors");
                } else {
                    System.out.println("Job execution successfully terminated!");
                }
            } catch (KettleSecurityException ke) {
                System.out.println("catching KettleSecurityException ");
                ke.printStackTrace();
                throw new .....;
            } catch (KettlePluginException e){
                logger.error("catching KettlePluginException");
                e.printStackTrace();
                throw new .....;
            } catch (KettleException ke) {
                logger.error("catching KettleException");
                ke.printStackTrace();
                throw new .....;
            } catch (Throwable ke) {
                logger.error("catching Throwable");
                ke.printStackTrace();
                throw new .....;
            } 
    
        }
        
        private void logJobsInDir(Repository repository, RepositoryDirectoryInterface directory) throws KettleException{
            String jobnames[] = repository.getJobNames(directory.getObjectId(), false);
            System.out.println("found jobs:");
            for(String name : jobnames) {
                System.out.println(name);
            }
        }
    
    }

    remember to :
    1) define the environment variables
    - KETTLE_HOME (it must point to a folder containing a ".kettle" folder: containing the repositories.xml file and the "plugin" folder that you have to copy here from pentaho\design-tools\data-integration);
    - PENTAHO_INSTALLED_LICENSE_PATH (it must point to the .installedLicenses.xml file);
    2) include in the classpath:
    kettle-core.jar (from pentaho\design-tools\data-integration\lib)
    kettle-db.jar (from pentaho\design-tools\data-integration\lib)
    kettle-engine.jar (from pentaho\design-tools\data-integration\lib)
    javassist.jar (from pentaho\design-tools\data-integration\libext)
    scannotation-1.0.2.jar (from pentaho\design-tools\data-integration\libext)
    commons-vfs-1.0.jar (e.g. from pentaho\server\enterprise-console\lib)
    Last edited by Praneeth423; 02-17-2015 at 06:40 PM.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  
Privacy Policy | Legal Notices | Safe Harbor Privacy Policy

Copyright © 2005 - 2019 Hitachi Vantara Corporation. All Rights Reserved.