Introducing Sparkl

Let me start by putting this in perspective. In my opinion Sparkl is the best, biggest, most exciting and game changing Ctoolsrelease since CDE was released in 2009. Out of curiosity, CDE was presented in PCM09, Barcelona and Sparkl was presented in PCM13, Sintra. Just to give you even extra reasons to attend the Pentaho Community Meetups

We've always built a lot of Ctools. At a point we got tired of doing all the same kind of work every time we wanted to create a new one. So a wild idea came up

Hey - why don't we build a Ctool to help us build Ctools?

And... We did just that! Sparkl is a plugin builder, where one can very easily create new plugins / applications / whatever.



Sparkl structure


Sparkl, or Pentaho App Builder, is a plugin creator instrument that sits on 2 major cornerstones of Pentaho: Ctools and PDI, aiming to leverage as much as possible of our existing stack. Anyone familiar with those tools will be able to build plugins, facilitating adoption.
Who's it for?

Building plugins has been, so far, something only at reach of those who have development knowledge. One would need to start a Java project, read the plugin documentation, write proper code and compile, deploy and test the project.

This leaves out of the picture a lot of people that have interesting ideas for plugins that would help a particular customer scenario. These people are usually business consultants that lack the proper java knowledge but are usually extremely familiar with Pentaho implementations. Being familiar with PDI and Ctools, they will have all the knowledge needed to create application to impress their customers - with no compilation required!
Main concepts

Any plugin, like any web application, has a front end and a back end. This is a pretty much obvious statement, and there are tons of ways to implement them, each with it's own advantages.

But this is Pentaho. Would there be some approach that would be better for that implementation? Then we basically came to the obvious realization that we have amazing tools for both ends; Kettle is practically a visual programing language, and we have proved already that there are little limitations to what CDE can do as UI generator.



There are two main sections when we edit a Sparkl application: Metadata about the plugin and the definition of the elements.

The elements are basically composed of two types of elements: Dashboards, for the front-end part of the application, and Endpoints for the back-end logic
Dashboards



When we filter on the Dashboards element types, we'll see how many screens our plugin has. The UIs are generated through CDE, and we can jump from Sparkl to CDE to edit our dashboard. Exactly like if we were creating a dashboard.

If we create a dashboard called "myDashboard", it can be called through the url appName/myDashboard. Simply by appending appName/myDashboard?mode=edit we'll be able to edit the dashboard in CDE. The same applies for passing the other flags &debug=true and &bypassCache=true.

Since the dashboards won't live in the solution repository, we don't have (yet) the same freedom to apply security roles as we have for dashboards in the solution repository. What we're doing here is assuming that if we save a dashboard in a special subdirectory called _admin it will be accessible for administrators only. This rule also applies for endpoints.

When we create a new Dashboard, we can specify which template to use. The Sparkl default template will automatically generate a navigation bar to allow switching from dashboard to dashboard.

CDE is also able to detect the Sparkl application endpoints, and treats them like any other regular data sources.
Endpoints



In the core of Sparkl, we have pluggable support for several element types. The original idea was to support backend logic in javascript, java, etc, but kettle worked so well out of the box that we didn't yet feel the need for anything else.

The Kettle endpoints listed are literally .ktr and .kjb files in the plugin directory.



Sparkl uses a lot of simple concepts. If a job / transformation is saved on a special directory (system/plugin/appName/endpoints/kettle/file.ktr), a http endpoint will automatically be exposed in appName/file. Any defined parameters in our job / transformation will also automatically be detected, thus allowing to call with appName/file?paramfoo=bar for a defined parameter foo.

There's a few more goodies. If we want to use subtrasformations and not making them appear as registered endpoints, all we have to do is give it a name that starts with an underscore, eg _mySubtransformation.ktr.

There are some very useful conventions around the output of transformations; Unlike a pure dashboard, where it's mostly around normal result sets, in plugin we may want other kind of features like:

  • Returning json
  • Returning xml
  • Downloading files
  • Displaying images
  • Generate bundles
  • Regular resultsets

What Sparkl does is trying to infer the correct mode like described in the following picture:



We can also force a specific mode in the call, by specifying the kettleOutputparameter:

  • kettleOutput=Infered: Guess the appropriate output from the result type. This is the default
  • kettleOutput=Json: Standard CDA like resultset
  • kettleOutput=ResultOnly: For jobs, just return the execution result
  • kettleOutput=ResultFiles: Returns the file stored in the result filename. If more than one, a zip file will be built. We can force a download by adding the parameter download=true
  • kettleOutput=SingleCell: The raw content of the first column of the first row is outputted. Useful for returning xml and custom json.

Creating a plugin. Step by step, 5 minutes

Create a new plugin


Give it a name


Fill in the rest of the plugin info


Fill in the rest of the plugin info


Add an element


Add two dashboards

Create two dashboards with Sparkl default template


Edit the dashboards


Add a simple row with Hello string

and for the other do the same with World


Preview it



Send it to a friend


Submit it to the marketplace

Happy with the plugin? Submit it to the Pentaho Marketplace and let everyone take advantage of it!



We're sorry - we couldn't make it any simpler than this ;)

The principles around adding and using kettle endpoints are the same, except it will generate a sample transformation or job to the file system that needs to be opened from spoon.

In the future we hope top have the ability to communicate directly from Spoonto the BI Server
A well defined development methodology

Sparkl is not only about technology, but also about defining a development methodology. See Sparkl specification documentand feel free to use it as template for identifying the requirements for your own plugin
How to get it

The usual place: Marketplaceor Ctools Installer.
It's currently compatible with Pentaho 4.8 and will soon be compatible with 5.0.



More...