Greetings Kettle Devs

A few months ago my company began building an EII interface to
Kettle. In particular, we were looking to allow Kettle
transformations to be used in an EII way from more than just the
Pentaho runtime "Get Data From" component. This is useful in Pentaho
Report Designer along with other tools (Jasper/BIRT/etc).

The interface we arrived at that would have far reaching
compatibility was, no surprise, JDBC. Tom Qin and I have worked on
this driver and have reach an initial version that is "baked enough"
to use. You can see an example here: http://

The basic idea is to point the JDBC driver at a directory of KTRs,
execute SQL like "select * from my_transformation.my_step" and get
the results. The JDBC driver basically translates Kettle metadata
(transforms, steps, fields, datatypes) into JDBC metadata (schemas,
tables, columns, column types), parses SQL, starts the
transformations and returns the in memory rows that are passed out of
the step name.

The project website is:

My question for the list is: is this useful enough to include in the
Kettle project? If others would find it useful, are willing to help
maintain it, etc we'd be happy to commit it to Kettle. If I hear
crickets from this email I'll assume that we are the only ones that
find this "interesting" and will continue to maintain it as a
standalone project.



You received this message because you are subscribed to the Google Groups "kettle-developers" group.
To post to this group, send email to kettle-developers (AT) googlegroups (DOT) com
To unsubscribe from this group, send email to kettle-developers+unsubscribe (AT) googlegroups (DOT) com
For more options, visit this group at