I've got Kettle loaded on a CentOS client box which can access an HDP cluster. It has no UI/Spoon.
I can run basic ETL jobs with kitchen/pan. However, I can't run any Hadoop/HDP related jobs. No errors are thrown during my HBase Output and Hadoop Copy Files steps, yet, they don't do anything.
I say the shims don't load because I rename my pentaho-big-data-plugin folder to pentaho-asdfbig-data-plugin and it doesn't affect the job, i.e. no errors.

I can run all of the jobs on a single pseudo-distributed box without issue.(just the hostnames are changed in the shims and cluster configs for the hbase steps)