hi all

i am using pentaho user console 4.8 which talks to a mysql db through the pentaho metadata layer.

sometimes users need to do cohort analysis. an example of the cohorts they are experimenting with are the number of units each partner has sold 0-9, 10-49, 50-249, 250+ maps to low, med, high, vhigh respectively. these mappings are prototype data analyses so it doesnt make sense to keep chopping and changing database columns.

solution 1: the current solution is to export all the necessary data to a csv and process in excel. this takes 30mins to stream all the uncompressed data to the browser then download as a csv. the query run straight in mysql INTO OUTFILE takes 30s, 30s to zip up and 10s to download. is it possible to replicate this INTO OUTFILE and download compressed process in PUC?

solution 2: is it possible to upload a new data source (csv with cohort mapping data) which is joinable to the current table?

solution 3: calculated measure in PUC?

thanks for your help! id be happy with any other viable solution too
ben