I have successfully completed a proof of concept where I took the PCI, modified the database connections to point at my own MySQL database, created various xaction files, reports and a dashboard. All is going well, a bit slow in getting to this point...but...after trial and error things seem pretty good. Now I am ready to take this to the next step...I want to de-couple my pentaho soultion from the PCI. Is there any documentation that would help me do this. Currently I am opening the PCI, logging in as Joe(Admin) and going to the "Solutions" and then to a new folder under 'samples' that has my specifc reports, Jpivot(modrian) analysis jumping off points, and my dashboard.

I need to start working on creating my "own PCI" that has no traces of the Pentaho samples, etc. I had no previous experience with jsp or XML prior to trying Pentaho. I have learned a lot but I still was primarily just reverse engineering the code behind the PCI samples, modifying the SQL and MDX xaction file logic, databas econnections, etc. How do I make a solution that doesn't contain remnants of the PCI that I started off with. i can start stripping things out but was hoping there might be so helpful documentation out htere related to this. I see the "Crawl" step of using Pentaho as bing what I have done so far...modifying existing PCI samples to go against my own data...now I am ready to "Walk".

My other topics of interest are:

1) Once I have my solutions de-coupled from the PCI, how do I best upgrade versions of Mondrian etc, and new released come out with bug fixes, etc. What files need to be installed to upgrade variious components of Pentaho as new realese come out.

2) Is there recommended harware to run Mondrian and Dashboards on? Obviously depneds on dat volumne, # of concurrent users, etc, but just looking for any real works data volume samples with the corresponding hardware platform being used. Maybe even some performance benchmarks calibrated against fact table and dimension sizes so i can gauge what so I can get a better idea of what real world users of pentaho and mondrian expect and see delivered. I see posts in the forumn about if the fact tables isn't too large and the dimensions are not "that" big aggregates are not even required. What is not clear is what people feel is acceptable performance. Under the "contuance of thought" concept, when users drill inter a pivot table member the reasonable expecation is that in a mater of 5 secods or less the expansion of the pivot table cells will take place. Is this un-realistic ? very open ended question here but just looking for some feedback on what current users of Penatho get for performance and how they achieve it.