PDA

View Full Version : System requirements



radek
05-03-2006, 08:00 AM
Hi

We are setting dedicated server for Pentaho. For now it will work only as a reporting server, later we are planning to add there ETL and JOSSO. Database server is on different machine and I think most operations for generating reports will be handled by database.
I didn't find any system requirements for Red Hat. I guess 2GB RAM and some decent CPU should be enough.
Could anybody present me some thoughts about system requirements? What is a safe choice? Are there anywhere some analysis about system requirements dependant on traffic etc?

Thanks in advance for any help
Radek

Post edited by: radek, at: 05/03/2006 12:02

hagge
05-04-2006, 12:01 AM
I would also be intrested in some performance metrics and anecdotes.

Im probably going to use parts of the Pentaho Suite in my Masters Thesis so to begin with i'd like to know what would be required for a comfortable user experience running it all on a single computer but also with a single user.

MySQL, Reporting (both designing and generating) and some OLAP-drilling into pretty small datasets atleast to begin with.

Guessing the memory-footprint and Eclipse/BIRT are in this case the limiting factors?

I'm currently using a Pentium M 1.7GHz machine with 1GB RAM, Windows XP.
Also, would there be any idea to run Pentaho in a Virtual Linux image instead of in Windows?

Hope im not hijacking radek's topic but it seemed like a good idea to collect system requirements into the same topic.

guzaldon
05-04-2006, 07:33 AM
Well I'm running RHEL 4 and MySQL 5.0 on a dual xion 2.8 GHz with 8 gig of ram and raided sata drives. and I currently only have data mart setup for pentaho to use. a couple things that I have notice is the 8 gigs is more than enough the highest I have seen my ram usage get up too in about 6 gig that's running pentaho SDK and kettle for some ETL. But the reason why I have so much more memory that I feel we currently need is b/c we are probably going to set up an in memory DB to handle real time warehousing once everything is brought up to speed.

I would say if you are only running 32 bit processors then 2 to 4 gigs would be enough just besure to allicate a lil extra SWAP just for the safe side. Another note to point out in a 32 bit processor with linux your max user memory can only go up to 4 gigs per user but with a 64 bit processor it can go much higher. And in my option I would say faster hard drives would be better then the fastest CPU since it's pretty hdd intensive b/c of the DB.

currently with the one schema I have in play and drilling down in a pivot table queries take any were from a 5 seconds to a lil under a minute depending what you are drill on. My DB currently has about a gig of data in it.

I hope that's a start to the informatioin you guys are looking for perhaps others can jump in and share their expriences.

Nic