This is more of a design/architecture question. How does insert/update into tables work in Pentaho. I tried insert/update step to load data but it was extremely slow less than 20 rows per second. When I replaced insert/update step with Table output the speed did increase significantly but not fast enough. We get files with millions of rows from the clients. Is pentaho not very good with loading this volumes of data? Millions of rows at the DB level is not very high and assumption is a ETL tool should be able to handle it efficiently. We are loading into PostGre.

Thing to note is im running the jobs from my PC locally. Does running it on the server or a different way help speed up this process? For example it took 6 plus hours to load 5 million rows in just one able.