I transfer records from source table to dest table.
For each record i check if the PK already exist in the table with the valid_to date = 9999.12.31, than i update the record in the table with valid_to = today and than insert the new record with the valid_to = 9999.12.31.
In the database i have an unique contraint PK + valid_to.

I think, for this update/insert szenario i must serialize the transformation with "execute for everey input row".

For 1197 records the job run correctly, not very fast (85 seconds), but without failure.
For 1315 records i get the exception.

Is it an conceptual problem with my job or kettle?
In the real world i think i have over 2 millions records to transfer.
What can i do for this problem?

PDI 4.0.1
Ora 10g
XP
spoon started with -Xmx1024m

the job
Name:  job_with_oome.jpg
Views: 46
Size:  7.6 KB
the transformation (in my example i use timestamps)
Name:  trans_with_oome.jpg
Views: 45
Size:  8.9 KB

2010/10/08 11:37:19 - dd_load_table_full_tbl_person_step_2_upsert - ERROR (version 4.0.1-stable, build 13826 from 2010-08-26 14.18.03 by buildguy) : java.lang.OutOfMemoryError: Java heap space
2010/10/08 11:37:19 - dd_load_table_full_tbl_person_step_2_upsert - ERROR (version 4.0.1-stable, build 13826 from 2010-08-26 14.18.03 by buildguy) : java.util.Arrays.copyOfRange(Unknown Source)
2010/10/08 11:37:19 - dd_load_table_full_tbl_person_step_2_upsert - ERROR (version 4.0.1-stable, build 13826 from 2010-08-26 14.18.03 by buildguy) : java.lang.String.<init>(Unknown Source)
2010/10/08 11:37:19 - dd_load_table_full_tbl_person_step_2_upsert - ERROR (version 4.0.1-stable, build 13826 from 2010-08-26 14.18.03 by buildguy) : java.lang.StringBuffer.toString(Unknown Source)
2010/10/08 11:37:19 - dd_load_table_full_tbl_person_step_2_upsert - ERROR (version 4.0.1-stable, build 13826 from 2010-08-26 14.18.03 by buildguy) : org.pentaho.di.job.Job.execute(Job.java:500)
2010/10/08 11:37:19 - dd_load_table_full_tbl_person_step_2_upsert - ERROR (version 4.0.1-stable, build 13826 from 2010-08-26 14.18.03 by buildguy) : org.pentaho.di.job.Job.execute(Job.java:600)
2010/10/08 11:37:19 - dd_load_table_full_tbl_person_step_2_upsert - ERROR (version 4.0.1-stable, build 13826 from 2010-08-26 14.18.03 by buildguy) : org.pentaho.di.job.Job.execute(Job.java:388)
2010/10/08 11:37:19 - dd_load_table_full_tbl_person_step_2_upsert - ERROR (version 4.0.1-stable, build 13826 from 2010-08-26 14.18.03 by buildguy) : org.pentaho.di.job.entries.job.JobEntryJobRunner.run(JobEntryJobRunner.java:60)
2010/10/08 11:37:19 - dd_load_table_full_tbl_person_step_2_upsert - ERROR (version 4.0.1-stable, build 13826 from 2010-08-26 14.18.03 by buildguy) : java.lang.Thread.run(Unknown Source)
2010/10/08 11:37:19 - update existing record - ERROR (version 4.0.1-stable, build 13826 from 2010-08-26 14.18.03 by buildguy) : An error occurred executing this job entry :
2010/10/08 11:37:19 - update existing record - ERROR (version 4.0.1-stable, build 13826 from 2010-08-26 14.18.03 by buildguy) : Unexpected error occurred while launching entry [dd_update_existing_record_as_historic.0]
2010/10/08 11:37:19 - update existing record - ERROR (version 4.0.1-stable, build 13826 from 2010-08-26 14.18.03 by buildguy) : Java heap space