Hi everybody,
I've searched through several posts but couldn't find a solution to this problem.

I've got a Job (Job1) that reads from disk the name of all the XML files in a specific input folder.
This file names are then passed to another Job (Job2) that:
  1. Reads the file (StAX)
  2. Saves the content to a database, whether the content of the XML satisfies specific requirements
  3. Deletes the imported file from disk

Job2 Executes for every input row, so it runs for every input file.

The problem is: after several processed files, I receive an out of memory error "GC overhead limit exceeded".

Gievn the fact that I would like to be able to process thousands of files, how could I set my jobs structure in order to avoid this Memory Error?
I thought that the job option "Execute for every input row" releases the memory after every single processing run, but it seems it's not the case.

I'm sorry but I can't post the Job and Transformation files for privacy matters.

Increasing the spoon memory limit is not a solution for me.

Any idea?

Thanks in advance!!!