Hi Pentaho gurus,

I want to build a fact table from a staging table and some pre-built dimension tables. All the tables are stored in the same SQLite database.

I have created a transformation to do this in five steps.

1. Read staging table (Table input)
2. Lookup dim1 key (Database value lookup)
3. Lookup dim2 key (Database value lookup)
4. Lookup dim3 key (Database value lookup)
5. Write fact table (Table output)

With the default settings the table output step always fals with an error like this:

Unexpected error inserting row
Error comitting connection
database is locked
I suspect the cause is that PDI tries to parallelize the reads and writes, but SQLite requires an exclusive lock on the database to write. (See 3.0 Locking in SQLite documentation).

One workaround is to make the commit size of the table output step greater than the number of rows in the staging table. This feels dirty.

Is there an option that makes PDI write in serial, or wait longer for the locks?

Thanks,
Iain