# Thread: kettle to count from the previous record

1. Junior Member
Join Date
Aug 2012
Posts
6

## kettle to count from the previous record

Dear all,
week = 0, start = 100, off = 3, lost = 1
week = 1, start? = (100-3-1) = x, off = 2, lost = 4
week = 2, start? = (x-2-4)
etc.
how to design kettle to count on each start of the week?
(from the previous week / previous record)
Thank you before.

2. Senior Member
Join Date
Jun 2012
Posts
5,534
You can use the Analytical Query step to access the field values in the previous row.

3. Junior Member
Join Date
Aug 2012
Posts
6
Ok, thank you Marabu. I will try it

4. Junior Member
Join Date
Aug 2012
Posts
6
after I try, analytic queries is quite helpful to read the previous record field by field. Is actually the case I have to add first 'off' and 'lost' at week 0, then the result will be a 'start' on the record later (week 1). And so on, adds first 'off' and 'lost' in week 1, then the result will be a 'start' on the record later (week 2), And so on. Maybe someone can help? thanks in advance.

5. Member
Join Date
Aug 2011
Posts
38
I am trying to build a similar transform. I have a stream of rows with weights, and I need a running total of the weights.

Input:
rank=1, name=Red, weight=5.0
rank=2, name=Blue, weight=4.7
rank=3, name=Pink, weight=4.1
...

Desired Output:
rank=1, name=Red, weight=5.0, sum_weight=5.0
rank=2, name=Blue, weight=4.7, sum_weight=9.7
rank=3, name=Pink, weight=4.1, sum_weight=13.8
...

The problem is that Analytic Query need to grap the sum_weight of the previous row, but this is not in the original stream, but is calculated after pulling the previous row. I wind up inserting a constant sum_weight, but that does not work. Still working on it.

Update:

I tried a few more solutions, and can't seem to get it working. The problem seems to be that when the Analytic Query step does the look back, it looks at data that pased through it, but at that point the total has not been calculated.

I'm going to attempt to use a SQL Script step that runs for each row, and does SUM(weight) WHERE ordering <= rownum.
Last edited by Bob McC; 10-09-2012 at 07:44 PM.

6. Senior Member
Join Date
Nov 2008
Posts
777
I would use the Group By step (not the Memory Group By step!).

7. Member
Join Date
Aug 2011
Posts
38
Thanks, I will try the Group By with Cumulative option.

I did get it working the hard way, by using a 2nd transform, after the rows without the sum_weight were written into the DB:

[Table Input] ---> [Executue SQL script]

Where Execute SQL Script had "Execute for each row"=Y, and script was

UPDATE benchmark_assets_fs
SET sum_weight =
(SELECT SUM(weight) from benchmark_assets_fs
WHERE benchmark='?' AND ordering <= ?)
WHERE benchmark='?' AND ordering = ?

and the ? are supplied as parameters from the stream fields.

8. Member
Join Date
Aug 2011
Posts
38
The Group By, with "Include All Rows" and "Cumulative sum" worked for my application.

For the original poster, you may need to do one pass adding all the "plus" numbers, then another pass adding all the "minus" numbers, which you can convert to negative number with Calculator.

#### Posting Permissions

• You may not post new threads
• You may not post replies
• You may not post attachments
• You may not edit your posts
•