Thursday, March 29, 2012

cursor question

hi

From general point of view I would like to update table 'warehouse' everday from another table 'table dump' and to checkif some of the columns have changed in table 'dump' against old values in table 'warehouse'. If there is some change I need to backup
old values in some kind of history_logs table.

So I begun by writing a stored procedure for this...

DECLARE ticket_dump_cursor CURSOR FOR
SELECT * FROM ticket_dump order by Ticket_ID

OPEN ticket_dump_cursor

FETCH NEXT FROM ticket_dump_cursor

WHILE @.@.FETCH_STATUS = 0
BEGIN
-- insert code later

END
CLOSE ticket_dump_cursor
DEALLOCATE ticket_dump_cursor

GO

Now the question? is there a way how to store one row as some kind of
row datatype (something like in POSTGRE SLQ is record datatype) and then
iterate trough columns and check their value against old values in 'warehouse' table

I don't want to use FETCH NEXT FROM dump INTO @.var1, @.var2
because I need to check about 50 columns..

How to trick this? Desperately waiting for your suggestions..

thanks a lot

misoI don't know how to accomplish this in SP, but if a client app is possible, I would use DAO or better ADO technology, which does enable you to loop through all fields of a record. I can provide you with some example code.|||Hi

thanx for suggestion but I would like to us dynamic SQL if possible...
I am not very familiar with ADO and to tell the truth I would like to avoid it if I can...If it is the only way then please send me examples you mentioned

e-mail michal.holecka@.accenture.com

I was thinking about alterative solution and maybe I could do the job this way:

I would simply update rows in 'warehouse' table and write trigger which would fire after update. This trigger will check if some column has changed and if it has been changed it will log the old value into history table..

The only problem is that inside this trigger I will have to check all 50 columns manualy ... that's not very systematic solution.|||You can archive the entire row if you want through a trigger with the type of action that was performed , that is Insert , Update Or delete|||yeah I can do that but there will be a lot of redundancy in storing the whole row..

imagine that this row has 56 columns exactly and only one column will change.. I will store the whole row. Then another row changes. and so on..

I was thinking about some kind of history logging that works like this..
if one column will change I will find out which one and log only column name and old value into history table not the whole row.

Does it make sense to you?

Can I do it somehow this way?

thank you for your time ..

michal holecka|||Its a choice between space and performance ...

Trying to figure out which column has changed would take about 56 comparisons ...

Hmm ... ask your managers to get a bigger disk :)|||Your original idea was to compare your table with a copy of the day before?! It can be done, but you would have to compare keys to detect inserted and deleted rows. To detect field changes, you would have to join your current and copied table on your key, and to compare field values.

A complete other approach is event-driven to catch every database change. This approach is mostly used in DWH environments to trace db changes.

The question is thus: what do you want to accomplish? :rolleyes:|||Originally posted by mikosan
I was thinking about some kind of history logging that works like this..
if one column will change I will find out which one and log only column name and old value into history table not the whole row.

Does it make sense to you?

Can I do it somehow this way?


You can detect, which columns are updated. See the CREATE TRIGGER syntax:

CREATE TRIGGER trigger_name
ON table
[WITH ENCRYPTION]
{
{FOR { [DELETE] [,] [INSERT] [,] [UPDATE] }
[WITH APPEND]
[NOT FOR REPLICATION]
AS
sql_statement [...n]
}
|
{FOR { [INSERT] [,] [UPDATE] }
[WITH APPEND]
[NOT FOR REPLICATION]
AS
{ IF UPDATE (column)
[{AND | OR} UPDATE (column)]
[...n]
| IF (COLUMNS_UPDATED() {bitwise_operator} updated_bitmask)
{ comparison_operator} column_bitmask [...n]
}
sql_statement [ ...n]
}
}

Example:

CREATE TRIGGER updEmployeeData
ON employeeData
FOR update AS

-- Check whether columns 2, 3 or 4 has been updated. If any or all of
-- columns 2, 3 or 4 have been changed, create an audit record.
-- The bitmask is: power(2,(2-1))+power(2,(3-1))+power(2,(4-1)) = 14
-- To check if all columns 2, 3, and 4 are updated, use = 14 in place of
-- >0 (below).

IF (COLUMNS_UPDATED() & 14) > 0
-- Use IF (COLUMNS_UPDATED() & 14) = 14 to see if all of
-- columns 2, 3, and 4 are updated.
BEGIN
-- Audit OLD record.
INSERT INTO auditEmployeeData
(audit_log_type,
....|||sorry if I mady myself unclear.

I have some identifier and according this value I can tell whether the row will be updated or appended to dwh table.

For those rows that will be updated I will try backup old values.. problem is that we are talking about 56 columns.

My idea was not to backup the whole row but to have some mechanism like somehow iterate through columns of one row and log to history table following

changed_column_name name
old_value

If I understand it well I am able to do that but only with 56 IF UPDATES...|||Originally posted by DoktorBlue
You can detect, which columns are updated. See the CREATE TRIGGER syntax:

CREATE TRIGGER trigger_name
ON table
[WITH ENCRYPTION]
{
{FOR { [DELETE] [,] [INSERT] [,] [UPDATE] }
[WITH APPEND]
[NOT FOR REPLICATION]
AS
sql_statement [...n]
}
|
{FOR { [INSERT] [,] [UPDATE] }
[WITH APPEND]
[NOT FOR REPLICATION]
AS
{ IF UPDATE (column)
[{AND | OR} UPDATE (column)]
[...n]
| IF (COLUMNS_UPDATED() {bitwise_operator} updated_bitmask)
{ comparison_operator} column_bitmask [...n]
}
sql_statement [ ...n]
}
}

Example:

CREATE TRIGGER updEmployeeData
ON employeeData
FOR update AS

-- Check whether columns 2, 3 or 4 has been updated. If any or all of
-- columns 2, 3 or 4 have been changed, create an audit record.
-- The bitmask is: power(2,(2-1))+power(2,(3-1))+power(2,(4-1)) = 14
-- To check if all columns 2, 3, and 4 are updated, use = 14 in place of
-- >0 (below).

IF (COLUMNS_UPDATED() & 14) > 0
-- Use IF (COLUMNS_UPDATED() & 14) = 14 to see if all of
-- columns 2, 3, and 4 are updated.
BEGIN
-- Audit OLD record.
INSERT INTO auditEmployeeData
(audit_log_type,
....

This approach will work only till you dont make any changes to your table schema . If for ex. I remove the first column , what will happen :)|||If I use your way
IF (COLUMNS_UPDATED() & 14) > 0
-- Use IF (COLUMNS_UPDATED() & 14) = 14 to see if all of
-- columns 2, 3, and 4 are updated.
BEGIN
-- Audit OLD record.
INSERT INTO auditEmployeeData
(audit_log_type,
....

Can I do something like
while

If (COLUMNS_UPDATED() &|||Originally posted by Enigma
This approach will work only till you dont make any changes to your table schema . If for ex. I remove the first column , what will happen :)

I assumed a fixed structure, indeed. Mikrosan, correct me if this isn't the case.|||yes it will be static structure it won't change...|||But why still go for the column checks ... is disk space really that expensive ? I would anytime sacrifice disk space for performance|||To be really honest I am just junior programmer and seniors advised/ordered me to do that this way...

But considering your replies I see that I will have to talk to him..

thank you guys you are opening my eyes all the time :)|||What is the size of the table ... no of average inserts,updates, deletes per day ... that would be a good deciding factor ...

Dont go on my words alone ... If the table is updated,deleted, inserted too much then the performance would be affected by the column level approach ... otherwise you can go for DoctorBlue's Method|||Originally posted by mikosan
If I use your way
IF (COLUMNS_UPDATED() & 14) > 0
-- Use IF (COLUMNS_UPDATED() & 14) = 14 to see if all of
-- columns 2, 3, and 4 are updated.
BEGIN
-- Audit OLD record.
INSERT INTO auditEmployeeData
(audit_log_type,
....

Can I do something like
while

If (COLUMNS_UPDATED() &

I'm not sure about this. Certainly, you can say:

IF (COLUMNS_UPDATED() & 1) = 1 -- 1st column updated
INSERT INTO LOG(Name, Value) VALUES(<1st col name>, OLD.1st value)
ELSE IF (COLUMNS_UPDATED() & 2) = 2 -- 2nd column updated
INSERT INTO LOG(Name, Value) VALUES(<2nd col name>, OLD.2nd value)
ELSE IF (COLUMNS_UPDATED() & 4) = 4 -- 3rd column updated
INSERT INTO LOG(Name, Value) VALUES(<3rd col name>, OLD.3rd value)
ELSE IF ...|||the warehouse table will be updated once a day
dump table will every day have 4000 rows but only 500 will be changed
against yesterday and about 50 will be brand new..

one guy advised me to use dynamic select and exec query..

to be more specific

use while loop to and create dynamic select which compare columns in whs table and dump table...

my senior insists on doing this that way.. so I will try to study this..|||every record will have app. 5 changes trough it's lifetime..|||Originally posted by mikosan
every record will have app. 5 changes trough it's lifetime..

Seems like some home work ....:p|||well, I will study dynamic queries and we will see if seniors like it that way

anyway I would like to thank for cooperation ...

No comments:

Post a Comment