Hi all
I have a flat file.I am trying to set the value for the property "HeaderRowsToSkip" during runtime.I have set an expression for this in my "flat file connection manager". But this is not working.The connection manager is not able to take the value during runtime.
My expression is as follows:
DataRowsToSkip : @.[user:: Var]
where "Var" is my variable which gets the value from the rowcount component and trying to set it back to the "HeaderRowsToskip" property.
I ve even tried setting the value to the "HeaderRowsToSkip" property in the expression builder.
Its not working....
Can anyone help me out in solving this?
Thanks in advance
Regards
Suganya
Is the rowCont component part of the same data flow where your trying to write into the flat file? If so; there might be the problem (no 100% sure); but I am guessing that by the time the flat file connection manager gets 'configured' the rowcount variable has not been populated.
Try moving the logic of the row count to a different dataflow at the beginning of the control flow and see if that helps...
|||I tried the option which you have mentioned.Its not working...Is there any other way to achieve this?
And what is meant by "flat file connection manager getting configured?".Does it mean that when i start running my package,It sets all the values and during runtime i cant change my connection manager details?.I get this doubt because the connection manager is able to skip the number of rows with the same expression being used and setting the value for the variable which am using in the expression during design time.
Thanks in advance.
Regards,
Suganya
|||
I've just had a go at this and it works fine.
Where is the rowcount component? Is it in the same data-flow?
-Jamie
|||
No...its not in the same dataflow task.As you said in the previous post i have changed it another data flow task.
My package has the following tasks and components
1.Dataflow task(1) containing a flatfile ,a rowcount and an OLE DB destination
2.Dataflow task(2) containing a flat file and an OLE DB destination
Only for the flat file available in the data flow task(2),I have set the expression and am not using the same connection manager which i ve used in the dataflow task(1).
am i doing anything wrong?
Regards,
Suganya
|||
All except one thing. Why have you got a an OLE DB Destination in the first data-flow? Where are you inserting the data to? You can terminate the data-flow with the rowcount compoennt - that seems to be all you need.
-Jamie
|||
Suganya Mahadevan wrote:
I tried the option which you have mentioned.Its not working...Is there any other way to achieve this?
And what is meant by "flat file connection manager getting configured?".Does it mean that when i start running my package,It sets all the values and during runtime i cant change my connection manager details?.
Yes; and the behavior you described proof that. The connection manager gets set up using the available values when the execution starts; at that point he only value available for the rowcount variable is the one given at design time; as no rows have started to flow. That is why I was suggesting moving the rowcount logic to a previous stage.
But I got a question for you; is the row count variable being populated based on the same file you are trying to skip the rows from? That would not cause to skip all the rows every time? (Unless you are using different files)
|||Its not working.I have removed the OLE DB destination from my dataflow task(1) and now trying to get the rowcount value from dataflow task(1) and set the value to the flat file connection manager's "HeaderRowsToSkip" in dataflow task(2).
My flatfile in dataflow task(1) has 100 records and my flatfile in dataflow task has 5000 records.My rowcount variable in dataflow task(1) has 100 as its value now(as i have 100 records in flatfile).So my flatfile in dataflow task(2) should skip first 100 rows and insert 4900 rows into the destination.But its not working that way.Its inserting all the 5000 records into the destination.
But the same scenario is working when i set the value during design time and i have specified this in my previous post.
|||
No its 2 different files.
My package has the following tasks and components
1.Dataflow task(1) containing a flatfile ,a rowcount and an OLE DB destination
2.Dataflow task(2) containing a flat file and an OLE DB destination
Only for the flat file available in the data flow task(2),I have set the expression and am not using the same connection manager which i ve used in the dataflow task(1).
My flatfile in dataflow task(1) has 100 records and my flatfile in dataflow task has 5000 records.My rowcount variable in dataflow task(1) has 100 as its value now(as i have 100 records in flatfile).So my flatfile in dataflow task(2) should skip first 100 rows and insert 4900 rows into the destination.But its not working that way.Its inserting all the 5000 records into the destination.
What is the problem with my package?
Suganya Mahadevan wrote:
Its not working.I have removed the OLE DB destination from my dataflow task(1) and now trying to get the rowcount value from dataflow task(1) and set the value to the flat file connection manager's "HeaderRowsToSkip" in dataflow task(2).
My flatfile in dataflow task(1) has 100 records and my flatfile in dataflow task has 5000 records.My rowcount variable in dataflow task(1) has 100 as its value now(as i have 100 records in flatfile).So my flatfile in dataflow task(2) should skip first 100 rows and insert 4900 rows into the destination.But its not working that way.Its inserting all the 5000 records into the destination.
But the same scenario is working when i set the value during design time and i have specified this in my previous post.
Could you set a break point in the second dataflow and then execute the package; when it breaks; could you check the value of the rowcount variable to see if it's properly populated?
Let's make sure first the variable is correctly populated...
|||Yes.Am able to see the value of the variable getting changed.I used script task to populate the value of the variable.Its getting set properly.(I am not doing any manipulation inside the script task except trying to populate the value in msg box.)
By the way,I tried enabling the breakpoint on "Variable Value Changed Event". It was not hit actually.So i tried using the script task for seeing my variable value.
Did i set the breakpoint correctly...or should i enable the breakpoint for some other event to see the value?
|||
Suganya Mahadevan wrote:
Yes.Am able to see the value of the variable getting changed.I used script task to populate the value of the variable.Its getting set properly.(I am not doing any manipulation inside the script task except trying to populate the value in msg box.)
By the way,I tried enabling the breakpoint on "Variable Value Changed Event". It was not hit actually.So i tried using the script task for seeing my variable value.
Did i set the breakpoint correctly...or should i enable the breakpoint for some other event to see the value?
Put the breakpoint on the OnPreExecute event of the second data-flow.
When it breaks, drag the variable into a Watch window and you will be able to see the value.
-Jamie
|||
Thanks Jamie.The value is getting set.
But flat file is not skipping the number of rows.
Any solution for this?
Regards
Suganya
|||
Can you share the package and the source files? Paste them up here if so.
-Jamie
|||
It is not possible for me to share the package here.Can u give ur mail id so that i could send my package and source code?
-Suganya
sql
No comments:
Post a Comment