I am running dynamic queries that often return HUGE (300MB - 1GB) result sets initially. Later, it should not be this big (not sure about that though) because I will be using delta loading. These result sets are then loaded into a C# data table. A script loops over these rows and then generates a query (stored in SSIS variable) to load them to the appropriate destination columns (determined by other scripts).
For small result sets, my package runs properly. But, for big ones, it simply fails due to out of memory error. How do I resolve this problem ? Can you suggest some strategies ? I guess I could fetch smaller parts of the data at a time and then load into target. Not sure how to go about it though. Is there a recipe for this ?
A brief intro to how the process works -
Execute SQL: Get big ResultSet > Script:RowReader: Read each row
and generate a String SQL like "Insert INTO TableABC VALUES" + {all
columns of 1 row here}. Then, concatenate SQL to a String destinationInsert >
Execute SQL: execute SQL inside String destinationInsert.
ETL process complete. Does that help ?