Hi,
I know the SSIS memory problem has probably been covered quite a bit, but being a newbie in using the SSIS, I'm not quite sure how to improved the performance of my SSIS package.
Basically, I've a package that loops through all the subdirectories within a specified directory, and it then loops through each file in the subdirectory and with the use of the Data Flow, process each file (according to their filenames) with a Script Component to insert data into a SQL DB.
Each subdirectory has up to 15 different csv files, but each is less than 5kB. I probably have about 100 subdirectories.
When I run the package, it functioned properly, but the package stalled (no error but just stuck in one Data Flow) after a while, and when I checked my CPU memory, it was running at 100%.
I'm not sure how I could fix it or improved the memory allocation. I was not expecting to have any memory problems as the file size is small and the number of rows of data going into and out of the Script Component is no more than 20.
Any advice? Thanks.
Just an update, I observed that running my package from the Visual Studio (development environment) takes up a large chunk of memory. So, I close VS and run the package through DTEXEC.
From the Windows Task Manager, I observe that the memory usage for DTEXEC.exe keeps increasing, so I'm guessing that all the previous process is still being stored in the memory. Is there a way I could flush the memory each time I process a new file, as I don't need the memory to keep data that has been processed.
I'm not keen to increase the PC memory size as a solution as I don't think the SSIS memory should keep increasing. Any suggestions?
No comments:
Post a Comment