Hi all,
We have been experiencing a problem with a couple of our reports that a
number of people seem to have hit previously. Basically the issue is if
the report is over a certain size the ASP.NET worker process hits its
threshold (60%) and the process gets shut down. We have followed the
advice listed in some of the posts in increasing this threshold which
manages to get the report through but we still have a few concerns.
1. Why won't the process use virtual memory. It seems to be limted to
physical memory available and when we increased the threshold, if that
physical memory runs out we get an out of memory exception.
2. The memory doesn't appear to be being released, I would have
throught the garbage collection would kick in pretty soon after the
report was finished but watching the process the memory stays in use
for a large amount of time after the report has finised rendering, with
no new activity on the server.
3. How does this scale at all? I have seen the argument that reports of
this size are unfeasable, and agree to an extent... unfortunately our
clients don't and they need a system capable of delivering them all the
data, regardless of the size of the report. I also read a suggestion to
use DTS to deliver a csv file to the client, but this sounds like a one
off workaround, more than an ongoing process that say an end user could
intiate once a month (for a thousand or so different companies).
This leads me to my final concern.... We have observed that the memory
will pile up, ie if the user kicks off one report it uses x amount...
if another user kicks off another report that will use an additional
amount of memory.... so even if the report isn't too big, it would only
take ten users running medium size reports to run the server out of
memory.... does anyone have any suggestions as to how we should cater
for this?
Thanks in advance
Greg
No comments:
Post a Comment