I have an application which is inserting/updating upwards of hundreds of
thousands of records into a SqlServer table. I am using ADO, make my ADO
connection, then use ADO commands to insert/update each record as
appropriate. At the end of the insert/update I release the connection. I
then proceed to repeat the processing using a different table of the
same database.
The memory usage of Sql Server 2000 running under Win2k SP4 shows
approximately 675 MB after inserting a few hundred thousand records,
while the memory of the application stays very reasonable. This seems
excessive even under today's systems and my 1 GB of real memory. Why is
the Sql Server 2000 memory consumption building up so greatly ? I can
expect a certain hit as a connection to a SqlServer database is
established, and as each command is issued I can expect a certain hit,
but I would think that memory is being returned after each command is
executed, and after the connection is closed. Yet the memory keeps
building to a great amount. Even after my application is finished, and
exits, and all connections are closed, Task Manager shows the 675 MB of
memory still being used by Sql Server 2000. Any ideas on keeping this
memory to a reasonable amount, or why such a huge amount of memory is
being used by Sql Server 2000, would be appreciated.SQL Server consumes memory as required, but doesn't give it back unless the
os or other applications need it. The assumption is, I suppose, that if you
need the memory now, you'll probably need it again in five minutes, so why
go through the hassle of releasing it just to grab it again, if no other
apps are demanding it.
Are you seeing performance issues? Do you have other apps on the box that
need the memory? Or are you just watching task manager? Typically, in this
kind of scenario, you wouldn't have other apps or services on the box that
require memory at levels that compete with SQL Server.
As for "keeping this memory to a reasonable amount" yes it is possible in
the server properties to limit the upper bound or to even fix the amount of
memory used by SQL Server. Of course, this will affect the performance of
the server when it actually would have used more than what *you* consider to
be a "reasonable amount" so I don't recommend this unless you absolutely
need to reserve the memory for other processes.
A
"Edward Diener" <eddielee_no_spam_here@.tropicsoft.com> wrote in message
news:ueU2WCqEGHA.1028@.TK2MSFTNGP11.phx.gbl...
>I have an application which is inserting/updating upwards of hundreds of
>thousands of records into a SqlServer table. I am using ADO, make my ADO
>connection, then use ADO commands to insert/update each record as
>appropriate. At the end of the insert/update I release the connection. I
>then proceed to repeat the processing using a different table of the same
>database.
> The memory usage of Sql Server 2000 running under Win2k SP4 shows
> approximately 675 MB after inserting a few hundred thousand records, while
> the memory of the application stays very reasonable. This seems excessive
> even under today's systems and my 1 GB of real memory. Why is the Sql
> Server 2000 memory consumption building up so greatly ? I can expect a
> certain hit as a connection to a SqlServer database is established, and as
> each command is issued I can expect a certain hit, but I would think that
> memory is being returned after each command is executed, and after the
> connection is closed. Yet the memory keeps building to a great amount.
> Even after my application is finished, and exits, and all connections are
> closed, Task Manager shows the 675 MB of memory still being used by Sql
> Server 2000. Any ideas on keeping this memory to a reasonable amount, or
> why such a huge amount of memory is being used by Sql Server 2000, would
> be appreciated.|||By default SQL memory allocation is set to dynamic. This means SQL Server
will take as much memory as neccessary. Its more efficient for SQL to read
data from memory than disk.
If other applications request more memory SQL will relinquish what it is
holding.
You can change this dynamic memory setting if you wish and "cap" the amount
of memory SQL consumes.
HTH. Ryan
"Edward Diener" <eddielee_no_spam_here@.tropicsoft.com> wrote in message
news:ueU2WCqEGHA.1028@.TK2MSFTNGP11.phx.gbl...
>I have an application which is inserting/updating upwards of hundreds of
>thousands of records into a SqlServer table. I am using ADO, make my ADO
>connection, then use ADO commands to insert/update each record as
>appropriate. At the end of the insert/update I release the connection. I
>then proceed to repeat the processing using a different table of the same
>database.
> The memory usage of Sql Server 2000 running under Win2k SP4 shows
> approximately 675 MB after inserting a few hundred thousand records, while
> the memory of the application stays very reasonable. This seems excessive
> even under today's systems and my 1 GB of real memory. Why is the Sql
> Server 2000 memory consumption building up so greatly ? I can expect a
> certain hit as a connection to a SqlServer database is established, and as
> each command is issued I can expect a certain hit, but I would think that
> memory is being returned after each command is executed, and after the
> connection is closed. Yet the memory keeps building to a great amount.
> Even after my application is finished, and exits, and all connections are
> closed, Task Manager shows the 675 MB of memory still being used by Sql
> Server 2000. Any ideas on keeping this memory to a reasonable amount, or
> why such a huge amount of memory is being used by Sql Server 2000, would
> be appreciated.|||Aaron Bertrand [SQL Server MVP] wrote:
> SQL Server consumes memory as required, but doesn't give it back unless th
e
> os or other applications need it.
That is an odd use of memory. Most modules return dynamic memory after
it is no longer being used. This makes it much easier for the Windows
memory management to find needed memory for other modules.
> The assumption is, I suppose, that if you
> need the memory now, you'll probably need it again in five minutes, so why
> go through the hassle of releasing it just to grab it again, if no other
> apps are demanding it.
When it builds up to 675 MB it definitely is slowing anything else that
is running, including the module(s) which are using Sql Server for
database access.
> Are you seeing performance issues? Do you have other apps on the box that
> need the memory? Or are you just watching task manager?
See comment above.
> Typically, in this
> kind of scenario, you wouldn't have other apps or services on the box that
> require memory at levels that compete with SQL Server.
No, there are no other modules which consume 675 MB of memory.
> As for "keeping this memory to a reasonable amount" yes it is possible in
> the server properties to limit the upper bound or to even fix the amount o
f
> memory used by SQL Server. Of course, this will affect the performance of
> the server when it actually would have used more than what *you* consider
to
> be a "reasonable amount" so I don't recommend this unless you absolutely
> need to reserve the memory for other processes.
I need for Sql Server to not slow down the system when it gets up to
using 675 MB. How does one limit the maximum amount of dynamic memory
Sql Server uses ? Is there really no way to just tell it,
programatically, to release the dynamic memory it has accumulated for a
given connection when that connection closes ? Frankly I am surprised at
this "hoggish" behavior of Sql Server to just hold on to all the memory
it has accumulated for a given connection even when that connection has
closed. This appears to mean to me that as long as Sql Server is running
in a particular Windows session on a machine it will never use less than
the maximum amount of memory it has allocated for that session.
> A
>
>
>
> "Edward Diener" <eddielee_no_spam_here@.tropicsoft.com> wrote in message
> news:ueU2WCqEGHA.1028@.TK2MSFTNGP11.phx.gbl...
>
>
>|||Ryan wrote:
> By default SQL memory allocation is set to dynamic. This means SQL Server
> will take as much memory as neccessary. Its more efficient for SQL to read
> data from memory than disk.
> If other applications request more memory SQL will relinquish what it is
> holding.
> You can change this dynamic memory setting if you wish and "cap" the amoun
t
> of memory SQL consumes.
How does one cap the amount of memory which it consumes, programatically
or otherwise ? is there really no way to tell Sql Server,
programatically, to release whatever memory has been allocated for a
given connection when that conncection has closed ? I find it very odd
that Sql Server simply holds on to dynamic memory allocated for a
particular connection, after that connection is closed.|||> That is an odd use of memory.
Maybe to you... it sounds like you have much more experience in runtime
applications than with services.
> Most modules return dynamic memory after it is no longer being used. This
> makes it much easier for the Windows memory management to find needed
> memory for other modules.
SQL Server expects to be the only major application/service on a server.
That is why we have terms like "database server" and "application server."
I understand that budgets sometimes dictate they are one and the same, but
when you have fewer servers, expect tradeoffs.
> When it builds up to 675 MB it definitely is slowing anything else that is
> running, including the module(s) which are using Sql Server for database
> access.
What else is running *on the same box*? I can't imagine a scenario where
some program needs to be running on the same box, and the only time it needs
675 MB to operate just happens to coincide with the times when SQL Server
*does not* need 675 MB. Does this magical app have anything to do with the
database at all? Have you tried running the app on a different server?
> I need for Sql Server to not slow down the system when it gets up to using
> 675 MB.
How have you proven that it is SQL Server that is slowing down the system?
Do you have any actual metrics Unless your app is using the other 500MB and
then some and is paging to disk... in which case I would look at the app
before looking at the database. SQL Server is in use in thousands upon
thousands of environments, do you think you are the only one witnessing this
"bizarre phenomenen" of SQL Server "stealing" memory and not giving it back?
Besides, what kind of server, that you expect good performance from, has 1GB
of memory these days? Maybe you should consider adding more memory instead
of intentionally crippling the database.
> How does one limit the maximum amount of dynamic memory Sql Server uses ?
Do you have Enterprise Manager? Have you tried right-clicking on the server
name, selecting Properties, and moving to the Memory tab? Now, you can
change these settings and restart SQL Server, but let me be the first to
predict that you will be back here complaining that it made matters worse.
SQL Server is using 675 MB because it NEEDS it. Why should it give it back?
If you run a query that returns a resultset that is sized 500 MB, SQL Server
may be able to keep that in memory so that the *next* time you call the same
query, it will be faster. If you go and do something foolish like limit the
amount of memory SQL Server can use, to something less than this query
requires, (a) it is going to be slower because it may have to page some of
the results to disk, and (b) successive invocations of the same query will
have to do the same thing, so you lose economies of scale, too. Unless you
have SQL Server for no reason and can eliminate it from the equation, I
don't see how this can possibly make the system faster.
Maybe you could focus on tuning your queries and limiting the amount of
sorting and data retrieval that is going on so that SQL Server uses less
memory. How much memory do you think is appropriate? What kind of testing
and benchmarking have you done? What kind of testing and benchmarking are
you going to do afteryou make the change? Or are you just going to walk
away, happy knowing that your app will always have free reign over the
memory on the system?
> Frankly I am surprised at this "hoggish" behavior of Sql Server to just
> hold on to all the memory it has accumulated for a given connection even
> when that connection has closed.
Yes, this surprises a lot of people. Unfortunately for you, this is how it
works. And the only solution you have come up with so far is *not* going to
make things better.
> This appears to mean to me that as long as Sql Server is running in a
> particular Windows session on a machine it will never use less than the
> maximum amount of memory it has allocated for that session.
That is correct. Next you'll be asking how to limit the disk space and
number of CPUs SQL Server uses, because all of your other apps need that
stuff sometimes too. :-(
A|||In EnterPrise Manager Right Click the ServerName
Properties
Memory Tab
You will see x2 radio buttons, dynamic and fixed memory with a slide rule.
OR in TSQL the same can be achevied via sp_configure. The setting is the
value in MB, so 2048 for 2 gb etc.
You will need to restart the SQL Service for this to take effect if you are
on SQL2000 or earlier, don't think that's the case for 2005 but i'm not
sure.
HTH. Ryan
"Edward Diener" <eddielee_no_spam_here@.tropicsoft.com> wrote in message
news:OGl$XZsEGHA.2380@.TK2MSFTNGP12.phx.gbl...
> Ryan wrote:
> How does one cap the amount of memory which it consumes, programatically
> or otherwise ? is there really no way to tell Sql Server, programatically,
> to release whatever memory has been allocated for a given connection when
> that conncection has closed ? I find it very odd that Sql Server simply
> holds on to dynamic memory allocated for a particular connection, after
> that connection is closed.
>|||Ryan wrote:
> In EnterPrise Manager Right Click the ServerName
> Properties
> Memory Tab
> You will see x2 radio buttons, dynamic and fixed memory with a slide rule.
> OR in TSQL the same can be achevied via sp_configure. The setting is the
> value in MB, so 2048 for 2 gb etc.
> You will need to restart the SQL Service for this to take effect if you ar
e
> on SQL2000 or earlier, don't think that's the case for 2005 but i'm not
> sure.
Thanks for the information.|||> How does one cap the amount of memory which it consumes, programatically
> or otherwise ?
Egads, have you even OPENED Enterprise Manager? Maybe a picture would help?
http://www.aspfaq.com/img/memory_tab.gif
Do you know what Books Online is?
http://www.aspfaq.com/2229
http://msdn.microsoft.com/library/e...tun_1a_6rn7.asp
http://msdn.microsoft.com/library/e...config_9zfy.asp
How about Google?
http://tinyurl.com/dzl96 (find the third instance of "release" on the
page)
http://www.sql-server-performance.com/q&a115.asp
And Google Groups?
http://tinyurl.com/bjttb
And the Knowledge Base?
http://support.microsoft.com/kb/321363/
http://support.microsoft.com/kb/319942/
> I find it very odd that Sql Server simply holds on to dynamic memory
> allocated for a particular connection, after that connection is closed.
Why on earth do you think that the memory is used only for that connection?
Isn't it even remotely possible that five minutes (or five seconds) from
now, some other connection might query the same data pages and require the
same amount of memory? Why make SQL Server start from scratch, instead of
allowing it to keep the most frequently accessed data pages in memory,
speeding up the majority of your queries? Limiting SQL Server's memory is
like cutting off your left foot while lying on the couch. You don't need it
right now, so what good is it, right?|||> Limiting SQL Server's memory is
> like cutting off your left foot while lying on the couch. You don't need
it
> right now, so what good is it, right?
This belongs on a T-shirt. :)
ML
http://milambda.blogspot.com/
No comments:
Post a Comment