Showing posts with label records. Show all posts
Showing posts with label records. Show all posts

Wednesday, March 28, 2012

merge conflicts resolution en-masse

Recently added table to merge, another process is moving the records between
servers and beating merge causing conflicts. App so big - dev will have to
spend many hours finding cause & resolving. Until then I have to deal w/
conflicts...
How do I resolve conflicts when I have thousands of them? I want to keep the
winning change for all. Is it safe to simply delete all records from the
conflict table:
delete from conflict_MergePubName_MyTableName
Thanks
CB
Chris,
if you're happy with the default conflict resolution that has already
occurred, you should be able to cursor through your records and call
sp_deletemergeconflictrow
eg: exec sp_deletemergeconflictrow 'conflict_NorthwindRegionsMerge_Region',
'[dbo].[Region]', '{9E93B574-55A3-4D72-A108-EED0C8E6B899}',
'UK-3XSW02J.pubs'
Cheers,
Paul Ibison SQL Server MVP, www.replicationanswers.com
(recommended sql server 2000 replication book:
http://www.nwsu.com/0974973602p.html)
sql

Monday, March 26, 2012

Merge Agent Profile settings

I am trying to limit the downloaded records per merge session over my
less reliable links. Under SQL2000 I could set MaxDownloadChanges in the
agent profile to 1500, but in SQL2005 this value is limited to 300 max.
Has anybody run into this issue before? Why was it changed from SQL2000
to SQL2005?
The parameters are defined in msdb..MSagentparameterlist. I am
considering changing the record to allow a different maximum. Would this
be an issue?
I have no idea why it was changed but you should be able to override the
profile settings in the agent properties.
Hilary Cotter
Looking for a SQL Server replication book?
http://www.nwsu.com/0974973602.html
Looking for a FAQ on Indexing Services/SQL FTS
http://www.indexserverfaq.com
"JE" <jgegroups.spam@.gmail.nospam.com> wrote in message
news:uqaHGxtaHHA.1296@.TK2MSFTNGP02.phx.gbl...
>I am trying to limit the downloaded records per merge session over my less
>reliable links. Under SQL2000 I could set MaxDownloadChanges in the agent
>profile to 1500, but in SQL2005 this value is limited to 300 max. Has
>anybody run into this issue before? Why was it changed from SQL2000 to
>SQL2005?
> The parameters are defined in msdb..MSagentparameterlist. I am considering
> changing the record to allow a different maximum. Would this be an issue?
|||When I try to override it I am not allowed to enter a value higher than
300. In SQL2000 I was able to enter any value, but in SQL2005 it limits
me to 300.
Hilary Cotter wrote:
> I have no idea why it was changed but you should be able to override the
> profile settings in the agent properties.
>
|||Open up replication monitor, locate your publication, locate your
publication (click on it), right click and select view details, click on
Action, Merge Agent Job Properties, and select Steps, run agent, and click
edit, click in the command box, and hit end, then type -MaxDownloadChanges
100
Close the dialog and restart your agent.
Hilary Cotter
Looking for a SQL Server replication book?
http://www.nwsu.com/0974973602.html
Looking for a FAQ on Indexing Services/SQL FTS
http://www.indexserverfaq.com
"JE" <jgegroups.spam@.gmail.nospam.com> wrote in message
news:45FFC69D.2030108@.gmail.nospam.com...[vbcol=seagreen]
> When I try to override it I am not allowed to enter a value higher than
> 300. In SQL2000 I was able to enter any value, but in SQL2005 it limits me
> to 300.
> Hilary Cotter wrote:
|||I see, so adding the parameter directly to the merge agent job will
override the value specified in the default merge agent profile and also
allow me to set it to whatever I need.
I wonder if Microsoft can tell us why they limit the MaxDownloadChanges
to 300 in the agent profile properties dialog.
Hilary Cotter wrote:
> Open up replication monitor, locate your publication, locate your
> publication (click on it), right click and select view details, click on
> Action, Merge Agent Job Properties, and select Steps, run agent, and click
> edit, click in the command box, and hit end, then type -MaxDownloadChanges
> 100
> Close the dialog and restart your agent.
>
|||Yes, it will override the profile setting. I don't know why the default
setting in the profile is 300. The default is everything, with a minimum of
100 (should there be over 100 changes to be downloaded).
Hilary Cotter
Looking for a SQL Server replication book?
http://www.nwsu.com/0974973602.html
Looking for a FAQ on Indexing Services/SQL FTS
http://www.indexserverfaq.com
"JE" <jgegroups.spam@.gmail.nospam.com> wrote in message
news:45FFDF85.7030807@.gmail.nospam.com...[vbcol=seagreen]
>I see, so adding the parameter directly to the merge agent job will
>override the value specified in the default merge agent profile and also
>allow me to set it to whatever I need.
> I wonder if Microsoft can tell us why they limit the MaxDownloadChanges to
> 300 in the agent profile properties dialog.
> Hilary Cotter wrote:

Friday, March 23, 2012

Merge ActiveX component removes subscriber records

Hi,
I am trying to synchronize data between a publisher running in SQL Server 2000 enterprise edition and a subscriber running in MSDE 2000 using DMO with Merge ActiveX component. My program is in VB.NET.
Now, everytime I try to synchronize data, the changes at subscriber don't get reflected at publisher. Here is the code segment:
objMerge.ReinitializeSubscription(true)
objMerge.Initialize()
objMerge.Run()
objMerge.Terminate()
We are using dynamic filters with SUSER_SNAME(). However, the data for specific users filtered by SUSER_SNAME works fine while getting data from publisher to the subscriber.
Thanks.
You shouldn't be reinitializing. This is one way you can guarantee you will loose your Subscriber records.
Looking for a SQL Server replication book?
If you want to upload the Subscriber changes to the Publisher before the initialization runs use the bUploadBeforeReinit property of the ReInitializeSubscription method to true.
Looking for a SQL Server replication book?
http://www.nwsu.com/0974973602.html
|||Hi,
I didn't get your answer. It says not to use ReinitializeSubscription and again it says use ReinitializeSubscription with bUploadBeforeReinit parameter set to true.
Please explain.
Thanks.
|||Hmm.. Sorry I seem to have missed that you were using objMerge.ReinitializeSubscription(true) in your first post. I somehow was under the impression you were doing this objMerge.ReinitializeSubscription() or objMerge.ReinitializeSubscription(false)
Basically what I was trying to ask you is why are you doing a reinitialization? You normally only do this if there has been a schema/publication change.
If there is no need to do a reinitialization I would not use this statement.
If you are doing a objMerge.ReinitializeSubscription(true) then transactions that occur on the Subscriber should not be deleted unless there are some conflicts. Does conflict viewer reveal anything?
Looking for a SQL Server replication book?
http://www.nwsu.com/0974973602.html
|||Thanks Hilary for the suggestion. You are right. There were actually conflicts as we didn't find the Identity Range tab for articles enabled and hence could not set the proper identity range. We generated the publication script, manually set the identity
range to sp_addmergearticle and then created publication by running this script - now everything worked just fine even without using ReinitializeSubscription(True).
Now the question is why the identity range tab for the artcles were not active in the first place?
sql

Mental Block on SQL query. Joining one to many with one of the many

Sorry for the akward title, not sure how to say this. I have a Person table and a addresses table. Each person may have many address records as shown below:

Person

--

PersonID | AutoNum

fName | varchar

lName | varchar

...etc

Addresses

addressID | AutoNum

personID | int (FK)

address1 | varchar

city | varchar

state | varchar

isPrimary | bit

...etc

What I'm trying to do is select all from Person and the city and state of each person's primary address. A little voice keeps saying subquery...but I can't figure it out. So far I have the SQL below, but if there is no address or no address where isPrimary = 1, it fails to return the person record. I need the person record regardless of if they have a primary address. Does that make sense?

Doesn't return all Person records:

SELECT Person.*, Address.City, Address.State

FROM Person LEFT OUTER JOIN Address

ON Person.PersonID = Address.PersonID
WHERE (Address.isPrimary= 1)
ORDER BY Person.lName, Person.fName

Try:

Code Snippet

SELECT Person.*, Address.City, Address.State, Address.isPrimary

FROM Person LEFT OUTER JOIN Address

ON Person.PersonID = Address.PersonID
AND (Address.isPrimary= 1)
ORDER BY Person.lName, Person.fName

|||Perfect, thank you.

Wednesday, March 21, 2012

Memory used

I've got this problem: I execute a bulk insert via a
stored procedure of about 500000 records. The bulk insert
works fine, but the memory used grows up very fast. Then,
it never goes down.
So if I try to do another bulk insert via the same stored
procedure, the sp doesn't work and the statement have been
terminated without any error.
The server is a pentium 4 (single processor) 1,5 Ghz and 1
Gb of memory, with windows 2000 server service pack 3 and
it is used for development purposes.
Thank you.marco
Are you sure there are no other activites on the server at the same time?
Have you looked at Profiler ?
"marco" <angilerm@.lloydadriatico.it> wrote in message
news:21e701c3a925$4cad3650$a601280a@.phx.gbl...
> I've got this problem: I execute a bulk insert via a
> stored procedure of about 500000 records. The bulk insert
> works fine, but the memory used grows up very fast. Then,
> it never goes down.
> So if I try to do another bulk insert via the same stored
> procedure, the sp doesn't work and the statement have been
> terminated without any error.
> The server is a pentium 4 (single processor) 1,5 Ghz and 1
> Gb of memory, with windows 2000 server service pack 3 and
> it is used for development purposes.
> Thank you.sql

Friday, March 9, 2012

Memory Settings

Hi
I Have a sql server standard edition running on win2k server sp3.
1g Ram on the server.
The sever records plant data in an idustrial appilcation every minute.
After a week of use i have noticed that in the task manager view the memory consuption has increased yet when looking at the performance monitor total server memory is equal to target server memory. Is this correct as the memory settings are set to limit the sql server to 511meg as the sql process is greater than this.
Surely when sql reaches its limit is should release memory back to the system. Is this a memeory leak? or normal .
After another couple of days the sql server crashes and the only way out is a reboot
Can anyone help please very frustrating as the
--
Posted using Wimdows.net NntpNews Component -
Post Made from http://www.SqlJunkies.com/newsgroups Our newsgroup engine supports Post Alerts, Ratings, and Searching.>
> Hi
> I Have a sql server standard edition running on win2k server sp3.
> 1g Ram on the server.
> The sever records plant data in an idustrial appilcation every minute.
> After a week of use i have noticed that in the task manager view the
> memory consuption has increased yet when looking at the performance
> monitor total server memory is equal to target server memory. Is
> this correct as the memory settings are set to limit the sql server
> to 511meg as the sql process is greater than this.
> Surely when sql reaches its limit is should release memory back to
> the system. Is this a memeory leak? or normal .
> After another couple of days the sql server crashes and the only way out
is a reboot
>
--
If SQL Server were crashing, you should raise a case with Microsoft Product
Support so the crash can be diagnosed and remedied.
Hope this helps,
--
Eric Cárdenas
Senior support professional
This posting is provided "AS IS" with no warranties, and confers no rights.

Memory Settings

Hi
I Have a sql server standard edition running on win2k server sp3.
1g Ram on the server.
The sever records plant data in an idustrial appilcation every minute.
After a week of use i have noticed that in the task manager view the memory consuption has increased yet when looking at the performance monitor total server memory is equal to target server memory. Is this correct as the memory settings are set to limit
the sql server to 511meg as the sql process is greater than this.
Surely when sql reaches its limit is should release memory back to the system. Is this a memeory leak? or normal .
After another couple of days the sql server crashes and the only way out is a reboot
Can anyone help please very frustrating as the
Posted using Wimdows.net NntpNews Component -
Post Made from http://www.SqlJunkies.com/newsgroups Our newsgroup engine supports Post Alerts, Ratings, and Searching.
>
> Hi
> I Have a sql server standard edition running on win2k server sp3.
> 1g Ram on the server.
> The sever records plant data in an idustrial appilcation every minute.
> After a week of use i have noticed that in the task manager view the
> memory consuption has increased yet when looking at the performance
> monitor total server memory is equal to target server memory. Is
> this correct as the memory settings are set to limit the sql server
> to 511meg as the sql process is greater than this.
> Surely when sql reaches its limit is should release memory back to
> the system. Is this a memeory leak? or normal .
> After another couple of days the sql server crashes and the only way out
is a reboot
>
If SQL Server were crashing, you should raise a case with Microsoft Product
Support so the crash can be diagnosed and remedied.
Hope this helps,
Eric Crdenas
Senior support professional
This posting is provided "AS IS" with no warranties, and confers no rights.

Memory Settings

Hi
I Have a sql server standard edition running on win2k server sp3.
1g Ram on the server.
The sever records plant data in an idustrial appilcation every minute.
After a week of use i have noticed that in the task manager view the memory
consuption has increased yet when looking at the performance monitor total s
erver memory is equal to target server memory. Is this correct as the memory
settings are set to limit
the sql server to 511meg as the sql process is greater than this.
Surely when sql reaches its limit is should release memory back to the syste
m. Is this a memeory leak? or normal .
After another couple of days the sql server crashes and the only way out is
a reboot
Can anyone help please very frustrating as the
Posted using Wimdows.net NntpNews Component -
Post Made from http://www.SqlJunkies.com/newsgroups Our newsgroup engine sup
ports Post Alerts, Ratings, and Searching.>
> Hi
> I Have a sql server standard edition running on win2k server sp3.
> 1g Ram on the server.
> The sever records plant data in an idustrial appilcation every minute.
> After a week of use i have noticed that in the task manager view the
> memory consuption has increased yet when looking at the performance
> monitor total server memory is equal to target server memory. Is
> this correct as the memory settings are set to limit the sql server
> to 511meg as the sql process is greater than this.
> Surely when sql reaches its limit is should release memory back to
> the system. Is this a memeory leak? or normal .
> After another couple of days the sql server crashes and the only way out
is a reboot
>
--
If SQL Server were crashing, you should raise a case with Microsoft Product
Support so the crash can be diagnosed and remedied.
Hope this helps,
Eric Crdenas
Senior support professional
This posting is provided "AS IS" with no warranties, and confers no rights.

Wednesday, March 7, 2012

Memory problems using Sql Server 2000

I have an application which is inserting/updating upwards of hundreds of
thousands of records into a SqlServer table. I am using ADO, make my ADO
connection, then use ADO commands to insert/update each record as
appropriate. At the end of the insert/update I release the connection. I
then proceed to repeat the processing using a different table of the
same database.
The memory usage of Sql Server 2000 running under Win2k SP4 shows
approximately 675 MB after inserting a few hundred thousand records,
while the memory of the application stays very reasonable. This seems
excessive even under today's systems and my 1 GB of real memory. Why is
the Sql Server 2000 memory consumption building up so greatly ? I can
expect a certain hit as a connection to a SqlServer database is
established, and as each command is issued I can expect a certain hit,
but I would think that memory is being returned after each command is
executed, and after the connection is closed. Yet the memory keeps
building to a great amount. Even after my application is finished, and
exits, and all connections are closed, Task Manager shows the 675 MB of
memory still being used by Sql Server 2000. Any ideas on keeping this
memory to a reasonable amount, or why such a huge amount of memory is
being used by Sql Server 2000, would be appreciated.SQL Server consumes memory as required, but doesn't give it back unless the
os or other applications need it. The assumption is, I suppose, that if you
need the memory now, you'll probably need it again in five minutes, so why
go through the hassle of releasing it just to grab it again, if no other
apps are demanding it.
Are you seeing performance issues? Do you have other apps on the box that
need the memory? Or are you just watching task manager? Typically, in this
kind of scenario, you wouldn't have other apps or services on the box that
require memory at levels that compete with SQL Server.
As for "keeping this memory to a reasonable amount" yes it is possible in
the server properties to limit the upper bound or to even fix the amount of
memory used by SQL Server. Of course, this will affect the performance of
the server when it actually would have used more than what *you* consider to
be a "reasonable amount" so I don't recommend this unless you absolutely
need to reserve the memory for other processes.
A
"Edward Diener" <eddielee_no_spam_here@.tropicsoft.com> wrote in message
news:ueU2WCqEGHA.1028@.TK2MSFTNGP11.phx.gbl...
>I have an application which is inserting/updating upwards of hundreds of
>thousands of records into a SqlServer table. I am using ADO, make my ADO
>connection, then use ADO commands to insert/update each record as
>appropriate. At the end of the insert/update I release the connection. I
>then proceed to repeat the processing using a different table of the same
>database.
> The memory usage of Sql Server 2000 running under Win2k SP4 shows
> approximately 675 MB after inserting a few hundred thousand records, while
> the memory of the application stays very reasonable. This seems excessive
> even under today's systems and my 1 GB of real memory. Why is the Sql
> Server 2000 memory consumption building up so greatly ? I can expect a
> certain hit as a connection to a SqlServer database is established, and as
> each command is issued I can expect a certain hit, but I would think that
> memory is being returned after each command is executed, and after the
> connection is closed. Yet the memory keeps building to a great amount.
> Even after my application is finished, and exits, and all connections are
> closed, Task Manager shows the 675 MB of memory still being used by Sql
> Server 2000. Any ideas on keeping this memory to a reasonable amount, or
> why such a huge amount of memory is being used by Sql Server 2000, would
> be appreciated.|||By default SQL memory allocation is set to dynamic. This means SQL Server
will take as much memory as neccessary. Its more efficient for SQL to read
data from memory than disk.
If other applications request more memory SQL will relinquish what it is
holding.
You can change this dynamic memory setting if you wish and "cap" the amount
of memory SQL consumes.
HTH. Ryan
"Edward Diener" <eddielee_no_spam_here@.tropicsoft.com> wrote in message
news:ueU2WCqEGHA.1028@.TK2MSFTNGP11.phx.gbl...
>I have an application which is inserting/updating upwards of hundreds of
>thousands of records into a SqlServer table. I am using ADO, make my ADO
>connection, then use ADO commands to insert/update each record as
>appropriate. At the end of the insert/update I release the connection. I
>then proceed to repeat the processing using a different table of the same
>database.
> The memory usage of Sql Server 2000 running under Win2k SP4 shows
> approximately 675 MB after inserting a few hundred thousand records, while
> the memory of the application stays very reasonable. This seems excessive
> even under today's systems and my 1 GB of real memory. Why is the Sql
> Server 2000 memory consumption building up so greatly ? I can expect a
> certain hit as a connection to a SqlServer database is established, and as
> each command is issued I can expect a certain hit, but I would think that
> memory is being returned after each command is executed, and after the
> connection is closed. Yet the memory keeps building to a great amount.
> Even after my application is finished, and exits, and all connections are
> closed, Task Manager shows the 675 MB of memory still being used by Sql
> Server 2000. Any ideas on keeping this memory to a reasonable amount, or
> why such a huge amount of memory is being used by Sql Server 2000, would
> be appreciated.|||Aaron Bertrand [SQL Server MVP] wrote:
> SQL Server consumes memory as required, but doesn't give it back unless th
e
> os or other applications need it.
That is an odd use of memory. Most modules return dynamic memory after
it is no longer being used. This makes it much easier for the Windows
memory management to find needed memory for other modules.

> The assumption is, I suppose, that if you
> need the memory now, you'll probably need it again in five minutes, so why
> go through the hassle of releasing it just to grab it again, if no other
> apps are demanding it.
When it builds up to 675 MB it definitely is slowing anything else that
is running, including the module(s) which are using Sql Server for
database access.

> Are you seeing performance issues? Do you have other apps on the box that
> need the memory? Or are you just watching task manager?
See comment above.

> Typically, in this
> kind of scenario, you wouldn't have other apps or services on the box that
> require memory at levels that compete with SQL Server.
No, there are no other modules which consume 675 MB of memory.

> As for "keeping this memory to a reasonable amount" yes it is possible in
> the server properties to limit the upper bound or to even fix the amount o
f
> memory used by SQL Server. Of course, this will affect the performance of
> the server when it actually would have used more than what *you* consider
to
> be a "reasonable amount" so I don't recommend this unless you absolutely
> need to reserve the memory for other processes.
I need for Sql Server to not slow down the system when it gets up to
using 675 MB. How does one limit the maximum amount of dynamic memory
Sql Server uses ? Is there really no way to just tell it,
programatically, to release the dynamic memory it has accumulated for a
given connection when that connection closes ? Frankly I am surprised at
this "hoggish" behavior of Sql Server to just hold on to all the memory
it has accumulated for a given connection even when that connection has
closed. This appears to mean to me that as long as Sql Server is running
in a particular Windows session on a machine it will never use less than
the maximum amount of memory it has allocated for that session.

> A
>
>
>
> "Edward Diener" <eddielee_no_spam_here@.tropicsoft.com> wrote in message
> news:ueU2WCqEGHA.1028@.TK2MSFTNGP11.phx.gbl...
>
>
>|||Ryan wrote:
> By default SQL memory allocation is set to dynamic. This means SQL Server
> will take as much memory as neccessary. Its more efficient for SQL to read
> data from memory than disk.
> If other applications request more memory SQL will relinquish what it is
> holding.
> You can change this dynamic memory setting if you wish and "cap" the amoun
t
> of memory SQL consumes.
How does one cap the amount of memory which it consumes, programatically
or otherwise ? is there really no way to tell Sql Server,
programatically, to release whatever memory has been allocated for a
given connection when that conncection has closed ? I find it very odd
that Sql Server simply holds on to dynamic memory allocated for a
particular connection, after that connection is closed.|||> That is an odd use of memory.
Maybe to you... it sounds like you have much more experience in runtime
applications than with services.

> Most modules return dynamic memory after it is no longer being used. This
> makes it much easier for the Windows memory management to find needed
> memory for other modules.
SQL Server expects to be the only major application/service on a server.
That is why we have terms like "database server" and "application server."
I understand that budgets sometimes dictate they are one and the same, but
when you have fewer servers, expect tradeoffs.

> When it builds up to 675 MB it definitely is slowing anything else that is
> running, including the module(s) which are using Sql Server for database
> access.
What else is running *on the same box*? I can't imagine a scenario where
some program needs to be running on the same box, and the only time it needs
675 MB to operate just happens to coincide with the times when SQL Server
*does not* need 675 MB. Does this magical app have anything to do with the
database at all? Have you tried running the app on a different server?

> I need for Sql Server to not slow down the system when it gets up to using
> 675 MB.
How have you proven that it is SQL Server that is slowing down the system?
Do you have any actual metrics Unless your app is using the other 500MB and
then some and is paging to disk... in which case I would look at the app
before looking at the database. SQL Server is in use in thousands upon
thousands of environments, do you think you are the only one witnessing this
"bizarre phenomenen" of SQL Server "stealing" memory and not giving it back?
Besides, what kind of server, that you expect good performance from, has 1GB
of memory these days? Maybe you should consider adding more memory instead
of intentionally crippling the database.

> How does one limit the maximum amount of dynamic memory Sql Server uses ?
Do you have Enterprise Manager? Have you tried right-clicking on the server
name, selecting Properties, and moving to the Memory tab? Now, you can
change these settings and restart SQL Server, but let me be the first to
predict that you will be back here complaining that it made matters worse.
SQL Server is using 675 MB because it NEEDS it. Why should it give it back?
If you run a query that returns a resultset that is sized 500 MB, SQL Server
may be able to keep that in memory so that the *next* time you call the same
query, it will be faster. If you go and do something foolish like limit the
amount of memory SQL Server can use, to something less than this query
requires, (a) it is going to be slower because it may have to page some of
the results to disk, and (b) successive invocations of the same query will
have to do the same thing, so you lose economies of scale, too. Unless you
have SQL Server for no reason and can eliminate it from the equation, I
don't see how this can possibly make the system faster.
Maybe you could focus on tuning your queries and limiting the amount of
sorting and data retrieval that is going on so that SQL Server uses less
memory. How much memory do you think is appropriate? What kind of testing
and benchmarking have you done? What kind of testing and benchmarking are
you going to do afteryou make the change? Or are you just going to walk
away, happy knowing that your app will always have free reign over the
memory on the system?

> Frankly I am surprised at this "hoggish" behavior of Sql Server to just
> hold on to all the memory it has accumulated for a given connection even
> when that connection has closed.
Yes, this surprises a lot of people. Unfortunately for you, this is how it
works. And the only solution you have come up with so far is *not* going to
make things better.

> This appears to mean to me that as long as Sql Server is running in a
> particular Windows session on a machine it will never use less than the
> maximum amount of memory it has allocated for that session.
That is correct. Next you'll be asking how to limit the disk space and
number of CPUs SQL Server uses, because all of your other apps need that
stuff sometimes too. :-(
A|||In EnterPrise Manager Right Click the ServerName
Properties
Memory Tab
You will see x2 radio buttons, dynamic and fixed memory with a slide rule.
OR in TSQL the same can be achevied via sp_configure. The setting is the
value in MB, so 2048 for 2 gb etc.
You will need to restart the SQL Service for this to take effect if you are
on SQL2000 or earlier, don't think that's the case for 2005 but i'm not
sure.
HTH. Ryan
"Edward Diener" <eddielee_no_spam_here@.tropicsoft.com> wrote in message
news:OGl$XZsEGHA.2380@.TK2MSFTNGP12.phx.gbl...
> Ryan wrote:
> How does one cap the amount of memory which it consumes, programatically
> or otherwise ? is there really no way to tell Sql Server, programatically,
> to release whatever memory has been allocated for a given connection when
> that conncection has closed ? I find it very odd that Sql Server simply
> holds on to dynamic memory allocated for a particular connection, after
> that connection is closed.
>|||Ryan wrote:
> In EnterPrise Manager Right Click the ServerName
> Properties
> Memory Tab
> You will see x2 radio buttons, dynamic and fixed memory with a slide rule.
> OR in TSQL the same can be achevied via sp_configure. The setting is the
> value in MB, so 2048 for 2 gb etc.
> You will need to restart the SQL Service for this to take effect if you ar
e
> on SQL2000 or earlier, don't think that's the case for 2005 but i'm not
> sure.
Thanks for the information.|||> How does one cap the amount of memory which it consumes, programatically
> or otherwise ?
Egads, have you even OPENED Enterprise Manager? Maybe a picture would help?
http://www.aspfaq.com/img/memory_tab.gif
Do you know what Books Online is?
http://www.aspfaq.com/2229
http://msdn.microsoft.com/library/e...tun_1a_6rn7.asp
http://msdn.microsoft.com/library/e...config_9zfy.asp
How about Google?
http://tinyurl.com/dzl96 (find the third instance of "release" on the
page)
http://www.sql-server-performance.com/q&a115.asp
And Google Groups?
http://tinyurl.com/bjttb
And the Knowledge Base?
http://support.microsoft.com/kb/321363/
http://support.microsoft.com/kb/319942/

> I find it very odd that Sql Server simply holds on to dynamic memory
> allocated for a particular connection, after that connection is closed.
Why on earth do you think that the memory is used only for that connection?
Isn't it even remotely possible that five minutes (or five seconds) from
now, some other connection might query the same data pages and require the
same amount of memory? Why make SQL Server start from scratch, instead of
allowing it to keep the most frequently accessed data pages in memory,
speeding up the majority of your queries? Limiting SQL Server's memory is
like cutting off your left foot while lying on the couch. You don't need it
right now, so what good is it, right?|||> Limiting SQL Server's memory is
> like cutting off your left foot while lying on the couch. You don't need
it
> right now, so what good is it, right?
This belongs on a T-shirt. :)
ML
http://milambda.blogspot.com/