Showing posts with label newbie. Show all posts
Showing posts with label newbie. Show all posts

Wednesday, March 28, 2012

Merge Dynamic filter problem

Hi!

I'm a merge newbie and have a couple of questions. I'm about to setup a merge replication with Sql Server 2005 and Sql Server CE as a subscriber. Situation is like this, we have 10 service technicians using pda.

I want to each pda user have their own data. What I understand I need to use dynamic filter and SUSER_NAME()? Do I need to create a "translation" table to map my system's UserId against SUSER_NAME? How have you solved this problem?

/Magnus

Hello magnus,

One easy approach is to have a column in the table for filtering purpose.

Please take a look on sp_addmergearticle (Transact-SQL) , from http://msdn2.microsoft.com/en-us/library/ms174329.aspx.


[ @.subset_filterclause = ] 'subset_filterclause'
Is a WHERE clause specifying the horizontal filtering of a table article without the word WHERE included. subset_filterclause is of nvarchar(1000), with a default of an empty string.

Important:
For performance reasons, we recommended that you not apply functions to column names in parameterized row filter clauses, such as LEFT([MyColumn]) = SUSER_SNAME(). If you use HOST_NAME in a filter clause and override the HOST_NAME value, you might have to convert data types by using CONVERT. For more information about best practices for this case, see the section "Overriding the HOST_NAME() Value" in Parameterized Row Filters.

Parameterized Row Filters - http://msdn2.microsoft.com/en-us/library/ms152478.aspx.

and

[ @.partition_options = ] partition_options
Defines the way in which data in the article is partitioned, which enables performance optimizations when all rows belong in only one partition or in only one subscription. partition_options is tinyint, and can be one of the following values.

Value Description
0 (default)
The filtering for the article either is static or does not yield a unique subset of data for each partition, that is, an "overlapping" partition.
1
The partitions are overlapping, and data manipulation language (DML) updates made at the Subscriber cannot change the partition to which a row belongs.
2
The filtering for the article yields non-overlapping partitions, but multiple Subscribers can receive the same partition.
3
The filtering for the article yields non-overlapping partitions that are unique for each subscription.

Merge and Join

Hi I am a newbie into the world of Microsoft SQL Server Integration Services 2005. I am trying to merge 2 tables using a specified join like equi-join and also having a complex set of critriea. And I know if I use the merge/merge join transformation it can't be done so how to go about doing that other than using the "Execute SQL Task" which doesn't help me in my next "Data Flow task" as an input. As i need the data derive from the complex sql as an input to other transformations.Hi Ken Hu

could you be a bit more specific about what you are trying to achieve ? (eg: by giving us some pseudo-code ?).

This way more people will be able to give you bits of answers.

cheers

Thibaut

Friday, March 23, 2012

Merege Replication Failed

Hi Guys im newbie but need to learn more of it so please cooperate with me and slolve my problem.

my Sql Server starts merging and it starts dumping the scema and data after some time it the status shows failed, it doesn't give any error message also. please let me know what to check and how to resolve this.

The merge agent must be saying something? Are you looking at the SQL Serrve agent job history?

Also you can try running the merge agent from command line with increased verbosity. -OutputVerboseLevel 2

|||

yeah there is an error saying that

" The process could not deliver the snapshot to the subscriber. Note:The step was was retried the requested number of times(10) without succeeding. the step faild."

So please let me know wht to do on this. Thanks and Praises in advance.

|||

Try one of all of the following:

1. Can you expand all the + in the job history view and see if there is any relevant information there.

2. Run the merge agent from command line tool D:\Program FIles\Microsoft SQL Server\90\COM\replmerg.exe with all the relevant parameters and also add -OutputVerboseLevel 2

3. Look in distribution..MSmerge_history for the error message for this session.

Monday, March 19, 2012

Memory usage in Task Manager

Hallo,

I am a newbie on SQL server and my problem is this:

My SQL server runs with many client queries and after a while I can observe that the meory usage of the SQL server shown in the Windows Task Manager is growing up (e.g. 260 MB !!!).

I checked the online books and found the settings "min/max server memory" which I set to

min = 4 MB
max = 20 MB

by Enterprise manager.

Then I restarted my SQL server, checked the memory settings again by Enterprise manager and started many client queries. The memory usage in Task Manager nevertheless exeeded the 20 MB.

What is my failure? How can I limit the memory usage of SQL server?

Thank you very much for any help...After the change, did you stop and restart SQL?

Wednesday, March 7, 2012

Memory problems & package performance

Hi,

I know the SSIS memory problem has probably been covered quite a bit, but being a newbie in using the SSIS, I'm not quite sure how to improved the performance of my SSIS package.

Basically, I've a package that loops through all the subdirectories within a specified directory, and it then loops through each file in the subdirectory and with the use of the Data Flow, process each file (according to their filenames) with a Script Component to insert data into a SQL DB.

Each subdirectory has up to 15 different csv files, but each is less than 5kB. I probably have about 100 subdirectories.

When I run the package, it functioned properly, but the package stalled (no error but just stuck in one Data Flow) after a while, and when I checked my CPU memory, it was running at 100%.

I'm not sure how I could fix it or improved the memory allocation. I was not expecting to have any memory problems as the file size is small and the number of rows of data going into and out of the Script Component is no more than 20.

Any advice? Thanks.

Just an update, I observed that running my package from the Visual Studio (development environment) takes up a large chunk of memory. So, I close VS and run the package through DTEXEC.

From the Windows Task Manager, I observe that the memory usage for DTEXEC.exe keeps increasing, so I'm guessing that all the previous process is still being stored in the memory. Is there a way I could flush the memory each time I process a new file, as I don't need the memory to keep data that has been processed.

I'm not keen to increase the PC memory size as a solution as I don't think the SSIS memory should keep increasing. Any suggestions?