Search Results

Search found 38535 results on 1542 pages for 'sql memory'.

Page 51/1542 | < Previous Page | 47 48 49 50 51 52 53 54 55 56 57 58  | Next Page >

  • How can I execute a .sql from C#?

    - by J. Pablo Fernández
    For some integration tests I want to connect to the database and run a .sql file that has the schema needed for the tests to actually run, including GO statements. How can I execute the .sql file? (or is this totally the wrong way to go?) I've found a post in the MSDN forum showing this code: using System.Data.SqlClient; using System.IO; using Microsoft.SqlServer.Management.Common; using Microsoft.SqlServer.Management.Smo; namespace ConsoleApplication1 { class Program { static void Main(string[] args) { string sqlConnectionString = "Data Source=(local);Initial Catalog=AdventureWorks;Integrated Security=True"; FileInfo file = new FileInfo("C:\\myscript.sql"); string script = file.OpenText().ReadToEnd(); SqlConnection conn = new SqlConnection(sqlConnectionString); Server server = new Server(new ServerConnection(conn)); server.ConnectionContext.ExecuteNonQuery(script); } } } but on the last line I'm getting this error: System.Reflection.TargetInvocationException: Exception has been thrown by the target of an invocation. --- System.TypeInitializationException: The type initializer for '' threw an exception. --- .ModuleLoadException: The C++ module failed to load during appdomain initialization. --- System.DllNotFoundException: Unable to load DLL 'MSVCR80.dll': The specified module could not be found. (Exception from HRESULT: 0x8007007E). I was told to go and download that DLL from somewhere, but that sounds very hacky. Is there a cleaner way to? Is there another way to do it? What am I doing wrong? I'm doing this with Visual Studio 2008, SQL Server 2008, .Net 3.5SP1 and C# 3.0.

    Read the article

  • Protecting an Application's Memory From Tampering

    - by Changeling
    We are adding AES 256 bit encryption to our server and client applications for encrypting the TCP/IP traffic containing sensitive information. We will be rotating the keys daily. Because of that, the keys will be stored in memory with the applications. Key distribution process: Each server and client will have a list of initial Key Encryption Key's (KEK) by day If the client has just started up or the server has just started up, the client will request the daily key from the server using the initial key. The server will respond with the daily key, encrypted with the initial key. The daily key is a randomly generated set of alphanumeric characters. We are using AES 256 bit encryption. All subsequent communications will be encrypted using that daily key. Nightly, the client will request the new daily key from the server using the current daily key as the current KEK. After the client gets the new key, the new daily key will replace the old daily key. Is it possible for another bad application to gain access to this memory illegally or is this protected in Windows? The key will not be written to a file, only stored in a variable in memory. If an application can access the memory illegally, how can you protect the memory from tampering? We are using C++ and XP (Vista/7 may be an option in the future so I don't know if that changes the answer).

    Read the article

  • Memory Warning but Small Live Bytes

    - by Kamchatka
    Hi everyone, In my application, I get a memory warning of level 1 and then 2 after repeating some action (choosing a picture + processing) several times and then a crash. The leak tool doesn't show any leak. I'm also following the Allocations tool in Instruments and my Live Bytes are roughly 4 MB, overall I allocate 113 MB. At maximum I have maybe 20 MB in memory when the picture is loaded. Since I have to repeat an action to get to the crash, it is very likely to be a memory leak. However, I don't know how to locate it since my live bytes are 4 MB and things supposed to be allocated (apart a small leak of ~100 KB in the UIImagePickerController). How much can I trust the memory leak/allocation tools? Would you have an advice to help me locate the reason of the problem?

    Read the article

  • SQL server 2005 agent not working

    - by flaggers
    Sql server 2005 service pack 2 version: 9.00.3042.00 All maintenance plans fail with the same error. The details of the error are:- Execute Maintenance Plan Execute maintenance plan. test7 (Error) Messages Execution failed. See the maintenance plan and SQL Server Agent job history logs for details. The advanced information section shows the following; Job 'test7.Subplan_1' failed. (SqlManagerUI) Program Location: at Microsoft.SqlServer.Management.SqlManagerUI.MaintenancePlanMenu_Run.PerformActions() At this point the following appear in the windows event log: Event Type: Error Event Source: SQLISPackage Event Category: None Event ID: 12291 Date: 28/05/2009 Time: 16:09:08 User: 'DOMAINNAME\username' Computer: SQLSERVER4 Description: Package "test7" failed. and also this: Event Type: Warning Event Source: SQLSERVERAGENT Event Category: Job Engine Event ID: 208 Date: 28/05/2009 Time: 16:09:10 User: N/A Computer: SQLSERVER4 Description: SQL Server Scheduled Job 'test7.Subplan_1' (0x96AE7493BFF39F4FBBAE034AB6DA1C1F) - Status: Failed - Invoked on: 2009-05-28 16:09:02 - Message: The job failed. The Job was invoked by User 'DOMAINNAME\username'. The last step to run was step 1 (Subplan_1). There are no entries in the SQl Agent log at all.

    Read the article

  • Objective-c for the iphone: Mystery memory leak

    - by user200341
    My application seems to have 4 memory leaks (on the device, running instruments). The memory leaks seems to come from this code: NSURL *url = [self getUrl:destination]; [destination release]; NSMutableURLRequest *request = [[NSMutableURLRequest alloc] initWithURL:url]; [url release]; [request setHTTPMethod:@"GET"]; [request addValue:@"application/json" forHTTPHeaderField:@"content-type"]; NSURLConnection *connection = [[NSURLConnection alloc]initWithRequest:request delegate:self]; [request release]; [connection release]; I am releasing all my objects as far as I can see but it's still showing this as the source of the 4 memory leaks. This is on the Device running 3.1.3 Is it acceptable to have a few memory leaks in your app or do they all have to go?

    Read the article

  • C# performance varying due to memory

    - by user1107474
    Hope this is a valid post here, its a combination of C# issues and hardware. I am benchmarking our server because we have found problems with the performance of our quant library (written in C#). I have simulated the same performance issues with some simple C# code- performing very heavy memory-usage. The code below is in a function which is spawned from a threadpool, up to a maximum of 32 threads (because our server has 4x CPUs x 8 cores each). This is all on .Net 3.5 The problem is that we are getting wildly differing performance. I run the below function 1000 times. The average time taken for the code to run could be, say, 3.5s, but the fastest will only be 1.2s and the slowest will be 7s- for the exact same function! I have graphed the memory usage against the timings and there doesnt appear to be any correlation with the GC kicking in. One thing I did notice is that when running in a single thread the timings are identical and there is no wild deviation. I have also tested CPU-bound algorithms and the timings are identical too. This has made us wonder if the memory bus just cannot cope. I was wondering could this be another .net or C# problem, or is it something related to our hardware? Would this be the same experience if I had used C++, or Java?? We are using 4x Intel x7550 with 32GB ram. Is there any way around this problem in general? Stopwatch watch = new Stopwatch(); watch.Start(); List<byte> list1 = new List<byte>(); List<byte> list2 = new List<byte>(); List<byte> list3 = new List<byte>(); int Size1 = 10000000; int Size2 = 2 * Size1; int Size3 = Size1; for (int i = 0; i < Size1; i++) { list1.Add(57); } for (int i = 0; i < Size2; i = i + 2) { list2.Add(56); } for (int i = 0; i < Size3; i++) { byte temp = list1.ElementAt(i); byte temp2 = list2.ElementAt(i); list3.Add(temp); list2[i] = temp; list1[i] = temp2; } watch.Stop(); (the code is just meant to stress out the memory) I would include the threadpool code, but we used a non-standard threadpool library. EDIT: I have reduced "size1" to 100000, which basically doesn't use much memory and I still get a lot of jitter. This suggests it's not the amount of memory being transferred, but the frequency of memory grabs?

    Read the article

  • Ways to avoid Memory Leaks in C/C++

    - by Ankur
    What are some tips I can use to avoid memory leaks in my applications? In my current project I use a tool "INSURE++" which finds the memory leak and generate the report. Apart from the tool is there any method to identify memory leaks and overcome it.

    Read the article

  • How to research unmanaged memory leaks in .NET?

    - by Brandon
    I have a WCF service running over MSMQ. Memory gradually increases over time, indicating that there is some sort of memory leak. I ran the service locally and monitored some counters using PerfMon. Total CLR memory managed heap bytes remains relatively constant, while the process' private bytes increases over time. This leads me to believe that there is some sort of unmanaged memory leak. Assuming that unmanaged memory leak is the issue, how do I address the issue? Are there any tools I could use to give me hints as to what is causing the unmanaged memory leak? Also, all my service is doing is reading from the transactional queue and writing to a database, all as part of a DTC transaction (handled under the hood by requiring a transaction on the service contract). I am not doing anything explicitly with COM or DllImports. Thanks in advance!

    Read the article

  • c# : simulate memory leaks..

    - by dotnet-practitioner
    Hi, I would like to write the following code in c#. a) small console application that simulates memory leak. b) small console application that would invoke the above application and release it right away simulating managing memory leak problem.. In other words the (b) application would continuously call and release application (a) to simulate how the "rebellious" memory leak application is being contained with out addressing the root cause which is application (a). Some sample code for application (a) and (b) would be very helpful. Thanks

    Read the article

  • SQL Server CE, Visual Studio 2008/2010 RC, and Linq-to-Sql

    - by blu
    I added an .sdf to my project, added a table, added a Linq-to-Sql dmbl, and tried to add the table to the dbml. The result was an error: "The selected object(s) are an unsupported data provider" This happens in both VS 2008 Professional SP1 and 2010 RC Ultimate. I found someone talking about using SQL Metal to generate the file, but I didn't enjoy that 2 years ago, and after a little playing around I recall why. Does anyone know if this is going to be supported in the release version? Should I abandon SQL Server CE and just use SQLite (with DbLinq)? Thanks for any insight.

    Read the article

  • Nested SELECT clause in SQL Compact 3.5

    - by Sasha
    In this post "select with nested select" I read that SQL Compact 3.5 (SP1) support nested SELECT clause. But my request not work: t1 - table 1 t2 - table 2 c1, c2 = columns select t1.c1, t1.c2, (select count(t2.c1) from t2 where t2.id = t1.id) as count_t from t1 Does SQL Compact 3.5 SP1 support nested SELECT clause in this case? Update: SQL Compact 3.5 SP1 work with this type of nested request: SELECT ... from ... where .. IN (SELECT ...) SELECT ... from (SELECT ...)

    Read the article

  • sql db problem with windows authentication

    - by Jimmy
    Have a SQL Server 2008 db which I connect to the Windows Authentication .. has worked good for 7-8 months .. but now when I come to work today it no longer worked to connect, without that I had done something Error message was: Can not open user default database. Login failed. Login failed for user 'Jimmy-PC \ Jimmy'. where the first is the computer name and the second is the user. The problem seems to be that it tries to connect to the default database. Have tried to change it without success .. I do not have sql server management tools for sql 2008 but only to 2005, someone who has similar experience? who have not touched anything said over the weekend and it worked last Friday without any problems.

    Read the article

  • Write a sql to get the last data

    - by Lu Lu
    Hello everyone, I have a Realtime table with example data: Symbol Date Value ABC 1/3/2009 03:05:01 327 -- is last data for 'ABC' ABC 1/2/2009 03:05:01 326 ABC 1/2/2009 02:05:01 323 ABC 1/2/2009 01:05:01 313 BBC 1/3/2009 03:05:01 458 -- is last data for 'BBC' BBC 1/2/2009 03:05:01 454 BBC 1/2/2009 02:05:01 453 BBC 1/2/2009 01:05:01 423 Please help me to write a sql to return last data for all symbol. The result is: Symbol Date Value ABC 1/3/2009 03:05:01 327 BBC 1/3/2009 03:05:01 458 P/s: I use sql server 2005. And Realtime data is very big, please optimize the sql code. Thanks.

    Read the article

  • Need help with SQL query on SQL Server 2005

    - by Avinash
    We're seeing strange behavior when running two versions of a query on SQL Server 2005: version A: SELECT otherattributes.* FROM listcontacts JOIN otherattributes ON listcontacts.contactId = otherattributes.contactId WHERE listcontacts.listid = 1234 ORDER BY name ASC version B: DECLARE @Id AS INT; SET @Id = 1234; SELECT otherattributes.* FROM listcontacts JOIN otherattributes ON listcontacts.contactId = otherattributes.contactId WHERE listcontacts.listid = @Id ORDER BY name ASC Both queries return 1000 rows; version A takes on average 15s; version B on average takes 4s. Could anyone help us understand the difference in execution times of these two versions of SQL? If we invoke this query via named parameters using NHibernate, we see the following query via SQL Server profiler: EXEC sp_executesql N'SELECT otherattributes.* FROM listcontacts JOIN otherattributes ON listcontacts.contactId = otherattributes.contactId WHERE listcontacts.listid = @id ORDER BY name ASC', N'@id INT', @id=1234; ...and this tends to perform as badly as version A.

    Read the article

  • SQL Server ORDER BY/WHERE with nested select

    - by Echilon
    I'm trying to get SQL Server to order by a column from a nested select. I know this isn't the best way of doing this but it needs to be done. I have two tables, Bookings and BookingItems. BookingItems contains StartDate and EndDate fields, and there can be multiple BookingItems on a Booking. I need to find the earliest startdate and latest end date from BookingItems, then filter and sort by these values. I've tried with a nested select, but when I try to use one of the selected columns in a WHERE or ORDER BY, I get an "Invalid Column Name". SELECT b.*, (SELECT COUNT(*) FROM bookingitems i WHERE b.BookingID = i.BookingID) AS TotalRooms, (SELECT MIN(i.StartDate) FROM bookingitems i WHERE b.BookingID = i.BookingID) AS StartDate, (SELECT MAX(i.EndDate) FROM bookingitems i WHERE b.BookingID = i.BookingID) AS EndDate FROM bookings b LEFT JOIN customers c ON b.CustomerID = c.CustomerID WHERE StartDate >= '2010-01-01' Am I missing something about SQL ordering? I'm using SQL Server 2008.

    Read the article

  • creating multiple users for a c#.net winform application using sql server express

    - by sqlchild
    i have a single sql database in sql server express. i have a login MAINLOGIN, for this database. i want insertion of data in this database through multiple users, U1,U2,U3, each having different userids & passwords. These users would be created by MAINLOGIN , manually or via the winform application. So while creating MAINLOGIN , i would give him permission to further create logins. For this what should i do? i cannot create MULTIPLE users, because for one database, only one user can be created under one login. so should i create multiple logins, L1,L2,L3, then map, U1, U2, U3 to them. Or, is there a better way to do this? like application roles etc. i dont want to use windows authentication. because if i know the system password, then i could simply connect sql via the application and insert wrong data.

    Read the article

  • Use LINQ to SQL results inside SQL Server stored procedure

    - by ifwdev
    Note: I'm not trying to call a SQL Server stored proc using a L2SQL datacontext. I use LINQPad for some fairly complex "reporting" that takes L2SQL output saved to an Array and is processed further. For example, it's usually much easier to do multiple levels of grouping with LINQ to Objects instead of trying to optimize a T-SQL query to run in a reasonable amount of time. What would be the easiest way to take the end result of one of these "applications" and use that in a SQL Server 2008 stored proc? The idea is to use the data for a Reporting Services Report, rather than copying and pasting into Excel (manual labor). The reports need to be accessible on the report server (not using the Report Server control in an application). I could output CSV and read that somehow via command line exec, but that seems like a hack. Thanks for your help.

    Read the article

  • Cannot login to SQL Server 2008 R2 with Windows authentication

    - by Ian Boyd
    When i try to connect to SQL Server (2008 R2) using Windows authentication: i cannot: Checking the Windows Application event log, i find the error: Login failed for user 'AVATOPIA\ian'. Reason: Token-based server access validation failed with an infrastructure error. Check for previous errors. [CLIENT: ] Log Name: Application Source: MSSQLSERVER Event ID: 18456 Level: Information User: AVATOPIA\ian OpCode: Task Category: Logon i can login to the computer itself using Windows authentication. i can log into SQL Server using the local Windows Administrator account. We can connect to 8 other SQL Servers on the domain using Windows Authentication. Just this one, whitch is the only one that is 2008 R2 is failing. So i assume it's a bug with *2008 R2. Note: i cannot logon locally, or remotely, using Windows authentication. i can login locally and remotely using SQL Server Authentication. Update Note: It's not limited to SQL Server Management Studio, standalone applications that connect using Windows authentication: fail: Note: It's not a client problem, as we can connect fine to other (non-SQL Server 2008 R2 machines): i'm sure there's a technote or knowledge base article describing why SQL Server 2008 R2 is broken by default, but i can't find it. Update 2 Matt figure out the change that Microsoft made so that SQL Server 2008 R2 is broken by default: Administrators are no longer administrators All that remains is to figure out how to make Administrators administrators. One of these days i'm going to start a list of changes around Microsoft's "broken by default" initiative. Steps to reproduce the problem How do i add a group to the sysadmin fixed server role? Here's the steps i try, that don't work: Click Add: Click Object Types: Ensure that you have no ability to add groups: and click OK. Under Enter the object names to select, enter Administrators: Click Check Names, and ensure that you are not allowed to add groups: and click Cancel. Click Browse..., and ensure that you have no ability to add groups: You should now still not have added any group to the sysadmin role. Additional information SQL Server Management Studio is being run as an administrator: SQL Server is set to use Windows Authentication: tried while logged into SQL with both sa and the only other sysadmin domain account (screenshot can be supplied for those who don't believe)

    Read the article

  • SQL 2008 R2 Named Instance Client Connectivity Issues?

    - by Jerry Dodge
    We're upgrading our software from using SQL 2000 to 2008 R2. Our customers will be installing an update which uninstalls 2000 and installs 2008 R2 under the same instance. So if no instance existed, then no instance name will be set (default). However, the problem starts with the customers which have a named SQL instance. Starting in 2008 R2 (not sure of ones before), for some reason, a client connecting to the server by its instance name is unsuccessful. I'm testing from the Management Studio - if I can't connect this, then nothing can connect. I browse network servers, and find the specific server\instance in the list. But, upon trying to connect to an instance name like MyServer\INST, I get: A network-related or instance-specific error occurred while establishing a connection to SQL Server. The server was not found or was not accessible. Verify that the instance name is correct and that SQL Server is configured to allow remote connections. (provider: SQL Network Interfaces, error: 26 - Error Locating Server/Instance Specified) (Microsoft SQL Server, Error: -1) I do in fact have TCP/IP and Named Pipes protocols enabled, this is the first thing I did. When I connect to the server using a comma (,) and port number like MyServer, 49195, it works just fine. So it appears that client computers are just unable to identify the instance names. This has happened on all our installations of SQL 2008 R2 and from all client computers, including Win 7, XP, Vista, Server 2008, and Server 2003. We never experienced such issues on earlier versions of SQL. The problem even persists if the firewalls and antiviruses are all disabled. Now, this is a large update which we will be distributing soon to all our customers, and we want to minimize the interaction they need with us to get this installed. We absolutely hate the idea of using a port number, because it will always be different, and we would have to modify each client to point to this server/port. Some of our customers may have hundreds of client computers. How do I make client connections to a named SQL instance work again? After all, this is the whole purpose of named instances, and if a client can't connect to this instance by its name, then what is it even named for? EDIT It was mentioned to make sure SQL Browser is running, so I checked, and it is running. The server is also able to connect to its self (locally) - just external connections are refused. UPDATE After more careful checking, I learned the firewall wasn't completely disabled when testing, and upon disabling it completely, this works. So it appears that SQL Browser is being blocked by the firewall from external clients from accessing.

    Read the article

  • How to analyze 'dbcc memorystatus' result in SQL Server 2008

    - by envykok
    Currently i am facing a sql memory pressure issue. i have run 'dbcc memorystatus', here is part of my result: Memory Manager KB VM Reserved 23617160 VM Committed 14818444 Locked Pages Allocated 0 Reserved Memory 1024 Reserved Memory In Use 0 Memory node Id = 0 KB VM Reserved 23613512 VM Committed 14814908 Locked Pages Allocated 0 MultiPage Allocator 387400 SinglePage Allocator 3265000 MEMORYCLERK_SQLBUFFERPOOL (node 0) KB VM Reserved 16809984 VM Committed 14184208 Locked Pages Allocated 0 SM Reserved 0 SM Committed 0 SinglePage Allocator 0 MultiPage Allocator 408 MEMORYCLERK_SQLCLR (node 0) KB VM Reserved 6311612 VM Committed 141616 Locked Pages Allocated 0 SM Reserved 0 SM Committed 0 SinglePage Allocator 1456 MultiPage Allocator 20144 CACHESTORE_SQLCP (node 0) KB VM Reserved 0 VM Committed 0 Locked Pages Allocated 0 SM Reserved 0 SM Committed 0 SinglePage Allocator 3101784 MultiPage Allocator 300328 Buffer Pool Value Committed 1742946 Target 1742946 Database 1333883 Dirty 940 In IO 1 Latched 18 Free 89 Stolen 408974 Reserved 2080 Visible 1742946 Stolen Potential 1579938 Limiting Factor 13 Last OOM Factor 0 Page Life Expectancy 5463 Process/System Counts Value Available Physical Memory 258572288 Available Virtual Memory 8771398631424 Available Paging File 16030617600 Working Set 15225597952 Percent of Committed Memory in WS 100 Page Faults 305556823 System physical memory high 1 System physical memory low 0 Process physical memory low 0 Process virtual memory low 0 Procedure Cache Value TotalProcs 11382 TotalPages 430160 InUsePages 28 Can you lead me to analyze this result ? Is it a lot execute plan have been cached causing the memory issue or other reasons?

    Read the article

  • Emgu CV - memory-leaks (memory consumption)

    - by martin pilch
    I am using EmguCV, the OpenCV wrapper for .NET. I am disposing all created objects but my app is still using more and more memory (in release configuration too). I have debugged my app using .NET Memory profiler and get this result: http://img532.imageshack.us/img532/2503/screenqv.png all objects instance count is oscilating but GChandle instance counr is increasing until my machine is unusable. Garbage collector does not release memory (i think). I am using VS 2008 professional, Win7 prof 32-bit, both up to date, and last stable version of emguCV. I can post some app code, if it will help. Thanks and sorry for my English. Martin

    Read the article

  • SQL server agent job to execute SSIS package fails, package succeds if run manually

    - by growse
    I've got a SSIS package installed on a SQL server (SQL Server 2012). It's fairly simple and just fetches data from a remote data source and adds it into a local table. The remote connection string is using SQL server authentication, while the local connection is using Windows auth. The remote connection password is protected, and the package was imported setting the protection level to Rely on server storage and roles for access control. If I run the SSIS package manually, it works. If I run it from the command line using dtexec, it works. If I use runas to switch to the domain account that the SQL server agent is running under, and then run the package using dtexec, it works. If I create a SQL Agent job with a single step to run the package, it fails, providing very little detail as to what's going on. I'm guessing it's not able to get the password to log into the remote SQL server, because it fails very quickly. Also, if I tick 'log to table' and view the resulting file, I get the following: Description: ADO NET Source has failed to acquire the connection {0D8F2CD4-A763-4AEB-8B52-B8FAE0621ED3} with the following error message: "Login failed for user 'username'.". If I try to add the password in the connection string manually under data sources in the job step dialog, it refuses to save it, always seeming to remove the 'password' bit of the connection string. I thought that SQL server agent jobs always ran under the context of the account which the SQL server agent is running under. This account is a sysadmin on the local SQL server, and the package works using dtexec under that account, so why would it fail when trying to run as an agent job?

    Read the article

  • SQL to select random mix of rows fairly [migrated]

    - by Matt Sieker
    Here's my problem: I have a set of tables in a database populated with data from a client that contains product information. In addition to the basic product information, there is also information about the manufacturer, and categories for those products (a product can be in one or more categories). These categories are then referred to as "Product Categories", and which stores these products are available at. These tables are updated once a week from a feed from the customer. Since for our purposes, some of the product categories are the same, or closely related for our purposes, there is another level of categories called "General Categories", a general category can have one or more product categories. For the scope of these tables, here's some rough numbers: Data Tables: Products: 475,000 Manufacturers: 1300 Stores: 150 General Categories: 245 Product Categories: 500 Mapping Tables: Product Category -> Product: 655,000 Stores -> Products: 50,000,000 Now, for the actual problem: As part of our software, we need to select n random products, given a store and a general category. However, we also need to ensure a good mix of manufacturers, as in some categories, a single manufacturer dominates the results, and selecting rows at random causes the results to strongly favor that manufacturer. The solution that is currently in place, works for most cases, involves selecting all of the rows that match the store and category criteria, partition them on manufacturer, and include their row number from within their partition, then select from that where the row number for that manufacturer is less than n, and use ROWCOUNT to clamp the total rows returned to n. This query looks something like this: SET ROWCOUNT 6 select p.Id, GeneralCategory_Id, Product_Id, ISNULL(m.DisplayName, m.Name) AS Vendor, MSRP, MemberPrice, FamilyImageName from (select p.Id, gc.Id GeneralCategory_Id, p.Id Product_Id, ctp.Store_id, Manufacturer_id, ROW_NUMBER() OVER (PARTITION BY Manufacturer_id ORDER BY NEWID()) AS 'VendorOrder', MSRP, MemberPrice, FamilyImageName from GeneralCategory gc inner join GeneralCategoriesToProductCategories gctpc ON gc.Id=gctpc.GeneralCategory_Id inner join ProductCategoryToProduct pctp on gctpc.ProductCategory_Id = pctp.ProductCategory_Id inner join Product p on p.Id = pctp.Product_Id inner join StoreToProduct ctp on p.Id = ctp.Product_id where gc.Id = @GeneralCategory and ctp.Store_id=@StoreId and p.Active=1 and p.MemberPrice >0) p inner join Manufacturer m on m.Id = p.Manufacturer_id where VendorOrder <=6 order by NEWID() SET ROWCOUNT 0 (I've tried to somewhat format it to make it cleaner, but I don't think it really helps) Running this query with an execution plan shows that for the majority of these tables, it's doing a Clustered Index Seek. There are two operations that take up roughly 90% of the time: Index Seek (Nonclustered) on StoreToProduct: 17%. This table just contains the key of the store, and the key of the product. It seems that NHibernate decided not to make a composite key when making this table, but I'm not concerned about this at this point, as compared to the other seek... Clustered Index Seek on Product: 69%. I really have no clue how I could make this one more performant. On categories without a lot of products, performance is acceptable (<50ms), however larger categories can take a few hundred ms, with the largest category taking 3s (which has about 170k products). It seems I have two ways to go from this point: Somehow optimize the existing query and table indices to lower the query time. As almost every expensive operation is already a clustered index scan, I don't know what could be done there. The inner query could be tuned to not return all of the possible rows for that category, but I am unsure how to do this, and maintain the requirements (random products, with a good mix of manufacturers) Denormalize this data for the purpose of this query when doing the once a week import. However, I am unsure how to do this and maintain the requirements. Does anyone have any input on either of these items?

    Read the article

  • Importing Sql Server 2005 database into Sql Server express 2008

    - by Matthew Kanwisher
    Is there any way to import a database backup from 2005 into 2008 express edition. What I've had to resort to is doing a script the database, then import all the data through DTS. Whenever I tried to import straight from a backup file it says something about not being to import into a new version of sql server or I'll get the below error. title: Microsoft SQL Server Management Studio Specified cast is not valid. (SqlManagerUI)

    Read the article

  • Sync LINQ-to-SQL DBML schema with SQL Server database

    - by Maxim Z.
    After creating a SQL Server 2008 database, I made a Linq-to-SQL schema in Visual Studio. Next, in the .dbml visual editor (in Visual Studio), I added PK-to-FK and PK-to-PK associations to the schema. How do I copy those associations that I created in Visual Studio over to the database? In other words, how do I sync with the DB?

    Read the article

< Previous Page | 47 48 49 50 51 52 53 54 55 56 57 58  | Next Page >