Search Results

Search found 47492 results on 1900 pages for 'sql management studio'.

Page 54/1900 | < Previous Page | 50 51 52 53 54 55 56 57 58 59 60 61  | Next Page >

  • SQL 2008 R2 Named Instance Client Connectivity Issues?

    - by Jerry Dodge
    We're upgrading our software from using SQL 2000 to 2008 R2. Our customers will be installing an update which uninstalls 2000 and installs 2008 R2 under the same instance. So if no instance existed, then no instance name will be set (default). However, the problem starts with the customers which have a named SQL instance. Starting in 2008 R2 (not sure of ones before), for some reason, a client connecting to the server by its instance name is unsuccessful. I'm testing from the Management Studio - if I can't connect this, then nothing can connect. I browse network servers, and find the specific server\instance in the list. But, upon trying to connect to an instance name like MyServer\INST, I get: A network-related or instance-specific error occurred while establishing a connection to SQL Server. The server was not found or was not accessible. Verify that the instance name is correct and that SQL Server is configured to allow remote connections. (provider: SQL Network Interfaces, error: 26 - Error Locating Server/Instance Specified) (Microsoft SQL Server, Error: -1) I do in fact have TCP/IP and Named Pipes protocols enabled, this is the first thing I did. When I connect to the server using a comma (,) and port number like MyServer, 49195, it works just fine. So it appears that client computers are just unable to identify the instance names. This has happened on all our installations of SQL 2008 R2 and from all client computers, including Win 7, XP, Vista, Server 2008, and Server 2003. We never experienced such issues on earlier versions of SQL. The problem even persists if the firewalls and antiviruses are all disabled. Now, this is a large update which we will be distributing soon to all our customers, and we want to minimize the interaction they need with us to get this installed. We absolutely hate the idea of using a port number, because it will always be different, and we would have to modify each client to point to this server/port. Some of our customers may have hundreds of client computers. How do I make client connections to a named SQL instance work again? After all, this is the whole purpose of named instances, and if a client can't connect to this instance by its name, then what is it even named for? EDIT It was mentioned to make sure SQL Browser is running, so I checked, and it is running. The server is also able to connect to its self (locally) - just external connections are refused. UPDATE After more careful checking, I learned the firewall wasn't completely disabled when testing, and upon disabling it completely, this works. So it appears that SQL Browser is being blocked by the firewall from external clients from accessing.

    Read the article

  • How to configure Visual Studio 2010 code coverage for ASP.NET MVC unit tests

    - by DigiMortal
    I just got Visual Studio 2010 code coverage work with ASP.NET MVC application unit tests. Everything is simple after you have spent some time with forums, blogs and Google. To save your valuable time I wrote this posting to guide you through the process of making code coverage work with ASP.NET MVC application unit tests. After some fighting with Visual Studio I got everything to work as expected. I am still not very sure why users must deal with this mess, but okay – I survived it. Before you start configuring Visual Studio I expect your solution meets the following needs: there are at least one library that will be tested, there is at least on library that contains tests to be run, there are some classes and some tests for them, and, of course, you are using version of Visual Studio 2010 that supports tests (I have Visual Studio 2010 Ultimate). Now open the following screenshot to separate windows and follow the steps given below. Visual Studio 2010 Test Settings window. Click on image to see it at original size.  Double click on Local.testsettings under Solution Items. Test settings window will be opened. Select “Data and Diagnostics” from left pane. Mark checkboxes “ASP.NET Profiler” and “Code Coverage”. Move cursor to “Code Coverage” line and press Configure button or make double click on line. Assemblies selection window will be opened. Mark checkboxes that are located before assemblies about what you want code coverage reports and apply settings. Save your project and close Visual Studio. Run Visual Studio as Administrator and run tests. NB! Select Test => Run => Tests in Current Context from menu. When tests are run you can open code coverage results by selecting Test => Windows => Code Coverage Results from menu. Here you can see my example test results. Visual Studio 2010 Test Results window. All my tests passed this time. :) Click on image to see it at original size.  And here are the code coverage results. Visual Studio 2101 Code Coverage Results. I need a lot more tests for sure. Click on image to see it at original size.  As you can see everything was pretty simple. But it took me sometime to figure out how to get everything work as expected. Problems? You may face some problems when making code coverage work. Here is my short list of possible problems. Make sure you have all assemblies available for code coverage. In some cases it needs more libraries to be referenced as you currently have. By example, I had to add some more Enterprise Library assemblies to my project. You can use EventViewer to discover errors that where given during testing. Make sure you selected all testable assemblies from Code Coverage settings like shown above. Otherwise you may get empty results. Tests with code coverage are slower because we need ASP.NET profiler. If your machine slows down then try to free more resources.

    Read the article

  • What is the best way to bookmark positions in code in Visual Studio 2008/2010?

    - by Edward Tanguay
    I find myself going to about five or six main places in my code 80% of the time and would like a way to go to them fast even if all files are closed. I would like to be able to open up a solution in visual studio and with no file open, see a list of self-labeled bookmarks like this: LoadNext Settings page refresh app.config connections app settings stringhelpers top stringhelpers bottom I click one of these and it opens that file and jumps to that position. How can I best make bookmarks like this in Visual Studio 2008/2010?

    Read the article

  • How to add WCF templates to Visual Studio Express?

    - by Mike Kantor
    I am working through the book Learning WCF by Michele Bustamante, and trying to do it using Visual Studio C# Express 2008. The instructions say to use WCF project and item templates, which are not included with VS C# Express. There are templates for these types included with Visual Studio Web Developer Express, and I've tried to copy them over into the right directories for VS C# Express to find, but the IDE doesn't find them. Is there some registration process? Or config file somewhere?

    Read the article

  • SQL Server unique constraint problem

    - by b0x0rz
    How to create a unique constraint on a varchar(max) field in visual studio, visually. the problem is when i try it: manage indexes and keys add columns I can only chose the bigint columns, but not any of the varchar(max) ones. Do I maybe have to use check constraints? If yes, what to put in the expression? Thnx for the info

    Read the article

  • SQL Server unique contraint problem

    - by b0x0rz
    How to create a unique constraint on a varchar(max) field in visual studio, visually. the problem is when i try it: manage indexes and keys add columns i can only chose the bigint columns, but not any of the varchar(max) ones. do i maybe have to use check constraints? if yes, what to put in the expression? thnx for the info

    Read the article

  • La version gratuite de Visual Studio 2010 est disponible, Visual Studio 2010 Express introduit des o

    Visual Studio 2010 Express est disponible La version gratuite de Visual Studio 2010 propose à présent des outils de développement pour Windows Phone 7 Alors que Visual Studio 2010 débarque avec ses nouveautés (lire par ailleurs ici), sa version gratuite Visual Studio Express 2010 vient d'être mis à la disposition des développeurs sur le site officiel de Microsoft. Cette version présente des fonctionnalités certes limitées par rapport à la version complète (elle est prévue...

    Read the article

  • Custom forms in Sharepoint with MS SQL Server as Backend. Is it possible?

    - by Kaan
    We're evaluating using SharePoint 2010 as our project management tool. Specifically, the system needs to satisfy the following: Discussion groups Project management (simple issue tracking, no complex workflows or vcs integrations) News feed for the project(s) File sharing based on authorization/user-roles Custom homepage Custom forms using MS SQL Server as a backend and contents of old forms searchable from the user interface. Now, I think [1-5] is possible using SharePoint (Comments are always welcome :)). I'm not sure about [6]. Is it possible? For instance, can an admin or a user of the SharePoint portal, create a custom form (without any programming) that uses MS SQL Server as a backend and publish it to the portal so that other users can also perform data entry? If it can be done (be it with or without some programming), can users perform text search on form data using the SharePoint interface?

    Read the article

  • SOA Governance Starts with People and Processes

    - by Jyothi Swaroop
    While we all agree that SOA Governance is about People, Processes and Technology. Some experts are of the opinion that SOA Governance begins with People and Processes but needs to be empowered with technology to achieve the best results. Here's an interesting piece from David Linthicum on eBizq: In the world of SOA, the concept of SOA governance is getting a lot of attention. However, how SOA governance is defined and implemented really depends on the SOA governance vendor who just left the building within most enterprises. Indeed, confusion is a huge issue when considering SOA governance, and the core issues are more about the fundamentals of people and processes, and not about the technology. SOA governance is a concept used for activities related to exercising control over services in an SOA, including tracking the services, monitoring the service, and controlling changes made to the services, simple put. The trouble comes in when SOA governance vendors attempt to define SOA governance around their technology, all with different approaches to SOA governance. Thus, it's important that those building SOAs within the enterprise take a step back and understand what really need to support the concept of SOA governance. The value of SOA governance is pretty simple. Since services make up the foundation of an SOA, and are at their essence the behavior and information from existing systems externalized, it's critical to make sure that those accessing, creating, and changing services do so using a well controlled and orderly mechanism. Those of you, who already have governance in place, typically around enterprise architecture efforts, will be happy to know that SOA governance does not replace those processes, but becomes a mechanism within the larger enterprise governance concept. People and processes are first thing on the list to get under control before you begin to toss technology at this problem. This means establishing an understanding of SOA governance within the team members, including why it's important, who's involved, and the core processes that are to be follow to make SOA governance work. Indeed, when creating the core SOA governance strategy should really be independent of the technology. The technology will change over the years, but the core processes and discipline should be relatively durable over time.

    Read the article

  • Implementing Database Settings Using Policy Based Management

    - by Ashish Kumar Mehta
    Introduction Database Administrators have always had a tough time to ensuring that all the SQL Servers administered by them are configured according to the policies and standards of organization. Using SQL Server’s  Policy Based Management feature DBAs can now manage one or more instances of SQL Server 2008 and check for policy compliance issues. In this article we will utilize Policy Based Management (aka Declarative Management Framework or DMF) feature of SQL Server to implement and verify database settings on all production databases. It is best practice to enforce the below settings on each Production database. However, it can be tedious to go through each database and then check whether the below database settings are implemented across databases. In this article I will explain it to you how to utilize the Policy Based Management Feature of SQL Server 2008 to create a policy to verify these settings on all databases and in cases of non-complaince how to bring them back into complaince. Database setting to enforce on each user database : Auto Close and Auto Shrink Properties of database set to False Auto Create Statistics and Auto Update Statistics set to True Compatibility Level of all the user database set as 100 Page Verify set as CHECKSUM Recovery Model of all user database set to Full Restrict Access set as MULTI_USER Configure a Policy to Verify Database Settings 1. Connect to SQL Server 2008 Instance using SQL Server Management Studio 2. In the Object Explorer, Click on Management > Policy Management and you will be able to see Policies, Conditions & Facets as child nodes 3. Right click Policies and then select New Policy…. from the drop down list as shown in the snippet below to open the  Create New Policy Popup window. 4. In the Create New Policy popup window you need to provide the name of the policy as “Implementing and Verify Database Settings for Production Databases” and then click the drop down list under Check Condition. As highlighted in the snippet below click on the New Condition… option to open up the Create New Condition window. 5. In the Create New Condition popup window you need to provide the name of the condition as “Verify and Change Database Settings”. In the Facet drop down list you need to choose the Facet as Database Options as shown in the snippet below. Under Expression you need to select Field value as @AutoClose and then choose Operator value as ‘ = ‘ and finally choose Value as False. Now that you have successfully added the first field you can now go ahead and add rest of the fields as shown in the snippet below. Once you have successfully added all the above shown fields of Database Options Facet, click OK to save the changes and to return to the parent Create New Policy – Implementing and Verify Database Settings for Production Database windows where you will see that the newly created condition “Verify and Change Database Settings” is selected by default. Continues…

    Read the article

  • Visual Studio Pro extensions with Express editions

    - by espais
    I am curious if it is somehow possible to get an extension written for Visual Studio 2010 to work with Visual Studio 2010 Express. My problem is that I've upgraded to 2010 Express, but my company is not ready to buy the full version yet. There is an extension I would like to use, but unfortunately I cannot import it as it was built for the standard edition. Is there any way to hack it in somehow?

    Read the article

  • SQL server agent job to execute SSIS package fails, package succeds if run manually

    - by growse
    I've got a SSIS package installed on a SQL server (SQL Server 2012). It's fairly simple and just fetches data from a remote data source and adds it into a local table. The remote connection string is using SQL server authentication, while the local connection is using Windows auth. The remote connection password is protected, and the package was imported setting the protection level to Rely on server storage and roles for access control. If I run the SSIS package manually, it works. If I run it from the command line using dtexec, it works. If I use runas to switch to the domain account that the SQL server agent is running under, and then run the package using dtexec, it works. If I create a SQL Agent job with a single step to run the package, it fails, providing very little detail as to what's going on. I'm guessing it's not able to get the password to log into the remote SQL server, because it fails very quickly. Also, if I tick 'log to table' and view the resulting file, I get the following: Description: ADO NET Source has failed to acquire the connection {0D8F2CD4-A763-4AEB-8B52-B8FAE0621ED3} with the following error message: "Login failed for user 'username'.". If I try to add the password in the connection string manually under data sources in the job step dialog, it refuses to save it, always seeming to remove the 'password' bit of the connection string. I thought that SQL server agent jobs always ran under the context of the account which the SQL server agent is running under. This account is a sysadmin on the local SQL server, and the package works using dtexec under that account, so why would it fail when trying to run as an agent job?

    Read the article

  • SQL to select random mix of rows fairly [migrated]

    - by Matt Sieker
    Here's my problem: I have a set of tables in a database populated with data from a client that contains product information. In addition to the basic product information, there is also information about the manufacturer, and categories for those products (a product can be in one or more categories). These categories are then referred to as "Product Categories", and which stores these products are available at. These tables are updated once a week from a feed from the customer. Since for our purposes, some of the product categories are the same, or closely related for our purposes, there is another level of categories called "General Categories", a general category can have one or more product categories. For the scope of these tables, here's some rough numbers: Data Tables: Products: 475,000 Manufacturers: 1300 Stores: 150 General Categories: 245 Product Categories: 500 Mapping Tables: Product Category -> Product: 655,000 Stores -> Products: 50,000,000 Now, for the actual problem: As part of our software, we need to select n random products, given a store and a general category. However, we also need to ensure a good mix of manufacturers, as in some categories, a single manufacturer dominates the results, and selecting rows at random causes the results to strongly favor that manufacturer. The solution that is currently in place, works for most cases, involves selecting all of the rows that match the store and category criteria, partition them on manufacturer, and include their row number from within their partition, then select from that where the row number for that manufacturer is less than n, and use ROWCOUNT to clamp the total rows returned to n. This query looks something like this: SET ROWCOUNT 6 select p.Id, GeneralCategory_Id, Product_Id, ISNULL(m.DisplayName, m.Name) AS Vendor, MSRP, MemberPrice, FamilyImageName from (select p.Id, gc.Id GeneralCategory_Id, p.Id Product_Id, ctp.Store_id, Manufacturer_id, ROW_NUMBER() OVER (PARTITION BY Manufacturer_id ORDER BY NEWID()) AS 'VendorOrder', MSRP, MemberPrice, FamilyImageName from GeneralCategory gc inner join GeneralCategoriesToProductCategories gctpc ON gc.Id=gctpc.GeneralCategory_Id inner join ProductCategoryToProduct pctp on gctpc.ProductCategory_Id = pctp.ProductCategory_Id inner join Product p on p.Id = pctp.Product_Id inner join StoreToProduct ctp on p.Id = ctp.Product_id where gc.Id = @GeneralCategory and ctp.Store_id=@StoreId and p.Active=1 and p.MemberPrice >0) p inner join Manufacturer m on m.Id = p.Manufacturer_id where VendorOrder <=6 order by NEWID() SET ROWCOUNT 0 (I've tried to somewhat format it to make it cleaner, but I don't think it really helps) Running this query with an execution plan shows that for the majority of these tables, it's doing a Clustered Index Seek. There are two operations that take up roughly 90% of the time: Index Seek (Nonclustered) on StoreToProduct: 17%. This table just contains the key of the store, and the key of the product. It seems that NHibernate decided not to make a composite key when making this table, but I'm not concerned about this at this point, as compared to the other seek... Clustered Index Seek on Product: 69%. I really have no clue how I could make this one more performant. On categories without a lot of products, performance is acceptable (<50ms), however larger categories can take a few hundred ms, with the largest category taking 3s (which has about 170k products). It seems I have two ways to go from this point: Somehow optimize the existing query and table indices to lower the query time. As almost every expensive operation is already a clustered index scan, I don't know what could be done there. The inner query could be tuned to not return all of the possible rows for that category, but I am unsure how to do this, and maintain the requirements (random products, with a good mix of manufacturers) Denormalize this data for the purpose of this query when doing the once a week import. However, I am unsure how to do this and maintain the requirements. Does anyone have any input on either of these items?

    Read the article

  • Importing Sql Server 2005 database into Sql Server express 2008

    - by Matthew Kanwisher
    Is there any way to import a database backup from 2005 into 2008 express edition. What I've had to resort to is doing a script the database, then import all the data through DTS. Whenever I tried to import straight from a backup file it says something about not being to import into a new version of sql server or I'll get the below error. title: Microsoft SQL Server Management Studio Specified cast is not valid. (SqlManagerUI)

    Read the article

  • 'The default schema does not exist' on deploy of SQL CLR assembly onto SQL Server 2008

    - by abatishchev
    I'm deploying an example SQL CLR stored procedure which has a SQL CLR type as parameter using Visual Studio 2008 and menu Project -> Deploy. public partial class StoredProcedures { [Microsoft.SqlServer.Server.SqlProcedure] public static void TakeTariff(TariffInfo tariffInfo) { } } public class TariffInfo { public SqlDecimal Amount { get; private set; } } but getting next strange error: The default schema does not exist. How can I fix that? My user was created this way: CREATE USER myUser FOR LOGIN myLogin_mod WITH DEFAULT_SCHEMA = mySchema

    Read the article

  • SQL Agent Command Line Not Saved

    - by Greg
    I have a SSIS package I am trying to schedule. I create a new job under SQL Server Agent. On the Command line tab of the jobstep, I choose "Edit the command-line manually". The changes are retained as I switch from tab to tab within the job step but whenever I exit and save the job, the changes are lost. Any ideas what's going on? I'm on SQL Server 2008.

    Read the article

  • Default Value or Binding in "Transfer SQL Server Object Task"

    - by Kronass
    Hi, I want to move 500 table from Database to other with their data and constraints all the tables have column who has default value, I used SSIS using "Transfer SQL Server Object Task" and I choose to copy all tables, copy data and primary keys, it copies the table except the default bindings I tried in SQL Server 2008 CopyAllDRIObjects Property but still the same result. How can I copy all tables from database to other with their data and maintaining their constraints.

    Read the article

  • How To Automatically Script SQL Server: 'Generate Scripts' for SQL Database

    - by skimania
    I want to run scheduled nightly exports of my database code into my SVN source. It's easy to schedule automated check-in's into svn from a folder, but scheduling the export from SQL in SQL Management Studio is Right click target database, choose Tasks Generate Scripts. Follow the wizard and presto you've got scripts in a folder. Is it possible to extract a single script that the wizard generates, and stuff that into a stored proc which I can run nightly? Ideas?

    Read the article

  • sql server: import operation won't copy full schema

    - by P a u l
    I recall that the import tool in sql server 2000 would copy indexes, relationships, etc. In sql server 2005/2008 the import tool in SSMS will only create the tables, copy the data, but the keys, indexes, relationships are missing. I can find no option in the import wizard to enable this? What am I missing here? Is this not possible anymore for any good reason?

    Read the article

  • update SQl table from values in excel

    - by user175084
    I am using the SQL Developer or SQl express. How do i get the values from an excel sheet and update those in a column of my database... Please help thanks. i have this and im running it but i get error: SELECT * FROM OPENROWSET('Microsoft.Jet.OLEDB.4.0', 'Excel 8.0;Database=C:\books.xls', 'SELECT * FROM [Sheet1$]') i get error now OLE DB provider "Microsoft.Jet.OLEDB.4.0" for linked server "(null)" returned message "Could not find installable ISAM.". thanks

    Read the article

  • How convert sql query to linq-to-sql

    - by name1ess0ne
    I have Sql query: SELECT News.NewsId, News.Subject, Cm.Cnt FROM dbo.News LEFT JOIN (SELECT Comments.OwnerId, COUNT(Comments.OwnerId) as Cnt FROM Comments WHERE Comments.CommentType = 'News' Group By Comments.OwnerId) Cm ON Cm.OwnerId = News.NewsId But I want linq-to-sql query, how I can convert this to linq?

    Read the article

< Previous Page | 50 51 52 53 54 55 56 57 58 59 60 61  | Next Page >