Search Results

Search found 42321 results on 1693 pages for 'sql reporting services 05'.

Page 242/1693 | < Previous Page | 238 239 240 241 242 243 244 245 246 247 248 249  | Next Page >

  • How do I best display CheckBoxes in SQL Server Reporting Services?

    - by Yadyn
    One of the many quirks of Reporting Services we've run across is the complete and utter lack of a CheckBox control or even something remotely similar. We have a form that should appear automatically filled out based on information pulled from a database. We have several bit datatype fields. Printing out "True" or "False" just looks silly, as this is supposed to look like a form that has been auto-filled out, so we want to have a series of checkboxes and labels that are either checked or unchecked. We are running SSRS 2005 but I'm not aware of SSRS 2008 having added a CheckBox control. Even if it did, we'd need to have an alternative for the time being. The best we've found so far is: use Wingdings use images use text boxes with borders and print a blank/space or a capital X All three approaches require IIF expression shenanigans. The Wingdings approach seemed to work acceptably, and was the most aesthetically pleasing except that for whatever reason it didn't always print correctly. More importantly, PDF exports, also for whatever reason, converted all fonts (generally) to Arial and so we got funky letters instead of the Windings dingbats. Images, being a pixel-based raster, don't do so well when printed along side vector stuff like text. Unless handled carefully, they tend to stretch, pixelate, and do other unprofessional looking things. While these methods do work (some with limitations as mentioned above) none of them are particularly elegant. Are we missing something obvious? Not so obvious? Does someone at Microsoft have a good reason why such a control was not provided in SSRS 2000, let alone 2 versions and 8 years later? This can't be the first time this issue has come up...

    Read the article

  • Are SQL Reporting Services Report Parameters deprecated in VS.NET 2010?

    - by Jason Kealey
    We use an Reporting Services inside an ASP.NET web application. (We have an *.rdlc which is presented to the ReportViewer web control in our page). Our ASPX page wires up a few report parameters in code: var parameters = new List<ReportParameter>(); parameters.Add(new ReportParameter("StoreAddress", InvoiceStoreAddress)); parameters.Add(new ReportParameter("LogoURL", InvoiceLogoURL)); parameters.Add(new ReportParameter("StoreName", InvoiceStoreName)); ReportViewer1.LocalReport.SetParameters(parameters); These are just general parameters that are passed to the report, instead of hooking it up to a data source. Recently, we upgraded to VS.NET 2010. We upgraded the *.rdlc to the newest version and also upgraded the ReportViewer control used by ASP.NET. Everything works as it did before. However, I now want to add a new report parameter to my *.rdlc. I typically right-clicked on the top left corner and clicked on "Report Parameters" to add it. With the new VS.NET, I cannot find this option anywhere - it is not even in the report properties. Where did it go? Are the deprecating this feature? How should I be passing some general parameters now?

    Read the article

  • Run Reporting Service in local mode and generate columns automatically?

    - by grady
    Hi, I have a SQL query right now which I want to use with the MS reporting services in my ASP.NET application. So I created a report in local mode (rdlc) and attached this to a report viewer. Since my query uses parameters, I created a stored procedure, which had exactly those parameters. In addition to that I had some textboxes which are used for entering the params for the query and a button to call the stored proc and to fill the datatset which is bound to the report viewer. This works, I press the button and according to what I entred the correct data is shown. Now my question: In the future I plan to have multiple reports (which will be selected in a dropdown) and I wonder if I can somehow just call the correct stored procedure and according to the columns which are *SELECT*ed in the procedure, the columns are shown in the report. Example: I select report1 from the dropdown (procedure for report 1 is called), 5 columns are shown in the reportviewer. I select report2 from dropdown (procedure for report 2 is called), 8 columns are shown. Is that possible somehow? Thanks :-)

    Read the article

  • Columnstore Case Study #2: Columnstore faster than SSAS Cube at DevCon Security

    - by aspiringgeek
    Preamble This is the second in a series of posts documenting big wins encountered using columnstore indexes in SQL Server 2012 & 2014.  Many of these can be found in my big deck along with details such as internals, best practices, caveats, etc.  The purpose of sharing the case studies in this context is to provide an easy-to-consume quick-reference alternative. See also Columnstore Case Study #1: MSIT SONAR Aggregations Why Columnstore? As stated previously, If we’re looking for a subset of columns from one or a few rows, given the right indexes, SQL Server can do a superlative job of providing an answer. If we’re asking a question which by design needs to hit lots of rows—DW, reporting, aggregations, grouping, scans, etc., SQL Server has never had a good mechanism—until columnstore. Columnstore indexes were introduced in SQL Server 2012. However, they're still largely unknown. Some adoption blockers existed; yet columnstore was nonetheless a game changer for many apps.  In SQL Server 2014, potential blockers have been largely removed & they're going to profoundly change the way we interact with our data.  The purpose of this series is to share the performance benefits of columnstore & documenting columnstore is a compelling reason to upgrade to SQL Server 2014. The Customer DevCon Security provides home & business security services & has been in business for 135 years. I met DevCon personnel while speaking to the Utah County SQL User Group on 20 February 2012. (Thanks to TJ Belt (b|@tjaybelt) & Ben Miller (b|@DBADuck) for the invitation which serendipitously coincided with the height of ski season.) The App: DevCon Security Reporting: Optimized & Ad Hoc Queries DevCon users interrogate a SQL Server 2012 Analysis Services cube via SSRS. In addition, the SQL Server 2012 relational back end is the target of ad hoc queries; this DW back end is refreshed nightly during a brief maintenance window via conventional table partition switching. SSRS, SSAS, & MDX Conventional relational structures were unable to provide adequate performance for user interaction for the SSRS reports. An SSAS solution was implemented requiring personnel to ramp up technically, including learning enough MDX to satisfy requirements. Ad Hoc Queries Even though the fact table is relatively small—only 22 million rows & 33GB—the table was a typical DW table in terms of its width: 137 columns, any of which could be the target of ad hoc interrogation. As is common in DW reporting scenarios such as this, it is often nearly to optimize for such queries using conventional indexing. DevCon DBAs & developers attended PASS 2012 & were introduced to the marvels of columnstore in a session presented by Klaus Aschenbrenner (b|@Aschenbrenner) The Details Classic vs. columnstore before-&-after metrics are impressive. Scenario Conventional Structures Columnstore ? SSRS via SSAS 10 - 12 seconds 1 second >10x Ad Hoc 5-7 minutes (300 - 420 seconds) 1 - 2 seconds >100x Here are two charts characterizing this data graphically.  The first is a linear representation of Report Duration (in seconds) for Conventional Structures vs. Columnstore Indexes.  As is so often the case when we chart such significant deltas, the linear scale doesn’t expose some the dramatically improved values corresponding to the columnstore metrics.  Just to make it fair here’s the same data represented logarithmically; yet even here the values corresponding to 1 –2 seconds aren’t visible.  The Wins Performance: Even prior to columnstore implementation, at 10 - 12 seconds canned report performance against the SSAS cube was tolerable. Yet the 1 second performance afterward is clearly better. As significant as that is, imagine the user experience re: ad hoc interrogation. The difference between several minutes vs. one or two seconds is a game changer, literally changing the way users interact with their data—no mental context switching, no wondering when the results will appear, no preoccupation with the spinning mind-numbing hurry-up-&-wait indicators.  As we’ve commonly found elsewhere, columnstore indexes here provided performance improvements of one, two, or more orders of magnitude. Simplified Infrastructure: Because in this case a nonclustered columnstore index on a conventional DW table was faster than an Analysis Services cube, the entire SSAS infrastructure was rendered superfluous & was retired. PASS Rocks: Once again, the value of attending PASS is proven out. The trip to Charlotte combined with eager & enquiring minds let directly to this success story. Find out more about the next PASS Summit here, hosted this year in Seattle on November 4 - 7, 2014. DevCon BI Team Lead Nathan Allan provided this unsolicited feedback: “What we found was pretty awesome. It has been a game changer for us in terms of the flexibility we can offer people that would like to get to the data in different ways.” Summary For DW, reports, & other BI workloads, columnstore often provides significant performance enhancements relative to conventional indexing.  I have documented here, the second in a series of reports on columnstore implementations, results from DevCon Security, a live customer production app for which performance increased by factors of from 10x to 100x for all report queries, including canned queries as well as reducing time for results for ad hoc queries from 5 - 7 minutes to 1 - 2 seconds. As a result of columnstore performance, the customer retired their SSAS infrastructure. I invite you to consider leveraging columnstore in your own environment. Let me know if you have any questions.

    Read the article

  • SQL Triggers and when or when not to use them.

    - by John Mitchell
    When I was originally learning about SQL I was always told, only use triggers if you really need to and opt to use stored procedures instead if possible. Now unfortunately at the time (a good few years ago) I wasn't as curious and caring about fundamentals as I am now so never did ask to the reason why. What's the communities opinion in this? Is it just someone's personal preference, or should triggers be avoided (just like cursors) unless there is a good reason for them.

    Read the article

  • Create SQL Server Analysis Services Partitions using AMO

    When you have SSAS cubes with millions of rows of data, it is very helpful to create partitions. If you have a few cubes you could probably do this manually, but if there are many or if you want to automate this process you should look for smarter solutions such as programming the creation of partitions dynamically. NEW! Never waste another weekend deployingDeploy SQL Server changes and ASP .NET applications fast, frequently, and without fuss, using Deployment Manager, the new tool from Red Gate. Try it now.

    Read the article

  • SQL Server 2012 Integration Services - Package and Project Configurations

    Marcin Policht examines SSIS 2012 package and project configurations, which offer different ways of modifying values of variables and parameters without having to directly edit content of the packages and projects of which they are a part. Get smart with SQL Backup ProGet faster, smaller backups with integrated verification.Quickly and easily DBCC CHECKDB your backups. Learn more.

    Read the article

  • Using Microsoft.Reporting.WebForms.ReportViewer in a custom SharePoint WebPart

    - by iHeartDucks
    I have a requirement where I have to display some data (from a custom db) and let the user export it to an excel file. I decided to use the ReportViewer control present in Microsoft.Reporting.WebForms.ReportViewer. I added the required assembly to the project references and I add the following code to test it out protected override void CreateChildControls() { base.CreateChildControls(); objRV = new Microsoft.Reporting.WebForms.ReportViewer(); objRV.ID = "objRV"; this.Controls.Add(objRV); } The first error asked me to add this line in the web.config which I did and the next error says The type 'Microsoft.SharePoint.Portal.Analytics.UI.ReportViewerMessages, Microsoft.SharePoint.Portal, Version=12.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c' does not implement IReportViewerMessages Is it possible to use ReportViewer in my custom Web Part? I rather not bind a repeater and write my own export to excel code. I want to use something which is already built by Microsoft? Any ideas on what I can reuse? Edit I commented the following line <add key="ReportViewerMessages"... and now my code looks like this after I added a data source to it protected override void CreateChildControls() { base.CreateChildControls(); objRV = new Microsoft.Reporting.WebForms.ReportViewer(); objRV.ID = "objRV"; objRV.Visible = true; Microsoft.Reporting.WebForms.ReportDataSource datasource = new Microsoft.Reporting.WebForms.ReportDataSource("test", GroupM.Common.DB.GetAllClientCodes()); objRV.LocalReport.DataSources.Clear(); objRV.LocalReport.DataSources.Add(datasource); objRV.LocalReport.Refresh(); this.Controls.Add(objRV); } but now I do not see any data on the page. I did check my db call and it does return a data table with 15 rows. Any ideas why I don't see anything on the page?

    Read the article

  • Can't step into stored procedure on remote SQL Server 2008

    - by abatishchev
    I have a domain installed on virtual Windows Server 2008 x64. SQL Server 2008 Express x64 is running in Windows Server 2008 x64 and client on Windows 7 RTM x86. Both are into the domain. I'm starting both Visual Studio 2008 and SQL Server Management Studio 2008 under domain admin user. This account is a member of group sysadmin on SQL Server. Server has firewall exceptions for both TCP and UDP on ports 135-139 and 1433-1434. Visual Studio 2008 Remote debugger services is started on server and Domain Admins group is allowed to debug, When I'm starting debugging of a query in SMS I'm getting this error: Failed to start debugger Error HRESULT E_FAIL has been returned from a call to a COM component. (mscorlib) Program Location: at System.Runtime.InteropServices.Marshal.ThrowExceptionForHRInternal(Int32 errorCode, IntPtr errorInfo) at Microsoft.SqlServer.Management.UI.VSIntegration.DebugSession.DebugCallbacks.OnSqlInitializeDebuggingEvent(ISqlInitializeDebuggingEvent sqlInitializeDebuggingEvent) at Microsoft.SqlServer.Management.UI.VSIntegration.DebugSession.DebugCallbacks.Microsoft.VisualStudio.Debugger.Interop.IDebugEventCallback2.Event(IDebugEngine2 debugEngine, IDebugProcess2 debugProcess, IDebugProgram2 debugProgram, IDebugThread2 debugThread, IDebugEvent2 debugEvent, Guid& riidEvent, UInt32 attribute) and Unable to access the SQL Server debugging interface. The Visual Studio debugger cannot connect to the remote computer. A firewall may be preventing communication via DCOM to the remote computer. Please see Help for assistance. and Unable to start program MSSSQL://server.mydomain.local/master/sys/=0 And when stepping-in into a stored procedure using VS I'm getting the first one and this: Exception from HRESULT: 0x89710016 What have I do?

    Read the article

  • How to Link VS2010 Database Project and LINQ to SQL

    - by Jason
    As I am working with the new database projects in VS2010, and as I am learning LINQ to SQL, I am curious as to the best way to link the two groups of information so that when I update one, the other updates along with it. From my research here at SO, as well as in Google, it appears the general rule of thumb is: "Build the database, and then create your LINQ to SQL classes." Of course, if I make a change in my database, the LINQ to SQL doesn't update automatically and I have to do it by hand. This is fairly simple right now as my database is small, but I am curious if there is an easier way for this to happen. In addition, the LINQ to SQL tool is pretty nice. The ability to create tables, add associations, and even create inheritance is very simple. As my second question, I am curious as to whether or not VS2010 can work the other way - I design the database in the DBLM file and then link it back to my database project. I appreciate any help with either of these two questions. I'm really interested in making this as easy as possible to reduce errors during development and improve the speed at which changes can be made.

    Read the article

  • SQL Server Management Studio – tips for improving the TSQL coding process

    - by kristof
    I used to work in a place where a common practice was to use Pair Programming. I remember how many small things we could learn from each other when working together on the code. Picking up new shortcuts, code snippets etc. with time significantly improved our efficiency of writing code. Since I started working with SQL Server I have been left on my own. The best habits I would normally pick from working together with other people which I cannot do now. So here is the question: What are you tips on efficiently writing TSQL code using SQL Server Management Studio? Please keep the tips to 2 – 3 things/shortcuts that you think improve you speed of coding Please stay within the scope of TSQL and SQL Server Management Studio 2005/2008 If the feature is specific to the version of Management Studio please indicate: e.g. “Works with SQL Server 2008 only" Thanks EDIT: I am afraid that I could have been misunderstood by some of you. I am not looking for tips for writing efficient TSQL code but rather for advice on how to efficiently use Management Studio to speed up the coding process itself. The type of answers that I am looking for are: use of templates, keyboard-shortcuts, use of IntelliSense plugins etc. Basically those little things that make the coding experience a bit more efficient and pleasant. Thanks again

    Read the article

  • Display a Photo Gallery using Asp.Net and SQL

    - by sweetcoder
    I have recently added photos to my SQL database and have displayed them on an *.aspx page using Asp:Image. The ImageUrl for this control stored in a separate *.aspx page. It works great for profile pictures. I have a new issue at hand. I need each user to be able to have their own photo gallery page. I want the photos to be stored in the sql database. Storing the photos is not difficult. The issue is displaying the photos. I want the photos to be stored in a thumbnail grid fashion, when the user clicks on the photo, it should bring up the photo on a separate page. What is the best way to do this. Obviously it is not to use Asp:Image. I am curious if I should use a Gridview. If so, how do I do that and should their be a thumbnail size stored in the database for this? Once the picture is click on how does the other page look so that it displays the correct image. I would think it is not correct to send the photoId through the url. Below is code from the page I use to display profile pictures. protected void Page_Load(object sender, EventArgs e) { string sql = "SELECT [ProfileImage] FROM [UserProfile] WHERE [UserId] = '" + User.Identity.Name.ToString() + "'"; string strCon = System.Web.Configuration.WebConfigurationManager.ConnectionStrings["SocialSiteConnectionString"].ConnectionString; SqlConnection conn = new SqlConnection(strCon); SqlCommand comm = new SqlCommand(sql, conn); conn.Open(); Response.ContentType = "image/jpeg"; Response.BinaryWrite((byte[])comm.ExecuteScalar()); conn.Close(); }

    Read the article

  • Issues Connecting to SQLExpress using Oracle SQL Developer

    - by ArtDeveloper
    Hey Guys, I'm trying to create a connection inside Oracle SQL Developer to a SQLExpress database I have Everything I have resides on the same machine so there isn't any network issues I should have to deal with but everytime I follow the instructions and I try to connect I get the following message "Failure - Unable to get information from SQL Server: localhost." I can connect to the SQLExpress DB using the SQL Management Studio and through an ODBC connection. I've installed the third party extensions and I've enabled the TCP protocol on the SQL Server Configuration manager as well as enabled the IP Addresses I'm assuming that the SQLExpress Database is on port 1433 because I didn't change this when I installed but if someone can tell me how to double check that I would appreciate that info as well. I setup the new connection with the following information name: databasename I'm using windows authentication so the username and password aren't filled in host:localhost port:1433/databasename;instance=SQLEXPRESS *databasename - this is replaced with the actual DB name I've just changed the name here to protect the innocent I've spent about a full day on this trying to get it connected and many google attempts where other ppl have had this issue but have gotten it solved through various methods that I've tried and it hasn't resolved my issue. Any information would be much appreciated Thank you in Advance, AD

    Read the article

  • WCF fault logging and SQL Exception 4060 error.

    - by Bill
    I have been attempting to compile/run a sample WCF application from Juval Lowy's website (author of Programming WCF Services & founder of IDesign) for several days. The example app utilizes Juval's ServiceModelEx library which logs faults/errors to a "WCFLogbook" SQL database. Unfortunately, when the sample app faults, I get the following error: SQL Exception 4060: "Cannot open database \"WCFLogbook\" requested by the login. The login failed.\r\nLogin failed for user 'Bill-PC\Bill'." I confirmed that the SQL WCFLogbook database has been created and have granted all of the appropriate permissions for my (Bill-PC\Bill) access to the database. Additionally port 8006 and port 1433 have been opened in the Firewall. TCP/IP has been enabled and "Allow remote connections to this server" has been checked. I am using the following endpoint within the App.Config file: <client> <endpoint name="LogbookTCP" address="net.tcp://Bill-PC:8006/LogbookManager" binding="netTcpBinding" contract="ILogbookManager" /> </client> Unfortunately SQL is a 'world' that I hadn't needed to venture into before now and I am terribly frustrated with my lack of success. Would anyone have any other suggestions on how to get this working? Have I missed anything?

    Read the article

  • Can't connect to SQL Server 2008 - looks like Shared Memory problem

    - by Proposition Joe
    I am unable to connect to my local instance of SQL Server 2008 Express using SQL Server Management Studio. I believe the problem is related to a change I made to the connection protocols. Before the error occurred, I had Shared Memory enabled and Named Pipes and TCP/IP disabled. I then enabled both Named Pipes and TCP/IP, and this is when I started experiencing the problem. When I try to connect to the server with SSMS (with either my SQL server sysadmin login or with windows authentication), I get the following error message: A connection was successfully established with the server, but then an error occurred during the login process. (provider: Named Pipes Provider, error: 0 - No process is on the other end of the pipe.) (Microsoft SQL Server, Error: 233) Why is it returning a Named Pipes error? Why would it not just use Shared Memory, as this has a higher priority order in the list of connection protocols? It seems like it is not listening on Shared Memory for some reason? When I set Named Pipes to enabled and try to connect, I get the same error message. My windows account is does not have administrator priviliges on my computer - perhaps this is making a difference in some way (as some of the discussions in this post about an "SuperSocketNetLib\Lpc" registry key seems to suggest).

    Read the article

  • Package creation issues using SQL Developer

    - by Carter
    So I've never worked with stored procedures and have not a whole lot of DB experience in general and I've been assigned a task that requires I create a package and I'm stuck. Using SQL Developer, I'm trying to create a package called JUMPTO with this code... create or replace package JUMPTO is type t_locations is ref cursor; procedure procGetLocations(locations out t_locations); end JUMPTO; When I run it, it spits out this PL/SQL code block... DECLARE LOCATIONS APPLICATION.JUMPTO.t_locations; BEGIN JUMPTO.PROCGET_LOCATIONS( LOCATIONS = LOCATIONS ); -- Modify the code to output the variable -- DBMS_OUTPUT.PUT_LINE('LOCATIONS = ' || LOCATIONS); END; A tutorial I found said to take out the comment for that second line there. I've tried with and without the comment. When I hit "ok" I get the error... ORA-06550: line 2, column 32: PLS-00302: component 'JUMPTO' must be declared ORA-06550: line 2, column 13: PL/SQL: item ignored ORA-06550: line 6, column 18: PLS-00320: the declaration of the type of this expression is incomplete or malformed ORA-06550: line 5, column 3: PL/SQL: Statement ignored ORA-06512: at line 58 I really don't have any idea what's going on, this is all completely new territory for me. I tried creating a body that just selected some stuff from the database but nothing is working the way it seems like it should in my head. Can anyone give me any insight into this?

    Read the article

  • Replacing XML reserved characters in SQL Server 2005

    - by Barn
    I'm working on a system that takes relational data from a sql server DB and uses SSIS to produce an XML extract using sql server 2005's 'FOR XML PATH' command and a schema. The problem lies with replacing the XML reserved characters. 'FOR XML PATH' is only replacing <, , and &, not ' and ", so I need a way of replacing these myself. I've tried pre-processing the fields in the database to replace XML reserved characters with their entitised equivalents (e.g. & becomes &amp;), but once these fields are used to construct XML using FOR XML the leading & is replaced with &amp;, so I end up with &amp;amp; where I should have &amp;. What I've tried so far is altering the element's contents after the XML has been constructed using XQuery inside SQL server like so: DECLARE @data VARCHAR(MAX) SET @data = CONVERT(VARCHAR(MAX), [my xml column].query(' data(/root/node_i_want)') SELECT @data = [function to replace quotes etc](@data) SET [my xml column].modify('replace value of (/root/node_i_want)[1] with sql:variable("@data")') but I get the same problem. Essentially, is there something wrong I'm doing with the above, or a way to tell FOR XML to entitise other characters, or something like that? Basically anything short of having to write a program to change the XML after it has been assembled in large batches and saved to files!

    Read the article

  • Python: How to execute a SQL file or command

    - by Mestika
    Hi, I have this Python script: import sys import getopt import timeit import random import os import re import ibm_db import time from string import maketrans runs=5 queries=50 file = open("results.txt", "a") for r in range(5): print "Run %s\n" % r os.system("python reads.py -r1 -pquery1.sql -q50 -sespec") file.write('END QUERY READ 01') file.close() os.system("python query_read_02.py") Everything here is working, it is creating the results.txt file, it run the os.system("python reads.py...") file and that file is doing everything it's suppose to, but the problem comes when go and run the query_read_02.py file. In this file, it should execute a SQL command or a SQL file on my database, so I can create an index and see what the performance of that input is, but how do i do it? I create the connection to the database in the reads.py file, but it's hard to create the queries in there because I doesn't keep track of which file it has reached, it just execute commands from what the parameters are. I hope I've explained myself clear enough, otherwise please let me know. I just want to execute a SQL command or file which each query_read_0x.py file. Sincerely Mestika

    Read the article

  • Project setup for an ADO.NET/WCF DataService

    - by Slauma
    I'd like to implement a ADO.NET/WCF DataService and I am wondering what's the best way to setup a project in VS2008 SP1 for this purpose. Currently I have an ASP.NET web application project (not of "WebSite" project type). The data access layer is an Entity model (EF version 1) with SQL Server database. I have the Entity Model in a separate DLL project and the web application project references to this assembly for all data accesses. The ADO.NET/WCF DataService needs to communicate with the Entity model/database as well. It has to be hosted on the same web server (IIS 7.5) together with the web application. Since the DataService is not directly related to that specific web application (though it will provide and modify data from/in the same database the web application uses as well) my basic idea was to separate the DataService in its own new project (which also references the Entity Model DLL). Now I have seen that there is no project type "ADO.NET/WCF DataService" in VS2008 SP1. It seems only possible to add a DataService as an element to other existing projects, for instance Web Application projects. Why isn't there a separate DataService project type? Does this mean now that I have to add the DataService as an element to my Web Application project? Or shall I create a new Web Application project and add a DataService to it? (I could delete the pregenerated default.aspx since I do not need any web pages in this project.) What's the best way? Thank you for suggestions in advance!

    Read the article

  • SQL Server 2005 Installation on Windows 2000 Server

    - by Raghu
    Hi All, We want to install sql server 2005 on Windo ws 2000 Server. We are able to install database service engine, analysis services engine and integration services. But we are unable to install workstation components like SSMS. Does anybody faced this problem? What is the fix for this? Thanks, Raghu

    Read the article

  • Cannot connect to database,.Login failed for user

    - by kishorejangid
    Yesterday my database was connected perfectly.. We have installed Windows Server 2008 R2 on our server and the i have added the user to the client PC name SMTECH5 with user jangid Now my database is not connecting using windows authentication. only the master database is connecting.. here are somw of the error throwned TITLE: Connect to Database Engine Cannot connect to SMTECH5\COLLEGEERP. ADDITIONAL INFORMATION: Cannot open user default database. Login failed. Login failed for user 'SMTECH\jangid'. (Microsoft SQL Server, Error: 4064) For help, click: http://go.microsoft.com/fwlink?ProdName=Microsoft+SQL+Server&EvtSrc=MSSQLServer&EvtID=4064&LinkId=20476 BUTTONS: OK TITLE: Connect to Database Engine Cannot connect to SMTECH5\COLLEGEERP. ADDITIONAL INFORMATION: Cannot open user default database. Login failed. Login failed for user 'SMTECH\jangid'. (Microsoft SQL Server, Error: 4064) For help, click: http://go.microsoft.com/fwlink?ProdName=Microsoft+SQL+Server&EvtSrc=MSSQLServer&EvtID=4064&LinkId=20476 BUTTONS: OK this happens on all the pcs in our lab

    Read the article

  • Impact of ordering of correlated subqueries within a projection

    - by Michael Petito
    I'm noticing something a bit unexpected with how SQL Server (SQL Server 2008 in this case) treats correlated subqueries within a select statement. My assumption was that a query plan should not be affected by the mere order in which subqueries (or columns, for that matter) are written within the projection clause of the select statement. However, this does not appear to be the case. Consider the following two queries, which are identical except for the ordering of the subqueries within the CTE: --query 1: subquery for Color is second WITH vw AS ( SELECT p.[ID], (SELECT TOP(1) [FirstName] FROM [Preference] WHERE p.ID = ID AND [FirstName] IS NOT NULL ORDER BY [LastModified] DESC) [FirstName], (SELECT TOP(1) [Color] FROM [Preference] WHERE p.ID = ID AND [Color] IS NOT NULL ORDER BY [LastModified] DESC) [Color] FROM Person p ) SELECT ID, Color, FirstName FROM vw WHERE Color = 'Gray'; --query 2: subquery for Color is first WITH vw AS ( SELECT p.[ID], (SELECT TOP(1) [Color] FROM [Preference] WHERE p.ID = ID AND [Color] IS NOT NULL ORDER BY [LastModified] DESC) [Color], (SELECT TOP(1) [FirstName] FROM [Preference] WHERE p.ID = ID AND [FirstName] IS NOT NULL ORDER BY [LastModified] DESC) [FirstName] FROM Person p ) SELECT ID, Color, FirstName FROM vw WHERE Color = 'Gray'; If you look at the two query plans, you'll see that an outer join is used for each subquery and that the order of the joins is the same as the order the subqueries are written. There is a filter applied to the result of the outer join for color, to filter out rows where the color is not 'Gray'. (It's odd to me that SQL would use an outer join for the color subquery since I have a non-null constraint on the result of the color subquery, but OK.) Most of the rows are removed by the color filter. The result is that query 2 is significantly cheaper than query 1 because fewer rows are involved with the second join. All reasons for constructing such a statement aside, is this an expected behavior? Shouldn't SQL server opt to move the filter as early as possible in the query plan, regardless of the order the subqueries are written?

    Read the article

  • Problem with Linq query and date format

    - by Alan T
    Hi All I have a C# console application written using Visual Studio 2008. My system culture is en-GB. I have a Linq query that looks like this: var myDate = "19-May-2010"; var cus = from x in _dataContext.testTable where x.CreateDate == Convert.ToDateTime(myDate) select x; The resulting SQL query generates and error because it returns the dates as "19/05/2010" which it interprets as an incorrect date. For some reason even though my system culture is set to en-GB it looks like it's trying to intrepret it as a en-US date. Any ideas how I get around this? Thanks. Alan T

    Read the article

  • ETL mechanisms for MySQL to SQL Server over WAN

    - by Troy Hunt
    I’m looking for some feedback on mechanisms to batch data from MySQL Community Server 5.1.32 with an external host down to an internal SQL Server 05 Enterprise machine over VPN. The external box accumulates data throughout business hours (about 100Mb per day), which then needs to be transferred internationally across a WAN connection to an internal corporate environment before some BI work is performed. This should just be change-sets making their way down each night. I’m interested in thoughts on the ETL mechanisms people have successfully used in similar scenarios before. SSIS seems like a potential candidate; can anyone comment on the suitability for this scenario? Alternatively, other thoughts on how to do this in a cost-conscious way would be most appreciated. Thanks!

    Read the article

< Previous Page | 238 239 240 241 242 243 244 245 246 247 248 249  | Next Page >