Search Results

Search found 28900 results on 1156 pages for 'sql 2005'.

Page 198/1156 | < Previous Page | 194 195 196 197 198 199 200 201 202 203 204 205  | Next Page >

  • Unable to cast object of type 'System.Object[]' to type 'System.String[]'

    - by salvationishere
    I am developing a C# VS 2008 / SQL Server website application. I am a newbie to ASP.NET. I am getting the above error, however, on the last line of the following code. Can you give me advice on how to fix this? This compiles correctly, but I encounter this error after running it. DataTable dt; Hashtable ht; string[] SingleRow; ... SqlConnection conn2 = new SqlConnection(connString); SqlCommand cmd = conn2.CreateCommand(); cmd.CommandText = "dbo.AppendDataCT"; cmd.Connection = conn2; SingleRow = (string[])dt.Rows[1].ItemArray; My error: System.InvalidCastException was caught Message="Unable to cast object of type 'System.Object[]' to type 'System.String[]'." Source="App_Code.g68pyuml" StackTrace: at ADONET_namespace.ADONET_methods.AppendDataCT(DataTable dt, Hashtable ht) in c:\Documents and Settings\Admin\My Documents\Visual Studio 2008\WebSites\Jerry\App_Code\ADONET methods.cs:line 88 InnerException:

    Read the article

  • How to create a UDF that takes a query string and returns the query's resultset

    - by Martin
    I want to create a stored procedure that takes a simple SELECT statement and return the resultset as a CSV string. So the basic idea is get the sql statement from user input, run it using EXEC(@stmt) and convert the resultset to text using cursors. However, as SQLServer doesn't allow: select * from storedprocedure(@sqlStmt) UDF with EXEC(@sqlStmt) so I tried Insert into #tempTable EXEC(@sqlStmt), but this doesn't work (error = "invalid object name #tempTable"). I'm stuck. Could you please shed some light on this matter? Many thanks EDIT: Actually the output (e.g CSV string) is not important. The problem is I don't know how to assign a cursor to the resultset returned by EXEC. SP and UDF do not work with Exec() while creating a temp table before inserting values is impossible without knowing the input statement. I thought of OPENQUERY but it does not accept variables as its parameters.

    Read the article

  • Ideablade Update

    - by Tolu
    Hi, I'm using IdeaBlade version 3.6. I noticed the following generated SQL update query : (@P1 nchar(32),@P2 nvarchar(32),@P3 nvarchar(512),@P4 nchar(32),@P5 int,@P6 nvarchar(32),@P7 int,@P8 datetime,@P9 datetime,@P10 datetime,@P11 int,@P12 datetime,@P13 int,@P14 int,@P15 int,@P16 nvarchar(32),@P17 nvarchar(128),@P18 nvarchar(32),@P19 nvarchar(32),@P20 datetime,@P21 datetime,@P22 bit,@P23 nvarchar(32),@P24 nvarchar(64),@P25 nchar(32))update "dbo"."GSS_Documents" set "DocumentID"=@P1,"FileName"=@P2,"FilePath"=@P3,"BusinessOfficeID"=@P4,"Pages"=@P5,"FileSize"=@P6,"DocumentType"=@P7,"DateCreated"=@P8,"EffectiveDateCreated"=@P9,"DateProcessed"=@P10,"ProcessorID"=@P11,"DateReviewed"=@P12,"ReviewerID"=@P13,"WorkflowStatus"=@P14,"ApprovalStatus"=@P15,"AccountNumber"=@P16,"AccountName"=@P17,"SerialNumber"=@P18,"TransactionID"=@P19,"CriticalDate"=@P20,"EmergencyDate"=@P21,"GenerateSMSAlert"=@P22,"CustomerPhoneNumber"=@P23,"CustomerEmailAddress"=@P24 where "DocumentID"=@P25 Problem is DocumentID is the primary key. This update appears to be updating the primary key as well! Any ideas on how to stop this?

    Read the article

  • Reporting Services is displaying extra rows for minimised row groups

    - by Graphain
    Hi, I have a fairly basic SQL Server Reporting Services report that is using nested row groups. Each sub-group depends on expanding its parent to be visible which is all pretty standard. The layout is something like this: { Company { { Car SUM(Price) { { { Part Price My desired result when expanded is something like this (which I get fine): - SuperCarCompany - SuperCar 20 Door 20 - SuperCar2 70 Door 30 Window 40 - OtherCarCompany - SuperCar2 50 /* Same SuperCar2 */ Door 50 - MoreCarCompany - BestCarEver 535 Engine 500 Door 30 Window 5 And when opened initially something like this: + SuperCarCompany + OtherCarCompany + MoreCarCompany However, I'm getting this: + SuperCarCompany + SuperCar2 70 (i.e. sum of all SuperCar2) + OtherCarCompany + SuperCar 20 + MoreCarCompany + BestCarEver 535 and I can even expand these superfluous rows like this: + SuperCarCompany - SuperCar2 70 (i.e. sum of all SuperCar2) Door 30 (i.e. first child of any SuperCar2) The superflous rows dissapear immediately when I expand the expected row above it (i.e. I'd need to expand all expected rows to get rid of all superflous rows). Any idea on the cause?

    Read the article

  • Unable to create index because of duplicate that doesn't exist?

    - by Alex Angas
    I'm getting an error running the following Transact-SQL command: CREATE UNIQUE NONCLUSTERED INDEX IX_TopicShortName ON DimMeasureTopic(TopicShortName) The error is: Msg 1505, Level 16, State 1, Line 1 The CREATE UNIQUE INDEX statement terminated because a duplicate key was found for the object name 'dbo.DimMeasureTopic' and the index name 'IX_TopicShortName'. The duplicate key value is (). When I run SELECT * FROM sys.indexes WHERE name = 'IX_TopicShortName' or SELECT * FROM sys.indexes WHERE object_id = OBJECT_ID(N'[dbo].[DimMeasureTopic]') the IX_TopicShortName index does not display. So there doesn't appear to be a duplicate. I have the same schema in another database and can create the index without issues there. Any ideas why it won't create here?

    Read the article

  • md5 hash for file without File Attributes

    - by Glennular
    Using the following code to compute MD5 hashs of files: Private _MD5Hash As String Dim md5 As New System.Security.Cryptography.MD5CryptoServiceProvider Dim md5hash() As Byte md5hash = md5.ComputeHash(Me._BinaryData) Me._MD5Hash = ByteArrayToString(md5hash) Private Function ByteArrayToString(ByVal arrInput() As Byte) As String Dim sb As New System.Text.StringBuilder(arrInput.Length * 2) For i As Integer = 0 To arrInput.Length - 1 sb.Append(arrInput(i).ToString("X2")) Next Return sb.ToString().ToLower End Function We are getting different hashes depending on the create-date and modify-date of the file. We are storing the hash and the binary file in a SQL DB. This works fine when we upload the same instance of a file. But when we save a new instance of the file from the DB (with today's date as the create/modify) on the file-system and then check the new hash versus the MD5 stored in the DB they do not match, and therefor fail a duplicate check. How can we check for a file hash excluding the file attributes? or is there a different issue here?

    Read the article

  • Changing a table-valued function that a stored procedure calls is not recognized?

    - by Peter
    Hey all I have a stored procedure sp that calls a table-valued function tvf. Sometimes I modify the tvf but when subsequently executing sp, the output from sp is the same as before the modification. It seems like it is cached or compiled or something. If I make some dummy change to the sp, then I get the right output of the sp. Is there some way, I can overcome this problem? In Oracle it is possible to re-compile all stored procedures, but I haven't been able to figure out how to do this in SQL Server? Any help is highly appreciated.

    Read the article

  • Is there a difference SMO ServerConnection transaction methods versus using the SqlConnectionObject

    - by YWE
    I am using SMO to create databases and tables on a SQL Server. I want to do so in a transaction. Are both of these methods of doing so valid and equivalent: First method: Server server; //... server.ConnectionContext.BeginTransaction(); //... server.ConnectionContext.CommitTransaction(); Second method: Server server; // ... SqlConnection conn = server.ConnectionContext.SqlConnectionObject; SqlTransaction trans = conn.BeginTransaction(); // ... trans.Commit();

    Read the article

  • Sorting the data returned by a database

    - by Rishabh Ohri
    hi all, In our project we have a requirement that when a set of records are returned by the database the records should be sorted with respect to the TITLE field in the record. The records will have to be sorted alphabetically but if the title of a record has a number in it then it should come after the records whose title only consists of alphabets. Details: we are using SQL Server , and c#. The data from the database comes to an Entity class whic forwards the data to other layers. So, What will be the possible and effective solution for this requirement.

    Read the article

  • nHibernate: Query tree nodes where self or ancestor matches condition

    - by Famous Nerd
    I have see a lot of competing theories about hierarchical queries in fluent-nHibernate or even basic nHibernate and how they're a difficult beast. Does anyone have any knowledge of good resources on the subject. I find myself needing to do queries similar to: (using a file system analog) select folderObjects from folders where folder.Permissions includes :myPermissionLevel or [any of my ancestors] includes :myPermissionLevel This is a one to many tree, no node has multiple parents. I'm not sure how to describe this in nHibernate specific terms or, even sql-terms. I've seen the phrase "nested sets" mentioned, is this applicable? I'm not sure. Can anyone offer any advice on approaches to writing this sort of nHibernate query?

    Read the article

  • schema for storing different varchar fields over time?

    - by Henry
    This app I'm working on needs to store some meta data fields about an entity. The problem is that we can already foresee that these fields are going to change a lot in the future. I'm using Hibernate (in ColdFusion) and each entity's properties are translated to one column in the entity table, but altering table columns later down the raod will be costly and error-prone right? Should I go for something like this? MetaDataField ----- metaDataFieldID (PK), name FieldValue ---------- EntityID (PK, FK), metaDataFieldID (PK, FK), value [varchar(255)] Is this common? Anything to watch out for? p.s. I also thought of using XML on SQL Server 05+. After talking to some ppl, seems like it is not a viable solution 'cause it will be too slow for doing certain query for reporting purposes.

    Read the article

  • How can I iterate over a recordset within a stored procedure?

    - by David
    I need to iterate over a recordset from a stored procedure and execute another stored procedure using each fields as arguments. I can't complete this iteration in the code. I have found samples on the internets, but they all seem to deal with a counter. I'm not sure if my problem involved a counter. I need the T-SQL equivalent of a foreach Currently, my first stored procedure stores its recordset in a temp table, #mytemp. I assume I will call the secondary stored procedure like this: while (something) execute nameofstoredprocedure arg1, arg2, arg3 end

    Read the article

  • Create view or SP, only if the DB contains a pattern

    - by Randall Salas
    Hi all: I am working on a script, that needs to be run in many different SQL servers. Some of them, shared the same structure, in other words, they are identical, but the filegroups and the DB names are different. This is because is one per client. Anyway, I would like when running a script, If I chose the wrong DB, it should not be executed. I am trying to mantain a clean DB. here is my example, which only works for dropping a view if exists, but does not work for creating a new one. I also wonder how it would be for creating a stored procedure. Thx a lot. if exists (select * from dbo.sysobjects where id = object_id(N'[dbo].[ContentModDate]') and OBJECTPROPERTY(id, N'IsView') = 1) AND CHARINDEX('Content', DB_NAME()) 0 drop view [dbo].[ContentModDate] GO IF (CHARINDEX('Content', DB_NAME()) 0)BEGIN CREATE VIEW [dbo].[Rx_ContentModDate] AS SELECT 'Table1' AS TableName, MAX(ModDate) AS ModDate FROM Tabl1 WHERE ModDate IS NOT NULL UNION SELECT 'Table2', MAX(ModDate) AS ModDate FROM Table2 WHERE ModDate IS NOT NULL END END GO

    Read the article

  • SSIS - Can I get the column schema for a flat file source from a database?

    - by Steve Clement
    We receive a nightly data export from a vendor in the form of about 10 tab-delimited flat file without column headers. In addition, the vendor provides us with the SQL scripts for the database tables so that we can import the files into our system. Unfortunately, the vendor recently changed the schema for the flat files. Each file has upwards 150 columns, and having to go through the DB schema and adjust column types on a Flat File Data Source in SSIS is extremely time consuming, not to mention a royal pain. Since I know the file data layout in the database schema, is there any way I can dynamically pull that into a Flat File source to set the columns correctly? Or am I just stuck with manually setting everything?

    Read the article

  • Inline Conditional Statement in Stored Procedure

    - by Jason
    Here is the pseudo-code for my inline query in my code: select columnOne from myTable where columnOne = '#variableOne#' if len(variableTwo) gt 0 and columnTwo = '#variableTwo#' end I would like to move this into a stored procedure but am having trouble building the query correctly. I assume it would be something like select columnOne from myTable where columnOne = @variableOne CASE WHEN len(@variableTwo) <> 0 THEN and columnTwo = @variableTwo END This is giving me a syntax error. Could someone tell me what I've got wrong. Also, I would like to keep it to only one query and not just have one if statement. Also, I do not want to build the sql in the stored procedure and run Exec() on it.

    Read the article

  • How to position columns in select list based on a variable.

    - by Rohit
    I am creating a dynamic grid which is created by selecting the format defined in the below XML.My select list includes all columns in this XML,I want to position these columns columns on the basic of position attribute in XML. <gridFormat> <column property="CustomerID" dbName="CustName" HeaderText="Customer" IsVisible="0" Position="1" /> <column property="CustomerName" dbName="FacilityName" HeaderText="Facility" IsVisible="1" Position="3" /> <column property="FacilityInternalID" dbName="Pname" HeaderText="Patient" IsVisible="1" Position="2" /> </gridFormat> I cannot select them in the order specified by the position attribute. I can do it in C# by looping around the returned datatable and specifically position columns using datatable.columns.setordinal() method. Is there any better way in SQL or C# to accomplish this.

    Read the article

  • Best way to correct garbled data caused by false encoding

    - by ercan
    Hi all, I have a set of data that contains garbled text fields because of encoding errors during many import/exports from one database to another. Most of the errors were caused by converting UTF-8 to ISO-8859-1. Strangely enough, the errors are not consistent: the word 'München' appears as 'München' in some place and as 'MÃœnchen'. Is there a trick in SQL server to correct this kind of crap? The first thing that I can think of is to exploit the COLLATE clause, so that ü is interpreted as ü, but I don't exactly know how. If it isn't possible to make it in the DB level, do you know any tool that helps for a bulk correction? (no manual find/replace tool, but a tool that guesses the garbled text somehow and correct them)

    Read the article

  • Errors with large data sources

    - by The Sheek Geek
    I'm doing some benchmarking on large data sources and binding/exporting data for reporting. I started with using a data set, filling it with 100000 rows and then attempting to open a crystal report with the retrieved data. I noticed that the data set filled just fine (took about 779 milliseconds) however, when attempting to export the data to the report or even bind to a gridview the application would fail with an OutOfMemoryException. Does anyone experienced this before or have an idea of how to get around it? It is very possible that clients will run reports for years worth of data and 100000 rows are not inconceivable. The application and the benchmark code are written in C# using ORACLE and SQL Server databases. I still have some data sources to test, but would like to know how to get around this just in case I don't find a better solution.

    Read the article

  • created persisted computed columns when the user defined scalar function appears to be non-determini

    - by Ralph Shillington
    I have a scalar UDF that I know to be deterministic, however SQL doesn't. Is there a way to declare it as deterministic so that I can then use it in a persisted computed column definition? further clarification: The purpose of this exercise is that I need to harvest out specific values from an XML column on the row. I can't use the value method of the xml column in my computed column definition, but I can use it in a UDF. I know the xpath query in the value method will produce the same output give the same input so while I certainly understand that not all calls to value will be deterministic I want to assert that mine is.

    Read the article

  • Will creating index help in this case

    - by The King
    I'm still a learning user of SQL-SERVER2005. Here is my table structure CREATE TABLE [dbo].[Trn_PostingGroups]( [ControlGroup] [char](5) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULL, [PracticeCode] [char](5) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULL, [ScanDate] [smalldatetime] NULL, [DepositDate] [smalldatetime] NULL, [NameOfFile] [varchar](50) COLLATE SQL_Latin1_General_CP1_CI_AS NULL, [DepositValue] [decimal](11, 2) NULL, [RecordStatus] [char](1) COLLATE SQL_Latin1_General_CP1_CI_AS NULL, CONSTRAINT [PK_Trn_PostingGroups_1] PRIMARY KEY CLUSTERED ( [ControlGroup] ASC, [PracticeCode] ASC )WITH (IGNORE_DUP_KEY = OFF) ON [PRIMARY] ) ON [PRIMARY] Scenario 1 : Suppose I have a query like this... Select * from Trn_PostingGroups where PracticeCode = 'ABC' Will indexing on Practice Code seperately help me in making my query faster?? Scenario 2 : Select * from Trn_PostingGroups where ControlGroup = 12701 and PracticeCode = 'ABC' and NameOfFile = 'FileName1' Will indexing on NameOfFile seperately help me in making my query faster ??

    Read the article

  • Timeout Expire on INSERT Query

    - by Sachin Gaur
    I am trying to insert a row in a table using simple INSERT Query in a transaction. It works fine in SQL Server but I am not able to insert the data using my business object. I am calling a SELECT query using the Command as: Using cm As New SqlCommand With cm .Connection = tr.Connection .Transaction = tr .CommandType = CommandType.Text .CommandText = Some Select Query .ExecuteScalar() '' Do something .CommandText = Insert Query .ExecuteNonQuery() End With End Using I am getting the Timeout period expired error at ".ExecuteNonQuery()" line. Any other DML query is running perfectly fine at this point. Can anyone tell me the reason?

    Read the article

  • Expanded securityadmin

    - by user80652
    I'm aware that sysadmin is documented as the server role necessary for creating logins (SQL/Windows-integrated); nevertheless, I'm tasked to find out if there's any other server role (built-in or otherwise) that can be used. To be specific, I'm looking to setup one or two logins with access to create logins, create [database] users, assign users to [database] roles. Potentially reset passwords, but most of the logins are Windows-integrated and it's not necessary. Cannot have access to data at all, nor can these logins have rights to update tables nor create/update roles. Seems my only options so far are to set these 2 logins with securityadmin server role and for the specific databases, configure with db_securityadmin and db_accessadmin... but this configuration doesn't allow for creating logins.

    Read the article

  • SQL Server stored procedures - update column based on variable name..?

    - by ClarkeyBoy
    Hi, I have a data driven site with many stored procedures. What I want to eventually be able to do is to say something like: For Each @variable in sproc inputs UPDATE @TableName SET @variable.toString = @variable Next I would like it to be able to accept any number of arguments. It will basically loop through all of the inputs and update the column with the name of the variable with the value of the variable - for example column "Name" would be updated with the value of @Name. I would like to basically have one stored procedure for updating and one for creating. However to do this I will need to be able to convert the actual name of a variable, not the value, to a string. Question 1: Is it possible to do this in T-SQL, and if so how? Question 2: Are there any major drawbacks to using something like this (like performance or CPU usage)? I know if a value is not valid then it will only prevent the update involving that variable and any subsequent ones, but all the data is validated in the vb.net code anyway so will always be valid on submitting to the database, and I will ensure that only variables where the column exists are able to be submitted. Many thanks in advance, Regards, Richard Clarke Edit: I know about using SQL strings and the risk of SQL injection attacks - I studied this a bit in my dissertation a few weeks ago. Basically the website uses an object oriented architecture. There are many classes - for example Product - which have many "Attributes" (I created my own class called Attribute, which has properties such as DataField, Name and Value where DataField is used to get or update data, Name is displayed on the administration frontend when creating or updating a Product and the Value, which may be displayed on the customer frontend, is set by the administrator. DataField is the field I will be using in the "UPDATE Blah SET @Field = @Value". I know this is probably confusing but its really complicated to explain - I have a really good understanding of the entire system in my head but I cant put it into words easily. Basically the structure is set up such that no user will be able to change the value of DataField or Name, but they can change Value. I think if I were to use dynamic parameterised SQL strings there will therefore be no risk of SQL injection attacks. I mean basically loop through all the attributes so that it ends up like: UPDATE Products SET [Name] = '@Name', Description = '@Description', Display = @Display Then loop through all the attributes again and add the parameter values - this will have the same effect as using stored procedures, right?? I dont mind adding to the page load time since this is mainly going to affect the administration frontend, and will marginly affect the customer frontend.

    Read the article

  • Dataset.ReadXml() - Invalid character in the given encoding

    - by NLV
    Hello all I have saved a dataset in the sql database in an xml column using the following code. XmlDataDocument dd = new XmlDataDocument(dataset); and passing this xml document as sql parameter using param.value = new XmlNodeReader(dd); The XML is like <NewDataSet><SubContractChangeOrders><AGCol>1</AGCol><SCO_x0020_Number>001</SCO_x0020_Number><Contract_x0020_Number>006</Contract_x0020_Number><ContractID>30</ContractID><ChangeOrderID>211</ChangeOrderID><Amount>0.0000</Amount><Udf_CostReimbursableFlag>false</Udf_CostReimbursableFlag><Udf_CustomerCode /><Udf_SubChangeOrderStatus /></SubContractChangeOrders><SubContractChangeOrders><AGCol>2</AGCol><SCO_x0020_Number>002</SCO_x0020_Number><Contract_x0020_Number>006</Contract_x0020_Number><ContractID>30</ContractID><ChangeOrderID>212</ChangeOrderID><Amount>0.0000</Amount><Udf_CostReimbursableFlag>false</Udf_CostReimbursableFlag></SubContractChangeOrders><SubContractChangeOrders><AGCol>3</AGCol><SCO_x0020_Number>001</SCO_x0020_Number><Contract_x0020_Number>111</Contract_x0020_Number><ContractID>87</ContractID><ChangeOrderID>12</ChangeOrderID><Amount>300.0000</Amount></SubContractChangeOrders><SubContractChangeOrders><AGCol>4</AGCol><SCO_x0020_Number>001</SCO_x0020_Number><Contract_x0020_Number>222</Contract_x0020_Number><ContractID>80</ContractID><ChangeOrderID>6</ChangeOrderID><Amount>100.0000</Amount></SubContractChangeOrders><SubContractChangeOrders><AGCol>5</AGCol><SCO_x0020_Number>001</SCO_x0020_Number><Contract_x0020_Number>777</Contract_x0020_Number><ContractID>79</ContractID><ChangeOrderID>5</ChangeOrderID><Amount>200.0000</Amount></SubContractChangeOrders><SubContractChangeOrders><AGCol>6</AGCol><SCO_x0020_Number>001</SCO_x0020_Number><Contract_x0020_Number>786</Contract_x0020_Number><ContractID>77</ContractID><ChangeOrderID>3</ChangeOrderID><Amount>100.0000</Amount></SubContractChangeOrders><SubContractChangeOrders><AGCol>7</AGCol><SCO_x0020_Number>001</SCO_x0020_Number><Contract_x0020_Number>787</Contract_x0020_Number><ContractID>78</ContractID><ChangeOrderID>4</ChangeOrderID><Amount>500.0000</Amount></SubContractChangeOrders><SubContractChangeOrders><AGCol>8</AGCol><SCO_x0020_Number>001</SCO_x0020_Number><Contract_x0020_Number>Con 009</Contract_x0020_Number><ContractID>219</ContractID><ChangeOrderID>78</ChangeOrderID><Amount>9000.0000</Amount></SubContractChangeOrders><SubContractChangeOrders><AGCol>9</AGCol><SCO_x0020_Number>001</SCO_x0020_Number><Contract_x0020_Number>Con 010</Contract_x0020_Number><ContractID>220</ContractID><ChangeOrderID>79</ChangeOrderID><Amount>13000.0000</Amount></SubContractChangeOrders><SubContractChangeOrders><AGCol>10</AGCol><SCO_x0020_Number>001</SCO_x0020_Number><Contract_x0020_Number>Con 012</Contract_x0020_Number><ContractID>222</ContractID><ChangeOrderID>83</ChangeOrderID><Amount>2300.0000</Amount></SubContractChangeOrders><SubContractChangeOrders><AGCol>11</AGCol><SCO_x0020_Number>001</SCO_x0020_Number><Contract_x0020_Number>Con 020</Contract_x0020_Number><ContractID>226</ContractID><ChangeOrderID>86</ChangeOrderID><Amount>5400.0000</Amount></SubContractChangeOrders><SubContractChangeOrders><AGCol>12</AGCol><SCO_x0020_Number>001</SCO_x0020_Number><Contract_x0020_Number>Con 021</Contract_x0020_Number><ContractID>227</ContractID><ChangeOrderID>87</ChangeOrderID><Amount>2300.0000</Amount></SubContractChangeOrders><SubContractChangeOrders><AGCol>13</AGCol><SCO_x0020_Number>001</SCO_x0020_Number><Contract_x0020_Number>Con001</Contract_x0020_Number><ContractID>208</ContractID><ChangeOrderID>72</ChangeOrderID><Amount>3000.0000</Amount></SubContractChangeOrders><SubContractChangeOrders><AGCol>14</AGCol><SCO_x0020_Number>001</SCO_x0020_Number><Contract_x0020_Number>Con002</Contract_x0020_Number><ContractID>209</ContractID><ChangeOrderID>73</ChangeOrderID><Amount>400.0000</Amount></SubContractChangeOrders><SubContractChangeOrders><AGCol>15</AGCol><SCO_x0020_Number>001</SCO_x0020_Number><Contract_x0020_Number>Con003</Contract_x0020_Number><ContractID>210</ContractID><ChangeOrderID>74</ChangeOrderID><Amount>6000.0000</Amount></SubContractChangeOrders><SubContractChangeOrders><AGCol>16</AGCol><SCO_x0020_Number>001</SCO_x0020_Number><Contract_x0020_Number>Con004</Contract_x0020_Number><ContractID>211</ContractID><ChangeOrderID>75</ChangeOrderID><Amount>9000.0000</Amount></SubContractChangeOrders><SubContractChangeOrders><AGCol>17</AGCol><SCO_x0020_Number>001</SCO_x0020_Number><Contract_x0020_Number>Con005</Contract_x0020_Number><ContractID>213</ContractID><ChangeOrderID>76</ChangeOrderID><Amount>17000.0000</Amount></SubContractChangeOrders><SubContractChangeOrders><AGCol>18</AGCol><SCO_x0020_Number>001</SCO_x0020_Number><Contract_x0020_Number>Cont001</Contract_x0020_Number><ContractID>228</ContractID><ChangeOrderID>89</ChangeOrderID><Amount>2000.0000</Amount></SubContractChangeOrders><SubContractChangeOrders><AGCol>19</AGCol><SCO_x0020_Number>001</SCO_x0020_Number><Contract_x0020_Number>PUR001</Contract_x0020_Number><ContractID>229</ContractID><ChangeOrderID>88</ChangeOrderID><Amount>1000.0000</Amount></SubContractChangeOrders><SubContractChangeOrders><AGCol>20</AGCol><SCO_x0020_Number>001</SCO_x0020_Number><Contract_x0020_Number>PUR002</Contract_x0020_Number><ContractID>230</ContractID><ChangeOrderID>90</ChangeOrderID><Amount>3000.0000</Amount></SubContractChangeOrders><SubContractChangeOrders><AGCol>21</AGCol><SCO_x0020_Number>001</SCO_x0020_Number><Contract_x0020_Number>SC-002</Contract_x0020_Number><ContractID>2</ContractID><ChangeOrderID>7</ChangeOrderID><Amount>200.0000</Amount></SubContractChangeOrders><SubContractChangeOrders><AGCol>22</AGCol><SCO_x0020_Number>001</SCO_x0020_Number><Contract_x0020_Number>SC-004</Contract_x0020_Number><ContractID>7</ContractID><ChangeOrderID>65</ChangeOrderID><Amount>1000.0000</Amount></SubContractChangeOrders></NewDataSet> I'm trying to read it back as follows using (SqlConnection con = new SqlConnection("Server=#####;Initial Catalog=#####;User ID=####;Pwd=########")) { using (SqlCommand com = new SqlCommand("select * from dbo.tbl_#####", con)) { using (SqlDataAdapter ada = new SqlDataAdapter(com)) { ada.Fill(dt); MemoryStream ms = new MemoryStream(); object contractXML1 = dt.Rows[0]["SCOXML1"]; System.Runtime.Serialization.Formatters.Binary.BinaryFormatter bf = new System.Runtime.Serialization.Formatters.Binary.BinaryFormatter(); bf.Serialize(ms, contractXML1); ms.Seek(0, SeekOrigin.Begin); ds.ReadXml(ms); } } } I'm getting the following error Data at the root level is invalid. Line 1, position 6. Any ideas? NLV

    Read the article

  • Change the default SqlCommand CommandTimeout with configuration rather than recompile?

    - by robertc
    I am supporting an ASP.Net 3.5 web application and users are experiencing a timeout error after 30 seconds when trying to run a report. Looking around the web it seems it's easy enough to change the timeout in the code, unfortunately I'm not able to access the code and recompile. Is there anyway to configure the default for either the web app, the worker process, IIS or the whole machine? Here is the stack trace up to the point where it's in System.Data in case I'm missing some other problem: [SqlException (0x80131904): Timeout expired. The timeout period elapsed prior to completion of the operation or the server is not responding.] System.Data.SqlClient.SqlConnection.OnError(SqlException exception, Boolean breakConnection) +1948826 System.Data.SqlClient.SqlInternalConnection.OnError(SqlException exception, Boolean breakConnection) +4844747 System.Data.SqlClient.TdsParser.ThrowExceptionAndWarning(TdsParserStateObject stateObj) +194 System.Data.SqlClient.TdsParser.Run(RunBehavior runBehavior, SqlCommand cmdHandler, SqlDataReader dataStream, BulkCopySimpleResultSet bulkCopyHandler, TdsParserStateObject stateObj) +2392 System.Data.SqlClient.SqlDataReader.ConsumeMetaData() +33 System.Data.SqlClient.SqlDataReader.get_MetaData() +83 System.Data.SqlClient.SqlCommand.FinishExecuteReader(SqlDataReader ds, RunBehavior runBehavior, String resetOptionsString) +297 System.Data.SqlClient.SqlCommand.RunExecuteReaderTds(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, Boolean async) +954 System.Data.SqlClient.SqlCommand.RunExecuteReader(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, String method, DbAsyncResult result) +162 System.Data.SqlClient.SqlCommand.RunExecuteReader(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, String method) +32 System.Data.SqlClient.SqlCommand.ExecuteReader(CommandBehavior behavior, String method) +141 System.Data.SqlClient.SqlCommand.ExecuteDbDataReader(CommandBehavior behavior) +12 System.Data.Common.DbCommand.System.Data.IDbCommand.ExecuteReader(CommandBehavior behavior) +10 System.Data.Common.DbDataAdapter.FillInternal(DataSet dataset, DataTable[] datatables, Int32 startRecord, Int32 maxRecords, String srcTable, IDbCommand command, CommandBehavior behavior) +130 System.Data.Common.DbDataAdapter.Fill(DataTable[] dataTables, Int32 startRecord, Int32 maxRecords, IDbCommand command, CommandBehavior behavior) +162 System.Data.Common.DbDataAdapter.Fill(DataTable dataTable) +115 --Edit There must be something outside the code itself - I've downloaded the database and run it against the same web site installed on a test server and it runs for longer than 30 seconds and returns the report. I've compared the machine.config and web.config files from the .Net directory on the live and test and they seem the same, compared the two IIS setups, also looked at the SQL Server configuration and the only difference is that the live server is clustered on 64bit W2K3 while the test server is on 32bit.

    Read the article

< Previous Page | 194 195 196 197 198 199 200 201 202 203 204 205  | Next Page >