Search Results

Search found 33316 results on 1333 pages for 'sql team'.

Page 96/1333 | < Previous Page | 92 93 94 95 96 97 98 99 100 101 102 103  | Next Page >

  • Linq to SQL Problem System.Data.Linq.IdentityManager.StandardIdentityManager.MultiKeyManager

    - by luckyluke
    I have a really tricky thing going up here. My project has around 100 tables and they are all mapped by LINQ. Everything works fine in a dev and test environment. These enviroments are MS Win 2008 r2 servers with SQL 2008 sp1 databases. IIS and SQL are on a different machines. Now on production enviroment which is MS Win 2003 x64 web farm + geoclustered SQL 2008 IT DOES not work. All I get is the exception System.IndexOutOfRangeException: Index was outside the bounds of the array. at System.Data.Linq.IdentityManager.StandardIdentityManager.MultiKeyManager3.TryCreateKeyFr>om Values(Object[] values, MultiKey& k) at System.Data.Linq.IdentityManager.StandardIdentityManager.IdentityCache2.Find(Object[] keyValues) at System.Data.Linq.ChangeProcessor.GetOtherItem(MetaAssociation assoc, Object instance) at System.Data.Linq.ChangeProcessor.BuildEdgeMaps() at System.Data.Linq.ChangeProcessor.SubmitChanges(ConflictMode failureMode) at System.Data.Linq.DataContext.SubmitChanges(ConflictMode failureMode) at ERS.IIMP.Services.ExposuresSrv.Update(Int32 ExpID, Int32 AssID) Services\ExposuresSrv.cs` My question is What the hell. They have precisely the same DBML, the DB has exactly THE SAME structure (when I get the DB from prod to TEST and mount it eveything works just great), the binaries on the WEB Server are the same. I seriously do not know what to do.... Did anyone found that Linq works on one env and does not on the second?? I mam really lost here. I really hope You can help me:)

    Read the article

  • Embedded SQL in OO languages like Java

    - by Steve De Caux
    One of the things that annoys me working with SQL in OO languages is having to define SQL statements in strings. When I used to work on IBM mainframes, the languages used an SQL preprocessor to parse SQL statements out of the native code, so the statements could be written in cleartext SQL without the obfuscation of strings, for instance in Cobol there is a EXEC SQL .... END-EXEC syntax construct that allows pure SQL statements to be embedded in the Cobol code. <pure cobol code, including assignment of value to local variable HOSTVARIABLE> EXEC SQL SELECT COL_A, COL_B, COL_C INTO :COLA, :COLB, :COLC FROM TAB_A WHERE COL_D = :HOSTVARIABLE END_EXEC <more cobol code, variables COLA, COLB, COLC have been set> ...this makes the SQL statement really easy to read & check for errors. Between the EXEC SQL .... END-EXEC tokens there are no constraints on indentation, linebreaking etc., so you can format the SQL statement according to taste. Note that this example is for a single-row select, when a multiple-row resultset is expected, the coding is different (but still v. easy to read). So, taking Java as an example What made the "old COBOL" approach undesirable ? Not only SQL, but system calls could be made much more readable with that approach. Let's call it the embedded foreign language preprocessor approach. Would an embedded foreign language preprocessor for SQL be useful to implement ? Would you see a benefit in being able to write native SQL statements inside java code ? Edit I'm really asking if you think SQL in OO languages is a throwback, and if not then what could be done to make it better.

    Read the article

  • sql exception when transferring project from usb to c:\

    - by jello
    I'm working on a C# windows program with Visual Studio 2008. Usually, I work from school, directly on my usb drive. But when I copy the folder on my hard drive at home, an sql exception is unhandled whenever I try to write to the database. it is unhandled at the conn.Open(); line. here's the exception unhandled Database 'L:\system\project\the_project\the_project\bin\Debug\PatientMonitoringDatabase.mdf' already exists. Choose a different database name. Cannot attach the file 'C:\Documents and Settings\Administrator\My Documents\system\project\the_project\the_project\bin\Debug\PatientMonitoringDatabase.mdf' as database 'PatientMonitoringDatabase'. it's weird, because my connection string says |DataDirectory|, so it should work on any drive... here's my connection string: string connStr = "Data Source=.\\SQLEXPRESS;AttachDbFilename=|DataDirectory|\\PatientMonitoringDatabase.mdf; " + "Initial Catalog=PatientMonitoringDatabase; " + "Integrated Security=True"; Someone told me to: Connect to localhost with SQL Server Management Studio Express, and remove/detach the existing PatientMonitoringDatabase database. Whether it's a persistent database or only active within a running application, you can't have 2 databases with the same name at the same time attached to a SQL Server instance. So I did that, and now it gives me: Directory lookup for the file "C:\Documents and Settings\Administrator\My Documents\system\project\the_project\the_project\bin\Debug\PatientMonitoringDatabase.mdf" failed with the operating system error 5(Access is denied.). Cannot attach the file 'C:\Documents and Settings\Administrator\My Documents\system\project\the_project\the_project\bin\Debug\PatientMonitoringDatabase.mdf' as database 'PatientMonitoringDatabase' I checked the files' properties, and I have allow for everyone. Does anyone know what's going on here?

    Read the article

  • Must read books for a programming team leader

    - by takeshin
    As a programming team leader, which books do you recommend? Books about HR, good programming practices etc. I have recently seen PHP Team Development but it is not mind blowing for experienced developers. One of the best I can recommend is Microsoft Manual of Style for Technical Publications, which helped me a lot in improving the language and practices of documenting the code.

    Read the article

  • How do large blobs affect SQL delete performance, and how can I mitigate the impact?

    - by Max Pollack
    I'm currently experiencing a strange issue that my understanding of SQL Server doesn't quite mesh with. We use SQL as our file storage for our internal storage service, and our database has about half a million rows in it. Most of the files (86%) are 1mb or under, but even on fresh copies of our database where we simply populate the table with data for the purposes of a test, it appears that rows with large amounts of data stored in a BLOB frequently cause timeouts when our SQL Server is under load. My understanding of how SQL Server deletes rows is that it's a garbage collection process, i.e. the row is marked as a ghost and the row is later deleted by the ghost cleanup process after the changes are copied to the transaction log. This suggests to me that regardless of the size of the data in the blob, row deletion should be close to instantaneous. However when deleting these rows we are definitely experiencing large numbers of timeouts and astoundingly low performance. In our test data set, its files over 30mb that cause this issue. This is an edge case, we don't frequently encounter these, and even though we're looking into SQL filestream as a solution to some of our problems, we're trying to narrow down where these issues are originating from. We ARE performing our deletes inside of a transaction. We're also performing updates to metadata such as file size stats, but these exist in a separate table away from the file data itself. Hierarchy data is stored in the table that contains the file information. Really, in the end it's not so much what we're doing around the deletes that matters, we just can't find any references to low delete performance on rows that contain a large amount of data in a BLOB. We are trying to determine if this is even an avenue worth exploring, or if it has to be one of our processes around the delete that's causing the issue. Are there any situations in which this could occur? Is it common for a database server to come to the point of complete timeouts when many of these deletes are occurring simultaneously? Is there a way to combat this issue if it exists? (cross-posted from StackOverflow )

    Read the article

  • SQL query recursion for a web-like structure

    - by MickeyD
    I have a table here, named "Foo". The data is set up something like this. ID TableReference DataId0 DataId1 DataId2 -- -------------- ------- ------- ------- 1 Prize 3 4 5 2 Prize 4 5 NULL 3 Cash 1 NULL NULL 4 Prize 8 NULL 12 5 Foo 2 3 NULL 6 Cash 8 1 10 7 Foo 5 1 2 Etc. The data is horribly set up, I know, but I didn't set it up that way. :) I'm only dealing with the after effect. I'm trying to come up with a way to essentially "flatten" the table; that is, to display all the data to a point where the table "Foo" does not reference itself. I'm trying to figure out a sql query that I can do to get there. Usually when I deal with recursion, I have (or can establish) parent IDs and set it up that way, but for this table there are seemingly multiple child and parent IDs creating a web-like structure instead of a hierarchy. So I'm at a loss where to even begin to write a sql query for something like this. Note: There is no infinite looping (where one Foo points to another Foo, which points back to the original Foo) from what I've found. Using t-sql. Thanks for any assistance, if at all possible.

    Read the article

  • SQL - Updating records based on most recent date

    - by Remnant
    I am having difficulty updating records within a database based on the most recent date and am looking for some guidance. By the way, I am new to SQL. As background, I have a windows forms application with SQL Express and am using ADO.NET to interact with the database. The application is designed to enable the user to track employee attendance on various courses that must be attended on a periodic basis (e.g. every 6 months, every year etc.). For example, they can pull back data to see the last time employees attended a given course and also update attendance dates if an employee has recently completed a course. I have three data tables: EmployeeDetailsTable - simple list of employees names, email address etc., each with unique ID CourseDetailsTable - simple list of courses, each with unique ID (e.g. 1, 2, 3 etc.) AttendanceRecordsTable - has 3 columns { EmployeeID, CourseID, AttendanceDate, Comments } For any given course, an employee will have an attendance history i.e. if the course needs to be attended each year then they will have one record for as many years as they have been at the company. What I want to be able to do is to update the 'Comments' field for a given employee and given course based on the most recent attendance date. What is the 'correct' SQL syntax for this? I have tried many things (like below) but cannot get it to work: UPDATE AttendanceRecordsTable SET Comments = @Comments WHERE AttendanceRecordsTable.EmployeeID = (SELECT EmployeeDetailsTable.EmployeeID FROM EmployeeDetailsTable WHERE (EmployeeDetailsTable.LastName =@ParameterLastName AND EmployeeDetailsTable.FirstName =@ParameterFirstName) AND AttendanceRecordsTable.CourseID = (SELECT CourseDetailsTable.CourseID FROM CourseDetailsTable WHERE CourseDetailsTable.CourseName =@CourseName)) GROUP BY MAX(AttendanceRecordsTable.LastDate) After much googling, I discovered that MAX is an aggregate function and so I need to use GROUP BY. I have also tried using the HAVING keyword but without success. Can anybody point me in the right direction? What is the 'conventional' syntax to update a database record based on the most recent date?

    Read the article

  • Export large amount of data from Oracle 10G to SQL Server 2005

    - by uniball
    Dear all, I need to export 100 million data rows (avg row length ~ 100 bytes) from Oracle 10G database table into SQL server (over WAN/VLAN with 6MBits/sec capacity) on a regular basis. So far, these are the options that I have tried and a quick summary. Has anyone tried this before? Are there other better options? Which option would be the best in terms of performance and reliability? The time taken has been calculated using tests on smaller amounts of data and then extrapolating it to estimate the time required. Using data import wizard on the SQL server or SSIS packages to import the data. It will take around 150 hours to complete the task. Using Oracle batch job to spool data into a comma-delimited flat-file. Then using SSIS package to FTP this file to the SQL server and then load directly from the flat-file. The issue here is the size of the flat-file which is expected to run in GBs. Although this option is drastically different, I am even considering the option of using Linked Server to query the Oracle data directly at run-time to avoid bringing in data. Performance is a big problem and I have limited control over the Oracle database in terms of creating table indexes. Regards, Uniball

    Read the article

  • SQL dealing with rubbish in a phone number field

    - by DoctaJonez
    Hello stackers! I've got a wonderfully fun little SQL problem to solve today and thought I'd ask the community to see what solutions you come up with. We've got a really cool email to text service that we use, you just need to send an email to [email protected] and it will send a text message to the desired phone number. For example to send a text to 0790 0006006, you need to send an email to [email protected], pretty neat huh? The problem is with the phone numbers in our database. Most of the phone numbers are fine, but some of them have "rubbish" mixed in with the phone number. Take these wonderful examples of the rubbish you need to deal with (I've anonymised the phone numbers by placing zeroes in): 07800 000647(mobile) 07500 000189 USE 1ST SEE NOTES 07900 000415 HO ONLY try 1st 0770 0000694 then home 07500 000465 Cannot Requirements The solution needs to be in SQL (for MS SQL server). So the challenge is as follows, we need to get the phone number without spaces, and without any of the rubbish seen in the samples. For example: This: try 1st 0770 0000694 then home Should become this: 07700000694 Anything without a phone number in it (e.g. "SEE NOTES") should be null.

    Read the article

  • Move SELECT to SQL Server side

    - by noober
    Hello all, I have an SQLCLR trigger. It contains a large and messy SELECT inside, with parts like: (CASE WHEN EXISTS(SELECT * FROM INSERTED I WHERE I.ID = R.ID) THEN '1' ELSE '0' END) AS IsUpdated -- Is selected row just added? as well as JOINs etc. I like to have the result as a single table with all included. Question 1. Can I move this SELECT to SQL Server side? If yes, how to do this? Saying "move", I mean to create a stored procedure or something else that can be executed before reading dataset in while cycle. The 2 following questions make sense only if answer is "yes". Why do I want to move SELECT? First off, I don't like mixing SQL with C# code. At second, I suppose that server-side queries run faster, since the server have more chances to cache them. Question 2. Am I right? Is it some sort of optimizing? Also, the SELECT contains constant strings, but they are localizable. For instance, WHERE R.Status = "Enabled" "Enabled" should be changed for French, German etc. So, I want to write 2 static methods -- OnCreate and OnDestroy -- then mark them as stored procedures. When registering/unregistering my assembly on server side, just call them respectively. In OnCreate format the SELECT string, replacing {0}, {1}... with required values from the assembly resources. Then I can localize resources only, not every script. Question 3. Is it good idea? Is there an existing attribute to mark methods to be executed by SQL Server automatically after (un)registartion an assembly? Regards,

    Read the article

  • LINQ to SQL - database relationships won't update after submit

    - by Quantic Programming
    I have a Database with the tables Users and Uploads. The important columns are: Users -> UserID Uploads -> UploadID, UserID The primary key in the relationship is Users -> UserID and the foreign key is Uploads -> UserID. In LINQ to SQL, I do the following operations: Retrieve files var upload = new Upload(); upload.UserID = user.UserID; upload.UploadID = XXX; db.Uploads.InsertOnSubmit(upload) db.SubmitChanges(); If I do that and rerun the application (and the db object is re-built, of course) - if do something like this: foreach(var upload in user.Uploads) I get all the uploads with that user's ID. (like added in the previous example) The problem is, that my application, after adding an upload an submitting changes, doesn't update the user.Uploads collection. i.e - I don't get the newly added uploads. The user object is stored in the Session object. At first, I though that the LINQ to SQL Framework doesn't update the reference of the object, therefore I should simply "reset" the user object from a new SQL request. I mean this: Session["user"] = db.Users.Where(u => u.UserID == user.UserID).SingleOrDefault(); (Where user is the previous user) But it didn't help. Please note: After rerunning the application, user.Uploads does have the new upload! Did anyone experience this type of problem, or is it normal behavior? I am a newbie to this framework. I would gladly take any advice. Thank you!

    Read the article

  • How to Store and Retrieve Images Using SQL Server (Server Management Studio)

    - by Joe Majewski
    I am having difficulties when trying to insert files into a SQL Server database. I'll try to break this down as best as I can: What data type should I be using to store image files (jpeg/png/gif/etc)? Right now my table is using the image data type, but I am curious if varbinary would be a better option. How would I go about inserting the image into the database? Does Microsoft SQL Server Management Studio have any built in functions that allow insertions of files into tables? If so, how is that done? Also, how could this be done through the use of an HTML form with PHP handling the input data and placing it into the table? How would I fetch the image from the table and display it on the page? I understand how to SELECT the cell's contents, but how would I go about translating that into a picture. Would I have to have a header(Content type: image/jpeg)? I have no problem doing any of these things with MySQL, but the SQL Server environment is still new to me, and I am working on a project for my job that requires the use of stored procedures to grab various data. Any and all help is appreciated.

    Read the article

  • SELECT * FROM Sql tweeters WHERE location = ‘UK’

    - by blakmk
    Alright this is actually a follow up post from Gethyn Ellis post SELECT * FROM SQLBLOGGERS WHERE LOCATION = ‘UK’ . Where he composed a list of UK bloggers so I thought id summarize a list of Sql folk that tweet, but rather than make the list static I will just point you towards the list which I will keep up to date: http://twitter.com/#!/blakmk/sqlserver-uk It actually summarises people titles pretty well when viewed through DABR http://dabr.co.uk/lists/blakmk/sqlserver-uk I will keep this list updated so you are welcome to follow if you find it useful. If anyone feels left out, contact me and I will happily add you to the list.

    Read the article

  • New SQL Down Under podcast episode: Bill Ramos

    - by DavidWimbush
    I thought Greg Lowe had stopped doing his excellent podcast a while back but every now and then I go and check (just in case). This time I found a new episode: http://www.sqldownunder.com/PreviousShows/tabid/98/Default.aspx. Great! As far as I can see, Greg just slipped this one out without any mention on his blog. I hope there are plenty more to come as there's no shortage of developments to discuss. It's funny to think that when I got into SQL Server, in 2000, one of the things I liked was that it only changed in occasional small increments. Really! This was a relief compared to keeping up with Visual Basic and Visual Studio (and .NET and C# and...). What happened? Did I miss a meeting? Still, I'm not complaining - there's no danger of getting bored!

    Read the article

  • SQL In The City Charlotte - Fundamentals of Database Design

    - by drsql
    Next Monday, October 14, at Red-Gate's SQL In The City conference in Charlotte, NC (one day before PASS), I will be presenting my Fundamentals of Database Design session. It is my big-time chestnut session, the one that I do the most and have the most fun with. This will be the "single" version of the session, weighing in at just under an hour, and it is a lot of material to go over (even with no code samples to go awry to take up time.)  In this hour long session (presented in widescreen...(read more)

    Read the article

  • SQL in Boston -- Red Gate Style

    - by Adam Machanic
    You might have heard of Red Gate's famous SQL in the City events: free, full-day educational events where you can learn from Red Gate's own evangelists in addition to various MVPs and other guests. With just a tiny bit of marketing thrown in for good measure (don't worry, it's not a daylong sales pitch). Red Gate is doing a US tour this fall, and I'm happy to note that my fair city of Boston is one of the stops ... and I am one of the speakers. The event takes place on October 8 . I'll be delivering...(read more)

    Read the article

  • SQL Server 2012 edition comparison details are published

    - by DavidWimbush
    Interesting stuff, particularly if you're doing BI. BISM tabular and Power View will not be in Standard Edition, only in the new - presumably more expensive - Business Intelligence Edition. That kind of makes sense as you need a fairly pricey edition of SharePoint to really get all the benefits, but it's a shame there won't be some kind of limited version in Standard Edition. And Always On will be in Standard Edition but limited to 2 nodes. I really expected Always On to be Enterprise-only so this is a great decision. It allows those of us working at a more modest scale to benefit and raises the fault tolerance of SQL Server as a product to a new level.Read all about it here: http://www.microsoft.com/sqlserver/en/us/future-editions/sql2012-editions.aspx

    Read the article

  • T-SQL Jokes

    - by Tomaz.tsql
    SQL Table walks to a psychiatrist dr. Index Table: "Doctor, I have a problem" Dr: "what kind a problem?" Table: "I'm a mess. I have things all over the place, i always look for my stuff" Dr. "No problem. I will get you in order". Index and table are reading a book "index-sutra" Table: Oh, baby tonight we can try a clustered position" Index: "yeah baby, we can also try covered position" Table: "or maybe multiple clustered position"...(read more)

    Read the article

  • Annual SQL Server conference in Poland - SQLDay 2014

    - by Damian
    We had a great 3-days conference this year in Poland. The SQLDay (7th edition) is an annual community conference. We started in 2008 as a part of C2C (community to communities) conference and after that, from 2009 the SQLDay is the independent event dedicated to the SQL Server specialists. This year we had almost 300 people and speakers like Bob Ward, Klaus Aschenbrenner and Alberto Ferrari. Of course there were also many local Polish leaders (MVP's and an MCM :) )If you are curious how we played in Wroclaw this year - just visit the link http://goo.gl/cgNzDl (or try that one https://plus.google.com/photos/100738200012412193487/albums/6010410545898180113?authkey=CITqmqmkrKK8Tw) Visit the conference site: http://conference.plssug.org.pl/ 

    Read the article

  • T-SQL User-Defined Functions: the good, the bad, and the ugly (part 1)

    - by Hugo Kornelis
    So you thought that encapsulating code in user-defined functions for easy reuse is a good idea? Think again! SQL Server supports three types of user-defined functions. Only one of them qualifies as good. The other two – well, the title says it all, doesn’t it? The bad: scalar functions A scalar user-defined function (UDF) is very much like a stored procedure, except that it always returns a single value of a predefined data type – and because of that property, it isn’t invoked with an EXECUTE statement,...(read more)

    Read the article

  • T-SQL Tuesday : Reflections on the PASS Summit and our community

    - by AaronBertrand
    Last week I attended the PASS Summit in Seattle. I blogged from both keynotes ( Keynote #1 and Keynote #2 ), as well as the WIT Luncheon - which SQL Sentry sponsored. I had a fantastic time at the conference, even though these days I attend far fewer sessions that I used to. As a company, we were overwhelmed by the positive energy in the Expo Hall. I really liked the notebook idea, where board members were assigned notebooks to carry around and take ideas from attendees. I took full advantage when...(read more)

    Read the article

  • T-SQL User-Defined Functions: the good, the bad, and the ugly (part 1)

    - by Hugo Kornelis
    So you thought that encapsulating code in user-defined functions for easy reuse is a good idea? Think again! SQL Server supports three types of user-defined functions. Only one of them qualifies as good. The other two – well, the title says it all, doesn’t it? The bad: scalar functions A scalar user-defined function (UDF) is very much like a stored procedure, except that it always returns a single value of a predefined data type – and because of that property, it isn’t invoked with an EXECUTE statement,...(read more)

    Read the article

  • In-Memory OLTP Sample for SQL Server 2014 RTM

    - by Damian
    I have just found a very good resource about Hekaton (In-memory OLTP feature in the SQL Server 2014). On the Codeplex site you can find the newest Hekaton samples - https://msftdbprodsamples.codeplex.com/releases/view/114491. The latest samples we have were related to the CTP2 version but the newest will work with the RTM version.There are some issues fixed you might find if you tried to run the previous samples on the RTM version:Update (Apr 28, 2014): Fixed an issue where the isolation level for sample stored procedures demonstrating integrity checks was too low. The transaction isolation level for the following stored procedures was updated: Sales.uspInsertSpecialOfferProductinmem, Sales.uspDeleteSpecialOfferinmem, Production.uspInsertProductinmem, and Production.uspDeleteProductinmem. 

    Read the article

  • T-SQL Tuesday : Reflections on the PASS Summit and our community

    - by AaronBertrand
    Last week I attended the PASS Summit in Seattle. I blogged from both keynotes ( Keynote #1 and Keynote #2 ), as well as the WIT Luncheon - which SQL Sentry sponsored. I had a fantastic time at the conference, even though these days I attend far fewer sessions that I used to. As a company, we were overwhelmed by the positive energy in the Expo Hall. I really liked the notebook idea, where board members were assigned notebooks to carry around and take ideas from attendees. I took full advantage when...(read more)

    Read the article

  • In-Memory OLTP Sample for SQL Server 2014 RTM

    - by Damian
    I have just found a very good resource about Hekaton (In-memory OLTP feature in the SQL Server 2014). On the Codeplex site you can find the newest Hekaton samples - https://msftdbprodsamples.codeplex.com/releases/view/114491. The latest samples we have were related to the CTP2 version but the newest will work with the RTM version.There are some issues fixed you might find if you tried to run the previous samples on the RTM version:Update (Apr 28, 2014): Fixed an issue where the isolation level for sample stored procedures demonstrating integrity checks was too low. The transaction isolation level for the following stored procedures was updated: Sales.uspInsertSpecialOfferProductinmem, Sales.uspDeleteSpecialOfferinmem, Production.uspInsertProductinmem, and Production.uspDeleteProductinmem. 

    Read the article

< Previous Page | 92 93 94 95 96 97 98 99 100 101 102 103  | Next Page >