Search Results

Search found 36773 results on 1471 pages for 'sql statement syntax'.

Page 180/1471 | < Previous Page | 176 177 178 179 180 181 182 183 184 185 186 187  | Next Page >

  • SQL Server 2012 Memory Manager KB articles

    - by SQLOS Team
    Since the release of SQL Server 2012 with a redesigned memory manager, a steady stream of KB articles have been produced by CSS to provide guidance on the new or changed options, as well as fixes that have been published..   How has memory sizing changed in SQL 2012? 2663912 Memory configuration and sizing considerations in SQL Server 2012 - http://support.microsoft.com/default.aspx?scid=kb;EN-US;2663912     Setting "locked pages" to avoid SQL Server memory pages getting swapped has been simplified, particularly for Standard Edition, the details can be found here: 2659143 How to enable the "locked pages" feature in SQL Server 2012 - http://support.microsoft.com/default.aspx?scid=kb;EN-US;2659143   Note the following deprecation (particularly relevant for 32-bit installations): 2644592 The "AWE enabled" SQL Server feature is deprecated - http://support.microsoft.com/default.aspx?scid=kb;EN-US;2644592   Note the following fixes available: 2708594 FIX: Locked page allocations are enabled without any warning after you upgrade to SQL Server 2012 - http://support.microsoft.com/kb/2708594/EN-US 2688697 FIX: Out-of-memory error when you run an instance of SQL Server 2012 on a computer that uses NUMA - http://support.microsoft.com/kb/2688697/EN-US Originally posted at http://blogs.msdn.com/b/sqlosteam/

    Read the article

  • How do I make complex SQL queries easier to write?

    - by DragonLord
    I'm finding it very difficult to write complex SQL queries involving joins across many (at least 3-4) tables and involving several nested conditions. The queries I'm being asked to write are easily described by a few sentences, but can require a deceptive amount of code to complete. I'm finding myself often using temporary views to write these queries, which seem like a bit of a crutch. What tips can you provide that I can use to make these complex queries easier? More specifically, how do I break these queries down into the steps I need to use to actually write the SQL code? Note that I'm the SQL I'm being asked to write is part of homework assignments for a database course, so I don't want software that will do the work for me. I want to actually understand the code I'm writing. More technical details: The database is hosted on a PostgreSQL server running on the local machine. The database is very small: there are no more than seven tables and the largest table has less than about 50 rows. The SQL queries are being passed unchanged to the server, via LibreOffice Base.

    Read the article

  • Is comparing an OO compiler to a SQL compiler/optimizer valid?

    - by Brad
    I'm now doing a lot of SQL development at my new job where as before I was doing Object Oriented desktop app stuff. I keep running across very large scripts (thousands of lines) and wanting to refactor in some way. I am seeing that SQL is a different sort of beast and it's probably fine to have these big scripts for the most part but while explaining this to me people are also insisting that the whole idea of refactoring is bad. That stuff like the .NET compiler are actually burdened by refactored code and that a big wall of code is more efficient and better design than code designed for reuse, readability and scalability. The other argument is that OO compilers are almost dangerously inefficient and don't have efficient memory management or runs too many CPU instructions compared to older "simpler" compilers and compared to SQL. Are these valid complaints? Even if some compiler like a C compiler is modestly more "efficient" (whatever that means on this high of a level without seeing code) would you want to write applications in C over C# or Java? Is comparing an OO compiler to a SQL compiler/optimizer even valid?

    Read the article

  • Where should SQL/DB Queries be encapsulated in a software system?

    - by Stephen Bennet
    I frequently write small applications (either web based or otherwise) that require heavy database usage. i've attempted various ways of handling where to put the actual sql queries (sort of ad-hoc ORM systems). These include: Models that build themselves up - and only allowing SQL to be inside of a model. A sort of factory style method where the models are built by a factory class that is allowed to know about SQL. A third entity that maps models based on their fields/keys into the database and generates SQL code on the fly based on this. Is there a common knowledge of which method is best? Or another way I have missed? Clearly a lot of it will be based on the context of the system itself, which for me is usually to produce lightweight tools or utility frameworks. In experimenting, I've never found any of them that feel intuitively "right" and not clunky, but I also do not want to go for a full framework such as Django or Ruby - both because the tools I create are in a variety of languages and because they usually do not warrant that level of surrounding footprint.

    Read the article

  • NEW CERTIFICATION: Oracle Certified Expert, Oracle Database 11g Release 2 SQL Tuning

    - by Brandye Barrington
    Oracle Certification announces the release of the new Oracle Certified Expert, Oracle Database 11g Release 2 SQL Tuning certification. This certification is designed forDevelopers, Database Administrators and SQL developers who are proficient at tuning efficient SQL statements. This certification covers topics on core elements such as: identifying and tuning inefficient SQL statements, using automatic SQL tuning, managing optimizer statistics on database objects, implementing partitioning and analyizing queries. Beta testing for the Oracle Database 11g Release 2: SQL Tuning exam (1Z1-117) is now underway and thus is available at the greatly discounted rate of $50 USD. Visit pearsonvue.com/oracle and register for exam 1Z1-117. You can get all preparation details on the Oracle Certification website, including exam objectives, number of questions, time allotments, and pricing. QUICK LINKS: Certification Track: Oracle Certified Expert, Oracle Database 11g Release 2 SQL Tuning Certification Exam: Oracle Database 11g Release 2: SQL Tuning (1Z0-117) Certification Website: About Beta Exams Register Now: Pearson VUE

    Read the article

  • SQL query. An unusual join. DB implemented in sqlite-3

    - by user02814
    This is essentially a question about constructing an SQL query. The db is implemented with sqlite3. I am a relatively new user of SQL. I have two tables and want to join them in an unusual way. The following is an example to explain the problem. Table 1 (t1): id year name ------------------------- 297 2010 Charles 298 2011 David 300 2010 Peter 301 2011 Richard Table 2 (t2) id year food --------------------------- 296 2009 Bananas 296 2011 Bananas 297 2009 Melon 297 2010 Coffee 297 2012 Cheese 298 2007 Sugar 298 2008 Cereal 298 2012 Chocolate 299 2000 Peas 300 2007 Barley 300 2011 Beans 300 2012 Chickpeas 301 2010 Watermelon I want to join the tables on id and year. The catch is that (1) id must match exactly, but if there is no exact match in Table 2 for the year in Table 1, then I want to choose the year that is the next (lower) available. A selection of the kind that I want to produce would give the following result id year matchyr name food ------------------------------------------------- 297 2010 2010 Charles Coffee 298 2011 2008 David Cereal 300 2010 2007 Peter Barley 301 2011 2010 Richard Watermelon To summarise, id=297 had an exact match for year=2010 given in Table 1, so the corresponding line for id=297, year=2010 is chosen from Table 2. id=298, year=2011 did not have a matching year in Table 2, so the next available year (less than 2011) is chosen. As you can see, I would also like to know what that matched year (whether exactly , or inexactly) actually was. I would very much appreciate (1) an indication (yes/no answer) of whether this is possible to do in SQL alone, or whether I need to look outside SQL, and (2) a solution, if that is not too onerous.

    Read the article

  • Enzo Backup for SQL Azure Beta Released!

    - by ScottKlein
    Blue Syntax is happy to announce the release of their SQL Azure database backup product! Enzo Backup for SQL Azure offers unparalleled backup and restored functionlity and flexibility of a SQL Azure database. You can download the beta release here: http://www.bluesyntax.net/backup.aspx With Enzo Backup for SQL Azure, you can: Create a backup blob, or a backup file from a SQL Azure database Restore a SQL Azure database from a backup blob, or a backup file Perform limited backup and restore of SQL Server databases (see details) Run backups entirely in the cloud using a remote agent Backup a single schema of a database Restore specific tables only Copy backup devices from on-premise to the cloud Use a command-line utility to perform backup operations Perform transactionally consistent backups for SQL Azure Please download it and provide us your feed back!

    Read the article

  • The World of SQL Database Deployment

    - by GGBlogger
    In my early development days, I used Microsoft Access for building databases. It made things easy since I only needed to package the database with the installation package so my clients would have access to it. When we began the development of a new package in Visual Studio .NET I decided to use SQL Server Express. It was free and provided good tools - also free. I thought it was a tremendous idea until it came time to distribute our new software! What a surprise. The nightmare Ah, the choices! Detach the database and have the client reattach it to a newly installed – oh wait. FIRST my new client needs to download and install SQL Server Express with SQL Server Management Studio. That’s not a great thing, but it is one more nightmare step for users who may have other versions of SQL installed. Then the question became – do we detach and reattach or do we do a backup. It was too late (bad planning) to revert to Microsoft Access but we badly needed a simple way to package and distribute both the database AND sample contents. Red Gate to the rescue It took me a while to find an answer but I did find it in a package called SQL Packager sold by a relatively unpublicized company in England called Red Gate. They call their products “ingeniously simple” and I must agree with that description. With SQL Packager you point to the database (more in a minute) you want to distribute. A few mouse clicks and dialogs and you have an executable file that you can ship virtually anywhere and virtually any way which, when run, installs the database on your destination SQL Server instance! It really is that simple. Easier to show than tell Let’s explore a hypothetical case. Let’s say you have a local SQL database of customers and you have decided you want to share it with your subsidiaries or partners. Here is the underlying screen you will see on starting SQL Packager. There are a bunch of possibilities here but I’m going to keep this relatively simple. At this point I simply want to illustrate the simplicity of generating an executable to deliver your database. You will notice that you can set up a new package, edit an existing package or change a bunch of options. Start SQL packager And the following is the default dialog you get on startup. In the next dialog, I’ve selected the Server and Database. I’ve also selected Windows Authentication. Pressing Next causes SQL Packager to run a number of checks and produce a report. Now you’re given a comprehensive list of what is going to be packaged and you’re allowed to change it if you desire. I’ve never made any changes here so I can’t really make any suggestions. The just illustrates the comprehensive nature of so many Red Gate products including this one. Clicking Next gives you still further options. SQL Packager then works its magic and shows you a dialog with the results. Packager then gives you a dialog of the scripts it has generated. The capture above only shows 1 of 4 tabs. Finally pressing Next gives you the option to generate a .NET executable of a C# project. I’ve only generated an executable so I’m not in a position to tell you what the C# project looks like. That may be the subject of further discussions. You can rename the package and tell SQL Packager where to save it. I’ve skipped a lot but this will serve to illustrate the comprehensive (and ingenious) things Red Gate does. All in all, it’s a superb way to distribute populated SQL databases. Oh – we’ll save running the resulting executable for later also but believe me it’s insanely simple.

    Read the article

  • The new SSIS in SQL2005/SQL2008 are oversized

    - by Ice
    I studied the new MERGE Statement and there is a nice example for importing a flatfile. INSERT <Table> SELECT * FROM OPENROWSET BULK <Import-Flat-File>, <Format-File>... seems to be a good replacment for such a simple job and avoids to build a SSIS-Package. EXEC XP_CMDSHELL bcp <Table or View> out <Flat-File> ... is almost simpler than building an SSIS, isn't it? (I know that the MERGE-Statement doesn't run on a SQL2005)

    Read the article

  • Re-using aggregate level formulas in SQL - any good tactics?

    - by Cade Roux
    Imagine this case, but with a lot more component buckets and a lot more intermediates and outputs. Many of the intermediates are calculated at the detail level, but a few things are calculated at the aggregate level: DECLARE @Profitability AS TABLE ( Cust INT NOT NULL ,Category VARCHAR(10) NOT NULL ,Income DECIMAL(10, 2) NOT NULL ,Expense DECIMAL(10, 2) NOT NULL ) ; INSERT INTO @Profitability VALUES ( 1, 'Software', 100, 50 ) ; INSERT INTO @Profitability VALUES ( 2, 'Software', 100, 20 ) ; INSERT INTO @Profitability VALUES ( 3, 'Software', 100, 60 ) ; INSERT INTO @Profitability VALUES ( 4, 'Software', 500, 400 ) ; INSERT INTO @Profitability VALUES ( 5, 'Hardware', 1000, 550 ) ; INSERT INTO @Profitability VALUES ( 6, 'Hardware', 1000, 250 ) ; INSERT INTO @Profitability VALUES ( 7, 'Hardware', 1000, 700 ) ; INSERT INTO @Profitability VALUES ( 8, 'Hardware', 5000, 4500 ) ; SELECT Cust ,Profit = SUM(Income - Expense) ,Margin = SUM(Income - Expense) / SUM(Income) FROM @Profitability GROUP BY Cust SELECT Category ,Profit = SUM(Income - Expense) ,Margin = SUM(Income - Expense) / SUM(Income) FROM @Profitability GROUP BY Category SELECT Profit = SUM(Income - Expense) ,Margin = SUM(Income - Expense) / SUM(Income) FROM @Profitability Notice how the same formulae have to be used at the different aggregation levels. This results in code duplication. I have thought of using UDFs (either scalar or table valued with an OUTER APPLY, since many of the final results may share intermediates which have to be calculated at the aggregate level), but in my experience the scalar and multi-statement table-valued UDFs perform very poorly. Also thought about using more dynamic SQL and applying the formulas by name, basically. Any other tricks, techniques or tactics to keeping these kinds of formulae which need to be applied at different levels in sync and/or organized?

    Read the article

  • How do you concat multiple rows into one column in SQL Server?

    - by Jason
    I've searched high and low for the answer to this, but I can't figure it out. I'm relatively new to SQL Server and don't quite have the syntax down yet. I have this datastructure (simplified): Table "Users" | Table "Tags": UserID UserName | TagID UserID PhotoID 1 Bob | 1 1 1 2 Bill | 2 2 1 3 Jane | 3 3 1 4 Sam | 4 2 2 ----------------------------------------------------- Table "Photos": | Table "Albums": PhotoID UserID AlbumID | AlbumID UserID 1 1 1 | 1 1 2 1 1 | 2 3 3 1 1 | 3 2 4 3 2 | 5 3 2 | I'm looking for a way to get the all the photo info (easy) plus all the tags for that photo concatenated like CONCAT(username, ', ') AS Tags of course with the last comma removed. I'm having a bear of a time trying to do this. I've tried the method in this article but I get an error when I try to run the query saying that I can't use DECLARE statements... do you guys have any idea how this can be done? I'm using VS08 and whatever DB is installed in it (I normally use MySQL so I don't know what flavor of DB this really is... it's an .mdf file?)

    Read the article

  • Database not updating after UPDATE SQL statement in ASP.net

    - by Ronnie
    I currently have a problem attepting to update a record within my database. I have a webpage that displays in text boxes a users details, these details are taken from the session upon login. The aim is to update the details when the user overwrites the current text in the text boxes. I have a function that runs when the user clicks the 'Save Details' button and it appears to work, as i have tested for number of rows affected and it outputs 1. However, when checking the database, the record has not been updated and I am unsure as to why. I've have checked the SQL statement that is being processed by displaying it as a label and it looks as so: UPDATE [users] SET [email]=@email, [firstname]=@firstname, [lastname]=@lastname, [promo]=@promo WHERE ([users].[user_id] = 16) The function and other relevant code is: Sub Button1_Click(sender As Object, e As EventArgs) changeDetails(emailBox.text, firstBox.text, lastBox.text, promoBox.text) End Sub Function changeDetails(ByVal email As String, ByVal firstname As String, ByVal lastname As String, ByVal promo As String) As Integer Dim connectionString As String = "Provider=Microsoft.Jet.OLEDB.4.0; Ole DB Services=-4; Data Source=C:\Documents an"& _ "d Settings\Paul Jarratt\My Documents\ticketoffice\datab\ticketoffice.mdb" Dim dbConnection As System.Data.IDbConnection = New System.Data.OleDb.OleDbConnection(connectionString) Dim queryString As String = "UPDATE [users] SET [email]=@email, [firstname]=@firstname, [lastname]=@lastname, "& _ "[promo]=@promo WHERE ([users].[user_id] = " + session.contents.item("ID") + ")" Dim dbCommand As System.Data.IDbCommand = New System.Data.OleDb.OleDbCommand dbCommand.CommandText = queryString dbCommand.Connection = dbConnection Dim dbParam_email As System.Data.IDataParameter = New System.Data.OleDb.OleDbParameter dbParam_email.ParameterName = "@email" dbParam_email.Value = email dbParam_email.DbType = System.Data.DbType.[String] dbCommand.Parameters.Add(dbParam_email) Dim dbParam_firstname As System.Data.IDataParameter = New System.Data.OleDb.OleDbParameter dbParam_firstname.ParameterName = "@firstname" dbParam_firstname.Value = firstname dbParam_firstname.DbType = System.Data.DbType.[String] dbCommand.Parameters.Add(dbParam_firstname) Dim dbParam_lastname As System.Data.IDataParameter = New System.Data.OleDb.OleDbParameter dbParam_lastname.ParameterName = "@lastname" dbParam_lastname.Value = lastname dbParam_lastname.DbType = System.Data.DbType.[String] dbCommand.Parameters.Add(dbParam_lastname) Dim dbParam_promo As System.Data.IDataParameter = New System.Data.OleDb.OleDbParameter dbParam_promo.ParameterName = "@promo" dbParam_promo.Value = promo dbParam_promo.DbType = System.Data.DbType.[String] dbCommand.Parameters.Add(dbParam_promo) Dim rowsAffected As Integer = 0 dbConnection.Open Try rowsAffected = dbCommand.ExecuteNonQuery Finally dbConnection.Close End Try labelTest.text = rowsAffected.ToString() if rowsAffected = 1 then labelSuccess.text = "* Your details have been updated and saved" else labelError.text = "* Your details could not be updated" end if End Function Any help would be greatly appreciated.

    Read the article

  • SQL statement to split a table based on a join

    - by williamjones
    I have a primary table for Articles that is linked by a join table Info to a table Tags that has only a small number of entries. I want to split the Articles table, by either deleting rows or creating a new table with only the entries I want, based on the absence of a link to a certain tag. There are a few million articles. How can I do this? Not all of the articles have any tag at all, and some have many tags. Example: table Articles primary_key id table Info foreign_key article_id foreign_key tag_id table Tags primary_key id It was easy for me to segregate the articles that do have the match right off the bat, so I thought maybe I could do that and then use a NOT IN statement but that is so slow running it's unclear if it's ever going to finish. I did that with these commands: INSERT INTO matched_articles SELECT * FROM articles a LEFT JOIN info i ON a.id = i.article_id WHERE i.tag_id = 5; INSERT INTO unmatched_articles SELECT * FROM articles a WHERE a.id NOT IN (SELECT m.id FROM matched_articles m); If it makes a difference, I'm on Postgres.

    Read the article

  • Passing in a lambda to a Where statement

    - by sonicblis
    I noticed today that if I do this: var items = context.items.Where(i => i.Property < 2); items = items.Where(i => i.Property > 4); Once I access the items var, it executes only the first line as the data call and then does the second call in memory. However, if I do this: var items = context.items.Where(i => i.Property < 2).Where(i => i.Property > 4); I get only one expression executed against the context that includes both where statements. I have a host of variables that I want to use to build the expression for the linq lambda, but their presence or absence changes the expression such that I'd have to have a rediculous number of conditionals to satisfy all cases. I thought I could just add the Where() statements as in my first example above, but that doesn't end up in a single expression that contains all of the criteria. Therefore, I'm trying to create just the lambda itself as such: //bogus syntax if (var1 == "something") var expression = Expression<Func<item, bool>>(i => i.Property == "Something); if (var2 == "somethingElse") expression = expression.Where(i => i.Property2 == "SomethingElse"); And then pass that in to the where of my context.Items to evaluate. A) is this right, and B) if so, how do you do it?

    Read the article

  • Linq2Sql - attempting to update but the Set statement in sql is empty

    - by MrTortoise
    This is weird ... done updates loads of times before but cannot spot why this is different. I have a client class from the dbml I added a method called update public void UpdateSingle() { L2SDataContext dc = new L2SDataContext(); Client c = dc.Clients.Single<Client>(p => p.ID == this.ID); c.CopyToMe(this); c.updatedOn = DateTime.Now; dc.SubmitChanges(); dc.Dispose(); } The CopytoMe method public void CopyToMe(Client theObject) { if (ID != theObject.ID) { ID = theObject.ID; } /// this is redundant as generated code checks field for a change. deleted = theObject.deleted; deletedBy = theObject.deletedBy; deletedOn = theObject.deletedOn; insertedBy = theObject.insertedBy; insertedOn = theObject.insertedOn; name = theObject.name; updatedBy = theObject.updatedBy; updatedOn = theObject.updatedOn; } Im taking a client that was selected, changing its name and then calling this update method. The generated sql is as follows exec sp_executesql N'UPDATE [dbo].[tblClient] SET WHERE ([ID] = @p0) AND ([name] = @p1) AND ([insertedOn] = @p2) AND ([insertedBy] = @p3) AND ([updatedOn] = @p4) AND ([updatedBy] = @p5) AND ([deletedOn] IS NULL) AND ([deletedBy] IS NULL) AND (NOT ([deleted] = 1))',N'@p0 int,@p1 varchar(8000),@p2 datetime,@p3 int,@p4 datetime,@p5 int',@p0=103,@p1='UnitTestClient',@p2=''2010-05-17 11:33:22:520'',@p3=3,@p4=''2010-05-17 11:33:22:520'',@p5=3 I have no idea why this is not working ... used this kind of select object, set field to new value submit the selected object pattern many times and not had this problem. there is also nothing obviously wrong with the dbml - although this is probably a false statement any ideas?

    Read the article

  • Should we use Visual Studio 2010 for all SQL Server Database Development?

    - by Luke
    Our company currently has seven dedicated SQL Server 2008 servers each running an average of 10 databases. All databases have many stored procedures and UDFs that commonly reference other databases both on the same server and also across linked servers. We currently use SSMS for all database related administration and development but we have recently purchased Visual Studio 2010 primarily for ongoing C# WinForms and ASP.NET development. I have used VS2010 to perform schema comparisons when rolling out changes from a development server into production and I'm finding it great for this task. We would like to consider using VS2010 for all database development going forward but as far as I understand, we would have to set up ALL databases as projects because of the dependencies on linked servers etc. My question is, do you have any experience using VS2010 for database development in a similar environment? Is it easy to use in tandem with SSMS or is it a one way street once VS2010 projects have been set up for all databases? Can you make any recommendations/impart any experience with a similar scenario? Thanks, Luke

    Read the article

  • Is there a better way to convert SQL datetime from hh:mm:ss to hhmmss?

    - by Johann J.
    I have to write an SQL view that returns the time part of a datetime column as a string in the format hhmmss (apparently SAP BW doesn't understand hh:mm:ss). This code is the SAP recommended way to do this, but I think there must be a better, more elegant way to accomplish this TIME = case len(convert(varchar(2), datepart(hh, timecolumn))) when 1 then /* Hour Part of TIMES */ case convert(varchar(2), datepart(hh, timecolumn)) when '0' then '24' /* Map 00 to 24 ( TIMES ) */ else '0' + convert(varchar(1), datepart(hh, timecolumn)) end else convert(varchar(2), datepart(hh, timecolumn)) end + case len(convert(varchar(2), datepart(mi, timecolumn))) when 1 then '0' + convert(varchar(1), datepart(mi, timecolumn)) else convert(varchar(2), datepart(mi, timecolumn)) end + case len(convert(varchar(2), datepart(ss, timecolumn))) when 1 then '0' + convert(varchar(1), datepart(ss, timecolumn)) else convert(varchar(2), datepart(ss, timecolumn)) end This accomplishes the desired result, 21:10:45 is displayed as 211045. I'd love for something more compact and easily readable but so far I've come up with nothing that works.

    Read the article

  • Linq insert statement inserts nothing, does not fail either

    - by pietjepoeier
    I am trying to insert a new account in my Acccounts table with linq. I tried using the EntityModel and Linq2Sql. I get no insert into my database nor an exception of any kind. public static Linq2SQLDataContext dataContext { get { return new Linq2SQLDataContext(); } } try { //EntityModel Accounts acc = Accounts.CreateAccounts(0, Voornaam, Straat, Huisnummer, Stad, Land, 15, EmailReg, Password1); Entities.AddToAccounts(acc); Entities.SaveChanges(); //Linq 2 SQL Account account = new Account { City = Stad, Country = Land, EmailAddress = EmailReg, Name = Voornaam, Password = Password1, Street = Straat, StreetNr = Huisnummer, StreetNrAdd = Toevoeging, Points = 25 }; dataContext.Accounts.InsertOnSubmit(account); var conf = dataContext.ChangeConflicts; // No changeConflicts ChangeSet set = dataContext.GetChangeSet(); // 0 inserts, 0 updates, 0 deletes try { dataContext.SubmitChanges(); } catch (Exception ex) { } } catch (EntityException ex) { }

    Read the article

  • Select Statement to show missing records (Easy Question)

    - by Gerhard Weiss
    I need some T-SQL that will show missing records. Here is some sample data: Emp 1 01/01/2010 02/01/2010 04/01/2010 06/01/2010 Emp 2 02/01/2010 04/01/2010 05/01/2010 etc... I need to know Emp 1 is missing 03/01/2010 05/01/2010 Emp 2 is missing 01/01/2010 03/01/2010 06/01/2010 The range to check will start with todays date and go back 6 months. In this example, lets say today's date is 06/12/2010 so the range is going to be 01/01/2010 thru 06/01/2010. The day is always going to be the 1st in the data. Thanks a bunch. :) Gerhard Weiss Secretary of Great Lakes Area .NET Users Group GANG Upcoming Meetings | GANG LinkedIn Group

    Read the article

  • SQL Query syntax, I want to use INNER JOIN

    - by amol kadam
    Hi . I'm working on a windows application project using front end "vb.net" & back end "Ms Access" I have problem in wrinting sql query Actually there are 5 tables Transaction,items,itemtitle,itemtype & userinfo. check the following query & with this referance if u get idea then plz change in correct query Thanking You SELECT TRANSACTIONS.ACCESSIONNO AS ACCESSIONNO,TRANSACTIONS.TYPEID, TRANSACTIONS.CHECKOUTDATE AS CHECKOUTDATE,ITEMTITLE.ITEMTITLE, TRANSACTIONS.CHECKEDOUTBY, USERINFO.FULLNAME_ENG, USERINFO.FULLNAME_MAR, TRANSACTIONS.ACCOUNTNO, ITEMTYPE.TYPES_MAR, ITEMTYPE.TYPES_ENG FROM TRANSACTIONS,ITEMTYPE, ITEMTITLE, USERINFO WHERE TRANSACTIONS.ACCOUNTNO=USERINFO.ACCOUNTNO AND TRANSACTIONS.ACCESSIONNO=ITEMS.ACCESSIONNO AND ITEMS.ITEMTITLEID=ITEMTITLE.ITEMTITLEID AND TRANSACTIONS.TYPEID=ITEMTYPE.TYPEID AND TRANSACTIONS.STATUS='Enabled'

    Read the article

  • Algorithms to trim leading zeroes from a SQL field?

    - by froadie
    I just came across the interesting problem of trying to trim the leading zeroes from a non-numeric field in SQL. (Since it can contain characters, it can't just be converted to a number and then back.) This is what we ended up using: SELECT REPLACE(LTRIM(REPLACE(fieldWithLeadingZeroes,'0',' ')),' ','0') It replaces the zeroes with spaces, left trims it, and then puts the zeroes back in. I thought this was a very clever and interesting way to do it, although not so readable if you've never come across it before. Are there any clearer ways to do this? Any more efficient ways to do this? Or any other ways to do this period? I was intrigued by this problem and would be interested to see any methods of getting around it.

    Read the article

  • Returning several COUNT results from one ASP SQL statement

    - by user89691
    Say I have a table like this: Field1 Field2 Field3 Field4 fred tom fred harry tom tom dick harry harry and I want to determine what proportion of it has been completed for each field. I can execute: SELECT COUNT (Field1) WHERE (Field1 <> '') AS Field1Count SELECT COUNT (Field2) WHERE (Field2 <> '') AS Field2Count SELECT COUNT (Field3) WHERE (Field3 <> '') AS Field3Count SELECT COUNT (Field4) WHERE (Field4 <> '') AS Field4Count Is it possible to roll up these separate SQL statements into one that will return the 4 results in one hit? Is there any performance advantage to doing so (given that the number of columns and rows may be quite large in practice)?

    Read the article

  • How do I return an empty result set from a procedure using T-SQL?

    - by Kivin
    I'm interested in returning an empty result set from SQL Server stored procedures in certain events. The intended behaviour is that a L2SQL DataContext.SPName().SingleOrDefault() will result in CLR null value. I'm presently using the following solution, but I'm unsure whether it would be considered bad practice, a performance hazard (I could not find one by reading the execution plan), or if there is simply a better way: SELECT * FROM [dbo].[TableName] WHERE 0 = 1; The execution plan is a constant scan with a trivial cost associated with it. The reason I am asking this instead of simply not running any SELECTs is because I'm concerned previous SELECT @scalar or SELECT INTO statements could cause unintended result sets to be served back to L2SQL. Am I worrying over nothing?

    Read the article

  • Is using Natural Join or Implicit column names not a good practice when writing SQL in a programming

    - by Jian Lin
    When we use Natural Join, we are joining the tables when both table have the same column names. But what if we write it in PHP and then the DBA add some more fields to both tables, then the Natural Join can break? The same goes for Insert, if we do a insert into gifts values (NULL, "chocolate", "choco.jpg", now()); then it will break the code as well as contaminating the table when the DBA adds some fields to the table (example as column 2 or 3). So it is always best to spell out the column names when the SQL statements are written inside a programming language and stored in a file in a big project.

    Read the article

< Previous Page | 176 177 178 179 180 181 182 183 184 185 186 187  | Next Page >