Search Results

Search found 42327 results on 1694 pages for 'microsoft sql t sql'.

Page 329/1694 | < Previous Page | 325 326 327 328 329 330 331 332 333 334 335 336  | Next Page >

  • Testing for existence using SELECT WHERE HAVING and NOT HAVING in a grouped subset

    - by IanC
    I have data on which I need to count +1 if a particular condition exists or another condition doesn't exist. I'm using SQL Server 2008. I shred the following simplified sample XML into a temp table and validate it: <product type="1"> <param type="1"> <item mode="0" weight="1" /> </param> <param type="2"> <item mode="1" weight="1" /> <item mode="0" weight="0.1" /> </param> <param type="3"> <item mode="1" weight="0.75" /> <item mode="1" weight="0.25" /> </param> </product> The validation in concern is the following rule: For each product type, for each param type, mode may be 0 & (1 || 2). In other words, there may be a 0(s), but then 1s or 2s are required, or there may be only 1(s) or 2(s). There cannot be only 0s, and there cannot be 1s and 2s. The only part I haven't figured out is how to detect if there are only 0s. This seems like a "not having" problem. The validation code (for this part): WITH t1 AS ( SELECT SUM(t.ParamWeight) AS S, COUNT(1) AS C, t.ProductTypeID, t.ParamTypeID, t.Mode FROM @t AS t GROUP BY t.ProductTypeID, t.ParamTypeID, t.Mode ), ... UNION ALL SELECT TOP (1) 1 -- only mode 0 & (1 || 2) is allowed FROM t1 WHERE t1.Mode IN (1, 2) GROUP BY t1.ProductTypeID, t1.ParamTypeID HAVING COUNT(1) > 1 UNION ALL ... ) SELECT @C = COUNT(1) FROM t2 This will show if any mode 1s & 2s are mixed, but not if the group contains only a 0. I'm sure there is a simple solution, but it's evading me right now. EDIT: I thought of a "cheat" that works perfectly. I added the following to the above: SELECT TOP (1) 1 -- only mode 0 & (null || 1 || 2) is allowed FROM t1 GROUP BY t1.ProductTypeID, t1.ParamTypeID HAVING SUM(t1.Mode) = 0 However, I'd still like to know how to do this without cheating.

    Read the article

  • linq to sql with nservicebus table lock issue

    - by IGoor
    I am building a system using NServiceBus and my DataLayer is using Linq 2 SQL. The system is made up of 2 services. Service1 receives messages from NSB. It will query Table1 in my database and inserts a record into Table1 If a certain condition is met a new NSB message is sent to the 2nd service Service2 will update records also in Table1 when it receives messages from Service1 and it does some other non database related work. Service2 is a long running process. The problem I am having is the moment Service2 updates a record in Table1, the table is locked. The lock seems to be in place until Service2 has completed all it is processing. i.e. The lock is not released after my datacontext is disposed. This causes the query in Service1 to timeout. Once Service2 completes processing, Service1 resumes processing again without problem. So for example Service1 code may look like: int x =0; using (DataContext db = new DataContext()) { x = (from dp in db.Table1 select dp).Count(); // this line will timeout while service2 is processing Table1 t = new Table1(); t.Data = "test"; db.Table1.InsertOnSubmit(t); db.SubmitChanges(); } if(x % 50 == 0) CallService2(); The code in service2 may look like: using (DataContext db = new DataContext()) { Table1 t = db.Table1.Where(t => t.id == myId); t.Data = "updated"; db.SubmitChanges(); } // I would have expected the lock to have been released at this point, but this is not the case. DoSomeLongRunningTasks(); // lock will be released once service2 exits I don't understand why the lock is not released when the datacontext is disposed in Service2. To get around the problem I have been calling: db.ExecuteCommand("SET TRANSACTION ISOLATION LEVEL READ UNCOMMITTED"); and this works, but I am not happy using it. I want to solve this problem properly. Has any one experienced this sort of problem before and does any one know how to solve it? Why is the lock not released after the datacontext has been disposed? Thanks in advance. p.s. sorry for the extremely long post.

    Read the article

  • Unable to add item to dataset in Linq to SQL

    - by Mike B
    I am having an issue adding an item to my dataset in Linq to SQL. I am using the exact same method in other tables with no problem. I suspect I know the problem but cannot find an answer (I also suspect all i really need is the right search term for Google). Please keep in mind this is a learning project (Although it is in use in a business) I have posted my code and datacontext below. What I am doing is: Create a view model (Relevant bits are shown) and a simple wpf window that allows editing of 3 properties that are bound to the category object. Category is from the datacontext. Edit works fine but add does not. If I check GetChangeSet() just before the db.submitChanges() call there are no adds, edits or deletes. I suspect an issue with the fact that a Category added without a Subcategory would be an orphan but I cannot seem to find the solution. Command code to open window: CategoryViewModel vm = new CategoryViewModel(); AddEditCategoryWindow window = new AddEditCategoryWindow(vm); window.ShowDialog(); ViewModel relevant stuff: public class CategoryViewModel : ViewModelBase { public Category category { get; set; } // Constructor used to Edit a Category public CategoryViewModel(Int16 categoryID) { db = new OITaskManagerDataContext(); category = QueryCategory(categoryID); } // Constructor used to Add a Category public CategoryViewModel() { db = new OITaskManagerDataContext(); category = new Category(); } } The code for saving changes: // Don't close window unless all controls are validated if (!vm.IsValid(this)) return; var changes = vm.db.GetChangeSet(); // DEBUG try { vm.db.SubmitChanges(ConflictMode.ContinueOnConflict); } catch (ChangeConflictException) { vm.db.ChangeConflicts.ResolveAll(RefreshMode.KeepChanges); vm.db.SubmitChanges(); } The Xaml (Edited fror brevity): <TextBox Text="{Binding category.CatName, Mode=TwoWay, ValidatesOnDataErrors=True, UpdateSourceTrigger=PropertyChanged}" /> <TextBox Text="{Binding category.CatDescription, ValidatesOnDataErrors=True, UpdateSourceTrigger=PropertyChanged}" /> <CheckBox IsChecked="{Binding category.CatIsInactive, Mode=TwoWay}" /> IssCategory in the Issues table is the old, text based category. This field is no longer used and will be removed from the database as soon as this is working and pushed live.

    Read the article

  • Stored procedure to remove FK of a given table

    - by Nicole
    I need to create a stored procedure that: Accepts a table name as a parameter Find its dependencies (FKs) Removes them Truncate the table I created the following so far based on http://www.mssqltips.com/sqlservertip/1376/disable-enable-drop-and-recreate-sql-server-foreign-keys/ . My problem is that the following script successfully does 1 and 2 and generates queries to alter tables but does not actually execute them. In another word how can execute the resulting "Alter Table ..." queries to actually remove FKs? CREATE PROCEDURE DropDependencies(@TableName VARCHAR(50)) AS BEGIN SELECT 'ALTER TABLE ' + OBJECT_SCHEMA_NAME(parent_object_id) + '.[' + OBJECT_NAME(parent_object_id) + '] DROP CONSTRAINT ' + name FROM sys.foreign_keys WHERE referenced_object_id=object_id(@TableName) END EXEC DropDependencies 'TableName' Any idea is appreciated! Update: I added the cursor to the SP but I still get and error: "Msg 203, Level 16, State 2, Procedure DropRestoreDependencies, Line 75 The name 'ALTER TABLE [dbo].[ChildTable] DROP CONSTRAINT [FK__ChileTable__ParentTable__745C7C5D]' is not a valid identifier." Here is the updated SP: CREATE PROCEDURE DropRestoreDependencies(@schemaName sysname, @tableName sysname) AS BEGIN SET NOCOUNT ON DECLARE @operation VARCHAR(10) SET @operation = 'DROP' --ENABLE, DISABLE, DROP DECLARE @cmd NVARCHAR(1000) DECLARE @FK_NAME sysname, @FK_OBJECTID INT, @FK_DISABLED INT, @FK_NOT_FOR_REPLICATION INT, @DELETE_RULE smallint, @UPDATE_RULE smallint, @FKTABLE_NAME sysname, @FKTABLE_OWNER sysname, @PKTABLE_NAME sysname, @PKTABLE_OWNER sysname, @FKCOLUMN_NAME sysname, @PKCOLUMN_NAME sysname, @CONSTRAINT_COLID INT DECLARE cursor_fkeys CURSOR FOR SELECT Fk.name, Fk.OBJECT_ID, Fk.is_disabled, Fk.is_not_for_replication, Fk.delete_referential_action, Fk.update_referential_action, OBJECT_NAME(Fk.parent_object_id) AS Fk_table_name, schema_name(Fk.schema_id) AS Fk_table_schema, TbR.name AS Pk_table_name, schema_name(TbR.schema_id) Pk_table_schema FROM sys.foreign_keys Fk LEFT OUTER JOIN sys.tables TbR ON TbR.OBJECT_ID = Fk.referenced_object_id --inner join WHERE TbR.name = @tableName AND schema_name(TbR.schema_id) = @schemaName OPEN cursor_fkeys FETCH NEXT FROM cursor_fkeys INTO @FK_NAME,@FK_OBJECTID, @FK_DISABLED, @FK_NOT_FOR_REPLICATION, @DELETE_RULE, @UPDATE_RULE, @FKTABLE_NAME, @FKTABLE_OWNER, @PKTABLE_NAME, @PKTABLE_OWNER WHILE @@FETCH_STATUS = 0 BEGIN -- create statement for dropping FK and also for recreating FK IF @operation = 'DROP' BEGIN -- drop statement SET @cmd = 'ALTER TABLE [' + @FKTABLE_OWNER + '].[' + @FKTABLE_NAME + '] DROP CONSTRAINT [' + @FK_NAME + ']' EXEC @cmd -- create process DECLARE @FKCOLUMNS VARCHAR(1000), @PKCOLUMNS VARCHAR(1000), @COUNTER INT -- create cursor to get FK columns DECLARE cursor_fkeyCols CURSOR FOR SELECT COL_NAME(Fk.parent_object_id, Fk_Cl.parent_column_id) AS Fk_col_name, COL_NAME(Fk.referenced_object_id, Fk_Cl.referenced_column_id) AS Pk_col_name FROM sys.foreign_keys Fk LEFT OUTER JOIN sys.tables TbR ON TbR.OBJECT_ID = Fk.referenced_object_id INNER JOIN sys.foreign_key_columns Fk_Cl ON Fk_Cl.constraint_object_id = Fk.OBJECT_ID WHERE TbR.name = @tableName AND schema_name(TbR.schema_id) = @schemaName AND Fk_Cl.constraint_object_id = @FK_OBJECTID -- added 6/12/2008 ORDER BY Fk_Cl.constraint_column_id OPEN cursor_fkeyCols FETCH NEXT FROM cursor_fkeyCols INTO @FKCOLUMN_NAME,@PKCOLUMN_NAME SET @COUNTER = 1 SET @FKCOLUMNS = '' SET @PKCOLUMNS = '' WHILE @@FETCH_STATUS = 0 BEGIN IF @COUNTER > 1 BEGIN SET @FKCOLUMNS = @FKCOLUMNS + ',' SET @PKCOLUMNS = @PKCOLUMNS + ',' END SET @FKCOLUMNS = @FKCOLUMNS + '[' + @FKCOLUMN_NAME + ']' SET @PKCOLUMNS = @PKCOLUMNS + '[' + @PKCOLUMN_NAME + ']' SET @COUNTER = @COUNTER + 1 FETCH NEXT FROM cursor_fkeyCols INTO @FKCOLUMN_NAME,@PKCOLUMN_NAME END CLOSE cursor_fkeyCols DEALLOCATE cursor_fkeyCols END FETCH NEXT FROM cursor_fkeys INTO @FK_NAME,@FK_OBJECTID, @FK_DISABLED, @FK_NOT_FOR_REPLICATION, @DELETE_RULE, @UPDATE_RULE, @FKTABLE_NAME, @FKTABLE_OWNER, @PKTABLE_NAME, @PKTABLE_OWNER END CLOSE cursor_fkeys DEALLOCATE cursor_fkeys END For running use: EXEC DropRestoreDependencies dbo, ParentTable

    Read the article

  • A better UPDATE method in LINQ to SQL

    - by Refracted Paladin
    The below is a typical, for me, Update method in L2S. I am still fairly new to a lot of this(L2S & business app development) but this just FEELs wrong. Like there MUST be a smarter way of doing this. Unfortunately, I am having trouble visualizing it and am hoping someone can provide an example or point me in the right direction. To take a stab in the dark, would I have a Person Object that has all these fields as Properties? Then what, though? Is that redundant since L2S already mapped my Person Table to a Class? Is this just 'how it goes', that you eventually end up passing 30 parameters(or MORE) to an UPDATE statement at some point? For reference, this is a business app using C#, WinForms, .Net 3.5, and L2S over SQL 2005 Standard. Here is a typical Update Call for me. This is in a file(BLLConnect.cs) with other CRUD methods. Connect is the name of the DB that holds tblPerson When a user clicks save() this is what is eventually called with all of these fields having, potentially, been updated-- public static void UpdatePerson(int personID, string userID, string titleID, string firstName, string middleName, string lastName, string suffixID, string ssn, char gender, DateTime? birthDate, DateTime? deathDate, string driversLicenseNumber, string driversLicenseStateID, string primaryRaceID, string secondaryRaceID, bool hispanicOrigin, bool citizenFlag, bool veteranFlag, short ? residencyCountyID, short? responsibilityCountyID, string emailAddress, string maritalStatusID) { using (var context = ConnectDataContext.Create()) { var personToUpdate = (from person in context.tblPersons where person.PersonID == personID select person).Single(); personToUpdate.TitleID = titleID; personToUpdate.FirstName = firstName; personToUpdate.MiddleName = middleName; personToUpdate.LastName = lastName; personToUpdate.SuffixID = suffixID; personToUpdate.SSN = ssn; personToUpdate.Gender = gender; personToUpdate.BirthDate = birthDate; personToUpdate.DeathDate = deathDate; personToUpdate.DriversLicenseNumber = driversLicenseNumber; personToUpdate.DriversLicenseStateID = driversLicenseStateID; personToUpdate.PrimaryRaceID = primaryRaceID; personToUpdate.SecondaryRaceID = secondaryRaceID; personToUpdate.HispanicOriginFlag = hispanicOrigin; personToUpdate.CitizenFlag = citizenFlag; personToUpdate.VeteranFlag = veteranFlag; personToUpdate.ResidencyCountyID = residencyCountyID; personToUpdate.ResponsibilityCountyID = responsibilityCountyID; personToUpdate.EmailAddress = emailAddress; personToUpdate.MaritalStatusID = maritalStatusID; personToUpdate.UpdateUserID = userID; personToUpdate.UpdateDateTime = DateTime.Now; context.SubmitChanges(); } }

    Read the article

  • Problems Enforcing Referential Integrity on SQL Server Tables

    - by SidC
    Hello All, I have a SQL Server 2005 database comprised of Customer, Quote, QuoteDetail tables. I want/need to enforce referential integrity such that when an insert is made on quotedetail, the quote and customer tables are also affected. I have tried my best to set up primary/foreign keys on my tables but need some help. Here's the scripts for my tables as they stand now (please don't laugh): Customers: USE [Diel_inventory] GO /****** Object: Table [dbo].[Customers] Script Date: 05/08/2010 03:39:04 ******/ SET ANSI_NULLS ON GO SET QUOTED_IDENTIFIER ON GO CREATE TABLE [dbo].[Customers]( [pkCustID] [int] IDENTITY(1,1) NOT NULL, [CompanyName] [nvarchar](50) NULL, [Address] [nvarchar](50) NULL, [City] [nvarchar](50) NULL, [State] [nvarchar](2) NULL, [ZipCode] [nvarchar](5) NULL, [OfficePhone] [nvarchar](12) NULL, [OfficeFAX] [nvarchar](12) NULL, [Email] [nvarchar](50) NULL, [PrimaryContactName] [nvarchar](50) NULL, CONSTRAINT [PK_Customers] PRIMARY KEY CLUSTERED ([pkCustID] ASC)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY] ) ON [PRIMARY] Quotes: USE [Diel_inventory] GO /****** Object: Table [dbo].[Quotes] Script Date: 05/08/2010 03:30:46 ******/ SET ANSI_NULLS ON GO SET QUOTED_IDENTIFIER ON GO CREATE TABLE [dbo].[Quotes]( [pkQuoteID] [int] IDENTITY(1,1) NOT NULL, [fkCustomerID] [int] NOT NULL, [QuoteDate] [timestamp] NOT NULL, [NeedbyDate] [datetime] NULL, [QuoteAmt] [decimal](6, 2) NOT NULL, [QuoteApproved] [bit] NOT NULL, [fkOrderID] [int] NOT NULL, CONSTRAINT [PK_Bids] PRIMARY KEY CLUSTERED ( [pkQuoteID] ASC)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY] ) ON [PRIMARY] GO ALTER TABLE [dbo].[Quotes] WITH CHECK ADD CONSTRAINT [fkCustomerID] FOREIGN KEY([fkCustomerID]) REFERENCES [dbo].[Customers] ([pkCustID]) GO ALTER TABLE [dbo].[Quotes] CHECK CONSTRAINT [fkCustomerID] QuoteDetail: USE [Diel_inventory] GO /****** Object: Table [dbo].[QuoteDetail] Script Date: 05/08/2010 03:31:58 ******/ SET ANSI_NULLS ON GO SET QUOTED_IDENTIFIER ON GO CREATE TABLE [dbo].[QuoteDetail]( [ID] [int] IDENTITY(1,1) NOT NULL, [fkQuoteID] [int] NOT NULL, [fkCustomerID] [int] NOT NULL, [fkPartID] [int] NULL, [PartNumber1] [float] NOT NULL, [Qty1] [int] NOT NULL, [PartNumber2] [float] NULL, [Qty2] [int] NULL, [PartNumber3] [float] NULL, [Qty3] [int] NULL, [PartNumber4] [float] NULL, [Qty4] [int] NULL, [PartNumber5] [float] NULL, [Qty5] [int] NULL, [PartNumber6] [float] NULL, [Qty6] [int] NULL, [PartNumber7] [float] NULL, [Qty7] [int] NULL, [PartNumber8] [float] NULL, [Qty8] [int] NULL, [PartNumber9] [float] NULL, [Qty9] [int] NULL, [PartNumber10] [float] NULL, [Qty10] [int] NULL, [PartNumber11] [float] NULL, [Qty11] [int] NULL, [PartNumber12] [float] NULL, [Qty12] [int] NULL, [PartNumber13] [float] NULL, [Qty13] [int] NULL, [PartNumber14] [float] NULL, [Qty14] [int] NULL, [PartNumber15] [float] NULL, [Qty15] [int] NULL, [PartNumber16] [float] NULL, [Qty16] [int] NULL, [PartNumber17] [float] NULL, [Qty17] [int] NULL, [PartNumber18] [float] NULL, [Qty18] [int] NULL, [PartNumber19] [float] NULL, [Qty19] [int] NULL, [PartNumber20] [float] NULL, [Qty20] [int] NULL, CONSTRAINT [PK_QuoteDetail] PRIMARY KEY CLUSTERED ( [ID] ASC )WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY] ) ON [PRIMARY] GO ALTER TABLE [dbo].[QuoteDetail] WITH CHECK ADD CONSTRAINT [FK_QuoteDetail_Customers] FOREIGN KEY ([fkCustomerID]) REFERENCES [dbo].[Customers] ([pkCustID]) GO ALTER TABLE [dbo].[QuoteDetail] CHECK CONSTRAINT [FK_QuoteDetail_Customers] GO ALTER TABLE [dbo].[QuoteDetail] WITH CHECK ADD CONSTRAINT [FK_QuoteDetail_PartList] FOREIGN KEY ([fkPartID]) REFERENCES [dbo].[PartList] ([RecID]) GO ALTER TABLE [dbo].[QuoteDetail] CHECK CONSTRAINT [FK_QuoteDetail_PartList] GO ALTER TABLE [dbo].[QuoteDetail] WITH CHECK ADD CONSTRAINT [FK_QuoteDetail_Quotes] FOREIGN KEY([fkQuoteID]) REFERENCES [dbo].[Quotes] ([pkQuoteID]) GO ALTER TABLE [dbo].[QuoteDetail] CHECK CONSTRAINT [FK_QuoteDetail_Quotes] Your advice/guidance on how to set these up so that customer ID in Customers is the same as in Quotes (referential integrity) and that CustomerID is inserted on Quotes and Customers when an insert is made to QuoteDetial would be much appreciated. Thanks, Sid

    Read the article

  • SQL Server architecture guidance

    - by Liam
    Hi, We are designing a new version of our existing product on a new schema. Its an internal web application with possibly 100 concurrent users (max)This will run on a SQL Server 2008 database. On of the discussion items recently is whether we should have a single database of split the database for performance reasons across 2 separate databases. The database could grow anywhere from 50-100GB over 5 years. We are Developers and not DBAs so it would be nice to get some general guidance. [I know the answer is not simple as it depends on the schema, archiving policy, amount of data etc. ] Option 1 Single Main Database [This is my preferred option]. The plan would be to have all the tables in a single database and possibly to use file groups and partitioning to separate the data if required across multiple disks. [Use schema if appropriate]. This should deal with the performance concerns One of the comments wrt this was that the a single server instance would still be processing this data so there would still be a processing bottle neck. For reporting we could have a separate reporting DB but this is still being discussed. Option 2 Split the database into 2 separate databases DB1 - Customers, Accounts, Customer resources etc DB2 - This would contain the bulk of the data [i.e. Vehicle tracking data, financial transaction tables etc]. These tables would typically contain a lot of data. [It could reside on a separate server if required] This plan would involve keeping the main data in a smaller database [DB1] and retaining the [mainly] read only transaction type data in a separate DB [DB2]. The UI would mainly read from DB1 and thus be more responsive. [I'm aware that this option makes it harder for Referential Integrity to be enforced.] Points for consideration As we are at the design stage we can at least make proper use of indexes to deal performance issues so thats why option 1 to me is attractive and its more of a standard approach. For both options we are considering implementing an archiving database. Apologies for the long Question. In summary the question is 1 DB or 2? Thanks in advance, Liam

    Read the article

  • SQL-How to retrieve the correct data using php

    - by Programatt
    I am new to SQL so please excuse my question if it is simple. I have a database with a few tables. 1 is a users table, the others are application tables that contain the users preferences for receiving notifications about that application based on the country they are interested in. What I want to do, is retrieve the e-mail address of all users that have an interest in that country. I am struggling to think about how to do this. I currently have the following query constructed, and the code to populate the values function check($string) { if (isset($_POST[$string])) { $print = implode(', ', $_POST[$string]); //Converts an array into a single string $imanageSQLArr = Array(); if (substr_count($print,'Benelux') > 0) { $imanageSQLArr[0] = "checked"; } else { $imanageSQLArr[0] = "off"; } if (substr_count($print, 'France') > 0) { $imanageSQLArr[1] = "checked"; } else { $imanageSQLArr[1] = "off"; } if (substr_count($print, 'Germany') > 0) { $imanageSQLArr[2] = "checked"; } else { $imanageSQLArr[2] = "off"; } if (substr_count($print, 'Italy') > 0) { $imanageSQLArr[3] = "checked"; } else { $imanageSQLArr[3] = "off"; } if (substr_count($print, 'Netherlands') > 0) { $imanageSQLArr[4] = "checked"; } else { $imanageSQLArr[4] = "off"; } if (substr_count($print, 'Portugal') > 0) { $imanageSQLArr[5] = "checked"; } else { $imanageSQLArr[5] = "off"; } if (substr_count($print, 'Spain') > 0) { $imanageSQLArr[6] = "checked"; } else { $imanageSQLArr[6] = "off"; } if (substr_count($print, 'Sweden') > 0) { $imanageSQLArr[7] = "checked"; } else { $imanageSQLArr[7] = "off"; } if (substr_count($print, 'Switzerland') > 0) { $imanageSQLArr[8] = "checked"; } else { $imanageSQLArr[8] = "off"; } if (substr_count($print, 'UK') > 0) { $imanageSQLArr[9] = "checked"; } else { $imanageSQLArr[9] = "off"; } and the query $tocheck = $db->prepare( "SELECT users.email FROM users,app WHERE users.id=app.userid AND BENELUX=:BENELUX AND FRANCE=:FRANCE AND GERMANY=:GERMANY AND ITALY=:ITALY AND NETHERLANDS=:NETHERLANDS AND PORTUGAL=:PORTUGAL AND SPAIN=:SPAIN AND SWEDEN=:SWEDEN AND SWITZERLAND=:SWITZERLAND AND UK=:UK"); $tocheck->execute($country); $row = $tocheck->fetchAll(); This does retrieve data, but only people who's preferences match EXACTLY what is put (so what they haven't selected is taken into account as much as what they have). Any help would be greatly appreciated.

    Read the article

  • jQuery - Save to SQL via PHP

    - by Kenny Bones
    This is probably easy for you guys, but I can't understand it. I want to save the filename of an image to it's own row in the SQL base. Basically, I log on to the site where I have my own userID. And each user has its own column for background images. And the user can choose his own image if he wants to. So basically, when the user clicks on the image he wants, a jquery click event occurs and an ajax call is made to a php file which is supposed to take care of the actual update. The row for each user always exist so there's only an update of the data that's necessary. First, I collect the filename of the css property 'background-image' and split it so I get only the filename. I then store that filename in a variable I call 'filename' which is then passed on to this jQuery snippet: $.ajax({ url: 'save_to_db.php', data: filename, dataType:'Text', type: 'POST', success: function(data) { // Just for testing purposes. alert('Background changed to: ' + data); } }); And this is the php: <?php require("dbconnect.php") ?> <?php $uploadstring = ($_POST['filename']); mysql_query("UPDATE brukere SET brukerBakgrunn = $uploadstring WHERE brukerID=" .$_SESSION['id'] .""; mysql_close(); ?> Basically, each user has their own ID and this is called 'brukerID' The table everything is in is called 'brukere' and the column I'm supposed to update is the one called 'brukerBakgrunn' When I just run the javascript snippet, I get this message box in return where it says: Background changed to: Parse error: syntax error, unexpected ';' in /var/www/clients/client2/web8/web/save_to_db.php on line 8 I actualle get this messagebox twice, not sure why. Line 8 in 'save_to_db.php' is this one: mysql_query("UPDATE brukere SET brukerBakgrunn = $uploadstring WHERE brukerID=" .$_SESSION['id'] .""; Not sure if you need to see db_connect.php as well. I can add that later if you need to see it. So what am I missing here?

    Read the article

  • Issue with SQL query for activity stream/feed

    - by blabus
    I'm building an application that allows users to recommend music to each other, and am having trouble building a query that would return a 'stream' of recommendations that involve both the user themselves, as well as any of the user's friends. This is my table structure: Recommendations ID Sender Recipient [other columns...] -- ------ --------- ------------------ r1 u1 u3 ... r2 u3 u2 ... r3 u4 u3 ... Users ID Email First Name Last Name [other columns...] --- ----- ---------- --------- ------------------ u1 ... ... ... ... u2 ... ... ... ... u3 ... ... ... ... u4 ... ... ... ... Relationships ID Sender Recipient Status [other columns...] --- ------ --------- -------- ------------------ rl1 u1 u2 accepted ... rl2 u3 u1 accepted ... rl3 u1 u4 accepted ... rl4 u3 u2 accepted ... So for user 'u4' (who is friends with 'u1'), I want to query for a 'stream' of recommendations relevant to u4. This stream would include all recommendations in which either the sender or recipient is u4, as well as all recommendations in which the sender or recipient is u1 (the friend). This is what I have for the query so far: SELECT * FROM recommendations WHERE recommendations.sender IN ( SELECT sender FROM relationships WHERE recipient='u4' AND status='accepted' UNION SELECT recipient FROM relationships WHERE sender='u4' AND status='accepted') OR recommendations.recipient IN ( SELECT sender FROM relationships WHERE recipient='u4' AND status='accepted' UNION SELECT recipient FROM relationships WHERE sender='u4' AND status='accepted') UNION SELECT * FROM recommendations WHERE recommendations.sender='u4' OR recommendations.recipient='u4' GROUP BY recommendations.id ORDER BY datecreated DESC Which seems to work, as far as I can see (I'm no SQL expert). It returns all of the records from the Recommendations table that would be 'relevant' to a given user. However, I'm now having trouble also getting data from the Users table as well. The Recommendations table has the sender's and recipient's ID (foreign keys), but I'd also like to get the first and last name of each as well. I think I require some sort of JOIN, but I'm lost on how to proceed, and was looking for help on that. (And also, if anyone sees any areas for improvement in my current query, I'm all ears.) Thanks!

    Read the article

  • two sql queries in one place

    - by Luke
    <?php $results = mysql_query("SELECT * FROM ".TBL_SUB_RESULTS." WHERE user_submitted != '$_SESSION[username]' AND home_user = '$_SESSION[username]' OR away_user = '$_SESSION[username]' ") ; $num_rows = mysql_num_rows($results); if ($num_rows > 0) { while( $row = mysql_fetch_assoc($results)) { extract($row); $q = mysql_query("SELECT name FROM ".TBL_FRIENDLY." WHERE id = '$ccompid'"); while( $row = mysql_fetch_assoc($q)) { extract($row); ?> <table cellspacing="10" style='border: 1px dotted' width="300" bgcolor="#eeeeee"> <tr> <td><b><? echo $name; ?></b></td> </tr><tr> <td width="100"><? echo $home_user; ?></td> <td width="50"><? echo $home_score; ?></td> <td>-</td> <td width="50"><? echo $away_score; ?></td> <td width="100"><? echo $away_user; ?></td> </tr><tr> <td colspan="2"><A HREF="confirmresult.php?fixid=<? echo $fix_id; ?>">Accept / Decline</a></td> </tr></table><br> <? } } } else { echo "<b>You currently have no results awaiting confirmation</b>"; } ?> I am trying to run two queries as you can see. But they aren't both working. Is there a better way to structure this, I am having a brain freeze! Thanks OOOH by the way, my SQL wont stay in this form! I will protect it afterwards

    Read the article

  • Week in Geek: USDA Chooses Microsoft for Cloud Services Edition

    - by Asian Angel
    This week we learned how to create geeky LED holiday lights with old bottles, dig deeper in Windows Defrag via the command prompt, use Google Chrome’s drag/drop feature to upload files easier, find great gift recommendations by looking through the How-To Geek holiday gift guide, and have fun adding Merry Christmas fonts to our computers. Photo by ntr23. Random Geek Links It has been a busy week, so we have extra news link goodness with information that is good for you to know. USDA making the move to Microsoft The U.S. Department of Agriculture has announced that it has chosen Microsoft to host things like e-mail, instant messaging, and collaboration through the software giant’s Business Productivity Online Suite. Google says it was cut off from USDA project bid Google is claiming that it was not given a chance to bid on a cloud-computing project for the U.S. Department of Agriculture, for which the contract was awarded to rival Microsoft. Apache is being forced into a Java Fork When Oracle rolled over Apache and Google’s objections to its Java plans in December, the scene was set for Apache to leave and, eventually, force a Java code fork. Tumblr explains daylong outage After experiencing an outage that started on Sunday afternoon and stretched through most of the day yesterday, Tumblr has explained what happened. Google demos Chrome OS, launches pilot program During a press briefing this week in San Francisco, Google launched the Chrome application store and demonstrated Chrome OS, its browser-centric netbook operating system. Don’t expect Spotify in U.S. this holiday season As of last week, Spotify had yet to sign a single licensing deal with a major label, after spending more than a year negotiating, multiple music sources told CNET. December 2010 Patch Tuesday will come with most bulletins ever According to the Microsoft Security Response Center, Microsoft will issue 17 Security Bulletins addressing 40 vulnerabilities on Tuesday, December 14. It will also host a webcast to address customer questions the following day. Hacker plants back door in Symbian firmware Indian hacker Atul Alex has had a look at the firmware for Symbian S60 smartphones and come up with a back door for it. PC quarantines raise tough complexities The concept of quarantining PCs to prevent widespread infection is “interesting, but difficult to implement, with far too many problems”, said security experts. Symantec: DDoS attacks hard to defend It has surfaced that the distributed denial of service (DDoS) attacks on Visa and MasterCard Web sites on Wednesday were carried out by a toolkit known as low orbit ion cannon (LOIC). Web Sockets and the risks of unfinished standards Enthusiasm for a promising new standard called Web Sockets has quickly cooled in some quarters as a potential security problem led some browser makers to hastily postpone support. Internet Explorer 9 to get tracking protection Microsoft is making changes to Internet Explorer 9’s security features that will better enable users to keep sites from tracking their activity across browsing sessions. NASA sold PCs with sensitive data NASA failed to remove sensitive data from computers that it sold, according to an audit report released this week. Cybercrooks create fake Amazon receipts The bad guys have created yet another online scam, this one involving fake Amazon receipts. World of Warcraft character move fees waived Until December 22, Blizzard will allow free realm transfers from 25 highly populated servers to alleviate log-in queues or performance issues. (The free transfers are one-way and one-time only.) SpaceX Dragon reaches orbit atop a Falcon with a fiery tail The Space Exploration Technologies corporation has become the first nongovernmental entity to put a vehicle into low Earth orbit. Geek Video of the Week If birds have wings, then why are the Angry Birds using slingshots? Photo by Dorkly Bits. Wait… Birds have Wings, Why are the Angry Ones Using Slingshots? Sysadmin Geek Tips How To Setup Email Alerts on Linux Using Gmail or SMTP Linux machines may require administrative intervention in countless ways, but without manually logging into them how would you know about it? Here’s how to setup emails to get notified when your machines want some tender love and attention. Random TinyHacker Links Red Panda Webcam Support Firefox and the Knoxville Zoo’s Red Panda program. Christmas Icons (Icons we like) Superb set of holiday icons by lgp85 at deviantArt. Download the .zip and use as .png or convert to .ico at Convertico.com or with tiny app Imagicon. Super User Questions Enjoy reading the great answers to this week’s popular questions from Super User Useful USB boot disks? DVD/CD burning .zip: is it more reliable, faster, longer lasting to burn a zip of files rather than the files as a folder? What are other ways to backup my files if I do not have an external drive? Anti virus what is the difference between these all? How can I block all Facebook elements/content? How-To Geek Weekly Article Recap Have you had a busy week between work and preparing for the holidays? Get caught up on your HTG reading with our hottest articles of the week. 20 Windows Keyboard Shortcuts You Might Not Know The 50 Best Registry Hacks that Make Windows Better LCD? LED? Plasma? The How-To Geek Guide to HDTV Technology HTG Explains: Which Linux File System Should You Choose? How to Use and Customize Google Chrome Web Apps One Year Ago on How-To Geek This week’s batch of retro geeky goodness is all about customizing Windows 7. ClassicShell Adds Classic Start Menu and Explorer Features to Windows 7 Get an Aero-Styled Classic Start Menu in Windows 7 Customize the Windows 7 Logon Screen Get the Classic Style Network Activity Indicator Back in Windows 7 How To Enable Check Boxes for Items In Windows 7 The Geek Note We would like you to join us in welcoming Jason Fitzpatrick to the writing staff here at How-To Geek. He started with us this past week, so take some time to read through his articles about the Wii, Kindle, & PlayStation 2 Peripherals and leave a friendly comment to say “Hi”! Got a great tip to share? Make sure to send it in to us at [email protected]. Photo by real00. Latest Features How-To Geek ETC The 50 Best Registry Hacks that Make Windows Better The How-To Geek Holiday Gift Guide (Geeky Stuff We Like) LCD? LED? Plasma? The How-To Geek Guide to HDTV Technology The How-To Geek Guide to Learning Photoshop, Part 8: Filters Improve Digital Photography by Calibrating Your Monitor Our Favorite Tech: What We’re Thankful For at How-To Geek Settle into Orbit with the Voyage Theme for Chrome and Iron Awesome Safari Compass Icons Set Escape from the Exploding Planet Wallpaper Move Your Tumblr Blog to WordPress Pytask is an Easy to Use To-Do List Manager for Your Ubuntu System Snowy Christmas House Personas Theme for Firefox

    Read the article

  • Spatial data in the UK

    - by simonsabin
    I am just loving the fact that the Ordance Survey has now released a huge amount of data that can be used freely. I’ve downloaded the Panorama (tm) data http://www.ordnancesurvey.co.uk/oswebsite/products/land-form-panorama-contours/index.html . which is all the contours for the UK This I’ve loaded into SQL Server using Safe Computing’s FME ( http://www.safe.com/ ). This is because the data is a Autocad DXF file and translating that to SQL Server spatial data is not easy. The FME workbench is not...(read more)

    Read the article

  • SQL Saturday Richmond, VA

    - by Mike
    Very excited to announce that I’ll be holding 2 sessions at SQL Saturday in VA on April 10th. If there are any frequent readers of SQLTeam.com attending, please make sure to say hi! Topics I’m covering are partitioning & loading data real time and an introduction to performance tuning. Hope to see you there! SQL Saturday Richmond Schedule

    Read the article

  • Enjoy Discounts as High as 25 Percent on Online Training Products in the Microsoft Training Catalog

    Visit the Microsoft Training Catalog to find training and certification resources for Microsoft technologies including SharePoint 2010 and the .NET Framework. Receive discounts on the purchase of online training products in the catalog....Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • Working with Temporal Data in SQL Server

    - by Dejan Sarka
    My third Pluralsight course, Working with Temporal Data in SQL Server, is published. I am really proud on the second part of the course, where I discuss optimization of temporal queries. This was a nearly impossible task for decades. First solutions appeared only lately. I present all together six solutions (and one more that is not a solution), and I invented four of them. http://pluralsight.com/training/Courses/TableOfContents/working-with-temporal-data-sql-server

    Read the article

  • Microsoft F#

    - by Aamir Hasan
    F# brings you type safe, succinct, efficient and expressive functional programming language on the .NET platform. It is a simple and pragmatic language, and has particular strengths in data-oriented programming, parallel I/O programming, parallel CPU programming, scripting and algorithmic development. F# cannot solve any problem C# could. F# is a functional language, statically typed. F# is a functional language that supports O-O-Programming References:http://msdn.microsoft.com/en-us/fsharp/cc835246.aspx http://research.microsoft.com/en-us/um/cambridge/projects/fsharp/

    Read the article

  • Présentation de Microsoft Online Services, par Michaël Todorovic

    Présentation de Microsoft Online Services, par Michaël Todorovic Citation: Cet article présente la plateforme Microsoft Online Services au travers de l'offre BPOS (Business Productivity Online Standard). Vous pouvez donner votre avis sur cet article en répondant à cette discussion et lui donner une note en notant la discussion. Si vous rencontrez des problèmes avec la mise en ...

    Read the article

  • Microsoft contraint d'abandonner le nom « SkyDrive », suite à une décision de justice en faveur de BSkyB

    Microsoft contraint d'abandonner le nom « SkyDrive » Suite à une décision de justice en faveur de BSkyBMicrosoft va devoir changer le nom « SkyDrive », utilisé par sa plateforme de stockage de fichiers dans le Cloud.Le même scénario qu'avec l'appellation « Metro », qu'a dû abandonner le géant du logiciel pour l'interface utilisateur de Windows 8, se produit.Le 28 juin dernier, une décision de justice statuait que Microsoft portait atteinte aux droits du groupe audiovisuel anglais BSkyB sur la marque « Sky ». Le groupe BSkyB est propriétaire de plusieurs chaines de télévision portant le pr...

    Read the article

  • Microsoft Access 2007 Certification

    Certification doesn't make one an IT super hero but it's something every developer should consider. Some might argue that there aren't any certifications for Microsoft Access application developers and they would be correct, however, the Microsoft Access 2007 Application Specialist (MCAS) exam might prove helpful.

    Read the article

  • Finalists for the Microsoft Accelerator for Windows Azure

    - by ScottGu
    Today, I am pleased to announce the ten finalists for the Microsoft Accelerator for Windows Azure powered by TechStars. These startups are about to launch into a three-month program where they will develop new products and businesses using Windows Azure. The response to the program has been fantastic - we received nearly 600 applications from entrepreneurs in 69 countries around the world, spanning a host of industries including retail, travel, entertainment, banking, real estate and more.  There were so many innovative ideas and amazing teams that it really made the selection process hard.  We finally landed on 10 finalists, based on their experience, qualifications, and innovative business ideas built on the cloud. This fall’s Windows Azure class includes: Advertory – Berlin, Germany. Advertory helps local businesses increase revenue and build customer loyalty. Appetas – Seattle, WA. Appetas' mission is to make restaurants look as beautiful online as they do on the plate! BagsUp – Sydney, Australia. Find great places from people you trust. Embarke – San Diego, CA. Embarke allows developers and companies the ability to integrate with any human communication channel (Facebook, Email, Text Message, Twitter) without having to learn the specifics, write code, or spend time on any of them. Fanzo – Seattle, WA. Fanzo puts sports fans in the spotlight. Find other fans, show off your fanswagger and get rewarded for your passion. MetricsHub – Bellevue, WA. A service providing cloud monitoring with incident detection and prebuilt workflows for remedying common problems. Mobilligy – Bellevue, WA. Mobilligy revolutionizes how people pay their bills by bringing convenient, secure, and instant bill payment support to mobile devices. Realty Mogul – Los Angeles, CA. Realty Mogul is a crowdfunding platform for real estate where accredited investors pool capital and invest in properties that are acquired, managed and eventually resold by professional private real estate companies and their management teams. Staq – San Francisco, CA. Back-end as a service for APIs. Socedo – Bellevue, WA. A simple and effective web application for lead generation and relationship management on Twitter. Each startup will be hosted in Seattle and mentored by entrepreneurs and venture capitalists as well as leaders from Windows Azure and other Microsoft organizations. The teams will spend the first month ideating and refining their business concepts with input and advice from their mentors as well as Microsoft customers, followed by two months of design and development. They will present their results to investors and Microsoft partners at an event in mid-January. We are really looking forward to seeing how their businesses evolve.  These teams have demonstrated incredible energy, passion, and innovative capabilities – and they are ready to show the world what’s possible with Windows Azure. Thanks, Scott P.S. And if you are new to Twitter you can also optionally follow me: @scottgu

    Read the article

  • Making Use of Plan Explorer in my own Environment

    - by Jonathan Kehayias
    Back in October 2010, I briefly blogged about the SQL Sentry Plan Explorer in my blog post wrap up for SQL Bits 7 and how impressed I was with what I saw from a Alpha demo standpoint from Greg Gonzalez ( Blog | Twitter ) while I was at SQLBits 7 in York.  To be 100% honest and transparent, Greg gave me early access to this tool after discussing it at SQLBits 7, and I had the opportunity to test a number of pre-Beta releases where I was able to offer significant feedback and submit bugs in the...(read more)

    Read the article

  • Microsoft Claims Success Versus Autorun Malware

    Microsoft recently used a post on the Threat Research and Response Blog section of its Malware Protection Center to describe how it is winning the battle against autorun malware. Although there is certainly no shortage of malware in the virtual world Microsoft has plenty of statistics to back its claims and it has displayed them with pride.... DNS Configured Correctly? Test Your Internal DNS With Our Free DNS Advisor Tool From Infoblox.

    Read the article

  • Some thoughts on the Virtualization Feedback in the SSWUG Newsletters

    - by Jonathan Kehayias
    Last Thursday, March 25, 2010, the topic of Virtualization of SQL Server came up in the SSWUG Newsletter , with Steven Wynkoop asking if peoples perceptions and experiences have changed since the last time he covered virtualizing SQL Server.  I unfortunately missed the last coverage of this topic, but it appears from the newsletter that there was a general consensus that “low-traffic solution could be fine, but if you had a heavy hitting application, the net advise was to avoid a virtual environment...(read more)

    Read the article

< Previous Page | 325 326 327 328 329 330 331 332 333 334 335 336  | Next Page >