Search Results

Search found 24674 results on 987 pages for 'visual studio 2005 tips'.

Page 21/987 | < Previous Page | 17 18 19 20 21 22 23 24 25 26 27 28  | Next Page >

  • SQL Server 2005: Insert a row in a table and update the same row

    - by vikas
    eg:table pkey --guid annualpay datefrom dateto--if null means current record percentannualincrease percent annual increase will be calculated only if there is a difference in newly inserted and previously existing last differing value. percentannualincrease = ([newannualpay-just previous pay(if different from current)]/newannualpay)*100 eg newid(),5000,today,null,0--very first row newid(),5000,today+1,null(*),0 newid,5500,today+2,null(*),?????????????--> need to be calculated before insert *--insert will close the previous record by updating dateto=null to todays date How can I do this stuff in a trigger???

    Read the article

  • Visual Studio 2005 Build Order

    - by user77826
    I have a Web Application. Sometimes I want to debug a console app that is within the solution. Why is it that when I right click on it and select debug, it builds every single webpage and libraries, which takes a while. When I look at build dependencies for the console app, it lists everything in the solution as the order and tells me to go to build order to change it. In build order, everything checks out... I only have checked the libraries that this console app needs. I also tried setting this console app as a start up project. Same thing... How do I get it so when I want to debug this console app, it only builds that and runs instead of building the entire solution before running?

    Read the article

  • Execute Statements in a Transaction - Sql Server 2005

    - by Shrewd Demon
    hi, i need to update a database where some of the table have changed (columns have been added). I want to perform this action in a proper transaction. If the code executes without any problem then i will commit the changes otherwise i will rollback the database back to its original state. I want to do something like this: BEGIN TRANSACTION ...Execute some sql statements here COMMIT TRANSACTION (When every thing goes well) ROLLBACK TRANSACTION (When something goes wrong) Please tell me what is the best way to do this i know there is a @@TranCount variable but dont know its exact purpose. Thanks.

    Read the article

  • Multiple column sorting (SQL SERVER 2005)

    - by Newbie
    I have a table which looks like Col1 col2 col3 col4 col5 1 5 1 4 6 1 4 0 3 7 0 1 5 6 3 1 8 2 1 5 4 3 2 1 4 The script is declare @t table(col1 int, col2 int, col3 int,col4 int,col5 int) insert into @t select 1,5,1,4,6 union all select 1,4,0,3,7 union all select 0,1,5,6,3 union all select 1,8,2,1,5 union all select 4,3,2,1,4 I want the output to be every column being sorted in ascending order i.e. Col1 col2 col3 col4 col5 0 1 0 1 3 1 3 1 1 4 1 4 2 3 5 1 5 2 4 6 4 8 5 6 7 I already solved tye problem by the folowing program Select x1.col1 ,x2.col2 ,x3.col3 ,x4.col4 ,x5.col5 From (Select Row_Number() Over(Order By col1) rn1, col1 From @t)x1 Join(Select Row_Number() Over(Order By col2) rn2, col2 From @t)x2 On x1.rn1=x2.rn2 Join(Select Row_Number() Over(Order By col3) rn3, col3 From @t)x3 On x1.rn1=x3.rn3 Join(Select Row_Number() Over(Order By col4) rn4, col4 From @t)x4 On x1.rn1=x4.rn4 Join(Select Row_Number() Over(Order By col5) rn5, col5 From @t)x5 On x1.rn1=x5.rn5 But I am not happy with this solution. Is there any better way to achieve the same? (Using set based approach) If so, could any one please show an example. Thanks

    Read the article

  • Unexpected behaviour of Order by clause(SQL SERVER 2005)

    - by Newbie
    I have a table which looks like Col1 col2 col3 col4 col5 1 5 1 4 6 1 4 0 3 7 0 1 5 6 3 1 8 2 1 5 4 3 2 1 4 The script is declare @t table(col1 int, col2 int, col3 int,col4 int,col5 int) insert into @t select 1,5,1,4,6 union all select 1,4,0,3,7 union all select 0,1,5,6,3 union all select 1,8,2,1,5 union all select 4,3,2,1,4 If I do a sorting (ascending), the output is Col1 col2 col3 col4 col5 0 1 5 6 3 1 4 0 3 7 1 5 1 4 6 1 8 2 1 5 4 3 2 1 4 The query is Select * from @t order by col1,col2,col3,col4,col5 But as can be seen that the sorting output is wrong (col2 to col5). Why so and how to overcome this? Thanks in advance

    Read the article

  • Declare global variables for a batch of execution statements - sql server 2005

    - by Shrewd Demon
    hi, i have an SQL statement wherein i am trying to update the table on the client's machine. the sql statement is as follows: BEGIN TRANSACTION DECLARE @CreatedBy INT SELECT @CreatedBy = [User_Id] FROM Users WHERE UserName = 'Administrator' --//////////////////////////////////////////////////////////////////// --//////////////////////////////////////////////////////////////////// PRINT @CreatedBy --(Works fine here and shows me the output) PRINT N'Rebuilding [dbo].[Some_Master]' ALTER TABLE [dbo].[Some_Master] ADD [CreatedBy] [BIGINT] NULL, [Reason] [VARCHAR](200) NULL GO PRINT @CreatedBy --(does not work here and throws me an error) PRINT N'Updating data in [Some_Master] table' UPDATE Some_Master SET CreatedBy = @CreatedBy COMMIT TRANSACTION but i am getting the following error: Must declare the scalar variable "@CreatedBy". Now i have observed if i write the Print statement above the alter command it works fine and shows me its value, but if i try to print the value after the Alter command it throws me the error i specified above. I dont know why ?? please help! Thank you

    Read the article

  • T-SQL 2005 - Divide by zero error encountered

    - by Grant
    Hi, I am trying to get some percentage data from a stored procedure using code similar to the line below. Obviously this is going to cause a (Divide by zero error encountered) problem when base.[XXX_DataSetB] returns 0 which may happen some of the time. Does anyone know how i can deal with this in the most efficient manner? Note: There would be about 20+ lines looking like the one below... cast((base.[XXX_DataSetB] - base.[XXX_DataSetA]) as decimal) / base.[XXX_DataSetB] as [XXX_Percentage]

    Read the article

  • subqueries in UPDATE SET (sql server 2005)

    - by itdebeloper
    I have a question about using subqueries in an Update statement. My example: UPDATE TRIPS SET locations = city + ', ' FROM (select Distinct city from poi where poi.trip_guid = trips.guid) Is it possible to refer to main table value (trips.guid) in subqueries? When i try to use trips.guid I get the error: "The multi-part identifier "trips.guid" could not be bound."

    Read the article

  • Strange Locking Behaviour in SQL Server 2005

    - by SQL Learner
    Can anyone please tell me why does the following statement inside a given stored procedure returns repeated results even with locks on the rows used by the first SELECT statement? BEGIN TRANSACTION DECLARE @Temp TABLE ( ID INT ) INSERT INTO @Temp SELECT ID FROM SomeTable WITH (ROWLOCK, UPDLOCK, READPAST) WHERE SomeValue <= 10 INSERT INTO @Temp SELECT ID FROM SomeTable WITH (ROWLOCK, UPDLOCK, READPAST) WHERE SomeValue >= 5 SELECT * FROM @Temp COMMIT TRANSACTION Any values in SomeTable for which SomeValue is between 5 and 10 will be returned twice, even though they were locked in the first SELECT. I thought that locks were in place for the whole transaction, and so I wasn't expecting the query to return repeated results. Why is this happening?

    Read the article

  • Is this possible in sql server 2005?

    - by chandru_cp
    This is my queries select ClientName,ClientMobNo from Clients select DriverName,DriverMobNo from Drivers It gives me two result tables... But i want to combine both the result tables into a single table... I tried union and union all it doesn't give me what i want.... Note: There is no relationship between the two tables...... There may be 200 clients and 100 drivers...

    Read the article

  • How to convert lots of database file from MSSQL 2000 to MSSQL 2005?

    - by Tech
    Hi all, I am moving the SQL Server from MSSQL 2000 to MSSQL 2005, and I found the article in the web like this: http://www.aspfree.com/c/a/MS-SQL-Server/Moving-Data-from-SQL-Server-2000-to-SQL-Server-2005/ It works, but the problem is, it only move database one by one. Because I have so many database, is there any easy way to do so? or is there provides any batches / untitlty allow me to do so? thz u.

    Read the article

  • How to convert lots of database file from MSSQL 2000 to MSSQL 2005?

    - by Tech
    Hi all, I am moving the SQL Server from MSSQL 2000 to MSSQL 2005, and I found the article in the web like this: http://www.aspfree.com/c/a/MS-SQL-Server/Moving-Data-from-SQL-Server-2000-to-SQL-Server-2005/ It works, but the problem is, it only move database one by one. Because I have so many database, is there any easy way to do so? or is there provides any batches / untitlty allow me to do so? thz u.

    Read the article

  • SSAS 2008 R2– Little Gems

    - by ACALVETT
    I have spent the last few days working with SSAS 2008 R2 and noticed a few small enhancements which many people probably won’t notice but i will list them here and why they are important to me. New profiler events Commit: This is a new sub class event for “progress report end”. This represents the elapsed time taken for the server to commit your data. It is important because for the duration of this event a server level lock will be in place blocking all incoming connections and causing time out...(read more)

    Read the article

  • SSIS - XML Source Script

    - by simonsabin
    The XML Source in SSIS is great if you have a 1 to 1 mapping between entity and table. You can do more complex mapping but it becomes very messy and won't perform. What other options do you have? The challenge with XML processing is to not need a huge amount of memory. I remember using the early versions of Biztalk with loaded the whole document into memory to map from one document type to another. This was fine for small documents but was an absolute killer for large documents. You therefore need a streaming approach. For flexibility however you want to be able to generate your rows easily, and if you've ever used the XmlReader you will know its ugly code to write. That brings me on to LINQ. The is an implementation of LINQ over XML which is really nice. You can write nice LINQ queries instead of the XMLReader stuff. The downside is that by default LINQ to XML requires a whole XML document to work with. No streaming. Your code would look like this. We create an XDocument and then enumerate over a set of annoymous types we generate from our LINQ statement XDocument x = XDocument.Load("C:\\TEMP\\CustomerOrders-Attribute.xml");   foreach (var xdata in (from customer in x.Elements("OrderInterface").Elements("Customer")                        from order in customer.Elements("Orders").Elements("Order")                        select new { Account = customer.Attribute("AccountNumber").Value                                   , OrderDate = order.Attribute("OrderDate").Value }                        )) {     Output0Buffer.AddRow();     Output0Buffer.AccountNumber = xdata.Account;     Output0Buffer.OrderDate = Convert.ToDateTime(xdata.OrderDate); } As I said the downside to this is that you are loading the whole document into memory. I did some googling and came across some helpful videos from a nice UK DPE Mike Taulty http://www.microsoft.com/uk/msdn/screencasts/screencast/289/LINQ-to-XML-Streaming-In-Large-Documents.aspx. Which show you how you can combine LINQ and the XmlReader to get a semi streaming approach. I took what he did and implemented it in SSIS. What I found odd was that when I ran it I got different numbers between using the streamed and non streamed versions. I found the cause was a little bug in Mikes code that causes the pointer in the XmlReader to progress past the start of the element and thus foreach (var xdata in (from customer in StreamReader("C:\\TEMP\\CustomerOrders-Attribute.xml","Customer")                                from order in customer.Elements("Orders").Elements("Order")                                select new { Account = customer.Attribute("AccountNumber").Value                                           , OrderDate = order.Attribute("OrderDate").Value }                                ))         {             Output0Buffer.AddRow();             Output0Buffer.AccountNumber = xdata.Account;             Output0Buffer.OrderDate = Convert.ToDateTime(xdata.OrderDate);         } These look very similiar and they are the key element is the method we are calling, StreamReader. This method is what gives us streaming, what it does is return a enumerable list of elements, because of the way that LINQ works this results in the data being streamed in. static IEnumerable<XElement> StreamReader(String filename, string elementName) {     using (XmlReader xr = XmlReader.Create(filename))     {         xr.MoveToContent();         while (xr.Read()) //Reads the first element         {             while (xr.NodeType == XmlNodeType.Element && xr.Name == elementName)             {                 XElement node = (XElement)XElement.ReadFrom(xr);                   yield return node;             }         }         xr.Close();     } } This code is specifically designed to return a list of the elements with a specific name. The first Read reads the root element and then the inner while loop checks to see if the current element is the type we want. If not we do the xr.Read() again until we find the element type we want. We then use the neat function XElement.ReadFrom to read an element and all its sub elements into an XElement. This is what is returned and can be consumed by the LINQ statement. Essentially once one element has been read we need to check if we are still on the same element type and name (the inner loop) This was Mikes mistake, if we called .Read again we would advance the XmlReader beyond the start of the Element and so the ReadFrom method wouldn't work. So with the code above you can use what ever LINQ statement you like to flatten your XML into the rowsets you want. You could even have multiple outputs and generate your own surrogate keys.        

    Read the article

  • From the Tips Box: Pre-installation Prep Work Makes Service Pack Upgrades Smoother

    - by Jason Fitzpatrick
    Last month Microsoft rolled out Windows 7 Service Pack 1 and, like many SP releases, quite a few people are hanging back to see what happens. If you want to update but still error on the side of caution, reader Ron Troy  offers a step-by-step guide. Ron’s cautious approach does an excellent job minimizing the number of issues that could crop up in a Service Pack upgrade by doing a thorough job updating your driver sets and clearing out old junk before you roll out the update. Read on to see how he does it: Just wanted to pass on a suggestion for people worried about installing Service Packs.  I came up with a ‘method’ a couple years back that seems to work well. Run Windows / Microsoft Update to get all updates EXCEPT the Service Pack. Use Secunia PSI to find any other updates you need. Use CCleaner or the Windows disk cleanup tools to get rid of all the old garbage out there.  Make sure that you include old system updates. Obviously, back up anything you really care about.  An image backup can be real nice to have if things go wrong. Download the correct SP version from Microsoft.com; do not use Windows / Microsoft Update to get it.  Make sure you have the 64 bit version if that’s what you have installed on your PC. Make sure that EVERYTHING that affects the OS is up to date.  That includes all sorts of drivers, starting with video and audio.  And if you have an Intel chipset, use the Intel Driver Utility to update those drivers.  It’s very quick and easy.  For the video and audio drivers, some can be updated by Intel, some by utilities on the vendor web sites, and some you just have to figure out yourself.  But don’t be lazy here; old drivers and Windows Service Packs are a poor mix. If you have 3rd party software, check to see if they have any updates for you.  They might not say that they are for the Service Pack but you cut your risk of things not working if you do this. Shut off the Antivirus software (especially if 3rd party). Reboot, hitting F8 to get the SafeMode menu.  Choose SafeMode with Networking. Log into the Administrator account to ensure that you have the right to install the SP. Run the SP.  It won’t be very fancy this way.  Maybe 45 minutes later it will reboot and then finish configuring itself, finally letting you log in. Total installation time on most of my PC’s was about 1 hour but that followed hours of preparation on each. On a separate note, I recently got on the Nvidia web site and their utility told me I had a new driver available for my GeForce 8600M GS.  This laptop had come with Vista, now has Win 7 SP1.  I had a big surprise from this driver update; the Windows Experience Score on the graphics side went way up.  Kudo’s to Nvidia for doing a driver update that actually helps day to day usage.  And unlike ATI’s updates (which I need for my AGP based system), this update was fairly quick and very easy.  Also, Nvidia drivers have never, as I can recall, given me BSOD’s, many of which I’ve gotten from ATI (TDR errors).How to Enable Google Chrome’s Secret Gold IconHTG Explains: What’s the Difference Between the Windows 7 HomeGroups and XP-style Networking?Internet Explorer 9 Released: Here’s What You Need To Know

    Read the article

  • Have you really fixed that problem?

    - by DavidWimbush
    The day before yesterday I saw our main live server's CPU go up to constantly 100% with just the occasional short drop to a lower level. The exact opposite of what you'd want to see. We're log shipping every 15 minutes and part of that involves calling WinRAR to compress the log backups before copying them over. (We're on SQL2005 so there's no native compression and we have bandwidth issues with the connection to our remote site.) I realised the log shipping jobs were taking about 10 minutes and that most of that was spent shipping a 'live' reporting database that is completely rebuilt every 20 minutes. (I'm just trying to keep this stuff alive until I can improve it.) We can rebuild this database in minutes if we have to fail over so I disabled log shipping of that database. The log shipping went down to less than 2 minutes and I went off to the SQL Social evening in London feeling quite pleased with myself. It was a great evening - fun, educational and thought-provoking. Thanks to Simon Sabin & co for laying that on, and thanks too to the guests for making the effort when they must have been pretty worn out after doing DevWeek all day first. The next morning I came down to earth with a bump: CPU still at 100%. WTF? I looked in the activity monitor but it was confusing because some sessions have been running for a long time so it's not a good guide what's using the CPU now. I tried the standard reports showing queries by CPU (average and total) but they only show the top 10 so they just show my big overnight archiving and data cleaning stuff. But the Profiler showed it was four queries used by our new website usage tracking system. Four simple indexes later the CPU was back where it should be: about 20% with occasional short spikes. So the moral is: even when you're convinced you've found the cause and fixed the problem, you HAVE to go back and confirm that the problem has gone. And, yes, I have checked the CPU again today and it's still looking sweet.

    Read the article

  • SSAS Native v .net Provider

    - by ACALVETT
    Recently I was investigating why a new server which is in its parallel running phase was taking significantly longer to process the daily data than the server its due to replace. The server has SQL & SSAS installed so the problem was not likely to be in the network transfer as its using shared memory. As i dug around the SQL dmv’s i noticed in sys.dm_exec_connections that the SSAS connection had a packet size of 8000 bytes instead of the usual 4096 bytes and from there i found that the datasource...(read more)

    Read the article

  • Change default language settings in Visual Studio 2012

    - by sreejukg
    The first thing you need to do after the installation of Visual Studio 2012 is to choose the IDE preferences. Once you select your preferred collection of settings, the IDE will always choose dialogs and other options according to your selection. Nowadays developer’s needs to work with different programming environments and due to this, developers might need to reset the default settings. In this article, I am going to demonstrate how you can change the default settings in Visual Studio 2012. For the purpose of this demonstration, I have installed Visual Studio 2012 and selected C++ as my default environment settings. So now when I go to file -> new project, it will give me C++ templates by default as follows. If you want to select another language, you need to expand Other Languages section and select C# or VB. Now I am going to change these default settings. I am going to change the default language preference to C#. In Visual Studio 2012, go to tools menu and select Import and Export Settings. Here you have 3 options; one is to export the current settings so that the settings are saved for future use. Also you can import previously saved settings. The last option available is to reset it to default. It is a good Idea to export your settings and import it as you need in later stages. To reset the settings to default select the Reset option and click next. Now Visual Studio will ask you to whether you would like to save the settings, which can be used in future to restore. Select any one option and click next. For the purpose of this demo, I have selected not to save the settings. Click Next button to continue. Now Visual Studio will bring you the similar dialog that appears just after installation to select your IDE settings. Select the required settings from the available list and click Finish button. Click Finish once you are done. If everything OK, you will see the success message as below. Now go to file -> new Project, you will see the selected language appear by default. I selected C# in the previous step and the new project dialog appears as follows. Changing IDE settings in Visual Studio 2012 is very easy and straight forward.

    Read the article

  • Enabling super single user mode with SQL Server

    - by simonsabin
    I recently got an email from a fellow MVP about single user mode. It made me think about some features I had just been looking at and so I started playing. The annoyance about single user mode for SQL Server is that its not really single user, but more like single connection mode. So how can you get round it, well there is extension to the -m startup option that allows you to specify an application name, and only connections with that application name can connect. This is very useful if you have...(read more)

    Read the article

  • Scream if you want to go faster

    - by simonsabin
    My session for 24hrs of pass on High Performance functions will be starting at 11:00 GMT thats migdnight for folks in the UK. To attend follow this link https://www.livemeeting.com/cc/8000181573/join?id=N5Q8S7&role=attend&pw=d2%28_KmN3r The rest of the sessions can be found here http://www.sqlpass.org/24hours/2010/Sessions/ChronologicalOrder.aspx So far the sessions have been great so no pressure :( See you there in 4.5 hrs...(read more)

    Read the article

< Previous Page | 17 18 19 20 21 22 23 24 25 26 27 28  | Next Page >