Search Results

Search found 16890 results on 676 pages for '2008 archive'.

Page 272/676 | < Previous Page | 268 269 270 271 272 273 274 275 276 277 278 279  | Next Page >

  • Simple ASP.NET Database Query

    - by Steven
    I have loaded my database with one table under "Data Connections" in the "Server Explorer" pane. What is the standard / best-practices way to handle a simple query in a VB ASPX page? My left <div> would be a set of form elements to filter rows, and when the button is clicked, the main <div> would show the columns I want for the rows returned. Note: Answers in C# are okay too, I'll just translate.

    Read the article

  • C#: standard Windows menu bars in Windows Forms

    - by BoltClock
    I noticed that adding a MenuStrip (from the VS Toolbox) to my form design doesn't yield a menu bar like many native Windows applications. Instead I get a menu bar like VS's own. None of the style settings for MenuStrip appear to mimic the much more common native menu bar. Is there a way to add a menu bar to my Windows Forms application that looks the same as the one you see in Notepad, Task Manager, Windows Explorer and others? (Preferably with the designer, but I wouldn't mind adding it programmatically either.)

    Read the article

  • T-SQL out of order insert

    - by tearman
    Basically I have a User-Defined Table Type (for use as a table-valued variable) which I'm referencing in a stored procedure that effectively just calls two other stored procedures, then inserts those values into the table type. Id est INSERT INTO @tableValuedVariable (var1, var2, var3, var4, var5) EXEC [dbo].StoredProcedure1; INSERT INTO @tableValuedVariable (var1, var2, var5) EXEC [dbo].StoredProcedure2; You can probably already tell what I'm going to ask. Basically StoredProcedure2 only returns a few of the values the table is set to hold, and I'd like those other variables to just be null (as defined as default). Only SQL is complaining that I'm not specifying all the variables available to that table. The return datasets can be quite sizable so I'd like to avoid loops and such for obvious reasons. Thanks for any help.

    Read the article

  • Help me finding dependency list.

    - by Pearl
    I have two table employee table and employee dependency table. Employee tooks like below. insert into E values(1,'Adam') insert into E values(2,'Bob') insert into E values(3,'Candy') insert into E values(4,'Doug') insert into E values(5,'Earl') insert into E values(6,'Fran') Employee dependency table looks like below insert into Ed values(3,'2') insert into Ed values(3,'5') insert into Ed values(2,'1') insert into Ed values(2,'4') insert into Ed values(5,'6') I need to find the dependency list like below Eid Ename Dname 3 Candy Bob,Fran Please help me finding the above.

    Read the article

  • Getting a Target to run BEFORE anything else runs when building from Visual Studio

    - by damageboy
    Hi, I'm trying to get a one-time costly target to run only when building a certain top-level project (that has many dependencies). I have no problem on getting this working from plain msbuild / command line build. I do this with setting and InitialTargets on the project, or alternatively with < BeforeBuild /. The tricky part is with Visual Studio. When I build the same project from VS. VS runs the dependencies before even invoking my .csproj, so my target (which affects how the other projects are built) doesn't get to run until they have already been built. Is there someway to force VS to run a target before invoking the dependencies? I'm currently working around this, by running the same costly target from my most low-level project (the one that get's always built...) by using: Condition=" $(BuildingInsideVisualStudio) " Any ideas on how to get this done "properly"? Again, I'm looking for a solution that will work FROM VS.

    Read the article

  • filling in the holes in the result of a query

    - by ????? ????????
    my query is returning: +------+------+------+------+------+------+------+-------+------+------+------+------+-----+ | Jan | Feb | Mar | Apr | May | Jun | Jul | Aug | Sep | Oct | Nov | Dec | Bla | +------+------+------+------+------+------+------+-------+------+------+------+------+-----+ | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 | 0 | 13 | | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 | 0 | 0 | 14 | | 0 | 0 | 0 | 0 | 0 | 9 | 0 | 0 | 0 | 0 | 8 | 37 | 29 | | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 374 | 30 | | 0 | 0 | 1 | 0 | 78 | 2 | 4 | 8 | 57 | 169 | 116 | 602 | 31 | | 156 | 255 | 79 | 75 | 684 | 325 | 289 | 194 | 407 | 171 | 584 | 443 | 32 | | 1561 | 2852 | 2056 | 796 | 2004 | 1755 | 879 | 1052 | 1490 | 1683 | 2532 | 2381 | 33 | | 4167 | 3841 | 4798 | 3399 | 4132 | 5849 | 3157 | 4381 | 4424 | 4487 | 4178 | 5343 | 34 | | 5472 | 5939 | 5768 | 4150 | 7483 | 6836 | 6346 | 6288 | 6850 | 7155 | 5706 | 5231 | 35 | | 5749 | 4741 | 5264 | 4045 | 6544 | 7405 | 7524 | 6625 | 6344 | 5508 | 6513 | 3854 | 36 | | 5464 | 6323 | 7074 | 4861 | 7244 | 6768 | 6632 | 7389 | 8077 | 8745 | 6738 | 5039 | 37 | | 5731 | 7205 | 7476 | 5734 | 9103 | 9244 | 7339 | 8970 | 9726 | 9089 | 6328 | 5512 | 38 | | 7262 | 6149 | 8231 | 6654 | 9886 | 9834 | 9306 | 10065 | 9983 | 9984 | 6738 | 5806 | 39 | | 5886 | 6934 | 7137 | 6978 | 9034 | 9155 | 7389 | 9437 | 9711 | 8665 | 6593 | 5337 | 40 | +------+------+------+------+------+------+------+-------+------+------+------+------+-----+ as you can see the BLA column starts from 13. i want it to start from 1, then 2, then 3 etc......I do not want any gaps in the data. The reason there are gaps is because all of the months are 0 for that specific bla how do i get the result set to include ALL values for BLA, even ones that will yield 0 for the months? here are the desired results: +-----+-----+-----+-----+-----+-----+-----+-----+-----+-----+-----+-----+-----+ | Jan | Feb | Mar | Apr | May | Jun | Jul | Aug | Sep | Oct | Nov | Dec | Bla | +-----+-----+-----+-----+-----+-----+-----+-----+-----+-----+-----+-----+-----+ | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 | | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 | | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 | | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 | | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 | | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 | | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 | | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 | | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 10 | | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 11 | | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 12 | | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 13 | | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 | 0 | 0 | 14 | | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 15 | | … | … | … | … | … | … | … | … | … | … | … | … | … | +-----+-----+-----+-----+-----+-----+-----+-----+-----+-----+-----+-----+-----+ here's my query: WITH CTE AS ( select sum(case when datepart(month,[datetime entered]) = 1 then 1 end) as Jan, sum(case when datepart(month,[datetime entered]) = 2 then 1 end) as Feb, sum(case when datepart(month,[datetime entered]) = 3 then 1 end) as Mar, sum(case when datepart(month,[datetime entered]) = 4 then 1 end) as Apr, sum(case when datepart(month,[datetime entered]) = 5 then 1 end) as May, sum(case when datepart(month,[datetime entered]) = 6 then 1 end) as Jun, sum(case when datepart(month,[datetime entered]) = 7 then 1 end) as Jul, sum(case when datepart(month,[datetime entered]) = 8 then 1 end) as Aug, sum(case when datepart(month,[datetime entered]) = 9 then 1 end) as Sep, sum(case when datepart(month,[datetime entered]) = 10 then 1 end) as Oct, sum(case when datepart(month,[datetime entered]) = 11 then 1 end) as Nov, sum(case when datepart(month,[datetime entered]) = 12 then 1 end) as Dec, DATEPART(yyyy,[datetime entered]) as [Year], bla= CASE WHEN datediff(d, CAST([datetime entered] as DATE), CAST([datetime completed] as DATE))*24 + CONVERT(CHAR(2),[datetime completed],108) >191 THEN 192 ELSE datediff(d, CAST([datetime entered] as DATE), CAST([datetime completed] as DATE))*24 + CONVERT(CHAR(2),[datetime completed],108) END --,datediff(d, CAST([datetime entered] as DATE), CAST([datetime completed] as DATE)) AS Sort_Days, --DATEPART(hour, [datetime completed] ) AS Sort_Hours from [TurnAround] group by datediff(d, CAST([datetime entered] as DATE), CAST([datetime completed] as DATE))*24 + CONVERT(CHAR(2),[datetime completed],108), DATEPART(yyyy,[datetime entered]) , [datetime entered] --[DateTime Completed] ) SELECT ISNULL(SUM(Jan),0) Jan, ISNULL(SUM(Feb),0) Feb, ISNULL(SUM(Mar),0) Mar, ISNULL(SUM(Apr),0) Apr, ISNULL(SUM(May),0) May, ISNULL(SUM(Jun),0) Jun, ISNULL(SUM(Jul),0) Jul, ISNULL(SUM(Aug),0) Aug, ISNULL(SUM(Sep),0) Sep, ISNULL(SUM(Oct),0) Oct, ISNULL(SUM(Nov),0) Nov, ISNULL(SUM(Dec),0) Dec, [year], --,Sort_Hours, --Sort_Days, A.RN Bla FROM ( SELECT *, RN=ROW_NUMBER() OVER(ORDER BY object_id) FROM sys.all_objects) A LEFT JOIN CTE B ON A.RN = CASE WHEN B.Bla > 191 THEN 192 ELSE B.Bla END WHERE A.RN BETWEEN 1 AND 192 GROUP BY A.RN,[year]

    Read the article

  • Is SSIS able to query flat files from another Windows Server?

    - by atricapilla
    I pretty new SQL Server Integration Server (SSIS) user. Is SSIS able to query data from text files located in another Windows Server? I mean that when SSIS is installed on Windwos Server A, is SSIS able to query data from e.g. one folder containing text files in Windows Server B (under same domain)? I have used only SAP BO Data Integrator ETL tool and it cannot query flat files from another Server: during execution, all files must be located on the Job Server machine that executes the job.

    Read the article

  • Upgrade from .NET 2.0 to .NET 3.5 problems

    - by Bashir Magomedov
    I’m trying to upgrade our solution from VS2005 .NET 2.0 to VS2008 .NET 3.5. I converted the solution using VS2008 conversion wizard. All the projects (about 50) remained targeting to .NET Framework 2.0., moreover if I’m changing target framework manually for one of the projects, all referenced dll (i.e. System, System.Core, System.Data, etc. are still pointing to Framework 2.0. The only way to completely change targeting framework I found is to remove these references and refer them again using proper version of framework. Doing it manually is not best choice I think. 50 projects ~ 10 references each ~ 0.5 minutes for changing each reference is about 5 hours to complete. Am I missing something? Are there any other ways of converting full solution from .NET 2.0 to .NET 3.5? Thank you.

    Read the article

  • Any way to separate unit tests from integration tests in VS2008?

    - by AngryHacker
    I have a project full of tests, unit and integration alike. Integration tests require that a pretty large database be present, so it's difficult to make it a part of the build process simply because of the time that it takes to re-initialize the database. Is there a way to somehow separate unit tests from integration tests and have the build server just run the unit tests? I see that there is an Ordered Unit test in VS2008, which allows you to pick and choose tests, but I can't make it just execute alone, without all the others. Is there a trick that I am missing? Or perhaps I could adorn the unit tests with an attribute? What are some of the approaches people are using? P.S. I know I could use mocking for integration tests (just to make them go faster) but then it wouldn't be a true integration test.

    Read the article

  • add ANOTHER primary key to a table which is UNIQUE

    - by gdubs
    so im having problems with adding another primary key to my table. i have 3 columns: 1. Account ID (Identity) 2. EmailID 3. Data field when i made the table i had this to make the Account ID and the Email ID unique PRIMARY KEY (AccountID, EmailID) i thought that would make my emailid unique, but then after i tried inserting another row with the same emailid it went through. so i thought i missed something out. now for my question: IF, i had to use alter, How do i alter the table/PK Constraint to modify the EmailID field and make it Unique IF i decided to drop the table and made a new one, how do i make those two primary keys uniqe? Thanks a bunch!!

    Read the article

  • What are the Pros & Cons of using SQL Azure for existing apps on dedicated servers

    - by Mark Redman
    We currently own our own servers, and rent a rack in a datacentre. Looking at the pricing, scalabilty and SLAs for Azure SQL, I am thinking that it might be viable to only use Azure SQL but continue to use our existing applications on our own servers in a datacentres. This will enable us to not worry about the database and its infrastructure so we can concentrate on building an application server farm with disk storeage for files etc. Our application is quite big and has various windows services and parts of it used unmanaged libraries that may not be feasible in the cloud, so probably coulnt have everything in the Azure cloud. The pros: Reduced Total Cost of ownership (no database servers, no sql server licenses) The Cons: I guess there would be overhead in the transfer of data between the Azure Cloud and our datacentre (ie cloud may be in US and datacentre is in the UK) but would this overhead be usable?

    Read the article

  • SQL Server Table Partitioning, what is happening behind the scenes?

    - by user404463
    I'm working with table partitioning on extremely large fact table in a warehouse. I have executed the script a few different ways. With and without non clustered indexes. With indexes it appears to dramatically expand the log file while without the non clustered indexes it appears to not expand the log file as much but takes more time to run due to the rebuilding of the indexes. What I am looking for is any links or information as to what is happening behind the scene specifically to the log file when you split a table partition.

    Read the article

  • Need help with SQL table structure transformation

    - by Arnis L.
    I need to perform update/insert simultaneously changing structure of incoming data. Think about Shops that have defined work time for each day of the week. Hopefully, this might explain better what I'm trying to achieve: worktimeOrigin table: columns: shop_id day val data: 123 | "monday" | "9:00 AM - 18:00" 123 | "tuesday" | "9:00 AM - 18:00" 123 | "wednesday" | "9:00 AM - 18:00" shop table: columns: id worktimeDestination.id worktimeDestination table: columns: id monday tuesday wednesday My aim: I would like to insert data from worktimeOrigin table into worktimeDestination and specify appropriate worktimeDestination for shop. shop table data: 123 1 (updated) worktimeDestination table data: 1 | "9:00 AM - 18:00" | "9:00 AM - 18:00" | "9:00 AM - 18:00" (inserted) Any ideas how to do that?

    Read the article

  • Resharper Autocomplete Issue

    - by Gene
    Hi All, I've been using resharper with the resharper automplete off (just find VS better for my purposes currently), but despite trying every setting I've tried, whenever I use a resharper template (such as tab-completing an if - if block), the resharper autocomplete dialog comes up in addition to the visual studio dialog (thus if they don't autocomplete to the same thing, or I accidentally hit enter, whatever I originally typed is replaced with the wrong resharper suggestion, effectively highlighting why I turned it off in the first place). A. Has anyone every seen this before? (I ask since I might have turned on/off a number of settings in a strange/incompatible way and perhaps a clean install might clear it up.) B. Any suggestions? :) (Visual Studio 08 SP1, Resharper 4.5.2 - only other tools installed are DevExpress, but they have long since been disabled) Thanks all!

    Read the article

  • Sql Server SMO connection timeout not working

    - by Uros Calakovic
    I have the following PowerShell code: function Get-SmoConnection { param ([string] $serverName = "", [int] $connectionTimeout = 0) if($serverName.Length -eq 0) { $serverConnection = New-Object ` Microsoft.SqlServer.Management.Common.ServerConnection } else { $serverConnection = New-Object ` Microsoft.SqlServer.Management.Common.ServerConnection($serverName) } if($connectionTimeout -ne 0) { $serverConnection.ConnectTimeout = $connectionTimeout } try { $serverConnection.Connect() $serverConnection } catch [system.Management.Automation.MethodInvocationException] { $null } } $connection = get-smoconnection "ServerName" 2 if($connection -ne $null) { Write-Host $connection.ServerInstance Write-Host $connection.ConnectTimeout } else { Write-Host "Connection could not be established" } It seems to work, except for the part that attempts to set the SMO connection timeout. If the connection is successful, I can verify that ServerConnection.ConnectTimeout is set to 2 (seconds), but when I supply a bogus name for the SQL Server instance, it still attempts to connect to it for ~ 15 seconds (which is I believe the default timeout value). Does anyone have experience with setting SMO connection timeout? Thank you in advance.

    Read the article

  • Execution Plan Optimization when where clause is removed then added back

    - by nmushov
    I have a stored procedure that uses a table valued function which executes in 9 seconds. If I alter the table valued function and remove the where clause, the stored procedure executes in 3 seconds. If I add the where clause back, the query still executes in 3 seconds. I took a look at the execution plans and it appears that after I remove the where clause, the execution plan includes parallelism and the scan count for 2 of my tables drops for 50000 and 65000 down to 5 and 3. After I add the where clause back, the optimized execution plan still runs unless I run DBCC FREEPROCCACHE. Questions 1. Why would SQL Server start using the optimized execution plan for both queries only when I first remove the where clause? Is there a way to force SQL Server to use this execution plan? Also, this is a paramaterized all-in-one query that uses the (Parameter is null or Parameter) in the where clause, which I believe is bad for performance. RETURNS TABLE AS RETURN ( SELECT TOP (@PageNumber * @PageSize) CASE WHEN @SortOrder = 'Expensive' THEN ROW_NUMBER() OVER (ORDER BY SellingPrice DESC) WHEN @SortOrder = 'Inexpensive' THEN ROW_NUMBER() OVER (ORDER BY SellingPrice ASC) WHEN @SortOrder = 'LowMiles' THEN ROW_NUMBER() OVER (ORDER BY Mileage ASC) WHEN @SortOrder = 'HighMiles' THEN ROW_NUMBER() OVER (ORDER BY Mileage DESC) WHEN @SortOrder = 'Closest' THEN ROW_NUMBER() OVER (ORDER BY P1.Distance ASC) WHEN @SortOrder = 'Newest' THEN ROW_NUMBER() OVER (ORDER BY [Year] DESC) WHEN @SortOrder = 'Oldest' THEN ROW_NUMBER() OVER (ORDER BY [Year] ASC) ELSE ROW_NUMBER() OVER (ORDER BY InventoryID ASC) END as rn, P1.InventoryID, P1.SellingPrice, P1.Distance, P1.Mileage, Count(*) OVER () RESULT_COUNT, dimCarStatus.[year] FROM (SELECT InventoryID, SellingPrice, Zip.Distance, Mileage, ColorKey, CarStatusKey, CarKey FROM facInventory JOIN @ZipCodes Zip ON Zip.DealerKey = facInventory.DealerKey) as P1 JOIN dimColor ON dimColor.ColorKey = P1.ColorKey JOIN dimCarStatus ON dimCarStatus.CarStatusKey = P1.CarStatusKey JOIN dimCar ON dimCar.CarKey = P1.CarKey WHERE (@ExteriorColor is NULL OR dimColor.ExteriorColor like @ExteriorColor) AND (@InteriorColor is NULL OR dimColor.InteriorColor like @InteriorColor) AND (@Condition is NULL OR dimCarStatus.Condition like @Condition) AND (@Year is NULL OR dimCarStatus.[Year] like @Year) AND (@Certified is NULL OR dimCarStatus.Certified like @Certified) AND (@Make is NULL OR dimCar.Make like @Make) AND (@ModelCategory is NULL OR dimCar.ModelCategory like @ModelCategory) AND (@Model is NULL OR dimCar.Model like @Model) AND (@Trim is NULL OR dimCar.Trim like @Trim) AND (@BodyType is NULL OR dimCar.BodyType like @BodyType) AND (@VehicleTypeCode is NULL OR dimCar.VehicleTypeCode like @VehicleTypeCode) AND (@MinPrice is NULL OR P1.SellingPrice >= @MinPrice) AND (@MaxPrice is NULL OR P1.SellingPrice < @MaxPrice) AND (@Mileage is NULL OR P1.Mileage < @Mileage) ORDER BY CASE WHEN @SortOrder = 'Expensive' THEN -SellingPrice WHEN @SortOrder = 'Inexpensive' THEN SellingPrice WHEN @SortOrder = 'LowMiles' THEN Mileage WHEN @SortOrder = 'HighMiles' THEN -Mileage WHEN @SortOrder = 'Closest' THEN P1.Distance WHEN @SortOrder = 'Newest' THEN -[YEAR] WHEN @SortOrder = 'Oldest' THEN [YEAR] ELSE InventoryID END )

    Read the article

  • MDX query for SSRS2008 parameter, to be used as filter in dataset

    - by adolf garlic - SAVE BBC6MUSIC
    What I want to do: The report parameter that the user can select from is a subset of the data available in a dimension. I have a query to retrieve this subset and it works fine. I am then trying to use the selection as a filter in a dataset. In the dataset properties, you can add a parameter based on the ones you have specified on the report, but when you look in the filters pane, there seems to be no way to select this. If I try and select something in there, it creates a whole new parameter and dataset. How can I get a dataset to consume and filter by a parameter that the user has selected?

    Read the article

  • Can you add a folder structure to IIS7?

    - by tigermain
    I am in the process of setting up a new server which I share with 2 colleagues. Is it possible to get a folder structure into IIS7 at all (in the MMC) so we can keep our sites seperate? In the IIS7 management console I would like a set of folders foreach of my colleagues so that each of our websites are within their own sub folder.

    Read the article

  • How to get Resharper to show a Refactoring that it already has.

    - by AngryHacker
    Whenever Resharper encounters code like this: (treeListNode.Tag as GridLine).AdvertiserSeparation = 5; it presents you with a possible fix (since treeListNode.Tag as GridLine might be null). It says: 'Replace with Direct Cast', which turns the code into the following: ((GridLine) treeListNode.Tag).AdvertiserSeparation = 5; This is great. However, when it encounters code like this: GridLine line = treeListNode.Tag as GridLine; line.AdvertiserSeparation = 5; Resharper simply displays a warning 'Possible System.NullReferenceException', but does not offer me to 'Replace with Direct Cast'. Is there a way to make Resharper offer me this refactoring, since it already has it?

    Read the article

  • Yahoo Web Scrapes: What are the limits?

    - by bvandrunen
    We are using a web scraper and have it set up to have a sleep function which has a random function set up (so that it isn't the same time between each scrape) but we are still getting blocked from Yahoo after 20-30 requests. Does any one know if there is a limit (i.e: 20 requests per minutes, 200 an hour) Right now our average between each request is around 3-6 seconds. Thanks for any help

    Read the article

< Previous Page | 268 269 270 271 272 273 274 275 276 277 278 279  | Next Page >