Search Results

Search found 28327 results on 1134 pages for 'sql improvements'.

Page 57/1134 | < Previous Page | 53 54 55 56 57 58 59 60 61 62 63 64  | Next Page >

  • Passing dynamic parameters to a stored procedure in SQL Server 2008

    - by themhz
    I have this procedure that executes another procedure passed by a parameter and its parameters datefrom and dateto. CREATE procedure [dbo].[execute_proc] @procs varchar(200), @pdatefrom date, @pdateto date as exec @procs @datefrom=@pdatefrom,@dateto=@pdateto But I need to also pass the parameters dynamically without the need to edit them in the procedure. For example, what I am imagining is something like this CREATE procedure [dbo].[execute_proc] @procs varchar(200), @params varchar(max) as exec @procs @params where @params is a string like @param1=1,@param2='somethingelse' Is there a way to do this?

    Read the article

  • SQL Server 2008 : Standard or SQL Express

    - by dr
    Which is a better choice on a development box if you primarily develop Asp.Net applications and SSRS reports. I have never had to use the Express editions, so I don't really know the pros or cons. The cons I have listed for Standard+ editions are: toll it takes on system resources pain to attach database for projects pain to detach unused databases $$$ Pros: You have everything you need Management Studio features Easy move to production

    Read the article

  • Is there a way to prevent Triggers from being disabled?

    - by HAdes
    I have a trigger on a table that should never be disabled. It performs certain checks and there have been occasions when other developers have disabled it to get around it. This is not good so I want to be able to turn off trigger disablement on this table alone. Is this possible? If not, any suggestions please. thanks.

    Read the article

  • SQL Server 2008 - Query takes forever to finish even though work is actually done

    - by Brian
    Running the following simple query in SSMS: UPDATE tblEntityAddress SET strPostCode= REPLACE(strPostCode,' ','') The update to the data (at least in memory) is complete in under a minute. I verified this by performing another query with transaction isolation level read uncommitted. The update query, however, continues to run for another 30 minutes. What is the issue here? Is this caused by a delay to write to disk? TIA

    Read the article

  • Trace Flag 1211 Not Working - SQL Server 2008 R2

    - by psam
    During a SSIS load, when an employee table is getting updated, locking comes into effect. However, have disabled lock escalation on the table using the following statements: ALTER TABLE dbo.Employee SET (LOCK_ESCALATION = DISABLE) DBCC TRACEON (1211,-1) However, the table (object) does get locked and is held for almost an hour. The total no. of updates (insert, update, delete statements) are approx 200,000

    Read the article

  • How to switch sql-2005 Select Case When T-sql Naming?

    - by soe
    select (Case Remark01 When 'l1' then type1 when 'l2' then type2 end) AS [?] --Remark .....want to switch name in here from mytable Example .... select (Case level When 'l1' then type1 ('l1' mean check constant string) when 'l2' then type2(('l2' mean check constant string)) end) AS (Case when 'l1' then [type01] Else [type02]) from mytable select level,type1,type2 from mytable I using two program this mytable one program is want to show menu only type1 only one program is want to show menu only type2 only I using one view using two program..

    Read the article

  • SQL Server: String Manipulation, Unpivoting

    - by OMG Ponies
    I have a column called body, which contains body content for our CMS. The data looks like: ...{cloak:id=1.1.1}...{cloak}...{cloak:id=1.1.2}...{cloak}...{cloak:id=1.1.3}...{cloak}... A moderately tweaked for readability example: ## h5. A formal process for approving and testing all external network connections and changes to the firewall and router configurations? {toggle-cloak:id=1.1.1}{tree-plus-icon} *Compliance:* {color:red}{*}Partial{*}{color} (?) {cloak:id=1.1.1} || Date: | 2010-03-15 || || Owner: | Brian || || Researched by: | || || Narrative: | Jira tickets are normally used to approve and track network changes\\ || || Artifacts: | Jira.bccampus.ca\\ || || Recommendation: | Need to update policy that no Jira = no change\\ || || Proposed Remedy(ies): | || || Approved Remedy(ies): | || || Date: | || || Reviewed by: | || || Remarks/comments: | || {cloak}## h5. Current network diagrams with all connections to cardholder data, including any wireless networks? {toggle-cloak:id=1.1.2}{tree-plus-icon} *Compliance:* {color:red}{*}TBD{*}{color} (?) {cloak:id=1.1.2} I'd like to get the cloak values out in the following format: requirement_num ----------------- 1.1.1 1.1.2 1.1.3 I'm looking at using UNIONs - does anyone have a better recommendation? Forgot to mention: I can't use regex, because CLR isn't enabled on the database. The numbers aren't sequencial. The current record jumps from 1.1.6 to 1.2.1

    Read the article

  • help with t-sql data aggregation

    - by stackoverflowuser
    Based on the following table Area S1 S2 S3 S4 -------------------- A1 5 10 20 0 A2 11 19 15 20 A3 0 0 0 20 I want to generate an output that will give the number of columns not having "0". So the output would be Area S1 S2 S3 S4 Count ------------------------- A1 5 10 20 0 3 A2 11 19 15 20 4 A3 0 0 0 20 1

    Read the article

  • t-sql getting leaf nodes

    - by stackoverflowuser
    Based on following table (I have kept spaces between the rows for clarity) Path ----------- \node1\node2\node3 \node1\node2\node3\node5 \node1\node6\node3 \node1\node4\node3 \node1\node4\node3\node7 \node1\node4\node3\node8 \node1\node4\node3\node9 \node1\node4\node3\node9\node10 I want to get all the paths containing leaf node. So for instance, following will be considered leaf nodes for path \node1\node4\node3 \node1\node4\node3\node7 \node1\node4\node3\node8 \node1\node4\node3\node9\node10 The following will be the output: Output --------------------------- \node1\node2\node3\node5 \node1\node6\node3 \node1\node4\node3\node7 \node1\node4\node3\node8 \node1\node4\node3\node9\node10 Pls. suggest. Thanks.

    Read the article

  • How to realize this in Transact SQL ( SQL Server 2008 )

    - by David
    I just want an update trigger like this postgresql version... It seems to me there is no NEW. and OLD. in MSSQL? CREATE OR REPLACE FUNCTION "public"."ts_create" () RETURNS trigger AS DECLARE BEGIN NEW.ctime := now(); RETURN NEW; END; Googled already, but nothing to be found unfortunately... Ideas?

    Read the article

  • Introducing Microsoft SQL Server 2008 R2 - Business Intelligence Samples

    - by smisner
    On April 14, 2010, Microsoft Press (blog | twitter) released my latest book, co-authored with Ross Mistry (twitter), as a free ebook download - Introducing Microsoft SQL Server 2008 R2. As the title implies, this ebook is an introduction to the latest SQL Server release. Although you'll find a comprehensive review of the product's features in this book, you will not find the step-by-step details that are typical in my other books. For those readers who are interested in a more interactive learning experience, I have created two samples file for download: IntroSQLServer2008R2Samples project Sales Analysis workbook Here's a recap of the business intelligence chapters and the samples I used to generate the screen shots by chapter: Chapter 6: Scalable Data Warehousing covers a new edition of SQL Server, Parallel Data Warehouse. Understandably, Microsoft did not ship me the software and hardware to set up my own Parallel Data Warehouse environment for testing purposes and consequently you won't see any screenshots in this chapter. I received a lot of information and a lot of help from the product team during the development of this chapter to ensure its technical accuracy. Chapter 7: Master Data Services is a new component in SQL Server. After you install Master Data Services (MDS), which is a separate installation from SQL Server although it's found on the same media, you can install sample models to explore (which is what I did to create screenshots for the book). To do this, you deploying packages found at \Program Files\Microsoft SQL Server\Master Data Services\Samples\Packages. You will first need to use the Configuration Manager (in the Microsoft SQL Server 2008 R2\Master Data Services program group) to create a database and a Web application for MDS. Then when you launch the application, you'll see a Getting Started page which has a Deploy Sample Data link that you can use to deploy any of the sample packages. Chapter 8: Complex Event Processing is an introduction to another new component, StreamInsight. This topic was way too large to cover in-depth in a single chapter, so I focused on information such as architecture, development models, and an overview of the key sections of code you'll need to develop for your own applications. StreamInsight is an engine that operates on data in-flight and as such has no user interface that I could include in the book as screenshots. The November CTP version of SQL Server 2008 R2 included code samples as part of the installation, but these are not the official samples that will eventually be available in Codeplex. At the time of this writing, the samples are not yet published. Chapter 9: Reporting Services Enhancements provides an overview of all the changes to Reporting Services in SQL Server 2008 R2, and there are many! In previous posts, I shared more details than you'll find in the book about new functions (Lookup, MultiLookup, and LookupSet), properties for page numbering, and the new global variable RenderFormat. I will confess that I didn't use actual data in the book for my discussion on the Lookup functions, but I did create real reports for the blog posts and will upload those separately. For the other screenshots and examples in the book, I have created the IntroSQLServer2008R2Samples project for you to download. To preview these reports in Business Intelligence Development Studio, you must have the AdventureWorksDW2008R2 database installed, and you must download and install SQL Server 2008 R2. For the map report, you must execute the PopulationData.sql script that I included in the samples file to add a table to the AdventureWorksDW2008R2 database. The IntroSQLServer2008R2Samples project includes the following files: 01_AggregateOfAggregates.rdl to illustrate the use of embedded aggregate functions 02_RenderFormatAndPaging.rdl to illustrate the use of page break properties (Disabled, ResetPageNumber), the PageName property, and the RenderFormat global variable 03_DataSynchronization.rdl to illustrate the use of the DomainScope property 04_TextboxOrientation.rdl to illustrate the use of the WritingMode property 05_DataBar.rdl 06_Sparklines.rdl 07_Indicators.rdl 08_Map.rdl to illustrate a simple analytical map that uses color to show population counts by state PopulationData.sql to provide the data necessary for the map report Chapter 10: Self-Service Analysis with PowerPivot introduces two new components to the Microsoft BI stack, PowerPivot for Excel and PowerPivot for SharePoint, which you can learn more about at the PowerPivot site. To produce the screenshots for this chapter, I created the Sales Analysis workbook which you can download (although you must have Excel 2010 and the PowerPivot for Excel add-in installed to explore it fully). It's a rather simple workbook because space in the book did not permit a complete exploration of all the wonderful things you can do with PowerPivot. I used a tutorial that was available with the CTP version as a basis for the report so it might look familiar if you've already started learning about PowerPivot. In future posts, I'll continue exploring the new features in greater detail. If there's any special requests, please let me know! Share this post: email it! | bookmark it! | digg it! | reddit! | kick it! | live it!

    Read the article

  • Support for 15,000 Partitions in SQL Server 2008 SP2 and SQL Server 2008 R2 SP1

    In SQL Server 2008 and SQL Server 2008 R2, the number of partitions on tables and indexes is limited to 1,000. This paper discusses how SQL Server 2008 SP2 and SQL Server 2008 R2 SP1 address this limitation by providing an option to increase the limit to 15,000 partitions. It describes how support for 15,000 partitions can be enabled and disabled on a database. It also talks about performance characteristics, certain limitations associated with this support, known issues, and their workarounds. This support is targeted to enterprise customers and ISVs with large-scale decision support or data warehouse requirements. The Future of SQL Server MonitoringMonitor wherever, whenever with Red Gate's SQL Monitor. See it live in action now.

    Read the article

  • Data-tier Applications in SQL Server 2008 R2

    - by BuckWoody
    I had the privilege of presenting to the Adelaide SQL Server User Group in Australia last evening, and I covered the Data Access Component (DAC) and the Utility Control Point (UCP) from SQL Server 2008 R2. Here are some links from that presentation:   Whitepaper: http://msdn.microsoft.com/en-us/library/ff381683.aspx Tutorials: http://msdn.microsoft.com/en-us/library/ee210554(SQL.105).aspx From Visual Studio: http://msdn.microsoft.com/en-us/library/dd193245(VS.100).aspx Restrictions and capabilities by Edition: http://msdn.microsoft.com/en-us/library/cc645993(SQL.105).aspx    Glen Berry's Blog entry on scripts for UCP/DAC: http://www.sqlservercentral.com/blogs/glennberry/archive/2010/05/19/sql-server-utility-script-from-24-hours-of-pass.aspx    Objects supported by a DAC: http://msdn.microsoft.com/en-us/library/ee210549(SQL.105).aspx   Share this post: email it! | bookmark it! | digg it! | reddit! | kick it! | live it!

    Read the article

  • Data-tier Applications in SQL Server 2008 R2

    - by BuckWoody
    I had the privilege of presenting to the Adelaide SQL Server User Group in Australia last evening, and I covered the Data Access Component (DAC) and the Utility Control Point (UCP) from SQL Server 2008 R2. Here are some links from that presentation:   Whitepaper: http://msdn.microsoft.com/en-us/library/ff381683.aspx Tutorials: http://msdn.microsoft.com/en-us/library/ee210554(SQL.105).aspx From Visual Studio: http://msdn.microsoft.com/en-us/library/dd193245(VS.100).aspx Restrictions and capabilities by Edition: http://msdn.microsoft.com/en-us/library/cc645993(SQL.105).aspx    Glen Berry's Blog entry on scripts for UCP/DAC: http://www.sqlservercentral.com/blogs/glennberry/archive/2010/05/19/sql-server-utility-script-from-24-hours-of-pass.aspx    Objects supported by a DAC: http://msdn.microsoft.com/en-us/library/ee210549(SQL.105).aspx   Share this post: email it! | bookmark it! | digg it! | reddit! | kick it! | live it!

    Read the article

  • SQL Developer at Oracle Open World 2012

    - by thatjeffsmith
    We have a lot going on in San Francisco this fall. One of the most personal exciting bits, for what will be my 4th or 5th Open World, is that this will be my FIRST as a member of Team Oracle. I’ve presented once before, but most years it was just me pressing flesh at the vendor booths. After 3-4 days of standing and talking, you’re ready to just go home and not do anything for a few weeks. This time I’ll have a chance to walk around and talk with our users and get a good idea of what’s working and what’s not. Of course it will be a great opportunity for you to find us and get to know your SQL Developer team! 3.4 miles across and back – thanks Ashley for signing me up for the run! This year is going to be a bit crazy. Work wise I’ll be presenting twice, working a booth, and proctoring several of our Hands-On Labs. The fun parts will be equally crazy though – running across the Bay Bridge (I don’t run), swimming the Bay (I don’t swim), having my wife fly out on Wednesday for the concert, and then our first WhiskyFest on Friday (I do drink whisky though.) But back to work – let’s talk about EVERYTHING you can expect from the SQL Developer team. Booth Hours We’ll have 2 ‘demo pods’ in the Exhibition Hall over at Moscone South. Look for the farm of Oracle booths, we’ll be there under the signs that say ‘SQL Developer.’ There will be several people on hand, mostly developers (yes, they still count as people), who can answer your questions or demo the latest features. Come by and say ‘Hi!’, and let us know what you like and what you think we can do better. Seriously. Monday 10AM – 6PM Tuesday 9:45AM – 6PM Wednesday 9:45AM – 4PM Presentations Stop by for an hour, pull up a chair, sit back and soak in all the SQL Developer goodness. You’ll only have to suffer my bad jokes for two of the presentations, so please at least try to come to the other ones. We’ll be talking about data modeling, migrations, source control, and new features in versions 3.1 and 3.2 of SQL Developer and SQL Developer Data Modeler. Day Time Event Monday 10:454:45 What’s New in SQL Developer Why Move to Oracle Application Express Listener Tueday 10:1511:455:00 Using Subversion in Oracle SQL Developer Data Modeler Oracle SQL Developer Tips & Tricks Database Design with Oracle SQL Developer Data Modeler Wednesday 11:453:30 Migrating Third-Party Databases and Applications to Oracle Exadata 11g Enterprise Options and Management Packs for Developers Hands On Labs (HOLs) The Hands On Labs allow you to come into a classroom environment, sit down at a computer, and run through some exercises. We’ll provide the hardware, software, and training materials. It’s self-paced, but we’ll have several helpers walking around to answer questions and chat up any SQL Developer or database topic that comes to mind. If your employer is sending you to Open World for all that great training, the HOLs are a great opportunity to capitalize on that. They are only 60 minutes each, so you don’t have to worry about burning out. And there’s no homework! Of course, if you do want to take the labs home with you, many are already available via the Developer Day Hands-On Database Applications Developer Lab. You will need your own computer for those, but we’ll take care of the rest. Wednesday PL/SQL Development and Unit Testing with Oracle SQL Developer 10:15 Performance Tuning with Oracle SQL Developer 11:45 Thursday The Soup to Nuts of Data Modeling with Oracle SQL Developer Data Modeler 11:15 Some Parting Advice Always wanted to meet your favorite Oracle authors, speakers, and thought-leaders? Don’t be shy, walk right up to them and introduce yourself. Normal social rules still apply, but at the conference everyone is open and up for meeting and talking with attendees. Just understand if there’s a line that you might only get a minute or two. It’s a LONG conference though, so you’ll have plenty of time to catch up with everyone. If you’re going to be around on Tuesday evening, head on over to the OTN Lounge from 4:30 to 6:30 and hang out for our Tweet Meet. That’s right, all the Oracle nerds on Twitter will be there in one place. Be sure to put your Twitter handle on your name tag so we know who you are!

    Read the article

  • How to Automate your Database Documentation

    - by Jonathan Hickford
    In my previous post, “Automating Deployments with SQL Compare command line” I looked at how teams can automate the deployment and post deployment validation of SQL Server databases using the command line versions of Red Gate tools. In this post I’m looking at another use for the command line tools, namely using them to generate up-to-date documentation with every database change. There are many reasons why up-to-date documentation is valuable. For example when somebody new has to work on or administer a database for the first time, or when a new database comes into service. Having database documentation reduces the risks of making incorrect decisions when making changes. Documentation is very useful to business intelligence analysts when writing reports, for example in SSRS. There are a couple of great examples talking about why up to date documentation is valuable on this site:  Database Documentation – Lands of Trolls: Why and How? and Database Documentation Using SQL Doc. The short answer is that it can save you time and reduce risk when you need that most! SQL Doc is a fast simple tool that automatically generates database documentation. It can create documents in HTML, Word or pdf files. The documentation contains information about object definitions and dependencies, along with any other information you want to associate with each object. The SQL Doc GUI, which is included in Red Gate’s SQL Developer Bundle and SQL Toolbelt, allows you to add additional notes to objects, and customise which objects are shown in the docs.  These settings can be saved as a .sqldoc project file. The SQL Doc command line can use this project file to automatically update the documentation every time the database is changed, ensuring that documentation that is always up to date. The simplest way to keep documentation up to date is probably to use a scheduled task to run a script every day. However if you have a source controlled database, or are using a Continuous Integration (CI) server or a build server, it may make more sense to use that instead. If  you’re using SQL Source Control or SSDT Database Projects to help version control your database, you can automatically update the documentation after each change is made to the source control repository that contains your database. To get this automation in place,  you can use the functionality of a Continuous Integration (CI) server, which can trigger commands to run when a source control repository has changed. A CI server will also capture and save the documentation that is created as an artifact, so you can always find the exact documentation for a specific version of the database. This forms an always up to date data dictionary. If you don’t already have a CI server in place there are several you can use, such as the free open source Jenkins or the free starter editions of TeamCity. I won’t cover setting these up in this article, but there is information about using CI servers for automating database tasks on the Red Gate Database Delivery webpage. You may be interested in Red Gate’s SQL CI utility (part of the SQL Automation Pack) which is an easy way to update a database with the latest changes from source control. The PowerShell example below shows how to create the documentation from a database. That database might be your integration database or a shared development database that is always up to date with the latest changes. $serverName = "server\instance" $databaseName = "databaseName" # If you want to document multiple databases use a comma separated list $userName = "username" $password = "password" # Path to SQLDoc.exe $SQLDocPath = "C:\Program Files (x86)\Red Gate\SQL Doc 3\SQLDoc.exe" $arguments = @( "/server:$($serverName)", "/database:$($databaseName)", "/username:$($userName)", "/password:$($password)", "/filetype:html", "/outputfolder:.", # "/project:$args[0]", # If you already have a .sqldoc project file you can pass it as an argument to this script. Values in the project will be overridden with any options set on the command line "/name:$databaseName Report", "/copyrightauthor:$([Environment]::UserName)" ) write-host $arguments & $SQLDocPath $arguments There are several options you can set on the command line to vary how your documentation is created. For example, you can document multiple databases or exclude certain types of objects. In the example above, we set the name of the report to match the database name, and use the current Windows user as the documentation author. For more examples of how you can customise the report from the command line please see the SQL Doc command line documentation If you already have a .sqldoc project file, or wish to further customise the report by including or excluding specific objects, you can use this project on the command line. Any settings you specify on the command line will override the defaults in the project. For details of what you can customise in the project please see the SQL Doc project documentation. In the example above, the line to use a project is commented out, but you can uncomment this line and then pass a path to a .sqldoc project file as an argument to this script.  Conclusion Keeping documentation about your databases up to date is very easy to set up using SQL Doc and PowerShell. By using a CI server to run this process you can trigger the documentation to be run on every change to a source controlled database, and keep historic documentation available. If you are considering more advanced database automation, e.g. database unit testing, change script generation, deploying to large numbers of targets and backup/verification, please email me at [email protected] for further script samples or if you have any questions.

    Read the article

  • Best practice to query data from MS SQL Server in C Sharp?

    - by Bruno
    What is the best way to query data from a MS SQL Server in C Sharp? I know that it is not good practice to have an SQL query in the code. Is the best way to create a stored procedure and call it from C Sharp with parameters? using (var conn = new SqlConnection(connStr)) using (var command = new SqlCommand("StoredProc", conn) { CommandType = CommandType.StoredProcedure }) { conn.Open(); command.ExecuteNonQuery(); conn.Close(); }

    Read the article

  • Announcing the Release of Visual Studio 2013 and Great Improvements to ASP.NET and Entity Framework

    - by ScottGu
    Today we released VS 2013 and .NET 4.5.1. These releases include a ton of great improvements, and include some fantastic enhancements to ASP.NET and the Entity Framework.  You can download and start using them now. Below are details on a few of the great ASP.NET, Web Development, and Entity Framework improvements you can take advantage of with this release.  Please visit http://www.asp.net/vnext for additional release notes, documentation, and tutorials. One ASP.NET With the release of Visual Studio 2013, we have taken a step towards unifying the experience of using the different ASP.NET sub-frameworks (Web Forms, MVC, Web API, SignalR, etc), and you can now easily mix and match the different ASP.NET technologies you want to use within a single application. When you do a File-New Project with VS 2013 you’ll now see a single ASP.NET Project option: Selecting this project will bring up an additional dialog that allows you to start with a base project template, and then optionally add/remove the technologies you want to use in it.  For example, you could start with a Web Forms template and add Web API or Web Forms support for it, or create a MVC project and also enable Web Forms pages within it: This makes it easy for you to use any ASP.NET technology you want within your apps, and take advantage of any feature across the entire ASP.NET technology span. Richer Authentication Support The new “One ASP.NET” project dialog also includes a new Change Authentication button that, when pushed, enables you to easily change the authentication approach used by your applications – and makes it much easier to build secure applications that enable SSO from a variety of identity providers.  For example, when you start with the ASP.NET Web Forms or MVC templates you can easily add any of the following authentication options to the application: No Authentication Individual User Accounts (Single Sign-On support with FaceBook, Twitter, Google, and Microsoft ID – or Forms Auth with ASP.NET Membership) Organizational Accounts (Single Sign-On support with Windows Azure Active Directory ) Windows Authentication (Active Directory in an intranet application) The Windows Azure Active Directory support is particularly cool.  Last month we updated Windows Azure Active Directory so that developers can now easily create any number of Directories using it (for free and deployed within seconds).  It now takes only a few moments to enable single-sign-on support within your ASP.NET applications against these Windows Azure Active Directories.  Simply choose the “Organizational Accounts” radio button within the Change Authentication dialog and enter the name of your Windows Azure Active Directory to do this: This will automatically configure your ASP.NET application to use Windows Azure Active Directory and register the application with it.  Now when you run the app your users can easily and securely sign-in using their Active Directory credentials within it – regardless of where the application is hosted on the Internet. For more information about the new process for creating web projects, see Creating ASP.NET Web Projects in Visual Studio 2013. Responsive Project Templates with Bootstrap The new default project templates for ASP.NET Web Forms, MVC, Web API and SPA are built using Bootstrap. Bootstrap is an open source CSS framework that helps you build responsive websites which look great on different form factors such as mobile phones, tables and desktops. For example in a browser window the home page created by the MVC template looks like the following: When you resize the browser to a narrow window to see how it would like on a phone, you can notice how the contents gracefully wrap around and the horizontal top menu turns into an icon: When you click the menu-icon above it expands into a vertical menu – which enables a good navigation experience for small screen real-estate devices: We think Bootstrap will enable developers to build web applications that work even better on phones, tablets and other mobile devices – and enable you to easily build applications that can leverage the rich ecosystem of Bootstrap CSS templates already out there.  You can learn more about Bootstrap here. Visual Studio Web Tooling Improvements Visual Studio 2013 includes a new, much richer, HTML editor for Razor files and HTML files in web applications. The new HTML editor provides a single unified schema based on HTML5. It has automatic brace completion, jQuery UI and AngularJS attribute IntelliSense, attribute IntelliSense Grouping, and other great improvements. For example, typing “ng-“ on an HTML element will show the intellisense for AngularJS: This support for AngularJS, Knockout.js, Handlebars and other SPA technologies in this release of ASP.NET and VS 2013 makes it even easier to build rich client web applications: The screen shot below demonstrates how the HTML editor can also now inspect your page at design-time to determine all of the CSS classes that are available. In this case, the auto-completion list contains classes from Bootstrap’s CSS file. No more guessing at which Bootstrap element names you need to use: Visual Studio 2013 also comes with built-in support for both CoffeeScript and LESS editing support. The LESS editor comes with all the cool features from the CSS editor and has specific Intellisense for variables and mixins across all the LESS documents in the @import chain. Browser Link – SignalR channel between browser and Visual Studio The new Browser Link feature in VS 2013 lets you run your app within multiple browsers on your dev machine, connect them to Visual Studio, and simultaneously refresh all of them just by clicking a button in the toolbar. You can connect multiple browsers (including IE, FireFox, Chrome) to your development site, including mobile emulators, and click refresh to refresh all the browsers all at the same time.  This makes it much easier to easily develop/test against multiple browsers in parallel. Browser Link also exposes an API to enable developers to write Browser Link extensions.  By enabling developers to take advantage of the Browser Link API, it becomes possible to create very advanced scenarios that crosses boundaries between Visual Studio and any browser that’s connected to it. Web Essentials takes advantage of the API to create an integrated experience between Visual Studio and the browser’s developer tools, remote controlling mobile emulators and a lot more. You will see us take advantage of this support even more to enable really cool scenarios going forward. ASP.NET Scaffolding ASP.NET Scaffolding is a new code generation framework for ASP.NET Web applications. It makes it easy to add boilerplate code to your project that interacts with a data model. In previous versions of Visual Studio, scaffolding was limited to ASP.NET MVC projects. With Visual Studio 2013, you can now use scaffolding for any ASP.NET project, including Web Forms. When using scaffolding, we ensure that all required dependencies are automatically installed for you in the project. For example, if you start with an ASP.NET Web Forms project and then use scaffolding to add a Web API Controller, the required NuGet packages and references to enable Web API are added to your project automatically.  To do this, just choose the Add->New Scaffold Item context menu: Support for scaffolding async controllers uses the new async features from Entity Framework 6. ASP.NET Identity ASP.NET Identity is a new membership system for ASP.NET applications that we are introducing with this release. ASP.NET Identity makes it easy to integrate user-specific profile data with application data. ASP.NET Identity also allows you to choose the persistence model for user profiles in your application. You can store the data in a SQL Server database or another data store, including NoSQL data stores such as Windows Azure Storage Tables. ASP.NET Identity also supports Claims-based authentication, where the user’s identity is represented as a set of claims from a trusted issuer. Users can login by creating an account on the website using username and password, or they can login using social identity providers (such as Microsoft Account, Twitter, Facebook, Google) or using organizational accounts through Windows Azure Active Directory or Active Directory Federation Services (ADFS). To learn more about how to use ASP.NET Identity visit http://www.asp.net/identity.  ASP.NET Web API 2 ASP.NET Web API 2 has a bunch of great improvements including: Attribute routing ASP.NET Web API now supports attribute routing, thanks to a contribution by Tim McCall, the author of http://attributerouting.net. With attribute routing you can specify your Web API routes by annotating your actions and controllers like this: OAuth 2.0 support The Web API and Single Page Application project templates now support authorization using OAuth 2.0. OAuth 2.0 is a framework for authorizing client access to protected resources. It works for a variety of clients including browsers and mobile devices. OData Improvements ASP.NET Web API also now provides support for OData endpoints and enables support for both ATOM and JSON-light formats. With OData you get support for rich query semantics, paging, $metadata, CRUD operations, and custom actions over any data source. Below are some of the specific enhancements in ASP.NET Web API 2 OData. Support for $select, $expand, $batch, and $value Improved extensibility Type-less support Reuse an existing model OWIN Integration ASP.NET Web API now fully supports OWIN and can be run on any OWIN capable host. With OWIN integration, you can self-host Web API in your own process alongside other OWIN middleware, such as SignalR. For more information, see Use OWIN to Self-Host ASP.NET Web API. More Web API Improvements In addition to the features above there have been a host of other features in ASP.NET Web API, including CORS support Authentication Filters Filter Overrides Improved Unit Testability Portable ASP.NET Web API Client To learn more go to http://www.asp.net/web-api/ ASP.NET SignalR 2 ASP.NET SignalR is library for ASP.NET developers that dramatically simplifies the process of adding real-time web functionality to your applications. Real-time web functionality is the ability to have server-side code push content to connected clients instantly as it becomes available. SignalR 2.0 introduces a ton of great improvements. We’ve added support for Cross-Origin Resource Sharing (CORS) to SignalR 2.0. iOS and Android support for SignalR have also been added using the MonoTouch and MonoDroid components from the Xamarin library (for more information on how to use these additions, see the article Using Xamarin Components from the SignalR wiki). We’ve also added support for the Portable .NET Client in SignalR 2.0 and created a new self-hosting package. This change makes the setup process for SignalR much more consistent between web-hosted and self-hosted SignalR applications. To learn more go to http://www.asp.net/signalr. ASP.NET MVC 5 The ASP.NET MVC project templates integrate seamlessly with the new One ASP.NET experience and enable you to integrate all of the above ASP.NET Web API, SignalR and Identity improvements. You can also customize your MVC project and configure authentication using the One ASP.NET project creation wizard. The MVC templates have also been updated to use ASP.NET Identity and Bootstrap as well. An introductory tutorial to ASP.NET MVC 5 can be found at Getting Started with ASP.NET MVC 5. This release of ASP.NET MVC also supports several nice new MVC-specific features including: Authentication filters: These filters allow you to specify authentication logic per-action, per-controller or globally for all controllers. Attribute Routing: Attribute Routing allows you to define your routes on actions or controllers. To learn more go to http://www.asp.net/mvc Entity Framework 6 Improvements Visual Studio 2013 ships with Entity Framework 6, which bring a lot of great new features to the data access space: Async and Task<T> Support EF6’s new Async Query and Save support enables you to perform asynchronous data access and take advantage of the Task<T> support introduced in .NET 4.5 within data access scenarios.  This allows you to free up threads that might otherwise by blocked on data access requests, and enable them to be used to process other requests whilst you wait for the database engine to process operations. When the database server responds the thread will be re-queued within your ASP.NET application and execution will continue.  This enables you to easily write significantly more scalable server code. Here is an example ASP.NET WebAPI action that makes use of the new EF6 async query methods: Interception and Logging Interception and SQL logging allows you to view – or even change – every command that is sent to the database by Entity Framework. This includes a simple, human readable log – which is great for debugging – as well as some lower level building blocks that give you access to the command and results. Here is an example of wiring up the simple log to Debug in the constructor of an MVC controller: Custom Code-First Conventions The new Custom Code-First Conventions enable bulk configuration of a Code First model – reducing the amount of code you need to write and maintain. Conventions are great when your domain classes don’t match the Code First conventions. For example, the following convention configures all properties that are called ‘Key’ to be the primary key of the entity they belong to. This is different than the default Code First convention that expects Id or <type name>Id. Connection Resiliency The new Connection Resiliency feature in EF6 enables you to register an execution strategy to handle – and potentially retry – failed database operations. This is especially useful when deploying to cloud environments where dropped connections become more common as you traverse load balancers and distributed networks. EF6 includes a built-in execution strategy for SQL Azure that knows about retryable exception types and has some sensible – but overridable – defaults for the number of retries and time between retries when errors occur. Registering it is simple using the new Code-Based Configuration support: These are just some of the new features in EF6. You can visit the release notes section of the Entity Framework site for a complete list of new features. Microsoft OWIN Components Open Web Interface for .NET (OWIN) defines an open abstraction between .NET web servers and web applications, and the ASP.NET “Katana” project brings this abstraction to ASP.NET. OWIN decouples the web application from the server, making web applications host-agnostic. For example, you can host an OWIN-based web application in IIS or self-host it in a custom process. For more information about OWIN and Katana, see What's new in OWIN and Katana. Summary Today’s Visual Studio 2013, ASP.NET and Entity Framework release delivers some fantastic new features that streamline your web development lifecycle. These feature span from server framework to data access to tooling to client-side HTML development.  They also integrate some great open-source technology and contributions from our developer community. Download and start using them today! Scott P.S. In addition to blogging, I am also now using Twitter for quick updates and to share links. Follow me at: twitter.com/scottgu

    Read the article

  • The best, in the West

    - by Fatherjack
    As many of you know, I run the SQL South West user group and we are currently in full flow preparing to stage the UK’s second SQL Saturday. The SQL Saturday spotlight is going to fall on Exeter in March 2013. We have full-day session on Friday 8th with some truly amazing speakers giving their insights and experience into some vital areas of working with SQL Server: Dave Ballantyne and Dave Morrison – TSQL and internals Christian Bolton and Gavin Payne – Mission critical data platforms on Windows Server 2012 Denny Cherry – SQL Server Security André Kamman – Powershell 3.0 for SQL Server Administrators and Developers Mladen Prajdic – From SQL Traces to Extended Events – The next big switch. A number of people have claimed that the choice is too good and they’d have trouble selecting just one session to attend. I can see how this is a problem but hope that they make their minds up quickly. The venue is a bespoke conference suite in the centre of Exeter but has limited capacity so we are working on a first-come first-served basis. All the session details and booking and travel information can be found on our user group website. The Saturday will be a day of free, 50 minute sessions on all aspects SQL Server from almost 30 different speakers. If you would like to submit a session then get a move on as submissions close on 8th January 2013 (That’s less than a month away). We are really interested in getting new speakers started so we have a lightning talk session where you can come along and give a small talk (anywhere from 5 to 15 minutes long) about anything connected with SQL Server as a way to introduce you to what it’s like to be a speaker at an event. Details on registering to attend and to submit a session (Lightning talks need to be submitted too please) can be found on our SQL Saturday pages. This is going to be the biggest and best bespoke SQL Server conference to ever take place this far South West in the UK and we aim to give everyone who comes to either day a real experience of the South West so we have a few surprises for you on the day.

    Read the article

  • Idea to develop a caching server between IIS and SQL Server

    - by John
    I work on a few high traffic websites that all share the same database and that are all heavily database driven. Our SQL server is max-ed out and, although we have already implemented many changes that have helped but the server is still working too hard. We employ some caching in our website but the type of queries we use negate using SQL dependency caching. We tried SQL replication to try and kind of load balance but that didn't prove very successful because the replication process is quite demanding on the servers too and it needed to be done frequently as it is important that data is up to date. We do use a Varnish web caching server (Linux based) to take a bit of the load off both the web and database server but as a lot of the sites are customised based on the user we can only do so much. Anyway, the reason for this question... Varnish gave me an idea for a possible application that might help in this situation. Just like Varnish sits between a web browser and the web server and caches response from the web server, I was wondering about the possibility of creating something that sits between the web server and the database server. Imagine that all SQL queries go through this SQL caching server. If it's a first time query then it will get recorded, and the result requested from the SQL server and stored locally on the cache server. If it's a repeat request within a set time then the result gets retrieved from the local copy without the query being sent to the SQL server. The caching server could also take advantage of SQL dependency caching notifications. This seems like a good idea in theory. There's still the same amount of data moving back and forward from the web server, but the SQL server is relieved of the work of processing the repeat queries. I wonder about how difficult it would be to build a service that sort of emulates requests and responses from SQL server, whether SQL server's own caching is doing enough of this already that this wouldn't be a benefit, or even if someone has done this before and I haven't found it? I would welcome any feedback or any references to any relevant projects.

    Read the article

  • SQL Server for the Oracle DBA Links

    - by BuckWoody
    I do a presentation (and a class) called "SQL Server for the Oracle DBA". It's a non-marketing overview that gives you the basics of working with SQL Server if you're already familiar wtih how Oracle works. This class and these links DO NOT help you with "Why should I use Oracle/SQL Server instead of Oracle/SQL Server" - I'll assume you're already there, and if not, there are LOTS of sites to help you make that decision. Although these links might contain slight marketing slants (I don't control them) I've tried to get the best links I can. Feel free to comment here to add more/better links. As such, these aren't links that help you work with Oracle - they are links to help you work with SQL Server. Some of them contain more information than you actually need, others don't have near enough. Taken together (and with the class) you're able to get done what you need to do. "Practical SQL Server for Oracle Professionals" - A Microsoft Whitepaper, probably the best place to get started: http://download.microsoft.com/download/6/9/d/69d1fea7-5b42-437a-b3ba-a4ad13e34ef6/SQLServer2008forOracle.docx Free Training: http://technet.microsoft.com/en-us/sqlserver/dd548020.aspx Classroom training (will cost you): http://www.microsoft.com/learning/en/us/course.aspx?ID=50068A&locale=en-us Terminology Differences: http://www.associatedcontent.com/article/2383466/oracle_and_sql_server_basic_terminology.html Datatype mapping between Oracle and SQL Server: http://msdn.microsoft.com/en-us/library/ms151817.aspx The "other" direction - can still be useful for the Oracle professional to see the other side: http://blog.benday.com/archive/2008/10/23/23195.aspx Share this post: email it! | bookmark it! | digg it! | reddit! | kick it! | live it!

    Read the article

  • Is inline SQL still classed as bad practice now that we have Micro ORMs?

    - by Grofit
    This is a bit of an open ended question but I wanted some opinions, as I grew up in a world where inline SQL scripts were the norm, then we were all made very aware of SQL injection based issues, and how fragile the sql was when doing string manipulations all over the place. Then came the dawn of the ORM where you were explaining the query to the ORM and letting it generate its own SQL, which in a lot of cases was not optimal but was safe and easy. Another good thing about ORMs or database abstraction layers were that the SQL was generated with its database engine in mind, so I could use Hibernate/Nhibernate with MSSQL, MYSQL and my code never changed it was just a configuration detail. Now fast forward to current day, where Micro ORMs seem to be winning over more developers I was wondering why we have seemingly taken a U-Turn on the whole in-line sql subject. I must admit I do like the idea of no ORM config files and being able to write my query in a more optimal manner but it feels like I am opening myself back up to the old vulnerabilities such as SQL injection and I am also tying myself to one database engine so if I want my software to support multiple database engines I would need to do some more string hackery which seems to then start to make code unreadable and more fragile. (Just before someone mentions it I know you can use parameter based arguments with most micro orms which offers protection in most cases from sql injection) So what are peoples opinions on this sort of thing? I am using Dapper as my Micro ORM in this instance and NHibernate as my regular ORM in this scenario, however most in each field are quite similar. What I term as inline sql is SQL strings within source code. There used to be design debates over SQL strings in source code detracting from the fundamental intent of the logic, which is why statically typed linq style queries became so popular its still just 1 language, but with lets say C# and Sql in one page you have 2 languages intermingled in your raw source code now. Just to clarify, the SQL injection is just one of the known issues with using sql strings, I already mention you can stop this from happening with parameter based queries, however I highlight other issues with having SQL queries ingrained in your source code, such as the lack of DB Vendor abstraction as well as losing any level of compile time error capturing on string based queries, these are all issues which we managed to side step with the dawn of ORMs with their higher level querying functionality, such as HQL or LINQ etc (not all of the issues but most of them). So I am less focused on the individual highlighted issues and more the bigger picture of is it now becoming more acceptable to have SQL strings directly in your source code again, as most Micro ORMs use this mechanism. Here is a similar question which has a few different view points, although is more about the inline sql without the micro orm context: http://stackoverflow.com/questions/5303746/is-inline-sql-hard-coding

    Read the article

  • WordPress is now nicely supported on SQL Server (and SQL Azure for that matter)

    - by Eric Nelson
    WordPress is enormously popular for blogs and full websites thanks to an awesome eco system which has built up around it, the simplicity (relatively) of getting it up and running plus the flexibility to “bend it” in all sorts of directions. When I say bend, check out the following which are all WordPress sites My “back up blog” http://iupdateable.wordpress.com/  My groups “odd site” :) http://ubelly.com My favourite “cheap games” site http://www.frugalgaming.co.uk/  WordPress users typically run their sites on Linux and MySQL, although PHP (the language in which WordPress is written) can be happily run on Windows. Both fine technologies in their own right, but for me (and probably a fair few others) I would love to use WordPress but with the technologies I know best (aka Windows, IIS and SQL Server). However, that has proven to be actually rather tricky in practice to get working – until now. Earlier last month OmniTI released a patch for WordPress which provides SQL Server and SQL Azure support.  In parallel with that some fine folks inside Microsoft have also created http://wordpress.visitmix.com which contains information about running WordPress on the Microsoft platform with a particular focus on SQL Server and SQL Azure.  Top stuff! To run WordPress with SQL Server: Download and Install the WordPress on SQL Server Distro/Patch And then you will quite likely need to migrate: Check out how to Migrate to Windows and SQL Server by Zach Owens who is moving his blog to Windows and SQL Server Enjoy Related Links Running PHP on IIS on Windows http://php.iis.net/  If PHP is not your thing, then the following Blog engines are .NET based BlogEngine http://www.dotnetblogengine.net/ DasBlog http://www.dasblog.info/ Subtext http://subtextproject.com/ (which happens to power http://geekswithblogs.net where my main blog is http://geekswithblogs.net/iupdateable)

    Read the article

  • SQL Server: Must numbers all be specified with latin numeral digits?

    - by Ian Boyd
    Does SQL server expect numbers to be specified with digits from the latin alphabet, e.g.: 0123456789 Is it valid to give SQL Server digits in other alphabets? Rosetta Stone: Latin: 01234567890 Arabic: ?????????? Bengali: ?????????? i know that the client (ADO) will convert 8-bit strings to 16-bit unicode strings using the current culture. But the client is also converting numbers to strings using their current culture, e.g.: SELECT * FROM Inventory WHERE Quantity > ???,?? Which throws SQL Server for fits. i know that the server/database has it's defined code page and locale, but that is for strings. Will SQL Server interpret numbers using the active (or per-login specified) locale, or must all numeric values be specifid with latin numeral digits?

    Read the article

< Previous Page | 53 54 55 56 57 58 59 60 61 62 63 64  | Next Page >