Search Results

Search found 19170 results on 767 pages for 'microsoft azure'.

Page 13/767 | < Previous Page | 9 10 11 12 13 14 15 16 17 18 19 20  | Next Page >

  • Is their a definitive list for the differences between the current version of SQL Azure and SQL Serv

    - by Aim Kai
    I am a relative newbie when it comes to SQL Azure!! I was wondering if there was a definitive list somewhere regarding what is and is not supported by SQL Azure in regards to SQL Server 2008? I have had a look through google but I've noticed some of the blog posts are missing things which I have found through my own testing: For example, quite a lot is summarised in this blog entry http://www.keepitsimpleandfast.com/2009/12/main-differences-between-sql-azure-and.html Common Language Runtime (CLR) Database file placement Database mirroring Distributed queries Distributed transactions Filegroup management Global temporary tables Spatial data and indexes SQL Server configuration options SQL Server Service Broker System tables Trace Flags which is a repeat of the MSDN page http://msdn.microsoft.com/en-us/library/ff394115.aspx I've noticed from my own testing that the following seem to have issues when migrating from SQL Server 2008 to the Azure: XML Types (the msdn does mention large custom types - I guess it may include this?? even if the data schema is really small?) Multi-part views I've been using SQL Azure Migration Wizard v3.1.8 to migrate local databases into the cloud. I was wondering if anyone could point to a list or give me any information till when these features are likely to be included in SQL Azure.

    Read the article

  • SQLAuthority News – Top 5 Latest Microsoft Certifications of 2013 – Guest Post

    - by Pinal Dave
    With the IT job market getting more and more competent by the day, certifications are a must for anyone who wishes to get a strong foothold in the industry. Microsoft community comes up with regular updates and enhancements in its existing products to keep up with the rapidly evolving requirements of the ICT industry. We bring you a list of five latest Microsoft certifications that you must consider acquiring this year. MCSE: SharePoint Learn all about Windows Server 2012 and Microsoft SharePoint 2013, which brings an advanced set of features to the fore in this latest version. It introduces new capabilities for business intelligence, social media, branding, search, identity management, mobile device among other features. Enjoy a great user experience with sharing and collaboration in community forum, within a pixel-perfect SharePoint website. Data connectivity and business intelligence tools allow users to process and access data, analyze reports, share and collaborate with each other more conveniently. Microsoft Specialist: Microsoft Project 2013 The only project management system that works seamlessly with other applications and cloud solutions of Microsoft, MS Project 2013 offers more than what meets the eye.  It provides for easier management and monitoring of projects so that users can ensure timely delivery while improving the productivity significantly. So keep all your projects on track and collaborate with your team like never before with this enhanced release! This one’s a must for all project managers. MCSE Messaging Another one of Microsoft gems is its messaging environment which has also launched the latest release Microsoft Exchange Server 2013. Messaging administrators can take up this training and validate their expertise in Unified Messaging, Exchange Online, PowerShell and Virtualization strategies, through MCSE Messaging certification in Exchange Server. If you wish to enhance productivity and data security of your organization while being flexible and extremely efficient, this is the right certification for you. MCSE Communication An enterprise can function optimally on the strength of its information flow and communication systems. With Lync Server 2013, you can introduce a whole new world of unified communications which consists of audio/video conferencing, dial-in, Persistent Chat, instant chat, and EDGE services in your organization. Utilize IT to serve and support business objectives by mastering this UC technology with this latest MCSE Communication course on using Microsoft Lync Server 2013. MCSE: SQL Server 2012 BI Platform The decision making process is largely influenced by underlying enterprise information used by the management for business intelligence. Therefore, a robust business intelligence platform that anchors enterprise IT and transform it to operational efficiencies is the need of the hour. SQL Server 2012 BI Platform certification helps professionals implement, manage and maintain a BI database infrastructure effectively. IT professionals with BI skills are highly sought after these days. MCSD: Windows Store Apps A Microsoft Certified Solutions Developer certification in Windows Store Apps validates your potential in designing interactive apps. Learn The Essentials of Developing Windows Store Apps using HTML5 and JavaScript and establish yourself as an ace developer capable of creating fast and fluid Metro style apps for Windows 8 that are accessible on a variety of devices. You can also go ahead and Learn Essentials of Developing Windows Store Apps using C# mode if you’re already familiar and working with C# programming language. Hence the developers are free to choose their own favorite development stream which opens doors for them to get ready for the latest and exciting application development platform called Windows store apps. Software developers with these skills are in great demand in the industry today. In order to continue being competitive in your respective fields, it is imperative that IT personnel update their knowledge on a regular basis. Certifications are a means to achieve this goal. Not considered to be an optional pre-requisite anymore, major IT certifications such as these are now essential to stay afloat in a cut-throat industry where technologies change on a daily basis. This blog is written by Aruneet Anand of Koenig Solutions. Koenig Solutions does training for all of the above courses. For more information, visit the website. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, SQLAuthority News, T SQL, Technology Tagged: Microsoft Certifications

    Read the article

  • Logging Output of Azure Startup Tasks to the Event Log

    - by Your DisplayName here!
    This can come in handy when troubleshooting: using System; using System.Diagnostics; using System.Text;   namespace Thinktecture.Azure {     class Program     {         static EventLog _eventLog = new EventLog("Application", ".", "StartupTaskShell");         static StringBuilder _out = new StringBuilder(64);         static StringBuilder _err = new StringBuilder(64);           static int Main(string[] args)         {             if (args.Length != 1)             {                 Console.WriteLine("Invalid arguments: " + String.Join(", ", args));                 _eventLog.WriteEntry("Invalid arguments: " + String.Join(", ", args));                                 return -1;             }               var task = args[0];               ProcessStartInfo info = new ProcessStartInfo()             {                 FileName = task,                 WorkingDirectory = Environment.CurrentDirectory,                 UseShellExecute = false,                 ErrorDialog = false,                 CreateNoWindow = true,                 RedirectStandardOutput = true,                 RedirectStandardError = true             };               var process = new Process();             process.StartInfo = info;               process.OutputDataReceived += (s, e) =>                 {                     if (e.Data != null)                     {                         _out.AppendLine(e.Data);                     }                 };             process.ErrorDataReceived += (s, e) =>                 {                     if (e.Data != null)                     {                         _err.AppendLine(e.Data);                     }                 };               process.Start();             process.BeginOutputReadLine();             process.BeginErrorReadLine();             process.WaitForExit();               var outString = _out.ToString();             var errString = _err.ToString();               if (!string.IsNullOrWhiteSpace(outString))             {                 outString = String.Format("Standard Out for {0}\n\n{1}", task, outString);                 _eventLog.WriteEntry(outString, EventLogEntryType.Information);             }               if (!string.IsNullOrWhiteSpace(errString))             {                 errString = String.Format("Standard Err for {0}\n\n{1}", task, errString);                 _eventLog.WriteEntry(errString, EventLogEntryType.Error);             }               return 0;         }     } } You then wrap your startup tasks with the StartupTaskShell and you’ll be able to see stdout and stderr in the application event log.

    Read the article

  • SanjayP&rsquo;s venture after Microsoft involves no Microsoft

    - by eddraper
    When I was at Microsoft, I always found Sanjay Parthasarathy to be a bright and passionate leader.  While he was a bit disconnected at times with what was really going on out in the trenches, I always thought he was true believer in what we in Developer Platform and Evangelism (DPE) were doing.  He got it.  He had started DPE and kicked a lot of doors down up in Redmond to make it happen.  Back in the early 2000s, battles over platform choices at large customers was trench warfare… bayonets and hand grenades at the P-Code level.  This model was not at all suited to Microsoft’s org structure at the time.  While there were plenty of people fully able to have competitive conversations around Windows Server, or AD, or Exchange, or the desktop, there weren’t many that could have deep technical conversations around Java vs .NET and the platform “stack” as a cohesive, unified unit of value.  This task fell to DPE. Sanjay ended up leaving Microsoft a number of months before me in 2009 and I remember thinking these exact words: “holy shit, SanjayP left Microsoft.”  When SanjayP left DPE years before that,  Sheila Gulati had stepped into his shoes and I thought we where starting to miss a beat.  Sheila had built an amazing business at Microsoft India, but I don’t recall being inspired by her as a leader.  SanjayP’s talks felt like the opening scene of “Patton” with George C. Scott pacing in front of the American flag.  Sheila was a voice on a con-call.  When she moved on in 2007, Walid Abu-Hadba was given the reigns.  Personally, I don’t ever recall even seeing his face.  I think I might recall hearing his voice on some con-calls, but for all intents and purposes he was invisible to me.  Perhaps this was the beginning of my carelessness around seeking “visibility.” Fast forward to Build 2011.  First off, we have no PDC – we have Build.  Microsoft had made an 11 year investment by this time in building an organization to make its technology relevant to developers.  One would think such an org would be in the driver’s seat of such an event, but we see Windows product group people on the podiums.  Watching, I could see the messaging unfold… but no story.  It was like the old days.  Demos and PowerPoints by team members building the tech, and in many cases VPs.  The ensuing confusion is almost legendary now.  Windows 8 was, and is, a pretty big deal… but who is telling the story – not just features and benefits, but the story around how it all fits together. Having been out of Microsoft for two years now, and looking in, I can only conclude that the “DPE of old” has at best been emasculated, and at worst been completely marginalized by internal politics, or perhaps the eternal march of the corporate entropy generator that resides at all large companies.  I don’t think this is a good thing for anyone. And now, back to Sanjay who is the father of Microsoft DPE… I noticed that he has moved back to India and is doing start-up work.  His current company Indix looks to be doing some interesting things with “big data” and here’s their stack: Nary a trace of anything Microsoft.  What could account for this?  I wonder….  Better availability of labor and expertise in India for this stack?  Donno, but even in India, leet R and Hadoop skills have to be hard to find. Technical superiority?  This, I sincerely doubt. This stack, with SanjayP’s name as CEO leaves me with an unsettling feeling.  If he did believe, he no longer does.  One doesn’t place bets with real money on things they don’t believe in.  Perhaps he never did believe, and was a corporate creature seeking to find a niche for himself after which he manipulated me and others.  Or perhaps… anger… be it passive aggression or an outright “in your face F*** you” to his former masters. I guess in the end, only he knows the true reason… But I have my theory...

    Read the article

  • Windows Azure ASP.NET MVC Role behaves strangely when redirecting from HTTP to HTTPS

    - by Rinat Abdullin
    Subj. I've got an ASP.NET 2 MVC Worker Role Application, that does not differ much from the default template. When attempting redirect from HTTP to HTTPS (this happens when we access constroller secured by the usual RequireSSL attribute implementation) we get blank page with "Bad Request" message. IntelliTrace shows this: Thrown: "The file '/Views/Home/LogOnUserControl.aspx' does not exist." (System.Web.HttpException) Call stack is really short: [External Code] App_Web_vfahw7gz.dll!ASP.views_shared_site_master.__Render__control1(System.Web.UI.HtmlTextWriter __w = {unknown}, System.Web.UI.Control parameterContainer = {unknown}) [External Code] App_Web_bsbqxr44.dll!ASP.views_home_index_aspx.ProcessRequest(System.Web.HttpContext context = {unknown}) [External Code] User control reference is the usual one in /Views/Shared/Site.Master: <div id="logindisplay"> <% Html.RenderPartial("LogOnUserControl"); %> </div> And partial view LogOnUserControl.ashx is located in Views/Shared (and it is ASHX, not ASPX). Problem shows up, when we try to access site pages, that require auth and redirect. These pages are secured by RequireSSL attribute (Redirect == true): [AttributeUsage(AttributeTargets.Method | AttributeTargets.Class, Inherited = true, AllowMultiple = false)] public sealed class RequireSslAttribute : FilterAttribute, IAuthorizationFilter { public bool Redirect { get; set; } // Methods public void OnAuthorization(AuthorizationContext filterContext) { // this get's messy, when we are running custom ports // within the local dev fabric. // hence we disable code in the debug #if !DEBUG if (filterContext == null) { throw new ArgumentNullException("filterContext"); } if (filterContext.HttpContext.Request.IsSecureConnection) return; var canRedirect = string.Equals(filterContext.HttpContext.Request.HttpMethod, "GET", StringComparison.OrdinalIgnoreCase); if (canRedirect && Redirect) { var builder = new UriBuilder { Scheme = "https", Host = filterContext.HttpContext.Request.Url.Host, Path = filterContext.HttpContext.Request.RawUrl }; filterContext.Result = new RedirectResult(builder.ToString()); } else { throw new HttpException(0x193, "Access forbidden. The requested resource requires an SSL connection."); } #endif } } Obviously we compile in RELEASE for this case. Does anybody have any idea, what could cause this strange exception and how to get rid of it?

    Read the article

  • Azure table storage: maximum variable size ?

    - by Egon
    I will be using the table storage to store a lot of blob names, in a single string, appended to each other using some special character. This string will sky rockets pretty soon. But is there a maximum size to the length of a property for a particular entity ? in my case the string ?

    Read the article

  • Microsoft and jQuery

    - by Rick Strahl
    The jQuery JavaScript library has been steadily getting more popular and with recent developments from Microsoft, jQuery is also getting ever more exposure on the ASP.NET platform including now directly from Microsoft. jQuery is a light weight, open source DOM manipulation library for JavaScript that has changed how many developers think about JavaScript. You can download it and find more information on jQuery on www.jquery.com. For me jQuery has had a huge impact on how I develop Web applications and was probably the main reason I went from dreading to do JavaScript development to actually looking forward to implementing client side JavaScript functionality. It has also had a profound impact on my JavaScript skill level for me by seeing how the library accomplishes things (and often reviewing the terse but excellent source code). jQuery made an uncomfortable development platform (JavaScript + DOM) a joy to work on. Although jQuery is by no means the only JavaScript library out there, its ease of use, small size, huge community of plug-ins and pure usefulness has made it easily the most popular JavaScript library available today. As a long time jQuery user, I’ve been excited to see the developments from Microsoft that are bringing jQuery to more ASP.NET developers and providing more integration with jQuery for ASP.NET’s core features rather than relying on the ASP.NET AJAX library. Microsoft and jQuery – making Friends jQuery is an open source project but in the last couple of years Microsoft has really thrown its weight behind supporting this open source library as a supported component on the Microsoft platform. When I say supported I literally mean supported: Microsoft now offers actual tech support for jQuery as part of their Product Support Services (PSS) as jQuery integration has become part of several of the ASP.NET toolkits and ships in several of the default Web project templates in Visual Studio 2010. The ASP.NET MVC 3 framework (still in Beta) also uses jQuery for a variety of client side support features including client side validation and we can look forward toward more integration of client side functionality via jQuery in both MVC and WebForms in the future. In other words jQuery is becoming an optional but included component of the ASP.NET platform. PSS support means that support staff will answer jQuery related support questions as part of any support incidents related to ASP.NET which provides some piece of mind to some corporate development shops that require end to end support from Microsoft. In addition to including jQuery and supporting it, Microsoft has also been getting involved in providing development resources for extending jQuery’s functionality via plug-ins. Microsoft’s last version of the Microsoft Ajax Library – which is the successor to the native ASP.NET AJAX Library – included some really cool functionality for client templates, databinding and localization. As it turns out Microsoft has rebuilt most of that functionality using jQuery as the base API and provided jQuery plug-ins of these components. Very recently these three plug-ins were submitted and have been approved for inclusion in the official jQuery plug-in repository and been taken over by the jQuery team for further improvements and maintenance. Even more surprising: The jQuery-templates component has actually been approved for inclusion in the next major update of the jQuery core in jQuery V1.5, which means it will become a native feature that doesn’t require additional script files to be loaded. Imagine this – an open source contribution from Microsoft that has been accepted into a major open source project for a core feature improvement. Microsoft has come a long way indeed! What the Microsoft Involvement with jQuery means to you For Microsoft jQuery support is a strategic decision that affects their direction in client side development, but nothing stopped you from using jQuery in your applications prior to Microsoft’s official backing and in fact a large chunk of developers did so readily prior to Microsoft’s announcement. Official support from Microsoft brings a few benefits to developers however. jQuery support in Visual Studio 2010 means built-in support for jQuery IntelliSense, automatically added jQuery scripts in many projects types and a common base for client side functionality that actually uses what most developers are already using. If you have already been using jQuery and were worried about straying from the Microsoft line and their internal Microsoft Ajax Library – worry no more. With official support and the change in direction towards jQuery Microsoft is now following along what most in the ASP.NET community had already been doing by using jQuery, which is likely the reason for Microsoft’s shift in direction in the first place. ASP.NET AJAX and the Microsoft AJAX Library weren’t bad technology – there was tons of useful functionality buried in these libraries. However, these libraries never got off the ground, mainly because early incarnations were squarely aimed at control/component developers rather than application developers. For all the functionality that these controls provided for control developers they lacked in useful and easily usable application developer functionality that was easily accessible in day to day client side development. The result was that even though Microsoft shipped support for these tools in the box (in .NET 3.5 and 4.0), other than for the internal support in ASP.NET for things like the UpdatePanel and the ASP.NET AJAX Control Toolkit as well as some third party vendors, the Microsoft client libraries were largely ignored by the developer community opening the door for other client side solutions. Microsoft seems to be acknowledging developer choice in this case: Many more developers were going down the jQuery path rather than using the Microsoft built libraries and there seems to be little sense in continuing development of a technology that largely goes unused by the majority of developers. Kudos for Microsoft for recognizing this and gracefully changing directions. Note that even though there will be no further development in the Microsoft client libraries they will continue to be supported so if you’re using them in your applications there’s no reason to start running for the exit in a panic and start re-writing everything with jQuery. Although that might be a reasonable choice in some cases, jQuery and the Microsoft libraries work well side by side so that you can leave existing solutions untouched even as you enhance them with jQuery. The Microsoft jQuery Plug-ins – Solid Core Features One of the most interesting developments in Microsoft’s embracing of jQuery is that Microsoft has started contributing to jQuery via standard mechanism set for jQuery developers: By submitting plug-ins. Microsoft took some of the nicest new features of the unpublished Microsoft Ajax Client Library and re-wrote these components for jQuery and then submitted them as plug-ins to the jQuery plug-in repository. Accepted plug-ins get taken over by the jQuery team and that’s exactly what happened with the three plug-ins submitted by Microsoft with the templating plug-in even getting slated to be published as part of the jQuery core in the next major release (1.5). The following plug-ins are provided by Microsoft: jQuery Templates – a client side template rendering engine jQuery Data Link – a client side databinder that can synchronize changes without code jQuery Globalization – provides formatting and conversion features for dates and numbers The first two are ports of functionality that was slated for the Microsoft Ajax Library while functionality for the globalization library provides functionality that was already found in the original ASP.NET AJAX library. To me all three plug-ins address a pressing need in client side applications and provide functionality I’ve previously used in other incarnations, but with more complete implementations. Let’s take a close look at these plug-ins. jQuery Templates http://api.jquery.com/category/plugins/templates/ Client side templating is a key component for building rich JavaScript applications in the browser. Templating on the client lets you avoid from manually creating markup by creating DOM nodes and injecting them individually into the document via code. Rather you can create markup templates – similar to the way you create classic ASP server markup – and merge data into these templates to render HTML which you can then inject into the document or replace existing content with. Output from templates are rendered as a jQuery matched set and can then be easily inserted into the document as needed. Templating is key to minimize client side code and reduce repeated code for rendering logic. Instead a single template can be used in many places for updating and adding content to existing pages. Further if you build pure AJAX interfaces that rely entirely on client rendering of the initial page content, templates allow you to a use a single markup template to handle all rendering of each specific HTML section/element. I’ve used a number of different client rendering template engines with jQuery in the past including jTemplates (a PHP style templating engine) and a modified version of John Resig’s MicroTemplating engine which I built into my own set of libraries because it’s such a commonly used feature in my client side applications. jQuery templates adds a much richer templating model that allows for sub-templates and access to the data items. Like John Resig’s original Micro Template engine, the core basics of the templating engine create JavaScript code which means that templates can include JavaScript code. To give you a basic idea of how templates work imagine I have an application that downloads a set of stock quotes based on a symbol list then displays them in the document. To do this you can create an ‘item’ template that describes how each of the quotes is renderd as a template inside of the document: <script id="stockTemplate" type="text/x-jquery-tmpl"> <div id="divStockQuote" class="errordisplay" style="width: 500px;"> <div class="label">Company:</div><div><b>${Company}(${Symbol})</b></div> <div class="label">Last Price:</div><div>${LastPrice}</div> <div class="label">Net Change:</div><div> {{if NetChange > 0}} <b style="color:green" >${NetChange}</b> {{else}} <b style="color:red" >${NetChange}</b> {{/if}} </div> <div class="label">Last Update:</div><div>${LastQuoteTimeString}</div> </div> </script> The ‘template’ is little more than HTML with some markup expressions inside of it that define the template language. Notice the embedded ${} expressions which reference data from the quote objects returned from an AJAX call on the server. You can embed any JavaScript or value expression in these template expressions. There are also a number of structural commands like {{if}} and {{each}} that provide for rudimentary logic inside of your templates as well as commands ({{tmpl}} and {{wrap}}) for nesting templates. You can find more about the full set of markup expressions available in the documentation. To load up this data you can use code like the following: <script type="text/javascript"> //var Proxy = new ServiceProxy("../PageMethods/PageMethodsService.asmx/"); $(document).ready(function () { $("#btnGetQuotes").click(GetQuotes); }); function GetQuotes() { var symbols = $("#txtSymbols").val().split(","); $.ajax({ url: "../PageMethods/PageMethodsService.asmx/GetStockQuotes", data: JSON.stringify({ symbols: symbols }), // parameter map type: "POST", // data has to be POSTed contentType: "application/json", timeout: 10000, dataType: "json", success: function (result) { var quotes = result.d; var jEl = $("#stockTemplate").tmpl(quotes); $("#quoteDisplay").empty().append(jEl); }, error: function (xhr, status) { alert(status + "\r\n" + xhr.responseText); } }); }; </script> In this case an ASMX AJAX service is called to retrieve the stock quotes. The service returns an array of quote objects. The result is returned as an object with the .d property (in Microsoft service style) that returns the actual array of quotes. The template is applied with: var jEl = $("#stockTemplate").tmpl(quotes); which selects the template script tag and uses the .tmpl() function to apply the data to it. The result is a jQuery matched set of elements that can then be appended to the quote display element in the page. The template is merged against an array in this example. When the result is an array the template is automatically applied to each each array item. If you pass a single data item – like say a stock quote – the template works exactly the same way but is applied only once. Templates also have access to a $data item which provides the current data item and information about the tempalte that is currently executing. This makes it possible to keep context within the context of the template itself and also to pass context from a parent template to a child template which is very powerful. Templates can be evaluated by using the template selector and calling the .tmpl() function on the jQuery matched set as shown above or you can use the static $.tmpl() function to provide a template as a string. This allows you to dynamically create templates in code or – more likely – to load templates from the server via AJAX calls. In short there are options The above shows off some of the basics, but there’s much for functionality available in the template engine. Check the documentation link for more information and links to additional examples. The plug-in download also comes with a number of examples that demonstrate functionality. jQuery templates will become a native component in jQuery Core 1.5, so it’s definitely worthwhile checking out the engine today and get familiar with this interface. As much as I’m stoked about templating becoming part of the jQuery core because it’s such an integral part of many applications, there are also a couple shortcomings in the current incarnation: Lack of Error Handling Currently if you embed an expression that is invalid it’s simply not rendered. There’s no error rendered into the template nor do the various  template functions throw errors which leaves finding of bugs as a runtime exercise. I would like some mechanism – optional if possible – to be able to get error info of what is failing in a template when it’s rendered. No String Output Templates are always rendered into a jQuery matched set and there’s no way that I can see to directly render to a string. String output can be useful for debugging as well as opening up templating for creating non-HTML string output. Limited JavaScript Access Unlike John Resig’s original MicroTemplating Engine which was entirely based on JavaScript code generation these templates are limited to a few structured commands that can ‘execute’. There’s no code execution inside of script code which means you’re limited to calling expressions available in global objects or the data item passed in. This may or may not be a big deal depending on the complexity of your template logic. Error handling has been discussed quite a bit and it’s likely there will be some solution to that particualar issue by the time jQuery templates ship. The others are relatively minor issues but something to think about anyway. jQuery Data Link http://api.jquery.com/category/plugins/data-link/ jQuery Data Link provides the ability to do two-way data binding between input controls and an underlying object’s properties. The typical scenario is linking a textbox to a property of an object and have the object updated when the text in the textbox is changed and have the textbox change when the value in the object or the entire object changes. The plug-in also supports converter functions that can be applied to provide the conversion logic from string to some other value typically necessary for mapping things like textbox string input to say a number property and potentially applying additional formatting and calculations. In theory this sounds great, however in reality this plug-in has some serious usability issues. Using the plug-in you can do things like the following to bind data: person = { firstName: "rick", lastName: "strahl"}; $(document).ready( function() { // provide for two-way linking of inputs $("form").link(person); // bind to non-input elements explicitly $("#objFirst").link(person, { firstName: { name: "objFirst", convertBack: function (value, source, target) { $(target).text(value); } } }); $("#objLast").link(person, { lastName: { name: "objLast", convertBack: function (value, source, target) { $(target).text(value); } } }); }); This code hooks up two-way linking between a couple of textboxes on the page and the person object. The first line in the .ready() handler provides mapping of object to form field with the same field names as properties on the object. Note that .link() does NOT bind items into the textboxes when you call .link() – changes are mapped only when values change and you move out of the field. Strike one. The two following commands allow manual binding of values to specific DOM elements which is effectively a one-way bind. You specify the object and a then an explicit mapping where name is an ID in the document. The converter is required to explicitly assign the value to the element. Strike two. You can also detect changes to the underlying object and cause updates to the input elements bound. Unfortunately the syntax to do this is not very natural as you have to rely on the jQuery data object. To update an object’s properties and get change notification looks like this: function updateFirstName() { $(person).data("firstName", person.firstName + " (code updated)"); } This works fine in causing any linked fields to be updated. In the bindings above both the firstName input field and objFirst DOM element gets updated. But the syntax requires you to use a jQuery .data() call for each property change to ensure that the changes are tracked properly. Really? Sure you’re binding through multiple layers of abstraction now but how is that better than just manually assigning values? The code savings (if any) are going to be minimal. As much as I would like to have a WPF/Silverlight/Observable-like binding mechanism in client script, this plug-in doesn’t help much towards that goal in its current incarnation. While you can bind values, the ‘binder’ is too limited to be really useful. If initial values can’t be assigned from the mappings you’re going to end up duplicating work loading the data using some other mechanism. There’s no easy way to re-bind data with a different object altogether since updates trigger only through the .data members. Finally, any non-input elements have to be bound via code that’s fairly verbose and frankly may be more voluminous than what you might write by hand for manual binding and unbinding. Two way binding can be very useful but it has to be easy and most importantly natural. If it’s more work to hook up a binding than writing a couple of lines to do binding/unbinding this sort of thing helps very little in most scenarios. In talking to some of the developers the feature set for Data Link is not complete and they are still soliciting input for features and functionality. If you have ideas on how you want this feature to be more useful get involved and post your recommendations. As it stands, it looks to me like this component needs a lot of love to become useful. For this component to really provide value, bindings need to be able to be refreshed easily and work at the object level, not just the property level. It seems to me we would be much better served by a model binder object that can perform these binding/unbinding tasks in bulk rather than a tool where each link has to be mapped first. I also find the choice of creating a jQuery plug-in questionable – it seems a standalone object – albeit one that relies on the jQuery library – would provide a more intuitive interface than the current forcing of options onto a plug-in style interface. Out of the three Microsoft created components this is by far the least useful and least polished implementation at this point. jQuery Globalization http://github.com/jquery/jquery-global Globalization in JavaScript applications often gets short shrift and part of the reason for this is that natively in JavaScript there’s little support for formatting and parsing of numbers and dates. There are a number of JavaScript libraries out there that provide some support for globalization, but most are limited to a particular portion of globalization. As .NET developers we’re fairly spoiled by the richness of APIs provided in the framework and when dealing with client development one really notices the lack of these features. While you may not necessarily need to localize your application the globalization plug-in also helps with some basic tasks for non-localized applications: Dealing with formatting and parsing of dates and time values. Dates in particular are problematic in JavaScript as there are no formatters whatsoever except the .toString() method which outputs a verbose and next to useless long string. With the globalization plug-in you get a good chunk of the formatting and parsing functionality that the .NET framework provides on the server. You can write code like the following for example to format numbers and dates: var date = new Date(); var output = $.format(date, "MMM. dd, yy") + "\r\n" + $.format(date, "d") + "\r\n" + // 10/25/2010 $.format(1222.32213, "N2") + "\r\n" + $.format(1222.33, "c") + "\r\n"; alert(output); This becomes even more useful if you combine it with templates which can also include any JavaScript expressions. Assuming the globalization plug-in is loaded you can create template expressions that use the $.format function. Here’s the template I used earlier for the stock quote again with a couple of formats applied: <script id="stockTemplate" type="text/x-jquery-tmpl"> <div id="divStockQuote" class="errordisplay" style="width: 500px;"> <div class="label">Company:</div><div><b>${Company}(${Symbol})</b></div> <div class="label">Last Price:</div> <div>${$.format(LastPrice,"N2")}</div> <div class="label">Net Change:</div><div> {{if NetChange > 0}} <b style="color:green" >${NetChange}</b> {{else}} <b style="color:red" >${NetChange}</b> {{/if}} </div> <div class="label">Last Update:</div> <div>${$.format(LastQuoteTime,"MMM dd, yyyy")}</div> </div> </script> There are also parsing methods that can parse dates and numbers from strings into numbers easily: alert($.parseDate("25.10.2010")); alert($.parseInt("12.222")); // de-DE uses . for thousands separators As you can see culture specific options are taken into account when parsing. The globalization plugin provides rich support for a variety of locales: Get a list of all available cultures Query cultures for culture items (like currency symbol, separators etc.) Localized string names for all calendar related items (days of week, months) Generated off of .NET’s supported locales In short you get much of the same functionality that you already might be using in .NET on the server side. The plugin includes a huge number of locales and an Globalization.all.min.js file that contains the text defaults for each of these locales as well as small locale specific script files that define each of the locale specific settings. It’s highly recommended that you NOT use the huge globalization file that includes all locales, but rather add script references to only those languages you explicitly care about. Overall this plug-in is a welcome helper. Even if you use it with a single locale (like en-US) and do no other localization, you’ll gain solid support for number and date formatting which is a vital feature of many applications. Changes for Microsoft It’s good to see Microsoft coming out of its shell and away from the ‘not-built-here’ mentality that has been so pervasive in the past. It’s especially good to see it applied to jQuery – a technology that has stood in drastic contrast to Microsoft’s own internal efforts in terms of design, usage model and… popularity. It’s great to see that Microsoft is paying attention to what customers prefer to use and supporting the customer sentiment – even if it meant drastically changing course of policy and moving into a more open and sharing environment in the process. The additional jQuery support that has been introduced in the last two years certainly has made lives easier for many developers on the ASP.NET platform. It’s also nice to see Microsoft submitting proposals through the standard jQuery process of plug-ins and getting accepted for various very useful projects. Certainly the jQuery Templates plug-in is going to be very useful to many especially since it will be baked into the jQuery core in jQuery 1.5. I hope we see more of this type of involvement from Microsoft in the future. Kudos!© Rick Strahl, West Wind Technologies, 2005-2010Posted in jQuery  ASP.NET  

    Read the article

  • Windows Azure Error: Failed to start Storage Emulator: the SQL Server instance ‘localhost\SQLExpress’ could not be found

    - by DigiMortal
    When running some of your Windows Azure applications when storage emulator is not configured you may get the following error: "Windows Azure Tools: Failed to initialize Windows Azure storage emulator. Unable to start Development Storage. Failed to start Storage Emulator: the SQL Server instance ‘localhost\SQLExpress’ could not be found.   Please configure the SQL Server instance for Storage Emulator using the ‘DSInit’ utility in the Windows Azure SDK.". Here’s how to solve this problem. You need to run DSInit utility to create database. For Windows Azure SDK 1.6 the location for DSInit utility is: C:\Program Files\Windows Azure Emulator\emulator\devstore By default DSInit expects that your database server is (local)\SQLEXPRESS but you can change it easily. If you have MSSQL instance called SQLEXPRESS then it is enough to just run DSInit. If you need some other instance then run the following command: DSInit /sqlinstance:<instance name> For default instance use “.” as instance name: DSInit /sqlinstance:. You can find more information about sqlinstance and other switches from DSInit documentation. If database was correctly created you should see dialog like this: When storage database is ready you can run your application.

    Read the article

  • How to delete just one LINE of text (NOT a table-row!) with a single KEYBOARD shortcut in Microsoft Office Word 2010?

    - by Sk8erPeter
    Are there any shortcuts to delete just one row (which is NOT a table row, just a single row in a text) in Microsoft Office Word 2010? If not, how can I assign one to do it? In worst case, can I make a macro (in VB) which could do the same with a custom shortcut? To clarify my problem: I would like to avoid multiple clicks and/or pushing multiple buttons, even if I click in the middle of the line of text. :) For example, in Notepad++ I can delete the entire current line with Ctrl+L, in NetBeans, I can delete an entire line with Ctrl+E, in Eclipse, I can delete current line with Ctrl+D, etc., where it doesn't really matter where my mouse cursor is actually... so there are these simple solutions, which I look for in Word too. It really would simplify my work in huge documents.

    Read the article

  • Azure WNS to Win8 - Push Notifications for Metro Apps

    - by JoshReuben
    Background The Windows Azure Toolkit for Windows 8 allows you to build a Windows Azure Cloud Service that can send Push Notifications to registered Metro apps via Windows Notification Service (WNS). Some configuration is required - you need to: Register the Metro app for Windows Live Application Management Provide Package SID & Client Secret to WNS Modify the Azure Cloud App cscfg file and the Metro app package.appxmanifest file to contain matching Metro package name, SID and client secret. The Mechanism: These notifications take the form of XAML Tile, Toast, Raw or Badge UI notifications. The core engine is provided via the WNS nuget recipe, which exposes an API for constructing payloads and posting notifications to WNS. An application receives push notifications by requesting a notification channel from WNS, which returns a channel URI that the application then registers with a cloud service. In the cloud service, A WnsAccessTokenProvider authenticates with WNS by providing its credentials, the package SID and secret key, and receives in return an access token that the provider caches and can reuse for multiple notification requests. The cloud service constructs a notification request by filling out a template class that contains the information that will be sent with the notification, including text and image references. Using the channel URI of a registered client, the cloud service can then send a notification whenever it has an update for the user. The package contains the NotificationSendUtils class for submitting notifications. The Windows Azure Toolkit for Windows 8 (WAT) provides the PNWorker sample pair of solutions - The Azure server side contains a WebRole & a WorkerRole. The WebRole allows submission of new push notifications into an Azure Queue which the WorkerRole extracts and processes. Further background resources: http://watwindows8.codeplex.com/ - Windows Azure Toolkit for Windows 8 http://watwindows8.codeplex.com/wikipage?title=Push%20Notification%20Worker%20Sample - WAT WNS sample setup http://watwindows8.codeplex.com/wikipage?title=Using%20the%20Windows%208%20Cloud%20Application%20Services%20Application – using Windows 8 with Cloud Application Services A bit of Configuration Register the Metro apps for Windows Live Application Management From the current app manifest of your metro app Publish tab, copy the Package Display Name and the Publisher From: https://manage.dev.live.com/Build/ Package name: <-- we need to change this Client secret: keep this Package Security Identifier (SID): keep this Verify the app here: https://manage.dev.live.com/Applications/Index - so this step is done "If you wish to send push notifications in your application, provide your Package Security Identifier (SID) and client secret to WNS." Provide Package SID & Client Secret to WNS http://msdn.microsoft.com/en-us/library/windows/apps/hh465407.aspx - How to authenticate with WNS https://appdev.microsoft.com/StorePortals/en-us/Account/Signup/PurchaseSubscription - register app with dashboard - need registration code or register a new account & pay $170 shekels http://msdn.microsoft.com/en-us/library/windows/apps/hh868184.aspx - Registering for a Windows Store developer account http://msdn.microsoft.com/en-us/library/windows/apps/hh868187.aspx - Picking a Microsoft account for the Windows Store The WNS Nuget Recipe The WNS Recipe is a nuget package that provides an API for authenticating against WNS, constructing payloads and posting notifications to WNS. After installing this package, a WnsRecipe assembly is added to project references. To send notifications using WNS, first register the application at the Windows Push Notifications & Live Connect portal to obtain Package Security Identifier (SID) and a secret key that your cloud service uses to authenticate with WNS. An application receives push notifications by requesting a notification channel from WNS, which returns a channel URI that the application then registers with a cloud service. In the cloud service, the WnsAccessTokenProvider authenticates with WNS by providing its credentials, the package SID and secret key, and receives in return an access token that the provider caches and can reuse for multiple notification requests. The cloud service constructs a notification request by filling out a template class that contains the information that will be sent with the notification, including text and image references.Using the channel URI of a registered client, the cloud service can then send a notification whenever it has an update for the user. var provider = new WnsAccessTokenProvider(clientId, clientSecret); var notification = new ToastNotification(provider) {     ToastType = ToastType.ToastText02,     Text = new List<string> { "blah"} }; notification.Send(channelUri); the WNS Recipe is instrumented to write trace information via a trace listener – configuratively or programmatically from Application_Start(): WnsDiagnostics.Enable(); WnsDiagnostics.TraceSource.Listeners.Add(new DiagnosticMonitorTraceListener()); WnsDiagnostics.TraceSource.Switch.Level = SourceLevels.Verbose; The WAT PNWorker Sample The Azure server side contains a WebRole & a WorkerRole. The WebRole allows submission of new push notifications into an Azure Queue which the WorkerRole extracts and processes. Overview of Push Notification Worker Sample The toolkit includes a sample application based on the same solution structure as the one created by theWindows 8 Cloud Application Services project template. The sample demonstrates how to off-load the job of sending Windows Push Notifications using a Windows Azure worker role. You can find the source code in theSamples\PNWorker folder. This folder contains a full version of the sample application showing how to use Windows Push Notifications using ASP.NET Membership as the authentication mechanism. The sample contains two different solution files: WATWindows.Azure.sln: This solution must be opened with Visual Studio 2010 and contains the projects related to the Windows Azure web and worker roles. WATWindows.Client.sln: This solution must be opened with Visual Studio 11 and contains the Windows Metro style application project. Only Visual Studio 2010 supports Windows Azure cloud projects so you currently need to use this edition to launch the server application. This will change in a future release of the Windows Azure tools when support for Visual Studio 11 is enabled. Important: Setting up the PNWorker Sample Before running the PNWorker sample, you need to register the application and configure it: 1. Register the app: To register your application, go to the Windows Live Application Management site for Metro style apps at https://manage.dev.live.com/build and sign in with your Windows Live ID. In the Windows Push Notifications & Live Connect page, enter the following information. Package Display Name PNWorker.Sample Publisher CN=127.0.0.1, O=TESTING ONLY, OU=Windows Azure DevFabric 2. 3. Once you register the application, make a note of the values shown in the portal for Client Secret,Package Name and Package SID. 4. Configure the app - double-click the SetupSample.cmd file located inside the Samples\PNWorker folder to launch a tool that will guide you through the process of configuring the sample. setup runs a PowerShell script that requires running with administration privileges to allow the scripts to execute in your machine. When prompted, enter the Client Secret, Package Name, and Package Security Identifier you obtained previously and wait until the tool finishes configuring your sample. Running the PNWorker Sample To run this sample, you must run both the client and the server application projects. 1. Open Visual Studio 2010 as an administrator. Open the WATWindows.Azure.sln solution. Set the start-up project of the solution as the cloud project. Run the app in the dev fabric to test. 2. Open Visual Studio 11 and open the WATWindows.Client.sln solution. Run the Metro client application. In the client application, click Reopen channel and send to server. à the application opens the channel and registers it with the cloud application, & the Output area shows the channel URI. 3. Refresh the WebRole's Push Notifications page to see the UI list the newly registered client. 4. Send notifications to the client application by clicking the Send Notification button. Setup 3 command files + 1 powershell script: SetupSample.cmd –> SetupWPNS.vbs –> SetupWPNS.cmd –> SetupWPNS.UpdateWPNSCredentialsInServiceConfiguration.ps1 appears to set PackageName – from manifest Client Id package security id (SID) – from registration Client Secret – from registration The following configs are modified: WATWindows\ServiceConfiguration.Cloud.cscfg WATWindows\ServiceConfiguration.Local.cscfg WATWindows.Client\package.appxmanifest WatWindows.Notifications A class library – it references the following WNS DLL: C:\WorkDev\CountdownValue\AzureToolkits\WATWindows8\Samples\PNWorker\packages\WnsRecipe.0.0.3.0\lib\net40\WnsRecipe.dll NotificationJobRequest A DataContract for triggering notifications:     using System.Runtime.Serialization; using Microsoft.Windows.Samples.Notifications;     [DataContract]     [KnownType(typeof(WnsAccessTokenProvider))] public class NotificationJobRequest     {               [DataMember] public bool ProcessAsync { get; set; }          [DataMember] public string Payload { get; set; }         [DataMember] public string ChannelUrl { get; set; }         [DataMember] public NotificationType NotificationType { get; set; }         [DataMember] public IAccessTokenProvider AccessTokenProvider { get; set; }         [DataMember] public NotificationSendOptions NotificationSendOptions{ get; set; }     } Investigated these types: WnsAccessTokenProvider – a DataContract that contains the client Id and client secret NotificationType – an enum that can be: Tile, Toast, badge, Raw IAccessTokenProvider – get or reset the access token NotificationSendOptions – SecondsTTL, NotificationPriority (enum), isCache, isRequestForStatus, Tag   There is also a NotificationJobSerializer class which basically wraps a DataContractSerializer serialization / deserialization of NotificationJobRequest The WNSNotificationJobProcessor class This class wraps the NotificationSendUtils API – it periodically extracts any NotificationJobRequest objects from a CloudQueue and submits them to WNS. The ProcessJobMessageRequest method – this is the punchline: it will deserialize a CloudQueueMessage into a NotificationJobRequest & send pass its contents to NotificationUtils to SendAsynchronously / SendSynchronously, (and then dequeue the message).     public override void ProcessJobMessageRequest(CloudQueueMessage notificationJobMessageRequest)         { Trace.WriteLine("Processing a new Notification Job Request", "Information"); NotificationJobRequest pushNotificationJob =                 NotificationJobSerializer.Deserialize(notificationJobMessageRequest.AsString); if (pushNotificationJob != null)             { if (pushNotificationJob.ProcessAsync)                 { Trace.WriteLine("Sending the notification asynchronously", "Information"); NotificationSendUtils.SendAsynchronously( new Uri(pushNotificationJob.ChannelUrl),                         pushNotificationJob.AccessTokenProvider,                         pushNotificationJob.Payload,                         result => this.ProcessSendResult(pushNotificationJob, result),                         result => this.ProcessSendResultError(pushNotificationJob, result),                         pushNotificationJob.NotificationType,                         pushNotificationJob.NotificationSendOptions);                 } else                 { Trace.WriteLine("Sending the notification synchronously", "Information"); NotificationSendResult result = NotificationSendUtils.Send( new Uri(pushNotificationJob.ChannelUrl),                         pushNotificationJob.AccessTokenProvider,                         pushNotificationJob.Payload,                         pushNotificationJob.NotificationType,                         pushNotificationJob.NotificationSendOptions); this.ProcessSendResult(pushNotificationJob, result);                 }             } else             { Trace.WriteLine("Could not deserialize the notification job", "Error");             } this.queue.DeleteMessage(notificationJobMessageRequest);         } Investigation of NotificationSendUtils class - This is the engine – it exposes Send and a SendAsyncronously overloads that take the following params from the NotificationJobRequest: Channel Uri AccessTokenProvider Payload NotificationType NotificationSendOptions WebRole WebRole is a large MVC project – it references WatWindows.Notifications as well as the following WNS DLL: \AzureToolkits\WATWindows8\Samples\PNWorker\packages\WnsRecipe.0.0.3.0\lib\net40\NotificationsExtensions.dll Controllers\PushNotificationController.cs Notification related namespaces:     using Notifications;     using NotificationsExtensions;     using NotificationsExtensions.BadgeContent;     using NotificationsExtensions.RawContent;     using NotificationsExtensions.TileContent;     using NotificationsExtensions.ToastContent;     using Windows.Samples.Notifications; TokenProvider – initialized from the Azure RoleEnvironment:   IAccessTokenProvider tokenProvider = new WnsAccessTokenProvider(         RoleEnvironment.GetConfigurationSettingValue("WNSPackageSID"),         RoleEnvironment.GetConfigurationSettingValue("WNSClientSecret")); SendNotification method – calls QueuePushMessage method to create and serialize a NotificationJobRequest and enqueue it in a CloudQueue [HttpPost]         public ActionResult SendNotification(             [ModelBinder(typeof(NotificationTemplateModelBinder))] INotificationContent notification,             string channelUrl,             NotificationPriority priority = NotificationPriority.Normal)         {             var payload = notification.GetContent();             var options = new NotificationSendOptions()             {                 Priority = priority             };             var notificationType =                 notification is IBadgeNotificationContent ? NotificationType.Badge :                 notification is IRawNotificationContent ? NotificationType.Raw :                 notification is ITileNotificationContent ? NotificationType.Tile :                 NotificationType.Toast;             this.QueuePushMessage(payload, channelUrl, notificationType, options);             object response = new             {                 Status = "Queued for delivery to WNS"             };             return this.Json(response);         } GetSendTemplate method: Create the cshtml partial rendering based on the notification type     [HttpPost]         public ActionResult GetSendTemplate(NotificationTemplateViewModel templateOptions)         {             PartialViewResult result = null;             switch (templateOptions.NotificationType)             {                 case "Badge":                     templateOptions.BadgeGlyphValueContent = Enum.GetNames(typeof( GlyphValue));                     ViewBag.ViewData = templateOptions;                     result = PartialView("_" + templateOptions.NotificationTemplateType);                     break;                 case "Raw":                     ViewBag.ViewData = templateOptions;                     result = PartialView("_Raw");                     break;                 case "Toast":                     templateOptions.TileImages = this.blobClient.GetAllBlobsInContainer(ConfigReader.GetConfigValue("TileImagesContainer")).OrderBy(i => i.FileName).ToList();                     templateOptions.ToastAudioContent = Enum.GetNames(typeof( ToastAudioContent));                     templateOptions.Priorities = Enum.GetNames(typeof( NotificationPriority));                     ViewBag.ViewData = templateOptions;                     result = PartialView("_" + templateOptions.NotificationTemplateType);                     break;                 case "Tile":                     templateOptions.TileImages = this.blobClient.GetAllBlobsInContainer(ConfigReader.GetConfigValue("TileImagesContainer")).OrderBy(i => i.FileName).ToList();                     ViewBag.ViewData = templateOptions;                     result = PartialView("_" + templateOptions.NotificationTemplateType);                     break;             }             return result;         } Investigated these types: ToastAudioContent – an enum of different Win8 sound effects for toast notifications GlyphValue – an enum of different Win8 icons for badge notifications · Infrastructure\NotificationTemplateModelBinder.cs WNS Namespace references     using NotificationsExtensions.BadgeContent;     using NotificationsExtensions.RawContent;     using NotificationsExtensions.TileContent;     using NotificationsExtensions.ToastContent; Various NotificationFactory derived types can server as bindable models in MVC for creating INotificationContent types. Default values are also set for IWideTileNotificationContent & IToastNotificationContent. Type factoryType = null;             switch (notificationType)             {                 case "Badge":                     factoryType = typeof(BadgeContentFactory);                     break;                 case "Tile":                     factoryType = typeof(TileContentFactory);                     break;                 case "Toast":                     factoryType = typeof(ToastContentFactory);                     break;                 case "Raw":                     factoryType = typeof(RawContentFactory);                     break;             } Investigated these types: BadgeContentFactory – CreateBadgeGlyph, CreateBadgeNumeric (???) TileContentFactory – many notification content creation methods , apparently one for every tile layout type ToastContentFactory – many notification content creation methods , apparently one for every toast layout type RawContentFactory – passing strings WorkerRole WNS Namespace references using Notifications; using Notifications.WNS; using Windows.Samples.Notifications; OnStart() Method – on Worker Role startup, initialize the NotificationJobSerializer, the CloudQueue, and the WNSNotificationJobProcessor _notificationJobSerializer = new NotificationJobSerializer(); _cloudQueueClient = this.account.CreateCloudQueueClient(); _pushNotificationRequestsQueue = _cloudQueueClient.GetQueueReference(ConfigReader.GetConfigValue("RequestQueueName")); _processor = new WNSNotificationJobProcessor(_notificationJobSerializer, _pushNotificationRequestsQueue); Run() Method – poll the Azure Queue for NotificationJobRequest messages & process them:   while (true)             { Trace.WriteLine("Checking for Messages", "Information"); try                 { Parallel.ForEach( this.pushNotificationRequestsQueue.GetMessages(this.batchSize), this.processor.ProcessJobMessageRequest);                 } catch (Exception e)                 { Trace.WriteLine(e.ToString(), "Error");                 } Trace.WriteLine(string.Format("Sleeping for {0} seconds", this.pollIntervalMiliseconds / 1000)); Thread.Sleep(this.pollIntervalMiliseconds);                                            } How I learned to appreciate Win8 There is really only one application architecture for Windows 8 apps: Metro client side and Azure backend – and that is a good thing. With WNS, tier integration is so automated that you don’t even have to leverage a HTTP push API such as SignalR. This is a pretty powerful development paradigm, and has changed the way I look at Windows 8 for RAD business apps. When I originally looked at Win8 and the WinRT API, my first opinion on Win8 dev was as follows – GOOD:WinRT, WRL, C++/CX, WinJS, XAML (& ease of Direct3D integration); BAD: low projected market penetration,.NET lobotomized (Only 8% of .NET 4.5 classes can be used in Win8 non-desktop apps - http://bit.ly/HRuJr7); UGLY:Metro pascal tiles! Perhaps my 80s teenage years gave me a punk reactionary sense of revulsion towards the Partridge Family 70s style that Metro UX seems to have appropriated: On second thought though, it simplifies UI dev to a single paradigm (although UX guys will need to change career) – you will not find an easier app dev environment. Speculation: If LightSwitch is going to support HTML5 client app generation, then its a safe guess to say that vnext will support Win8 Metro XAML - a much easier port from Silverlight XAML. Given the VS2012 LightSwitch integration as a thumbs up from the powers that be at MS, and given that Win8 C#/XAML Metro apps tend towards a streamlined 'golden straight-jacket' cookie cutter app dev style with an Azure back-end supporting Win8 push notifications... --> its easy to extrapolate than LightSwitch vnext could well be the Win8 Metro XAML to Azure RAD tool of choice! The hook is already there - :) Why else have the space next to the HTML Client box? This high level of application development abstraction will facilitate rapid app cookie-cutter architecture-infrastructure frameworks for wrapping any app. This will allow me to avoid too much XAML code-monkeying around & focus on my area of interest: Technical Computing.

    Read the article

  • SQLAuthority News – Statistics Used by the Query Optimizer in Microsoft SQL Server 2008 – Microsoft Whitepaper

    - by pinaldave
    I recently presented session on Statistics and Best Practices in Virtual Tech Days on Nov 22, 2010. The sessions was very popular and I got many questions right after the sessions. The number question I had received was where everybody can get the further information. I am very much happy that my sessions created some curiosity for one of the most important feature of the SQL Server. Statistics are the heart of the SQL Server. Microsoft has published a white paper on the subject how statistics are useful to Query Optimizer. Here is the abstract of the same white paper from Microsoft. Statistics Used by the Query Optimizer in Microsoft SQL Server 2008 Writer: Eric N. Hanson and Yavor Angelov Microsoft SQL Server 2008 collects statistical information about indexes and column data stored in the database. These statistics are used by the SQL Server query optimizer to choose the most efficient plan for retrieving or updating data. This paper describes what data is collected, where it is stored, and which commands create, update, and delete statistics. By default, SQL Server 2008 also creates and updates statistics automatically, when such an operation is considered to be useful. This paper also outlines how these defaults can be changed on different levels (column, table, and database). In addition, it presents how certain query language features, such as Transact-SQL variables, interact with use of statistics by the optimizer, and it provides guidance for using these features when writing queries so you can obtain good query performance. Link to white paper Statistics Used by the Query Optimizer in Microsoft SQL Server 2008 ?Reference: Pinal Dave (http://blog.SQLAuthority.com)   Filed under: Pinal Dave, SQL, SQL Authority, SQL Documentation, SQL Download, SQL Query, SQL Scripts, SQL Server, SQL Tips and Tricks, SQL White Papers, SQLAuthority News, T SQL, Technology

    Read the article

  • Lancement de la plateforme Microsoft Online Services : testez-la et venez en discuter avec Microsoft

    [IMG]http://www.lgmorand.com/blog/image.axd?picture=2010%2f3%2fhome_header-bg+-+Copie.jpg[/IMG] Après le lancement de sa plateforme Azure en début d'année, Microsoft a lancé début mars sa nouvelle plateforme MOS, pour Microsoft Office Services, une plateforme d'outils de communication externalisés mais restants au service de l'entreprise. Il s'agit un service destiné aux professionnels uniquement qui permet de confier certaines fonctions à Microsoft : messagerie collaborative (Exchange), travail collaboratif (Sharepoint), communications temps réel (Office Communications, Live Meeting, Communicator) et bureautique (Office).

    Read the article

  • Microsoft MVP 2012 – ASP.NET/IIS

    - by hajan
    It’s Sunday. I wasn’t really sure whether I should expect something today or not, although its 1st of July when we all know that the new and re-awarded MVPs should get the ‘Congratulations’ email by Microsoft. And YES! I GOT IT! This is my second year, and first time re-awarded… Microsoft MVP 2012 The feeling is exactly same as the first time… I am honored, privileged, veeeery happy and thankful to Microsoft for this prestigious award! The past year was really great with all the events, speaking engagements in various conferences and camps, many other community activities and the first time visit at MVP Global Summit. I am looking forward to boost even more the Microsoft community activities in the next year... And… part of the email message: Dear Hajan Selmani, Congratulations! We are pleased to present you with the 2012 Microsoft® MVP Award! This award is given to exceptional technical community leaders who actively share their high quality, real world expertise with others. We appreciate your outstanding contributions in ASP.NET/IIS technical communities during the past year. I would like to say a big THANK YOU to all stakeholders. First of all, THANK YOU MICROSOFT for this prestigious award, Thanks to CEE & Italy Region MVP Lead, Alessandro Teglia, who did a great job by helping and supporting MVPs through the whole past year, I hope we will continue collaborating in the same way on the forthcoming year! Thanks to my family, friends, supports, followers, those who read my blogs regularly and have made me reach more than thousands of comments in my ASP.NET Blog :), those who collaborate and work with me on a daily basis and are supporting me in all my community activities. Thank You Everyone! There are lot of new, exciting, great and innovative technologies in the Microsoft Technology Stack. I am excited and really looking forward to rock the community in the years to come! THANK YOU! Hajan

    Read the article

  • Azure Flavor for the Sharepoint Media Component

    - by spano
    Some time ago I wrote about a Media Processing Component for Sharepoint that I was working on. It is a Media Assets list for Sharepoint that lets you choose where to store the blob files. It provides also intelligence for encoding videos, generating thumbnail and poster images, obtaining media metadata, etc. On that first post the component was explained in detail, with the original 3 storage flavors: Sharepoint list, Virtual Directoy or FTP. The storage manager is extensible, so a new flavor was...(read more)

    Read the article

  • Convert ADO.Net EF Connection String To Be SQL Azure Cloud Connection String Compatible!?

    - by Goober
    The Scenario I have written a Silverlight 3 Application that uses an SQL Server database. I'm moving the application onto the Cloud (Azure Platform). In order to do this I have had to setup my database on SQL Azure. I am using the ADO.Net Entity Framework to model my database. I have got the application running on the cloud, but I cannot get it to connect to the database. Below is the original localhost connection string, followed by the SQL Azure connection string that isn't working. The application itself runs fine, but fails when trying to retrieve data. The Original Localhost Connection String <add name="InmZenEntities" connectionString="metadata=res://*/InmZenModel.csdl|res://*/InmZenModel.ssdl|res://*/InmZenModel.msl; provider=System.Data.SqlClient; provider connection string=&quot; Data Source=localhost; Initial Catalog=InmarsatZenith; Integrated Security=True; MultipleActiveResultSets=True&quot;" providerName="System.Data.EntityClient" /> The Converted SQL Azure Connection String <add name="InmZenEntities" connectionString="metadata=res://*/InmZenModel.csdl|res://*/InmZenModel.ssdl|res://*/InmZenModel.msl; provider=System.Data.SqlClient; provider connection string=&quot; Server=tcp:MYSERVER.ctp.database.windows.net; Database=InmarsatZenith; UserID=MYUSERID;Password=MYPASSWORD; Trusted_Connection=False; MultipleActiveResultSets=True&quot;" providerName="System.Data.EntityClient" /> The Question Anyone know if this connection string for SQL Azure is correct? Help greatly appreciated.

    Read the article

  • Windows Azure Mobile Services Updates Keep Coming

    - by Clint Edmonson
    Some exciting new Windows Azure Mobile Services features were delivered to production this week. The highlights include: iPhone and iPad connectivity support via a new iOS SDK Integrated Authentication so developers can configure user authentication via Microsoft Account, Facebook, Twitter, and Google. New server-side Mobile Service script modules Access to Structured Storage, Windows Azure Blob, Table, Queues, and ServiceBus Email services through partnership with SendGrid SMS & voice services through partnership with Twilio Mobile Services hosting expanded to west coast US The iOS SDK I’m excited to share that we've announced the release of an under-development iOS client SDK for Windows Azure Mobile Services. The iOS SDK joins the Windows 8 SDK launched with Windows Azure Mobile Services as well as client SDKs released by Xamarin for MonoTouch and MonoDroid.  The native iOS SDK is for developers programming in Objective-C on the iPhone and iPad platforms. The SDK gives developers the same level of access to data storage using dynamic schematization that is available for Windows 8. Also, iOS applications can use the same authentication options available in Mobile Services. While full iOS support is still in development, the libraries are currently available on GitHub. There’s a great getting started tutorial to walk you through building a simple iOS “Todo List” app that stores data in Windows Azure.  These additional tutorials explore how to use the iOS client libraries to store data and authenticate users: Get Started with data in Mobile Services for iOS Get Started with authentication in Mobile Services for iOS What’s New in Authentication Available to both iOS and Windows 8 developers, Mobile Services has expanded its authentication options.  Developers can now use Microsoft, Facebook, Twitter, and Google authentication. Similar to using Microsoft accounts for authentication, developers must sign up and through Facebook, Twitter, or Google's developer portal in order to authenticate through them.  These tutorials walk through how to register your Mobile Service with an identity provider: How to register your app with Microsoft Account How to register your app with Facebook How to register your app with Twitter How to register your app with Google And these tutorials walk through authenticating against Mobile Services: Get started with authentication in Mobile Services for Windows Store (C#) Get started with authentication in Mobile Services for Windows Store (JavaScript) Get started with authentication in Mobile Services for iOS What’s New in Mobile Service Scripts Some great new functionality is now available in the Mobile Service script layer.  These server side scripts are triggered off of any CRUD operation on a Mobile Service's table and can already handle doing data and query validation, filtering, web requests and more.  Today, the Azure SDK module is now available to these scripts giving them access to blob storage, service bus, table storage.  Check out the new tutorials on the Windows Azure Node.js developer center to learn more about working with Blob, Tables, Queues and Service Bus using the azure module. In addition, SendGrid and Twilio are now available via modules that can be called from the scripts as well.  This gives developers the ability to send emails (SendGrid) or SMS text messages (Twilio) whenever a script is fired.  Windows Azure customers receive a special offer of 25,000 free emails per month from SendGrid and 1000 free text messages from Twilio. Expanded Data Center Availability In addition to Mobile Services being available in our US East data center, they can now be spun up in US West. The above features are all now live in production and are available to use immediately.  If you don’t already have a Windows Azure account, you can sign-up for a free trial and start using Mobile Services today. The Windows Azure Mobile Developer Center has been updated with new tutorials that cover these new features in detail. And don’t forget - Windows Azure Mobile Services are still free for your first ten applications running on shared compute instances. Stay tuned to my twitter feed for Windows Azure announcements, updates, and links: @clinted

    Read the article

  • Setting default version on Azure Blob Storage?

    - by Erik
    What is the easiest way, without having to create your own utility, to set the default service version to the latest in Azure Blob Storage ? http://msdn.microsoft.com/en-us/library/windowsazure/dd894041 There is basically nothing to be set in the Azure portal and I am having a difficult time finding working utilities to use for Azure. For some reason Azure is defaulting to the oldest version which does not send things like the http range header for example. Any utility that can do this ? Thank you.

    Read the article

  • Should I move to an Azure database?

    - by Mike Flynn
    I am currently using a database from an installed instance on my web server in a VM on Azure. I was thinking about moving it to an actual Azure Database so I could load balance the web server. Does the Azure database work the same way as a straight install? Will it cost more or the same since I am just moving to a new server? Basically looking for advantages and disadvantages. I am familiar with SQL Server Management Studio.

    Read the article

  • Cannot access Azure Portal: We were unable to find any subscriptions associated with your account

    - by Marc
    I am trying to get access to my Windows Azure Portal. Although I can access my user profile which shows two active subscriptions, I cannot access the portal (https://manage.windowsazure.com/). I get redirected to this site: https://manage.windowsazure.com/Error/NoSubscriptions, saying We were unable to find any subscriptions associated with your account. Now, this is a clear issue for the Azure support, but unfortunately, as azure thinks I don't have any subscriptions, I cannot get support, either, because I get redirected to the same error page. I've spent two hours talking and chatting with incompetent support people, i.e.: I've submitted feedback on the azure site but have no idea whether I will ever get a response. Maybe somebody has had this issue before, or can hint me which way to go?

    Read the article

  • How to access Windows Azure management portal from Linux/Ubuntu

    - by Zaar Hai
    Good day everyone! I have only Ubuntu/Debian machines in my office. I've registered for Windows Azure platform and try to enter their management portal (https://windows.azure.com/default.aspx), but it requires Microsoft Silverlight. I've tried installing latest Moonlight plugin availabing for Ubuntu 11.04, but without any success - after login I see blank page and nothing more (tried both firefox 7.0.1 and chromium-14). Does anyone know how to enter Microsoft Azure management portal from Ubuntu 11.04 or any other linux? Thank you, Zaar

    Read the article

  • Azure VM : Connection refused by host

    - by Simon Kérouack
    I recently stopped a subscription with 14 VMs in it and restarted it a few days later. Now all my VMs are working just fine at the exception of 6 used for MongoDB. They respond to ping and so they show as online in the azure dashboard but they do not answer to anything else. I tried (from different locations, in and out of the azure cloud) ssh : connect to host * port *: Connection refused telnet : Unable to connect to remote host: Connection refused mongo : exception: connect failed The ports for ssh and mongo are opened in azure. I tried restarting the VMs a few times trough the azure dashboard, they seem to restart successfully but still refuse all connections. I already looked for similar issues and the best solutions I found was to wait... the issue has been happening for 7 days and waiting is no more an option.

    Read the article

  • azure website restart and take old dll version

    - by vipul dumaniya
    One of my site is hosted on windows azure and when azure restart site from manage windows azure panel. then it take old version dll and site is down until we restart the site by deploying global.asax or change in web.config to restart the site. after deployment of global.asax or change in web.config site is restart and then it work perfectly and take latest dll. so if any issues with my code then it should not work after the restart by deploying global.asax file so i think issues is not from code side. Error like "Could not load type 'DSF.DATA.Repository.RecurringOrderLogResposity' from 'DSF.DATA Version 1.0.0" I am just deploying changed dll using FTP & site restart and take effect successfully I have already resolve this error and uploaded latest dll too but when site restart from azure panel it back and then site down until i restart the site by deploying global.asax file so i think issues is not from code side. please please help I am in big trouble as my site is live site and there are lot of traffic Thanks Vipul

    Read the article

  • Determining Azure SQL Database requirements

    - by Gerald
    I'm looking into moving an SQL Server database project to the cloud using Azure SQL Database. I'm just wondering what metrics I can use from SQL Server to help determine what my needs will be on Azure. The size of the database is around 150GB, so I understand what my needs are in terms of storage, I'm just not sure what metrics I can use to translate my database usage to the DTU benchmark metrics that the various service tiers on Azure SQL use.

    Read the article

< Previous Page | 9 10 11 12 13 14 15 16 17 18 19 20  | Next Page >