Search Results

Search found 4410 results on 177 pages for 'azure worker roles'.

Page 46/177 | < Previous Page | 42 43 44 45 46 47 48 49 50 51 52 53  | Next Page >

  • How to fix “The requested service, ‘net.pipe://localhost/SecurityTokenServiceApplication/appsts.svc’ could not be activated.”

    - by ybbest
    Problem: When I try to publish a SharePoint2013 workflow, I received the error: The requested service, ‘net.pipe://localhost/SecurityTokenServiceApplication/appsts.svc’ could not be activated. After that, my workflow stopped working and every time I start a work I receive the following error message: System.ApplicationException: PreconditionFailed ---> System.ApplicationException: Error in the application. --- End of inner exception stack trace --- at System.Activities.Statements.Throw.Execute(CodeActivityContext context) at System.Activities.CodeActivity.InternalExecute(ActivityInstance instance, ActivityExecutor executor, BookmarkManager bookmarkManager) at System.Activities.Runtime.ActivityExecutor.ExecuteActivityWorkItem.ExecuteBody(ActivityExecutor executor, BookmarkManager bookmarkManager, Location resultLocation) Analysis: After analysis, I found the error by visiting the http://localhost:32843/SecurityTokenServiceApplication/securitytoken.svc and the error I got on the message is                                                                                                                                              Solution: The solution is basically getting more memory to the server. For development environment, you can restart your noderunner.exe or some other services to release some memories. To verify you have enough memory    you can browse to http://localhost:32843/SecurityTokenServiceApplication/securitytoken.svc , it should return the information below. Then you can republish your workflow and it will work like a charm.

    Read the article

  • How to Fix “Error occurred in deployment step ‘Activate Features’: System.TimeoutException:”

    - by ybbest
    Problem: When deploying a SharePoint2013 workflow using Visual Studio, I got the following Error: Error occurred in deployment step ‘Activate Features’: System.TimeoutException: The HTTP request has timed out after 20000 milliseconds. —> System.Net.WebException: The request was aborted: The request was canceled. at System.Net.HttpWebRequest.EndGetResponse(IAsyncResult asyncResult) at Microsoft.Workflow.Client.HttpGetResponseAsyncResult`1.OnGotResponse(IAsyncResult result) — End of inner exception stack trace — at Microsoft.Workflow.Common.AsyncResult.End[TAsyncResult](IAsyncResult result) at Microsoft.Workflow.Client.Ht Analysis: After reading AC’s blogpost and I find out the issue is to do with the service bus. Then I found out the following services are not started Solution: So I start the Service Bus Gateway and Service Bus Message Broker and the problem goes away. References: SharePoint 2013 Workflow – Advanced Workflow Debugging with Fiddler

    Read the article

  • EFMVC Migrated to .NET 4.5, Visual Studio 2012, ASP.NET MVC 4 and EF 5 Code First

    - by shiju
    I have just migrated my EFMVC app from .NET 4.0 and ASP.NET MVC 4 RC to .NET 4.5, ASP.NET MVC 4 RTM and Entity Framework 5 Code First. In this release, the EFMVC solution is built with Visual Studio 2012 RTM. The migration process was very smooth and did not made any major changes other than adding simple unit tests with NUnit and Moq. I will add more unit tests on later and will also modify the existing solution. Source Code You can download the source code from http://efmvc.codeplex.com/

    Read the article

  • SO-Aware at the Atlanta Connected Systems User Group

    - by gsusx
    Today my colleague Don Demsak will be presenting a session about WCF management, testing and governance using SO-Aware and the SO-Aware Test Workbench at the Connected Systems User Group in Atlanta . Don is a very engaging speaker and has prepared some very cool demos based on lessons of real world WCF solutions. If you are in the ATL area and interested in WCF, AppFabric, BizTalk you should definitely swing by Don’s session . Don’t forget to heckle him a bit (you can blame it for it ;) )...(read more)

    Read the article

  • Using a service registry that doesn’t suck part I: UDDI is dead

    - by gsusx
    This is the first of a series of posts on which I am hoping to detail some of the most common SOA governance scenarios in the real world, their challenges and the approach we’ve taken to address them in SO-Aware. This series does not intend to be a marketing pitch about SO-Aware. Instead, I would like to use this to foment an honest dialog between SOA governance technologists. For the starting post I decided to focus on the aspect that was once considered the keystone of SOA governance: service discovery...(read more)

    Read the article

  • ASP.NET and WIF: Showing custom profile username as User.Identity.Name

    - by DigiMortal
    I am building ASP.NET MVC application that uses external services to authenticate users. For ASP.NET users are fully authenticated when they are redirected back from external service. In system they are logically authenticated when they have created user profiles. In this posting I will show you how to force ASP.NET MVC controller actions to demand existence of custom user profiles. Using external authentication sources with AppFabric Suppose you want to be user-friendly and you don’t force users to keep in mind another username/password when they visit your site. You can accept logins from different popular sites like Windows Live, Facebook, Yahoo, Google and many more. If user has account in some of these services then he or she can use his or her account to log in to your site. If you have community site then you usually have support for user profiles too. Some of these providers give you some information about users and other don’t. So only thing in common you get from all those providers is some unique ID that identifies user in service uniquely. Image above shows you how new user joins your site. Existing users who already have profile are directed to users homepage after they are authenticated. You can read more about how to solve semi-authorized users problem from my blog posting ASP.NET MVC: Using ProfileRequiredAttribute to restrict access to pages. The other problem is related to usernames that we don’t get from all identity providers. Why is IIdentity.Name sometimes empty? The problem is described more specifically in my blog posting Identifying AppFabric Access Control Service users uniquely. Shortly the problem is that not all providers have claim called http://schemas.xmlsoap.org/ws/2005/05/identity/claims/name. The following diagram illustrates what happens when user got token from AppFabric ACS and was redirected to your site. Now, when user was authenticated using Windows Live ID then we don’t have name claim in token and that’s why User.Identity.Name is empty. Okay, we can force nameidentifier to be used as name (we can do it in web.config file) but we have user profiles and we want username from profile to be shown when username is asked. Modifying name claim Now let’s force IClaimsIdentity to use username from our user profiles. You can read more about my profiles topic from my blog posting ASP.NET MVC: Using ProfileRequiredAttribute to restrict access to pages and you can find some useful extension methods for claims identity from my blog posting Identifying AppFabric Access Control Service users uniquely. Here is what we do to set User.Identity.Name: we will check if user has profile, if user has profile we will check if User.Identity.Name matches the name given by profile, if names does not match then probably identity provider returned some name for user, we will remove name claim and recreate it with correct username, we will add new name claim to claims collection. All this stuff happens in Application_AuthorizeRequest event of our web application. The code is here. protected void Application_AuthorizeRequest() {     if (string.IsNullOrEmpty(User.Identity.Name))     {         var identity = User.Identity;         var profile = identity.GetProfile();         if (profile != null)         {             if (profile.UserName != identity.Name)             {                 identity.RemoveName();                   var claim = new Claim("http://schemas.xmlsoap.org/ws/2005/05/identity/claims/name", profile.UserName);                 var claimsIdentity = (IClaimsIdentity)identity;                 claimsIdentity.Claims.Add(claim);             }         }     } } RemoveName extension method is simple – it looks for name claims of IClaimsIdentity claims collection and removes them. public static void RemoveName(this IIdentity identity) {     if (identity == null)         return;       var claimsIndentity = identity as ClaimsIdentity;     if (claimsIndentity == null)         return;       for (var i = claimsIndentity.Claims.Count - 1; i >= 0; i--)     {         var claim = claimsIndentity.Claims[i];         if (claim.ClaimType == "http://schemas.xmlsoap.org/ws/2005/05/identity/claims/name")             claimsIndentity.Claims.RemoveAt(i);     } } And we are done. Now User.Identity.Name returns the username from user profile and you can use it to show username of current user everywhere in your site. Conclusion Mixing AppFabric Access Control Service and Windows Identity Foundation with custom authorization logic is not impossible but a little bit tricky. This posting finishes my little series about AppFabric ACS and WIF for this time and hopefully you found some useful tricks, tips, hacks and code pieces you can use in your own applications.

    Read the article

  • How to fix “The requested service, ‘net.pipe://localhost/SecurityTokenServiceApplication/appsts.svc’ could not be activated.”

    - by ybbest
    Problem: When I try to publish a SharePoint2013 workflow, I received the error: The requested service, ‘net.pipe://localhost/SecurityTokenServiceApplication/appsts.svc’ could not be activated. After that, my workflow stopped working and every time I start a work I receive the following error message: System.ApplicationException: PreconditionFailed ---> System.ApplicationException: Error in the application. --- End of inner exception stack trace --- at System.Activities.Statements.Throw.Execute(CodeActivityContext context) at System.Activities.CodeActivity.InternalExecute(ActivityInstance instance, ActivityExecutor executor, BookmarkManager bookmarkManager) at System.Activities.Runtime.ActivityExecutor.ExecuteActivityWorkItem.ExecuteBody(ActivityExecutor executor, BookmarkManager bookmarkManager, Location resultLocation) Analysis: After analysis, I found the error by visiting the http://localhost:32843/SecurityTokenServiceApplication/securitytoken.svc and the error I got on the message is                                                                                                                                              Solution: The solution is basically getting more memory to the server. For development environment, you can restart your noderunner.exe or some other services to release some memories. To verify you have enough memory    you can browse to http://localhost:32843/SecurityTokenServiceApplication/securitytoken.svc , it should return the information below. Then you can republish your workflow and it will work like a charm.

    Read the article

  • Access Control Service: Passive/Active Transition Sample

    - by Your DisplayName here!
    Here you can find my updated ACS2 sample. In addition to the existing front ends (web [WS-Federation], console [SOAP & REST], Silverlight [REST]) and error handling, it now also includes a WPF client that shows the passive/active transition with a SOAP service as illustrated here. All the ACS interaction is encapsulated in a WPF user control that: retrieves the JSON feed displays a list of supported identity providers triggers the sign in via a browser control retrieves the token response packages the token as a GenericXmlSecurityToken (to be used directly with the WIF ChannelFactory extensions methods) All you need to supply is the ACS namespace and the realm. Have fun!

    Read the article

  • Migration & Modernization: Windows/VB6 Apps to ASP.NET HTML5

    - by Visual WebGui
    I would like to invite you to a webinar we are doing in collaboration with Jeffrey S. Hammond , Principal Analyst serving Application Development & Delivery Professionals at Forrester Research. The webinar is free and it will will introduce the substantial changes brought on by the move to Web Applications and Open Web architectures, and the challenges it places on application development shops. We’ll also introduce how we at Gizmox are helping client navigate this mobile shift and evolve existing...(read more)

    Read the article

  • Access Control Service: Protocol and Token Transition

    - by Your DisplayName here!
    ACS v2 supports a number of protocols (WS-Federation, WS-Trust, OpenId, OAuth 2 / WRAP) and a number of token types (SWT, SAML 1.1/2.0) – see Vittorio’s Infographic here. Some protocols are designed for active client (WS-Trust, OAuth / WRAP) and some are designed for passive clients (WS-Federation, OpenID). One of the most obvious advantages of ACS is that it allows to transition between various protocols and token types. Once example would be using WS-Federation/SAML between your application and ACS to sign in with a Google account. Google is using OpenId and non-SAML tokens, but ACS transitions into WS-Federation and sends back a SAML token. This way you application only needs to understand a single protocol whereas ACS acts as a protocol bridge (see my ACS2 sample here). Another example would be transformation of a SAML token to a SWT. This is achieved by using the WRAP endpoint – you send a SAML token (from a registered identity provider) to ACS, and ACS turns it into a SWT token for the requested relying party, e.g. (using the WrapClient from Thinktecture.IdentityModel): [TestMethod] public void GetClaimsSamlToSwt() {     // get saml token from idp     var samlToken = Helper.GetSamlIdentityTokenForAcs();     // send to ACS for SWT converion     var swtToken = Helper.GetSimpleWebToken(samlToken);     var client = new HttpClient(Constants.BaseUri);     client.SetAccessToken(swtToken, WebClientTokenSchemes.OAuth);     // call REST service with SWT     var response = client.Get("wcf/client");     Assert.AreEqual<HttpStatusCode>(HttpStatusCode.OK, response.StatusCode); } There are more protocol transitions possible – but they are not so obvious. A popular example would be how to call a REST/SOAP service using e.g. a LiveId login. In the next post I will show you how to approach that scenario.

    Read the article

  • How to use call web service action in SharePoint2013 workflow

    - by ybbest
    In SharePoint2013, you can use call web service action and loop. In this post, I will show you how to achieve this. 1. Create a List workflow called CallWebService 2. Create a variable called listurl and assign the value to http://sp2010/_vti_bin/listdata.svc 3. Create a dictionary variable called RequestHeaders and add the following key value pairs. 4. Call the web service with the HttpHeaders you just build in the previous step and store the response in the variable ResponseContent. 5. The ResponseContent variable is the Dynamic values (in SharePoint designer it will be called dictionary type) and it is new feature for SharePoint2013 workflow. We can use the following actions to count the number items in the variable. 6. You can use loop in SharePoint 2013 workflow and out each list title as shown below.

    Read the article

  • ASP.NET MVC: Using ProfileRequiredAttribute to restrict access to pages

    - by DigiMortal
    If you are using AppFabric Access Control Services to authenticate users when they log in to your community site using Live ID, Google or some other popular identity provider, you need more than AuthorizeAttribute to make sure that users can access the content that is there for authenticated users only. In this posting I will show you hot to extend the AuthorizeAttribute so users must also have user profile filled. Semi-authorized users When user is authenticated through external identity provider then not all identity providers give us user name or other information we ask users when they join with our site. What all identity providers have in common is unique ID that helps you identify the user. Example. Users authenticated through Windows Live ID by AppFabric ACS have no name specified. Google’s identity provider is able to provide you with user name and e-mail address if user agrees to publish this information to you. They both give you unique ID of user when user is successfully authenticated in their service. There is logical shift between ASP.NET and my site when considering user as authorized. For ASP.NET MVC user is authorized when user has identity. For my site user is authorized when user has profile and row in my users table. Having profile means that user has unique username in my system and he or she is always identified by this username by other users. My solution is simple: I created my own action filter attribute that makes sure if user has profile to access given method and if user has no profile then browser is redirected to join page. Illustrating the problem Usually we restrict access to page using AuthorizeAttribute. Code is something like this. [Authorize] public ActionResult Details(string id) {     var profile = _userRepository.GetUserByUserName(id);     return View(profile); } If this page is only for site users and we have user profiles then all users – the ones that have profile and all the others that are just authenticated – can access the information. It is okay because all these users have successfully logged in in some service that is supported by AppFabric ACS. In my site the users with no profile are in grey spot. They are on half way to be users because they have no username and profile on my site yet. So looking at the image above again we need something that adds profile existence condition to user-only content. [ProfileRequired] public ActionResult Details(string id) {     var profile = _userRepository.GetUserByUserName(id);     return View(profile); } Now, this attribute will solve our problem as soon as we implement it. ProfileRequiredAttribute: Profiles are required to be fully authorized Here is my implementation of ProfileRequiredAttribute. It is pretty new and right now it is more like working draft but you can already play with it. public class ProfileRequiredAttribute : AuthorizeAttribute {     private readonly string _redirectUrl;       public ProfileRequiredAttribute()     {         _redirectUrl = ConfigurationManager.AppSettings["JoinUrl"];         if (string.IsNullOrWhiteSpace(_redirectUrl))             _redirectUrl = "~/";     }              public override void OnAuthorization(AuthorizationContext filterContext)     {         base.OnAuthorization(filterContext);           var httpContext = filterContext.HttpContext;         var identity = httpContext.User.Identity;           if (!identity.IsAuthenticated || identity.GetProfile() == null)             if(filterContext.Result == null)                 httpContext.Response.Redirect(_redirectUrl);          } } All methods with this attribute work as follows: if user is not authenticated then he or she is redirected to AppFabric ACS identity provider selection page, if user is authenticated but has no profile then user is by default redirected to main page of site but if you have application setting with name JoinUrl then user is redirected to this URL. First case is handled by AuthorizeAttribute and the second one is handled by custom logic in ProfileRequiredAttribute class. GetProfile() extension method To get user profile using less code in places where profiles are needed I wrote GetProfile() extension method for IIdentity interface. There are some more extension methods that read out user and identity provider identifier from claims and based on this information user profile is read from database. If you take this code with copy and paste I am sure it doesn’t work for you but you get the idea. public static User GetProfile(this IIdentity identity) {     if (identity == null)         return null;       var context = HttpContext.Current;     if (context.Items["UserProfile"] != null)         return context.Items["UserProfile"] as User;       var provider = identity.GetIdentityProvider();     var nameId = identity.GetNameIdentifier();       var rep = ObjectFactory.GetInstance<IUserRepository>();     var profile = rep.GetUserByProviderAndNameId(provider, nameId);       context.Items["UserProfile"] = profile;       return profile; } To avoid round trips to database I cache user profile to current request because the chance that profile gets changed meanwhile is very minimal. The other reason is maybe more tricky – profile objects are coming from Entity Framework context and context has also HTTP request as lifecycle. Conclusion This posting gave you some ideas how to finish user profiles stuff when you use AppFabric ACS as external authentication provider. Although there was little shift between us and ASP.NET MVC with interpretation of “authorized” we were easily able to solve the problem by extending AuthorizeAttribute to get all our requirements fulfilled. We also write extension method for IIdentity that returns as user profile based on username and caches the profile in HTTP request scope.

    Read the article

  • How to use call web service action in SharePoint2013 workflow

    - by ybbest
    In SharePoint2013, you can use call web service action and loop. In this post, I will show you how to achieve this. 1. Create a List workflow called CallWebService 2. Create a variable called listurl and assign the value to http://sp2010/_vti_bin/listdata.svc 3. Create a dictionary variable called RequestHeaders and add the following key value pairs. 4. Call the web service with the HttpHeaders you just build in the previous step and store the response in the variable ResponseContent. 5. The ResponseContent variable is the Dynamic values (in SharePoint designer it will be called dictionary type) and it is new feature for SharePoint2013 workflow. We can use the following actions to count the number items in the variable. 6. You can use loop in SharePoint 2013 workflow and out each list title as shown below.

    Read the article

  • DevIntersection Conference Dec 9th-12th

    - by ScottGu
    I’m excited to be presenting a keynote at the DevIntersection conference this coming Dec 9th->12th in Las Vegas.  This conference has an awesome set of speakers from a variety of backgrounds.  A number of people from my team (including Scott Hanselman, Scott Hunter and Daniel Roth from the ASP.NET team) will be presenting in addition to me.  You can learn more about the conference and check out the schedule here. Attendees who register by November 20th will receive a free Windows 8 Tablet – so if you are interested in attending sign-up soon! Hope to see some of you there, Scott

    Read the article

  • Access Control Service: Home Realm Discovery (HRD) Gotcha

    - by Your DisplayName here!
    I really like ACS2. One feature that is very useful is home realm discovery. ACS provides a Nascar style list as well as discovery based on email addresses. You can take control of the home realm selection process yourself by downloading the JSON feed or by manually setting the home realm parameter. Plenty of options – the only option missing is turning it off… In other words, when you setup your ACS namespace and realm and register identity provider, there is no way to keep the list of identity providers secret. An interested “user” can always retrieve all registered identity provider (using the browser or download the JSON feed). This may not be an issue with web identity providers, but when you use ACS to federate with customers or business partners, you maybe don’t want to disclose that list to the public (or to other customers). This is an adoption blocker for certain situations. I hope this feature will be added soon. In addition I would also like to see a feature I call “home realm aliases”. Some random string that I can use as a whr parameter instead of using the real issuer URI.

    Read the article

  • Access Control Service: Transitioning between Active and Passive Scenarios

    - by Your DisplayName here!
    As I mentioned in my last post, ACS features a number of ways to transition between protocol and token types. One not so widely known transition is between passive sign ins (browser) and active service consumers. Let’s see how this works. We all know the usual WS-Federation handshake via passive redirect. But ACS also allows driving the sign in process yourself via specially crafted WS-Federation query strings. So you can use the following URL to sign in using LiveID via ACS. ACS will then redirect back to the registered reply URL in your application: GET /login.srf?   wa=wsignin1.0&   wtrealm=https%3a%2f%2faccesscontrol.windows.net%2f&   wreply=https%3a%2f%2fleastprivilege.accesscontrol.windows.net%3a443%2fv2%2fwsfederation&   wp=MBI_FED_SSL&   wctx=pr%3dwsfederation%26rm%3dhttps%253a%252f%252froadie%252facs2rp%252frest%252f The wsfederation bit in the wctx parameter indicates, that the response to the token request will be transmitted back to the relying party via a POST. So far so good – but how can an active client receive that token now? ACS knows an alternative way to send the token request response. Instead of doing the redirect back to the RP, it emits a page that in turn echoes the token response using JavaScript’s window.external.notify. The URL would look like this: GET /login.srf?   wa=wsignin1.0&   wtrealm=https%3a%2f%2faccesscontrol.windows.net%2f&   wreply=https%3a%2f%2fleastprivilege.accesscontrol.windows.net%3a443%2fv2%2fwsfederation&   wp=MBI_FED_SSL&   wctx=pr%3djavascriptnotify%26rm%3dhttps%253a%252f%252froadie%252facs2rp%252frest%252f ACS would then render a page that contains the following script block: <script type="text/javascript">     try{         window.external.Notify('token_response');     }     catch(err){         alert("Error ACS50021: windows.external.Notify is not registered.");     } </script> Whereas token_response is a JSON encoded string with the following format: {   "appliesTo":"...",   "context":null,   "created":123,   "expires":123,   "securityToken":"...",   "tokenType":"..." } OK – so how does this all come together now? As an active client (Silverlight, WPF, WP7, WinForms etc). application, you would host a browser control and use the above URL to trigger the right series of redirects. All the browser controls support one way or the other to register a callback whenever the window.external.notify function is called. This way you get the JSON string from ACS back into the hosting application – and voila you have the security token. When you selected the SWT token format in ACS – you can use that token e.g. for REST services. When you have selected SAML, you can use the token e.g. for SOAP services. In the next post I will show how to retrieve these URLs from ACS and a practical example using WPF.

    Read the article

  • How stable are Single Page Application (SPA) build with Microsoft .Net for enterprise application [on hold]

    - by Husrat Mehmood
    Imagine a situation where you have your data loading to your application via REST Api,you are building a responsive application(ajax request) for an Enterprise. What potential problems might I run into for a single page application(SPA) using Microsoft Asp.Net Web application build using MVC template? Are there advantages to just designing a multi-page application using asp.net mvc 5 remember I am using SPA for an Enterprise Application where there are role based views for the users.?

    Read the article

  • Cloud – the forecast is improving

    - by Rob Farley
    There is a lot of discussion about “the cloud”, and how that affects people’s data stories. Today the discussion enters the realm of T-SQL Tuesday, hosted this month by Jorge Segarra. Over the years, companies have invested a lot in making sure that their data is good, and I mean every aspect of it – the quality of it, the security of it, the performance of it, and more. Experts such as those of us at LobsterPot Solutions have helped these companies with this, and continue to work with clients to make sure that data is a strong part of their business, not an oversight. Whether business intelligence systems are being utilised or not, every business needs to be able to rely on its data, and have the confidence in it. Data should be a foundation upon which a business is built. In the past, data had been stored in paper-based systems. Filing cabinets stored vital information. Today, people have server rooms with storage of various kinds, recognising that filing cabinets don’t necessarily scale particularly well. It’s easy to ‘lose’ data in a filing cabinet, when you have people who need to make sure that the sheets of paper are in the right spot, and that you know how things are stored. Databases help solve that problem, but still the idea of a large filing cabinet continues, it just doesn’t involve paper. If something happens to the physical ‘filing cabinet’, then the problems are larger still. Then the data itself is under threat. Many clients have generators in case the power goes out, redundant cables in case the connectivity dies, and spare servers in other buildings just in case they’re required. But still they’re maintaining filing cabinets. You see, people like filing cabinets. There’s something to be said for having your data ‘close’. Even if the data is not in readable form, living as bits on a disk somewhere, the idea that its home is ‘in the building’ is comforting to many people. They simply don’t want to move their data anywhere else. The cloud offers an alternative to this, and the human element is an obstacle. By leveraging the cloud, companies can have someone else look after their filing cabinet. A lot of people really don’t like the idea of this, partly because the administrators of the data, those people who could potentially log in with escalated rights and see more than they should be allowed to, who need to be trusted to respond if there’s a problem, are now a faceless entity in the cloud. But this doesn’t mean that the cloud is bad – this is simply a concern that some people may have. In new functionality that’s on its way, we see other hybrid mechanisms that mean that people can leverage parts of the cloud with less fear. Companies can use cloud storage to hold their backup data, for example, backups that have been encrypted and are therefore not able to be read by anyone (including administrators) who don’t have the right password. Companies can have a database instance that runs locally, but which has its data files in the cloud, complete with Transparent Data Encryption if needed. There can be a higher level of control, making the change easier to accept. Hybrid options allow people who have had fears (potentially very justifiable) to take a new look at the cloud, and to start embracing some of the benefits of the cloud (such as letting someone else take care of storage, high availability, and more) without losing the feeling of the data being close. @rob_farley

    Read the article

  • Dev'Camps : Microsoft décortique Windows Azure le 20 juin lors d'une session gratuite sur sa plateforme Cloud pour les développeurs

    Dev'Camps: Microsoft décortique Windows Azure le 20 juin lors d'une session gratuite sur sa plateforme Cloud pour les développeurs Microsoft organise une série d'événements à travers la France sur ses plateformes et technologies du moment. Ces Dev'Camps sont un nouveau format d'évènements, 100% développement, en direct avec les experts Microsoft, pour vous aider dans vos projets applicatifs. Un des événements les plus attendus de cette série est l'Azure Camp, qui se tiendra la mercredi 20 juin, au siège de Microsoft à Issy-Les-Moulineaux. Les Azure Camps permettront en effet d'approfondir rapidement ses connaissances sur le nouvel outil phare de Microsoft ...

    Read the article

  • Access Control Service: Programmatically Accessing Identity Provider Information and Redirect URLs

    - by Your DisplayName here!
    In my last post I showed you that different redirect URLs trigger different response behaviors in ACS. Where did I actually get these URLs from? The answer is simple – I asked ACS ;) ACS publishes a JSON encoded feed that contains information about all registered identity providers, their display names, logos and URLs. With that information you can easily write a discovery client which, at the very heart, does this: public void GetAsync(string protocol) {     var url = string.Format( "https://{0}.{1}/v2/metadata/IdentityProviders.js?protocol={2}&realm={3}&version=1.0",         AcsNamespace,         "accesscontrol.windows.net",         protocol,         Realm);     _client.DownloadStringAsync(new Uri(url)); } The protocol can be one of these two values: wsfederation or javascriptnotify. Based on that value, the returned JSON will contain the URLs for either the redirect or notify method. Now with the help of some JSON serializer you can turn that information into CLR objects and display them in some sort of selection dialog. The next post will have a demo and source code.

    Read the article

  • Is there any way to optimize my search blob program?

    - by Vicky
    I written this code to search the blob items (text files) on the basis of there content. For ex : if I search for "Good", then the files that contains "Good or good" word the name of that files should appear in search result. My code is working but i want to optimize it. class BlobSearch { public static int num = 1; static void Main(string[] args) { string accountName = "accountName"; string accessKey = "accesskey"; string azureConString = "DefaultEndpointsProtocol=https;AccountName=" + accountName + ";AccountKey=" + accessKey; string blob = "MyBlobContainer"; string searchText = string.Empty; Console.WriteLine("Type and enter to search : "); searchText = Console.ReadLine(); CloudStorageAccount account = CloudStorageAccount.Parse(azureConString); CloudBlobClient blobClient = account.CreateCloudBlobClient(); CloudBlobContainer blobContainer = blobClient.GetContainerReference(blob); blobContainer.FetchAttributes(); var blobItemList = blobContainer.ListBlobs(); GetBlobList(searchText, blobContainer, blobItemList); Console.ReadLine(); } private static async void GetBlobList(string searchText, CloudBlobContainer blobContainer, IEnumerable<IListBlobItem> blobItemList) { foreach (var item in blobItemList) { string line = string.Empty; CloudBlockBlob blockBlob = blobContainer.GetBlockBlobReference(item.Uri.ToString()); if (blockBlob.Name.Contains(".txt")) { await Search(searchText, blockBlob); } } } private async static Task Search(string searchText, CloudBlockBlob blockBlob) { string text = await blockBlob.DownloadTextAsync(); if (text.ToLower().IndexOf(searchText.ToLower()) != -1) { Console.WriteLine("Result : " + num + " => " + blockBlob.Name.Substring(blockBlob.Name.LastIndexOf('/') + 1)); num++; } } } I think blobContainer.ListBlobs(); is blocking code because search will not work until all the blob items loaded. Is there anyway to optimize it or anywhere else in my code. Thanks

    Read the article

  • Microsoft sort la version 1.4 du SDK de Windows Azure, qui corrige de nombreux bogues de la version précédente

    Microsoft sort la version 1.4 du SDK de Windows Azure, qui corrige de nombreux bogues de la version précédente L'équipe de développement de Windows Azure vient de rendre disponible la nouvelle version de son SDK. Ainsi, Windows Azure SDK 1.4 est téléchargeable et apporte quelques mises à jour. Il corrige notamment quelques problèmes qui se trouvaient dans la version précédente (1.3), parmi lesquels : - correction de l'échec de l'IIS lors de la configuration du fichier web.config en lecture seule - correction des erreurs dans les packs IIS qui les faisaient doubler en taille lorsque packagés - correction du recyclage de la totalité d'un IIS web role quand le diagnotic sto...

    Read the article

  • Nouveau Windows Azure : des machines virtuelles persistantes sous Linux, du IaaS, et encore plus de technos open-sources supportées

    Nouveau Windows Azure : des machines virtuelles persistantes sous Linux Du IaaS, et encore plus de technologies open-sources supportées Windows Azure, la plateforme Cloud de Microsoft dédiée aux développeurs, continue sa montée en puissance. Depuis hier, plusieurs nouveaux services ont été officiellement annoncés. Parmi ceux-ci, un des plus attendus (et qui a alimenté le plus de rumeurs) est l'arrivée de machines virtuelles - persistantes ? capables de faire tourner des distributions Linux (Ubuntu, OpenSuse, CentOS, SUSE Linux Enterprise Server). Azure combine à présent des services d'infrastructure et de plateforme pour « une plus grande souplesse dans la façon de cons...

    Read the article

  • Microsoft ajoute un nouveau service de Build à la version hébergée de Team Foundation Server permettant de compiler sur Windows Azure

    Microsoft ajoute nouveau service de Build à la version hébergée de Team Foundation Server permettant de compiler le code source sur Windows Azure Microsoft a procédé à une mise à jour de la version hébergée de Team Foundation Server sur Windows Azure. Team Foundation Server est une solution de travail collaboratif permettant la gestion des sources, des builds, le suivi des éléments de travail, la planification et l'analyse des performances. Il y a un an de cela, Microsoft avait porté la solution sur sa plateforme d'hébergement Cloud Windows Azure. La société vient d'annoncer qu'un nouveau service de Build a été ajouté à Team Foundation. Ce service réduit le...

    Read the article

  • Azure VS Tools and SDK - systray already running&hellip;

    - by Shawn Cicoria
    If you are getting a message when you start the Compute Emulator “Systray already running…” from within Visual Studio one fix is to check what the image name is loading is. For some reason, on 2 of my machines the image was loading with the 8.3 format.  This caused the logic in the VS tools to not find the process.  So, to fix, I just did a little copy/rename magic. C:\Program Files\Windows Azure SDK\v1.3\bin>copy csmonitor.exe csmonitor-a.exe 1 file(s) copied. C:\Program Files\Windows Azure SDK\v1.3\bin>del csmonitor.exe C:\Program Files\Windows Azure SDK\v1.3\bin>copy csmonitor-a.exe csmonitor.exe 1 file(s) copied. If you bring up task manager and see something like CSMON~1.EXE in the Image Name column, you probably have this issue.

    Read the article

< Previous Page | 42 43 44 45 46 47 48 49 50 51 52 53  | Next Page >