Search Results

Search found 1774 results on 71 pages for 'azure'.

Page 23/71 | < Previous Page | 19 20 21 22 23 24 25 26 27 28 29 30  | Next Page >

  • SQL Azure and VS 2010B2 or SSMSE 2008

    - by Vulgrin
    Ok, I see that people have asked this question before, but I'm seeing some conflicting statements. Can I, or can I not, connect directly to my SQL Azure database from SSMSE 2008? I see posts from before November that the SSMS 2008 RC would be able to connect directly - so I don't understand why the newest SSMSE cannot connect. Is it just a problem with the Express version of SSMS? Where can I find the "non-Express" version, if there is one? I can connect to the database via the cancel and connect method - however, you don't get object explorer that way. I see that there are add-ons for VS.Net to allow you to explore the database but I wanted to do it with the base apps if possible. Thanks

    Read the article

  • Checking if a blob exists in Azure Storage

    - by John
    Hi, I've got a very simple question (I hope!) - I just want to find out if a blob (with a name I've defined) exists in a particular container. I'll be downloading it if it does exist, and if it doesn't then I'll do something else. I've done some searching on the intertubes and apparently there used to be a function called DoesExist or something similar... but as with so many of the Azure APIs, this no longer seems to be there (or if it is, has a very cleverly disguised name). Or maybe I'm missing something simple... :) John

    Read the article

  • Windows Azure: Exception while creating a blob container

    - by veda
    I followed a tutorial on creating a blob on windows azure. But when I do that, I get an exception error: Error while creating containerThe server encountered an unknown failure: The remote server returned an error: (300) Ambiguous Redirect. The code is : private void SetContainersAndPermission() { try { // create a container var CloudAccountStorage = CloudStorageAccount.FromConfigurationSetting("BlobConnectionString"); cloudBlobClient = CloudAccountStorage.CreateCloudBlobClient(); CloudBlobContainer blobContainer = cloudBlobClient.GetContainerReference("documents"); blobContainer.CreateIfNotExist(); // permissions var containerPermissions = blobContainer.GetPermissions(); containerPermissions.PublicAccess = BlobContainerPublicAccessType.Container; blobContainer.SetPermissions(containerPermissions); } catch(Exception ex) { throw new Exception("Error while creating container" + ex.Message); } } Can anyone tell me How to solve this problem....

    Read the article

  • Windows Azure access POST data

    - by Mohamed Nuur
    Ok, so I can't seem to find decent Windows Azure examples. I have a simple hello world application that's based on this tutorial. I want to have custom output instead of JSON or XML. So I created my interface like: [ServiceContract] public interface IService { [OperationContract] [WebInvoke(UriTemplate = "session/create", Method = "POST")] string createSession(); } public class MyService : IService { public string createSession() { // get access to POST data here: user, pass string sessionid = Session.Create(user, pass); return "sessionid=" + sessionid; } } For the life of me, I can't seem to figure out how to access the POST data. Please help. Thanks!

    Read the article

  • LinqPad with Azure Table Storage

    - by Sarang
    LinqPad as we all know has been a wonderful tool for running ad-hoc queries. With Windows Azure Table storage in picture LinqPad was no longer in picture and we shifted focus to Cloud Storage Studio only to realize the limited and strange querying capabilities of CSS. With some tweaking to Linqpad we can get the comfortable old shoe of ad-hoc queries with LinqPad in the Windows Azure Table storage. Steps: 1. Start LinqPad 2. Right Click in the query window and select “Query Properties” 3. In The Additional References add reference to Microsoft.WindowsAzure.StorageClient, System.Data.Services.Client.dll and the assembly containing the implementation of the DataServiceContext class tied to the Windows Azure table storage. 4. In the additional namespace imports import the same three namespaces mentioned above. 5. Then we need to provide following details. a. Table storage account name and shared key. b. DataServiceContext implementing class in your code. c. A LINQ query. e.x.         var storageAccountName = "myStorageAccount";  // Enter valid storage account name         var storageSharedKey = "mysharedKey"; // Enter valid storage account shared key         var uri = new System.Uri("http://table.core.windows.net/");         var storageAccountInfo = new CloudStorageAccount(new StorageCredentialsAccountKey(storageAccountName, storageSharedKey), false);         var serviceContext = new TweetPollDataServiceContext(storageAccountInfo); // Specify the DataServiceContext implementation         // The query         var query = from row in serviceContext.Table                     select row;         query.Dump(); Thanks LinqPad! Technorati Tags: LinqPad,Azure Table Storage,Linq

    Read the article

  • Miami 311: Built on Windows Azure

    - by Josh Holmes
    This is a cool use of Azure. The city of Miami tool their “311” data around potholes, trash pickup issues, recycling issues, broken sidewalks and the like and put that data in Azure. The next step is that they leveraged Bing Maps and Silverlight to visualize those issues spread on a map of the city. The solution takes advantage of virtually unlimited storage and processing power, provides the ability to quickly address service requests and implement updates even during peak times such as hurricane season. If things change, the City can bring the solution on site or move to a physical facility, all based on  need and cost-effectiveness. As a result, residents logging on to Miami 311 can see on average 4,500 issues in progress - not represented as a ‘list', but located on a map in relation to other projects in their neighborhood .  A simple click on the map allows them to easily drill down to more and more specific details if they want. In short, they have turned what used to be represented by a meaningless list of data into useful information, and created  actionable and consumable knowledge that is relevant to the citizens of Miami. For Miami, their ‘service call to the city' becomes an interactive process they can follow - and the City has a new tool to manage and deliver outcomes. … When the city made the move to the web, they chose tools they knew and software they trust. The Microsoft Windows Azure cloud platform made it easy to do, and they used both Bing mapping and Silverlight to build a user friendly front end. According to Port25 (Miami 311: Built on Windows Azure - Port 25: The Open Source Community at Microsoft), it took two people 8 days to implement the whole system and they are going to open source their solution so that other cities can leverage it. I haven’t seen yet where and how they are going to release it but I’ll keep you posted if I find out.

    Read the article

  • Windows Azure Upgrade Domain

    - by kaleidoscope
    Windows Azure automatically divides your role instances into some “logical” domains called upgrade domains. During upgrade, Azure is updating these domains one by one. This is a by design behavior to avoid nasty situations. Some of the last feature additions and enhancements on the platform was the ability to notify your role instances in case of “environment” changes, like adding or removing being most common. In such case, all your roles get a notification of this change. Imagine if you had 50 or 60 role instances, getting notified all at once and start doing various actions to react to this change. It will be a complete disaster for your service. The way to address this problem is upgrade domains. During upgrade Windows Azure updates them one by one and only the associated role instances to a specific domain get notified of the changes taking place. Only a small number of your role instances will get notified, react and the rest will remain intact providing a seamless upgrade experience and no service disruption or downtime. http://www.kefalidis.me/archive/2009/11/27/windows-azure-ndash-what-is-an-upgrade-domain.aspx   Lokesh, M

    Read the article

  • Worker roles in Windows Azure to host a multiplayer server

    - by MrWiggels
    I've been doing research on where to host a simple multi-player backend for a simple game I'm developing. So as a first choice I downloaded the Windows Azure SDK, which provides a nice and simple emulator environment where you can test out your application before uploading. I also download the Azure Social Game Toolkit (Visit), and followed as far as my understanding can take me. So, down to the main question. Is there anybody with experience developing Azure applications. I'm developing a Action RPG game, in a similar vein to Diablo III. I was thinking of putting up Matchmaking, Friends Lists, etc. Is there another way to connect to Azure services via something like UDP or TCP for sending packets or does everything have to go through HTTP requests? Is it even possible to use HTTP request/response for something like this? All game commands will be simple. Because the game server and the clients will be kept in-sync and will have deterministic actions, I'm just going to send actions like "Use Primary Skill" and "Use Secondary Skill". Any hints, ideas, light bulbs or a smack-in-the-face presentation will be much appreciated.

    Read the article

  • Automated backups for Windows Azure SQL Database

    - by Greg Low
    One of the questions that I've often been asked is about how you can backup databases in Windows Azure SQL Database. What we have had access to was the ability to export a database to a BACPAC. A BACPAC is basically just a zip file that contains a bunch of metadata along with a set of bcp files for each of the tables in the database. Each table in the database is exported one after the other, so this does not produce a transactionally-consistent backup at a specific point in time. To get a transactionally-consistent copy, you need a database that isn't in use.The easiest way to get a database that isn't in use is to use CREATE DATABASE AS COPY OF. This creates a new database as a transactionally-consistent copy of the database that you are copying. You can then use the export options to get a consistent BACPAC created.Previously, I've had to automate this process by myself. Given there was also no SQL Agent in Azure, I used a job in my on-premises SQL Server to do this, using a linked server configuration.Now there's a much simpler way. Windows Azure SQL Database now supports an automated export function. On the Configuration tab for the database, you need to enable the Automated Export function. You can configure how often the operation is performed for you, and which storage account will be used for the backups.It's important to consider the cost impacts of this as well. You are charged for how ever many databases are on your server on a given day. So if you enable a daily backup, you will double your database costs. Do not schedule the backups just before midnight UTC, as that could cause you to have three databases each day instead of one.This is a much needed addition to the capabilities. Scott Guthrie also posted about some other notable changes today, including a preview of a new premium offering for SQL Database. In addition to the Web and Business editions, there will now be a Premium edition that has reserved (rather than shared) resources. You can read about it all in Scott's post here: http://weblogs.asp.net/scottgu/archive/2013/07/23/windows-azure-july-updates-sql-database-traffic-manager-autoscale-virtual-machines.aspx

    Read the article

  • The tale of how the PowerShell CmdLets got installed with Azure SDK 1.4

    - by Enrique Lima
    I installed the Azure SDK 1.4 while rebuilding my laptop and ran the installation for the Windows Azure Service Management PowerShell CmdLets. Kicked off the installation script for the WASM PowerShell CmdLets by locating the path to which WASM PowerShell CmdLets was deployed to. Double clicked the startHere command. It will then open the WASM installation dialog. Click Next. Click Next. Notice the red x next to the Azure SDK 1.3, the problem is I have SDK 1.4 Here is the workaround, I go back to the location of the deployed WASM sources. Go into the setup path, then scripts>dependencies>check. Now, locate the CheckAzureSDK.ps1 file, and right-click, then edit. This is the content in the ps1 file, it check for the specific version of the Azure SDK, in this case, it is looking for version 1.3.11133.0038. We need for it to check for version 1.4.20227.1419 Now, save your ps1 file, go back to the open WASM install dialog, and click rescan. This time it should pass, then click next. A Command prompt window will appear, click any key. This completes the installation, click Close.

    Read the article

  • WCF Service in Azure with ClaimsIdentity over SSL

    - by Sunil Ramu
    Hello , Created a WCF service as a WebRole using Azure and a client windows application which refers to this service. The Cloud Service is refered to a certificate which is created using the "Hands On Lab" given in windows identity foundation. The Web Service is hosted in IIS and it works perfect when executed. I've created a client windows app which refers to this web service. Since WIF Claims identity is used, I have a claimsAuthorizationManager Class, and also a Policy class with set of defilned policies. The Claims is set in the web.config file. When I execute the windows app as the start up project, the app prompts for authentication, and when the account credentials are given as in the config file, it opens a new "Windows Card Space" Window and Says "Incoming Policy Failed". When I close the window the System throws and Exception The incoming policy could not be validated. For more information, please see the event log. Event Log Details Incoming policy failed validation. No valid claim elements were found in the policy XML. Additional Information: at System.Environment.get_StackTrace() at Microsoft.InfoCards.Diagnostics.InfoCardTrace.BuildMessage(InfoCardBaseException ie) at Microsoft.InfoCards.Diagnostics.InfoCardTrace.TraceAndLogException(Exception e) at Microsoft.InfoCards.Diagnostics.InfoCardTrace.ThrowHelperError(Exception e) at Microsoft.InfoCards.InfoCardPolicy.Validate() at Microsoft.InfoCards.Request.PreProcessRequest() at Microsoft.InfoCards.ClientUIRequest.PreProcessRequest() at Microsoft.InfoCards.Request.DoProcessRequest(String& extendedMessage) at Microsoft.InfoCards.RequestFactory.ProcessNewRequest(Int32 parentRequestHandle, IntPtr rpcHandle, IntPtr inArgs, IntPtr& outArgs) Details: System Provider [ Name] CardSpace 3.0.0.0 EventID 267 [ Qualifiers] 49157 Level 2 Task 1 Keywords 0x80000000000000 EventRecordID 6996 Channel Application EventData No valid claim elements were found in the policy XML. Additional Information: at System.Environment.get_StackTrace() at Microsoft.InfoCards.Diagnostics.InfoCardTrace.BuildMessage(InfoCardBaseException ie) at Microsoft.InfoCards.Diagnostics.InfoCardTrace.TraceAndLogException(Exception e) at Microsoft.InfoCards.Diagnostics.InfoCardTrace.ThrowHelperError(Exception e) at Microsoft.InfoCards.InfoCardPolicy.Validate() at Microsoft.InfoCards.Request.PreProcessRequest() at Microsoft.InfoCards.ClientUIRequest.PreProcessRequest() at Microsoft.InfoCards.Request.DoProcessRequest(String& extendedMessage) at Microsoft.InfoCards.RequestFactory.ProcessNewRequest(Int32 parentRequestHandle, IntPtr rpcHandle, IntPtr inArgs, IntPtr& outArgs)

    Read the article

  • Send Email from worker role (Azure) with attachment in c#

    - by simplyvaibh
    I am trying to send an email(in c#) from worker role(Azure) with an attachment(from blob storage). I am able to send an email but attachment(word document) is blank. The following function is called from worker role. public void sendMail(string blobName) { InitStorage();//Initialize the storage var storageAccount = CloudStorageAccount.FromConfigurationSetting("DataConnectionString"); container = blobStorage.GetContainerReference("Container Name"); CloudBlockBlob blob = container.GetBlockBlobReference(blobName); if (File.Exists("demo.doc")) File.Delete("demo.doc"); FileStream fs = new FileStream("demo.doc", FileMode.OpenOrCreate); blob.DownloadToStream(fs); Attachment attach = new Attachment(fs,"Report.doc"); System.Net.Mail.MailMessage Email = new System.Net.Mail.MailMessage("[email protected]", "[email protected]"); Email.Subject = "Text fax send via email"; Email.Subject = "Subject Of email"; Email.Attachments.Add(attach); Email.Body = "Body of email"; System.Net.Mail.SmtpClient client = new SmtpClient("smtp.live.com", 25); client.DeliveryMethod = SmtpDeliveryMethod.Network; client.EnableSsl = true; client.Credentials = new NetworkCredential("[email protected]", Password); client.Send(Email); fs.Flush(); fs.Close(); Email.Dispose(); } Please tell me where I am doing wrong?

    Read the article

  • Windows Azure: asp.net <appsetting> for ChatHttpHandler not found

    - by veda
    I have a problem in setting the chartHttpHandler in web.config for windows azure Initially I added a chartHttpHandler on my web.config file <remove name="ChartImageHandler"/> <add name="ChartImageHandler" preCondition="integratedMode" verb="GET,HEAD" path="ChartImg.axd" type="System.Web.UI.DataVisualization.Charting.ChartHttpHandler, System.Web.DataVisualization, Version=3.5.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35"/> under system.webServer section Then I got an error stating Invalid temp directory in chart handler configuration [c:\TempImageFiles\]. Then I found that I should change in From <add key="ChartImageHandler" value="storage=file;timeout=20;dir=c:\TempImageFiles\;" /> To <add key="ChartImageHandler" value="storage=file;timeout=20;" /> I am not able to find this section in my web.config file. So I tried to add it under section but it gave me error that it is an invalid child Likewise, I tried adding it under section, it too gave the same error... So, I added sepeartely, Then I am not able to see anything.. The webpage is just blank... What should I have to do.. Can anyone tell me "how to solve it"

    Read the article

  • Can someone please debug this Windows Azure Application?

    - by Vimvq1987
    Here's the myTODO project from codeplex: myTODO project I added any necessary libraries, added storage, changed obsolete types/methods, but everything went wrong when I debug it. An exception was thrown here (in TableStorage.cs): public IEnumerable<TElement> ExecuteWithRetries(RetryPolicy retry) { IEnumerable<TElement> ret = null; if (retry == null) { throw new ArgumentNullException("retry"); } retry(() => { try { ret = _query.Execute(); } catch (InvalidOperationException e) { if (TableStorageHelpers.CanBeRetried(e)) { throw new TableRetryWrapperException(e); } throw; } }); return ret; } I'm using Visual Studio 2008, SQL server 2008, Windows Azure SDK v1.1. Can anyone please debug this project for me, or suggest me someway to get it working. This request is urgent. Any helps are much appreciated. PS: If you can't download these file, please let me know, I'll upload to another hosts.

    Read the article

  • Create a real time web application using .net framework and azure - very confused

    - by test
    Let us say I would like a simple (yet complex) web application where there is continuous READING AND WRITING to the sql azure database. Let us say I am tracking a location, and I would like it to be updated very frequently (lets take the worst case: 1 second). From the little knowledge I have, I think that this involves the use of the database to continuously write the location to the database, and continuously read from the database to update another person through a website. Do you please have any suggestion which technologies can I use? Is there a simple way? I heard about node.js, signalR. I have no idea how to use them, if they are really what I need. the last tutorial I checked out simply uses a while(true) loop..but I don't think that that's something good to keep a thread continuously busy... Do I have to create some background task? Do I have to create some web service? This is a school project and I wish not to go for the most difficult option, but if there is some sort of solution, challenge accepted :) Can you please help me? Since I have asked many questions here and yet I have no solution in mind

    Read the article

  • Windows Azure Worldwide availability

    - by Insomniac
    Hi, I've been reviewing Windows Azure platform for some time, and can't find answer to one very important question. If I deploy my application within a cloud, how it will be reached from different places worldwide? For example if I have a web application with a database and want it to be accessible to users in UK, US, China and etc. Can I be sure that any user in the world will get almost the same request processing time? I think of it this way. 1. User sends request (navigates in browser to my web site) 2. This request gets in a cloud in a nearest location (closest to user MS Data Center?) 3. It is processed by an instance of my web application (in nearest location, with request to my centralized DB which can be far away but SQL request goes via MS internal network, which I believe should be very fast). 4. Response sent to user. Please let me know if I'm wrong. Thanks.

    Read the article

  • Microsoft Townhall, An Example for Azure and MVC

    - by Shaun
    Microsoft just released an example named Microsoft Townhall which was built and deployed on Azure. It uses ASP.NET MVC as its webiste framework and the SQL Azure plus LinqToSQL as its the database and the ORM framework. You can download the source code at the MSDN Code Gallery. Basides the Azure it might be more useful to us to learn how they utilized ASP.NET MVC. Just a very quickly review I found it utilized the Enterprise Library Unity as the main IoC container for controllers, services and repositories and customized a lot of ModelBinders, Filters, etc.   Hope this helps, Shaun   All documents and related graphics, codes are provided "AS IS" without warranty of any kind. Copyright © Shaun Ziyan Xu. This work is licensed under the Creative Commons License.

    Read the article

  • Devfish Joe Healy in Fort Lauderdale - Cloud Computing and Azure - 03/11/2010 MSDN Tiki Hut

    - by Rainer
    Devfish Joe Healy, Brian Hitney, and Herve Rogero presented excellent sessions on today's MSDN Tiki Hut Event about  Cloud Computing and Azure. This was an developer focused event, starting out with an overview about structure and platform, followed by working code samples running on the platform, and all needed information to get developers started on development for cloud applications. Participants had Q&A opportunities after each session and made good use of it. I am sure that a lot of developers will jump on the Azure train. Azure is on top of my dev project list after that great event! This platform offers endless opportunities for development and businesses. The cloud environment in general is safer, scales better, and is far more cost effective compared to run and maintain your own data center. Posted: Rainer Habermann

    Read the article

  • PASS Conference 2011 Topic: Multitenant Design and Sharding with SQL Azure

    - by Herve Roggero
    I am really happy to announce that I have been accepted as a speaker at the 2011 PASS Conference in Seattle. The topic? It will be about SQL Azure scalability using shards, and the Data Federation feature of SQL Azure. I will also talk extensively about the community open-source sharding library Enzo SQL Shard (enzosqlshard.codeplex.com) and show how to make the most out of it. In general, the presentation will provide details about how to properly design an application for sharding, how to make it work for SQL Server, SQL Azure, and how to leverage the upcoming Data Federation technology that Microsoft is planning. The primary objective is to turn sharding an implementation concern, not a development concern. Using a library like Enzo SQL Shard will help you achieve this objective. If you come to PASS Summit this year, come see me and mention you saw this blog!

    Read the article

  • MSDN Simulcast Event: Take Your Applications Sky-High with Cloud Computing and the Windows Azure Pla

    Join your local MSDN Events team as we take a deep dive into Microsoft Windows Azure. We'll start with a developer-focused overview of this brave new platform and the cloud computing services that can be used to build amazing applications. As the day unfolds, we'll explore data storage, Microsoft SQL Azure, and the basics of deployment with Windows Azure....Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • Azure Boot Camp

    - by Brian Schroer
    Belated thanks to Perficient for sponsoring (and providing lunch, which was a nice unadvertised surprise) and to Avichal Jain and Brian Blanchard for presenting at the St. Louis Azure Boot Camp May 13-14. There was a little more upfront discussion of “What is Cloud Computing and Why is it important?” than I thought necessary (I would think that people signing up for a two-day Azure event would already be convinced that it’s a worthwhile thing), but we put on our boots and fired up Visual Studio soon enough. The good news for developers, as with most of Microsoft’s recent initiatives (e.g Silverlight and Windows Phone 7 development), is that you can leverage the skills you already have. If you’ve developed service-oriented applications, you’ve got a big head start. If a free Azure Boot Camp event is coming to your area (here’s the schedule), be sure to check it out. If not, you can download the slides and labs from their web site and “throw your own”.

    Read the article

  • Whats new in My Life:Robotics,Azure

    - by sonam
    AZURE: I haven’t blogged from long time.I was actually busy with doing some Azure. For any starters with Azure,I would recommend to go with Neil: http://nmackenzie.spaces.live.com/Blog/cns!B863FF075995D18A!564.entry Awesome content.   Another thing that has come in my interests:Robotics Yes,I am finally reading up on robotics, specially the mobile robotics. Since,I don’t have any prof to guide yet,I am doing it independently by reading research papers and books. My first robot is not autonomous but i am actually making it for RoboWars. I got inspired by this video of Steve jobs and I think,I love to work on robotics.Perhaps ,thats my love. http://www.youtube.com/watch?v=Hd_ptbiPoXM Cya

    Read the article

  • Using Hadooop (HDInsight) with Microsoft - Two (OK, Three) Options

    - by BuckWoody
    Microsoft has many tools for “Big Data”. In fact, you need many tools – there’s no product called “Big Data Solution” in a shrink-wrapped box – if you find one, you probably shouldn’t buy it. It’s tempting to want a single tool that handles everything in a problem domain, but with large, complex data, that isn’t a reality. You’ll mix and match several systems, open and closed source, to solve a given problem. But there are tools that help with handling data at large, complex scales. Normally the best way to do this is to break up the data into parts, and then put the calculation engines for that chunk of data right on the node where the data is stored. These systems are in a family called “Distributed File and Compute”. Microsoft has a couple of these, including the High Performance Computing edition of Windows Server. Recently we partnered with Hortonworks to bring the Apache Foundation’s release of Hadoop to Windows. And as it turns out, there are actually two (technically three) ways you can use it. (There’s a more detailed set of information here: http://www.microsoft.com/sqlserver/en/us/solutions-technologies/business-intelligence/big-data.aspx, I’ll cover the options at a general level below)  First Option: Windows Azure HDInsight Service  Your first option is that you can simply log on to a Hadoop control node and begin to run Pig or Hive statements against data that you have stored in Windows Azure. There’s nothing to set up (although you can configure things where needed), and you can send the commands, get the output of the job(s), and stop using the service when you are done – and repeat the process later if you wish. (There are also connectors to run jobs from Microsoft Excel, but that’s another post)   This option is useful when you have a periodic burst of work for a Hadoop workload, or the data collection has been happening into Windows Azure storage anyway. That might be from a web application, the logs from a web application, telemetrics (remote sensor input), and other modes of constant collection.   You can read more about this option here:  http://blogs.msdn.com/b/windowsazure/archive/2012/10/24/getting-started-with-windows-azure-hdinsight-service.aspx Second Option: Microsoft HDInsight Server Your second option is to use the Hadoop Distribution for on-premises Windows called Microsoft HDInsight Server. You set up the Name Node(s), Job Tracker(s), and Data Node(s), among other components, and you have control over the entire ecostructure.   This option is useful if you want to  have complete control over the system, leave it running all the time, or you have a huge quantity of data that you have to bulk-load constantly – something that isn’t going to be practical with a network transfer or disk-mailing scheme. You can read more about this option here: http://www.microsoft.com/sqlserver/en/us/solutions-technologies/business-intelligence/big-data.aspx Third Option (unsupported): Installation on Windows Azure Virtual Machines  Although unsupported, you could simply use a Windows Azure Virtual Machine (we support both Windows and Linux servers) and install Hadoop yourself – it’s open-source, so there’s nothing preventing you from doing that.   Aside from being unsupported, there are other issues you’ll run into with this approach – primarily involving performance and the amount of configuration you’ll need to do to access the data nodes properly. But for a single-node installation (where all components run on one system) such as learning, demos, training and the like, this isn’t a bad option. Did I mention that’s unsupported? :) You can learn more about Windows Azure Virtual Machines here: http://www.windowsazure.com/en-us/home/scenarios/virtual-machines/ And more about Hadoop and the installation/configuration (on Linux) here: http://en.wikipedia.org/wiki/Apache_Hadoop And more about the HDInsight installation here: http://www.microsoft.com/web/gallery/install.aspx?appid=HDINSIGHT-PREVIEW Choosing the right option Since you have two or three routes you can go, the best thing to do is evaluate the need you have, and place the workload where it makes the most sense.  My suggestion is to install the HDInsight Server locally on a test system, and play around with it. Read up on the best ways to use Hadoop for a given workload, understand the parts, write a little Pig and Hive, and get your feet wet. Then sign up for a test account on HDInsight Service, and see how that leverages what you know. If you're a true tinkerer, go ahead and try the VM route as well. Oh - there’s another great reference on the Windows Azure HDInsight that just came out, here: http://blogs.msdn.com/b/brunoterkaly/archive/2012/11/16/hadoop-on-azure-introduction.aspx  

    Read the article

  • Integration Patterns with Azure Service Bus Relay, Part 2: Anonymous full-trust .NET consumer

    - by Elton Stoneman
    This is the second in the IPASBR series, see also: Integration Patterns with Azure Service Bus Relay, Part 1: Exposing the on-premise service Part 2 is nice and easy. From Part 1 we exposed our service over the Azure Service Bus Relay using the netTcpRelayBinding and verified we could set up our network to listen for relayed messages. Assuming we want to consume that service in .NET from an environment which is fairly unrestricted for us, but quite restricted for attackers, we can use netTcpRelay and shared secret authentication. Pattern applicability This is a good fit for scenarios where: the consumer can run .NET in full trust the environment does not restrict use of external DLLs the runtime environment is secure enough to keep shared secrets the service does not need to know who is consuming it the service does not need to know who the end-user is So for example, the consumer is an ASP.NET website sitting in a cloud VM or Azure worker role, where we can keep the shared secret in web.config and we don't need to flow any identity through to the on-premise service. The service doesn't care who the consumer or end-user is - say it's a reference data service that provides a list of vehicle manufacturers. Provided you can authenticate with ACS and have access to Service Bus endpoint, you can use the service and it doesn't care who you are. In this post, we’ll consume the service from Part 1 in ASP.NET using netTcpRelay. The code for Part 2 (+ Part 1) is on GitHub here: IPASBR Part 2 Authenticating and authorizing with ACS In this scenario the consumer is a server in a controlled environment, so we can use a shared secret to authenticate with ACS, assuming that there is governance around the environment and the codebase which will prevent the identity being compromised. From the provider's side, we will create a dedicated service identity for this consumer, so we can lock down their permissions. The provider controls the identity, so the consumer's rights can be revoked. We'll add a new service identity for the namespace in ACS , just as we did for the serviceProvider identity in Part 1. I've named the identity fullTrustConsumer. We then need to add a rule to map the incoming identity claim to an outgoing authorization claim that allows the identity to send messages to Service Bus (see Part 1 for a walkthrough creating Service Idenitities): Issuer: Access Control Service Input claim type: http://schemas.xmlsoap.org/ws/2005/05/identity/claims/nameidentifier Input claim value: fullTrustConsumer Output claim type: net.windows.servicebus.action Output claim value: Send This sets up a service identity which can send messages into Service Bus, but cannot register itself as a listener, or manage the namespace. Adding a Service Reference The Part 2 sample client code is ready to go, but if you want to replicate the steps, you’re going to add a WSDL reference, add a reference to Microsoft.ServiceBus and sort out the ServiceModel config. In Part 1 we exposed metadata for our service, so we can browse to the WSDL locally at: http://localhost/Sixeyed.Ipasbr.Services/FormatService.svc?wsdl If you add a Service Reference to that in a new project you'll get a confused config section with a customBinding, and a set of unrecognized policy assertions in the namespace http://schemas.microsoft.com/netservices/2009/05/servicebus/connect. If you NuGet the ASB package (“windowsazure.servicebus”) first and add the service reference - you'll get the same messy config. Either way, the WSDL should have downloaded and you should have the proxy code generated. You can delete the customBinding entries and copy your config from the service's web.config (this is already done in the sample project in Sixeyed.Ipasbr.NetTcpClient), specifying details for the client:     <client>       <endpoint address="sb://sixeyed-ipasbr.servicebus.windows.net/net"                 behaviorConfiguration="SharedSecret"                 binding="netTcpRelayBinding"                 contract="FormatService.IFormatService" />     </client>     <behaviors>       <endpointBehaviors>         <behavior name="SharedSecret">           <transportClientEndpointBehavior credentialType="SharedSecret">             <clientCredentials>               <sharedSecret issuerName="fullTrustConsumer"                             issuerSecret="E3feJSMuyGGXksJi2g2bRY5/Bpd2ll5Eb+1FgQrXIqo="/>             </clientCredentials>           </transportClientEndpointBehavior>         </behavior>       </endpointBehaviors>     </behaviors>   The proxy is straight WCF territory, and the same client can run against Azure Service Bus through any relay binding, or directly to the local network service using any WCF binding - the contract is exactly the same. The code is simple, standard WCF stuff: using (var client = new FormatService.FormatServiceClient()) { outputString = client.ReverseString(inputString); } Running the sample First, update Solution Items\AzureConnectionDetails.xml with your service bus namespace, and your service identity credentials for the netTcpClient and the provider:   <!-- ACS credentials for the full trust consumer (Part2): -->   <netTcpClient identityName="fullTrustConsumer"                 symmetricKey="E3feJSMuyGGXksJi2g2bRY5/Bpd2ll5Eb+1FgQrXIqo="/> Then rebuild the solution and verify the unit tests work. If they’re green, your service is listening through Azure. Check out the client by navigating to http://localhost:53835/Sixeyed.Ipasbr.NetTcpClient. Enter a string and hit Go! - your string will be reversed by your on-premise service, routed through Azure: Using shared secret client credentials in this way means ACS is the identity provider for your service, and the claim which allows Send access to Service Bus is consumed by Service Bus. None of the authentication details make it through to your service, so your service is not aware who the consumer is (MSDN calls this "anonymous authentication").

    Read the article

  • Windows Azure SDK 1.2 Available - .NET 4.0 Support

    - by Shaun
    The Windows Azure team had just announced the release of the latest version of its tools and SDK (v1.2) at the TechED 2010 New Orleans. You can download it here. The biggest new feature/improvement of this version of the SDK would be Visual Studio 2010 RTM and .NET 4.0 support. It gives us the facilities to build our azure-based applications on top of .NET 3.5 and 4.0 as well. So the guys who is working on, like me, or is going to be working on .NET 4 would better to have this SDK installed I think. Also there are some other information about the envolution of the Windows Azure at this TechED session you can find here.   Hope this helps, Shaun All documents and related graphics, codes are provided "AS IS" without warranty of any kind. Copyright © Shaun Ziyan Xu. This work is licensed under the Creative Commons License.

    Read the article

< Previous Page | 19 20 21 22 23 24 25 26 27 28 29 30  | Next Page >