Search Results

Search found 1939 results on 78 pages for 'adventureworks on azure'.

Page 26/78 | < Previous Page | 22 23 24 25 26 27 28 29 30 31 32 33  | Next Page >

  • Windows Azure : Storage Client Exception Unhandled

    - by veda
    I am writing a code for upload large files into the blobs using blocks... When I tested it, it gave me an StorageClientException It stated: One of the request inputs is out of range. I got this exception in this line of the code: blob.PutBlock(block, ms, null); Here is my code: protected void ButUploadBlocks_click(object sender, EventArgs e) { // store upladed file as a blob storage if (uplFileUpload.HasFile) { name = uplFileUpload.FileName; byte[] byteArray = uplFileUpload.FileBytes; Int64 contentLength = byteArray.Length; int numBytesPerBlock = 250 *1024; // 250KB per block int blocksCount = (int)Math.Ceiling((double)contentLength / numBytesPerBlock); // number of blocks MemoryStream ms ; List<string>BlockIds = new List<string>(); string block; int offset = 0; // get refernce to the cloud blob container CloudBlobContainer blobContainer = cloudBlobClient.GetContainerReference("documents"); // set the name for the uploading files string UploadDocName = name; // get the blob reference and set the metadata properties CloudBlockBlob blob = blobContainer.GetBlockBlobReference(UploadDocName); blob.Properties.ContentType = uplFileUpload.PostedFile.ContentType; for (int i = 0; i < blocksCount; i++, offset = offset + numBytesPerBlock) { block = Convert.ToBase64String(BitConverter.GetBytes(i)); ms = new MemoryStream(); ms.Write(byteArray, offset, numBytesPerBlock); blob.PutBlock(block, ms, null); BlockIds.Add(block); } blob.PutBlockList(BlockIds); blob.Metadata["FILETYPE"] = "text"; } } Can anyone tell me how to solve it...

    Read the article

  • Upgrade to Azure 2.2 SDK is causing roles to fail

    - by Jon Leach
    I have 3 worker roles and a web role in my project and I upgraded it to the new 2.2 SDK (required in VS2013). Ever since the upgrade, all of the worker roles are failing and they instantly recycle as soon as they're started. When the roles start, I'm getting these messages: Microsoft.WindowsAzure.ServiceRuntime Information: 200 : Role entrypoint . CALLING OnStart() Microsoft.WindowsAzure.ServiceRuntime Information: 202 : Role entrypoint . COMPLETED OnStart() The thread 0x441c has exited with code 259 (0x103). Microsoft.WindowsAzure.ServiceRuntime Information: 203 : Role entrypoint . CALLING Run() Microsoft.WindowsAzure.ServiceRuntime Warning: 204 : Role entrypoint . COMPLETED Run() ==> ROLE RECYCLING INITIATED Microsoft.WindowsAzure.ServiceRuntime Information: 503 : Role instance recycling is starting The thread 0x2684 has exited with code 259 (0x103) Two things draw my attention: I've started to see a bunch of errors "Cannot find or open the PDB file." But I don't know that this is directly relevant. I'm using VS 2013 and while the project lists the SDK as 2.2, the references within the roles are the 2.1 versions. Do I need to upgrade the components? Why wouldn't the project upgrade these automatically when I pulled the project into VS as it only support 2.2? Any thoughts on how to attach this are appreciated.

    Read the article

  • Azure storage: Uploaded files with size zero bytes

    - by Fabio Milheiro
    When I upload an image file to a blob, the image is uploaded apparently successfully (no errors). When I go to cloud storage studio, the file is there, but with a size of 0 (zero) bytes. The following is the code that I am using: // These two methods belong to the ContentService class used to upload // files in the storage. public void SetContent(HttpPostedFileBase file, string filename, bool overwrite) { CloudBlobContainer blobContainer = GetContainer(); var blob = blobContainer.GetBlobReference(filename); if (file != null) { blob.Properties.ContentType = file.ContentType; blob.UploadFromStream(file.InputStream); } else { blob.Properties.ContentType = "application/octet-stream"; blob.UploadByteArray(new byte[1]); } } public string UploadFile(HttpPostedFileBase file, string uploadPath) { if (file.ContentLength == 0) { return null; } string filename; int indexBar = file.FileName.LastIndexOf('\\'); if (indexBar > -1) { filename = DateTime.UtcNow.Ticks + file.FileName.Substring(indexBar + 1); } else { filename = DateTime.UtcNow.Ticks + file.FileName; } ContentService.Instance.SetContent(file, Helper.CombinePath(uploadPath, filename), true); return filename; } // The above code is called by this code. HttpPostedFileBase newFile = Request.Files["newFile"] as HttpPostedFileBase; ContentService service = new ContentService(); blog.Image = service.UploadFile(newFile, string.Format("{0}{1}", Constants.Paths.BlogImages, blog.RowKey)); Before the image file is uploaded to the storage, the Property InputStream from the HttpPostedFileBase appears to be fine (the size of the of image corresponds to what is expected! And no exceptions are thrown). And the really strange thing is that this works perfectly in other cases (uploading Power Points or even other images from the Worker role). The code that calls the SetContent method seems to be exactly the same and file seems to be correct since a new file with zero bytes is created at the correct location. Does any one have any suggestion please? I debugged this code dozens of times and I cannot see the problem. Any suggestions are welcome! Thanks

    Read the article

  • Windows Azure: Creation of a file on cloud blob container

    - by veda
    I am writing a program that will be executing on the cloud. The program will generate an output that should be written on to a file and the file should be saved on the blob container. I don't have a idea of how to do that Will this code FileStream fs = new FileStream(file, FileMode.Create); StreamWriter sw = new StreamWriter(fs); generate a file named "file" on the cloud...

    Read the article

  • Windows Azure: How to create sub directory in a blob container

    - by veda
    How to create a sub directory in a blob container for example, in my blob container http://veda.blob.core.windows.net/document/ If I store some files it will be http://veda.blob.core.windows.net/document/1.txt http://veda.blob.core.windows.net/document/2.txt Now, how to create a sub directory http://veda.blob.core.windows.net/document/folder/ So that I can store files http://veda.blob.core.windows.net/document/folder/1.txt

    Read the article

  • Upgraded to EF6 blew up Universal provider session state for Azure

    - by Ryan
    I have an ASP.NET MVC 4 application that using the Universal providers for session state: <sessionState mode="Custom" sqlConnectionString="DefaultConnection" customProvider="DefaultSessionProvider"> <providers> <add name="DefaultSessionProvider" type="System.Web.Providers.DefaultSessionStateProvider, System.Web.Providers, Version=1.0.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35" connectionStringName="DefaultConnection" /> </providers> </sessionState> When I upgraded to entity framework 6 I now get this error: Method not found: 'System.Data.Objects.ObjectContext System.Data.Entity.Infrastructure.IObjectContextAdapter.get_ObjectContext()'. I tried adding the reference to System.Data.Entity.dll back in but that didn't work and I know that your not suppose to add that with the new entity framework..

    Read the article

  • Windows Azure: Creating a subdirectories inside the blob

    - by veda
    I wanted to create some subdirectories inside my blob. But it is not working out well Here is my code protected void ButUpload_click(object sender, EventArgs e) { // store upladed file as a blob storage if (uplFileUpload.HasFile) { name = uplFileUpload.FileName; // get refernce to the cloud blob container CloudBlobContainer blobContainer = cloudBlobClient.GetContainerReference("documents"); if (textbox.Text != "") { name = textbox.Text + "/" + name; } // set the name for the uploading files string UploadDocName = name; // get the blob reference and set the metadata properties CloudBlockBlob blob = blobContainer.GetBlockBlobReference(UploadDocName); blob.Metadata["FILETYPE"] = "text"; blob.Properties.ContentType = uplFileUpload.PostedFile.ContentType; // upload the blob to the storage blob.UploadFromStream(uplFileUpload.FileContent); } } What I did is that, If I have to create a sub directory, I will enter the name of the sub directory in the textbox. for example, if I need to create a file named "test.txt" inside the sub directory "files" Then, my textbox.text = files and uplFileUpload.FileName = test.txt Now I will concatenate them and upload to the blob.. But it is not working well.. I am getting just https://test.core.windows.net/documents/files/ I am not getting the entire thing I was expecting https://test.core.windows.net/documents/files/test.txt What am I doing wrong... How to create sub directories inside the blob.

    Read the article

  • Update Azure Service Configuration File using Powershell

    - by David Osborn
    I'm trying to write a powershell script that updats each of the DiagnosticsConnectionString and DataConnectionString values below, but I can't seem to find each individual Role node using $serviceconfig.ServiceConfiguration.SelectSingleNode("Role[@name='MyService_WorkerRole']") doing echo $serviceconfig.ServiceConfiguration.Role lists out both Role nodes for me so I know it is working up to that point, but after that I am not having much success. where $serviceConfig contains the below XML: <?xml version="1.0"?> <ServiceConfiguration serviceName="MyService" xmlns="http://schemas.microsoft.com/ServiceHosting/2008/10/ServiceConfiguration"> <Role name="MyService_WorkerRole"> <Instances count="1" /> <ConfigurationSettings> <Setting name="DiagnosticsConnectionString" value="really long string" /> <Setting name="DataConnectionString" value="really long string 2" /> </ConfigurationSettings> </Role> <Role name="MyService_WebRole"> <Instances count="1" /> <ConfigurationSettings> <Setting name="DiagnosticsConnectionString" value="really long string 3" /> <Setting name="DataConnectionString" value="really long string 4" /> </ConfigurationSettings> </Role> </ServiceConfiguration>

    Read the article

  • Azure batch operations delete several blobs and tables

    - by reft
    I have a function that deletes every table & blob that belongs to the affected user. CloudTable uploadTable = CloudStorageServices.GetCloudUploadsTable(); TableQuery<UploadEntity> uploadQuery = uploadTable.CreateQuery<UploadEntity>(); List<UploadEntity> uploadEntity = (from e in uploadTable.ExecuteQuery(uploadQuery) where e.PartitionKey == "uploads" && e.UserName == User.Idendity.Name select e).ToList(); foreach (UploadEntity uploadTableItem in uploadEntity) { //Delete table TableOperation retrieveOperationUploads = TableOperation.Retrieve<UploadEntity>("uploads", uploadTableItem.RowKey); TableResult retrievedResultUploads = uploadTable.Execute(retrieveOperationUploads); UploadEntity deleteEntityUploads = (UploadEntity)retrievedResultUploads.Result; TableOperation deleteOperationUploads = TableOperation.Delete(deleteEntityUploads); uploadTable.Execute(deleteOperationUploads); //Delete blob CloudBlobContainer blobContainer = CloudStorageServices.GetCloudBlobsContainer(); CloudBlockBlob blob = blobContainer.GetBlockBlobReference(uploadTableItem.BlobName); blob.Delete(); } Each table got its own blob, so if the list contains 3 uploadentities, the 3 table and the 3 blobs will be deleted. I heard you can use table batch operations for reduce cost and load. I tried it, but failed miserable. Anyone intrested in helping me:)? Im guessing tablebatch operations are for tables only, so its a no go for blobs, right? How would you add tablebatchoperations for this code? Do you see any other improvements that can be done? Thanks!

    Read the article

  • Simplest Azure Storage Manipulation possible

    - by Hurricanepkt
    I have the need to integrate some blob storage into an existing ASP.NET Mvc site my hope is to be able to just add some references and then just do puts and gets but I cannot find any simple example for how to do this (that hasn't been depricated to the point it no longer works) I have tried using StorageClient but CreateCloudBlobClient() doesn't seem to work.

    Read the article

  • Windows Azure:broken logging after migration to the new SDK 1.3

    - by cloud.dev
    Hi, pls, help. I've migrated to new SDK 1. (Full-IIS mode) I use the following logging: case TraceLevel.Error: Trace.TraceError(message); break; case TraceLevel.Warning: Trace.TraceWarning(message); break; case TraceLevel.Info: Trace.TraceInformation(message); break; case TraceLevel.Verbose: Trace.WriteLine(message); break; it worked fine until I migrated to the new SDK. now, logging works only for Worker Roles. Web-Role can log only inside OnStart-method of WebRole.cs in other cases: logged nothing I understand that Full-IIS means different domains. so, I must call someway WaIIS.exe from w3wp.exe or ...?

    Read the article

  • Windows Azure: Parallelization of the code

    - by veda
    I have some matrix multiplication operation. I want to parallelize the execution of those operations through multiple processors.. This can be done on high performance computing cluster using MPI (Message Passing Interface). Like wise, can I do some parallelization in the cloud using multiple worker roles. Is there any means for doing that.

    Read the article

  • dotTrace cant create deploy folder on Windows Azure Web Sites (WAWS)

    - by Orhan Maden
    I get the error message 'Can't create deploy folder.' when I try to profile a remote websites on WAWS. Actions taken: Downloaded and installed the dotTrace Profiler 5.5 from the JetBrains website Downloaded the dotTrace.Performance.Remote version 5.5.0 from Nuget Published the website to WAWS via Visual Studio 2013 Started the dotTrace application as Administrator Connected to the remote _https://subdomain.azurwebsites.net/AgentService.asmx. See image: http://1drv.ms/1nF5Cyh Selected the w3wp process and pressed Run Got the error message 'Can't create deploy folder'. See image: http://1drv.ms/U5h35A I'm running dotTrace in trial mode at the moment. Swift help is much appreciated. Orhan

    Read the article

  • Details of 5GB and 50GB SQL Azure databases have now been released, along with new price points

    - by Eric Nelson
    Like many others signed up to the Windows Azure Platform, I received an email overnight detailing the upcoming database size changes for SQL Azure. I know from our work with early adopters over the last 12 months that the 1GB and 10GB limits were sometimes seen as blockers, especially when migrating existing application to SQL Azure. On June 28th 2010, we will be increasing the size limits: SQL Azure Web Edition database from 1 GB to 5 GB SQL Azure Business Edition database will go from 10 GB to 50 GB Along with these changes comes new price points, including the option to increase in increments of 10GB: Web Edition: Up to 1 GB relational database = $9.99 / month Up to 5 GB relational database = $49.95 / month Business Edition: Up to 10 GB relational database = $99.99 / month Up to 20 GB relational database = $199.98 / month Up to 30 GB relational database = $299.97 / month Up to 40 GB relational database = $399.96 / month Up to 50 GB relational database = $499.95 / month Check out the full SQL Azure pricing. Related Links: http://ukazure.ning.com UK community site Getting started with the Windows Azure Platform

    Read the article

  • Azure Virtual Machines - what fault tolerance do they provide?

    - by Borek
    We are thinking about moving our virtual machines (Hyper-V VHDs) to Windows Azure but I haven't found much about what kind of fault tolerance that infrastructure provides. When I run VHD in Azure, I've got two questions: Is my VHD and all the data in it safe? I think that uploaded VHDs use the "Storage" infrastructure so they should be automatically replicated to multiple disks and geographically distributed but should I still make a full-image backup just to be safe? (Note that of course I will be backing up the actual data inside VMs that I care about; I just want to know if there is a chance greater than 0.0000001% that one day I will receive an email from Microsoft telling me that my VM is gone and that I should create or restore it from scratch). Do I need to worry about other things regarding the availability of my VMs? I mean, when I have an on-premise server I need to worry about the hardware itself, about the host operating system, what would happen if my router failed, if my Hyper-V's C: drive failed etc. Am I right in thinking that with Azure, their infrastructure takes care of all of this? Thanks.

    Read the article

  • Windows Azure : le dev camp arrive, venez découvrir la page dédiée a l'évènement sur Developpez !

    Le 20 juin aura lieu la journée Dev Camp consacrée à Azure. [IMG]http://i.msdn.microsoft.com/hh868108.azure-camps(fr-fr,MSDN.10).png[/IMG] Cette journée est l'occasion de découvrir tous les services Cloud d'Azure (SQL Azure, Stockage avec Windows Azure Storage, Back-end, etc.), d'apprendre comment réaliser des projets et héberger des applications ? ou des sites webs - sur la plateforme. L'Azure Dev Camp abordera également les applications multi-tiers et la manière de « migrer, intégrer et étendre votre code et vos applications existantes grâce à Windows Azure ». Cette journée abordera aussi la construction d'APIs Web pour enrichir des applications mobiles iOS, Android et bien sûr Windows Phone. Enfin, le rendez-vous...

    Read the article

  • Azure VM won't boot after sysprep; integration tools installed

    - by Mark Williams
    I have installed the Azure Integration Components and used sysprep on a Windows 2012 VM. Now the machine won't start up. I uploaded the VHD to Azure - it failed there too. When I start up the VM I get a PowerShell window that hangs out for a bit; eventually I get the following error, after which the machine restarts. New-Object: The dependency service or group failed to start. (Exception from HRESULT: 0x8007042C) At line1: char:1 New-Object -comobject WaAgent.WindowsSetupComponent | % { $_.HandleSetupError() ... ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ +CategoryInfo : ResourceUnavailable (:) [New-Object], COMException +FullyQualifiedErrorId: NoCOMClassIdentified,Microsoft.PowerShell.Commands.NewObjectCommand I have tried renaming unattended.xml and turning on bootlogging. Neither of those yielded much help. Is there a way I can disable the Azure components that run during OOBE? That seems to be the source of the problem. Mounting the VHD is easy. 0x8007042C looks like a firewall issue, based on my googling. Unfortunately I can't get the machine to boot so I can figure that issue out. Also, I can't get around this problem by booting into safe mode. Thanks for your help, guys.

    Read the article

  • Azure Search Preview

    - by Greg Low
    One of the things I’ve been keeping an eye on for quite a while now is the development of the Azure Search system. While it’s not a full replacement for the full-text indexing service in SQL Server on-premises as yet, it’s a really, really good start. Liam Cavanagh, Pablo Castro and the team have done a great job bringing this to the preview stage and I suspect it could be quite popular. I was very impressed by how they incorporated quite a bit of feedback I gave them early on, and I’m sure that others involved would have felt the same. There are two tiers at present. One is a free tier and has shared resources; the other is currently $125/month and has reserved resources. I would like to see another tier between these two, much the same way that Azure websites work. If you have any feedback on this, now would be a good time to make it known. In the meantime, given there is a free tier, there’s no excuse to not get out and try it. You’ll find details of it here: http://azure.microsoft.com/en-us/documentation/services/search/ I’ll be posting more info about this service, and showing examples of it during the upcoming months.

    Read the article

  • Windows Azure AppFabric SDK - June CTP - Download issues

    - by Charles Young
    Microsoft has announced availability of the June CTP for Windows Azure AppFabric. See http://blogs.msdn.com/b/appfabric/archive/2011/06/20/announcing-the-windows-azure-appfabric-june-ctp.aspx. This is an exciting release and provides greater insight into where the AppFabric team is heading in terms of developer and management tooling. Microsoft is offering space in the cloud to experiment with the CTP, but this is limited, so register early to get a namespace! You can download the SDK for the June CTP. However, we ran into a lot of trouble trying to do this today. Whenever we followed the link, we ended up on the page for the May CTP. We found what appeared to be a workaround which we were able to repeat on another box (and which I reported on Connect), but then a few minutes later I couldn't repeat it. Just now, the given link appears to be working every time in IE, but not in Firefox!   Frankly, the behaviour seems random!   It looks like the same URL points to two different pages, and I suspect that which page you end up on is hit and miss. The link to the download page is http://www.microsoft.com/download/en/details.aspx?id=17691. If you end up on the wrong page, try again later and you may get to the right place. Or try googling "Windows Azure AppFabric SDK CTP – June Update" and following a link to this page. For some reason, that sometimes seems to work. Good luck!

    Read the article

  • PPT Leveraging Azure for Performance Testing

    - by Tarun Arora
    I have recently presented a session on “How you can leverage Azure for Performance Testing” your application.  It goes without saying that performance testing your application not only gives you the confidence that the application will work under heavy levels of stress but also gives you the ability to test how scalable the architecture of your application is. It is important to know how much is too much for your application! Working with various clients in the industry I have realized that the biggest barrier in Load Testing & Performance Testing adoption is the high infrastructure and administration cost that comes with this phase of testing. In the session I tried to demonstrate how you can use the power of Windows Azure to effectively abstract the administration cost of infrastructure management and lower the total cost of Load & Performance Testing. You can view the session presentation here, http://www.slideshare.net/aroratarun/leveraging-azure-for-performance-testing  I’ll be adding a video on this subject shortly… If you have any feedback or further suggestions to add to the goodness of this solution please get in touch.

    Read the article

  • Connect to running web role on Azure using Remote Desktop Connection and VS2012

    - by Magnus Karlsson
    We want to be able to collect IntelliTrace information from our running app and also use remote desktop to connect to the IIS and look around(probably debugging). 1. Create certificate 1.1 Right-click the cloud project (marked in red) and select “Configure remote desktop”. 1.2 In the drop down list of certificates, choose <create> at the bottom. 1.3. Follow the instructions, you can set it up with default values. 1.4 When done. Choose the certificate and click “Copy to File…” as seen in the left of the picture above. 1.5. Save the file with any name you want. Now we will save it to local storage to be able to import it to our solution through the azure configuration manager in step 3. 2. Save certificate to local storage Now we need to attach it to our local certificate storage to be able to reach it from our confiuguration manager in visual studio. Microsoft provides the following steps for doing this: http://support.microsoft.com/kb/232137 In order to view the Certificates store on the local computer, perform the following steps: Click Start, and then click Run. Type "MMC.EXE" (without the quotation marks) and click OK. Click Console in the new MMC you created, and then click Add/Remove Snap-in. In the new window, click Add. Highlight the Certificates snap-in, and then click Add. Choose the Computer option and click Next. Select Local Computer on the next screen, and then click OK. Click Close , and then click OK. You have now added the Certificates snap-in, which will allow you to work with any certificates in your computer's certificate store. You may want to save this MMC for later use. Now that you have access to the Certificates snap-in, you can import the server certificate into you computer's certificate store by following these steps: Open the Certificates (Local Computer) snap-in and navigate to Personal, and then Certificates. Note: Certificates may not be listed. If it is not, that is because there are no certificates installed. Right-click Certificates (or Personal if that option does not exist.) Choose All Tasks, and then click Import. When the wizard starts, click Next. Browse to the PFX file you created containing your server certificate and private key. Click Next. Enter the password you gave the PFX file when you created it. Be sure the Mark the key as exportable option is selected if you want to be able to export the key pair again from this computer. As an added security measure, you may want to leave this option unchecked to ensure that no one can make a backup of your private key. Click Next, and then choose the Certificate Store you want to save the certificate to. You should select Personal because it is a Web server certificate. If you included the certificates in the certification hierarchy, it will also be added to this store. Click Next. You should see a summary of screen showing what the wizard is about to do. If this information is correct, click Finish. You will now see the server certificate for your Web server in the list of Personal Certificates. It will be denoted by the common name of the server (found in the subject section of the certificate). Now that you have the certificate backup imported into the certificate store, you can enable Internet Information Services 5.0 to use that certificate (and the corresponding private key). To do this, perform the following steps: Open the Internet Services Manager (under Administrative Tools) and navigate to the Web site you want to enable secure communications (SSL/TLS) on. Right-click on the site and click Properties. You should now see the properties screen for the Web site. Click the Directory Security tab. Under the Secure Communications section, click Server Certificate. This will start the Web Site Certificate Wizard. Click Next. Choose the Assign an existing certificate option and click Next. You will now see a screen showing that contents of your computer's personal certificate store. Highlight your Web server certificate (denoted by the common name), and then click Next. You will now see a summary screen showing you all the details about the certificate you are installing. Be sure that this information is correct or you may have problems using SSL or TLS in HTTP communications. Click Next, and then click OK to exit the wizard. You should now have an SSL/TLS-enabled Web server. Be sure to protect your PFX files from any unwanted personnel. Image of a typical MMC.EXE with the certificates up.   3. Import the certificate to you visual studio project. 3.1 Now right click your equivalent to the MvcWebRole1 (as seen in the first picture under the red oval) and choose properties. 3.2 Choose Certificates. Right click the ellipsis to the right of the “thumbprint” and you should be able to select your newly created certificate here. After selecting it- save the file.   4. Upload the certificate to your Azure subscription. 4.1 Go to the azure management portal, click the services menu icon to the left and choose the service. Click Upload in the bottom menu.     5. Connect to server. Since I tried to use account settings(have to use another name) we have to set up a new name for the connection. No biggie. 5.1 Go to azure management portal, select your service and in the bottom menu, choose “REMOTE”. This will display the configuration for remote connection. It will actually change your ServiceConfiguration.cscfg file. After you change It here it might be good to choose download and replace the one in your project. Set a name that is not your windows azure account name and not Administrator. 5.2 Goto visual studio, click Server Explorer. Choose as selected in the picture below and click “COnnect using remote desktop”.   5.2 You will now be able to log in with the name and password set up in step 5.1. and voila! Windows server 2012, IIS and other nice stuff!   To do this one I’ve been using http://msdn.microsoft.com/en-us/library/windowsazure/ff683671.aspx where you can collect some of this information and additional one.

    Read the article

< Previous Page | 22 23 24 25 26 27 28 29 30 31 32 33  | Next Page >