Search Results

Search found 1914 results on 77 pages for 'platinum azure'.

Page 26/77 | < Previous Page | 22 23 24 25 26 27 28 29 30 31 32 33  | Next Page >

  • windows azure thoughts.

    - by foxjazz
    Well I had a relatively unpleasant exprience with Microsoft's new offerings when it comes to azure.And of course they deleted the forum trail.They don't want the consumer as a customer, expect to pay $200 - $250 or more a year for azure and that doesn't include a good email solution. This is for the smallest slice offering.If you want a simple website solution 1and1.com isn't a bad choice and it's less than $50 a year.

    Read the article

  • Azure Design Patterms

    - by kaleidoscope
    Design patterns are represented as relationships between classes and objects with defined responsibilities that act in concert to carry out the solution. Azure Design Pattern : Design Pattern on the Azure platform. · Cloud Hosting Patterns · Cloud Data Patterns · Cloud Communication & Sync Patterns · Cloud Security Patterns · Application Patterns More Information: http://azuredesignpatterns.com/   Ram, P

    Read the article

  • Deploy and Test an Azure App with Platform Ready

    Microsoft Platform Ready provides technical and marketing resources for companies building applications for the Microsoft platform. Currently they are working with The Code Project on a promotion that will pay $250 USD to companies for their FIRST Windows Azure Application that is verified compatible using the Microsoft Platform Ready testing tools. The contest is valid only through 21 June 2011 12:00 PST in the US only, but the walkthrough I’m about to show will work for any company who wishes to confirm and verify to customers that their application is running correctly on Windows Azure.

    Read the article

  • Windows Azure : Storage Client Exception Unhandled

    - by veda
    I am writing a code for upload large files into the blobs using blocks... When I tested it, it gave me an StorageClientException It stated: One of the request inputs is out of range. I got this exception in this line of the code: blob.PutBlock(block, ms, null); Here is my code: protected void ButUploadBlocks_click(object sender, EventArgs e) { // store upladed file as a blob storage if (uplFileUpload.HasFile) { name = uplFileUpload.FileName; byte[] byteArray = uplFileUpload.FileBytes; Int64 contentLength = byteArray.Length; int numBytesPerBlock = 250 *1024; // 250KB per block int blocksCount = (int)Math.Ceiling((double)contentLength / numBytesPerBlock); // number of blocks MemoryStream ms ; List<string>BlockIds = new List<string>(); string block; int offset = 0; // get refernce to the cloud blob container CloudBlobContainer blobContainer = cloudBlobClient.GetContainerReference("documents"); // set the name for the uploading files string UploadDocName = name; // get the blob reference and set the metadata properties CloudBlockBlob blob = blobContainer.GetBlockBlobReference(UploadDocName); blob.Properties.ContentType = uplFileUpload.PostedFile.ContentType; for (int i = 0; i < blocksCount; i++, offset = offset + numBytesPerBlock) { block = Convert.ToBase64String(BitConverter.GetBytes(i)); ms = new MemoryStream(); ms.Write(byteArray, offset, numBytesPerBlock); blob.PutBlock(block, ms, null); BlockIds.Add(block); } blob.PutBlockList(BlockIds); blob.Metadata["FILETYPE"] = "text"; } } Can anyone tell me how to solve it...

    Read the article

  • Upgrade to Azure 2.2 SDK is causing roles to fail

    - by Jon Leach
    I have 3 worker roles and a web role in my project and I upgraded it to the new 2.2 SDK (required in VS2013). Ever since the upgrade, all of the worker roles are failing and they instantly recycle as soon as they're started. When the roles start, I'm getting these messages: Microsoft.WindowsAzure.ServiceRuntime Information: 200 : Role entrypoint . CALLING OnStart() Microsoft.WindowsAzure.ServiceRuntime Information: 202 : Role entrypoint . COMPLETED OnStart() The thread 0x441c has exited with code 259 (0x103). Microsoft.WindowsAzure.ServiceRuntime Information: 203 : Role entrypoint . CALLING Run() Microsoft.WindowsAzure.ServiceRuntime Warning: 204 : Role entrypoint . COMPLETED Run() ==> ROLE RECYCLING INITIATED Microsoft.WindowsAzure.ServiceRuntime Information: 503 : Role instance recycling is starting The thread 0x2684 has exited with code 259 (0x103) Two things draw my attention: I've started to see a bunch of errors "Cannot find or open the PDB file." But I don't know that this is directly relevant. I'm using VS 2013 and while the project lists the SDK as 2.2, the references within the roles are the 2.1 versions. Do I need to upgrade the components? Why wouldn't the project upgrade these automatically when I pulled the project into VS as it only support 2.2? Any thoughts on how to attach this are appreciated.

    Read the article

  • Azure storage: Uploaded files with size zero bytes

    - by Fabio Milheiro
    When I upload an image file to a blob, the image is uploaded apparently successfully (no errors). When I go to cloud storage studio, the file is there, but with a size of 0 (zero) bytes. The following is the code that I am using: // These two methods belong to the ContentService class used to upload // files in the storage. public void SetContent(HttpPostedFileBase file, string filename, bool overwrite) { CloudBlobContainer blobContainer = GetContainer(); var blob = blobContainer.GetBlobReference(filename); if (file != null) { blob.Properties.ContentType = file.ContentType; blob.UploadFromStream(file.InputStream); } else { blob.Properties.ContentType = "application/octet-stream"; blob.UploadByteArray(new byte[1]); } } public string UploadFile(HttpPostedFileBase file, string uploadPath) { if (file.ContentLength == 0) { return null; } string filename; int indexBar = file.FileName.LastIndexOf('\\'); if (indexBar > -1) { filename = DateTime.UtcNow.Ticks + file.FileName.Substring(indexBar + 1); } else { filename = DateTime.UtcNow.Ticks + file.FileName; } ContentService.Instance.SetContent(file, Helper.CombinePath(uploadPath, filename), true); return filename; } // The above code is called by this code. HttpPostedFileBase newFile = Request.Files["newFile"] as HttpPostedFileBase; ContentService service = new ContentService(); blog.Image = service.UploadFile(newFile, string.Format("{0}{1}", Constants.Paths.BlogImages, blog.RowKey)); Before the image file is uploaded to the storage, the Property InputStream from the HttpPostedFileBase appears to be fine (the size of the of image corresponds to what is expected! And no exceptions are thrown). And the really strange thing is that this works perfectly in other cases (uploading Power Points or even other images from the Worker role). The code that calls the SetContent method seems to be exactly the same and file seems to be correct since a new file with zero bytes is created at the correct location. Does any one have any suggestion please? I debugged this code dozens of times and I cannot see the problem. Any suggestions are welcome! Thanks

    Read the article

  • Windows Azure: Creation of a file on cloud blob container

    - by veda
    I am writing a program that will be executing on the cloud. The program will generate an output that should be written on to a file and the file should be saved on the blob container. I don't have a idea of how to do that Will this code FileStream fs = new FileStream(file, FileMode.Create); StreamWriter sw = new StreamWriter(fs); generate a file named "file" on the cloud...

    Read the article

  • Windows Azure: How to create sub directory in a blob container

    - by veda
    How to create a sub directory in a blob container for example, in my blob container http://veda.blob.core.windows.net/document/ If I store some files it will be http://veda.blob.core.windows.net/document/1.txt http://veda.blob.core.windows.net/document/2.txt Now, how to create a sub directory http://veda.blob.core.windows.net/document/folder/ So that I can store files http://veda.blob.core.windows.net/document/folder/1.txt

    Read the article

  • Upgraded to EF6 blew up Universal provider session state for Azure

    - by Ryan
    I have an ASP.NET MVC 4 application that using the Universal providers for session state: <sessionState mode="Custom" sqlConnectionString="DefaultConnection" customProvider="DefaultSessionProvider"> <providers> <add name="DefaultSessionProvider" type="System.Web.Providers.DefaultSessionStateProvider, System.Web.Providers, Version=1.0.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35" connectionStringName="DefaultConnection" /> </providers> </sessionState> When I upgraded to entity framework 6 I now get this error: Method not found: 'System.Data.Objects.ObjectContext System.Data.Entity.Infrastructure.IObjectContextAdapter.get_ObjectContext()'. I tried adding the reference to System.Data.Entity.dll back in but that didn't work and I know that your not suppose to add that with the new entity framework..

    Read the article

  • Windows Azure: Creating a subdirectories inside the blob

    - by veda
    I wanted to create some subdirectories inside my blob. But it is not working out well Here is my code protected void ButUpload_click(object sender, EventArgs e) { // store upladed file as a blob storage if (uplFileUpload.HasFile) { name = uplFileUpload.FileName; // get refernce to the cloud blob container CloudBlobContainer blobContainer = cloudBlobClient.GetContainerReference("documents"); if (textbox.Text != "") { name = textbox.Text + "/" + name; } // set the name for the uploading files string UploadDocName = name; // get the blob reference and set the metadata properties CloudBlockBlob blob = blobContainer.GetBlockBlobReference(UploadDocName); blob.Metadata["FILETYPE"] = "text"; blob.Properties.ContentType = uplFileUpload.PostedFile.ContentType; // upload the blob to the storage blob.UploadFromStream(uplFileUpload.FileContent); } } What I did is that, If I have to create a sub directory, I will enter the name of the sub directory in the textbox. for example, if I need to create a file named "test.txt" inside the sub directory "files" Then, my textbox.text = files and uplFileUpload.FileName = test.txt Now I will concatenate them and upload to the blob.. But it is not working well.. I am getting just https://test.core.windows.net/documents/files/ I am not getting the entire thing I was expecting https://test.core.windows.net/documents/files/test.txt What am I doing wrong... How to create sub directories inside the blob.

    Read the article

  • Update Azure Service Configuration File using Powershell

    - by David Osborn
    I'm trying to write a powershell script that updats each of the DiagnosticsConnectionString and DataConnectionString values below, but I can't seem to find each individual Role node using $serviceconfig.ServiceConfiguration.SelectSingleNode("Role[@name='MyService_WorkerRole']") doing echo $serviceconfig.ServiceConfiguration.Role lists out both Role nodes for me so I know it is working up to that point, but after that I am not having much success. where $serviceConfig contains the below XML: <?xml version="1.0"?> <ServiceConfiguration serviceName="MyService" xmlns="http://schemas.microsoft.com/ServiceHosting/2008/10/ServiceConfiguration"> <Role name="MyService_WorkerRole"> <Instances count="1" /> <ConfigurationSettings> <Setting name="DiagnosticsConnectionString" value="really long string" /> <Setting name="DataConnectionString" value="really long string 2" /> </ConfigurationSettings> </Role> <Role name="MyService_WebRole"> <Instances count="1" /> <ConfigurationSettings> <Setting name="DiagnosticsConnectionString" value="really long string 3" /> <Setting name="DataConnectionString" value="really long string 4" /> </ConfigurationSettings> </Role> </ServiceConfiguration>

    Read the article

  • Azure batch operations delete several blobs and tables

    - by reft
    I have a function that deletes every table & blob that belongs to the affected user. CloudTable uploadTable = CloudStorageServices.GetCloudUploadsTable(); TableQuery<UploadEntity> uploadQuery = uploadTable.CreateQuery<UploadEntity>(); List<UploadEntity> uploadEntity = (from e in uploadTable.ExecuteQuery(uploadQuery) where e.PartitionKey == "uploads" && e.UserName == User.Idendity.Name select e).ToList(); foreach (UploadEntity uploadTableItem in uploadEntity) { //Delete table TableOperation retrieveOperationUploads = TableOperation.Retrieve<UploadEntity>("uploads", uploadTableItem.RowKey); TableResult retrievedResultUploads = uploadTable.Execute(retrieveOperationUploads); UploadEntity deleteEntityUploads = (UploadEntity)retrievedResultUploads.Result; TableOperation deleteOperationUploads = TableOperation.Delete(deleteEntityUploads); uploadTable.Execute(deleteOperationUploads); //Delete blob CloudBlobContainer blobContainer = CloudStorageServices.GetCloudBlobsContainer(); CloudBlockBlob blob = blobContainer.GetBlockBlobReference(uploadTableItem.BlobName); blob.Delete(); } Each table got its own blob, so if the list contains 3 uploadentities, the 3 table and the 3 blobs will be deleted. I heard you can use table batch operations for reduce cost and load. I tried it, but failed miserable. Anyone intrested in helping me:)? Im guessing tablebatch operations are for tables only, so its a no go for blobs, right? How would you add tablebatchoperations for this code? Do you see any other improvements that can be done? Thanks!

    Read the article

  • Simplest Azure Storage Manipulation possible

    - by Hurricanepkt
    I have the need to integrate some blob storage into an existing ASP.NET Mvc site my hope is to be able to just add some references and then just do puts and gets but I cannot find any simple example for how to do this (that hasn't been depricated to the point it no longer works) I have tried using StorageClient but CreateCloudBlobClient() doesn't seem to work.

    Read the article

  • Windows Azure:broken logging after migration to the new SDK 1.3

    - by cloud.dev
    Hi, pls, help. I've migrated to new SDK 1. (Full-IIS mode) I use the following logging: case TraceLevel.Error: Trace.TraceError(message); break; case TraceLevel.Warning: Trace.TraceWarning(message); break; case TraceLevel.Info: Trace.TraceInformation(message); break; case TraceLevel.Verbose: Trace.WriteLine(message); break; it worked fine until I migrated to the new SDK. now, logging works only for Worker Roles. Web-Role can log only inside OnStart-method of WebRole.cs in other cases: logged nothing I understand that Full-IIS means different domains. so, I must call someway WaIIS.exe from w3wp.exe or ...?

    Read the article

  • Windows Azure: Parallelization of the code

    - by veda
    I have some matrix multiplication operation. I want to parallelize the execution of those operations through multiple processors.. This can be done on high performance computing cluster using MPI (Message Passing Interface). Like wise, can I do some parallelization in the cloud using multiple worker roles. Is there any means for doing that.

    Read the article

  • dotTrace cant create deploy folder on Windows Azure Web Sites (WAWS)

    - by Orhan Maden
    I get the error message 'Can't create deploy folder.' when I try to profile a remote websites on WAWS. Actions taken: Downloaded and installed the dotTrace Profiler 5.5 from the JetBrains website Downloaded the dotTrace.Performance.Remote version 5.5.0 from Nuget Published the website to WAWS via Visual Studio 2013 Started the dotTrace application as Administrator Connected to the remote _https://subdomain.azurwebsites.net/AgentService.asmx. See image: http://1drv.ms/1nF5Cyh Selected the w3wp process and pressed Run Got the error message 'Can't create deploy folder'. See image: http://1drv.ms/U5h35A I'm running dotTrace in trial mode at the moment. Swift help is much appreciated. Orhan

    Read the article

  • Details of 5GB and 50GB SQL Azure databases have now been released, along with new price points

    - by Eric Nelson
    Like many others signed up to the Windows Azure Platform, I received an email overnight detailing the upcoming database size changes for SQL Azure. I know from our work with early adopters over the last 12 months that the 1GB and 10GB limits were sometimes seen as blockers, especially when migrating existing application to SQL Azure. On June 28th 2010, we will be increasing the size limits: SQL Azure Web Edition database from 1 GB to 5 GB SQL Azure Business Edition database will go from 10 GB to 50 GB Along with these changes comes new price points, including the option to increase in increments of 10GB: Web Edition: Up to 1 GB relational database = $9.99 / month Up to 5 GB relational database = $49.95 / month Business Edition: Up to 10 GB relational database = $99.99 / month Up to 20 GB relational database = $199.98 / month Up to 30 GB relational database = $299.97 / month Up to 40 GB relational database = $399.96 / month Up to 50 GB relational database = $499.95 / month Check out the full SQL Azure pricing. Related Links: http://ukazure.ning.com UK community site Getting started with the Windows Azure Platform

    Read the article

  • Azure Virtual Machines - what fault tolerance do they provide?

    - by Borek
    We are thinking about moving our virtual machines (Hyper-V VHDs) to Windows Azure but I haven't found much about what kind of fault tolerance that infrastructure provides. When I run VHD in Azure, I've got two questions: Is my VHD and all the data in it safe? I think that uploaded VHDs use the "Storage" infrastructure so they should be automatically replicated to multiple disks and geographically distributed but should I still make a full-image backup just to be safe? (Note that of course I will be backing up the actual data inside VMs that I care about; I just want to know if there is a chance greater than 0.0000001% that one day I will receive an email from Microsoft telling me that my VM is gone and that I should create or restore it from scratch). Do I need to worry about other things regarding the availability of my VMs? I mean, when I have an on-premise server I need to worry about the hardware itself, about the host operating system, what would happen if my router failed, if my Hyper-V's C: drive failed etc. Am I right in thinking that with Azure, their infrastructure takes care of all of this? Thanks.

    Read the article

  • Azure VM won't boot after sysprep; integration tools installed

    - by Mark Williams
    I have installed the Azure Integration Components and used sysprep on a Windows 2012 VM. Now the machine won't start up. I uploaded the VHD to Azure - it failed there too. When I start up the VM I get a PowerShell window that hangs out for a bit; eventually I get the following error, after which the machine restarts. New-Object: The dependency service or group failed to start. (Exception from HRESULT: 0x8007042C) At line1: char:1 New-Object -comobject WaAgent.WindowsSetupComponent | % { $_.HandleSetupError() ... ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ +CategoryInfo : ResourceUnavailable (:) [New-Object], COMException +FullyQualifiedErrorId: NoCOMClassIdentified,Microsoft.PowerShell.Commands.NewObjectCommand I have tried renaming unattended.xml and turning on bootlogging. Neither of those yielded much help. Is there a way I can disable the Azure components that run during OOBE? That seems to be the source of the problem. Mounting the VHD is easy. 0x8007042C looks like a firewall issue, based on my googling. Unfortunately I can't get the machine to boot so I can figure that issue out. Also, I can't get around this problem by booting into safe mode. Thanks for your help, guys.

    Read the article

< Previous Page | 22 23 24 25 26 27 28 29 30 31 32 33  | Next Page >