Search Results

Search found 2630 results on 106 pages for 'sergey p aka azure'.

Page 39/106 | < Previous Page | 35 36 37 38 39 40 41 42 43 44 45 46  | Next Page >

  • The web publishing extension is not installed which is required to publish

    - by Chong Wang
    I was tring to deploy an ASP.NET Web Application to a Windows Azure Web Site by following the tutorial through this link: https://www.windowsazure.com/en-us/develop/net/tutorials/get-started/ After download the public profile, which is a ".PublishSettings" file, I go back to Visual Studio and right-click the project in Solution Explorer and select Publish from the context menu as the tutorial said. However, a warning box jumped up and it showed me that "The Web Publishing extension is not installed which is required to publish. You can install it from http://go.microsoft.com/fwlink/?LinkID=208120." I already installed "Windows Azure SDK for .NET (VS 2012)" and I also tried to uninstall this and install again, but the same problem is still there. Anyone knows how to solve this? I am really appreciated.

    Read the article

  • When VS 2010 RC is going to be released?

    - by Vadim
    I was looking to install the latest Windows Azure Tools for Microsoft Visual Studio. However, this version doesn't work with Visual Studio 2010 Beta 2. They say that the latest version of Azure Tools & SDK will only work with VS 2008 and VS 2010 RC and that for VS 2010 Beta 2 I still need to install November release. I know that VS 2010 is going to be released on April 12th this year but does anyone knows when VS 2010 RC is coming out? EDIT: The full RTM version of Visual Studio 2010 is released as of April 14, 2010.

    Read the article

  • The underlying connection was closed: An unexpected error occurred on a receive

    - by Dave
    Using a console app which runs as a scheduled task on Azure, to all a long(ish) running webpage on Azure. Problem after a few minutes: The underlying connection was closed: An unexpected error occurred on a receive //// used as WebClient doesn't support timeout public class ExtendedWebClient : WebClient { public int Timeout { get; set; } protected override WebRequest GetWebRequest(Uri address) { HttpWebRequest request = (HttpWebRequest)base.GetWebRequest(address); if (request != null) request.Timeout = Timeout; request.KeepAlive = false; request.ProtocolVersion = HttpVersion.Version10; return request; } public ExtendedWebClient() { Timeout = 1000000; // in ms.. the standard is 100,000 } } class Program { static void Main(string[] args) { var taskUrl = "http://www.secret.com/secret.aspx"; // create a webclient and issue an HTTP get to our url using (ExtendedWebClient httpRequest = new ExtendedWebClient()) { var output = httpRequest.DownloadString(taskUrl); } } }

    Read the article

  • Server configration for our website [duplicate]

    - by Varun Varunesh
    This question already has an answer here: Can you help me with my capacity planning? 2 answers We are a start-up and 6 month back we have launched our beta version website. Now we are in a phase of building our website and web-services for the final product. This website will be based on PHP, Python, MySql database and with wamp server. Right now in the beta version we are using Azure VM for hosting, with configuration of 786MB RAM and Shared CPU. We have 200 avg users daily coming to our website. Now we are trying to increase the number of users from 200 to 1500 daily users. And I am thinking our server should have capability to handle at least 100 concurrent user. Also we have developed web-services for our mobile-apps. Which can also increase loads on the sever. So here are the question that takes me here, I am pretty much confused about whether to go with shared hosting or VM based hosting. If VM, then what configuration will be best for our requirement (as I discussed above) ? Currently our VM is a Windows based server and its very simple to manage, So other than cost factor why should I go for Linux based sever? What other factor should I keep in mind while choosing the server as per our requirement ?

    Read the article

  • Apache/passenger/ree doesn't interpret .rb files

    - by Sergey
    I'm trying to get apache + passenger + ree to work. I think I did everything (except for setting up rails env - for now I wanna run just pure ruby) described here: http://rvm.beginrescueend.com/integration/passenger/ But when I try to go to localhost/test.rb it doesn't interpret that file and just download it. I don't know where should I look for mistakes, so here are a few files I think could be relevant: /var/log/apache2/error.log (these 2 lines are repeating) [Mon May 31 23:12:47 2010] [notice] Graceful restart requested, doing restart [Mon May 31 23:12:48 2010] [notice] Apache/2.2.14 (Ubuntu) PHP/5.3.2-1ubuntu4.2 with Suhosin-Patch Phusion_Passenger/2.2.11 configured -- resuming normal operations /etc/apache2/httpd.conf LoadModule passenger_module /home/sergey/.rvm/gems/ree-1.8.7-2010.01/gems/passenger-2.2.11/ext/apache2/mod_passenger.so PassengerRoot /home/sergey/.rvm/gems/ree-1.8.7-2010.01/gems/passenger-2.2.11 PassengerRuby /home/sergey/.rvm/bin/passenger_ruby /var/www/test.rb puts "test"

    Read the article

  • How do I get an Enter USB TV Box TV tuner aka Gadmei UTV302 to work?

    - by Subhash
    Has anyone had any success in using the Enter USB TV Box from Enter Multimedia? It comes bundled with software that works in Windows. I have had no luck using it in Ubuntu 10.10. Update 1 Here is the output from lsusb Bus 007 Device 001: ID 1d6b:0001 Linux Foundation 1.1 root hub Bus 006 Device 001: ID 1d6b:0001 Linux Foundation 1.1 root hub Bus 005 Device 001: ID 1d6b:0001 Linux Foundation 1.1 root hub Bus 004 Device 003: ID 093a:2510 Pixart Imaging, Inc. Optical Mouse Bus 004 Device 002: ID 046d:c312 Logitech, Inc. DeLuxe 250 Keyboard Bus 004 Device 001: ID 1d6b:0001 Linux Foundation 1.1 root hub Bus 003 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub Bus 002 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub Bus 001 Device 006: ID 1f71:3301 Bus 001 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub I can't find the Enter USB TV Box listed in this. In the dmesg tail command, I found something that seems to be related to the card: usb 1-5: new high speed USB device using ehci_hcd and address 6 usb 1-5: config 1 interface 0 altsetting 1 bulk endpoint 0x83 has invalid maxpacket 256 Update 2 From Windows I learned that this USB TV tuner uses some chipset from Gadmei corporation. All computer stores in India sell Enter USB TV Box if you ask for an USB TV tuner. No other brand seems to be interested in this market. Update 3 I learned that this TV tuner is rebranded version of Gadmei UTV302 (USB TV Tuner Box). Update 4 I tried adding em28xx as the chipset (as suggested by user BOBBO below) for the tuner but that did not work. I went back to my Pinnacle PCTV internal card. I don't think the tuner referred by UbuntuForums (Gadmei UTV 330) and the tuner that I have (Gadmei UTV 302) are the same. My USB tuner is several times bigger. My tuner seems to be a newer device with a newer tuner chip. I will submit details of this device to the LinuxTV developers this weekend. Update 5 I opened the tuner box and found that it uses a tuner from a Chinese company - Tenas. Model is TNF 8022-DFA. Update 6 Tuner chip specs (retrived from supplier directory) for Tenas TNF 8022-DFA. Supply voltage: true 5V device(low power dissipation) Control system: I2C bus control of tuning, address selection Tuning system: PLL controlled tuning Receiving system: system PAL D/K,IF(Intermediate Frequency): 38MHz Receiving channels: full frequency range from channel DS1 (49.75MHz) to channel DS57 (863.25MHz); Use Texas Instruments SN761678 IC solution, with mini install size Update 7 Reverse side of the circuit board. Picture of the TV tuner

    Read the article

  • Why did Apple remove Python support in Mavericks, aka Mac OS X 10.9?

    - by alex gray
    Apple has removed Python support (at least on the Developer level) in 10.9. Python IS still on the machine in /System/Library/Frameworks/Python.framework... but trying to link to Python using the 10.9 SDK fails. /Applications/Xcode.app/Contents/Developer/Platforms/MacOSX.platform/Developer/SDKs/MacOSX10.9.sdk/System/Library/Frameworks does not have Python. I'm not a Pythonista, but find it interesting that Apple has made this change. I don't understand why this is done and I'm a bit annoyed that I have to remove Python from my compilation units in order to compile with 10.9 SDK. Is this a statement by Apple, along the lines of "People aren't using Python very much anymore so we're going to phase out support"? Or was something else driving the change?

    Read the article

  • Restful Java based web services in json + html5 and javascript no templates (jsp/jsf/freemarker) aka fat/thick client

    - by Ismail Marmoush
    I have this idea of building a website which service JSON data through restful services framework. And will not use any template engines like jsp/jsf/freemarker. Just pure html5 and Javascript libs. What do you think of the pros and cons of such design ? Just for elaboration and brain storming a friend of mine argued with the following concerns: sounds like gwt this way you won't have any control over you service api for example say you wanna charge the user per request how will you handle it? how will you control your design and themes? what about the 1st request the browser make? not easy with this all of the user's requests will come with "Accept" header "application/json" how will you separate browser from abuser? this way all of your public apis will be used by third party apps abusively and you won't be able to lock it since you won't be able to block the normal user browser We won't use compiled html anyway but may be something like freemarker and in that case you won't expose any of your json resources to the unauthorized user but you will expose all the html since any browser can access them all the well known 1st class services do this can you send me links to what you've read? keep in mind the DOM based XSS it will be a nightmare ofc, if what you say is applicable.

    Read the article

  • Workaround the flip queue (AKA pre-rendered frames) in OpenGL?

    - by user41500
    It appears that some drivers implement a "flip queue" such that, even with vsync enabled, the first few calls to swap buffers return immediately (queuing those frames for later use). It is only after this queue is filled that buffer swaps will block to synchronize with vblank. This behavior is detrimental to my application. It creates latency. Does anyone know of a way to disable it or a workaround for dealing with it? The OpenGL Wiki on Swap Interval suggests a call to glFinish after the swap but I've had no such luck with that trick.

    Read the article

  • HP va-t-il s'attaquer à Microsoft ? Le constructeur va se lancer dans le Cloud Computing où il pourrait concurrencer Windows Azure

    HP va-t-il s'attaquer à Microsoft ? Le constructeur va se lancer dans le Cloud Computing où il pourrait concurrencer Windows Azure Le PDG change, la stratégie aussi. Au temps de Mark Hurd, l'alliance entre le constructeur HP et le fournisseur d'OS Microsoft était claire. Les deux sociétés complétaient leurs offres respectives avec les atouts de l'autre. Le hardware de HP et la plateforme Azure de Microsoft formaient des appliances complètes, clef-en-main, à destination des Cloud privés et des data-centers des entreprises. C'est encore théoriquement le cas aujourd'hui. Mais depu...

    Read the article

  • application custom stock icons not working in ubuntu unity top panel menu (aka appmenu) ("Menus Have Icons" ON)

    - by giuspen
    I recently noticed that in ubuntu unity the top menu of my apps does not show the (custom) icons I added to the gtk stock, but only the basic gtk stock icons. This happens only since the top menu is displayed in the unity top panel (appmenu) and not in the application window. In place of the correct custom icons I see "gtk-missing-image". On my apps toolbars and other menus those icons are displayed properly, the problem is only with the top menu. This happens either with pygtk2 (e.g. http://www.giuspen.com/cherrytree/) and gobject introspection (e.g. http://www.giuspen.com/nautilus-pyextensions/). I use gtk ui manager after integrating the stock icons this way: factory = gtk.IconFactory() pixbuf = gtk.gdk.pixbuf_new_from_file(filepath) iconset = gtk.IconSet(pixbuf) factory.add(stock_name, iconset) factory.add_default() If anybody solved this problem please help. Cheers, Giuseppe.

    Read the article

  • SQL vs. Oracle Live Debate (AKA Smackdown!)

    - by Peter W. DeBetta
    A few years ago I was speaking at a conference in Raleigh, NC where Ted Neward and I found a fun way to promote a Java vs. .NET debate that was planned one evening. We stood in the middle of a crowd during one of the breaks and starting “arguing” about Java vs. .NET with one another. Our voice levels quickly raised and we ended it by slapping each other across the face with a glove to request a challenge. It was a great way to segue to our announcing of the actual debate planned later that evening....(read more)

    Read the article

  • What is wiser for me? Eclipse or IntelliJ aka not another IDE war. [closed]

    - by Xorty
    I hope you guys won't take this as attempt of flamewar :) I am quite happy Eclipse user atm, but I am having little dilema: All big companies in my area are using Eclipse, and I quite like Eclipse, but IntelliJ is simply smarter and comfortable in some things. What would you do? If I decide to switch to IDEA I am afraid that I'll regret it someday ... And I don't want to remember like all shortcuts in both eclipse & IDEA, that's simply too much :)

    Read the article

  • How to save one role implementing a client/server pattern in Azure?

    - by Alfredo Delsors
    Sometimes you need to have an instance performing a server role when other instances are playing the client role. An example can be a file sharing like in this great post: http://blogs.msdn.com/b/mariok/archive/2011/02/11/sharing-folders-in-azure.aspx, one instance shares a folder that all other instances are using to write files that the server processes. The problem is that there is not discovering mechanism in Azure that allows one instance to know where the instance acting as a server is located. A first approach can be having a server role and a client role like in the previous post. This means more instances, more money. A solution to save this "server" role is to use Instance 0, always available, to act as a server. An instance can know that it should act as the server checking RoleEnvironment.CurrentRoleInstance.Id.EndsWith(".0"). Other instances can iterate the RoleEnvironment Instances collection to find the instance whose name ends with ".0", getting its endpoints and acting as its clients.

    Read the article

  • Web Platform Installer issues deploying Azure SDK 1.4 on refreshed systems.

    - by Enrique Lima
    Recently I have been doing quite a bit of testing on different means to deploy the Azure SDKs and such. After a very successful couple of systems, I started running into issues last night. Here is the problem, if I go to the Windows Azure Website, and go to Develop, then click on the SDK and Tools, then Get Tools & SDK, it launches the Web Platform Installer.  All seems well at that point, except it will go through the initial process, will find the SDK files for 1.4, but since the tools for Visual Studio are still 1.3, the location throws back a 404, which causes the Installer to fail.  NOTE:If you already had SDK 1.3 and the tools in place, it will go through. The fix is to go directly to the Microsoft Download Center location and download the files.  Here is the link … http://www.microsoft.com/downloads/en/details.aspx?FamilyID=7a1089b6-4050-4307-86c4-9dadaa5ed018

    Read the article

  • SQL vs. Oracle Live Debate (AKA Smackdown!)

    - by Peter W. DeBetta
    A few years ago I was speaking at a conference in Raleigh, NC where Ted Neward and I found a fun way to promote a Java vs. .NET debate that was planned one evening. We stood in the middle of a crowd during one of the breaks and starting “arguing” about Java vs. .NET with one another. Our voice levels quickly raised and we ended it by slapping each other across the face with a glove to request a challenge. It was a great way to segue to our announcing of the actual debate planned later that evening....(read more)

    Read the article

  • Windows Azure : quand le Cloud vend des avions, Boeing utilise la plate-forme de Microsoft pour modéliser ses appareils

    Windows Azure : quand le Cloud vend des avions Boeing utilise la plate-forme de Microsoft pour modéliser ses appareils Lors du PDC (une de ses conférences pour développeurs) de l'année dernière, Microsoft avait illustré la puissance de calcul de Windows Azure par une démonstration ludique et très concrète. Les studios Pixar étaient montés sur scène pour montrer comment ils utilisaient un nombre variable d'instances hébergées pour obtenir le rendu de certaines scènes de leurs films. Un rendu qu'ils n'auraient pu obtenir, ou pas aussi rapidement, avec leurs propres outils internes. [IMG]http://ftp-developpez.com/gordon-fowler/AzurePixar.jpg[/IMG] P...

    Read the article

  • Windows Azure : Microsoft et Oracle enterrent la hache de guerre avec un accord sur Java, le SGBD, le Linux et les applications d'Oracle

    Windows Azure : Microsoft et Oracle enterrent la hache de guerre Avec un accord qui concerne Java, le SGBD, Linux, le middleware et les applications d'OracleQui l'eût cru ? Les frères ennemis Microsoft et Oracle viennent de passer un accord. Ces retombés se feront en plusieurs temps.Dès aujourd'hui, les applications certifiées par Oracle pourront tourner sur Hyper-V (l'outil de virtualisation de Windows Server) et être hébergées dans Windows Azure. En sens inverse, la plateforme Cloud de Microsoft devient un revendeur de service Cloud approuvé par Oracle (le seul avec AWS). Mais ...

    Read the article

  • Using SQL dB column as a lock for concurrent operations in Entity Framework

    - by Sid
    We have a long running user operation that is handled by a pool of worker processes. Data input and output is from Azure SQL. The master Azure SQL table structure columns are approximated to [UserId, col1, col2, ... , col N, beingProcessed, lastTimeProcessed ] beingProcessed is boolean and lastTimeProcessed is DateTime. The logic in every worker role is: public void WorkerRoleMain() { while(true) { try { dbContext db = new dbContext(); // Read foreach (UserProfile user in db.UserProfile .Where(u => DateTime.UtcNow.Subtract(u.lastTimeProcessed) > TimeSpan.FromHours(24) & u.beingProcessed == false)) { user.beingProcessed = true; // Modify db.SaveChanges(); // Write // Do some long drawn processing here ... ... ... user.lastTimeProcessed = DateTime.UtcNow; user.beingProcessed = false; db.SaveChanges(); } } catch(Exception ex) { LogException(ex); Sleep(TimeSpan.FromMinutes(5)); } } // while () } With multiple workers processing as above (each with their own Entity Framework layer), in essence beingProcessed is being used a lock for MutEx purposes Question: How can I deal with concurrency issues on the beingProcessed "lock" itself based on the above load? I think read-modify-write operation on the beingProcessed needs to be atomic but I'm open to other strategies. Open to other code refinements too.

    Read the article

  • EF Code First to SQL Azure

    - by Predrag Pejic
    I am using EF Code First to create a database on local .\SQLEXPRESS. Among others. I have these 2 classes: public class Shop { public int ShopID { get; set; } [Required(AllowEmptyStrings = false, ErrorMessage = "You must enter a name!")] [MaxLength(25, ErrorMessage = "Name must be 25 characters or less")] public string Name { get; set; } [Required(AllowEmptyStrings = false, ErrorMessage = "You must enter an address!")] [MaxLength(30, ErrorMessage = "Address must be 30 characters or less")] public string Address { get; set; } [Required(AllowEmptyStrings = false, ErrorMessage = "You must enter a valid city name!")] [MaxLength(30, ErrorMessage = "City name must be 30 characters or less")] public string City { get; set; } [Required(AllowEmptyStrings = false, ErrorMessage = "You must enter a phone number!")] [MaxLength(14, ErrorMessage = "Phone number must be 14 characters or less")] public string Phone { get; set; } [MaxLength(100, ErrorMessage = "Description must be 50 characters or less")] public string Description { get; set; } [Required(AllowEmptyStrings = false, ErrorMessage = "You must enter a WorkTime!")] public DateTime WorkTimeBegin { get; set; } [Required(AllowEmptyStrings = false, ErrorMessage = "You must enter a WorkTime!")] public DateTime WorkTimeEnd { get; set; } public DateTime? SaturdayWorkTimeBegin { get; set; } public DateTime? SaturdayWorkTimeEnd { get; set; } public DateTime? SundayWorkTimeBegin { get; set; } public DateTime? SundayWorkTimeEnd { get; set; } public int ShoppingPlaceID { get; set; } public virtual ShoppingPlace ShoppingPlace { get; set; } public virtual ICollection<Category> Categories { get; set; } } public class ShoppingPlace { [Key] public int ShopingplaceID { get; set; } [Required(AllowEmptyStrings = false, ErrorMessage = "You must enter a name!")] [MaxLength(25, ErrorMessage = "Name must be 25 characters or less")] public string Name { get; set; } [Required(AllowEmptyStrings = false, ErrorMessage = "You must enter an address!")] [MaxLength(50, ErrorMessage = "Address must be 50 characters or less")] public string Address { get; set; } [Required(AllowEmptyStrings = false, ErrorMessage = "You must enter a city name!")] [MaxLength(30, ErrorMessage = "City must be 30 characters or less")] public string City { get; set; } [Required(AllowEmptyStrings = false, ErrorMessage = "You must enter a valid phone number!")] [MaxLength(14, ErrorMessage = "Phone number must be 14 characters or less")] public string Phone { get; set; } public int ShoppingCenterID { get; set; } public virtual ShoppingCenter ShoppingCenter { get; set; } public virtual ICollection<Shop> Shops { get; set; } } and a method in DbContext: modelBuilder.Entity<Item>() .HasRequired(p => p.Category) .WithMany(a => a.Items) .HasForeignKey(a => a.CategoryID) .WillCascadeOnDelete(false); modelBuilder.Entity<Category>() .HasRequired(a => a.Shop) .WithMany(a => a.Categories) .HasForeignKey(a => a.ShopID) .WillCascadeOnDelete(false); modelBuilder.Entity<Shop>() .HasOptional(a => a.ShoppingPlace) .WithMany(a => a.Shops) .HasForeignKey(a => a.ShoppingPlaceID) .WillCascadeOnDelete(false); modelBuilder.Entity<ShoppingPlace>() .HasOptional(a => a.ShoppingCenter) .WithMany(a => a.ShoppingPlaces) .HasForeignKey(a => a.ShoppingCenterID) .WillCascadeOnDelete(false); Why I can't create Shop without creating and populating ShopingPlace. How to achieve that? EDIT: Tried with: modelBuilder.Entity<Shop>() .HasOptional(a => a.ShoppingPlace) .WithOptionalPrincipal(); modelBuilder.Entity<ShoppingPlace>() .HasOptional(a => a.ShoppingCenter) .WithOptionalPrincipal(); and it passed, but what is the difference? And why in SQL Server i am allowed to see ShoppingPlaceID and ShoppingPlace_ShopingPlaceID when in the case of Item and Category i see only one?

    Read the article

  • How to use Azure storage for uploading and displaying pictures.

    - by Magnus Karlsson
    Basic set up of Azure storage for local development and production. This is a somewhat completion of the following guide from http://www.windowsazure.com/en-us/develop/net/how-to-guides/blob-storage/ that also involves a practical example that I believe is commonly used, i.e. upload and present an image from a user.   First we set up for local storage and then we configure for them to work on a web role. Steps: 1. Configure connection string locally. 2. Configure model, controllers and razor views.   1. Setup connectionsstring 1.1 Right click your web role and choose “Properties”. 1.2 Click Settings. 1.3 Add setting. 1.4 Name your setting. This will be the name of the connectionstring. 1.5 Click the ellipsis to the right. (the ellipsis appear when you mark the area. 1.6 The following window appears- Select “Windows Azure storage emulator” and click ok.   Now we have a connection string to use. To be able to use it we need to make sure we have windows azure tools for storage. 2.1 Click Tools –> Library Package manager –> Manage Nuget packages for solution. 2.2 This is what it looks like after it has been added.   Now on to what the code should look like. 3.1 First we need a view which collects images to upload. Here Index.cshtml. 1: @model List<string> 2:  3: @{ 4: ViewBag.Title = "Index"; 5: } 6:  7: <h2>Index</h2> 8: <form action="@Url.Action("Upload")" method="post" enctype="multipart/form-data"> 9:  10: <label for="file">Filename:</label> 11: <input type="file" name="file" id="file1" /> 12: <br /> 13: <label for="file">Filename:</label> 14: <input type="file" name="file" id="file2" /> 15: <br /> 16: <label for="file">Filename:</label> 17: <input type="file" name="file" id="file3" /> 18: <br /> 19: <label for="file">Filename:</label> 20: <input type="file" name="file" id="file4" /> 21: <br /> 22: <input type="submit" value="Submit" /> 23: 24: </form> 25:  26: @foreach (var item in Model) { 27:  28: <img src="@item" alt="Alternate text"/> 29: } 3.2 We need a controller to receive the post. Notice the “containername” string I send to the blobhandler. I use this as a folder for the pictures for each user. If this is not a requirement you could just call it container or anything with small characters directly when creating the container. 1: public ActionResult Upload(IEnumerable<HttpPostedFileBase> file) 2: { 3: BlobHandler bh = new BlobHandler("containername"); 4: bh.Upload(file); 5: var blobUris=bh.GetBlobs(); 6: 7: return RedirectToAction("Index",blobUris); 8: } 3.3 The handler model. I’ll let the comments speak for themselves. 1: public class BlobHandler 2: { 3: // Retrieve storage account from connection string. 4: CloudStorageAccount storageAccount = CloudStorageAccount.Parse( 5: CloudConfigurationManager.GetSetting("StorageConnectionString")); 6: 7: private string imageDirecoryUrl; 8: 9: /// <summary> 10: /// Receives the users Id for where the pictures are and creates 11: /// a blob storage with that name if it does not exist. 12: /// </summary> 13: /// <param name="imageDirecoryUrl"></param> 14: public BlobHandler(string imageDirecoryUrl) 15: { 16: this.imageDirecoryUrl = imageDirecoryUrl; 17: // Create the blob client. 18: CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient(); 19: 20: // Retrieve a reference to a container. 21: CloudBlobContainer container = blobClient.GetContainerReference(imageDirecoryUrl); 22: 23: // Create the container if it doesn't already exist. 24: container.CreateIfNotExists(); 25: 26: //Make available to everyone 27: container.SetPermissions( 28: new BlobContainerPermissions 29: { 30: PublicAccess = BlobContainerPublicAccessType.Blob 31: }); 32: } 33: 34: public void Upload(IEnumerable<HttpPostedFileBase> file) 35: { 36: // Create the blob client. 37: CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient(); 38: 39: // Retrieve a reference to a container. 40: CloudBlobContainer container = blobClient.GetContainerReference(imageDirecoryUrl); 41: 42: if (file != null) 43: { 44: foreach (var f in file) 45: { 46: if (f != null) 47: { 48: CloudBlockBlob blockBlob = container.GetBlockBlobReference(f.FileName); 49: blockBlob.UploadFromStream(f.InputStream); 50: } 51: } 52: } 53: } 54: 55: public List<string> GetBlobs() 56: { 57: // Create the blob client. 58: CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient(); 59: 60: // Retrieve reference to a previously created container. 61: CloudBlobContainer container = blobClient.GetContainerReference(imageDirecoryUrl); 62: 63: List<string> blobs = new List<string>(); 64: 65: // Loop over blobs within the container and output the URI to each of them 66: foreach (var blobItem in container.ListBlobs()) 67: blobs.Add(blobItem.Uri.ToString()); 68: 69: return blobs; 70: } 71: } 3.4 So, when the files have been uploaded we will get them to present them to out user in the index page. Pretty straight forward. In this example we only present the image by sending the Uri’s to the view. A better way would be to save them up in a view model containing URI, metadata, alternate text, and other relevant information but for this example this is all we need.   4. Now press F5 in your solution to try it out. You can see the storage emulator UI here:     4.1 If you get any exceptions or errors I suggest to first check if the service Is running correctly. I had problem with this and they seemed related to the installation and a reboot fixed my problems.     5. Set up for Cloud storage. To do this we need to add configuration for cloud just as we did for local in step one. 5.1 We need our keys to do this. Go to the windows Azure menagement portal, select storage icon to the right and click “Manage keys”. (Image from a different blog post though).   5.2 Do as in step 1.but replace step 1.6 with: 1.6 Choose “Manually entered credentials”. Enter your account name. 1.7 Paste your Account Key from step 5.1. and click ok.   5.3. Save, publish and run! Please feel free to ask any questions using the comments form at the bottom of this page. I will get back to you to help you solve any questions. Our consultancy agency also provides services in the Nordic regions if you would like any further support.

    Read the article

  • Accessing SQL Data Services via ADO.NET Data Service Client Library

    - by Mehmet Aras
    Is this possible? Basically I would like to use SQL Data Services REST interface and let the ADO.NET Data Service Client library handle communication details and generate the entities that I can use. I looked at the samples in February release of Azure services kit but the samples in there are using HttpWebRequest and HttpWebResponse to consume SQL Data Services RESTfully. I was hoping to use ADO.NET Data Service Client library to abstract low-level details away.

    Read the article

  • cloud and existing enterprise applications technologies

    - by maxxxee
    What is the significance of new cloud platforms and databases like Microsoft Azure and Amazon EC2? Is it a replacement for enterprise application platforms like .net or JEE in a cloud environment? Is it neccessary to use these or other cloud specific platforms, or can we implement .net or JEE on a cloud based environment?

    Read the article

< Previous Page | 35 36 37 38 39 40 41 42 43 44 45 46  | Next Page >