Search Results

Search found 57848 results on 2314 pages for 'windows vista sp2'.

Page 94/2314 | < Previous Page | 90 91 92 93 94 95 96 97 98 99 100 101  | Next Page >

  • Upload File to Windows Azure Blob in Chunks through ASP.NET MVC, JavaScript and HTML5

    - by Shaun
    Originally posted on: http://geekswithblogs.net/shaunxu/archive/2013/07/01/upload-file-to-windows-azure-blob-in-chunks-through-asp.net.aspxMany people are using Windows Azure Blob Storage to store their data in the cloud. Blob storage provides 99.9% availability with easy-to-use API through .NET SDK and HTTP REST. For example, we can store JavaScript files, images, documents in blob storage when we are building an ASP.NET web application on a Web Role in Windows Azure. Or we can store our VHD files in blob and mount it as a hard drive in our cloud service. If you are familiar with Windows Azure, you should know that there are two kinds of blob: page blob and block blob. The page blob is optimized for random read and write, which is very useful when you need to store VHD files. The block blob is optimized for sequential/chunk read and write, which has more common usage. Since we can upload block blob in blocks through BlockBlob.PutBlock, and them commit them as a whole blob with invoking the BlockBlob.PutBlockList, it is very powerful to upload large files, as we can upload blocks in parallel, and provide pause-resume feature. There are many documents, articles and blog posts described on how to upload a block blob. Most of them are focus on the server side, which means when you had received a big file, stream or binaries, how to upload them into blob storage in blocks through .NET SDK.  But the problem is, how can we upload these large files from client side, for example, a browser. This questioned to me when I was working with a Chinese customer to help them build a network disk production on top of azure. The end users upload their files from the web portal, and then the files will be stored in blob storage from the Web Role. My goal is to find the best way to transform the file from client (end user’s machine) to the server (Web Role) through browser. In this post I will demonstrate and describe what I had done, to upload large file in chunks with high speed, and save them as blocks into Windows Azure Blob Storage.   Traditional Upload, Works with Limitation The simplest way to implement this requirement is to create a web page with a form that contains a file input element and a submit button. 1: @using (Html.BeginForm("About", "Index", FormMethod.Post, new { enctype = "multipart/form-data" })) 2: { 3: <input type="file" name="file" /> 4: <input type="submit" value="upload" /> 5: } And then in the backend controller, we retrieve the whole content of this file and upload it in to the blob storage through .NET SDK. We can split the file in blocks and upload them in parallel and commit. The code had been well blogged in the community. 1: [HttpPost] 2: public ActionResult About(HttpPostedFileBase file) 3: { 4: var container = _client.GetContainerReference("test"); 5: container.CreateIfNotExists(); 6: var blob = container.GetBlockBlobReference(file.FileName); 7: var blockDataList = new Dictionary<string, byte[]>(); 8: using (var stream = file.InputStream) 9: { 10: var blockSizeInKB = 1024; 11: var offset = 0; 12: var index = 0; 13: while (offset < stream.Length) 14: { 15: var readLength = Math.Min(1024 * blockSizeInKB, (int)stream.Length - offset); 16: var blockData = new byte[readLength]; 17: offset += stream.Read(blockData, 0, readLength); 18: blockDataList.Add(Convert.ToBase64String(BitConverter.GetBytes(index)), blockData); 19:  20: index++; 21: } 22: } 23:  24: Parallel.ForEach(blockDataList, (bi) => 25: { 26: blob.PutBlock(bi.Key, new MemoryStream(bi.Value), null); 27: }); 28: blob.PutBlockList(blockDataList.Select(b => b.Key).ToArray()); 29:  30: return RedirectToAction("About"); 31: } This works perfect if we selected an image, a music or a small video to upload. But if I selected a large file, let’s say a 6GB HD-movie, after upload for about few minutes the page will be shown as below and the upload will be terminated. In ASP.NET there is a limitation of request length and the maximized request length is defined in the web.config file. It’s a number which less than about 4GB. So if we want to upload a really big file, we cannot simply implement in this way. Also, in Windows Azure, a cloud service network load balancer will terminate the connection if exceed the timeout period. From my test the timeout looks like 2 - 3 minutes. Hence, when we need to upload a large file we cannot just use the basic HTML elements. Besides the limitation mentioned above, the simple HTML file upload cannot provide rich upload experience such as chunk upload, pause and pause-resume. So we need to find a better way to upload large file from the client to the server.   Upload in Chunks through HTML5 and JavaScript In order to break those limitation mentioned above we will try to upload the large file in chunks. This takes some benefit to us such as - No request size limitation: Since we upload in chunks, we can define the request size for each chunks regardless how big the entire file is. - No timeout problem: The size of chunks are controlled by us, which means we should be able to make sure request for each chunk upload will not exceed the timeout period of both ASP.NET and Windows Azure load balancer. It was a big challenge to upload big file in chunks until we have HTML5. There are some new features and improvements introduced in HTML5 and we will use them to implement our solution.   In HTML5, the File interface had been improved with a new method called “slice”. It can be used to read part of the file by specifying the start byte index and the end byte index. For example if the entire file was 1024 bytes, file.slice(512, 768) will read the part of this file from the 512nd byte to 768th byte, and return a new object of interface called "Blob”, which you can treat as an array of bytes. In fact,  a Blob object represents a file-like object of immutable, raw data. The File interface is based on Blob, inheriting blob functionality and expanding it to support files on the user's system. For more information about the Blob please refer here. File and Blob is very useful to implement the chunk upload. We will use File interface to represent the file the user selected from the browser and then use File.slice to read the file in chunks in the size we wanted. For example, if we wanted to upload a 10MB file with 512KB chunks, then we can read it in 512KB blobs by using File.slice in a loop.   Assuming we have a web page as below. User can select a file, an input box to specify the block size in KB and a button to start upload. 1: <div> 2: <input type="file" id="upload_files" name="files[]" /><br /> 3: Block Size: <input type="number" id="block_size" value="512" name="block_size" />KB<br /> 4: <input type="button" id="upload_button_blob" name="upload" value="upload (blob)" /> 5: </div> Then we can have the JavaScript function to upload the file in chunks when user clicked the button. 1: <script type="text/javascript"> 1: 2: $(function () { 3: $("#upload_button_blob").click(function () { 4: }); 5: });</script> Firstly we need to ensure the client browser supports the interfaces we are going to use. Just try to invoke the File, Blob and FormData from the “window” object. If any of them is “undefined” the condition result will be “false” which means your browser doesn’t support these premium feature and it’s time for you to get your browser updated. FormData is another new feature we are going to use in the future. It could generate a temporary form for us. We will use this interface to create a form with chunk and associated metadata when invoked the service through ajax. 1: $("#upload_button_blob").click(function () { 2: // assert the browser support html5 3: if (window.File && window.Blob && window.FormData) { 4: alert("Your brwoser is awesome, let's rock!"); 5: } 6: else { 7: alert("Oh man plz update to a modern browser before try is cool stuff out."); 8: return; 9: } 10: }); Each browser supports these interfaces by their own implementation and currently the Blob, File and File.slice are supported by Chrome 21, FireFox 13, IE 10, Opera 12 and Safari 5.1 or higher. After that we worked on the files the user selected one by one since in HTML5, user can select multiple files in one file input box. 1: var files = $("#upload_files")[0].files; 2: for (var i = 0; i < files.length; i++) { 3: var file = files[i]; 4: var fileSize = file.size; 5: var fileName = file.name; 6: } Next, we calculated the start index and end index for each chunks based on the size the user specified from the browser. We put them into an array with the file name and the index, which will be used when we upload chunks into Windows Azure Blob Storage as blocks since we need to specify the target blob name and the block index. At the same time we will store the list of all indexes into another variant which will be used to commit blocks into blob in Azure Storage once all chunks had been uploaded successfully. 1: $("#upload_button_blob").click(function () { 2: // assert the browser support html5 3: ... ... 4: // start to upload each files in chunks 5: var files = $("#upload_files")[0].files; 6: for (var i = 0; i < files.length; i++) { 7: var file = files[i]; 8: var fileSize = file.size; 9: var fileName = file.name; 10:  11: // calculate the start and end byte index for each blocks(chunks) 12: // with the index, file name and index list for future using 13: var blockSizeInKB = $("#block_size").val(); 14: var blockSize = blockSizeInKB * 1024; 15: var blocks = []; 16: var offset = 0; 17: var index = 0; 18: var list = ""; 19: while (offset < fileSize) { 20: var start = offset; 21: var end = Math.min(offset + blockSize, fileSize); 22:  23: blocks.push({ 24: name: fileName, 25: index: index, 26: start: start, 27: end: end 28: }); 29: list += index + ","; 30:  31: offset = end; 32: index++; 33: } 34: } 35: }); Now we have all chunks’ information ready. The next step should be upload them one by one to the server side, and at the server side when received a chunk it will upload as a block into Blob Storage, and finally commit them with the index list through BlockBlobClient.PutBlockList. But since all these invokes are ajax calling, which means not synchronized call. So we need to introduce a new JavaScript library to help us coordinate the asynchronize operation, which named “async.js”. You can download this JavaScript library here, and you can find the document here. I will not explain this library too much in this post. We will put all procedures we want to execute as a function array, and pass into the proper function defined in async.js to let it help us to control the execution sequence, in series or in parallel. Hence we will define an array and put the function for chunk upload into this array. 1: $("#upload_button_blob").click(function () { 2: // assert the browser support html5 3: ... ... 4:  5: // start to upload each files in chunks 6: var files = $("#upload_files")[0].files; 7: for (var i = 0; i < files.length; i++) { 8: var file = files[i]; 9: var fileSize = file.size; 10: var fileName = file.name; 11: // calculate the start and end byte index for each blocks(chunks) 12: // with the index, file name and index list for future using 13: ... ... 14:  15: // define the function array and push all chunk upload operation into this array 16: blocks.forEach(function (block) { 17: putBlocks.push(function (callback) { 18: }); 19: }); 20: } 21: }); 22: }); As you can see, I used File.slice method to read each chunks based on the start and end byte index we calculated previously, and constructed a temporary HTML form with the file name, chunk index and chunk data through another new feature in HTML5 named FormData. Then post this form to the backend server through jQuery.ajax. This is the key part of our solution. 1: $("#upload_button_blob").click(function () { 2: // assert the browser support html5 3: ... ... 4: // start to upload each files in chunks 5: var files = $("#upload_files")[0].files; 6: for (var i = 0; i < files.length; i++) { 7: var file = files[i]; 8: var fileSize = file.size; 9: var fileName = file.name; 10: // calculate the start and end byte index for each blocks(chunks) 11: // with the index, file name and index list for future using 12: ... ... 13: // define the function array and push all chunk upload operation into this array 14: blocks.forEach(function (block) { 15: putBlocks.push(function (callback) { 16: // load blob based on the start and end index for each chunks 17: var blob = file.slice(block.start, block.end); 18: // put the file name, index and blob into a temporary from 19: var fd = new FormData(); 20: fd.append("name", block.name); 21: fd.append("index", block.index); 22: fd.append("file", blob); 23: // post the form to backend service (asp.net mvc controller action) 24: $.ajax({ 25: url: "/Home/UploadInFormData", 26: data: fd, 27: processData: false, 28: contentType: "multipart/form-data", 29: type: "POST", 30: success: function (result) { 31: if (!result.success) { 32: alert(result.error); 33: } 34: callback(null, block.index); 35: } 36: }); 37: }); 38: }); 39: } 40: }); Then we will invoke these functions one by one by using the async.js. And once all functions had been executed successfully I invoked another ajax call to the backend service to commit all these chunks (blocks) as the blob in Windows Azure Storage. 1: $("#upload_button_blob").click(function () { 2: // assert the browser support html5 3: ... ... 4: // start to upload each files in chunks 5: var files = $("#upload_files")[0].files; 6: for (var i = 0; i < files.length; i++) { 7: var file = files[i]; 8: var fileSize = file.size; 9: var fileName = file.name; 10: // calculate the start and end byte index for each blocks(chunks) 11: // with the index, file name and index list for future using 12: ... ... 13: // define the function array and push all chunk upload operation into this array 14: ... ... 15: // invoke the functions one by one 16: // then invoke the commit ajax call to put blocks into blob in azure storage 17: async.series(putBlocks, function (error, result) { 18: var data = { 19: name: fileName, 20: list: list 21: }; 22: $.post("/Home/Commit", data, function (result) { 23: if (!result.success) { 24: alert(result.error); 25: } 26: else { 27: alert("done!"); 28: } 29: }); 30: }); 31: } 32: }); That’s all in the client side. The outline of our logic would be - Calculate the start and end byte index for each chunks based on the block size. - Defined the functions of reading the chunk form file and upload the content to the backend service through ajax. - Execute the functions defined in previous step with “async.js”. - Commit the chunks by invoking the backend service in Windows Azure Storage finally.   Save Chunks as Blocks into Blob Storage In above we finished the client size JavaScript code. It uploaded the file in chunks to the backend service which we are going to implement in this step. We will use ASP.NET MVC as our backend service, and it will receive the chunks, upload into Windows Azure Bob Storage in blocks, then finally commit as one blob. As in the client side we uploaded chunks by invoking the ajax call to the URL "/Home/UploadInFormData", I created a new action under the Index controller and it only accepts HTTP POST request. 1: [HttpPost] 2: public JsonResult UploadInFormData() 3: { 4: var error = string.Empty; 5: try 6: { 7: } 8: catch (Exception e) 9: { 10: error = e.ToString(); 11: } 12:  13: return new JsonResult() 14: { 15: Data = new 16: { 17: success = string.IsNullOrWhiteSpace(error), 18: error = error 19: } 20: }; 21: } Then I retrieved the file name, index and the chunk content from the Request.Form object, which was passed from our client side. And then, used the Windows Azure SDK to create a blob container (in this case we will use the container named “test”.) and create a blob reference with the blob name (same as the file name). Then uploaded the chunk as a block of this blob with the index, since in Blob Storage each block must have an index (ID) associated with so that finally we can put all blocks as one blob by specifying their block ID list. 1: [HttpPost] 2: public JsonResult UploadInFormData() 3: { 4: var error = string.Empty; 5: try 6: { 7: var name = Request.Form["name"]; 8: var index = int.Parse(Request.Form["index"]); 9: var file = Request.Files[0]; 10: var id = Convert.ToBase64String(BitConverter.GetBytes(index)); 11:  12: var container = _client.GetContainerReference("test"); 13: container.CreateIfNotExists(); 14: var blob = container.GetBlockBlobReference(name); 15: blob.PutBlock(id, file.InputStream, null); 16: } 17: catch (Exception e) 18: { 19: error = e.ToString(); 20: } 21:  22: return new JsonResult() 23: { 24: Data = new 25: { 26: success = string.IsNullOrWhiteSpace(error), 27: error = error 28: } 29: }; 30: } Next, I created another action to commit the blocks into blob once all chunks had been uploaded. Similarly, I retrieved the blob name from the Request.Form. I also retrieved the chunks ID list, which is the block ID list from the Request.Form in a string format, split them as a list, then invoked the BlockBlob.PutBlockList method. After that our blob will be shown in the container and ready to be download. 1: [HttpPost] 2: public JsonResult Commit() 3: { 4: var error = string.Empty; 5: try 6: { 7: var name = Request.Form["name"]; 8: var list = Request.Form["list"]; 9: var ids = list 10: .Split(',') 11: .Where(id => !string.IsNullOrWhiteSpace(id)) 12: .Select(id => Convert.ToBase64String(BitConverter.GetBytes(int.Parse(id)))) 13: .ToArray(); 14:  15: var container = _client.GetContainerReference("test"); 16: container.CreateIfNotExists(); 17: var blob = container.GetBlockBlobReference(name); 18: blob.PutBlockList(ids); 19: } 20: catch (Exception e) 21: { 22: error = e.ToString(); 23: } 24:  25: return new JsonResult() 26: { 27: Data = new 28: { 29: success = string.IsNullOrWhiteSpace(error), 30: error = error 31: } 32: }; 33: } Now we finished all code we need. The whole process of uploading would be like this below. Below is the full client side JavaScript code. 1: <script type="text/javascript" src="~/Scripts/async.js"></script> 2: <script type="text/javascript"> 3: $(function () { 4: $("#upload_button_blob").click(function () { 5: // assert the browser support html5 6: if (window.File && window.Blob && window.FormData) { 7: alert("Your brwoser is awesome, let's rock!"); 8: } 9: else { 10: alert("Oh man plz update to a modern browser before try is cool stuff out."); 11: return; 12: } 13:  14: // start to upload each files in chunks 15: var files = $("#upload_files")[0].files; 16: for (var i = 0; i < files.length; i++) { 17: var file = files[i]; 18: var fileSize = file.size; 19: var fileName = file.name; 20:  21: // calculate the start and end byte index for each blocks(chunks) 22: // with the index, file name and index list for future using 23: var blockSizeInKB = $("#block_size").val(); 24: var blockSize = blockSizeInKB * 1024; 25: var blocks = []; 26: var offset = 0; 27: var index = 0; 28: var list = ""; 29: while (offset < fileSize) { 30: var start = offset; 31: var end = Math.min(offset + blockSize, fileSize); 32:  33: blocks.push({ 34: name: fileName, 35: index: index, 36: start: start, 37: end: end 38: }); 39: list += index + ","; 40:  41: offset = end; 42: index++; 43: } 44:  45: // define the function array and push all chunk upload operation into this array 46: var putBlocks = []; 47: blocks.forEach(function (block) { 48: putBlocks.push(function (callback) { 49: // load blob based on the start and end index for each chunks 50: var blob = file.slice(block.start, block.end); 51: // put the file name, index and blob into a temporary from 52: var fd = new FormData(); 53: fd.append("name", block.name); 54: fd.append("index", block.index); 55: fd.append("file", blob); 56: // post the form to backend service (asp.net mvc controller action) 57: $.ajax({ 58: url: "/Home/UploadInFormData", 59: data: fd, 60: processData: false, 61: contentType: "multipart/form-data", 62: type: "POST", 63: success: function (result) { 64: if (!result.success) { 65: alert(result.error); 66: } 67: callback(null, block.index); 68: } 69: }); 70: }); 71: }); 72:  73: // invoke the functions one by one 74: // then invoke the commit ajax call to put blocks into blob in azure storage 75: async.series(putBlocks, function (error, result) { 76: var data = { 77: name: fileName, 78: list: list 79: }; 80: $.post("/Home/Commit", data, function (result) { 81: if (!result.success) { 82: alert(result.error); 83: } 84: else { 85: alert("done!"); 86: } 87: }); 88: }); 89: } 90: }); 91: }); 92: </script> And below is the full ASP.NET MVC controller code. 1: public class HomeController : Controller 2: { 3: private CloudStorageAccount _account; 4: private CloudBlobClient _client; 5:  6: public HomeController() 7: : base() 8: { 9: _account = CloudStorageAccount.Parse(CloudConfigurationManager.GetSetting("DataConnectionString")); 10: _client = _account.CreateCloudBlobClient(); 11: } 12:  13: public ActionResult Index() 14: { 15: ViewBag.Message = "Modify this template to jump-start your ASP.NET MVC application."; 16:  17: return View(); 18: } 19:  20: [HttpPost] 21: public JsonResult UploadInFormData() 22: { 23: var error = string.Empty; 24: try 25: { 26: var name = Request.Form["name"]; 27: var index = int.Parse(Request.Form["index"]); 28: var file = Request.Files[0]; 29: var id = Convert.ToBase64String(BitConverter.GetBytes(index)); 30:  31: var container = _client.GetContainerReference("test"); 32: container.CreateIfNotExists(); 33: var blob = container.GetBlockBlobReference(name); 34: blob.PutBlock(id, file.InputStream, null); 35: } 36: catch (Exception e) 37: { 38: error = e.ToString(); 39: } 40:  41: return new JsonResult() 42: { 43: Data = new 44: { 45: success = string.IsNullOrWhiteSpace(error), 46: error = error 47: } 48: }; 49: } 50:  51: [HttpPost] 52: public JsonResult Commit() 53: { 54: var error = string.Empty; 55: try 56: { 57: var name = Request.Form["name"]; 58: var list = Request.Form["list"]; 59: var ids = list 60: .Split(',') 61: .Where(id => !string.IsNullOrWhiteSpace(id)) 62: .Select(id => Convert.ToBase64String(BitConverter.GetBytes(int.Parse(id)))) 63: .ToArray(); 64:  65: var container = _client.GetContainerReference("test"); 66: container.CreateIfNotExists(); 67: var blob = container.GetBlockBlobReference(name); 68: blob.PutBlockList(ids); 69: } 70: catch (Exception e) 71: { 72: error = e.ToString(); 73: } 74:  75: return new JsonResult() 76: { 77: Data = new 78: { 79: success = string.IsNullOrWhiteSpace(error), 80: error = error 81: } 82: }; 83: } 84: } And if we selected a file from the browser we will see our application will upload chunks in the size we specified to the server through ajax call in background, and then commit all chunks in one blob. Then we can find the blob in our Windows Azure Blob Storage.   Optimized by Parallel Upload In previous example we just uploaded our file in chunks. This solved the problem that ASP.NET MVC request content size limitation as well as the Windows Azure load balancer timeout. But it might introduce the performance problem since we uploaded chunks in sequence. In order to improve the upload performance we could modify our client side code a bit to make the upload operation invoked in parallel. The good news is that, “async.js” library provides the parallel execution function. If you remembered the code we invoke the service to upload chunks, it utilized “async.series” which means all functions will be executed in sequence. Now we will change this code to “async.parallel”. This will invoke all functions in parallel. 1: $("#upload_button_blob").click(function () { 2: // assert the browser support html5 3: ... ... 4: // start to upload each files in chunks 5: var files = $("#upload_files")[0].files; 6: for (var i = 0; i < files.length; i++) { 7: var file = files[i]; 8: var fileSize = file.size; 9: var fileName = file.name; 10: // calculate the start and end byte index for each blocks(chunks) 11: // with the index, file name and index list for future using 12: ... ... 13: // define the function array and push all chunk upload operation into this array 14: ... ... 15: // invoke the functions one by one 16: // then invoke the commit ajax call to put blocks into blob in azure storage 17: async.parallel(putBlocks, function (error, result) { 18: var data = { 19: name: fileName, 20: list: list 21: }; 22: $.post("/Home/Commit", data, function (result) { 23: if (!result.success) { 24: alert(result.error); 25: } 26: else { 27: alert("done!"); 28: } 29: }); 30: }); 31: } 32: }); In this way all chunks will be uploaded to the server side at the same time to maximize the bandwidth usage. This should work if the file was not very large and the chunk size was not very small. But for large file this might introduce another problem that too many ajax calls are sent to the server at the same time. So the best solution should be, upload the chunks in parallel with maximum concurrency limitation. The code below specified the concurrency limitation to 4, which means at the most only 4 ajax calls could be invoked at the same time. 1: $("#upload_button_blob").click(function () { 2: // assert the browser support html5 3: ... ... 4: // start to upload each files in chunks 5: var files = $("#upload_files")[0].files; 6: for (var i = 0; i < files.length; i++) { 7: var file = files[i]; 8: var fileSize = file.size; 9: var fileName = file.name; 10: // calculate the start and end byte index for each blocks(chunks) 11: // with the index, file name and index list for future using 12: ... ... 13: // define the function array and push all chunk upload operation into this array 14: ... ... 15: // invoke the functions one by one 16: // then invoke the commit ajax call to put blocks into blob in azure storage 17: async.parallelLimit(putBlocks, 4, function (error, result) { 18: var data = { 19: name: fileName, 20: list: list 21: }; 22: $.post("/Home/Commit", data, function (result) { 23: if (!result.success) { 24: alert(result.error); 25: } 26: else { 27: alert("done!"); 28: } 29: }); 30: }); 31: } 32: });   Summary In this post we discussed how to upload files in chunks to the backend service and then upload them into Windows Azure Blob Storage in blocks. We focused on the frontend side and leverage three new feature introduced in HTML 5 which are - File.slice: Read part of the file by specifying the start and end byte index. - Blob: File-like interface which contains the part of the file content. - FormData: Temporary form element that we can pass the chunk alone with some metadata to the backend service. Then we discussed the performance consideration of chunk uploading. Sequence upload cannot provide maximized upload speed, but the unlimited parallel upload might crash the browser and server if too many chunks. So we finally came up with the solution to upload chunks in parallel with the concurrency limitation. We also demonstrated how to utilize “async.js” JavaScript library to help us control the asynchronize call and the parallel limitation.   Regarding the chunk size and the parallel limitation value there is no “best” value. You need to test vary composition and find out the best one for your particular scenario. It depends on the local bandwidth, client machine cores and the server side (Windows Azure Cloud Service Virtual Machine) cores, memory and bandwidth. Below is one of my performance test result. The client machine was Windows 8 IE 10 with 4 cores. I was using Microsoft Cooperation Network. The web site was hosted on Windows Azure China North data center (in Beijing) with one small web role (1.7GB 1 core CPU, 1.75GB memory with 100Mbps bandwidth). The test cases were - Chunk size: 512KB, 1MB, 2MB, 4MB. - Upload Mode: Sequence, parallel (unlimited), parallel with limit (4 threads, 8 threads). - Chunk Format: base64 string, binaries. - Target file: 100MB. - Each case was tested 3 times. Below is the test result chart. Some thoughts, but not guidance or best practice: - Parallel gets better performance than series. - No significant performance improvement between parallel 4 threads and 8 threads. - Transform with binaries provides better performance than base64. - In all cases, chunk size in 1MB - 2MB gets better performance.   Hope this helps, Shaun All documents and related graphics, codes are provided "AS IS" without warranty of any kind. Copyright © Shaun Ziyan Xu. This work is licensed under the Creative Commons License.

    Read the article

  • Unable to Turn On Media Streaming in Windows Media Player 12 on Windows 7

    - by Chau Chee Yang
    I have 2 PC installed with Windows 7 and Media Player 12. I would like to use Play To feature on both PC connected via LAN. Both PC (A and B) run media player in standard user account. I able to turn on media streaming option in PC A (with privilege access prompt) without any problem. However, PC B also prompt privilege access but no response after enter administrator password. Both PC follow same configuration steps. I may use "play to" PC A (in standard user account) from other PC without any problem. But I can't "play to" PC B in standard user account. I can only run media player in administrator account for "play to" to function. I have tried uninstall and reinstall media player via "Programs and Features" in control panel on PC B. However, it doesn't work too. Does anyone has similar experience as me failing to turn on media streaming that running Windows media player in standard user account?

    Read the article

  • windows 2003 remote desktop fail connect after windows update

    - by bookstorecowboy
    I have a number of windows servers, both win2003, win2003r2 and win2003r2 64 bit. After installing windows updates on the machines and rebooting terminal services give me an immediate "not connect" message. The servers are running fine and you can connect via the physical machine. Once you reboot the box (with remote restart in some cases when it's not accessible) terminal services behaves itself and allows a connection. Whilst it's not a show stopper it is rather annoying when you can't see your box. Anyone know why this is or what the cause is?

    Read the article

  • How to bypass resume from hibernate

    - by Daniel Trebbien
    I am attempting to resume a Windows Vista laptop from hibernate, but the resume process seems to be stuck in an endless loop in which Windows is repeatedly trying to read from the optical drive. When I press the Power On button on the laptop, the screen is black (not even the backlight turns on) and the following occurs in a loop: Five seconds pass and I hear the optical drive being accessed. (There's no disk in the drive, so it sounds like a short buzzing noise.) Two seconds pass and I hear the optical drive being accessed. Two seconds pass and I hear the optical drive being accessed. So it's three short buzzing noises in a row, over and over again. Eventually I have to abruptly power off the machine. I have tried inserting a data CD into the drive as well as a bootable CD (a live Linux distro boot disk). For both, the optical drive spins up for a bit, but stops after Windows decides that the disk is not what it is looking for. I have since lost the Windows Vista recovery DVD, but I don't know if inserting the recovery disk into the optical drive would have a different effect than the bootable CD. I have tried pressing F8 immediately after pressing the Power On button (hoping to enter System Restore), but that did not have an effect. Is there a special key sequence that will cause Windows to bypass resuming from hibernate, effectively ignoring hiberfil.sys?

    Read the article

  • How to bypass resume from hibernate [closed]

    - by Daniel Trebbien
    I am attempting to resume a Windows Vista laptop from hibernate, but the resume process seems to be stuck in an endless loop in which Windows is repeatedly trying to read from the optical drive. When I press the Power On button on the laptop, the screen is black (not even the backlight turns on) and the following occurs in a loop: Five seconds pass and I hear the optical drive being accessed. (There's no disk in the drive, so it sounds like a short buzzing noise.) Two seconds pass and I hear the optical drive being accessed. Two seconds pass and I hear the optical drive being accessed. So it's three short buzzing noises in a row, over and over again. Eventually I have to abruptly power off the machine. I have tried inserting a data CD into the drive as well as a bootable CD (a live Linux distro boot disk). For both, the optical drive spins up for a bit, but stops after Windows decides that the disk is not what it is looking for. I have since lost the Windows Vista recovery DVD, but I don't know if inserting the recovery disk into the optical drive would have a different effect than the bootable CD. I have tried pressing F8 immediately after pressing the Power On button (hoping to enter System Restore), but that did not have an effect. Is there a special key sequence that will cause Windows to bypass resuming from hibernate, effectively ignoring hiberfil.sys?

    Read the article

  • Win 2008 R2 Server Not Recognizing Second Hard Drive

    - by Brian
    Hello, I just purchased a Dell server, which has two hard drives and no RAID setup. I can only currently see one hard drive... not sure how to get it to recognize the other, as I thought being a new machine that wouldn't be an issue. It has Windows Server 2008 R2 that I loaded on. I'm a n00b to all of this so I'm not sure why this is failing to work... Any help appreciated. Thanks.

    Read the article

  • Backup Your Windows Home Server Off-Site with Asus Webstorage

    - by Mysticgeek
    Windows Home Server lets you backup machines on your network easily. But what about backing up the server data? Today we take a look at ASUS WebStorage for Windows Home Server, which provides you with secure off-site backup for WHS. To use the ASUS WebStorage service you’ll need to sign up for a free account. It offers 1GB of free storage, then you can purchase an unlimited backup package for $39.99 for a year subscription. Note: They also offer online storage for individual PCs as well. Install ASUS WebStorage for WHS Browse to your shared folders on the server and open the Add-Ins folder and copy over the WHSConnectorSetup2.2.4.088.msi file (link below) then close out of the folder. Now launch Windows Home Server Console from one of the computers on your network, click Settings, then Add-ins. Under Available Add-ins click the Available tab and you’ll see the Asus WebStorage installer file we just copied over. Click the Install button. Installation kicks off and when it’s complete, you’ll need to close out of the console and reconnect. Using ASUS WebStorage WHS Connector  When you reconnect to WHS Console, scroll over to the ASUS WebStorage icon and click on Settings. Now log into your ASUS account… Now select the folders you want to backup to the WebStorage service. Select the radio button next to Enable to initialize the backup process… The backup process begins. You can change which folders are backed up simply by disabling the backup process, uncheck the folder(s), then enable the backup again. ASUS WebStorage Site After you have files backed up to the ASUS site, log into your account, and your presented with an overview of the amount of storage you’re using. It also shows what type of files are taking certain amounts of space.   You can browse through your backed up files and folders. It allows you to share and sync backed up data as well. Navigate to the file you want and you can easily download it by clicking on it, or share it out by clicking the share link below it. If you choose to share it, you’re provided with a link to the file to send out to other users.   Conclusion Users of Windows Home Server have been looking for an inexpensive cloud backup solution for quite some time. There are services such as JungleDisk, KeepVault, Wuala…etc. These services probably do a better job, but can start getting expensive once you start uploading a GBs of data. Another disappointment of ASUS WebStorage is you can only backup your WHS shares (from what we’ve been able to determine), it’s an “all or nothing” type of thing. You cannot go in and select individual files and folders. The initial upload speeds can be a bit slow as well, although that might have something to do with limited upload speeds on the DSL connection we used to test it. Retrieving your data from the ASUS site is a breeze though, and all the data files are organized quite well. The WHS Addin is very easy to install and use. If you’re looking for an off-site solution to backup your WHS data, you can test out ASUS WebStorage for free with a 1GB limit. This is good for testing the service and it might be exactly what you’re looking for. Other users may want a more advanced solution like KeepVault or CloudBerry…which is a front end for Amazon S3 storage. Download ASUS WebStorage WHS Addin Other WHS Offsite Backup Solutions CloudBerry, JungleDisk, KeepVault, Wuala Similar Articles Productive Geek Tips Restore Files from Backups on Windows Home ServerGMedia Blog: Setting Up a Windows Home ServerCreate A Windows Home Server Home Computer Restore DiscRemove a Network Computer from Windows Home ServerShare Ubuntu Home Directories using Samba TouchFreeze Alternative in AutoHotkey The Icy Undertow Desktop Windows Home Server – Backup to LAN The Clear & Clean Desktop Use This Bookmarklet to Easily Get Albums Use AutoHotkey to Assign a Hotkey to a Specific Window Latest Software Reviews Tinyhacker Random Tips DVDFab 6 Revo Uninstaller Pro Registry Mechanic 9 for Windows PC Tools Internet Security Suite 2010 Gadfly is a cool Twitter/Silverlight app Enable DreamScene in Windows 7 Microsoft’s “How Do I ?” Videos Home Networks – How do they look like & the problems they cause Check Your IMAP Mail Offline In Thunderbird Follow Finder Finds You Twitter Users To Follow

    Read the article

  • Reduce the I/O priority of Windows Backup (Windows Server 2008 R2)

    - by HelloSam
    I have a PostgreSQL running on Windows Server 2008 R2 x64 box. And I have scheduled a backup everyday from the RAID 1 DB disk to a dedicated standalone disk. They are SAS 15k on Dell PERC 6i. I am using the built-in Windows Server Backup for purpose. The problem is, whenever the backup process is kicked in, the database performance is hogged. I would say almost a 10x of performance reduction. From the resource monitor, the disk queue is in the double digit range when backing up, and less than 1 during the day. The disk activity is like ~30-50MB/s during backup, so I guess the hardware is acting normally, though wbengine.exe takes up most of the portions. I think reduce the IO priority of the backup process would be an answer, but I couldn't find a way to. Tuning process CPU priority does not seems to help.

    Read the article

  • Should I use Windows deployment server to deploy Windows 7?

    - by Scottio
    Hi all, I've installed the Windows Deployment Server role on to Windows 2008 R2, I've found the Windows PE .wim file from the Windows 7 AIK and added it to my deployment server. On a client I can now do a network boot and have the Windows PE image load, now on my previous Windows 2003 Deployment Server I'd type WDSCapture and upload the Windows client to the server but WDSCapture is no longer present. Am I missing something? Should I be using Windows Deployment Server for Windows 7? Many thanks Scott

    Read the article

  • Migrating Windows 2008 R2 to Windows 2012 (migrate all FSMO too)

    - by Mauro
    I own 2 server with Windows 2008 R2, both DC. The first one is of course the Primary DC (with all FSMO). What I would like to do is ro dcdemote the 2nd DC, remove it from domain and replace the Windows 2008 r2 with 2012. I will then rejoin this 2nd DC (with the new 2012 server) to domain and dcpromo it (Server Management). After this is a new DC I would like to temporary transfer all the FSMO to this server, while I'm doing the same operation on what is actually the Primary DC. Is this a stupid solution? What I would like to do is a clean installation, I don't want to upgrade directly those systems. Suggestions? Ideas? Thanks, Mauro

    Read the article

  • mapping server 2008 network drive to vista home premium x64

    - by rboorgapally
    Hi, We have a windows server 2008 box at my work place. I want to map a drive from the server to my laptop. I use windows vista home premium x64. I am connected to my workplace through VPN. i can map the drive when I use the administrator account on the server. But the log on is unsuccessful if I use my personal account on the server to map the drive. My personal account on the server is part of Administrators group. Can any one help me with this?

    Read the article

  • Dual booting Windows and Arch Linux (with GRUB2) - after using Windows, Windows Boot Manager made first in boot priority list

    - by louis058
    I am dual booting Windows 7 and Arch Linux (both 64bit), with GRUB2, using the 64-bit EFI version. I partitioned my drive into a GPT drive and installed Windows first according to this guide. I then installed Arch Linux using the Beginner's Guide, installing grub2-efi-x86_64 in the process. Everything is working fine now, but with one problem. I can set the boot priority in BIOS (or is it UEFI?) to have GRUB boot try and boot before Windows Boot Manager. Then I chainload Windows Boot Manager using GRUB. However, when I actually use Windows in this manner, upon shutting down and turning on again, or rebooting, Windows seems to set Windows Boot Manager first in the priority list again, with the result being I have to manually set GRUB again, or I can't boot into Linux. My motherboard is an Asrock H61M/USB3, if that helps. I want to know how to turn off this behaviour.

    Read the article

  • Monitor flashes black when starting video files, opening folders with video files

    - by Bob
    My entire monitor flashes black for a second or so when starting any video file. It does the same thing when closing the file. It also does this when opening a folder that has video files and a thumbnail view, or when I do anything that presumably accesses the video stream inside a video file. It happens in VLC and Windows Media Player, though it occurs to me certain video editors I have like avidemux and virtualdub don't have that issue when opening video files... It also does not happen when watching flash-based videos on web sites. It doesn't seem to happen when Quicktime. I went through and uninstalled any extraneous programs a week or so ago and didn't see any change. I also updated the graphics card driver. What can I do to troubleshoot/fix this?

    Read the article

  • BUILD 2013 day 1 Keynote recap for devs

    - by pluginbaby
    Only 7 months after the previous BUILD event, Microsoft is hosting a new BUILD conference in San Francisco (June 26-28, 2013).   Notable announcements of day 1: The Windows 8.1 preview is available today Preview of Internet Explorer 11 Visual Studio 2013 Preview and .NET Framework 4.5.1 Preview are both available The Windows Store has been redesigned and is now much more interesting, both for users and developers Windows Phone 8 annual Dev Center registration is reduced to $19 for the next 60 days! (normally $99 for individuals and companies)     Also to check out: Windows 8.1 Preview Product Guide for Developers http://msdn.microsoft.com/en-US/windows/apps/bg184615 F12 Tools in Internet Explorer 11 Preview has been rebuilt from the ground up: http://msdn.microsoft.com/en-us/library/ie/bg182632   Watch the entire keynote online: http://channel9.msdn.com/Events/Build/2013/1-001 Read the full transcript: http://www.microsoft.com/en-us/news/Speeches/2013/06-26Build2013.aspx

    Read the article

  • What's wrong with my grub configuration?

    - by Daniel Medrano
    I have one hard drive with 2 partitions. I had Windows XP and Windows Seven installed. I've deleted Seven partition and installed Ubuntu 11.10 on it. But when I turn on the computer and the grub menu appears on the screen, I can only see my ubuntu installations and a windows 7 loader. I've already got installed windows XP on another partition, but it doesn't boot. If I want to use Ubuntu alongside Windows XP, ¿how can I fix the problem?

    Read the article

  • Install Windows 8 64-bit over the top of Windows 8 32-bit

    - by Andrew Gee
    I currently have Windows 8 32-bit installed from MSDN (I didn't realise at the time that my processor supports 64-bit). I understand that you can't upgrade within Windows 32-bit to 64-bit directly from the ISO. I have burned the ISO to a DVD, and have attempted booting from this drive. The problem I am encountering: The operating system couldn't be loaded because a required file is missing or contains errors. File: CI.dll Error code: 0xc0000221 You'll need to use the recovery tools on your installation media. If you don't have any installation media (like a disc or USB device), contact your system administrator or PC manufacturer. Additional info: Computer: HP Pavillion m9280.uk-a Processor: AMD Phenom 9600 Quad-Core RAM: 3 1GB sticks Thanks in advance!

    Read the article

  • Which tool / technology: System management for databases and dependent services

    - by Filburt
    I'm looking for advice on how to enable our team to take down and re-start our company systems for maintainance purposes. The scenario includes several Oracle db machines several MS SQL Server machines with multiple instances windows services (IIS etc.) BizTalk EAI solution Apache and Tomcat instances lots of scheduled tasks on win2003 and win2008 machines (physical and virtual). The main focus is on capture all dependencies between said databases and services and tasks connecting to them. At the moment an enterprise class solution is not an option. We are considering developing a solution driven by PowerShell scripts but I hope for some more suggestions.

    Read the article

  • Can I change the system's "Browse for Folder" dialog globally?

    - by Chris Phillips
    As far as I know, everyone hates the "Browse for Folder" dialog: This dialog is always too small, rarely remembers locations well, and worst of all: forces you to navigate your entire computer using a tedious tree structure. Now, to be fair, some of the problems are likely to do with how apps are invoking the control -- not setting a size or a default directory, etc. But the problem about the tedious tree control remains. Is there any way to customize your Windows installation to use a different control? Preferably an app/installer that does it for you safely, but dropping in a compatible DLL or similar technique would be okay too. Or are we stuck with this terrible control forever?

    Read the article

  • Gladinet Cloud Desktop tool to manage Windows Azure Blob Storage from Windows Explorer

    - by kaleidoscope
    Gladinet Cloud Desktop is designed to make it easier for Windows Azure users to manage Windows Azure Blob storage directly from Windows Explorer. The solution makes it possible for Windows Azure Blob storage to be mapped as a virtual network Drive. “You can map multiple Azure Blob Storage Accounts as side-by-side virtual folders in same network drive. You can do drag & drop between local drive and Azure drive. For more information -  http://www.ditii.com/2010/01/04/gladinet-cloud-desktop-tool-to-manage-windows-azure-blob-storage-from-windows-explorer/ For Downloading the tool – http://www.gladinet.com/p/download_starter_direct.htm   Ritesh, D

    Read the article

  • windows server secondary plex or windows server default

    - by shiva
    I am new to handling issues on server. when my system tried to reboot I could see two installables to boot from. windows server 2012- current os windows server secondary plex. So when ever there is a system restart the system it stops at this screen. And since I am connecting to this server using RDP I have to wait for the hetzner console to click on of the os to boot. Even though the current os is set as default and time given is 30 sec, it still waits for a user input. So I want to know which of the two should I be using to boot and I just want one os.

    Read the article

  • Microsoft Keyboard(8000) play button opens Windows Media Player rather than Zune on Windows 7

    - by Chance
    Im currently using the Microsoft 8000 keyboard on Windows 7 and before installing Microsoft's Keyboard application, it was opening Zune and playing music with the play button. After installation, it is now opening Windows Media Player rather than Zune. If I have Zune open and press play, it will open WMP and upon pressing it again, will kick off the play command in Zune. I've checked the default programs in the control panel and have set Zune as default for all available but that hasn't changed anything. Has anyone run into this before? I'm a bit stumped as googling does not produce any relevant results. Thanks!

    Read the article

  • Firewall GPO not applying despite being enumerated by gpresult

    - by jshin47
    I have a need to open up the admin$ share on all of my domain's client PC's and I am trying to do so using group policy. I defined computer policy for Windows Firewall with Advanced Security in a policy object linked to the appropriate container and added the appropriate rules. However, they are not being applied! I feel like I have tried all of the obvious steps: I've checked gpresult and the resulting set of policy is the way that I would expect it to look. I've gpupdate /force and gpupdate /sync on a few client computers, but no matter what I do they don't seem to respond to my changes. I know that other computer policies in the GPO are being applied so it is strange that these are not. I have also disabled exceptions on clients in the firewall GPO, but that doesn't seem to be applying either. Here is a screenshot of the firewall.cpl from a client: Basically, although other options in the same GPO ARE applied for computer policy, the firewall settings seem to be ignored.

    Read the article

  • Bluescreen after Ubuntu 12.10 installation, after recommended boot repair "no filesystem found" please help!

    - by Phil
    After I tried to install Ubuntu 12.10 into my Windows 7 a black screen came up and nothing happened for over five minutes. So I force shutdown my computer and started again on the linux CD. I partitioned the Linux partitions manually and installed Ubuntu. At the next reboot I got a bluescreen from Windows three secounds after loading. I tried to repair the problem by using boot-repair. Then I got out the url: http://paste.ubuntu.com/1430803 And after the next restart he told me that he didn't found the filesystem. Because he didn't found the filesystem it was unable to repair it with the windows CD. Then I tried to repair it with TestDisk and was able to change the Windows Partition into NTFS, but I was not able to repair the windows 7 boot partition. Now I get the message that No Bootloader is found when I restart. Please help me.

    Read the article

  • Need to use wusa to stop endless rebooting on R2

    - by Jack
    I installed 3 updates, which after rebooting resulted in a reboot loop, never going past the "configuring your computer for windows" stage. Last known good configuration, safe mode etc make no difference (as expected). I have no system restore images available. I used dism to uninstall one of the updates, but the other two appear to be MSP patches, and so I cannot use dism to uninstall them. From what I have been reading it appears I must use wusa, however wusa does not seem to be included in WinRE. Does anyone have a suggestion of how I can access wusa to uninstall these updates, and hopefully reverse this problem?

    Read the article

  • Imagine Cup 2012 : la compétition s'enrichit de deux nouveaux défis sur Windows Phone et Windows Azure

    Imagine Cup 2012 : la compétition s'enrichit de deux nouveaux défis Windows Phone et Windows Azure Imagine Cup, la compétition annuelle de l'innovation numérique ouverte aux étudiants organisée par Microsoft a été lancée récemment pour la 10ème édition. Pour cette année, les compétiteurs auront l'occasion de s'affronter dans deux défis supplémentaires qui viennent d'être annoncés récemment par Microsoft. Les nouvelles rubriques concernent principalement la plateforme Cloud de Microsoft Windows Azure et le système d'exploitation mobiles Windows Phone. Les chalenges Windows Azure et Windows Phone contrairement aux autres catégories se feront en ligne (soumission, évaluation et choix des vainqueurs). Les...

    Read the article

< Previous Page | 90 91 92 93 94 95 96 97 98 99 100 101  | Next Page >