Search Results

Search found 128 results on 6 pages for 'jsonresult'.

Page 2/6 | < Previous Page | 1 2 3 4 5 6  | Next Page >

  • Upload File to Windows Azure Blob in Chunks through ASP.NET MVC, JavaScript and HTML5

    - by Shaun
    Originally posted on: http://geekswithblogs.net/shaunxu/archive/2013/07/01/upload-file-to-windows-azure-blob-in-chunks-through-asp.net.aspxMany people are using Windows Azure Blob Storage to store their data in the cloud. Blob storage provides 99.9% availability with easy-to-use API through .NET SDK and HTTP REST. For example, we can store JavaScript files, images, documents in blob storage when we are building an ASP.NET web application on a Web Role in Windows Azure. Or we can store our VHD files in blob and mount it as a hard drive in our cloud service. If you are familiar with Windows Azure, you should know that there are two kinds of blob: page blob and block blob. The page blob is optimized for random read and write, which is very useful when you need to store VHD files. The block blob is optimized for sequential/chunk read and write, which has more common usage. Since we can upload block blob in blocks through BlockBlob.PutBlock, and them commit them as a whole blob with invoking the BlockBlob.PutBlockList, it is very powerful to upload large files, as we can upload blocks in parallel, and provide pause-resume feature. There are many documents, articles and blog posts described on how to upload a block blob. Most of them are focus on the server side, which means when you had received a big file, stream or binaries, how to upload them into blob storage in blocks through .NET SDK.  But the problem is, how can we upload these large files from client side, for example, a browser. This questioned to me when I was working with a Chinese customer to help them build a network disk production on top of azure. The end users upload their files from the web portal, and then the files will be stored in blob storage from the Web Role. My goal is to find the best way to transform the file from client (end user’s machine) to the server (Web Role) through browser. In this post I will demonstrate and describe what I had done, to upload large file in chunks with high speed, and save them as blocks into Windows Azure Blob Storage.   Traditional Upload, Works with Limitation The simplest way to implement this requirement is to create a web page with a form that contains a file input element and a submit button. 1: @using (Html.BeginForm("About", "Index", FormMethod.Post, new { enctype = "multipart/form-data" })) 2: { 3: <input type="file" name="file" /> 4: <input type="submit" value="upload" /> 5: } And then in the backend controller, we retrieve the whole content of this file and upload it in to the blob storage through .NET SDK. We can split the file in blocks and upload them in parallel and commit. The code had been well blogged in the community. 1: [HttpPost] 2: public ActionResult About(HttpPostedFileBase file) 3: { 4: var container = _client.GetContainerReference("test"); 5: container.CreateIfNotExists(); 6: var blob = container.GetBlockBlobReference(file.FileName); 7: var blockDataList = new Dictionary<string, byte[]>(); 8: using (var stream = file.InputStream) 9: { 10: var blockSizeInKB = 1024; 11: var offset = 0; 12: var index = 0; 13: while (offset < stream.Length) 14: { 15: var readLength = Math.Min(1024 * blockSizeInKB, (int)stream.Length - offset); 16: var blockData = new byte[readLength]; 17: offset += stream.Read(blockData, 0, readLength); 18: blockDataList.Add(Convert.ToBase64String(BitConverter.GetBytes(index)), blockData); 19:  20: index++; 21: } 22: } 23:  24: Parallel.ForEach(blockDataList, (bi) => 25: { 26: blob.PutBlock(bi.Key, new MemoryStream(bi.Value), null); 27: }); 28: blob.PutBlockList(blockDataList.Select(b => b.Key).ToArray()); 29:  30: return RedirectToAction("About"); 31: } This works perfect if we selected an image, a music or a small video to upload. But if I selected a large file, let’s say a 6GB HD-movie, after upload for about few minutes the page will be shown as below and the upload will be terminated. In ASP.NET there is a limitation of request length and the maximized request length is defined in the web.config file. It’s a number which less than about 4GB. So if we want to upload a really big file, we cannot simply implement in this way. Also, in Windows Azure, a cloud service network load balancer will terminate the connection if exceed the timeout period. From my test the timeout looks like 2 - 3 minutes. Hence, when we need to upload a large file we cannot just use the basic HTML elements. Besides the limitation mentioned above, the simple HTML file upload cannot provide rich upload experience such as chunk upload, pause and pause-resume. So we need to find a better way to upload large file from the client to the server.   Upload in Chunks through HTML5 and JavaScript In order to break those limitation mentioned above we will try to upload the large file in chunks. This takes some benefit to us such as - No request size limitation: Since we upload in chunks, we can define the request size for each chunks regardless how big the entire file is. - No timeout problem: The size of chunks are controlled by us, which means we should be able to make sure request for each chunk upload will not exceed the timeout period of both ASP.NET and Windows Azure load balancer. It was a big challenge to upload big file in chunks until we have HTML5. There are some new features and improvements introduced in HTML5 and we will use them to implement our solution.   In HTML5, the File interface had been improved with a new method called “slice”. It can be used to read part of the file by specifying the start byte index and the end byte index. For example if the entire file was 1024 bytes, file.slice(512, 768) will read the part of this file from the 512nd byte to 768th byte, and return a new object of interface called "Blob”, which you can treat as an array of bytes. In fact,  a Blob object represents a file-like object of immutable, raw data. The File interface is based on Blob, inheriting blob functionality and expanding it to support files on the user's system. For more information about the Blob please refer here. File and Blob is very useful to implement the chunk upload. We will use File interface to represent the file the user selected from the browser and then use File.slice to read the file in chunks in the size we wanted. For example, if we wanted to upload a 10MB file with 512KB chunks, then we can read it in 512KB blobs by using File.slice in a loop.   Assuming we have a web page as below. User can select a file, an input box to specify the block size in KB and a button to start upload. 1: <div> 2: <input type="file" id="upload_files" name="files[]" /><br /> 3: Block Size: <input type="number" id="block_size" value="512" name="block_size" />KB<br /> 4: <input type="button" id="upload_button_blob" name="upload" value="upload (blob)" /> 5: </div> Then we can have the JavaScript function to upload the file in chunks when user clicked the button. 1: <script type="text/javascript"> 1: 2: $(function () { 3: $("#upload_button_blob").click(function () { 4: }); 5: });</script> Firstly we need to ensure the client browser supports the interfaces we are going to use. Just try to invoke the File, Blob and FormData from the “window” object. If any of them is “undefined” the condition result will be “false” which means your browser doesn’t support these premium feature and it’s time for you to get your browser updated. FormData is another new feature we are going to use in the future. It could generate a temporary form for us. We will use this interface to create a form with chunk and associated metadata when invoked the service through ajax. 1: $("#upload_button_blob").click(function () { 2: // assert the browser support html5 3: if (window.File && window.Blob && window.FormData) { 4: alert("Your brwoser is awesome, let's rock!"); 5: } 6: else { 7: alert("Oh man plz update to a modern browser before try is cool stuff out."); 8: return; 9: } 10: }); Each browser supports these interfaces by their own implementation and currently the Blob, File and File.slice are supported by Chrome 21, FireFox 13, IE 10, Opera 12 and Safari 5.1 or higher. After that we worked on the files the user selected one by one since in HTML5, user can select multiple files in one file input box. 1: var files = $("#upload_files")[0].files; 2: for (var i = 0; i < files.length; i++) { 3: var file = files[i]; 4: var fileSize = file.size; 5: var fileName = file.name; 6: } Next, we calculated the start index and end index for each chunks based on the size the user specified from the browser. We put them into an array with the file name and the index, which will be used when we upload chunks into Windows Azure Blob Storage as blocks since we need to specify the target blob name and the block index. At the same time we will store the list of all indexes into another variant which will be used to commit blocks into blob in Azure Storage once all chunks had been uploaded successfully. 1: $("#upload_button_blob").click(function () { 2: // assert the browser support html5 3: ... ... 4: // start to upload each files in chunks 5: var files = $("#upload_files")[0].files; 6: for (var i = 0; i < files.length; i++) { 7: var file = files[i]; 8: var fileSize = file.size; 9: var fileName = file.name; 10:  11: // calculate the start and end byte index for each blocks(chunks) 12: // with the index, file name and index list for future using 13: var blockSizeInKB = $("#block_size").val(); 14: var blockSize = blockSizeInKB * 1024; 15: var blocks = []; 16: var offset = 0; 17: var index = 0; 18: var list = ""; 19: while (offset < fileSize) { 20: var start = offset; 21: var end = Math.min(offset + blockSize, fileSize); 22:  23: blocks.push({ 24: name: fileName, 25: index: index, 26: start: start, 27: end: end 28: }); 29: list += index + ","; 30:  31: offset = end; 32: index++; 33: } 34: } 35: }); Now we have all chunks’ information ready. The next step should be upload them one by one to the server side, and at the server side when received a chunk it will upload as a block into Blob Storage, and finally commit them with the index list through BlockBlobClient.PutBlockList. But since all these invokes are ajax calling, which means not synchronized call. So we need to introduce a new JavaScript library to help us coordinate the asynchronize operation, which named “async.js”. You can download this JavaScript library here, and you can find the document here. I will not explain this library too much in this post. We will put all procedures we want to execute as a function array, and pass into the proper function defined in async.js to let it help us to control the execution sequence, in series or in parallel. Hence we will define an array and put the function for chunk upload into this array. 1: $("#upload_button_blob").click(function () { 2: // assert the browser support html5 3: ... ... 4:  5: // start to upload each files in chunks 6: var files = $("#upload_files")[0].files; 7: for (var i = 0; i < files.length; i++) { 8: var file = files[i]; 9: var fileSize = file.size; 10: var fileName = file.name; 11: // calculate the start and end byte index for each blocks(chunks) 12: // with the index, file name and index list for future using 13: ... ... 14:  15: // define the function array and push all chunk upload operation into this array 16: blocks.forEach(function (block) { 17: putBlocks.push(function (callback) { 18: }); 19: }); 20: } 21: }); 22: }); As you can see, I used File.slice method to read each chunks based on the start and end byte index we calculated previously, and constructed a temporary HTML form with the file name, chunk index and chunk data through another new feature in HTML5 named FormData. Then post this form to the backend server through jQuery.ajax. This is the key part of our solution. 1: $("#upload_button_blob").click(function () { 2: // assert the browser support html5 3: ... ... 4: // start to upload each files in chunks 5: var files = $("#upload_files")[0].files; 6: for (var i = 0; i < files.length; i++) { 7: var file = files[i]; 8: var fileSize = file.size; 9: var fileName = file.name; 10: // calculate the start and end byte index for each blocks(chunks) 11: // with the index, file name and index list for future using 12: ... ... 13: // define the function array and push all chunk upload operation into this array 14: blocks.forEach(function (block) { 15: putBlocks.push(function (callback) { 16: // load blob based on the start and end index for each chunks 17: var blob = file.slice(block.start, block.end); 18: // put the file name, index and blob into a temporary from 19: var fd = new FormData(); 20: fd.append("name", block.name); 21: fd.append("index", block.index); 22: fd.append("file", blob); 23: // post the form to backend service (asp.net mvc controller action) 24: $.ajax({ 25: url: "/Home/UploadInFormData", 26: data: fd, 27: processData: false, 28: contentType: "multipart/form-data", 29: type: "POST", 30: success: function (result) { 31: if (!result.success) { 32: alert(result.error); 33: } 34: callback(null, block.index); 35: } 36: }); 37: }); 38: }); 39: } 40: }); Then we will invoke these functions one by one by using the async.js. And once all functions had been executed successfully I invoked another ajax call to the backend service to commit all these chunks (blocks) as the blob in Windows Azure Storage. 1: $("#upload_button_blob").click(function () { 2: // assert the browser support html5 3: ... ... 4: // start to upload each files in chunks 5: var files = $("#upload_files")[0].files; 6: for (var i = 0; i < files.length; i++) { 7: var file = files[i]; 8: var fileSize = file.size; 9: var fileName = file.name; 10: // calculate the start and end byte index for each blocks(chunks) 11: // with the index, file name and index list for future using 12: ... ... 13: // define the function array and push all chunk upload operation into this array 14: ... ... 15: // invoke the functions one by one 16: // then invoke the commit ajax call to put blocks into blob in azure storage 17: async.series(putBlocks, function (error, result) { 18: var data = { 19: name: fileName, 20: list: list 21: }; 22: $.post("/Home/Commit", data, function (result) { 23: if (!result.success) { 24: alert(result.error); 25: } 26: else { 27: alert("done!"); 28: } 29: }); 30: }); 31: } 32: }); That’s all in the client side. The outline of our logic would be - Calculate the start and end byte index for each chunks based on the block size. - Defined the functions of reading the chunk form file and upload the content to the backend service through ajax. - Execute the functions defined in previous step with “async.js”. - Commit the chunks by invoking the backend service in Windows Azure Storage finally.   Save Chunks as Blocks into Blob Storage In above we finished the client size JavaScript code. It uploaded the file in chunks to the backend service which we are going to implement in this step. We will use ASP.NET MVC as our backend service, and it will receive the chunks, upload into Windows Azure Bob Storage in blocks, then finally commit as one blob. As in the client side we uploaded chunks by invoking the ajax call to the URL "/Home/UploadInFormData", I created a new action under the Index controller and it only accepts HTTP POST request. 1: [HttpPost] 2: public JsonResult UploadInFormData() 3: { 4: var error = string.Empty; 5: try 6: { 7: } 8: catch (Exception e) 9: { 10: error = e.ToString(); 11: } 12:  13: return new JsonResult() 14: { 15: Data = new 16: { 17: success = string.IsNullOrWhiteSpace(error), 18: error = error 19: } 20: }; 21: } Then I retrieved the file name, index and the chunk content from the Request.Form object, which was passed from our client side. And then, used the Windows Azure SDK to create a blob container (in this case we will use the container named “test”.) and create a blob reference with the blob name (same as the file name). Then uploaded the chunk as a block of this blob with the index, since in Blob Storage each block must have an index (ID) associated with so that finally we can put all blocks as one blob by specifying their block ID list. 1: [HttpPost] 2: public JsonResult UploadInFormData() 3: { 4: var error = string.Empty; 5: try 6: { 7: var name = Request.Form["name"]; 8: var index = int.Parse(Request.Form["index"]); 9: var file = Request.Files[0]; 10: var id = Convert.ToBase64String(BitConverter.GetBytes(index)); 11:  12: var container = _client.GetContainerReference("test"); 13: container.CreateIfNotExists(); 14: var blob = container.GetBlockBlobReference(name); 15: blob.PutBlock(id, file.InputStream, null); 16: } 17: catch (Exception e) 18: { 19: error = e.ToString(); 20: } 21:  22: return new JsonResult() 23: { 24: Data = new 25: { 26: success = string.IsNullOrWhiteSpace(error), 27: error = error 28: } 29: }; 30: } Next, I created another action to commit the blocks into blob once all chunks had been uploaded. Similarly, I retrieved the blob name from the Request.Form. I also retrieved the chunks ID list, which is the block ID list from the Request.Form in a string format, split them as a list, then invoked the BlockBlob.PutBlockList method. After that our blob will be shown in the container and ready to be download. 1: [HttpPost] 2: public JsonResult Commit() 3: { 4: var error = string.Empty; 5: try 6: { 7: var name = Request.Form["name"]; 8: var list = Request.Form["list"]; 9: var ids = list 10: .Split(',') 11: .Where(id => !string.IsNullOrWhiteSpace(id)) 12: .Select(id => Convert.ToBase64String(BitConverter.GetBytes(int.Parse(id)))) 13: .ToArray(); 14:  15: var container = _client.GetContainerReference("test"); 16: container.CreateIfNotExists(); 17: var blob = container.GetBlockBlobReference(name); 18: blob.PutBlockList(ids); 19: } 20: catch (Exception e) 21: { 22: error = e.ToString(); 23: } 24:  25: return new JsonResult() 26: { 27: Data = new 28: { 29: success = string.IsNullOrWhiteSpace(error), 30: error = error 31: } 32: }; 33: } Now we finished all code we need. The whole process of uploading would be like this below. Below is the full client side JavaScript code. 1: <script type="text/javascript" src="~/Scripts/async.js"></script> 2: <script type="text/javascript"> 3: $(function () { 4: $("#upload_button_blob").click(function () { 5: // assert the browser support html5 6: if (window.File && window.Blob && window.FormData) { 7: alert("Your brwoser is awesome, let's rock!"); 8: } 9: else { 10: alert("Oh man plz update to a modern browser before try is cool stuff out."); 11: return; 12: } 13:  14: // start to upload each files in chunks 15: var files = $("#upload_files")[0].files; 16: for (var i = 0; i < files.length; i++) { 17: var file = files[i]; 18: var fileSize = file.size; 19: var fileName = file.name; 20:  21: // calculate the start and end byte index for each blocks(chunks) 22: // with the index, file name and index list for future using 23: var blockSizeInKB = $("#block_size").val(); 24: var blockSize = blockSizeInKB * 1024; 25: var blocks = []; 26: var offset = 0; 27: var index = 0; 28: var list = ""; 29: while (offset < fileSize) { 30: var start = offset; 31: var end = Math.min(offset + blockSize, fileSize); 32:  33: blocks.push({ 34: name: fileName, 35: index: index, 36: start: start, 37: end: end 38: }); 39: list += index + ","; 40:  41: offset = end; 42: index++; 43: } 44:  45: // define the function array and push all chunk upload operation into this array 46: var putBlocks = []; 47: blocks.forEach(function (block) { 48: putBlocks.push(function (callback) { 49: // load blob based on the start and end index for each chunks 50: var blob = file.slice(block.start, block.end); 51: // put the file name, index and blob into a temporary from 52: var fd = new FormData(); 53: fd.append("name", block.name); 54: fd.append("index", block.index); 55: fd.append("file", blob); 56: // post the form to backend service (asp.net mvc controller action) 57: $.ajax({ 58: url: "/Home/UploadInFormData", 59: data: fd, 60: processData: false, 61: contentType: "multipart/form-data", 62: type: "POST", 63: success: function (result) { 64: if (!result.success) { 65: alert(result.error); 66: } 67: callback(null, block.index); 68: } 69: }); 70: }); 71: }); 72:  73: // invoke the functions one by one 74: // then invoke the commit ajax call to put blocks into blob in azure storage 75: async.series(putBlocks, function (error, result) { 76: var data = { 77: name: fileName, 78: list: list 79: }; 80: $.post("/Home/Commit", data, function (result) { 81: if (!result.success) { 82: alert(result.error); 83: } 84: else { 85: alert("done!"); 86: } 87: }); 88: }); 89: } 90: }); 91: }); 92: </script> And below is the full ASP.NET MVC controller code. 1: public class HomeController : Controller 2: { 3: private CloudStorageAccount _account; 4: private CloudBlobClient _client; 5:  6: public HomeController() 7: : base() 8: { 9: _account = CloudStorageAccount.Parse(CloudConfigurationManager.GetSetting("DataConnectionString")); 10: _client = _account.CreateCloudBlobClient(); 11: } 12:  13: public ActionResult Index() 14: { 15: ViewBag.Message = "Modify this template to jump-start your ASP.NET MVC application."; 16:  17: return View(); 18: } 19:  20: [HttpPost] 21: public JsonResult UploadInFormData() 22: { 23: var error = string.Empty; 24: try 25: { 26: var name = Request.Form["name"]; 27: var index = int.Parse(Request.Form["index"]); 28: var file = Request.Files[0]; 29: var id = Convert.ToBase64String(BitConverter.GetBytes(index)); 30:  31: var container = _client.GetContainerReference("test"); 32: container.CreateIfNotExists(); 33: var blob = container.GetBlockBlobReference(name); 34: blob.PutBlock(id, file.InputStream, null); 35: } 36: catch (Exception e) 37: { 38: error = e.ToString(); 39: } 40:  41: return new JsonResult() 42: { 43: Data = new 44: { 45: success = string.IsNullOrWhiteSpace(error), 46: error = error 47: } 48: }; 49: } 50:  51: [HttpPost] 52: public JsonResult Commit() 53: { 54: var error = string.Empty; 55: try 56: { 57: var name = Request.Form["name"]; 58: var list = Request.Form["list"]; 59: var ids = list 60: .Split(',') 61: .Where(id => !string.IsNullOrWhiteSpace(id)) 62: .Select(id => Convert.ToBase64String(BitConverter.GetBytes(int.Parse(id)))) 63: .ToArray(); 64:  65: var container = _client.GetContainerReference("test"); 66: container.CreateIfNotExists(); 67: var blob = container.GetBlockBlobReference(name); 68: blob.PutBlockList(ids); 69: } 70: catch (Exception e) 71: { 72: error = e.ToString(); 73: } 74:  75: return new JsonResult() 76: { 77: Data = new 78: { 79: success = string.IsNullOrWhiteSpace(error), 80: error = error 81: } 82: }; 83: } 84: } And if we selected a file from the browser we will see our application will upload chunks in the size we specified to the server through ajax call in background, and then commit all chunks in one blob. Then we can find the blob in our Windows Azure Blob Storage.   Optimized by Parallel Upload In previous example we just uploaded our file in chunks. This solved the problem that ASP.NET MVC request content size limitation as well as the Windows Azure load balancer timeout. But it might introduce the performance problem since we uploaded chunks in sequence. In order to improve the upload performance we could modify our client side code a bit to make the upload operation invoked in parallel. The good news is that, “async.js” library provides the parallel execution function. If you remembered the code we invoke the service to upload chunks, it utilized “async.series” which means all functions will be executed in sequence. Now we will change this code to “async.parallel”. This will invoke all functions in parallel. 1: $("#upload_button_blob").click(function () { 2: // assert the browser support html5 3: ... ... 4: // start to upload each files in chunks 5: var files = $("#upload_files")[0].files; 6: for (var i = 0; i < files.length; i++) { 7: var file = files[i]; 8: var fileSize = file.size; 9: var fileName = file.name; 10: // calculate the start and end byte index for each blocks(chunks) 11: // with the index, file name and index list for future using 12: ... ... 13: // define the function array and push all chunk upload operation into this array 14: ... ... 15: // invoke the functions one by one 16: // then invoke the commit ajax call to put blocks into blob in azure storage 17: async.parallel(putBlocks, function (error, result) { 18: var data = { 19: name: fileName, 20: list: list 21: }; 22: $.post("/Home/Commit", data, function (result) { 23: if (!result.success) { 24: alert(result.error); 25: } 26: else { 27: alert("done!"); 28: } 29: }); 30: }); 31: } 32: }); In this way all chunks will be uploaded to the server side at the same time to maximize the bandwidth usage. This should work if the file was not very large and the chunk size was not very small. But for large file this might introduce another problem that too many ajax calls are sent to the server at the same time. So the best solution should be, upload the chunks in parallel with maximum concurrency limitation. The code below specified the concurrency limitation to 4, which means at the most only 4 ajax calls could be invoked at the same time. 1: $("#upload_button_blob").click(function () { 2: // assert the browser support html5 3: ... ... 4: // start to upload each files in chunks 5: var files = $("#upload_files")[0].files; 6: for (var i = 0; i < files.length; i++) { 7: var file = files[i]; 8: var fileSize = file.size; 9: var fileName = file.name; 10: // calculate the start and end byte index for each blocks(chunks) 11: // with the index, file name and index list for future using 12: ... ... 13: // define the function array and push all chunk upload operation into this array 14: ... ... 15: // invoke the functions one by one 16: // then invoke the commit ajax call to put blocks into blob in azure storage 17: async.parallelLimit(putBlocks, 4, function (error, result) { 18: var data = { 19: name: fileName, 20: list: list 21: }; 22: $.post("/Home/Commit", data, function (result) { 23: if (!result.success) { 24: alert(result.error); 25: } 26: else { 27: alert("done!"); 28: } 29: }); 30: }); 31: } 32: });   Summary In this post we discussed how to upload files in chunks to the backend service and then upload them into Windows Azure Blob Storage in blocks. We focused on the frontend side and leverage three new feature introduced in HTML 5 which are - File.slice: Read part of the file by specifying the start and end byte index. - Blob: File-like interface which contains the part of the file content. - FormData: Temporary form element that we can pass the chunk alone with some metadata to the backend service. Then we discussed the performance consideration of chunk uploading. Sequence upload cannot provide maximized upload speed, but the unlimited parallel upload might crash the browser and server if too many chunks. So we finally came up with the solution to upload chunks in parallel with the concurrency limitation. We also demonstrated how to utilize “async.js” JavaScript library to help us control the asynchronize call and the parallel limitation.   Regarding the chunk size and the parallel limitation value there is no “best” value. You need to test vary composition and find out the best one for your particular scenario. It depends on the local bandwidth, client machine cores and the server side (Windows Azure Cloud Service Virtual Machine) cores, memory and bandwidth. Below is one of my performance test result. The client machine was Windows 8 IE 10 with 4 cores. I was using Microsoft Cooperation Network. The web site was hosted on Windows Azure China North data center (in Beijing) with one small web role (1.7GB 1 core CPU, 1.75GB memory with 100Mbps bandwidth). The test cases were - Chunk size: 512KB, 1MB, 2MB, 4MB. - Upload Mode: Sequence, parallel (unlimited), parallel with limit (4 threads, 8 threads). - Chunk Format: base64 string, binaries. - Target file: 100MB. - Each case was tested 3 times. Below is the test result chart. Some thoughts, but not guidance or best practice: - Parallel gets better performance than series. - No significant performance improvement between parallel 4 threads and 8 threads. - Transform with binaries provides better performance than base64. - In all cases, chunk size in 1MB - 2MB gets better performance.   Hope this helps, Shaun All documents and related graphics, codes are provided "AS IS" without warranty of any kind. Copyright © Shaun Ziyan Xu. This work is licensed under the Creative Commons License.

    Read the article

  • Trying to get a json result back from method in another namespace, having issues

    - by Blankman
    I have a seperate .js file and namespace for json requests. I have another .js file and namespace for the actual logic. I can't seem to get the result back in my logic layer. var jsonResult = Blah.Data.LoadAggregates(); alert(jsonResult); alert(jsonResult.d.length); alert(jsonResult.length); all of the above calls are returning undefined. Blah.RegisterNamespace("Blah.Data"); (function(Data) { Data.LoadAggregates = function() { $.ajax({ type: "POST", url: "asdf.asmx/GetAggregates", data: "{}", contentType: "application/json; charset=utf-8", dataType: "json", success: function(data) { ??????? }, error: function(msg) { alert("error" + msg); } }); }; })(Blah.Data);

    Read the article

  • JqGrid don't work in ASP .NET MVC2

    - by Raouf
    I have a project in ASP.NET MVC1 using VB.NET controlers and JqGrid. it works fine under MVC1. After migrating the project to ASP.NET MVC2, the grid is no longer populated. It seems that there is some new restrictions on returned Jsonresult in MVC2. How to solve this in VB.NET. Controler function populating the jqgrid is something like this : Function GetGridRecordset(ByVal qry As String) As JsonResult Dim result = New JsonResult() ... ... Return result End Function Is there anyone who have a solution? Thank you very much!

    Read the article

  • JqGrid doesn't work in ASP.NET MVC2

    - by Raouf
    I have a project in ASP.NET MVC1 using VB.NET controlers and JqGrid. it works fine under MVC1. After migrating the project to ASP.NET MVC2, the grid is no longer populated. It seems that there is some new restrictions on returned Jsonresult in MVC2. How to solve this in VB.NET. Controler function populating the jqgrid is something like this : Function GetGridRecordset(ByVal qry As String) As JsonResult Dim result = New JsonResult() ... ... Return result End Function Is there anyone who have a solution?

    Read the article

  • Why doesn't jquery validation plugin's remote attribute work for me?

    - by Pandiya Chendur
    I use jquery validation plugin and the remote attribute works with emailId but not with mobileNo? var validator = $("#addform").validate({ rules: { Name: "required", MobileNo: { required: true, minlength: 10, remote: '<%=Url.Action("getClientMobNo", "Clients") %>' }, Address: "required" }, messages: { Name: "please provide a client name", MobileNo: { required: "Please provide a mobile phone no", rangelength: jQuery.format("Enter at least {0} characters"), remote: jQuery.format("This MobileNo is already in use") }, Address: "please provide client address" }, A null value is passed to my controller action.. Any suggestion... public JsonResult getClientMobNo(string mobno) { JsonResult result = new JsonResult(); string status = clirep.getClientMobNo(Convert.ToInt64(mobno)); if (status == "Mobile No already exists") { result.Data = false; } else { result.Data = true; } return result; }

    Read the article

  • render json data returned from mvc controller

    - by user1765862
    I'm having js function which calls mvc controller action method which return list of data as json. function FillCountryCities(countryId) { $.ajax({ type: 'GET', url: '/User/FillCityCombo', data: { countryId: countryId }, contentType: 'application/json', success: function (data) { alert(data[0].Name); } error: function () { alert('something bad happened'); } .... format of data which sent back from controller is Name (string) and Id (Guid) Now I just want to alert Name on success first item from collection. Double checked controller sends 20 records, so it should alert first from collection but I'm getting error something bad happened update: public JsonResult FillCityCombo(Guid countryId) { var cities = repository.GetData() .Where(x = x.Country.Id == countryId).ToList(); if (Request.IsAjaxRequest()) { return new JsonResult { Data = cities, JsonRequestBehavior = JsonRequestBehavior.AllowGet }; } else { return new JsonResult { Data = "Not Valid Request", JsonRequestBehavior = JsonRequestBehavior.AllowGet }; } }

    Read the article

  • Can a View be returned as a JSON object in ASP.Net MVC

    - by Chev
    I want to know if it is possibe to return a view as a JSON object. In my controller I want to do something like the following: [AcceptVerbs("Post")] public JsonResult SomeActionMethod() { return new JsonResult { Data = new { success = true, view = PartialView("MyPartialView") } }; } In html: $.post($(this).attr('action'), $(this).serialize(), function(Data) { alert(Data.success); $("#test").replaceWith(Data.view); }); Any feedback greatly appreciated.

    Read the article

  • How can I send a json string to view as a json object?

    - by yapiskan
    I have a json string in an action of MVC controller. I want to send it to view as a JSON object. How can I solve this? public JsonResult Json() { ... some code here ... string jsonString = "{\"Success\":true, \"Msg\":null}"; return new JsonResult() { Data = jsonString, JsonRequestBehavior = JsonRequestBehavior.AllowGet }; }

    Read the article

  • ASP.NET MVC 2 JQuery POST not displaying the model state errors

    - by Oshan
    Hello, I have been using asp.net mvc for a bit (but I'm still a beginer). I want to have the ability to update two views as a result of a jquery postback. Basically I have a list and a details view. The details view is presented using a jquery popup (using jquery-UI popup). I only want to update the list if the details save is successful (i.e. there are no validation errors on the details view). However, if there are any validation errros in the details view, I want to update the details view so that the user sees the validation errors. so I thought in my controller, I return a JsonResult instead of a View. [HttpPost] public ActionResult SavePersonInfo(Person p) { if(ModelState.Valid) { return View("PersonList"); } return Json({Error = true, View = PartialView("PersonDetails", p)}); } As you can see if there are no errors I return the person list view, but if there are any validation errors, I have return the details view. The reason that I'm returning a JsonResult is I need to tell my view there is an error so that the view (jquery) knows which section to update (as in whether to update the person list 'div' or the popup dialog 'div'). So, in my view, the jquery is as follows (please assume that there is a form for entering in the person details and "SubmitPersonForm();" function is called upon clicking on the "Save" button): <script type="text/javascript> $('#btnSave').click(function (event) { onBegin(); $.ajax( { type: "POST", url: "/Person/Save", data: $('form').serialize(), success: function (result) { if(result.Error) { $('#dvDetails').html($(result).View)); } else { $('#dvPersonList').html($result); } } }); }); </script> So the problem that I have now, is that when there is a validation error, I do see the correct, 'div' being updated, but I lose the asp.net mvc validation messages. I do not see any validation errors in red, as if ASP.NET MVC is completely ignored them. However, my ModelState does have those errros, just not displayed in the details view. I do have valication summary and Html.ValidationFor(m = ...) statements put in my details view. Could someone tell me why I'm not seeing the validation errors? although I'm using a JSonResult, I do use the right property which is a valid view when I render the 'dvDetails'. Am I doing something that I'm not suppose to in asp.net mvc? Btw I'm using asp.net mvc2 RC with Visual Studio 2010 RC. Thank you.

    Read the article

  • First Test Crashes using MSTEST with ASP.NET MVC 1

    - by Trey Carroll
    I'm trying to start using Unit Testing and I want to test the following Controller: public class AjaxController : Controller { ... public JsonResult RateVideo( int userRating, long videoId ) { string userName = User.Identity.Name; ... } } I have a created a TestClass with the following method: [ TestMethod public void TestRateVideo() { //Arrange AjaxController c = new AjaxController(); //Act JsonResult jr = c.RateVideo(1, 1); //Assert //Not implemented yet } I select debug and run the test. When the code reaches the 1st statement: string username = User.Identity.Name; Debugging stops and I am presented with a message that says that the test failed. Any guidance you can offer would be appreciated.

    Read the article

  • jQuery Form, ASP.NET MVC JSon Result

    - by Stacey
    I'm trying to return a json result from a jQuery Form instance - but it keeps prompting me to download a file instead of displaying it in the window like it is supposed to... $("#ajaxImageForm").ajaxForm({ iframe: true, type: "GET", dataType: "json", beforeSubmit: function() { $("#ajaxImageForm").block({ message: '<img src="/content/images/loader.gif" /> Uploading . . .' }); }, success: function(result) { $("#ajaxImageForm").unblock(); $.growlUI(null, result.message); } }); [AcceptVerbs(HttpVerbs.Post)] public JsonResult Edit(FormCollection collection) { // return Json to the jQuery Form Result return new JsonResult { Data = new { message = string.Format("edited successfully.") } }; }

    Read the article

  • How to update strongly typed Html.DropDownList using Jquery

    - by Remnant
    I have a webpage with two radiobuttons and a dropdownlist as follows: <div class="sectionheader">Course <div class="dropdown"><%=Html.DropDownList("CourseSelection", Model.CourseList, new { @class = "dropdown" })%> </div> <div class="radiobuttons"><label><%=Html.RadioButton("CourseType", "Advanced", false )%> Advanced </label></div> <div class="radiobuttons"><label><%=Html.RadioButton("CourseType", "Beginner", true )%> Beginner </label></div> </div> The dropdownlist is strongly typed and populated with Model.CourseList (NB - on the first page load, 'Beginner' is the default selection and the dropdown shows the beginner course options accordingly) What I want to be able to do is to update the DropDownList based on which radiobutton is selected i.e. if 'Advanced' selected then show one list of course options in dropdown, and if 'Beginner' selected then show another list of courses. The code I would like to call sits within my Controller: public JsonResult UpdateDropDown(string courseType) { IDropDownList dropdownlistRepository = new DropDownListRepository(); IEnumerable<SelectListItem> courseList = dropdownlistRepository.GetCourseList(courseType); return Json(courseList); } Edit - Updated below to show latest position Using examples provided in jQuery in Action, I now have the following jQuery code: $('.radiobuttons input:radio').click(function() { var courseType = $(this).val(); //Get selected courseType from radiobutton var dropdownList = $(".dropdown"); //Ref for dropdownlist $.getJSON("/ByCourse/UpdateDropDown", { courseType: courseType }, function(data) { $(dropdownList).loadSelect(data); }); }); The loadSelect function is taken straight from the book and is as follows: (function($) { $.fn.emptySelect = function() { return this.each(function() { if (this.tagName == 'SELECT') this.options.length = 0; }); } $.fn.loadSelect = function(optionsDataArray) { return this.emptySelect().each(function() { if (this.tagName == 'SELECT') { var selectElement = this; $.each(optionsDataArray, function(index, optionData) { var option = new Option(optionData.Text, optionData.Value); if ($.browser.msie) { selectElement.add(option); } else { selectElement.add(option, null); } }); } }); } })(jQuery); 1 day+ later I still cannot get this to work. Assuming the jQuery code is correct then I can only think that the issue is with retrieving the actual data with $getJSON. I have verified that JsonResult UpdateDropDown does actually retrieve valid data. What am I missing? Assembly reference? (NB: I have MicrosoftAjax.js and MicrosoftMvcAjax.js in my head tags of the master page Should JsonResult be ActionResult? (I have seen both used in samples on web) Do I need to register route Controller/UpdateDropDown in Global.asax? Any further guidance would be appreciated.

    Read the article

  • Is this a valid jquery getJSON call?

    - by Pandiya Chendur
    I am using jquery getJSON with asp.net mvc controller... I cant able to get it work.... public JsonResult GetMaterials(int currentPage,int pageSize) { var materials = consRepository.FindAllMaterials().AsQueryable(); var results = new PagedList<MaterialsObj>(materials, currentPage-1, pageSize); return Json(results); } and i am calling this with, $.getJSON('Materials/GetMaterials', "{'currentPage':1,'pageSize':5}", function(data) { }); This call doesn't seem to work.... when inspected through firebug i found this, The parameters dictionary contains a null entry for parameter 'currentPage' of non-nullable type 'System.Int32' for method 'System.Web.Mvc.JsonResult GetMaterials(Int32, Int32)' in 'CrMVC.Controllers.MaterialsController'. To make a parameter optional its type should be either a reference type or a Nullable type.<br> Parameter name: parameters

    Read the article

  • "dynamic" keyword and JSON data

    - by Peter Perhác
    An action method in my ASP.NET MVC2 application returns a JsonResult object and in my unit test I would like to check that the returned JSON object indeed contains the expected values. I tried this: 1. dynamic json = ((JsonResult)myActionResult).Data; 2. Assert.AreEqual(JsonMessagesHelper.ErrorLevel.ERROR.ToString(), json.ErrorLevel); But I get a RuntimeBinderException "'object' does not contain a definition for 'ErrorLevel'". However, when I place a breakpoint on line 2 and inspect the json dynamic variable (see picture below), it obviously does contain the ErrorLevel string and it has the expected value, so if the runtime binder wasn't playing funny the test would pass. What am I not getting? What am I doing wrong and how can I fix this? How can I make the assertion pass?

    Read the article

  • Simple getJSON does not work...

    - by user54197
    JSON function(Index) does not fire. Any Ideas? <script type="text/javascript"> $(document).ready(function() { alert("This alert is displayed :("); $("form[action$='GetQuote']").submit(function() { $.getJSON($(this).attr("action"), $(this).serialize(), function(Result) { alert("This alert is not shown :("); $("#name").html(Result.name); $("#address").html(Result.address); }); return false; }); }); </script> CONTROLLERS... public JsonResult GetQuote(string dataName) { if (dataName != "" || dataName != null) return new JsonResult { Data = new Result { name = "Hello", address = "World" } }; else return null; }

    Read the article

  • C# DataTable to Json?

    - by AliRiza Adiyahsi
    I want to get DataTable as Json Format to show it on a chart. public JsonResult GetDataTable() { DataTable dt = new DataTable(); dt.Columns.Add("Jan"); dt.Columns.Add("Feb"); dt.Columns.Add("Mar"); dt.Columns.Add("Apr"); for (int i = 0; i < 10; i++) { dt.Rows.Add(i * 5, i * 10, i * 15, i * 11); } // JsonDataTable = dt to Json return new JsonResult { Data = new { success = true, chartData = JsonDataTable }, JsonRequestBehavior = JsonRequestBehavior.AllowGet }; } How Can I convert DataTable to Json? Thanks.

    Read the article

  • jQuery FullCalendar JSON date issue

    - by durilai
    I am integrating jQuery plugin FullCalendar, overall it has been really straightforward. I however have ran into a problem with adding events to the calendar. I am using ASP.NET MVC 1.0 and have found and followed this post. I am returning JSON to the FullCalendar and the events are getting bound, but they all show up as all day events. I am formatting the dates as ISO8601 format. Calendar Javascript $('#calendar').fullCalendar({ events: "/Calendar/GetEvents/" }); JsonResult public JsonResult GetEvents(double start, double end) { var fromDate = Utility.Dates.ConvertFromUnixTimestamp(start); var toDate = Utility.Dates.ConvertFromUnixTimestamp(end); List<GenericEventList> events = GETGENERICLISTOFEVENTS(); return Json(events.ToArray()); } JSON Result Value [{"id":2,"title":"Test Event","start":"2010-03-14T11:00:00","end":"2010-03-14T16:00:00"}, {"id":3,"title":"Test Event1asasas","start":"2010-03-14T10:00:00","end":"2010-03-14T14:00:00"}, {"id":4,"title":"Test Event12","start":"2010-03-14T16:00:00","end":"2010-03-14T17:00:00"}, {"id":6,"title":"Test Event1aaa","start":"2010-03-14T10:00:00","end":"2010-03-14T14:00:00"}] Any help is truly appreciated!

    Read the article

  • Asp.net mvc json

    - by user310657
    Hi, I am working on a mvc project, and having problem with json. i have created a demo project with list of colors public JsonResult GetResult() { List strList = new List(); strList.Add("white"); strList.Add("blue"); strList.Add("black"); strList.Add("red"); strList.Add("orange"); strList.Add("green"); return this.Json(strList); } i am able to get these on my page, but when i try to delete one color, that is when i send the following using jquery function deleteItem(item) { $.ajax({ type: "POST", url: "/Home/Delete/white", data: "{}", contentType: "application/json; charset=utf-8", success: ajaxCallSucceed, dataType: "json", failure: ajaxCallFailed }); } the controler action public JsonResult Delete(string Color) {} Color always returns null, even if i have specified "/Home/Delete/white" in the url. i know i am doing something wrong or missing something, but not able to find out what. please can any one guide me in the right direction.

    Read the article

  • ASP.Net MVC Json Result: Parameters passed to controller method issue

    - by Moskie
    I'm having a problem getting a controller method that returns a JsonResult to accept parameters passed via the JQuery getJSON method. The code I’m working on works fine when the second parameter ("data") of the getJSON method call is null. But when I attempt to pass in a value there, it seems as if the controller method never even gets called. In this example case, I just want to use an integer. The getJSON call that works fine looks like this: $.getJSON(”/News/ListNewsJson/”, null, ListNews_OnReturn); The controller method is like this: public JsonResult ListNewsJson(int? id) { … return Json(toReturn); } By putting a breakpoint in the ListNewsJson method, I see that this method gets called when the data parameter of getJSON is null, but when I replace it with value, such as, say, 3: $.getJSON(”/News/ListNewsJson/”, 3, ListNews_OnReturn); … the controller method/breakpoint is never hit. Any idea what I'm doing wrong? I should also mention that the controller method works fine if I manually go to the address via my browser ("/News/ListNewsJson/3").

    Read the article

  • Exception Handling in ASP.NET MVC and Ajax - [HandleException] filter

    - by Graham
    All, I'm learning MVC and using it for a business app (MVC 1.0). I'm really struggling to get my head around exception handling. I've spent a lot of time on the web but not found anything along the lines of what I'm after. We currently use a filter attribute that implements IExceptionFilter. We decorate a base controller class with this so all server side exceptions are nicely routed to an exception page that displays the error and performs logging. I've started to use AJAX calls that return JSON data but when the server side implementation throws an error, the filter is fired but the page does not redirect to the Error page - it just stays on the page that called the AJAX method. Is there any way to force the redirect on the server (e.g. a ASP.NET Server.Transfer or redirect?) I've read that I must return a JSON object (wrapping the .NET Exception) and then redirect on the client, but then I can't guarantee the client will redirect... but then (although I'm probably doing something wrong) the server attempts to redirect but then gets an unauthorised exception (the base controller is secured but the Exception controller is not as it does not inherit from this) Has anybody please got a simple example (.NET and jQuery code). I feel like I'm randomly trying things in the hope it will work Exception Filter so far... public class HandleExceptionAttribute : FilterAttribute, IExceptionFilter { #region IExceptionFilter Members public void OnException(ExceptionContext filterContext) { if (filterContext.ExceptionHandled) { return; } filterContext.Controller.TempData[CommonLookup.ExceptionObject] = filterContext.Exception; if (filterContext.HttpContext.Request.IsAjaxRequest()) { filterContext.Result = AjaxException(filterContext.Exception.Message, filterContext); } else { //Redirect to global handler filterContext.Result = new RedirectToRouteResult(new RouteValueDictionary(new { controller = AvailableControllers.Exception, action = AvailableActions.HandleException })); filterContext.ExceptionHandled = true; filterContext.HttpContext.Response.Clear(); } } #endregion private JsonResult AjaxException(string message, ExceptionContext filterContext) { if (string.IsNullOrEmpty(message)) { message = "Server error"; //TODO: Replace with better message } filterContext.HttpContext.Response.StatusCode = (int)HttpStatusCode.InternalServerError; filterContext.HttpContext.Response.TrySkipIisCustomErrors = true; //Needed for IIS7.0 return new JsonResult { Data = new { ErrorMessage = message }, ContentEncoding = Encoding.UTF8, }; } }

    Read the article

  • AjaxFileUpload return download panel when data is json

    - by Tr.Crab
    I use AjaxFileUpload (http://www.phpletter.com/Our-Projects/AjaxFileUpload/ ) to upload a file and get json result type response in struts2 ( code.google.struts2jsonresult.JSONResult ) but browser always pop-up download pane, plz give me some suggestions, thanks in advance Here is my config in struts.xml : ...... <result-type name="json" class="code.google.struts2jsonresult.JSONResult"> ............ <action name="doGetList" method="doGetList" class="main.java.GetListAction"> <result type="json"> <param name="target">jsonObject</param> <param name="deepSerialize">true</param> <param name="patterns"> -*.class</param> </result> </action> and js client : function ajaxFileUpload(){ $("#loading").ajaxStart(function(){ $(this).show(); }).ajaxComplete(function(){ $(this).hide(); }); $.ajaxFileUpload ( { url:'doGetList.do', secureuri:false, fileElementId:'uploadfile', dataType: 'json', success: function (data, status) { if(typeof(data.error) != 'undefined') { if(data.error != '') { alert(data.error); } else { alert(data.msg); } } }, error: function (data, status, e) { alert(e); alert(data.records); } } ) return false; }

    Read the article

  • Circular Reference exception with JSON Serialisation with MVC3 and EF4 CTP5w

    - by nakchak
    Hi I'm having problems with a circular reference when i try and serialise an object returned via EF4 CTP5. Im using the code first approach and simple poco's for my model. I have added [ScriptIgnore] attributes to any properties that provide a back references to an object and annoyingly every seems to work fine if i manually instantiate the poco's, i.e. they serialise to JSON fine, and the scriptignore attribute is acknowledged. However when i try and serialise an object returned from the DAL i get the circular reference exception "A circular reference was detected while serializing an object of type 'System.Data.Entity.DynamicProxies.xxxx'" I have tried several ways of retreiving the data but they all get stuck with this error: public JsonResult GetTimeSlot(int id) { TimeSlotDao tsDao = new TimeSlotDao(); TimeSlot ts = tsDao.GetById(id); return Json(ts); } The method below works slightly better as rather than the timeslot dynamic proxied object causing the circular refference its the appointment object. public JsonResult GetTimeSlot(int id) { TimeSlotDao tsDao = new TimeSlotDao(); var ts = from t in tsDao.GetQueryable() where t.Id == id select new {t.Id, t.StartTime, t.Available, t.Appointment}; return Json(ts); } Any ideas or solutions to this problem? Update I would prefer to use the out of the box serialiser if possible although Json.Net via nuget is ok as an alternative i would hope its possible to use it as I intended as well...

    Read the article

  • JQuery Ajax Get passing parameters

    - by George
    Hi, I am working on my first MVC application and am running into a bit of a problem. I have a data table that when a row is clicked, I want to return the detail from that row. I have a function set up as: function rowClick(item) { $("#detailInfo").data("width.dialog", 800); $.ajax({ type: "GET", contentType: "application/json; charset=utf-8", url: "<%= Url.Action("GetDetails", "WarningRecognition")%>", data: "", dataType: "json", success: function(data) {//do some stuff...and show results} } The problem I am running into is the passing of the "item". I calls the Controller function that looks like this: public JsonResult GetDetails(string sDetail) { Debug.WriteLine(Request.QueryString["sDetail"]); Debug.WriteLine("sDetail: " + sDetail); var myDetailsDao = new WarnRecogDetailsDao(); return new JsonResult { Data = myDetailsDao.SelectDetailedInfo(Convert.ToInt32(sDetail)) }; } But it never shows anything as the the "sDetail". It does hit the function but nothing is passed to it. So, I have read where you pass the parameter via the data but I have tried every combination I can think of and it never shows up. Tried: data: {"item"} data: {sDetail[item]} data: {sDetail[" + item + "]} Any help is greatly appreciated. Thanks in advance. Geo...

    Read the article

  • Do you catch expected exceptions in the controller or business service of your asp.net mvc application

    - by Pascal
    I am developing an asp.net mvc application where user1 could delete data records which were just loaded before by user2. User2 either changes this non-existent data record (Update) or is doing an insert with this data in another table that a foreign-key constraint is violated. Where do you catch such expected exceptions? In the Controller of your asp.net mvc application or in the business service? Just a sidenote: I only catch the SqlException here if its a ForeignKey constraint exception to tell the user that another user has deleted a certain parent record and therefore he can not create the testplan. But this code is not fully implemented yet! Controller:   public JsonResult CreateTestplan(Testplan testplan)   {    bool success = false;    string error = string.Empty;    try   {    success = testplanService.CreateTestplan(testplan);    }   catch (SqlException ex)    {    error = ex.Message;    }    return Json(new { success = success, error = error }, JsonRequestBehavior.AllowGet);   } OR Business service: public Result CreateTestplan(Testplan testplan) { Result result = new Result(); try { using (var con = new SqlConnection(_connectionString)) using (var trans = new TransactionScope()) { con.Open(); _testplanDataProvider.AddTestplan(testplan); _testplanDataProvider.CreateTeststepsForTestplan(testplan.Id, testplan.TemplateId); trans.Complete(); result.Success = true; } } catch (SqlException e) { result.Error = e.Message; } return result; } then in the Controller: public JsonResult CreateTestplan(Testplan testplan)   {    Result result = testplanService.CreateTestplan(testplan);       return Json(new { success = result.success, error = result.error }, JsonRequestBehavior.AllowGet);   }

    Read the article

  • ASP.NET MVC 2.0 JsonRequestBehavior Global Setting

    - by beckelmw
    ASP.NET MVC 2.0 will now, by default, throw an exception when an action attempts to return JSON in response to a GET request. I know this can be overridden on a method by method basis by using JsonRequestBehavior.AllowGet, but is it possible to set on a controller or higher basis (possibly the web.config)? Update: Per Levi's comment, this is what I ended up using- protected override JsonResult Json(object data, string contentType, System.Text.Encoding contentEncoding) { return Json(data, contentType, JsonRequestBehavior.AllowGet); }

    Read the article

< Previous Page | 1 2 3 4 5 6  | Next Page >