Search Results

Search found 105167 results on 4207 pages for 'class file'.

Page 4/4207 | < Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >

  • adhoc struct/class in C#?

    - by acidzombie24
    Currently i am using reflection with sql. I find if i want to make a specialize query it is easiest to get the results by creating a new class inheriting from another and adding the 2 members/columns for my specialized query. Then due to reflections in the lib in my c# code i can write foreach(var v in list) { v.AnyMember and v.MyExtraMember) Now instead of having the class scattered around or modifying my main DB.cs file can i define a class inside a function? I know i can create an anonymous object by writing new {name=val, name2=...}; but i need a to pass this class in a generic function func(query, args);

    Read the article

  • PHP File Downloading Questions

    - by nsearle
    Hey All! I am currently running into some problems with user's downloading a file stored on my server. I have code set up to auto download a file once the user hits the download button. It is working for all files, but when the size get's larger than 30 MB it is having issues. Is there a limit on user download? Also, I have supplied my example code and am wondering if there is a better practice than using the PHP function 'file_get_contents'. Thank You all for the help! $path = $_SERVER['DOCUMENT_ROOT'] . '../path/to/file/'; $filename = 'filename.zip'; $filesize = filesize($path . $filename); @header("Content-type: application/zip"); @header("Content-Disposition: attachment; filename=$filename"); @header("Content-Length: $filesize") echo file_get_contents($path . $filename);

    Read the article

  • Reference inherited class's <T>ype in a derived class

    - by DRapp
    I don't know if its possible or not, but here's what I need. I'm toying around with something and want to know if its possible since you can't create your own data type based on a sealed type such as int, Int32, Int64, etc. I want to create a top-level class that is defined of a given type with some common stuff. Then, derive this into two subclasses, but in this case, each class is based on either and int or Int64 type. From THAT instance, create an instance of either one and know its yped basis for parameter referenc / return settings. So when I need to create an instance of the "ThisClass", I don't have to know its type basis of either int or Int64, yet IT will know the type and be able to allow methods/functions to be called with the typed... This way, If I want to change my ThisClass definition from SubLevel1 to SubLevel2, I don't have to dance around all different data type definitions. Hope this makes sense.. public class TopLevel<T> { ... } pubic class SubLevel1 : TopLevel<int> { ... } public class SubLevel2 : TopLevel<Int64> { ... } public class ThisClass : SubLevel1 { ... public <based on the Int data type from SubLevel1> SomeFunc() { return <the Int value computed>; } }

    Read the article

  • Using Multiple File Handles for Single File

    - by Ryan Rosario
    I have an O(n^2) operation that requires me to read line i from a file, and then compare line i to every line in the file. This repeats for all i. I wrote the following code to do this with 2 file handles, but it does not yield the result I am looking for. I imagine this is a simple error on my part. IN1 = open("myfile.dat","r") IN2 = open("myfile.dat","r") for line1 in IN1: for line2 in IN2: print line1.strip(), line2.strip() IN1.close() IN2.close() The result: Hello Hello Hello World Hello This Hello is Hello an Hello Example Hello of Hello Using Hello Two Hello File Hello Pointers Hello to Hello Read Hello One Hello File The output should contain 15^2 lines.

    Read the article

  • jquery: set variable based on one class from an element that has more than one class

    - by John
    Hi I'm trying to make a table who's columns and rows highlight on hover (I realise there are jquery plugins out there that will do this, but I'm trying to learn, so thought I'd have a stab at doing it for myself.) Here's what I've got so far: $('th:not(.features), td:not(.features)').hover(highlight); function highlight(){ $('th.highlightCol, td.highlightCol').removeClass('highlightCol'); var col = $(this).attr('class'); $('.' + col).addClass('highlightCol'); }; $('tr').hover(highlightRowOn, highlightRowOff); function highlightRowOn(){ $(this).children('td:not(.highlightCol)').addClass('highlightRow'); }; function highlightRowOff(){ $(this).children('td:not(.highlightCol)').removeClass('highlightRow'); }; This works fine apart from one problem: Each 'td' has a class specific to it's column (package1, package2, package3, package4). It is this that gets passed to the variable (col) to add the class 'highlightCol' to a column when one of its 'td's are hovered on. However, If you move the cursor to a new column along a highlighted row, the 'td' you land on has two classes (highlightedRow and package* ). These both get passed to the variable and as a result the new column does not receive the correct class to highlight. Is there a way for me to target just the 'package* ' class and pass that to the variable while ignoring the 'highlightedRow' class? I hope that's not too jumbled for someone to make sense of and many thanks for any help offered.

    Read the article

  • Accessing a Class Member from a First-Class Function

    - by dbyrne
    I have a case class which takes a list of functions: case class A(q:Double, r:Double, s:Double, l:List[(Double)=>Double]) I have over 20 functions defined. Some of these functions have their own parameters, and some of them also use the q, r, and s values from the case class. Two examples are: def f1(w:Double) = (d:Double) => math.sin(d) * w def f2(w:Double, q:Double) = (d:Double) => d * q * w The problem is that I then need to reference q, r, and s twice when instantiating the case class: A(0.5, 1.0, 2.0, List(f1(3.0), f2(4.0, 0.5))) //0.5 is referenced twice I would like to be able to instantiate the class like this: A(0.5, 1.0, 2.0, List(f1(3.0), f2(4.0))) //f2 already knows about q! What is the best technique to accomplish this? Can I define my functions in a trait that the case class extends? EDIT: The real world application has 7 members, not 3. Only a small number of the functions need access to the members. Most of the functions don't care about them.

    Read the article

  • DBIx::Class base result class

    - by Rob
    Hi there, I am trying to create a model for Catalyst by using DBIx::Class::Schema::Loader. I want the result classes to have a base class I can add methods to. So MyTable.pm inherits from Base.pm which inherits from DBIx::Class::core (default). Somehow I cannot figure out how to do this. my create script is below, can anyone tell me what I am doing wrong? The script creates my model ok, but all resultset classes just directly inherit from DBIx::Class::core without my Base class in between. #!/usr/bin/perl use DBIx::Class::Schema::Loader qw/ make_schema_at /; #specifically for the entities many-2-many relation $ENV{DBIC_OVERWRITE_HELPER_METHODS_OK} = 1; make_schema_at( 'MyApp::Schema', { dump_directory => '/tmp', debug => 1, overwrite_modifications => 1, components => ['EncodedColumn'], #encoded password column use_namespaces => 1, default_resultset_class => 'Base' }, [ 'DBI:mysql:database=mydb;host=localhost;port=3306','rob', '******' ], );

    Read the article

  • Instantiating a class within a class

    - by Ink-Jet
    Hello. I'm trying to instantiate a class within a class, so that the outer class contains the inner class. This is my code: #include <iostream> #include <string> class Inner { private: std::string message; public: Inner(std::string m); void print() const; }; Inner::Inner(std::string m) { message = m; } void Inner::print() const { std::cout << message << std::endl; std::cout << message << std::endl; } class Outer { private: std::string message; Inner in; public: Outer(std::string m); void print() const; }; Outer::Outer(std::string m) { message = m; } void Outer::print() const { std::cout << message << std::endl; } int main() { Outer out("Hello world."); out.print(); return 0; } "Inner in", is my attempt at containing the inner within the outer, however, when I compile, i get an error that there is no matching function for call to Inner::Inner(). What have I done wrong? Thanks.

    Read the article

  • Upload File to Windows Azure Blob in Chunks through ASP.NET MVC, JavaScript and HTML5

    - by Shaun
    Originally posted on: http://geekswithblogs.net/shaunxu/archive/2013/07/01/upload-file-to-windows-azure-blob-in-chunks-through-asp.net.aspxMany people are using Windows Azure Blob Storage to store their data in the cloud. Blob storage provides 99.9% availability with easy-to-use API through .NET SDK and HTTP REST. For example, we can store JavaScript files, images, documents in blob storage when we are building an ASP.NET web application on a Web Role in Windows Azure. Or we can store our VHD files in blob and mount it as a hard drive in our cloud service. If you are familiar with Windows Azure, you should know that there are two kinds of blob: page blob and block blob. The page blob is optimized for random read and write, which is very useful when you need to store VHD files. The block blob is optimized for sequential/chunk read and write, which has more common usage. Since we can upload block blob in blocks through BlockBlob.PutBlock, and them commit them as a whole blob with invoking the BlockBlob.PutBlockList, it is very powerful to upload large files, as we can upload blocks in parallel, and provide pause-resume feature. There are many documents, articles and blog posts described on how to upload a block blob. Most of them are focus on the server side, which means when you had received a big file, stream or binaries, how to upload them into blob storage in blocks through .NET SDK.  But the problem is, how can we upload these large files from client side, for example, a browser. This questioned to me when I was working with a Chinese customer to help them build a network disk production on top of azure. The end users upload their files from the web portal, and then the files will be stored in blob storage from the Web Role. My goal is to find the best way to transform the file from client (end user’s machine) to the server (Web Role) through browser. In this post I will demonstrate and describe what I had done, to upload large file in chunks with high speed, and save them as blocks into Windows Azure Blob Storage.   Traditional Upload, Works with Limitation The simplest way to implement this requirement is to create a web page with a form that contains a file input element and a submit button. 1: @using (Html.BeginForm("About", "Index", FormMethod.Post, new { enctype = "multipart/form-data" })) 2: { 3: <input type="file" name="file" /> 4: <input type="submit" value="upload" /> 5: } And then in the backend controller, we retrieve the whole content of this file and upload it in to the blob storage through .NET SDK. We can split the file in blocks and upload them in parallel and commit. The code had been well blogged in the community. 1: [HttpPost] 2: public ActionResult About(HttpPostedFileBase file) 3: { 4: var container = _client.GetContainerReference("test"); 5: container.CreateIfNotExists(); 6: var blob = container.GetBlockBlobReference(file.FileName); 7: var blockDataList = new Dictionary<string, byte[]>(); 8: using (var stream = file.InputStream) 9: { 10: var blockSizeInKB = 1024; 11: var offset = 0; 12: var index = 0; 13: while (offset < stream.Length) 14: { 15: var readLength = Math.Min(1024 * blockSizeInKB, (int)stream.Length - offset); 16: var blockData = new byte[readLength]; 17: offset += stream.Read(blockData, 0, readLength); 18: blockDataList.Add(Convert.ToBase64String(BitConverter.GetBytes(index)), blockData); 19:  20: index++; 21: } 22: } 23:  24: Parallel.ForEach(blockDataList, (bi) => 25: { 26: blob.PutBlock(bi.Key, new MemoryStream(bi.Value), null); 27: }); 28: blob.PutBlockList(blockDataList.Select(b => b.Key).ToArray()); 29:  30: return RedirectToAction("About"); 31: } This works perfect if we selected an image, a music or a small video to upload. But if I selected a large file, let’s say a 6GB HD-movie, after upload for about few minutes the page will be shown as below and the upload will be terminated. In ASP.NET there is a limitation of request length and the maximized request length is defined in the web.config file. It’s a number which less than about 4GB. So if we want to upload a really big file, we cannot simply implement in this way. Also, in Windows Azure, a cloud service network load balancer will terminate the connection if exceed the timeout period. From my test the timeout looks like 2 - 3 minutes. Hence, when we need to upload a large file we cannot just use the basic HTML elements. Besides the limitation mentioned above, the simple HTML file upload cannot provide rich upload experience such as chunk upload, pause and pause-resume. So we need to find a better way to upload large file from the client to the server.   Upload in Chunks through HTML5 and JavaScript In order to break those limitation mentioned above we will try to upload the large file in chunks. This takes some benefit to us such as - No request size limitation: Since we upload in chunks, we can define the request size for each chunks regardless how big the entire file is. - No timeout problem: The size of chunks are controlled by us, which means we should be able to make sure request for each chunk upload will not exceed the timeout period of both ASP.NET and Windows Azure load balancer. It was a big challenge to upload big file in chunks until we have HTML5. There are some new features and improvements introduced in HTML5 and we will use them to implement our solution.   In HTML5, the File interface had been improved with a new method called “slice”. It can be used to read part of the file by specifying the start byte index and the end byte index. For example if the entire file was 1024 bytes, file.slice(512, 768) will read the part of this file from the 512nd byte to 768th byte, and return a new object of interface called "Blob”, which you can treat as an array of bytes. In fact,  a Blob object represents a file-like object of immutable, raw data. The File interface is based on Blob, inheriting blob functionality and expanding it to support files on the user's system. For more information about the Blob please refer here. File and Blob is very useful to implement the chunk upload. We will use File interface to represent the file the user selected from the browser and then use File.slice to read the file in chunks in the size we wanted. For example, if we wanted to upload a 10MB file with 512KB chunks, then we can read it in 512KB blobs by using File.slice in a loop.   Assuming we have a web page as below. User can select a file, an input box to specify the block size in KB and a button to start upload. 1: <div> 2: <input type="file" id="upload_files" name="files[]" /><br /> 3: Block Size: <input type="number" id="block_size" value="512" name="block_size" />KB<br /> 4: <input type="button" id="upload_button_blob" name="upload" value="upload (blob)" /> 5: </div> Then we can have the JavaScript function to upload the file in chunks when user clicked the button. 1: <script type="text/javascript"> 1: 2: $(function () { 3: $("#upload_button_blob").click(function () { 4: }); 5: });</script> Firstly we need to ensure the client browser supports the interfaces we are going to use. Just try to invoke the File, Blob and FormData from the “window” object. If any of them is “undefined” the condition result will be “false” which means your browser doesn’t support these premium feature and it’s time for you to get your browser updated. FormData is another new feature we are going to use in the future. It could generate a temporary form for us. We will use this interface to create a form with chunk and associated metadata when invoked the service through ajax. 1: $("#upload_button_blob").click(function () { 2: // assert the browser support html5 3: if (window.File && window.Blob && window.FormData) { 4: alert("Your brwoser is awesome, let's rock!"); 5: } 6: else { 7: alert("Oh man plz update to a modern browser before try is cool stuff out."); 8: return; 9: } 10: }); Each browser supports these interfaces by their own implementation and currently the Blob, File and File.slice are supported by Chrome 21, FireFox 13, IE 10, Opera 12 and Safari 5.1 or higher. After that we worked on the files the user selected one by one since in HTML5, user can select multiple files in one file input box. 1: var files = $("#upload_files")[0].files; 2: for (var i = 0; i < files.length; i++) { 3: var file = files[i]; 4: var fileSize = file.size; 5: var fileName = file.name; 6: } Next, we calculated the start index and end index for each chunks based on the size the user specified from the browser. We put them into an array with the file name and the index, which will be used when we upload chunks into Windows Azure Blob Storage as blocks since we need to specify the target blob name and the block index. At the same time we will store the list of all indexes into another variant which will be used to commit blocks into blob in Azure Storage once all chunks had been uploaded successfully. 1: $("#upload_button_blob").click(function () { 2: // assert the browser support html5 3: ... ... 4: // start to upload each files in chunks 5: var files = $("#upload_files")[0].files; 6: for (var i = 0; i < files.length; i++) { 7: var file = files[i]; 8: var fileSize = file.size; 9: var fileName = file.name; 10:  11: // calculate the start and end byte index for each blocks(chunks) 12: // with the index, file name and index list for future using 13: var blockSizeInKB = $("#block_size").val(); 14: var blockSize = blockSizeInKB * 1024; 15: var blocks = []; 16: var offset = 0; 17: var index = 0; 18: var list = ""; 19: while (offset < fileSize) { 20: var start = offset; 21: var end = Math.min(offset + blockSize, fileSize); 22:  23: blocks.push({ 24: name: fileName, 25: index: index, 26: start: start, 27: end: end 28: }); 29: list += index + ","; 30:  31: offset = end; 32: index++; 33: } 34: } 35: }); Now we have all chunks’ information ready. The next step should be upload them one by one to the server side, and at the server side when received a chunk it will upload as a block into Blob Storage, and finally commit them with the index list through BlockBlobClient.PutBlockList. But since all these invokes are ajax calling, which means not synchronized call. So we need to introduce a new JavaScript library to help us coordinate the asynchronize operation, which named “async.js”. You can download this JavaScript library here, and you can find the document here. I will not explain this library too much in this post. We will put all procedures we want to execute as a function array, and pass into the proper function defined in async.js to let it help us to control the execution sequence, in series or in parallel. Hence we will define an array and put the function for chunk upload into this array. 1: $("#upload_button_blob").click(function () { 2: // assert the browser support html5 3: ... ... 4:  5: // start to upload each files in chunks 6: var files = $("#upload_files")[0].files; 7: for (var i = 0; i < files.length; i++) { 8: var file = files[i]; 9: var fileSize = file.size; 10: var fileName = file.name; 11: // calculate the start and end byte index for each blocks(chunks) 12: // with the index, file name and index list for future using 13: ... ... 14:  15: // define the function array and push all chunk upload operation into this array 16: blocks.forEach(function (block) { 17: putBlocks.push(function (callback) { 18: }); 19: }); 20: } 21: }); 22: }); As you can see, I used File.slice method to read each chunks based on the start and end byte index we calculated previously, and constructed a temporary HTML form with the file name, chunk index and chunk data through another new feature in HTML5 named FormData. Then post this form to the backend server through jQuery.ajax. This is the key part of our solution. 1: $("#upload_button_blob").click(function () { 2: // assert the browser support html5 3: ... ... 4: // start to upload each files in chunks 5: var files = $("#upload_files")[0].files; 6: for (var i = 0; i < files.length; i++) { 7: var file = files[i]; 8: var fileSize = file.size; 9: var fileName = file.name; 10: // calculate the start and end byte index for each blocks(chunks) 11: // with the index, file name and index list for future using 12: ... ... 13: // define the function array and push all chunk upload operation into this array 14: blocks.forEach(function (block) { 15: putBlocks.push(function (callback) { 16: // load blob based on the start and end index for each chunks 17: var blob = file.slice(block.start, block.end); 18: // put the file name, index and blob into a temporary from 19: var fd = new FormData(); 20: fd.append("name", block.name); 21: fd.append("index", block.index); 22: fd.append("file", blob); 23: // post the form to backend service (asp.net mvc controller action) 24: $.ajax({ 25: url: "/Home/UploadInFormData", 26: data: fd, 27: processData: false, 28: contentType: "multipart/form-data", 29: type: "POST", 30: success: function (result) { 31: if (!result.success) { 32: alert(result.error); 33: } 34: callback(null, block.index); 35: } 36: }); 37: }); 38: }); 39: } 40: }); Then we will invoke these functions one by one by using the async.js. And once all functions had been executed successfully I invoked another ajax call to the backend service to commit all these chunks (blocks) as the blob in Windows Azure Storage. 1: $("#upload_button_blob").click(function () { 2: // assert the browser support html5 3: ... ... 4: // start to upload each files in chunks 5: var files = $("#upload_files")[0].files; 6: for (var i = 0; i < files.length; i++) { 7: var file = files[i]; 8: var fileSize = file.size; 9: var fileName = file.name; 10: // calculate the start and end byte index for each blocks(chunks) 11: // with the index, file name and index list for future using 12: ... ... 13: // define the function array and push all chunk upload operation into this array 14: ... ... 15: // invoke the functions one by one 16: // then invoke the commit ajax call to put blocks into blob in azure storage 17: async.series(putBlocks, function (error, result) { 18: var data = { 19: name: fileName, 20: list: list 21: }; 22: $.post("/Home/Commit", data, function (result) { 23: if (!result.success) { 24: alert(result.error); 25: } 26: else { 27: alert("done!"); 28: } 29: }); 30: }); 31: } 32: }); That’s all in the client side. The outline of our logic would be - Calculate the start and end byte index for each chunks based on the block size. - Defined the functions of reading the chunk form file and upload the content to the backend service through ajax. - Execute the functions defined in previous step with “async.js”. - Commit the chunks by invoking the backend service in Windows Azure Storage finally.   Save Chunks as Blocks into Blob Storage In above we finished the client size JavaScript code. It uploaded the file in chunks to the backend service which we are going to implement in this step. We will use ASP.NET MVC as our backend service, and it will receive the chunks, upload into Windows Azure Bob Storage in blocks, then finally commit as one blob. As in the client side we uploaded chunks by invoking the ajax call to the URL "/Home/UploadInFormData", I created a new action under the Index controller and it only accepts HTTP POST request. 1: [HttpPost] 2: public JsonResult UploadInFormData() 3: { 4: var error = string.Empty; 5: try 6: { 7: } 8: catch (Exception e) 9: { 10: error = e.ToString(); 11: } 12:  13: return new JsonResult() 14: { 15: Data = new 16: { 17: success = string.IsNullOrWhiteSpace(error), 18: error = error 19: } 20: }; 21: } Then I retrieved the file name, index and the chunk content from the Request.Form object, which was passed from our client side. And then, used the Windows Azure SDK to create a blob container (in this case we will use the container named “test”.) and create a blob reference with the blob name (same as the file name). Then uploaded the chunk as a block of this blob with the index, since in Blob Storage each block must have an index (ID) associated with so that finally we can put all blocks as one blob by specifying their block ID list. 1: [HttpPost] 2: public JsonResult UploadInFormData() 3: { 4: var error = string.Empty; 5: try 6: { 7: var name = Request.Form["name"]; 8: var index = int.Parse(Request.Form["index"]); 9: var file = Request.Files[0]; 10: var id = Convert.ToBase64String(BitConverter.GetBytes(index)); 11:  12: var container = _client.GetContainerReference("test"); 13: container.CreateIfNotExists(); 14: var blob = container.GetBlockBlobReference(name); 15: blob.PutBlock(id, file.InputStream, null); 16: } 17: catch (Exception e) 18: { 19: error = e.ToString(); 20: } 21:  22: return new JsonResult() 23: { 24: Data = new 25: { 26: success = string.IsNullOrWhiteSpace(error), 27: error = error 28: } 29: }; 30: } Next, I created another action to commit the blocks into blob once all chunks had been uploaded. Similarly, I retrieved the blob name from the Request.Form. I also retrieved the chunks ID list, which is the block ID list from the Request.Form in a string format, split them as a list, then invoked the BlockBlob.PutBlockList method. After that our blob will be shown in the container and ready to be download. 1: [HttpPost] 2: public JsonResult Commit() 3: { 4: var error = string.Empty; 5: try 6: { 7: var name = Request.Form["name"]; 8: var list = Request.Form["list"]; 9: var ids = list 10: .Split(',') 11: .Where(id => !string.IsNullOrWhiteSpace(id)) 12: .Select(id => Convert.ToBase64String(BitConverter.GetBytes(int.Parse(id)))) 13: .ToArray(); 14:  15: var container = _client.GetContainerReference("test"); 16: container.CreateIfNotExists(); 17: var blob = container.GetBlockBlobReference(name); 18: blob.PutBlockList(ids); 19: } 20: catch (Exception e) 21: { 22: error = e.ToString(); 23: } 24:  25: return new JsonResult() 26: { 27: Data = new 28: { 29: success = string.IsNullOrWhiteSpace(error), 30: error = error 31: } 32: }; 33: } Now we finished all code we need. The whole process of uploading would be like this below. Below is the full client side JavaScript code. 1: <script type="text/javascript" src="~/Scripts/async.js"></script> 2: <script type="text/javascript"> 3: $(function () { 4: $("#upload_button_blob").click(function () { 5: // assert the browser support html5 6: if (window.File && window.Blob && window.FormData) { 7: alert("Your brwoser is awesome, let's rock!"); 8: } 9: else { 10: alert("Oh man plz update to a modern browser before try is cool stuff out."); 11: return; 12: } 13:  14: // start to upload each files in chunks 15: var files = $("#upload_files")[0].files; 16: for (var i = 0; i < files.length; i++) { 17: var file = files[i]; 18: var fileSize = file.size; 19: var fileName = file.name; 20:  21: // calculate the start and end byte index for each blocks(chunks) 22: // with the index, file name and index list for future using 23: var blockSizeInKB = $("#block_size").val(); 24: var blockSize = blockSizeInKB * 1024; 25: var blocks = []; 26: var offset = 0; 27: var index = 0; 28: var list = ""; 29: while (offset < fileSize) { 30: var start = offset; 31: var end = Math.min(offset + blockSize, fileSize); 32:  33: blocks.push({ 34: name: fileName, 35: index: index, 36: start: start, 37: end: end 38: }); 39: list += index + ","; 40:  41: offset = end; 42: index++; 43: } 44:  45: // define the function array and push all chunk upload operation into this array 46: var putBlocks = []; 47: blocks.forEach(function (block) { 48: putBlocks.push(function (callback) { 49: // load blob based on the start and end index for each chunks 50: var blob = file.slice(block.start, block.end); 51: // put the file name, index and blob into a temporary from 52: var fd = new FormData(); 53: fd.append("name", block.name); 54: fd.append("index", block.index); 55: fd.append("file", blob); 56: // post the form to backend service (asp.net mvc controller action) 57: $.ajax({ 58: url: "/Home/UploadInFormData", 59: data: fd, 60: processData: false, 61: contentType: "multipart/form-data", 62: type: "POST", 63: success: function (result) { 64: if (!result.success) { 65: alert(result.error); 66: } 67: callback(null, block.index); 68: } 69: }); 70: }); 71: }); 72:  73: // invoke the functions one by one 74: // then invoke the commit ajax call to put blocks into blob in azure storage 75: async.series(putBlocks, function (error, result) { 76: var data = { 77: name: fileName, 78: list: list 79: }; 80: $.post("/Home/Commit", data, function (result) { 81: if (!result.success) { 82: alert(result.error); 83: } 84: else { 85: alert("done!"); 86: } 87: }); 88: }); 89: } 90: }); 91: }); 92: </script> And below is the full ASP.NET MVC controller code. 1: public class HomeController : Controller 2: { 3: private CloudStorageAccount _account; 4: private CloudBlobClient _client; 5:  6: public HomeController() 7: : base() 8: { 9: _account = CloudStorageAccount.Parse(CloudConfigurationManager.GetSetting("DataConnectionString")); 10: _client = _account.CreateCloudBlobClient(); 11: } 12:  13: public ActionResult Index() 14: { 15: ViewBag.Message = "Modify this template to jump-start your ASP.NET MVC application."; 16:  17: return View(); 18: } 19:  20: [HttpPost] 21: public JsonResult UploadInFormData() 22: { 23: var error = string.Empty; 24: try 25: { 26: var name = Request.Form["name"]; 27: var index = int.Parse(Request.Form["index"]); 28: var file = Request.Files[0]; 29: var id = Convert.ToBase64String(BitConverter.GetBytes(index)); 30:  31: var container = _client.GetContainerReference("test"); 32: container.CreateIfNotExists(); 33: var blob = container.GetBlockBlobReference(name); 34: blob.PutBlock(id, file.InputStream, null); 35: } 36: catch (Exception e) 37: { 38: error = e.ToString(); 39: } 40:  41: return new JsonResult() 42: { 43: Data = new 44: { 45: success = string.IsNullOrWhiteSpace(error), 46: error = error 47: } 48: }; 49: } 50:  51: [HttpPost] 52: public JsonResult Commit() 53: { 54: var error = string.Empty; 55: try 56: { 57: var name = Request.Form["name"]; 58: var list = Request.Form["list"]; 59: var ids = list 60: .Split(',') 61: .Where(id => !string.IsNullOrWhiteSpace(id)) 62: .Select(id => Convert.ToBase64String(BitConverter.GetBytes(int.Parse(id)))) 63: .ToArray(); 64:  65: var container = _client.GetContainerReference("test"); 66: container.CreateIfNotExists(); 67: var blob = container.GetBlockBlobReference(name); 68: blob.PutBlockList(ids); 69: } 70: catch (Exception e) 71: { 72: error = e.ToString(); 73: } 74:  75: return new JsonResult() 76: { 77: Data = new 78: { 79: success = string.IsNullOrWhiteSpace(error), 80: error = error 81: } 82: }; 83: } 84: } And if we selected a file from the browser we will see our application will upload chunks in the size we specified to the server through ajax call in background, and then commit all chunks in one blob. Then we can find the blob in our Windows Azure Blob Storage.   Optimized by Parallel Upload In previous example we just uploaded our file in chunks. This solved the problem that ASP.NET MVC request content size limitation as well as the Windows Azure load balancer timeout. But it might introduce the performance problem since we uploaded chunks in sequence. In order to improve the upload performance we could modify our client side code a bit to make the upload operation invoked in parallel. The good news is that, “async.js” library provides the parallel execution function. If you remembered the code we invoke the service to upload chunks, it utilized “async.series” which means all functions will be executed in sequence. Now we will change this code to “async.parallel”. This will invoke all functions in parallel. 1: $("#upload_button_blob").click(function () { 2: // assert the browser support html5 3: ... ... 4: // start to upload each files in chunks 5: var files = $("#upload_files")[0].files; 6: for (var i = 0; i < files.length; i++) { 7: var file = files[i]; 8: var fileSize = file.size; 9: var fileName = file.name; 10: // calculate the start and end byte index for each blocks(chunks) 11: // with the index, file name and index list for future using 12: ... ... 13: // define the function array and push all chunk upload operation into this array 14: ... ... 15: // invoke the functions one by one 16: // then invoke the commit ajax call to put blocks into blob in azure storage 17: async.parallel(putBlocks, function (error, result) { 18: var data = { 19: name: fileName, 20: list: list 21: }; 22: $.post("/Home/Commit", data, function (result) { 23: if (!result.success) { 24: alert(result.error); 25: } 26: else { 27: alert("done!"); 28: } 29: }); 30: }); 31: } 32: }); In this way all chunks will be uploaded to the server side at the same time to maximize the bandwidth usage. This should work if the file was not very large and the chunk size was not very small. But for large file this might introduce another problem that too many ajax calls are sent to the server at the same time. So the best solution should be, upload the chunks in parallel with maximum concurrency limitation. The code below specified the concurrency limitation to 4, which means at the most only 4 ajax calls could be invoked at the same time. 1: $("#upload_button_blob").click(function () { 2: // assert the browser support html5 3: ... ... 4: // start to upload each files in chunks 5: var files = $("#upload_files")[0].files; 6: for (var i = 0; i < files.length; i++) { 7: var file = files[i]; 8: var fileSize = file.size; 9: var fileName = file.name; 10: // calculate the start and end byte index for each blocks(chunks) 11: // with the index, file name and index list for future using 12: ... ... 13: // define the function array and push all chunk upload operation into this array 14: ... ... 15: // invoke the functions one by one 16: // then invoke the commit ajax call to put blocks into blob in azure storage 17: async.parallelLimit(putBlocks, 4, function (error, result) { 18: var data = { 19: name: fileName, 20: list: list 21: }; 22: $.post("/Home/Commit", data, function (result) { 23: if (!result.success) { 24: alert(result.error); 25: } 26: else { 27: alert("done!"); 28: } 29: }); 30: }); 31: } 32: });   Summary In this post we discussed how to upload files in chunks to the backend service and then upload them into Windows Azure Blob Storage in blocks. We focused on the frontend side and leverage three new feature introduced in HTML 5 which are - File.slice: Read part of the file by specifying the start and end byte index. - Blob: File-like interface which contains the part of the file content. - FormData: Temporary form element that we can pass the chunk alone with some metadata to the backend service. Then we discussed the performance consideration of chunk uploading. Sequence upload cannot provide maximized upload speed, but the unlimited parallel upload might crash the browser and server if too many chunks. So we finally came up with the solution to upload chunks in parallel with the concurrency limitation. We also demonstrated how to utilize “async.js” JavaScript library to help us control the asynchronize call and the parallel limitation.   Regarding the chunk size and the parallel limitation value there is no “best” value. You need to test vary composition and find out the best one for your particular scenario. It depends on the local bandwidth, client machine cores and the server side (Windows Azure Cloud Service Virtual Machine) cores, memory and bandwidth. Below is one of my performance test result. The client machine was Windows 8 IE 10 with 4 cores. I was using Microsoft Cooperation Network. The web site was hosted on Windows Azure China North data center (in Beijing) with one small web role (1.7GB 1 core CPU, 1.75GB memory with 100Mbps bandwidth). The test cases were - Chunk size: 512KB, 1MB, 2MB, 4MB. - Upload Mode: Sequence, parallel (unlimited), parallel with limit (4 threads, 8 threads). - Chunk Format: base64 string, binaries. - Target file: 100MB. - Each case was tested 3 times. Below is the test result chart. Some thoughts, but not guidance or best practice: - Parallel gets better performance than series. - No significant performance improvement between parallel 4 threads and 8 threads. - Transform with binaries provides better performance than base64. - In all cases, chunk size in 1MB - 2MB gets better performance.   Hope this helps, Shaun All documents and related graphics, codes are provided "AS IS" without warranty of any kind. Copyright © Shaun Ziyan Xu. This work is licensed under the Creative Commons License.

    Read the article

  • Java: reusable encapsulation with interface, abstract class or inner classes?

    - by HH
    I try to encapsulate. Exeption from interface, static inner class working, non-static inner class not working, cannot understand terminology: nested classes, inner classes, nested interfaces, interface-abstract-class -- sounds too Repetitive! Exception 'illegal type' from interface apparently because values being constants(?!) static interface userInfo { File startingFile=new File("."); String startingPath="dummy"; try{ startingPath=startingFile.getCanonicalPath(); }catch(Exception e){e.printStackTrace();} } Working code but no succes with non-static inner class import java.io.*; import java.util.*; public class listTest{ public interface hello{String word="hello word from Interface!";} public static class hej{ hej(){} private String hejo="hello hallo from Static class with image"; public void printHallooo(){System.out.println(hejo);} } public class nonStatic{ nonStatic(){} //HOW TO USE IT? public void printNonStatic(){System.out.println("Inside static class with an image!");} } public static void main(String[] args){ //INTERFACE TEST System.out.println(hello.word); //INNNER CLASS STATIC TEST hej h=new hej(); h.printHallooo(); //INNER CLASS NON-STATIC TEST nonStatic ns=new nonStatic(); ns.printNonStatic(); //IS there a way to it without STATIC? } } Output The above code works but how non-staticly? Output: hello word from Interface! hello hallo from Static class with image! StaticPrint without an image of the class! Related Nesting classes inner classes? interfacses

    Read the article

  • C# "Rename" Property in Derived Class

    - by Eric
    When you read this you'll be awfully tempted to give advice like "this is a bad idea for the following reason..." Bear with me. I know there are other ways to approach this. This question should be considered trivia. Lets say you have a class "Transaction" that has properties common to all transactions such as Invoice, Purchase Order, and Sales Receipt. Let's take the simple example of Transaction "Amount", which is the most important monetary amount for a given transaction. public class Transaction { public double Amount { get; set; } public TxnTypeEnum TransactionType { get; set; } } This Amount may have a more specific name in a derived type... at least in the real world. For example, the following values are all actually the same thing: Transaction - Amount Invoice - Subtotal PurchaseOrder - Total Sales Receipt - Amount So now I want a derived class "Invoice" that has a Subtotal rather than the generically-named Amount. Ideally both of the following would be true: In an instance of Transaction, the Amount property would be visible. In an instance of Invoice, the Amount property would be hidden, but the Subtotal property would refer to it internally. Invoice looks like this: public class Invoice : Transaction { new private double? Amount { get { return base.Amount; } set { base.Amount = value; } } // This property should hide the generic property "Amount" on Transaction public double? SubTotal { get { return Amount; } set { Amount = value; } } public double RemainingBalance { get; set; } } But of course Transaction.Amount is still visible on any instance of Invoice. Thanks for taking a look!

    Read the article

  • How to remove CRUD operations from Entity Class

    - by GlutVonSmark
    Trying to get my head around removing dataStore access from my entity classes. Lets say I have an AccountsGroup entity class. I put the all DBAccess into AccountsGroupRepository class. Now should I have a DeleteFromDB method in the AccountsGroup class, that will call the repository? Public Sub DeleteFromDB dim repository as new AccountsGroupRepository(me) repository.DelteFromDB End Sub Or should I just always use repositry whenever I need to delete an entity, and not have the CRUD methods in the entity class? What happens when there is some business logic validation that needs to be done before the delete can proceed. For example if AccountsGroup still has some Accounts in it the delete method should throw an exception. Where do I put that?

    Read the article

  • Unable to access A class variables in B Class - Unity-Monodevelop

    - by Syed
    I have made a class including variables in Monodevelop which is: public class GridInfo : MonoBehaviour { public float initPosX; public float initPosY; public bool inUse; public int f; public int g; public int h; public GridInfo parent; public int y,x; } Now I am using its class variable in another class, Map.cs which is: public class Map : MonoBehaviour { public static GridInfo[,] Tile = new GridInfo[17, 23]; void Start() { Tile[0,0].initPosX = initPosX; //Line 49 } } Iam not getting any error on runtime, but when I play in unity it is giving me error NullReferenceException: Object reference not set to an instance of an object Map.Start () (at Assets/Scripts/Map.cs:49) I am not inserting this script in any gameobject, as Map.cs will make a GridInfo type array, I have also tried using variables using GetComponent, where is the problem ?

    Read the article

  • Making Class Diagram for MVC Pattern Project

    - by iMohammad
    I have a question about making a class diagram for an MVC based college senior project. If we have 2 actors of users in my system, lets say Undergrad and Graduate students are the children of abstract class called User. (Generalisation) Each actor has his own features. My question, in such case, do we need to have these two actors in separate classes which inherits from the abstract class User? even though, I'm going to implement them as roles using one Model called User Model ? I think you can see my confusion here. I code using MVC pattern, but I've never made a class diagram for this pattern. Thank you in advance!

    Read the article

  • 7 drived classes with one common base class

    - by user144905
    i have written the following code, //main.cpp #include<iostream> #include<string> #include"human.h" #include"computer.h" #include"referee.h" #include"RandomComputer.h" #include"Avalanche.h" #include"Bureaucrat.h" #include"Toolbox.h" #include"Crescendo.h" #include"PaperDoll.h" #include"FistfullODollors.h" using namespace std; int main() { Avalanche pla1; Avalanche pla2; referee f; pla1.disp(); for (int i=0;i<5;i++) { cout<<pla2.mov[i]; } return 0; } in this program all included classes except referee.h and human.h are drived from computer.h. each drived calls has a char array variable which is initialized when a member of a drived class is declared. the problem is that when i declare tow diffrent drived class memebers lets say Avalache and ToolBox. upon printing the char array for one of them using for loop it prints nothing. However if i declare only one of them in main.cpp the it works properly. and the file for computer.h is as such: #ifndef COMPUTER_H #define COMPUTER_H class computer { public: int nump; char mov[]; void disp(); }; #endif ToolBox.h is like this: #ifndef TOOLBOX_H #define TOOLBOX_H #include"computer.h" class Toolbox: public computer { public: Toolbox(); }; #endif finally Avalanche.h is as following: #ifndef AVALANCHE_H #define AVALANCHE_H #include"computer.h" class Avalanche: public computer { public: Avalanche(); }; #endif

    Read the article

  • Using "public" vars or attributes in class calls, functional approach

    - by marw
    I was always wondering about two things I tend to do in my little projects. Sometimes I will have this design: class FooClass ... self.foo = "it's a bar" self._do_some_stuff(self) def _do_some_stuff(self): print(self.foo) And sometimes this one: class FooClass2 ... self.do_some_stuff(foo="it's a bar") def do_some_stuff(self, foo): print(foo) Although I roughly understand the differences between functional and class approaches, I struggle with the design. For example, in FooClass the self.foo is always accessible as an attribute. If there are numerous calls to it, is that faster than making foo a local variable that is passed from method to method (like in FooClass2)? What happens in memory in both cases? If FooClass2 is preferred (ie. I don't need to access foo) and other attributes inside do not change their states (the class is executed once only and returns the result), should the code then be written as a series of functions in a module?

    Read the article

  • Nested class or not nested class?

    - by eriks
    I have class A and list of A objects. A has a function f that should be executed every X seconds (for the first instance every 1 second, for the seconds instance every 5 seconds, etc.). I have a scheduler class that is responsible to execute the functions at the correct time. What i thought to do is to create a new class, ATime, that will hold ptr to A instance and the time A::f should be executed. The scheduler will hold a min priority queue of Atime. Do you think it is the correct implementation? Should ATime be a nested class of the scheduler?

    Read the article

  • C++ threaded class design from non-threaded class

    - by macs
    I'm working on a library doing audio encoding/decoding. The encoder shall be able to use multiple cores (i.e. multiple threads, using boost library), if available. What i have right now is a class that performs all encoding-relevant operations. The next step i want to take is to make that class threaded. So i'm wondering how to do this. I thought about writing a thread-class, creating n threads for n cores and then calling the encoder with the appropriate arguments. But maybe this is an overkill and there is no need for another class, so i'm going to make use of the "user interface" for thread-creation. I hope there are any suggestions.

    Read the article

  • Include a Class in another model / class / lib

    - by jaycode
    I need to use function "image_path" in my lib class. I tried this (and couple of other variations): class CustomHelpers::Base include ActionView::Helpers::AssetTagHelper def self.image_url(source) abs_path = image_path(source) unless abs_path =~ /^http/ abs_path = "#{request.protocol}#{request.host_with_port}#{abs_path}" end abs_path end end But it didn't work. Am I doing it right? Another question is, how do I find the right class to include? For example if I look at this module: http://api.rubyonrails.org/classes/ActionView/Helpers/AssetTagHelper.html is there a rule of thumb how to include that module in a model / library / class / anything else ?

    Read the article

  • How to call a method from another class that's been instantiated within the current class

    - by Pavan
    my screen has a few views like such __________________ | _____ | | | | | //viewX is a video screen | | | | | viewX | vY | | //viewY is a custom uiview i created. | |____| | //it contains a method which i would like to call that toggles |_________________| //the hidden property of this view. and when it hides, a little | | //button is replaced no the top right corner on top of viewX | viewZ | //the video layer | | |_________________| //viewZ is a view containing many square views - thumbnails. my question is, i dont know how to register for touch events so that it recognises any touch event on no matter which view the user touches the screen.. atm im handling the touch events for each view inside it. so all works well... however what im trying to do is that when the user taps anywhere else on the screen but on viewY, viewY should dissapear by calling that method in the viewY class. this viewY class is instantiated and has no xib file attached to it. the uiview is created progammatically in the viewY class. this whole class for viewY behviour is instantiated in viewX - the video view. my boss says add delegates.. although i have now clue how to do that... any help? is there anyway i can just make it really simple and be able to say REMOVE VIEW no matter which class im calling from? Also ive seen other people achieve this by using these funky arrows - ... <- etc.. although im not sure if thats what i need or how to implement such a thing. ah i think ive made my question quite complicated but i really mean it to be a simple one, and know it can be done in an easy way!

    Read the article

  • Use nested static class as ActionListener for the Outer class

    - by Digvijay Yadav
    I want to use an nested static class as an actionListener for the enclosing class's GUI elements. I did something like this: public class OuterClass { public static void myImplementation() { OuterClass.StartupHandler startupHandler = new OuterClass.StartupHandler(); exitMenuItem.addActionListener(startupHandler); // error Line } public static class StartupHandler implements ActionListener { @Override public void actionPerformed(ActionEvent e) { //throw new UnsupportedOperationException("Not supported yet."); if (e.getSource() == exitMenuItem) { System.exit(1); } else if (e.getSource() == helpMenuItem) { // show help menu } } } } But when I invoke this code I get the NullPointerException at the //error Line. Is this the right method to do do this or there is something I did am missing?

    Read the article

  • Thread class closing from other Class (Activity) with protected void onStop() Android

    - by user1761337
    I have a Problem with Closing the Thread. I will Closing the Thread with onStop,onPause and onDestroy. This is my Source in the Activity Class: @Override protected void onStop(){ super.onStop(); finish(); } @Override protected void onPause() { super.onPause(); finish(); } @Override public void onDestroy() { this.mWakeLock.release(); super.onDestroy(); } And the Thread Class: public class GameThread extends Thread { private SurfaceHolder mSurfaceHolder; private Handler mHandler; private Context mContext; private Paint mLinePaint; private Paint blackPaint; //for consistent rendering private long sleepTime; //amount of time to sleep for (in milliseconds) private long delay=1000/30; //state of game (Running or Paused). int state = 1; public final static int RUNNING = 1; public final static int PAUSED = 2; public final static int STOPED = 3; GameSurface gEngine; public GameThread(SurfaceHolder surfaceHolder, Context context, Handler handler,GameSurface gEngineS){ //data about the screen mSurfaceHolder = surfaceHolder; mHandler = handler; mContext = context; gEngine=gEngineS; } //This is the most important part of the code. It is invoked when the call to start() is //made from the SurfaceView class. It loops continuously until the game is finished or //the application is suspended. private long beforeTime; @Override public void run() { //UPDATE while (state==RUNNING) { Log.d("State","Thread is runnig"); //time before update beforeTime = System.nanoTime(); //This is where we update the game engine gEngine.Update(); //DRAW Canvas c = null; try { //lock canvas so nothing else can use it c = mSurfaceHolder.lockCanvas(null); synchronized (mSurfaceHolder) { //clear the screen with the black painter. //reset the canvas c.drawColor(Color.BLACK); //This is where we draw the game engine. gEngine.doDraw(c); } } finally { // do this in a finally so that if an exception is thrown // during the above, we don't leave the Surface in an // inconsistent state if (c != null) { mSurfaceHolder.unlockCanvasAndPost(c); } } this.sleepTime = delay-((System.nanoTime()-beforeTime)/1000000L); try { //actual sleep code if(sleepTime>0){ this.sleep(sleepTime); } } catch (InterruptedException ex) { Logger.getLogger(GameThread.class.getName()).log(Level.SEVERE, null, ex); } while (state==PAUSED){ Log.d("State","Thread is pausing"); try { this.sleep(1000); } catch (InterruptedException e) { // TODO Auto-generated catch block e.printStackTrace(); } } } }} How i can close the Thread from Activity Class??

    Read the article

  • Add class to elements which already have a class

    - by bwstud
    I have a group of divs which I'm dynamically generating when a button is clicked with the class, "brick". This gives them dimension and starting position of top: 0. I'm trying to get them to animate to the bottom of the view using a css transition with a second class assignment which gives them a bottom position: 0;. Can't figure out the syntax for adding a second class to elements with a pre-existing class. On inspection they only show the original class of, "brick". HTML <!DOCTYPE html> <html> <head> <script src="http://code.jquery.com/jquery-2.1.0.min.js"></script> <meta charset="utf-8"> <title>JS Bin</title> </head> <body> <div id="container"> <div id="button" >Click Me</div> </div> </body> </html> CSS #container { width: 100%; height: 100vh; padding: 10vmax; } #button { position: fixed; } .brick { position: relative; top: 0; height: 10vmax; width: 20vmax; background: white; margin: 0; padding: 0; transition: all 1s; } .drop { transition: all 1s; bottom 0; } The offending JS: var brickCount = function() { var count = prompt("How many boxes you lookin' for?"); for(var i=0; i < count; i++) { var newBrick = document.createElement("div"); newBrick.className="brick"; document.querySelector("#container") .appendChild(newBrick); } }; var getBricks = function(){ document.getElementByClass("brick"); }; var changeColor = function(){ getBricks.style.backgroundColor = '#'+Math.floor(Math.random()*16777215).toString(16); }; var addDrop = function() { getBricks.brick = "getBricks.brick" + " drop"; }; var multiple = function() { brickCount(); getBricks(); changeColor(); addDrop(); }; document.getElementById("button").onclick = function() {multiple();}; Thanks!

    Read the article

< Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >