Search Results

Search found 73216 results on 2929 pages for 'header file'.

Page 2/2929 | < Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >

  • PHP File Downloading Questions

    - by nsearle
    Hey All! I am currently running into some problems with user's downloading a file stored on my server. I have code set up to auto download a file once the user hits the download button. It is working for all files, but when the size get's larger than 30 MB it is having issues. Is there a limit on user download? Also, I have supplied my example code and am wondering if there is a better practice than using the PHP function 'file_get_contents'. Thank You all for the help! $path = $_SERVER['DOCUMENT_ROOT'] . '../path/to/file/'; $filename = 'filename.zip'; $filesize = filesize($path . $filename); @header("Content-type: application/zip"); @header("Content-Disposition: attachment; filename=$filename"); @header("Content-Length: $filesize") echo file_get_contents($path . $filename);

    Read the article

  • using macro defined in header files

    - by Neeraj
    I have a macro definition in header file like this: // header.h ARRAY_SZ(a) = ((int) sizeof(a)/sizeof(a[0])); This is defined in some header file, which includes some more header files. Now, i need to use this macro in some source file that has no other reason to include header.h or any other header files included in header.h, so should i redefine the macro in my source file or simply include the header file header.h. Will the latter approach affect the code size/compile time (I think yes), or runtime (i think no)? Your advice on this!

    Read the article

  • How to Customize the File Open/Save Dialog Box in Windows

    - by Lori Kaufman
    Generally, there are two kinds of Open/Save dialog boxes in Windows. One kind looks like Windows Explorer, with the tree on the left containing Favorites, Libraries, Computer, etc. The other kind contains a vertical toolbar, called the Places Bar. The Windows Explorer-style Open/Save dialog box can be customized by adding your own folders to the Favorites list. You can, then, click the arrows to the left of the main items, except the Favorites, to collapse them, leaving only the list of default and custom Favorites. The Places Bar is located along the left side of the File Open/Save dialog box and contains buttons providing access to frequently-used folders. The default buttons on the Places Bar are links to Recent Places, Desktop, Libraries, Computer, and Network. However, you change these links to be links to custom folders of your choice. We will show you how to customize the Places Bar using the registry and using a free tool in case you are not comfortable making changes in the registry. Use Your Android Phone to Comparison Shop: 4 Scanner Apps Reviewed How to Run Android Apps on Your Desktop the Easy Way HTG Explains: Do You Really Need to Defrag Your PC?

    Read the article

  • Opening a file opens the folder the file is in, not the file itself

    - by Pepe Lebuntu
    Whenever I try to open a file (such as an .odt, or .doc) from say, the Dash or the Firefox Downloads, Ubuntu 11.10 opens Nautilus to the the folder where the file is, rather than just going to the application and loading the file straight away. In previous releases, when I clicked on a downloaded file, it just went straight to LibreOffice, and it was fine. This is adding a superfluous step in the process. How do I associate the correct extensions?

    Read the article

  • Check to see if file transfer is complete

    - by Cymon
    We have a daily job that processes files delivered from an external source. The process usually runs fine without any issues but every once in a while we have an issue of attempting to process a file that is not completely transferred. The external source SCPs these files from a UNIX server to our Windows server. From there we try to process the files. Is there a way to check to see if a file is still being transferred? Does UNIX put a lock on a file while SCPing it that we could check on the Windows side?

    Read the article

  • How to I get the value of a custom soap header in WCF

    - by Jason Coyne
    I have created a custom soap header, and added it into my message via IClientMessageInspector public object BeforeSendRequest(ref System.ServiceModel.Channels.Message request, System.ServiceModel.IClientChannel channel) { var header = new MessageHeader<AuthHeader>(); header.Content = new AuthHeader(Key); header.Actor = "Anyone"; var header2 = header.GetUntypedHeader("Auth", "xWow"); request.Headers.Add(header2); return null; } [DataContract(Name="Auth")] public class AuthHeader { public AuthHeader(string key) { this.Key = key; } [DataMember] public string Key { get; set; } } I also have an IDispatchMessageInspector, and I can find the correct header in the list. However, there is no value. I know that the value went across the wire correctly, because the message string is correct <s:Envelope xmlns:s=\"http://schemas.xmlsoap.org/soap/envelope/\">\r\n <s:Header>\r\n <Auth s:actor=\"Anyone\" xmlns=\"xWow\" xmlns:i=\"http://www.w3.org/2001/XMLSchema-instance\">\r\n <Key xmlns=\"http://schemas.datacontract.org/2004/07/xWow.Lib\">HERE IS MY KEY VALUE!!!!</Key>\r\n </Auth>\r\n <To s:mustUnderstand=\"1\" xmlns=\"http://schemas.microsoft.com/ws/2005/05/addressing/none\">http://localhost:26443/AuthService.svc</To>\r\n <Action s:mustUnderstand=\"1\" xmlns=\"http://schemas.microsoft.com/ws/2005/05/addressing/none\">http://tempuri.org/IAuthService/GetPayload</Action>\r\n </s:Header>\r\n <s:Body>\r\n <GetPayload xmlns=\"http://tempuri.org/\" />\r\n </s:Body>\r\n</s:Envelope>" But there does not seem to be any property to retrieve this value. The MessageHeaderInfo class has Actor, etc, but nothing else useful I can find. On the client side I had to convert between Header and Untyped header, is there an equivalent operation on the server?

    Read the article

  • CSS Relative sized header/footer

    - by superexsl
    Hi I'm not a CSS expert so I might be missing something obvious. I'm trying to set a relative size header and footer (so that on larger monitors, the header and footer are larger than on smaller screens). To do this, I'm using a percentage height. However, this only works if I set the position to absolute. The problem is, setting it to absolute means that it overlaps the main part of the screen (inbetween the header and footer). Setting it to relative doesn't work since it relies on items being inside the header/footer. This is what my header looks like: .header { position:absolute; top:0px; background-color:white; width:100%; height:30%; } the ASPX page simply contains <div class="header"></div> Is there a way to get relatively proportioned header and footers? Thanks

    Read the article

  • Error 'duplicate definition' when compiling 2 c files that reference 1 header file

    - by super newbie
    I have two C files and one header that are as follows: Header file header.h: char c = 0; file1.c: #include "header.h" file2.c: #include "header.h" I was warned about 'duplicate definition' when compiling. I understand the cause as the variable c is defined twice in both file1.c and file2.c; however, I do need to reference the header.h in both c files. How should I overcome this issue?

    Read the article

  • Using Multiple File Handles for Single File

    - by Ryan Rosario
    I have an O(n^2) operation that requires me to read line i from a file, and then compare line i to every line in the file. This repeats for all i. I wrote the following code to do this with 2 file handles, but it does not yield the result I am looking for. I imagine this is a simple error on my part. IN1 = open("myfile.dat","r") IN2 = open("myfile.dat","r") for line1 in IN1: for line2 in IN2: print line1.strip(), line2.strip() IN1.close() IN2.close() The result: Hello Hello Hello World Hello This Hello is Hello an Hello Example Hello of Hello Using Hello Two Hello File Hello Pointers Hello to Hello Read Hello One Hello File The output should contain 15^2 lines.

    Read the article

  • Upload File to Windows Azure Blob in Chunks through ASP.NET MVC, JavaScript and HTML5

    - by Shaun
    Originally posted on: http://geekswithblogs.net/shaunxu/archive/2013/07/01/upload-file-to-windows-azure-blob-in-chunks-through-asp.net.aspxMany people are using Windows Azure Blob Storage to store their data in the cloud. Blob storage provides 99.9% availability with easy-to-use API through .NET SDK and HTTP REST. For example, we can store JavaScript files, images, documents in blob storage when we are building an ASP.NET web application on a Web Role in Windows Azure. Or we can store our VHD files in blob and mount it as a hard drive in our cloud service. If you are familiar with Windows Azure, you should know that there are two kinds of blob: page blob and block blob. The page blob is optimized for random read and write, which is very useful when you need to store VHD files. The block blob is optimized for sequential/chunk read and write, which has more common usage. Since we can upload block blob in blocks through BlockBlob.PutBlock, and them commit them as a whole blob with invoking the BlockBlob.PutBlockList, it is very powerful to upload large files, as we can upload blocks in parallel, and provide pause-resume feature. There are many documents, articles and blog posts described on how to upload a block blob. Most of them are focus on the server side, which means when you had received a big file, stream or binaries, how to upload them into blob storage in blocks through .NET SDK.  But the problem is, how can we upload these large files from client side, for example, a browser. This questioned to me when I was working with a Chinese customer to help them build a network disk production on top of azure. The end users upload their files from the web portal, and then the files will be stored in blob storage from the Web Role. My goal is to find the best way to transform the file from client (end user’s machine) to the server (Web Role) through browser. In this post I will demonstrate and describe what I had done, to upload large file in chunks with high speed, and save them as blocks into Windows Azure Blob Storage.   Traditional Upload, Works with Limitation The simplest way to implement this requirement is to create a web page with a form that contains a file input element and a submit button. 1: @using (Html.BeginForm("About", "Index", FormMethod.Post, new { enctype = "multipart/form-data" })) 2: { 3: <input type="file" name="file" /> 4: <input type="submit" value="upload" /> 5: } And then in the backend controller, we retrieve the whole content of this file and upload it in to the blob storage through .NET SDK. We can split the file in blocks and upload them in parallel and commit. The code had been well blogged in the community. 1: [HttpPost] 2: public ActionResult About(HttpPostedFileBase file) 3: { 4: var container = _client.GetContainerReference("test"); 5: container.CreateIfNotExists(); 6: var blob = container.GetBlockBlobReference(file.FileName); 7: var blockDataList = new Dictionary<string, byte[]>(); 8: using (var stream = file.InputStream) 9: { 10: var blockSizeInKB = 1024; 11: var offset = 0; 12: var index = 0; 13: while (offset < stream.Length) 14: { 15: var readLength = Math.Min(1024 * blockSizeInKB, (int)stream.Length - offset); 16: var blockData = new byte[readLength]; 17: offset += stream.Read(blockData, 0, readLength); 18: blockDataList.Add(Convert.ToBase64String(BitConverter.GetBytes(index)), blockData); 19:  20: index++; 21: } 22: } 23:  24: Parallel.ForEach(blockDataList, (bi) => 25: { 26: blob.PutBlock(bi.Key, new MemoryStream(bi.Value), null); 27: }); 28: blob.PutBlockList(blockDataList.Select(b => b.Key).ToArray()); 29:  30: return RedirectToAction("About"); 31: } This works perfect if we selected an image, a music or a small video to upload. But if I selected a large file, let’s say a 6GB HD-movie, after upload for about few minutes the page will be shown as below and the upload will be terminated. In ASP.NET there is a limitation of request length and the maximized request length is defined in the web.config file. It’s a number which less than about 4GB. So if we want to upload a really big file, we cannot simply implement in this way. Also, in Windows Azure, a cloud service network load balancer will terminate the connection if exceed the timeout period. From my test the timeout looks like 2 - 3 minutes. Hence, when we need to upload a large file we cannot just use the basic HTML elements. Besides the limitation mentioned above, the simple HTML file upload cannot provide rich upload experience such as chunk upload, pause and pause-resume. So we need to find a better way to upload large file from the client to the server.   Upload in Chunks through HTML5 and JavaScript In order to break those limitation mentioned above we will try to upload the large file in chunks. This takes some benefit to us such as - No request size limitation: Since we upload in chunks, we can define the request size for each chunks regardless how big the entire file is. - No timeout problem: The size of chunks are controlled by us, which means we should be able to make sure request for each chunk upload will not exceed the timeout period of both ASP.NET and Windows Azure load balancer. It was a big challenge to upload big file in chunks until we have HTML5. There are some new features and improvements introduced in HTML5 and we will use them to implement our solution.   In HTML5, the File interface had been improved with a new method called “slice”. It can be used to read part of the file by specifying the start byte index and the end byte index. For example if the entire file was 1024 bytes, file.slice(512, 768) will read the part of this file from the 512nd byte to 768th byte, and return a new object of interface called "Blob”, which you can treat as an array of bytes. In fact,  a Blob object represents a file-like object of immutable, raw data. The File interface is based on Blob, inheriting blob functionality and expanding it to support files on the user's system. For more information about the Blob please refer here. File and Blob is very useful to implement the chunk upload. We will use File interface to represent the file the user selected from the browser and then use File.slice to read the file in chunks in the size we wanted. For example, if we wanted to upload a 10MB file with 512KB chunks, then we can read it in 512KB blobs by using File.slice in a loop.   Assuming we have a web page as below. User can select a file, an input box to specify the block size in KB and a button to start upload. 1: <div> 2: <input type="file" id="upload_files" name="files[]" /><br /> 3: Block Size: <input type="number" id="block_size" value="512" name="block_size" />KB<br /> 4: <input type="button" id="upload_button_blob" name="upload" value="upload (blob)" /> 5: </div> Then we can have the JavaScript function to upload the file in chunks when user clicked the button. 1: <script type="text/javascript"> 1: 2: $(function () { 3: $("#upload_button_blob").click(function () { 4: }); 5: });</script> Firstly we need to ensure the client browser supports the interfaces we are going to use. Just try to invoke the File, Blob and FormData from the “window” object. If any of them is “undefined” the condition result will be “false” which means your browser doesn’t support these premium feature and it’s time for you to get your browser updated. FormData is another new feature we are going to use in the future. It could generate a temporary form for us. We will use this interface to create a form with chunk and associated metadata when invoked the service through ajax. 1: $("#upload_button_blob").click(function () { 2: // assert the browser support html5 3: if (window.File && window.Blob && window.FormData) { 4: alert("Your brwoser is awesome, let's rock!"); 5: } 6: else { 7: alert("Oh man plz update to a modern browser before try is cool stuff out."); 8: return; 9: } 10: }); Each browser supports these interfaces by their own implementation and currently the Blob, File and File.slice are supported by Chrome 21, FireFox 13, IE 10, Opera 12 and Safari 5.1 or higher. After that we worked on the files the user selected one by one since in HTML5, user can select multiple files in one file input box. 1: var files = $("#upload_files")[0].files; 2: for (var i = 0; i < files.length; i++) { 3: var file = files[i]; 4: var fileSize = file.size; 5: var fileName = file.name; 6: } Next, we calculated the start index and end index for each chunks based on the size the user specified from the browser. We put them into an array with the file name and the index, which will be used when we upload chunks into Windows Azure Blob Storage as blocks since we need to specify the target blob name and the block index. At the same time we will store the list of all indexes into another variant which will be used to commit blocks into blob in Azure Storage once all chunks had been uploaded successfully. 1: $("#upload_button_blob").click(function () { 2: // assert the browser support html5 3: ... ... 4: // start to upload each files in chunks 5: var files = $("#upload_files")[0].files; 6: for (var i = 0; i < files.length; i++) { 7: var file = files[i]; 8: var fileSize = file.size; 9: var fileName = file.name; 10:  11: // calculate the start and end byte index for each blocks(chunks) 12: // with the index, file name and index list for future using 13: var blockSizeInKB = $("#block_size").val(); 14: var blockSize = blockSizeInKB * 1024; 15: var blocks = []; 16: var offset = 0; 17: var index = 0; 18: var list = ""; 19: while (offset < fileSize) { 20: var start = offset; 21: var end = Math.min(offset + blockSize, fileSize); 22:  23: blocks.push({ 24: name: fileName, 25: index: index, 26: start: start, 27: end: end 28: }); 29: list += index + ","; 30:  31: offset = end; 32: index++; 33: } 34: } 35: }); Now we have all chunks’ information ready. The next step should be upload them one by one to the server side, and at the server side when received a chunk it will upload as a block into Blob Storage, and finally commit them with the index list through BlockBlobClient.PutBlockList. But since all these invokes are ajax calling, which means not synchronized call. So we need to introduce a new JavaScript library to help us coordinate the asynchronize operation, which named “async.js”. You can download this JavaScript library here, and you can find the document here. I will not explain this library too much in this post. We will put all procedures we want to execute as a function array, and pass into the proper function defined in async.js to let it help us to control the execution sequence, in series or in parallel. Hence we will define an array and put the function for chunk upload into this array. 1: $("#upload_button_blob").click(function () { 2: // assert the browser support html5 3: ... ... 4:  5: // start to upload each files in chunks 6: var files = $("#upload_files")[0].files; 7: for (var i = 0; i < files.length; i++) { 8: var file = files[i]; 9: var fileSize = file.size; 10: var fileName = file.name; 11: // calculate the start and end byte index for each blocks(chunks) 12: // with the index, file name and index list for future using 13: ... ... 14:  15: // define the function array and push all chunk upload operation into this array 16: blocks.forEach(function (block) { 17: putBlocks.push(function (callback) { 18: }); 19: }); 20: } 21: }); 22: }); As you can see, I used File.slice method to read each chunks based on the start and end byte index we calculated previously, and constructed a temporary HTML form with the file name, chunk index and chunk data through another new feature in HTML5 named FormData. Then post this form to the backend server through jQuery.ajax. This is the key part of our solution. 1: $("#upload_button_blob").click(function () { 2: // assert the browser support html5 3: ... ... 4: // start to upload each files in chunks 5: var files = $("#upload_files")[0].files; 6: for (var i = 0; i < files.length; i++) { 7: var file = files[i]; 8: var fileSize = file.size; 9: var fileName = file.name; 10: // calculate the start and end byte index for each blocks(chunks) 11: // with the index, file name and index list for future using 12: ... ... 13: // define the function array and push all chunk upload operation into this array 14: blocks.forEach(function (block) { 15: putBlocks.push(function (callback) { 16: // load blob based on the start and end index for each chunks 17: var blob = file.slice(block.start, block.end); 18: // put the file name, index and blob into a temporary from 19: var fd = new FormData(); 20: fd.append("name", block.name); 21: fd.append("index", block.index); 22: fd.append("file", blob); 23: // post the form to backend service (asp.net mvc controller action) 24: $.ajax({ 25: url: "/Home/UploadInFormData", 26: data: fd, 27: processData: false, 28: contentType: "multipart/form-data", 29: type: "POST", 30: success: function (result) { 31: if (!result.success) { 32: alert(result.error); 33: } 34: callback(null, block.index); 35: } 36: }); 37: }); 38: }); 39: } 40: }); Then we will invoke these functions one by one by using the async.js. And once all functions had been executed successfully I invoked another ajax call to the backend service to commit all these chunks (blocks) as the blob in Windows Azure Storage. 1: $("#upload_button_blob").click(function () { 2: // assert the browser support html5 3: ... ... 4: // start to upload each files in chunks 5: var files = $("#upload_files")[0].files; 6: for (var i = 0; i < files.length; i++) { 7: var file = files[i]; 8: var fileSize = file.size; 9: var fileName = file.name; 10: // calculate the start and end byte index for each blocks(chunks) 11: // with the index, file name and index list for future using 12: ... ... 13: // define the function array and push all chunk upload operation into this array 14: ... ... 15: // invoke the functions one by one 16: // then invoke the commit ajax call to put blocks into blob in azure storage 17: async.series(putBlocks, function (error, result) { 18: var data = { 19: name: fileName, 20: list: list 21: }; 22: $.post("/Home/Commit", data, function (result) { 23: if (!result.success) { 24: alert(result.error); 25: } 26: else { 27: alert("done!"); 28: } 29: }); 30: }); 31: } 32: }); That’s all in the client side. The outline of our logic would be - Calculate the start and end byte index for each chunks based on the block size. - Defined the functions of reading the chunk form file and upload the content to the backend service through ajax. - Execute the functions defined in previous step with “async.js”. - Commit the chunks by invoking the backend service in Windows Azure Storage finally.   Save Chunks as Blocks into Blob Storage In above we finished the client size JavaScript code. It uploaded the file in chunks to the backend service which we are going to implement in this step. We will use ASP.NET MVC as our backend service, and it will receive the chunks, upload into Windows Azure Bob Storage in blocks, then finally commit as one blob. As in the client side we uploaded chunks by invoking the ajax call to the URL "/Home/UploadInFormData", I created a new action under the Index controller and it only accepts HTTP POST request. 1: [HttpPost] 2: public JsonResult UploadInFormData() 3: { 4: var error = string.Empty; 5: try 6: { 7: } 8: catch (Exception e) 9: { 10: error = e.ToString(); 11: } 12:  13: return new JsonResult() 14: { 15: Data = new 16: { 17: success = string.IsNullOrWhiteSpace(error), 18: error = error 19: } 20: }; 21: } Then I retrieved the file name, index and the chunk content from the Request.Form object, which was passed from our client side. And then, used the Windows Azure SDK to create a blob container (in this case we will use the container named “test”.) and create a blob reference with the blob name (same as the file name). Then uploaded the chunk as a block of this blob with the index, since in Blob Storage each block must have an index (ID) associated with so that finally we can put all blocks as one blob by specifying their block ID list. 1: [HttpPost] 2: public JsonResult UploadInFormData() 3: { 4: var error = string.Empty; 5: try 6: { 7: var name = Request.Form["name"]; 8: var index = int.Parse(Request.Form["index"]); 9: var file = Request.Files[0]; 10: var id = Convert.ToBase64String(BitConverter.GetBytes(index)); 11:  12: var container = _client.GetContainerReference("test"); 13: container.CreateIfNotExists(); 14: var blob = container.GetBlockBlobReference(name); 15: blob.PutBlock(id, file.InputStream, null); 16: } 17: catch (Exception e) 18: { 19: error = e.ToString(); 20: } 21:  22: return new JsonResult() 23: { 24: Data = new 25: { 26: success = string.IsNullOrWhiteSpace(error), 27: error = error 28: } 29: }; 30: } Next, I created another action to commit the blocks into blob once all chunks had been uploaded. Similarly, I retrieved the blob name from the Request.Form. I also retrieved the chunks ID list, which is the block ID list from the Request.Form in a string format, split them as a list, then invoked the BlockBlob.PutBlockList method. After that our blob will be shown in the container and ready to be download. 1: [HttpPost] 2: public JsonResult Commit() 3: { 4: var error = string.Empty; 5: try 6: { 7: var name = Request.Form["name"]; 8: var list = Request.Form["list"]; 9: var ids = list 10: .Split(',') 11: .Where(id => !string.IsNullOrWhiteSpace(id)) 12: .Select(id => Convert.ToBase64String(BitConverter.GetBytes(int.Parse(id)))) 13: .ToArray(); 14:  15: var container = _client.GetContainerReference("test"); 16: container.CreateIfNotExists(); 17: var blob = container.GetBlockBlobReference(name); 18: blob.PutBlockList(ids); 19: } 20: catch (Exception e) 21: { 22: error = e.ToString(); 23: } 24:  25: return new JsonResult() 26: { 27: Data = new 28: { 29: success = string.IsNullOrWhiteSpace(error), 30: error = error 31: } 32: }; 33: } Now we finished all code we need. The whole process of uploading would be like this below. Below is the full client side JavaScript code. 1: <script type="text/javascript" src="~/Scripts/async.js"></script> 2: <script type="text/javascript"> 3: $(function () { 4: $("#upload_button_blob").click(function () { 5: // assert the browser support html5 6: if (window.File && window.Blob && window.FormData) { 7: alert("Your brwoser is awesome, let's rock!"); 8: } 9: else { 10: alert("Oh man plz update to a modern browser before try is cool stuff out."); 11: return; 12: } 13:  14: // start to upload each files in chunks 15: var files = $("#upload_files")[0].files; 16: for (var i = 0; i < files.length; i++) { 17: var file = files[i]; 18: var fileSize = file.size; 19: var fileName = file.name; 20:  21: // calculate the start and end byte index for each blocks(chunks) 22: // with the index, file name and index list for future using 23: var blockSizeInKB = $("#block_size").val(); 24: var blockSize = blockSizeInKB * 1024; 25: var blocks = []; 26: var offset = 0; 27: var index = 0; 28: var list = ""; 29: while (offset < fileSize) { 30: var start = offset; 31: var end = Math.min(offset + blockSize, fileSize); 32:  33: blocks.push({ 34: name: fileName, 35: index: index, 36: start: start, 37: end: end 38: }); 39: list += index + ","; 40:  41: offset = end; 42: index++; 43: } 44:  45: // define the function array and push all chunk upload operation into this array 46: var putBlocks = []; 47: blocks.forEach(function (block) { 48: putBlocks.push(function (callback) { 49: // load blob based on the start and end index for each chunks 50: var blob = file.slice(block.start, block.end); 51: // put the file name, index and blob into a temporary from 52: var fd = new FormData(); 53: fd.append("name", block.name); 54: fd.append("index", block.index); 55: fd.append("file", blob); 56: // post the form to backend service (asp.net mvc controller action) 57: $.ajax({ 58: url: "/Home/UploadInFormData", 59: data: fd, 60: processData: false, 61: contentType: "multipart/form-data", 62: type: "POST", 63: success: function (result) { 64: if (!result.success) { 65: alert(result.error); 66: } 67: callback(null, block.index); 68: } 69: }); 70: }); 71: }); 72:  73: // invoke the functions one by one 74: // then invoke the commit ajax call to put blocks into blob in azure storage 75: async.series(putBlocks, function (error, result) { 76: var data = { 77: name: fileName, 78: list: list 79: }; 80: $.post("/Home/Commit", data, function (result) { 81: if (!result.success) { 82: alert(result.error); 83: } 84: else { 85: alert("done!"); 86: } 87: }); 88: }); 89: } 90: }); 91: }); 92: </script> And below is the full ASP.NET MVC controller code. 1: public class HomeController : Controller 2: { 3: private CloudStorageAccount _account; 4: private CloudBlobClient _client; 5:  6: public HomeController() 7: : base() 8: { 9: _account = CloudStorageAccount.Parse(CloudConfigurationManager.GetSetting("DataConnectionString")); 10: _client = _account.CreateCloudBlobClient(); 11: } 12:  13: public ActionResult Index() 14: { 15: ViewBag.Message = "Modify this template to jump-start your ASP.NET MVC application."; 16:  17: return View(); 18: } 19:  20: [HttpPost] 21: public JsonResult UploadInFormData() 22: { 23: var error = string.Empty; 24: try 25: { 26: var name = Request.Form["name"]; 27: var index = int.Parse(Request.Form["index"]); 28: var file = Request.Files[0]; 29: var id = Convert.ToBase64String(BitConverter.GetBytes(index)); 30:  31: var container = _client.GetContainerReference("test"); 32: container.CreateIfNotExists(); 33: var blob = container.GetBlockBlobReference(name); 34: blob.PutBlock(id, file.InputStream, null); 35: } 36: catch (Exception e) 37: { 38: error = e.ToString(); 39: } 40:  41: return new JsonResult() 42: { 43: Data = new 44: { 45: success = string.IsNullOrWhiteSpace(error), 46: error = error 47: } 48: }; 49: } 50:  51: [HttpPost] 52: public JsonResult Commit() 53: { 54: var error = string.Empty; 55: try 56: { 57: var name = Request.Form["name"]; 58: var list = Request.Form["list"]; 59: var ids = list 60: .Split(',') 61: .Where(id => !string.IsNullOrWhiteSpace(id)) 62: .Select(id => Convert.ToBase64String(BitConverter.GetBytes(int.Parse(id)))) 63: .ToArray(); 64:  65: var container = _client.GetContainerReference("test"); 66: container.CreateIfNotExists(); 67: var blob = container.GetBlockBlobReference(name); 68: blob.PutBlockList(ids); 69: } 70: catch (Exception e) 71: { 72: error = e.ToString(); 73: } 74:  75: return new JsonResult() 76: { 77: Data = new 78: { 79: success = string.IsNullOrWhiteSpace(error), 80: error = error 81: } 82: }; 83: } 84: } And if we selected a file from the browser we will see our application will upload chunks in the size we specified to the server through ajax call in background, and then commit all chunks in one blob. Then we can find the blob in our Windows Azure Blob Storage.   Optimized by Parallel Upload In previous example we just uploaded our file in chunks. This solved the problem that ASP.NET MVC request content size limitation as well as the Windows Azure load balancer timeout. But it might introduce the performance problem since we uploaded chunks in sequence. In order to improve the upload performance we could modify our client side code a bit to make the upload operation invoked in parallel. The good news is that, “async.js” library provides the parallel execution function. If you remembered the code we invoke the service to upload chunks, it utilized “async.series” which means all functions will be executed in sequence. Now we will change this code to “async.parallel”. This will invoke all functions in parallel. 1: $("#upload_button_blob").click(function () { 2: // assert the browser support html5 3: ... ... 4: // start to upload each files in chunks 5: var files = $("#upload_files")[0].files; 6: for (var i = 0; i < files.length; i++) { 7: var file = files[i]; 8: var fileSize = file.size; 9: var fileName = file.name; 10: // calculate the start and end byte index for each blocks(chunks) 11: // with the index, file name and index list for future using 12: ... ... 13: // define the function array and push all chunk upload operation into this array 14: ... ... 15: // invoke the functions one by one 16: // then invoke the commit ajax call to put blocks into blob in azure storage 17: async.parallel(putBlocks, function (error, result) { 18: var data = { 19: name: fileName, 20: list: list 21: }; 22: $.post("/Home/Commit", data, function (result) { 23: if (!result.success) { 24: alert(result.error); 25: } 26: else { 27: alert("done!"); 28: } 29: }); 30: }); 31: } 32: }); In this way all chunks will be uploaded to the server side at the same time to maximize the bandwidth usage. This should work if the file was not very large and the chunk size was not very small. But for large file this might introduce another problem that too many ajax calls are sent to the server at the same time. So the best solution should be, upload the chunks in parallel with maximum concurrency limitation. The code below specified the concurrency limitation to 4, which means at the most only 4 ajax calls could be invoked at the same time. 1: $("#upload_button_blob").click(function () { 2: // assert the browser support html5 3: ... ... 4: // start to upload each files in chunks 5: var files = $("#upload_files")[0].files; 6: for (var i = 0; i < files.length; i++) { 7: var file = files[i]; 8: var fileSize = file.size; 9: var fileName = file.name; 10: // calculate the start and end byte index for each blocks(chunks) 11: // with the index, file name and index list for future using 12: ... ... 13: // define the function array and push all chunk upload operation into this array 14: ... ... 15: // invoke the functions one by one 16: // then invoke the commit ajax call to put blocks into blob in azure storage 17: async.parallelLimit(putBlocks, 4, function (error, result) { 18: var data = { 19: name: fileName, 20: list: list 21: }; 22: $.post("/Home/Commit", data, function (result) { 23: if (!result.success) { 24: alert(result.error); 25: } 26: else { 27: alert("done!"); 28: } 29: }); 30: }); 31: } 32: });   Summary In this post we discussed how to upload files in chunks to the backend service and then upload them into Windows Azure Blob Storage in blocks. We focused on the frontend side and leverage three new feature introduced in HTML 5 which are - File.slice: Read part of the file by specifying the start and end byte index. - Blob: File-like interface which contains the part of the file content. - FormData: Temporary form element that we can pass the chunk alone with some metadata to the backend service. Then we discussed the performance consideration of chunk uploading. Sequence upload cannot provide maximized upload speed, but the unlimited parallel upload might crash the browser and server if too many chunks. So we finally came up with the solution to upload chunks in parallel with the concurrency limitation. We also demonstrated how to utilize “async.js” JavaScript library to help us control the asynchronize call and the parallel limitation.   Regarding the chunk size and the parallel limitation value there is no “best” value. You need to test vary composition and find out the best one for your particular scenario. It depends on the local bandwidth, client machine cores and the server side (Windows Azure Cloud Service Virtual Machine) cores, memory and bandwidth. Below is one of my performance test result. The client machine was Windows 8 IE 10 with 4 cores. I was using Microsoft Cooperation Network. The web site was hosted on Windows Azure China North data center (in Beijing) with one small web role (1.7GB 1 core CPU, 1.75GB memory with 100Mbps bandwidth). The test cases were - Chunk size: 512KB, 1MB, 2MB, 4MB. - Upload Mode: Sequence, parallel (unlimited), parallel with limit (4 threads, 8 threads). - Chunk Format: base64 string, binaries. - Target file: 100MB. - Each case was tested 3 times. Below is the test result chart. Some thoughts, but not guidance or best practice: - Parallel gets better performance than series. - No significant performance improvement between parallel 4 threads and 8 threads. - Transform with binaries provides better performance than base64. - In all cases, chunk size in 1MB - 2MB gets better performance.   Hope this helps, Shaun All documents and related graphics, codes are provided "AS IS" without warranty of any kind. Copyright © Shaun Ziyan Xu. This work is licensed under the Creative Commons License.

    Read the article

  • Constructing radiotap header and ieee80211 header structures for packet injection

    - by hektor
    I am trying to communicate between two laptop machines using Wifi. The structure of the radiotap header and ieee80211 header I am using is: struct ieee80211_radiotap_header { unsigned char it_version; uint16_t it_len; uint32_t it_present; }; /* Structure for 80211 header */ struct ieee80211_hdr_3addr { uint16_t frame_ctl[2]; uint16_t duration_id; unsigned char addr1[ETH_ALEN]; unsigned char addr2[ETH_ALEN]; unsigned char addr3[ETH_ALEN]; uint16_t seq_ctl; }; struct packet { struct ieee80211_radiotap_header rtap_header; struct ieee80211_hdr_3addr iee802_header; unsigned char payload[30]; }; /* In main program */ struct packet mypacket; struct ieee80211_radiotap_header ratap_header; struct ieee80211_hdr_3addr iee802_header; unsigned char addr1[ETH_ALEN] = {0xFF,0xFF,0xFF,0xFF,0xFF,0xFF}; /* broadcast address */ unsigned char addr2[ETH_ALEN] = {0x28,0xcf,0xda,0xde,0xd3,0xcc}; /* mac address of network card */ unsigned char addr3[ETH_ALEN] = {0xd8,0xc7,0xc8,0xd7,0x9f,0x21}; /* mac address of access point i am trying to connect to */ /* Radio tap header data */ ratap_header.it_version = 0x00; ratap_header.it_len = 0x07; ratap_header.it_present = (1 << IEEE80211_RADIOTAP_RATE); mypacket.rtap_header = ratap_header; /* ieee80211 header data */ iee802_header.frame_ctl[0] = IEEE80211_FC0_VERSION_0 | IEEE80211_FC0_TYPE_MGT | IEEE80211_FC0_SUBTYPE_BEACON; iee802_header.frame_ctl[1] =IEEE80211_FC1_DIR_NODS; strcpy(iee802_header.addr1,addr1); strcpy(iee802_header.addr2,addr2); strcpy(iee802_header.addr3,addr3); iee802_header.seq_ctl = 0x1086; mypacket.iee802_header=iee802_header; /* Payload */ unsigned char payload[PACKET_LENGTH]="temp"; strcpy(mypacket.payload , payload); I am able to receive the packets when I test the transmission and reception on the same laptop. However I am not able to receive the packet transmitted on a different laptop. Wireshark does not show the packet as well. Can anyone point out the mistake I am making?

    Read the article

  • UITableView Header View Scrolls

    - by Jim Bonner
    I am using a header view, not section header, with my UITableView. The documentation says that the header view sits on top of the table, but my header view scrolls just like a normal row. Is there a way to make the header view always visible at the top of the table?

    Read the article

  • Is this too much code for a header only library?

    - by Billy ONeal
    It seems like I had to inline quite a bit of code here. I'm wondering if it's bad design practice to leave this entirely in a header file like this: #pragma once #include <string> #include <boost/noncopyable.hpp> #include <boost/make_shared.hpp> #include <boost/iterator/iterator_facade.hpp> #include <Windows.h> #include "../Exception.hpp" namespace WindowsAPI { namespace FileSystem { class FileData; struct AllResults; struct FilesOnly; template <typename Filter_T = AllResults> class DirectoryIterator; namespace detail { class DirectoryIteratorImpl : public boost::noncopyable { WIN32_FIND_DATAW currentData; HANDLE hFind; std::wstring root; public: inline DirectoryIteratorImpl(); inline explicit DirectoryIteratorImpl(const std::wstring& pathSpec); inline void increment(); inline bool equal(const DirectoryIteratorImpl& other) const; inline const std::wstring& GetPathRoot() const; inline const WIN32_FIND_DATAW& GetCurrentFindData() const; inline ~DirectoryIteratorImpl(); }; } class FileData //Serves as a proxy to the WIN32_FIND_DATA struture inside the iterator. { boost::shared_ptr<detail::DirectoryIteratorImpl> iteratorSource; public: FileData(const boost::shared_ptr<detail::DirectoryIteratorImpl>& parent) : iteratorSource(parent) {}; DWORD GetAttributes() const { return iteratorSource->GetCurrentFindData().dwFileAttributes; }; bool IsDirectory() const { return (GetAttributes() | FILE_ATTRIBUTE_DIRECTORY) != 0; }; bool IsFile() const { return !IsDirectory(); }; bool IsArchive() const { return (GetAttributes() | FILE_ATTRIBUTE_ARCHIVE) != 0; }; bool IsReadOnly() const { return (GetAttributes() | FILE_ATTRIBUTE_READONLY) != 0; }; unsigned __int64 GetSize() const { ULARGE_INTEGER intValue; intValue.LowPart = iteratorSource->GetCurrentFindData().nFileSizeLow; intValue.HighPart = iteratorSource->GetCurrentFindData().nFileSizeHigh; return intValue.QuadPart; }; std::wstring GetFolderPath() const { return iteratorSource->GetPathRoot(); }; std::wstring GetFileName() const { return iteratorSource->GetCurrentFindData().cFileName; }; std::wstring GetFullFileName() const { return GetFolderPath() + GetFileName(); }; std::wstring GetShortFileName() const { return iteratorSource->GetCurrentFindData().cAlternateFileName; }; FILETIME GetCreationTime() const { return iteratorSource->GetCurrentFindData().ftCreationTime; }; FILETIME GetLastAccessTime() const { return iteratorSource->GetCurrentFindData().ftLastAccessTime; }; FILETIME GetLastWriteTime() const { return iteratorSource->GetCurrentFindData().ftLastWriteTime; }; }; struct AllResults : public std::unary_function<const FileData&, bool> { bool operator()(const FileData&) { return true; }; }; struct FilesOnly : public std::unary_function<const FileData&, bool> { bool operator()(const FileData& arg) { return arg.IsFile(); }; }; template <typename Filter_T> class DirectoryIterator : public boost::iterator_facade<DirectoryIterator<Filter_T>, const FileData, std::input_iterator_tag> { friend class boost::iterator_core_access; boost::shared_ptr<detail::DirectoryIteratorImpl> impl; FileData current; Filter_T filter; void increment() { do { impl->increment(); } while (! filter(current)); }; bool equal(const DirectoryIterator& other) const { return impl->equal(*other.impl); }; const FileData& dereference() const { return current; }; public: DirectoryIterator(Filter_T functor = Filter_T()) : impl(boost::make_shared<detail::DirectoryIteratorImpl>()), current(impl), filter(functor) { }; explicit DirectoryIterator(const std::wstring& pathSpec, Filter_T functor = Filter_T()) : impl(boost::make_shared<detail::DirectoryIteratorImpl>(pathSpec)), current(impl), filter(functor) { }; }; namespace detail { DirectoryIteratorImpl::DirectoryIteratorImpl() : hFind(INVALID_HANDLE_VALUE) { } DirectoryIteratorImpl::DirectoryIteratorImpl(const std::wstring& pathSpec) { std::wstring::const_iterator lastSlash = std::find(pathSpec.rbegin(), pathSpec.rend(), L'\\').base(); root.assign(pathSpec.begin(), lastSlash); hFind = FindFirstFileW(pathSpec.c_str(), &currentData); if (hFind == INVALID_HANDLE_VALUE) WindowsApiException::ThrowFromLastError(); while (!wcscmp(currentData.cFileName, L".") || !wcscmp(currentData.cFileName, L"..")) { increment(); } } void DirectoryIteratorImpl::increment() { BOOL success = FindNextFile(hFind, &currentData); if (success) return; DWORD error = GetLastError(); if (error == ERROR_NO_MORE_FILES) { FindClose(hFind); hFind = INVALID_HANDLE_VALUE; } else { WindowsApiException::Throw(error); } } DirectoryIteratorImpl::~DirectoryIteratorImpl() { if (hFind != INVALID_HANDLE_VALUE) FindClose(hFind); } bool DirectoryIteratorImpl::equal(const DirectoryIteratorImpl& other) const { if (this == &other) return true; return hFind == other.hFind; } const std::wstring& DirectoryIteratorImpl::GetPathRoot() const { return root; } const WIN32_FIND_DATAW& DirectoryIteratorImpl::GetCurrentFindData() const { return currentData; } } }}

    Read the article

  • PJSIP Custom Registration Header (iOS)

    - by Daniel Redington
    I am attempting to setup SIP communication with an internal server (using the PJSIP library), however, this server requires a custom header field with a specified header value for the REGISTRATION call. For example's sake we'll call this required header "MyHeader". From what I've found, the pjsua_acc_add() function will add an account and register it to the server using a config struct. The parameter "reg_hdr_list" of the config struct has the description "The optional custom SIP headers to be put in the registration request." Which sounds like exactly what I need, however doesn't seem to have any effect on the call itself. Here's what I have so far: pjsua_acc_config cfg; pjsua_acc_config_default(&cfg); //...Some other config stuff related to the server... pjsip_hdr test; test.name = pj_str("MyHeader"); test.sname = pj_str("MyHdr"); test.type = PJSIP_H_OTHER; test.prev = cfg.reg_hdr_list.prev; test.next = cfg.reg_hdr_list.next; cfg.reg_hdr_list = test; pj_status_t status; status = pjsua_acc_add(&cfg, PJ_TRUE, &acc_id); From the server side, there are no extra header fields or anything. And the struct that is used to define the header (pjsua_hdr) has no "value" or equivalent field, so even if it did create the header, how does it set the value? Here's the refrence for the header list definition: Link And the reference for the header struct: Link Any help would be greatly appreciated!

    Read the article

  • PHP File Upload second file does not upload, first file does without error

    - by Curtis
    So I have a script I have been using and it generally works well with multiple files... When I upload a very large file in a multiple file upload, only the first file is uploaded. I am not seeing an errors as to why. I figure this is related to a timeout setting but can not figure it out - Any ideas? I have foloowing set in my htaccess file php_value post_max_size 1024M php_value upload_max_filesize 1024M php_value memory_limit 600M php_value output_buffering on php_value max_execution_time 259200 php_value max_input_time 259200 php_value session.cookie_lifetime 0 php_value session.gc_maxlifetime 259200 php_value default_socket_timeout 259200

    Read the article

  • scp No such file or directory

    - by Joe
    I've a confusing question for which superuser doesn't seem to have a good answer, and neither google. I'm trying to scp a file from a remote server to my local machine. The command is this scp user@server:/path/to/source/file.gz /path/to/destination The error I get is: scp: /path/to/source/file.gz: No such file or directory user is my username on the server. The command syntax appears fine to me. ssh works fine and I can cd to the file and it doesn't seem to be an access control issue? Thanks; Edit: Thank you John. I spotted the issue. ls returned this: -r--r--r-- 1 nobody users 168967171 Mar 10 2009 /path/to/source/file.gz So, the file was on a read-only file system and user is able to read it but not scp. I just copied the file to a different directory and chown it and worked fine. It would be good if someone can explain why this is the case though.

    Read the article

  • What is a header? Especially, what are POST@GET headers?

    - by brilliant
    Hello, I've been trying to find a Python code that would log in to my Yahoo account from "Google App Engine". One supporter on "StackOverflow" gave me this three-step plan: Simulate normal login and save login page that you get; Save POST&GET headers with "Wireshark"; Compare login page with those headers and see what fields you need to include with your request; The problem here is that I have never used "Wireshark" before. Plus, I don't know what the POST&GET headers are. Can You, please, explain it to me (preferably with some example). Thank You.

    Read the article

  • Ironport X-Junk header.

    - by Kyle
    My school uses Ironport for filtering/monitoring web traffic. I have a bit of curiosity as to why it tacks on a x-junk: header onto everything. After going through a few curl tests, I've found no real connection between what is shown on the page and the x-junk: header. here's my curl request: any ideas? Anchorage:~ khotchkiss$ curl -I google.com HTTP/1.1 301 Moved Permanently Location: http://www.google.com/ Content-Type: text/html; charset=UTF-8 Date: Sun, 06 Feb 2011 04:37:25 GMT Expires: Tue, 08 Mar 2011 04:37:25 GMT Cache-Control: public, max-age=2592000 Server: gws X-XSS-Protection: 1; mode=block Content-Length: 219 Age: 108 Via: 1.1 MC-IRONPORT.UNIVERSITY.LIBERTY.EDU:80 (IronPort-WSA/6.3.3-015) Connection: keep-alive X-Junk: xxxxxxxxxxxxxxxxxxxxxxx

    Read the article

  • Designing status management for a file processing module

    - by bot
    The background One of the functionality of a product that I am currently working on is to process a set of compressed files ( containing XML files ) that will be made available at a fixed location periodically (local or remote location - doesn't really matter for now) and dump the contents of each XML file in a database. I have taken care of the design for a generic parsing module that should be able to accommodate the parsing of any file type as I have explained in my question linked below. There is no need to take a look at the following link to answer my question but it would definitely provide a better context to the problem Generic file parser design in Java using the Strategy pattern The Goal I want to be able to keep a track of the status of each XML file and the status of each compressed file containing the XML files. I can probably have different statuses defined for the XML files such as NEW, PROCESSING, LOADING, COMPLETE or FAILED. I can derive the status of a compressed file based on the status of the XML files within the compressed file. e.g status of the compressed file is COMPLETE if no XML file inside the compressed file is in a FAILED state or status of the compressed file is FAILED if the status of at-least one XML file inside the compressed file is FAILED. A possible solution The Model I need to maintain the status of each XML file and the compressed file. I will have to define some POJOs for holding the information about an XML file as shown below. Note that there is no need to store the status of a compressed file as the status of a compressed file can be derived from the status of its XML files. public class FileInformation { private String compressedFileName; private String xmlFileName; private long lastModifiedDate; private int status; public FileInformation(final String compressedFileName, final String xmlFileName, final long lastModified, final int status) { this.compressedFileName = compressedFileName; this.xmlFileName = xmlFileName; this.lastModifiedDate = lastModified; this.status = status; } } I can then have a class called StatusManager that aggregates a Map of FileInformation instances and provides me the status of a given file at any given time in the lifetime of the appliciation as shown below : public class StatusManager { private Map<String,FileInformation> processingMap = new HashMap<String,FileInformation>(); public void add(FileInformation fileInformation) { fileInformation.setStatus(0); // 0 will indicates that the file is in NEW state. 1 will indicate that the file is in process and so on.. processingMap.put(fileInformation.getXmlFileName(),fileInformation); } public void update(String filename,int status) { FileInformation fileInformation = processingMap.get(filename); fileInformation.setStatus(status); } } That takes care of the model for the sake of explanation. So whats my question? Edited after comments from Loki and answer from Eric : - I would like to know if there are any existing design patterns that I can refer to while coming up with a design. I would also like to know how I should go about designing the status management classes. I am more interested in understanding how I can model the status management classes. I am not interested in how other components are going to be updated about a change in status at the moment as suggested by Eric.

    Read the article

  • Blackberry read local properties file in project

    - by Dachmt
    Hi, I have a config.properties file at the root of my blackberry project (same place as Blackberry_App_Descriptor.xml file), and I try to access the file to read and write into it. See below my class: public class Configuration { private String file; private String fileName; public Configuration(String pathToFile) { this.fileName = pathToFile; try { // Try to load the file and read it System.out.println("---------- Start to read the file"); file = readFile(fileName); System.out.println("---------- Property file:"); System.out.println(file); } catch (Exception e) { System.out.println("---------- Error reading file"); System.out.println(e.getMessage()); } } /** * Read a file and return it in a String * @param fName * @return */ private String readFile(String fName) { String properties = null; try { System.out.println("---------- Opening the file"); //to actually retrieve the resource prefix the name of the file with a "/" InputStream is = this.getClass().getResourceAsStream(fName); //we now have an input stream. Create a reader and read out //each character in the stream. System.out.println("---------- Input stream"); InputStreamReader isr = new InputStreamReader(is); char c; System.out.println("---------- Append string now"); while ((c = (char)isr.read()) != -1) { properties += c; } } catch (Exception e) { } return properties; } } I call my class constructor like this: Configuration config = new Configuration("/config.properties"); So in my class, "file" should have all the content of the config.properties file, and the fileName should have this value "/config.properties". But the "name" is null because the file cannot be found... I know this is the path of the file which should be different, but I don't know what i can change... The class is in the package com.mycompany.blackberry.utils Thank you!

    Read the article

  • batch file infinite loop when parsing file

    - by Bart
    Okay, this should be a really simple task but its proving to be more complicated than I think it should be. I'm clearly doing something wrong, and would like someone else's input. What I would like to do is parse through a file containing paths to directories and set permissions on those directories. An example line of the input file. There are several lines, all formatted the same way, with a different path to a directory. E:\stuff\Things\something else (X)\ (The file in question is generated under Cygwin using find to list all directories with "(X)" in the name. The file is then passed through unix2win to make it windows compatible. I've also tried manually creating the input file from within windows to rule out the file's creation method as the problem.) Here's where I'm stuck... I wrote the following quick and dirty batch file in Windows XP and it worked without any issues at all, but it will not work in server 2k8. Batch file code to run through the file and set permissions: FOR /F "tokens=*" %%A IN (dirlist.txt) DO echo y| cacls "%%A" /T /C /G "Domain Admins":f "Some Group":f "some-security-group":f What this is SUPPOSED to do (and does in XP) is loop through the specified file (dirlist.txt) and run cacls.exe on each directory it pulls from the file. The "echo y|" is in there to automagically confirm when cacls helpfully asks "are you sure?" for every directory in the list. Unfortunately, however, what it DOES is fall into an infinite loop. I've tried surrounding everything after "DO" with quotes, which prevents the endless loop but confuses cacls so it throws an error. Interestingly, I've tried running the code from after "DO" manually (obviously replacing the variable with the full path, copied straight from the file) at a command prompt and it runs as expected. I don't think it's the file or the loop, as adding quotes to the command to be executed prevents the loop from continuing past where it's supposed to... I really have no idea at this point. Any help would be appreciated. I have a feeling it's going to be something increadibly stupid... but I'm pulling my hair out so I thought I'd ask.

    Read the article

  • Microsoft Management Console stops working when I add snap-in to it

    - by JayaprakashReddy
    I have Windows 7 Ultimate OS. I'm opening mmc.exe as administrator and trying add Certificates or any other snap-in, then while loading that snap-in MMC breaks and displays following message and after that it closes automatically once I click on close button on that message. What could be the problem? I did following to fix the problem but couldn't succeed any of these: I tried to repair the OS I repaired files using this method Even repaired the installation using this link Update: *@oldskool: Here is the debug process output:* Sorry its a long output text. 'mmc.exe': Loaded 'C:\Windows\System32\mmc.exe', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\ntdll.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\kernel32.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\KernelBase.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\gdi32.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\user32.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\lpk.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\usp10.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\msvcrt.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\mfc42u.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\ole32.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\rpcrt4.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\oleaut32.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\odbc32.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\advapi32.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\sechost.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\mmcbase.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\shlwapi.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\uxtheme.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\duser.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\imm32.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\msctf.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\odbcint.dll', Binary was not built with debug information. 'mmc.exe': Loaded 'C:\Windows\System32\dui70.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\winsxs\x86_microsoft.windows.common-controls_6595b64144ccf1df_6.0.7600.16661_none_420fe3fa2b8113bd\comctl32.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\shell32.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\cryptbase.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\urlmon.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\wininet.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\iertutil.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\crypt32.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\msasn1.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\clbcatq.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\mmcndmgr.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\dwmapi.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\oleacc.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\cryptsp.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\rsaenh.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\RpcRtRemote.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\mlang.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\xmllite.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\version.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\apphelp.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\msi.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Program Files\Microsoft Visual Studio 8\SDK\v2.0\Bin\mscormmc.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\mscoree.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\winsxs\x86_microsoft.vc80.crt_1fc8b3b9a1e18e3b_8.0.50727.4927_none_d08a205e442db5b5\msvcr80.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\azroleui.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\atl.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\secur32.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\netutils.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\dsrole.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\logoncli.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\dsuiext.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\ntdsapi.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\ws2_32.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\nsi.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\activeds.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\adsldpc.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\Wldap32.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\mpr.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\netapi32.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\srvcli.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\wkscli.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\certmgr.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\certcli.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\CertEnroll.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\cryptui.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\ncrypt.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\bcrypt.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\wintrust.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\imagehlp.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\sspicli.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\aclui.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\IPHLPAPI.DLL', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\winnsi.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\slc.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\comsnap.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\mfc42.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\mycomput.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\devmgr.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\setupapi.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\cfgmgr32.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\devobj.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\devrtl.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\newdev.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\dmdskmgr.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\dmutil.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\dmdskres.dll', Binary was not built with debug information. 'mmc.exe': Loaded 'C:\Windows\System32\dmdskres2.dll', Binary was not built with debug information. 'mmc.exe': Loaded 'C:\Windows\System32\gpedit.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\dssec.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\authz.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\dfscli.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\samcli.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\gpapi.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\framedynos.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\wtsapi32.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\ipsmsnap.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\winipsec.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\userenv.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\profapi.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\ipsecsnp.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\polstore.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\localsec.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\wdc.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\pdh.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\pdhui.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\comdlg32.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\credui.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\wevtapi.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\pla.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\tdh.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\winsta.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\utildll.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\browcli.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\vdmdbg.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\pmcsnap.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\winspool.drv', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\puiapi.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\wsecedit.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\scecli.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\filemgmt.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Program Files\Microsoft SQL Server\100\Tools\Binn\SqlManager.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\winsxs\x86_microsoft.vc80.mfc_1fc8b3b9a1e18e3b_8.0.50727.4053_none_cbf21254470d8752\mfc80u.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\winsxs\x86_microsoft.vc80.crt_1fc8b3b9a1e18e3b_8.0.50727.4927_none_d08a205e442db5b5\msvcp80.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\winsxs\x86_microsoft.vc80.atl_1fc8b3b9a1e18e3b_8.0.50727.4053_none_d1c738ec43578ea1\ATL80.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\winsxs\x86_microsoft.windows.common-controls_6595b64144ccf1df_5.82.7600.16661_none_ebfb56996c72aefc\comctl32.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\winsxs\x86_microsoft.vc80.mfcloc_1fc8b3b9a1e18e3b_8.0.50727.4053_none_03ca5532205cb096\mfc80ENU.dll', Binary was not built with debug information. 'mmc.exe': Loaded 'C:\Program Files\Microsoft SQL Server\100\Tools\Binn\Resources\1033\SqlManager.rll', Binary was not built with debug information. 'mmc.exe': Loaded 'C:\Windows\System32\msxml6.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Program Files\Microsoft SQL Server\90\Tools\Binn\SqlManager.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\wbem\wbemcntl.dll', Cannot find or open the PDB file The thread 'Win32 Thread' (0xf74) has exited with code 0 (0x0). Unhandled exception at 0x774d35e3 in mmc.exe: 0xC0000374: A heap has been corrupted.

    Read the article

  • apache dont send me mp3 header even when use direct address to the file

    - by user1728307
    apache dont send me mp3 header even when use direct address to the file, it means i can play it with flash audio players on my web pages, but when i tried to download from direct address on my server i got "Error 101 (net::ERR_CONNECTION_RESET): The connection was reset" or sometimes gives me a file with mp3 extension that has just 13B files-size, and when i open that file in gedit/notepad there is just: <html></html> i dont have any problem with php files and images, but mp3 files never be send to browser for download or play. i added this code to httpd.conf: AddType audio/mpeg .mp3 but there is not any difference!! thanks in advance

    Read the article

  • File corruption after copying files in Windows 7 64 bit using two methods

    - by DustByte
    I have 5000 pictures and other files in a directory taking up 35 GB. I want to duplicate this directory. Method 1: I do a simple copy and paste of the directory in explorer. I have the habit of checking the checksums after copying important files. In this case I noticed that around 2000 files failed the MD5 test. At a closer inspection of a randomly chosen JPEG with different checksums it turns out that some XMP metadata had changed. In particular, the tag <MicrosoftPhoto:DateAcquired> had changed the date from 2009 to today (possibly around the time I was copying the files). I have no idea what triggered this XMP data to be changed and exactly when it was changed and why for these particular files, but at least it seems to explain the checksum discrepancy. Method 2: As I want the exact files to be duplicated, I tried the program FreeFileSync to mirror the directory, hoping no XMP metadata would mysteriously change. A checksum test in addition to a thorough file comparison test in FreeFileSync lead to two similar but yet different results: 31 files fail the checksum test, 23 files fail the file comparison test. The smaller set is not entirely contained in the bigger set, although many files occur in both. What is alarming here is that not only JPEGs are flagged as altered but also som AVIs, MPGs and a large 7-zip file. Closer inspection of a JPEG indicates that it is indeed corrupt: the bottom half of the picture is simply plain gray. Due to the size of the 7-zip file, I have not been able to pin down the discrepancy. Note, in both methods, every file has its correct file size after being copied. Question: Any thoughts on what is possibly going on here? I have never had this problem before, and I am now terrified that files get corrupted after simple actions like copy/paste and file sync. Even if I manage to successfully copy the files somehow, I would still like an explanation to this.

    Read the article

  • How can I center a Silverlight DataGridTemplateColumn header?

    - by Mike Pateras
    I want to center the header on a Silverlight DataGridTemplateColumn. The following code gets me most of the way there: DataGridTemplateColumn column = new DataGridTemplateColumn(); column.CellTemplate = Resources[templateName] as DataTemplate; column.Header = headerName; column.HeaderStyle = new Style { TargetType = typeof(DataGridColumnHeader) }; column.HeaderStyle.Setters.Add(new Setter(DataGridColumnHeader.HorizontalAlignmentProperty, HorizontalAlignment.Center)); The header is, indeed, centered, but if the column is expanded, the header doesn't stretch. It just remains it's original width, leaving white gaps on either side of it, which looks terrible. What is the proper way to center the column header, such that it still occupies the full width?

    Read the article

< Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >