Search Results

Search found 72319 results on 2893 pages for 'file explorer'.

Page 316/2893 | < Previous Page | 312 313 314 315 316 317 318 319 320 321 322 323  | Next Page >

  • Using Javascript to generate and save a file

    - by Toji
    I've been fiddling with WebGL lately, and have gotten a Collada reader working. Problem is it's pretty slow (Collada is a very verbose format), so I'm going to start converting files to a easier to use format (probably JSON). Thing is, I already have the code to parse the file in Javascript, so I may as well use it as my exporter too! The problem is saving. Now, I know that I can parse the file, send the result to the server, and have the browser request the file back from the server as a download. But in reality the server has nothing to do with this particular process, so why get it involved? I already have the contents of the desired file in memory. Is there any way that I could present the user with a download using pure javascript? (I doubt it, but might as well ask...) And to be clear: I am not trying to access the filesystem without the users knowledge! The user will provide a file (probably via drag and drop), the script will transform the file in memory, and the user will be prompted to download the result. All of which should be "safe" activities as far as the browser is concerned.

    Read the article

  • Compile IL code at runtime using .NET 3.5 and C# from file

    - by nitefrog
    I would like to take a file that is an IL file, and at run time compile it back to an exe. Right now I can use process.start to fire off the command line with parameters (ilasm.exe) but I would like to automate this process from a C# service I will create. Is there a way to do this with reflection and reflection.emit? While this works: string rawText = File.ReadAllText(string.Format("c:\\temp\\{0}.il", Utility.GetAppSetting("baseName")), Encoding.ASCII); rawText = rawText.Replace("[--STRIP--]", guid); File.Delete(string.Format("c:\\temp\\{0}.il", Utility.GetAppSetting("baseName"))); File.WriteAllText(string.Format("c:\\temp\\{0}.il", Utility.GetAppSetting("baseName")),rawText, Encoding.ASCII); pi = new ProcessStartInfo(); pi.WindowStyle = ProcessWindowStyle.Hidden; pi.FileName = "\"" + ilasm + "\""; pi.Arguments = string.Format("c:\\temp\\{0}.il", Utility.GetAppSetting("baseName")); using(Process p = Process.Start(pi)) { p.WaitForExit(); } It is not ideal as I really would like this to be a streamlined process. I have seen examples of creating the IL at runtime, then saving, but I need to use the IL I already have in file form and compile it back to an exe. Thanks.

    Read the article

  • Download dynamic file with GWT

    - by Maksim
    I have a GWT page where user enter data (start date, end date, etc.), then this data goes to the server via RPC call. On the server I want to generate Excel report with POI and let user save that file on their local machine. This is my test code to stream file back to the client but for some reason I think it does not know how to stream file to the client when I'm using RPC: public class ReportsServiceImpl extends RemoteServiceServlet implements ReportsService { public String myMethod(String s) { File f = new File("/excelTestFile.xls"); String filename = f.getName(); int length = 0; try { HttpServletResponse resp = getThreadLocalResponse(); ServletOutputStream op = resp.getOutputStream(); ServletContext context = getServletConfig().getServletContext(); resp.setContentType("application/octet-stream"); resp.setContentLength((int) f.length()); resp.setHeader("Content-Disposition", "attachment; filename*=\"utf-8''" + filename + ""); byte[] bbuf = new byte[1024]; DataInputStream in = new DataInputStream(new FileInputStream(f)); while ((in != null) && ((length = in.read(bbuf)) != -1)) { op.write(bbuf, 0, length); } in.close(); op.flush(); op.close(); } catch (Exception ex) { ex.printStackTrace(); } return "Server says: " + filename; } } I've red somewhere on internet that you can't do file stream with RPC and I have to use Servlet for that. Is there any example of how to use Servlet and how to call that servlet from ReportsServiceImpl. Do I really need to make a servlet or it is possible to stream it back with my RPC?

    Read the article

  • How to download a file from a UNC mapped share via IIS and ASP

    - by helgeg
    I am writing an ASP application that will serve files to clients through the browser. The files are located on a file server that is available from the machine IIS is running on via a UNC path (\server\some\path). I want to use something like the code below to serve the file. Serving files that are local to the machine IIS is running on is working well with this method, my trouble is being able to serve files from the UNC mapped share: //Set the appropriate ContentType. Response.ContentType = "Application/pdf"; //Get the physical path to the file. string FilePath = MapPath("acrobat.pdf"); //Write the file directly to the HTTP content output stream. Response.WriteFile(FilePath); Response.End(); My question is how I can specify a UNC path for the file name. Also, to access the file share I need to connect with a specific username/password. I would appreciate some pointers on how I can achieve this (either using the approach above or by other means).

    Read the article

  • Editing a remote file on-the-fly with PHP

    - by user275074
    Hi, I have a requirement to edit a remote text file on-the-fly, the content of which currently stands at ~1Mb. I have tried a couple of approaches and both seem to be clunky or hog memory which I can't rely on. Thinking out logically what I'm trying to achieve is: FTP to a remote server. Download a copy of the file for backup purposes and store it somewhere locally. Open the remote file and add the necessary lines required. Remove lines from the remote file as per an array of un-required data generated from the local server. Is this possible? I've managed to code steps 1 and 2 but I'm having difficult with 3 and 4. The way I'm doing it now is to use fgets and return the whole string. Really, I don't want to do this as it involves manipulating and re-generating the whole string (and it's large) and then re-inserting it in between two markers in the remote file. Is there no way of manipulating the lines of text in the file on-the-fly?

    Read the article

  • PHP file upload issue

    - by Varun
    I am working on a PHP based, ticket management system. While creating a ticket, one can upload an attachment. I want to put a limit (say 10 MB) per file upload. To implement this I plan the following- 1. In php.ini set post_max_size = 10M 2.In PHP script which receives the POST- Since the file is larger than post_max_size, $_FILES[] will be empty. But I can still check the content-length header and discard the upload, if size more than 10M. While testing this I tried uploading a file of 1 GB and analysed the http traffic and this is what I found. - the entire 1 GB data is first uploaded to a to the server temporarily and discarded once the http request completes. Though I couldn't exactly find out where the file was getting saved(as it was not there in the temporary directory in the server.), but my http traffic analyzer showed that the browser did send 1 GB data to the server. - the PHP script execution started only after completion of the http request(i.e after uploading the entire 1 GB) Now I have 2 concerns: a) People may exploit my server bandwidth by trying to upload large file, which I will have to discard anyways. b) Even worse, if someone starts uploading a huge file (say 100 GB), entire 100 GB data is first uploaded to the server temporarily, that means for that period, it will consume that much of memory on my server. What's the common solution for this. Am I missing something here?

    Read the article

  • In Visual Studio (2008) is there a way to have a custom dependent file on another custom file?

    - by rball
    Instead of a *.cs code behind or beside I'd like to have a *.js file. I'm developing a MVC application an have no need for a code beside because I have controllers, but in certain cases it'd be nice to have a JavaScript code beside or some way to associate the file to the page it's being used on. I suppose I could just name them similarly, but I'm wanting to show the association if possible so there's no question about what the file is for. Typically what I'm talking about is within Visual Studio now under your Global.asax file you will have a plus sign to the left: + Global.asax Once you expand it you'll get - Global.asax Global.asax.cs I'd like the same thing to happen: + Home.spark - Home.spark Home.spark.js Updated: My existing csproj file has a path to the actual file, not sure if that's screwing it up. I've currently got: <ItemGroup> <Content Include="Views\User\Profile.spark.js"> <DependentUpon>Views\User\Profile.spark</DependentUpon> </Content> </ItemGroup> <ItemGroup> <Content Include="Views\User\Profile.spark" /> </ItemGroup> and it's simply just showing the files besides each other.

    Read the article

  • we would like the user to be able to pick a file from native file system and drag & drop it to our a

    - by user261740
    We have an existing java desktop application which starts, when user click on our application icon (placed on desktop) OR double click on executable(.exe). It opens the frame which allows user to select the file from native file system and uploads it to the server. Now we would like to facilitate user, that he can pick a file from windows explorer and drag it to the "shortcut / Application icon" & drop it on the "shortcut / Application icon" on the desktop. This would start the uploading of that file on the server. we need to capture the action of "drop" and launch on shortcut.. which may be completely not related to java, it can be very generic to any application. We are using JSmooth to build an executable from jar and NSIS for installer purpose. I would like to know.. How we can launch the application if user drops local file onto the system icon ? How we get the absolute path of file name which has been dropped onto the executable ?

    Read the article

  • Drag Drop copy file

    - by Graham Warrender
    I've perhaps done something marginally stupid, but can't see what it is!! string pegasusKey = @"HKEY_LOCAL_MACHINE\SOFTWARE\Pegasus\"; string opera2ServerPath = @"Server VFP\"; string opera3ServerPath = @"O3 Client VFP\"; string opera2InstallationPath = null; string opera3InstallationPath = null; //Gets the opera Installtion paths and reads to the string opera*InstallationPath opera2InstallationPath = (string)Registry.GetValue(pegasusKey + opera2ServerPath + "System", "PathToServerDynamic", null); opera3InstallationPath = (string)Registry.GetValue(pegasusKey + opera3ServerPath + "System", "PathToServerDynamic", null); string Filesource = null; string[] FileList = (string[])e.Data.GetData(DataFormats.FileDrop, false); foreach (string File in FileList) Filesource = File; label.Text = Filesource; if (System.IO.Directory.Exists(opera3InstallationPath)) { System.IO.File.Copy(Filesource, opera3InstallationPath); MessageBox.Show("File Copied from" + Filesource + "\n to" + opera3InstallationPath); } else { MessageBox.Show("Directory Doesn't Exist"); } The user drags the file onto the window, I then get the installation path of an application which is then used as the destination for the source file.. When the application is runs, it throws the error directory not found. But surely if the directory doesn't exists is should step into the else statement? a simple application that is becoming a headache!!

    Read the article

  • ResXResourceWriter Chops off end of file

    - by Aaron Salazar
    I'm trying to create a .resx file from a dictionary of strings. Luckily the .Net framework has a ResXResourceWriter class. The only problem is that when I create the file using the code below, the last few lines of my generated resx file are missing. I checked my Dictionary and it contains all the string pairs that I expect. public void ExportToResx(Dictionary<string,string> resourceDictionary, string fileName) { var writer = new ResXResourceWriter(fileName); foreach (var resource in resourceDictionary) { writer.AddResource(resource.Key, resource.Value); } } Unfortunately, it is a little difficult to show the entire resx file since it has 2198 (should have 2222) lines but here is the last little bit: ... 2195 <data name="LangAlign_ReportIssue" xml:space="preserve"> 2196 <value>Report an Issue</value> 2197 </data> 2198 <data name="LangAlign_Return BTW, notice that the file cuts off right at the end of that "n" in "LangAlign_Return". The string should read "LangAlign_ReturnToWorkspace". The file should also end at line 2222.

    Read the article

  • File.Replace throwing IOException

    - by WebDevHobo
    I have an app that can make modify images. In some cases, this makes the filesize smaller, in some cases bigger. The program doesn't have an option to "not replace the file if result has a bigger filesize". So I wrote a little C# app to try and solve this. Instead of overwriting the files, I make the app write the result to a folder under the current one and name that folder Test. The C# app I wrote compares grabs the contents of both folders and puts the full path to the file(s) in two List objects. I then compare and replace. The replacing isn't working however. I get the following IOException: Unable to remove the file to be replaced The location is on an external hard-drive, on which I have full rights. Now, I know I can just do File.Delete and File.Move in that order, but this exception has gotten me interested in why this particular setup wont work. Here's the source code: http://pastebin.com/4Vq82Umu And yes, the file specified as last argument of the Replace function does exist.

    Read the article

  • Java: fastest way to do random reads on huge disk file(s)

    - by cocotwo
    I've got a moderately big set of data, about 800 MB or so, that is basically some big precomputed table that I need to speed some computation by several orders of magnitude (creating that file took several mutlicores computers days to produce using an optimized and multi-threaded algo... I do really need that file). Now that it has been computed once, that 800MB of data is read only. I cannot hold it in memory. As of now it is one big huge 800MB file but splitting in into smaller files ain't a problem if it can help. I need to read about 32 bits of data here and there in that file a lot of time. I don't know before hand where I'll need to read these data: the reads are uniformly distributed. What would be the fastest way in Java to do my random reads in such a file or files? Ideally I should be doing these reads from several unrelated threads (but I could queue the reads in a single thread if needed). Is Java NIO the way to go? I'm not familiar with 'memory mapped file': I think I don't want to map the 800 MB in memory. All I want is the fastest random reads I can get to access these 800MB of disk-based data. btw in case people wonder this is not at all the same as the question I asked not long ago: http://stackoverflow.com/questions/2346722/java-fast-disk-based-hash-set

    Read the article

  • Help file not working

    - by meryl
    Hi, can anyone help me ? I wanto to play an audio file and whenever I press the stop button , the already played part of the file should be saved. Unfortunately , what I get is an audio file (.wav) which actually is unplayable. Thanks //**************************** void play_cut() { try { // First, we get the format of the input file final AudioFileFormat.Type fileType = AudioSystem.getAudioFileFormat(inputAudio).getType(); // Then, we get a clip for playing the audio. c = AudioSystem.getClip(); // We get a stream for playing the input file. AudioInputStream ais = AudioSystem.getAudioInputStream(inputAudio); // We use the clip to open (but not start) the input stream c.open(ais); // We get the format of the audio codec (not the file format we got above) final AudioFormat audioFormat = ais.getFormat(); c.start(); AudioInputStream startStream = new AudioInputStream(new FileInputStream(inputAudio), audioFormat, c.getLongFramePosition()); AudioSystem.write(startStream, fileType, outputAudio); } catch (UnsupportedAudioFileException e) { e.printStackTrace(); } catch (IOException e) { e.printStackTrace(); } catch (LineUnavailableException e) { e.printStackTrace(); } }// end play_cut //****************************

    Read the article

  • Upload file onto Server from the IPhone using ASIHTTPRequest

    - by Nick
    I've been trying to upload a file (login.zip) using the ASIHTTPRequest libraries from the IPhone onto the inbuilt Apache Server in Mac OS X Snow Leopard. My code is: NSString *urlAddress = [[[NSString alloc] initWithString:self.uploadField.text]autorelease]; NSURL *url = [NSURL URLWithString:urlAddress]; ASIFormDataRequest *request; NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory,NSUserDomainMask, YES); NSString *documentsDirectory = [paths objectAtIndex:0]; NSString *dataPath = [documentsDirectory stringByAppendingPathComponent:@"login.zip"]; NSData *data = [[[NSData alloc] initWithContentsOfFile:dataPath] autorelease]; request = [[[ASIFormDataRequest alloc] initWithURL:url] autorelease]; [request setPostValue:@"login.zip" forKey:@"file"]; [request setData:data forKey:@"file"]; [request setUploadProgressDelegate:uploadProgress]; [request setShowAccurateProgress:YES]; [request setDelegate:self]; [request startAsynchronous]; The php code is : <?php $target = "upload/"; $target = $target . basename( $_FILES['uploaded']['name']) ; $ok=1; if(move_uploaded_file($_FILES['uploaded']['tmp_name'], $target)) { echo "The file ". basename( $_FILES['uploadedfile']['name']). " has been uploaded"; } else { echo "Sorry, there was a problem uploading your file."; } ?> I don't quite understand why the file is not uploading. If anyone could help me. I've stuck on this for 5 days straight. Thanks in advance Nik

    Read the article

  • How to make a custom ear file in maven

    - by Zombies
    Here is my challenge, I need to make an ear file for a specific container. To be more specific on how this ear will be created: This is a standard j2ee ear file, with 1 WAR in it. The container it is deployed to will expect certain xml files (which can easily be found (somewhere) inside the source project). Here are my obstacles The source folder contains various container specific xml files. But, these files do not map directly to where the container expects them inside the EAR file. For example, there will be a file that this container expects to be in 'EARFILE.ear/config/connections.xml'. But this file is located (in the source) at /some/obscure/unrelated/directory. This is the case for about 5-7 files. I cannot change the original source project layout at all. So, how can I create the compliant EAR file that I need. There is NO plugin at this time for the container that I am using, I have certainly looked.

    Read the article

  • File upload issue

    - by Varun
    I am working on a PHP based, ticket management system. While creating a ticket, one can upload an attachment. I want to put a limit (say 10 MB) per file upload. To implement this I plan the following- 1. In php.ini set post_max_size = 10M 2.In PHP script which receives the POST- Since the file is larger than post_max_size, $_FILES[] will be empty. But I can still check the content-length header and discard the upload, if size more than 10M. While testing this I tried uploading a file of 1 GB and analysed the http traffic and this is what I found. - the entire 1 GB data is first uploaded to a to the server temporarily and discarded once the http request completes. Though I couldn't exactly find out where the file was getting saved(as it was not there in the temporary directory in the server.), but my http traffic analyzer showed that the browser did send 1 GB data to the server. - the PHP script execution started only after completion of the http request(i.e after uploading the entire 1 GB) Now I have 2 concerns: a) People may exploit my server bandwidth by trying to upload large file, which I will have to discard anyways. b) Even worse, if someone starts uploading a huge file (say 100 GB), entire 100 GB data is first uploaded to the server temporarily, that means for that period, it will consume that much of memory on my server. What's the common solution for this. Am I missing something here?

    Read the article

  • PHP Array saved to Text file

    - by coffeemonitor
    I've saved a response from an outside server to a text file, so I don't need to keep running connection requests. Instead, perhaps I can use the text file for my manipulation purposes, until I'm read for re-connecting again. (also, my connection requests are limited to this outside server) Here is what I've saved to a text file: records.txt Array ( [0] => stdClass Object ( [id] => 552 [date_created] => 2012-02-23 10:30:56 [date_modified] => 2012-03-09 18:55:26 [date_deleted] => 2012-03-09 18:55:26 [first_name] => Test [middle_name] => [last_name] => Test [home_phone] => (123) 123-1234 [email] => [email protected] ) [1] => stdClass Object ( [id] => 553 [date_created] => 2012-02-23 10:30:56 [date_modified] => 2012-03-09 18:55:26 [date_deleted] => 2012-03-09 18:55:26 [first_name] => Test [middle_name] => [last_name] => Test [home_phone] => (325) 558-1234 [email] => [email protected] ) ) There's actually more in the Array, but I'm sure 2 are fine. Since this is a text file, and I want to pretend this is the actual outside server (sending me the same info), how do I make it a real array again? I know I need to open the file first: <?php $fp = fopen('records.txt', "r"); // open the file $theData = fread($fh, filesize('records.txt')); fclose($fh); echo $theData; ?> So far $theData is a string value. Is there a way to convert it back to the Array it originally came in as?

    Read the article

  • Stop writing blank line at the end of CSV file (using MATLAB)

    - by Grant M.
    Hello all ... I'm using MATLAB to open a batch of CSV files containing column headers and data (using the 'importdata' function), then I manipulate the data a bit and write the headers and data to new CSV files using the 'dlmwrite' function. I'm using the '-append' and 'newline' attributes of 'dlmwrite' to add each line of text/data on a new line. Each of my new CSV files has a blank line at the end, whereas this blank line was not there before when I read in the data ... and I'm not using 'newline' on my final call of 'dlmwrite'. Does anyone know how I can keep from writing this blank line to the end of my CSV files? Thanks for your help, Grant EDITED 5/18/10 1:35PM CST - Added information about code and text file per request ... you'll notice after performing the procedure below that there appears to be a carriage return at the end of the last line in the new text file. Consider a text file named 'textfile.txt' that looks like this: Column1, Column2, Column3, Column4, Column 5 1, 2, 3, 4, 5 1, 2, 3, 4, 5 1, 2, 3, 4, 5 1, 2, 3, 4, 5 1, 2, 3, 4, 5 Here's a sample of the code I am using: % import data importedData = importdata('textfile.txt'); % manipulate data importedData.data(:,1) = 100; % store column headers into single comma-delimited % character array (for easy writing later) columnHeaders = importedData.textdata{1}; for counter = 2:size(importedData.textdata,2) columnHeaders = horzcat(columnHeaders,',',importedData.textdata{counter}); end % write column headers to new file dlmwrite('textfile_updated.txt',columnHeaders,'Delimiter','','newline','pc') % append all but last line of data to new file for dataCounter = 1:(size(importedData.data,2)-1) dlmwrite('textfile_updated.txt',importedData.data(dataCounter,:),'Delimiter',',','newline','pc','-append') end % append last line of data to new file, not % creating new line at end dlmwrite('textfile_updated.txt',importedData.data(end,:),'Delimiter',',','-append')

    Read the article

  • Approach for parsing file and creating dynamic data structure for use by another program

    - by user275633
    All, Background: I have a customer who has some build scripts for their datacenter based on python that I've inherited. I did not work on the original design so I'm sort of limited to some degree on what I can and can't change. That said, my customer has a properties file that they use in their datacenter. Some of the values are used to build their servers and unfortunately they have other applications that also use these values so I cannot change them to make it easier for me. What I want to do is make the scripts more dynamic to distribute more hosts so that I don't have to keep updating the scripts in the future and can just add more hosts to the property file. Unfortunately I can't change the current property file and have to work with it. The property file looks something like this: projectName.ClusterNameServer1.sslport=443 projectName.ClusterNameServer1.port=80 projectName.ClusterNameServer1.host=myHostA projectName.ClusterNameServer2.sslport=443 projectName.ClusterNameServer2.port=80 projectName.ClusterNameServer2.host=myHostB In their deployment scripts they basically have alot of if projectName.ClusterNameServerX where X is some number of entries defined and then do something, e.g.: if projectName.ClusterNameServer1.host != "" do X if projectName.ClusterNameServer2.host != "" do X if projectName.ClusterNameServer3.host != "" do X Then when they add another host (say Serve4) they've added another if statement. Question: What I would like to do is make the scripts more dynamic and parse the properties file and put what I need into some data structure to pass to the deployment scripts and then just iterate over the structure and do my deployment that way so I don't have to constantly add a bunch of if some host# do something. I'm just curious to feed some suggestions as to what others would do to parse the file and what sort of data structure would they use and how they would group things together by ClusterNameServer# or something else. Thanks

    Read the article

  • [PHP] Script looking for string in file breaks

    - by Kel
    Hey guys. I'm running a script that looks for a specific term in a pdf file. Well, actually I'm reading the pdf file as a txt file and look for the term there. The script processes over 20k files. But, unexpectedly, the script breaks after it hits a file that is over 50mb long. It stops. What could the reason be? Here's an excerpt of the script: // Proceed if file exists if(file_exists($sourcePath)){ echo "file exists\n"; if(filesize($sourcePath) > 0){ echo "filesize is greater than 0\n"; $pdfFile = fopen($sourcePath,"rb"); $data = fread($pdfFile, filesize($sourcePath)); fclose($pdfFile); // Search for string if(stripos($data,$searchFor)){ echo "Success. encrypt found\r\n"; fwrite($errorFileHandler,"Success. encrypt found\r\n"); }else{ ..... } ... ... What could be the problem?

    Read the article

  • Obtain File size with os.path.getsize() in Python 2.7.5

    - by Ruxuan Ouyang
    I am new to python. I am trying to use os.path.getsize() to obtain the file size. However, if the file name is not in Englist, but in Chinese, Gemany, or French, etc, Python cannot recognize it and do not return the size of the file. Could you please help me with it? How can I let Python recognize the file's name and return the size of these kind of files? For example: The file's name is:?????????? ????????????? ? ????????????? ???????? ?? 2030?.doc path="C:\xxxx\xxx\xxxx\?????????? ????????????? ? ????????????? ???????? ?? 2030?.doc" I'd like to use" os.path.getsize(path) But it does not recognize the file name. Could you please kindly tell me what should I do? Thank you very much!

    Read the article

  • Problem reading in file written with xdr using c

    - by Inga
    I am using Ubuntu 10.4 and have two (long) C programs, one that writes a file using XDR, and one that uses this file as input. However, the second program does not manage to read in the written file. Everything looks perfectly fine, it just does not work. More spesifically it fails at the last line added here with the error message xdr_string(), which indicates that it can not read in the first line of the input file. I do no see any obvious errors. The input file is written out, have a content and I can see the right strings using stings -a -n 2 "inputfile". Anyone have any idea what is going wrong? Relevant parts of program 1 (writer): /** * create compressed XDR output stream */ output_file=open_write_pipe(output_filename); xdrstdio_create(&xdrs, output_file, XDR_ENCODE); /** * print material name */ if( xdr_string(&xdrs, &name, _POSIX_NAME_MAX) == FALSE ) xdr_err("xdr_string()"); Relevant parts of program 2 (reader): /** * open data file */ input_file=open_data_file(input_filename, "r"); if( input_file == NULL ){ ERROR(input_filename); exit(EXIT_FAILURE); } /** * create input XDR stream */ xdrstdio_create(&xdrs, input_file, XDR_DECODE); /** * read material name */ if(xdr_string(&xdrs, &name, _POSIX_NAME_MAX) == FALSE) XDR_ERR("xdr_string()");

    Read the article

  • Simple Oracle File repository with folder hierarchy

    - by Ope
    I have an application that stores large amount of files (XML and binary) in folder hierarchies. Currently the main method is storing them in file system or using a legacy CMS, which we want to get rid of. The CMS supports Oracle and a customer wants to keep the files in Oracle because of enterprise policies (backup etc.) The question is: Is there a simple implementation of file repository with folder hierarchy for Oracle? What I am looking for is a small .Net component or example code (PL/SQL and/or .Net) that would have the following methods: Create, Delete, Exists Folder CRUD file Move and potentially Copy file or directory Access to files and folders with paths like "/root/folder1/folder2/file.xml" Ability to get all the files and folders in a folder and potentially also the entire directory tree Tree traversal, getting the parent, all children etc. needs to be fast. I need the implementation in .Net, but if it was just the stored procedures, I could create the .Net calling code. I have pointers to generic articles for creating hierarchies in DB, so if I need to do it from scratch, I know where to start. What I am asking here, is there already an implementation that I could take without doing this from scratch? It seems like such a generic requirement... If the answer is a CMS, Document management system or such it should be Open Source or at least quite cheap (some hundreds / server) and it should be possible to deploy it XCopy - hopefully only couple of DLL:s. I do not need - or want - a full featured big CMS with dozens of dlls and especially not an msi-installation. I have tried to google this, but the words "repository", "CMS", "file hierarchy" etc. give so many answers, the searches are pretty much useless. Thanks, OPe

    Read the article

  • Batch file recursively find files and rar them

    - by b1gf00t
    Hi there, I have a Parent Directory which hosts many sub directories, and in every sub directory there is .mpg movies. Some of the directories might contain one or more .mpg movies. I would like to automate the process below, which I have been doing manually. Step One If the directory has more than 1 .mpg file, I create separates directories for each and move each file into its directory, naming the directory as per the name of the file. Step Two I rar each video file in its directory as per one of my profiles, by that it splits the movie into 50MB parts, test the archive, delete the source, and instructs winrar to wait if another rar is executing. I am doing this so I can queue jobs manually. Step Three After having all the rars in the sub directories, I start creating a checksum for every directory, therefore leaving checksum.sfv in every directory. Step Four I copy the parent folder and its sub directories to my external drives. I was hoping that someone could assist me in creating a script. I was able to automate the process of creating directories as per the name of the file, and moving the file. However, I never succeeded in automating Step two. I am using the below software Winrar from rarlabs exf from exactfile Appreciate your assistance.

    Read the article

  • .Net file writing and string splitting issues

    - by sagar
    I have a requirement where the file should be split using a given character. Default splitting options are CRLF and LF In both these cases I am splitting the line by \r\n and \r respectively. Also I have requirement where any size of file should be processed. (Processing is basically inserting the given string in a file at given position). For this I am reading the file in chunk of 1024 bytes. Then I am applying the string.Split() method. Split() method gives options for ignoring white spaces and none. I have to add back these line break characters to the line. for this I am using a binary writer and I am writing the byte array to the new file. Issue:- 1) When line break is CRLF, and the split option is NONE, while spaces are also added in the splitted array. Second option is given (to ignore white spaces) CRLF works properly. 2)Bit ignoring white space option creates other problems, as I am reading the file byte by byte I can't ignore a white space. 3)When line break characters are other than default(e.g. '|', a null value is prepended to the resulting line. Can anybody give solution to my issues?

    Read the article

< Previous Page | 312 313 314 315 316 317 318 319 320 321 322 323  | Next Page >