Search Results

Search found 68478 results on 2740 pages for 'file descriptor'.

Page 251/2740 | < Previous Page | 247 248 249 250 251 252 253 254 255 256 257 258  | Next Page >

  • SVN delete file

    - by Josemalive
    Hi, My question is simple, if i delete a file in a trunk of an svn repository, could i get the file if i ask for a previous version or the file was deleted forever? Thanks. Best Regards. Jose

    Read the article

  • Problem using pantheios file stock back end

    - by Tanuj
    I am trying to use pantheios file stock back end to log from my dll in c++. The file gets created but nothing gets written to the file. Here is a snippet of my code if (pantheios::pantheios_init() < 0) { MessageBox(NULL, "Init Failed", "fvm", MB_OK); }else { MessageBox(NULL, "Init Passed", "fvm", MB_OK); pantheios::log_INFORMATIONAL("Logger enabled!"); } DWORD pid = GetCurrentProcessId(); sprintf_s(moduleName, sizeof(moduleName), "C:\\%d.log", pid); pantheios_be_file_setFilePath(moduleName, PANTHEIOS_BE_FILE_F_TRUNCATE,PANTHEIOS_BE_FILE_F_TRUNCATE, PANTHEIOS_BEID_LOCAL); pantheios::log(pantheios::debug, "Entering main"); I get the Init Passed message box but no log statements are dumped in the file.

    Read the article

  • Importing a large delimited file to a MySQL table

    - by Tom
    I have this large (and oddly formatted txt file) from the USDA's website. It is the NUT_DATA.txt file. But the problem is that it is almost 27mb! I was successful in importing the a few other smaller files, but my method was using file_get_contents which it makes sense why an error would be thrown if I try to snag 27+ mb of RAM. So how can I import this massive file to my MySQL DB without running into a timeout and RAM issue? I've tried just getting one line at a time from the file, but this ran into timeout issue. Using PHP 5.2.0. Here is the old script (the fields in the DB are just numbers because I could not figure out what number represented what nutrient, I found this data very poorly document. Sorry about the ugliness of the code): <? $file = "NUT_DATA.txt"; $data = split("\n", file_get_contents($file)); // split each line $link = mysql_connect("localhost", "username", "password"); mysql_select_db("database", $link); for($i = 0, $e = sizeof($data); $i < $e; $i++) { $sql = "INSERT INTO `USDA` (1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17) VALUES("; $row = split("\^", trim($data[$i])); // split each line by carrot for ($j = 0, $k = sizeof($row); $j < $k; $j++) { $val = trim($row[$j], '~'); $val = (empty($val)) ? 0 : $val; $sql .= ((empty($val)) ? 0 : $val) . ','; // this gets rid of those tildas and replaces empty strings with 0s } $sql = rtrim($sql, ',') . ");"; mysql_query($sql) or die(mysql_error()); // query the db } echo "Finished inserting data into database.\n"; mysql_close($link); ?>

    Read the article

  • Uploading file to multiple servers ?

    - by WarDoGG
    I'm creating an image uploader which will give the user an option to upload file to a different server than which the script is hosted on. Is making an FTP connection from the script server to the target uploading server the only way to do this ? Correct me if im wrong, but would the path of the file look like this ? User computer (file) --- script server ---- target server Is there another way to do this ?

    Read the article

  • File Upload in GWT in a Special Case

    - by Maksud
    I am doing a software for a document system. In this system when a user completes a document and want to save it, the document will be uploaded directly to server without the user action. This system uses COM/ActiveX to facilitate user using native editors. Ok, my problem is: suppose I have a file say d:/notepad.txt. Using classical method a user can browse the file and upload it. I can do that with apache commonio and GWT FormPanel and FileUpload. But if I know the filename (d:/notepad.txt), is there any way to upload the file directly to server without the user having to browse the file. I am currently doing this by the ActiveX componenet calling some HttpUpload methods with POST. But that does not maintain session. Thanks

    Read the article

  • Extract log file

    - by k38
    Hi, xxxxxxxxmessageyyyyyyymessagexxxxxxxxxx xxxxxxxxmessagezzzzzzzmessagexxxxxxxxxx xxxxxxxxmessageaaaaaaamessagexxxxxxxxxx xxxxxxxxmessageyyyyyyymessagexxxxxxxxxx The above is my log file I need to extract the phrase which is inside the message tag and I need to save the distinct messages in a file in the above example I need to save zzzzzzz and aaaaaaa to a file. What are the unix commands I need to use.

    Read the article

  • Joomla Development - Allow Direct File Access AND use Joomla-intern Framework

    - by Email
    Hi As usual you write in Joomla Development defined('_JEXEC') or die('Restricted access'); I make a plugin which needs access from Paypal/IPN, so i exclude that in that specific file. BUT I also need to use the Joomla-intern Variables to access the Database, so i tried this: require("../filewithaccesstoframework.php"); OR even $baseurl = $_SERVER['HTTP_HOST']; $baseurl ="http://".$baseurl."/configuration.php"; require($baseurl); By using the first code it displays "Restircted Access" since it seems to take this from the included file which i can't skip to put-in-there. The second code does not seem to recognize the Variables used in configuration.php like $host, $db, $password. the file configuration.php is chmod 444 Why this happens and is there a workarround to allow direct access to a file AND using the Joomla intern Framework (Variables, Functions)?

    Read the article

  • How do i resize image file to optional sizes

    - by shuxer
    Hi I have image upload form, user attaches aimage file, and selects image size to resize the uploaded image file(200kb, 500kb, 1mb, 5mb, Original). Then my script needs to resize image file size based on user's optional size, but im not sure how to implement this feature, For example, user uploads image with one 1mb size, and if user selects 200KB to resize, then my script should save it with 200kb size. Does anyone know or have an experience on similar task ? Thanks for you reply in advance.

    Read the article

  • Perl, efficient parsing of csv file

    - by Mike
    I'm working on a project that involves parsing a large csv formatted file in Perl and am looking to make things more efficient. My approach has been to split() the file by lines first, and then split() each line again by commas to get the fields. But this suboptimal since at least two passes on the data are required. (once to split by lines, then once again for each line). This is a very large file, so cutting processing in half would be a significant improvement to the entire application. My question is, what is the most time efficient means of parsing a large CSV file using only built in tools? note: Each line has a varying number of tokens, so we can't just ignore lines and split by commas only. Also we can assume fields will contain only alphanumeric ascii data (no special characters or other tricks). Also, i don't want to get into parallel processing, although that might work effectively.

    Read the article

  • pdf file creation in php

    - by pavun_cool
    Actually I have used following code , for creating the simple pdf file . It executed fine in the browsers. But I am not able to get the pdf file . But it gives me some output when I am running the code in CLI . my doubt is , where I need specify pdf file name ( creation file ) . <?php require('fpdf.php'); $pdf=new FPDF(); $pdf->AddPage(); $pdf->SetFont('Arial','B',16); $pdf->Cell(40,10,'Hello World!'); $pdf->Output(); ?> CLI output: 2 0 obj << /Type /Page /Parent 1 0 R /Contents 3 0 R endobj 3 0 obj << /Length 4 0 R stream 2.834646 0 0 2.834646 0 841.9 cm 2 J 0.2 w BT /F1 5.64 Tf ET BT 11 -16.692 Td (Hello World!) Tj ET

    Read the article

  • mod_rewrite if file exists

    - by Mathieu Parent
    Hi everyone, I already have two rewrite rules that work correctly for now but some more code has to be added to work perfectly. I have a website hosted at mydomain.com and all subdom.mydomain.com are rewrited to mydomain.com/subs/subdom . My CMS has to handle the request if the file being reached does not exist, the rewrite is done like so: RewriteCond $1 !^subs/ RewriteCond %{HTTP_HOST} ^([^.]+)\.mydomain\.com$ RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteRule ^(.*)$ subs/%1/index.php?page=$1 [L] My CMS handles the next part of the parsing as usual. The problem is if a file really exists, I need to link to it without passing through my CMS, I managed to do it like this: RewriteCond $1 !^subs/ RewriteCond %{HTTP_HOST} ^([^.]+)\.mydomain\.com$ RewriteCond %{REQUEST_FILENAME} -f [OR] RewriteCond %{REQUEST_FILENAME} -d RewriteRule ^(.*)$ subs/%1/$1 [L] So far it seems to work like a charm. Now I am being picky and I need to have default files that are stored in subs/default/. If the file exists in the subdomain folder, we should grab this one but if not, we need to get the file from the default subdomain. And if the file does not exist anywhere, we should be using the 404 page from the current subdomain unless there is none. I hope it describes well enough. Thank you for your time!

    Read the article

  • Import text file crunching library for Java/Groovy ?

    - by devdude
    In a lot of real life implementations of applications we face the requirement to import some kind of (text) files. Usually we would implement some (hardcoded?) logic to validate the file (eg. proper header, proper number of delimiters, proper date/time value,etc.). Eventually also need to check for the existence of related data in a table (eg. value of field 1 in text file must have an entry in some basic data table). While XML solves this (to some extend) with XSD and DTD, we end up hacking this again and again for proprietary text file formats. Is there any library or framework that allows the creation of templates similar to the xsd approach ? This would make it way more flexible to react on file format changes or implement new formats. Thanks for any hints Sven

    Read the article

  • CFHTTP PUT without a physical file

    - by E-Madd
    I'm trying to communicate with an API that requires JSON payloads via PUT method. Is it possible to do this in CFHTTP with CFHTTPPARAM without first writing a file? My code currently looks like this... <cfset json = "{'key':'value'}"> <cffile action="write" file="#ExpandPath('./test.js')#" output="#json#" /> <cfhttp method="put" url="http://servicedomain.com/api/method/" resolveurl="no" username="username" password="password"> <cfhttpparam mimetype="application/json" type="file" name="payload" file="#ExpandPath('./test.js')#" /> </cfhttp>

    Read the article

  • Hosting Javascript/CSS file on CDN similar to Google hosting jQuery

    - by Alec Smart
    Hello, I am wondering if there are any hosts or if I can host my file (JS & CSS) on Google so that they are cached and load real quick (due to CDN and gzip). A number of my customers use these files and I would prefer if they could somehow include this to file to receive the JS file. Ideally with filename.js?publickey=sdfgsdfg (which will be tied to a particular domain name). The problem is that my hosting needs are very small- only about 100kb. Any suggestions. My problem is that the customers using the JS & CSS file, have no clue about gzipping content or caching (as their shared hosts do not support it), as a result causes the JS/CSS to take forever to load. Am wondering if I can leverage an existing free service, or I do not mind paying either. Thank you for your time.

    Read the article

  • Have no idea with python-excel read data file

    - by Protoss Reed
    I am a student and haven't a big expirence to do this work. So problem is next. I have a part of code: import matplotlib.pyplot as plt from pylab import * import cmath def sf(prompt): """ """ error_message = "Value must be integer and greater or equal than zero" while True: val = raw_input(prompt) try: val = float(val) except ValueError: print(error_message) continue if val <= 0: print(error_message) continue return val def petrogen_elements(): """Input and calculations the main parameters for pertogen elements""" print "Please enter Petrogen elements: \r" SiO2 = sf("SiO2: ") Al2O3= sf("Al2O3: ") Na2O = sf("Na2O: ") K2O = sf("K2O: ") petro = [SiO2,TiO2,Al2O3,] Sum = sum(petro) Alcal = Na2O + K2O TypeA lcal= Na2O / K2O Ka= (Na2O + K2O)/ Al2O3 print '-'*20, "\r Alcal: %s \r TypeAlcal: %s \ \r Ka: %s \r" % (Alcal, TypeAlcal,Ka,) petrogen_elements() So the problem is next. I have to load and read excel file and read all data in it. After that program have to calculate for example Alcaline, Type of Alcaline etc. Excel file has only this structure 1 2 3 4 5   1 name1 SiO2 Al2O3 Na2O K2O 2 32 12 0.21 0.1 3 name2 SiO2 Al2O3 Na2O K2O 4 45 8 7.54 5 5 name3 SiO2 Al2O3 Na2O K2O 6. … …. …. … … … All excel file has only 5 columns and unlimited rows. User has choice input data or import excel file. First part of work I have done but it stays a big part Finally I need to read all file and calculate the values. I would be so grateful for some advice

    Read the article

  • MySQL data file won't shrink

    - by Midhat
    My ibdata1 file for MySQL database grew to about 32GB over time. Recently I deleted about 10GB of data from my databases (and restarted mysql for good measure), but the file won't shrink. Is there any way to reduce the size of this file

    Read the article

  • Vim jump last mark-different file

    - by anon
    :bn , :bp just switches buffers I want something like ctrl-o, which has a 'stack' of marks it jumps through. However, I want it to ignore marks in the same file ... i.e i want ctrl-my-o to be ctrl-o until you hit a different file and ctrl-my-i to be ctrl-i until you hit a different file Is there somethig like this in vim?

    Read the article

  • Delete Range of Data From Text File With PHP

    - by Evan Byrne
    I want to delete a range of data from a text file using PHP. Let's assume the file contains the following: Hello, World! I want to delete everything from character 2 to character 7. The actual file I need to do this with is very large, so I don't want to have to read the large file in order to delete just a small, given range of data. The data contained within the given range is not known, so str_replace or preg_replace solutions wouldn't work anyways. Thanks!

    Read the article

< Previous Page | 247 248 249 250 251 252 253 254 255 256 257 258  | Next Page >