Search Results

Search found 80596 results on 3224 pages for 'simplexml load file'.

Page 75/3224 | < Previous Page | 71 72 73 74 75 76 77 78 79 80 81 82  | Next Page >

  • Parsing a file with hierarchical structure in Python

    - by Kevin Stargel
    I'm trying to parse the output from a tool into a data structure but I'm having some difficulty getting things right. The file looks like this: Fruits Apple Auxiliary Core Extras Banana Something Coconut Vegetables Eggplant Rutabaga You can see that top-level items are indented by one space, and items beneath that are indented by two spaces for each level. The items are also in alphabetical order. How do I turn the file into a Python list that's something like ["Fruits", "Fruits/Apple", "Fruits/Banana", ..., "Vegetables", "Vegetables/Eggplant", "Vegetables/Rutabaga"]?

    Read the article

  • Performance Tuning a High-Load Apache Server

    - by futureal
    I am looking to understand some server performance problems I am seeing with a (for us) heavily loaded web server. The environment is as follows: Debian Lenny (all stable packages + patched to security updates) Apache 2.2.9 PHP 5.2.6 Amazon EC2 large instance The behavior we're seeing is that the web typically feels responsive, but with a slight delay to begin handling a request -- sometimes a fraction of a second, sometimes 2-3 seconds in our peak usage times. The actual load on the server is being reported as very high -- often 10.xx or 20.xx as reported by top. Further, running other things on the server during these times (even vi) is very slow, so the load is definitely up there. Oddly enough Apache remains very responsive, other than that initial delay. We have Apache configured as follows, using prefork: StartServers 5 MinSpareServers 5 MaxSpareServers 10 MaxClients 150 MaxRequestsPerChild 0 And KeepAlive as: KeepAlive On MaxKeepAliveRequests 100 KeepAliveTimeout 5 Looking at the server-status page, even at these times of heavy load we are rarely hitting the client cap, usually serving between 80-100 requests and many of those in the keepalive state. That tells me to rule out the initial request slowness as "waiting for a handler" but I may be wrong. Amazon's CloudWatch monitoring tells me that even when our OS is reporting a load of 15, our instance CPU utilization is between 75-80%. Example output from top: top - 15:47:06 up 31 days, 1:38, 8 users, load average: 11.46, 7.10, 6.56 Tasks: 221 total, 28 running, 193 sleeping, 0 stopped, 0 zombie Cpu(s): 66.9%us, 22.1%sy, 0.0%ni, 2.6%id, 3.1%wa, 0.0%hi, 0.7%si, 4.5%st Mem: 7871900k total, 7850624k used, 21276k free, 68728k buffers Swap: 0k total, 0k used, 0k free, 3750664k cached The majority of the processes look like: 24720 www-data 15 0 202m 26m 4412 S 9 0.3 0:02.97 apache2 24530 www-data 15 0 212m 35m 4544 S 7 0.5 0:03.05 apache2 24846 www-data 15 0 209m 33m 4420 S 7 0.4 0:01.03 apache2 24083 www-data 15 0 211m 35m 4484 S 7 0.5 0:07.14 apache2 24615 www-data 15 0 212m 35m 4404 S 7 0.5 0:02.89 apache2 Example output from vmstat at the same time as the above: procs -----------memory---------- ---swap-- -----io---- -system-- ----cpu---- r b swpd free buff cache si so bi bo in cs us sy id wa 8 0 0 215084 68908 3774864 0 0 154 228 5 7 32 12 42 9 6 21 0 198948 68936 3775740 0 0 676 2363 4022 1047 56 16 9 15 23 0 0 169460 68936 3776356 0 0 432 1372 3762 835 76 21 0 0 23 1 0 140412 68936 3776648 0 0 280 0 3157 827 70 25 0 0 20 1 0 115892 68936 3776792 0 0 188 8 2802 532 68 24 0 0 6 1 0 133368 68936 3777780 0 0 752 71 3501 878 67 29 0 1 0 1 0 146656 68944 3778064 0 0 308 2052 3312 850 38 17 19 24 2 0 0 202104 68952 3778140 0 0 28 90 2617 700 44 13 33 5 9 0 0 188960 68956 3778200 0 0 8 0 2226 475 59 17 6 2 3 0 0 166364 68956 3778252 0 0 0 21 2288 386 65 19 1 0 And finally, output from Apache's server-status: Server uptime: 31 days 2 hours 18 minutes 31 seconds Total accesses: 60102946 - Total Traffic: 974.5 GB CPU Usage: u209.62 s75.19 cu0 cs0 - .0106% CPU load 22.4 requests/sec - 380.3 kB/second - 17.0 kB/request 107 requests currently being processed, 6 idle workers C.KKKW..KWWKKWKW.KKKCKK..KKK.KKKK.KK._WK.K.K.KKKKK.K.R.KK..C.C.K K.C.K..WK_K..KKW_CK.WK..W.KKKWKCKCKW.W_KKKKK.KKWKKKW._KKK.CKK... KK_KWKKKWKCKCWKK.KKKCK.......................................... ................................................................ From my limited experience I draw the following conclusions/questions: We may be allowing far too many KeepAlive requests I do see some time spent waiting for IO in the vmstat although not consistently and not a lot (I think?) so I am not sure this is a big concern or not, I am less experienced with vmstat Also in vmstat, I see in some iterations a number of processes waiting to be served, which is what I am attributing the initial page load delay on our web server to, possibly erroneously We serve a mixture of static content (75% or higher) and script content, and the script content is often fairly processor intensive, so finding the right balance between the two is important; long term we want to move statics elsewhere to optimize both servers but our software is not ready for that today I am happy to provide additional information if anybody has any ideas, the other note is that this is a high-availability production installation so I am wary of making tweak after tweak, and is why I haven't played with things like the KeepAlive value myself yet.

    Read the article

  • Uploading file from file object with PyCurl

    - by Tom
    I'm attempting to upload a file like this: import pycurl c = pycurl.Curl() values = [ ("name", "tom"), ("image", (pycurl.FORM_FILE, "tom.png")) ] c.setopt(c.URL, "http://upload.com/submit") c.setopt(c.HTTPPOST, values) c.perform() c.close() This works fine. However, this only works if the file is local. If I was to fetch the image such that: import urllib2 resp = urllib2.urlopen("http://upload.com/people/tom.png") How would I pass resp.fp as a file object instead of writing it to a file and passing the filename? Is this possible?

    Read the article

  • SSIS DTS Package flat file error - "The file name specified in the connection was not valid"

    - by MisterZimbu
    I have a pretty basic SSIS package that is attempting to read a file hosted on a share, and import its contents to a database table. The package runs fine when I run it manually within SSIS. However, when I set up a SQL Agent job and attempt to execute it, I get the following error: Executed as user: DOMAIN\UserName. Microsoft (R) SQL Server Execute Package Utility Version 9.00.3042.00 for 64-bit Copyright (C) Microsoft Corp 1984-2005. All rights reserved. Started: 10:14:17 AM Error: 2010-05-03 10:14:17.75 Code: 0xC001401E Source: DataImport Connection manager "Data File Local" Description: The file name "\10.1.1.159\llpf\datafile.dat" specified in the connection was not valid. End Error Error: 2010-05-03 10:14:17.75 Code: 0xC001401D Source: DataAnimalImport Description: Connection "Data File Local" failed validation. End Error DTExec: The package execution returned DTSER_FAILURE (1). Started: 10:14:17 AM Finished: 10:14:17 AM Elapsed: 0.594 seconds. The package execution failed. The step failed. This leads me to believe it's a permissions issue, but every attempt I've made to fix it has failed. What I've tried so far: Run as the SQL Agent account (DOMAIN\SqlAgent) - yields same error. DOMAIN\SqlAgent has "Full Control" permissions on both the share and the uploaded file. Set up a proxy account with a different account's credentials (DOMAIN\Account) - yields same error. Like above, "Full Control" permissions were given over the share to that account. Gave "Everyone" full control permissions over the share (temporarily!). Yielded same error. Manually copied the file to a local path and tested with the SQL Agent account. Worked properly. Added an ActiveX script task that would first copy the remotely hosted file to a local path, and then have the DTS package reference the local file. Gave a completely nondescriptive (even by SSIS standards) error when trying to run the script. Set up a proxy account, using my own personal account's credentials - worked correctly. However, this is not an acceptable solution as there are password policies in place on my account, as well as being a bad practice to set things up this way in general. Any ideas? I'm still convinced it's a permissions issue. However, what I've read from various searches more or less says giving the executing account permissions on the share should work. However, this is not the case here (unless I'm missing something obscure when I'm setting up permissions on the share).

    Read the article

  • Skip Corrupt Revisions During SvnAdmin Load

    - by cisellis
    I have a dump file that I am generating from VSS with the use of the VSS2SVN script. I've tested the generated dump file before and some of the revisions are corrupt for one reason or another (binary data or long path strings seem to be the main culprit). This is fine. In the past I have used svndumpfilter to split the dump file, remove the corrupt revisions and continue to load the repository. It worked but took a lot of manual effort to start the load, hit the bad revision, split the dump file, continue loading the repo, etc. This dump file is pretty large (~5GB) and takes several hours to load. I think I know the answer to this but is there any way to simply tell svnadmin load to keep going and skip corrupt revisions? I know how to verify, backup, etc. the dump file and don't need any of that. I don't care about recovering corrupt revisions. I just want to start the load, walk away, and not worry about checking it every few hours to manually remove the corrupt revisions. Is that possible? Thanks.

    Read the article

  • execute python file from another file

    - by ariel
    Hi I have a python file that has functions and classes. now I am writting another program (in another file). and I want to start the new file with running the old file (with the function and classes). I have tried using exec(path_2_oldFile.pyw) but it didn't work. thanks for any help Ariel

    Read the article

  • file upload in ajax by using php

    - by princess123
    hi.... I wud like to use file upload(ajax/javascript with php) in my form that has other controls also. when i uploaded an img it displays on the same form with delete option. if i click on submit it goes into folder as well as database & if i click on delete it deletes an img.... anybody can help me?pls u knw gmail file upload exact like that

    Read the article

  • SIMPLE file reading in Perl

    - by Befall
    Hey guys, The other answered questions were a bit complicated for me, as I'm extremely new to using Perl. I'm curious how Perl reads in the files, how to tell it to advance to the next line in the text file, and how to make it read all lines in the .txt file until, for example, it reaches item "banana". Any and all help would be appreciated, thanks!

    Read the article

  • Modify MP3 File

    - by arik-so
    Hello, I have an MP3 file uploader. I want to add an additional audio track to the file upon upload - via PHP. Is that possible? Thanks in advance!

    Read the article

  • How to Read Java File Structure using Java?

    - by Iso
    Hi, I'm trying to read a java file and display in console the package, class and method name. something like this: File: Test.java package tspec.test; public class Test { public void addTest () {} public void deleteTest () {} } Output: package name: tspec.test class name: Test method name: addTest deleteTest Thanks in advance :)

    Read the article

  • Read text file in google GWT?

    - by ipkiss
    Hello all, I am writing a webpage using GWT. Now I need to read a text file and display the content in the webpage but have no idea how to do that with GWT. It is very nice if there is any way in GWT that I can read .properties file. Does anyone have an idea, please? Thanks.

    Read the article

  • How to load an HTML file into an included PHP file from another PHP?

    - by Peter NGM
    i have 2 PHP file and 1 HTML. i want to include file2.php in file1.php. file1.php is: <html> <head>...</head> <body> ... <?php include("file2.php"); ?> ... </body> </html> i want to load an HTML file in file2.php: file2.php is: <?php $doc = new DOMDocument(); $doc->loadHTMLFile("sample.html"); echo $doc->saveHTML(); ?> and sample.html is: ... <b>Hello World!</b> ... my problem is this error: Warning: DOMDocument::loadHTMLFile() [domdocument.loadhtmlfile]: I/O warning : failed to load external entity "sample.html" in C:\xampp\htdocs\project\file2.php on line 3 please help me to solve this problem.

    Read the article

  • Free-as-in-beer binary file format inspector

    - by fbrereto
    I am looking for a utility that gives me the ability to specify a binary file format and then interpret a file of bytes according to that format. (Something along the lines of the 010 Editor, but infinitely more cost-effective). Something that runs on Mac OS X would be preferred, but I'm interested to see what all is out there in general (while more of a hassle I'd be willing to run a tool on Windows if it were superior.) What's your preference?

    Read the article

  • Best pattern to load enumerated values from DAL using WCF RIA Services

    - by Dale Halliwell
    I would like to be able to load several RIA entitysets in a single call without chaining/nesting several small LoadOperations together so that they load sequentially. I have several pages that have a number of comboboxes on them. These comboboxes are populated with static values from a database (for example status values). Right now I preload these values in my VM by one method that strings together a series of LoadOperations for each type that I want to load. For example: public void LoadEnums() { context.Load(context.GetMyStatusValues1Query()).Completed += (s, e) => { this.StatusValues1 = context.StatusValues1; context.Load(context.GetMyStatusValues2()).Completed += (s1, e1) => { this.StatusValues2 = context.StatusValues2; context.Load(context.GetMyStatusValues3Query()).Completed += (s2, e2) => { this.StatusValues3 = context.StatusValues3; (....and so on) }; }; }; }; While this works fine, it seems a bit nasty. Also, I would like to know when the last loadoperation completes so that I can load whatever entity I want to work on after this, so that these enumerated values resolve properly in form elements like comboboxes and listboxes. (I think) I can't do this easily above without creating a delegate and calling that on the completion of the last loadoperation. So my question is: does anyone out there know a better pattern to use, ideally where I can load all my static entitysets in a single LoadOperation?

    Read the article

  • Import and Export for CSV are both broken in Mathematica

    - by dreeves
    Consider the following 2 by 2 array: x = {{"a b c", "1,2,3"}, {"i \"comma-heart\" you", "i \",heart\" u, too"}} If we Export that to CSV and then Import it again we don't get the same thing back: Import[Export["tmp.csv", d]] Looking at tmp.csv it's clear that the Export didn't work, since the quotes are not escaped properly. According to the RFC which I presume is summarized correctly on Wikipedia's entry on CSV, the right way to export the above array is as follows: a b c, "1,2,3" "i ""heart"" you", "i "",heart"" u, too" Importing the above does not yield the original array either. So Import is broken as well. I've reported these bugs to [email protected] but I'm wondering if others have workarounds in the meantime. One workaround is to just use TSV instead of CSV. I tested the above with TSV and it seems to work (even with tabs embedded in the entries of the array).

    Read the article

< Previous Page | 71 72 73 74 75 76 77 78 79 80 81 82  | Next Page >