Search Results

Search found 31269 results on 1251 pages for 'process management'.

Page 476/1251 | < Previous Page | 472 473 474 475 476 477 478 479 480 481 482 483  | Next Page >

  • Does a SELECT happen all at once, or progressively

    - by AmbroseChapel
    I have a process which finds a list of files to be deleted using a SELECT wheredelete= 'Y'. I set this process running the other day but it takes a while because it actually does the file deletions too. And in the middle of its long operation, I was using the application and deleted one more file. At this point I realised I didn't know if that file would be deleted, because I didn't know if the SELECT would have found all the files at the start, or if it was finding them progressively and would get to my newly-deleted file eventually.

    Read the article

  • Improving HTML scrapper efficiency with pcntl_fork()

    - by Michael Pasqualone
    With the help from two previous questions, I now have a working HTML scrapper that feeds product information into a database. What I am now trying to do is improve efficiently by wrapping my brain around with getting my scrapper working with pcntl_fork. If I split my php5-cli script into 10 separate chunks, I improve total runtime by a large factor so I know I am not i/o or cpu bound but just limited by the linear nature of my scraping functions. Using code I've cobbled together from multiple sources, I have this working test: <?php libxml_use_internal_errors(true); ini_set('max_execution_time', 0); ini_set('max_input_time', 0); set_time_limit(0); $hrefArray = array("http://slashdot.org", "http://slashdot.org", "http://slashdot.org", "http://slashdot.org"); function doDomStuff($singleHref,$childPid) { $html = new DOMDocument(); $html->loadHtmlFile($singleHref); $xPath = new DOMXPath($html); $domQuery = '//div[@id="slogan"]/h2'; $domReturn = $xPath->query($domQuery); foreach($domReturn as $return) { $slogan = $return->nodeValue; echo "Child PID #" . $childPid . " says: " . $slogan . "\n"; } } $pids = array(); foreach ($hrefArray as $singleHref) { $pid = pcntl_fork(); if ($pid == -1) { die("Couldn't fork, error!"); } elseif ($pid > 0) { // We are the parent $pids[] = $pid; } else { // We are the child $childPid = posix_getpid(); doDomStuff($singleHref,$childPid); exit(0); } } foreach ($pids as $pid) { pcntl_waitpid($pid, $status); } // Clear the libxml buffer so it doesn't fill up libxml_clear_errors(); Which raises the following questions: 1) Given my hrefArray contains 4 urls - if the array was to contain say 1,000 product urls this code would spawn 1,000 child processes? If so, what is the best way to limit the amount of processes to say 10, and again 1,000 urls as an example split the child work load to 100 products per child (10 x 100). 2) I've learn that pcntl_fork creates a copy of the process and all variables, classes, etc. What I would like to do is replace my hrefArray variable with a DOMDocument query that builds the list of products to scrape, and then feeds them off to child processes to do the processing - so spreading the load across 10 child workers. My brain is telling I need to do something like the following (obviously this doesn't work, so don't run it): <?php libxml_use_internal_errors(true); ini_set('max_execution_time', 0); ini_set('max_input_time', 0); set_time_limit(0); $maxChildWorkers = 10; $html = new DOMDocument(); $html->loadHtmlFile('http://xxxx'); $xPath = new DOMXPath($html); $domQuery = '//div[@id=productDetail]/a'; $domReturn = $xPath->query($domQuery); $hrefsArray[] = $domReturn->getAttribute('href'); function doDomStuff($singleHref) { // Do stuff here with each product } // To figure out: Split href array into $maxChilderWorks # of workArray1, workArray2 ... workArray10. $pids = array(); foreach ($workArray(1,2,3 ... 10) as $singleHref) { $pid = pcntl_fork(); if ($pid == -1) { die("Couldn't fork, error!"); } elseif ($pid > 0) { // We are the parent $pids[] = $pid; } else { // We are the child $childPid = posix_getpid(); doDomStuff($singleHref); exit(0); } } foreach ($pids as $pid) { pcntl_waitpid($pid, $status); } // Clear the libxml buffer so it doesn't fill up libxml_clear_errors(); But what I can't figure out is how to build my hrefsArray[] in the master/parent process only and feed it off to the child process. Currently everything I've tried causes loops in the child processes. I.e. my hrefsArray gets built in the master, and in each subsequent child process. I am sure I am going about this all totally wrong, so would greatly appreciate just general nudge in the right direction.

    Read the article

  • How to implement a mailing system with Rails that sends emails in the background

    - by Tam
    I want to implement a reliable mailing system with Ruby on Rails that sends emails in the background as sending email sometimes takes like 10 seconds or more so I don't want the user to wait. Some ideas I thought of: 1- Write to a table in DB a have a background process that go over and send email (concern: potential many reads/writes to DB slows down my application) 2- Messaging Queue background process / Rake task (concern: if server crashes queued mails will be lost also might eat up a lot of memory if many emails) I was wondering if you a know of a good solution that provides a balance between reliability and performance.

    Read the article

  • ResGen.exe stucks when redirect output

    - by Darqer
    Hello I try to redirect standard output from ResGen.exe. I use following code ProcessStartInfo psi = new ProcessStartInfo( "resxGen.exe" ); psi.CreateNoWindow = true; psi.Arguments = sb.ToString(); psi.UseShellExecute = false; psi.RedirectStandardOutput = true; Process p = Process.Start( psi ); p.WaitForExit(); StreamReader sr = p.StandardOutput; string message = p.StandardOutput.ReadToEnd(); It stuck on p.WaitForExit. When I turn off output stream redirection and do not read StandardOutput it works correctly. What do I do wrong ?

    Read the article

  • How can I measure TForm deserialization time in Delphi?

    - by mjustin
    For performance tests I need a way to measure the time needed for a form to load its definition from the DFM. All existing forms inherit a custom form class. To capture the current time, this base class needs overriden methods as "extension points": before the beginning of the deserialization process after the completion of deserialization (can be implemented by overriding the Loaded procedure) the moment just before the execution of the OnFormCreate event Which TObject (or TComponent) methods are best suited? Maybe there are other extension points in the form creation process, please feel free to make suggestions.

    Read the article

  • Is there a buffer size attached to stdout?

    - by jameswelle
    I am trying to find some information on data limits related to stdout on Windows. I can't seem to find the information on MSDN. Is there a limit to how much data can be written to stdout? If so, what happens if the limit is reached? Is the data lost? If stdout is redirected (for example, by launching the process from .Net and using the ProcessStartInfo.RedirectStandardOutput property), does that have any effect on how much data can be written? As I read from the stdout stream in the calling process, does that affect the limitations? Are these limits related in any way to named pipes?

    Read the article

  • Reading and writing to SysV shared memory without synchronization (use of semaphores, C/C++, Linux)

    - by user363778
    Hi, I use SysV shared memory to let two processes communicate with each other. I do not want the code to become to complex so I wondered if I really had to use semaphores to synchronize the access to the shared memory. In my C/C++ program the parent process reads from the shared memory and the child process writes to the shared memory. I wrote two test applications to see if I could produce some kind of error like a segmentation fault, but I couldn't (Ubuntu 10.04 64bit). Even two processes writing non stop in a while loop to the same shared memory did not produce any error. I hope someone has experience concerning this matter and can tell me if I really must use semaphores to synchronize the access or if I am OK without synchronization. Thanks

    Read the article

  • urllib alternative for iPhone

    - by Pr301
    hi, I am trying to create an iPhone application which in some point connects to the internet, fills an on-line form, fetches the resulting website, parses it and returns a string to the user. I want all this process to happen in the background. I know how to do this kind of things with python and urllib but in objc I can't find an alternative, from on-line search I found either sites that explain how to use webkit to retrieve webpages (I suppose this is for displaying them to the user) or how to parse an existing HTML file or string. Since I want the file to be retrieved from the internet and the whole process should be running in the background, neither of these solutions covers my needs.

    Read the article

  • Launch x64 Windows application in C# while the project is set to x86

    - by daydr3am3r
    Hi all I'm trying to launch the osk.exe and I keep getting "Could not start osk" message. The problem is that my project is set to x86 (i'm using a ms access database). If I switch to x64 or Any CPU everything works fine but the database will no longer work. I tried this using System.Diagnostics; private void btnOSK_Click(object sender, EventArgs e) { Process.Start("osk.exe"); Process.Start(@"C:\windows\system32\osk.exe"); } I also tried to run SysWOWW\osk but this also didn't work. Besides my application should run on both x86 and x64 machines. Is there any way to bypass this? It's really frustrating.

    Read the article

  • Windows Shell Programming book suggestion

    - by Lijo
    Hi, I am a web developer using C#. I would like to experiment with Windows shell programming in C#. Many people suggests that using managed applications for shell program, is dangerous. (Creation of separate instances for each process versus in process shell. Also version dependency) Frankly speaking, I am totally new to shell. Is there a book available, that will treat these topics; both through managed code and unmanaged code (but mostly towards managed code)? It would be great if that book is concise , for beginner and giving theoretical background of the shell. Please suggest…. Thanks Lijo

    Read the article

  • What is the easiest way to send a Javascript array via JSON to PHP?

    - by dscher
    I have a few arrays that I want to send to process with PHP. Using json2.js I will stringify the arrays like so: var JSONlinks = JSON.stringify(link_array); var JSONnotes = JSON.stringify(note_array); but then I'm confused. Do I need to use a XMLHttpRequest object? Is there another way? If that is the simplest way, could someone please just share the most basic instance of the code needed in order to send to PHP where I can then use JSON decode? I think it might help others in the future really. I'm currently using Jquery and I know there are many options out there for frameworks and each one may or may not make this process any easier. If you're using a framework in your reply please mention why you'd choose that framework rather than just javascript.

    Read the article

  • Using PowerShell class to invoke a "[namespace.class]::method" style command

    - by Marco
    Hello, I created a powershell object via .net to invoke commands. When I invoke normal commands like 'Get-Process' I had no problems: ps.AddCommand("Get-Process").AddParameter(...).Invoke() but I'm not able to invoke a .net method with the syntax "[namespace.class]::method", just to make an example to invoke [System.IO.File]::Exists("c:\boo.txt"). I tried with ps.AddCommand("[System.IO.File]::Exists(\"c:\\boo.txt\")").Invoke() ps.AddCommand("[System.IO.File]::Exists").AddArgument("c:\\boo.txt").Invoke() and some others. It always throws an exception which says that the command specified is not recognized. There is a way to invoke that type of command? Thanks

    Read the article

  • how to selfhost wcf without iis

    - by dotnetcoder
    Reading up on WCF we have self hosting option available , one limitation here is we have to manage the host process lifecycle ourselves. What I am exploring here is to run the service without IIS and do a self hosting. Few things come to mind - How will request management work here. In case of IIS it manages the request and give control to dotnet on a particular thread. In absence of IIS do we need to write code ourselves to manage incoming requests ( say on a tcp port ) or WCF provides some classes to manage request and spawn threads to process each thread. I am aware that in case of self hosting this needs to be a windows service. In case of self hosting how can me tap on the number of simultaneous requests on the sever , it can be managed by limiting the thread pool ? or we can configure this via wcf ? Thanks dc

    Read the article

  • Help to run it in the background

    - by AlexPolo
    Here's a simple python daemon I can't manage to run as a background process: #!/usr/bin/env python import socket host = '' port = 843 backlog = 5 size = 1024 s = socket.socket(socket.AF_INET, socket.SOCK_STREAM) s.bind((host,port)) s.listen(backlog) while 1: client, address = s.accept() data = client.recv(size) if data == '<policy-file-request/>\0': client.send('<?xml version="1.0"?><cross-domain-policy><allow-access-from domain="*" to-ports="*"/></cross-domain-policy>') client.close() It's a socket policy file server (you may have heard of the restiction Adope put on socket connection - http://www.adobe.com/devnet/flashplayer/articles/socket_policy_files.html); that works well when gets run like an "ordinary" process - "python that_server.py", - but I get problem to run it in the background. Running like so: "that_server.py &", - does not work.

    Read the article

  • Compilation hangs for a class with field double d = 2.2250738585072012e-308

    - by 01es
    I have come across an interesting situation. A coworker committed some changes, which would not compile on my machine neither from the IDE (Eclipse) nor from a command line (Maven). The problem manifested in the compilation process taking 100% CPU and only killing the process would help to stop it. After some analysis the cause of the problem was located and resolved. It turned out be a line "double d = 2.2250738585072012e-308" (without semicolon at the end) in one of the interfaces. The following snipped duplicates it. public class WeirdCompilationIssue { double d = 2.2250738585072012e-308 } Why would compiler hang? A language edge case?

    Read the article

  • How do I view how many concurrent long polling requests there are on my server?

    - by Pascal
    My host is Joyent. My host says I have 15 process limit and prstat -J shows those processes but that doesn't tell me how many long polling requests are currently being served. I could record it myself but that would add alot of performance overhead. I need to know when the server is at its long polling limits. I know this limit occurs far before the memory or CPU is used up. From experimentation, I've already verified that the number of long polls open is NOT equivalant to the number of processes running, probably because each process has multiple threads, each serving a request. thanks.

    Read the article

  • Windows Azure local development environment speed

    - by Paperjam
    I've started porting an existing ASP.NET web app to Windows Azure and have noticed that the development process is really slow. Each time I make a change to my code and want to view it, I have to effectively redeploy it to the local dev cloud (using Start debugging (F5) or Start without debugging (Ctrl-F5). The process itself takes over a minute, during which time Visual Studio is completely unresponsive. Am I doing something wrong or is that simply how things are developing for Azure? My specs: Visual Studio 2008 9.0.30729.1 SP 5 projects running on .NET 3.5 SP1 Azure SDK 1.1 (February 2010) Single instance of a single web role Dual-core AMD 64 machine with 8GB RAM, 64-bit Windows 7, fully patched The main project itself is quite large (3k files, ~200k lines) but compiles normally in 10-15 seconds

    Read the article

  • How to run an exe which is added in the project

    - by Manish
    First of all I did gave a look at this one. But I didn't got the solution. The accepted anser says to use it like this: Process p = new Process(); p.StartInfo.UseShellExecute = false; p.StartInfo.RedirectStandardOutput = true; p.StartInfo.FileName = "myExec.exe"; p.Start(); But this is not working for me. THe exception's message says "The system cannot find the specified file". Am I missing something? I added the exe directly to the project itself.

    Read the article

  • Is Amazon SQS the right choice here? Rails performance issue.

    - by ole_berlin
    I'm close to releasing a rails app with the common networking features (messaging, wall, etc.). I want to use some kind of background processing (most likely Bj) for off-loading tasks from the request/response cycle. This would happen when users invite friends via email to join and for email notifications. I'm not sure if I should just drop these invites and notifications in my Database, using a model and then just process it with a worker process every x minutes or if I should go for Amazon SQS, storing the messages and invites there and let my worker retrieve it from Amazon SQS for processing (sending the invites / notifications). The Amazon approach would get load off my Database but I guess it is slower to retrieve messages from there. What do you think?

    Read the article

< Previous Page | 472 473 474 475 476 477 478 479 480 481 482 483  | Next Page >