Search Results

Search found 3052 results on 123 pages for 'jawahar sync'.

Page 55/123 | < Previous Page | 51 52 53 54 55 56 57 58 59 60 61 62  | Next Page >

  • How to invalidate the OutputCache in a webfarm?

    - by Pure.Krome
    Hi folks, i've got a website that uses OutputCache attribute to cache pages. Works great. Now, I'm in the middle of R&D'ing scaling up this site to be in a web farm. Along with the usual suspects for webfarm pain ... I've noticed (pretty quickly/obviously) that the OutputCache from Server_A doesn't invalidate the OutputCache from Server_B .. if a try and invalidate a single server's OutputCache. This makes total sense - how can S_A 'tell' S_B to invalidate when they are physically 2 seperate machines, etc? So - what are our options? Velocity? I understand this will move the caching to a different layer .. which means that the final result (output) will always be required to be determined .. as opposed to the OutputCache whic remembers the final output content (yes, varby gives different versions, etc.. which is totally fine). So even though the poco or business objects are all sync'd, there's still that last rendering effort required (even if it's tiny .. compared to the effort to generate/sync business objects). So yeah .. not sure of the options here and what other people do?

    Read the article

  • How to avoid chaotic ASP.NET web application deployment?

    - by emzero
    Ok, so here's the thing. I'm developing an existing (it started being an ASP classic app, so you can imagine :P) web application under ASP.NET 4.0 and SQLServer 2005. We are 4 developers using local instances of SQL Server 2005 Express, having the source-code and the Visual Studio database project This webapp has several "universes" (that's how we call it). Every universe has its own database (currently on the same server) but they all share the same schema (tables, sprocs, etc) and the same source/site code. So manually deploying is really annoying, because I have to deploy the source code and then run the sql scripts manually on each database. I know that manual deploying can cause problems, so I'm looking for a way of automating it. We've recently created a Visual Studio Database Project to manage the schema and generate the diff-schema scripts with different targets. I don't have idea how to put the pieces together I would like to: Have a way to make a "sync" deploy to a target server (thanksfully I have full RDC access to the servers so I can install things if required). With "sync" deploy I mean that I don't want to fully deploy the whole application, because it has lots of files and I just want to deploy those new or changed. Generate diff-sql update scripts for every database target and combine it to just 1 script. For this I should have some list of the databases names somewhere. Copy the site files and executing the generated sql script in an easy and automated way. I've read about MSBuild, MS WebDeploy, NAnt, etc. But I don't really know where to start and I really want to get rid of this manual deploy. If there is a better and easier way of doing it than what I enumerated, I'll be pleased to read your option. I know this is not a very specific question but I've googled a lot about it and it seems I cannot figure out how to do it. I've never used any automation tool to deploy. Any help will be really appreciated, Thank you all, Regards

    Read the article

  • Publish failed using Ant publisher (Eclipse/datanucleus).

    - by aronp
    Dear All, I am being driven mad the following (apparently hard) error from eclipse. Publish failed using Ant publisher Resource is out of sync with the file system: '/MyServlet/build/classes/com/inver/hotzones/database/BaseNetworkData.class'. I have seen comments on similar errors where refreshing eclipses view of the project helps but it is not helping me. Have tried cleaning the project, removing it from the webserver, deleting war files but cant seem to clear it. I have reset my TMPDIR variable so that it uses a directory on the same filesystem as that appeared to be another possible cause. The error occurs on classes which have been enhanced by datanuculeus. I have auto-enhance on the project. The other references to this problem indicate that it is due to Eclipses view of the project being out of step with the filesystem, and I am guessing that this has something to do with thedata nucleus enhancement. Any ideas? Thanks. I am using Eclipse 3.5.2 with latest datanucleus pluggins. Stack trace org.eclipse.core.runtime.CoreException: Resource is out of sync with the file system: '/MyServlet/build/classes/com/inver/hotzones/database/BaseNetworkData.class'. at org.eclipse.jst.server.generic.core.internal.publishers.AbstractModuleAssembler.copyModule(AbstractModuleAssembler.java:172) at org.eclipse.jst.server.generic.core.internal.publishers.WarModuleAssembler.assemble(WarModuleAssembler.java:31) at org.eclipse.jst.server.generic.core.internal.publishers.AntPublisher.assembleModule(AntPublisher.java:167) at org.eclipse.jst.server.generic.core.internal.publishers.AntPublisher.publish(AntPublisher.java:128) at org.eclipse.jst.server.generic.core.internal.GenericServerBehaviour.publishModule(GenericServerBehaviour.java:82) at org.eclipse.wst.server.core.model.ServerBehaviourDelegate.publishModule(ServerBehaviourDelegate.java:949) at org.eclipse.wst.server.core.model.ServerBehaviourDelegate.publishModules(ServerBehaviourDelegate.java:1039) at org.eclipse.wst.server.core.model.ServerBehaviourDelegate.publish(ServerBehaviourDelegate.java:872) at org.eclipse.wst.server.core.model.ServerBehaviourDelegate.publish(ServerBehaviourDelegate.java:708) at org.eclipse.wst.server.core.internal.Server.publishImpl(Server.java:2731) at org.eclipse.wst.server.core.internal.Server$PublishJob.run(Server.java:278) at org.eclipse.core.internal.jobs.Worker.run(Worker.java:55)

    Read the article

  • Can a standalone ruby script (windows and mac) reload and restart itself?

    - by user30997
    I have a master-workers architecture where the number of workers is growing on a weekly basis. I can no longer be expected to ssh or remote console into each machine to kill the worker, do a source control sync, and restart. I would like to be able to have the master place a message out on the network that tells each machine to sync and restart. That's where I hit a roadblock. If I were using any sane platform, I could just do: exec('ruby', __FILE__) ...and be done. However, I did the following test: p Process.pid sleep 1 exec('ruby', __FILE__) ...and on Windows, I get one ruby instance for each call to exec. None of them die until I hit ^C on the window in question. On every platform I tried this on, it is executing the new version of the file each time, which I have verified this by making simple edits to the test script while the test marched along. The reason I'm printing the pid is to double-check the behavior I'm seeing. On windows, I am getting a different pid with each execution - which I would expect, considering that I am seeing a new process in the task manager for each run. The mac is behaving correctly: the pid is the same for every system call and I have verified with dtrace that each run is trigging a call to the execve syscall. So, in short, is there a way to get a windows ruby script to restart its execution so it will be running any code - including itself - that has changed during its execution? Please note that this is not a rails application, though it does use activerecord.

    Read the article

  • Presenting an image cropping interface

    - by wkw
    I'm trying to engineer a UI for cropping images in iphone OS and suspect I'm going about things the hard way. My goal is pretty much what the Tapbots duo have done with Pastebot. In that app, they dim the source image but provide a movable and resizable cropping view and the image you're cropping is in a zoomable scrollview; when you resize or move the underlying image, the cropping view adjusts appropriately. I mocked up a composite image which will give a sense of the design I'm after, along with how I presently have my view hierarchy setup, viewable here The approach I've started with is the following: UIImageView with the image to crop is in a scrollview, a plain UIView with black fill and suitable transparency/alpha setting is added in front of the imageview. I then use a custom UIView which is a sibling to the scrollview at a higher level, which implements the drawRect: method and for the most part calls CGImageCreateWithImageInRect to get the portion of the image's bitmap that matches the position of the cropping view and draws that to the CGContext. in the viewcontroller I'm using the UIScrollViewDelegate methods to track scrolling and passing those changes to the custom cropping UIView so it stays in sync with the scroll contentOffset. That much is finally working. But trying to keep in sync as the scrollview zoomScale changes is when I figured I should ask for help. Looking for suggestions or guidance. My initial approach just seems like more work than is required. Could this be done with a masking layer in the ImageView? And if so, how would I setup the tracking for moving and resizing the cropping rect? My experience working with layers is non-nil, but very limited thus far.

    Read the article

  • Serial: write() throttling?

    - by damian
    Hi everyone, I'm working on a project sending serial data to control animation of LED lights, which need to stay in sync with a sound engine. There seems to be a large serial write buffer (OSX (POSIX) + FTDI chipset usb serial device), so without manually restricting the transmission rate, the animation system can get several seconds ahead of the serial transmission. Currently I'm manually restricting the serial write speed to the baudrate (8N1 = 10 bytes serial frame per 8 bytes data, 19200 bps serial - 1920 bytes per second max), but I am having a problem with the sound drifting out of sync over time - it starts fine, but after 10 minutes there's a noticeable (100ms+) lag between the sound and the lights. This is the code that's restricting the serial write speed (called once per animation frame, 'elapsed' is the duration of the current frame, 'baudrate' is the bps (19200)): void BufferedSerial::update( float elapsed ) { baud_timer += elapsed; if ( bytes_written > 1024 ) { // maintain baudrate float time_should_have_taken = (float(bytes_written)*10)/float(baudrate); float time_actually_took = baud_timer; // sleep if we have > 20ms lag between serial transmit and our write calls if ( time_should_have_taken-time_actually_took > 0.02f ) { float sleep_time = time_should_have_taken - time_actually_took; int sleep_time_us = sleep_time*1000.0f*1000.0f; //printf("BufferedSerial::update sleeping %i ms\n", sleep_time_us/1000 ); delayUs( sleep_time_us ); // subtract 128 bytes bytes_written -= 128; // subtract the time it should have taken to write 128 bytes baud_timer -= (float(128)*10)/float(baudrate); } } } Clearly there's something wrong, somewhere. A much better approach would be to be able to determine the number of bytes currently in the transmit queue, and try and keep that below a fixed threshold. Any advice appreciated.

    Read the article

  • How can I access mainframe data with .Net applications and SQL Queries?

    - by orandov
    We have a large amount of data stored on an IBM mainframe using VSAM files. A lot of this data is dropped on the network every night in the form of text files to be processed and dumped into FoxPro and SQL Server databases. There are also many text files produced nightly by custom applications that get uploaded to the mainframe to keep everything in sync. Keeping the everything in sync is very tricky, to say the least. We are not getting rid of the mainframe any time soon and we would like to replace all the nightly batch processing with real time access to the mainframe data. We would like to be able to: Read data directly from the mainframe and produce reports based on it. Possibly using SQL queries. Read and Write data from custom .Net applications. We are not looking for a new platform to interface with the mainframe like Information Builders offers. We don't want to build application modules or reports with new "Business Intelligence" tools. We already know how to generate reports and write custom applications using SQL,.Net, Visual Studio, etc. All we are looking for is some sort of adapter to connect to our mainframe data. Any ideas are appreciated.

    Read the article

  • C# reference collection for storing reference types

    - by ivo s
    I like to implement a collection (something like List<T>) which would hold all my objects that I have created in the entire life span of my application as if its an array of pointers in C++. The idea is that when my process starts I can use a central factory to create all objects and then periodically validate/invalidate their state. Basically I want to make sure that my process only deals with valid instances and I don't re-fetch information I already fetched from the database. So all my objects will basically be in one place - my collection. A cool thing I can do with this is avoid database calls to get data from the database if I already got it (even if I updated it after retrieval its still up-to-date if of course some other process didn't update it but that a different concern). I don't want to be calling new Customer("James Thomas"); again if I initted James Thomas already sometime in the past. Currently I will end up with multiple copies of the same object across the appdomain - some out of sync other in sync and even though I deal with this using timestamp field on the MSSQL server I'd like to keep only one copy per customer in my appdomain (if possible process would be better). I can't use regular collections like List or ArrayList for example because I cannot pass parameters by their real local reference to the their existing Add() methods where I'm creating them using ref so that's not to good I think. So how can this be implemented/can it be implemented at all ? A 'linked list' type of class with all methods working with ref & out params is what I'm thinking now but it may get ugly pretty quickly. Is there another way to implement such collection like RefList<T>.Add(ref T obj)? So bottom line is: I don't want re-create an object if I've already created it before during the entire application life unless I decide to re-create it explicitly (maybe its out-of-date or something so I have to fetch it again from the db). Is there alternatives maybe ?

    Read the article

  • Is there such a thing as IMAP for podcasts?

    - by Gerrit
    Is there such a thing as IMAP for podcasts? I own a desktop, laptop, iPod, smartphone and a web-client all downloading StackOverflow Podcasts. (among others) They all tell me which episodes are available and which are already played. Everything is a horrible mess, ofcourse. My iPod is somewhat in sync with my desktop, but everything else is a random jungle. The same problem with e-mail is solved by IMAP. Every device gets content and meta-information from one server, and stays in sync with it. Per device, I can set preferences (do or do not download the complete archive including junkmail). Can we implement the IMAP approach for podcasts? Or is there a better metaphore/standard to solve this problem? How will the adoption-strategy look like? (by the way: except for the Windows smartphone, I own a full Apple-stack of products. Even then, I run into this problem) UPDATE The RSS-to-Imap link to sourceforge looks promesting, but very alpha/experimental. UPDATE 2 The one thing RSS is missing is the command/method/parameter/attribute to delete/unread items. RSS can only add, not remove. If RSS(N+1) (3?) could add a value for unread="true|false", it would be solved. If I cache all my RSS-feeds on my own server, and add the attribute myself, I only would have to convince iTunes and every other client to respect that.

    Read the article

  • How to best future proof my application that needs to connect to Outlook?

    - by Troy
    I have a contact management application written in Delphi which has a “Sync with Outlook” feature that I developed 10 years ago. Now, I’m going back to add some features and fix some bugs. This sync feature uses the Outlook object model to get started, but it has an optional mode called “Use MAPI Enhancements” where it uses pure MAPI to speed up how it looks for changes, and it allows notes to be synced w/ RTF instead of just plain text. I'm wondering if supporting two parallel paths of execution is a good idea or not. If I went with all MAPI, I believe I'd avoid some security prompts, and I'd avoid situations where anti-virus has "script-blocking" features which block my app from connecting to Outlook. But I believe that on the down side, my 32-bit app would not be able to to connect with 64-bit Outlook 2010 using MAPI. And I wonder about the future of MAPI in general. If I stick with the Outlook object model, will my 32-bit app be able to connect to the Outlook object model (since it's out of process COM)? If so, this is a compelling reason to keep my Outlook object model execution path in place. But if not, and if my app needs to be compiled for x64, then why not just go with pure MAPI?

    Read the article

  • Is nested synchronized block necessary?

    - by Dan
    I am writing a multithreaded program and I have a method that has a nested synchronized blocks and I was wondering if I need the inner sync or if just the outer sync is good enough. public class Tester { private BlockingQueue<Ticket> q = new LinkedBlockingQueue<>(); private ArrayList<Long> list = new ArrayList<>(); public void acceptTicket(Ticket p) { try { synchronized (q) { q.put(p); synchronized (list) { if (list.size() < 5) { list.add(p.getSize()); } else { list.remove(0); list.add(p.getSize()); } } } } catch (InterruptedException ex) { Logger.getLogger(Consumer.class.getName()).log(Level.SEVERE, null, ex); } } } EDIT: This isn't a complete class as I am still working on it. But essentially I am trying to emulate a ticket machine. The ticket machine maintains a list of tickets in the BlockingQueue q. Whenever a client adds a ticket to the machine, the machine also keeps track of the price of the last 5 tickets (ArrayList list)

    Read the article

  • How accurately (in terms of time) does Windows play audio?

    - by MusiGenesis
    Let's say I play a stereo WAV file with 317,520,000 samples, which is theoretically 1 hour long. Assuming no interruptions of the playback, will the file finish playing in exactly one hour, or is there some occasional tiny variation in the playback speed such that it would be slightly more or slightly less (by some number of milliseconds) than one hour? I am trying to synchronize animation with audio, and I am using a System.Diagnostics.Stopwatch to keep the frames matching the audio. But if the playback speed of WAV audio in Windows can vary slightly over time, then the audio will drift out of sync with the Stopwatch-driven animation. Which leads to a second question: it appears that a Stopwatch - while highly granular and accurate for short durations - runs slightly fast. On my laptop, a Stopwatch run for exactly 24 hours (as measured by the computer's system time and a real stopwatch) shows an elapsed time of 24 hours plus about 5 seconds (not milliseconds). Is this a known problem with Stopwatch? (A related question would be "am I crazy?", but you can try it for yourself.) Given its usage as a diagnostics tool, I can see where a discrepancy like this would only show up when measuring long durations, for which most people would use something other than a Stopwatch. If I'm really lucky, then both Stopwatch and audio playback are driven by the same underlying mechanism, and thus will stay in sync with each other for days on end. Any chance this is true?

    Read the article

  • WinCE and PC USB communication

    - by sebeksd
    We are developing some device and we need to find good solution for one of needed functionality. Thing is that we need communicate WinCE 6.0 (ARM) and Windows on PC. Easiest way is of course COM port but in our case it is impossible (all serial ports are used on WinCE and we don't want to add one more). Second option is LAN but for us it is not the best option for few reasons. So there is third option we could use. USB to USB communication but how to do that ? Of course WinCE is USB Device and PC is USB Host so all hardware basics are meet. We could use Active sync but there are few problems with it: - WinCE 6.0 is not working with WMDC (drivers on device just crash after connecting device with PC) and I didn't find any solution for it so in this case we need to use WinXP on PC side (old ActiveSync) - we need to filter communication with active sync to only our application, no other non authorized software should be allowed (what I know this is imposible to obtain). So propably best way to do what we need is to communicate throug USB like standard COM (serial communication). The question is, how it could be made, are we need to write driver on WinCE and also a Driver on Windows (PC), or there are better solution? Maybe some driver for WinCE 6.0 that would emulate Virtual COM on PC side (and of course allow standard Read/Write to it on WinCE side) ? Could someone tell me if something like that exists ?

    Read the article

  • Is there a FAST way to export and install an app on my phone, while signing it with my own keystore?

    - by Alexei Andreev
    So, I've downloaded my own application from the market and installed it on my phone. Now, I am trying to install a temporary new version from Eclipse, but here is the message I get: Re-installation failed due to different application signatures. You must perform a full uninstall of the application. WARNING: This will remove the application data! Please execute 'adb uninstall com.applicationName' in a shell. Launch canceled! Now, I really really don't want to uninstall the application, because I will lose all my data. One solution I found is to Export my application, creating new .apk, and then install it via HTC Sync (probably a different program based on what phone you have). The problem is this takes a long time to do, since I need to enter the password for the keystore each time and then wait for HTC Sync. It's a pain in the ass! So the question is: Is there a way to make Eclipse automatically use my keystore to sign the application (quickly and automatically)? Or perhaps to replace debug keystore with my own? Or perhaps just tell it to remember the password, so I don't have to enter it every time...? Or some other way to solve this problem?

    Read the article

  • For loop index out of range ArgumentOutOfRangeException when multithreading

    - by Lirik
    I'm getting some strange behavior... when I iterate over the dummyText List in the ThreadTest method I get an index out of range exception (ArgumentOutOfRangeException), but if I remove the threads and I just print out the text, then everything works fine. This is my main method: public static Object sync = new Object(); static void Main(string[] args) { ThreadTest(); Console.WriteLine("Press any key to continue."); Console.ReadKey(); } This method throws the exception: private static void ThreadTest() { Console.WriteLine("Running ThreadTest"); Console.WriteLine("Running ThreadTest"); List<String> dummyText = new List<string>() { "One", "Two", "Three", "Four", "Five", "Six", "Seven", "Eight", "Nine", "Ten"}; for (int i = 0; i < dummyText.Count; i++) { Thread t = new Thread(() => PrintThreadName(dummyText[i])); // <-- Index out of range?!? t.Name = ("Thread " + (i)); t.IsBackground = true; t.Start(); } } private static void PrintThreadName(String text) { Random rand = new Random(DateTime.Now.Millisecond); while (true) { lock (sync) { Console.WriteLine(Thread.CurrentThread.Name + " running " + text); Thread.Sleep(1000+rand.Next(0,2000)); } } } This does not throw the exception: private static void ThreadTest() { Console.WriteLine("Running ThreadTest"); List<String> dummyText = new List<string>() { "One", "Two", "Three", "Four", "Five", "Six", "Seven", "Eight", "Nine", "Ten"}; for (int i = 0; i < dummyText.Count; i++) { Console.WriteLine(dummyText[i]); // <-- No exception here } } Does anybody know why this is happening?

    Read the article

  • Omit return type in C++0x

    - by Clinton
    I've recently found myself using the following macro with gcc 4.5 in C++0x mode: #define RETURN(x) -> decltype(x) { return x; } And writing functions like this: template <class T> auto f(T&& x) RETURN (( g(h(std::forward<T>(x))) )) I've been doing this to avoid the inconvenience having to effectively write the function body twice, and having keep changes in the body and the return type in sync (which in my opinion is a disaster waiting to happen). The problem is that this technique only works on one line functions. So when I have something like this (convoluted example): template <class T> auto f(T&& x) -> ... { auto y1 = f(x); auto y2 = h(y1, g1(x)); auto y3 = h(y1, g2(x)); if (y1) { ++y3; } return h2(y2, y3); } Then I have to put something horrible in the return type. Furthermore, whenever I update the function, I'll need to change the return type, and if I don't change it correctly, I'll get a compile error if I'm lucky, or a runtime bug in the worse case. Having to copy and paste changes to two locations and keep them in sync I feel is not good practice. And I can't think of a situation where I'd want an implicit cast on return instead of an explicit cast. Surely there is a way to ask the compiler to deduce this information. What is the point of the compiler keeping it a secret? I thought C++0x was designed so such duplication would not be required.

    Read the article

  • Syncing two AS3 NetStreams

    - by Lowgain
    I'm writing an app that requires an audio stream to be recording while a backing track is played. I have this working, but there is an inconsistent gap in between playback and record starting. I don't know if I can do anything to make the sync perfect every time, so I've been trying to track what time each stream starts so I can calculate the delay and trim it server-side. This also has proved to be a challenge as no events seem to be sent when a connection starts (as far as I know). I've tried using various properties like the streams' buffer sizes, etc. I'm thinking now that as my recorded audio is only mono, I may be able to put some kind of 'control signal' on the second stereo track which I could use to determine exactly when a sound starts recording (or stick the whole backing track in that channel so I can sync them that way). This leaves me with the new problem of properly injecting this sound into the NetStream. If anyone has any idea whether or not any of these ideas will work, how to execute them, or some alternatives, that would be extremely helpful! Been working on this issue for awhile

    Read the article

  • Ember Data Sycn - LocalStorage+REST+RealTime+Online/Offline

    - by Miguel Madero
    We have a combination of requirements in terms o data access. Pre-load some reference data. We need reference data to survive browser restarts instead of just living in memory to avoid loading it all the time. I'm currently using the LocalStorageAdapter for that. Once we have it, we would like to sync changes (polling or using Socket.IO in the background and updating the LocalStorage could do the trick) There're other models that are more transactional, where we would need to directly go to the Server and get/save them. It would be nice to use something like the RESTAdapter for that. Lastly, there're some operations that should work off-line and changes should be synced later. To make it more concrete: * We pre-load vendor and "favorite products" into Local Storage. We work offline with those. * We need to sync server changes to vendor and product information. * If they search the full catalog, that requires them to be online. * When offline, we need to allow users to add something to their cart or even submit and order. We would like to queue this action and submit it when they have an Internet Connection. So a few questions are derived from this: * Is there a way to user RESTAdapter in combination with LocalStorage? * Is there some Socket.IO support? (Happy to do this part manually) * Is there Queueing support? Ideally at the Ember-Data level. I know we will have to do a lot of this manually and pull together the different lego pieces, but I wanted to ask for some perspective from experience Ember devs.

    Read the article

  • When is the reintegrate option really necessary?

    - by Tor Hovland
    If you always sync a feature branch before you merge it back, why do you really have to use the --reintegrate option? The Subversion book says: When merging your branch back to the trunk, however, the underlying mathematics is quite different. Your feature branch is now a mishmosh of both duplicated trunk changes and private branch changes, so there's no simple contiguous range of revisions to copy over. By specifying the --reintegrate option, you're asking Subversion to carefully replicate only those changes unique to your branch. (And in fact, it does this by comparing the latest trunk tree with the latest branch tree: the resulting difference is exactly your branch changes!) So the --reintegrate option only merges the changes that are unique to the feature branch. But if you always sync before merge (which is a recommended practice, in order to deal with any conflicts on the feature branch), then the only changes between the branches are the changes that are unique to the feature branch, right? And if Subversion tries to merge code that is already on the target branch, it will just do nothing, right? In this blog post, Mark Phippard writes: http://blogs.open.collab.net/svn/2008/07/subversion-merg.html If we include those synched revisions, then we merge back changes that already exist in trunk. This yields unnecessary and confusing conflicts. Can somebody give me an example of when dropping reintegrate gives me unnecessary conflicts?

    Read the article

  • Parallel processing from a command queue on Linux (bash, python, ruby... whatever)

    - by mlambie
    I have a list/queue of 200 commands that I need to run in a shell on a Linux server. I only want to have a maximum of 10 processes running (from the queue) at once. Some processes will take a few seconds to complete, other processes will take much longer. When a process finishes I want the next command to be "popped" from the queue and executed. Does anyone have code to solve this problem? Further elaboration: There's 200 pieces of work that need to be done, in a queue of some sort. I want to have at most 10 pieces of work going on at once. When a thread finishes a piece of work it should ask the queue for the next piece of work. If there's no more work in the queue, the thread should die. When all the threads have died it means all the work has been done. The actual problem I'm trying to solve is using imapsync to synchronize 200 mailboxes from an old mail server to a new mail server. Some users have large mailboxes and take a long time tto sync, others have very small mailboxes and sync quickly.

    Read the article

  • Multi-Part HTTP Request through xcode

    - by devsri
    Hello Everyone, i want to upload image,video and audio files to a server. I have read this thread on the similar topic but wasn't able to understand completely the flow of the code. It would be great if you can suggest me some sample code or tutorial to start with. I am using the following code to connect without any media to the server [UIApplication sharedApplication].networkActivityIndicatorVisible = YES; NSString *url =[[NSString alloc]initWithFormat:@"%@",[NetworkConstants getURL]]; NSURL *theURL =[NSURL URLWithString:url]; [url release]; NSMutableURLRequest *theRequest =[NSMutableURLRequest requestWithURL:theURL cachePolicy:NSURLRequestReloadIgnoringCacheData timeoutInterval:0.0f]; [theRequest setHTTPMethod:@"POST"]; NSString *theBodyString = [NSString stringWithFormat:@"json1=%@&userID=%@",jsonObject,[GlobalConstants getUID]]; NSData *theBodyData = [theBodyString dataUsingEncoding:NSUTF8StringEncoding]; [theRequest setHTTPBody:theBodyData]; NSURLConnection *conn = [[NSURLConnection alloc] initWithRequest:theRequest delegate:self]; if (conn) { NSLog(@"Successful in sending sync"); } else { NSLog(@"Failed Connection in sending sync"); } [conn release]; It would be really convenient for me if anything could be done editing this part of code. Any form of help would be highly appreciated. Thanks in advance!!

    Read the article

  • Android Bluetooth syncing

    - by Darryl
    I am connecting to a bluetooth enabled camera, and I am able to connect using the methods found in the BluetoothChat example. I need to send commands to the camera. The issue is that I also need to get a response BACK from the camera after I send the command in the first place. So basically I need to write a command and receive a response. However, the thing is that the commands sometimes don't generate a response. Even the documentation on the camera says that you "have to send the sync command as many as 25 times on power up before you will get a response." So I cannot just write a command and wait for a response, as the "read" function blocks the thread. If I have the read function in another thread, like the bluetooth chat example, there seems to be sync issues, i.e., if I issue a write command, how can I know that it is reading if that is happening in another thread? I did set a global variable to check for, but this seems "iffy" at best. So basically I need to write to the bluetooth and then attempt to read from it. However, I need to let that read timeout and if I haven't received a response, I need to write again until I get a response (or until it's tried a set number of times). I don't need the read function to be going all the time in the background. Any ideas? Thanks in advance for your time.

    Read the article

  • Need help profiling .NET caching extension method.

    - by rockinthesixstring
    I've got the following extension Public Module CacheExtensions Sub New() End Sub Private sync As New Object() Public Const DefaultCacheExpiration As Integer = 1200 ''# 20 minutes <Extension()> Public Function GetOrStore(Of T)(ByVal cache As Cache, ByVal key As String, ByVal generator As Func(Of T)) As T Return cache.GetOrStore(key, If(generator IsNot Nothing, generator(), Nothing), DefaultCacheExpiration) End Function <Extension()> Public Function GetOrStore(Of T)(ByVal cache As Cache, ByVal key As String, ByVal generator As Func(Of T), ByVal expireInSeconds As Double) As T Return cache.GetOrStore(key, If(generator IsNot Nothing, generator(), Nothing), expireInSeconds) End Function <Extension()> Public Function GetOrStore(Of T)(ByVal cache As Cache, ByVal key As String, ByVal obj As T) As T Return cache.GetOrStore(key, obj, DefaultCacheExpiration) End Function <Extension()> Public Function GetOrStore(Of T)(ByVal cache As Cache, ByVal key As String, ByVal obj As T, ByVal expireInSeconds As Double) As T Dim result = cache(key) If result Is Nothing Then SyncLock sync If result Is Nothing Then result = If(obj IsNot Nothing, obj, Nothing) cache.Insert(key, result, Nothing, DateTime.Now.AddSeconds(expireInSeconds), cache.NoSlidingExpiration) End If End SyncLock End If Return DirectCast(result, T) End Function End Module From here, I'm using the extension is a TagService to get a list of tags Public Function GetTagNames() As List(Of String) Implements Domain.ITagService.GetTags ''# We're not using a dynamic Cache key because the list of TagNames ''# will persist across all users in all regions. Return HttpRuntime.Cache.GetOrStore(Of List(Of String))("TagNamesOnly", Function() _TagRepository.Read().Select(Function(t) t.Name).OrderBy(Function(t) t).ToList()) End Function All of this is pretty much straight forward except when I put a breakpoint on _TagRepository.Read(). The problem is that it is getting called on every request, when I thought that it is only to be called when Result Is Nothing Am I missing something here?

    Read the article

  • How can I synchronize one set of data with another?

    - by RenderIn
    I have an old database and a new database. The old records were converted to the new database recently. All our old applications continue to point to the old database, but the new applications point to the new database. Currently the old database is the only one being updated, so throughout the day the new database becomes out of sync. It is acceptable for the new database to be out of sync for a day, so until all our applications are pointed to the new database I just need to write a nightly cron job that will bring it up to date. I do not want to purge the new database and run the complete conversion script each night, as that would reduce uptime and would create a mess in our auditing of that table. I'm thinking about selecting all the data from the old database, converting it to the new database structure in memory, and then checking for the existence of each record before inserting it in the new database. After that's done, I'd select everything from the new database and check if it exists in the old one, and if not delete it. Is this the simplest way to do this?

    Read the article

  • Syncing objects between two devices with different system times

    - by Mike Weller
    Hi there. I'm syncing objects between two devices. Objects have a lastModified property. If both devices have modified an object, then during the next sync the version of the object with the most recent lastModified is chosen on both devices. So we don't do fine-grained merging, only 'most recent version' merging. The problem is this. When one device receives a list of changed objects it can't reliably compare the lastModified of received objects to its own because the system times on the two devices may be different. I considered having each device send its current date/time during the sync. Then each calculates the difference between the remote time and the local time to compare the dates properly. But if there is lag between sending a date and the remote device receiving it, this causes incorrect comparisons with objects that were modified at the same time (or very close together in time). i.e. both devices think the remote object is newer and they end up with different objects. I hope I have explained this clearly enough. There must be a common solution to this kind of problem but my brain isn't coming up with anything. Any suggestions? Thanks in advance...

    Read the article

< Previous Page | 51 52 53 54 55 56 57 58 59 60 61 62  | Next Page >