Search Results

Search found 4646 results on 186 pages for 'multi'.

Page 24/186 | < Previous Page | 20 21 22 23 24 25 26 27 28 29 30 31  | Next Page >

  • C# - Fast and simple multi dimensional data structures?

    - by Jeremy Rudd
    I need to store multi-dimensional data consisting of numbers in a manner thats easy to work with. I'm capturing data in real time, and once processed I would destroy and GC older data. This data structure must be fast so it won't hit my overall app performance. The faster the better. What are my choices in terms of platform supported data structures? I'm using VS 2010. and .NET 4.

    Read the article

  • Two-pass multi way merge sort?

    - by Nimesh
    If I have a relation (SQL) that does not fit in memory and I want to sort the relation using TPMMS (Two-pass multi-way merge sort method). How would I divide the table in sub tables (and how many) that can fit in memory and than merge them? Let's say I am using C#.

    Read the article

  • Django1.1 file based session backend multi-threaded solution

    - by Satoru.Logic
    Hi, all. I read django.contrib.sessions.backend.file today, in the save method of SessionStore there is something as the following that's used to achieve multi-threaded saving integrity: output_file_fd, output_file_name = tempfile.mkstemp(dir=dir, prefix=prefix + '_out_') renamed = False try: try: os.write(output_file_fd, self.encode(session_data)) finally: os.close(output_file_fd) os.rename(output_file_name, session_file_name) renamed = True finally: if not renamed: os.unlink(output_file_name) I don't quite understand how this solve the integrity problem.

    Read the article

  • Security concerns for a multi-lingual web application.

    - by The Rook
    I am converting a PHP MySQL web application written for English language into a Multi-Language site. Do you know any vulnerabilities that affect web applications in another language? Or perhaps vulnerabilities that could be introduced in the conversion of code base to support multiple languages. (If you know any vulnerabilities of this type in another programming language I'll give you a +1)

    Read the article

  • Confusion in multi dimensional array in Java

    - by Alvin
    Hello, I'm not able to understand the following multi-dimensional code. Could someone please clarify me? int[][] myJaggedArr = new int [][] { new int[] {1,3,5,7,9}, new int[] {0,2,4,6}, new int[] {11,22} }; May I know how it is different from the following code? int[][] myArr = new int [][] { {1,3,5,7,9}, {0,2,4,6}, {11,22} };

    Read the article

  • "Multi-threading" w/ NSTimers in an iPhone app

    - by MrDatabase
    Say I have two NSTimers in my iPhone app: timer1 and timer2. timer1 calls function1 30 times per second and timer2 calls function2 30 times per second. Assume these two functions are reading and updating the same integer variables. Are there any "multi-threading" issues here? If not how does iPhone OS handle the execution of the two functions (in general)?

    Read the article

  • Multi Module Project - Assembly plugin

    - by user209947
    I am using Maven 2.0.9 to build a multi module project. I have defined the assembly plugin in my parent pom. I can get my assemblies built using mvn install assembly:assembly This command runs the tests twice, once during install phase and another during assembly. I tried assembly:single but it throws an error. Any help to get my assemblies built without running the tests twice is much appreciated.

    Read the article

  • multi row header on Google Visualizations

    - by Elzo Valugi
    Hi, I am trying to create a DataTable with a multi row header. I'll exemplify here: | 2008 | 2009 | --------------------------------------------------------- | price | qty. | price | qty | --------------------------------------------------------- | 93993 | 34434 | 34244 | 3434 | ..... The years headers can be fixed as I don't want to do sorting by that. Is there a way to do that in Google Visualizations?

    Read the article

  • Java Multi threading - Avoid duplicate request processing

    - by seawaves
    I have following multi threaded environment scenario - Requests are coming to a method and I want to avoid the duplicate processing of concurrent requests coming. As multiple similar requests might be waiting for being processed in blocked state. I used hashtable to keep track of processed request, but it will create memory leaks, so how should keep track of processed request and avoid the same requests to be processed which may be in blocking state.

    Read the article

  • Concatenate multi value into one record.

    - by mikehjun
    I joined two tables together and what I like to do is concatenate multi vaule in one records without duplicated value. TAXLOT_ZONE TID ZONE 1 A 1 A 1 B 1 C 2 D 2 D 2 E 3 A 3 B 4 C 5 D Desirable Final table looks like; TID ZONE 1 A, B, C 2 D, E 3 A, B 4 C 5 D

    Read the article

  • Multi-key dictionaries (of another kind) in C#?

    - by Matthew Scharley
    Building on this question, is there a simple solution for having a multi-key dictionary where either key individually can be used to identify the value? ie. MultikeyDictionary<TKey1, TKey2, TValue> foo; foo.Add(key1, key2, value); myValue = foo[key1]; // value == myValue foo.Remove(key2); myValue = foo[key1]; // invalid, Exception or null returned

    Read the article

  • Backing up default windows installation with dd from linux running on another partition - is this fe

    - by Marek
    I am preparing to reinstall my system. I am thinking about creating a multi boot with a linux distro+Windows 7 to choose from when starting up. I would love to be able to skip all the hassle of reinstalling Windows and all programs when it starts becoming too slow in the future, thus I would like to mirror my fresh Windows system partition with some programs preinstalled. I am thinking about installing Ubuntu, making a partition for windows, installing windows with the basic environment (Visual Studio, Office, etc.) then booting into Linux and making an image of the windows partition with dd. I am not familiar with linux at all so I am a little afraid something may go wrong along the way. Is it possible to do it this way? Will I be able to partition my existing disk for multi boot easily after I install Ubuntu? Will I be able to recover the Windows partition easily using dd when I will need to re-create a fresh windows partition in the future? What other (better) approach can you recommend to achieve the goal of easy disk mirroring (for free)?

    Read the article

  • How to handle multi-processing of libraries which already spawn sub-processes?

    - by exhuma
    I am having some trouble coming up with a good solution to limit sub-processes in a script which uses a multi-processed library and the script itself is also multi-processed. Both, the library and script are modifiable by us. I believe the question is more about design than actual code, but for what it's worth, it's written in Python. The goal of the library is to hide implementation details of various internet routers. For that reason, the library has a "Proxy" factory method which takes the IP of a router as parameter. The factory then probes the device using a set of possible proxies. Usually, there is one proxy which immediately knows that is is able to send commands to this device. All others usually take some time to return (given a timeout). One thought was already to simply query the device for an identifier, and then select the proper proxy using that, but in order to do so, you would already need to know how to query the device. Abstracting this knowledge is one of the main purposes of the library, so that becomes a little bit of a "circular-requirement"/deadlock: To connect to a device, you need to know what proxy to use, and to know what proxy to create, you need to connect to a device. So probing the device is - as we can see - the best solution so far, apart from keeping a lookup-table somewhere. The library currently kills all remaining processes once a valid proxy has been found. And yes, there is always only one good proxy per device. Currently there are about 12 proxies. So if one create a proxy instance using the factory, 12 sub-processes are spawned. So far, this has been really useful and worked very well. But recently someone else wanted to use this library to "broadcast" a command to all devices. So he took the library, and wrote his own multi-processed script. This obviously spawned 12 * n processes where n is the number of IPs to which he broadcasted. This has given us two problems: The host on which the command was executed slowed down to a near halt. Aborting the script with CTRL+C ground the system to a total halt. Not even the hardware console responded anymore! This may be due to some Python strangeness which still needs to be investigated. Maybe related to http://bugs.python.org/issue8296 The big underlying question, is how to design a library which does multi-processing, so other applications which use this library and want to be multi-processed themselves do not run into system limitations. My first thought was to require a pool to be passed to the library, and execute all tasks in that pool. In that way, the person using the library has control over the usage of system resources. But my gut tells me that there must be a better solution. Disclaimer: My experience with multiprocessing is fairly limited. I have implemented a few straightforward which did not require access control to resources. So I have not yet any practical experience with semaphores or mutexes. p.s.: In the future, we may have enough information to do this without the probing. But the database which would contain the proper information is not yet operational. Also, the design about multiprocessing a multiprocessed library intrigues me :)

    Read the article

  • How to make firefox to spellcheck in multiple languages simultaneously?

    - by Vi
    I want to it to assume that text may be in mixture of languages and words should be looked up in multiple dictionaries. (E.g. everything in en-GB, en-US, ru, be and be-classic should be consider as good, everything else should be underlined and corrections from all dictionaries should be offered). Is there an add-on for "multi-language spell-check"? Alternatively, can I merge all dictionaries into one big combined dictionary?

    Read the article

  • How to limit a process to a single CPU core?

    - by Jonathan
    How do you limit a single process program run in a Windows environment to run only on a single CPU on a multi-core machine? Is it the same between a windowed program and a command line program? UPDATE: Reason for doing this: benchmarking various programming languages aspects I need something that would work from the very start of the process, therefore @akseli's answer, although great for other cases, doesn't solve my case

    Read the article

  • Forcing a mixed ISO-8859-1 and UTF-8 multi-line string into UTF-8

    - by knorv
    Consider the following problem: A multi-line string $junk contains some lines which are encoded in UTF-8 and some in ISO-8859-1. I don't know a priori which lines are in which encoding, so heuristics will be needed. I want to turn $junk into pure UTF-8 with proper re-encoding of the ISO-8859-1 lines. Also, in the event of errors in the processing I want to provide a "best effort result" rather than throwing an error. My current attempt looks like this: $junk = &force_utf8($junk); sub force_utf8 { my $input = shift; my $output = ''; foreach my $line (split(/\n/, $input)) { if (utf8::valid($line)) { utf8::decode($line); } $output .= "$line\n"; } return $output; } While this appears to work I'm certain this is not the optimal solution. How would you improve my force_utf8(...) sub?

    Read the article

< Previous Page | 20 21 22 23 24 25 26 27 28 29 30 31  | Next Page >