Search Results

Search found 15224 results on 609 pages for 'parallel python'.

Page 315/609 | < Previous Page | 311 312 313 314 315 316 317 318 319 320 321 322  | Next Page >

  • Are mathamatical Algorithms protected by copyright

    - by analogy
    I wish to implement an algorithm which i read in a journal paper in my software (commercial). I want to know if this is allowed or not. The algorithm in question is described in http://arxiv.org/abs/0709.2938 It is a very simple algorithm and a number of implementations exist in python (http://igraph.sourceforge.net/) and java. One of them is in gpl another which i got from a different researcher and had no license attached. There are significant differences in two implementations, e.g. second one uses threads and multiple cores. It is possible to rewrite/ (not translate) the algorithm. So can I use it in my software or on a server for commercial purpose. Thanks

    Read the article

  • Encoding gives "'ascii' codec can't encode character … ordinal not in range(128)"

    - by user140314
    I am working through the Django RSS reader project here. The RSS feed will read something like "OKLAHOMA CITY (AP) — James Harden let". The RSS feed's encoding reads encoding="UTF-8" so I believe I am passing utf-8 to markdown in the code snippet below. The em dash is where it chokes. I get the Django error of "'ascii' codec can't encode character u'\u2014' in position 109: ordinal not in range(128)" which is an UnicodeEncodeError. In the variables being passed I see "OKLAHOMA CITY (AP) \u2014 James Harden". The code line that is not working is: content = content.encode(parsed_feed.encoding, "xmlcharrefreplace") I am using markdown 2.0, django 1.1, and python 2.4. What is the magic sequence of encoding and decoding that I need to do to make this work? Thanks.

    Read the article

  • Turning A Stacked List into workable data

    - by BoSox
    In Excel I have a list of names that in the cell appear stacked, and I want each name in its own column. I was thinking Python may be a good way to do this? Example: Joe Smith John Hawk Mike Green Lauren Smith One cell will look exactly like that, with each name on its line within the cell but all of the names contained in the cell. I have 50 cells each with 1-20 stacked names and I want to put each name in its own cell on a given row. So, in my example all of those names would occupy the same row but each would have their own column. Any ideas?

    Read the article

  • How can one manage to fully use the newly enhanced Parallelism features in .NET 4.0?

    - by Will Marcouiller
    I am pretty much interested into using the newly enhanced Parallelism features in .NET 4.0. I have also seen some possibilities of using it in F#, as much as in C#. Despite, I can only see what PLINQ has to offer with, for example, the following: var query = from c in Customers.AsParallel() where (c.Name.Contains("customerNameLike") select c; There must for sure be some other use of this parallelism thing. Have you any other examples of using it? Is this particularly turned toward PLINQ, or are there other usage as easy as PLINQ? Thanks! =)

    Read the article

  • Bulk get of child entities on Google app engine?

    - by dfrankow
    On Google App Engine in Python, I have a Visit entity that has a parent of Patient. A Patient may have multiple visits. I need to set the most_recent_visit (and some auxiliary visit data) somewhere for later querying, probably in another child entity that Brett Slatkin might call a "relationship index entity." I wish to do so in a bulk style as follows: 1. get 100 Patient keys 2. get all Visits that have any of the Patients from 1 as an ancestor 3. go through the Visits and get the latest for each Patient Is there any way to perform step 2 in a bulk get? I know there is a way to bulk get entities from a list of keys, and there is a way to match a single entity by its ancestor.

    Read the article

  • PyQt4 Need to move DLLs to package root

    - by Xavier
    Hi Guys, I've used the new installers from http://www.riverbankcomputing.co.uk/software/pyqt/download for Python 2.6 x86_64 and I've a small problem importing PyQt4 in one particular application. Here's the traceback: # ERROR : Traceback (most recent call last): # File "<Script Block >", line 2, in <module> # from PyQt4 import QtCore # ImportError: DLL load failed: The specified procedure could not be found. # - [line 2] This might look familiar. Fun thing is that in a previous version of the 3d software it does work (and from a standard command line), but not in the new software version. I inspected the sys.path (within the app) in order to see if this path was there: C:\Python26\Lib\site-packages\PyQt4\bin In both application this path is present. Finally managed to make it works by copying the DLLs from C:\Python26\Lib\site-packages\PyQt4\bin to C:\Python26\Lib\site-packages\PyQt4 Is there any known reason for this? I've a hard time debugging this thing further (making sure everything is 64 bit, path are correct, etc) Thanks for your help

    Read the article

  • How do you PEP 8-name a class whose name is an acronym?

    - by Arrieta
    I try to adhere to the style guide for Python code (also known as PEP 8). Accordingly, the preferred way to name a class is using CamelCase: Almost without exception, class names use the CapWords convention. Classes for internal use have a leading underscore in addition. How can I be consistent with PEP 8 if my class name is formed by two acronyms (which in proper English should be capitalized). For instance, if my class name was 'NASA JPL', what would you name it?: class NASAJPL(): # 1 class NASA_JPL(): # 2 class NasaJpl(): # 3 I am using #1, but it looks weird; #3 looks weird too, and #2 seems to violate PEP 8. Thoughts?

    Read the article

  • Obtain Latitude and Longitude from a GeoTIFF File

    - by Mikee
    Using GDAL in Python, how do you get the latitude and longitude of a GeoTIFF file? GeoTIFF's do not appear to store any coordinate information. Instead, they store the XY Origin coordinates. However, the XY coordinates do not provide the latitude and longitude of the top left corner and bottom left corner. It appears I will need to do some math to solve this problem, but I don't have a clue on where to start. What procedure is required to have this performed? I know that the GetGeoTransform() method is important for this, however, I don't know what to do with it from there.

    Read the article

  • match elements from two files, how to write the intended format to a new file

    - by user2489612
    I am trying to update my text file by matching the first column to another updated file's first column, after match it, it will update the old file. Here is my old file: Name Chr Pos ind1 in2 in3 ind4 foot 1 5 aa bb cc ford 3 9 bb cc 00 fake 3 13 dd ee ff fool 1 5 ee ff gg fork 1 3 ff gg ee Here is the new file: Name Chr Pos foot 1 5 fool 2 5 fork 2 6 ford 3 9 fake 3 13 The updated file will be like: Name Chr Pos ind1 in2 in3 ind4 foot 1 5 aa bb cc fool 2 5 ee ff gg fork 2 6 ff gg ee ford 3 9 bb cc 00 fake 3 13 dd ee ff Here is my code: #!/usr/bin/env python import sys inputfile_1 = sys.argv[1] inputfile_2 = sys.argv[2] outputfile = sys.argv[3] inputfile1 = open(inputfile_1, 'r') inputfile2 = open(inputfile_2, 'r') outputfile = open(outputfile, 'w') ind = inputfile1.readlines() cm = inputfile2.readlines()[1:] outputfile.write(ind[0]) #add header for i in ind: i = i.split() for j in cm: j = j.split() if j[0] == i[0]: outputfile.writelines(j[0:3] + i[3:]) inputfile1.close() inputfile2.close() outputfile.close() When I ran it, it returned a single column rather than the format i wanted, any suggestions? Thanks!

    Read the article

  • Duplicate an AppEngine Query object to create variations of a filter without affecting the base quer

    - by Steve Mayne
    In my AppEngine project I have a need to use a certain filter as a base then apply various different extra filters to the end, retrieving the different result sets separately. e.g.: base_query = MyModel.all().filter('mainfilter', 123) Then I need to use the results of various sub queries separately: subquery1 = basequery.filter('subfilter1', 'xyz') #Do something with subquery1 results here subquery2 = basequery.filter('subfilter2', 'abc') #Do something with subquery2 results here Unfortunately 'filter()' affects the state of the basequery Query instance, rather than just returning a modified version. Is there any way to duplicate the Query object and use it as a base? Is there perhaps a standard Python way of duping an object that could be used? The extra filters are actually applied by the results of different forms dynamically within a wizard, and they use the 'running total' of the query in their branch to assess whether to ask further questions. Obviously I could pass around a rudimentary stack of filter criteria, but I'd rather use the Query itself if possible, as it adds simplicity and elegance to the solution.

    Read the article

  • is this a correct way to generate rsa keys?

    - by calccrypto
    is this code going to give me correct values for RSA keys (assuming that the other functions are correct)? im having trouble getting my program to decrypt properly, as in certain blocks are not decrypting properly this is in python: import random def keygen(bits): p = q = 3 while p == q: p = random.randint(2**(bits/2-2),2**(bits/2)) q = random.randint(2**(bits/2-2),2**(bits/2)) p += not(p&1) # changes the values from q += not(q&1) # even to odd while MillerRabin(p) == False: # checks for primality p -= 2 while MillerRabin(q) == False: q -= 2 n = p * q tot = (p-1) * (q-1) e = tot while gcd(tot,e) != 1: e = random.randint(3,tot-1) d = getd(tot,e) # gets the multiplicative inverse while d<0: # i can probably replace this with mod d = d + tot return e,d,n one set of keys generated: e = 3daf16a37799d3b2c951c9baab30ad2d d = 16873c0dd2825b2e8e6c2c68da3a5e25 n = dc2a732d64b83816a99448a2c2077ced

    Read the article

  • Faster way to convert from 24 bit wav pcm format to float?

    - by LMO
    I need to read data in from a wav file in 24 bit pcm format, and convert to float. I'm using Python 2.7.2. The wave package reads the data in as a string, so what I've tried is: # read in entire wav file wdata = f.readframes(nFrames) # unpack into signed integers and convert to float data = array.array('f') for i in range(0,nFrames*3,3): data.append(float(struct.unpack('<i', '\x00'+ wdata[i:i+3])[0])) # normalize sample values data = data / 0x800000 This is quite a bit faster than my earlier approaches, but still quite slow. Can anyone suggest a more efficient method?

    Read the article

  • How to synchronize cuda threads when they are in the same loop and we need to synchronize them to ex

    - by Vickey
    Hi all, I have written a code and Now I want to implement this on cuda GPU but I'm new to synchronization so please help me with this, It's little urgent to me. Below I'm presenting the code and I want to that LOOP1 to be executed by all threads (heance I want to this portion to take advantage of cuda and the remaining portion (the portion other from the LOOP1) is to be executed by only a single thread. do{ point_set = master_Q[(*num_mas) - 1].q; List* temp = point_set; List* pa = point_set; if(master_Q[num_mas[0] - 1].max) max_level = (int) (ceilf(il2 * log(master_Q[num_mas[0] - 1].max))); *num_mas = (*num_mas) - 1; while(point_set){ List* insert_ele = temp; while(temp){ insert_ele = temp; if((insert_ele->dist[insert_ele->dist_index-1] <= pow(2, max_level-1)) || (top_level == max_level)){ if(point_set == temp){ point_set = temp->next; pa = temp->next; } else{ pa->next = temp->next; } temp = NULL; List* new_point_set = point_set; float maximum_dist = 0; if(parent->p_index != insert_ele->point_index){ List* tmp = new_point_set; float *b = &(data[(insert_ele->point_index)*point_len]); **LOOP 1:** while(tmp){ float *c = &(data[(tmp->point_index)*point_len]); float sum = 0.; for(int j = 0; j < point_len; j+=2){ float d1 = b[j] - c[j]; float d2 = b[j+1] - c[j+1]; d1 *= d1; d2 *= d2; sum = sum + d1 + d2; } tmp->dist[tmp->dist_index] = sqrt(sum); if(maximum_dist < tmp->dist[tmp->dist_index]) maximum_dist = tmp->dist[tmp->dist_index]; tmp->dist_index = tmp->dist_index+1; tmp = tmp->next; } max_distance = maximum_dist; } while(new_point_set || insert_ele){ List* far, *par, *tmp, *tmp_new; far = NULL; tmp = new_point_set; tmp_new = NULL; float level_dist = pow(2, max_level-1); float maxdist = 0, maxp = 0; while(tmp){ if(tmp->dist[(tmp->dist_index)-1] > level_dist){ if(maxdist < tmp->dist[tmp->dist_index-1]) maxdist = tmp->dist[tmp->dist_index-1]; if(tmp == new_point_set){ new_point_set = tmp->next; par = tmp->next; } else{ par->next = tmp->next; } if(far == NULL){ far = tmp; tmp_new = far; } else{ tmp_new->next = tmp; tmp_new = tmp; } if(parent->p_index != insert_ele->point_index) tmp->dist_index = tmp->dist_index - 1; tmp = tmp->next; tmp_new->next = NULL; } else{ par = tmp; if(maxp < tmp->dist[(tmp->dist_index)-1]) maxp = tmp->dist[(tmp->dist_index)-1]; tmp = tmp->next; } } if(0 == maxp){ tmp = new_point_set; aloc_mem[*tree_index].p_index = insert_ele->point_index; aloc_mem[*tree_index].no_child = 0; aloc_mem[*tree_index].level = max_level--; parent->children_index[parent->no_child++] = *tree_index; parent = &(aloc_mem[*tree_index]); tree_index[0] = tree_index[0]+1; while(tmp){ aloc_mem[*tree_index].p_index = tmp->point_index; aloc_mem[(*tree_index)].no_child = 0; aloc_mem[(*tree_index)].level = master_Q[(*cur_count_Q)-1].level; parent->children_index[parent->no_child] = *tree_index; parent->no_child = parent->no_child + 1; (*tree_index)++; tmp = tmp->next; } cur_count_Q[0] = cur_count_Q[0]-1; new_point_set = NULL; } master_Q[*num_mas].q = far; master_Q[*num_mas].parent = parent; master_Q[*num_mas].valid = true; master_Q[*num_mas].max = maxdist; master_Q[*num_mas].level = max_level; num_mas[0] = num_mas[0]+1; if(0 != maxp){ aloc_mem[*tree_index].p_index = insert_ele->point_index; aloc_mem[*tree_index].no_child = 0; aloc_mem[*tree_index].level = max_level; parent->children_index[parent->no_child++] = *tree_index; parent = &(aloc_mem[*tree_index]); tree_index[0] = tree_index[0]+1; if(maxp){ int new_level = ((int) (ceilf(il2 * log(maxp)))) +1; if (new_level < (max_level-1)) max_level = new_level; else max_level--; } else max_level--; } if( 0 == maxp ) insert_ele = NULL; } } else{ if(NULL == temp->next){ master_Q[*num_mas].q = point_set; master_Q[*num_mas].parent = parent; master_Q[*num_mas].valid = true; master_Q[*num_mas].level = max_level; num_mas[0] = num_mas[0]+1; } pa = temp; temp = temp->next; } } if((*num_mas) > 1){ List *temp2 = master_Q[(*num_mas)-1].q; while(temp2){ List* temp3 = master_Q[(*num_mas)-2].q; master_Q[(*num_mas)-2].q = temp2; if((master_Q[(*num_mas)-1].parent)->p_index != (master_Q[(*num_mas)-2].parent)->p_index){ temp2->dist_index = temp2->dist_index - 1; } temp2 = temp2->next; master_Q[(*num_mas)-2].q->next = temp3; } num_mas[0] = num_mas[0]-1; } point_set = master_Q[(*num_mas)-1].q; temp = point_set; pa = point_set; parent = master_Q[(*num_mas)-1].parent; max_level = master_Q[(*num_mas)-1].level; if(master_Q[(*num_mas)-1].max) if( max_level > ((int) (ceilf(il2 * log(master_Q[(*num_mas)-1].max)))) +1) max_level = ((int) (ceilf(il2 * log(master_Q[(*num_mas)-1].max)))) +1; num_mas[0] = num_mas[0]-1; } }while(*num_mas > 0);

    Read the article

  • Getting pixel averages of a vector sitting atop a bitmap...

    - by user346511
    I'm currently involved in a hardware project where I am mapping triangular shaped LED to traditional bitmap images. I'd like to overlay a triangle vector onto an image and get the average pixel data within the bounds of that vector. However, I'm unfamiliar with the math needed to calculate this. Does anyone have an algorithm or a link that could send me in the right direction? (I tagged this as Python, which is preferred, but I'd be happy with the general algorithm!) I've created a basic image of what I'm trying to capture here: http://imgur.com/Isjip.gif

    Read the article

  • Returning more than one result

    - by Hairr
    I'm using the following code: def recentchanges(bot=False,rclimit=20): """ @description: Gets the last 20 pages edited on the recent changes and who the user who edited it """ recent_changes_data = { 'action':'query', 'list':'recentchanges', 'rcprop':'user|title', 'rclimit':rclimit, 'format':'json' } if bot is False: recent_changes_data['rcshow'] = '!bot' else: pass data = urllib.urlencode(recent_changes_data) response = opener.open('http://runescape.wikia.com/api.php',data) content = json.load(response) pages = tuple(content['query']['recentchanges']) for title in pages: return title['title'] When I do recentchanges() I only get one result. If I print it though, all the pages are printed. Am I just misunderstanding or is this something relating to python? Also, opener is: cj = CookieJar() opener = urllib2.build_opener(urllib2.HTTPCookieProcessor(cj))

    Read the article

  • MongoDB: What's the point of using MapReduce without parallelism?

    - by netvope
    Quoting http://www.mongodb.org/display/DOCS/MapReduce#MapReduce-Parallelism As of right now, MapReduce jobs on a single mongod process are single threaded Without parallelism, what are the benefits of MapReduce compared to simpler or more traditional methods for queries and data aggregation? To avoid confusion: the question is NOT "what are the benefits of document-oriented DB over traditional relational DB"

    Read the article

  • My next programming Language

    - by Betamoo
    Currently I can program in: C#, C++, JAVA and PHP. The next summer, I intend to start learning a new language. Can you help me suggesting what must I start reading about? I heard about Perl, Python and Lisp.. but I do not know if any of them will worth more than what I already got in my other languages.. Also please mention how much your suggest language is demanded in career market.. I do not want to learn an obsolete language.. Thanks

    Read the article

  • How to return a value when destroying/cleaning-up an object instance

    - by Mridang Agarwalla
    When I initiate a class in Python, I give it some values. I then call method in the class which does something. Here's a snippet: class TestClass(): def __init__(self): self.counter = 0 def doSomething(self): self.counter = self.counter + 1 print 'Hiya' if __name__ == "__main__": obj = TestClass() obj.doSomething() obj.doSomething() obj.doSomething() print obj.counter As you can see, everytime I call the doSomething method, it prints some text and increments an internal variable i.e. counter. When I initiate the class, i set the counter variable to 0. When I destroy the object, I'd like to return the internal counter variable. What would be a good way of doing this? I wanted to know if there were other ways apart from doing stuff like: accessing the variable directly. Like obj.counter. creating a method like getCounter. Thanks.

    Read the article

  • django + xmppy: send a message to two recipients

    - by Agrajag
    I'm trying to use xmpppy for sending jabber-messages from a django-website. This works entirely fine. However, the message only gets sent to the -first- of the recipients in the list. This happens when I run the following function from django, and also if I run it from an interactive python-shell. The weird part though, is that if I extract the -body- of the function and run that interactively, then all the recipients (there's just 2 at the moment) get the message. Also, I do know that the inner for-loop gets run the correct count times (2), because the print-statement does run twice, and return two different message-ids. The function looks like this: def hello_jabber(request, text): jid=xmpp.protocol.JID(settings.JABBER_ID) cl=xmpp.Client(jid.getDomain(),debug=[]) con=cl.connect() auth=cl.auth(jid.getNode(),settings.JABBER_PW,resource=jid.getResource()) for friend in settings.JABBER_FRIENDS: id=cl.send(xmpp.protocol.Message(friend,friend + ' is awesome:' + text)) print 'sent message with id ' + str(id) cl.disconnect() return render_to_response('jabber/sent.htm', locals())

    Read the article

  • What is the "task" in twitter Storm parallelism

    - by John Wang
    I'm trying to learn twitter storm by following the great article "Understanding the parallelism of a Storm topology" However I'm a bit confused by the concept of "task". Is a task an running instance of the component(spout or bolt) ? A executor having multiple tasks actually is saying the same component is executed for multiple times by the executor, am I correct ? Moreover in a general parallelism sense, Storm will spawn a dedicated thread(executor) for a spout or bolt, but what is contributed to the parallelism by an executor(thread) having multiple tasks ? I think having multiple tasks in a thread, since a thread executes sequentially, only make the thread a kind of "cached" resource, which avoids spawning new thread for next task run. Am I correct? I may clear those confusion by myself after taking more time to investigate, but you know, we both love stackoverflow ;-) Thanks in advance.

    Read the article

  • Using Property Builtin with GAE Datastore's Model

    - by ejel
    I want to make attributes of GAE Model properties. The reason is for cases like to turn the value into uppercase before storing it. For a plain Python class, I would do something like: Foo(db.Model): def get_attr(self): return self.something def set_attr(self, value): self.something = value.upper() if value != None else None attr = property(get_attr, set_attr) However, GAE Datastore have their own concept of Property class, I looked into the documentation and it seems that I could override get_value_for_datastore(model_instance) to achieve my goal. Nevertheless, I don't know what model_instance is and how to extract the corresponding field from it. Is overriding GAE Property classes the right way to provides getter/setter-like functionality? If so, how to do it? Added: One potential issue of overriding get_value_for_datastore that I think of is it might not get called before the object was put into datastore. Hence getting the attribute before storing the object would yield an incorrect value.

    Read the article

  • Regex for finding valid sphinx fields

    - by mlissner
    I'm trying to validate that the fields given to sphinx are valid, but I'm having difficulty. Imagine that valid fields are cat, mouse, dog, puppy. Valid searches would then be: @cat search terms @(cat) search terms @(cat, dog) search term @cat searchterm1 @dog searchterm2 @(cat, dog) searchterm1 @mouse searchterm2 So, I want to use a regular expression to find terms such as cat, dog, mouse in the above examples, and check them against a list of valid terms. Thus, a query such as: @(goat) Would produce an error because goat is not a valid term. I've gotten so that I can find simple queries such as @cat with this regex: (?:@)([^( ]*) But I can't figure out how to find the rest. I'm using python & django, for what that's worth.

    Read the article

  • How do I redirect stdin/stdout when I have a sequence of commands in Bash?

    - by Tom
    I've currently got a Bash command being executed (via Python's subprocess::Popen) which is reading from stdin, doing something and outputing to stdout. Something along the lines of: pid = subprocess.Popen( ["-c", "cmd1 | cmd2"], stdin = subprocess.PIPE, stdout = subprocess.PIPE, shell =True ) output_data = pid.communicate( "input data\n" ) Now, what I want to do is to change that to execute another command in that same subshell that will alter the state before the next commands execute, so my shell command line will now (conceptually) be: cmd0; cmd1 | cmd2 Is there any way to have the input sent to cmd1 instead of cmd0 in this scenario? I'm assuming the output will include cmd0's output (which will be empty) followed by cmd2's output. cmd0 shouldn't actually read anything from stdin, does that make a difference in this situation? I know this is probably just a dumb way of doing this, I'm trying to patch in cmd0 without altering the other code too significantly. That said, I'm open to suggestions if there's a much cleaner way to approach this.

    Read the article

  • Django or Drupal, which one should I use that suits best my needs ?

    - by HJ-INCPP
    Hello, I want to learn and use Drupal or Django for the following: dynamic web sites, medium database, multi-level users, paypal integration, content managment, speed (developing), security I like MVC, ORM and object-oriented prg. Which is better to jump into ? Which one is more mature, powerful, understandable, object-oriented and easier to use by the time ? What about Python Spring ... Also, which of these 3 are better documented, are better for a cv and have more extensions? Known languages: php, java, mysql Thank you !

    Read the article

< Previous Page | 311 312 313 314 315 316 317 318 319 320 321 322  | Next Page >