Search Results

Search found 16680 results on 668 pages for 'python datetime'.

Page 433/668 | < Previous Page | 429 430 431 432 433 434 435 436 437 438 439 440  | Next Page >

  • programs hangs during socket interaction

    - by herrturtur
    I have two programs, sendfile.py and recvfile.py that are supposed to interact to send a file across the network. They communicate over TCP sockets. The communication is supposed to go something like this: sender =====filename=====> receiver sender <===== 'ok' ======= receiver or sender <===== 'no' ======= receiver if ok: sender ====== file ======> receiver I've got The sender and receiver code is here: Sender: import sys from jmm_sockets import * if len(sys.argv) != 4: print "Usage:", sys.argv[0], "<host> <port> <filename>" sys.exit(1) s = getClientSocket(sys.argv[1], int(sys.argv[2])) try: f = open(sys.argv[3]) except IOError, msg: print "couldn't open file" sys.exit(1) # send filename s.send(sys.argv[3]) # receive 'ok' buffer = None response = str() while 1: buffer = s.recv(1) if buffer == '': break else: response = response + buffer if response == 'ok': print 'receiver acknowledged receipt of filename' # send file s.send(f.read()) elif response == 'no': print "receiver doesn't want the file" # cleanup f.close() s.close() Receiver: from jmm_sockets import * s = getServerSocket(None, 16001) conn, addr = s.accept() buffer = None filename = str() # receive filename while 1: buffer = conn.recv(1) if buffer == '': break else: filename = filename + buffer print "sender wants to send", filename, "is that ok?" user_choice = raw_input("ok/no: ") if user_choice == 'ok': # send ok conn.send('ok') #receive file data = str() while 1: buffer = conn.recv(1) if buffer=='': break else: data = data + buffer print data else: conn.send('no') conn.close() I'm sure I'm missing something here in the sorts of a deadlock, but don't know what it is.

    Read the article

  • How do you get SQLAlchemy to override MySQL "on update CURRENT_TIMESTAMP"

    - by nocola
    I've inherited an older database that was setup with a "on update CURRENT_TIMESTAMP" put on a field that should only describe an item's creation. With PHP I have been using "timestamp=timestamp" on UPDATE clauses, but in SQLAlchemy I can't seem to force the system to use the set timestamp. Do I have no choice and need to update the MySQL table (millions of rows)? foo = session.query(f).get(int(1)) ts = foo.timestamp setattr(foo, 'timestamp', ts) setattr(foo, 'bar', bar) www_model.www_Session.commit() I have also tried: foo = session.query(f).get(int(1)) setattr(foo, 'timestamp', foo.timestamp) setattr(foo, 'bar', bar) www_model.www_Session.commit()

    Read the article

  • Various way to send data to the web server

    - by Webrsk
    Client Environment : Windows XP , Internet connection Available, PHP Not installed. Server Environment : CentOS , Internet connection Available, PHP , MYsql installed. Data are stored in files at client machine , suggest better ways to send data fetched from the file to the server. Normally i would be using HTTP request using Curl to send the data to the server but client machine doesnt have php installed. What all are the ways to send data to the server and the comparisons?

    Read the article

  • Py GTK Drawing area and Rich Text Editor

    - by crashekar
    I would like to include a rich text editor in a pygtk drawing area for an application i am developing. The editor ( a small resizable widget ) should be able to move around the drawing area like a rectangle. I am not sure how to start as I am pretty new to PyGTK. thank you !

    Read the article

  • poplib and email module will not reloop through a message if it has alread read it

    - by user1440925
    I'm currently trying to write a script that gets messages from my gmail account but I'm noticing a problem. If poplib loops through a message in my inbox it will never loop through it again. Here is my code import poplib, string, email user = "[email protected]" password = "p0ckystyx" message = "" mail = poplib.POP3_SSL('pop.gmail.com') mail.user(user) mail.pass_(password) iMessageCount = len(mail.list()[1]) message = "" msg = mail.retr(iMessageCount) str = string.join(msg[1], "\n") frmMail = email.message_from_string(str) for part in frmMail.walk(): if part.get_content_type() == "text/plain": print part.get_payload() mail.quit() Every time I run this script it goes to the next newest email and just skips over the email that was shown last time it was run.

    Read the article

  • How to do a back-reference on Google Webapp?

    - by jCuga
    I'm trying to access an object that is linked to by a db.ReferenceProperty in Google app engine. here's the model's code: class InquiryQuestion(db.Model): inquiry_ref = db.ReferenceProperty(reference_class=GiftInquiry, required=True, collection_name="inquiry_ref") And I am trying to access it in the following way: linkedObject = question.inquiry_ref and then linkedKey = linkedObject.key but it's not working. Can anyone please help?

    Read the article

  • Reordering fields in Django model

    - by Alex Lebedev
    I want to add few fields to every model in my django application. This time it's created_at, updated_at and notes. Duplicating code for every of 20+ models seems dumb. So, I decided to use abstract base class which would add these fields. The problem is that fields inherited from abstract base class come first in the field list in admin. Declaring field order for every ModelAdmin class is not an option, it's even more duplicate code than with manual field declaration. In my final solution, I modified model constructor to reorder fields in _meta before creating new instance: class MyModel(models.Model): # Service fields notes = my_fields.NotesField() created_at = models.DateTimeField(auto_now_add=True) updated_at = models.DateTimeField(auto_now=True) class Meta: abstract = True last_fields = ("notes", "created_at", "updated_at") def __init__(self, *args, **kwargs): new_order = [f.name for f in self._meta.fields] for field in self.last_fields: new_order.remove(field) new_order.append(field) self._meta._field_name_cache.sort(key=lambda x: new_order.index(x.name)) super(TwangooModel, self).__init__(*args, **kwargs) class ModelA(MyModel): field1 = models.CharField() field2 = models.CharField() #etc ... It works as intended, but I'm wondering, is there a better way to acheive my goal?

    Read the article

  • GQL Request BadArgument Error. How to get around with my case?

    - by awegawef
    My query is essentially the following: entries=Entry.all().order("-votes").order("-date").filter("votes >", VOTE_FILTER).fetch(PAGE_SIZE+1, page* PAGE_SIZE) I want to grab N of the latest entries that have a voting score above some benchmark (VOTE_FILTER). Google currently says that I cannot filter on 'votes' because I order by 'date.' I don't see a way that I can do this the way I want to, so I'd appreciate any advice.

    Read the article

  • OpenGL embedded in gtk has colour badly displayed

    - by Sardathrion
    Note that this is a re-write now that I have more clues as to where the problem could be... I am creating a GTK GUI which contains two embedded OpenGL displays. Both use the same shader code (complied once for each). On my normal hardware, this works fine. On a virtual machine running on the same hardware, I get horrible colours -- see images. I suspect that the shader code is at fault -- certainly dropping a simpler shader does make the problem moot. However, I do need both diffuse and spot lights in my shader thus making it non-trivial. Anyone has seen this before?

    Read the article

  • default model field attribute in Django

    - by Rosarch
    I have a Django model: @staticmethod def getdefault(): print "getdefault called" return cPickle.dumps(set()) _applies_to = models.TextField(db_index=True, default=getdefault) For some reason, getdefault() is never called, even as I construct instances of this model and save them to the database. This seems to contradict the Django documentation: Field.default The default value for the field. This can be a value or a callable object. If callable it will be called every time a new object is created. Am I doing something wrong? Update: Originally, I had this, but then I switched to the above version to debug: _applies_to = models.TextField(db_index=True, default=cPickle.dumps(set())) I'm not sure why that wouldn't work.

    Read the article

  • Online Game programming in Google App Engine: AI

    - by Hortinstein
    I am currently in the planning stages of a game for google app engine, but cannot wrap my head around how I am going to handle AI. I intend to have persistant NPCs that will move about the map, but short of writing a program that generates the same XML requests I use to control player actions, than run it on another server I am stuck on how to do it. I have looked at the Task Queue feature, but due to long running processes not being an option on the App engine, I am a little stuck. I intend to run multiple server instances with 200+ persistant NPC entities that I will need to update. Most action is slowly roaming around based on player movements/concentrations, and attacking close range players(you can probably guess the type of game im developing)

    Read the article

  • integrate / build 'weather radar' widget

    - by zack
    I'm looking to integrate a 'weather radar' widget into a site I'm building. The only available resource I can find is: http://www.meteoonline.co.uk/gadgets/Europe/Netherlands/135 which basically delivers a flash mov in a iframe ! urrrgh! - and 'permission denied' of course with any javascript interaction on the iframe.. Can anyone suggest an alternative resource / or approach ? I'm happy to work with the raw data if I can get it .. any ideas welcome ! site is like [ html5 + jquery on django ]

    Read the article

  • Pyjamas import statements

    - by Gordon Worley
    I'm starting to use Pyjamas and I'm running into some annoyances. I have to import a lot of stuff to make a script work well. For example, to make a button I need to first from pyjamas.ui.Button import Button and then I can use Button. Note that import pyjamas.ui.Button and then using Button.Button doesn't work (results in errors when you build to JavaScript, at least in 0.7pre1). Does anyone have a better example of a good way to do the import statements in Pyjamas than what the Pyjamas folks have on their site? Doing things their way is possible, but ugly and overly complicated from my perspective, especially when you want to use a dozen or more ui components.

    Read the article

  • removing elements incrementally from a list

    - by Javier
    Dear all, I've a list of float numbers and I would like to delete incrementally a set of elements in a given range of indexes, sth. like: for j in range(beginIndex, endIndex+1): print ("remove [%d] => val: %g" % (j, myList[j])) del myList[j] However, since I'm iterating over the same list, the indexes (range) are not valid any more for the new list. Does anybody has some suggestions on how to delete the elements properly? Best wishes

    Read the article

  • how to fetch more than 1000 entities NON keybased?

    - by user291071
    If I should be approaching this problem through a different method, please suggest so. I am creating an item based collaborative filter. I populate the db with the LinkRating2 class and for each link there are more than a 1000 users that I need to call and collect their ratings to perform calculations which I then use to create another table. So I need to call more than 1000 entities for a given link. For instance lets say there are over a 1000 users rated 'link1' there will be over a 1000 instances of this class for the given link property that I need to call. How would I complete this example? class LinkRating2(db.Model): user = db.StringProperty() link = db.StringProperty() rating2 = db.FloatProperty() query =LinkRating2.all() link1 = 'link string name' a = query.filter('link = ', link1) aa = a.fetch(1000)##how would i get more than 1000 for a given link1 as shown? ##keybased over 1000 in other post example i need method for a subset though not key class MyModel(db.Expando): @classmethod def count_all(cls): """ Count *all* of the rows (without maxing out at 1000) """ count = 0 query = cls.all().order('__key__') while count % 1000 == 0: current_count = query.count() if current_count == 0: break count += current_count if current_count == 1000: last_key = query.fetch(1, 999)[0].key() query = query.filter('__key__ > ', last_key) return count

    Read the article

  • wxpython - Running threads sequentially without blocking GUI

    - by ryantmer
    I've got a GUI script with all my wxPython code in it, and a separate testSequences module that has a bunch of tasks that I run based on input from the GUI. The tasks take a long time to complete (from 20 seconds to 3 minutes), so I want to thread them, otherwise the GUI locks up while they're running. I also need them to run one after another, since they all use the same hardware. (My rationale behind threading is simply to prevent the GUI from locking up.) I'd like to have a "Running" message (with varying number of periods after it, i.e. "Running", "Running.", "Running..", etc.) so the user knows that progress is occurring, even though it isn't visible. I'd like this script to run the test sequences in separate threads, but sequentially, so that the second thread won't be created and run until the first is complete. Since this is kind of the opposite of the purpose of threads, I can't really find any information on how to do this... Any help would be greatly appreciated. Thanks in advance! gui.py import testSequences from threading import Thread #wxPython code for setting everything up here... for j in range(5): testThread = Thread(target=testSequences.test1) testThread.start() while testThread.isAlive(): #wait until the previous thread is complete time.sleep(0.5) i = (i+1) % 4 self.status.SetStatusText("Running"+'.'*i) testSequences.py import time def test1(): for i in range(10): print i time.sleep(1) (Obviously this isn't the actual test code, but the idea is the same.)

    Read the article

  • threading.Event wait function not signaled when subclassing Process class

    - by user1313404
    For following code never gets past the wait function in run. I'm certain I'm doing something ridiculously stupid, but since I'm not smart enough to figure out what, I'm asking. Any help is appreciated. Here is the code: import threading import multiprocessing from multiprocessing import Process class SomeClass(Process): def __init__(self): Process.__init__(self) self.event = threading.Event() self.event.clear() def continueExec(self): print multiprocessing.current_process().name print self print "Set:" + str(self.event.is_set()) self.event.set() print "Set:" + str(self.event.is_set()) def run(self): print "I'm running with it" print multiprocessing.current_process().name self.event.wait() print "I'm further than I was" print multiprocessing.current_process().name self.event.clear() def main(): s_list = [] for t in range(3): s = SomeClass() print "s:" + str(s) s_list.append(s) s.start() raw_input("Press enter to send signal") for t in range(3): print "s_list["+str(t)+"]:" + str(s_list[t]) s_list[t].continueExec() raw_input("Press enter to send signal") for t in range(3): s_list[t].join() print "All Done" if __name__ == "__main__": main()

    Read the article

  • How do I do a semijoin using SQLAlchemy?

    - by Jason Baker
    http://en.wikipedia.org/wiki/Relational_algebra#Semijoin Let's say that I have two tables: A and B. I want to make a query that would work similarly to the following SQL statement using the SQLAlchemy orm: SELECT A.* FROM A, B WHERE A.id = B.id AND B.type = 'some type'; The thing is that I'm trying to separate out A and B's logic into different places. So I'd like to make two queries that I can define in separate places: one where A uses B as a subquery, but only returns rows from A. I'm sure this is fairly easy to do, but an example would be nice if someone could show me.

    Read the article

  • How can I identify an element from a list within another list

    - by Alex
    I have been trying to make a block of code that finds the index of the largest bid for each item. Then I was going to use the index as a way to identify the person who paid that much moneys name. However no matter what i try I can't link the person and what they have gained from the auction together. Here is the code I have been writing: It has to be able to work with any information inputted def sealedBids(): n = int(input('\nHow many people are in the group? ')) z = 0 g = [] s = [] b = [] f = [] w = []#goes by number of items q = [] while z < n: b.append([]) z = z + 1 z = 0 while z < n: g.append(input('Enter a bidders name: ')) z = z + 1 z = 0 i = int(input('How many items are being bid on?')) while z < i: s.append(input('Enter the name of an item: ')) w.append(z) z = z + 1 z = 0 for j in range(n):#specifies which persons bids your taking for k in range(i):#specifies which item is being bid on b[j].append(int(input('How much money has {0} bid on the {1}? '.format(g[j], s[k])))) print(' ') for j in range(n):#calculates fair share f.append(sum(b[j])/n) for j in range(i):#identifies which quantity of money was the largest for each item for k in range(n): if w[j] < b[k][j]: w[j] = b[k][j] q.append(k) any advice is much appreciated.

    Read the article

  • MongoDB lists with paginations?

    - by Timmy
    for documents with lists with pagination, is it better to embed or use reference? im reading the custom type "SONManipulator" and it appears to transform every thing on retrieval, even the sub docs. i want to keep the list in the document sorted, should this impact anything?

    Read the article

< Previous Page | 429 430 431 432 433 434 435 436 437 438 439 440  | Next Page >