Search Results

Search found 15187 results on 608 pages for 'boost python'.

Page 135/608 | < Previous Page | 131 132 133 134 135 136 137 138 139 140 141 142  | Next Page >

  • Limit calls to external database with Python CGI

    - by Matt Ball
    I've got a Python CGI script that pulls data from a GPS service; I'd like this information to be updated on the webpage about once every 10s (the max allowed by the GPS service's TOS). But there could be, say, 100 users viewing the webpage at once, all calling the script. I think the users' scripts need to grab data from a buffer page that itself only upates once every ten seconds. How can I make this buffer page auto-update if there's no one directly viewing the content (and not accessing the CGI)? Are there better ways to accomplish this?

    Read the article

  • Update existing columns and rows within csv file using Python

    - by wilbev
    So I've been attempting to use the csv module in Python to add data to existing rows and columns, but only specific columns of each row. So for examples let's say my existing csv file has the following: id, name, city, age 1, Ed,, 34 2, Pat,, 23 So basically the city of each person is missing, so I would like to update each row with that person's city. However, the writerow method only seems replace the existing data within the csv file. Changing the open file to append mode just adds the data to a new row. Is there any way to skip the existing data, and only add the city to each row? Thanks

    Read the article

  • Scrapping *.aspx content using Python

    - by tomato
    I'm having difficulties scrapping dynamically generated table in ASPX. Trying to scrap the gas prices from a site like these GasPrices. I can extract all the information in the gas price table (address, time submitted etc.), except for the actual gas price. Is there a way I could scrap the gas prices? i.e. somehow get a text representation of it. I'm not very familiar with ASP/ASPX - but what's being generated now is not showing up in the final HTML. I'm using Python to do the scrapping, but that's irrelevant unless there's a specific library...

    Read the article

  • Maintaining Logging and/or stdout/stderr in Python Daemon

    - by dave mankoff
    Every recipe that I've found for creating a daemon process in Python involves forking twice (for Unix) and then closing all open file descriptors. (See http://www.jejik.com/articles/2007/02/a_simple_unix_linux_daemon_in_python/ for an example). This is all simple enough but I seem to have an issue. On the production machine that I am setting up, my daemon is aborting - silently since all open file descriptors were closed. I am having a tricky time debugging the issue currently and am wondering what the proper way to catch and log these errors are. What is the right way to setup logging such that it continues to work after daemonizing? Do I just call logging.basicConfig() a second time after daemonizing? What's the right way to capture stdout and stderr? I am fuzzy on the details of why all the files are closed. Ideally, my main code could just call daemon_start(pid_file) and logging would continue to work.

    Read the article

  • Python: Check if all dictionaries in list are empty

    - by Brant
    I have a list of dictionaries. I need to check if all the dictionaries in that list are empty. I am looking for a simple statement that will do it in one line. Is there a single line way to do the following (not including the print)? l = [{},{},{}] # this list is generated elsewhere... all_empty = True for i in l: if i: all_empty = False print all_empty Somewhat new to python... I don't know if there is a shorthand built-in way to check this. Thanks in advance.

    Read the article

  • StringListProperty limited to 500 char strings (Google App Engine / Python)

    - by MarcoB
    It seems that StringListProperty can only contain strings up to 500 chars each, just like StringProperty... Is there a way to store longer strings than that? I don't need them to be indexed or anything. What I would need would be something like a "TextListProperty", where each string in the list can be any length and not limited to 500 chars. Can I create a property like that? Or can you experts suggest a different approach? Perhaps I should use a plain list and pickle/unpickle it in a Blob field, or something like that? I'm a bit new to Python and GAE and I would greatly appreciate some pointers instead of spending days on trial and error...thanks!

    Read the article

  • delete all records except the id I have in a python list

    - by jay_t
    Hi all, I want to delete all records in a mysql db except the record id's I have in a list. The length of that list can vary and could easily contain 2000+ id's, ... Currently I convert my list to a string so it fits in something like this: cursor.execute("""delete from table where id not in (%s)""",(list)) Which doesn't feel right and I have no idea how long list is allowed to be, .... What's the most efficient way of doing this from python? Altering the structure of table with an extra field to mark/unmark records for deletion would be great but not an option. Having a dedicated table storing the id's would indeed be helpful then this can just be done through a sql query... but I would really like to avoid these options if possible. Thanks,

    Read the article

  • Filtering Data in a Text File with Python

    - by YAS
    I'm new to Python (like Zygote new), and it's just to supplement another program but what I need is I have a text file that's a group of items for a game and it is formatted so: [1] Name=Blah Faction=Blahdiddly Cost=1000 [2] Name=Meh Faction=MehMeh Cost=2000 [3] Name=Lollypop Faction=Blahdiddly Cost=100 And I need to be able to find out what groups (the numbers in brackets) have matching values. So if I search Faction=Blahdiddly Group 1 & 3 will come up. I unfortunately have NO idea how to do this. Can anyone help?

    Read the article

  • python -> combinations of numbers and letters

    - by tekknolagi
    #!/usr/bin/python import random lower_a = ['a', 'b', 'c', 'd', 'e', 'f', 'g', 'h', 'i', 'j', 'k', 'l', 'm', 'n', 'o', 'p', 'q', 'r', 's', 't', 'u', 'v', 'w', 'x', 'y', 'z'] upper_a = ['A', 'B', 'C', 'D', 'E', 'F', 'G', 'H', 'I', 'J', 'K', 'L', 'M', 'N', 'O', 'P', 'Q', 'R', 'S', 'T', 'U', 'V', 'W', 'X', 'Y', 'Z'] num = ['0', '1', '2', '3', '4', '5', '6', '7', '8', '9'] all = [] all = " ".join("".join(lower_a) + "".join(upper_a) + "".join(num)) all = all.split() x = 1 c = 1 while x < 10: y = [] for i in range(c): a = random.choice(all) y.append(a) print "".join(y) x += 1 c += 1 what i have now outputs something like the following: 5 hE HAy 1kgy Pt6JM 2pFuCb Jv5osaX 5q8PwWAO SvHWRKfI5 how can i make it systematically go through every combination of letters (upper and lowercase) for a given length, then add 1 to that length and repeat the process?

    Read the article

  • Python sqlite3 and concurrency

    - by RexE
    I have a Python program that uses the "threading" module. Once every second, my program starts a new thread that fetches some data from the web, and stores this data to my hard drive. I would like to use sqlite3 to store these results, but I can't get it to work. The issue seems to be about the following line: conn = sqlite3.connect("mydatabase.db") If I put this line of code inside each thread, I get an OperationalError telling me that the database file is locked. I guess this means that another thread has mydatabase.db open through a sqlite3 connection and has locked it. If I put this line of code in the main program and pass the connection object (conn) to each thread, I get a ProgrammingError, saying that SQLite objects created in a thread can only be used in that same thread. Previously I was storing all my results in CSV files, and did not have any of these file-locking issues. Hopefully this will be possible with sqlite. Any ideas?

    Read the article

  • Adding anchors to h2 in text using python and regexp

    - by Oli
    I'm trying to add anchors to all h2's in my html, using python. This code will add those anchors, but I need to fill the name of the anchors too. Any idea if the name can be the number of the match in the loop or a slugified version of the text between the h2 tags? Here's the code so far: regex = '(?P<name><h2>.*?</h2>)' text = re.sub(regex, "<a name=''/>"+r"\g<name>", text)

    Read the article

  • google app engine db.Model in python only display user-defined fields

    - by MattM
    I'm a python newbie so I apologize in advance if this question has been asked before. I am building out an application in GAE and need to generate a report that contains the values for a user-defined subset of fields. For example, in my db model, CrashReport, I have the following fields: entry_type entry_date instance_id build_id crash_text machine_info I present a user with the above list as a checkbox group from which they select. Whichever fields the user selects, I then create a report showing all the values in the datastore, but only for the fields that they selected. For example, if from the above list, the user selects the build_id and crash_text fields, the output might look like this: build_id crash_text 0.8.2 blown gasket 0.8.2 boom! 0.8.1 crack! ... So the question is, how exactly do I only access the values for the fields which the user has defined?

    Read the article

  • How to integrate Python scripting in my Android App (like SL4A)

    - by Seraphim's host
    I need to add scripting layer to my android App. So I can remotely prepare a script that my app download form a web service and execute on the user device. I found a interesting project called Scripting Layer for Android (SL4A) here: http://code.google.com/p/android-scripting/ I'm not sure I can execute Python script without installing the PythonForAndroid_r4.apk first. I can't force my customer to install that application! So my question is, can the SL4A layer be integrated in my app without the need to install other apk? I need to execute actions like update data in the DB, create/read/delete a file on the sd card... Not so complex but I see SL4A can do a lot of things like these. Other scripting libraries? EDIT: Found also MVEL: http://mvel.codehaus.org/ but I think it needs to be integrated to execute complex operations like accessing a DB...

    Read the article

  • making errorbars not clipped in matplotlib with Python

    - by user248237
    I am using matplotlib in Python to plot a line with errorbars as follows: plt.errorbar(xvalues, up_densities, yerr=ctl_sds, fmt='-^', lw=1.2, markersize=markersize, markeredgecolor=up_color, color=up_color, label="My label", clip_on=False) plt.xticks(xvalues) I set the ticks on the x-axis using "xticks". However, the error bars of the last point in xvalues (i.e. xvalues[-1]) are clipped on the right -- meaning only half an error bar appears. This is true even with the clip_on=False option. How can I fix this, so that the error bars appear in full, even though their right side is technically outside xvalues[-1]? thanks.

    Read the article

  • Calling Google's Custom Search API via Python

    - by user353829
    I am writing in Python a module that will query Google's Custom Search API and return all listings of domain 'example.com' I Have been reading instructions at https://code.google.com/apis/customsearch/v1/getting_started.html and am a little stumped at the moment. Are my assumptions listed below correct? For example, to search for results that has 'example.com' in the URL, the query is: *'https://www.googleapis.com/customsearch/v1?key=my_key&cx=017576662512468239146:omuauf_lfve&q=site:example.com*' *key=my_key*: value of key given by google cx=017576662512468239146: name of the search engine (google)? *omuauf_lfve*: I have no idea what this is q=site:example.com: This should return all results with 'example.com'; e.g. www.a.example.com, b.example.com, example .com

    Read the article

  • unittest in python: ignore an import from the code I want to test

    - by vaidab
    I have a python program that imports pythoncom (and uses pythoncom.CoCreateInstance from it). I want to create a unittest for the program logic without it importing pythoncom (so I can run the test on Linux as well). What options are there? Can I do it without modifying the system under test? What I found so far: sys.modules["pythoncom"] = "test" import module_that_imports_pythoncom My problem with it is if I have: from pythoncom.something import something I'll get: ImportError: No module named something.something And sys.modules["something.something"] or sys.modules["pythoncom.something.something"] doesn't work. Any ideas?

    Read the article

  • limiting the rate of emails using python

    - by Ali
    I have a python script which reads email addresses from a database for a particular date, example today, and sends out an email message to them one by one. It reads data from MySQL using the MySQLdb module and stores all results in a dictionary and sends out emails using : rows = cursor.fetchall () #All email addresses returned that are supposed to go out on todays date. for row is rows: #send email However, my hosting service only lets me send out 500 emails per hour. How can I limit my script from making sure only 500 emails are sent in an hour and then to check the database if more emails are left for today or not and then to send them in the next hour. The script is activated using a cron job.

    Read the article

  • Python Regular Expression TypeError

    - by spaghettiwestern
    I am writing my first python program and I am running into a problem with regex. I am using regular expression to search for a specific value in a registry key. import _winreg import re key = _winreg.OpenKey(_winreg.HKEY_LOCAL_MACHINE,"Software\\Microsoft\\Windows\\CurrentVersion\\Uninstall\\{26A24AE4-039D-4CA4-87B4-2F83216020FF}") results=[] v = re.compile(r"(?i)Java") try: i = 0 while 1: name, value, type = _winreg.EnumValue(key, i) if v.search(value): results.append((name,value,type)) i += 1 except WindowsError: print for x in results: print "%-50s%-80s%-20s" % x I am getting the following error: exceptions.TypeError: expected string or buffer I can use the "name" variable and my regex works fine. For example if I make the following changes regex doesn't complain: v = re.compile(r"(?i)DisplayName") if v.search(name): Thanks for any help.

    Read the article

  • Looking for python lib to manage remote tasks

    - by Riz
    Hi, I have server with django on it, this server runs some manage.py commands and update database. Now I need to move some of this tasks to different servers. I don't want to allow remote db access and need some tool\lib to be able to start task on remote servers by main server's command and update tasks code/add new tasks. I have ssh access to every server, all servers run under debian and all code in python. I was thiking about creating my own xmpp based solution(server sends messages to slave servers with commands to execute, like "update task", "run task"), or maybe some low-level ssh based solution where main server logs to slave servers and executes bash commands. But I would be happy to hear any advices.

    Read the article

  • autocomplete-like feature with a python dict

    - by tipu
    In PHP, I had this line matches = preg_grep('/^for/', array_keys($hash)); What it would do is it would grab the words: fork, form etc. that are in $hash. In Python, I have a dict with 400,000 words. It's keys are words I'd like to present in an auto-complete like feature (the values in this case are meaningless). How would I be able to return the keys from my dictionary that match the input? For example (as used earlier), if I have my_dic = t{"fork" : True, "form" : True, "fold" : True, "fame" : True} and I get some input "for", It'll return a list of "fork", "form", "fold"

    Read the article

  • Creating interruptible process in python

    - by Glycerine
    I'm creating a python script of which parses a large (but simple) CSV. It'll take some time to process. I would like the ability to interrupt the parsing of the CSV so I can continue at a later stage. Currently I have this - of which lives in a larger class: (unfinished) Edit: I have some changed code. But the system will parse over 3 million rows. def parseData(self) reader = csv.reader(open(self.file)) for id, title, disc in reader: print "%-5s %-50s %s" % (id, title, disc) l = LegacyData() l.old_id = int(id) l.name = title l.disc_number = disc l.parsed = False l.save() This is the old code. def parseData(self): #first line start fields = self.data.next() for row in self.data: items = zip(fields, row) item = {} for (name, value) in items: item[name] = value.strip() self.save(item) Thanks guys.

    Read the article

  • Extract substructure from a text file using bash or python

    - by Werner
    Hi, I have a huge text file, which follows the structure: SET TAG1 ... ... SET ... SET TAG2 ... ... SET ... ... I would like to extract for a specific TAG, (i.e. TAG54) its individual "substructure", which would be SET TAG54 ... ... SET Each substructure, for a given TAG_i contains always: first line:SET second line:TAG_i (in this case TAG54) an arbitrary number of lines last line:SET I wonder what would be the best way to do this, whether in bash or python, so for a given TAG, one can "extract" this substructure. Thanks

    Read the article

  • Send files between python and C#

    - by SuitUp
    Hi, i would like to know, what is the best way to send files between python and C# and vice versa. I have my own protocol which work on socket level, and i can send string and numbers in both ways. Loops works too. With this i can send pretty much anything, like package of users id, if it is simple data. But soon i will start sending whole files, maybe xml or executables. Simple server with files is no an option because i want sending files from client too. I was thinking about serialization but i don't know it is the best solution, but if it is i will love some tips from stackoverflow community.

    Read the article

  • Using try vs if in python

    - by artdanil
    Is there a rationale to decide which one of try or if constructs to use, when testing variable to have a value? For example, there is a function that returns either a list or doesn't return a value. I want to check result before processing it. Which of the following would be more preferable and why? result = function(); if (result): for r in result: #process items or result = function(); try: for r in result: #process items except TypeError: pass; Related discussion: Checking for member existence in Python

    Read the article

  • getting smallest of coordinates that differ by N or more in Python

    - by user248237
    suppose I have a list of coordinates: data = [[(10, 20), (100, 120), (0, 5), (50, 60)], [(13, 20), (300, 400), (100, 120), (51, 62)]] and I want to take all tuples that either appear in each list in data, or any tuple that differs from all tuples in lists other than its own by 3 or less. How can I do this efficiently in Python? For the above example, the results should be: [[(100, 120), # since it occurs in both lists (10, 20), (13, 20), # since they differ by only 3 (50, 60), (51, 60)]] (0, 5) and (300, 400) would not be included, since they don't appear in both lists and are not different from elements in lists other than their own by 3 or less. how can this be computed? thanks.

    Read the article

< Previous Page | 131 132 133 134 135 136 137 138 139 140 141 142  | Next Page >