Search Results

Search found 4921 results on 197 pages for 'django transactions'.

Page 124/197 | < Previous Page | 120 121 122 123 124 125 126 127 128 129 130 131  | Next Page >

  • Using set with values from a table

    - by gozzilli
    I'm writing a database of all DVDs I have at home. One of the fields, actors, I would like it to be a set of values from an other table, which is storing actors. So for every film I want to store a list of actors, all of which selected from a list of actors, taken from a different table. Is it possible? How do I do this? It would be a set of foreign keys basically. I'm using a MySQL database for a Django application (python), so any hint in SQL or Python would be much appreciated. I hope the question is clear, many thanks.

    Read the article

  • How do I get the position of a result in the list after an order_by?

    - by Bob Bob
    I'm trying to find an efficient way to find the rank of an object in the database related to it's score. My naive solution looks like this: rank = 0 for q in Model.objects.all().order_by('score'): if q.name == 'searching_for_this' return rank rank += 1 It should be possible to get the database to do the filtering, using order_by: Model.objects.all().order_by('score').filter(name='searching_for_this') But there doesn't seem to be a way to retrieve the index for the order_by step after the filter. Is there a better way to do this? (Using python/django and/or raw SQL.) My next thought is to pre-compute ranks on insert but that seems messy.

    Read the article

  • Increasing speed of webservice - howto

    - by Koran
    Hi, Our client-server product has the protocol between them as XML over HTTP. Here, the client asks a GET/POST query to the web server and the server responds with XML. The server is written using django. The server has to be on the web because there are many clients across the world using this. The server code uses extensive memoization and also there is very less db queries - most queries dont have any db queries, some of them has max 1. The biggest problem is the speed. Every query takes close to 5 seconds for the reply. The data replied is also very less - in the range of 4-6 Kb. What are the mechanisms to improve speed of the web service? Is this the usual way of writing a client-server? Are there other technologies and are we missing out on it? Thank you K

    Read the article

  • Parsing a list of dictionaries passed as a POST parameter

    - by andyashton
    I have a list of python dictionaries that look like this: sandwiches = [ {'bread':'wheat', 'topping':'tomatoes', 'meat':'bacon'}, {'bread':'white', 'topping':'peanut butter', 'meat':'bacon'}, {'bread':'sourdough', 'topping':'cheese', 'meat':'bacon'} ] I want to pass this as a POST parameter to another Django app. What does the client app need to do to iterate through the list? I want to do something like: for sandwich in request.POST['sandwiches']: print "%s on %s with %s is yummy!" % (sandwich['meat'], sandwich['bread'], sandwich['topping']) But I don't seem to have a list of dicts when my data arrives at my client app.

    Read the article

  • Why can't I see any data in the Google App Engine *Development* Console?

    - by willem
    I run my google app engine application in one of two ways... Directly by using the application from http://localhost:8080 Or execute unit tests from http://localhost:8080/test When I create entities by using the application directly, the data is visible in the Development Console (dataStore view). However, when I execute the unit tests... even if they succeed and I can put() and get() data, the data does not show in the dataStore view. Any idea why I can't see my data? Even though it is there? Notes: I use GAEUnit for unit tests. the data stored mostly consists of StringProperties(). I use Python and run Django on top of the GAE, don't know if that matters.

    Read the article

  • what is this 'content_type' mean..

    - by zjm1126
    content_type = ContentType.objects.get_for_model(Map) maps = maps.extra(select=SortedDict([ ('member_count', MEMBER_COUNT_SQL), ('topic_count', TOPIC_COUNT_SQL), ]), select_params=(content_type.id,)) and the ContentType is: class ContentType(models.Model): name = models.CharField(max_length=100) app_label = models.CharField(max_length=100) model = models.CharField(_('python model class name'), max_length=100) objects = ContentTypeManager() class Meta: verbose_name = _('content type') verbose_name_plural = _('content types') db_table = 'django_content_type' ordering = ('name',) unique_together = (('app_label', 'model'),) def __unicode__(self): return self.name def model_class(self): "Returns the Python model class for this type of content." from django.db import models return models.get_model(self.app_label, self.model) def get_object_for_this_type(self, **kwargs): """ Returns an object of this type for the keyword arguments given. Basically, this is a proxy around this object_type's get_object() model method. The ObjectNotExist exception, if thrown, will not be caught, so code that calls this method should catch it. """ return self.model_class()._default_manager.using(self._state.db).get(**kwargs) def natural_key(self): return (self.app_label, self.model) i want to know: what is the 'content_type' used for ??

    Read the article

  • Key word extraction in Python

    - by oliland
    I'm building a website in django that needs to extract key words from short (twitter-like) messages. I've looked at packages like topia.textextract and nltk - but both seem to be overkill for what I need to do. All I need to do is filter words like "and", "or", "not" while keeping nouns and verbs that aren't conjunctives or other parts of speech. Are there any "simpler" packages out there that can do this? EDIT: This needs to be done in near real-time on a production website, so using a keyword extraction service seems out of the question, based on their response times and request throttling.

    Read the article

  • python interactive web data/forms/interface communicating with remote server

    - by decipher
    What's an efficient method (preferably simple as well) for communicating with a remote server and allowing the user to 'interact' with it (IE submit commands, user interface) via the web browser (IE a text box to input commands, and an text area for output, or various command-less abstracted interfaces)? I have the 'standalone' python code finished for communicating and working(terminal/console based right now). My primary concern is with re-factoring the code to suite the web, which involves establishing a connection (python sockets), and maintaining the connection while the user is logged on. some further details: currently using django framework for the basic back end/templates.

    Read the article

  • IN statement performance in PostgreSQL (and in general)

    - by Vasil
    I know this has probably been asked before, but I can't find it with SO's search. Lets say i've TABLE1 and TABLE2, who should I expect the performance of a query such as this: SELECT * FROM TABLE1 WHERE id IN SUBQUERY_ON_TABLE2; as the number of rows in TABLE1 and TABLE2 grow and id is a primary key on TABLE1. Yes, I know using IN is such a n00b mistake, but TABLE2 has a generic relation (django generic relation) to multiple other tables so I can't think of another way to filter the data. At what (aproximate) ammount of rows in TABLE1 and TABLE2 should I expect to notice performance issues because of this? Will performance degrade linearly, exponentially etc. depending on the number of rows?

    Read the article

  • measuring performance - using real clicks vs "ab" command

    - by shanyu
    I have a web site in closed beta, developed in Django, runs with Mysql on Debian. In the last few days, the main page has been showing a slowdown. For every ten clicks, one or two receives extremely slow response (10 secs or more), others are as fast as they used to be. When I was searching for the problem, I ran into this issue that I couldn't grasp: top command shows that when I request the main page, mysql shoots up to 90% - 100% cpu usage. I get the page just as the cpu use gets back to normal. So, I thought, it is db. Then I called ab with parameters -n 1000 -c 5, I got decent performance, about 100 pages per second, just as it was before the slowdown. I would imagine a worse performance as 10-20% of requests take 10 secs to load. Is this conflict between ab and "real" clicks normal, or am I using ab in a wrong configuration?

    Read the article

  • Google App Engine Project hierarchy

    - by Ron
    Hey guys, I'm working on a google app engine (with Django) and I just can't figure out what's a good practice for folder hierarchy.. I've looked at this: Project structure for Google App Engine but one thing isn't clear - what if I have static folder (like js files) that are unique to my app, not project? where do they go? my current hierarchy is: proj static ** js ** css myapp ** templates So when a template inside my app sends a GET for js/script.js. this gets redirected to /myapp/js/script.js, which my server doesn't recognize. here is my project url.py: urlpatterns = patterns('', (r'^myapp/', include('myapp.urls')), ) and here is my myapp.urls.py: urlpatterns = patterns('myapp.views', (r'^$', 'myapp.views.index'), ) how should I rearrange this to work? thanks!

    Read the article

  • basic unique ModelForm field for Google App Engine

    - by Alexander Vasiljev
    I do not care about concurrency issues. It is relatively easy to build unique form field: from django import forms class UniqueUserEmailField(forms.CharField): def clean(self, value): self.check_uniqueness(super(UniqueUserEmailField, self).clean(value)) def check_uniqueness(self, value): same_user = users.User.all().filter('email', value).get() if same_user: raise forms.ValidationError('%s already_registered' % value) so one could add users on-the-fly. Editing existing user is tricky. This field would not allow to save user having other user email. At the same time it would not allow to save a user with the same email. What code do you use to put a field with uniqueness check into ModelForm?

    Read the article

  • AppEngine dev_appserver.py aborts with no error message

    - by Gj
    I have an app which works well live on AppEngine. However, when I try to run it locally with the dev_appserver.py, it aborts within ~1 second with: ~/ dev_appserver.py --debug_imports myapp /opt/local/share/google_appengine/google/appengine/api/datastore_file_stub.py:40: DeprecationWarning: the md5 module is deprecated; use hashlib instead import md5 /opt/local/share/google_appengine/google/appengine/api/memcache/__init__.py:31: DeprecationWarning: the sha module is deprecated; use the hashlib module instead import sha I'm on OS X 10.6.3, Python 2.6.4 + Django 1.1.1 + appengine 1.3.1 (all installed via macports) Any ideas? Thanks!

    Read the article

  • ProgrammingError: (1146, "Table 'test_<DB>.<TABLE>' doesn't exist") when running unit test for Djang

    - by abigblackman
    I'm running a unit test using the Django framework and get this error. Running the actual code does not have this problem, running the unit tests creates a test database on the fly so I suspect the issue lies there. The code that throws the error looks like this member = Member.objects.get(email=email_address) and the model looks like class Member(models.Model): member_id = models.IntegerField(primary_key=True) created_on = models.DateTimeField(editable=False, default=datetime.datetime.utcnow()) flags = models.IntegerField(default=0) email = models.CharField(max_length=150, blank=True) phone = models.CharField(max_length=150, blank=True) country_iso = models.CharField(max_length=6, blank=True) location_id = models.IntegerField(null=True, blank=True) facebook_uid = models.IntegerField(null=True, blank=True) utc_offset = models.IntegerField(null=True, blank=True) tokens = models.CharField(max_length=3000, blank=True) class Meta: db_table = u'member' there's nothing too odd there i can see. the user running the tests has the same permissions to the database server as the user that runs the website where else can I look to see what's going wrong, why is this table not being created?

    Read the article

  • How do I track images embedded in HTML?

    - by ycseattle
    Hi, I'd like to track the views/impressions of images on web pages, but still allow the images to be embedded in HTML, like in the "img src="http://mysite.com/upload/myimage.jpg"/" element. I know in Windows I can write a handler for ".jpg" so the URL will actually trigger a handling function instead of loading the images from disk. Is it possible to do that in python/django on Ubuntu server? Can web browser still cache the jpg files if it is not a straight file path? It looks to me that this is how google picasaweb handles the image file name. I'd like to get some ideas on how to implement that. Thanks! -Yi

    Read the article

  • Regex for finding valid sphinx fields

    - by mlissner
    I'm trying to validate that the fields given to sphinx are valid, but I'm having difficulty. Imagine that valid fields are cat, mouse, dog, puppy. Valid searches would then be: @cat search terms @(cat) search terms @(cat, dog) search term @cat searchterm1 @dog searchterm2 @(cat, dog) searchterm1 @mouse searchterm2 So, I want to use a regular expression to find terms such as cat, dog, mouse in the above examples, and check them against a list of valid terms. Thus, a query such as: @(goat) Would produce an error because goat is not a valid term. I've gotten so that I can find simple queries such as @cat with this regex: (?:@)([^( ]*) But I can't figure out how to find the rest. I'm using python & django, for what that's worth.

    Read the article

  • Intentionally get a "MySQL server has gone away" error

    - by Jonathan
    I'm trying to cope with MySQL's error MySQL server has gone away in a django env. The quick workaround was to set the global wait_timeout MySQL variable to a huge value, but in the long run this would accumulate to many open connections. I figured I'll get the wait_timeout variable and poll the server in smaller intervals. After implementing this I tried to test it but am failing to get the error. I set global wait_timeout=15 and even set global interactive_timeout=15 but the connection refuses to disappear. I'm sure I'm polling the database in larger intervals than 15sec. What could be the cause for not being able to recreate this error?

    Read the article

  • Avoiding dog-piling or thundering herd in a memcached expiration scenario

    - by Quintin Par
    I have the result of a query that is very expensive. It is the join of several tables and a map reduce job. This is cached in memcached for 15 minutes. Once the cache expires the queries are obviously run and the cache warmed again. But at the point of expiration the thundering herd problem issue can happen. One way to fix this problem, that I do right now is to run a scheduled task that kicks in the 14th minute. But somehow this looks very sub optimal to me. Another approach I like is nginx’s proxy_cache_use_stale updating; mechanism. The webserver/machine continues to deliver stale cache while a thread kicks in the moment expiration happens and updates the cache. Has someone applied this to memcached scenario though I understand this is a client side strategy? If it benefits, I use Django.

    Read the article

  • Best option for Google App Engine Datastore and external database?

    - by Alex
    I need to get an App Engine app talking to and sharing data with an external database, The best option i can come up with is outputting the external database data to an xml file and then processing this in my app engine app and storing it inside the datastore, although the data being shared is sensitive data such as login details so outputting this to an xml file is not exactly a great idea, is it possible for the app engine app to directly query the database? or is there a secure option for using xml files? oh and im using python/django and the external database will be hosted on another domain

    Read the article

  • Counts of events grouped by date in python?

    - by Sologoub
    This is no doubt another noobish question, but I'll ask it anyways: I have a data set of events with exact datetime in UTC. I'd like to create a line chart showing total number of events by day (date) in the specified date range. Right now I can retrieve the total data set for the needed date range, but then I need to go through it and count up for each date. The app is running on google app engine and is using python. What is the best way to create a new data set showing date and corresponding counts (including if there were no events on that date) that I can then use to pass this info to a django template? Data set for this example looks like this: class Event(db.Model): event_name = db.StringProperty() doe = db.DateTimeProperty() dlu = db.DateTimeProperty() user = db.UserProperty() Ideally, I want something with date and count for that date. Thanks and please let me know if something else is needed to answer this question!

    Read the article

  • Conventional Approaches for Passing Data to Back-End?

    - by Calvin
    Hi guys, I'm fairly new to web development, so please pardon the painfully newbie question that's about to follow. My computer science class group and I are developing a web application for class, which is built in Python (under Django) and uses jQuery on the front end. It's primarily an AJAX-ified application, and passing data from the backend to the front end is done through AJAX calls to specific URLs which return JSON. This is probably a stupid question, but what's the conventional approach for passing data in the opposite direction? We don't want to reload the page or anything, so is it an AJAX pass going the other way or something? Thanks in advance for your help!

    Read the article

  • Update Facebook Page's status using pyfacebook

    - by thornomad
    I am attempting to add functionality to my Django app: when a new post is approved, I want to update the corresponding Facebook Page's status with a message and a link to the post automatically. Basic status update. I have downloaded and installed pyfacebook - and I have read through the tutorial from Facebook. I have also seen this suggestion here on SO: import facebook fb = facebook.Facebook('YOUR_API_KEY', 'YOUR_SECRET_KEY') fb.auth.createToken() fb.login() # THIS IS AS FAR AS I CAN GET fb.auth.getSession() fb.set_status('Checking out StackOverFlow.com') When I get to the login() call, however, pyfacebook tries to open lynx so I can login to Facebook 'via the web' -- this is, obviously, not going to work for me because the system is supposed to be automated ... I've been looking, but can't find out how I can keep this all working with the script and not having to login via a web browser. Any ideas?

    Read the article

  • Serialize Dictionary with a string key and List[] value to JSON

    - by Patrick
    How can I serialize a python Dictionary to JSON and pass back to javascript, which contains a string key, while the value is a List (i.e. []) if request.is_ajax() and request.method == 'GET': groupSet = GroupSet.objects.get(id=int(request.GET["groupSetId"])) groups = groupSet.groups.all() group_items = [] #list groups_and_items = {} #dictionary for group in groups: group_items.extend([group_item for group_item in group.group_items.all()]) #use group as Key name and group_items (LIST) as the value groups_and_items[group] = group_items data = serializers.serialize("json", groups_and_items) return HttpResponse(data, mimetype="application/json") the result: [{"pk": 5, "model": "myApp.group", "fields": {"name": "\u6fb4\u9584", "group_items": [13]}}] while the group_items should have many group_item and each group_item should have "name", rather than only the Id, in this case the Id is 13. I need to serialize the group name, as well as the group_item's Id and name as JSON and pass back to javascript. I am new to Python and Django, please advice me if you have a better way to do this, appreciate. Thank you so much. :)

    Read the article

  • show output of file on client side using jquery + javascript .

    - by tazimk
    Hi, Written some code in my view function : This code reads a file from server . stores it in a list .passes to client def showfiledata(request): f = open("/home/tazim/webexample/test.txt") list = f.readlines() return_dict = {'list':list} json = simplejson.dumps(list) return HttpResponse(json,mimetype="application/json") On, client side the $.ajax callback function receives this list of lines. Now, My Question is . I have to display these lines in a textarea. But these lines should not be displayed at once . Each line should be appended in textarea with some delay. (Use of setInterval is required as per my knowledge) . Also I am using jquery in my templates. The server used is Django . Please provide some solution as in some sample code will be quite helpful .

    Read the article

  • How to "select file" with Python script? . Google App Engine . Python .

    - by draconisthe0ry
    I'm trying to create an online application for a python function i have created. in my script, i input the path of my file for the computer (input_path = '/users/user/desktop/input.txt') but i'm not sure how to go about this using Google App Engine . I have the choice between 3 templates: flask, django, and bottle . I really do believe this question is relevant for people transitioning from scripts to web-based applications. Do I need to incorporate GUI stuff from Tkinter or something? There has to be a way to simply select a file to use for the input path in an interactive way using python scripts

    Read the article

< Previous Page | 120 121 122 123 124 125 126 127 128 129 130 131  | Next Page >