Search Results

Search found 28 results on 2 pages for 'referenceproperty'.

Page 2/2 | < Previous Page | 1 2 

  • Python: why does this code take forever (infinite loop?)

    - by Rosarch
    I'm developing an app in Google App Engine. One of my methods is taking never completing, which makes me think it's caught in an infinite loop. I've stared at it, but can't figure it out. Disclaimer: I'm using http://code.google.com/p/gaeunitlink text to run my tests. Perhaps it's acting oddly? This is the problematic function: def _traverseForwards(course, c_levels): ''' Looks forwards in the dependency graph ''' result = {'nodes': [], 'arcs': []} if c_levels == 0: return result model_arc_tails_with_course = set(_getListArcTailsWithCourse(course)) q_arc_heads = DependencyArcHead.all() for model_arc_head in q_arc_heads: for model_arc_tail in model_arc_tails_with_course: if model_arc_tail.key() in model_arc_head.tails: result['nodes'].append(model_arc_head.sink) result['arcs'].append(_makeArc(course, model_arc_head.sink)) # rec_result = _traverseForwards(model_arc_head.sink, c_levels - 1) # _extendResult(result, rec_result) return result Originally, I thought it might be a recursion error, but I commented out the recursion and the problem persists. If this function is called with c_levels = 0, it runs fine. The models it references: class Course(db.Model): dept_code = db.StringProperty() number = db.IntegerProperty() title = db.StringProperty() raw_pre_reqs = db.StringProperty(multiline=True) original_description = db.StringProperty() def getPreReqs(self): return pickle.loads(str(self.raw_pre_reqs)) def __repr__(self): return "%s %s: %s" % (self.dept_code, self.number, self.title) class DependencyArcTail(db.Model): ''' A list of courses that is a pre-req for something else ''' courses = db.ListProperty(db.Key) def equals(self, arcTail): for this_course in self.courses: if not (this_course in arcTail.courses): return False for other_course in arcTail.courses: if not (other_course in self.courses): return False return True class DependencyArcHead(db.Model): ''' Maintains a course, and a list of tails with that course as their sink ''' sink = db.ReferenceProperty() tails = db.ListProperty(db.Key) Utility functions it references: def _makeArc(source, sink): return {'source': source, 'sink': sink} def _getListArcTailsWithCourse(course): ''' returns a LIST, not SET there may be duplicate entries ''' q_arc_heads = DependencyArcHead.all() result = [] for arc_head in q_arc_heads: for key_arc_tail in arc_head.tails: model_arc_tail = db.get(key_arc_tail) if course.key() in model_arc_tail.courses: result.append(model_arc_tail) return result Am I missing something pretty obvious here, or is GAEUnit acting up?

    Read the article

  • How to get to the key name of a referenced entity property from an entity instance without a datastore read in google app engine?

    - by Sumeet Pareek
    Consider I have the following models - class Team(db.Model): # say I have just 5 teams name = db.StringProperty() class Player(db.Model): # say I have thousands of players name = db.StringProperty() team = db.ReferenceProperty(Team, collection_name="player_set") Key name for each Team entity = 'team_' , and for each Player entity = 'player_' By some prior arrangement I have a Team entity's (key_name, name) mapping available to me. For example (team_01, United States Of America), (team_02, Russia) etc I have to show all the players and their teams on a page. One way of doing this would be - players = Player.all().fetch(1000) # This is 1 DB read for player in players: # This will iterate 1000 times self.response.out.write(player.name) # This is obviously not a DB read self.response.out.write(player.team.name) #This is a total of 1x1000 = 1000 DB reads That is a 1001 DB reads for a silly thing. The interesting part is that when I do a db.to_dict() on players, it shows that for every player in that list there is 'name' of the player and there is the 'key_name' of the team available too. So how can I do the below ?? players = Player.all().fetch(1000) # This is 1 DB read for player in players: # This will iterate 1000 times self.response.out.write(player.name) # This is obviously not a DB read self.response.out.write(team_list[player.<SOME WAY OF GETTING TEAM KEY NAME>]) # Here 'team_list' already has (key_name, name) for all 5 teams I have been struggling with this for a long time. Have read every available documentation. I could just hug the person that can help me here :-) Disclaimer: The above problem description is not a real scenario. It is a simplified arrangement that represents my problem exactly. I have run into it in a rater complex and big GAE appication.

    Read the article

  • Optimizing tasks to reduce CPU in a trading application

    - by Joel
    Hello, I have designed a trading application that handles customers stocks investment portfolio. I am using two datastore kinds: Stocks - Contains unique stock name and its daily percent change. UserTransactions - Contains information regarding a specific purchase of a stock made by a user : the value of the purchase along with a reference to Stock for the current purchase. db.Model python modules: class Stocks (db.Model): stockname = db.StringProperty(multiline=True) dailyPercentChange=db.FloatProperty(default=1.0) class UserTransactions (db.Model): buyer = db.UserProperty() value=db.FloatProperty() stockref = db.ReferenceProperty(Stocks) Once an hour I need to update the database: update the daily percent change in Stocks and then update the value of all entities in UserTransactions that refer to that stock. The following python module iterates over all the stocks, update the dailyPercentChange property, and invoke a task to go over all UserTransactions entities which refer to the stock and update their value: Stocks.py # Iterate over all stocks in datastore for stock in Stocks.all(): # update daily percent change in datastore db.run_in_transaction(updateStockTxn, stock.key()) # create a task to update all user transactions entities referring to this stock taskqueue.add(url='/task', params={'stock_key': str(stock.key(), 'value' : self.request.get ('some_val_for_stock') }) def updateStockTxn(stock_key): #fetch the stock again - necessary to avoid concurrency updates stock = db.get(stock_key) stock.dailyPercentChange= data.get('some_val_for_stock') # I get this value from outside ... some more calculations here ... stock.put() Task.py (/task) # Amount of transaction per task amountPerCall=10 stock=db.get(self.request.get("stock_key")) # Get all user transactions which point to current stock user_transaction_query=stock.usertransactions_set cursor=self.request.get("cursor") if cursor: user_transaction_query.with_cursor(cursor) # Spawn another task if more than 10 transactions are in datastore transactions = user_transaction_query.fetch(amountPerCall) if len(transactions)==amountPerCall: taskqueue.add(url='/task', params={'stock_key': str(stock.key(), 'value' : self.request.get ('some_val_for_stock'), 'cursor': user_transaction_query.cursor() }) # Iterate over all transaction pointing to stock and update their value for transaction in transactions: db.run_in_transaction(updateUserTransactionTxn, transaction.key()) def updateUserTransactionTxn(transaction_key): #fetch the transaction again - necessary to avoid concurrency updates transaction = db.get(transaction_key) transaction.value= transaction.value* self.request.get ('some_val_for_stock') db.put(transaction) The problem: Currently the system works great, but the problem is that it is not scaling well… I have around 100 Stocks with 300 User Transactions, and I run the update every hour. In the dashboard, I see that the task.py takes around 65% of the CPU (Stock.py takes around 20%-30%) and I am using almost all of the 6.5 free CPU hours given to me by app engine. I have no problem to enable billing and pay for additional CPU, but the problem is the scaling of the system… Using 6.5 CPU hours for 100 stocks is very poor. I was wondering, given the requirements of the system as mentioned above, if there is a better and more efficient implementation (or just a small change that can help with the current implemntation) than the one presented here. Thanks!! Joel

    Read the article

< Previous Page | 1 2