Search Results

Search found 9788 results on 392 pages for 'character limit'.

Page 206/392 | < Previous Page | 202 203 204 205 206 207 208 209 210 211 212 213  | Next Page >

  • Is there a max size for POST parameter content?

    - by l3dx
    I'm troubleshooting a java app where XML is sent between two systems using HTTP POST and a servlet. I suspect that the problem is that the XML is growing way to big. Is it possible that this is the problem? Is there a limit? When it doesn't work, the request.getParameter("message") on the consumer side will return null. Both apps are running on tomcat

    Read the article

  • Limiting the size of a python dictionary

    - by anthony
    I'd like to work with a dict in python, but limit the number of key/value pairs to X. In other words, if the dict is currently storing X key/value pairs and I perform an insertion, I would like one of the existing pairs to be dropped. It would be nice if it was the least recently inserted/accesses key but that's not completely necessary. If this exists in the standard library please save me some time and point it out!

    Read the article

  • Need MYSQL query for finding lowest score per game player

    - by Chris Barnhill
    I have a game on Facebook called Rails Across Europe. I have a Best Scores page where I show the players with the best 20 scores, which in game terms refers to the lowest winning turn. The problem is that there are a small number of players who play frequently, and their scores dominate the page. I'd like to make the scores page open to more players. So I thought that I could display the single lowest winning turn for each player instead of displaying all of the lowest winning turns for all players. The problem is that the query for this eludes me. So I hope that one of you brilliant StackOverflow folks can help me with this. I have included the relevant MYSQL table schemas below. Here are the the table relationships: player_stats contains statistics for either a game in progress or a completed game. If a game is in progress, winning_turn is zero (which means that games with a winning_turn of zero should not be included in the query). player_stats has a game_player table id reference. game_player contains data describing games currently in progress. game_player has a player table id reference. player contains data describing a person who plays the game. Here's the query I'm currently using: 'SELECT p.fb_user_id, ps.winning_turn, gp.difficulty_level, c.name as city_name, g.name as goods_name, d.cost FROM game_player as gp, player as p, player_stats as ps, demand as d, city as c, goods as g WHERE p.status = "ACTIVE" AND gp.player_id = p.id AND ps.game_player_id = gp.id AND d.id = ps.highest_demand_id AND c.id = d.city_id AND g.id = d.goods_id AND ps.winning_turn > 0 ORDER BY ps.winning_turn ASC, d.cost DESC LIMIT '.$limit.';'; Here are the relevant table schemas: -- -- Table structure for table `player_stats` -- CREATE TABLE IF NOT EXISTS `player_stats` ( `id` int(11) NOT NULL auto_increment, `game_player_id` int(11) NOT NULL, `winning_turn` int(11) NOT NULL, `highest_demand_id` int(11) NOT NULL, PRIMARY KEY (`id`), KEY `game_player_id` (`game_player_id`,`highest_demand_id`) ) ENGINE=InnoDB DEFAULT CHARSET=utf8 AUTO_INCREMENT=3814 ; -- -- Table structure for table `game_player` -- CREATE TABLE IF NOT EXISTS `game_player` ( `id` int(10) unsigned NOT NULL auto_increment, `game_id` int(10) unsigned NOT NULL, `player_id` int(10) unsigned NOT NULL, `player_number` int(11) NOT NULL, `funds` int(10) unsigned NOT NULL, `turn` int(10) unsigned NOT NULL, `difficulty_level` enum('STANDARD','ADVANCED','MASTER','ULTIMATE') NOT NULL, `date_last_used` datetime NOT NULL, PRIMARY KEY (`id`), KEY `game_id` (`game_id`,`player_id`) ) ENGINE=InnoDB DEFAULT CHARSET=utf8 AUTO_INCREMENT=3814 ; -- -- Table structure for table `player` -- CREATE TABLE IF NOT EXISTS `player` ( `id` int(11) NOT NULL auto_increment, `fb_user_id` char(255) NOT NULL, `fb_proxied_email` text NOT NULL, `first_name` char(255) NOT NULL, `last_name` char(255) NOT NULL, `birthdate` date NOT NULL, `date_registered` datetime NOT NULL, `date_last_logged_in` datetime NOT NULL, `status` enum('ACTIVE','SUSPENDED','CLOSED') NOT NULL, PRIMARY KEY (`id`), KEY `fb_user_id` (`fb_user_id`) ) ENGINE=InnoDB DEFAULT CHARSET=utf8 AUTO_INCREMENT=1646 ;

    Read the article

  • Calculating percent of votes inside mysql statement.

    - by Beck
    UPDATE polls_options SET `votes`=`votes`+1, `percent`=ROUND((`votes`+1) / (SELECT voters FROM polls WHERE poll_id=? LIMIT 1) * 100,1) WHERE option_id=? AND poll_id=? Don't have table data yet, to test it properly. :) And by the way, in what type % integers should be stored in database? Thanks for the help!

    Read the article

  • Php/MySQL to ASP.NET/MSSQL, Suggest if its worth the trouble.

    - by user302656
    Hello Guys, We have been using PHP/MySQL for our web application which has been growing a lot, the database is around 4-5GB and one of the table is 2GB sometimes, hence slowing down whenever any queries to that table is called. Should we just try to optimize, or are we using MySQL above its limit? Will switching our web app to .NET/MSSQL resolve the issues? Thanks

    Read the article

  • Unknown column '' in 'field list'

    - by Rixius
    I am running a sql in PHP query that is dieing with the mysql_error() of Unknown column '' in 'field list' The query: SELECT `standard` AS fee FROM `corporation_state_fee` WHERE `stateid` = '8' LIMIT 1 when I run the query in PHPmyadmin, it return the information without flagging the error

    Read the article

  • How secure is .htaccess protected pages

    - by Steven smethurst
    Are there any known flaws with htaccess protected pages? I know they are acceptable to brute force attacks as there is no limit to the amount of times someone can attempt to login. And a user can uploaded and execute a file on the server all bets are off... Anything other .htaccess flaws?

    Read the article

  • JPA and compatibility with persistance providers and databases vendors

    - by Kartoch
    JPA promises to be vendor neutral for persistence and database. But I already know than some persistence frameworks like hibernate are not perfect (character encoding, null comparison) and you need to adapt your schema for each database. Because there is two layers (the persistence framework and database), I would imagine they're some work to use some JPA codes... Does anyone has some experiences with multiple support and if yes, what are the tricks and recommendations to avoid such incompatibilities ?

    Read the article

  • How to sort by hexadecimals in xslt?

    - by fielding
    Hi, im trying to sort my transformed output by a element which contains a hex value. <xsl:sort select="Generation/Sirio/Code" data-type="number"/> Values are plain old Hex: 00 01 02 ... 0A ... FF but they are getting sorted like that: 0A FF 00 01 02, which indicates the sorting method fails as soon as there are character involved. How can i work around this? Thank you very much!

    Read the article

  • MYSQL autoincrement a column or just have an integer, difference?

    - by David19801
    Hi, If I have a column, set as primary index, and set as INT. If I don't set it as auto increment and just insert random integers which are unique into it, does that slow down future queries compared to autincrementing? Does it speed things up if I run OPTIMIZE on a table with its primary and only index as INT? (assuming only 2 columns, and second column is just some INT value) (the main worry is the upper limit on the autoincrement as theres lots of adds and deletes in my table)

    Read the article

  • iTunes RSS Feed returning max. 100 items instead of 300

    - by TheEye
    I tried the iTunes RSS generator at http://itunes.apple.com/rss/generator/ to download the newest 300 games, which gave me the RSS Feed URL http://itunes.apple.com/us/rss/newapplications/limit=300/genre=6014/xml. However, only 100 are returned, and alphabetically sorted, so the list stops at the letter E ... Did Apple restrict the amount of items one could get without updating their RSS Feed Generator? Or am I missing something?

    Read the article

  • Reading a large text file to memory in C++

    - by NoneType
    Is there a way to read a large text file (~60MB) into memory at once (like a compiler flag to increase program memory limit) ? Currently, ofstream's open function throws a segmentation fault while trying to read this file. ifstream fis; fis.open("my_large_file.txt"); // Segfaults here The file just consists of rows of the form number_1<tabspace>number_2 i.e., two numbers separated by a tabspace.

    Read the article

  • String literals and escape characters in postgresql

    - by rjohnston
    Attempting to insert an escape character into a table results in a warning. For example: create table EscapeTest (text varchar(50)); insert into EscapeTest (text) values ('This is the first part \n And this is the second'); Produces the warning: WARNING: nonstandard use of escape in a string literal (Using PSQL 8.2) Anyone know how to get around this?

    Read the article

  • GdkEventKey. Key value question

    - by spajak
    Hi How to translate GdkEventKey.keyval with level 0 to key value with level == 0? For example: GDK_ISO_Left_Tab ==> GDK_Tab? Of course not using switch/case for every key. I need to get particular key, not a character.

    Read the article

  • Efficiently fetching and storing tweets from a few hundred twitter profiles?

    - by MSpreij
    The site I'm working on needs to fetch the tweets from 150-300 people, store them locally, and then list them on the front page. The profiles sit in groups. The pages will be showing the last 20 tweets (or 21-40, etc) by date, group of profiles, single profile, search, or "subject" (which is sort of a different group.. I think..) a live, context-aware tag cloud (based on the last 300 tweets of the current search, group of profiles, or single profile shown) various statistics (group stuffs, most active, etc) which depend on the type of page shown. We're expecting a fair bit of traffic. The last, similar site peaked at nearly 40K visits per day, and ran intro trouble before I started caching pages as static files, and disabling some features (some, accidently..). This was caused mostly by the fact that a page load would also fetch the last x tweets from the 3-6 profiles which had not been updated the longest.. With this new site I can fortunately use cron to fetch tweets, so that helps. I'll also be denormalizing the db a little so it needs less joins, optimize it for faster selects instead of size. Now, main question: how do I figure out which profiles to check for new tweets in an efficient manner? Some people will be tweeting more often than others, some will tweet in bursts (this happens a lot). I want to keep the front page of the site as "current" as possible. If it comes to, say, 300 profiles, and I check 5 every minute, some tweets will only appear an hour after the fact. I can check more often (up to 20K) but want to optimize this as much as possible, both to not hit the rate limit and to not run out of resources on the local server (it hit mysql's connection limit with that other site). Question 2: since cron only "runs" once a minute, I figure I have to check multiple profiles each minute - as stated, at least 5, possibly more. To try and spread it out over that minute I could have it sleep a few seconds between batches or even single profiles. But then if it takes longer than 60 seconds altogether, the script will run into itself. Is this a problem? If so, how can I avoid that? Question 3: any other tips? Readmes? URLs?

    Read the article

  • Why is Dictionary.First() so slow?

    - by Rotsor
    Not a real question because I already found out the answer, but still interesting thing. I always thought that hash table is the fastest associative container if you hash properly. However, the following code is terribly slow. It executes only about 1 million iterations and takes more than 2 minutes of time on a Core 2 CPU. The code does the following: it maintains the collection todo of items it needs to process. At each iteration it takes an item from this collection (doesn't matter which item), deletes it, processes it if it wasn't processed (possibly adding more items to process), and repeats this until there are no items to process. The culprit seems to be the Dictionary.Keys.First() operation. The question is why is it slow? Stopwatch watch = new Stopwatch(); watch.Start(); HashSet<int> processed = new HashSet<int>(); Dictionary<int, int> todo = new Dictionary<int, int>(); todo.Add(1, 1); int iterations = 0; int limit = 500000; while (todo.Count > 0) { iterations++; var key = todo.Keys.First(); var value = todo[key]; todo.Remove(key); if (!processed.Contains(key)) { processed.Add(key); // process item here if (key < limit) { todo[key + 13] = value + 1; todo[key + 7] = value + 1; } // doesn't matter much how } } Console.WriteLine("Iterations: {0}; Time: {1}.", iterations, watch.Elapsed); This results in: Iterations: 923007; Time: 00:02:09.8414388. Simply changing Dictionary to SortedDictionary yields: Iterations: 499976; Time: 00:00:00.4451514. 300 times faster while having only 2 times less iterations. The same happens in java. Used HashMap instead of Dictionary and keySet().iterator().next() instead of Keys.First().

    Read the article

  • How to dislpay Kannada (Indic) Fonts in a Java application

    - by AJ
    I am trying to dislplay a kannda character in a Java app. String fonts[] = ge.getAvailableFontFamilyNames(); This shows that there is a font family by name "BRH Kannada" Font f = new Font("BRH Kannada", Font.PLAIN, 20); and then I do button.setFont(f); now when i set the button text, i have to ideally get the text on button to use the BRH kannada font. However, what and how should i give the string as? Rgds, AJ

    Read the article

< Previous Page | 202 203 204 205 206 207 208 209 210 211 212 213  | Next Page >