Search Results

Search found 6311 results on 253 pages for 'limit clause'.

Page 123/253 | < Previous Page | 119 120 121 122 123 124 125 126 127 128 129 130  | Next Page >

  • New table for every user?

    - by SonOfOmer
    Hi everyone, I want to crate new table for each new user on the web site and I assume that there will be many users, I am sure that search performance will be good, but what is with maintenance?? It is MySQL which has no limit in number of tables. Thanks a lot.

    Read the article

  • Is there a max size for POST parameter content?

    - by l3dx
    I'm troubleshooting a java app where XML is sent between two systems using HTTP POST and a servlet. I suspect that the problem is that the XML is growing way to big. Is it possible that this is the problem? Is there a limit? When it doesn't work, the request.getParameter("message") on the consumer side will return null. Both apps are running on tomcat

    Read the article

  • Limiting the size of a python dictionary

    - by anthony
    I'd like to work with a dict in python, but limit the number of key/value pairs to X. In other words, if the dict is currently storing X key/value pairs and I perform an insertion, I would like one of the existing pairs to be dropped. It would be nice if it was the least recently inserted/accesses key but that's not completely necessary. If this exists in the standard library please save me some time and point it out!

    Read the article

  • Limiting the number of queries returns in SQL Server 2008

    - by Jose Sotero Villegas III
    This is my query SELECT Fullname, rank, id_no, TIN, birthdate, hair, eyes, Blood, height, weight, marks, name, address FROM [******_DOMAIN\****_*****].*******view Problem is, source table has too many duplicates, how do I my limit query to the latest row on the database? I'm using SQL Server 2008. Thanks In advance My next problem is that the view shows me a birthdate string format of yyyymmdd, I need to change it to mm/dd/yyyy can please provide me a function? using the same string above?

    Read the article

  • Zend_Paginator / Doctrine 2

    - by Kevin
    I'm using Doctrine 2 with my Zend Framework application and a typical query result could yield a million (or more) search results. I want to use Zend_Paginator in line with this result set. However, I don't want to return all the results as an array and use the Array adapter as this would be inefficient, instead I would like to supply the paginator the total amount of rows then and array of results based on limit/offset amounts. Is this doable using the Array adapter or would I need to create my own pagination adapter?

    Read the article

  • How do I use a MySQL subquery to count the number of rows in a foreign table?

    - by James Skidmore
    I have two tables, users and reports. Each user has no, one, or multiple reports associated with it, and the reports table has a user_id field. I have the following query, and I need to add to each row a count of how many reports the user has: SELECT * FROM users LIMIT 1, 10 Do I need to use a subquery, and if so, how can I use it efficently? The reports table has thousands and thousands of rows.

    Read the article

  • Php/MySQL to ASP.NET/MSSQL, Suggest if its worth the trouble.

    - by user302656
    Hello Guys, We have been using PHP/MySQL for our web application which has been growing a lot, the database is around 4-5GB and one of the table is 2GB sometimes, hence slowing down whenever any queries to that table is called. Should we just try to optimize, or are we using MySQL above its limit? Will switching our web app to .NET/MSSQL resolve the issues? Thanks

    Read the article

  • Need MYSQL query for finding lowest score per game player

    - by Chris Barnhill
    I have a game on Facebook called Rails Across Europe. I have a Best Scores page where I show the players with the best 20 scores, which in game terms refers to the lowest winning turn. The problem is that there are a small number of players who play frequently, and their scores dominate the page. I'd like to make the scores page open to more players. So I thought that I could display the single lowest winning turn for each player instead of displaying all of the lowest winning turns for all players. The problem is that the query for this eludes me. So I hope that one of you brilliant StackOverflow folks can help me with this. I have included the relevant MYSQL table schemas below. Here are the the table relationships: player_stats contains statistics for either a game in progress or a completed game. If a game is in progress, winning_turn is zero (which means that games with a winning_turn of zero should not be included in the query). player_stats has a game_player table id reference. game_player contains data describing games currently in progress. game_player has a player table id reference. player contains data describing a person who plays the game. Here's the query I'm currently using: 'SELECT p.fb_user_id, ps.winning_turn, gp.difficulty_level, c.name as city_name, g.name as goods_name, d.cost FROM game_player as gp, player as p, player_stats as ps, demand as d, city as c, goods as g WHERE p.status = "ACTIVE" AND gp.player_id = p.id AND ps.game_player_id = gp.id AND d.id = ps.highest_demand_id AND c.id = d.city_id AND g.id = d.goods_id AND ps.winning_turn > 0 ORDER BY ps.winning_turn ASC, d.cost DESC LIMIT '.$limit.';'; Here are the relevant table schemas: -- -- Table structure for table `player_stats` -- CREATE TABLE IF NOT EXISTS `player_stats` ( `id` int(11) NOT NULL auto_increment, `game_player_id` int(11) NOT NULL, `winning_turn` int(11) NOT NULL, `highest_demand_id` int(11) NOT NULL, PRIMARY KEY (`id`), KEY `game_player_id` (`game_player_id`,`highest_demand_id`) ) ENGINE=InnoDB DEFAULT CHARSET=utf8 AUTO_INCREMENT=3814 ; -- -- Table structure for table `game_player` -- CREATE TABLE IF NOT EXISTS `game_player` ( `id` int(10) unsigned NOT NULL auto_increment, `game_id` int(10) unsigned NOT NULL, `player_id` int(10) unsigned NOT NULL, `player_number` int(11) NOT NULL, `funds` int(10) unsigned NOT NULL, `turn` int(10) unsigned NOT NULL, `difficulty_level` enum('STANDARD','ADVANCED','MASTER','ULTIMATE') NOT NULL, `date_last_used` datetime NOT NULL, PRIMARY KEY (`id`), KEY `game_id` (`game_id`,`player_id`) ) ENGINE=InnoDB DEFAULT CHARSET=utf8 AUTO_INCREMENT=3814 ; -- -- Table structure for table `player` -- CREATE TABLE IF NOT EXISTS `player` ( `id` int(11) NOT NULL auto_increment, `fb_user_id` char(255) NOT NULL, `fb_proxied_email` text NOT NULL, `first_name` char(255) NOT NULL, `last_name` char(255) NOT NULL, `birthdate` date NOT NULL, `date_registered` datetime NOT NULL, `date_last_logged_in` datetime NOT NULL, `status` enum('ACTIVE','SUSPENDED','CLOSED') NOT NULL, PRIMARY KEY (`id`), KEY `fb_user_id` (`fb_user_id`) ) ENGINE=InnoDB DEFAULT CHARSET=utf8 AUTO_INCREMENT=1646 ;

    Read the article

  • SQL Server Maximum row size

    - by DannySmurf
    Came across this error today. Wondering if anyone can tell me what it means: Cannot sort a row of size 9522, which is greater than the allowable maximum of 8094. Is that 8094 bytes? Characters? Fields? Is this a problem joining multiple tables that are exceeding some limit?

    Read the article

  • Unknown column '' in 'field list'

    - by Rixius
    I am running a sql in PHP query that is dieing with the mysql_error() of Unknown column '' in 'field list' The query: SELECT `standard` AS fee FROM `corporation_state_fee` WHERE `stateid` = '8' LIMIT 1 when I run the query in PHPmyadmin, it return the information without flagging the error

    Read the article

  • How do I select a random record efficiently in MySQL?

    - by user198729
    mysql> EXPLAIN SELECT * FROM urls ORDER BY RAND() LIMIT 1; +----+-------------+-------+------+---------------+------+---------+------+-------+---------------------------------+ | id | select_type | table | type | possible_keys | key | key_len | ref | rows | Extra | +----+-------------+-------+------+---------------+------+---------+------+-------+---------------------------------+ | 1 | SIMPLE | urls | ALL | NULL | NULL | NULL | NULL | 62228 | Using temporary; Using filesort | +----+-------------+-------+------+---------------+------+---------+------+-------+---------------------------------+ The above doesn't qualify as efficient,how should I do it properly?

    Read the article

  • How secure is .htaccess protected pages

    - by Steven smethurst
    Are there any known flaws with htaccess protected pages? I know they are acceptable to brute force attacks as there is no limit to the amount of times someone can attempt to login. And a user can uploaded and execute a file on the server all bets are off... Anything other .htaccess flaws?

    Read the article

  • Why is Dictionary.First() so slow?

    - by Rotsor
    Not a real question because I already found out the answer, but still interesting thing. I always thought that hash table is the fastest associative container if you hash properly. However, the following code is terribly slow. It executes only about 1 million iterations and takes more than 2 minutes of time on a Core 2 CPU. The code does the following: it maintains the collection todo of items it needs to process. At each iteration it takes an item from this collection (doesn't matter which item), deletes it, processes it if it wasn't processed (possibly adding more items to process), and repeats this until there are no items to process. The culprit seems to be the Dictionary.Keys.First() operation. The question is why is it slow? Stopwatch watch = new Stopwatch(); watch.Start(); HashSet<int> processed = new HashSet<int>(); Dictionary<int, int> todo = new Dictionary<int, int>(); todo.Add(1, 1); int iterations = 0; int limit = 500000; while (todo.Count > 0) { iterations++; var key = todo.Keys.First(); var value = todo[key]; todo.Remove(key); if (!processed.Contains(key)) { processed.Add(key); // process item here if (key < limit) { todo[key + 13] = value + 1; todo[key + 7] = value + 1; } // doesn't matter much how } } Console.WriteLine("Iterations: {0}; Time: {1}.", iterations, watch.Elapsed); This results in: Iterations: 923007; Time: 00:02:09.8414388. Simply changing Dictionary to SortedDictionary yields: Iterations: 499976; Time: 00:00:00.4451514. 300 times faster while having only 2 times less iterations. The same happens in java. Used HashMap instead of Dictionary and keySet().iterator().next() instead of Keys.First().

    Read the article

  • Problem using generics in function

    - by JAVA
    Hi all, I have this functions and need to make it one function. The only difference is type of input variable sourceColumnValue. This variable can be String or Integer but the return value of function must be always Integer. I know I need to use Generics but can't do it. public Integer selectReturnInt(String tableName, String sourceColumnName, String sourceColumnValue, String targetColumnName) { Integer returned = null; String query = "SELECT "+targetColumnName+" FROM "+tableName+" WHERE "+sourceColumnName+"='"+sourceColumnValue+"' LIMIT 1"; try { Connection connection = ConnectionManager.getInstance().open(); java.sql.Statement statement = connection.createStatement(); statement.execute(query.toString()); ResultSet rs = statement.getResultSet(); while(rs.next()){ returned = rs.getInt(targetColumnName); } rs.close(); statement.close(); ConnectionManager.getInstance().close(connection); } catch (SQLException e) { System.out.println("???????? ?? ???? ?? ???? ?????????!"); System.out.println(e); } return returned; } // SELECT (RETURN INTEGER) public Integer selectIntReturnInt(String tableName, String sourceColumnName, Integer sourceColumnValue, String targetColumnName) { Integer returned = null; String query = "SELECT "+targetColumnName+" FROM "+tableName+" WHERE "+sourceColumnName+"='"+sourceColumnValue+"' LIMIT 1"; try { Connection connection = ConnectionManager.getInstance().open(); java.sql.Statement statement = connection.createStatement(); statement.execute(query.toString()); ResultSet rs = statement.getResultSet(); while(rs.next()){ returned = rs.getInt(targetColumnName); } rs.close(); statement.close(); ConnectionManager.getInstance().close(connection); } catch (SQLException e) { System.out.println("???????? ?? ???? ?? ???? ?????????!"); System.out.println(e); } return returned; }

    Read the article

  • modalpopup.js not working properly in windows

    - by jpallavi
    I have applied modalpopup.js to display error & restrict word limit counter to 300words for messages. Although it displays error message but user can still type words more than 300words in windows operating system. It works fine in fedora operating system.Can somebody help solve the issue.

    Read the article

  • Retrieving substring of a bound value

    - by shivesh
    I am binding some data to control, but want to limit the number of character of a specific field to a 30 first characters. I want to do it, if it's possible, on aspx page. I tried this: Text='<%# String.Format("{0}", Eval("Title")).Substring(0,30) %> ' But got this error: Index and length must refer to a location within the string. Parameter name: length

    Read the article

  • Limiting XML read to 5 records

    - by Jean
    Hello, I have a xml file with 100 records, but I want it to limit it to just 5 records for ($i=0;$i<=5;$i++) { foreach($xml-entry as $result){ if ($result->updated == $result->published) { } } } When I put in the above code, it display one record 5 times. Thanks Jean

    Read the article

  • Efficient SQL to count an occurrence in the latest X rows

    - by pulegium
    For example I have: create table a (i int); Assume there are 10k rows. I want to count 0's in the last 20 rows. Something like: select count(*) from (select i from a limit 20) where i = 0; Is that possible to make it more efficient? Like a single SQL statement or something? PS. DB is SQLite3 if that matters at all...

    Read the article

  • Efficiently fetching and storing tweets from a few hundred twitter profiles?

    - by MSpreij
    The site I'm working on needs to fetch the tweets from 150-300 people, store them locally, and then list them on the front page. The profiles sit in groups. The pages will be showing the last 20 tweets (or 21-40, etc) by date, group of profiles, single profile, search, or "subject" (which is sort of a different group.. I think..) a live, context-aware tag cloud (based on the last 300 tweets of the current search, group of profiles, or single profile shown) various statistics (group stuffs, most active, etc) which depend on the type of page shown. We're expecting a fair bit of traffic. The last, similar site peaked at nearly 40K visits per day, and ran intro trouble before I started caching pages as static files, and disabling some features (some, accidently..). This was caused mostly by the fact that a page load would also fetch the last x tweets from the 3-6 profiles which had not been updated the longest.. With this new site I can fortunately use cron to fetch tweets, so that helps. I'll also be denormalizing the db a little so it needs less joins, optimize it for faster selects instead of size. Now, main question: how do I figure out which profiles to check for new tweets in an efficient manner? Some people will be tweeting more often than others, some will tweet in bursts (this happens a lot). I want to keep the front page of the site as "current" as possible. If it comes to, say, 300 profiles, and I check 5 every minute, some tweets will only appear an hour after the fact. I can check more often (up to 20K) but want to optimize this as much as possible, both to not hit the rate limit and to not run out of resources on the local server (it hit mysql's connection limit with that other site). Question 2: since cron only "runs" once a minute, I figure I have to check multiple profiles each minute - as stated, at least 5, possibly more. To try and spread it out over that minute I could have it sleep a few seconds between batches or even single profiles. But then if it takes longer than 60 seconds altogether, the script will run into itself. Is this a problem? If so, how can I avoid that? Question 3: any other tips? Readmes? URLs?

    Read the article

  • iTunes RSS Feed returning max. 100 items instead of 300

    - by TheEye
    I tried the iTunes RSS generator at http://itunes.apple.com/rss/generator/ to download the newest 300 games, which gave me the RSS Feed URL http://itunes.apple.com/us/rss/newapplications/limit=300/genre=6014/xml. However, only 100 are returned, and alphabetically sorted, so the list stops at the letter E ... Did Apple restrict the amount of items one could get without updating their RSS Feed Generator? Or am I missing something?

    Read the article

< Previous Page | 119 120 121 122 123 124 125 126 127 128 129 130  | Next Page >