Search Results

Search found 40581 results on 1624 pages for 'mysql select db'.

Page 156/1624 | < Previous Page | 152 153 154 155 156 157 158 159 160 161 162 163  | Next Page >

  • MDB2, Pear, Mysql error

    - by Kyle Hudson
    Hi Guys, I have PEAR, MDB2 and Mysql Driver installed however I keep getting: Fatal error: Call to undefined function: MDB2_Driver_mysql::_isNewLinkSet(). in /home/**/PEAR/MDB2.php on line 1937. The Server is CentOS I am stuck, any help would be appriciated. Thanks :)

    Read the article

  • Importing json data into MySQL?

    - by AP257
    Pretty much what the title says :) At the moment I'm using Python to turn the json data into a plain-text tab-separated file, and then mysqlimport to pull that into my MySQL tables. Anyone know a nicer / more direct way?

    Read the article

  • Query MySQL with unicode char code.

    - by Ben
    Hi, I have been having trouble searching through a MySQL table, trying to find entries with the character (UTF-16 code 200E) in a particular column. This particular code doesn't have a glyph, so it doesn't seem to work when I try to paste it into my search term. Is there a way to specify characters as their respective code point instead for a query? Thanks, -Ben

    Read the article

  • Data aggregation mongodb vs mysql

    - by Dimitris Stefanidis
    I am currently researching on a backend to use for a project with demanding data aggregation requirements. The main project requirements are the following. Store millions of records for each user. Users might have more than 1 million entries per year so even with 100 users we are talking about 100 million entries per year. Data aggregation on those entries must be performed on the fly. The users need to be able to filter on the entries by a ton of available filters and then present summaries (totals , averages e.t.c) and graphs on the results. Obviously I cannot precalculate any of the aggregation results because the filter combinations (and thus the result sets) are huge. Users are going to have access on their own data only but it would be nice if anonymous stats could be calculated for all the data. The data is going to be most of the time in batch. e.g the user will upload the data every day and it could like 3000 records. In some later version there could be automated programs that upload every few minutes in smaller batches of 100 items for example. I made a simple test of creating a table with 1 million rows and performing a simple sum of 1 column both in mongodb and in mysql and the performance difference was huge. I do not remember the exact numbers but it was something like mysql = 200ms , mongodb = 20 sec. I have also made the test with couchdb and had much worse results. What seems promising speed wise is cassandra which I was very enthusiastic about when I first discovered it. However the documentation is scarce and I haven't found any solid examples on how to perform sums and other aggregate functions on the data. Is that possible ? As it seems from my test (Maybe I have done something wrong) with the current performance its impossible to use mongodb for such a project although the automated sharding functionality seems like a perfect fit for it. Does anybody have experience with data aggregation in mongodb or have any insights that might be of help for the implementation of the project ? Thanks, Dimitris

    Read the article

  • MYSQL Select and group by date

    - by Gil
    I'm not sure how create a right query to get the result I'm looking for. What I have is 2 tables. first has ID, Name columns and second has date and adminID, which is referenced from table 1 column ID. Now, what I want to get is basically number of times each admin loged in per day during the month. From structure like this one I want to get per day and month data so result would be similar to 1, 2, 2 march total 5 for admin 4. ID | Date ------------------ 4 | 2010/03/01 4 | 2010/03/04 4 | 2010/03/04 4 | 2010/03/05 4 | 2010/03/05

    Read the article

  • BULK INSERT problem in mysql

    - by kartiku
    Hi, I get an error with the following sql command for bulk insert....any help would be appreciated. BULK INSERT libra.faculty FROM 'd\:faculty.csv' WITH ( FIELDTERMINATOR = ',', ROWTERMINATOR = '\n' ); Here's the error message ERROR 1064 (42000): You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near 'BULK INSERT libra.faculty FROM 'd:\faculty.csv' WITH ( FIELDTERMINATOR = ',', RO' at line 1

    Read the article

  • MySQL high IO usage quries

    - by jack
    MySQL has a built-in slow query logger. Is there any options or third-party tools which are able to detect the queries causing high IO usage just in the way like what slow query logger does?

    Read the article

  • PHP & MySQL Pagination

    - by Francesc
    Hi. I have a MySQL query SELECT * FROM 'redirect' WHERE 'user_id'= \''.$_SESSION['user_id'].' \' ORDER BY 'timestamp'` I want to paginate 10 results per page. How Can I do it?

    Read the article

  • parse results in MySQL via REGEX

    - by Derek Adair
    Hi, I'm a bit confused on the functionality of the REGEX support for MySQL and I have yet to find a solid example on how to separate a result with REGEX within an sql statement. Example: How could I pull data from a table emails that looks something like... +-------------------------+ |Emails | |-------------------------| |[email protected]| +-------------------------+ and return something through an sql statement that looks like... +------------------------------+ |Username | Domain | TLD | |-----------|------------|-----| |some.email | yourdomain | com | +------------------------------+

    Read the article

  • mysql query - blog posts and comments with limit

    - by Lemon
    Hi, I have 2 tables: comments and posts and I'd like to display a list of 15 posts and maximum 2 most recent comments under each blog post the database scheme looks like this posts_table: post_id, post_txt, post_timestamp comments_table: post_id, comment_txt, comment_timestamp how the mysql query should look like to select 15 posts and related comments (max 2 most recent ones per post) ??? thanks, Leo

    Read the article

  • MySQL - How do I inner join sorting the joined data

    - by Gary
    I'm trying to write a report which will join a person, their work, and their hourly wage at the time of work. I cannot seem to figure out the best way to join the person's cost when the date is less than the date of the work. Let's say a person cost $30 per hour at the start of the year then got a $10 raise o Feb 5 and another on Mar 1. 01/01/2010 $30.00 (per hour) 02/05/2010 $40.00 03/01/2010 $45.00 The person put in hours several days which span the rasies. 01/05/2010 10 hours (should be at $30/hr) 01/27/2010 5 hours (again at $30) 02/10/2010 10 hours (at $40/hr) 03/03/2010 5 hours (at $45/hr) I'm trying to write one SQL statement which will pull the hours, the cost per hour, and the hours*cost. The cost is the hourly rate last entered into the system so the cost date is less than the work date, ordered by cost date limit 1. SELECT person.id, person.name, work.hours, person_costs.value, work.hours * person_costs.value AS value FROM person INNER JOIN work ON (person.id = work.person_id) INNER JOIN person_costs ON (person.id = person_costs.person_id AND person_costs.date < work.date) WHERE person.id = 1234 ORDER BY work.date ASC The problem I'm having, the person_costs isn't ordered by date in descending order. It's pulling out "any" value (naturally sorted by record position) which matches the condition. How do I select the first person_cost value which is older than the work date? Thanks!

    Read the article

  • Fail2Ban - Log to mysql

    - by user319660
    Hi! We have a few servers with SSH public (using sFTP). Obviously, the attacks ar too many. We want put the banned logs into a MySQL DB for make stats and etc. Have anyone tryied this? Thanks

    Read the article

  • MySQL ORDER BY date and team

    - by Michael
    I would like to order by date and then team in a MySQL query. It should be something similar to this: SELECT * FROM games ORDER BY gamedate ASC, team_id AND it should output something like this: 2010-04-12 10:20 Game 1 Team 1 2010-04-12 11:00 Game 3 Team 1 2010-04-12 10:30 Game 2 Team 2 2010-04-14 10:00 Game 4 Team 1 So that Team 1 is under each other on the same date, but separate on a new date

    Read the article

  • MySQL ORDER BY DESC is fast but ASC is very slow

    - by Pepper
    Hello, I'm completely stumped on this one. For some reason when I sort this query by DESC it's super fast, but if sorted by ASC it's extremely slow. This takes about 150 milliseconds: SELECT posts.id FROM posts USE INDEX (published) WHERE posts.feed_id IN ( 4953,622,1,1852,4952,76,623,624,10 ) ORDER BY posts.published DESC LIMIT 0, 50; This takes about 32 seconds: SELECT posts.id FROM posts USE INDEX (published) WHERE posts.feed_id IN ( 4953,622,1,1852,4952,76,623,624,10 ) ORDER BY posts.published ASC LIMIT 0, 50; The EXPLAIN is the same for both queries. id select_type table type possible_keys key key_len ref rows Extra 1 SIMPLE posts index NULL published 5 NULL 50 Using where I've tracked it down to "USE INDEX (published)". If I take that out it's the same performance both ways. But the EXPLAIN shows the query is less efficient overall. id select_type table type possible_keys key key_len ref rows Extra 1 SIMPLE posts range feed_id feed_id 4 \N 759 Using where; Using filesort And here's the table. CREATE TABLE `posts` ( `id` int(20) NOT NULL AUTO_INCREMENT, `feed_id` int(11) NOT NULL, `post_url` varchar(255) NOT NULL, `title` varchar(255) NOT NULL, `content` blob, `author` varchar(255) DEFAULT NULL, `published` int(12) DEFAULT NULL, `updated` datetime NOT NULL, `created` datetime NOT NULL, PRIMARY KEY (`id`), UNIQUE KEY `post_url` (`post_url`,`feed_id`), KEY `feed_id` (`feed_id`), KEY `published` (`published`) ) ENGINE=InnoDB AUTO_INCREMENT=196530 DEFAULT CHARSET=latin1; Is there a fix for this? Thanks!

    Read the article

  • Optimize GROUP BY&ORDER BY query

    - by Jan Hancic
    I have a web page where users upload&watch videos. Last week I asked what is the best way to track video views so that I could display the most viewed videos this week (videos from all dates). Now I need some help optimizing a query with which I get the videos from the database. The relevant tables are this: video (~239371 rows) VID(int), UID(int), title(varchar), status(enum), type(varchar), is_duplicate(enum), is_adult(enum), channel_id(tinyint) signup (~115440 rows) UID(int), username(varchar) videos_views (~359202 rows after 6 days of collecting data, so this table will grow rapidly) videos_id(int), views_date(date), num_of_views(int) The table video holds the videos, signup hodls users and videos_views holds data about video views (each video can have one row per day in that table). I have this query that does the trick, but takes ~10s to execute, and I imagine this will only get worse over time as the videos_views table grows in size. SELECT v.VID, v.title, v.vkey, v.duration, v.addtime, v.UID, v.viewnumber, v.com_num, v.rate, v.THB, s.username, SUM(vvt.num_of_views) AS tmp_num FROM video v LEFT JOIN videos_views vvt ON v.VID = vvt.videos_id LEFT JOIN signup s on v.UID = s.UID WHERE v.status = 'Converted' AND v.type = 'public' AND v.is_duplicate = '0' AND v.is_adult = '0' AND v.channel_id <> 10 AND vvt.views_date >= '2001-05-11' GROUP BY vvt.videos_id ORDER BY tmp_num DESC LIMIT 8 And here is a screenshot of the EXPLAIN result: So, how can I optimize this?

    Read the article

  • Error querying database in PHP, MySQL

    - by user296516
    Hi guys, I have this code in PHP. It connects to the DB fine, but pops an error, when tryinto to insert the info. $dbc = mysqli_connect('localhost', 'root', 'marina', 'aliendatabase') or die('Error connecting to MySQL server.'); $query = "INSERT INTO aliens_abduction (name, email) VALUSE ('John', '[email protected]')"; $result = mysqli_query($dbc, $query) or die('Error querying database.'); mysqli_close($dbc); Here's a screenshot: http://img532.imageshack.us/img532/2930/63306356.jpg Thanks, R

    Read the article

< Previous Page | 152 153 154 155 156 157 158 159 160 161 162 163  | Next Page >