Search Results

Search found 34274 results on 1371 pages for 'mysql table'.

Page 399/1371 | < Previous Page | 395 396 397 398 399 400 401 402 403 404 405 406  | Next Page >

  • Most efficient way to make an activity log

    - by Nathan
    I am making a "recent activity" tab to profiles on my site and I also am going to have a log for moderators to see everything that happens on the site. This would require making an activity log of some sort. I just don't know what would be better. I have 2 options: Make a table called "activity" and then every time someone does something, add a record to it with the type of action, user id, timestamp, etc. Problem: table could get very long. Join all 3 tables (questions, answers, answer_comments) and then somehow show all these on the page in the order in which the action was taken. Problem: this would be extremely hard because I have no clue how I could make it say "John commented on an answer on Question Title Here" by just joining 3 tables. Does anyone know of a better way of making an activity log in this situation? I am using PHP and MySQL. If this is either too inefficient or hard I will probably just forget the Recent Activity tab on profiles but I still need an activity log for moderators. Here's some SQL that I started making for option 2, but this would not work because there is no way of detecting whether the action is a comment, question, or answer when I echo the info in a while loop: SELECT q.*, a.*, ac.* FROM questions q JOIN answers a ON a.questionid = q.qid JOIN answer_comments ac ON c.answerid = a.ans_id WHERE q.user = $userid AND a.userid = $userid AND ac.userid = $userid ORDER BY q.created DESC, a.created DESC, ac.created DESC Thanks in advance for any help!

    Read the article

  • create procedure fails!?

    - by Mark
    Hi, when trying to create a simple procedure in mysql 5.1.47-community it fails everytime i've tried everything! even simple things like this! DELIMITER // CREATE PROCEDURE two () begin SELECT 1+1; end; //

    Read the article

  • Select multiple records by one query

    - by kofto4ka
    Hello there. Please, give me advice, how to construct select query. I have table table with fields type and obj_id. I want to select all records in concordance with next array: $arr = array( 0 => array('type' => 1, 'obj_id' => 5), 1 => array('type' => 3, 'obj_id' => 15), 2 => array('type' => 4, 'obj_id' => 14), 3 => array('type' => 12, 'obj_id' => 17), ); I want to select needed rows by one query, is it real? Smth like select * from `table` where type in (1,3,4,12) and obj_id in (5,15,14,17) But this query returns also records with type = 3 and obj_id = 14, and for example type = 1 and obj_id = 17. p.s. moderators, please fix my title, I dont know how to describe my question.

    Read the article

  • MySQL: host name universal change.

    - by ctrlShiftBryan
    I am making some updates to a php site which I did not design. I have a local copy of the site. At the top of each page there are settings for the host name for the db connection. Is there someway I can setup a pointer to the remote address. The address is 'mysqlhost' for example and I want that to point to 'mysql.myhost.com'. I tried creating a HOST record for mysqlhost pointing to the IP address it resolves to but that doesn't work. If I put 'mysql.myhost.com' in the connection it works. If I put that IP address it doesn't so that is probably why the HOST record idea doesn't work. Other than creating a local copy of the DB is there a quick way so that I don't have to modify each file in my dev environment and then again when I redeploy?

    Read the article

  • Data pagination with "saved" search

    - by TiuTalk
    I'm creating a page that's a search result... When you're viewing one of the results, at the bottom of the page, I need to insert "Next result" and "Previous result" links, like a pagination... But from [I think] a saved search, right? How you people would do this? Obs.: I'll use CakePHP (PHP) and MySQL

    Read the article

  • HOW TO FIX "The requested URL /phpMyAdmin was not found on this server"

    - by user1392840
    I have install apache,php and mysql on Mac 10.8.1. After this, in my web brower i type this, it give the error message Not Found The requested URL /News-2012-Academy-Awards-53.html was not found on this server. Apache/2.2.22 (Unix) mod_fastcgi/2.4.6 mod_ssl/2.2.22 OpenSSL/0.9.8r DAV/2 PHP/5.3.13 with Suhosin-Patch mod_wsgi/3.3 Python/2.7.2 Server at clontarf.girlsacademy.com.au Port 80" Please help me to solve it.

    Read the article

  • PHP and storing stats

    - by John
    Using PHP5 and the latest version of MySQL I want to be able to track impressions and clicks for business listings. My question is if I did this myself what would be the best method in storing it so I can run reports? Before I just had a table that had the listing id, user ip address and if it was a click or impression as well as the date it was tracked. However the database itself is approaching 2GB of data and its very slow, part of the problem is its a pretty simple script that includes impressions and clicks from anyone including search engines and basically anyone or anything that accesses the listing page. Is there an api or file out there that has an update to date list that can detect if the person viewing is a actually person and not a spider so I dont fill up the database with unneeded stats? Just looking for suggestions, do I just have a raw database that gets just the hits then a cron job at night tally up for the day for each listing for each ip and store the cumulative stats in a different table? Also what type of database should it be? Innodb? MyISAM?

    Read the article

  • MySQLDB query not returning all rows

    - by RBK
    I am trying to do a simple fetch using MySQLDB in Python. I have 2 tables(Accounts & Products). I have to look up Accounts table, get acc_id from it & query the Products table using it. The Products tables has more than 10 rows. But when I run this code it randomly returns between 0 & 6 rows each time I run it. Here's the code snippet: # Set up connection con = mdb.connect('db.xxxxx.com', 'user', 'password', 'mydb') # Create cursor cur = con.cursor() # Execute query cur.execute("SELECT acc_id FROM Accounts WHERE ext_acc = '%s'" % account_num ) # account_num is alpha-numberic and is got from preceding part of the program # A tuple is returned, so get the 0th item from it acc_id = cur.fetchone()[0] print "account_id = ", acc_id # Close the cursor - I was not sure if I can reuse it cur.close() # Reopen the cursor cur = con.cursor() # Second query cur.execute("SELECT * FROM Products WHERE account_id = %d" % acc_id) keys = cur.fetchall() print cur.rowcount # This prints incorrect row count for key in keys: # Does not print all rows. Tried to directly print keys instead of iterating - same result :( print key # Closing the cursor & connection cur.close() con.close() The weird part is, I tried to step through the code using a debugger(PyDev on Eclipse) and it correctly gets all rows(both the value stored in the variable 'keys' as well as console output are correct). I am sure my DB has correct data since I ran the same SQL on MySQL console & got the correct result. Just to be sure I was not improperly closing the connection, I tried using with con instead of manually closing the connection and it's the same result. I did RTFM but I couldn't find much in it to help me with this issue. Where am I going wrong? Thank you. EDIT: I noticed another weird thing now. In the line cur.execute("SELECT * FROM Products WHERE account_id = %d" % acc_id), I hard-coded the acc_id value, i.e made it cur.execute("SELECT * FROM Products WHERE account_id = %d" % 322) and it returns all rows

    Read the article

  • Calculating average (AVG) and grouping by week on large data set takes too long

    - by caioiglesias
    I'm getting average prices by week on 7 million rows, it's taking around 30 seconds to get the job done. This is the simple query: SELECT AVG(price) as price, yearWEEK(FROM_UNIXTIME(timelog)) as week from pricehistory where timelog > $range and product_id = $id GROUP BY week The only week that actually gets data changed and is worth averaging every time is always the last one, so this calculation for the whole period is a waste of resources. I just wanted to know if mysql has a tool to help out on this.

    Read the article

  • How to parse the table from webpage where there are many webpage.

    - by Harikrishna
    There are many tables in the one webpage from that I want to extract the data from only one table. I am using Html Agility Pack to parse the html table.There are many tables in one webpage but I want to extract the data from only one table. So I will first find that table for which I want to extract the data which I can do.Now problem is once I find that table,what I should do to extract the data from only that table ?

    Read the article

  • detection of 'flush tables with read lock' in php

    - by theduke0
    I would like to know from my application if a myisam table can accept writes (i.e. not locked). If an exception is thrown, everything is fine as I can catch this and log the failed statement to a file. However, if a 'flush tables with read lock' command has been issued (possibly for backup), the query I send will pretty much hang out forever. If one table is locked at a time, insert delayed works well. But when this global lock is applied, my query just waits. The query I run is an insert statement. If this statement fails or hangs, user experience is degraded. I need a way to send the query to the server and forget about it (pretty much). Does anyone have any suggestions on how to deal with this? -set a query timeout? -run asyncronous request and allow for the lock to expire while application continues? -fork my php process? Please let me know if I can provide and clarification or details.

    Read the article

  • Can php query the results from a previous query?

    - by eaolson
    In some languages (ColdFusion comes to mind), you can run a query on the result set from a previous query. Is it possible to do something like that in php (with MySQL as the database)? I sort of want to do: $rs1 = do_query( "SELECT * FROM animals WHERE type = 'fish'" ); $rs2 = do_query( "SELECT * FROM rs1 WHERE name = 'trout'" );

    Read the article

  • Is it possible to capture data from a WHERE clause?

    - by Kristopher Ives
    I have a scenario where I'm calculating something in the WHERE clause of my SQL, but I also want to get that calculation - since it's expensive. Is it possible to get the results of something done in the WHERE clause, like this: SELECT `foo` FROM `table` WHERE (foo = LongCalculation()) Wishful thinking, or possible with MySQL?

    Read the article

  • jQuery + ajax livesearch

    - by andrei
    I am doing a mysql database search and retrieving some results via ajax livesearch using the example on w3schools and i want to manipulate those results (drag and drop them) but im having a problem because the script loads before you enter the search and get the results so it does absolutely nothing no the search results. Any thoughts on this matter ?

    Read the article

  • How to Get / Set Div and Table Width / Height

    - by Nasser Hajloo
    I have a Table (or a region) and want to set it's Width and Height value to another Div (or region). The second one is actually a Ajax Indicator modal which display a loading text when the page is asynchronously post back. here is the example <table id="MainTable"> <tr> <td> Content .... </td> </tr> </table> <div id="Progress"> Ajax Indocator </div> the following javascript didn't work document.getElementById("Progress").style.width = document.getElementById("MainTable").style.width; document.getElementById("Progress").style.height = document.getElementById("MainTable").style.height; It should work both on IE and FireFox. how to correct it. I checked some other solution in StackOverFlow but I couldn't fix it. I'mwaiting to hear from you.

    Read the article

  • UPLOAD DATA IN DATA BASE

    - by rajson
    HI friends, i wish to store image and resume of user in the data base. i am using mysql data base and php5. i want to know that which field i set means data type.and how i set the range (maximum size) for uploaded data.

    Read the article

  • mysqldump problem with case sensitivity? Win->linux

    - by acidzombie24
    When i dump a table with uppercase letters using mysqldump it comes out as lower case in my dump.sql file. I found a report here in 2006, almost 4 years old http://bugs.mysql.com/bug.php?id=19967 A solution here suggest making linux insensitive. I rather not if possible. Whats the easiest way to copy a win32 db into linux?

    Read the article

  • What is wrong with this SQL syntax?

    - by Finbarr
    UPDATE files SET filepath = REPLACE(filepath, `sites/somedomain.com/files/`, `sites/someotherdomain.com/files/`); I have a table called files with a field called filepath. MySQL returns this error: Unknown column 'sites/somedomain.com/files/' in 'field list'

    Read the article

  • how to get latest entry of a item when item have multiple rows?

    - by I Like PHP
    i have an table tbl_exp id| exp_id|qnty| last_update 1 | 12 | 10|2010-05-18 19:34:29 2 | 13 | 50|2010-05-19 19:34:29 3 | 12 | 50|2010-05-19 19:34:29 4 | 15 | 50|2010-05-18 19:34:29 5 | 18 | 50|2010-05-20 19:34:29 6 | 13 | 70|2010-05-20 19:34:29 now i need only latest entry of each exp_id id| exp_id|qnty| last_update 3 | 12 | 50|2010-05-19 19:34:29 6 | 13 | 70|2010-05-20 19:34:29 4 | 15 | 50|2010-05-18 19:34:29 5 | 18 | 50|2010-05-20 19:34:29 please suggest me the mysql query to retrive above result?? thanks!

    Read the article

  • Skip all databases and run only specific one

    - by Ergec
    I have a sql file generated by "mysqldump --all-databases" . There are many databases in it. What I want to do is to update my local database but only a specific one, not all. I tried to use "mysql -database=db_name < file.sql" but it updated all databases. Is there a way to skip all databases except the one that I want.

    Read the article

< Previous Page | 395 396 397 398 399 400 401 402 403 404 405 406  | Next Page >