Search Results

Search found 40581 results on 1624 pages for 'mysql select db'.

Page 196/1624 | < Previous Page | 192 193 194 195 196 197 198 199 200 201 202 203  | Next Page >

  • Load Balancing of PHP/MYSQL script without big code changes

    - by DR.GEWA
    Sorry for my Dummy Question, but... I am making a script on php/mysql (codeigniter) and I am extremally interested in knowing if there is a way without big architectural changes of the script make a load balancing. I mean, that for example now I will rent a medium dedicated server with 2GB ram, 200GB memory and good processor, and this will be enough lets say half year for the users which will come. But when they will become more and more, and as its a social net and at nights the server is waiting to have 500-1500 or 5000-8000 users online, I wander if there is a way for lets say just add second server with some config which will bear next pressure. After again one and so on... ???? <? if($answer=YES) { how(??); } esle{ whatToDo(??); } ?> If there is no way, than maybe you could point to a easiest way of load balancing solution.... I will be extremally thanksfull if you can tell me for such purposes , should I move lets say to PostgreSQl or FireBird? Which of them will be more easy in the future to handle ? I am getting on the mysite.com/users/show/$userId page something like 60queries for all data... maybe too much, but anyway....after some optimization it can be 20-30....

    Read the article

  • Escaping single quote in PHP when inserting into MySQL

    - by hairdresser-101
    PLEASE NOTE: I am reposting as the original was not answered correctly... I AM LOOKING FOR THE WHY - NOT THE SOLUTION - I KNOW THE SOLUTION, WHAT I DON'T UNDERSTAND IS THE WHY! I have a perplexing issue that I can't seem to comprehend... I'm hoping someone here might be able to point me in the right direction... I have two SQL statements: - the first enters information from a form into the database. - the second takes data from the database entered above, sends an email and then logs the details of the transaction The problem is that it a appears that a single quote is triggering a MySQL error on the second entry only!!! The first instance works without issue but the second instance triggers the mysql_error(). Does the data from a form get handled differently from the data captured in a form? Query#1 - This works without issue (and without escaping the single quote) $result = mysql_query("INSERT INTO job_log (order_id, supplier_id, category_id, service_id, qty_ordered, customer_id, user_id, salesperson_ref, booking_ref, booking_name, address, suburb, postcode, state_id, region_id, email, phone, phone2, mobile, delivery_date, stock_taken, special_instructions, cost_price, cost_price_gst, sell_price, sell_price_gst, ext_sell_price, retail_customer, created, modified, log_status_id) VALUES ('$order_id', '$supplier_id', '$category_id', '{$value['id']}', '{$value['qty']}', '$customer_id', '$user_id', '$salesperson_ref', '$booking_ref', '$booking_name', '$address', '$suburb', '$postcode', '$state_id', '$region_id', '$email', '$phone', '$phone2', '$mobile', STR_TO_DATE('$delivery_date', '%d/%m/%Y'), '$stock_taken', '$special_instructions', '$cost_price', '$cost_price_gst', '$sell_price', '$sell_price_gst', '$ext_sell_price', '$retail_customer', '".date('Y-m-d H:i:s', time())."', '".date('Y-m-d H:i:s', time())."', '1')"); Query#2 - This fails when entering a name with a single quote (i.e. O'Brien) $query = mysql_query("INSERT INTO message_log (order_id, timestamp, message_type, email_from, supplier_id, primary_contact, secondary_contact, subject, message_content, status) VALUES ('$order_id', '".date('Y-m-d H:i:s', time())."', '$email', '$from', '$row->supplier_id', '$row->primary_email' ,'$row->secondary_email', '$subject', '$message_content', '1')");

    Read the article

  • Escaping single quote in PHP when inserting into MySQL

    - by hairdresser-101
    I have a perplexing issue that I can't seem to comprehend... I'm hoping someone here might be able to point me in the right direction... I have two SQL statements: - the first enters information from a form into the database. - the second takes data from the database entered above, sends an email and then logs the details of the transaction The problem is that it a appears that a single quote is triggering a MySQL error on the second entry only!!! The first instance works without issue but the second instance triggers the mysql_error(). Does the data from a form get handled differently from the data captured in a form? Query#1 - This works without issue (and without escaping the single quote) $result = mysql_query("INSERT INTO job_log (order_id, supplier_id, category_id, service_id, qty_ordered, customer_id, user_id, salesperson_ref, booking_ref, booking_name, address, suburb, postcode, state_id, region_id, email, phone, phone2, mobile, delivery_date, stock_taken, special_instructions, cost_price, cost_price_gst, sell_price, sell_price_gst, ext_sell_price, retail_customer, created, modified, log_status_id) VALUES ('$order_id', '$supplier_id', '$category_id', '{$value['id']}', '{$value['qty']}', '$customer_id', '$user_id', '$salesperson_ref', '$booking_ref', '$booking_name', '$address', '$suburb', '$postcode', '$state_id', '$region_id', '$email', '$phone', '$phone2', '$mobile', STR_TO_DATE('$delivery_date', '%d/%m/%Y'), '$stock_taken', '$special_instructions', '$cost_price', '$cost_price_gst', '$sell_price', '$sell_price_gst', '$ext_sell_price', '$retail_customer', '".date('Y-m-d H:i:s', time())."', '".date('Y-m-d H:i:s', time())."', '1')"); Query#2 - This fails when entering a name with a single quote (i.e. O'Brien) $query = mysql_query("INSERT INTO message_log (order_id, timestamp, message_type, email_from, supplier_id, primary_contact, secondary_contact, subject, message_content, status) VALUES ('$order_id', '".date('Y-m-d H:i:s', time())."', '$email', '$from', '$row->supplier_id', '$row->primary_email' ,'$row->secondary_email', '$subject', '$message_content', '1')");

    Read the article

  • MySQL BinLog Statement Retrieval

    - by Jonathon
    I have seven 1G MySQL binlog files that I have to use to retrieve some "lost" information. I only need to get certain INSERT statements from the log (ex. where the statement starts with "INSERT INTO table SET field1="). If I just run a mysqlbinlog (even if per database and with using --short-form), I get a text file that is several hundred megabytes, which makes it almost impossible to then parse with any other program. Is there a way to just retrieve certain sql statements from the log? I don't need any of the ancillary information (timestamps, autoincrement #s, etc.). I just need a list of sql statements that match a certain string. Ideally, I would like to have a text file that just lists those sql statements, such as: INSERT INTO table SET field1='a'; INSERT INTO table SET field1='tommy'; INSERT INTO table SET field1='2'; I could get that by running mysqlbinlog to a text file and then parsing the results based upon a string, but the text file is way too big. It just times out any script I run and even makes it impossible to open in a text editor. Thanks for your help in advance.

    Read the article

  • conditional update records mysql query

    - by Shakti Singh
    Hi, Is there any single msql query which can update customer DOB? I want to update the DOB of those customers which have DOB greater than current date. example:- if a customer have dob 2034 update it to 1934 , if have 2068 updated with 1968. There was a bug in my system if you enter date less than 1970 it was storing it as 2070. The bug is solved now but what about the customers which have wrong DOB. So I have to update their DOB. All customers are stored in customer_entity table and the entity_id is the customer_id Details is as follows:- desc customer_entity -> ; +------------------+----------------------+------+-----+---------------------+----------------+ | Field | Type | Null | Key | Default | Extra | +------------------+----------------------+------+-----+---------------------+----------------+ | entity_id | int(10) unsigned | NO | PRI | NULL | auto_increment | | entity_type_id | smallint(8) unsigned | NO | MUL | 0 | | | attribute_set_id | smallint(5) unsigned | NO | | 0 | | | website_id | smallint(5) unsigned | YES | MUL | NULL | | | email | varchar(255) | NO | MUL | | | | group_id | smallint(3) unsigned | NO | | 0 | | | increment_id | varchar(50) | NO | | | | | store_id | smallint(5) unsigned | YES | MUL | 0 | | | created_at | datetime | NO | | 0000-00-00 00:00:00 | | | updated_at | datetime | NO | | 0000-00-00 00:00:00 | | | is_active | tinyint(1) unsigned | NO | | 1 | | +------------------+----------------------+------+-----+---------------------+----------------+ 11 rows in set (0.00 sec) And the DOB is stored in the customer_entity_datetime table the column value contain the DOB. but in this table values of all other attribute are also stored such as fname,lname etc. So the attribute_id with value 11 is DOB attribute. mysql> desc customer_entity_datetime; +----------------+----------------------+------+-----+---------------------+----------------+ | Field | Type | Null | Key | Default | Extra | +----------------+----------------------+------+-----+---------------------+----------------+ | value_id | int(11) | NO | PRI | NULL | auto_increment | | entity_type_id | smallint(8) unsigned | NO | MUL | 0 | | | attribute_id | smallint(5) unsigned | NO | MUL | 0 | | | entity_id | int(10) unsigned | NO | MUL | 0 | | | value | datetime | NO | | 0000-00-00 00:00:00 | | +----------------+----------------------+------+-----+---------------------+----------------+ 5 rows in set (0.01 sec) Thanks.

    Read the article

  • highlighting search results in php/mysql

    - by fusion
    how do i highlight search results from mysql query using php? this is my code: $search_result = ""; $search_result = $_GET["q"]; $result = mysql_query('SELECT cQuotes, vAuthor, cArabic, vReference FROM thquotes WHERE cQuotes LIKE "%' . $search_result .'%" ORDER BY idQuotes DESC', $conn) or die ('Error: '.mysql_error()); function h($s) { echo htmlspecialchars($s, ENT_QUOTES); } ?> <div class="center_div"> <table> <caption>Search Results</caption> <?php while ($row= mysql_fetch_array($result)) { ?> <tr> <td style="text-align:right; font-size:15px;"><?php h($row['cArabic']) ?></td> <td style="font-size:16px;"><?php h($row['cQuotes']) ?></td> <td style="font-size:12px;"><?php h($row['vAuthor']) ?></td> <td style="font-size:12px; font-style:italic; text-align:right;"><?php h($row['vReference']) ?></td> </tr> <?php } ?> </table> </div>

    Read the article

  • MySQL efficiency as it relates to the database/table size

    - by mlissner
    I'm building a system using django, Sphinx and MySQL that's very quickly becoming quite large. The database currently has about 2000 rows, and I've written a program that's going to populate it with another 40,000 rows in a couple days. Since the database is live right now, and since I've never had a database with this much information in it, I'm worried about some things: Is adding all these rows going to seriously degrade the efficiency of my django app? Will I need to go back through it and optimize all my database calls so they're doing things more cleverly? Or will this make the database slow all around to the extent that I can't do anything about it at all? If you scoff at my 40k rows, then, my next question is, at what point SHOULD I be concerned? I will likely be adding another couple hundred thousand soon, so I worry, and I fret. How is sphinx going to feel about all this? Is it going to freak out when it realizes it has to index all this data? Or will it be fine? Is this normal for it? If it is, at what point should I be concerned that it's too much data for Sphinx? Thanks for any thoughts.

    Read the article

  • security deleting a mysql row with jQuery $.post

    - by FFish
    I want to delete a row in my database and found an example on how to do this with jQuery's $.post() Now I am wondering about security though.. Can someone send a POST request to my delete-row.php script from another website? JS function deleterow(id) { // alert(typeof(id)); // number if (confirm('Are you sure want to delete?')) { $.post('delete-row.php', {album_id:+id, ajax:'true'}, function() { $("#row_"+id).fadeOut("slow"); }); } } PHP: delete-row.php <?php require_once("../db.php"); mysql_connect(DB_SERVER, DB_USER, DB_PASSWORD) or die("could not connect to database " . mysql_error()); mysql_select_db(DB_NAME) or die("could not select database " . mysql_error()); if (isset($_POST['album_id'])) { $query = "DELETE FROM albums WHERE album_id = " . $_POST['album_id']; $result = mysql_query($query); if (!$result) die('Invalid query: ' . mysql_error()); echo "album deleted!"; } ?>

    Read the article

  • MySQL GIS and Spatial Extensions - how to map regions and query against them

    - by chibineku
    I am trying to make a smartphone app which will return a list of users within a certain proximity, say 100m. It's easy to get the coordinates of my BlackBerry and write them to a database, but in order to return a list of other users within 100m, I need to pull every other record from the database and compare the distance between the two points, checking to see if it's within range, before outputting that user's information. This is going to be time consuming if there are many users involved. So I would like to map areas (countries, cities, I'm not yet sure of the resolution I'll need) so that I can first target a smaller subset of all users. This will save on processing time. I have read the basics of GIS and spatial querying on the mysql website but to be honest the query is over my head and I hate copying and pasting code without understanding it. Plus it only checks for proximity - I want to first check if a coordinate falls within a certain area. Does anyone have any experience of such matters and feel like giving me some pointers? Resources such as any preexisting databases of points describing countries as polygons would be really helpful too. Many thanks to anyone who takes the time :)

    Read the article

  • MySQL multidimensional arrays...

    - by jay
    What is the best way to store data that is dynamic in nature using MySQL? Let's say I have a table in which one item is "dynamic". For some entries I need to store one value, but for others it could be one hundred values. For example let's say I have the following simple table: CREATE TABLE manager ( name char(50), worker_1_name(50), worker_2_name(50), ... worker_N_name(50) ); Clearly, this is not an ideal way to set up a database. Because I have to accommodate the largest group that a manager could potentially have, I am wasting a lot of space in the database. What I would prefer is to have a table that I can use as a member of another table (like I would do in C++ through inheritance) that can be used by the "manager" table to handle the variable number of employees. It might look something like this. CREATE TABLE manager ( name char(50), underlings WORKERS ); CREATE TABLE WORKERS ( name char(50), ); I would like to be able to add a variable number of workers to each manager. Is this possible or am I constrained to enumerating all the possible number of employees even though I will use the full complement only rarely?

    Read the article

  • Load Spikes on a Apache MySQL Server with Wordpress MU

    - by Vikram Goyal
    Hi there, I am trying to investigate the reasons for some mysterious load spikes on a Linux Apache server (2.2.14) running PHP 5.2.9 on a dedicated server with enough processing power and memory. My primary web application is a Wordpress MU (2.9.2) installation. I have investigated and ruled out DOS attack, MySQL or Apache configuration issues. The log files don't give me anything of interest, except to tell me that there is severe load. The load (which can go up to 100) just seems to come and go. It helps that I have a script that checks every 3 minutes for the load, and restarts Apache. Restarting it helps, and the server comes back, till it happens again. There seems to be no set time frame, or visitor numbers on the site that can trigger this. Even a low number of concurrent visitors (20) can trigger it. I am almost convinced that there is a rewrite loop somewhere that is causing Apache to go mad. Apache is trying to serve something that is causing it to spawn more and more processes till it keels over. My question is: Given that I am convinced that this is a rewrite issue or something similar, how can I try and figure out what the issue is? What should I monitor? Apache logs are voluminous, and not very helpful. Of course, if this is not the issue, then at least knowing what to look for will help me eliminate this as an issue and look for something else. Thanks! Vikram

    Read the article

  • Let multiple highcharts charts appear automatically from mysql data

    - by martini1993
    I have the following problem. I want to make multiple Highcharts webcharts appear automatically based on the data from the database. Let's say we have the following database: ___________________________________________________________________ | | | | | | | | Year | Month | ID | Name User | Wins | Losses | |_______|___________|______|_______________|____________|__________| | 2013 1 21 Tony Stark 3 12 | | 2013 1 52 Bruce Wayne 5 4 | | 2013 1 76 Clark Kent 9 5 | |__________________________________________________________________| (This database is an example, there are a lot more rows in the real database.) And i have the following query: SELECT a.year AS year1, a.month AS month1, a.id AS id, a.name AS nameuser, a.wins AS wins, a.losses AS losses FROM Sales a WHERE a.month = 1 AND a.year = YEAR(NOW()) With this, it is very easy to hardcode a chart with Highcharts. But what I want is that there has to be a webchart per user. So instead of a single webchart with all the users in it, I want multiple charts next to each other based on the data from the database. So instead of this: http://jsfiddle.net/CWSb6/ I want this (But then next to each other): http://jsfiddle.net/DReMD/ It has to be generated automatically with php and mysql. So if there is a new user starting this month, and the new user is saved in the database, the page automatically displays the new user with the related web chart. I find this very hard to accomplish and I need some help to get to the right direction for the solution. Many thanks in advance! (Sorry for my bad english.)

    Read the article

  • php mysql parallel array checkboxes

    - by gramware
    I have an array of checkboxes that I edit at once to set up a 'tinyint' field. the problem comes in when i uncheck the checkbox and post the vales to mysql. since it posts an array of checkboxes and another parallel array of values to edit, unchecking a checkbox results in the 0 value been ignored by PHP_POST and hence the checkbox array will be less by the number of unchecked values in the form while the array to be edited will have all the records in the form. here is the submit code while($row=mysql_fetch_array($result)) { $checked = ($row[active]==1) ? 'checked="checked"' : ''; ... echo "<input type='hidden' name='TrID[]' value='$TrID'>"; echo "<input type='checkbox' name='active1[]' value='$row[active]''$checked' >"; ... and the processing php script $userid = ($_POST['TrID']); $checked= ($_POST['active']); $i=0; foreach ($userid as $usid) { if ($checked[$i]==1){ $check = 1; } else{ $check = 0; } $qry1 ="UPDATE `epapers`.`clientelle` SET `active` = '$check' WHERE `clientelle`.`user_id` = '$usid' "; $result = mysql_query($qry1); $i++; }

    Read the article

  • question about MySQL database migration

    - by WilliamLou
    Hi there: If I have a MySQL database with several tables on a live server, now I would like to migrate this database to another server. Of course, the migration I mean here involves some database tables, for example: add some new columns to several tables, add some new tables etc.. Now, the only method I can think of is to use some php/python(two scripts I know) script, connect two databases, dump the data from the old database, and then write into the new database. However, this method is not efficient at all. For example: in old database, table A has 28 columns; in new database, table A has 29 columns, but the extra column will have default value 0 for all the old rows. My script still needs to dump the data row by row and insert each row into the new database. Is there any tools or a better method than writing a script yourself? Here, I dont need to worry about multithread writing problems etc.., I mean the old database will be down (not open to public usage etc.., only for upgrade ) for a while. Thanks!!

    Read the article

  • Best way to handle Many-to-Many relationships in PHP MySQL

    - by Jayrox
    I am looking for the best way to handle a database of many-to-many relationships in PHP and MySQL. Right now I have 2 tables: Users (id, user_name, first_name, last_name) Connections (id_1, id_2) In the User table id is auto incremented on add and user_name is unique, but can be changed. Unfortunately, I don't have control over the user_name and its ability to be changed, but I must account for it. The Connections table is obviously, user1 and user2's id. The connection table needs to account for these possible relations: user1 --> user2 (user 1 friends with user 2 but not user2 friends with user1) user2 --> user1 (user 2 friends with user 1 but not user1 friends with user2) user1 <--> user2 (user 1 and user 2 mutually friends) user1 <-!-> user2 (user 1 and user 2 not friends) That part is not the problem, The problem I am having with is keeping these relations unique when and if they change in batches. Possible solution 1: delete all of user 1's relations and readd them with the updated list. I think this might be too slow for my needs. Solution 2? Anyone else encounter this problem? How should I best handle this? update: distinguishing relationships: i handle relationships like this: user1, user2 user1, user3 user2, user1 in that example the following is true: user1 follows user2 and user3 user2 only follows user1 but doesn't follow user3 user3 doesn't follow either user1 or user2

    Read the article

  • Having an issue with Nullable MySQL columns in SubSonic 3.0 templates

    - by omegawkd
    Looking at this line in the Settings.ttinclude string CheckNullable(Column col){ string result=""; if(col.IsNullable && col.SysType !="byte[]" && col.SysType !="string") result="?"; return result; } It describes how it determines if the column is nullable based on requirements and returns either "" or "?" to the generated code. Now I'm not too familiar with the ? nullable type operator but from what I can see a cast is required. For instance, if I have a nullable integer MySQL column and I generate the code using the default template files it returns a line similar to this: int? _User_ID; When trying to compile the project I get the error: Cannot implicitly convert type 'int?' to 'int'. An explicit conversion exists (are you missing a cast?) I checked teh Settings files for the other database types and they all seems to have the same routine. So my question is, is this behaviour expected or is this a bug? I need to solve it one way or the other before I can procede. Thanks for your help.

    Read the article

  • mysql/algorithm: Weighting an average to accentuate differences from the mean

    - by Sai Emrys
    This is for a new feature on http://cssfingerprint.com (see /about for general info). The feature looks up the sites you've visited in a database of site demographics, and tries to guess what your demographic stats are based on that. All my demgraphics are in 0..1 probability format, not ratios or absolute numbers or the like. Essentially, you have a large number of data points that each tend you towards their own demographics. However, just taking the average is poor, because it means that by adding in a lot of generic data, the number goes down. For example, suppose you've visited sites S0..S50. All except S0 are 48% female; S0 is 100% male. If I'm guessing your gender, I want to have a value close to 100%, not just the 49% that a straight average would give. Also, consider that most demographics (i.e. everything other than gender) does not have the average at 50%. For example, the average probability of having kids 0-17 is ~37%. The more a given site's demographics are different from this average (e.g. maybe it's a site for parents, or for child-free people), the more it should count in my guess of your status. What's the best way to calculate this? For extra credit: what's the best way to calculate this, that is also cheap & easy to do in mysql?

    Read the article

  • Rolling back file moves, folder deletes and mysql queries

    - by Workoholic
    This has been bugging me all day and there is no end in sight. When the user of my php application adds a new update and something goes wrong, I need to be able to undo a complex batch of mixed commands. They can be mysql update and insert queries, file deletes and folder renaming and creations. I can track the status of all insert commands and undo them if an error is thrown. But how do I do this with the update statements? Is there a smart way (some design pattern?) to keep track of such changes both in the file structure and the database? My database tables are MyISAM. It would be easy to just convert everything to InnoDB, so that I can use transactions. That way I would only have to deal with the file and folder operations. Unfortunately, I cannot assume that all clients have InnoDB support. It would also require me to convert many tables in my database to InnoDB, which I am hesitant to do.

    Read the article

  • Adding Mysql Columns in Rails Rake

    - by Gigg
    I have a rake file that does a series of calculations on a database. Basically it adds usage on equipment regularly. At the end of the day it needs to add that days total to a monthly total table and update that same table. i use the following simple concepts: To get data from the database is pretty simple: @usages = Usage.find(:all) time = Time.new for usage in @usages sql = ActiveRecord::Base.connection To insert it into the database: sql.execute "INSERT (or UPDATE) into usages ## add values and options as per MySQL But how do I: take a column from a database, add all of its values together that have a common value in another column, (i.e. if column x == value y) and then insert it into another column in another table, say dailyusages? I have tried these options: task (:monthly => :environment) do @dailyusages = Dailyusage.find(:all) for dailyusage in @dailyusages sql = ActiveRecord::Base.connection time = Time.new device = monthlyusages.device month = time.month if device == dailyusages.device ##&& month == dailyusages.month total = (dailyusage.total.sum.to_i) @monthlyusages = Monthlyusage.find(:all) for monthlyusage in @monthlyusages sql = ActiveRecord::Base.connection old_total = monthlyusage.total.to_i new_total = (old_total + total) sql.execute "UPDATE monthlyusages ( year, month, total, device ) values('#{time.year}', '#{time.month}', '#{total}', '#{dailyusage.device}' )" end I obviously have uncommented options and tried all sorts of things. Any help would really save me a load of trouble. Thanks in advance. (** BTW - I am new to rails, so go easy on me **)

    Read the article

  • smallest mysql type that accomodates single decimal

    - by donpal
    Database newbie here. I'm setting up a mysql table. One of the fields will accept a value in increment of a 0.5. e.g. 0.5, 1.0, 1.5, 2.0, .... 200.5, etc. I've tried int but it doesn't capture the decimals. `value` int(10), What would be the smallest type that can accommodate this value, considering it's only a single decimal. I also was considering that because the decimal will always be 0.5 if at all, I could store it in a separate boolean field? So I would have 2 fields instead. Is this a stupid or somewhat over complicated idea? I don't know if it really saves me any memory, and it might get slower now that I'm accessing 2 fields instead of 1 `value` int(10), `half` bool, //or something similar to boolean What are your suggestions guys? Is the first option better, and what's the smallest data type in that case that would get me the 0.5?

    Read the article

  • Good PHP / MYSQL hashing solution for large number of text values

    - by Dave
    Short descriptio: Need hashing algorithm solution in php for large number of text values. Long description. PRODUCT_OWNER_TABLE serial_number (auto_inc), product_name, owner_id OWNER_TABLE owner_id (auto_inc), owener_name I need to maintain a database of 200000 unique products and their owners (AND all subsequent changes to ownership). Each product has one owner, but an owner may have MANY different products. Owner names are "Adam Smith", "John Reeves", etc, just text values (quite likely to be unicode as well). I want to optimize the database design, so what i was thinking was, every week when i run this script, it fetchs the owner of a proudct, then checks against a table i suppose similar to PRODUCT_OWNER_TABLE, fetching the owner_id. It then looks up owner_id in OWNER_TABLE. If it matches, then its the same, so it moves on. The problem is when its different... To optimize the database, i think i should be checking against the other "owner_name" entries in OWNER_TABLE to see if that value exists there. If it does, then i should use that owner_id. If it doesnt, then i should add another entry. Note that there is nothing special about the "name". as long as i maintain the correct linkagaes AND make the OWNER_TABLE "read-only, append-new" type table - I should be able create a historical archive of ownership. I need to do this check for 200000 entries, with i dont know how many unique owner names (~50000?). I think i need a hashing solution - the OWNER_TABLE wont be sorted, so search algos wont be optimal. programming language is PHP. database is MYSQL.

    Read the article

  • PHP & MySQL - Array to string conversion error problem.

    - by comma
    I keep getting an the following error of Array to string conversion error on this line listed below. How can I fix this problem? $skill = explode('', $_POST['skill']); Here is the PHP & MySQL code. $skill = explode('', $_POST['skill']); $experience = explode('', $_POST['experience']); $years = explode('', $_POST['years']); for ($s = 0; $s < count($skill); $s++){ for ($x = 0; $x < count($experience); $x++){ for ($g = 0; $g < count($years); $g++){ if (mysqli_num_rows($dbc) == 0) { $mysqli = mysqli_connect("localhost", "root", "", "sitename"); $dbc = mysqli_query($mysqli,"INSERT INTO learned_skills (user_id, skill, experience, years, date_created) VALUES ('" . $user_id . "', '" . $skill[$s] . "', '" . $experience[$x] . "', '" . $years[$g] . "', NOW())"); } if ($dbc == TRUE) { $dbc = mysqli_query($mysqli,"UPDATE learned_skills SET skill = '$skill', experience = '$experience', years = '$years', date_created = NOW() WHERE user_id = '$user_id'"); echo '<p class="changes-saved">Your changes have been saved!</p>'; } if (!$dbc) { print mysqli_error($mysqli); return; } } } }

    Read the article

  • mysql to xml (DOM question)

    - by Jerry
    Hello guys I am new to php dom and trying to get the mysql data transfer into xml. My current xml output is like this <markers> <city> <name>Seattle</name> <size>medium</size> <name>New York</name> <size>big</size> <city> <markers> but I want to change it to <markers> <city> <name>Seattle</name> <size>medium</size> <city> <city> <name>New York</name> <size>big</size> </city> <city> <markers> my php $dom=new DOMDocument("1.0"); $node=$dom->createElement("markers"); $parnode=$dom->appendChild($node); $firstElement=$dom->createElement("city"); $parnode->appendChild($firstElement); $getLocationQuery=mysql_query("SELECT * FROM location",$connection); header("Content-type:text/xml"); while($row=mysql_fetch_assoc($getLocationQuery)){ foreach ($row as $fieldName=>$value){ $child=$dom->createElement($fieldName); $value=$dom->createTextNode($value); $child->appendChild($value); $firstElement->appendChild($child); } } I can't figure out how to change my php code. Please help me about it. Thanks a lot.

    Read the article

  • MySQL: Get average of time differences?

    - by Nebs
    I have a table called Sessions with two datetime columns: start and end. For each day (YYYY-MM-DD) there can be many different start and end times (HH:ii:ss). I need to find a daily average of all the differences between these start and end times. An example of a few rows would be: start: 2010-04-10 12:30:00 end: 2010-04-10 12:30:50 start: 2010-04-10 13:20:00 end: 2010-04-10 13:21:00 start: 2010-04-10 14:10:00 end: 2010-04-10 14:15:00 start: 2010-04-10 15:45:00 end: 2010-04-10 15:45:05 start: 2010-05-10 09:12:00 end: 2010-05-10 09:13:12 ... The time differences (in seconds) for 2010-04-10 would be: 50 60 300 5 The average for 2010-04-10 would be 103.75 seconds. I would like my query to return something like: day: 2010-04-10 ave: 103.75 day: 2010-05-10 ave: 72 ... I can get the time difference grouped by start date but I'm not sure how to get the average. I tried using the AVG function but I think it only works directly on column values (rather than the result of another aggregate function). This is what I have: SELECT TIME_TO_SEC(TIMEDIFF(end,start)) AS timediff FROM Sessions GROUP BY DATE(start) Is there a way to get the average of timediff for each start date group? I'm new to aggregate functions so maybe I'm misunderstanding something. If you know of an alternate solution please share. I could always do it ad hoc and compute the average manually in PHP but I'm wondering if there's a way to do it in MySQL so I can avoid running a bunch of loops. Thanks.

    Read the article

  • How to create a chart from mysql data?

    - by user187580
    Hello, I have some data and want to create some dynamic charts. I have looked on Google visualisation api .. It looks great but the problem is I am not very familiar with it. Any ideas, how I can set the data.setValue from mysql data. <script type='text/javascript'> google.load('visualization', '1', {'packages': ['geomap']}); google.setOnLoadCallback(drawMap); function drawMap() { var data = new google.visualization.DataTable(); data.addRows(6); data.addColumn('string', 'Country'); data.addColumn('number', 'Popularity'); data.setValue(0, 0, 'Germany'); data.setValue(0, 1, 200); data.setValue(1, 0, 'United States'); data.setValue(1, 1, 300); data.setValue(2, 0, 'Brazil'); data.setValue(2, 1, 400); data.setValue(3, 0, 'Canada'); data.setValue(3, 1, 500); data.setValue(4, 0, 'France'); data.setValue(4, 1, 600); data.setValue(5, 0, 'RU'); data.setValue(5, 1, 700); var options = {}; options['dataMode'] = 'regions'; var container = document.getElementById('map_canvas'); var geomap = new google.visualization.GeoMap(container); geomap.draw(data, options); }; </script> I can create chart using some other methods but just interested in using Google Visualisation API. Thanks.

    Read the article

< Previous Page | 192 193 194 195 196 197 198 199 200 201 202 203  | Next Page >