Search Results

Search found 5642 results on 226 pages for 'coding efficiency'.

Page 124/226 | < Previous Page | 120 121 122 123 124 125 126 127 128 129 130 131  | Next Page >

  • MySQL query, 2 similar servers, 2 minute difference in execution times

    - by mr12086
    I had a similar question on stack overflow, but it seems to be more server/mysql setup related than coding. The queries below all execute instantly on our development server where as they can take upto 2 minutes 20 seconds. The query execution time seems to be affected by home ambiguous the LIKE string's are. If they closely match a country that has few matches it will take less time, and if you use something like 'ge' for germany - it will take longer to execute. But this doesn't always work out like that, at times its quite erratic. Sending data appears to be the culprit but why and what does that mean. Also memory on production looks to be quite low (free memory)? Production: Intel Quad Xeon E3-1220 3.1GHz 4GB DDR3 2x 1TB SATA in RAID1 Network speed 100Mb Ubuntu Development Intel Core i3-2100, 2C/4T, 3.10GHz 500 GB SATA - No RAID 4GB DDR3 UPDATE 2 : mysqltuner output: [prod] -------- General Statistics -------------------------------------------------- [--] Skipped version check for MySQLTuner script [OK] Currently running supported MySQL version 5.1.61-0ubuntu0.10.04.1 [OK] Operating on 64-bit architecture -------- Storage Engine Statistics ------------------------------------------- [--] Status: +Archive -BDB -Federated +InnoDB -ISAM -NDBCluster [--] Data in MyISAM tables: 103M (Tables: 180) [--] Data in InnoDB tables: 491M (Tables: 19) [!!] Total fragmented tables: 38 -------- Security Recommendations ------------------------------------------- [OK] All database users have passwords assigned -------- Performance Metrics ------------------------------------------------- [--] Up for: 77d 4h 6m 1s (53M q [7.968 qps], 14M conn, TX: 87B, RX: 12B) [--] Reads / Writes: 98% / 2% [--] Total buffers: 58.0M global + 2.7M per thread (151 max threads) [OK] Maximum possible memory usage: 463.8M (11% of installed RAM) [OK] Slow queries: 0% (12K/53M) [OK] Highest usage of available connections: 22% (34/151) [OK] Key buffer size / total MyISAM indexes: 16.0M/10.6M [OK] Key buffer hit rate: 98.7% (162M cached / 2M reads) [OK] Query cache efficiency: 20.7% (7M cached / 36M selects) [!!] Query cache prunes per day: 3934 [OK] Sorts requiring temporary tables: 1% (3K temp sorts / 230K sorts) [!!] Joins performed without indexes: 71068 [OK] Temporary tables created on disk: 24% (3M on disk / 13M total) [OK] Thread cache hit rate: 99% (690 created / 14M connections) [!!] Table cache hit rate: 0% (64 open / 85M opened) [OK] Open file limit used: 12% (128/1K) [OK] Table locks acquired immediately: 99% (16M immediate / 16M locks) [!!] InnoDB data size / buffer pool: 491.9M/8.0M -------- Recommendations ----------------------------------------------------- General recommendations: Run OPTIMIZE TABLE to defragment tables for better performance Enable the slow query log to troubleshoot bad queries Adjust your join queries to always utilize indexes Increase table_cache gradually to avoid file descriptor limits Variables to adjust: query_cache_size (> 16M) join_buffer_size (> 128.0K, or always use indexes with joins) table_cache (> 64) innodb_buffer_pool_size (>= 491M) [dev] -------- General Statistics -------------------------------------------------- [--] Skipped version check for MySQLTuner script [OK] Currently running supported MySQL version 5.1.62-0ubuntu0.11.10.1 [!!] Switch to 64-bit OS - MySQL cannot currently use all of your RAM -------- Storage Engine Statistics ------------------------------------------- [--] Status: +Archive -BDB -Federated +InnoDB -ISAM -NDBCluster [--] Data in MyISAM tables: 185M (Tables: 632) [--] Data in InnoDB tables: 967M (Tables: 38) [!!] Total fragmented tables: 73 -------- Security Recommendations ------------------------------------------- [OK] All database users have passwords assigned -------- Performance Metrics ------------------------------------------------- [--] Up for: 1d 2h 26m 9s (5K q [0.058 qps], 1K conn, TX: 4M, RX: 1M) [--] Reads / Writes: 99% / 1% [--] Total buffers: 58.0M global + 2.7M per thread (151 max threads) [OK] Maximum possible memory usage: 463.8M (11% of installed RAM) [OK] Slow queries: 0% (0/5K) [OK] Highest usage of available connections: 1% (2/151) [OK] Key buffer size / total MyISAM indexes: 16.0M/18.6M [OK] Key buffer hit rate: 99.9% (60K cached / 36 reads) [OK] Query cache efficiency: 44.5% (1K cached / 2K selects) [OK] Query cache prunes per day: 0 [OK] Sorts requiring temporary tables: 0% (0 temp sorts / 44 sorts) [OK] Temporary tables created on disk: 24% (162 on disk / 666 total) [OK] Thread cache hit rate: 99% (2 created / 1K connections) [!!] Table cache hit rate: 1% (64 open / 4K opened) [OK] Open file limit used: 8% (88/1K) [OK] Table locks acquired immediately: 100% (1K immediate / 1K locks) [!!] InnoDB data size / buffer pool: 967.7M/8.0M -------- Recommendations ----------------------------------------------------- General recommendations: Run OPTIMIZE TABLE to defragment tables for better performance Enable the slow query log to troubleshoot bad queries Increase table_cache gradually to avoid file descriptor limits Variables to adjust: table_cache (> 64) innodb_buffer_pool_size (>= 967M) UPDATE 1: When testing the queries listed here there is usually no more than one other query taking place, and usually none. Because production is actually handling apache requests that development gets very few of as it's only myself and 1 other who accesses it - could the 4GB of RAM be getting exhausted by using the single machine for both apache and mysql server? Production: sudo hdparm -tT /dev/sda /dev/sda: Timing cached reads: 24872 MB in 2.00 seconds = 12450.72 MB/sec Timing buffered disk reads: 368 MB in 3.00 seconds = 122.49 MB/sec sudo hdparm -tT /dev/sdb /dev/sdb: Timing cached reads: 24786 MB in 2.00 seconds = 12407.22 MB/sec Timing buffered disk reads: 350 MB in 3.00 seconds = 116.53 MB/sec Server version(mysql + ubuntu versions): 5.1.61-0ubuntu0.10.04.1 Development: sudo hdparm -tT /dev/sda /dev/sda: Timing cached reads: 10632 MB in 2.00 seconds = 5319.40 MB/sec Timing buffered disk reads: 400 MB in 3.01 seconds = 132.85 MB/sec Server version(mysql + ubuntu versions): 5.1.62-0ubuntu0.11.10.1 ORIGINAL DATA : This query is NOT the query in question but is related so ill post it. SELECT f.form_question_has_answer_id FROM form_question_has_answer f INNER JOIN project_company_has_user p ON f.form_question_has_answer_user_id = p.project_company_has_user_user_id INNER JOIN company c ON p.project_company_has_user_company_id = c.company_id INNER JOIN project p2 ON p.project_company_has_user_project_id = p2.project_id INNER JOIN user u ON p.project_company_has_user_user_id = u.user_id INNER JOIN form f2 ON p.project_company_has_user_project_id = f2.form_project_id WHERE (f2.form_template_name = 'custom' AND p.project_company_has_user_garbage_collection = 0 AND p.project_company_has_user_project_id = '29') AND (LCASE(c.company_country) LIKE '%ge%' OR LCASE(c.company_country) LIKE '%abcde%') AND f.form_question_has_answer_form_id = '174' And the explain plan for the above query is, run on both dev and production produce the same plan. +----+-------------+-------+--------+----------------------------------------------------------------------------------------------------------------------------------------------+----------------------------------+---------+----------------------------------------------------+------+-------------+ | id | select_type | table | type | possible_keys | key | key_len | ref | rows | Extra | +----+-------------+-------+--------+----------------------------------------------------------------------------------------------------------------------------------------------+----------------------------------+---------+----------------------------------------------------+------+-------------+ | 1 | SIMPLE | p2 | const | PRIMARY | PRIMARY | 4 | const | 1 | Using index | | 1 | SIMPLE | f | ref | form_question_has_answer_form_id,form_question_has_answer_user_id | form_question_has_answer_form_id | 4 | const | 796 | Using where | | 1 | SIMPLE | u | eq_ref | PRIMARY | PRIMARY | 4 | new_klarents.f.form_question_has_answer_user_id | 1 | Using index | | 1 | SIMPLE | p | ref | project_company_has_user_unique_key,project_company_has_user_user_id,project_company_has_user_company_id,project_company_has_user_project_id | project_company_has_user_user_id | 4 | new_klarents.f.form_question_has_answer_user_id | 1 | Using where | | 1 | SIMPLE | f2 | ref | form_project_id | form_project_id | 4 | const | 15 | Using where | | 1 | SIMPLE | c | eq_ref | PRIMARY | PRIMARY | 4 | new_klarents.p.project_company_has_user_company_id | 1 | Using where | +----+-------------+-------+--------+----------------------------------------------------------------------------------------------------------------------------------------------+----------------------------------+---------+----------------------------------------------------+------+-------------+ This query takes 2 minutes ~20 seconds to execute. The query that is ACTUALLY being run on the server is this one: SELECT COUNT(*) AS num_results FROM (SELECT f.form_question_has_answer_id FROM form_question_has_answer f INNER JOIN project_company_has_user p ON f.form_question_has_answer_user_id = p.project_company_has_user_user_id INNER JOIN company c ON p.project_company_has_user_company_id = c.company_id INNER JOIN project p2 ON p.project_company_has_user_project_id = p2.project_id INNER JOIN user u ON p.project_company_has_user_user_id = u.user_id INNER JOIN form f2 ON p.project_company_has_user_project_id = f2.form_project_id WHERE (f2.form_template_name = 'custom' AND p.project_company_has_user_garbage_collection = 0 AND p.project_company_has_user_project_id = '29') AND (LCASE(c.company_country) LIKE '%ge%' OR LCASE(c.company_country) LIKE '%abcde%') AND f.form_question_has_answer_form_id = '174' GROUP BY f.form_question_has_answer_id;) dctrn_count_query; With explain plans (again same on dev and production): +----+-------------+-------+--------+------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+----------------------------------+---------+----------------------------------------------------+------+------------------------------+ | id | select_type | table | type | possible_keys | key | key_len | ref | rows | Extra | +----+-------------+-------+--------+------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+----------------------------------+---------+----------------------------------------------------+------+------------------------------+ | 1 | PRIMARY | NULL | NULL | NULL | NULL | NULL | NULL | NULL | Select tables optimized away | | 2 | DERIVED | p2 | const | PRIMARY | PRIMARY | 4 | | 1 | Using index | | 2 | DERIVED | f | ref | form_question_has_answer_form_id,form_question_has_answer_user_id | form_question_has_answer_form_id | 4 | | 797 | Using where | | 2 | DERIVED | p | ref | project_company_has_user_unique_key,project_company_has_user_user_id,project_company_has_user_company_id,project_company_has_user_project_id,project_company_has_user_garbage_collection | project_company_has_user_user_id | 4 | new_klarents.f.form_question_has_answer_user_id | 1 | Using where | | 2 | DERIVED | f2 | ref | form_project_id | form_project_id | 4 | | 15 | Using where | | 2 | DERIVED | c | eq_ref | PRIMARY | PRIMARY | 4 | new_klarents.p.project_company_has_user_company_id | 1 | Using where | | 2 | DERIVED | u | eq_ref | PRIMARY | PRIMARY | 4 | new_klarents.p.project_company_has_user_user_id | 1 | Using where; Using index | +----+-------------+-------+--------+------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+----------------------------------+---------+----------------------------------------------------+------+------------------------------+ On the production server the information I have is as follows. Upon execution: +-------------+ | num_results | +-------------+ | 3 | +-------------+ 1 row in set (2 min 14.28 sec) Show profile: +--------------------------------+------------+ | Status | Duration | +--------------------------------+------------+ | starting | 0.000016 | | checking query cache for query | 0.000057 | | Opening tables | 0.004388 | | System lock | 0.000003 | | Table lock | 0.000036 | | init | 0.000030 | | optimizing | 0.000016 | | statistics | 0.000111 | | preparing | 0.000022 | | executing | 0.000004 | | Sorting result | 0.000002 | | Sending data | 136.213836 | | end | 0.000007 | | query end | 0.000002 | | freeing items | 0.004273 | | storing result in query cache | 0.000010 | | logging slow query | 0.000001 | | logging slow query | 0.000002 | | cleaning up | 0.000002 | +--------------------------------+------------+ On development the results are as follows. +-------------+ | num_results | +-------------+ | 3 | +-------------+ 1 row in set (0.08 sec) Again the profile for this query: +--------------------------------+----------+ | Status | Duration | +--------------------------------+----------+ | starting | 0.000022 | | checking query cache for query | 0.000148 | | Opening tables | 0.000025 | | System lock | 0.000008 | | Table lock | 0.000101 | | optimizing | 0.000035 | | statistics | 0.001019 | | preparing | 0.000047 | | executing | 0.000008 | | Sorting result | 0.000005 | | Sending data | 0.086565 | | init | 0.000015 | | optimizing | 0.000006 | | executing | 0.000020 | | end | 0.000004 | | query end | 0.000004 | | freeing items | 0.000028 | | storing result in query cache | 0.000005 | | removing tmp table | 0.000008 | | closing tables | 0.000008 | | logging slow query | 0.000002 | | cleaning up | 0.000005 | +--------------------------------+----------+ If i remove user and/or project innerjoins the query is reduced to 30s. Last bit of information I have: Mysqlserver and Apache are on the same box, there is only one box for production. Production output from top: before & after. top - 15:43:25 up 78 days, 12:11, 4 users, load average: 1.42, 0.99, 0.78 Tasks: 162 total, 2 running, 160 sleeping, 0 stopped, 0 zombie Cpu(s): 0.1%us, 50.4%sy, 0.0%ni, 49.5%id, 0.0%wa, 0.0%hi, 0.0%si, 0.0%st Mem: 4037868k total, 3772580k used, 265288k free, 243704k buffers Swap: 3905528k total, 265384k used, 3640144k free, 1207944k cached top - 15:44:31 up 78 days, 12:13, 4 users, load average: 1.94, 1.23, 0.87 Tasks: 160 total, 2 running, 157 sleeping, 0 stopped, 1 zombie Cpu(s): 0.2%us, 50.6%sy, 0.0%ni, 49.3%id, 0.0%wa, 0.0%hi, 0.0%si, 0.0%st Mem: 4037868k total, 3834300k used, 203568k free, 243736k buffers Swap: 3905528k total, 265384k used, 3640144k free, 1207804k cached But this isn't a good representation of production's normal status so here is a grab of it from today outside of executing the queries. top - 11:04:58 up 79 days, 7:33, 4 users, load average: 0.39, 0.58, 0.76 Tasks: 156 total, 1 running, 155 sleeping, 0 stopped, 0 zombie Cpu(s): 3.3%us, 2.8%sy, 0.0%ni, 93.9%id, 0.0%wa, 0.0%hi, 0.0%si, 0.0%st Mem: 4037868k total, 3676136k used, 361732k free, 271480k buffers Swap: 3905528k total, 268736k used, 3636792k free, 1063432k cached Development: This one doesn't change during or after. top - 15:47:07 up 110 days, 22:11, 7 users, load average: 0.17, 0.07, 0.06 Tasks: 210 total, 2 running, 208 sleeping, 0 stopped, 0 zombie Cpu(s): 0.1%us, 0.2%sy, 0.0%ni, 99.7%id, 0.0%wa, 0.0%hi, 0.0%si, 0.0%st Mem: 4111972k total, 1821100k used, 2290872k free, 238860k buffers Swap: 4183036k total, 66472k used, 4116564k free, 921072k cached

    Read the article

  • Why isn't this javascript with else if working?

    - by Uni
    I'm sorry I can't be any more specific - I have no idea where the problem is. I'm a total beginner, and I've added everything I know to add to the coding, but nothing happens when I push the button. I don't know at this point if it's an error in the coding, or a syntax error that makes it not work. Basically I am trying to get this function "Rip It" to go through the list of Dewey decimal numbers, change some of them, and return the new number and a message saying it's been changed. There is also one labeled "no number" that has to return an error (not necessarily an alert box, a message in the same space is okay.) I am a total beginner and not particularly good at this stuff, so please be gentle! Many thanks! <!DOCTYPE html> <html> <head> <script type="text/javascript"> function RipIt() { for (var i = l; i <=10 i=i+l) { var dewey=document.getElementById(i); dewey=parseFloat(dewey); if (dewey >= 100 && 200 >= dewey) { document.getElementById('dewey'+ 100) } else if (dewey >= 400 && 500 >= dewey) { document.getElementById('dewey'+ 200) } else if (dewey >= 850 && 900 >= dewey) { document.getElementById('dewey'-100) } else if (dewey >= 600 && 650 >= dewey) { document.getElementById('dewey'+17) } } } </script> </head> <body> <h4>Records to Change</h4> <ul id="myList"> <li id ="1">101.33</li> <li id = "2">600.01</li> <li id = "3">001.11</li> <li id = "4">050.02</li> <li id = "5">199.52</li> <li id = "6">400.27</li> <li id = "7">401.73</li> <li id = "8">404.98</li> <li id = "9">no number</li> <li id = "10">850.68</li> <li id = "11">853.88</li> <li id = "12">407.8</li> <li id = "13">878.22</li> <li id = "14">175.93</li> <li id = "15">175.9</li> <li id = "16">176.11</li> <li id = "17">190.97</li> <li id = "18">90.01</li> <li id = "19">191.001</li> <li id = "20">600.95</li> <li id = "21">602.81</li> <li id = "22">604.14</li> <li id = "23">701.31</li> <li id = "24">606.44</li> <li id = "25">141.77</li> </ul> <b> </b> <input type="button" value="Click To Run" onclick="RipIt()"> <!-- <input type="button" value="Click Here" onClick="showAlert();"> --> </body> </html>

    Read the article

  • Why is my second monitor not working?

    - by StampedeXV
    Since I have my new computer, I have a very weird problem. Facts: New Computer: Motherboard: ASRock Z77 Pro 3 Graphics-card: Asus1GB D5 X EN GTX560 DCII OC/2DI R CPU: Intel i5-3570 Windows 7 64bit 500W beQuiet special edition (92% efficiency) 8GB 1333MHz DDR3 Corsair RAM (CL9) Scythe Mugen 2 2 magnetic HDDs + 1 SDD 1 DVD-R Old Computer: Motherboard: Asus P55 something Graphics-card: Asus1GB D5 X EN GTX560 DCII OC/2DI R CPU: Intel i7-870 Windows 7 64bit 550W Corsair 8GB 1333MHz DDR3 Corsair RAM (CL9) Scythe Mugen 3 2 magnetic HDDs + 1 SDD 1 DVD-R On the old computer it worked fine with two monitors. Moving to the new (I took the same Graphics-card) it only works with one. The weird thing I mentioned is: not matter which one. But if I put both there, only one is available. There is no reaction at the start (where normally (at least if I remember correctly) the monitor shortly went from "standby" to "on"). Windows does not recognize a second monitor in the Device Manager. I have the latest drivers for Motherboard and Graphics-card. I have the latest BIOS drivers. I am out of ideas. Edit: completed computer setup

    Read the article

  • php, mySQL & AJAX: Unable to use sessions across the scripts in the same domain

    - by Devner
    Hi all, I have the following pages: page1.php, page2.php and page3.php. Code in each of them is as below CODE: page1.php <script type="text/javascript"> $(function(){ $('#imgID').upload({ submit_to_url: "page2.php", file_name: 'myfile1', description : "Image", limit : 1, file_types : "*.jpg", }) }); </script> <body> <form action="page3.php" method="post" enctype="multipart/form-data" name="frm1" id="frm1"> //Some other text fields <input type="submit" name="submit" id="submit" value="Submit" /> </form> </body> page2.php <?php session_start(); $a = $_SESSION['a']; $b = $_SESSION['b']; $c = $_SESSION['c']; $res = mysql_query("SELECT col FROM table WHERE col1 = $a AND col2 = $b AND col3 = $c LIMIT 1"); $num_rows = mysql_num_rows($res); echo $num_rows; //echos 0 when in fact it should have been 1 because the data in the Session exists. //Ok let's proceed further //... Do some stuff... //Store some more values and create new session variables (and assume that page1.php is going to be able to use it) $_SESSION['d'] = 'd'; $_SESSION['e'] = 'e'; $_SESSION['f'] = 'f'; if (move_uploaded_file($_FILES['file']['tmp_name'], $file)) { echo "success"; } else { echo "error ".$_FILES['file']['error']; } ?> page3.php <?php session_start(); if( isset($_POST['submit']) ) { //These sessions are non-existent although the AJAX request //to page2.php may have created them when called via AJAX from within page1.php echo $_SESSION['d'].$_SESSION['e'].$_SESSION['f']; ?> } ?> As the code says it I am posting some info via AJAX call from page1.php to page2.php. page2.php is supposed to be able to use the session values from page1.php i.e. $_SESSION['a'], $_SESSION['b'] and $_SESSION['c'] but it does not. Why? How can I fix this? page2.php is creating some more sessions after some processing is done and a response is sent back to page1.php. The submit button of the form on page1.php is hit and the page gets POST'ed to page3.php. But when the SESSION info that gets created in page2.php is echoed, it's blank signifying that SESSIONS from page2.php are not used. How can I fix this? I looked over a lot of information and have spent about 50 hours trying to do different things with my scripts before arriving at the above conclusions. My app. is custom made using function (not OOPS) and does not use any PHP frameworks & I am not even about to use any as my knowledge of OOP concepts is limited any many frameworks are object oriented. I came across race conditions, but the solutions provided don't help too much. One more solution of using DB to hold sessions and seek and retrieve from DB is the last thing on my mind and I really want to avoid creating table, coding and maintaining code for a task as simple as just keeping sessions across pages in the same domain. So my request is: Is there a way that I can solve the above problem(s) via simple coding in present conditions? Any help is appreciated. Thank you.

    Read the article

  • WMII Terminal Width of 80 Columns for xterm (colrules)

    - by BCable
    I'm trying to get WMII to split horizontally at 80 columns for xterm, but I'm only seeing a way to do this via percentage. It would be nice to be able to set it by something other than percentage for various resolutions, but if I have to deal with that I will. The problem is that even percentages don't work at my resolution (1366x768). 47+47 in /colrules yields 79 characters and 48+48 yields 81 characters. As far as I can tell, there is no decimal system allowed so I could do 47.5 for instance. I came from Ion3 and I'm used to using 80 column terminals, resizable by the keyboard, to get a reasonable cut off point for VIM when I'm coding. I would just settle with using the mouse, but WMII seems to be much more fluid than Ion3, so I would have to do it a LOT, which sounds annoying. Any ideas?

    Read the article

  • Recommendations for managing dedicated server DNS

    - by KP Overflow
    I've rented a dedicated server for several years with a number of domains. I've got a coding background so am comfortable with that side of the tech, but I hate that I still don't truly understand DNS settings. Example: My provider (hostgator) just told me that my parent nameservers are not correctly configured as there is no A record for my primary nameserver. What book/link/tutorial should I read to go from kind of understanding that comment to really understanding it & knowing exactly what I need to do to fix it rather than trial & error which is what I usually do. Thanks BTW I'm using a WHM/cpanel linux setup at hostgator but am eager to learn the fundamentals.

    Read the article

  • Choosing A Power Supply [closed]

    - by Geeks On Hugs
    Possible Duplicate: Power Supply Capacity Formula How can I check if my system needs more PSU power? I'm not sure if it's OK or not to ask a hardware question here. If not please let me know a good place but I've always got good info here so I thought I'd give it a shot. I'm custom building a new workstation for coding (Linux/Eclipse). How do I determine how much power the power supply needs? I'm building a mini ITX system on a budget and so I need to get as small as possible that is sufficient. I'll have a mini itx mobo with on board wifi and bluetooth, 8 GB RAM, an Intel i3 3.1 Ghz processor, 64 GB SSD and a slim optical drive. In the future I might add a descrete GPU, 16 GB RAM, 128 GB SSD. What is the minimum power I need and how do I calculate that?

    Read the article

  • Syntax Highlight and Autocomplete in Geany for GTK+ (C)

    - by Prasanna Choudhari
    I have just started GTK+ coding in C. I was curious whether i can get syntax highlight and auto-completion working for my GTK code... because as a beginner it would be helpful. I was completely convinced that it was not possible until i came across this video on youtube: https://www.youtube.com/watch?v=AyeQrO1VDFM&feature=plcp I asked the uploader for help, but turns out his last activity on youtube was in Septembeer :( I also tried opening the gtk.h file with geany as i had read somewhere that it worked, but unfortunately it didn't work too. Any help? :'(

    Read the article

  • TextMate - completion using an external file or file contained in project?

    - by Neil Baldwin
    Does anyone know how to get TextMate to search an external file (or even the files contained in a TextMate "project") with which to perform word completion? I'm coding some stuff on the C64 (using TextMate to write the code) and I have an external file containing labels for all of the hardware registers/kernal routines e.g VIC2InteruptStatus = $D019 It would be really handy to be able to type, say, 'VIC2I' then press the key for word completion and have TextMate find matches in the external library file. Rather than how I'm having to do it at the moment by opening the library file and copy-paste the register names into my code.

    Read the article

  • How to install latest version of imagick on centos 5.8 64bit using bash

    - by user57221
    How can I download and install latest version of imagick on centos 5.8 64bit using bash for php 5.4. >yum info php Loaded plugins: fastestmirror Loading mirror speeds from cached hostfile * base: mirror.ellogroup.com * epel: mirror01.th.ifl.net * extras: mirror.ellogroup.com * updates: mirror.ellogroup.com Installed Packages Name : php Arch : x86_64 Version : 5.4.3 Release : 1.el5.remi Size : 8.8 M Repo : installed Summary : The PHP HTML-embedded scripting language. (PHP: Hypertext Preprocessor) URL : http://www.php.net/ License : PHP Description: PHP is an HTML-embedded scripting language. PHP attempts to make it : easy for developers to write dynamically generated webpages. PHP also : offers built-in database integration for several commercial and : non-commercial database management systems, so writing a : database-enabled webpage with PHP is fairly simple. The most common : use of PHP coding is probably as a replacement for CGI scripts. : : The php package contains the module which adds support for the PHP : language to Apache HTTP Server.

    Read the article

  • Workflow: suggest a versioning and file control for Designer and Developer

    - by Pennf0lio
    Our company are having hard time managing project files and managing versions of PSD, HTML, PHP, and CSS files. Can anyone recommend a good software or workflow to handle files and versions. Here's my common scenario: I work for a project in my computer, it could be a Website mockup or a coding project. I then save all the files locally in my workstation. I'll then upload all the project files in the server connected in our network to have a backup. In my files, I usually append a "r1" for revisions, like "WebsiteMockup_r1" or "WebsiteMockup_r2". I need somehow to synchronize all my local files to the server and have some versions options.

    Read the article

  • Indenting an x number of lines in vim

    - by Mack Stump
    I've been coding in Java for a job recently and I've noticed that I'll write some code and then determine that I need to wrap the code in a try/catch block. I've just been moving to the beginning of a line and adding a tab. 0 i <tab> <esc> k (repeat process until at beginning or end of block) Now this was fine the first three or four times I had to indent but now it's just become tedious and I'm a lazy person. Could someone suggest an easier way I could deal with this problem?

    Read the article

  • How do anti-viruses work?

    - by Phoshi
    So I was thinking about viruses recently, and wondering how exactly antiviruses keep up? Considering anybody who'd been coding for a few weeks could hack together something do do nasty, nasty things to somebody's PC, the quantity alone would make a simple list of hashes prohibitive, so how do antiviruses do it? Do they monitor process activity and have a 3 strikes rule for doing virus-like things? And if so, what's stopping it from triggering on perfectly harmless things (like me moving files around in \system32)? I did a bit of googling, but the regular places didn't particularly help, and I couldn't find a dupe here, so I thought it'd be good to ask :)

    Read the article

  • Load images in parallel - supported by browser or a feature to implement?

    - by Michael Mao
    Hi all: I am not a pro in web development and Apache server still remains a mystery to me. we've got a project which runs on LAMP, pretty much like all the commercial hosting plans. I am confused about one problem : does modern browsers support image loading in parallel? or this requires some special feature/config set up from server side? Can this be done with PHP coding or by some server-side configuration? Is a special content delivery networking needed for this? The benchmark demonstration will be the flickr website. I am too suprised to see how all image thumbnails are loaded in a short time after a search as if there were only one image to load. Sorry I cannot present any code to you... completed lost in this:(

    Read the article

  • Is it possible to code on two different computers simultaneously?

    - by Muhammad
    I want to work with another programmer and I want the source code to be live in real-time on both of our screens. Is this possible on the Mac OS x or Linux? We're going to be using OS X but occasionally we might need to add an Ubuntu computer too. Is there a way I can do this using ssh, any shell based program, or even a good GUI? I thought Coda might be capable of this but it's not really working. Anyone ever do this? I'm not look for a git/svn/or any other version control system. This is more of a live coding session. :)

    Read the article

  • How to cope with developing against a poor 3rd party API/application?

    - by wsanville
    I'm a web developer, and my organization has recently started to use a proprietary ASP.NET CMS for our web sites. I was excited to get started using the CMS, thinking it would bring a lot of value to our end users and be fun to work with, since my skills are a good match for the types of projects we're using it for. That was about a year ago. Since then, we've ran into all kinds of issues, from blatant bugs in the product, to nasty edge cases in the APIs, to extremely poor documentation for developers. On about a weekly basis, we are forced to pursue workarounds and rewrite some of the out of the box functionality, and even find some of the basic features unusable. In many cases, since this is a closed source application (and obfuscated of course), there's nothing we can do as developers to solve these issues. So my question is, how does one attempt to develop a good application in such a scenario? The application mostly works when using the the exact out of the box behavior, or using one of the company's starter sites. However, my attempts to use the underlying APIs to implement slightly different, yet reasonable behavior has proved to be extremely time consuming (not to mention just as buggy), given the lack of good information about the APIs. I've given this a lot of thought, and my conflicting viewpoints are the following: Strongly advise against any customization to the CMS, as development time will rise exponentially, or even have an extremely high chance of failing. While this is accurate, I do not want to give the impression that I am not willing to code my own solutions to problems and take the initiative to implement something difficult or complex. I don't want to be perceived as someone who is not motivated, lazy, or not knowledgeable to do anything complex, because this is simply not the case. I love coding my own solutions, trying new/difficult things, I just dislike the vendor app we're using. Continue on the path I'm on now, which is hacking my way past all issues I encounter and try my best to deliver an application that meets the needs and specs exactly. My goals are to make it as seamless and easy to use as possible to the end user, even when integrating the CMS with our other applications internally. The problem I'm finding with this approach is it is very time consuming. I open support cases with the vendor on a regular basis to solve issues and to gain knowledge of their APIs, but this is extremely time consuming, and in some cases it leads to dead ends. I post on the vendors forums on a regular basis but have become frustrated as most of my posts get 0 replies. So, what would you, a reasonable developer, do in this case? How can I make the best of the situation? And just for fun, here are some of the code smells and anti-patterns I've dealt with using the product (aside from their own code blatantly failing): Use of StringBuilder to concatenate a giant string that is hard coded and does not change. They use it to concatenate their Javascript and write it out into the body tags of their pages. Methods that accept object or Microsoft.VisualBasic.Collection as the parameters. In the case of the VB Collection, the data is not a list of any kind, it's used instead of making a class. Methods that return a Hashtable of VB Collections Method names of the form MethodName_v45, MethodName_v20, etc... Multiple classes with the same name in different namespaces with different functionality/behavior. Intellisense that reads "Note: this parameter is non functional" Complete lack of coding standards, API is filled with magic numbers and magic strings. Properties with a getter of type object that accepts totally different things, like enum or strings, and throw exceptions at runtime when you pass in something not supported. And much, much, more...

    Read the article

  • Ideal laptop specs for a Computer Science Masters student?

    - by Ayush
    I have a HP pavillion core 2 duo 2 GHz and 4 GB RAM, and it is painful to use this machine for any kind of coding. Eclipse (especially Juno) literally takes 5 minutes to load. And even after that, everything is lagy. Apart from school stuff, I also use my computer as a television. I watch Hulu, Netflix, YouTube etc in 720p, and this laptop gets hot as hell and the fans are loud enough to wake somebody up from deep sleep. I DON'T use my laptop for Gaming or Video/Photo Editing. I'm looking to buy a new laptop (in which most widely used IDEs would work smoothly and playing hi-def videos wouldn't be too much for the machine to handle) any suggestions (on hardware specs) would be greatly appreciated. Thanks

    Read the article

  • Suggestions for Backup solution

    - by jiewmeng
    i am considering between windows home server simple nas extra HDD's in desktop btw, i will be the main user i am looking to fulfil the following needs: reliability (i am think RAID 1 or 5) not so prone to virus/malware infections (will using a separate NAS or home server help? say windows home server is still a windows pc except separated by network?) power efficiency (eg. spin down when not in use) download (eg. i may want to dl big files/torrents overnight and i may not want to use a full powered PC for it? does a full pc vs NAS provide significant power usage to justify cost of new system esp. since i am only user?) performance (i guess i like to write/access my files fast, on 2nd thought, maybe for backup i can forgo this? maybe for a WD Green HDD? but how much slower will it be? plus since i am the only user, i think the whole HDD will be mine?)

    Read the article

  • Wi-fi signal with keeping the internet cable

    - by daGrevis
    So the situation is that I have Ethernet cable which provides internet to my computer. Thing I want is to have wi-fi connection in my house and Ethernet cable (like I have before) to use for my PC. I will use wi-fi for my laptop and mobile phone. I think I need router for that and I'm looking at Asus RT-N16 (suggested in Coding Horror) for it, but I am not sure. Is it the right thing for me and I will be able to get wi-fi signal and keep the Ethernet cable? I guess the system will be that current cable goes into router, router provides wi-fi signal and gives back new cable... or something like that. Thanks in any advice! And sorry if topic isn't in right site.

    Read the article

  • Best Configuration For Youtube HD 720p

    - by Eray Alakese
    Hello, I'm recording a screencast with CamStudio . My computer's screen resolution is 1280*800 , so video's resolution is 1280*800, too. I'm using Microsoft Video 1 codec when recording. I was record 9 minutes video and this video's size is 214 MB . I will upload this video to Youtube. I'm coding a web site at the video, because of this, video must be quality (720p) . I want to reduce file's size, before upload . I'm using Total Video Converter . But when i convert to FLV , video's size increase to 250MB :) I don't know, how can i configure this setting and which file type should i choose.

    Read the article

  • Best Configuration For Youtube HD 720p

    - by Eray Alakese
    Hello, I'm recording a screencast with CamStudio (http://camstudio.org/) . My computer's screen resolution is 1280*800 , so video's resolution is 1280*800, too. I'm using Microsoft Video 1 codec when recording. I was record 9 minutes video and this video's size is 214 MB . I will upload this video to Youtube. I'm coding a web site at the video, because of this, video must be quality (720p) . I want to reduce file's size, before upload . I'm using Total Video Converter (http://www.effectmatrix.com/total-video-converter/) . But when i convert to FLV , video's size increase to 250MB :) I don't know, how can i configure this setting and which file type should i choose. (screenshot here : http://i.imgur.com/Se0EP.jpg )

    Read the article

  • Changing keyboard layout for login screen on Mac OS X

    - by R.A
    I have a Mac Mini with Mac OS X Lion. Basically I am a developer and practiced coding with DVORAK keyboards. So I changed my keyboard layout to DVORAK and I'm working with it. I'm using the admin user on my Mac Mini. When I restart my computer, during login, the layout is changed to QWERTY and it's hard to type the password. It happens only for the admin user though. If I create the another user and set the layout as DVORAK and restart, that user has the DVORAK keyboard for login. Note: I have a QWERTY keyboard, I'm only changing the layout in the keyboard preferences. So, to summarize, how can I get the DVORAK keyboard layout for the admin login as well?

    Read the article

  • How to better copy&paste big files over RDP?

    - by WebMAOhist
    Recently I was making a few attempts to copy&paste a big (1.2 GB) file to remote computer over RDP. The remote computer is virtual testing machine with MS Windows Server 2008 Datacenter. First I tried to copy&paste before midnight when the transfer speed was limited by client computer ISP to 100 kB/s. So, it required a few hours and I was forced to cancel transfer since remote desktop became too unresponsive and sluggish (slow). So, I re-started it over midnight when my local transfer speed is over 4 GB/s 4MB/s (sorry for typo). So, my impression is that independently on speed (broadband) of copy&paste transfer the remote computer becomes sluggish while copying over RDP. At the same time downloading from internet doesn't make remote host sluggish. AFAIU, it is because clipboard of remote computer and so its memory becomes overloaded by transfer. How can I control (restrict) the usage of clipboard for specific process (pasting of file)? What are the possible way to control it? Update: After reading that slow speed of transfer is caused by encryption used for copy&pasting over RDP and since I believe I am more interested in overall efficiency: both the time, or rapidness, of getting file as well as possibility to work without waiting, I changed the question title from: How to control the usage of remote desktop clipboard usage for pasting a big file? to How to better copy&paste big files over RDP? For example, is it better to copy&paste one huge (zip) archive or unzip it and copy paste a folder with unzipped files? And more exactly I wanted to ask: What are possible ways to improve overall experience: the speed of transfer (i.e. availability of needed file) responsiveness of remote host (making remote coputer available for work before completion of copy&pasting)?

    Read the article

  • Server speed: sharing one script.php or using many copies the same script.php

    - by Marco Demaio
    Let's assume: I have thousands of domains on same Apache server. Each domain is in a folder under server public_html document folder, so it can be accessed by calling "www.somedomain.com" or by calling "www.serverdomain.com/somedomain_folder" In each domain there is a website who needs a certain script.php (identical for each domain). From a coding point view, its obvious that it's better to use a unique script.php, so when i update it with new features/bug fixes etc, I need to update on server only one file and it will work for all domains. But from a server point of view? If i use a unique script all domains will access it at the same time, will the server run slower compared to the situation where each domain called its own script?

    Read the article

  • Server slowdown

    - by Clinton Bosch
    I have a GWT application running on Tomcat on a cloud linux(Ubuntu) server, recently I released a new version of the application and suddenly my server response times have gone from 500ms average to 15s average. I have run every monitoring tool I know. iostat says my disks are 0.03% utilised mysqltuner.pl says I am OK other see below top says my processor is 99% idle and load average: 0.20, 0.31, 0.33 memory usage is 50% (-/+ buffers/cache: 3997 3974) mysqltuner output [OK] Logged in using credentials from debian maintenance account. -------- General Statistics -------------------------------------------------- [--] Skipped version check for MySQLTuner script [OK] Currently running supported MySQL version 5.1.63-0ubuntu0.10.04.1-log [OK] Operating on 64-bit architecture -------- Storage Engine Statistics ------------------------------------------- [--] Status: +Archive -BDB -Federated +InnoDB -ISAM -NDBCluster [--] Data in MyISAM tables: 370M (Tables: 52) [--] Data in InnoDB tables: 697M (Tables: 1749) [!!] Total fragmented tables: 1754 -------- Security Recommendations ------------------------------------------- [OK] All database users have passwords assigned -------- Performance Metrics ------------------------------------------------- [--] Up for: 19h 25m 41s (1M q [28.122 qps], 1K conn, TX: 2B, RX: 1B) [--] Reads / Writes: 98% / 2% [--] Total buffers: 1.0G global + 2.7M per thread (500 max threads) [OK] Maximum possible memory usage: 2.4G (30% of installed RAM) [OK] Slow queries: 0% (1/1M) [OK] Highest usage of available connections: 34% (173/500) [OK] Key buffer size / total MyISAM indexes: 16.0M/279.0K [OK] Key buffer hit rate: 99.9% (50K cached / 40 reads) [OK] Query cache efficiency: 61.4% (844K cached / 1M selects) [!!] Query cache prunes per day: 553779 [OK] Sorts requiring temporary tables: 0% (0 temp sorts / 34K sorts) [OK] Temporary tables created on disk: 4% (4K on disk / 102K total) [OK] Thread cache hit rate: 84% (185 created / 1K connections) [!!] Table cache hit rate: 0% (256 open / 27K opened) [OK] Open file limit used: 0% (20/2K) [OK] Table locks acquired immediately: 100% (692K immediate / 692K locks) [OK] InnoDB data size / buffer pool: 697.2M/1.0G -------- Recommendations ----------------------------------------------------- General recommendations: Run OPTIMIZE TABLE to defragment tables for better performance MySQL started within last 24 hours - recommendations may be inaccurate Enable the slow query log to troubleshoot bad queries Increase table_cache gradually to avoid file descriptor limits Variables to adjust: query_cache_size (> 16M) table_cache (> 256)

    Read the article

< Previous Page | 120 121 122 123 124 125 126 127 128 129 130 131  | Next Page >