Search Results

Search found 36111 results on 1445 pages for 'mysql update'.

Page 333/1445 | < Previous Page | 329 330 331 332 333 334 335 336 337 338 339 340  | Next Page >

  • Updating a Safari Extension?

    - by Ricky Romero
    Hi there, I'm writing a simple Safari Extension, and I'm trying to figure out how to get the update mechanism working. Apple's documentation here is delightfully vague: http://developer.apple.com/safari/library/documentation/Tools/Conceptual/SafariExtensionGuide/UpdatingExtensions/UpdatingExtensions.html And here's my manifest, based on that documentation: <?xml version="1.0" encoding="UTF-8"?> <!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd"> <plist version="1.0"> <dict> <key>Extension Updates</key> <array> <dict> <key>CFBundleIdentifier</key> <string>net.rickyromero.safari.shutup</string> <key>Team Identifier</key> <string>TMM5P68287</string> <key>CFBundleVersion</key> <string>1</string> <key>CFBundleShortVersionString</key> <string>1.0</string> <key>URL</key> <string>http://rickyromero.net/misc/SafariExtensions/ShutUp.safariextz</string> </dict> </array> </dict> </plist> I don't know where to get "YourCertifcateID," for example. And when I increment the values for CFBundleVersion and CFBundleShortVersionString, it doesn't trigger an update. I know Safari is hitting my manifest though, because I'm watching HTTP traffic. Thoroughly stumped. Any ideas, guys?

    Read the article

  • Reversed Latitude/Longitude US Tiger/Line Shape File to MySQL w/ OGR2OGRP

    - by Dave LeJeune
    Hi - I've downloaded the latest set (2010) of TIGER edge shape files (ESRI shapefile format) from the US Census website and am loading them into MySQL using the GDAL ogr2ogr utility. A new table (geotest) does get created with a SHAPE column that has the geometry defined as a LINESTRING. However, I am seeing reversed latitude and longitude values that get reversed when running the following command: ogr2ogr -f "MySQL" MySQL:"geo,user=lejeuned,host=localhost,password=cnickl234" -nln geotest -nlt LINESTRING -append -a_srs "EPSG:4326" -lco engine=MYISAM tl_2010_01021_edges.shp Mapping the latitude/longitude (after reversing them of course) they appear to be spot on so I suspect there is just something I am doing wrong or flag I am missing which is causing the latitude and longitudes to be transposed. When I select the SHAPE column using astext() I get the following result: LINESTRING(-86.69863 32.973164,-86.69853 32.97302,-86.69856 32.97287,-86.698613 32.972825,-86.6988 32.972825,-86.6989 32.972892,-86.6989 32.973002,-86.69874 32.97316,-86.69864 32.97318,-86.69863 32.973164) Any ideas what I am doing wrong?

    Read the article

  • Dedicated server automatic backup solution

    - by Luigi
    I have a dedicated Ubuntu web server in a cloud environment, and I am looking for a nice way to do automated backups. I would like to backup some directories with web apps, and all my MySql databases. As for destination: make snapshots every two hours localy, and every six hours to a remote ftp server. Also delete backup archives older than seven days(localy + ftp), and notify on any problems by email. Now to achieve some of this functionality I use cron + shell script, and http://www.mysqldumper.net/, but really that doesn't answer my needs. Mysqldumper doesn't know automaticly about new databases, and shell script does not notify on problems. It's something I have to check out from time to time, and i don't have trust for. I googled a while, and seems like most people solve this stuff with shell scripts. Is this a method you can trust? Are there any web-gui tools, I'm missing? Maybe there is a smarter startegy for doing this? I'm a little bit confused.

    Read the article

  • Redmine plug-in fails at rake db:migrate_plugins

    - by Drew
    Hey, First post, so hope I'm in the right place. While trying to install the Redmine plug-in 'Wiki Extensions', I keep getting stuck when I try to run the "rake db:migrate_plugins RAILS_ENV=production" command. I am moving server and I'm in bit over my head. Haven't found anything on Google that has helped me much, though I might have missed something. I have pasted in the output with --trace: (in /srv/www/vastpark.org/redmine) ** Invoke db:migrate_plugins (first_time) ** Invoke environment (first_time) ** Execute environment ** Execute db:migrate_plugins Migrating engines... Migrating acts_as_activity_provider... Migrating acts_as_attachable... Migrating acts_as_customizable... Migrating acts_as_event... Migrating acts_as_list... Migrating acts_as_searchable... Migrating acts_as_tree... Migrating acts_as_versioned... Migrating acts_as_watchable... Migrating awesome_nested_set... Migrating classic_pagination... Migrating coderay-0.9.2... Migrating gravatar... Migrating open_id_authentication... Migrating prepend_engine_views... Migrating redmine_wiki_extensions... == CreateWikiExtensionsComments: migrating =================================== -- create_table(:wiki_extensions_comments) rake aborted! An error has occurred, all later migrations canceled: Mysql::Error: Table 'wiki_extensions_comments' already exists: CREATE TABLE 'wiki_extensions_comments' ('id' int(11) DEFAULT NULL auto_increment PRIMARY KEY, 'wiki_page_id' int(11), 'key_word' varchar(255), 'user_id' int(11), 'comment' text, 'created_at' datetime, 'updated_at' datetime) ENGINE=InnoDB

    Read the article

  • Apache crashes every 5min

    - by Simon
    I'm relatively new to server issues, having a site of mine that I started early in the year grow beyond my capabilities of managing it. I need help. I recently moved out of my shared hosting environment onto a dedicated virtual server from Mediatemple. Each week, I run a script that fetches data from my DB, fetches data from last.fm's API and then tweets information to Twitter. My server uses Virtuozzo and when the script runs, Apache crashes every 5min. I checked and saw that the 'kmemsize' parameter reaches its cap (its 13mb). I realise my problem. The MySQL process stays open for long while Apache needs to handle lots of incoming links (about 200 000 pageviews for that day according to my previous host's AWSTATS). Yes, I'm quite unexperienced in this, and I'm clearly killing the server with too many incoming links while it has to manage the updating of the DB. So that is the precedent: I want a few answers. 1) Why did my shared hosting environment not crash apache every 5min? It ran fine, the site only slowed a lot. Clearly, it must be the virtual container and the kmemsize limit? 2) Where do I go from here? Would a physical server (not a virtual container) encounter the same problems? I sent a support request to Mediatemple as well. I need all the help I can get.

    Read the article

  • A few tables are still out of sync after running mk-table-sync

    - by smusumeche
    I have 1 master and 2 slaves. I am using MySQL 5.1.42 on all servers. I am attempting to use mk-table-checksum to verify that their data is in sync, but I am getting unexpected results on one of the slaves. First, I generate the checksums on the master like this: mk-table-checksum h=localhost --databases MYDB --tables {$table_list} --replicate=MYDB.mk_checksum --chunk-size=10M My understanding is that this runs the checksum queries on the master which then propagate via normal replication to the slaves. So, no locking is needed because the slaves will be at the same logical point in time when they run the checksum queries on themselves. Is this correct? Next, to verify that the checksums match, I run this on the master: mk-table-checksum --databases MYDB --replicate=IRC.mk_checksum --replicate-check 1 h=localhost,u=maatkit,p=xxxx If there are any differences, I repair the slaves like this: mk-table-sync --execute --verbose --replicate IRC.mk_checksum h=localhost,u=maatkit,p=xxxx After doing all of this, I repaired both slaves with mk-table-sync. However, everytime I run this sequence (after everything has already been repaired), one slave is perfectly in sync but one slave always has a few tables out of sync. I am 99.999% sure that the data on the slaves matches, since I repaired everything and the tables were not even updated on the master between runs of the checksum script. What would cause a few tables to always show out of sync on only one of the slaves? I am stuck. Here is the output: Differences on h=x.x.x.x,p=...,u=maatkit DB TBL CHUNK CNT_DIFF CRC_DIFF BOUNDARIES IRC product 10 0 1 product_id = 147377 AND product_id < 162085 IRC post_order_survey 0 0 1 1=1 IRC mk_heartbeat 0 0 1 1=1 IRC mailing_list 0 0 1 1=1 IRC honey_pot_log 0 0 1 1=1 IRC product 12 0 1 product_id = 176793 AND product_id < 191501 IRC product 18 0 1 product_id = 265041 IRC orders 26 0 1 order_id = 694472 IRC orders_product 6 0 1 op_id = 935375

    Read the article

  • Server Recovery from Denial of Service

    - by JMC
    I'm looking at a server that might be misconfigured to handle Denial of Service. The database was knocked offline after the attack, and was unable to restart itself after it failed to restart when the attack subsided. Details of the Attack: The Attacker either intentionally or unintentionally sent 1000's of search queries using the applications search query url within a couple of seconds. It looks like the server was overwhelmed and it caused the database to log this message: Server Specs: 1.5GB of dedicated memory Are there any obvious mis-configurations here that I'm missing? **mysql.log** 121118 20:28:54 mysqld_safe Number of processes running now: 0 121118 20:28:54 mysqld_safe mysqld restarted 121118 20:28:55 [Warning] option 'slow_query_log': boolean value '/var/log/mysqld.slow.log' wasn't recognized. Set to OFF. 121118 20:28:55 [Note] Plugin 'FEDERATED' is disabled. 121118 20:28:55 InnoDB: The InnoDB memory heap is disabled 121118 20:28:55 InnoDB: Mutexes and rw_locks use GCC atomic builtins 121118 20:28:55 InnoDB: Compressed tables use zlib 1.2.3 121118 20:28:55 InnoDB: Using Linux native AIO 121118 20:28:55 InnoDB: Initializing buffer pool, size = 512.0M InnoDB: mmap(549453824 bytes) failed; errno 12 121118 20:28:55 InnoDB: Completed initialization of buffer pool 121118 20:28:55 InnoDB: Fatal error: cannot allocate memory for the buffer pool 121118 20:28:55 [ERROR] Plugin 'InnoDB' init function returned error. 121118 20:28:55 [ERROR] Plugin 'InnoDB' registration as a STORAGE ENGINE failed. 121118 20:28:55 [ERROR] Unknown/unsupported storage engine: InnoDB 121118 20:28:55 [ERROR] Aborting **ulimit -a** core file size (blocks, -c) 0 data seg size (kbytes, -d) unlimited scheduling priority (-e) 0 file size (blocks, -f) unlimited pending signals (-i) 13089 max locked memory (kbytes, -l) 64 max memory size (kbytes, -m) unlimited open files (-n) 1024 pipe size (512 bytes, -p) 8 POSIX message queues (bytes, -q) 819200 real-time priority (-r) 0 stack size (kbytes, -s) 8192 cpu time (seconds, -t) unlimited max user processes (-u) 1024 virtual memory (kbytes, -v) unlimited file locks (-x) unlimited **httpd.conf** StartServers 10 MinSpareServers 8 MaxSpareServers 12 ServerLimit 256 MaxClients 256 MaxRequestsPerChild 4000 **my.cnf** innodb_buffer_pool_size=512M # Increase Innodb Thread Concurrency = 2 * [numberofCPUs] + 2 innodb_thread_concurrency=4 # Set Table Cache table_cache=512 # Set Query Cache_Size query_cache_size=64M query_cache_limit=2M # A sort buffer is used for optimizing sorting sort_buffer_size=8M # Log slow queries slow_query_log=/var/log/mysqld.slow.log long_query_time=2 #performance_tweak join_buffer_size=2M **php.ini** memory_limit = 128M post_max_size = 8M

    Read the article

  • Server configration for our website [duplicate]

    - by Varun Varunesh
    This question already has an answer here: Can you help me with my capacity planning? 2 answers We are a start-up and 6 month back we have launched our beta version website. Now we are in a phase of building our website and web-services for the final product. This website will be based on PHP, Python, MySql database and with wamp server. Right now in the beta version we are using Azure VM for hosting, with configuration of 786MB RAM and Shared CPU. We have 200 avg users daily coming to our website. Now we are trying to increase the number of users from 200 to 1500 daily users. And I am thinking our server should have capability to handle at least 100 concurrent user. Also we have developed web-services for our mobile-apps. Which can also increase loads on the sever. So here are the question that takes me here, I am pretty much confused about whether to go with shared hosting or VM based hosting. If VM, then what configuration will be best for our requirement (as I discussed above) ? Currently our VM is a Windows based server and its very simple to manage, So other than cost factor why should I go for Linux based sever? What other factor should I keep in mind while choosing the server as per our requirement ?

    Read the article

  • mysqldump skip one table

    - by danneth
    I'm running a cronjob to backup our system using mysqldump. The database contains 90 or so tables. One of these tables is HUGE and every once in a while causes the dump to fail. From the manual I see that you can specify specific tables to dump shell> mysqldump [options] db_name [tbl_name ...] This got me thinking. What if I have two jobs, one for dumping the huge table, and one for all the others. To accomplish this it would be nice if I could to something like shell> mysqldump -u backupuser -p database huge_table > db_huge_table.sql shell> mysqldump -u backupuser -p database --skip huge_table > db_rest.sql Unfortenately I'm not seeing such and option. I could of course explicitly state the 90 tables, but that just seems like a mess. Another option would be a script of some sort, but before checking that route I'll try this resource. MySQL is 5.1.61 on CentOS 6.2

    Read the article

  • Web Hosting: Any web host that supports files more than 50,000 in number?

    - by Devner
    Hi all, For my PHP & mySQL based application, I am trying to buy website hosting from a host who does not have a limit on the number of files I carry in my hosting account. Almost all the websites have a common limit of 50,000 files (some websites call it 50,000 nodes). The rest(to the extent of my search) are not even close. I have gone through the various websites, Googled lot of information, have spoken with the customer service of the hosting companies and they said that they have a limit of 50,000 files and that's why they call it the LIMIT. Now I have my application, which is a kind of social networking website, where people can upload various files of varying file size. So say if 50,000 users were to join the website and upload 1 file each, the limit of 50,000 will be reached very easily and my 50,001 customer will start facing file upload problems (& so will my account). So I would like to know if there's any website hosting services that do NOT levy such restrictions. In summary, I need the following options: No maximum file limit (more than 50,000 files in account). No maximum file upload limit in server setting (10MB, 12MB, 15MB, 20MB, etc.). Ability to upload files of various types (zip, flv, jg, png, etc.). Ability to stream Audio and Video (live audio & video not necessary). Access to .htaccess Access to php.ini, my.cnf or my.ini (this would be a plus) Supports SSL. Provides dedicated hosting(& IP) as well. Monthly payments without contracts are a plus. If you know of any such website hosting services, please post a reply ( a link to the same will be appreciated ). Thank you.

    Read the article

  • Developing high-performance and scalable zend framework website [on hold]

    - by Daniel
    We are going to develop an ads website like http://www.gumtree.com/ (it will not be like this one but just to give you an ideea) and we are having some issues regarding performance and scalability. We are planning on using Zend Framework for this project but this is all that I'm sure off at this point. I don't think a classic approch like Zend Framework (PHP) + MySQL + Memcache + jQuery (and I would throw Doctrine 2 in there to) will fix result in a high-performance application. I was thinking on making this a RESTful application (with Zend Framework) + NGINX (or maybe MongoDB) + Memcache (or eAccelerator -- I understand this will create problems with scalability on multiple servers) + jQuery or maybe throw Backbone.js in there, a CDN for static content, a server for images and a scalable server for the requests and the rest. My questions are: - What do you think about my approch? - What solutions would you recommand for developing an high performance, scalable application expected to have a lot of traffic using PHP(Zend Framework 2)...I would be interested in your approch. I should note that I'm a Zend developer, I'm working with Zend for over 3 years, this is why I'm choosing it.

    Read the article

  • choosing hosting for custom ecommerce site, shared, dedicated, what to look for?

    - by spirytus
    Hi, I have (almost) developed website for my client and now need to decide on hosting. Most of the users of the site will be located in Australia, and so am I and my client. Now, I want to consider everything before deciding on host and few questions comes to my mind: I cannot afford website being down, and all hosts say something like "99% uptime guranted". Should just that be enough or shall I ask hosts for some stats maybe? Does it make any difference if servers and whole hosting company is located in Australia or outside? I've been hosting few sites with JustHost.com on shared hosting (cheapest plan, servers in US I believe) and never seen any delays but could that be an issue? I would prefer Australian company so I can actually go to them and give them piece of my mind if something goes wrong, but US servers seem cheaper. Would share hosting do? Its ecommerce custom build php application, I know there are security issues with sessions etc on shared hosting. Will take precautions of course but could share hosting be an issue? Would dedicated be worthy option considering that my knowledge of server is very limited? I need to run php/mysql, with preferably unlimited bandwidth as with my experience I cannot tell what amount of traffic would be sufficient. Please let me know if I didn't provide you with enough information so you could answer my questions, will gladly explain further. In advance thanks for any answers :)

    Read the article

  • High-performance Academic Server [closed]

    - by PHPsmith
    Suppose I want to build a server for the university's academic interests. The server is dedicated only to a site, where users (students and lecturers) just view and fill the academic data. But at a time (e.g. once a semester), about 12,000 students will access the site simultaneously. Due to limitation of resources, I have to build the server using free software (except for the operating system Windows 7, the university has been prepared). The hardware is also limited to the usual 4-core computers (eg, Ivy Bridge Intel Core i7-3770) with approximately 16GB of memory (DDR3 1600 MHz), equipped with an RJ-45 port (Intel 82 579 Gigabit Ethernet). With all these limitations, I have to choose the software (web server, database, etc) are appropriate for this purpose is achieved. I decided to create a site in PHP. Please help me by answering the following questions based on your expertise. (my prime candidate software to consider after googling) Web server which is faster & stable & secure, when implemented and optimized for PHP? And why? (nginx) PHP accelerator which is faster & stable & compatible with the selected web server? And why? (APC with Zend Optimizer+) Database which is faster & stable & secure, when implemented and optimized for selected web server and selected PHP accelerator? (MySQL) Are there any errors that have been or will be happening from my condition is? If there is, please enlighten me? Is there anything else I need to know in order to achieve this goal? If there is, please enlighten me? I understand that the performance also depends on the implementation of source-code program, so I assume it will create a site with the best efficiency (e.g. using AJAX).

    Read the article

  • Multivalue Mysql Inserts using HibernateTemplate

    - by Langali
    I am using Spring HibernateTemplate and need to insert hundreds of records into a mysql database every second. Not sure what is the most performant way of doing it, but I am trying to see how the multi value mysql inserts do using hibernate. String query = "insert into user(age, name, birth_date) values(24, 'Joe', '2010-05-19 14:33:14'), (25, 'Joe1', '2010-05-19 14:33:14')" getHibernateTemplate().execute(new HibernateCallback(){ public Object doInHibernate(Session session) throws HibernateException, SQLException { return session.createSQLQuery(query).executeUpdate(); } }); But I get this error: 'could not execute native bulk manipulation query.' Please check your query ..... Any idea of I can use a multi value mysql insert using Hibernate? or is my query incorrect? Any other ways that I can improve the performance? I did try the saveOrUpdateAll() method, and that wasn't good enough!

    Read the article

  • Why is my server performance degrading to the point of stopping, periodically?

    - by Pascal Aschwanden
    So, once in a while, I see in firebug that a request takes over 15 or even 60 seconds to respond and sometimes never. Here is what I've ruled out: It's not the CPU, cuz every time I check the Server load its less then 6 for all 3 numbers It's not the memory, because thats fairly low too, less the 50% It's not the I/O anymore, because I've seen the graphs that Joyent sent back to me when I requested them, and they show less then 3MB of I/O (mostly all read). It's not the SQL performance - I've profiled every last SQL command that runs, and they're all (99.9% of them anyway) running in less then 30ms, most run in less then 5ms. Oh and I've been profiling all the script execution times, and even the when the problem occurs, the script always manages to finish in 50ms or less (that's 1 / 20th of a second ). Now, I do run alot of ajax calls. 1 every 2 seconds per user and I have 300 DAU+. But, even if all 300 are playing simultaneously, thats still only 150 calls per second max. The only other thing I can think of is that one of my neighbors is funky. The problem is highly intermittent. 99% of the time it works perfectly and there's excellent performance. but 99%+ is not good enough. Eventually the performance gets so bad I have to restart the server, at which point everything is fine again. I've done this about 4 times now. Any ideas? Note: this is on joyent, vps, intro package 256mb of ram with bursting. here are the mysql dump info: Traffic ø per hour Received 18 MiB 29 MiB Sent 134 MiB 221 MiB Total 151 MiB 251 MiB Connections ø per hour % max. concurrent connections 5 --- --- Failed attempts 0 0.00 0.00% Aborted 0 0.00 0.00% Total 9,418 15.59 k 100.00%

    Read the article

  • mysqldump --where with = operator doesn't get all rows = - Help!

    - by JonathanLIVE
    I have a situation with a particular table that now thinks it contains 4 Petabytes of data. I know that sounds cool, but I assure you, it is only on a 60GB partition. This table has 9 fields in it. One of them is a domain_id field. It is the best field to identify the rows by, as there are only approximately 6300 of them. The only other field option to match has over 2million records, and thats just more difficult. I cannot do a straight mysqldump because it will attempt to output all 4PB of data and fill the drive long before it gets close to that, so I need to surgically remove the good stuff, destroy the db, and recreate it. I believe if I can do a dump for each domain_id record, then I will get most of the usable data out of it. This is what I am trying to use: mysqldump -u root --skip-opt -q --no-create-info --skip-add-drop-table --max_allowed_packet=1000000000 database table --where="domain_id=10" domains10.sql Using this I expect every row with the domain_id 10 to be exported. However, when I check the export, I am only getting 1 row, when however I look at the db, there are many many rows. It is as though the operator just finds one, then gives up. I have tried various operators. Using the < or I am able to get more of the data, but the export stops short at certain rows where the data has been compromised. With over 6000 to go through, I can't narrow down which rows are being affected in the export easily enough. So, what I need is an operator that will basically do what I thought = would do, simply give me an export of all records that match the specific field. Also note, the only way I got this DB even accessible is through an innodb force recovery 3. So I need to get this right, because after this is done, I have to drop the db in order to make mysql functional again. Looking forward to any helpful answers.

    Read the article

  • Good Hosting Providers With Zend Framework Support [closed]

    - by manyxcxi
    I currently use ixwebhosting for my hosting services. They're cheap and work (most of the time). The databases are horribly slow, the servers are horribly slow, and their support (though usually prompt) is tough to deal with. That being said, they're cheap, I've got like 20 domains hosted in my account, none of them are high volume, and they work JUST good enough- until today. This isn't meant to be a condemnation of ixwh though. Their prices are very low for what they do offer and most things work just fine, most of the time. I need to be able to host web apps written with Zend Framework in a fairly easy fashion. The server performance can't be worse than what I've already had (a pretty low hurdle to clear), and I don't want to spend $30/mo. These are not money making websites- they're projects. My requirements are PHP 5.3, ZF support, MySQL databases, multiple domains- not much. Who should I look at, and who should I look out for?

    Read the article

  • Server specification recommendation

    - by foo
    To cut the story short, I can't buy an item (server/cpu/motherboard/ram) that costs more than USD 330. However, I can combine them, meaning, I can buy a CPU that costs USD 330 and motherboard that costs USD 330. With this limitation, I can't buy a powerful 1U server which will definitely costs me more USD 330. With that in mind, I was hoping to build a powerful desktop PC which will be used as a database server. However, through my experience, desktop PC doesn't last very long, usually the motherboard will just die by itself after 1 or 2 years. So, what would you guys recommend me to buy with this kind of budget? Every item must be <= USD 330. Will be used as a MySQL server. RAID would be nice. 1TB is pretty big for my data. I do not need external graphic card (onboard would do just fine), mouse, keyboard, monitor. Linux friendly. One ethernet port is good enough. It's important that those hardware is made of components that will last long (at least 3 years or something). The server will be placed in an air conditioned room, but a good ventilation for the server is always preferred. I won't overclock it. Intel processor is preferred. Thanks in advance.

    Read the article

  • 5 year old server upgrade

    - by rizzo0917
    I am looking to upgrade a server for a web app. Currently the application is running very sluggish. We've made some adjustments to mysql (that's another issue in itself) and made some adjustments so that heaviest quires get run on a copy of the database on another server was have as a backup, however this will not last that much longer and we are looking to upgrade. Currently the servers CPUs are (4) Intel(R) XEON(TM) CPU 2.00GHz, with 1 gig of ram. The database is 442.5 MiB, with about 1,743,808 records. There are two parts of the program, the one, side a, inserts and updates most of the data. Side b, reads the data and does some minor updates. Currently our biggest day for side a are 800 users (of 40,000 users all year) imputing the system. And our Side b is currently unknown, however we have a total of 1000 clients. The system is most likely going to cap out at 5000 side b clients, with about a year 300,000 side a users. The current database is 5 years old, so we can most likely expect the database to grow pretty rapidly, possibly double each year (which we can most likely archive older records if it comes to that). So with that being said, should we get a server for each side of the app, side a being the master, side b being the slave, any updates made on side b are router to side a. So the question is should i get 2 of these or 1. 2 x Intel Nehalem Xeon E5520 2.26Ghz (8 Cores) 12GB DDRIII Memory 500GB SATAII HDD 100Mbps Port Speed And Naturally I would need to have a redundant backup so it could potentially be 4 of them.

    Read the article

  • please take a look at my server's ram usage

    - by user66779
    Hi, i am a noob with servers. I have a centos5.5 vps with 512mb ram. My goal is to have it host just one magento store. I've installed Magento on the server without any control panel, by just installing lamp myself and whatever php extensions were necessary to get Magento to install. As soon as i visit my magento store, suddenly the ram on the vps is almost completely used, with only about 100mb left. Please see this screenshot of htop taken after just myself visited the website. http://img714.imageshack.us/img714/1944/screenouv.png As you can see there's only around 100mb left. Is that normal? I'm wondering if i might have done something stupid with the server that makes it very resource hungry. I installed apache from the centos base repo, php version 5.3 from the ius repository and mysql 5.1 also from ius repo. I haven't changed any of the default config files for any of these except to make memory_minimum 256 in php.ini. Is there anything i can do to make more ram free? I'm clueless but i see each Apache daemon is using 8% of available ram, and AFAIK each visitor needs one Apache daemon. So i would run out of ram with just a handful of visitors. Thanks for your advice.

    Read the article

  • validation in datagrid while insert update in asp.net

    - by abhi
    <%@ Page Language="C#" AutoEventWireup="true" CodeFile="Default7.aspx.cs" Inherits="Default7" % <%@ Register Assembly="Telerik.Web.UI" Namespace="Telerik.Web.UI" TagPrefix="telerik" % Untitled Page   <telerik:GridTemplateColumn Visible="false"> <ItemTemplate> <asp:Label ID="lblEmpID" runat="server" Text='<%# bind("pid") %>'> </asp:Label> </ItemTemplate> </telerik:GridTemplateColumn> <telerik:GridBoundColumn UniqueName="Fname" HeaderText="Fname" DataField="Fname" CurrentFilterFunction="NotIsNull" SortExpression="Fname"> </telerik:GridBoundColumn> <telerik:GridBoundColumn UniqueName="Lname" HeaderText="Lname" DataField="Lname" CurrentFilterFunction="NotIsNull" SortExpression="Lname"> </telerik:GridBoundColumn> <telerik:GridBoundColumn UniqueName="Designation" HeaderText="Designation" DataField="Designation" CurrentFilterFunction="NotIsNull" SortExpression="Designation"> </telerik:GridBoundColumn> <telerik:GridEditCommandColumn> </telerik:GridEditCommandColumn> <telerik:GridButtonColumn CommandName="Delete" Text="Delete" UniqueName="column"> </telerik:GridButtonColumn> </Columns> <EditFormSettings> <FormTemplate> <table> <tr> <td>Fname*</td> <td> <asp:HiddenField ID="Fname" runat="server" Visible="false" /> <asp:TextBox ID="txtFname" runat="server" Text='<%#("Fname")%>'> </asp:TextBox> <asp:RequiredFieldValidator ID="EvalFname" ControlToValidate="txtFname" ErrorMessage="Enter Name" runat="server" ValidationGroup="Update"> *</asp:RequiredFieldValidator> </td> </tr> <tr> <td>Lname*</td> <td> <asp:HiddenField ID="HiddenField1" runat="server" Visible="false" /> <asp:TextBox ID="txtLname" runat="server" Text='<%#("Lname")%>'> </asp:TextBox> <asp:RequiredFieldValidator ID="RequiredFieldValidator1" ControlToValidate="txtLname" ErrorMessage="Enter Name" runat="server" ValidationGroup="Update"> *</asp:RequiredFieldValidator> </td> </tr> <tr> <td>Designation* </td> <td> <asp:HiddenField ID="HiddenField2" runat="server" Visible="false" /> <asp:TextBox ID="txtDesignation" runat="server" Text='<%#("Designation")%>'> </asp:TextBox> <asp:RequiredFieldValidator ID="RequiredFieldValidator2" ControlToValidate="txtDesignation" ErrorMessage="Enter Designation" runat="server" ValidationGroup="Update"> *</asp:RequiredFieldValidator> </td> </tr> </table> </FormTemplate> <EditColumn UpdateText="Update record" UniqueName="EditCommandColumn1" CancelText="Cancel edit"> </EditColumn> </EditFormSettings> </MasterTableView> </telerik:RadGrid> </form> this is my code i want to perform validation using the required validators but i think m missing smthin so pls help, here's my code behind using System; using System.Data; using System.Configuration; using System.Collections; using System.Web; using System.Web.Security; using System.Web.UI; using System.Web.UI.WebControls; using System.Web.UI.WebControls.WebParts; using System.Web.UI.HtmlControls; using System.Data.SqlClient; using Telerik.Web.UI; public partial class Default7 : System.Web.UI.Page { string strQry; public SqlDataAdapter da; public DataSet ds; public SqlCommand cmd; public DataTable dt; string strCon = "Data Source=MINETDEVDATA; Initial Catalog=ML_SuppliersProd; User Id=sa; Password=@MinetApps7;"; public SqlConnection con; protected void Page_Load(object sender, EventArgs e) { } protected void RadGrid1_NeedDataSource(object source, Telerik.Web.UI.GridNeedDataSourceEventArgs e) { dt = new DataTable(); con = new SqlConnection(strCon); da = new SqlDataAdapter(); try { strQry = "SELECT * FROM table2"; con.Open(); da.SelectCommand = new SqlCommand(strQry,con); da.Fill(dt); RadGrid1.DataSource = dt; } finally { con.Close(); } } protected void RadGrid1_DeleteCommand(object source, Telerik.Web.UI.GridCommandEventArgs e) { con = new SqlConnection(strCon); cmd = new SqlCommand(); GridDataItem item = (GridDataItem)e.Item; string pid = item.OwnerTableView.DataKeyValues[item.ItemIndex]["pid"].ToString(); con.Open(); string delete = "DELETE from table2 where pid='"+pid+"'"; cmd.CommandText = delete; cmd.Connection = con; cmd.ExecuteNonQuery(); con.Close(); } protected void RadGrid1_UpdateCommand(object source, GridCommandEventArgs e) { GridEditableItem radgriditem = e.Item as GridEditableItem; string pid = radgriditem.OwnerTableView.DataKeyValues[radgriditem.ItemIndex]["pid"].ToString(); string firstname = (radgriditem["Fname"].Controls[0] as TextBox).Text; string lastname = (radgriditem["Lname"].Controls[0] as TextBox).Text; string designation = (radgriditem["Designation"].Controls[0] as TextBox).Text; con = new SqlConnection(strCon); cmd = new SqlCommand(); try { con.Open(); string update = "UPDATE table2 set Fname='" + firstname + "',Lname='" + lastname + "',Designation='" + designation + "' WHERE pid='" + pid + "'"; cmd.CommandText = update; cmd.Connection = con; cmd.ExecuteNonQuery(); con.Close(); } catch (Exception ex) { RadGrid1.Controls.Add(new LiteralControl("Unable to update Reason: " + ex.Message)); e.Canceled = true; } } protected void RadGrid1_InsertCommand(object source, GridCommandEventArgs e) { GridEditFormInsertItem insertitem = (GridEditFormInsertItem)e.Item; string firstname = (insertitem["Fname"].Controls[0] as TextBox).Text; string lastname = (insertitem["Lname"].Controls[0] as TextBox).Text; string designation = (insertitem["Designation"].Controls[0] as TextBox).Text; con = new SqlConnection(strCon); cmd = new SqlCommand(); try { con.Open(); String insertQuery = "INSERT into table1(Fname,Lname,Designation) Values ('" + firstname + "','" + lastname + "','" + designation + "')"; cmd.CommandText = insertQuery; cmd.Connection = con; cmd.ExecuteNonQuery(); con.Close(); } catch(Exception ex) { RadGrid1.Controls.Add(new LiteralControl("Unable to insert Reason:" + ex.Message)); e.Canceled = true; } } }

    Read the article

  • Trouble with dns and debian update

    - by Sean
    I tried to update my debian dreamplug server with the command running as root apt-get update and recieved these errors. Err http://security.debian.org lenny/updates Release.gpg Could not resolve 'security.debian.org' Err htdtp://security.debian.org lenny/updates/main Translation-en_US Could not resolve 'security.debian.org' Err htdtp://security.debian.org lenny/updates/contrib Translation-en_US Could not resolve 'security.debian.org' Err htdtp://security.debian.org lenny/updates/non-free Translation-en_US Could not resolve 'security.debian.org' Err httdp://www.backports.org lenny-backports Releasegpg Could not resolve 'www.backports.org' Err httdp://www.backports.org lenny-backports/main Translation-en_US Could not resolve 'www.backports.org' Err httdp://www.backports.org lenny-backports/contrib Translation-en_US Could not resolve 'www.backports.org' Err httdp://www.backports.org lenny-backports/non-free Translation-en_US Could not resolve 'www.backports.org' Err httdp://ftp.us.debian.org lenny Release.gpg Could not resolve 'ftp.us.debian.org' Err httdp://ftp.us.debian.org lenny/main Translation-en_US Could not resolve 'ftp.us.debian.org' Err httdp://ftp.us.debian.org lenny/contrib Translation-en_US Could not resolve 'ftp.us.debian.org' Err httdp://ftp.us.debian.org lenny/non-free Translation-en_US Could not resolve 'ftp.us.debian.org' Err httdp://http.us.debian.org stable Release.gpg Could not resolve 'http.us.debian.org' Err htdtp://http.us.debian.org stable/main Translation-en_US Could not resolve 'http.us.debian.org' Err httdp://http.us.debian.org stable/contrib Translation-en_US Could not resolve 'http.us.debian.org' Err htdtp://http.us.debian.org stable/non-free Translation-en_US Could not resolve 'http.us.debian.org' Reading package lists... Done W: Failed to fetch ttp://ftp.us.debian.org/debian/dists/lenny/Release.gpg Could not resolve 'ftp.us.debian.org' W: Failed to fetch ttp://ftp.us.debian.org/debian/dists/lenny/main/i18n/Translation-en_US.gz Could not resolve 'ftp.us.debian.org' W: Failed to fetch ttp://ftp.us.debian.org/debian/dists/lenny/contrib/i18n/Translation-en_US.gz Could not resolve 'ftp.us.debian.org' W: Failed to fetch ttp://ftp.us.debian.org/debian/dists/lenny/non-free/i18n/Translation-en_US.gz Could not resolve 'ftp.us.debian.org' W: Failed to fetch ttp://http.us.debian.org/debian/dists/stable/Release.gpg Could not resolve 'http.us.debian.org' W: Failed to fetch ttp://http.us.debian.org/debian/dists/stable/main/i18n/Translation-en_US.gz Could not resolve 'http.us.debian.org' W: Failed to fetch ttp://http.us.debian.org/debian/dists/stable/contrib/i18n/Translation-en_US.gz Could not resolve 'http.us.debian.org' W: Failed to fetch ttp://http.us.debian.org/debian/dists/stable/non-free/i18n/Translation-en_US.gz Could not resolve 'http.us.debian.org' W: Failed to fetch ttp://security.debian.org/dists/lenny/updates/Release.gpg Could not resolve 'security.debian.org' W: Failed to fetch ttp://security.debian.org/dists/lenny/updates/main/i18n/Translation-en_US.gz Could not resolve 'security.debian.org' W: Failed to fetch ttp://security.debian.org/dists/lenny/updates/contrib/i18n/Translation-en_US.gz Could not resolve 'security.debian.org' W: Failed to fetch ttp://security.debian.org/dists/lenny/updates/non-free/i18n/Translation-en_US.gz Could not resolve 'security.debian.org' W: Failed to fetch ttp://www.backports.org/debian/dists/lenny-backports/Release.gpg Could not resolve 'www.backports.org' W: Failed to fetch ttp://www.backports.org/debian/dists/lenny-backports/main/i18n/Translation-en_US.gz Could not resolve 'www.backports.org' W: Failed to fetch ttp://www.backports.org/debian/dists/lenny-backports/contrib/i18n/Translation-en_US.gz Could not resolve 'www.backports.org' W: Failed to fetch ttp://www.backports.org/debian/dists/lenny-backports/non-free/i18n/Translation-en_US.gz Could not resolve 'www.backports.org' W: Some index files failed to download, they have been ignored, or old ones used instead. W: You may want to run apt-get update to correct these problems I am able to ping ip addresses but not namespaces. Can't seem to figure out the problem. My /etc/resolv.conf file contains nameserver 192.168.1.2 which is my router.

    Read the article

  • unable to update gridview

    - by bhakti
    Please help ,i have added update/edit command button in gridview so to update data in my sql server database but am unable to do it. Data is not updated in database . ======code for onrowupdate======================================== protected void gRowUpdate(object sender, GridViewUpdateEventArgs e) { Books b = null; b = new Books(); DataTable dt=null; GridView g = (GridView)sender; try { dt=new DataTable(); b = new Books(); b.author = Convert.ToString(g.Rows[e.RowIndex].FindControl("Author")); b.bookID = Convert.ToInt32(g.Rows[e.RowIndex].FindControl("BookID")); b.title = Convert.ToString(g.Rows[e.RowIndex].FindControl("Title")); b.price = Convert.ToDouble(g.Rows[e.RowIndex].FindControl("Price")); // b.rec = Convert.ToString(g.Rows[e.RowIndex].FindControl("Date_of_reciept")); b.ed = Convert.ToString(g.Rows[e.RowIndex].FindControl("Edition")); b.bill = Convert.ToString(g.Rows[e.RowIndex].FindControl("Edition")); b.cre_by = Convert.ToString(g.Rows[e.RowIndex].FindControl("Edition")); b.src = Convert.ToString(g.Rows[e.RowIndex].FindControl("Edition")); b.pages = Convert.ToInt32(g.Rows[e.RowIndex].FindControl("Edition")); b.pub = Convert.ToString(g.Rows[e.RowIndex].FindControl("Edition")); b.mod_by = Convert.ToString(g.Rows[e.RowIndex].FindControl("Edition")); b.remark = Convert.ToString(g.Rows[e.RowIndex].FindControl("Edition")); // b.year = Convert.ToString(g.Rows[e.RowIndex].FindControl("Edition")); b.updatebook(b); g.EditIndex = -1; dt = b.GetAllBooks(); g.DataSource = dt; g.DataBind(); } catch (Exception ex) { throw (ex); } finally { b = null; } } ===================My stored procedure for update book able to update database by exec in sqlserver mgmt studio========================== set ANSI_NULLS ON set QUOTED_IDENTIFIER ON go ALTER PROCEDURE [dbo].[usp_updatebook] @bookid bigint, @author varchar(50), @title varchar(50), @price bigint, @src_equisition varchar(50), @bill_no varchar(50), @publisher varchar(50), @pages bigint, @remark varchar(50), @edition varchar(50), @created_by varchar(50), @modified_by varchar(50) /*@date_of_reciept datetime, @year_of_publication datetime*/ AS declare @modified_on datetime set @modified_on=getdate() UPDATE books SET author=@author, title=@title, price=@price, src_equisition=@src_equisition, bill_no=@bill_no, publisher=@publisher, /*Date_of_reciept=@date_of_reciept,*/ pages=@pages, remark=@remark, edition=@edition, /*Year_of_publication=@year_of_publication,*/ created_by=@created_by, modified_on=@modified_on, modified_by=@modified_by WHERE bookid=@bookid ========================class library function for update==================== public void updatebook(Books b) { DataAccess dbAccess = null; SqlCommand cmd = null; try { dbAccess = new DataAccess(); cmd = dbAccess.GetSQLCommand("usp_updatebook", CommandType.StoredProcedure); cmd.Parameters.Add("@bookid", SqlDbType.VarChar, 50).Value = b.bookID; cmd.Parameters.Add("@author", SqlDbType.VarChar, 50).Value = b.author; cmd.Parameters.Add("@title", SqlDbType.VarChar, 50).Value = b.title; cmd.Parameters.Add("@price", SqlDbType.Money).Value = b.price; cmd.Parameters.Add("@publisher", SqlDbType.VarChar, 50).Value = b.pub; // cmd.Parameters.Add("@year_of_publication", SqlDbType.DateTime).Value =Convert.ToDateTime( b.year); cmd.Parameters.Add("@src_equisition", SqlDbType.VarChar, 50).Value = b.src; cmd.Parameters.Add("@bill_no", SqlDbType.VarChar, 50).Value = b.bill; cmd.Parameters.Add("@remark", SqlDbType.VarChar, 50).Value = b.remark; cmd.Parameters.Add("@pages", SqlDbType.Int).Value = b.pages; cmd.Parameters.Add("@edition", SqlDbType.VarChar, 50).Value = b.ed; // cmd.Parameters.Add("@date_of_reciept", SqlDbType.DateTime).Value = Convert.ToDateTime(b.rec); // cmd.Parameters.Add("@created_on", SqlDbType.DateTime).Value = Convert.ToDateTime(b.cre_on); cmd.Parameters.Add("@created_by", SqlDbType.VarChar, 50).Value = b.cre_by; //cmd.Parameters.Add("@modified_on", SqlDbType.DateTime).Value = Convert.ToDateTime(b.mod_on); cmd.Parameters.Add("@modified_by", SqlDbType.VarChar, 50).Value = b.mod_by; cmd.ExecuteNonQuery(); } catch (Exception ex) { throw (ex); } finally { if (cmd.Connection != null && cmd.Connection.State == ConnectionState.Open) cmd.Connection.Close(); dbAccess = null; cmd = null; } } I have also tried to do update by following way protected void gv1_updating(object sender, GridViewUpdateEventArgs e) { GridView g = (GridView)sender; abc a = new abc(); DataTable dt = new DataTable(); try { a.cd_Id = Convert.ToInt32(g.DataKeys[e.RowIndex].Values[0].ToString()); //TextBox b = (TextBox)g.Rows[e.RowIndex].Cells[0].FindControl("cd_id"); TextBox c = (TextBox)g.Rows[e.RowIndex].Cells[2].FindControl("cd_name"); TextBox d = (TextBox)g.Rows[e.RowIndex].Cells[3].FindControl("version"); TextBox f = (TextBox)g.Rows[e.RowIndex].Cells[4].FindControl("company"); TextBox h = (TextBox)g.Rows[e.RowIndex].Cells[6].FindControl("created_by"); TextBox i = (TextBox)g.Rows[e.RowIndex].Cells[8].FindControl("modified_by"); //a.cd_Id = Convert.ToInt32(b.Text); a.cd_name = c.Text; a.ver = d.Text; a.comp = f.Text; a.cre_by = h.Text; a.mod_by = i.Text; a.updateDigi(a); g.EditIndex = -1; dt = a.GetAllDigi(); g.DataSource = dt; g.DataBind(); } catch(Exception ex) { throw (ex); } finally { dt = null; a = null; g = null; } } =================== but have error of Index out of range exception========= please do reply,thanxs in advance

    Read the article

  • Ruby Gem LoadError mysql2/mysql2 required

    - by Kalli Dalli
    Im trying to setup my rails server on OSX 10.8 but I can't get my rails server to run. - Currently Im using a Zend Server with mysql 5.1. - I also have istalled brew and brew mysql. - And I used: gem install mysql2 -- --srcdir=/usr/local/mysql/include --with-opt-include=/usr/local/mysql/include the server worked already but now, I always get this loadError below. This is what my Gemfile says: ralphs-macbook-pro:admin-mockup zero$ bundle install Using rake (10.0.2) Using i18n (0.6.1) Using multi_json (1.3.7) Using activesupport (3.2.7) Using builder (3.0.4) Using activemodel (3.2.7) Using erubis (2.7.0) Using journey (1.0.4) Using rack (1.4.1) Using rack-cache (1.2) Using rack-test (0.6.2) Using hike (1.2.1) Using tilt (1.3.3) Using sprockets (2.1.3) Using actionpack (3.2.7) Using mime-types (1.19) Using polyglot (0.3.3) Using treetop (1.4.12) Using mail (2.4.4) Using actionmailer (3.2.7) Using arel (3.0.2) Using tzinfo (0.3.35) Using activerecord (3.2.7) Using activeresource (3.2.7) Using annotate (2.5.0) Using coffee-script-source (1.4.0) Using execjs (1.4.0) Using coffee-script (2.2.0) Using rack-ssl (1.3.2) Using json (1.7.5) Using rdoc (3.12) Using thor (0.16.0) Using railties (3.2.7) Using coffee-rails (3.2.2) Using columnize (0.3.6) Using debugger-ruby_core_source (1.1.5) Using debugger-linecache (1.1.2) Using debugger (1.2.2) Using formtastic (2.2.1) Using haml (3.1.7) Using haml-rails (0.3.5) Using hirb (0.7.0) Using hpricot (0.8.6) Using jquery-rails (2.1.4) Using kgio (2.7.4) Using mysql2 (0.3.11) Using php_serialize (1.2) Using polyamorous (0.5.0) Using rabl (0.7.8) Using railroady (1.1.0) Using bundler (1.2.3) Using rails (3.2.7) Using raindrops (0.10.0) Using randumb (0.3.0) Using sass (3.2.3) Using sass-rails (3.2.5) Using squeel (1.0.13) Using uglifier (1.3.0) Using unicorn (4.4.0) Your bundle is complete! Use `bundle show [gemname]` to see where a bundled gem is installed. And after starting rails s /Users/zero/.rvm/gems/ruby-1.9.3-p327/gems/mysql2-0.3.11/lib/mysql2.rb:9:in `require': cannot load such file -- mysql2/mysql2 (LoadError) from /Users/zero/.rvm/gems/ruby-1.9.3-p327/gems/mysql2-0.3.11/lib/mysql2.rb:9:in `<top (required)>' from /Users/zero/.rvm/gems/ruby-1.9.3-p327/gems/bundler-1.2.3/lib/bundler/runtime.rb:68:in `require' from /Users/zero/.rvm/gems/ruby-1.9.3-p327/gems/bundler-1.2.3/lib/bundler/runtime.rb:68:in `block (2 levels) in require' from /Users/zero/.rvm/gems/ruby-1.9.3-p327/gems/bundler-1.2.3/lib/bundler/runtime.rb:66:in `each' from /Users/zero/.rvm/gems/ruby-1.9.3-p327/gems/bundler-1.2.3/lib/bundler/runtime.rb:66:in `block in require' from /Users/zero/.rvm/gems/ruby-1.9.3-p327/gems/bundler-1.2.3/lib/bundler/runtime.rb:55:in `each' from /Users/zero/.rvm/gems/ruby-1.9.3-p327/gems/bundler-1.2.3/lib/bundler/runtime.rb:55:in `require' from /Users/zero/.rvm/gems/ruby-1.9.3-p327/gems/bundler-1.2.3/lib/bundler.rb:128:in `require' from /Users/zero/GitHub/admin-mockup/config/application.rb:7:in `<top (required)>' from /Users/zero/.rvm/gems/ruby-1.9.3-p327/gems/railties-3.2.7/lib/rails/commands.rb:53:in `require' from /Users/zero/.rvm/gems/ruby-1.9.3-p327/gems/railties-3.2.7/lib/rails/commands.rb:53:in `block in <top (required)>' from /Users/zero/.rvm/gems/ruby-1.9.3-p327/gems/railties-3.2.7/lib/rails/commands.rb:50:in `tap' from /Users/zero/.rvm/gems/ruby-1.9.3-p327/gems/railties-3.2.7/lib/rails/commands.rb:50:in `<top (required)>' from script/rails:6:in `require' from script/rails:6:in `<main>' Thx for any help!

    Read the article

  • Unable to edit a database row from JSF

    - by user1924104
    Hi guys i have a data table in JSF which displays all of the contents of my database table, it displays it fine, i also have a delete function that can successfully delete from the database fine and updates the data table fine however when i try to update the database i get the error java.lang.IllegalArgumentException: Cannot convert richard.test.User@129d62a7 of type class richard.test.User to long below is the code that i have been using to delete the rows in the database that is working fine : public void delete(long userID) { PreparedStatement ps = null; Connection con = null; if (userID != 0) { try { Class.forName("com.mysql.jdbc.Driver"); con = DriverManager.getConnection("jdbc:mysql://localhost:3306/test", "root", "root"); String sql = "DELETE FROM user1 WHERE userId=" + userID; ps = con.prepareStatement(sql); int i = ps.executeUpdate(); if (i > 0) { System.out.println("Row deleted successfully"); } } catch (Exception e) { e.printStackTrace(); } finally { try { con.close(); ps.close(); } catch (Exception e) { e.printStackTrace(); } } } } i simply wanted to edit the above code so it would update the records instead of deleting them so i edited it to look like : public void editData(long userID) { PreparedStatement ps = null; Connection con = null; if (userID != 0) { try { Class.forName("com.mysql.jdbc.Driver"); con = DriverManager.getConnection("jdbc:mysql://localhost:3306/test", "root", "root"); String sql = "UPDATE user1 set name = '"+name+"', email = '"+ email +"', address = '"+address+"' WHERE userId=" + userID; ps = con.prepareStatement(sql); int i = ps.executeUpdate(); if (i > 0) { System.out.println("Row updated successfully"); } } catch (Exception e) { e.printStackTrace(); } finally { try { con.close(); ps.close(); } catch (Exception e) { e.printStackTrace(); } } } } and the xhmtl is : <p:dataTable id="dataTable" var="u" value="#{userBean.getUserList()}" paginator="true" rows="10" editable="true" paginatorTemplate="{CurrentPageReport} {FirstPageLink} {PreviousPageLink} {PageLinks} {NextPageLink} {LastPageLink} {RowsPerPageDropdown}" rowsPerPageTemplate="5,10,15"> <p:column> <f:facet name="header"> User ID </f:facet> #{u.userID} </p:column> <p:column> <f:facet name="header"> Name </f:facet> #{u.name} </p:column> <p:column> <f:facet name="header"> Email </f:facet> #{u.email} </p:column> <p:column> <f:facet name="header"> Address </f:facet> #{u.address} </p:column> <p:column> <f:facet name="header"> Created Date </f:facet> #{u.created_date} </p:column> <p:column> <f:facet name="header"> Delete </f:facet> <h:commandButton value="Delete" action="#{user.delete(u.userID)}" /> </p:column> <p:column> <f:facet name="header"> Delete </f:facet> <h:commandButton value="Edit" action="#{user.editData(u)}" /> </p:column> currently when you press the edit button it will only update it with the same values as i haven't yet managed to get the datatable to be editable with the database, i have seen a few examples with an array list where the data table gets its values from but never a database so if you have any advice on this too it would be great thanks

    Read the article

< Previous Page | 329 330 331 332 333 334 335 336 337 338 339 340  | Next Page >