Search Results

Search found 68715 results on 2749 pages for 'mysql data'.

Page 103/2749 | < Previous Page | 99 100 101 102 103 104 105 106 107 108 109 110  | Next Page >

  • MySQL "Host" permissions

    - by Wayne M
    Exactly what is the best way to configure this? I have a user account for a web app specified, but I also want to connect to the database via a GUI. The host is specified as % but the GUI tool repeatedly says access denied although I am using the proper password. If I change this to localhost then I can connect via the command line, but not via the GUI. If I add two entries, then I can connect via the command line and not the GUI. Leaving only the % doesn't let me connect via the command line OR the GUI. I want to be able to connect both on the actual server (via the web app itself) AND via the GUI tool.

    Read the article

  • SASL (Postfix) authentication with MySQL and SHA1 pre-encrypted passwords

    - by webo
    I have a Rails app with the Devise authentication gem running user registration and login. I want to use the db table that Devise populates when a user registers as the table that Postfix uses to authenticate users. The table has all the fields that Postfix may want for SASL authentication except that Devise encrypts the password using SHA1 before placing it in the database. How could I go about getting Postfix/SASL to decrypt those passwords so that the user can be authenticated properly? Devise salts the password so I'm not sure if that helps. Any suggestions? I'd likely want to do something similar with Dovecot or Courier, I'm not attached to one quite yet.

    Read the article

  • How do I mirror a MySQL database?

    - by user45745
    I'm running two load balanced servers for one website, and I'd like the databases to be synchronized. Queries may be run on either of the two servers because they are both production sites, so the replication can't just work one way. It doesn't have to be in real-time, just fairly accurate so people don't notice a difference when they get switched to a different server.

    Read the article

  • Extracting a line section of mysql backup using sed

    - by carpii
    I occasionally need to extract a single record from a mysqlbackup To do this, I first extract the single table I want from the backup... sed -n -e '/CREATE TABLE.*usertext/,/CREATE TABLE/p' 20120930_backup.sql > table.sql In table.sql, the records are batched using extended inserts (with maybe 100 records per insert before it creates a new line starting with INSERT INTO), so they look like... INSERT INTO usertext VALUES (1, field2 etc), (2, field2 etc), INSERT INTO usertext VALUES (101, field2 etc), (102, field2 etc), ... Im trying to extract record 239560 from this, using... sed -n -e '/(239560.*/,/)/p' table.sql > record.sql Ie.. start streaming when it finds 239560, and stop when it hits the closing bracket But this isnt working as I hoped, it just results in the full insert batch being output. Please can someone give me some pointers as to where Im going wrong? Would I be better off using awk for extracting segments of lines, and use sed for extracting lines within a file?

    Read the article

  • MySQL gzipped Export in PhpMyAdmin has wrong size in Mozilla

    - by Michal Gow
    That is really strange. I am using PhpMyAdmin 2.11.9.6 on Linux hosting. While I am Exporting databases using "gzipped" compression in Mozilla, I am getting files which have size of uncompressed database, but they seems to be downloading in incredible speed (10 times quicker than is possible using my ISP). So at the end: for database of 10M size I am getting 10M gzip downloaded in miniseconds it has indeed shown 10M size on drive it is corrupted Zip compression is working just fine (I am getting file with cca 1M size with fine content of compressed database) And the weirdest thing: that is happening for Mozilla Firefox (13.0.1) only, Internet Explorer 9 is downloading correct gzipped files... Any hint?

    Read the article

  • Backing up a 22 GB MySQL database daily

    - by unknown (yahoo)
    Right now I am able to do the backup using mysqldump. But I have to take down the web server AND it takes around 5 minutes to do the backup. If I don't take down the web server, it takes forever and never finishes + the website becomes inaccessible during the backup. Is there a quicker/better way to backup my 22 GB and growing database? All the tables are MyISAM.

    Read the article

  • innodb memory usage mysql

    - by Tiddo
    I have a small vps, with only 256mb of ram, with maximum burst up to 512mb. When I configure my vps without innodb, it only uses 130 mb of ram, so that is no problem for me. But when I turn on innodb, The memory usage grows to about 300-400 mb. Is it possible to run innodb such that I won't exceed the 256mb? preferably I don't want to use more than 100mb for innodb. I already came across some sites which said I could limit the memory usage, but if I limit it to only 100mb will the db run well enough? (compared to for example the MyISAM storage engine) If 100mb is to little memory for innodb, can you recommend me any other storage engine which supports transactions?

    Read the article

  • Centos/MySQL Server Tweaking

    - by josephs8
    If I wanted a professional to like tweak my server where would I go about finding someone that does this? I have a web server that gets a lot of traffic and Im learning all about managing web servers, but before my traffic gets out of hand I want the server tweaked up. I already have some high load issues so I wanted to see if they could help with that. Then I would continue my learning on my own. Thank You

    Read the article

  • MySQL Non Index Queries Analysis

    - by Markii
    I'm using the log queries not using index but it logs all that use indexes but just more advanced or using IFs. Is there a parser or a program out there that can analyze the log and give me a literal output of saying "table.column should be a index" Thanks

    Read the article

  • Auto-restart mysql when it dies

    - by Los Frijoles
    I have a rackspace server that I have been renting to run my personal projects upon. Since I am cheap, it has 256Mb of RAM and honestly can't handle alot. Every once in a while, when there is a sharp uptick in traffic, the server decides to start killing processes and it seems that mysqld is a popular one for it to kill. I try to visit my site and am greeted with the message that there was an error establishing the database connection. Inspection of the logs reveals that mysqld was killed due to lack of memory. Since I am still as poor as I was yesterday and don't want to upgrade my rackspace VM's RAM, is there a way I can tell it to automagically restart mysqld when it dies? I have a thought to use something like crontab, but alas, I don't know exactly what to do there either. I guess I am product of the "Linux on your desktop" generation since I can do most things on my desktop and laptop (which run Linux almost exclusively), but still lack a lot of server administration skills for Linux. The server runs CentOS 6.3

    Read the article

  • CentOS not allowing remote MySQL connections

    - by nd8ad
    When assigning a user from a remote IP to connect to a database it is saying that it's failing to connect. It is also failing to connect with root so something is wrong. Bind IP is off and I have also tried disabling iptables, still no dice. Port 3306 is forwarded. I'm running on Centos 5.6, using phpmyadmin, but I have also tried to assign the user via the commandline and create a new database, still not working. Been googling and troubleshooting for hours now, no dice.

    Read the article

  • How do I mirror a MySQL database?

    - by user45745
    I'm running two load balanced servers for one website, and I'd like the databases to be synchronized. Queries may be run on either of the two servers because they are both production sites, so the replication can't just work one way. It doesn't have to be in real-time, just fairly accurate so people don't notice a difference when they get switched to a different server.

    Read the article

  • Realtime website slowness (PHP/MYSQL)

    - by 3s2ng
    We got a website with realtime transactions. The past few days we experienced some slowness during a period of time. Usually the slowness last for 5 minutes. And also during the slowness we noticed that sometimes we can not connect to SSH and FTP or sometimes very slow to connect to those services. Currently we are trying to identify the issue. We already setup database monitoring tool. Now are about to signup with pingdom.com My question is. If the website is too slow due to the database (table or row lock) will it affect the other services like SSH and FTP? Does the ping correlates the page load and the connection between my PC and server? Thanks, Mark

    Read the article

  • How do you setup Postfix/Dovecot/MySQL to not look for local accounts?

    - by thiesdiggity
    I am having an issue with one of my Postfix/Dovecot mail servers and I'm unsure how to fix the problem. I will try to explain it in detail, here it goes: I have an Ubuntu server setup using Virtual hosting with Postfix, Dovecot and MySQL. We have one domain setup as a virtual domain, for this example I am going to use mail.example.com. Under that domain we have one email address. I have another server (MS Exchange) setup using another one of my sub-domains, ex.example.com. The problem is that when I SMTP into the account on mail.example.com and try to send an email to an account on ex.example.com, I get the email returned back to us with an "unknown host" error. Now, I know that the mail.example.com server can resolve the ex.example.com domain because I can ping/dig while SSH'd into it. I can also log into Postfix via Telnet and send an email to an ex.example.com mailbox. I'm guessing that it has something to do with Postfix/Dovecot looking locally for the domain in the virtual domain list because of the tld domain (example.com)? If that's the case, how do I get Postfix/Dovecot to only look locally for the entire URL (mail.example.com) and if it doesn't find it, send it to the correct server by looking up the MX/A records (which I know exist and are setup correctly)? I have been working on this all day and any guidance would be GREATLY appreciated! Thanks for your time!

    Read the article

  • SL3/SL4 - Ado.Net Data Services Error during new DataServiceCollection<T>(queryResponse)

    - by Soulhuntre
    Hey all, I have two functions in a SL project (VS2010) that do almost exactly the same thing, yet one throws an error and the other does not. It seems to be related to the projections, but I am unsure about the best way to resolve. The function that works is... public void LoadAllChunksExpandAll(DataHelperReturnHandler handler, string orderby) { DataServiceCollection<CmsChunk> data = null; DataServiceQuery<CmsChunk> theQuery = _dataservice .CmsChunks .Expand("CmsItemState") .AddQueryOption("$orderby", orderby); theQuery.BeginExecute( delegate(IAsyncResult asyncResult) { _callback_dispatcher.BeginInvoke( () => { try { DataServiceQuery<CmsChunk> query = asyncResult.AsyncState as DataServiceQuery<CmsChunk>; if (query != null) { //create a tracked DataServiceCollection from the result of the asynchronous query. QueryOperationResponse<CmsChunk> queryResponse = query.EndExecute(asyncResult) as QueryOperationResponse<CmsChunk>; data = new DataServiceCollection<CmsChunk>(queryResponse); handler(data); } } catch { handler(data); } } ); }, theQuery ); } This compiles and runs as expected. A very, very similar function (shown below) fails... public void LoadAllPagesExpandAll(DataHelperReturnHandler handler, string orderby) { DataServiceCollection<CmsPage> data = null; DataServiceQuery<CmsPage> theQuery = _dataservice .CmsPages .Expand("CmsChildPages") .Expand("CmsParentPage") .Expand("CmsItemState") .AddQueryOption("$orderby", orderby); theQuery.BeginExecute( delegate(IAsyncResult asyncResult) { _callback_dispatcher.BeginInvoke( () => { try { DataServiceQuery<CmsPage> query = asyncResult.AsyncState as DataServiceQuery<CmsPage>; if (query != null) { //create a tracked DataServiceCollection from the result of the asynchronous query. QueryOperationResponse<CmsPage> queryResponse = query.EndExecute(asyncResult) as QueryOperationResponse<CmsPage>; data = new DataServiceCollection<CmsPage>(queryResponse); handler(data); } } catch { handler(data); } } ); }, theQuery ); } Clearly the issue is the Expand projections that involve a self referencing relationship (pages can contain other pages). This is under SL4 or SL3 using ADONETDataServices SL3 Update CTP3. I am open to any work around or pointers to goo information, a Google search for the error results in two hits, neither particularly helpful that I can decipher. The short error is "An item could not be added to the collection. When items in a DataServiceCollection are tracked by the DataServiceContext, new items cannot be added before items have been loaded into the collection." The full error is... System.Reflection.TargetInvocationException was caught Message=Exception has been thrown by the target of an invocation. StackTrace: at System.RuntimeMethodHandle.InvokeMethodFast(IRuntimeMethodInfo method, Object target, Object[] arguments, SignatureStruct& sig, MethodAttributes methodAttributes, RuntimeType typeOwner) at System.Reflection.RuntimeMethodInfo.Invoke(Object obj, BindingFlags invokeAttr, Binder binder, Object[] parameters, CultureInfo culture, Boolean skipVisibilityChecks) at System.Reflection.RuntimeMethodInfo.Invoke(Object obj, BindingFlags invokeAttr, Binder binder, Object[] parameters, CultureInfo culture) at System.Reflection.MethodBase.Invoke(Object obj, Object[] parameters) at System.Data.Services.Client.ClientType.ClientProperty.SetValue(Object instance, Object value, String propertyName, Boolean allowAdd) at System.Data.Services.Client.AtomMaterializer.ApplyItemsToCollection(AtomEntry entry, ClientProperty property, IEnumerable items, Uri nextLink, ProjectionPlan continuationPlan) at System.Data.Services.Client.AtomMaterializer.ApplyFeedToCollection(AtomEntry entry, ClientProperty property, AtomFeed feed, Boolean includeLinks) at System.Data.Services.Client.AtomMaterializer.MaterializeResolvedEntry(AtomEntry entry, Boolean includeLinks) at System.Data.Services.Client.AtomMaterializer.Materialize(AtomEntry entry, Type expectedEntryType, Boolean includeLinks) at System.Data.Services.Client.AtomMaterializer.DirectMaterializePlan(AtomMaterializer materializer, AtomEntry entry, Type expectedEntryType) at System.Data.Services.Client.AtomMaterializerInvoker.DirectMaterializePlan(Object materializer, Object entry, Type expectedEntryType) at System.Data.Services.Client.ProjectionPlan.Run(AtomMaterializer materializer, AtomEntry entry, Type expectedType) at System.Data.Services.Client.AtomMaterializer.Read() at System.Data.Services.Client.MaterializeAtom.MoveNextInternal() at System.Data.Services.Client.MaterializeAtom.MoveNext() at System.Linq.Enumerable.d_b11.MoveNext() at System.Data.Services.Client.DataServiceCollection1.InternalLoadCollection(IEnumerable1 items) at System.Data.Services.Client.DataServiceCollection1.StartTracking(DataServiceContext context, IEnumerable1 items, String entitySet, Func2 entityChanged, Func2 collectionChanged) at System.Data.Services.Client.DataServiceCollection1..ctor(DataServiceContext context, IEnumerable1 items, TrackingMode trackingMode, String entitySetName, Func2 entityChangedCallback, Func2 collectionChangedCallback) at System.Data.Services.Client.DataServiceCollection1..ctor(IEnumerable1 items) at Phinli.Dashboard.Silverlight.Helpers.DataHelper.<>c__DisplayClass44.<>c__DisplayClass46.<LoadAllPagesExpandAll>b__43() InnerException: System.InvalidOperationException Message=An item could not be added to the collection. When items in a DataServiceCollection are tracked by the DataServiceContext, new items cannot be added before items have been loaded into the collection. StackTrace: at System.Data.Services.Client.DataServiceCollection1.InsertItem(Int32 index, T item) at System.Collections.ObjectModel.Collection`1.Add(T item) InnerException: Thanks for any help!

    Read the article

  • What's wrong with my MySql query ?!

    - by Anytime
    This is a query I am doing with mysql using PHP This is the query line <?php $query = "SELECT * FROM node WHERE type = 'student_report' AND uid = '{$uid}' LIMIT 1 ORDER BY created DESC"; ?> I get the following error You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near 'ORDER BY created DESC' at line 1

    Read the article

  • mysql error using Rails-- Please help

    - by Cypher
    Alright I am sry for the noob question but this has been driving me up a wall-especially because I got it to work yesterday and I can't remember what I did.... I am just trying to use mysql with rails with a mongrel server. I set up the server fine and can run rails applications that don't need mysql but when I create a project using (for example) rails -d mysql blog and then create some simple controller e.g. ruby script/generate Test then put this code in the controller... class TestController < ApplicationController def index render :text => 'WORK' end end then when I start the server up and open up localhost:3000/test I get the following error: = Booting Mongrel = Rails 2.3.5 application starting on http://0.0.0.0:3000 = Call with -d to detach = Ctrl-C to shutdown server /!\ FAILSAFE /!\ Mon May 10 20:15:06 -0500 2010 Status: 500 Internal Server Error Can't connect to MySQL server on 'localhost' (10061) C:/Ruby/lib/ruby/gems/1.8/gems/activerecord-2.3.5/lib/active_record/connection_adapter s/mysql_adapter.rb:589:in 'real_connect' C:/Ruby/lib/ruby/gems/1.8/gems/activerecord-2.3.5/lib/active_record/connection_adapter s/mysql_adapter.rb:589:in 'connect' C:/Ruby/lib/ruby/gems/1.8/gems/activerecord-2.3.5/lib/active_record/connection_adapter s/mysql_adapter.rb:203:in 'initialize' C:/Ruby/lib/ruby/gems/1.8/gems/activerecord-2.3.5/lib/active_record/connection_adapter s/mysql_adapter.rb:75:in 'new' C:/Ruby/lib/ruby/gems/1.8/gems/activerecord-2.3.5/lib/active_record/connection_adapter s/mysql_adapter.rb:75:in 'mysql_connection' C:/Ruby/lib/ruby/gems/1.8/gems/activerecord-2.3.5/lib/active_record/connection_adapter s/abstract/connection_pool.rb:223:in 'send' C:/Ruby/lib/ruby/gems/1.8/gems/activerecord-2.3.5/lib/active_record/connection_adapter s/abstract/connection_pool.rb:223:in 'new_connection' C:/Ruby/lib/ruby/gems/1.8/gems/activerecord-2.3.5/lib/active_record/connection_adapter s/abstract/connection_pool.rb:245:in 'checkout_new_connection' C:/Ruby/lib/ruby/gems/1.8/gems/activerecord-2.3.5/lib/active_record/connection_adapter s/abstract/connection_pool.rb:188:in 'checkout' C:/Ruby/lib/ruby/gems/1.8/gems/activerecord-2.3.5/lib/active_record/connection_adapter s/abstract/connection_pool.rb:184:in 'loop' C:/Ruby/lib/ruby/gems/1.8/gems/activerecord-2.3.5/lib/active_record/connection_adapter s/abstract/connection_pool.rb:184:in 'checkout' C:/Ruby/lib/ruby/1.8/monitor.rb:242:in 'synchronize' C:/Ruby/lib/ruby/gems/1.8/gems/activerecord-2.3.5/lib/active_record/connection_adapter s/abstract/connection_pool.rb:183:in 'checkout' C:/Ruby/lib/ruby/gems/1.8/gems/activerecord-2.3.5/lib/active_record/connection_adapter s/abstract/connection_pool.rb:98:in 'connection' C:/Ruby/lib/ruby/gems/1.8/gems/activerecord-2.3.5/lib/active_record/connection_adapter s/abstract/connection_pool.rb:326:in 'retrieve_connection' C:/Ruby/lib/ruby/gems/1.8/gems/activerecord-2.3.5/lib/active_record/connection_adapter s/abstract/connection_specification.rb:123:in 'retrieve_connection' C:/Ruby/lib/ruby/gems/1.8/gems/activerecord-2.3.5/lib/active_record/connection_adapter s/abstract/connection_specification.rb:115:in 'connection' etc... In the browser i get a 'We're sorry, but something went wrong' Does anyone know what I am doing wrong?

    Read the article

  • Delphi - MySQL Best Data aware components to use

    - by Brad
    I need my applicaiton to connect to my web server's MySQL database, what is the best option for this. Perfered Data aware Component. I tried zeos 7, but I keep getting the error: SQL error: Client does not support authentication protocal requested by server; consider upgrading MySQL client and have not been able to fix it. Thanks -Brad

    Read the article

< Previous Page | 99 100 101 102 103 104 105 106 107 108 109 110  | Next Page >