Search Results

Search found 10492 results on 420 pages for 'online backups'.

Page 64/420 | < Previous Page | 60 61 62 63 64 65 66 67 68 69 70 71  | Next Page >

  • Affordable SEO Service is the Best Option

    Affordable SEO service is the key to a prosperous online marketing business. There are many ways of doing internet marketing but a few most popular methods that seem to be on the rise include Search Engine Optimization, better known as SEO, Pay Per Click or PPC, Website lead Generations, Google AdSense, online newsletters and online magazines. SEO-service is on the rise in the contemporary world.

    Read the article

  • Do Implement Search Engine Optimization Techniques in Your Website and Gain More Network Traffic

    With the increasing number of internet users many of the companies run their business mainly targeting these online users. It has become mandatory to have your online presence in order to do well in your business. Getting the online customers is not so easy and it requires a lot of things to be done in order to attract more customers to your website. Search Engine Optimization plays a very major role in achieving the required traffic for any of the website.

    Read the article

  • The Lure of Simplicity in IT

    A deceptively simple solution to a business re-engineering problem can beguile companies into selecting a compromise that doesn't actually meet all their needs. Simple is great, but not at the expense of functionality. Some IT solutions are complex because the problem is complex, but they can be made conceptually clearer. Get smart with SQL Backup ProGet faster, smaller backups with integrated verification.Quickly and easily DBCC CHECKDB your backups. Learn more.

    Read the article

  • The Perils of Running Database Repair

    In a perfect world everyone has the right backups to be able to recover within the downtime and data-loss service level agreements when accidental data loss or corruption occurs. Unfortunately we don’t live in a perfect world and so many people find that they don’t have the backups they need to recover when faced with corruption. What are your servers really trying to tell you? Find out with new SQL Monitor 3.0, an easy-to-use tool built for no-nonsense database professionals.For effortless insights into SQL Server, download a free trial today.

    Read the article

  • SQL Server Prefetch and Query Performance

    Prefetching can make a surprising difference to SQL Server query execution times where there is a high incidence of waiting for disk i/o operations, but the benefits come at a cost. Mostly, the Query Optimizer gets it right, but occasionally there are queries that would benefit from tuning. Get smart with SQL Backup ProGet faster, smaller backups with integrated verification.Quickly and easily DBCC CHECKDB your backups. Learn more.

    Read the article

  • Developing a SSRS report using a SSAS Data Source

    After designing several SSRS reports based on regular relational databases, your boss would now like several new reports to be designed and rolled out to production based on your organization's SSAS OLAP cube. How do you get started with designing a report based on a cube? Get smart with SQL Backup ProGet faster, smaller backups with integrated verification.Quickly and easily DBCC CHECKDB your backups. Learn more.

    Read the article

  • Tips For a Successful Link Building Strategy

    If you want to become a successful online marketer and want to make your online marketing campaign successful, you will need to work on building backlinks for your website. Link building will decide the failure or the success of your online marketing campaign.

    Read the article

  • SQL VIEW Basics

    SQL Views are essential for the database developer. However, it is common to see them misued, or neglected. Joe Celko tackles an introduction to the subject, but there is something about the topic that makes it likely that even the experienced developer will find out something new from reading it. Get smart with SQL Backup ProGet faster, smaller backups with integrated verification.Quickly and easily DBCC CHECKDB your backups. Learn more.

    Read the article

  • SQL Server 2012 Integration Services - Project Deployment

    SQL Server 2012 Integration Services parameters introduce a new way of dealing with package development, deployment, and execution. In order to truly appreciate their relevance, it is necessary to take a look at the new Project Deployment Model. Get smart with SQL Backup ProGet faster, smaller backups with integrated verification.Quickly and easily DBCC CHECKDB your backups. Learn more.

    Read the article

  • Is RapidSSL WildCard Cert suitable for my eCommerce Web site?

    - by Eian
    We have recently launched our online T-Shirts shop which is based on eCommerce platform but certainly we have been facing problem of customer’s transactions security as they were asking for suitable security of their confidential information while shop online over the my website. One of my friends is being used RapidSSL WildCard Certificate from RapidSSLonline.com To be clear that we don’t know much about SSL certificate security but we have found that SSL certificates ensure the online web site visitors towards their digital transaction safety. We would like to know that Is RapidSSL Wildcard Certificate the right choice for eCommerce shop?

    Read the article

  • SEO Tools to Help You in Your Business

    SEO and other online strategies are being used more and more by businesses today. As the market is developing and becoming more educated, it is more important to ensure that you are on top of your game and understand the impact the SEO and other online tactics have on your website. This article will discuss some of the things that you need to watch out for in your online activities.

    Read the article

  • SSL and mod_rewrite error

    - by wnoveno
    Hi, I have https on my site. Pages with rewritten URL in my site are inaccessible while direct urls (folders). here's the htaccess ## 2009-12-17 16:52 JGC [START] ## Enable http compression for this site <IfModule mod_deflate.c> SetOutputFilter DEFLATE ## Directive "DeflateCompressionLevel" not allowed in .htaccess, only valid in server config and virtual hosts # DeflateCompressionLevel 9 # file-types indicated will not be compressed SetEnvIfNoCase Request_URI \.(?:gif|jpe?g|png)$ no-gzip dont-vary SetEnvIfNoCase Request_URI \.(?:swf|flv|pdf)$ no-gzip dont-vary SetEnvIfNoCase Request_URI \.(?:exe|t?gz|zip|gz2|sit|rar)$ no-gzip dont-vary <IfModule mod_headers.c> Header append Vary User-Agent </IfModule> </IfModule> ## 2009-12-17 16:52 JGC [END] ## 2010-03-05 16:05 JGC [START] #<IfModule mod_alias.c> #RedirectMatch 301 ^(/)$ /online-casino-poker-register.html #RedirectMatch 301 ^(/en)$ /en/online-casino-poker-register.html #RedirectMatch 301 ^(/en/)$ /en/online-casino-poker-register.html #RedirectMatch 301 ^(/en\.html)$ /en/online-casino-poker-register.html #RedirectMatch 301 ^(/sc)$ /sc/online-casino-poker-register.html #RedirectMatch 301 ^(/sc/)$ /sc/online-casino-poker-register.html #RedirectMatch 301 ^(/sc\.html)$ /sc/online-casino-poker-register.html #RedirectMatch 301 ^(/ch)$ /ch/online-casino-poker-register.html #RedirectMatch 301 ^(/ch/)$ /ch/online-casino-poker-register.html #RedirectMatch 301 ^(/ch\.html)$ /ch/online-casino-poker-register.html #</IfModule> ## 2010-03-05 16:05 JGC [END] ## # @version $Id: htaccess.txt 10492 2008-07-02 06:38:28Z ircmaxell $ # @package Joomla # @copyright Copyright (C) 2005 - 2008 Open Source Matters. All rights reserved. # @license http://www.gnu.org/copyleft/gpl.html GNU/GPL # Joomla! is Free Software ## ##################################################### # READ THIS COMPLETELY IF YOU CHOOSE TO USE THIS FILE # # The line just below this section: 'Options +FollowSymLinks' may cause problems # with some server configurations. It is required for use of mod_rewrite, but may already # be set by your server administrator in a way that dissallows changing it in # your .htaccess file. If using it causes your server to error out, comment it out (add # to # beginning of line), reload your site in your browser and test your sef url's. If they work, # it has been set by your server administrator and you do not need it set here. # ##################################################### ## Can be commented out if causes errors, see notes above. Options +FollowSymLinks # # mod_rewrite in use RewriteEngine On #RewriteLog "/var/www/html/dafa888/rewrite.log" #RewriteLogLevel 3 RewriteCond %{HTTP_COOKIE} !jfcookie\[lang\] [NC] RewriteCond %{HTTP:Accept-Language} ^zh-cn [NC] RewriteRule ^$ /sc/ [L,R=301] RewriteCond %{HTTP_COOKIE} !jfcookie\[lang\] [NC] RewriteCond %{HTTP:Accept-Language} ^zh-tw [NC] RewriteRule ^$ /ch/ [L,R=301] #RewriteCond %{HTTP_COOKIE} !jfcookie[lang] [NC] #RewriteCond %{HTTP_COOKIE} jfcookie\[lang\] [NC] #RewriteCond %{HTTP_COOKIE} jfcookie\[lang\]=([^;]+) [NC] #RewriteRule ^(.*)$ /%1/$1 [NC,QSA] ########## Begin - Rewrite rules to block out some common exploits ## If you experience problems on your site block out the operations listed below ## This attempts to block the most common type of exploit `attempts` to Joomla! # # Block out any script trying to set a mosConfig value through the URL RewriteCond %{QUERY_STRING} mosConfig_[a-zA-Z_]{1,21}(=|\%3D) [OR] # Block out any script trying to base64_encode crap to send via URL RewriteCond %{QUERY_STRING} base64_encode.*\(.*\) [OR] # Block out any script that includes a <script> tag in URL RewriteCond %{QUERY_STRING} (\<|%3C).*script.*(\>|%3E) [NC,OR] # Block out any script trying to set a PHP GLOBALS variable via URL RewriteCond %{QUERY_STRING} GLOBALS(=|\[|\%[0-9A-Z]{0,2}) [OR] # Block out any script trying to modify a _REQUEST variable via URL RewriteCond %{QUERY_STRING} _REQUEST(=|\[|\%[0-9A-Z]{0,2}) # Send all blocked request to homepage with 403 Forbidden error! RewriteRule ^(.*)$ index.php [F,L] # ########## End - Rewrite rules to block out some common exploits # Uncomment following line if your webserver's URL # is not directly related to physical file paths. # Update Your Joomla! Directory (just / for root) RewriteBase / #RewriteCond %{HTTP_HOST} ^(.*)$ [NC] #RewriteRule ^(.*)$ http://www.%1/$1 [R=301] ########## Begin - Joomla! core SEF Section # RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteCond %{REQUEST_URI} !^/index.php RewriteCond %{REQUEST_URI} (/|\.php|\.html|\.htm|\.feed|\.pdf|\.raw|/[^.]*)$ [NC] RewriteRule (.*) index.php RewriteRule .* - [E=HTTP_AUTHORIZATION:%{HTTP:Authorization},L] # ########## End - Joomla! core SEF Section

    Read the article

  • Script launching 3 copies of rsync

    - by organicveggie
    I have a simple script that uses rsync to copy a Postgres database to a backup location for use with Point In Time Recovery. The script is run every 2 hours via a cron job for the postgres user. For some strange reason, I can see three copies of rsync running in the process list. Any ideas why this might the case? Here's the cron entry: # crontab -u postgres -l PATH=/bin:/usr/bin:/usr/local/bin 0 */2 * * * /var/lib/pgsql/9.0/pitr_backup.sh And here's the ps list, which shows two copies of rsync running and one sleeping: # ps ax |grep rsync 9102 ? R 2:06 rsync -avW /var/lib/pgsql/9.0/data/ /var/lib/pgsql/9.0/backups/pitr_archives/20110629100001/ --exclude pg_xlog --exclude recovery.conf --exclude recovery.done --exclude pg_log 9103 ? S 0:00 rsync -avW /var/lib/pgsql/9.0/data/ /var/lib/pgsql/9.0/backups/pitr_archives/20110629100001/ --exclude pg_xlog --exclude recovery.conf --exclude recovery.done --exclude pg_log 9104 ? R 2:51 rsync -avW /var/lib/pgsql/9.0/data/ /var/lib/pgsql/9.0/backups/pitr_archives/20110629100001/ --exclude pg_xlog --exclude recovery.conf --exclude recovery.done --exclude pg_log And here's the uber simple script that seems to be the cause of the problem: #!/bin/sh LOG="/var/log/pgsql-pitr-backup.log" base_backup_dir="/var/lib/pgsql/9.0/backups" wal_archive_dir="$base_backup_dir/wal_archives" pitr_archive_dir="$base_backup_dir/pitr_archives" timestamp=`date +%Y%m%d%H%M%S` backup_dir="$pitr_archive_dir/$timestamp" mkdir -p $backup_dir echo `date` >> $LOG /usr/bin/psql -U postgres -c "SELECT pg_start_backup('$backup_dir');" rsync -avW /var/lib/pgsql/9.0/data/ $backup_dir/ --exclude pg_xlog --exclude recovery.conf --exclude recovery.done --exclude pg_log /usr/bin/psql -U postgres -c "SELECT pg_stop_backup();"

    Read the article

  • DRBD Replication failure

    - by user62513
    A couple of weeks ago I setup a 2 nodes CRM system with one of the resources managed being MySQL over DRBD. Today for maintenance reasons I restarted both nodes but now they can't connect to each other anymore. DRBD fell out of sync and I followed this guide to get it back connected but it's only able to run successfully on one node. But this strange thing happens: If I crm node standby both nodes and I try: crm node online node0 before crm node online node1, all the CRM resources start successfully but the DRBD partitions are still running in StandAlone state. crm node online node1 beofre crm node online node0, the DRBD resource fails to start, thus causing mysql not to start. If I standby both resources and call crm node online node0 then it times out and prints this error: Running crm node online node0 produces this output after timing out Error setting standby=off (section=nodes, set=<null>): Remote node did not respond Error performing operation: Remote node did not respond Is there anything I'm doing wrong here? An alternative will be just do MySQL replication but I'm not sure how to promote a slave to master when the master database is not available.

    Read the article

  • Missing whole disk device in OpenSolaris

    - by Jeff Mc
    I have begun experimenting with Solaris and ZFS as a NAS. All was going very smoothly until I had a drive failure. When I replaced the drive, I no longer have a device file mapped to the whole disk. /dev/dsk/c7t3d0 does not exist but c7t2d0 and c7t4d0 both do. Also the sd@3,0:wd file under the /devices/ tree is non-existent. Do I have to prepare/partition the disk somehow to cause the whole disk device to exist? Here are a few outputs that might be useful. jeffmc@ats-ds2:/dev/dsk$ zpool status pool: datapool state: DEGRADED status: One or more devices could not be opened. Sufficient replicas exist for the pool to continue functioning in a degraded state. action: Attach the missing device and online it using 'zpool online'. see: http://www.sun.com/msg/ZFS-8000-2Q scrub: none requested config: NAME STATE READ WRITE CKSUM datapool DEGRADED 0 0 0 mirror-0 DEGRADED 0 0 0 c7t2d0 ONLINE 0 0 0 c7t3d0 UNAVAIL 0 0 0 cannot open mirror-1 ONLINE 0 0 0 c7t4d0 ONLINE 0 0 0 c7t5d0 ONLINE 0 0 0 jeffmc@ats-ds2:/dev/dsk$ zpool replace datapool c7t3d0 cannot open 'c7t3d0': no such device in /dev/dsk must be a full path or shorthand device name jeffmc@ats-ds2:/dev/dsk$ sudo format Searching for disks...done AVAILABLE DISK SELECTIONS: 0. c7t0d0 /pci@0,0/pci8086,3599@6/pci8086,330@0/pci1014,2cc@7,1/sd@0,0 1. c7t1d0 /pci@0,0/pci8086,3599@6/pci8086,330@0/pci1014,2cc@7,1/sd@1,0 2. c7t2d0 /pci@0,0/pci8086,3599@6/pci8086,330@0/pci1014,2cc@7,1/sd@2,0 3. c7t3d0 /pci@0,0/pci8086,3599@6/pci8086,330@0/pci1014,2cc@7,1/sd@3,0 4. c7t4d0 /pci@0,0/pci8086,3599@6/pci8086,330@0/pci1014,2cc@7,1/sd@4,0 5. c7t5d0 /pci@0,0/pci8086,3599@6/pci8086,330@0/pci1014,2cc@7,1/sd@5,0

    Read the article

  • zfs pool error, how to determine which drive failed in the past

    - by Kendrick
    I had been copying data from my pool so that I could rebuild it with a different version so that I could go away from solaris 11 and to one that is portable between freebsd/openindia etc. it was copying at 20mb a sec the other day which is about all my desktop drive can handle writing from the network. suddently lastnight it went down to 1.4mb i ran zpool status today and got this. pool: store state: ONLINE status: One or more devices has experienced an unrecoverable error. An attempt was made to correct the error. Applications are unaffected. action: Determine if the device needs to be replaced, and clear the errors using 'zpool clear' or replace the device with 'zpool replace'. see: http://www.sun.com/msg/ZFS-8000-9P scan: none requested config: NAME STATE READ WRITE CKSUM store ONLINE 0 0 0 raidz1-0 ONLINE 0 0 0 c8t3d0p0 ONLINE 0 0 2 c8t4d0p0 ONLINE 0 0 10 c8t2d0p0 ONLINE 0 0 0 it is currently a 3 x1tb drive array. what tools would best be used to determine what the error was and which drive is failing. per the admin doc The second section of the configuration output displays error statistics. These errors are divided into three categories: READ – I/O errors occurred while issuing a read request. WRITE – I/O errors occurred while issuing a write request. CKSUM – Checksum errors. The device returned corrupted data as the result of a read request. it was saying low counts could be any thing from a power flux to a disk event but gave no suggestions as to what tools to check and determine with.

    Read the article

  • No, iCloud Isn’t Backing Them All Up: How to Manage Photos on Your iPhone or iPad

    - by Chris Hoffman
    Are the photos you take with your iPhone or iPad backed up in case you lose your device? If you’re just relying on iCloud to manage your important memories, your photos may not be backed up at all. Apple’s iCloud has a photo-syncing feature in the form of “Photo Stream,” but Photo Stream doesn’t actually perform any long-term backups of your photos. iCloud’s Photo Backup Limitations Assuming you’ve set up iCloud on your iPhone or iPad, your device is using a feature called “Photo Stream” to automatically upload the photos you take to your iCloud storage and sync them across your devices. Unfortunately, there are some big limitations here. 1000 Photos: Photo Stream only backs up the latest 1000 photos. Do you have 1500 photos in your Camera Roll folder on your phone? If so, only the latest 1000 photos are stored in your iCloud account online. If you don’t have those photos backed up elsewhere, you’ll lose them when you lose your phone. If you have 1000 photos and take one more, the oldest photo will be removed from your iCloud Photo Stream. 30 Days: Apple also states that photos in your Photo Stream will be automatically deleted after 30 days “to give your devices plenty of time to connect and download them.” Some people report photos aren’t deleted after 30 days, but it’s clear you shouldn’t rely on iCloud for more than 30 days of storage. iCloud Storage Limits: Apple only gives you 5 GB of iCloud storage space for free, and this is shared between backups, documents, and all other iCloud data. This 5 GB can fill up pretty quickly. If your iCloud storage is full and you haven’t purchased any more storage more from Apple, your photos aren’t being backed up. Videos Aren’t Included: Photo Stream doesn’t include videos, so any videos you take aren’t automatically backed up. It’s clear that iCloud’s Photo Stream isn’t designed as a long-term way to store your photos, just a convenient way to access recent photos on all your devices before you back them up for real. iCloud’s Photo Stream is Designed for Desktop Backups If you have a Mac, you can launch iPhoto and enable the Automatic Import option under Photo Stream in its preferences pane. Assuming your Mac is on and connected to the Internet, iPhoto will automatically download photos from your photo stream and make local backups of them on your hard drive. You’ll then have to back up your photos manually so you don’t lose them if your Mac’s hard drive ever fails. If you have a Windows PC, you can install the iCloud Control Panel, which will create a Photo Stream folder on your PC. Your photos will be automatically downloaded to this folder and stored in it. You’ll want to back up your photos so you don’t lose them if your PC’s hard drive ever fails. Photo Stream is clearly designed to be used along with a desktop application. Photo Stream temporarily backs up your photos to iCloud so iPhoto or iCloud Control Panel can download them to your Mac or PC and make a local backup before they’re deleted. You could also use iTunes to sync your photos from your device to your PC or Mac, but we don’t really recommend it — you should never have to use iTunes. How to Actually Back Up All Your Photos Online So Photo Stream is actually pretty inconvenient — or, at least, it’s just a way to temporarily sync photos between your devices without storing them long-term. But what if you actually want to automatically back up your photos online without them being deleted automatically? The solution here is a third-party app that does this for you, offering the automatic photo uploads with long-term storage. There are several good services with apps in the App Store: Dropbox: Dropbox’s Camera Upload feature allows you to automatically upload the photos — and videos — you take to your Dropbox account. They’ll be easily accessible anywhere there’s a Dropbox app and you can get much more free Dropbox storage than you can iCloud storage. Dropbox will never automatically delete your old photos. Google+: Google+ offers photo and video backups with its Auto Upload feature, too. Photos will be stored in your Google+ Photos — formerly Picasa Web Albums — and will be marked as private by default so no one else can view them. Full-size photos will count against your free 15 GB of Google account storage space, but you can also choose to upload an unlimited amount of photos at a smaller resolution. Flickr: The Flickr app is no longer a mess. Flickr offers an Auto Upload feature for uploading full-size photos you take and free Flickr accounts offer a massive 1 TB of storage for you to store your photos. The massive amount of free storage alone makes Flickr worth a look. Use any of these services and you’ll get an online, automatic photo backup solution you can rely on. You’ll get a good chunk of free space, your photos will never be automatically deleted, and you can easily access them from any device. You won’t have to worry about storing local copies of your photos and backing them up manually. Apple should fix this mess and offer a better solution for long-term photo backup, especially considering the limitations aren’t immediately obvious to users. Until they do, third-party apps are ready to step in and take their place. You can also automatically back up your photos to the web on Android with Google+’s Auto Upload or Dropbox’s Camera Upload. Image Credit: Simon Yeo on Flickr     

    Read the article

  • Tomcat with virtual hosts - 404

    - by Thardas
    I have a CentOS 5.2 server set up with Apache 2.2.3 and Tomcat 5.5.27. The server hosts multiple virtual hosts connected to multiple Tomcats. For instance we have one tomcat for development and testing and one tomcat for production. project.demo.us.com points to dev tomcat and project.us.com points to production tomcat. Here's the virtual host's configuration: <VirtualHost *:80> ServerName project.demo.us.com CustomLog logs/project.demo.us.com/access_log combined env=!VLOG ErrorLog logs/project.demo.us.com/error_log DocumentRoot /var/www/vhosts/project.demo.us.com <Directory /var/www/vhosts/project.demo.us.com> Allow from all AllowOverride All Options -Indexes FollowSymLinks </Directory> ########## ########## ########## JkMount /project/* online </VirtualHost> JkMount line defines that we use online worker and our workers.properties contains this: worker.list=..., online, ... worker.online.port=7703 worker.online.host=localhost worker.online.type=ajp13 worker.online.lbfactor=1 And tomcat's conf/server.xml contains: <Connector port="7703" enableLookups="false" redirectPort="8443" protocol="AJP/1.3" URIEncoding="UTF-8" maxThreads="80" minSpareThreads="10" maxSpareThreads="15"/> I'm not sure what redirectPort is but I tried to telnet to that port and there's no one answering, so it shouldn't matter? Tomcat's webapps directory contains project.war and the server automatically deployed it under project directory which contains index.jsp and hello.html. The latter is for static debugging purposes. Now when I try to access http://project.demo.us.com/project/index.jsp, I get Tomcat's HTTP Status 404 - The requested resource () is not available. The same thing happens to hello.html so it's not working with static content either. Apache's access_log contains: 88.112.152.31 - - [10/Aug/2009:12:15:14 +0300] "GET /demo/index.jsp HTTP/1.1" 404 952 "-" "Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10.5; en-US; rv:1.9.1.2) Gecko/20090729 Firefox/3.5.2" I couldn't find any mention of the request in Tomcat's logs. If I shutdown this specific tomcat, I no longer get Tomcat's 404 but Apache's 503 Service Temporarily Unavailable, so I should be configuring the correct Tomcat. Is there something obvious that I'm missing? Is there any place where I could find out what path the Tomcat is using to look for requested files?

    Read the article

  • What's wrong with this function?

    - by ct2k7
    Hello, I'm using this function to determine whether my application should be online or offline: function online() { if ($online == "0") { if($_SESSION['exp_user']['userlevel'] != "1") { include("error/offline.php"); exit(); } } } However, with the data value set to 0 in the database, and $online does = '0', why is error/offline.php not included for those whoose user level is not 1? Thanks :)

    Read the article

  • Read about Interface-Based Programming in C#

    - by Editor
    Learn to program using interfaces by reading C# Online.NET articles like Interfaces and Abstract Classes. And, here is an excerpt from a VSLive! article on Interface-Based Programming in C#. "Interfaces help define a contract, or agreement, between your application and other objects. This agreement indicates what sort of methods, properties and events are exposed by an object. [...]

    Read the article

  • Mysqldump causes "Too many connections"

    - by vbachev
    A scheduled backup using mysqldump on one of our databases is causing Too many connections. The database is of both InnoDB and MyISAM tables with size of around 500Mb. The Too many connections appears for about 2-3 minutes We understand that mysqldump locks the tables and causes all other queries and connections to pile up and jam the mysql server. We need frequent backups and we cannot afford server downtime or putting websites in maintenance mode while doing it. Our websites are global and traffic is high all the time so its hard to find a moment for backups. How can we avoid downtime during backups?Is there maybe a way to use mysqldump in way that it will not lock all tables at the same time?Is there an alternative to backing up with mysqldump?

    Read the article

< Previous Page | 60 61 62 63 64 65 66 67 68 69 70 71  | Next Page >