Search Results

Search found 14789 results on 592 pages for 'pro backup'.

Page 90/592 | < Previous Page | 86 87 88 89 90 91 92 93 94 95 96 97  | Next Page >

  • rsync generates very much traffic

    - by user109459
    I use rsync for backing up one of my servers with 4GB of files. When I now try to transfer these files the traffic for the files isn't the estimated 4GB. It is a lot higher. It's about 60GB. I also checked the traffic on my server, backup server and router and all three say that there was a traffic of 60GB. But at the end rsync says that it only has transfered 4GB. Another problem is that I can't debugg it because the problem occures randomly.

    Read the article

  • Nightly backups (and maybe other tasks) causing server alerts

    - by J. Pablo Fernández
    I have two independent alert notification systems for my servers. The server is a virtual machine on Linode and one of the alerts comes from Linode. The other monitoring system we use is New Relic. They are both watching out for IO utilization. Every night I get alerts from both of them as the server is using too much IO. I run quite a few tasks in the middle of the night but the one I confirmed that can cause IO-warnings is running the backups. The backup is done by s3cmd sync. I tried ionice but it still generates the warnings. Getting warnings every night reduces the efficacy of warnings when they happen for real. For Linode I could raise the level at which a warning is issued, but it might mean making the whole thing useless as the level is too high. What would be the proper solution for this?

    Read the article

  • Should the virtualization host be allowed to run any service?

    - by Giordano
    I recently setup a virtualization server for the small company I'm running. This server runs few virtual machines that are used for development, testing, etc... My business partner works from a remote location, thus I also installed a vpn server on the virtualization host to make it possible for him to safely reach the company services. Moreover, again on the virtualization host, I installed bacula to perform the backup of the data. Is it advisable/good practice to do so or should I create one more virtual machine to do backups and VPN? Is it a bad idea to run these services on the host itself? If yes, why? Thanks in advance!

    Read the article

  • Back up Linux environment

    - by joesavage
    I'm currently in the process of installing a bunch of stuff I need and getting setup my Linode that I recently purchased. Being a Linux newbie, I'm doing pretty well - but one small mistake can screw everything up for me. I've currently got apache2 and some other things successfully installed and setup how I want them, and DO NOT want this to be ruined by some newbie mistake. What is the easiest way to backup the actual environment itself so that I can restore the backed up environment (with apache2 and things fully working) if I mess up?

    Read the article

  • tar gzip slowing down server

    - by Josir
    I have a backup script that: compress some files generate md5 copy the compressed file to another server. the other server finishes comparing MD5 (to find copy errors). Here it's the core script: nice -n 15 tar -czvf $BKP $PATH_BKP/*.* \ | xargs -I '{}' sh -c "test -f '{}' && md5sum '{}'" \ | tee $MD5 scp -l 80000 $BKP $SCP_BKP scp $MD5 $SCP_BKP This routine got CPU at 90% at gzip routine, slowing down the production server. I tried to add a nice -n 15 but server still hangs. I've already read 1 but the conversation didn't help me. What is the best approach to solve this issue ? I am open to new architectures/solutions :)

    Read the article

  • Settings what-opens-what once and for all (Backing up File Associations)

    - by ldigas
    Every time I switch machines (as in, get a new one, or reinstall an OS or something like that) my precious file associations get lost. And the next six months pass slowly until I again set them up right. Is there a program that allows me to: Set all the extensions I would like to open with let's say, Vim, without setting each one of them individually. Something of a kind: Vim opens: .... list of extensions ... and/or A program which lets me backup my current settings, and when I copy those to a new machine it lets me just modify the paths where I putted the applications in question, and it does the rest (again, associates that program with all the extensions it opened before).

    Read the article

  • Why is Ubuntu One slow to sync in 11.10, either backup or any sub-folder contents?

    - by pst007x
    I have been trying to sync my documents folder of 1.4GB, it still hasn't worked and it has been syncing for a month. The top level syncs, files and folders in the Document folders, but contents of sub-folders just hang. (Gave up and stopped syncing this folder) However,I have tried using the backup facility in 11.10, to backup to Ubuntu One.... I upgraded my HDD space in Ubuntu One. It has been going now for 24hours-ish and only backed up what looks like a couple of percent. (By the way what an excellent idea to backup to Ubuntu One, if only we could get it to actually work! :-o) The odd thing is I can sync to drop box within hours, rather than months. This is bad, and has been an issue since Ubuntu One's release. I have reported this problem and there were promises in later releases this would be fixed, but it hasn't. Canonical cannot help either... I posted on several blogs, a lot of people have the same problem but no fixes. So do I use dropbox or another service, until it is sorted, as Ubuntu does not seem to see this as an issue, I think a fix will be a long time in coming. (However,I love the potential of Ubuntu One and the integration with the OS) Yes my internet speeds are fine, etc... :-) No firewall (sudo ufw status: STATUS: INACTIVE), No Proxy, etc NB: I have raised this as a separate question to others posted here, because my question relates to Ubuntu 11.10, though I have commented elsewhere for help. Plus my question also relates to deja-dup backup to Ubuntu One. Thanks

    Read the article

  • trying to backup mysql database using php

    - by user225269
    I got this code from this site: http://www.php-mysql-tutorial.com/wikis/mysql-tutorials/using-php-to-backup-mysql-databases.aspx But I'm just a beginner so I don't know what the config.php and opendb.php suppose to mean. Do I have to create those 2 files in order for this code to work? If yes, then how do I create it, it isn't included in the site how to create it. <?php include 'config.php'; include 'opendb.php'; $tableName = 'mypet'; $backupFile = 'backup/mypet.sql'; $query = "SELECT * INTO OUTFILE '$backupFile' FROM $tableName"; $result = mysql_query($query); include 'closedb.php'; ?> can I just include these lines on the top code so that I will not be putting the include 'opendb.php' anymore: $con = mysql_connect("localhost","root",""); if (!$con) { die('Could not connect: ' . mysql_error()); } mysql_select_db("Hospital", $con);

    Read the article

  • New ways for backup, recovery and restore of Essbase Block Storage databases – part 2 by Bernhard Kinkel

    - by Alexandra Georgescu
    After discussing in the first part of this article new options in Essbase for the general backup and restore, this second part will deal with the also rather new feature of Transaction Logging and Replay, which was released in version 11.1, enhancing existing restore options. Tip: Transaction logging and replay cannot be used for aggregate storage databases. Please refer to the Oracle Hyperion Enterprise Performance Management System Backup and Recovery Guide (rel. 11.1.2.1). Even if backups are done on a regular, frequent base, subsequent data entries, loads or calculations would not be reflected in a restored database. Activating Transaction Logging could fill that gap and provides you with an option to capture these post-backup transactions for later replay. The following table shows, which are the transactions that could be logged when Transaction Logging is enabled: In order to activate its usage, corresponding statements could be added to the Essbase.cfg file, using the TRANSACTIONLOGLOCATION command. The complete syntax reads: TRANSACTIONLOGLOCATION [ appname [ dbname]] LOGLOCATION NATIVE ?ENABLE | DISABLE Where appname and dbname are optional parameters giving you the chance in combination with the ENABLE or DISABLE command to set Transaction Logging for certain applications or databases or to exclude them from being logged. If only an appname is specified, the setting applies to all databases in that particular application. If appname and dbname are not defined, all applications and databases would be covered. LOGLOCATION specifies the directory to which the log is written, e.g. D:\temp\trlogs. This directory must already exist or needs to be created before using it for log information being written to it. NATIVE is a reserved keyword that shouldn’t be changed. The following example shows how to first enable logging on a more general level for all databases in the application Sample, followed by a disabling statement on a more granular level for only the Basic database in application Sample, hence excluding it from being logged. TRANSACTIONLOGLOCATION Sample Hyperion/trlog/Sample NATIVE ENABLE TRANSACTIONLOGLOCATION Sample Basic Hyperion/trlog/Sample NATIVE DISABLE Tip: After applying changes to the configuration file you must restart the Essbase server in order to initialize the settings. A maybe required replay of logged transactions after restoring a database can be done only by administrators. The following options are available: In Administration Services selecting Replay Transactions on the right-click menu on the database: Here you can select to replay transactions logged after the last replay request was originally executed or after the time of the last restored backup (whichever occurred later) or transactions logged after a specified time. Or you can replay transactions selectively based on a range of sequence IDs, which can be accessed using Display Transactions on the right-click menu on the database: These sequence ID s (0, 1, 2 … 7 in the screenshot below) are assigned to each logged transaction, indicating the order in which the transaction was performed. This helps to ensure the integrity of the restored data after a replay, as the replay of transactions is enforced in the same order in which they were originally performed. So for example a calculation originally run after a data load cannot be replayed before having replayed the data load first. After a transaction is replayed, you can replay only transactions with a greater sequence ID. For example, replaying the transaction with sequence ID of 4 includes all preceding transactions, while afterwards you can only replay transactions with a sequence ID of 5 or greater. Tip: After restoring a database from a backup you should always completely replay all logged transactions, which were executed after the backup, before executing new transactions. But not only the transaction information itself needs to be logged and stored in a specified directory as described above. During transaction logging, Essbase also creates archive copies of data load and rules files in the following default directory: ARBORPATH/app/appname/dbname/Replay These files are then used during the replay of a logged transaction. By default Essbase archives only data load and rules files for client data loads, but in order to specify the type of data to archive when logging transactions you can use the command TRANSACTIONLOGDATALOADARCHIVE as an additional entry in the Essbase.cfg file. The syntax for the statement is: TRANSACTIONLOGDATALOADARCHIVE [appname [dbname]] [OPTION] While to the [appname [dbname]] argument the same applies like before for TRANSACTIONLOGLOCATION, the valid values for the OPTION argument are the following: Make the respective setting for which files copies should be logged, considering from which location transactions are usually taking place. Selecting the NONE option prevents Essbase from saving the respective files and the data load cannot be replayed. In this case you must first manually load the data before you can replay the transactions. Tip: If you use server or SQL data and the data and rules files are not archived in the Replay directory (for example, you did not use the SERVER or SERVER_CLIENT option), Essbase replays the data that is actually in the data source at the moment of the replay, which may or may not be the data that was originally loaded. You can find more detailed information in the following documents: Oracle Hyperion Enterprise Performance Management System Backup and Recovery Guide (rel. 11.1.2.1) Oracle Essbase Online Documentation (rel. 11.1.2.1)) Enterprise Performance Management System Documentation (including previous releases) Or on the Oracle Technology Network. If you are also interested in other new features and smart enhancements in Essbase or Hyperion Planning stay tuned for coming articles or check our training courses and web presentations. You can find general information about offerings for the Essbase and Planning curriculum or other Oracle-Hyperion products here; (please make sure to select your country/region at the top of this page) or in the OU Learning paths section, where Planning, Essbase and other Hyperion products can be found under the Fusion Middleware heading (again, please select the right country/region). Or drop me a note directly: [email protected]. About the Author: Bernhard Kinkel started working for Hyperion Solutions as a Presales Consultant and Consultant in 1998 and moved to Hyperion Education Services in 1999. He joined Oracle University in 2007 where he is a Principal Education Consultant. Based on these many years of working with Hyperion products he has detailed product knowledge across several versions. He delivers both classroom and live virtual courses. His areas of expertise are Oracle/Hyperion Essbase, Oracle Hyperion Planning and Hyperion Web Analysis. Disclaimer: All methods and features mentioned in this article must be considered and tested carefully related to your environment, processes and requirements. As guidance please always refer to the available software documentation. This article does not recommend or advise any explicit action or change, hence the author cannot be held responsible for any consequences due to the use or implementation of these features.

    Read the article

  • MYSQL backup and restore

    - by Codezy
    I am having trouble getting mysql backups to run properly when their are views in the database. I think this might have something to do with needing a placeholder object for it. In any event I run this command: mysqldump -u myuser -pmypassword mydatabase | mysql -u myuser -pmypassword -C mydatabase_Beta The user has full privileges and I get this: View mydatabase_beta.yadayada references invalid tables or columns or functions or definer/invoker or view lack rights to use them. How can I back it up so that it restores all of my database properly? In the example I am restoring it to a different name but I do need to be able to restore a working copy. I think it is probably an additional mysqldump parameter or maybe hot copy would work better. Thoughts?

    Read the article

  • Using NTBackup to Backup Exchange 2003 Mail Stores

    - by Kyle Brandt
    I have netbackup backing up my Exchange stores stores to tape, but would like to maybe make the restore process faster. I have plenty of room on the array attached to the mail server, so I was thinking I could use NTBackup to do weekly backups in addition to my tape backups. Has anyone used this with good success?

    Read the article

  • SQL Server Restore from Backup, Just primary File Group

    - by bladefist
    Thankfully, this question is just a what-if, and I am not in an emergency right now. But I have created a file group in my database (sql server 2008), and moved some massive data tables over to it. Leaving my websites central tables in the Primary file group. In the event of a restore, can I restore just the primary file group, and have a working database? Or do I have to restore both file groups? I don't want my site down for ages while it restores the 2nd file group.

    Read the article

  • Backup the Windows user folder in the cloud?

    - by Benjamin
    As I understand it, Google Drive and Dropbox, the two cloud storage providers I happen to know, can only sync a predefined folder that is created upon installation. I'd be happy to have an automated synchronisation of my folders in the cloud, but I'm not ready to change my habits, and start saving all my documents in the folder imposed by the provider. Is it possible with one of these, or any other you might know, to sync the full Windows user folder instead?

    Read the article

  • migrating puppet clients to a new puppet master (old puppet master server gone, only using backup)

    - by user47650
    My puppet master server had a hardware failure, and I have restored to another box. However this box has different hardware and hostname. If I restore the existing /etc/puppet directory to the new server, the puppetmaster will not start with the following error; # puppetmasterd --debug --verbose Could not prepare for execution: Retrieved certificate does not match private key; please remove certificate from server and regenerate it with the current key So what steps do I need to take to allow the new puppetmaster to start, and to generate a new puppetmaster certificate using the old ca.. Also will the puppet clients actually report in to a different puppet server using a server certificate that has been generated with the old CA?

    Read the article

  • IT lead does not have a backup, DR plan in writing

    - by Alex
    This is a general management question to IT managers out there. We are a small firm with about 4 servers in our colo cabinent. No full time IT manager. But we do have one person on monthly contract and I am having a terrible time getting him to share what these plans actually are. I am sure he HAS a plan (and its probably in his head..) but that does us no good if he gets hit by a bus.. How would you guys handle this? He is a long time friend, but I fear this is dangerous for us long term..I have confronted him on several occasions about this, and he tells me not to worry, he has go it covered.. Thanks.

    Read the article

  • How to backup or export PowerStrip display profiles?

    - by Sk8erPeter
    I would like to save two of my saved PowerStrip display profiles. Earlier I set 720x540 resolution and some other settings (frequency, etc.) to another display device usually used in extended mode, which is now NOT connected: But when I go to "Advanced timing options", I see some different settings. I thought I could copy settings with the copy icon , but this way I would copy the wrong ones, not the predefined ones (with the 720x540 resolution): What is the best method to "export" these settings before formatting the hard drive?

    Read the article

  • OSX 10.9 Time Machine backup to NAS

    - by user214577
    I recently upgraded from 10.6.8 to 10.9. on snow leopard i was able to make time machine backups over the network to my nas, i think i had to tweak some settings but i dont recall what i did. now that i upgraded to mavericks, i cannot do backups to my nas using time machine. my question is, what do i have to do to allow time machine backups over the network in 10.9? i tried looking for solutions online but did not find anything relating to mavericks.

    Read the article

  • Secretary cannot add appointments to boss's calendar after exchange restore from backup

    - by therulebookman
    The calendar is the Boss's calendar on Exchange. I have set permissions for it through his Outlook to give the secretary and a few other people "Editor" access to his calendar. All the editors can view the calendar, but only he can add new appointments. Anyone else who tries to add an appointment gets "The item cannot be saved in this folder. The folder was deleted or moved or you do not have permission." The permissions are correct, editor. The item hasn't been deleted or moved. It's in his mailbox on exchange. The message says something about the mailbox size, but he is well under the size limit anyway. He is using Outlook 2003, and I have tried accessing it from 2003 and 2007, but I don't think that is related I tried clearing the forms cache and enabling disabled items: no disabled items and clearing cache didn't help. I also tried "Allow all forms" but this apparently doesn't apply in this scenario as we are not using any custom forms. Is there any way to delete just his calendar and then I can exmerge it back in (after exporting to PST of course)? I really can't exmerge out his mailbox, delete it, and exmerge it back in because he works all sorts of hours, but if this is the only way, then I'll have to do it. Is there any other possible solution?

    Read the article

  • Recover backup copy of a ubuntu linux installation on a usb stick using dd

    - by user10826
    Hi, I installed Ubuntu 10.04 on a usb stick in persistent install mode. So I could boot the laptop or my desktop computer with the stick, at boot time. Once I needed the 8GB stick for another purposes so I thought about coyping it to my desktop doing from mac os x: dd if=/dev/disks3s of=/Users/jack/Desktop/usb_copy Now I am trying to do the opposite, after having used the stick, which was formatted to NTFS, just doing dd if=/Users/jack/Desktop/usb_copy of=/dev/disks3s but although I can see that almost of the files are there, I can not boot again. IT is also strange the the file permissions are kind of strange, something like _user What can I do ? Thanks

    Read the article

  • Recover backup copy of a ubuntu linux installation on a usb stick using dd

    - by user10826
    Hi, I installed Ubuntu 10.04 on a usb stick in persistent install mode. So I could boot the laptop or my desktop computer with the stick, at boot time. Once I needed the 8GB stick for another purposes so I thought about coyping it to my desktop doing from mac os x: dd if=/dev/disks3s of=/Users/jack/Desktop/usb_copy Now I am trying to do the opposite, after having used the stick, which was formatted to NTFS, just doing dd if=/Users/jack/Desktop/usb_copy of=/dev/disks3s but although I can see that almost of the files are there, I can not boot again. IT is also strange the the file permissions are kind of strange, something like _user What can I do ? Thanks

    Read the article

  • SSH + MysqlDump Remote Backup Script

    - by bundini
    I'm trying to issue a remote mysqldump command, redirect stdout to a dmp file, then tar that up. I'm a bit confused as to how to do the redirection bits over ssh: i.e. ssh [email protected] mysqldump $dbname -u admin -p > dbdump.dmp && tar cvzf dbdump.tar.gz dbdump.dmp Issues: 1) I'm not providing the password because I want it to prompt me. Will an ssh remote command deal with this? 2) What's the deal with the syntax? Do I want to use quotations, or don't I? What happens with the redirects and pipes? Do those have to be escaped or formatted in some special fashion.

    Read the article

< Previous Page | 86 87 88 89 90 91 92 93 94 95 96 97  | Next Page >