Search Results

Search found 7545 results on 302 pages for 'backup and restore'.

Page 35/302 | < Previous Page | 31 32 33 34 35 36 37 38 39 40 41 42  | Next Page >

  • How to set up an rsync backup to Ubuntu securely?

    - by ws_e_c421
    I have been following various other tutorials and blog posts on setting up a Ubuntu machine as a backup "server" (I'll call it a server, but it's just running Ubuntu desktop) that I push new files to with rsync. Right now, I am able to connect to the server from my laptop using rsync and ssh with an RSA key that I created and no password prompt when my laptop is connected to my home router that the server is also connected to. I would like to be able to send files from my laptop when I am away from home. Some of the tutorials I have looked at had some brief suggestions about security, but they didn't focus on them. What do I need to do to let my laptop with send files to the server without making it too easy for someone else to hack into the server? Here is what I have done so far: Ran ssh-keygen and ssh-copy-id to create a key pair for my laptop and server. Created a script on the server to write its public ip address to a file, encrypt the file, and upload to an ftp server I have access to (I know I could sign up for a free dynamic DNS account for this part, but since I have the ftp account and don't really need to make the ip publicly accessible I thought this might be better). Here are the things I have seen suggested: Port forwarding: I know I need to assign the server a fixed ip address on the router and then tell the router to forward a port or ports to it. Should I just use port 22 or choose a random port and use that? Turn on the firewall (ufw). Will this do anything, or will my router already block everything except the port I want? Run fail2ban. Are all of those things worth doing? Should I do anything else? Could I set up the server to allow connections with the RSA key only (and not with a password), or will fail2ban provide enough protection against malicious connection attempts? Is it possible to limit the kinds of connections the server allows (e.g. only ssh)? I hope this isn't too many questions. I am pretty new to Ubuntu (but use the shell and bash scripts on OSX). I don't need to have the absolute most secure set up. I'd like something that is reasonably secure without being so complicated that it could easily break in a way that would be hard for me to fix.

    Read the article

  • Marking Changes to database...

    - by KoolKabin
    Hi guys... I am developing application to be run in central server and distributed computers. I am supposed to write application to backup the data from distributed machines and merge it in central server. I thought of compressing whole local database and sending it to server for merging. But as the database size grows the size of compress file also began to grow. So is there any way to merge data in central server without sending whole database. I need to do it on daily basis. Daily take backup and send to server

    Read the article

  • Backup & recovery of multiple MySQL databases (InnoDB & MyISAM)

    - by Cymon
    I am working on nightly and hourly backups of MySQL Databases. There are multiple MySQL databases which are either InnoDB or MyISAM (Note: Each database is either InnoDB or MyISAM for a reason). With the 2 different types I want to make sure I am grabbing everything that is needed for backup and recovery. Here is my current plan Nightly -mysqldump of each DB which is stored locally and remotely. Hourly -flush binary logs and store them locally and remotely. Weekly -expire binary logs older than a week. I feel like I am grabbing everything that is needed for the MyISAM databases but I am concerned about the InnoDB databases and the log files (ib_logfile0, ib_logfile1, ibdata1) they create. Should I backup these files? Nightly? Hourly? Both? Do I really need them if I am already doing the above nightly and hourly backups?

    Read the article

  • Tools to backup an external hard disk

    - by Kaushik Gopal
    Hey people, What's the best method to take an exact copy of my external hard disk? A guru suggested rsync, but I was wondering if there's an easier alternative. I do remember reading somewhere that Acronis also does this. Was looking for your advice on the best option. I'm running Windows. Essentially i have an external HDD which has a lot of stuff synchronized across various pcs. I wish to take a backup of this external Hard disk (ext.HDDs aren't entirely reliable so want to keep a backup of my ext.HDD). Cheers. K

    Read the article

  • How do I restore a backup of my keyring (containing ssh key passprases, nautilus remote filesystem passwords and wifi passwords)?

    - by con-f-use
    I changed the disk on my laptop and installed Ubuntu on the new disk. Old disk had 12.04 upgraded to 12.10 on it. Now I want to copy my old keyring with WiFi passwords, ftp passwords for nautilus and ssh key passphrases. I have the whole data from the old disk available (is now a USB disk and I did not delete the old data yet or do anything with it - I could still put it in the laptop and boot from it like nothing happend). The old methods of just copying ~/.gconf/... and ~/.gnome2/keyrings won't work. Did I miss something? 1. Edit: I figure one needs to copy files not located in the users home directory as well. I copied the whole old /home/confus (which is my home directory) to the fresh install to no effect. That whole copy is now reverted to the fresh install's home directory, so my /home/confus is as it was the after fresh install. 2. Edit: The folder /etc/NetworkManager/system-connections seems to be the place for WiFi passwords. Could be that /usr/share/keyrings is important as well for ssh keys - that's the only sensible thing that a search came up with: find /usr/ -name "*keyring* 3. Edit: Still no ssh and ftp passwords from the keyring. What I did: Convert old hard drive to usb drive Put new drive in the laptop and installed fresh version of 12.10 there Booted from old hdd via USB and copied its /etc/NetwrokManager/system-connections, ~/.gconf/ and ~/.gnome2/keyrings, ~/.ssh over to the new disk. Confirmed that all keys on the old install work Booted from new disk Result: No passphrase for ssh keys, no ftp passwords in keyring. At least the WiFi passwords are migrated.

    Read the article

  • Windows reinstallation

    - by Xaver
    Hi i made PXE installation of Win XP. Today i want to realize that before windows installation file, settings, software, software-settings, are saved and after windows installation it loaded. Can freeware System Managment Systems help me?

    Read the article

  • Creating diff from a tar

    - by Hulk
    There is a tar file which was created few days before, this tar file was created on the /files directory. And now /files directory has new files uploaded in it. My question is how to create another files with only the new files uploaded. Thanks..

    Read the article

  • I want to version control my entire slice

    - by Tom
    I'm renting a slice (i.e., a VPS) from Slicehost. I've a spent a day or two filling up /usr with my favorite packages, /etc with configs and init scripts, and so on. Now I want to: save this whole setup somewhere (e.g., to load onto another machine). see what changes I've made to which files revert changes, tag revisions, and all that other good version control stuff Saving a disk image gives me (1), but not (2) and (3). Using Subversion (svn import / svn://someotherhost) might give me all three, but I expect problems if I actually try to check a project out into / and maintain .svn directories in root-owned areas. And to load my setup onto a fresh slice, I'd need to install an svn client on it first. Is there a good way to do what I want to do?

    Read the article

  • Copy all installed programs & files in a hard disk (which has 32 bit Windows 7) and clone/transfer it to another computer which has 64 bit Windows 7

    - by galacticninja
    I recently got a new PC which has a 64-bit Windows 7 installed. The current PC that I am using has a 32-bit Windows 7 installed. I would like to know if there is a software that can copy all my installed programs and files in the hard disk with the 32-bit Windows 7 PC and transfer it to the newer PC's hard disk which has a 64 bit version of Windows 7. This is essentially like "cloning" a hard disk but I would like to use a 64-bit OS in the target drive, instead of also using the 32-bit OS of the source drive. I would like to do this I can avoid reinstalling and reconfiguring my installed programs and files again on the new PC. If possible, I would like the new PC to work as it was in my previous PC, with the installed programs, configuration and files intact except that the OS is now 64-bit and the hard disk has a larger capacity. I have heard of programs that can clone a hard disk, but my concern is that the 32-bit Windows 7 OS will also be cloned to the new 64-bit PC. If it is not possible to transfer my installed programs and settings like the way I described, are there software that can make it easier to migrate my installed programs, their configurations and my files from a 32-bit Windows 7 PC to a 64-bit Windows 7 PC? Details: I have a SATA to USB connector/adapter to copy files in the current hard disk to the newer one. The two PCs are connected through LAN, so I can also transfer files through LAN. Both PCs only have one hard disk.

    Read the article

  • Ghost/Acronis/Clonezilla Live Image Creation (Without Rebooting)

    - by user39621
    I know Ghost and Clonezilla aren't able to build images of a system while the system is running(Without Rebooting). Haven't Checked on Acronis though, but i don't simpatize with private solutions. Question: Is there a software solution which is able to build a "Live" image? Would appreciate anwsers, since I'm one step away from building a Clonezilla test enviroment and this will just help on my decision. Thank you.

    Read the article

  • Ghost/Acronis/Clonezilla Live Image Creation (Without Rebooting)

    - by user39621
    I know Ghost and Clonezilla aren't able to build images of a system while the system is running(Without Rebooting). Haven't Checked on Acronis though, but i don't simpatize with private solutions. Question: Is there a software solution which is able to build a "Live" image? Would appreciate anwsers, since I'm one step away from building a Clonezilla test enviroment and this will just help on my decision. Thank you.

    Read the article

  • Protecting Backups from Viruses

    - by Frank Thornton
    Currently we are using ReadyNAS RN102 by NETGEAR to hold all our images from all our machines. My question is can viruses expand from the machines and infect the NAS drive and the backups? We are using AOMEI Backupper. EDIT: Basically, I want to keep my clean backups clean. I keep Daily, Weekly, Monthly achievable backups, my worry is a virus might attempt or try to infect these backups making using them worthless.

    Read the article

  • Backing up data (including mysqldumps) to S3

    - by seengee
    We have a web app on a number of servers and we want to add an additional layer of redundancy by backing up the key data to S3. The key data is the MySQL database and a folder containing dynamically created site assets - predominantly images. Some kind of rsync based solution would initially seem the best plan. A couple of years ago we played with S3cmd (in particular s3cmd sync) with some success but we didn't find it particularly reliable although this may have changed since. Its occurred to me though that a rsync solution might not work particularly well with a single db.sql file created with mysqldump and I assume this means the whole database getting transferred each time, with multiple databases of over 1GB this is going to add up to a lot of traffic (and $s) very quickly. With the image files I could simply just transfer files modified within the last day which would be far more simple. What approach should I look at?

    Read the article

  • Automating the Backup of a SQL Server 2008 Express Database

    - by JaydPage
    Steps Involved: 1) Create a Database Backup Script. 2) Create a Scheduled Task To Run the Backup Script. 1 Create a Database Backup Script. a) Download and install SQL Server Management Studio. This is a free tool available on the Microsoft website. b) Once Management Studio is installed launch it and connect to the SQL server instance that contains the database that you want to back up. c) Right click on the database and then in the menu choose Tasks -> Back up... d) This will open up a window where you can choose your backup options, once you are happy with the options click on the "Script" button near the top and select the "Script Action to File" option. e) Save the File. 2 Create a Schedule Task to Run the Backup Script a) Open up Windows Task Scheduler. b) Create a new Task using the wizard, when asked to select a program browse to C:\Program Files\Microsoft SQL Server\100\Tools\binn\SQLCMD.exe c) There are 2 arguments that need to be set: -S \SERVER_INSTANCE_NAME  -i "PATH_OF_SQLBACKUP_SCRIPT" where SERVER_INSTANCE_NAME  is the name of the instance of SQL server that contains your database e.g. (local) and PATH_OF_SQLBACKUP_SCRIPT is the path of your backup script e.g. "C:\Program Files\Microsoft SQL Server\DatastoreBackup.sql" d) Adjust the task to run at the desired times and you are done.

    Read the article

  • SQL Server 2008 Restore from Backup fails with error 3241 'cannot process this media family'

    - by pearcewg
    I am attempting to backup a database from a SQL Server instance on one machine and restore it to another, and I am encountering the frequently discovered 'SQL Server cannot process this media family' error. Each of my instances are SQL Server 2008, but with different patch levels Restore: 10.0.2531.0 Backup: 10.0.1600.22 ((SQL_PreRelease).080709-1414 ) The restore DB is express. Not sure about the backup version. The backup version is on a virtual private server. The restore is on my development box. When I restore to a different database on the source (backup) server, it restores fine. Lots of stuff on google about this issue, some on stackoverflow about this issue, but nothing which is this exact situation. Any thoughts? It should be straightforward to do a backup and restore from one machine to another (having done this thousands of times in with SQL 6.5,7,2000,2005). Any ideas how to restore a database in this situation, which gives this error when attempting to restore? PARTIAL RESOLUTION: When I restored to a different box, running SQL 2008 Express on Windows Server 2003, all worked well. It just wouldn't work on the Windows 7 box. Not sure why. If anyone else has a similar experience, please let me know (there are many similar issues in different forums out there).

    Read the article

  • SQL SERVER – Backup SQL databases to Box or SkyDrive

    - by Pinal Dave
    To ensure your SQL Server or Azure databases remain safe, you should backup your databases periodically. And it is important to store the backups in a reliable location. Microsoft SkyDrive currently offers 7GB free, Box offers 5GB free – both are reliable and it is simple to send your backups there. SQLBackupAndFTP in it’s latest version 9 added the option to backup to SkyDrive and Box ( in addition to local/network folder, NAS drive, FTP, Dropbox, Google Drive and Amazon S3). Just select the databases that you’d like to backup and select to store the backups in SkyDrive or Box. Below I will show you how to do it in details Select databases to backup First connect to your SQL Server or Azure Sql Database. Then select the databases you’d like to backup. Connect to SkyDrive or Box cloud If you have a free version of SQLBackupAndFTP Box destination is included, but SkyDrive destination will be disabled as it is available in the Standard version or above. Click “Try now” to get 30 days trial on all options On the “SkyDrive Settings” form you’ll need to authorize SQLBackupAndFTP to access your SkyDrive. Click “Authorize…” to open SkyDrive authorization page in your browser, sign in your to SkyDrive account and click at “Allow” . On the next page you will see the field with authorization code. Copy it to the clipboard. Box operation is just the same. After that return to SQLBackupAndFTP, paste the authorization code and click “OK” . After you are authorized, you can enter the path to a backup folder. SQLBackupAndFTP will create the folder if it does not exist. That’s all what has to be done to backup to SkyDrive or Box cloud.  You can now click on “Run Now” button to test this job. Conclusion Whatever is your preference for storing SQL backups, it is easy with SQLBackupAndFTP. Note that at the time of this writing they are running a very rare promotion on volume licenses: 5–9 licenses: 20% off 10–19 licenses: 35% off more than 20 licenses: 50% off Please let me know your favorite options for storing the backups. Reference: Pinal Dave (http://blog.sqlauthority.com)Filed under: PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL

    Read the article

  • Recommended (remote) backup technique for SQL Compact?

    - by Cool Jon
    Hello. Is there a generally recommended approach to backing up an SQL CE/SQLite database over the Internet? The client source is .NET/Windows based, the backup destination runs Ubuntu. I am using a small SQL CE database and have been trying to figure out the most reasonable approach to doing this. The file size (in terms of transfer time/bandwidth) isn't a big deal. I had a look around, and so far the things I've given thought are: Online backup services (Dropbox, Mozy) Opening an FTP/SFTP connection Writing a custom protocol with public/private keys Unsure regarding #1 because I doubt they would like it if somebody transferred gigabytes of data using a POST; and they do not seem to offer native (or .NET) APIs. FTP/SFTP seems risky in terms of security and privileges (as the password/key would need to be stored on the client side). With the right user group/user privileges this may work. Custom protocol seems overkill, which is why I am hoping somebody has already defined a reasonable API for language/platform-independent backups over the Internet. Any hints S.O.?

    Read the article

  • What is the suggested approach to Syncing/Backing up/Restoring from SQL Server 2008 to SQL Server 20

    - by Eoin Campbell
    I only have SQL Server 2008 (Dev Edition) on my development machine I only have SQL Server 2005 available with my hosting company (and I don't have direct connection access to this database) I'm just wondering what the best approach is for: Getting the initlal DB Structure & Data into production. And keeping any structural changes/data changes in sync in future. As far as I can see... Replication - not an option cos I can't connect to the production DB. Restoring a backup - not an option because as far as I can see, you cannot export a DB from 2008 that is restorable in 2005 (even with the 2008 DB set in 2005 compatibility mode) and it wouldn't make sense to be restoring production over the top of my dev version anyway. Dump all the scripts from my 2008 Database, Revert my Dev to machine from 2008 - 2005, and recreate the database from the scripts, then just use backup & restore to get the initial DB into production, then run scripts through the web panel from that point onwards Dump all the scripts from my 2008 Database and generate the entire 2005 db from scripts in production. then run scripts through the web panel from that point onwards With the last 2 options, I'd probably need to script all the data inserts as well using some tool (which I presume exists on the web) Are there any other possibile solutions that I'm not considering.

    Read the article

  • Moving from SVN to HG : branching and backup

    - by rorycl
    My company runs svn right now and we are very familiar with it. However, because we do a lot of concurrent development, merging can become very complicated.. We've been playing with hg and we really like the ability to make fast and effective clones on a per-feature basis. We've got two main issues we'd like to resolve before we move to hg: Branches for erstwhile svn users I'm familiar with the "4 ways to branch in Mercurial" as set out in Steve Losh's article. We think we should "materialise" the branches because I think the dev team will find this the most straightforward way of migrating from svn. Consequently I guess we should follow the "branching with clones" model which means that separate clones for branches are made on the server. While this means that every clone/branch needs to be made on the server and published separately, this isn't too much of an issue for us as we are used to checking out svn branches which come down as separate copies. I'm worried, however, that merging changes and following history may become difficult between branches in this model. Backup If programmers in our team make local clones of a branch, how do they backup the local clone? We're used to seeing svn commit messages like this on a feature branch "Interim commit: db function not yet working". I can't see a way of doing this easily in hg. Advice gratefully received. Rory

    Read the article

  • NetApp NDMP backup with BE 2010 R2 works, restore fails

    - by uuwe
    Hi, I'm having some issues with a new Backup Exec 2010 R2 installation. I configured a NetApp FAS2020 as an NDMP device and want to backup files from the NAS to a tape drive connected to my backup server. I set up ndmpd according to this document (http://www.symantec.com/business/support/index?page=content&id=TECH48957) and created a separate backup user (http://filers.blogspot.com/2006/09/setting-veritas-netbackup-with-non.html). Backup works perfectly, but restoring any file gives me an authentication failed error. The NDMP device has a "global" ndmp user configured in the device tab (tried this with the newly created ndmpd backup user and the netapp root) and I can also configure separate resource credentials in the BE restore job. I have tried setting the same accounts for the "global" ndmp device and the restore credentials and have also tried setting different accounts for them. NDMP debug level is at 5 and this is what shows up in /etc/messages. The session is closed immediately after it has been granted. 16:12:07 PST [Java_Thread:info]: ndmpdserver: ndmpd.access allowed for version = 4, sessionId = 51, from src ip = 192.168.11.17, dst ip = FAS2020-1/192.168.11.75, src port = 50857, dst port = 10000 16:12:07 PST [Java_Thread:info]: Ndmpd51: ndmpd session closed successfully for version = 4, sessionId = 51, from src ip = 192.168.11.17, dst ip = FAS2020-1/192.168.11.75, src port = 50857, dst port = 10000 Running wireshark on the backup server doesn't produce much. It shows a SYN - SYN/ACK - NDMP CONNECT_CLOSE Request from the backup server. The Resource Credentials for the restore job behave very oddly. If I enter NDMP credentials and do "Test All" it fails. If I use my regular domain backup account, it is successful. There are no failed or succeeded logons in the NetApp ndmp log and tracing this check shows that it doesn't even connect to the NAS. This makes me think that this is more likely flaky BE behaviour rather than misconfiguration of the NAS. Here is the options ndmp output: FAS2020-1 options ndmp ndmpd.access all ndmpd.authtype challenge ndmpd.connectlog.enabled on ndmpd.enable on ndmpd.ignore_ctime.enabled off ndmpd.offset_map.enable on ndmpd.password_length 16 ndmpd.preferred_interface disable ndmpd.tcpnodelay.enable off

    Read the article

  • What is a ‘best practice’ backup plan for a website?

    - by HollerTrain
    I have a website which is very large and has a large user-base. I am trying to think of a 'best practice' way to create a back up or mirror website, so if something happens on domain.com, I can quickly point the site to backup.domain.com via 401 redirect. This would give me time to troubleshoot domain.com while everyone is viewing backup.domain.com and not knowing the difference. Is my method the ideal method, or have you enacted better methods to creating a backup site? I don't want to have the site go down and then get yelled at every minute while I'm trying to fix it. Ideally I would just 'flip the switch' and it would redirect the user to a backup. Any insight would be greatly appreciated.

    Read the article

< Previous Page | 31 32 33 34 35 36 37 38 39 40 41 42  | Next Page >