Search Results

Search found 6798 results on 272 pages for 'continuous backup'.

Page 30/272 | < Previous Page | 26 27 28 29 30 31 32 33 34 35 36 37  | Next Page >

  • How to obtain a list of installed packages based on a backup?

    - by ujjain
    I have made a backup of my entire computer and I reinstalled the OS right now. I know I should have listed dpkg -l to obtain a list of packages, but I have not done so. I only made a tarball of the entire disk. I wonder how I can find a list of packages based on this data. I would like to have the same packages installed that I had on my previous set-up. But there seems no longer a way to gain a list of packages, since I already reinstalled Ubuntu on the computer and only have the tarball back-up.

    Read the article

  • Do you known a reputable backup software that can capture ONLY file system structure + attributes, WITHOUT file content

    - by bogdan
    Is there, on Windows, a reputable backup software out there capable of capturing ONLY a file system's directory and file structure, along with each item's attributes, WITHOUT capturing the actual file content (all files should be zero-length in the backup). I thoroughly searched the web for a solution and wasn't able to find one. Scenario when this would be very useful: I have a large drive with a huge amount of files. If the drive dies, I don't care so much about the content in these files (I can always download this content again from the Internet at any time) but I do care HUGELY about the names of the files that were on it, possibly also about their MD5 hashes and other classic file attributes (especially created-date / modified-date). The functionality I need is present to an extent in "media"/file cataloging software (i.e. whereisit) and, to a lesser extent, in a Total Commander set of extensions (DiskDir, DiskDirExtended). The huge drawback with cataloging software is that it's not designed to store previous versions of each item (AFAIK) and, most importantly, it has very weak content backup capabilities. I managed to think of a hack but I hope there's some backup software out there that already has this capability and I just failed to find it, thus this question. The hack: RoboCopy could be used with /CREATE (CREATE directory tree and zero-length files only) or /COPY (what to COPY for files) without the D=Data flag, to clone a directory structure into one where all files are zero-length but have the desired attributes. Then I would backup the cloned directory structure with a reputable backup software. I would really love to avoid a hack like this one, if possible. Thanks, Bogdan

    Read the article

  • Mac OS X - Time Machine backup fails verification - What can I do to save the history?

    - by usermac75
    Hi, How do I make Time Machine to make a new complete backup without losing older versions of backed up files? Verbose: I am using the Time Machine backup on my OS X (Snow Leopard) to backup the whole computer to an external drive. I especially like the "history", i.e. the feature that allows you to restore the older version of a file. Problem: I had some data corruption on my external backup drive, I repaired it with the System Tool for doing that, it found some faults. I had the disk tool repair the external drive. After that, the external drive was OK and I could use Time Machine again. I let Time Machine do one more backup. Now I made a verification according to http://superuser.com/questions/47628/verifying-time-machine-backups, namely along sudo diff -qr . $HOME/Desktop 2>&1 | tee $HOME/timemachine-diff.log However: After doing the command above, several differences and missing files were reported, approx. 200 files in sum. Whereas some of the missing files were cache or excluded directories, the differences do bother me, especially as some important documents from me are listed as differing. How can I make sure that the data on the external drive is synced correctly? Is it possible to have Time Machine to do a complete new backup without losing the history? Or to have Time Machine compare all files for differences and re-write all files that are different? Or can I set some flags on the files that do not match to have them copied again? (like the archive-flag in Windows/Dos). I'd rather not touch the files because I would like to keep the date of last change/date of creation) Thank you for your thoughts!

    Read the article

  • Mac OS X - Time Machine backup fails verification - What can I do to save the history?

    - by usermac75
    Hi, How do I make Time Machine to make a new complete backup without losing older versions of backed up files? Verbose: I am using the Time Machine backup on my OS X (Snow Leopard) to backup the whole computer to an external drive. I especially like the "history", i.e. the feature that allows you to restore the older version of a file. Problem: I had some data corruption on my external backup drive, I repaired it with the System Tool for doing that, it found some faults. I had the disk tool repair the external drive. After that, the external drive was OK and I could use Time Machine again. I let Time Machine do one more backup. Now I made a verification according to http://superuser.com/questions/47628/verifying-time-machine-backups, namely along sudo diff -qr . $HOME/Desktop 2>&1 | tee $HOME/timemachine-diff.log However: After doing the command above, several differences and missing files were reported, approx. 200 files in sum. Whereas some of the missing files were cache or excluded directories, the differences do bother me, especially as some important documents from me are listed as differing. How can I make sure that the data on the external drive is synced correctly? Is it possible to have Time Machine to do a complete new backup without losing the history? Or to have Time Machine compare all files for differences and re-write all files that are different? Or can I set some flags on the files that do not match to have them copied again? (like the archive-flag in Windows/Dos). I'd rather not touch the files because I would like to keep the date of last change/date of creation) Thank you for your thoughts!

    Read the article

  • php code in FTP to backup db using URL

    - by Giom
    I fund and inserted this code in my FTP in a backup.php that works great but it copy the file into the root folder (homez/) 1- I would like to put these backup files in a homez/backupDB any idea where to put this path? 2- the db backup files (DB.sql.bz2) have always the same name, is it possible to name each one with the date of creation? (I launch this php with the URL link) <? echo "Votre base est en cours de sauvegarde....... "; $db="nom_de_ma_base"; $status=system("mysqldump --host=mysql5-1.perso --user=$_POST[login] --password=$_POST[password] $db > ../$db.sql"); echo $status; echo "Compression du fichier..... "; system("bzip2 -f ../$db.sql"); echo "C'est fini. Vous pouvez récupérer la base par FTP \n "; ?> Thanks!

    Read the article

  • Implementing Cluster Continuous Replication, Part 3

    Cluster continuous replication (CCR) uses log shipping and failover to provide a more resilient email system with faster recovery. Once it is installed, a clustered server requires different management routines. These are done either with a GUI tool, The Failover Cluster Management Console, or the Exchange Management Shell. You can use Powershell as well for some tasks. Confused? Not for long, since Brien Posey is once more here to help.

    Read the article

  • Continuous Deployment to Azure powered by Git

    Today Scott Guthrie announced several updated capabilities for Azure Web Sites. Announcing: Great Improvements to Windows Azure Web Sites I recommend you checkout the full post there are some really cool improvements. My favorite is the ability to enable Continuous Deployment from your CodePlex project into Azure. David Ebbo has a great video walk-through: (Please visit the site to view this video)

    Read the article

  • automatically starting crashplan backup when a usb harddisc is connected

    - by Stefan Armbruster
    Using CrashPlan I've configured two backup sets: online backup in crashplan's cloud (this is running perfectly) a local backup on a usb harddisc directly connected to the local laptop. The USB drive is only connected rarly when being at home. When connecting the drive it mounts automatically. Is there a way to start the local backup whenever the usb disc connected. My guess is that using udev it should be possible to "somehow" tell crashplan to reevaluate the presence of backup location. Any ideas to do this?

    Read the article

  • How to set up an rsync backup to Ubuntu securely?

    - by ws_e_c421
    I have been following various other tutorials and blog posts on setting up a Ubuntu machine as a backup "server" (I'll call it a server, but it's just running Ubuntu desktop) that I push new files to with rsync. Right now, I am able to connect to the server from my laptop using rsync and ssh with an RSA key that I created and no password prompt when my laptop is connected to my home router that the server is also connected to. I would like to be able to send files from my laptop when I am away from home. Some of the tutorials I have looked at had some brief suggestions about security, but they didn't focus on them. What do I need to do to let my laptop with send files to the server without making it too easy for someone else to hack into the server? Here is what I have done so far: Ran ssh-keygen and ssh-copy-id to create a key pair for my laptop and server. Created a script on the server to write its public ip address to a file, encrypt the file, and upload to an ftp server I have access to (I know I could sign up for a free dynamic DNS account for this part, but since I have the ftp account and don't really need to make the ip publicly accessible I thought this might be better). Here are the things I have seen suggested: Port forwarding: I know I need to assign the server a fixed ip address on the router and then tell the router to forward a port or ports to it. Should I just use port 22 or choose a random port and use that? Turn on the firewall (ufw). Will this do anything, or will my router already block everything except the port I want? Run fail2ban. Are all of those things worth doing? Should I do anything else? Could I set up the server to allow connections with the RSA key only (and not with a password), or will fail2ban provide enough protection against malicious connection attempts? Is it possible to limit the kinds of connections the server allows (e.g. only ssh)? I hope this isn't too many questions. I am pretty new to Ubuntu (but use the shell and bash scripts on OSX). I don't need to have the absolute most secure set up. I'd like something that is reasonably secure without being so complicated that it could easily break in a way that would be hard for me to fix.

    Read the article

  • Problem restoring a SQL Server backup

    - by Elmex
    I have a SQL Server 2008, which is part of a domain. Now I make a backup of a database of this server and restore it on a SQL Server, which is not part of a domain. I have an C# application, which uses this database. On the NON-Domain machine I get now exceptions like this: "Cannot execute as the database prinzipal because the principial "dbo" does not exist, this type of principal cannot be impersonatedm or you don not have the permission" I think, the problem is, that the database owner is a domain user and this user doesn't exist on the target machine (backup machine)!? How can I solve this ?

    Read the article

  • Backup & recovery of multiple MySQL databases (InnoDB & MyISAM)

    - by Cymon
    I am working on nightly and hourly backups of MySQL Databases. There are multiple MySQL databases which are either InnoDB or MyISAM (Note: Each database is either InnoDB or MyISAM for a reason). With the 2 different types I want to make sure I am grabbing everything that is needed for backup and recovery. Here is my current plan Nightly -mysqldump of each DB which is stored locally and remotely. Hourly -flush binary logs and store them locally and remotely. Weekly -expire binary logs older than a week. I feel like I am grabbing everything that is needed for the MyISAM databases but I am concerned about the InnoDB databases and the log files (ib_logfile0, ib_logfile1, ibdata1) they create. Should I backup these files? Nightly? Hourly? Both? Do I really need them if I am already doing the above nightly and hourly backups?

    Read the article

  • Tools to backup an external hard disk

    - by Kaushik Gopal
    Hey people, What's the best method to take an exact copy of my external hard disk? A guru suggested rsync, but I was wondering if there's an easier alternative. I do remember reading somewhere that Acronis also does this. Was looking for your advice on the best option. I'm running Windows. Essentially i have an external HDD which has a lot of stuff synchronized across various pcs. I wish to take a backup of this external Hard disk (ext.HDDs aren't entirely reliable so want to keep a backup of my ext.HDD). Cheers. K

    Read the article

  • Things to consider when building a continuous integration server?

    - by Dave
    I'm new to continuous integration, but immediately realize its value, and I want to get one set up right away. I have played with TeamCity and have it working in a VM great. Now, I don't want to spend money on another system, so I was planning on just doing the VM again on a faster machine (i.e. my dev system). There are a few questions that come to mind with this: Hard disk allocation - how big should it be? Sure, 60GB seems like more than enough, but people also used to think that we'd never need more than 64KB of RAM Backups - is it even important to back up the integration server? Sure, I guess it's nice so that one doesn't have to go through the entire configuration process again, but I would think that's about it. I could snapshot my VM every time I do a configuration change, and then do a backup of applications only (ignore the buildAgent stuff). Migration - if I want to go away from a VM on my dev system, to a new server, which maybe even runs Windows Server 2003, is it easy enough? Perhaps this is a particular point best suited for StackOverflow.

    Read the article

  • Microsoft ReportViewer SetParameters continuous refresh issue

    - by Ilya Verbitskiy
    Originally posted on: http://geekswithblogs.net/ilich/archive/2013/10/16/microsoft-reportviewer-setparameters-continuous-refresh-issue.aspxI am a big fun of using ASP.NET MVC for building web-applications. It allows us to create simple, robust and testable solutions. However, .NET world is not perfect. There is tons of code written in ASP.NET web-forms. You cannot simply ignore it, even if you want to. Sometimes ASP.NET web-forms controls bring us non-obvious issues. The good example is Microsoft ReportViewer control. I have an example for you. 1: <%@ Page Language="C#" AutoEventWireup="true" CodeFile="Default.aspx.cs" Inherits="_Default" %> 2: <%@ Register Assembly="Microsoft.ReportViewer.WebForms, Version=11.0.0.0, Culture=neutral, PublicKeyToken=89845dcd8080cc91" Namespace="Microsoft.Reporting.WebForms" TagPrefix="rsweb" %> 3:   4: <!DOCTYPE html> 5:   6: <html xmlns="http://www.w3.org/1999/xhtml"> 7: <head runat="server"> 8: <title>Report Viewer Continiuse Resfresh Issue Example</title> 9: </head> 10: <body> 11: <form id="form1" runat="server"> 12: <div> 13: <asp:ScriptManager runat="server"></asp:ScriptManager> 14: <rsweb:ReportViewer ID="_reportViewer" runat="server" Width="100%" Height="100%"></rsweb:ReportViewer> 15: </div> 16: </form> 17: </body> 18: </html>   The back-end code is simple as well. I want to show a report with some parameters to a user. 1: protected void Page_Load(object sender, EventArgs e) 2: { 3: _reportViewer.ProcessingMode = ProcessingMode.Remote; 4: _reportViewer.ShowParameterPrompts = false; 5:   6: var serverReport = _reportViewer.ServerReport; 7: serverReport.ReportServerUrl = new Uri("http://localhost/ReportServer_SQLEXPRESS"); 8: serverReport.ReportPath = "/Reports/TestReport"; 9:   10: var reportParameter1 = new ReportParameter("Parameter1"); 11: reportParameter1.Values.Add("Hello World!"); 12:   13: var reportParameter2 = new ReportParameter("Parameter2"); 14: reportParameter2.Values.Add("10/16/2013"); 15:   16: var reportParameter3 = new ReportParameter("Parameter3"); 17: reportParameter3.Values.Add("10"); 18:   19: serverReport.SetParameters(new[] { reportParameter1, reportParameter2, reportParameter3 }); 20: }   I set ShowParametersPrompts to false because I do not want user to refine the search. It looks good until you run the report. The report will refresh itself all the time. The problem caused by ServerReport.SetParameters method in Page_Load. The method cause ReportViewer control to execute the report on the NEXT post back. That is why the page has continuous post-backs. The fix is very simple: do nothing if Page_Load method executed during post-back. 1: protected void Page_Load(object sender, EventArgs e) 2: { 3: if (IsPostBack) 4: { 5: return; 6: } 7:   8: _reportViewer.ProcessingMode = ProcessingMode.Remote; 9: _reportViewer.ShowParameterPrompts = false; 10:   11: var serverReport = _reportViewer.ServerReport; 12: serverReport.ReportServerUrl = new Uri("http://localhost/ReportServer_SQLEXPRESS"); 13: serverReport.ReportPath = "/Reports/TestReport"; 14:   15: var reportParameter1 = new ReportParameter("Parameter1"); 16: reportParameter1.Values.Add("Hello World!"); 17:   18: var reportParameter2 = new ReportParameter("Parameter2"); 19: reportParameter2.Values.Add("10/16/2013"); 20:   21: var reportParameter3 = new ReportParameter("Parameter3"); 22: reportParameter3.Values.Add("10"); 23:   24: serverReport.SetParameters(new[] { reportParameter1, reportParameter2, reportParameter3 }); 25: } You can download sample code from GitHub - https://github.com/ilich/Examples/tree/master/ReportViewerContinuousRefresh

    Read the article

  • Windows reinstallation

    - by Xaver
    Hi i made PXE installation of Win XP. Today i want to realize that before windows installation file, settings, software, software-settings, are saved and after windows installation it loaded. Can freeware System Managment Systems help me?

    Read the article

  • Creating diff from a tar

    - by Hulk
    There is a tar file which was created few days before, this tar file was created on the /files directory. And now /files directory has new files uploaded in it. My question is how to create another files with only the new files uploaded. Thanks..

    Read the article

  • Carbonite Restore Error after Windows Update

    - by Rev
    I recently installed Windows 8 on my main machine, and I'm have an issue with Carbonite. I can't restore any files from my previous OS, only files that where backed up from the current OS. I can download them from the web service, but I can't restore them with the client. The error says "You don't have the necessary security permissions to restore this file." (The file on top was backed up from the new install, while the one on bottom was from the old install.) I'm also having another issue, where Carbonite doesn't think it's running in admin mode. I think it's probably related to this error. When I right click to restore a file, I can't choose a location to restore to. I contacted Carbonite, and they said it's a known issue that apparently happens when upgrade from XP to 7, but they wouldn't help me since I'm running a beta OS. Hopefully one of you fine folks will have an idea.

    Read the article

  • I want to version control my entire slice

    - by Tom
    I'm renting a slice (i.e., a VPS) from Slicehost. I've a spent a day or two filling up /usr with my favorite packages, /etc with configs and init scripts, and so on. Now I want to: save this whole setup somewhere (e.g., to load onto another machine). see what changes I've made to which files revert changes, tag revisions, and all that other good version control stuff Saving a disk image gives me (1), but not (2) and (3). Using Subversion (svn import / svn://someotherhost) might give me all three, but I expect problems if I actually try to check a project out into / and maintain .svn directories in root-owned areas. And to load my setup onto a fresh slice, I'd need to install an svn client on it first. Is there a good way to do what I want to do?

    Read the article

  • Copy all installed programs & files in a hard disk (which has 32 bit Windows 7) and clone/transfer it to another computer which has 64 bit Windows 7

    - by galacticninja
    I recently got a new PC which has a 64-bit Windows 7 installed. The current PC that I am using has a 32-bit Windows 7 installed. I would like to know if there is a software that can copy all my installed programs and files in the hard disk with the 32-bit Windows 7 PC and transfer it to the newer PC's hard disk which has a 64 bit version of Windows 7. This is essentially like "cloning" a hard disk but I would like to use a 64-bit OS in the target drive, instead of also using the 32-bit OS of the source drive. I would like to do this I can avoid reinstalling and reconfiguring my installed programs and files again on the new PC. If possible, I would like the new PC to work as it was in my previous PC, with the installed programs, configuration and files intact except that the OS is now 64-bit and the hard disk has a larger capacity. I have heard of programs that can clone a hard disk, but my concern is that the 32-bit Windows 7 OS will also be cloned to the new 64-bit PC. If it is not possible to transfer my installed programs and settings like the way I described, are there software that can make it easier to migrate my installed programs, their configurations and my files from a 32-bit Windows 7 PC to a 64-bit Windows 7 PC? Details: I have a SATA to USB connector/adapter to copy files in the current hard disk to the newer one. The two PCs are connected through LAN, so I can also transfer files through LAN. Both PCs only have one hard disk.

    Read the article

  • Ghost/Acronis/Clonezilla Live Image Creation (Without Rebooting)

    - by user39621
    I know Ghost and Clonezilla aren't able to build images of a system while the system is running(Without Rebooting). Haven't Checked on Acronis though, but i don't simpatize with private solutions. Question: Is there a software solution which is able to build a "Live" image? Would appreciate anwsers, since I'm one step away from building a Clonezilla test enviroment and this will just help on my decision. Thank you.

    Read the article

  • Ghost/Acronis/Clonezilla Live Image Creation (Without Rebooting)

    - by user39621
    I know Ghost and Clonezilla aren't able to build images of a system while the system is running(Without Rebooting). Haven't Checked on Acronis though, but i don't simpatize with private solutions. Question: Is there a software solution which is able to build a "Live" image? Would appreciate anwsers, since I'm one step away from building a Clonezilla test enviroment and this will just help on my decision. Thank you.

    Read the article

  • Protecting Backups from Viruses

    - by Frank Thornton
    Currently we are using ReadyNAS RN102 by NETGEAR to hold all our images from all our machines. My question is can viruses expand from the machines and infect the NAS drive and the backups? We are using AOMEI Backupper. EDIT: Basically, I want to keep my clean backups clean. I keep Daily, Weekly, Monthly achievable backups, my worry is a virus might attempt or try to infect these backups making using them worthless.

    Read the article

< Previous Page | 26 27 28 29 30 31 32 33 34 35 36 37  | Next Page >