Search Results

Search found 9847 results on 394 pages for 'cloud backup'.

Page 11/394 | < Previous Page | 7 8 9 10 11 12 13 14 15 16 17 18  | Next Page >

  • What makes Erlang suitable for cloud applications?

    - by Duncan
    We are starting a new project and implementing on our corporations's instantiation of an openstack cloud (see http://www.openstack.org/). The project is security tooling for our corporation. We currently run many hundreds of dedicated servers for security tools and are moving them to our corporations instantiation of openstack. Other projects in my company currently use erlang in several distributed server applications, and other Q/A point out erlang is used in several popular cloud services. I am trying to convince others to consider where it might be applicable on our project. What are erlang's strengths for cloud programming? Where are areas it is particularly appropriate to use erlang?

    Read the article

  • Easing the Journey to the Private Cloud with Oracle Consulting

    - by MichaelM-Oracle
    By Sanjai Marimadaiah, Senior Director, Strategy & Business Development – Cloud Solutions, Oracle Consulting Services Business leaders are now leading the charge on how their firms can profit from cloud solutions. Agility and innovation are becoming the primary drivers of the business case for the cloud, even more than the anticipated cost savings. Leaders need to find the right strategy and optimize the use of cloud-based applications across their enterprise-computing infrastructure. The Problem – Current State With prevalent IT practices, many organizations find that they run multiple IT solutions serving similar business needs. This has led to the proliferation of technology stacks, for example: Oracle 10g on Sun T4 running Solaris 9; Oracle 11g on Exadata running Linux; or Oracle 12c on commodity x86 servers. This variance has a huge impact on an organization’s agility and expenses, and requires IT professionals with varied skills as well as on-going training for different systems and tools. Fortunately there is a practical business strategy to overcome this unneeded redundancy. Thus begins a journey to the right cloud computing solution. The Solution – Cloud Services from Oracle Consulting Services (OCS) Oracle Consulting Services (OCS ) works closely with our clients as trusted advisors to proactively respond to business needs and IT concerns. OCS understands that making the transition to cloud solutions begins with a strategic conversation, based on its deep expertise for successfully completing private cloud service engagements with several companies. For a journey to the cloud, Oracle Consulting Services leads the client through four phases– standardization, consolidation, service delivery, and enterprise cloud – to achieve optimal returns. Phase 1 - Standardization Oracle Consulting Services (OCS) works with clients to evaluate their business requirements and propose a set of standard solutions stacks for various IT solutions. This is an opportune time to evaluate cloud ready solutions, such as Oracle 12c, Oracle Exadata, and the Oracle Database Appliance (ODA). The OCS consultants, together with the delivery team, then turn to upgrading and migrating existing solution stacks to standardized offerings. OCS has the expertise and tools to complete this stage in a fraction of the time required by other IT services companies. Clients quickly realize cost savings in tools, processes, and type/number of resources required. This standardization also improves agility of the IT organizations and their abilities to respond to the needs of various business units. Phase 2 - Consolidation During the consolidation phase, OCS consultants programmatically consolidate hundreds of databases into a smaller number of servers to improve utilization, reduce floor space, and optimize maintenance costs. Consolidation helps clients realize huge savings in CapEx investments and shrink OpEx costs. The use of engineered systems, such as Oracle Exadata, greatly reduces the client’s risk of moving to a new solution stack. OCS recommends clients to pursue Phase 1 (Standardization) and Phase 2 (Consolidation) simultaneously to reduce the overall time, effort, and expense of the cloud journey. Phase 3 - Service Delivery Once a client is on a path of standardization and consolidation, OCS consultants create Service Catalogues based on the SLAs requirements and the criticality of the solutions. The number and types of Service Catalogues (Platinum, Gold, Silver, Bronze, etc.) vary from client to client. OCS consultants also implement a variety of value-added cloud solutions, including monitoring, metering, and charge-back solutions. At this stage, clients are able to achieve a high level of understanding in their cloud journey. Their IT organizations are operating efficiently and are more agile in responding to the needs of business units. Phase 4 - Enterprise Cloud In the final phase of the cloud journey, the economics of the IT organizations change. Business units can request services on-demand; applications can be deployed and consumed on a pay-as-you-go model. OCS has the expertise and capabilities to establish processes, programs, and solutions required for IT organizations to transform how they interact with business units. The Promise of Cloud Solutions Depending the size and complexity of their business model, some clients are able to abbreviate some phases of their cloud journey. Cloud solutions are still evolving and there is rapid pace of innovation to transform how IT organizations operate. The lesson is clear. Cloud solutions hold a lot of promise for business agility. Business leaders can now leverage an additional set of capabilities and services. They can ramp up their pace of innovation. With cloud maturity, they can compete more effectively in their respective markets. But there are certainly challenges ahead. A skilled consulting services partner can play a pivotal role as a trusted advisor in the successful adoption of cloud solutions. Oracle Consulting Services has expertise and a portfolio of services to help clients succeed on their journey to the cloud.

    Read the article

  • Incremental backup with stsadm sitecolleciton backup

    - by TPOL
    I was looking into doing backup with stsadm and I found that with catastrophic backup we can achieve differential and full backup but in catastrophic stsadm backup can we achive this at sitecollection level or webapplication level? Basically i have a webapplication of which I want to do full and incremental backup any suggestoins ?

    Read the article

  • How to back up non-standard directories in my user profile with Windows Backup?

    - by James Johnston
    I'm using Windows Backup to back up my Win7 Pro laptop. I'd like to use it to back up my complete user profile, but I only see standard profile directories (e.g. C:\Users\JohnstonJ\Documents) in the list. Non-standard ones aren't there (e.g. C:\Users\JohnstonJ\MyCustomDirectory). What's the best way to handle this? The only thing I can think of is to browse under the "Computer" entry and navigate directly to C:\Users\JohnstonJ and check off the entire profile (to get what's in there, and any new directories that come up). But is that going to back up the profile twice? Cause other unforeseen problems given that I checked it off by navigating through the computer, rather than picking it under the "Data Files" category? (e.g. back up temporary file garbage, files in use problems, etc. that the "Data Files" category might be handling better). Looking for solutions that other people use that are known to work well and still uses the Windows Backup software - I don't really want to fuss with 3rd-party backup software. Example - as you can see, I have two directories in my profile that Windows Backup is not offering to back up: "Dropbox" and "New folder": (Link to images album because I don't have enough reputation to directly embed them: http://imgur.com/a/Xyv5u)

    Read the article

  • How to backup BOINC

    - by Stephen Judge
    I have BOINC Manager installed from the PPA, version 6.10.17, and I am about to upgrade my Ununtu install with a clean install. I would like to know how I can backup my work done on BOINC so I don't loose what I have already done and have to start from scratch again. For example I am running the Climate Prediction project and it runs for a year or so, I'm at 30% work done so I want to backup that 30% work done. Also as an addition to this, can someone advise me the best way to upgrade BOINC when new versions are released on their website but are not available on the PPA yet. I know you can install BOINC anywhere, but I want to install it to the same place the PPA install does so all my settings and work done is recognised. Thanks in advance.

    Read the article

  • Backup all home folders on usb disk and accessibility

    - by PatrickV
    I am using Ubuntu 12.04 and have multiple family members working on it with there own home folder. I have an USB disk and want to use it to backup my home folders. Trying this, I got some questions. When my disk auto mount, it is not visible for each user. It seams to be visible for the user the time I connect the usb disk. I want to create one folder per home on the usb disk to backup the data to. But when I format the disk in EXT4 or FAT for example it is Read Only. How can I format the disk so it is accessible to every user. Best Regards, Patrick

    Read the article

  • Backup Windows files using Ubuntu - Unable to find Win partition

    - by Siva
    I am using a Dell laptop with Windows 7, all of a sudden the HDD is not recognized by Win7. I wanted to backup the data in Win, so I made a Ubuntu 12.04.1 Live CD, and booted from it. I am using Ubuntu without installing it in my laptop. My problem is that I don’t see the Windows partitions in Ubuntu 12.04.1, b’cos of which I am unable to backup the data. Any suggestion in this regard would be very helpful.. PS: I checked the SMART status of the HDD it says 2 Bad sectors, when I attempted an extended Self-Test, I get a Read Failed message, though the Short test goes through fine. Thank You, Siva

    Read the article

  • Backup systems config files

    - by David ???
    I'm planning on installing nVidia proprietary drivers on my Ubuntu 10.10. Historically this always ends-up with me being left with no graphical interface. No ability to revert - and reinstalling the whole system. So now, before trying this anew, I wish to backup all relevant config files. I'll try 1 or 2 methods. I'll list each one's commands. I'll appreciate if anyone can tell me how to backup the relevant file, or what's the reverse of this operation. 10x, David Method I - as described here: apt-get --purge remove xserver-xorg-video-nouveau As described in this answer: edit /etc/default/grub and add the line GRUB_CMDLINE_LINUX="nouveau.modeset=0" sudo update-grub Reboot Install original drivers downloaded from nVidia site. Method II - as described here: sudo apt-get purge nvidia* [possibly 'sudo gedit /etc/modprobe.d/blacklist.conf' adding 'vga16fb' 'nouveau' sudo apt-get install nvidia-glx-185 sudo modprobe nvidia sudo lsmod | grep -i nvidia sudo nvidia-xconfig

    Read the article

  • How do I take an image/backup of Ubuntu partition and restore to VirtualBox VM

    - by whizkid
    I have Ubuntu 10.04 installed on an older hard disk. I recently bought a new disk and already installed Windows 7. I dont want to use the older disk anymore, and I would like to keep on using Ubuntu in a virtual machine on the new disk(to avoid the possible mess-ups of dual boot and I found VirtualBox is the best free tool for this). I wish to keep the exact same data\programs\configurations\settings I had been using in Ubuntu for so long, and avoid the tedious part of having to reconfigure so many things. How do I backup\restore Ubuntu to another disk? I would prefer a free tool to do the backup\restore.

    Read the article

  • Setup CRON weekly backup

    - by sadmicrowave
    I want to make a backup of my /var/lib/mysql and /var/www folders and save them as tar.gz files to my mounted network file server (uslons001). Here is my bash file located in: /etc/cron.weekly/mysqlbackup.sh #!/bin/bash mkdir ~/uslons001/`date +%d%m%y` tar -czf ~/uslons001/`date +%d%m%y`/mysql.tar.gz /var/lib/mysql tar -czf ~/uslons001/`date +%d%m%y`/www.tar.gz /var/www tar -czf ~/uslons001/`date +%d%m%y`.tar.gz ~/uslons001/`date +%d%m%y` echo Backup Completed `date` >> ~/backuplog Which works PERFECTLY fine when I execute it in a cmd shell but when I setup the cron job it never runs, so I'm not setting the cron job up properly. My cron job looks like this. 30 7 * * fri /etc/cron.weekly/mysqlbackup.sh Which should execute at 7:30AM every Friday... What am I doing wrong? UPDATE1 - change the cron job line to the following: 44 8 * * 5 /etc/cron.weekly/mysqlbackup.sh with still no luck...is there a cron error log file that I can read to help pin point where the problem is?

    Read the article

  • Top Three Reasons to Move to the Cloud Before Your Next Upgrade

    - by yaldahhakim
    Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Calibri","sans-serif";} Normal 0 false false false EN-US X-NONE X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Calibri","sans-serif";} 1) Reduced Cost - During major upgrades, most organizations typically need to replace or invest in extra hardware and other IT resources to support the upgrade. With the Cloud, this can become more of an Op-ex discussion. The flexibility and scalability of the cloud also allows for new business solution to be set up more quickly with the ability to scale IT resources to closely map to changing business requirements. . This enables more and faster innovation because you are spending money to focus on core business initiatives instead of setting up complex environments. 2) Reduced Risk- This is especially true when you are working with a cloud provider that possesses substantial in-house expertise. Oracle Managed Cloud Services has been hosting and managing customer’s business applications for over a decade and has help hundreds of customers upgrade and adopt new technologies faster and better. Customer have access to over 15,000 Oracle experts in operation centers around the world that can work around the clock and have direct access Oracle Development to optimize our customers’ upgrade experience. 3) Reduced Downtime - Whether a customer is looking to upgrade their E-Business Suite, PeopleSoft, JD-Edwards, or Fusion applications, we’ve developed standardized best practices and tools across the technology stack to accelerate the upgrade and migration with substantially reduced timelines and risk. And because the process is repeatable, customer stay more current on the latest releases, continuously taking advantage of the newest innovations – without the headache.. By leveraging the economies and expertise of scale that belong to Oracle, you can sleep better at night knowing that your next major application upgrade is taken care of. Check out the video of this Managed Cloud Services customer to learn more about their experience.

    Read the article

  • backup dedicated server runing ubuntu 10.04 and plesk 11.01 prior to update os to uduntu 12.04

    - by timmob
    i would like to backup my dedicated server which is my web server hosting various sites and email, so that I can update the os to Ubuntu 12.04, and basically restore back to 10.04 if things go wrong. I have a local machine that I can install 12.04 onto an then I was going to rsinc between the two, but I am fairly clueless when it comes to linux. I can ssh into the remote server and gain root access. can anyone explain if i need to backup the whole server hard drive or just some of the files? Thanks Timmo.

    Read the article

  • Windows 7 file-based backup service

    - by Ben Voigt
    I'm looking for a good replacement for Lazy Mirror, since it doesn't support Windows 7 well. Pros: One of the things I really loved about Lazy Mirror is that it always maintains a "full" backup, but does so by only copying modified files. As each file was copied, the old version got archived (moved to an out-of-the-way location). So after mirroring ran, there'd be a complete copy of the file system, which could even be booted if necessary. At the same time, extra space on the backup media was used to store as many older versions of files as possible, without wasting space storing multiple copies of the same version. It seems that with Windows 7 backup, there'd be wasted space storing the same data in both the system image and file backup. It was completely file-based, but also aware of the registry (it had a feature to dump the live registry to hive files in the correct format). The backups were normal NTFS filesystems, no special tool was needed to read them. It automatically cleaned out the oldest previous versions when space ran out (unlike Windows 7 backup which apparently simply starts failing the the backup media fills.) It copied all file attributes including security. Cons: It doesn't deal well with junction points, symbolic links, and hard links. It didn't run as a service without lots of help from firesrv or srvany, and then you couldn't interact with the GUI. Running as a service was necessary to be able to mirror protected OS files. It didn't have open file handling, except for registry hives. I guess that the file-by-file archive and replacement could leave mismatched sets of files, if the mirror was interrupted. This would be the advantage of incremental backup techniques that require old full backup + all intermediate incremental backups to restore. But I don't see this as presenting much of a problem, you'd really only have a boot failure if you had a mixture of pre- and post-service pack files, and I can run a full image backup using another tool before applying a service pack. Does anyone know of a tool that does both full-system backup and storage of old versions of files like Lazy Mirror did (without storing the same data multiple times), and also can run as a service in Windows 7? Free is best of course, but a reasonably priced paid program (e.g. It would be absolutely awesome if it also triggered a backup/mirror pass when a particular external drive was plugged in and generated popup warnings if backups hadn't been run recently)

    Read the article

  • Share Your Top 30 Visited Domains with Visitation Cloud for Firefox

    - by Asian Angel
    Curious about the domains that you visit most or perhaps you want a way to share that information on a social website? Now you can see and share the 30 most visited domains in your browser’s history with the Visitation Cloud extension. Accessing Visitation Cloud As soon as you install the extension you can get started using it. Depending on how your browser’s UI is set up there are three methods for accessing Visitation Cloud: a “Visitation Cloud Button” inserted at the end of your “Bookmarks Toolbar”, a menu listing in the “Tools Menu”, and a “Toolbar Button” (not shown here). Visitation Cloud in Action As soon as you activate Visitation Cloud a new window will appear with your top domains displayed in a cloud format. Keep in mind that this is more than just a static image…each listing is actually a clickable link. Clicking on any of the listings will open that domain in a new tab or window depending on your particular browser settings. If you feel that you have a great set of links and want to share it with your friends then that is easy to do. Right click anywhere within the Visitation Cloud Window and select “Save as…”. The “cloud image” can be saved in “.png, .jpg, or Scalable Vector Graphics (.svg)” format. For our example we chose the “.svg format”. Perhaps you love the set of links but not the layout…right click and select “Randomize” to change how the cloud looks. Here is our cloud after being “Randomized”. Things definitely got moved around… Accessing the Visitation Cloud Image in other Browsers Once you have your “cloud image” saved you can share it with friends or save it for your own future use in other browsers. Here is our “cloud image” open in Opera Browser with link opening in progress. The same “cloud image” open in Google Chrome. Very nice… Conclusion While this may not be something that everyone will use Visitation Cloud does make for a rather unique, interesting, & fun way to access and share your most visited domains. Links Download the Visitation Cloud extension (Mozilla Add-ons) Similar Articles Productive Geek Tips Fix "Security Error: Domain Name Mismatch" Warning in FirefoxAdd Variety to Your Searches with Search CloudletRestore Your Missing/Deleted Smart Bookmarks Folder in Firefox 3Blocking Spam from International Senders in Windows Vista MailSee Where a Package is Installed on Ubuntu TouchFreeze Alternative in AutoHotkey The Icy Undertow Desktop Windows Home Server – Backup to LAN The Clear & Clean Desktop Use This Bookmarklet to Easily Get Albums Use AutoHotkey to Assign a Hotkey to a Specific Window Latest Software Reviews Tinyhacker Random Tips Revo Uninstaller Pro Registry Mechanic 9 for Windows PC Tools Internet Security Suite 2010 PCmover Professional Share High Res Photos using Divvyshot Draw Online using Harmony How to Browse Privately in Firefox Kill Processes Quickly with Process Assassin Need to Come Up with a Good Name? Try Wordoid StockFox puts a Lightweight Stock Ticker in your Statusbar

    Read the article

  • Need Suggestions on Backup Strategies and Alternatives?

    - by Leejo
    I'm not sure where else to post this question since it is not exactly Code or Development related...but I know Stackoverflow is a very responsive to questions... Currently, I use Mozy Home to perform an online backup of my laptop. So far, this works well, since I only use one laptop that needs to be backed up. But, soon this may change and I want to explore other alternatives than having to perform an online backup on all machines. Ideally, I want to set up a Network Computer (Laptop/Desktop) with enough storage to hold the backups for all other machines that I would have. Each machine should be responsible for performing their backup (to the Network Computer). This would require some capability like Mozy's incremental backup strategy, but instead of online backup, I would prefer it to be done locally to the Network Computer. Can you recommend a local backup software (backup to a network pc, incremental backup, good restore options)? I'm also looking for any ideas on a local backup strategy even if its different from what I've stated? What works and what doesn't work? Thanks in advance for your help!

    Read the article

  • TFS 2012: Backup Plan Fails with empty log file

    - by Vitor
    I have a Team Foundation Server 2012 installation with Power Tools, and I defined a backup plan using the wizard found in the "Database Backup Tools" in the Team Foundation Server Administration Console. I set the backup plan to do a full database backup on Sunday mornings, to another server in the network. I followed the wizard with no problems and the Backup Plan was set successfully. However when the backup runs it returns Error as result and when I go to the log file I only get the header and no further info: [Info @01:00:01.078] ==================================================================== [Info @01:00:01.078] Team Foundation Server Administration Log [Info @01:00:01.078] Version : 11.0.50727.1 [Info @01:00:01.078] DateTime : 11/25/2012 02:00:01 [Info @01:00:01.078] Type : Full Backup Activity [Info @01:00:01.078] User : <backup user> [Info @01:00:01.078] Machine : <TFS Server> [Info @01:00:01.078] System : Microsoft Windows NT 6.2.9200.0 (AMD64) [Info @01:00:01.078] ==================================================================== I can imagine it's a permission problem, but I have no idea where to start ... Can anyone help? Thank you for your time! EDIT I'm not sure if it is related, but I logged in with "backup user" in "TFS Server" and there was this crash window opened with "TFS Power Tool Shell Extension (TfsComProviderSvr) has stopped working". The full crash log is here: Problem signature: Problem Event Name: APPCRASH Application Name: TfsComProviderSvr.exe Application Version: 11.0.50727.0 Application Timestamp: 5050cd2a Fault Module Name: StackHash_e8da Fault Module Version: 6.2.9200.16420 Fault Module Timestamp: 505aaa82 Exception Code: c0000374 Exception Offset: PCH_72_FROM_ntdll+0x00040DA8 OS Version: 6.2.9200.2.0.0.272.7 Locale ID: 1043 Additional Information 1: e8da Additional Information 2: e8dac447e1089515a72386afa6746972 Additional Information 3: d903 Additional Information 4: d9036f986c69f4492a70e4cf004fb44d Does it help? Thanks everyone!

    Read the article

  • Cloud Evolving, SQL Server Responding

    - by KKline
    Brent Ozar ( blog | twitter ) and I did an interview with TechTarget’s Brendan Cournoyer at last summer's Tech-Ed, which as turned into a podcast titled “Cloud efforts advance, SQL Server evolves.” The podcast covers all the major trends at the conference (like BI), virtualization features in Quest’s products (like Spotlight), Brent’s new book and MCM certification, and more. Here’s a link to hear it, appearing on 6/11/10: http://searchsqlserver.techtarget.com/podcast/Cloud-efforts-advance-SQL-Server-evolves....(read more)

    Read the article

  • Desktop Fun: Cloud Chaser Wallpaper Collection Series 2

    - by Asian Angel
    Last year we shared a wonderful collection of cloud wallpapers with you and today we are back with more cloudy goodness. Float away with the clouds on your desktop with the second in our series of Cloud Chaser Wallpaper collections. 8 Deadly Commands You Should Never Run on Linux 14 Special Google Searches That Show Instant Answers How To Create a Customized Windows 7 Installation Disc With Integrated Updates

    Read the article

  • Cloud Evolving, SQL Server Responding

    - by KKline
    Brent Ozar ( blog | twitter ) and I did an interview with TechTarget’s Brendan Cournoyer at last summer's Tech-Ed, which as turned into a podcast titled “Cloud efforts advance, SQL Server evolves.” The podcast covers all the major trends at the conference (like BI), virtualization features in Quest’s products (like Spotlight), Brent’s new book and MCM certification, and more. Here’s a link to hear it, appearing on 6/11/10: http://searchsqlserver.techtarget.com/podcast/Cloud-efforts-advance-SQL-Server-evolves....(read more)

    Read the article

  • selective backup script in bash

    - by Sake
    Hi, I've been using this simple command (that's all I can do :) to backup the whole tree from my user data in NAS server for a year. cp -r /STORAGE /BACKUP-STORAGE/YYYY-MM-DD Unfortunately, after a year of service. My user start filling the spaces with lot of photo and cliparts (jpg, gif, bmp) And that start to make my backup process get much slower. The space is also a big issue. Now I no longer have enough space for a week-long daily backup set. I think I want to change from backup everything to backup only non-image data. How can I exclude jpg, gif, and bmp from the backup ? It's quite easy with DOS XCOPY command, but I really have no idea how to do that in bash. Thanks

    Read the article

  • Duplication of Windows 7 Backup

    - by Steven Pickles
    I use the built in backup utility for Windows 7 because it's automated and flexible enough to allow me to schedule a daily shadow copy backup of particular files and folders directly to a separate internal RAID 0 array (2 x 1TB). It's also lightweight and stays out of the way. For off-site backup purposes, each week I copy the contents of the internal backup from the RAID 0 array to an external 1 TB drive. I then store move this drive to a different building. The copy from the internal backup to the external backup typically works like this: mount and erase contents of external drive highlight "file" on internal drive, hit CTRL+C CTRL+V on root directory of external drive Is there a better way to synchronize? Microsoft's SyncToy application does a pitiful job, and often leaves the folders not truly synchronized... which completely defeats the ability to use the backup's restore feature.

    Read the article

  • Secure, efficient, version-preserving, filename-hiding backup implemented in this way?

    - by barrycarter
    I tried writing a "perfect" backup program (below), but ran into problems (also below). Is there an efficient/working version of this?: Assumptions: you're backing up from 'local', which you own and has limited disk space to 'remote', which has infinite disk space and belongs to someone else, so you need encryption. Network bandwidth is finite. 'local' keeps a db of backed-up files w/ this data for each file: filename, including full path file's last modified time (mtime) sha1sum of file's unencrypted contents sha1sum of file's encrypted contents Given a list of files to backup (some perhaps already backed up), the program runs 'find' and gets the full path/mtime for each file (this is fairly efficient; conversely, computing the sha1sum of each file would NOT be efficient) The program discards files whose filename and mtime are in 'local' db. The program now computes the sha1sum of the (unencrypted contents of each remaining file. If the sha1sum matches one in 'local' db, we create a special entry in 'local' db that points this file/mtime to the file/mtime of the existing entry. Effectively, we're saying "we have a backup of this file's contents, but under another filename, so no need to back it up again". For each remaining file, we encrypt the file, take the sha1sum of the encrypted file's contents, rsync the file to its sha1sum. Example: if the file's encrypted sha1sum was da39a3ee5e6b4b0d3255bfef95601890afd80709, we'd rsync it to /some/path/da/39/a3/da39a3ee5e6b4b0d3255bfef95601890afd80709 on 'remote'. Once the step above succeeds, we add the file to the 'local' db. Note that we efficiently avoid computing sha1sums and encrypting unless absolutely necessary. Note: I don't specify encryption method: this would be user's choice. The problems: We must encrypt and backup 'local' db regularly. However, 'local' db grows quickly and rsync'ing encrypted files is inefficient, since a small change in 'local' db means a big change in the encrypted version of 'local' db. We create a file on 'remote' for each file on 'local', which is ugly and excessive. We query 'local' db frequently. Even w/ indexes, these queries are slow, since we're often making one query for each file. Would be nice to speed this up by batching queries or something. Probably other problems that I've now forgotten.

    Read the article

  • How to Backup Your Web-Based Email Account Using Thunderbird

    - by Jason Fitzpatrick
    If the Gmail scare earlier this week has you thinking about backing up your Gmail or other web-based email account, we’re here to help. Read on to learn how to backup your web-based email using open source email application Thunderbird. In case you missed it, earlier this week Gmail suffered an unusual series of glitches that led to 0.02% of Gmail users finding their inboxes totally empty. The good news is that the glitch was fixed and no actual data was lost (they restored the missing email from tape backups that were unaffected by the issue). While that’s wonderful nobody lost any important emails it’s also very unsettling. Not every “Oops, we lost your data!” scenario ends so well. Today we’re going to walk you through backing up your email using the free and robust open-source application Thunderbird. Latest Features How-To Geek ETC Have You Ever Wondered How Your Operating System Got Its Name? Should You Delete Windows 7 Service Pack Backup Files to Save Space? What Can Super Mario Teach Us About Graphics Technology? Windows 7 Service Pack 1 is Released: But Should You Install It? How To Make Hundreds of Complex Photo Edits in Seconds With Photoshop Actions How to Enable User-Specific Wireless Networks in Windows 7 Access the Options for Your Favorite Extensions Easier in Firefox Don’t Sleep Keeps Your Windows Machine Awake DropSpace Syncs Android Files to Dropbox Field of Poppies Wallpaper The History Of Operating Systems [Infographic] DriveSafe.ly Reads Your Text Messages Aloud

    Read the article

  • Help with a simple incremental backup script

    - by Evan
    I'd like to run the following incomplete script weekly in as a cron job to backup my home directory to an external drive mounted as /mnt/backups #!/bin/bash # TIMEDATE=$(date +%b-%d-%Y-%k:%M) LASTBACKUP=pathToDirWithLastBackup rsync -avr --numeric-ids --link-dest=$LASTBACKUP /home/myfiles /mnt/backups/myfiles$TIMEDATE My first question is how do I correctly set LASTBACKUP to the the the directory in /backs most recently created? Secondly, I'm under the impression that using --link-desk will mean that files in previous backups will not will not copied in later backups if they still exist but will rather symbolically link back to the originally copied files? However, I don't want to retain old files forever. What would be the best way to remove all the backups before a certain date without losing files that may think linked in those backups by currents backups? Basically I'm looking to merge all the files before a certain date to a certain date if that makes more sense than the way I initially framed the question :). Can --link-dest create hard links, and if so, just deleting previous directories wouldn't actually remove linked file? Finally I'd like to add a line to my script that compresses each newly created backup folder (/mnt/backups/myfiles$TIMEDATE). Based on reading this question, I was wondering if I could just use this line gzip --rsyncable /backups/myfiles$TIMEDATE after I run rsync so that sequential rsync --link-dest executions would find already copied and compressed files? I know that's a lot, so many thanks in advance for your help!!

    Read the article

  • Update from Ola Hallengren: Target multiple devices during SQL Server backup

    - by Greg Low
    Ola has produced another update of his database management scripts. If you haven't taken a look at them, you should. At the very least, they'll give you good ideas about what to implement and how others have done so. The latest update allows targeting multiple devices during backup. This is available in native SQL Server backup and can be helpful with very large databases. Ola's scripts now support it as well.Details are here: http://ola.hallengren.com/sql-server-backup.html http://ola.hallengren.com/versions.html The following example shows it backing up to 4 files on 4 drives, one file on each drive:EXECUTE dbo.DatabaseBackup@Databases = 'USER_DATABASES',@Directory = 'C:\Backup, D:\Backup, E:\Backup, F:\Backup',@BackupType = 'FULL',@Compress = 'Y',@NumberOfFiles = 4And this example shows backing up to 16 files on 4 drives, 4 files on each drive: EXECUTE dbo.DatabaseBackup@Databases = 'USER_DATABASES',@Directory = 'C:\Backup, D:\Backup, E:\Backup, F:\Backup',@BackupType = 'FULL',@Compress = 'Y',@NumberOfFiles = 16Ola mentioned that you can now back up to up to 64 drives. 

    Read the article

< Previous Page | 7 8 9 10 11 12 13 14 15 16 17 18  | Next Page >