Search Results

Search found 14173 results on 567 pages for 'online backup'.

Page 89/567 | < Previous Page | 85 86 87 88 89 90 91 92 93 94 95 96  | Next Page >

  • Online payment service recommendation?

    - by Shadowman
    We're currently in the process of looking for an online payment service that will allow us to accept credit cards, etc. However, our business model also involves revenue sharing in a model similar to that of iTunes. That is, content creators will be able to sell content through our site and we take a small percentage of the revenue. Can anyone recommend an online payment service that supports this model? We're also interested in: Accept all major credit cards Being able to do international transactions in the appropriate local currency Recurring transactions (monthly, yearly, etc.) Additionally, if the service provided a Java API for integration or the ability to broker PayPal transactions that would be an added bonus. I know Amazon provides a hosted payment service, but I'd prefer not to require all of our customers to have an Amazon account. That provides an additional barrier to entry that we'd prefer to avoid. I'd appreciate any recommendations you can provide!

    Read the article

  • s3fs Input/output error

    - by shadow_of__soul
    i'm trying to set up a backup system with s3fs and the amazon s3 service. i followed this 2 guides: http://qugstart.com/blog/linux/how-to-mount-an-amazon-s3-bucket-as-virtual-drive-on-centos-5-2/ http://blog.eberly.org/2008/10/27/how-i-automated-my-backups-to-amazon-s3-using-rsync/ anyway making a tail to the /var/log/messages i get: Aug 28 13:37:46 server s3fs:###response=403 i already tried creating the authentication file on /etc/passwd-s3fs and setting there the access and private key, passing it trough the command line, i checked several times the credentials and i used it with s3fox, and is working. i also have set the time of the machine (with the date command) to be the same as the amazon S3 servers (i got the time of the S3 server uploading a file with the file manager) not only rsync don't work, commands like ls or cp in the /mnt/s3 didn't work also. any help of how i can solve/debug this? Regards, Shadow.

    Read the article

  • Remotely sync Time Machine drives

    - by Off Rhoden
    I have an Xserve that runs Time Machine to a local terabyte drive. I also connected my external terabyte drive for a time period and had Time Machine use it to establish the seed data. I plan to take my drive back home with me (out of state) and have the Xserve return to using its local drive for Time Machine. But when I get back home, is there a way to keep my external drive's copy of the Time Machine Backups folder in sync with the Backups folder back on the Xserve? I'm wanting a full copy of the history (makes an awesome remote backup). I've thought of using the unix command rsync. In fact, that's how I had been doing it but I was looking the compactness that Time Machine was able to achieve. Thanks.

    Read the article

  • DPM 2010 RC Mailbox Recovery Fails

    - by ITGuy24
    I am testing DPM 2010 with Exchange 2010. I am attempting to restore a single mailbox from a previous backup to a Recovery Database. I created and mounting the Recovery Database on the Exchange 2010 server and set the overwrite property. When I run the restore for the mailbox and point it to the recovery databse I get the following error. The recovery jobs for Exchange Mailbox Database MailboxDatabase01 that started at with the destination of EXCHANGE2010.domain.com, have completed. Most or all jobs failed to recover the requested data. (ID 3111) DPM encountered an error while performaing an operation for E:\DatabaseFiles\MailboxDatabase01.edb on EXCHANGE2010.domain.com (ID 2033 Details: The process cannot access the file because it is being used by another process (0x80070020)) MailboxDatabase01 is one of our MDBs and not the RDB I setup for the recovery. I am confused why it is even trying to access this as I have triple checked that the recovery is pointed to the RDB. Any idea what I am doing wrong?

    Read the article

  • Centos livecd of current installation

    - by mplacona
    I'm trying to create a liveCD of my current Centos installation. i want it to be almost like a backup, so whenever I want to copy my current installation to another computer I would simply install from my custom liveCD. I know this is possible, and found some resources in the nets, but they all seem to only create a minimal version of CENTOS, and I'm wanting to have all the current functionalities available to me at present, including all of my development functionalities, Apache and samba settings, etc. I have done (on Debian though) it a few years ago, but can't remember how. could anyone please shed me a light on this? Thanks in advance

    Read the article

  • Need to make a scheduled task run as another user but keep the current user’s environment

    - by Chad Marmon
    I need to backup users .pst files. The current method I am trying is making a shadow copy using Diskshadow. My script works great all but Diskshadow needs to be ran as administrator but also needs to retain the logged-on user's environment variables; specifically, the %USERNAME% and %HOMESHARE% variables so the right user’s files get copied up to the right network location. I have for the most part got this to work), but there’s no straightforward (or secure, at least) way to pass the password. If I set up a scheduled task to run the script as a domain user with local admin privs, the environment variables get lost. I need to run this script automagically so that there should be no user interaction. If I could figure out how to make a scheduled task run as another user but keep the current user’s environment, I think this would work, but I’ve been beating my head against that for a while now, without any luck.

    Read the article

  • Moving files fails due to privileges but they seem to be OK

    - by joaoc
    I am trying to copy old files from an OS X 10.5.8 to a new external HD. When trying to copy a folder I get the message: The operation cannot be completed because you do not have sufficient privileges for some of the items. I've checked the privileges and they seem ok (read & write for me). What is curious is that the folder is created empty on the target drive and I can then copy the contents from inside the original folder to inside the new folder without changing anything else. This happens with several folders but not all and is making backup a pain. I have to figure out which folder broke the copy, copy its contents to the external disk and then select what hasn't been copied to copy (and eventually stop at some point and repeating the experience)

    Read the article

  • Is it possible to stream input into RAR

    - by Dscoduc
    I'm using RARLABS RAR.exe to archive/backup my server data. I am familiar with using RAR for creating an archive and adding files from a folder, but what about streaming data directly into an archive? For example, when backing up my MySQL databases I use the mysqldump command that includes a pipe command into a text file. It would be nice to skip the file step and go directly into an archive file using something like the following syntax: mysqldump -uUserName -pPassword --all-databases > rar.exe newarchivename.rar Does anyone know if what I have described, or something similar, is even possible?

    Read the article

  • Mysqldump causes "Too many connections"

    - by vbachev
    A scheduled backup using mysqldump on one of our databases is causing Too many connections. The database is of both InnoDB and MyISAM tables with size of around 500Mb. The Too many connections appears for about 2-3 minutes We understand that mysqldump locks the tables and causes all other queries and connections to pile up and jam the mysql server. We need frequent backups and we cannot afford server downtime or putting websites in maintenance mode while doing it. Our websites are global and traffic is high all the time so its hard to find a moment for backups. How can we avoid downtime during backups?Is there maybe a way to use mysqldump in way that it will not lock all tables at the same time?Is there an alternative to backing up with mysqldump?

    Read the article

  • Can DPM 2007 back up Active Directory?

    - by rbeier
    We're installing Microsoft Data Protection Manager 2007 - we'll be using it to back up Exchange and SQL Server among other things. Does anyone know if DPM can also back up Active Directory? It sounds like the answer is "not really". You can install the DPM agent on a domain controller and make system state backups. But if your Active Directory is out of commission, there will be no way to restore the backups, since DPM depends on AD. Currently we're just using Windows Backup (ntbackup) to take system state backups on one of the DCs. Should we just continue with that? Thanks, Richard

    Read the article

  • Ultrium 3 tape drive shoe-shining, 3Mb/s: and it's not the cable

    - by mowsala
    I have a HP 960 Ultrium 3 tape drive. Since I got it, (second hand, £90) I've been experiencing shoe-shining. Writing with tar in Linux, I average about 3Mb/s write speed. I've tried replacing both the SCSI card and the cable now, both of which made no difference at all. A curiuos observation I have made is that the write rate is not consistent. Sometimes it will write for over a minute without shoeshining, but more often, just a few seconds. I've also tried several tapes, different source drives, and even writing from Windows Backup, to no avail.

    Read the article

  • VSS Not Creating Shadow Set

    - by Jeff Leyser
    I'm trying to setup backup scripts on WinXP to use Volume Shadow Sets. I downloaded the VSS 7.2 SDK from MSFT, and used the include vshadow.exe to create a shadow set: vshadow -script=vss-setvar.cmd f: (note that I've tried both f: and c:) vshadow executes just find, giving no errors, reporting the shadow is created. However, executing vshadow -q as the very next command results in "There are no shadows on the system" and, indeed, if I use dosdev to try and map the Shadow set named in vss-setvar.cmd, it will not work. Am I missing a step?

    Read the article

  • Is there a way to import email from the raw email files?

    - by Chris Schmitz
    I have a client who recently switched hosts. When they switched hosts they didn't backup their email and updated their configuration settings so they lost everything. However, I was able to log in to their old hosting control panel and download their mail folder. I am wondering if there is a way to extract their emails and/or contacts from the files. I'm not sure what type of files they are, there is no extension, but the folder directory is structured like this: mail/ .Drafts/ .Sent/ .Trash/ cur/ new/ theirdomain.com/ tmp/ [email protected] maildir Inside of the theirdomain.com folder, there is a folder for each account and inside of that is a folder called "cur" which has a whole bunch of files with names like 1292945327.H169813P25958.uscentral21.myserverhosts.com,S=10117/2,S and if I preview those files I can see the actual email messages inside of them but I have no idea how to get that information from those files to an email client. Anyone know of a way to work with these files? Thanks in advance for any insight you can share!

    Read the article

  • Are whole VM images backed up on Amazon EC2/S3?

    - by John
    I've been trying to get my head around Amazon Web Services as a VPS provider. My understanding is a EC2 instance running Windows is basically a Windows VM, very similar to renting a VPS from a more traditional hosting provider. I don't want to have complex backups, either to administer or to restore - if my restore involves installing SVN, MySQL, Jira, etc on a new box before I can even try to restore the backup then it's not great to me. What I really want is a service which backs up my entire VM... if the PC running the VPS dies then the VM image is installed on a new PC and off we go again. With Amazon being all about flexibility and elasticity, I wondered if they have this service? I can't figure it out from reading their docs.

    Read the article

  • Best way to Duplicate a Laptop's Hard Drive One-to-One

    - by Urda
    I have a Lenovo X61 Tablet computer, with a plain SATA drive inside. I have windows 7 and Ubuntu 9.10 dual booting on the computer. I want to back up both of these OS's, and their special partitions (Windows 7 has one, and of course the Linux Swap). I want a one-to-one backup, all of my mission critical data is already backed up, but I would like to get a snapshot, and store it on a larger file server at home for quick recovery. What is the best approach to do this?

    Read the article

  • Concurrent backups in SQL Server?

    - by Mikey Cee
    We currently have our backups managed by a third party company. There are a bunch of agent jobs created that take full backups (4 times a day) and transaction log backups (4 times an hour). We now want to manage our backups in house, but don't want to disable the third party's jobs until we are sure that we have everything configured correctly internally So I am proposing to have a short period (say, a couple of days) where backups are being taken both by the old and the new system. I am wondering what the ramifications of having these two different systems both manage backups, and the potential pitfalls of having backups taken simultaneously. Is this even supported? If so, and bearing in mind that the system can cope with one backup without any noticeable performance degradation, is it fairly logical to assume that it should be able to cope with two simultaneous backups? Currently the load on the server is fairly light and it rarely struggles. Any advice is appreciated

    Read the article

  • How to connect to WPA2 encrypted wireless-network when booted from CloneZilla Live-CD?

    - by caligula
    My intention is to perform a backup of my laptop's (Dell Vostro 3350) sda1 disk to my desktop. After some googling I decided to use CloneZilla for that purpose. I have an OpenSSH server installed and running on my desktop. So I inserted CloneZilla CD to CD-rom, booted from it, then chose an option smth like "use ssh server to store image". Then I got an invitation to choose a network interface. I chose wlan0 and entered shell to manualy configure connection. And that's where I got into trouble, for wifi-network which I wanted to use is WPA2-encrypted, and I don't know how to connect to it from command line. Can somebody assist me. Appreciation for advance.

    Read the article

  • Are Plesk server backups useful?

    - by Michael T. Smith
    I'm working for a startup now, and I'm the programmer. Because of our small team size, I'm also handling the server management for now (until we get a dedicated server administrator.) I've never used Plesk before, and the server we're using (a Media Temple Dedicated Virtual server) had it installed when I got here. One of my first jobs was to set up backups: Plesk was already running it's nightly server-wide backups. I created a small script to dump the web app, it's DBs and any assets, tar them, store them, and then copy them to another small server we have (to backup the backups.) But, we're constantly running into hard drive space issues because of the Plesk backups. And I'm wondering, are they useful? If I have the web app and all of it's assets, I could easily enough get another server up and running. Do we need to keep running Plesk's backups? Thoughts?

    Read the article

  • Should I be running my scheduled backups as SYSTEM or as the our domain admin?

    - by MetalSearGolid
    I have a daily backup which is scheduled through the Task Scheduler. It failed with a strange error code last night, but I was able to search and find a blog post with how to avoid the error in the future. However, one of his recommendations was to run the backups as the Administrator user of the domain. Since all of the files being backed up are local to this system, should I continue to have the backups run as SYSTEM? Or is it actually better to run it as a different user? I have been running these backups for well over a year now and have only had a handful of failures, but ironically when it does fail, the error code means it was a permissions issue (or so I read, this code seems to be undocumented by Microsoft). Thanks in advance for any insight into this. Might as well post the error code here too, in case anyone would like to share their insight on this as well, but I rarely ever get this error, so I don't care too much about it: 4294967294

    Read the article

  • How to give a Linux user permission to create backups, but not permission to delete them?

    - by ChocoDeveloper
    I want to set up automated backups that are kept safe from myself (in case a virus pwns me). The problem is the "create" and "delete" permissions are the same thing: write permission. So what can I do about it? Is it possible to decouple the create/delete permissions? Another option could be to let the user "root" make the backups. The problem is my home directory is encrypted, and I don't want to backup everything. Any ideas? For the backups I'm using Deja Dup, which is installed by default in Fedora and Ubuntu.

    Read the article

  • How to automaticaly mount luks-partition only when disk is plugged in

    - by Frederick Roth
    I have the following scenario: I want to automatically backup some data from my Laptop(Fedora Core 17) to a external encrypted(luks) hard disk. The disk can be opened by a key file, which lies on the also encrypted root partition of my laptop. The hard disk is attached to my docking station and therefore only "present" when I am at home (which is approximately 1/2 of the time the Laptop runs) I have everything set up the way I want it with one exception. I don't get a decent way to mount the hard disk automatically at boot if and only if it is present. If I add it to crypttab and fstab without noauto it tries to mount it at boot and takes a lot(!) of time and error messages when it is not present. If I add noauto, well it does not mount automatically ;) Is there a way to configure luks/crypttab to do the following: check whether the disk is present if yes: decrypt/mount if no: just don't

    Read the article

  • Rolling Back Microsoft CRM during testing

    - by npeterson
    Process related question: Currently we have a multi-tenant installation of MS CRM 4.0 on three servers, Dev, Test, and Live. We are actively working on customizing one of the tenants, but the others are static. During user testing, we often find it necessary to 'start fresh' in one of the tenants. Is it better to try and delete out the changes from the tenant (created accounts, leads, etc), or just revert the database to a backup from before the testing started? Is there compelling reasons why bulk delete is not advisable for MSCRM or that reverting the database frequently could cause issue?

    Read the article

< Previous Page | 85 86 87 88 89 90 91 92 93 94 95 96  | Next Page >