Search Results

Search found 14789 results on 592 pages for 'pro backup'.

Page 58/592 | < Previous Page | 54 55 56 57 58 59 60 61 62 63 64 65  | Next Page >

  • Windows 8 Pro Remote Desktop issue

    - by Mike C.
    I have a weird issue here. I'm running Windows 8 Pro. The client computer is also running Windows 8 Pro. Remote Desktop works when I'm in the same network. I tried connecting using my external IP Address and my DynDNS account, neither works. I disabled Windows Firewall and setup DMZ for my computer on the router, still can't get remote desktop to work. I verified www.canyouseeme.org, port 3389 is open, which is obvious since I'm running DMZ! My ISP, Bell Canada (modem/router: Sagemcom F@st 2864), blocks port 80 and 25, but I don't need those for RDP, do I? The funny thing is RDP rejects the connection instantaneously for my IP or DynDNS while it takes a while for another address. Thank you, Michael

    Read the article

  • Mac OS X: easiest (free, non-QuickTime Pro) application for converting numbered folder of images to

    - by Jared Updike
    I'd like to convert a folder of PNGs into a quicktime .mov with PNG compression (it's a folder of fractals in an animation; PNG compression works great here and the losslessness is important). What programs will do this with minimal fuss? (I don't have or want to pay for a full license of QuickTime Pro.) UPDATE: Let me make this more clear: minimal fuss means: I download some EncoderMagic.app (for example), I double click it to launch it. I select the folder with my numbered images, and out pops my movie. No mess. No resizing. ... Perhaps this doesn't exist (or is called QuickTime Pro?)

    Read the article

  • Compiling zip component for PHP 5.2.11 in MAMP PRO

    - by Zlatoroh
    Helo I installed MAMP PRO on my Macbook Pro (10.6) some time ago. Now I would like to use zip functions in php. I found that I must add zip.so to my extension folder and edited php.ini. On my computer I have two different versions of PHP one in MAMP folder and other in user/lib which was pre-installed on my system. Now I wish to compile my zip library for MAMP version. I got zip sources for my version of PHP then in terminal called function /Applications/MAMP/bin/php5/bin/phpize so it uses mamp php version ./configure make then I moved compile zip.so to extensions/no-debug-non-zts-20060613. When MAMP is launched it returns this error: [11-Apr-2010 16:33:27] PHP Warning: PHP Startup: zip: Unable to initialize module Module compiled with module API=20090626, debug=0, thread-safety=0 PHP compiled with module API=20060613, debug=0, thread-safety=0 These options need to match in Unknown on line 0 Can some body explain to me how to do this the right way.

    Read the article

  • 24TB RAID 6 configuration

    - by Phil
    I am in charge of a new website in a niche industry that stores lots of data (10+ TB per client, growing to 2 or 3 clients soon). We are considering ordering about $5000 worth of 3TB drives (10 in a RAID 6 configuration and 10 for backup), which will give us approximately 24 TB of production storage. The data will be written once and remain unmodified for the lifetime of the website, so we only need to do a backup one time. I understand basic RAID theory, however I am not experienced with it. My question is, does this sound like a good configuration? What potential problems could this setup cause? Also, what is the best way to do a one-time backup? Have two RAID 6 arrays, one for offsite backup and one for production? Or should I backup the RAID 6 production array to a JBOD? EDIT: The data server is running Windows 2008 Server x64. EDIT 2: To reduce rebuild time, what would you think about using two RAID 5's instead of one RAID 6?

    Read the article

  • Windows 8 Restore Problems

    - by Joe
    I created a Windows 8 system image backup on a separate drive before I installed Linux, and during the Linux installation process I accidentally wiped out Windows. I now want to restore my Windows 8 backup that I have on the separate drive. I created a repair USB stick and I followed the directions according to this article. After selecting the image on the hard drive, I get this error: "To restore this computer, Windows needs to format the hard drive." I don't know what this means! The drive partitions are different now then they were when I backup up, so I don't know if that matters. I re-installed Windows and I can restore my files from this backup, but I don't think this covers the registry, etc. I want a full restore. Does anyone know how to fix this problem, or how to restore in a different way? Thanks!

    Read the article

  • Running Windows 8 Consumer Preview to Pro

    - by elvispt
    Currently I have Windows 8 Consumer Preview installed I tried running the Windows 8 Upgrade Assistant, and it tells me, after checking for compatibility, that I can buy Windows 8 Pro for 29.99euros. Will I have any issues with this? Will I be able to perform a clean install from the downloaded file? I heard that I could not get Windows 8 Pro at this price, only If I had Windows 7 installed. The bottom line is, what kind of issues can I expect if I decide to go down this path? Will it ask me later for a key of Windows 7 to validate? Thanks.

    Read the article

  • Bash Script to Back Up Backs Up Itself

    - by Jay LaCroix
    I have the following bash script that creates a tar.gz of my filesystem on a Kubuntu PC. The problem is, that it also tries to backup the tar.gz backup file, even though I am storing the backup in /tmp and omitting /tmp from the backup. I am wondering why it's backing up the file in /tmp even though I told it not to. #!/bin/bash # init DATE=$(date +20%y%m%d) sudo tar -cvpzf /tmp/`hostname`_$DATE.tar.gz \ --exclude=/proc \ --exclude=/lost+found \ --exclude=/sys \ --exclude=/mnt \ --exclude=/media \ --exclude=/dev \ --exclude=/tmp \ --exclude=/home/jlacroix/Desktop \ --exclude=/home/jlacroix/Documents \ --exclude=/home/jlacroix/Music \ --exclude=/home/jlacroix/Pictures \ --exclude=/home/jlacroix/Projects \ --exclude=/home/jlacroix/Roms \ --exclude=/home/jlacroix/Videos \ --exclude=/home/jlacroix/.VirtualBox\ VMs \ --exclude=/home/jlacroix/.SpiderOak \ / scp /tmp/`hostname`_$DATE.tar.gz jlacroix@Pluto:/share/Recovery/Snapshots sudo rm /tmp/`hostname`_$DATE.tar.gz

    Read the article

  • Why can't this user connect to domain share?

    - by Saariko
    Part of my reorganizing credentials in the domain, I have created several users that will be used solely for services (backup, LDAP, etc) The idea is that systems that need specific usage will use a user/service user, that will give them what they need. However, I am having trouble setting the correct needed data. For this example, I have a NAS (Ready NAS 1100 by Netgear), that runs it's own backup jobs. The job reads from a domain share: \domain\qa and copies all data to another location. When using the domain\administrator everything works. When I input the domain\srv.backup user I get an error connecting to the folder. The srv.backup is part of the 'Domain Admins' group, which is a member of 'Administrators' I thought there might be propagation issues, but even when the srv.backup user was a direct member of 'Administrators' the error still occurred. I have 2 DC's (W2K8R2 replicas) - I thought that could also cause a problem, AFAIKT it's not the issue. Sharing permissions are open to everyone The Security on the folder is as follow This is the test window from the NAS dashboard I doubled check that the 'srv.domain' is part of the 'Domain Admins' group As well as tried with a simple 1-9 password. What else do I need to check? thanks.

    Read the article

  • Reinstalled Windows XP Home, shows as XP Pro, 'invalid' errors

    - by jan
    We reinstalled Windows XP Home after hard drive crash. The system now shows as XP Pro and we're constantly getting 'invalid software' popups, images show in solid black. We called Microsoft, they said the serial number is valid etc. We're assuming that the pop ups are because that it is now XP Pro instead of Home. Or could it be something else? Or how can we determine what the issue is in order to correct it? Thanks for any advice on this. jb

    Read the article

  • Linux Logitech QuickCam Pro 9000 - microphone issues

    - by drahnr
    I got a Logitech Quickcam Pro 9000, the cam itself is working as it honors the UVC spec. This fancy WebCam has a integrated mic which worked some time before but now, it does no more. (Note: I use pulseaudio as it is a USB Mic and I am not really keen on the hassle of ALSA setup) Things I check already are if it gets detected at all: $ lsusb |grep Logi Bus 002 Device 002: ID 046d:0809 Logitech, Inc. Webcam Pro 9000 is muted in alsa-mixer - not the case, volume at 100 pavucontrol shows it too, but no input level bar! On top of that, if I open the gnome3 (fallback mode) audio panel(from the desktop panel), and disabel/reenable it in the hardware tab, it works "for some time"... Any hints? Any ideas? I am really out options for now, and the fact it worked like 6 months ago (perfectly) makes it no better.

    Read the article

  • Non-Apple RAID card for Mac PRO (TOWER)

    - by Arthor
    I have the following: MAC PRO (Model Number: A1186) (PCIe - SLOTS) At present I am using the software RAID however I wish to move to the hardware raid because of the following: Performance (4 x 300gb SATA II in RAID 5) Redundancy (Raid 5, 1 drive can fail and system will be online) I do not wish to use the Apple RAID card (very expensive), I would like to use an aftermarket one which is cheaper. Questions: Does anyone have a WORKING aftermarket RAID card working in their MAC PRO (TOWER)? -(Have done some research, ROCKETRAID, need confirmation) If so to the above, does it work from boot? Thanks

    Read the article

  • Permissions Required for Sharepoint Backups

    - by Wyatt Barnett
    We are in the process of rolling out an extranet for some of our partners using WSS 3.0 as the platform. We already use it internally for a variety of things, and we are using the following powershell script to backup the server: param( $url="http://localhost", $backupFolder="c:\" ) [System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SharePoint") $site= new-Object Microsoft.SharePoint.SPSite($url) $names=$site.WebApplication.Sites.Names foreach ($name in $names) { $n2 = "" if ($name.Length -eq 0) { $n2="ROOT" } else { $n2 = $name } $tmp=$n2.Replace("/", "_") + ".sbk" $saveas = "" if ($backupFolder.Length -eq 0) { $saveas = $tmp } else { $saveas = join-path -path $backupFolder -childPath $tmp } $site.WebApplication.Sites.Backup($name, $saveas, "true") write-host "$n2 backed up to $saveas." } This script works perfectly on the current installation running as our domain backup user. On the new box, it fails when ran as the backup user--claiming "The web application located at http://extranet/" could not be found. That url does, in fact, work so I'm fairly certain it isn't anything that dumb and rather is some permissions issue. Especially because, when executed from my security context, the script works perfectly. I have tried making the backup user a farm owner, as well as added him to the various site collection admin groups on the extranet. The one major difference between the extranet and the intranet server is that the extranet has an alternative access mapping (for https://xnet.example.com) and also uses forms authentication for that mapping. Anyhow, what permissions (or other voodoo) do I need to setup to get this script to work properly?

    Read the article

  • Writing to Samba share as different user?

    - by Hamid Elaosta
    I have a Samba share on my NAS drive mounter as follows: mount -t smbfs -o username=backup,password=backups_password //sharebox/SVNBackup /mnt/SVNBackup I am then trying to run: sudo svnadmin dump /usr/local/svn/repos/testrepo > /mnt/SVNBackup/test1.svn but I get: bash: /mnt/SVNBackup/test1.svn: Permission Denied The backup location is setup to accept access only from the user "backup" (who doesn't exist on the local system) How do I go about solving this problem? Thanks

    Read the article

  • Current alternative to the old CHECKSUM program

    - by faulty
    I'm looking for an application that does md5/sha hash check on specific files/folders periodically and store an index file per folder for future verification. I remember such application exist in DOS days, to detect files infected by virus. The main purpose for this is to detect corrupted copy of backup, as I understand that consumer grade hardware are not 100% error free when doing backup or file transfer from device to device. The hash can also be used to generate a list of changed files for backup. Most of the software I can find is hash manually. EDIT: Windows based application, preferably a shell extension which I can right click on a folder and do a checksum/verify all files in that folder. Even better if that can integrate with a backup/sync program like BeyondCopy

    Read the article

  • Compiling zip component for PHP 5.2.11 in MAMP PRO

    - by Zlatoroh
    I installed MAMP PRO on my Macbook Pro (10.6) some time ago. Now I would like to use zip functions in php. I found that I must add zip.so to my extension folder and edited php.ini. On my computer I have two different versions of PHP one in MAMP folder and other in user/lib which was pre-installed on my system. Now I wish to compile my zip library for MAMP version. I got zip sources for my version of PHP then in terminal called function /Applications/MAMP/bin/php5/bin/phpize so it uses mamp php version ./configure make then I moved compile zip.so to extensions/no-debug-non-zts-20060613. When MAMP is launched it returns this error [11-Apr-2010 16:33:27] PHP Warning: PHP Startup: zip: Unable to initialize module Module compiled with module API=20090626, debug=0, thread-safety=0 PHP compiled with module API=20060613, debug=0, thread-safety=0 These options need to match in Unknown on line 0 Can somebody explain to me how to do this the right way.

    Read the article

  • Backing up data stored on Amazon S3

    - by Fiver
    I have an EC2 instance running a web server that stores users' uploaded files to S3. The files are written once and never change, but are retrieved occasionally by the users. We will likely accumulate somewhere around 200-500GB of data per year. We would like to ensure this data is safe, particularly from accidental deletions and would like to be able to restore files that were deleted regardless of the reason. I have read about the versioning feature for S3 buckets, but I cannot seem to find if recovery is possible for files with no modification history. See the AWS docs here on versioning: http://docs.aws.amazon.com/AmazonS3/latest/dev/ObjectVersioning.html In those examples, they don't show the scenario where data is uploaded, but never modified, and then deleted. Are files deleted in this scenario recoverable? Then, we thought we may just backup the S3 files to Glacier using object lifecycle management: http://docs.aws.amazon.com/AmazonS3/latest/dev/object-lifecycle-mgmt.html But, it seems this will not work for us, as the file object is not copied to Glacier but moved to Glacier (more accurately it seems it is an object attribute that is changed, but anyway...). So it seems there is no direct way to backup S3 data, and transferring the data from S3 to local servers may be time-consuming and may incur significant transfer costs over time. Finally, we thought we would create a new bucket every month to serve as a monthly full backup, and copy the original bucket's data to the new one on Day 1. Then using something like duplicity (http://duplicity.nongnu.org/) we would synchronize the backup bucket every night. At the end of the month we would put the backup bucket's contents in Glacier storage, and create a new backup bucket using a new, current copy of the original bucket...and repeat this process. This seems like it would work and minimize the storage / transfer costs, but I'm not sure if duplicity allows bucket-to-bucket transfers directly without bringing data down to the controlling client first. So, I guess there are a couple questions here. First, does S3 versioning allow recovery of files that were never modified? Is there some way to "copy" files from S3 to Glacier that I have missed? Can duplicity or any other tool transfer files between S3 buckets directly to avoid transfer costs? Finally, am I way off the mark in my approach to backing up S3 data? Thanks in advance for any insight you could provide!

    Read the article

  • 13-inch Macbook Pro Battery Not Charging

    - by hkhalidz
    I have a 13 inch Macbook Pro. From last night on, it says 'Battery Not Charging'. The charger's light is always green and if I remove the charger it turns off(means the battery is completely empty). I tried reseting the PRAM and SMC, but not dice. I also tried using different chargers with out any success. The Macbook Pro is from the new generations and is over a year old. The Power Information section indicates 458 for the Power Cycles. Any help would be greatly appreciated.

    Read the article

  • ntbackup workalike for adhoc full backups in Windows 7 thats free and preferably open source

    - by Justin Dearing
    On windows 2000 and XP machines I used to be able to do the following: ntbackup backup systemstate c: /f e:\backups\machineName\machineName-full+systemstate_200101206.bkf This gave me a full backup of the system that I could use to do a system restore, after doing a barebones OS install. Windows 7 has a great utility for regular backups with alerting and all that stuff. It does not seem to have command line support. I'd like a backup solution for my Windwos 7 systems that has the following features: Is free Is open source (preferebly) Works while the system is booted and leaves the system functional (clonezilla is great for offline backups, and I use that too) Gives me a backup that is suited for a full system restore or partial system restore (ruling out most imaging software even if they could work while the system is booted via some sort of shadow copy voodoo) Can work via the command line Compression would be nice, the ability to pipe output would be better.

    Read the article

  • Error loading operating system WinXP Pro

    - by Jakesan
    So im getting the error "Error loading operating system" when the computer tries to boot to a fresh install of WinXP Pro. To get to this point, I: Shrunk the only partition with Gparted to 33GB Copied the partition to the end of the 200GB drive Enlarged the first one to fill the space Formatted the first partition to NTFS Set the first partition to boot, tagged the latter to hidden, removed boot flag This was done all under Hiren's BootCD. Now this is where it goes down the drain. I installed XP Pro SP1a from its CD, and chose to quick format the partition. Now after the OS was installed, I can't start XP without using the default menu action from Hiren's BootCD. All I am greeted with is the "error loading operating system" message. I tried to use the XP recovery to fixboot, fixmbr and bootcfg /rebuild (dont remember if the command was like this, anyway the 3 suggested commands). This did nothing. What am I missing here?

    Read the article

  • Backups of Exchange 2007 SP3 using VSS are abnormally large

    - by Stew
    I have recently implemented Veeam backup and recovery 6.0, and have noted when backing up my exchange server via incremental updates, it is transferring way more data than expected. Backup is incremental, and setup to use VSS. VSS is stable and healthy, according to vssadmin. Exchange 2007 SP3 running on Windows Server 2008 R2, just last weekend I installed the latest Rollup for Exchange. I thought the nightly incrementals were large, but perhaps my users really are sending that much mail so I tested taking one incremental backup, waiting 10 minutes and taking a second. The second incremental backup transfered 5.8GB of data. We as an organization are absolutely NOT putting 5.8GB of data on the mail server every 10 minutes. Are there any other veeam users who have seen something similar? Is my test faulted? Are there other considerations for VSS?

    Read the article

  • backing up ntfs disk using rsync on ubuntu

    - by user70366
    For a long time I was using windows. I have a separate drive I use to keep copies of my media files, photos etc. on, which I periodically backup to an external drive. In Windows I used SyncToy to do this. After my Windows stopped booting, I decided to switch to Linux (Ubuntu 10.10). That seems to be going fine, but now I want to backup my drive to the external drive like before. Mostly the two drives will be already the same with maybe about 10GB of extra files added. So I try to use rsync to synchronise the two drives like this: rsync --dry-run -rvlt --modify-window=1 /media/Antonio1TB/Backup /media/FREECOM\ HDD/Backup The problem is the dry run indicates that every file on the drive will be copied. Not just the files I have recently added. What is the correct command to synch two NTFS drives under Ubuntu so that files that already exist don't get copied again? Thanks.

    Read the article

< Previous Page | 54 55 56 57 58 59 60 61 62 63 64 65  | Next Page >