Search Results

Search found 16404 results on 657 pages for 'easy transfer'.

Page 161/657 | < Previous Page | 157 158 159 160 161 162 163 164 165 166 167 168  | Next Page >

  • Need deleted data back from Pen Drive, Please help

    - by Manav Sharma
    All, I am using a Pendrive to transfer files from one system to another. I think I deleted some files from the Pendrive that I now need back. Also, there are chances that I might have overwritten the memory where the deleted data might have existed. Is there any software to do something about that? I understand that logically whatever is overwritten in memory cannot to fetched back. But still I need to trust the advances in the technology. Any help is appreciated. Thanks

    Read the article

  • Empty /var/log after running cron bash script

    - by Ortix92
    I wrote a little bash script and all of a sudden my /var/log folder is completely empty except for the log I created for the bash script. This is the script I'm running every hour with cron: #!/bin/bash STL_DIR=/path/to/some/folder/i/hid LOGFILE=/var/log/stl_upload.log now=`date` echo "----------Start of Transmission----------" 2>&1 | tee -a $LOGFILE echo "Starting transfer at $now" 2>&1 | tee -a $LOGFILE rsync -av -e ssh $STL_DIR [email protected]:/users/path/folder 2>&1 | tee -a $LOGFILE echo "----------End of transmission----------" 2>&1 | tee -a $LOGFILE printf "\n" 2>&1 | tee -a $LOGFILE I want to be clear that I'm not 100% certain this is related to the empty logs folder. So if anyone could give me a pointer as to what could be going on about the reason why my log folder is empty, that'd be great.

    Read the article

  • Corporate IM with video that actually works, suggestions?

    - by Erik P. Skaalerud
    Hi. Does anyone here have a suggestion for a cross-platform IM solution wich will work with voip/video on both Windows (XP and 7) and Mac OS X from 10.4 and upwards? Right now were in a kind of mixed enviroment, with some Mac users using iChat server since they need video support (conference across several offices over VPN), but it wont't work on windows clients. The rest of us are happily using Openfire+Spark, but there's no VoIP or video avaible from what i've found, unless you want to add in several 3rd party software (like red5 and asterisk). Requirements: As said before; must work on both Windows and Mac Internal server (no Skype etc) File transfer between platforms SSO (Single Sign-On) via Active Directory authentication Some sort of screen sharing would be a plus, like switching over to a screen capture (powerpoint, software training etc) We can afford to buy software if that's needed to get this working without any hiccups across platforms. Pre-thanks to anyone who gives suggestions.

    Read the article

  • Including file server into domain

    - by user23419
    We have a file server in our LAN. Windows 2003 Server. It was not included into our domain. Every user has local account on that server, and NTFS permissions are granted to those local accounts. Now we have established a domain controller. Every user has got a domain account. How to include file server into domain so that the permissions would transfer from local accounts to domain accounts? Is it possible to automate this process somehow?

    Read the article

  • Apache only transferring partial content from a Samba share

    - by thaBadDawg
    I have an Apache server running on CentOS 5.3. It currently hosts 12 sites with no known issues. (I say this to point out that up to this point my Apache installation has performed flawlessly) I'm adding a new site where the DocumentRoot of the new VirtualHost is a Samba share. When at the command line of the server I can cp video.m4v ~ and the whole file is copied properly to my home directory. But when I try to access the file from IE/Firefox/Safari/Chrome it only passes back a partial result of 33k. The same thing is happening with my image and audio files. If I make the files local to the server by copying them from the share and then serving them up then the files transfer. Any ideas?

    Read the article

  • rsync for coping file

    - by vinayrks
    I am migrating my old server to new server . I used this server for hosting website . first I tried sftp but due to huge number of files and connection time out , it simply didn't work . then I tried rsync .rsync working good , but only problem I am facing it updating file very nicely & fastly but do not copy new files please help me . because still i need to transfer lots of file. I am using this command : rsync -anv -e ssh oldserver:/path/ /path

    Read the article

  • reaching 99.9999% uptime

    - by user35204
    I am currently developing a project which is mission-critical. The actual domain name is registered with 1 & 1 and I plan on purchasing DynDNS Custom DNS service (which has 5 different geographical locations for DNS) and then another secondary DNS service to make sure my DNS is as failover safe as possible. Does it matter that the registration is with 1 & 1 - are they a weak link in the chain? All I really use them for is to say that DynDNS is my primary DNS nameserver and then my secondary DNS is my other nameserver. I can transfer the registration to DynDNS - Im just not sure if it really matters or not. Thanks

    Read the article

  • rsync for copying file

    - by vinayrks
    I am migrating my old server to new server . I used this server for hosting website . first I tried sftp but due to huge number of files and connection time out , it simply didn't work . then I tried rsync .rsync working good , but only problem I am facing it updating file very nicely & fastly but do not copy new files please help me . because still i need to transfer lots of file. I am using this command : rsync -anv -e ssh oldserver:/path/ /path

    Read the article

  • Windows File Sharing - Long Initial Delay

    - by Isaac Sutherland
    I have two Windows 7 machines connected to a router. I created a shared folder on machine A, and I can access it from machine B. The transfer speed is great. However, there is sometimes a long initial delay when I try to access the shared folder from machine B. I'll click to open the folder, and windows explorer pauses for a few minutes before actually loading the contents of the folder. After it loads, however, I can navigate the subfolders and edit files with no noticeable delay. Then, some time later, I will get the huge delay on saving a file, after which subsequent saves have no delay. What is the problem here, and how can I fix it?

    Read the article

  • Apple Automator "New PDF from Images" maintaining same filename

    - by mech
    I will potentially have 26k of old legacy PICT images to transfer first to PDF for migration. I am using Apple Automator and also the "Dispense Items Incrementally" to loop through it. However, I can't seem to let "New PDF from Images" to remember the original filename. Anyone able to offer some advice :) FYI, I am transforming it to PDF because I can't do it using ImageMagick to convert directly to my ultimate JPEG format. Due to the fact that my PICT was created very long ago and thus has some convert: improper image header error. See this ticket for more information. Thus I am doing a intermediate convert PICT to PDF first, then convert that PDF to JPEG :) The only thing left is the naming of the "Output File Name" which do not allow me to identify original filename. See the screen here:

    Read the article

  • Can I rely on S3 to keep my data secure?

    - by Jamie Hale
    I want to back up sensitive personal data to S3 via an rsync-style interface. I'm currently using s3cmd - a great tool - but it doesn't yet support encrypted syncs. This means that while my data is encrypted (via SSL) during transfer, it's stored on their end unencrypted. I want to know if this is a big deal. The S3 FAQ says "Amazon S3 uses proven cryptographic methods to authenticate users... If you would like extra security, there is no restriction on encrypting your data before storing it in Amazon S3." Why would I like extra security? Is there some way my buckets could be opened to prying eyes without my knowing? Or are they just trying to save you when you accidentally change your ACLs and make your buckets world-readable?

    Read the article

  • CentOS centralised logging, syslogd, rsyslog, syslog-ng, logstash sender?

    - by benbradley
    I'm trying to figure out the best way to setup a central place to store and interrogate server logs. syslog, Apache, MySQL etc. I've found a few different options but I'm not sure what would be best. I'm looking for something that is easy to install and keep updated on many virtual machines. I can add it to a VM template going forward but I'd also like it to be easy to install to keep the VM complexity down. The options I've found so far are: syslogd syslog-ng rsyslog syslogd/syslog-ng/rsyslog to logstash/ElasticSearch logstash agent in each log "client" to send to Redis/logstash/ElasticSearch And all sorts of permutations of the above. What's the most resilient and light from the log "client" perspective? I'd like to avoid the situation where log "clients" hang because they are unable to send their logs to the logging server. Also I would still like to keep local logging and the rotation/retention provided by logrotate in place. Any ideas/suggestions or reasons for or against any of the above? Or suggestions of a different structure entirely? Cheers, B

    Read the article

  • Forgot to unmount/eject external hard drive, lost moved files. Mac OS X

    - by balupton
    So I was using my Mac with my external hard drive connected via USB. I moved about 10 GB of data to it (via drag and drop while holding down the Command key to move the files rather than to copy them). They moved to the drive all right, but as I was having some issues and the Finder crashed after the transfer, I was unable to eject the volume and later everything froze so I had to do a hard restart (hold the power button). When I remounted the volume (plugged the external hard drive back in) it no longer had any of the files which I moved onto it. As it was a lot of data, how can I recover these files?

    Read the article

  • How can I solve Windows PPTP VPN issues?

    - by Robin M
    I'm having persistent problems with Windows PPTP VPN connections. The VPN appears up whilst the tunnel won't transfer traffic (ping to a remote IP within the VPN works for a while, and then fails). The client receives routing information via DHCP. When the connection fails, the routing table is still correct so I don't think it's a routing problem. My internet connection is via an ADSL2 line. There's software to deal with PPTP problems, like TunnelRat, but I don't want to install v1.1 of the .NET framework and I'd rather get to the bottom of the problem (I have multiple VPN connections and some are more unreliable than others). What can I do to get to the bottom of this? Alternatively, what can I do to keep the connection alive?

    Read the article

  • Improving server security [closed]

    - by Vicenç Gascó
    I've been developing webapps for a while ... and I always had a sysadmin which made the environment perfect to run my apps with no worries. But now I am starting a project on myself, and I need to set up a server, knowing near to nothing about it. All I need to do is just have a Linux, with a webserver (I usually used Apache), PHP and MySQL. I'll also need SSH, SSL to run https:// and FTP to transfer files. I know how to install almost everything (need advice about SSL) with Ubuntu Server, but I am concerned about the security topic ... say: firewall, open/closed ports, php security, etc ... Where can I found a good guide covering this topics? Everything else in the server... I don't need it, and I wanna know how to remove it, to avoid resources consumption. Final note: I'll be running the webapp at amazon-ec2 or rackspace cloud servers. Thanks in advance!!

    Read the article

  • Current alternative to the old CHECKSUM program

    - by faulty
    I'm looking for an application that does md5/sha hash check on specific files/folders periodically and store an index file per folder for future verification. I remember such application exist in DOS days, to detect files infected by virus. The main purpose for this is to detect corrupted copy of backup, as I understand that consumer grade hardware are not 100% error free when doing backup or file transfer from device to device. The hash can also be used to generate a list of changed files for backup. Most of the software I can find is hash manually. EDIT: Windows based application, preferably a shell extension which I can right click on a folder and do a checksum/verify all files in that folder. Even better if that can integrate with a backup/sync program like BeyondCopy

    Read the article

  • Custom firmware for Asus WL-520g

    - by Jaroslav Záruba
    What custom firmware works with Asus WL-520g? (Note this is not 520gU, 520gC, etc.) I failed to flash it with Tomato (Tomato_1_28_ND.zip) - the admin UI does not accept the file, and when trying to tftp the file as adviced for 520gU all I get is this: Transfering file tomato-ND.trx to server in octet mode... Error occurred during the file transfer (Error code = 0): Error in SendPacket() call. I just saved the router form rather unsuccessful flash to DD-WRT (after few hours the router fell into coma), and I'd like to keep it as a backup should the new one die or whatever. (Unfortunately the stock firmware does not support WOL.)

    Read the article

  • Best way to keep configuration for server reinstallation?

    - by Gunnar
    I have a server at home running Ubuntu 12.04 which has grown messy over the years. I have fiddled with various packages, desktop environments (for VNC) etc. and I would like to reinstall it to start again, and have better control over what goes into the box. But I want to keep much of the configurations after reinstallation, like LVM configuration, apache2, samba, etc etc. There would ideally exist a program which could analyze /etc, installed packages and such, store the information, and selectively put it back into the new installation. I am even considering installing Ubuntu server on a virtual machine, just to be able to compare the contents of /etc with a clean installation, and even perform a migration to the virtual machine first, to verify that the transfer process works. How do one go about performing this kind of reinstallation? Have anyone seen any resources on the net on the topic?

    Read the article

  • Server hard disk read speed and client download speed, is there a connection? [closed]

    - by Mywiki Witwiki
    Ok so a client's download speed is only as fast as a server's upload speed, and vice versa. Based on the answers to this post: Does upload speed depend upon download speed of the server? In other words, the data transfer rate between the two computers is only as fast as the speed of the "bottleneck". Let's pretend the two computers are in two different networks and both have 100Mbps internet connection. Ben wants a copy of a file in Mark's computer hard disk with 30Mbps read speed. Does this mean that Ben can download the file at a speed of around 30Mbps only, despite having an internet connection faster than 30Mbps?

    Read the article

  • How to FTP multiple folders to another server using mput in Unix?

    - by Mircea
    I am logged in on a server (using Putty). from there I'm connecting using FTP to another server. I want to copy several folders from the first server to the second server using mput. Like: ftp mput folder1 folder2 folder3 But I get "folder1: not a plain file."...and so on. Each of these folders have subfolders, files (some binary, some not). How can I accomplish what I want without zipping the stuff and then transfer? Thanks.

    Read the article

  • Home Server: cpu virtualisation, what to choose?

    - by Huygens
    I'm looking for virtualisation solutions for storage and OS for a home server. A sort of private cloud where I manage the storage space independently of the VM one. This question focus on VM (or compute instance) management and what would best suit my needs. (I have another question related to the storage management). My use cases are: A backup server: rsync and other services running. A personal cloud server: a kind of owned dropbox system, à la ownCloud. " users foreseen. A media server: streaming videos and displaying photos. Here my environement and wishes: Server: HP Proliant MicroServer with 8 GB RAM (AMD Turion dual core with AMD-V technology) OS types: only Linux (perhaps a *BSD VM in the future) Linux distributions do not matter, I'm familiar with RHEL, Fedora, Suse, Ubuntu, but any other recommandation will be fine 2-3 VMs foreseen: backup server, owncloud server and media server (optional). Those are only servers, so no graphical console needed (I don't need VirtualBox) By VM I mean a virtualised environment like KVM, Xen, etc. or a compute instance like with OpenStack storage should be "virtualised/cloudified" see my other question. VM should be able to be migrated to another server in the future if performance cannot be fullfilled anymore by the current server It does not matter if installation of such setup is complicated as long as management tools allow for easy maintenance I don't have Windows at home, so solution should be Linux friendly and would be nice to be web based. But native apps are OK too. System should be easy to enhance: by adding a new server to migate some of the VMs to it. So it's really a kind of private cloud on which I could run some Linux OS. I would prefer free (libre, as in a free speach) and open source tools. But it does not have to be free as in a free beer. So Xen, KVM, VitualBox or OpenStack? What would you recommend?

    Read the article

  • A simple Volume Replication Tool for large data set?

    - by Jin
    I'm looking for a solution to the following: Server A (Site A) - Win 2008 R2 - approx 10TB (15TB max) of data - well over 8 million files Server B (Site B) - Win 2008 R2 I want to assynchronously replicate Server A's volume to a volume on Server B for data redundancy. Something that I can say to my users, "go here for data" when/if Server A goes belly up due to machine problems, disaster, etc. Windows 2008 R2 does have DFS, but microsoft does not apparently support this large of a dataset (or more accurately, more than 8 million files - according to the docs I could find). I also looked at Veritas Volume Replication, but this seems almost too much as I would also require Veritas Volume Manager. There are numerous "back-up" software which makes a 1-1 backup, which would be ok, but since it will be transfering over internet, I'd like something that has compression during transfer like DFS has. Does anyone have any suggestions regarding this?

    Read the article

  • Win2008 DC in a Windows 2000 domain: can I keep the old DC?

    - by gravyface
    Will be putting a new Windows 2008 SE Server into a single domain network with two domain controllers, both running Windows 2000 Server. The functional level of the domain is mixed mode/2000. Until a second 2008 DC can be purchased, I'd like to leave the current Win2k operational master DC as a backup DC as the other member servers running 2003 have either accounting/SQL or Exchange on them. Eventually all the w2k servers will be decommissioned, but until then, I need another DC for redundancy. Following the standard process for adding a new DC, can I leave the old operational master DC (or the other backup DC) running after I transfer the FSMO roles to the new server? Will this cause any issues?

    Read the article

  • Strange FTP issues - some files are not downloaded

    - by FractalizeR
    I have a machine, which cannot fetch some files from remote servers by FTP. Machine is powered by CentOS. I tested FTP on three files: 12.09.2012 21:21 166 007 ll091212.002 13.09.2012 11:32 23 040 ll091212.003 13.09.2012 11:50 61 313 ll091212.004 From them, I can always successfully download only one - ll091212.004. Two others are downloaded by about 90% (I can see them on disk) and then FTP transfer hangs without any error messages. I move files, copy them about the remote server - no luck. Another machine from the same subnet can download all three of them easily. I just don't know what's the reason of this.

    Read the article

  • HTC Android Fails to mount- Mount from computer?

    - by Ben Franchuk
    I Have an HTC Incredible S (S-Off, Rooted, ViperVIVO 1.3.0 ICS) that has seemingly ceased to posses the ability to mount its SD Storage to my computer. For whatever reason, whenever I plug in my device to transfer files from computer to phone and vice versa, the computer, for some reason, cannot actually aces the phone. I get prompted with a window on my phone when I first plug it in, asking me which mode I want to put the device into (Charge mode, tether mode, etc.), and even if I select the "Disk Drive" function, the phone still cannot successfully mount to my computer. The phone itself unmounts itself from the SD and says that the computer is connected, but again, it doesn't work. Is there any way to force mount the device from my computer- either via command or otherwise? This should help in that if I unmount the SD from the phone I should be able to mount it to my computer, from my computer, Correct?

    Read the article

< Previous Page | 157 158 159 160 161 162 163 164 165 166 167 168  | Next Page >