Search Results

Search found 16404 results on 657 pages for 'easy transfer'.

Page 169/657 | < Previous Page | 165 166 167 168 169 170 171 172 173 174 175 176  | Next Page >

  • I run about 100 small traffic websites, what host would you recommend (expansion is planned)?

    - by MALON
    I know there are plenty of suggestions like asmallorange, linode, etc, but how well do these apply to someone who is running 100 sites? Traffic can be anywhere from zero hits a month up to about 1,000. The host I'm using right now doesn't allow access to httpd.conf or other important apache features. If I had to guess, it seems like Linode or other services like it are right up my alley, however, I am not great with linux. I can get by alright in Ubuntu, but that's about it. Will this knowledge be enough to get by with Linode? What about domain name transfers? The way it works now for me is if someone has an existing site, I ask them to get the domain transfer code and then I send the domain name xfer code to my current host and they take care of the rest. Does Linode take care of domain name transfers? How do I do it?

    Read the article

  • Where can someone store >100GB of pictures online? [closed]

    - by sbi
    A person who is not very computer-savvy needs to store 130GB of photos. The key parameters are: an non-negligible probability that the company selling the storage will be existing, and the data accessible, for at least five years data should be considered safe once uploaded reasonable terms of service: google drive reserving the right to literally do anything they want with their user's data is not acceptable; the possibility that the CIA might look at those pictures is not considered a threat easy to use from Windows, preferably as a drive no nerve-wracking limitations ("cannot upload 10GB/day" or "files 500MB" etc.) that serve no purpose other than pushing the user to the next-higher price plan some upgrade plan: there's currently 10-30GB of new photos per year, with a tendency to increase, which might bust a 150GB limit next January ability to somehow sort the pictures: currently they are sorted into folders, but something alike (tags) would be just as good, if easy enough to apply of course, the pricing is important (although there's a reason this is the last bullet; reasonable data safety is considered more important) Nice to have, but not necessary features would be: additional features related to photos (thumbnail generation, album sharing etc.) access from web and other platforms than Windows (smart phones) Let me stress this again: The person in need of that is able to copy pictures from the camera to the computer, can copy files in the explorer, and uses a web email service. That's about it, there's almost no understanding of what happens under the hood.

    Read the article

  • Connecting to a 2.5" laptop drive using USB

    - by user27449
    So I swapped my MBP hard drive with a SSD drive, and I want to connect to my old hard drive. I bought a nextstar 3 external 2.5" drive enclosure that has a USB 2.0 interface. I watched an online video of an older version of the enclosure, and in the video they had a ATA cable to connect to the drive. My enclosure didn't come with anything like that, and there doesn't seem to be a way to connect to my drive (my old drive is a seagate 500gb momentus 7200 RPM). My laptop is about 2 years old now (MBP 17"). The drive doesnt' have pins to connect, it has a wide plug type connector, and a USB connector also (I was able to directly plugin the USB cable to the drive, but didn't connect it to my computer, was just testing thigns out). Is it safe for me to directly connect the cable that came with the enclosure to the drive, and then connect the other end to my computer? Does this single cable provide power and data transfer?

    Read the article

  • Folder sync application which can sync over Internet (the other machine specified by an IP)?

    - by Adal
    I need to sync some folders between two Win 7 machines. While they are connected to the same LAN, they can't see each-other over Windows Networking since sharing is disabled on both of them (security reasons). Do you know any sync app which can work over IP? The folder I need to sync has 500,000 files in it (80 GB in total), so the sync app should be pretty efficient. At the moment I copy the files from one machine to the other over FTP, but it takes forever, since a separate connection is opened for each file. Or maybe you know some app which can efficiently transfer a large number of files between two machines on the Internet?

    Read the article

  • Hardware recommendations for building an Ubuntu encrypted file server

    - by Robert Mashlan
    I would like to build a file server for my home network using Ubuntu. It will serve files from RAID1 configured disks, either in the OS or in hardware. It will be connected to a Gigabit ethernet LAN. The disks will use an encrypted file system. It will serve samba shares. I would like a recommendation on what kind of processing power/memory I would need to build a box that would be able to sustain the full capacity of the Gigabit ethernet connection in a file transfer for a single connection with the overhead of serving from an encrypted disk. I'm not looking to build a dream server, I just want enough processing capacity for high performance (and reliable) file sharing and spend as little as possible for it. This may be tangential, but what kind of hardware would I need to have a server be able to reliably go into a low power mode when no requests are being made of it?

    Read the article

  • HDDs randomly falling off raid

    - by michael
    I really need help on this it's Saturday long weekend.No customer service help:( .I recently build new server/light duty desktop.Main purpose is only file sharing.Raid configuration :Adaptec 6805 ,8x 3TB HDD WD Red,Intel RES2SV240 expander,Raid 6,set in Intel mobo DZ 77GA-70K.I upgraded firmwares, but I'm having strange problem. During Build Verify segment 7 got missing.I just reinsert drive into hot swap bay and it started to rebuild Array.After rebuild was done another segments 0 and 5 gone missing while build/verify.I reinserted drives and now I'm praying that raid is going to rebuild successfully from remaining 6 drives,because i already transfer some data on it(I know it was bad idea).I checked S.M.A.R.T on missing drives , it only says link failure and aborted commands on one of them.No Errors on HDDs.Connections and cables are good.I added 2 fans blowing on RAID controller because it was getting too hot, so I guess overheating shouldn't be issue.What can be possibly wrong? Thank you for help.

    Read the article

  • Remote screen control software (such as VNC) with permission mechanism

    - by xuhdev
    Does anyone know a kind of software that allows controlling of other computer, but the server can configure permission control? For example, the server can define five users, one of them can control, but the other four of them can only view. Also, file transfer support is required. I've tried TightVNC, it works fine with anything else, but it just does not support user permission control. RAdmin can serve this job, but I wish to find some software which is free. Thank you!

    Read the article

  • How can I solve Windows PPTP VPN issues?

    - by Robin M
    I'm having persistent problems with Windows PPTP VPN connections. The VPN appears up whilst the tunnel won't transfer traffic (ping to a remote IP within the VPN works for a while, and then fails). The client receives routing information via DHCP. When the connection fails, the routing table is still correct so I don't think it's a routing problem. My internet connection is via an ADSL2 line. There's software to deal with PPTP problems, like TunnelRat, but I don't want to install v1.1 of the .NET framework and I'd rather get to the bottom of the problem (I have multiple VPN connections and some are more unreliable than others). What can I do to get to the bottom of this? Alternatively, what can I do to keep the connection alive?

    Read the article

  • Online FTP or file sharing service [on hold]

    - by Frede
    We need to share large files with clients, e.g. clients upload a large file, we modify it and later make it available for download. Up until now we've used FTP but this has a number of drawbacks. A lot of management of files and setting up accounts etc. We are therefore considering online alternatives. Requirements: Cheap, 8-) Easy to use, ideally just requiring a web browser, but also possible for power users to connect e.g. via FTPS/SFTP No registration requried for users to upload/download files. We ourselves of course need to be able to login an view uploaded files and upload new files. No per user fee High bandwidth. As files may be GBs in size both upload and download speed cannot be too slow Secure. Encryption during upload/download. No way for users to access uploaded files. Once a user has uploaded a file they (or anyone else besides us) should be able to access the file. To download files users get a link with a password. Ideally the link expires after a set time. No software installation We do NOT need any sync features, backup, versioning etc. Just a quick, easy, secure way for us to share files with our clients. Services like JustCloud, DriveHQ etc seems bloated and "too much" for what we need. What other alternatives exist? Thanks!

    Read the article

  • Rejecting new HTTP requests when server reaches a certain throughput

    - by user56221
    I have a requirement to run an HTTP server that rejects new HTTP requests (with a 503, or similar) when the global transfer rate of current HTTP responses exceeds a certain level. For example, if the web server is transferring at 98Mbps, and a new HTTP request arrives, we would want to reject this (as we couldn't guarantee a good speed). I've had a look at mod_cband for Apache, limit_req for nginx, and lighttpd's rate limiting features, but none of them seem to handle my (rather contrived, granted) use case. I should add that I'm open to using pretty much any web server, and am open to implementing this in iptables rules if someone can craft such a rule! (Refusing the TCP connection is fine, it doesn't have to respond with an HTTP 503). Any suggestions?

    Read the article

  • How can I create a bootable CD-ROM/USB Drive? (With NTFS + USB Drive support)

    - by RonK
    My motherboard got fried and I was forced to get a new hardware set (MB+CPU+RAM) so in most likelyhood I'll need to reinstall my Windows 7. I usually follow procedure and put the OS on the primary partition and my data on a logical partition - so I can format the primary without concern - but this time I made a mistakte and left some crucial items on the primary partition. I want to create a bootable CD-ROM/USB Drive which can read NTFS so I can access this data. If booting via a CD-ROM, I would prefer being able to connect a disk-on-key/My Passport to the computer and be able to access it to transfer the data to it. How can I do it? (free 3rd party applications are most welcome)

    Read the article

  • Need hosting (e-mail, http) for external domains

    - by disappointed
    This may not be the right place, but since it is a more technical aspect of the hosting world, I am taking the liberty to ask: I'm currently running a virtual server with nginx and postfix for web and e-mail, but I can't handle the administration and, due to frequent problems with e-mail services, I need to resolve this with a almost-standard hosting package (anything should work, even 5 MB static files would be OK). The exception being that I would like to use several domains, hosted with different registrars, for web and e-mail. Currently, this is a very simple configuration in my setup. All hosters I have looked at seem to think this a costly business (more than domain registration costs), but of course the recommend to transfer domains to them (they want the $$). Does anyone know of a hosting company that allows its customers to freely manage domains registered somewhere else?

    Read the article

  • rsync generates very much traffic

    - by user109459
    I use rsync for backing up one of my servers with 4GB of files. When I now try to transfer these files the traffic for the files isn't the estimated 4GB. It is a lot higher. It's about 60GB. I also checked the traffic on my server, backup server and router and all three say that there was a traffic of 60GB. But at the end rsync says that it only has transfered 4GB. Another problem is that I can't debugg it because the problem occures randomly.

    Read the article

  • OpenVPN / iptables restrict some access

    - by RitonLaJoie
    I want to create an openvpn service on a dedicated server I have, for some friends so that they are able to play online games faster. Is there an easy way to restrict which traffic I allow them with iptables ? It seems iptable is not very easy to maintain and we can easily get kicked out of our own server. Rebooting on a rescue mode every time I would get kicked out because of bad iptable rules would just be a pain. As far as I understand, the tun interface would be providing the access. Which kind of rule in iptables would I have to implement to restrict their access to only 1 ip ? Also, I don't want this vpn to be the default gateway for all the traffic. I guess I should go with the option of pushing a route to the clients so that they connect to the IP of the game server through the VPN and use their regular routes through their ISP for all the other traffic ? As a side not, it seems Openvpn AS is not very robust. Is there some other (commercial is ok) product that would give me the same administration options through a web interface ? Is Webmin the only other solution ? Thanks !

    Read the article

  • Mysql skip-name-resolve

    - by user72459
    I had recently purchased a new server, and transferred all my accounts via WHM Transfer. The problem is that when WHM takes a daily backup, it outputs are message such as DBD::mysql::st execute failed: There is no such grant defined for user 'abc' on host 'myhostname' The problem is solved when I remove skip-name-resolve from my my.cnf file. Tough I dont find any differences in the speed (when I dont add it), it is often mentioned in forums that adding skip-name-resolve optimizes Mysql Performance. Does adding skip-name-resolve really help, if one has a Dedicated server?

    Read the article

  • Is it possible to access internet on smart-phone through usb (no wi-fi) using Laptop as the modem?

    - by Prahlad Yeri
    One of my relatives has a nokia 5235 smartphone. It is good, but the only limitation is - there is no wifi. bluetooth & usb are the only connectivity options. His laptop has an unlimited broadband but no support for bluetooth, so basically usb is the only method by which he can connect the mobile with laptop. Now, he can do a file-transfer between these two devices, but by any chance, is there a way by which we can share the Laptop's internet connection through usb to the smart-phone? So, bascially its like this: Broadband-modem = Laptop =usb= smart-phone -(want to use internet here) Are there any software or utilities available to achieve this ?

    Read the article

  • SFTP (or similar) server automated setup for group spaces

    - by spikeheap
    I need to build a dedicated machine which will be used to allow our clients to upload and download files in a secure manner. Each client has multiple users, and I would rather not hand out generic client users which are used by multiple people. Each client should have access to their files only, and no others. There is no use-case (yet) for multiple clients interacting with a single file or space. Is there an existing solution to automating the creation and maintenance of these accounts, preferably with a view to integration with LDAP? Currently it looks like if we want to use SFTP with chrooted spaces they will need to be set up manually (or an automation hand-rolled). If a solution exists for a different (but still secure) transfer method, such as FTPS, I'm all ears.

    Read the article

  • SFTP sending files between laptops on Ubuntu

    - by twigg
    I want to transfer files between two Ubuntu systems using SFTP. I have got it set-up and I can connect to the other laptop, ping it and see its file list using sftp> dir. I can see the files on the other system. But when I call get filename.deb it comes up saying Fetching /home/user/filename.deb to filename.deb 0% 0 0.0KB/s --:-- ETA and then drops back to the sftp command promote without transferring anything. Have I missed something?

    Read the article

  • Slow Gmail Attachment Download over HTTPS [closed]

    - by Todd
    It only affects Gmail. I have tested other HTTP, FTP and HTTPS websites and downloading is fine, at the exact same time the problem is affecting a Gmail attachment download. The symptoms include both: High waiting time, before the download starts (in the order of 30 seconds); AND Very slow download transfer speeds ( in the order of 1kbps) It's a very strange problem, and I didn't find any recent issues relating to Google Australia. The account works fine at another site with a different ISP, so it's either a computer configuration or ISP - I lean to the latter.

    Read the article

  • How to copy data (clone) from one partition to another in Windows XP?

    - by Martin
    I have installed a new hard drive in our PC running Windows XP and I wonder how to transfer the data from the old (small) data partition to the new (large) one. My question concerns only a data partition containing files and folders (not the boot partition with the Operating System files!) Is it ok to just copy the folders in the Windows XP Explorer to the new partition? Could anything be lost this way (hidden folders, metadata, ..)? What is the best way to clone a data partition in Windows XP?

    Read the article

  • Harddrive speed drops a lot!

    - by AZ
    The hard drive is used to do BT with uTorrent. Recently uTorrent began to report there is a "I/O Device error", then I use HD Tune to test it, it turns out the transfer rate is only 16 MB/sec. At the same time, I have a similar hard drive tested, it rates as 120 MB/sec. They are both 7200rpm desktop hard drives. I used chkdsk to fix it but the rate didn't change. Is this a symptom of hard drive failure? Should I backup the content on the disk ASAP or is there any other tool can fix it or diagnosis it?`

    Read the article

  • Large OSX 10.6 to Windows 7 smb transfers fail?

    - by user41724
    I'm connecting to a windows 7 box from a OSX 10.6 box via smb: smb://ftp1 Connection works fine, I can transfer individual files one at a time, but as soon as I try and move an entier folder I get the following error: The Finder can’t complete the operation because some data in “test” can’t be read or written. (Error code -36) This error happens on all our OSX boxes when trying to push the entier folder to the Win7 box. The folder TEST in the above error message has -R 777 permissions. I can move every image file to the windows box one by one with no errors. But if I try an move the entier folder. bam error out. This error seems to kill the smb client on the Windows box as well. There's an FTP server on the windows box and I can FTP in from the OSX box and move everything just fine. Not sure what is going on here?

    Read the article

  • Can I move files from a laptop hard drive with a corrupt sector to a USB hard drive?

    - by Corey
    I have a hard drive that is on its way out and won't boot to Windows 7. The Windows partition takes up the whole disk. I thought I would try to recover some recent files that hadn't been backed up. Assuming the files are recoverable, how can I explore the drive that has the corrupt sector and transfer files to a USB hard drive? If it helps, the laptop is able to see the USB drive when choosing a boot order. Some searching lead me to WinPE 3.0, part of the Windows Automated Install Kit. Is that a method?

    Read the article

  • What are my options for a secure External File Share in Server 2008 R2?

    - by Nitax
    Hi, I have a Windows Server 2008 R2 machine installed on a home network with a number of files that need to be shared in a few different scenarios. I would like for all three scenarios to have a solution with some sort of encyption to protect the data during transfer. Scenario 1: I need to access files from my laptop (Mac OSX) or another computer outside of the network. This option seems like the easy one to answer in that I could use LogMeIn, the windows VPN, etc. to create such a connection. Scenario 2: I need to provide access to another user with minimal installation / configuration on his or her end. This makes me think of the new FTP 7.5 provided with Server 2008 R2 but i'm not sure of the details: Does it support SSH or some other form of encryption?, can an OSX user connect?, etc. My question here is what are my options? I really just don't know where to get started...

    Read the article

  • calculater by using reverse polish notation and using a stack

    - by programmer
    hello everyone I have a segmentation fault ,can you help please? if i have this operater "3 5 +" that mean 3+5 and like "9 8 * 5 + 4 + sin", "sin(((9*8)+5)+4)" so my idea is check if the first and second are numbers and push theem in the stack then when i have operator i pop the numbers and make the calculation then push the answer again. ` typedef struct st_node { float val; struct st_node *next; } t_node; typedef t_node t_stack; // a function to allocate memory for a stack and returns the stack t_stack* fnewCell() { t_stack* ret; ret = (t_stack*) malloc(sizeof(t_stack)); return ret; } // a function to allocate memory for a stack, fills it with value v and pointer n , and returns the stack t_stack* fnewCellFilled(float v, t_stack* n) { t_stack* ret; ret = fnewCell(); ret->val = v; ret->next =n; return ret; } //function to initialize stack void initstack(t_stack** stack) { fnewCellFilled(0,NULL); } // add new cell void insrtHead(t_stack** head,float val) { *head = fnewCellFilled(val,*head); } //function to push the value v into the stack s void push(t_stack **s, float val) { insrtHead(s,val); } //function to pop a value from the stack and returns it int pop(t_stack **s) { t_stack* tmp; int ret; tmp = (*s)->next; ret = (*s)->val; free(*s); (*s) = tmp; return ret; } int isempty (t_stack *t) { return t == NULL; } //function to transfer a string(str) to int (value) //returns -1 when success , i otherwise int str2int(char *str,int *value) { int i; *value = 0; int sign=(str[0]=='-' ? -1 : 1); for(i=(str[0]=='-' ? 1 : 0);str[i]!=0;i++) { if(!(str[i]>=48 && str[i]<=57)) // Ascii char 0 to 9 return i; *value= *value*10+(str[i]-48); } *value = *value * sign; return -1; } //a function that takes a string, transfer it into integer and make operation using a stack void function(t_stack *stack, char *str) { char x[10]=" "; int y,j,i=0,z; printf("++\n"); if(str[i] != '\0') { strcpy(x, strtok(str, " ")); z= str2int(x, &y); if(z == -1) { push(&stack,y); i=i+2; } } while(str[i] != '\0') { strcpy(x, strtok(NULL, " ")); z= str2int(x, &y); if(z == -1) { printf("yes %d",y); push(&stack,y); i=i+2; } else { y=pop(&stack); j=pop(&stack); if(x[0] == '+' ) push(&stack,y+j); else if (x[0] == '-' ) push(&stack,j-y); else if(x[0] == '*' ) push(&stack,j*y); else if(x[0] == '/') push (&stack ,j/y); } } } int main() { t_stack *s; initstack(&s); char *str="3 5 +"; function(s,str); return 0; } `

    Read the article

< Previous Page | 165 166 167 168 169 170 171 172 173 174 175 176  | Next Page >