Search Results

Search found 75611 results on 3025 pages for 'copy file'.

Page 696/3025 | < Previous Page | 692 693 694 695 696 697 698 699 700 701 702 703  | Next Page >

  • Weird nfs performance: 1 thread better than 8, 8 better than 2!

    - by Joe
    I'm trying to determine the cause of poor nfs performance between two Xen Virtual Machines (client & server) running on the same host. Specifically, the speed at which I can sequentially read a 1GB file on the client is much lower than what would be expected based on the measured network connection speed between the two VMs and the measured speed of reading the file directly on the server. The VMs are running Ubuntu 9.04 and the server is using the nfs-kernel-server package. According to various NFS tuning resources, changing the number of nfsd threads (in my case kernel threads) can affect performance. Usually this advice is framed in terms of increasing the number from the default of 8 on heavily-used servers. What I find in my current configuration: RPCNFSDCOUNT=8: (default): 13.5-30 seconds to cat a 1GB file on the client so 35-80MB/sec RPCNFSDCOUNT=16: 18s to cat the file 60MB/s RPCNFSDCOUNT=1: 8-9 seconds to cat the file (!!?!) 125MB/s RPCNFSDCOUNT=2: 87s to cat the file 12MB/s I should mention that the file I'm exporting is on a RevoDrive SSD mounted on the server using Xen's PCI-passthrough; on the server I can cat the file in under seconds ( 250MB/s). I am dropping caches on the client before each test. I don't really want to leave the server configured with just one thread as I'm guessing that won't work so well when there are multiple clients, but I might be misunderstanding how that works. I have repeated the tests a few times (changing the server config in between) and the results are fairly consistent. So my question is: why is the best performance with 1 thread? A few other things I have tried changing, to little or no effect: increasing the values of /proc/sys/net/ipv4/ipfrag_low_thresh and /proc/sys/net/ipv4/ipfrag_high_thresh to 512K, 1M from the default 192K,256K increasing the value of /proc/sys/net/core/rmem_default and /proc/sys/net/core/rmem_max to 1M from the default of 128K mounting with client options rsize=32768, wsize=32768 From the output of sar -d I understand that the actual read sizes going to the underlying device are rather small (<100 bytes) but this doesn't cause a problem when reading the file locally on the client. The RevoDrive actually exposes two "SATA" devices /dev/sda and /dev/sdb, then dmraid picks up a fakeRAID-0 striped across them which I have mounted to /mnt/ssd and then bind-mounted to /export/ssd. I've done local tests on my file using both locations and see the good performance mentioned above. If answers/comments ask for more details I will add them.

    Read the article

  • Strange Windows Server 2008 R2 (FTP Server) Error - Caused by a specific combination of characters in the filename of uploaded file

    - by Steven
    We are running Windows Server 2008 R2, which is setup to be a FTP server. Everything seemed to be working fine until one our our cilents started complaining about their uploads being halted with the message "Connection with server reset". Further diagnosis revealed that a specific combination of characters in the filename will cause a repeatable error. I am hoping that a form expert can confirm the error or perhaps provide a solution. This is an example filename that will always cause the error: REPORT_FILED_000000001 (extension does not matter) Any help would be greatly appreciated! We need files named like this to work properly with our FTP server.

    Read the article

  • Rebuilding a Mac Mini (early 2009)

    - by Kelly Jones
    This weekend I decided to rebuild the family’s Mac Mini.  It’s the early 2009 model and I hadn’t done it since we got it in March of 2009.  Even worse, I had done the import data step (or whatever Apple calls it) which brought over all of the data files and apps from our previous Mac.  AND that install goes back to before 2005, as far as I can remember.  SO, to say that “cruft” had built up in the operating system, is probably a bit of an understatement. The rebuild went pretty smoothly, especially since I had a couple of spare hard drives.  I hooked up a spare USB drive and formatted it for use with the Mac.  I then used Carbon Copy to clone the internal hard drive onto the USB drive.  (Carbon Copy is a great little app that I used several years ago and I was happy to see it was not only still around, but updated as well.) Once I had my backup, I shut down the Mac and replaced the internal hard drive.  I had purchased the hard drive last fall to use with my work laptop, but I got a new work laptop (with awesome dual SSDs) so I wasn’t using it anymore.  The replacement drive (Seagate Momentus 7200.4 ST9500420AS 500GB 7200 RPM 2.5" SATA 3.0Gb/s Internal Notebook Hard Drive) has more than double the original’s capacity and is also faster.  I’ll have to keep an eye on the temperature, since that 7200 drive will run hotter. Opening the Mac Mini is not for the easily intimidated!  That cool little case is quite the pain to open.  Luckily, OWC put a video together here.  After replacing the drive, I then installed a clean copy of OS 10.5 using the DVDs that came with the Mac.  After the OS, it was time to reinstall the apps.  I downloaded some of the freeware, just to make sure I had the latest versions.  For the rest, I just copied from the backup cloned drive to the new drive.  (I love the way most Mac apps are written – with almost everything contained within a “package” that I can just copy from one drive to another.  MUCH better than the Windows way of using shared DLLs and the registry to store critical pieces that the app needs in order to run!) The whole process took longer than I would have preferred, but it was long overdue.  It definitely “feels” faster, especially boot time and application launches.

    Read the article

  • Delete registry key or value via a CMD script?

    - by Derek
    How do I edit an already-in-production .cmd script file, in order to have the script delete a certain registry key in the Windows registry? Firstly, is this even possible, and secondly (if that's not possible), could I create a .reg file and execute that file from with the .cmd file? From within the .cmd script, it is not working: del "[HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\CurrentVersion\SampleKey]" This method hasn't worked for me either: cmd "\\networkdrive\regfiles\deleteSampleKey.reg" Then from within the .reg file: Windows Registry Editor Version 5.00 [ -HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Windows NT\CurrentVersion\Winlogon ]

    Read the article

  • Create/rename a file/folder that begins with a dot in Windows?

    - by Adventure10
    Many programs needs folder names that starts with a dot, like .emacs.d, .gimp-2.2, .jedit etc. How do I create such a folder? When using the Windows Explorer in Windows 2000 (and other versions), I get an error message saying "You have to enter a filename". The only solution I have come up with, is to open a command prompt (Start, Run, "CMD", OK) and enter "mkdir .mydir". Why have Microsoft this error message in the Explorer, but not in the command shell? Is there any registry hack out there to fix this, so that I am able to enter the folder name directly in the Explorer?

    Read the article

  • How can I pin point a USB file transfer bottleneck in Unix?

    - by HankHendrix
    I'm experiencing very slow data transfer speeds over USB 2.0 on my nix box and was wondering how I can pin-point the cause of the problem. I've looked into iotop and top but the cpu and mem figures look normal (compared to guides I have checked). The box which is affected is Ubuntu 12.04 32bit Server running on an Asus EEE 701 2G model and I am transferring from the OS over USB 2.0 to an external HDD (which transfers at 30MB/s+ on Windows 7 on other machine). I get rsync write speeds of 1MB/s from OS to USB HDD which seems ridiculously slow. These speeds are consistent with other USB HDDs and sticks.

    Read the article

  • How do I rsync an entire folder based on the existence of a specific file type in that folder

    - by inquam
    I have a server set up that receives movies to a folder. I then serve these movies using DLNA. But in the initial folder where they end up all kind of files end up. Pictures, music, documents etc. I thought I'd fix this by running the following script inside that folder rsync -rvt --include='*/' --include='*.avi' --include='*.mkv' --exclude='*' . ../Movies/ This works and scans the given folder and moves all the found movies of the given extension types to the Movies folder. But I wonder if there is anyway to tell rsync to if a folder if found that includes a movie of the given extension types, sync the entire folder. Including other files such as .srt. This is to make it easier for me to get subtitles moved along with the movie. I have a solution figured out via a script made in php (yea, I actually do most of my scripting in linux using php... just a habbit that stuck a long time ago). But if rsync can handle it from the start that would be super. Also, I have noticed that this line of rsync actually copies all the root folders in the given folder. If no movie is in the folder it will create an empty folder. How do I prevent rsync from doing this... and saving me the trouble of deleting all folder in Movies that are empty.

    Read the article

  • Push, parse & import "selected" data, text, info blobs from Webpages/ Emails as Event/ Appointment to standard Calendar directly or as .ics file?

    - by Alex S
    Any tool, plugin, extension, script/ code to push "selected" data, text, information blobs from Web pages, Emails etc, then parsed and imported to structured Event, Appointment (e.g. .ics) on a standard Calendar like Outlook, Google, iCal? If not, what and how could I use some scripting, coding or existing tools, extensions to add on top and do this. I come across a lot of unstructured information on Webpages, Emails, FB events etc. where I just want to add that information to my Calendar. Instead of entering all the information by hand all the time, there should be an easy enough way to have the information get parsed, organized and imported to a Calendar... Either directly to a calendar from source or Translated to a standard format such as .ICS that can be imported & saved easily. Would love to see some suggestions for this incorporating one or more of the following: on Windows with Chrome & Outlook on iPhone/ iPad to its Calendar PS: I'll come back and see if I can add more information to this question and to answer it as well. I have not found a solution yet.

    Read the article

  • Is there an open source solution that I can host on a web server that will allow users to anonymously upload a file to me?

    - by mjn12
    I'm looking for some kind of web application I can host on my Linux web server that will allow users to upload files of arbitrary size to me from their browser without requiring them to log in. Ideally this application would allow me to generate a link to my website that allowed for a one-time use upload. It might contain a unique, random key that was only good for that session. I could email them the link, they click it and are taken to a page where they can upload their file to me. I'm mainly targeting friends and family that need to send me files that are too large for email. I don't want to require them to install anything (dropbox), sign up and log in, etc. I'm definitely not teaching them to use FTP. This wouldn't be a difficult project for me to roll on my own but I'd like to take something off the shelf if it is possible. Does anything like this exist that my google-foo isn't turning up?

    Read the article

  • Is it logical that file system acls would be corrupted in a way that adds permission for another user?

    - by wilbbe01
    I was having issues on a shared hosting provider with the host's web server instance not serving some files. I asked the companies support about the issue and they responded with the results of getfacl on my home directory, and added the necessary line to allow their web server to obtain the necessary permissions. All is working happily now, but I noticed a line in the getfacl that was for what appeared to be another username to which I had no relation. I asked them about this and their response was that it was likely some minor corruption and that I could remove the unwanted line with the setfacl -x option. I know I never added the user to my home directory, and I also find it weird that that could truly happen due to corruption. So now that it is fixed I'm a little bit weary of whether or not they were trying to cover up a problem they accidentally gave someone permissions to my account, or if this kind of thing can really be corrupted in that way. Especially when that user is a real user on the same server. Any thoughts? Thanks.

    Read the article

  • How to tell IIS7 to allow POST to a text file (to solve 405)?

    - by meticulous
    If I want to allow HTTP POST to text files *.txt (i.e. I'm taking an example of what could be any static resource normally accessible by GET). The error is: Server Error 405 - HTTP verb used to access this page is not allowed. The page you are looking for cannot be displayed because an invalid method (HTTP verb) was used to attempt access. How can I accomplish this? Background: I'm using apps.facebook.com to hit my hosted facebook app and facebook sends HTTP POST now through to the iFrame hosting my app. This facebook behaviour has been around for a while but it's being forced now. In turn this forces me to make stuff available to the POST verb.

    Read the article

  • Windows 7 Explorer - Activate mouse focus to be able to scroll in tree and file view?

    - by none
    I'd like to be able to scroll in the tree view without having to click in it. Is there a way to do this? I have once used a tool that generally gives focus to the window underneath the mouse cursor, but this caused some other glitches so I would like to achieve this without an extra tool. I also think that there are programs that embed the Windows Explorer and offer more features, including the behaviour I would like to have. But maybe a registry value change is all that is needed?

    Read the article

  • Possible to load entries into hosts.deny from text file?

    - by Tar
    I have around 96 million IP addresses that I have collected and routinely validate to be VPN providers, proxies, etc. I want these blocked. Currently, I am including the list formatted like deny ip; in nginx and that works perfectly. I want to use this list on another server, but nginx isn't an option, and I don't trust apache to handle this without slowing down. Is there a way to load this list into hosts deny via some command like aclexec or something? Are there other alternatives like setting up a DNSBL or using hosts.deny in conjunction with one?

    Read the article

  • How can I view a .eml file from command line in Windows Vista?

    - by Nosrettap
    Ok, so my parent's computer crashed (horribly - corruptedRegistry) and I'm trying to access one of their e-mail files that is saved locally on the hard drive. I can't launch Windows itself so right now I am in a "bootup command prompt". I've navigated to where the e-mail appear to be stored C:Users\[userName]\AppData\Local\Microsoft\Windows Mail\Local Folders\Inbox and it shows a list of what appear to be the files. The problem is, they are .eml files and I can't seem to be able to open them. I've tried 'vim' and 'vi' commands but it tells me that 'vim is not recognized as an internal or external command. Does anyone know how I can view .eml files from command line? Thanks

    Read the article

  • virutalbox always get back the same ip

    - by user1012451
    I exported a virtual machine from my work station, and I am trying to renew the ip-address at home. I put one copy on my laptop and one copy on my PC. I've done this sudo rm /etc/udev/rules.d/70-persistent-net.rules and changed the MAC address, but they keep coming back as 192.168.1.165. I need them to be different because I need to run two of these exports at the same time, so I cannot afford to have the same IP. What can I do? Thanks.

    Read the article

  • How to tune Windows 2008r2 and IIS to maximize single file download speeds?

    - by uSlackr
    We recently put up an IIS site (on WinSvr 2008r2) that is used almost exclusively for downloading files over the internet. The data exists as a large collection of .zip files ranging from 1MB - 35GB in size. We want to allow a lot of downloads during a day (more than 500GB) but have implemented an outbound ASA throttle at 60mbps in order to preserve bandwidth for other uses. The total link speed is 100mbps. Here's the interesting part: While we can serve up multiple downloads to hit the 60mbps cap, we cannot get any single download to exceed 2.5M bytes/sec (20 Mbits/s). Is there any TCP or IIS tuning we can do to push up individual download speeds? Or something else to look at?

    Read the article

  • Zipping only files using powershell

    - by SteB
    I'm trying to zip all the files in a single directory to a different folder as part of a simple backup routine. The code runs ok but doesn't produce a zip file: $srcdir = "H:\Backup" $filename = "test.zip" $destpath = "K:\" $zip_file = (new-object -com shell.application).namespace($destpath + "\"+ $filename) $destination = (new-object -com shell.application).namespace($destpath) $files = Get-ChildItem -Path $srcdir foreach ($file in $files) { $file.FullName; if ($file.Attributes -cne "Directory") { $destination.CopyHere($file, 0x14); } } Any ideas where I'm going wrong?

    Read the article

< Previous Page | 692 693 694 695 696 697 698 699 700 701 702 703  | Next Page >