Search Results

Search found 45804 results on 1833 pages for 'large files'.

Page 472/1833 | < Previous Page | 468 469 470 471 472 473 474 475 476 477 478 479  | Next Page >

  • linux: upload / download difference on network shares

    - by Batsu
    I have a Red Hat Enterprise Linux 6 (with SELinux) which shows significant differences of speed between download and upload (the latter significantly slower) of files shared over the LAN. The bottleneck seems to be the output of the linux machine since I have a rate around 1Mb/s when WinXP machines download files shared (using samba) by the RHEL machine uploading files from the RHEL to a WinXP's shared folder while uploading from the XP machines to linux's shares downloading XPs' shares on the RHEL any share between Windows machines only run smooth (around 50Mb/s). Since the upload from RHEL to WinXP's share is slowed too I would exclude an issue in the configuration of samba. What could possibly determine this limit in the upload speed? update: iptables doesn't show any output rule and disabling it doesn't show any noticeable difference, so I would rule out it too.

    Read the article

  • Issue with aborted MySQL connections (error code: 4)

    - by arikfr
    Some of the connections between my application server (Ubuntu, Apache, PHP) and my DB server (Ubuntu, MySQL) are failing with error code 4. According to the documentation error code 4 is: OS error code 4: Interrupted system call At first I thought that maybe the issue is that the DB server has too many connections and fails because there are too much open files. But it seems not to be the case because: Too many open files has different error code (24). I've checked and during peak time the server had 497 files open (checked using lsof command) while the maximum is 1024. The TCP settings were already checked (see prior question). Any ideas what this can be or what should I check?

    Read the article

  • Find out the size of a .tar.gz archive in the terminal without unpacking

    - by Sven
    I have a 32GB .tar.gz archive and I'd like to know the size of the files if I unpack this compressed archive. I'd like to avoid unpacking the archive first and than use e.g. du. Is it also possible to find out the size of the contained files without unpacking the compressed archive (on a Linux and/or MacOSX system)? For another archive I know, that it also contains .tar.gz files. Is it also possible to calculate the size of the unpacked archives that are contained within an archive? (for example by setting a level to which the "unpacking" should be simulated?)

    Read the article

  • Resize the /var directory in redhat enterprise edition 4

    - by Sri
    I am running NDB mysql. the log files fills up the /var directory. therefore i cant start the ndbd service now. as a temporary fix, i have deleted the log files and again working fine. but again the log files fill up the /var directory. i got plenty of space in other partition. therefore i would like to swap the partition from one directory to /var. here if my input from df -h Filesystem Type Size Used Avail Use% Mounted on /dev/mapper/VolGroup00-LogVol00 ext3 54G 2.9G 49G 6% / /dev/cciss/c0d0p1 ext3 99M 14M 81M 14% /boot none tmpfs 1013M 0 1013M 0% /dev/shm /dev/cciss/c0d0p2 ext3 9.7G 9.7G 0 100% /var there are plenty of space in /dev/mapper/VolGroup00-LogVol00. Therefore i will like to swap 10 G space from this directory to /var. could you please help me out to solve this problem?

    Read the article

  • Mounting share over VPN

    - by user1337
    I have a CentOS 5 web server which currently mounts a NFS export on my Mac OS X 10.7 laptop. It works great, except over VPN I can't get it to mount at all. I tried SMBUp but haven't been able to get it working even locally. It doesn't look like there's an easy way to install netatalk for CentOS 5. Even still, I'm not sure if that's the best way to do it. I tried using a GUI SSH client that can "mount a FTP disk" and it would work, except the files require root access and there's no external root access and the client can't elevate permissions. The basic thing I need to do is have the server be able to read the files off of my laptop, connected via VPN. The files are frequently updated (every 5-20 seconds) so I don't want to manually do that via SSH. Which protocol can work with both platforms and easily handle the latency introduced by VPN (and potentially mobile broadband)? Thanks

    Read the article

  • Hard drive placement

    - by zm15
    I'm a video editor working with large HD files. I am building a new computer and need some help. I will be running 2 hard drives. One with the operating system and all the programs. And one with all the project files I will be working from. I am keeping these seperate. I will be purchasing a 10k rpm hard drive. So i will have a 10k rpm drive, and a 7200rpm drive. Should I put the OS on the faster drive, or put my working files on the faster drive?

    Read the article

  • Opening a Corrupted winmail.dat file

    - by tearman
    I have a set of winmail.dat files that apparently have evolved from a set of emails with corrupted headers. It looks like Exchange 2010 changed the headers around sometime last September and basically rendered exported .eml files unable to open. Now the HTML/PlainText emails seem to do ok, but the files that use RichText (specifically Microsoft's TNEF format) will not open in any program, Microsoft or not. I've attempted to use many different non-Microsoft converters and they see it as a corrupted message as well. If I remove the headers of the email, rename it as a winmail.dat file, some emails will open in Word, but most won't. If you take a look at the email in a text editor, there are null characters EVERYWHERE that distort the email itself. Anybody have any experience with this and/or suggestions on how to at least open it?

    Read the article

  • Entourage to Outlook Migration questions

    - by George Bluff
    I am currently migrating a users information from a pop email account to my exchange server. I have already migrated them over to my hosted exchange, and their email is following properly. Now, the user is moving from Entourage on a Mac (10.7) to Outlook 2010 on a PC (Windows 7). I was wondering what the easiest way was to migrate him since there is no .pst files. I have been able to get his email over by dragging the inbox from Entourage to the desktop, then converting the files to .eml using IMAPSize, importing them to Outlook Express (which will only work on Windows XP), then exporting to a pst, then importing in the new account. Takes awhile with large emails, but it works. The issue I am now having is for calendar items. I exported the calendar and got a folder with all the .ics files, but Outlook 2010 doesn't seem to have an easy way to import all of them. Any thoughts?

    Read the article

  • Load testing nginx inside AWS

    - by andy
    I'm trying to load test nginx running on AWS. I need to try to optimise it to handle 1Gbps of inbound traffic. Currently I've got it to peak at 85Mbit/s by running nginx on an m1.large with 4 other machines hitting it by using ab with -i (for head requests) -k (keepalives) -r (ignore failed requests) -n 500000 -c 20000. I'm struggling to generate more than 85 Mbit/s traffic from 4 machines, yet when I do scp a large file I get nearly 0.25Gbit/s of traffic going over the network. Are there any tools or approaches that I could use to load test nginx that might generate more load? I'm only interested in inbound traffic, so perhaps a DoS tool could help if it chucks away responses? I'm hitting a very small (40 byte) static asset, and have peaked at handling 50K concurrent connections and getting 25k reqs/s when just using a single load generator machine.

    Read the article

  • Aligning Numbered Bullet Points in Word 2007

    - by Frustratedwithbullets
    Hello, I am putting together a very large business manual which incorportaes numbered heading, steps to follow, diagrams, etc. When using the bullet points, they align perfectly as I work through the processes. However when I include a diagram, or something different from the "norm" of text, the alignment changes. I would like all the bullets points to be aligned in the whole document regardless of where they appear in the document. Is there a way to save the settings so that the bullets always appear in the same position? Currently I am having to reset the indents by dragging the tabs on the ruler. This will be a large document, so I don't want to manually adjust the numbered bullets every time. Help would be greatly appreciated. Thanks very much.

    Read the article

  • XAMPP won't start Apache on Mac OS

    - by Paul Masri
    When I try starting Apache from the XAMPP control panel (Mac OSX Snow Leopard), I get the following error popup and Apache won't start: /Applications/XAMPP/xamppfiles/bin/apachectl: line 70: ulimit: open files: cannot modify limit: Invalid argument (48)Address already in use: make_sock: could not bind to address [::]:80 XAMPP was running perfectly 10 minutes earlier. I stopped Apache to add some .conf files and it failed on restart. I removed all the new .conf files (i.e. reverted it to how it was before) but now I get the above message. EDIT: I've checked AppMonitor and I see the "httpd" processes (one by _www nested within root). Just tried quitting these but they're auto-restarted on new process IDs and it didn't solve the problem.

    Read the article

  • Less daunting front end for SQL Server

    - by Martin
    We currently have a few users who have been using Access very succesfully to throw around large amounts of data. We've now got to the point where the data is just too large to be held in Access, as well as wanting to hold it in a single place where multiple users can access it. We have therefore moved the data over to SQL Server. I want to provide a general tool that they can use to view the data on the server and do some simple things like run queries and filters and export the data for offline manipulation. I don't want the support headaches that might come with rolling out SQL Management Studio, and neither do I want to have to create an Access database with links for each current database or ones that are created in the future. Can anyone recommend a simple tool that will connect to a server, list all the databases and allow a user to drill into a table and look at the data. Many thanks.

    Read the article

  • How to configure DNS BIND to work locally on one computer?

    - by user619656
    I want to do some changes to the BIND source code. In order to test those changes I want to be able to post queries to my local BIND server and for it to use only the local zone files. I know how to make the zone files and somewhat the named.conf file but what should i put in /etc/resolv.conf? In resolv.conf currently there is the line nameserver 192.168.0.1 witch i guess is my router IP address and the queries go through the router to my ISP. I want those queries to go to the local BIND server and to look for answers in the zone files i provided. Is there a way for this using resolf.conf file or should i do something else?

    Read the article

  • $DISPLAY dependent gtk themes

    - by Vlad Seghete
    I have a computer at home that I log into remotely. The "monitor" for it is a TV, so I want gtk applications to use a large font and icon theme, which I managed to do by editing the ~/.gtkrc-2.0 file and some other similar stuff. What I want to be able to do is have a separate theme for when I'm logging in remotely. The best way to explain is that I would like my gtk theme choice to be dependent on the X display that the application is started on. For example, if I start something on :0.0 then that is the TV and I want large fonts, but if I start it on localhost:10.0 I want to use a regular size font, because it will get rendered on my laptop screen. The elegant solution would be to have some sort of IF statement in the .gtkrc-2.0 file that checks the $DISPLAY variable and behaves accordingly. The problem is I can't find any documentation on control structures in .gktrc files, or if it's even possible to do that.

    Read the article

  • SSD Performance for PHP?

    - by Andrew Fashion
    My programmer just built an application with PHP using Doctrine ORM (will be a high traffic social networking website), and it's very heavy in PHP/Apache and CPU. The queries are wonderfully fast, and MySQL is barely using any CPU, it's just Apache. I was curious to if an SSD would help speed up PHP/Apache, because I know the bottleneck is in PHP reading multiple files, class files, and loading up a bunch of data. So common sense makes me think if PHP is reading multiple PHP files, an SSD would only help as far as read/write? I was thinking of doing a high performance SSD for the PHP application, but for user image uploads, I would just continue using a 15k SAS. Is there any performance issues regarding using an SSD in this kind of situation? And would it prove to help speed up PHP/Apache, and help the CPU problem out?

    Read the article

  • How to download a URL as a file?

    - by Michelle
    A website URL has "hidden" some MP3 files by embedding them as Shockwave files, as follows. <span class="caption"><!-- Odeo player --><embed src="http://odeo.com/flash/audio_player_tiny_gray.swf"quality="high" name="audio_player_tiny_gray" align="middle" allowScriptAccess="always" wmode="transparent" type="application/x-shockwave-flash" flashvars="valid_sample_rate=true external_url=http://podcast.cbc.ca/mp3/sundayeditionstream_20081125_9524.mp3" pluginspage="http://www.macromedia.com/go/getflashplayer"></embed></span> How can I download the files for off-line listening? I've found two methods: 1. The Stack Overflow Method Create a new local HTML file with just the links, for example: <a href="http://podcast.cbc.ca/mp3/sundayeditionstream_20081125_9524.mp3">Sunday Edition 25Nov2008</a> Open the file in the browser, right click the link and File Save Link As. 2. The Super User Method Install the Firefox addin Iget. (Be sure to use the right version for your Firefox version.) Tools Downloads Enter URL in the field. Are there any other ways?

    Read the article

  • Writing to network share failed

    - by Unreason
    I have outlook files stored on a network share and accessed by clients directly. Outlook is version 2003, clients windows XP and server is 2003. The files are quite big at around 3GB. One of the common problem that happens is that I get 'delayed write failed' and this happens only on these PST files. When this happens I have to run scanpst.exe to fix the PST file. I did not find any entries in even logs that I could relate to the issue. What would you suggest to change to fix the issue or where to look to further diagnose it? EDIT: No loss on ping and ping times within normal for LAN.

    Read the article

  • Unix file permission for groups

    - by GOPI
    I am working on HP Unix server. I have a directory in which users from different groups need to create files. And the users of a same group should have complete access to the files created by their group and only read access for the files created by other groups. I tried to set sticky bit for the directory thereby to restrict access for other groups. But I face the following problem. Created File1 from user1 of GroupA. When I tried to execute the 'rm' command from user2 of the same group GroupA, it doesn't allow as user2 is not the owner of the file. can setgid (at directory level) or other command help me to sort this issue?

    Read the article

  • How to cleanup tmp folder safely on Linux

    - by Syncopated
    I use RAM for my tmpfs /tmp, 2GB, to be exact. Normally, this is enough but sometimes, processes create files in there and fail to cleanup after themselves. This can happen if they crash. I need to delete these orphaned tmp files or else future process will run out of space on /tmp. How can I safely garbage collect /tmp? Some people do it by checking last modification timestamp, but this approach is unsafe because there can be long-running processes that still need those files. A safer approach is to combine the last modification timestamp condition with the condition that no process has a file handle for the file. Is there a program/script/etc that embodies this approach or some other approach that is also safe? Incidentally, does Linux/Unix allow a mode of file opening with creation wherein the created file is deleted when the creating process terminates, even if it's from a crash?

    Read the article

  • How to migrate Notepad++ settings?

    - by NoCatharsis
    I am trying to portabilize every program I use if possible, and Notepad++ is on the list. The only problem is that I've had a native installation until now so that I'm not totally sure which settings files need to be moved to the portable directory. Surely there's a function tucked away somewhere in NPP exactly for this purpose, or some plugin out there? I mean the developers have literally thought of everything else, yet this is the one thing I cannot find specifically anywhere in the NPP wiki or otherwise, and I don't want to miss an important file. Here is the closest I've gotten: Notepad++'s configuration files and Where are all the files? Should I just copy every configuration file listed on the first link?

    Read the article

  • rsync --link-dest behaviour when run as sudo

    - by fotNelton
    In order to create regular backups, I'm using rsync together with --link-dest so as to create hard-links for unchanged files. For example: rsync -ax \ --partial --delete --delete-excluded --inplace \ --exclude-from=/tmp/temp_excludes \ --link-dest=/Volumes/Backup/current \ /Users /Volumes/Backup/2012-06-25 This works very well as long as I start the process from my normal user account. Though as soon as I start the process using sudo it behaves erradically, meaning that rsync copies all the unchanged files instead of hard-linking them. Since sudo modifies the environment, I've already also tried sudo -E in conjunction with making sure that my sudoers file has the corresponding option set. Well, that didn't work either. So, the question is, how can I run rsync using sudo? Whereas the above example only shows a backup of the Users directory, I also need to backup some system files that I can only access as root.

    Read the article

  • Trying to unpack 2.5GB .tar.gz file on Linux but getting "An error occurred while trying to open the archive"

    - by TMM
    Hi, Is there a limit on Linux for the file size of a .tar.gz (or its contents). I am currently creating a .tar.gz (both through the UI/"Compress As" and also through the command line) file for 2 files (6GB and 2GB), and even though it is created successfully, when I try to unpack it using Ark it throws the error "An error occurred while trying to open the archive". I have seen some places that it might be better to archive the file into several smaller .tar.gz files, but I was wondering exactly how to do this (and subsequently unpack the files). Also, is it totally impossible to use the 1 .tar.gz file approach as this would be much simpler. Thanks in advance, Tim

    Read the article

  • Windows 7 PATH not expanding

    - by trinithis
    I am using the following to create and edit environment variables for Windows 7. Control Panel\All Control Panel Items\System -> Advanced system settings -> Environment Variables Under System variables I have the following pertinant variables: PROG32=C:\Program Files (x86) REALDWG_SDK_DIR=%PROG32%\Autodesk\RealDWG 2011 Path=%REALDWG_SDK_DIR%;%PROG32%\Haskell\bin However, the following happens: C:\>echo %PROG32% C:\Program Files (x86) C:\>echo %Path% %REALDWG_SDK_DIR%;C:\Program Files (x86)\Haskell\bin Is it possible to have a chain of variables expand? If I rename Path to something else, I sometimes get the problem, and sometimes I don't.

    Read the article

  • CGI Script not running in PHP file

    - by Unykvis
    I have a CGI script in the server called script.cgi and I have added the following code to the domain vhost: Action add-footer /cgi-bin/script.cgi AddHandler add-footer .htm .html I have change it to: Action add-footer /cgi-bin/script.cgi AddHandler add-footer .htm .html .php If the page is HTML the code will run but if the page is PHP the code will not run. Is there any code I need to add to the vhost so that PHP files can run this script? **EDITED:** I want to "inject" an HTML code in every possible page of the server this includes HTML and PHP files. The code only works for html files and I don't know why.

    Read the article

  • Importing Bookmarks from a Text File (to any browser/website)

    - by Gary Oldfaber
    I have dozens of text files containing around 60 url's each, accumulated over years of browsing on multiple computers. I wish to import these into any browser, to allow me to then use cross-browser importing. My ultimate goal is to then import the bookmarks to somewhere like delicious, which will automatically tag the links, allowing me to sort each page by subject. The closest I've managed to find is: Import bookmarks to firefox from txt file However while this plugin imports from a text file, it has no correlation with Firefox's bookmarks, and only allows you to export back to csv/txt files. I understand that the problem of importing from text files is that bookmarks need a Title, and so I wish to use a given pages existing title. I've been unable to find any such tool on the net.

    Read the article

< Previous Page | 468 469 470 471 472 473 474 475 476 477 478 479  | Next Page >