Search Results

Search found 40479 results on 1620 pages for 'binary files'.

Page 461/1620 | < Previous Page | 457 458 459 460 461 462 463 464 465 466 467 468  | Next Page >

  • How to use chain.p7b with Apache?

    - by Debianuser
    I wanted to setup a SSL website on Apache and applied for a certificate from my local ISP. All they sent me was a single file named chain.p7b. I have always used certificates from other vendors without any issues but they usually provide two files to be configured as SSLCertificateFile and SSLCertificateChainFile in Apache. Following instructions from several online resources, I opened the p7b file in Windows and extracted 4 certificates from the file. I then tried configuring Apache with one of the files and it worked, but shows a warning: The certificate is not trusted because no issuer chain was provided. I though I have to use remaining 3 files as SSLCertificateChainFile and/or SSLCACertificateFile. I tried that but it didn't work so I am assuming it might be something completely different. Anyone faced this issue before? The following page http://www-01.ibm.com/support/docview.wss?uid=swg21458997 talks about using a keystore but is that relevant to Apache?

    Read the article

  • Find out the size of a .tar.gz archive in the terminal without unpacking

    - by Sven
    I have a 32GB .tar.gz archive and I'd like to know the size of the files if I unpack this compressed archive. I'd like to avoid unpacking the archive first and than use e.g. du. Is it also possible to find out the size of the contained files without unpacking the compressed archive (on a Linux and/or MacOSX system)? For another archive I know, that it also contains .tar.gz files. Is it also possible to calculate the size of the unpacked archives that are contained within an archive? (for example by setting a level to which the "unpacking" should be simulated?)

    Read the article

  • Hard drive placement

    - by zm15
    I'm a video editor working with large HD files. I am building a new computer and need some help. I will be running 2 hard drives. One with the operating system and all the programs. And one with all the project files I will be working from. I am keeping these seperate. I will be purchasing a 10k rpm hard drive. So i will have a 10k rpm drive, and a 7200rpm drive. Should I put the OS on the faster drive, or put my working files on the faster drive?

    Read the article

  • Mounting share over VPN

    - by user1337
    I have a CentOS 5 web server which currently mounts a NFS export on my Mac OS X 10.7 laptop. It works great, except over VPN I can't get it to mount at all. I tried SMBUp but haven't been able to get it working even locally. It doesn't look like there's an easy way to install netatalk for CentOS 5. Even still, I'm not sure if that's the best way to do it. I tried using a GUI SSH client that can "mount a FTP disk" and it would work, except the files require root access and there's no external root access and the client can't elevate permissions. The basic thing I need to do is have the server be able to read the files off of my laptop, connected via VPN. The files are frequently updated (every 5-20 seconds) so I don't want to manually do that via SSH. Which protocol can work with both platforms and easily handle the latency introduced by VPN (and potentially mobile broadband)? Thanks

    Read the article

  • Resize the /var directory in redhat enterprise edition 4

    - by Sri
    I am running NDB mysql. the log files fills up the /var directory. therefore i cant start the ndbd service now. as a temporary fix, i have deleted the log files and again working fine. but again the log files fill up the /var directory. i got plenty of space in other partition. therefore i would like to swap the partition from one directory to /var. here if my input from df -h Filesystem Type Size Used Avail Use% Mounted on /dev/mapper/VolGroup00-LogVol00 ext3 54G 2.9G 49G 6% / /dev/cciss/c0d0p1 ext3 99M 14M 81M 14% /boot none tmpfs 1013M 0 1013M 0% /dev/shm /dev/cciss/c0d0p2 ext3 9.7G 9.7G 0 100% /var there are plenty of space in /dev/mapper/VolGroup00-LogVol00. Therefore i will like to swap 10 G space from this directory to /var. could you please help me out to solve this problem?

    Read the article

  • Issue with aborted MySQL connections (error code: 4)

    - by arikfr
    Some of the connections between my application server (Ubuntu, Apache, PHP) and my DB server (Ubuntu, MySQL) are failing with error code 4. According to the documentation error code 4 is: OS error code 4: Interrupted system call At first I thought that maybe the issue is that the DB server has too many connections and fails because there are too much open files. But it seems not to be the case because: Too many open files has different error code (24). I've checked and during peak time the server had 497 files open (checked using lsof command) while the maximum is 1024. The TCP settings were already checked (see prior question). Any ideas what this can be or what should I check?

    Read the article

  • Opening a Corrupted winmail.dat file

    - by tearman
    I have a set of winmail.dat files that apparently have evolved from a set of emails with corrupted headers. It looks like Exchange 2010 changed the headers around sometime last September and basically rendered exported .eml files unable to open. Now the HTML/PlainText emails seem to do ok, but the files that use RichText (specifically Microsoft's TNEF format) will not open in any program, Microsoft or not. I've attempted to use many different non-Microsoft converters and they see it as a corrupted message as well. If I remove the headers of the email, rename it as a winmail.dat file, some emails will open in Word, but most won't. If you take a look at the email in a text editor, there are null characters EVERYWHERE that distort the email itself. Anybody have any experience with this and/or suggestions on how to at least open it?

    Read the article

  • XAMPP won't start Apache on Mac OS

    - by Paul Masri
    When I try starting Apache from the XAMPP control panel (Mac OSX Snow Leopard), I get the following error popup and Apache won't start: /Applications/XAMPP/xamppfiles/bin/apachectl: line 70: ulimit: open files: cannot modify limit: Invalid argument (48)Address already in use: make_sock: could not bind to address [::]:80 XAMPP was running perfectly 10 minutes earlier. I stopped Apache to add some .conf files and it failed on restart. I removed all the new .conf files (i.e. reverted it to how it was before) but now I get the above message. EDIT: I've checked AppMonitor and I see the "httpd" processes (one by _www nested within root). Just tried quitting these but they're auto-restarted on new process IDs and it didn't solve the problem.

    Read the article

  • Entourage to Outlook Migration questions

    - by George Bluff
    I am currently migrating a users information from a pop email account to my exchange server. I have already migrated them over to my hosted exchange, and their email is following properly. Now, the user is moving from Entourage on a Mac (10.7) to Outlook 2010 on a PC (Windows 7). I was wondering what the easiest way was to migrate him since there is no .pst files. I have been able to get his email over by dragging the inbox from Entourage to the desktop, then converting the files to .eml using IMAPSize, importing them to Outlook Express (which will only work on Windows XP), then exporting to a pst, then importing in the new account. Takes awhile with large emails, but it works. The issue I am now having is for calendar items. I exported the calendar and got a folder with all the .ics files, but Outlook 2010 doesn't seem to have an easy way to import all of them. Any thoughts?

    Read the article

  • How to configure DNS BIND to work locally on one computer?

    - by user619656
    I want to do some changes to the BIND source code. In order to test those changes I want to be able to post queries to my local BIND server and for it to use only the local zone files. I know how to make the zone files and somewhat the named.conf file but what should i put in /etc/resolv.conf? In resolv.conf currently there is the line nameserver 192.168.0.1 witch i guess is my router IP address and the queries go through the router to my ISP. I want those queries to go to the local BIND server and to look for answers in the zone files i provided. Is there a way for this using resolf.conf file or should i do something else?

    Read the article

  • SSD Performance for PHP?

    - by Andrew Fashion
    My programmer just built an application with PHP using Doctrine ORM (will be a high traffic social networking website), and it's very heavy in PHP/Apache and CPU. The queries are wonderfully fast, and MySQL is barely using any CPU, it's just Apache. I was curious to if an SSD would help speed up PHP/Apache, because I know the bottleneck is in PHP reading multiple files, class files, and loading up a bunch of data. So common sense makes me think if PHP is reading multiple PHP files, an SSD would only help as far as read/write? I was thinking of doing a high performance SSD for the PHP application, but for user image uploads, I would just continue using a 15k SAS. Is there any performance issues regarding using an SSD in this kind of situation? And would it prove to help speed up PHP/Apache, and help the CPU problem out?

    Read the article

  • How to download a URL as a file?

    - by Michelle
    A website URL has "hidden" some MP3 files by embedding them as Shockwave files, as follows. <span class="caption"><!-- Odeo player --><embed src="http://odeo.com/flash/audio_player_tiny_gray.swf"quality="high" name="audio_player_tiny_gray" align="middle" allowScriptAccess="always" wmode="transparent" type="application/x-shockwave-flash" flashvars="valid_sample_rate=true external_url=http://podcast.cbc.ca/mp3/sundayeditionstream_20081125_9524.mp3" pluginspage="http://www.macromedia.com/go/getflashplayer"></embed></span> How can I download the files for off-line listening? I've found two methods: 1. The Stack Overflow Method Create a new local HTML file with just the links, for example: <a href="http://podcast.cbc.ca/mp3/sundayeditionstream_20081125_9524.mp3">Sunday Edition 25Nov2008</a> Open the file in the browser, right click the link and File Save Link As. 2. The Super User Method Install the Firefox addin Iget. (Be sure to use the right version for your Firefox version.) Tools Downloads Enter URL in the field. Are there any other ways?

    Read the article

  • Writing to network share failed

    - by Unreason
    I have outlook files stored on a network share and accessed by clients directly. Outlook is version 2003, clients windows XP and server is 2003. The files are quite big at around 3GB. One of the common problem that happens is that I get 'delayed write failed' and this happens only on these PST files. When this happens I have to run scanpst.exe to fix the PST file. I did not find any entries in even logs that I could relate to the issue. What would you suggest to change to fix the issue or where to look to further diagnose it? EDIT: No loss on ping and ping times within normal for LAN.

    Read the article

  • How to migrate Notepad++ settings?

    - by NoCatharsis
    I am trying to portabilize every program I use if possible, and Notepad++ is on the list. The only problem is that I've had a native installation until now so that I'm not totally sure which settings files need to be moved to the portable directory. Surely there's a function tucked away somewhere in NPP exactly for this purpose, or some plugin out there? I mean the developers have literally thought of everything else, yet this is the one thing I cannot find specifically anywhere in the NPP wiki or otherwise, and I don't want to miss an important file. Here is the closest I've gotten: Notepad++'s configuration files and Where are all the files? Should I just copy every configuration file listed on the first link?

    Read the article

  • Unix file permission for groups

    - by GOPI
    I am working on HP Unix server. I have a directory in which users from different groups need to create files. And the users of a same group should have complete access to the files created by their group and only read access for the files created by other groups. I tried to set sticky bit for the directory thereby to restrict access for other groups. But I face the following problem. Created File1 from user1 of GroupA. When I tried to execute the 'rm' command from user2 of the same group GroupA, it doesn't allow as user2 is not the owner of the file. can setgid (at directory level) or other command help me to sort this issue?

    Read the article

  • How to cleanup tmp folder safely on Linux

    - by Syncopated
    I use RAM for my tmpfs /tmp, 2GB, to be exact. Normally, this is enough but sometimes, processes create files in there and fail to cleanup after themselves. This can happen if they crash. I need to delete these orphaned tmp files or else future process will run out of space on /tmp. How can I safely garbage collect /tmp? Some people do it by checking last modification timestamp, but this approach is unsafe because there can be long-running processes that still need those files. A safer approach is to combine the last modification timestamp condition with the condition that no process has a file handle for the file. Is there a program/script/etc that embodies this approach or some other approach that is also safe? Incidentally, does Linux/Unix allow a mode of file opening with creation wherein the created file is deleted when the creating process terminates, even if it's from a crash?

    Read the article

  • Windows 7 PATH not expanding

    - by trinithis
    I am using the following to create and edit environment variables for Windows 7. Control Panel\All Control Panel Items\System -> Advanced system settings -> Environment Variables Under System variables I have the following pertinant variables: PROG32=C:\Program Files (x86) REALDWG_SDK_DIR=%PROG32%\Autodesk\RealDWG 2011 Path=%REALDWG_SDK_DIR%;%PROG32%\Haskell\bin However, the following happens: C:\>echo %PROG32% C:\Program Files (x86) C:\>echo %Path% %REALDWG_SDK_DIR%;C:\Program Files (x86)\Haskell\bin Is it possible to have a chain of variables expand? If I rename Path to something else, I sometimes get the problem, and sometimes I don't.

    Read the article

  • Trying to unpack 2.5GB .tar.gz file on Linux but getting "An error occurred while trying to open the archive"

    - by TMM
    Hi, Is there a limit on Linux for the file size of a .tar.gz (or its contents). I am currently creating a .tar.gz (both through the UI/"Compress As" and also through the command line) file for 2 files (6GB and 2GB), and even though it is created successfully, when I try to unpack it using Ark it throws the error "An error occurred while trying to open the archive". I have seen some places that it might be better to archive the file into several smaller .tar.gz files, but I was wondering exactly how to do this (and subsequently unpack the files). Also, is it totally impossible to use the 1 .tar.gz file approach as this would be much simpler. Thanks in advance, Tim

    Read the article

  • rsync --link-dest behaviour when run as sudo

    - by fotNelton
    In order to create regular backups, I'm using rsync together with --link-dest so as to create hard-links for unchanged files. For example: rsync -ax \ --partial --delete --delete-excluded --inplace \ --exclude-from=/tmp/temp_excludes \ --link-dest=/Volumes/Backup/current \ /Users /Volumes/Backup/2012-06-25 This works very well as long as I start the process from my normal user account. Though as soon as I start the process using sudo it behaves erradically, meaning that rsync copies all the unchanged files instead of hard-linking them. Since sudo modifies the environment, I've already also tried sudo -E in conjunction with making sure that my sudoers file has the corresponding option set. Well, that didn't work either. So, the question is, how can I run rsync using sudo? Whereas the above example only shows a backup of the Users directory, I also need to backup some system files that I can only access as root.

    Read the article

  • Nagios and rrd on a old server

    - by Pier
    I have an old server (P4 based) on which nagios (and all the other tools to monitor) is running. In the last few weeks we are seeing a strange behavior. In the /var/spool/pnp4nagios (where temporary files are stored before getting processed by pnp4nagios daemon) we have many files like perfdata.1274949941-PID-18839 and we get an error in npcd.log: [05-27-2010 11:17:46] NPCD: ThreadCounter 0/15 File is perfdata.1274951306-PID-27849 [05-27-2010 11:17:46] NPCD: File 'perfdata.1274951306-PID-27849' is an already in process PNP file. Leaving it untouched. Sometimes some graph are not drawn. The server is pretty loaded (around 5-6 normally) and i suspect that npcd goes in timeout and leave those files behind. What could I do (apart from change the server)? Few infos about the system: centos 5.5 nagios 3.2.1 pnp4nagios 0.6 (from sources) Thanks

    Read the article

  • Importing Bookmarks from a Text File (to any browser/website)

    - by Gary Oldfaber
    I have dozens of text files containing around 60 url's each, accumulated over years of browsing on multiple computers. I wish to import these into any browser, to allow me to then use cross-browser importing. My ultimate goal is to then import the bookmarks to somewhere like delicious, which will automatically tag the links, allowing me to sort each page by subject. The closest I've managed to find is: Import bookmarks to firefox from txt file However while this plugin imports from a text file, it has no correlation with Firefox's bookmarks, and only allows you to export back to csv/txt files. I understand that the problem of importing from text files is that bookmarks need a Title, and so I wish to use a given pages existing title. I've been unable to find any such tool on the net.

    Read the article

  • SSH and Latent Connections (e.g., satellite connections)

    - by user71494
    Most of the week I live in the city where I have a typical broadband connection, but most weekends I'm out of town and only have access to a satellite connection. Trying to work over SSH on a satellite connection, while possible, is hardly desirable due to the high latency ( 1 second). My question is this: Is there any software that will do something like buffering keystrokes on my local machine before they're sent over SSH to help make the lag on individual keystrokes a little bit more transparent? Essentially I'm looking for something that would reduce the effects of the high latency for everything except for commands (e.g., opening files, changing to a new directory, etc.). I've already discovered that vim can open remote files locally and rewrite them remotely, but, while this is a huge help, it is not quite what I'm looking for since it only works when editing files, and requires opening a connection every time a read/write occurs. (For anyone who may not know how to do this and is curious, just use this command: 'vim scp://host/file/path/here)

    Read the article

  • Tools for tracking disk usage

    - by Carey
    I manage a number of linux fileservers. These all run applications written from 0-10 years ago. As sometimes happens, a machine will come close to, or run out of disk space. Reasons include applications not rotating log files, a machine with 500GB of disk producing 150GB of new files every month that were not written to tape, databases gradually increasing in size, people doing silly things...generally a bit of chaos. Anyway, when a machine unexpectedly goes from 50% to 100% full in a couple of hours, I figure out what broke (lots of "du") and delete files or contact someone. I also can look at cacti graphs to figure out what the machine's normal disk usage is (e.g. for /home). Does anyone know of any tools that will give finer grained information on historial usage than a cacti/RRD graph? Like "/home/abc/xyz increased 50GB in the last day".

    Read the article

  • CGI Script not running in PHP file

    - by Unykvis
    I have a CGI script in the server called script.cgi and I have added the following code to the domain vhost: Action add-footer /cgi-bin/script.cgi AddHandler add-footer .htm .html I have change it to: Action add-footer /cgi-bin/script.cgi AddHandler add-footer .htm .html .php If the page is HTML the code will run but if the page is PHP the code will not run. Is there any code I need to add to the vhost so that PHP files can run this script? **EDITED:** I want to "inject" an HTML code in every possible page of the server this includes HTML and PHP files. The code only works for html files and I don't know why.

    Read the article

  • Automate opening HTML and printing to PDF

    - by craigpatik
    I need a way to automate the following process in Windows 7: Open an .html file in Internet Explorer Print to PDF Save the PDF with a patterned file name (i.e., original_name_YYYY-MM-DD.pdf) Ideally, I could drag and drop several files or open a whole folder of files at once and a PDF would be created for each one. A command line solution is also acceptable. The files have to be opened in the browser because parts of the page are rendered with JavaScript on page load. In other words, if you simply right-click on the file in Explorer and choose "print", the resulting file is not the same because the JS didn't run. If it helps, Internet Explorer can be set as the default browser, and a PDF printer can be set as the default printer.

    Read the article

< Previous Page | 457 458 459 460 461 462 463 464 465 466 467 468  | Next Page >