Search Results

Search found 11671 results on 467 pages for 'man pages'.

Page 194/467 | < Previous Page | 190 191 192 193 194 195 196 197 198 199 200 201  | Next Page >

  • PHP `virtual()` with Apache MultiViews not working after upgrade to Ubuntu 12.04

    - by Izzy
    I use PHP's virtual() directive quite a lot on one of my sites, including central elements. This worked fine for the last ~10 years -- but after upgrading (or rather moving, as it is on a new machine) to Ubuntu 12.04 it somehow got broken. Example setup (simplified) To make it easier to understand, I simplify some things (contents). So say I need a HTML fragment like <P>For further instructions, please look <A HREF='foobar'>here</P> in multiple pages. 10 years ago, I used SSI for that, so it is put into a file in a central place -- so if e.g. the targeted URL changes, I only need to update it in one place. To serve multiple languages, I have Apache's MultiViews enabled -- and at $DOCUMENT_ROOT/central/ there are the files: foobar.html (English variant, and the default) foobar.html.de (German variant). Now in the PHP code, I simply placed: <? virtual("/central/foobar"); ?> and let Apache take care to deliver the correct language variant. The problem As said, this worked fine for about 10 years: German visitors got the German variant, all others the English (depending on their preferred language). But after upgrading to Ubuntu 12.04, it no longer worked: Either nothing was delivered from the virtual() command, or (in connection with framesets) it even ended up in binary gibberish. Trying to figure out what happens, I played with a lot of things. I first thought MultiViews was (somehow) not available anymore -- but calling http://<server>/central/foobar showed the right variant, depending on the configured language preferences. This also proved there was nothing wrong with file permissions. The error.log gave no clues either (no error message thrown). Finally, just as a "last ressort", I changed the PHP command to <? virtual("central/foobar.html"); ?> -- and that very same file was in fact included. So PHP's virtual() function basically worked -- but the language dependend stuff obviously did no longer work together with it as it did before. Of course I tried to find some change (most likely in PHP's virtual() command), using Google a lot, and also searching the questions here -- unfortunately to no avail. Finally: The question Putting "design questions" aside (surely today I would design things differently -- but at least currently I miss the time to change that for a quite huge amount of pages): What can be done to make it work again? I surely missed something -- but I cannot figure out what...

    Read the article

  • How to associate localhost in Snow Leopard to a specific ip ?

    - by Patrick
    I've disabled the web server on Snow Leopard, and I'm using an emulated Ubuntu with Lighttpd web server. In order to access to the web pages I need to specify the ip of the emulated machine. However I now need to associated such ip with "localhost" in Leopard environment. When I type localhost in Leopard I actually want to visit the localhost on the Ubuntu machine. Do I need apache on leopard to make the forward or can I change network settings in Leopard or what ? thanks

    Read the article

  • what is the meaning of *this* crontab setting?

    - by aXqd
    * */1 * * * sh foo.sh I found this setting on one production machine. And foo.sh was executed every one minute. I am guessing that the original author of this setting wants it to be executed every one hour. And I cannot find the official meaning of this setting in the crontab man page. Hence please help. UPDATE: I extracted these logs from that machine, however I cannot find the law out of them. 2013-06-29 20:47:01 2013-06-29 20:50:02 2013-06-29 20:51:01 2013-06-29 20:53:01 2013-06-29 20:54:01 2013-06-29 20:57:01 2013-06-29 20:58:01 2013-06-29 21:00:01 2013-06-29 21:05:02 2013-06-29 21:10:02

    Read the article

  • How to limit the number of concurrent CGI script invocations in Apache 2.2?

    - by hsivonen
    How can I limit the number of concurrent CGI invocations in Apache 2.2.x? More specifically, my problem is this: I have Apache hosting a Bugzilla instance and other stuff on one server. There's very little legitimate concurrent use of Bugzilla. However, it's trivial to mount a Denial of Service attack on the whole server by ignoring robots.txt and simply fetching a lot of bug pages that fork a process and hit a database.

    Read the article

  • Spam is Killing Me - Can I use GMail as a spam filter?

    - by kirkouimet
    I'm getting at least 50 Viagra ads a day and it's driving me insane. I currently have a hosted MS Exchange account and a Gmail account. My Gmail account forwards to my Exchange account. Both of my addresses are used evenly, and it has been really nice to have all of my e-mail end up in my Exchange box. I like replying from one address consistently, which is my Exchange address. Spam sent to my Gmail address is always caught, where spam sent to my Exchange is getting passed straight through to me. I don't want to have two spam filtration systems that have quarantines that I need to check frequently for false positives. Here is my question: Can I setup my MX records such that all e-mail sent to my Exchange address is forwarded to my Gmail account, which will then forward it to my Exchange account? Kind of like using Gmail as the middle man.

    Read the article

  • place php errors in log file

    - by Gatura
    I am running mac 10.6.4 on an iMac and am using it as a developer server. I have Apache and Entropy php5 installed, when i write my applications, some pages wont run when php has errors, however these are not recorded on a log file, I created one php_errors.log and entered the following on the php.ini file error_log = /usr/local/php5/logs/php_errors.log However errors are not written to this file and i have log_errors = true What could be the problem

    Read the article

  • Changing URL of Wordpress Post Completely

    - by HollerTrain
    I have a WP site where I want only specific Pages to have a completely different url. For example, the Page url might be 'www.domain.com/page_name' and I would like for it to be 'www.someotherdomain.com/page_name'. Is this possible with WP and if so how do I go about doing it? Any help would be greatly appreciated.

    Read the article

  • how do I use tar -C on Snow Leopard when creating an archive ?

    - by ssc
    The man page states -C directory In c and r mode, this changes the directory before adding the following files. However, tar does not change to the directory I specify, but instead reports tar: <folder name>: Cannot stat: No such file or directory for every folder in the directory I run the tar command in. Do I really have to do something like cd <folder> && tar ... && cd - or is there a way to get this work ?

    Read the article

  • Ubuntu 9.10 upgraded with Internet connection fail

    - by Neofizz
    I am pretty sure that the latest Ubuntu release does not support my wireless internet card. Everything was working fine in 9.04 and since upgrading my browser will load web pages. I can see and connect to my wep encrypted network. I can ping google.com and lose no packets. I am at the end of my tether, what else can I try? Is it possible to download a driver to re-enable my wireless?

    Read the article

  • How can I crop every page of a large PDF file?

    - by Andrew
    I have a 1300 page PDF file of a scanned book that was unfortunately not cropped when scanned. The actual book page dimensions are around 6x9", but each scanned page is 8.5x11", the size of the scanner bed. For much smaller PDF files I could throw it into Photoshop and crop the page, but this is a huge file. What is the best way to losslessly crop all of the pages of the file, in either Windows or OS X?

    Read the article

  • SSL totally stopped working in Windows

    - by Dims
    Apparently, on my notebook, I have suddenly lost any ability to use network connections, involving SSL and/or data encryption, provided my MS: 1) remote desktop connections: Because of an error in data encryption, this session will end 2) browse HTTPS sites: Can't browse HTTPS pages. TLS error 3) communicate over WiFi, while wired is ok Is there any possible one central reason for all of these problems in Windows? Third party applications, like Putty, works fine. Is it possible to reset/repair certificate store or something in Windows?

    Read the article

  • Unable to list contents/remove directory (linux ext3)

    - by RedKrieg
    System is CentOS5 x86_64, completely up to date. I've got a folder that can't be listed (ls just hangs, eating memory until it is killed). The directory size is nearly 500k: root@server [/home/user/public_html/domain.com/wp-content/uploads/2010/03]# stat . File: `.' Size: 458752 Blocks: 904 IO Block: 4096 directory Device: 812h/2066d Inode: 44499071 Links: 2 Access: (0755/drwxr-xr-x) Uid: ( 3292/ user) Gid: ( 3287/ user) Access: 2012-06-29 17:31:47.000000000 -0400 Modify: 2012-10-23 14:41:58.000000000 -0400 Change: 2012-10-23 14:41:58.000000000 -0400 I can see the file names if I use ls -1f, but it just repeats the same 48 files ad infinitum, all of which have non-ascii characters somewhere in the file name: La-critic\363-al-servicio-la-privacidad-300x160.jpg When I try to access the files (say to copy them or remove them) I get messages like the following: lstat("/home/user/public_html/domain.com/wp-content/uploads/2010/03/Sebast\355an-Pi\361era-el-balc\363n-150x120.jpg", 0x7fff364c52c0) = -1 ENOENT (No such file or directory) I tried altering the code found on this man page and modified the code to call unlink for each file. I get the same ENOENT error from the unlink call: unlink("/home/user/public_html/domain.com/wp-content/uploads/2010/03/Marca-naci\363n-Madrid-150x120.jpg") = -1 ENOENT (No such file or directory) I also straced a "touch", grabbed the syscalls it makes and replicated them, then tried to unlink the resulting file by name. This works fine, but the folder still contains an entry by the same name after the operation completes and the program runs for an arbitrarily long time (strace output ended up at 20GB after 5 minutes and I stopped the process). I'm stumped on this one, I'd really prefer not to have to take this production machine (hundreds of customers) offline to fsck the filesystem, but I'm leaning toward that being the only option at this point. If anyone's had success using other methods for removing files (by inode number, I can get those with the getdents code) I'd love to hear them. (Yes, I've tried find . -inum <inode> -exec rm -fv {} \; and it still has the problem with unlink returning ENOENT) For those interested, here's the diff between that man page's code and mine. I didn't bother with error checking on mallocs, etc because I'm lazy and this is a one-off: root@server [~]# diff -u listdir-orig.c listdir.c --- listdir-orig.c 2012-10-23 15:10:02.000000000 -0400 +++ listdir.c 2012-10-23 14:59:47.000000000 -0400 @@ -6,6 +6,7 @@ #include <stdlib.h> #include <sys/stat.h> #include <sys/syscall.h> +#include <string.h> #define handle_error(msg) \ do { perror(msg); exit(EXIT_FAILURE); } while (0) @@ -17,7 +18,7 @@ char d_name[]; }; -#define BUF_SIZE 1024 +#define BUF_SIZE 1024*1024*5 int main(int argc, char *argv[]) { @@ -26,11 +27,16 @@ struct linux_dirent *d; int bpos; char d_type; + int deleted; + int file_descriptor; fd = open(argc > 1 ? argv[1] : ".", O_RDONLY | O_DIRECTORY); if (fd == -1) handle_error("open"); + char* full_path; + char* fd_path; + for ( ; ; ) { nread = syscall(SYS_getdents, fd, buf, BUF_SIZE); if (nread == -1) @@ -55,7 +61,24 @@ printf("%4d %10lld %s\n", d->d_reclen, (long long) d->d_off, (char *) d->d_name); bpos += d->d_reclen; + if ( d_type == DT_REG ) + { + full_path = malloc(strlen((char *) d->d_name) + strlen(argv[1]) + 2); //One for the /, one for the \0 + strcpy(full_path, argv[1]); + strcat(full_path, (char *) d->d_name); + + //We're going to try to "touch" the file. + //file_descriptor = open(full_path, O_WRONLY|O_CREAT|O_NOCTTY|O_NONBLOCK, 0666); + //fd_path = malloc(32); //Lazy, only really needs 16 + //sprintf(fd_path, "/proc/self/fd/%d", file_descriptor); + //utimes(fd_path, NULL); + //close(file_descriptor); + deleted = unlink(full_path); + if ( deleted == -1 ) printf("Error unlinking file\n"); + break; //Break on first try + } } + break; //Break on first try } exit(EXIT_SUCCESS);

    Read the article

  • How do I turn on basic HTTP-auth for a page in jboss?

    - by Electrons_Ahoy
    I'm setting up a jboss server for testing some java code that talks to http servers. That's pretty easy. However one of the things I'm testing is interfacing with classic "old-school" HTTP-Auth protected pages, and for the life of me I can't figure out how to turn that on in jboss (and my google-fu seems to have let me down.) So, how do I add a basic username and password to a single html (or jsp) file in jboss using http Basic Access Authentication?

    Read the article

  • Remove DRM From a WMV file I own

    - by Rev
    Alright, first, let me explain. I purchased some content through Microsoft's Zune/Xbox Video service, and man that was a mistake. After trying several times to get the video to play, I received an error along the lines of "out of licenses." Lucky for me, I was able to recover the file I was looking for off of a backup, but now I'm having problems playing it. It works fine in Windows Media Player, but not in Zune or Xbox Video (on Windows 8). I contacted Zune's customer support, and of course they couldn't help me. So, I legally own the content, I just want to be able to play it where I want to play it. It's ridiculous that I can't. I know there are ways to do this out there, I just can't figure it out (I keep getting directed to this piece of junk thing called Almedia, which kind of seems to work, but was only putting out audio in the demo version). Thanks!

    Read the article

  • How to make scp copy hidden files?

    - by rascher
    I often use SCP to copy files around - particularly web-related files. The problem is that whenever I do this, I can't get my command to copy hidden files (eg, .htaccess). I typically invoke this: scp -rp src/ user@server:dest/ This doesn't copy hidden files. I don't want to have to invoke this again (by doing something like scp -rp src/.* ... - and that has strange . and .. implications anyway. I didn't see anything in the scp man page about an "include hidden files". How can I accomplish this?

    Read the article

  • How do I generate a summary of tracked changes in a word document?

    - by david-ocallaghan
    Is there a way to generate an attractive summary of tracked changes in a word document? If I'm working on a ~100 page document and, say, I change two paragraphs on p.37 and update a table on p.74, is there a way to produce a Word document showing just the changed pages? Perhaps something like the summary / diff available in some wikis (e.g. https://en.wikipedia.org/w/index.php?title=Stack_Exchange_Network&action=historysubmit&diff=461134938&oldid=458998783)

    Read the article

  • Script or Utility to convert .nab to .csv without importing double entries in Outlook

    - by Chris
    Currently our environment is migrating from Groupwise 7 to Outlook 2003 and we have multiple users with mission critical outside contacts in their frequent contacts that will have to be imported in Outlook. Currently our only solution is to export GW contacts to a .nab, import to excel to scrub out the contacts in our own domain (to avoid double entry) and convert to .csv. This current solution will require a lot of man hours for hand holding because most of our users are not technically savvy AT ALL and are frankly too busy to do this themselves. Anyone know of any kind of tool or script to assist with this?

    Read the article

  • How do I mount a sparse disk image permanently?

    - by Mike
    On Mac OS X 10.6.7, when I mount a sparse disk image (either by double-clicking it or using hdid from the command line), the image: Appears on my desktop Needs to be re-mounted every time I log in I'd like to set up the equivalent of an /etc/fstab which will mount the image when the system boots, and make it permanent - so I don't have to worry if my symbolic links will resolve or not. Is this more trouble than it's worth on a Mac? I noticed that there is no /etc/fstab, and /etc/fstab.hd contains a dire warning: IGNORE THIS FILE. This file does nothing, contains no useful data, and might go away in future releases. Do not depend on this file or its contents. I tried sudo hdid -notremovable <image>, which seemed like half of what I wanted (according to man hdid), but it failed with an error: hdid: attach failed - no mountable file systems.

    Read the article

  • "service"-command and environment variables

    - by varesa
    I am trying to start a service that requires a env. variable to be set to certain path. I set this variable in "/etc/profile.d/". However when I start this service using the service command, it doesn't work. man service: service runs a System V init script in as predictable environment as possible, removing most environment variables and with current working directory set to /. So it seems that service is removing my variables. How should I set the variables up to keep them from being removed. Or is that something i should not do. I could start the service manually using the init-scripts, or even hardcode the path into the script, but I'd like to know how to use it with the service command.

    Read the article

  • Do cheap color laserjets come with toner?

    - by jblocksom
    I'm thinking of getting a budget color laserjet. Will I need to buy four toner cartridges at the same time I get the printer, or do they come with starter toner cartridges? If there is toner with it, any idea how many pages I can get on what's in the box? The Samsung CL-315 or the HP Color LaserJet CP1215, both under $200, are good examples of the class of printer I'm looking at. Thanks!

    Read the article

  • A way to extract layer sets from PSD files without Photoshop?

    - by chaddjohnson
    Are there any tools for (in order of preference) Mac, Linux, or Windows to extract images per layer set from a PSD file, without using Photoshop? I had a web designer send me a PSD file containing multiple pages, and each page is apparently in a layer set, and GIMP cannot handle these--it can't understand layer groupings. a command line tool, a rinky-dink Windows tool...anything will work, so long as it's free (preferable) or under $30, does not leave watermarks or anything of the sort, and works well.

    Read the article

  • PHP file download sometimes

    - by user42092
    Hello all. I have a little problem with my apache centos server with php 5.0 and mysql. When I open multiples files, sometimes, it show de pages .php for download. I think that PHP have a problem, because it happens only when I open many browser windows (like 30). How can I solve this issue?

    Read the article

  • How do I use XQuartz with ssh on OS X?

    - by cwd
    I've downloaded the latest stable version of XQuartz on my Snow Leopard machine, and I'm trying to make an ssh connection with X forwarding but X11 keeps opening. How can I get OS X to use XQuartz? I had X11 installed I downloaded and installed XQuartz X11 is not open / running XQuartz is open and running I try and connect to a remote system using iTerm2: ssh user@remote -X X11 opens. XQuartz is still open, but I doubt it is doing anything. I also tried moving X11 to the trash but then the ssh connection will not complete, even though XQuartz is open. I also get the two warnings which I don't understand how to fix, even after reading the ssh man page. Warning: untrusted X11 forwarding setup failed: xauth key data not generated Warning: No xauth data; using fake authentication data for X11 forwarding.

    Read the article

< Previous Page | 190 191 192 193 194 195 196 197 198 199 200 201  | Next Page >