Search Results

Search found 49518 results on 1981 pages for 'configuration files'.

Page 461/1981 | < Previous Page | 457 458 459 460 461 462 463 464 465 466 467 468  | Next Page >

  • Why isn't this rewrite rule (nginx) applied? (trying to setup Wordpress multisite)

    - by Brian Park
    Hi, I'm trying to setup Wordpress multisite (subfolder structure) with nginx, but having a problem with this rewrite rule. Below is the Apache's .htaccess, which I have to translate into nginx configuration. RewriteEngine On RewriteBase /blogs/ RewriteRule ^index\.php$ - [L] # uploaded files RewriteRule ^([_0-9a-zA-Z-]+/)?files/(.+) wp-includes/ms-files.php?file=$2 [L] # add a trailing slash to /wp-admin RewriteRule ^([_0-9a-zA-Z-]+/)?wp-admin$ $1wp-admin/ [R=301,L] RewriteCond %{REQUEST_FILENAME} -f [OR] RewriteCond %{REQUEST_FILENAME} -d RewriteRule ^ - [L] RewriteRule ^([_0-9a-zA-Z-]+/)?(wp-(content|admin|includes).*) $2 [L] RewriteRule ^([_0-9a-zA-Z-]+/)?(.*\.php)$ $2 [L] RewriteRule . index.php [L] Below is what I came up with: server { listen 80; server_name example.com; server_name_in_redirect off; expires 1d; access_log /srv/www/example.com/logs/access.log; error_log /srv/www/example.com/logs/error.log; root /srv/www/example.com/public; index index.html; try_files $uri $uri/ /index.html; # rewriting uploaded files rewrite ^/blogs/(.+/)?files/(.+) /blogs/wp-includes/ms-files.php?file=$2 last; # add a trailing slash to /wp-admin rewrite ^/blogs/(.+/)?wp-admin$ /blogs/$1wp-admin/ permanent; if (!-e $request_filename) { rewrite ^/blogs/(.+/)?(wp-(content|admin|includes).*) /blogs/$2 last; rewrite ^/blogs/(.+/)?(.*\.php)$ /blogs/$2 last; } location /blogs/ { index index.php; #try_files $uri $uri/ /blogs/index.php?q=$uri&$args; } location ~ \.php$ { include /etc/nginx/fastcgi_params; fastcgi_pass 127.0.0.1:9000; fastcgi_index index.php; fastcgi_param SCRIPT_FILENAME /srv/www/example.com/public$fastcgi_script_name; } # static assets location ~* ^.+\.(manifest)$ { access_log /srv/www/example.com/logs/static.log; } location ~* ^.+\.(ico|ogg|ogv|svg|svgz|eot|otf|woff|mp4|ttf|css|rss|atom|js|jpg|jpeg|gif|png|ico|zip|tgz|gz|rar|bz2|doc|xls|exe|ppt|tar|mid|midi|wav|bmp|rtf)$ { # only set expires max IFF the file is a static file and exists if (-f $request_filename) { expires max; access_log /srv/www/example.com/logs/static.log; } } } In the above code, I believe rewrite ^/blogs/(.+/)?(.*\.php)$ /blogs/$2 last; has no effect because when I look at the access_log file, I see the following line: 2010/09/15 01:14:55 [error] 10166#0: *8 "/srv/www/example.com/public/blogs/test/index.php" is not found (2: No such file or directory), request: "GET /blogs/test/ HTTP/1.1" (Here, 'test' is the second blog created using multisite feature) What I'm expecting is that /blogs/test/index.php gets rewritten to /blogs/index.php, but it doesn't seem to do that... Am I overlooking something obvious? Thanks!

    Read the article

  • Can't change permission/ownership/group of external hard drive on Ubuntu.

    - by MikeN
    I have an external hard drive connected to my Linux box. I wanted to setup a web server to access files on it, but the permission on all files and directories on the drive are "rwx" for the owner which is my local login, and the group is the "root" group. I need the files readable by the apache user, I was trying to set all files to be "chmod a+rwx -R *", but this doesn't do anything (gives no errros, just has no effect.) I tried chaning the group using "chgrp" to my user group, but that won't work either, it gives me errors that I lack permission even when I run all those commands as sudo! What's up with this hard drive??? "sudo chmod a+rwx *" should work on anything, right?

    Read the article

  • Mass renaming, *nix version

    - by Paolo B.
    I was looking for a way to rename a huge number of similarly-named files, much like this one (a Windows-related question) except that I'm using *nix (Ubuntu and FreeBSD, separately). Just to sum up, while using the shell (Bash, CSH, etc.) how do I mass-rename a number of files such that, for example, the following files: Beethoven - Fur Elise.mp3 Beethoven - Moonlight Sonata.mp3 Beethoven - Ode to Joy.mp3 Beethoven - Rage Over the Lost Penny.mp3 will be renamed like these? Fur Elise.mp3 Moonlight Sonata.mp3 Ode to Joy.mp3 Rage Over the Lost Penny.mp3 The reason I want to do this is that these collection of files will go under a directory named "Beethoven" (i.e. the filenames' prefix), and having this information on the filename itself will be redundant.

    Read the article

  • If I use a video format converter to change a movie from AVI to MKV, will quality stay the same?

    - by Matt
    I know they're both container formats and what matters is the actual codec used, but what I don't know is if video converting software will do anything to change the codec, or if it just repackages. The reason I need to know is that I have several .avi files with subtitle files, and I'm wanting to turn them into .mkv so I can attach the subs and not need a second subtitle file anymore. Will my new .mkv files be identical in video and audio quality to the original .avi?

    Read the article

  • Exchange Server 2007 message tracking log tuning ?

    - by Albert Widjaja
    Hi All, what is the best practice if I want to have a retention of let say 6 months ? I'm confused which parameter that is should/can be changes. Get-ExchangeServer | where {$_.isHubTransportServer -eq $true} | Get-TransportServer | select Name, *MessageTracking* | ft -AutoSize Name MessageTrackingLogEnabled MessageTrackingLogMaxAge MessageTrackingLogMaxDirectorySize MessageTrackingLogMaxFileSize MessageTrackingLogPat h ---- ------------------------- ------------------------ ---------------------------------- ----------------------------- --------------------- ExHTServer1 True 20.00:00:00 250MB 10MB D:\Program Files\M... ExHTServer2 True 20.00:00:00 250MB 10MB D:\Program Files\M... ExHTServer3 True 20.00:00:00 250MB 10MB D:\Program Files\M... Thanks, Albert

    Read the article

  • google search engine

    - by kourosh
    I am working on a google box, something like this, http://mytwentyfive.com/blog/wp-content/uploads/byme/Google%20Search%20Appliances.jpg I am pointing the crawler to a folder where there are html files. before the crawler was crawling the files and indexing them but right now it finds the pattern or the folder but not following any html files within the folder. I have tried everything I could and know but, can't think of anything else. Can someone help? thanks

    Read the article

  • Differential backup missing moved folders (flawed archive attribute logic)

    - by Max
    Recently I've discovered that my backup system it flawed: there are situation where various files/folders are missed. I do my backup from local disk to a network NAS. I use Cobian backup, and I have setup the backup software to create one full backup every week, and one differential backup every day. Now, the backup software (to my knowledge any backup software work this way) decide the files that go in the differential backup by looking at the file archive attribute. If the attribute is set, then the file go in to the backup. Now, when you move a file to a new location, on Windows systems, the archive attribute get set and the file is included in the backup, and that's fine... but when you move an entire folder, no archive attribute is set, nor on the folder, nor in any files inside the folder, so the moved folder isn't included in the differential backup! So, if you have a full backup plus a differential backup, and you moved folders around... then it's impossible to reconstruct the original files/folders structure starting from the full+differential backup, because the backup software didn't include the moved folders in the differential backup. So my differential backup are useless... Why does windows set the archive attribute when moving a file, but not when moving a folder? How can I deal with this issue? Is there a way to create a differential backup that works as it's supposed to do? Doing full backup every day is not practical, because the changed data is about 0.1% at day (by using a differential backup I can keep 4 weeks of files history without using too much disk space.)

    Read the article

  • Ubuntu Software RAID 0 on AWS Does Not Survive Reboot

    - by Eric J.
    I'm experimenting with creating a software RAID 0 device from 4 EBS volumes on Ubuntu 9.10 running at Amazon AWS following this guide: http://alestic.com/2009/06/ec2-ebs-raid The device appears (and according to SysBench is 3.5x faster than a regular attached EBS volume). Problem is, when I reboot the instance, all files on the RAID device are gone. The device is available and mounted where expected, but contains no files. I am able to write new files to it, which survive until the next reboot.

    Read the article

  • split shell command

    - by pedro
    I want split a file into multiple files with at most 25 lines each. I'm using this: split -l 25 /etc/adduser.conf /home/ubuntu/PL/trab3/rc_ But I do not get the files I expect. How can files with the filenames like rc_01, rc_02, etc.?

    Read the article

  • Fetch MP3 ID3 tag under linux

    - by exic
    Hi, I have a few mp3 files which are not tagged. Winamp has a nice feature which I think is called "autotag" and which is very good at finding out artist and title for files. I'd like something like this for unix, so that I could possibly get artists and titles for my untagged files. Do you know some program which does this? Thanks.

    Read the article

  • Recover data from a Thecus N4100 NAS jbod partition

    - by TimothyP
    I have a Thecus N4100 NAS wich had a 2TB drive in it configured as jbod partition. Later I tried adding a second drive to expand the available space. I added it to the jbod configuration but I did not get any extra space. Then I tried removing the second drive but then the NAS system indicated the jbod configuration was damaged. After a reboot it told me there is no configuration and I need to create a new RAID/jbod configuration and that I would lose all data on the drive. Of course I did not do this, I took the 2TB drive and checked with Linux if the partition was still there and it is... completely intact. I found that linux recognizes it as a /dev/mdX device and that I should be able to mount it but I don't know the file system type. Anyway.... is there a way to recover the data? Since everything has always been on a single drive it should all be there right? I can connect the drive to Windows , Linux or MacOS so whatever gets the job done.

    Read the article

  • Apache multiple vhost logs, stored locally and sent to remote logstash

    - by benbradley
    I'm investigating centralised logging and it seems there's so many different ways this can be done. I don't want to run logstash as a log "sender", preferring to keep the web servers as lean and simple possible. So that means either using syslog, syslog-ng or the one I'm testing now, rsyslog. But I would like to have separate vhost log files on the web server, in addition to these logs being sent to a remote log collector. I've tested rsyslog using the imfile module to watch the Apache log files, but this means I have to hard-code each vhost log file into my rsyslog.conf. Not ideal as people will invariably forget when they add/remove sites on the server. The reason I'm using rsyslog's imfile is that Apache doesn't appear to let you log to file and syslog. And I want to keep vhost-specific log files on the web server. So how can I do this? Is there a way of having rsyslog produce local log files and forward the logs to a remote collector? I am prepared to change my Apache config to log to a single access/error log for all vhosts, so long as there are vhost-specific log files produced somewhere on the web server machine. I just don't want to lose any logging info if the remote log collector can't be contacted for any reason. Any comments/suggestions? Cheers, B

    Read the article

  • Windows Server 2008 R2 DFS Root Namespace Required?

    - by caleban
    I would prefer to set up our DFS such as: \domain.local\users \domain.local\customers \domain.local\support etc. Is this a problem? Do I need to instead set all of the above folders as targets under a root such as: \domain.local\files\users \domain.local\files\customers \domain.local\files\support Other than the path being shorter in the top example, which is what I would prefer, is there a difference in functionality in Windows DFS between the two examples shown? Thanks in advance.

    Read the article

  • Saving a compressed text attachment results in empty file

    - by Brandon
    I have a text document with compressed text in it, the text is auto generated by a program. The text document is fine on my machine (Vista 32-bit), and can be used normally. The other person can also create and use these files just fine. (XP 32-bit) However when I email it to someone else (Outlook 2003 on both machines) the attachment is sent fine (5kb) but when the other person tries to save it somewhere, the saved file is empty. (64b) At first I thought Outlook didn't like compressed text files (security risk maybe?), but I can receive the text files just fine. Is there a setting somewhere on the other persons machine that tells Outlook not to trust compressed text? Can anyone think of a reason why these files are being saved as empty text documents?

    Read the article

  • Backup Dropbox to Amazon Glacier

    - by joekr
    i'm using Dropbox for Backup which means i keep all my files in my Dropbox folder (encrypted using encfs but that should not be relevant). I like this solution because it is automatic and keeps copies of my files on several machines at different locations. The only thing i could see go wrong is that Dropbox has some sort of bug that tells all my machines to delete the files. So currently i do a Backup of the Dropbox folder to an external Harddrive. With Amazon Glacier it seems affordable to automate Backup snapshots of my Dropbox. What i am looking for is a tool that will do this for me - the base case scenario would be that files would go from Dropbox (using their API) directly to Amazon as uploading the ~80GB from my home connection would take forever... Thanks!

    Read the article

  • Mercurial Scenario

    - by richzilla
    Hi all, I have a scenario in mercurial, and i cant finad anything that would tell me how to solve it. Basically, i have a mercurial repository with numerous branches for stable, development, experimental features etc.... However, ive found a bug in a set of core application files that are common to each branch. Is there a way to modify these files, and then push the changes to the common files to all the other branches, without sending any other changes? any help would be appreciated.

    Read the article

  • Wrong java -version being reported

    - by Malachi
    I am running Windows 7 Professional x64 and have the following Java versions installed: x64 C:\Program Files\Java jdk1.6.0_24 jdk1.7.0_04 jdk1.7.0_07 jre6 jre7 x86 C:\Program Files (x86)\Java jre1.6.0_07 jre6 jre7 in my environment variables I have my PATH containing C:\Program Files\Java\jdk1.6.0_24\bin and JAVA_HOME set to C:\Program Files\Java\jdk1.6.0_24\bin However running java -version reports java version "1.7.0_07" Java(TM) SE Runtime Environment (build 1.7.0_07-b10) Java HotSpot(TM) 64-Bit Server VM (build 23.3-b01, mixed mode) How is this the case when there is no reference to this version of Java in my Environment variables. Any help on this issue would be great as I am trying to run Apache ANT using Java 1.6.

    Read the article

  • Deploying website content via Subversion

    - by Johann
    we have recently set up a new development infrastructure and process for one of our clients. This involves the strict use of subversion as a central source code repository. The svn repositories contains a seperate branch for code on the live system (/branches/live/). The repositories are use for PHP content (mainly Wordpress Blogs), but in future they may hold other asp code as well. Bonus points for a solutions which more or less in the same way with ASP code on Windows Server 2008 R2. We have two servers: one staging system and one live system. The staging system is updated regularly with the code of the trunk. The live system is update manually. Each webroot on the servers are working copy of either the trunk (staging system) or the live branch (live system). The current workflow is: Developing on the dev's box - commit into the trunk - auto-deploy on staging system - testing on the staging system - merging into /branches/live/ - manual deployment on live system. This works for one-way changes very well, however we have some troubles on every wordpress (or plugin) update: The WP update process removes the directories and unpack the archive of the new version. This removes the svn admin area as well, which produces a lot of errors. We could switch to SVN 1.7 with a single, global admin area, but this would only solve on part of the problem. Finally, we have done the update via the WP Gui, restored the svn admin area, added/removed the files and committed the changes to the trunk. After testing, we had to do basically the same thing on the live server (except the commit, we just reverted the changes and merged the new files from the staging system to the live system). I'm currently thinking of the following: The htdocs of each website is a svn export Each website has a svn working copy beside the htdocs directory a script which "replays" the changes in the wc from htdocs after an update in WP (rsync'ing the changed files to the working copy, rsync'ing new files and svn add them and finally svn delete the deleted files). The script would have to exclude some files (like wp-config.php, uploads/temp directories, etc.). Are there better ways to do this? Unfortunaly, a complete CI server is out of scope due to time and budget limitations.

    Read the article

  • Deploy Windows 7 Backup set to Windows 8

    - by Matthias
    Situation: We have a laptop here that's completely fubar. I.e: The hard drive is filled to the brim with bad sectors. Luckily, backups have been made using the built-in Windows 7 backup feature. This produces folders named Backup Set 2012-11-09 003009, containing folders like Backup Files 2012-11-09 003009, containing zip-files like Backup files 1, 2, 3,... Our brand new laptop comes with Windows 8. Now: Can we, using the standard back-up and restore feature in Windows 8, restore all the documents, music, etc. using the Windows 7 backup files? Thanks. (FYI: We also took a normal backup of all the documents just to be sure of course. I'm just curious what would happen. I would test it out, but the new laptop hasn't arrived yet and I wanted to make sure my efforts would not be in vain.)

    Read the article

  • LED Display control software

    - by user978733
    My university bought new big led display from chinese manufacturers. What I want to do is, show some visualizations (like Windows Media Player, Winamp, Itunes ... does) with music. I'd just drag the application window to show on screen But main problem is,The software that controls it (called "Led Vision") doesn't support showing applications' windows: It shows limited types of the files, such as, powerpoint presentation, video files, picture files.. etc. Now the question is, where I can find the visualization video files? something that created in After Effects, looks like that:

    Read the article

  • What kinds of protections against viruses does Linux provide out of the box for the average user?

    - by ChocoDeveloper
    I know others have asked this, but I have other questions related to this. In particular, I'm concerned about the damage that the virus can do the user itself (his files), not the OS in general nor other users of the same machine. This question came to my mind because of that ransomware virus that is encrypting machines all over the world, and then asking the user to send a payment in Bitcoin if he wants to recover his files. I have already received and opened the email that is supposed to contain the virus, so I guess I didn't do that bad because nothing happened. But would I have survived if I opened the attachment and it was aimed at Linux users? I guess not. One of the advantages is that files are not executable by default right after downloading them. Is that just a bad default in Windows and could be fixed with a proper configuration? As a Linux user, I thought my machine was pretty secure by default, and I was even told that I shouldn't bother installing an antivirus. But I have read some people saying that the most important (or only?) difference is that Linux is just less popular, so almost no one writes viruses for it. Is that right? What else can I do to be safe from this kind of ransomware virus? Not automatically executing random files from unknown sources seems to be more than enough, but is it? I can't think of many other things a user can do to protect his own files (not the OS, not other users), because he has full permissions on them.

    Read the article

  • FTP Access Denied when uploading to server

    - by Albert
    Ok, here's the story. I have a server running FTP 'out there' I can connect to it using the admin account, browse files, download files. When I try to upload files, I get 550 Access Denied. I have tried through FileZilla and command line. I have windows firewall turned off (on my machine) I can UPLOAD files from another machine (using the same admin account) on our local network (that means, same public IP) what is the problem? I am running Windows 7, Build 7100 and the other machine on the network is running XP SP3 The thing that gets me though, is that this worked for the last probably 4 months, without a problem, I get back in the office after a weekend today and it won't work...

    Read the article

< Previous Page | 457 458 459 460 461 462 463 464 465 466 467 468  | Next Page >