Search Results

Search found 28288 results on 1132 pages for 'home directory'.

Page 171/1132 | < Previous Page | 167 168 169 170 171 172 173 174 175 176 177 178  | Next Page >

  • How can I override mod-php5's .php mapping to php4-cgi per VirtualHost or Directory?

    - by geocoo
    I am running Debian Linux with apache2 and libapache2-mod-php5 5.3.3-7. I have one VirtualHost which requires php4. So I researched and compiled php4-cgi. However, I cannot seem to: Override mod-php5's mapping of .php in that vhost (or even globally, without disabling php completley). Even find where that mapping is made, in hope of disabling it and enabling mod-php5 or php4-cgi per vhost. This is my php4-cgi mapping (Inside the one php4 vhost): ScriptAlias /php4 /usr/local/php4/bin <Directory /usr/local/php4/bin> Options +ExecCGI +FollowSymLinks </Directory> <Directory /www/test> AddHandler php4-cgi-script .php Action php4-cgi-script /php4/php Options +ExecCGI </Directory> This does not work, mod-php5 still runs all .php files in that vhost/directory. If I change the file extension in the AddHandler above from .php to .php4, then .php4 files do run php4-cgi as expected, but I can't change all the files in the app to .php4. I thought maybe I could disable the mod-php5's mapping in my vhost or directory, then do my cgi-config (as above) but many combinations of these in different contexts did not work: RemoveHandler .php RemoveType .php php_flag engine off (this seems to even disable my php4-cgi so that wont work) The only other place I can find any mapping is in /etc/mime.types, but commenting out the relevant lines and restarting apache2 does not affect mod-php5's .php mapping. I have searched as much as I can, it is now a mystery to me. Any help or direction would be greatly appreciated.

    Read the article

  • Why /home folder is missing from my backup archive created by tar?

    - by Konstantin Pereyaslov
    So I'm doing full backup of my VPS using the following command (as root, of course): tar czvf 20120604.tar.gz / Everything seems to be fine, all files seem to appear in the list. The size of archive is 6 Gb and gunzipped version is 11 Gb which includes /home, because I totally have 11 Gb of data on VPS. But when I try actually to unpack archive, or open it using mc or WinRAR, there's no /home folder. And WinRAR tells 20120604.tar.gz - TAR+GZIP archive, unpacked size 894 841 346 bytes. It can't be WinRAR's bug, because when I type tar xzvf 20120604.tar.gz, /home folder isn't unpacked either. Why is /home folder missing from my archive? And what can I do to include it there? tar --version outputs the following: tar (GNU tar) 1.15.1

    Read the article

  • How to add ACTIVE DIRECTORY user to Sharepoint group

    - by standley-nguyen
    Hi all. I got an exception when executing this snippet code SPSecurity.RunWithElevatedPrivileges(delegate() { using (SPSite site = new SPSite(siteUrl.Trim())) { using (SPWeb web = site.OpenWeb()) { try { web.AllowUnsafeUpdates = true; SPUser spUser = web.AllUsers[userName]; if (spUser != null) { SPGroup spGroup = web.Groups[groupName]; if (spGroup != null) spGroup.AddUser(spUser); } } catch (Exception ex) { this.TraceData(LogLevel.Error, "Error at function Named [AddUserToSPGroupWidget.AddUserToGroup] . With Error Message: " + ex.ToString()); } finally { web.AllowUnsafeUpdates = false; } } } }); PLease guide me. Thanks in advance.

    Read the article

  • Recursive FTP directory listing in shell/bash with a single session (using cURL or ftp)

    - by Timo
    I am writing a little shellscript that needs to go through all folders and files on an ftp server (recursively). So far everything works fine using cURL - but it's pretty slow, becuase cURL starts a new session for every command. So for 500 directories, cURL preforms 500 logins. Does anybody know, whether I can stay logged in using cURL (this would be my favourite solution) or how I can use ftp with only one session in a shell script? I know how to execute a set of ftp commands and retrieve the response, but for the recursive listing, it has to be a little more dynamic... Thanks for your help!

    Read the article

  • Parse Directory Structure (Strings) to JSON using PHP

    - by Ecropolis
    I have an array of file-path strings like this videos/funny/jelloman.wmv videos/funny/bellydance.flv videos/abc.mp4 videos/june.mp4 videos/cleaver.mp4 fun.wmv jimmy.wmv herman.wmv Is there a library or easy way I can get to a data structure json or xml? Something like this: (I see there are a lot of snippets available for traversing actual folders, but again, I just have strings.) { files:{ file:[ { filename:'fun.wmv' }, { filename:'jimmy.wmv' }, { filename:'herman.wmv' } ], folder:{ foldername:'videos', file:[ { filename:'abc.mp4' }, { filename:'june.mp4' }, { filename:'cleaver.mp4' } ], folder:{ foldername:'funny', file:[ { filename:'jelloman.wmv' }, { filename:'bellydance.flv' } ] } } } }

    Read the article

  • EnterpriseLibrary.Logging directory ..

    - by MüllerDK
    Hi, I'm using MS EnterpriseLibrary.Logging and that works perfectly. But the logfile(s) are placed in where the program file are located. How do I get it to place my logfiles in individual users applicationData folder?? The folder I'm talking about is the one you get by calling "Environment.GetFolderPath(Environment.SpecialFolder.ApplicationData)". Any help appreciated :-) Best Regards Søren

    Read the article

  • Commit all folders and files in a directory using commandline

    - by Shaharyar
    Hello everybody We are having trouble with having a generalized approach to committing with a batch file using commandline svn. We've got a backupscript that created a new folder with the current date containing the database dumps of our database. (Yes, we version control our database). Now how can I use the svn commit command to include all directories that are new in the project? Is there an approach without using the svn add command?` Thanks for all advice!

    Read the article

  • ack: Excluding only one directory but keeping all others with the same name

    - by mattalexx
    My folder structure looks like this: /app /app/data ... /app/secondary /app/secondary/data I want to recursively search /app, including /app/data. I do not want to search /app/secondary/data however. This what I have so far: ack --ignore-dir=data searchtext ack --ignore-dir=secondary/data searchtext The first command is ignoring both directories and the second one is ignoring neither of them. From within the app folder, what should my ack command look like?

    Read the article

  • Is it possible to change "working directory" of XeTeX?

    - by Herbert Sitz
    Using XeTeX there are many working files that get created in process of producing the pdf, and they litter the directory where my main .tex file is. Is it possible to change the working directory of XeTeX so that it stores all these scratch files in some other directory, out of the way? There is a previous question on Superuser.com that discusses a utility that cleans up the working files by deleting them after they're produced: http://superuser.com/questions/95712/how-to-avoid-littering-ones-tex-directories-with-intermediate-files That solution doesn't work for me since I'm using XeTeX, but also it seems like it would be preferable to simply be able to designate a "scratch" directory where all working files are saved. I haven't been able to find any info on how to do it though. Is there a way? (My question is prompted partly because of the fact that I often work with files in a directory that is shared using DropBox, so it creates a lot of unnecessary traffic if files are getting created and destroyed willy nilly. I don't know if it affects speed in any way, but the idea of having a separate working directory that is not shared/replicated by DropBox would be a cleaner solution, even if I could use the method suggested in the earlier thread.)

    Read the article

  • how do I redirect from a subdirectory to another directory, but strip off the original subdirectory

    - by Eric Cope
    I have a Ruby on Rails app running on 12001. I am currently redirecting a subdomain to 127.0.0.1:12001 using some ReWriteCond detection. Now I want to redirect my subdirectory to that rails app. http[s]://domain.com/redmine to 127.0.0.1:12001 The current rules apply REQUEST_URI to the above rails path, but I need to strip "/redmine" from the front of REQUEST_URI... Any ideas?

    Read the article

  • Get All Users in an Active Directory Group

    - by Matt Hanson
    I'm using the following code sample to get a list of all users in a specified AD group (in this case, all users in the "Domain Users" group). My listed code works great, with one exception: it won't return users who have their primary group set to "Domain Users". How can I get a list of all users in the group, including those who have it set as their primary group? Private Sub GetUsers() Dim groupSearcher As New DirectorySearcher Dim groupSearchRoot As New DirectoryEntry("LDAP://OU=Users,DC=domain,DC=com") With groupSearcher .SearchRoot = groupSearchRoot .Filter = "(&(ObjectClass=Group)(CN=Domain Users))" End With Dim members As Object members = groupSearcher.FindOne.GetDirectoryEntry.Invoke("Members", Nothing) For Each member As Object In CType(members, IEnumerable) Console.WriteLine(New DirectoryEntry(member).Name.Remove(0, 3)) Next End Sub

    Read the article

  • git reference common directory/repo

    - by phillee
    Project layout: /project_a /shared /project_b /shared /shared project_a and project_b both need to contain the shared folder. With svn, we used svn:externalsand that worked fine, since svn can reference subdirs (with relative paths too). However, we moved to git and it seems to not support checking out subdirs. Our solution now is to put project_a, project_b and shared all in different git repos, and use git submodules in project_a and project_b. However this seems much more complicated than one monolithic svn repo with svn:externals. What's the correct way to handle common elements in git?

    Read the article

  • delete all files and folders in multiple directory but leave the directoy

    - by NightsEVil
    well i like this nice of code right here it seems to work awsome but i cant seem to add any more directories to it DirectoryInfo dir = new DirectoryInfo(@"C:\temp"); foreach(FileInfo files in dir.GetFiles()) { files.Delete(); } foreach (DirectoryInfo dirs in dir.GetDirectories()) { dirs.Delete(true); } i would also like to add in special folders aswell like History and cookies and such how would i go about doing that (i would like to include atleast 4-5 different folders)

    Read the article

  • Recursively find files/directories within a directory

    - by enchilada
    I want to use enumeratorAtURL:includingPropertiesForKeys:options:errorHandler: to do this, as I'm coding under Snow Leopard. I know how to use this method. However, as for my purposes it's not adequate to load the contents lazily, I have the following question: How can I create an object hierarchy (of, say, the custom FileOrDirectory class with the standard children,count,isLeaf attributes) from the information I load from the enumeratorAtURL: method? I need to load this data beforehand, and show it in an outline view, for my purposes.

    Read the article

  • C# delete all files and folders in multiple directory but leave the directoy

    - by NightsEVil
    well i like this nice of code right here it seems to work awsome but i cant seem to add any more directories to it { DirectoryInfo dir = new DirectoryInfo(@"C:\temp"); foreach(FileInfo files in dir.GetFiles()) { files.Delete(); } foreach (DirectoryInfo dirs in dir.GetDirectories()) { dirs.Delete(true); } } i would also like to add in special folders aswell like History and cookies and such how would i go about doing that (i would like to include atleast 4-5 different folders)

    Read the article

  • Sort directory listing using RecursiveDirectoryIterator

    - by Scott Saunders
    I'm using RecursiveDirectoryIterator and RecursiveIteratorIterator to build a file listing tree using code like below. I need to the list to be sorted - either directories then files alphabetically, or just alphabetically. Can anyone tell me how to sort the file list? $dir_iterator = new RecursiveDirectoryIterator($groupDirectory); $iterator = new RecursiveIteratorIterator($dir_iterator, RecursiveIteratorIterator::SELF_FIRST); foreach ($iterator as $file) { // do stuff with $file }

    Read the article

  • Connection to Google, Yahoo, Bing, Ask, etc. compromised via all devices on my home network - How?

    - by jt0dd
    I'm a very computer savvy guy (although not very networking savvy), and I may still be wrong about this, but I think my home network may be compromised somehow. I'd like to know if it's possible for someone to have hijacked my network's connection to Google.com and other popular websites. Update: The issue seems to take effect with all popular websites. I can connect to small (non-popular) websites without issue, but Facebook, Google, Yahoo, and Bing cannot be accessed by any device on my home network. On all devices using my home network, I'm being shown http://www.google.com WARNING! Internet Explorer is currently out of date. Please update to continue. when I attempt to connect to google.com. I wouldn't be surprised by this at all if it were just the laptop. It's the fact that this is happening on all devices on my network that confuses me. Here's the screenshot from my iPhone, for reference. Can my home network be compromised? Is that even possible? How can something like this happen across all platforms on all devices in the same way? I wouldn't imagine every device / platform on the network would get the same virus. Should I assume that my network's security is totally compromised? Update: All mobile devices and laptops on my home network are experiencing the same alert when attempting to connect to google.com.

    Read the article

  • FileSystemWatcher keeping parent directory

    - by Henry Jackson
    I am using FileSystemWatcher to monitor a folder, and it seems to be preventing the folder's parent from being deleted. For example, I have the file structure: C:\Root\FolderToWatch\... with the FileSystemWatcher targeting FolderToWatch. While my program is running, if I go to Windows Explorer and try to delete Root, I get an error "Cannot delete Root: access is denied". However, if I delete FolderToWatch FIRST, I can then delete Root without incident. (Needless to say, if the FileSystemWatcher is not enabled, I have no problem deleting either folder.) What gives? Why does the FileSystemWatcher hang onto it's target's parent like that? How (if possible) can I stop this behavior? I would like the user to be able to freely move or delete directories in Windows Explorer.

    Read the article

  • Directory of "index.html" on site

    - by Camran
    I wonder if the index.html MUST be in the "www" folder on the server after uploading the site? This because I have actually made everything in a folder called "SV", so my site is located in : "www/SV/index.html" My Q is, on the server, could I just create a folder named "SV" under "www" and expect index.html to be automatically displayed ones the users type in the web-adress to my site? Thanks

    Read the article

  • ASP.NET forgets dlls in bin directory

    - by Timothy Strimple
    We have a plugin system on a WCF service that checks libraries placed in the bin folder for certain assembly level attributes and loads them. This allows customization of certain service calls based on which client is making the call. This works great most of the time. However, sometimes it seems to lose the dll, which causes the service to revert back to the default implementation for every client. The solution so far has been to just move the dll file out of the bin folder, and back in. This causes asp.net to pick up the file and customizations start working again. I'm at a loss for why the assembly is getting missed like that after a certain amount of time. Any ideas as to what might be causing this?

    Read the article

  • eCryptFS: How to mount a backup of an encrypted home dir?

    - by Boldewyn
    I use eCryptFS to encrypt the home directory of my laptop. My backup script copies the encrypted files to a server (together with everything else in (home/.ecryptfs). How can I mount the encrypted files of the backup? I'd like to verify that I can do that, and that everything is in place. My naive try with mount -t ecryptfs /backup/home/.ecryptfs/boldewyn /mnt/test didn't work, eCryptFS wanted to create a new partition.

    Read the article

< Previous Page | 167 168 169 170 171 172 173 174 175 176 177 178  | Next Page >