Search Results

Search found 16797 results on 672 pages for 'directory traversal'.

Page 166/672 | < Previous Page | 162 163 164 165 166 167 168 169 170 171 172 173  | Next Page >

  • Installing HTK Error

    - by Alex Madill
    I am having an issue when I try and make the file, ./configure worked perfectly fine for me when I try and make: zodiac@Zodiac:~/Downloads/htk$ make all (cd HTKTools && make all) \ || case "" in *k*) fail=yes;; *) exit 1;; esac; make[1]: Entering directory `/home/zodiac/Downloads/htk/HTKTools' make[1]: Nothing to be done for `all'. make[1]: Leaving directory `/home/zodiac/Downloads/htk/HTKTools' (cd HLMTools && make all) \ || case "" in *k*) fail=yes;; *) exit 1;; esac; make[1]: Entering directory `/home/zodiac/Downloads/htk/HLMTools' make[1]: Nothing to be done for `all'. make[1]: Leaving directory `/home/zodiac/Downloads/htk/HLMTools' Thanks in advance

    Read the article

  • Deploying Asp.net MVC web application [migrated]

    - by Pankaj Upadhyay
    I have been trying to find a neat tutorial, guide or step by step instructions for deploying an Asp.net MVC3 webapp but have found nothing so far. Everyone talks about his version of the stroy and different type of MVC versions. Right now, I have build a simple Asp.net MVC web application which i need to deploy on my shared hosting account. In a very simple manner, I need to know which files should i copy. Do i upload everything in my webproject directory to the server including the controller directory, views, models, content and bin directory ?. What about the Global.asax, web.config, packages.config, myapp.publish.xml. In short, I have no idea which files should be uploaded and which should be not. I am sure of one thing that i need few(MVC and Razor dlls) following dlls in bin directory. Just treat me as someone who has never deployed any website NOTE:- I don't have VS SP1 installed and it doesn't install either. Basically i need a manual procedure.

    Read the article

  • Configure httpd.conf alias/subdirectory point to another server

    - by azrim
    Hi, I,m having a web server for testing purposes to host my domain http://www.domain.com which run perfectly. Below is server specs: OS Freebsd 7.2 MySQL 5.1.33 Apache 2.2.11 PHP 5.2.9 I can do alias directory in my httpd.conf so that my domain can have subdirectory hosting in the same server such as httpd://domain.com/subdomain1, httpd://domain.com/subdomain2 and so on. All my subdomain1 and subdomain2 directory folders reside on the same web server but only different location. Below is my example from httpd.conf for the alias subdomain1 block: Alias /subdomain1 "/usr/local/www/subdomain1" <Directory "/usr/local/www/subdomain1"> Options +Indexes AllowOverride None allow from all </Directory> I,m looking a way in order my subdomain1 and subdomain2 directory is read from another server in my LAN but remain hosted as httpd://domain.com/subdomain1. Really appreciate anyone know how to do this. Thanks,

    Read the article

  • Windows 7 Permissions

    - by Scott
    I have an odd problem with a windows 7 laptop. It's a single user installation currently. This is a fresh install on an Asus laptop. I have a svn repo checked out on my second partition. I have a directory which I have added to svn:ignore list, because it is for tmp files. This specific directory shows as read-only. I need write access on this directory for my project to function properly. If I right click and modify the directory to be not be read only and run this recursively, it simply is immediately reverted back to a read-only directory. I have also modified apache's service to run as myself to no avail. I'm stumped... Any ideas?

    Read the article

  • Cannot access windows share

    - by Wrigley
    I have an accounting program specifically tailored to the IT industry, its called Fincon (exe based). It basically works on a client to server directory base system. the server is currently running on a windows 7 machine with a NTFS partition. I have installed Wine. Have the shared Windows directory mounted with what I assume is the correct command for such being (mount -tsmbfs //servername/sharedir /mnt/fincon -0 usename=username,password=password). I can see the shared dir although it does take a bit long to access it on the Ubuntu machine via the mapped directory but instant via normal network browsing. I have also set up the mapped directory to my D: drive in Wine and have pointed the fincon.ini to read server field from D: directly. Here's my issue, it seems that for some odd reason I cant write to the mapped directory from Ubuntu, yet I can with my Windows machines, the permissions are set correct on the Windows 7 share and I really dunno what I'm missing. I'm quite a Linux noob just switched yesterday. Thanks guys for any help in this would be quite appreciated. As I'm pulling out my hair here and really want to migrate my work PCs to a Linux OS as it just gives less issues than Windows does ever.

    Read the article

  • How can the shared hosting server provide unlimited physical subdomains as opposed to unlimited virtual subdomains?

    - by xport
    Some hosting companies offer unlimited subdomains. There are two kind of subdomains: physical subdomains and virtual subdomains. A physical subdomains has its own site directory rather than being nested inside the site directory of its parent domain. A virtual subdomain site directory, on the other hand, is nested inside the site directory of its parent domain. I wonder how can the shared hosting company provide unlimited (theoritically) physical subdomains? In my understanding, each physical subdomain represents a new site (rather than a new application or virtual directory) in IIS. Please correct me if my mental model is wrong.

    Read the article

  • how do I use tar -C on Snow Leopard when creating an archive ?

    - by ssc
    The man page states -C directory In c and r mode, this changes the directory before adding the following files. However, tar does not change to the directory I specify, but instead reports tar: <folder name>: Cannot stat: No such file or directory for every folder in the directory I run the tar command in. Do I really have to do something like cd <folder> && tar ... && cd - or is there a way to get this work ?

    Read the article

  • C# : Parsing information out of a path

    - by mbcrump
    If you have a path for example: \\MyServer\MyFolderA\MyFolderB\MyFile.jpg and you need to parse out the filename, directory name or the directory parent name. You should use the fileinfo class instead of a regex expression or a string split. See the example code below:   Code Snippet using System; using System.IO;   class Test {     static void Main(string[] args)     {         string file = @"\\MyServer\MyFolderA\MyFolderB\MyFile.jpg";         FileInfo fi = new FileInfo(file);         Console.WriteLine(fi.Name);                  // Prints File.jpg         Console.WriteLine(fi.Directory.Name);        // Prints FolderB         Console.WriteLine(fi.Directory.Parent.Name); // Prints FolderA     } }

    Read the article

  • Syncing Large Directories/Filesystems using USB Drive [closed]

    - by Alan Lue
    Does anyone have a solution for syncing large directories/filesystems using just a USB flash drive (and specifically without using a network connection)? The objective is simply to sync a user directory between two computers. The contents of the user directory could amount to a large quantity of data—say, a quantity larger than could be stored on any single USB drive—but the aggregate size of changes that must be propagated by a single sync could easily fit on a USB drive. As an example, suppose a user directory is already synchronized between a desktop and a laptop computer. Here's a use case: Some changes are made in the user directory on the desktop. We mount a USB drive onto the desktop and copy whatever changes need to be applied to the laptop user directory in order to synchronize the desktop and laptop user directories. We now mount the USB drive onto the laptop and apply the changes. The desktop and laptop user directories are now synchronized. Any ideas? Alan

    Read the article

  • Emacs: Changing the location of auto-save files

    - by Dominic Rodger
    I've currently got: (setq backup-directory-alist `((".*" . ,temporary-file-directory))) (setq auto-save-file-name-transforms `((".*" ,temporary-file-directory t))) in my .emacs, but that doesn't seem to have changed where auto-save files get saved (it has changed where backup files get saved. M-x describe-variable shows that temporary-file-directory is set to /tmp/, but when I edit a file called testing.md and have unsaved changes, I get a file called .#testing.md in the same directory. How can I make that file go somewhere else (e.g. /tmp/)? I've had no luck with these suggestions, so any suggestions welcome! If it helps, I'm on GNU Emacs 23.3.1, running Ubuntu.

    Read the article

  • Mac OS X 10.6 executable not found without full path

    - by Danack
    I just installed Apache via MacPorts. It seems that my Mac was absolutely confused about which version of the Apache executables to run. After moving the Apache executables that ship with the Mac to a directory that is not listed in the PATH variable, trying to run the httpd built by MacPorts fails even though the correct directory (/opt/local/apache2/bin) is listed in the PATH variable. If I navigate to the directory /opt/local/apache2/bin and type the command httpd I still get the error message -bash: httpd: command not found If I type the command with the full path /opt/local/apache2/bin/httpd it works fine. I've run the command alias to see if something was clashing but the only thing listed is: alias wget='curl -O' How do I find what is intercepting the command and preventing the executable being found in the directory, even when I'm inside the same directory? By the way, the httpd file is executable: -rwxr-xr-x 1 root admin 442496 9 May 2012 httpd

    Read the article

  • Dark NetBeans

    - by Geertjan
    Let's make NetBeans IDE look like this. Not saying it's a nice color or anything, just that it's possible to do so: I changed the coloring in the Java editor by going to Tools | Options, then chose "Fonts & Colors", then selected the "Norway Today" profile and changed the background setting to Dark Gray. Next, I put this themes.xml file into the "config" folder of the NetBeans IDE user directory, which you can identify as such by going to Help | About in the IDE. Go to the exact location defined by "User directory" in Help | About, and then go to the "config" folder within that folder: The "config" folder of the user directory is the readable/writable root of the NetBeans IDE virtual filesystem. If a themes.xml file is found there, it is used, as described here. Then, in netbeans.conf file, which is not in the NetBeans user directory but in the NetBeans installation directory, within its "etc" folder, I added the following to "netbeans_default_options": -J-Dnetbeans.useTheme=true --laf Metal The first of these enables usage of the themes.xml file, i.e., it notifies NetBeans IDE at startup to load the themes.xml file and to apply the content to the relevant UI components, while the second is needed because most/all of the themes only work if you're using the Metal Look and Feel. Note: I must add that in most cases, whatever it is you're trying to achieve via a themes.xml file can probably be achieved in a different, and better, way. The themes.xml mechanism has been there forever, but is not actively supported or tested, though it may work for the specific thing you're trying to do anyway. For example, if you're trying to change the background color of a TopComponent, use the paintComponent method of the TopComponent instead of using a themes.xml file.

    Read the article

  • How do I map some subdirectories to run alongside a Drupal site?

    - by paradroid
    I have a Drupal site running on Apache using the following vhosts file: <VirtualHost xx.xx.xx.xx:80> ServerName bananas.net ServerAlias www.bananas.net DocumentRoot /var/www/drupal/ RewriteEngine On RewriteCond %{HTTP_HOST} !=bananas.net [NC] RewriteRule ^(.*)$ http://bananas.net$1 [L,R=301] <Directory /var/www/bananas.net/> Options -Indexes FollowSymlinks AllowOverride All Order allow,deny Allow from all </Directory> CustomLog ${APACHE_LOG_DIR}/access.log combined ErrorLog ${APACHE_LOG_DIR}/error.log </VirtualHost> I set it up some time ago, so I am not sure what the <Directory /var/www/bananas.net/> directive was meant for. That directory is currently empty. With the vhosts file the way it is, does the Directory directive have any effect at all? I want to add some content which is separate from the Drupal site. How do I add sub-directories within /var/www/bananas.net/ which can be accessed alongside the Drupal site running at the root? As they have nothing to do with the Drupal site, I want to keep the files separate, but still using the same domain.

    Read the article

  • VMware Workstation 7&8&9 does not generate /etc/vmware/network upon installation

    - by dash17291
    When I install VMware Workstation on Arch linux Virtual ethernet is not working. $ sudo tail /var/log/vnetlib Aug 28 22:20:33 VNLFileExists - Cannot check for file or directory: /etc/vmware/networking , error: No such file or directory Aug 28 22:20:33 VNLNetCfgLoad - Import file does not exist Aug 28 22:20:33 VNL_Load - Error loading the vnet configuration, file used: /etc/vmware/networking Aug 28 22:20:33 VNLNetCfgUnload - Requested cache is not loaded Database file is not present. Failed to initialize Aug 28 22:20:41 VNLFileExists - Cannot check for file or directory: /etc/vmware/networking , error: No such file or directory Aug 28 22:20:41 VNLNetCfgLoad - Import file does not exist Aug 28 22:20:41 VNL_Load - Error loading the vnet configuration, file used: /etc/vmware/networking Aug 28 22:20:41 VNLNetCfgUnload - Requested cache is not loaded Required modules compiled. Previously I have copied that file or directory (I don't remember) from a working installation, but now I need a real solution. It's strange for me, may be a hardware issue also because with Ubuntu the same thing happens on the same computer.

    Read the article

  • Syncing Large Directories/Filesystems using USB Drive

    - by Alan Lue
    Does anyone have a solution for syncing large directories/filesystems using just a USB flash drive (and specifically without using a network connection)? The objective is simply to sync a user directory between two computers. The contents of the user directory could amount to a large quantity of data—say, a quantity larger than could be stored on any single USB drive—but the aggregate size of changes that must be propagated by a single sync could easily fit on a USB drive. As an example, suppose a user directory is already synchronized between a desktop and a laptop computer. Here's a use case: Some changes are made in the user directory on the desktop. We mount a USB drive onto the desktop and copy whatever changes need to be applied to the laptop user directory in order to synchronize the desktop and laptop user directories. We now mount the USB drive onto the laptop and apply the changes. The desktop and laptop user directories are now synchronized. Any ideas? Alan

    Read the article

  • Apache says DocumentRoot doesn't exist when it does

    - by Jakobud
    I used Webmin to create the following Virtual Host: <VirtualHost *:80> DocumentRoot "/var/www/whatever" ServerName whatever.ourdomain <Directory "/var/www/whatever"> allow from all Options +Indexes </Directory> </VirtualHost> And when restarting Apache I get Starting httpd: Warning: DocumentRoot [/var/www/whatever] does not exist The thing is, the directory absolutely DOES exist. I'm staring right at it. pwd shows me that's my current directory, etc. It's not that hard to spell it right. I can't find any other errors or warnings in the httpd logs. apache:apache owns the directory and all subdirectories/files. There isn't any symlinks or anything involved here. What am I missing or what else should I look at to determine why this is? OS is CentOS 6.0

    Read the article

  • How is the PHP extensions/modules file structure logic based?

    - by dotpointer
    I'm trying to configure/build PHP 5.3.10 on Linux/Slackware 12 but the extensions appear in the wrong directory when I run make install. In the php.ini file is the extension dir defined: /usr/lib/php/extensions Problem is that when I run "make install" the newly built extensions are copied to a subfolder in extensions directory: /usr/lib/php/extensions/no-debug-non-zts-20090626 What am I supposed to do with this... copy the files down from the no-debug-non-zts-20090626 directory into the extensions directory, create symlinks from extensions to the modules in the no-debug-non-zts-20090626 directory (which will take a lot of time) or what? (I know I can do any of them, but I want to know the correct way...)

    Read the article

  • How do I not show files on the main screen

    - by ChuckMcM
    Ok, so I upgraded to 12.10 and have been trying out Unity, my screen has become a complete mess of folders and files. "Back in the day" the folders that were on my screen were the ones shown in the .Desktop directory now it seems like all the files in my login directory are there (that is a lot of files) Is there some way to set the files being diplayed to come from a specific directory? if so how? I think I've gone through every panel of the system settings application.

    Read the article

  • Why do I get this error trying to compile libxml2?

    - by bfaskiplar
    Although, I installed libxml2 once and reinstalled it for a few times. I cannot compile c-source code because compiler cannot find where the header file is. I am able to locate where it is (in the folder where I downloaded the tar.gz package) but I had a feeling in my guts that this package isn't installed correctly because when I tried to sudo make install, it says /bin/bash: /home/bfaskiplar/Downloads/tar.gz: No such file or directory make[2]: *** [install-libLTLIBRARIES] Error 127 make[2]: Leaving directory `/home/bfaskiplar/Downloads/tar.gz packages/libxml2-2.8.0' make[1]: *** [install-am] Error 2 make[1]: Leaving directory `/home/bfaskiplar/Downloads/tar.gz packages/libxml2-2.8.0' make: *** [install-recursive] Error 1 that's why I installed synaptic package manager and reinstalled it, but in this case, isn't it supposed to put header files in default directory where gcc normally searches in currently, I am able to compile it with -I option, but I wonder why do I have to copy headers manually even if I used synaptic for installation and why I am getting Error 1 and Error 2 when trying to install the package manually. thanks in advance

    Read the article

  • Apache2 WebServer not allowing me to view website/files in /var/www

    - by CitadelCSAlum
    I used to be able to access websites/files that were stored in the directory /var/www I have not used this for a while, but now I have a need to store, media in this directory or in the directory/var/www/images I noticed that my apache web server wasnt running correctly so I did a complete package removal and then reinstalled, but I am still unable to access a test page inde.html in the directory /var/www/index.html by going to http://myipaddresshere/index.html Is there some initial configuration I need to do to allow me to store HTML and media files in this directory and be able to access them from the browser? I dont remember having to do anything before.

    Read the article

  • Help converting to C++ (6 replies)

    I have two lines of basic C# code System.IO.Directory.CreateDirectory(path); return System.IO.Directory.Exists(path); I want to bury this in a C Win32 Project. My effort so far (that does not work and I do not understand the error messages) is: #using mscorlib.dll using namespace System; using namespace System::IO; bool* clrcall;bool CDirectory(String path) { DirectoryInfo d Directory::CreateDirec...

    Read the article

  • Comparison in Monit Permissions Testing

    - by beanland
    I'm trying to use Monit to check the permissions of a particular directory, but I only care that it's readable to all users. I don't care about any other permissions (write, execute) for the owner, group, or all. I also don't care about any special permissions. Knowing that I can't change the permissions of this directory, and with the possibility of another administrator changing these permissions without affecting my processes that rely on this directory (i.e., granting or revoking write access to the group), is it possible to check for a minimum permission in Monit? I have this which is currently working: check directory archive path /var/home/archive/ if failed perm 0755 then alert But I would like to have something like tihs: check directory archive path /var/home/archive/ if failed perm > 444 then alert This is failing for me. Is it possible to use comparison operators in Monit's permissions checking? If not, are there any workarounds?

    Read the article

  • I deleted all files and folders (including hidden) from /home/username/ now in big trouble

    - by jeffery_the_wind
    I am logged into a remote ubuntu server, and I accidentally erased the entire /home/username/ directory for the current user. The only thing left is a hidden directory called .gvfs. I don't need anything of the Documents/Music/etc. Now it is not letting me cd into the /var/www/ directory, which has permissions 666 and it is owned by the current user. I am afraid to disconnect from my ssh session because I don't know if I will be able to get back on. Have I permanently created a problem? Is there a way I can replace the most important files to the /home/username/ directory? Thanks! ** EDIT ** Thanks everyone for the help. I figured the problem with cd into the /var/www/ was actually my permissions in the /var/www/ directory. It was set to 666, changed it to 755 and everything was good again. It doesn't look like anything systematic was ruined by deleting the contents of the user folder.

    Read the article

  • How to copy the file from source to destination only once at a time?

    - by Viswa
    I have to copy the file from my desktop to my mounted directory. I was using the following command to copy the file from my desktop to mounted directory. os.system("cp -f /home/Desktop/filename /media/folder_1"). It works fine. But the problem is while copying the file from my source to mounted directory(folder_1) if any interruption is happens like network down, then the system continuously keep on trying. It couldn't skip that process. Finally, when the network comes the files are again copy to my mounted directory. Due to this continuous trying, next time i try to move the content it throws "permission denied" error. How do i copy the file only once, if any network issues happen then it will not keep try to copy, instead of that, it throws the error. If you know, Let me. Its very useful to me.

    Read the article

< Previous Page | 162 163 164 165 166 167 168 169 170 171 172 173  | Next Page >