Search Results

Search found 16683 results on 668 pages for 'directory junction'.

Page 131/668 | < Previous Page | 127 128 129 130 131 132 133 134 135 136 137 138  | Next Page >

  • Fluent nHibernate - How to map a non-key column on a junction table?

    - by The Matt
    Taking an example that is provided on the Fluent nHibernate website, I need to extend it slightly: I need to add a 'Quantity' column to the StoreProduct table. How would I map this using nHibernate? An example mapping is provided for the given scenario above, but I'm not sure how I would get the Quantity column to map to a property on the Product class: public class StoreMap : ClassMap<Store> { public StoreMap() { Id(x => x.Id); Map(x => x.Name); HasMany(x => x.Employee) .Inverse() .Cascade.All(); HasManyToMany(x => x.Products) .Cascade.All() .Table("StoreProduct"); } }

    Read the article

  • Why does my code fail to create a directory in "C:\Program Files" under Windows 7?

    - by sunil.nishad87
    I am using Windows 7 and I have to run one program in that windows but that program working in Windows XP. This is a Visual C++ program and I am using Visual Studio 2008 for this. When I am running my application, it does not throw any errors, but it does not create a directory in "c:\program files\". So can anyone help me to create directory and exe file? This is the code I am using: char szAppPath[MAX_PATH]; char szFileName[MAX_PATH]; DWORD dwResult; WIN32_FIND_DATA FindFileData; HANDLE hFind; dwResult = ExpandEnvironmentStrings( NULL, szAppPath, MAX_PATH); // "%ProgramFiles%" // do same for NSim directory strcat(szAppPath,"\\NSim"); hFind = FindFirstFile(szAppPath, &FindFileData); if (hFind == INVALID_HANDLE_VALUE) { //Directory Does't Exists create New if(!CreateDirectory(szAppPath,NULL)) //Throw Error { MessageBox("Unable to Create N-SIM directory","NSim Installer"); return ; } } else { //check if is directory or not if(!(FindFileData.dwFileAttributes & FILE_ATTRIBUTE_DIRECTORY)) { MessageBox("Can't Create N-SIM directory\n Another file with same name exists","NSim Installer"); return ; } FindClose(hFind); } //***************************************N-SIM Application**************************** strcpy(szFileName, szAppPath); HRSRC hRes; if( bRegister == FALSE) { strcat(szFileName,"\\NSim.exe"); //make same name of the Client & Server in program file hRes = FindResource(NULL, MAKEINTRESOURCE(IDR_LANSIMSERVER),RT_RCDATA); if(flagUpgrade ==0) { CString trial = installationDate(); //----- Detemine Expiry Date ----- setRegistry(trial); } }

    Read the article

  • Microsoft met à jour Windows Azure Web Sites et Azure Active Directory pour la gestion d'identité et l'hébergement Web dans le Cloud

    Microsoft met à jour Windows Azure Web Sites et Azure Active Directory pour l'hébergement Web et la gestion d'identité dans le Cloud Microsoft par la voix de Scott Guthrie, Vice-président de la division serveur et Business Tools, vient d'annoncer une mise à jour de Windows Azure Web Sites ainsi que du service Windows Azure Active Directory (WAAD). [IMG]http://ftp-developpez.com/gordon-fowler/windowsazurelogo.jpg[/IMG] Azure Web Sites est une plateforme d'hébergement des sites et applications Web dans le Cloud Azure. L'objectif de l'infrastructure Azure Web Sites est de rendre l'hébergement disponible à la fois sur le Cloud et en local sur les serv...

    Read the article

  • How do you get the solution directory in C# (VS 2008) in code?

    - by IsaacB
    Hi, Got an annoying problem here. I've got an NHibernate/Forms application I'm working through SVN. I made some of my own controls, but when I drag and drop those (or view some form editors where I have already dragged and dropped) onto some of my other controls, Visual studio decides it needs to execute some of the code I wrote, including the part that looks for hibernate.cfg.xml. I have no idea why this is, but (sometimes!) when it executes the code during my form load or drag and drop it switches the current directory to C:\program files\vs 9.0\common7\ide, and then nhibernate throws an exception that it can't find hibernate.cfg.xml, because I'm searching for that in a relative path. Now, I don't want to hard code the location of hibernate.cfg.xml, or just copy hibernate.cfg.xml to the ide directory (which will work). I want a solution that gets the solutions directory while the current directory is common7\ide. Something that will let someone view my forms in the designer on a fresh checkout to an arbitrary directory on an arbitrary machine. And no, I'm not about to load the controls in code. I have so many controls within controls that it is a nightmare to line everything up without it. I tried a pre build event that made a file that has the solution directory in it, but of course how can I find that from common7\ide? All the projects files need to be in the solution directory because of svn. Thanks for your help guys, I've already spent a few hours fiddling with this in vain.

    Read the article

  • Du tells me it can't find the current directory?

    - by C. Ross
    I'm on AIX, and in some directories I can't use the du command. I get the follow error message: du: 0653-175 Cannot find the current directory. Obviously the current directory exists, and I have permissions to it. I can list the directory and create files in it, both before and after I ran du. What could possibly be wrong here? The du command works just fine in my home directory. A quick google search turns up a bunch of forum posts of the same problem, but no clear answers.

    Read the article

  • Maximum number of files in one ext3 directory while still getting acceptable performance?

    - by knorv
    I have an application writing to an ext3 directory which over time has grown to roughly three million files. Needless to say, reading the file listing of this directory is unbearably slow. I don't blame ext3. The proper solution would have been to let the application code write to sub-directories such as ./a/b/c/abc.ext rather than using only ./abc.ext. I'm changing to such a sub-directory structure and my question is simply: roughly how many files should I expect to store in one ext3 directory while still getting acceptable performance? What's your experience? Or in other words; assuming that I need to store three million files in the structure, how many levels deep should the ./a/b/c/abc.ext structure be? Obviously this is a question that cannot be answered exactly, but I'm looking for a ball park estimate.

    Read the article

  • Why is .htaccess not allowed in a directory but is allowed in another?

    - by JD Isaacks
    I have apache2 installed on ubuntu 10.4 inside my var/www/ directory [amung others] I have a cakephp and a dvdcatalog directories. Each of which have CakePHP 1.3 installed. I can access them both via localhost/cakephp and localhost/dvdcatalog But the dvdcatalog shows up with no css styling. They both have these files: /var/www/cakephp/app/webroot/css/cake.generic.css /var/www/dvdcatalog/app/webroot/css/cake.generic.css But when I go to http://localhost/cakephp/css/cake.generic.css it sees the file but it does not see the file when I go to http://localhost/dvdcatalog/css/cake.generic.css I think this means the cakephp folder is able to use .htaccess and the dvdcatalog is not. I setup the cakephp directory last month when I was following in the blog tutorial. I am setting up the dvdcatalog directory now for a different tutorial. So I am not sure if I am missing a step. in my /etc/apache2/apache2.conf file I have this: <Directory "/var/www/*"> Order allow,deny Allow from all AllowOverride All </Directory> Which I thought gave .htaccesss to all. Does anyone have any ideas what the problem is?

    Read the article

  • How can I tell SELinux to give vsftpd write access in a specific directory?

    - by Arcturus
    Hello. I've set up vsftpd on my Fedora 12 server, and I'd like to have the following configuration. Each user should have access to: his home directory (/home/USER); the web directory I created for him (/web/USER). To achieve this, I first configured vsftpd to chroot each user to his home directory. Then, I created /web/USER with the correct permissions, and used mount --bind /web/USER /home/USER/Web so that the user may have access to /web/USER through /home/USER/Web. I also turned on the SELinux boolean ftp_home_dir so that vsftpd is allowed to write in users' home directories. This works very well, except that when a user tries to upload or rename a file in /home/USER/Web, SELinux forbids it because the change must also be done to /web/USER, and SELinux doesn't give vsftpd permission to write anything to that directory. I know that I could solve the problem by turning on the SELinux boolean allow_ftpd_full_access, or ftpd_disable_trans. I also tried to use audit2allow to generate a policy, but what it does is generate a policy that gives ftpd write access to directories of type public_content_t; this is equivalent to turning on allow_ftpd_full_access, if I understood it correctly. I'd like to know if it's possible to configure SELinux to allow FTP write access to the specific directory /web/USER and its contents, instead of disabling SELinux's FTP controls entirely.

    Read the article

  • How to make new file permission inherit from the parent directory?

    - by Wai Yip Tung
    I have a directory called data. Then I am running a script under the user id 'robot'. robot writes to the data directory and update files inside. The idea is data is open for both me and robot to update. So I setup the permission and owner group like this drwxrwxr-x 2 me robot-grp 4096 Jun 11 20:50 data where both me and robot belongs to the 'robot-grp'. I change the permission and the owner group recursively like the parent directory. I regularly upload new files into the data directory using rsync. Unfortunately, new files uploaded does not inherit the parent directory's permission as I hope. Instead it looks like this -rw-r--r-- 1 me users 6 Jun 11 20:50 new-file.txt When robot tries to update new-file.txt, it fails due to lack of file permission. I'm not sure if setting umask helps. In anycase the new files does not really follow it. $ umask -S u=rwx,g=rx,o=rx I'm often confounded by Unix file permission. Do I even have a right plan? I'm using Debian lenny.

    Read the article

  • Directory directive: AuthType None but still need an AuthProvider?

    - by Steffen Winkler
    For now I just need the server to let me download files from one specific folder (in my case I chose /opt/myFolder for that task) Distribution is Debian 6.0 *edit_start* Apache version is 2.4, according to their official documentation, the Order/Allow clauses are deprecated and should not be used anymore I'm an idiot: Apache version is 2.2. *edit_end* My directory directives in apache2.conf look like this: <IfModule dir_module> DirectoryIndex index.html index.htm index.php </IfModule> ServerRoot "/etc/apache2" DocumentRoot "/opt/myFolder" <Directory /> Options FollowSymLinks AuthType None AllowOverride None Require all denie </Directory> <Directory "/opt/myFolder/*"> Options FollowSymLinks MultiViews AllowOverride None AuthType None Require all allow </Directory> When I try to access a file inside that folder (http://myserver.de/aTestFile.zip) I get an Internal Server Error. Also Apache writes the following error into it's log: configuration error: couldn't check user. Check your authn provider!: /aTestFile.zip Why would I need an authn provider if I don't want any authentication? Also I hope someone can explain to me what kind of AuthenticationProvider I'd need for that. Everytime I search for those things I get pointed at people asking how to protect files/directories with passwords or restrict access to some IP addresses, which doesn't really help me. ok, since I've Apache version 2.2, here is the error I get when using the Order/Deny/Allow commands instead of AuthType/Require: Invalid command 'Order', perhaps misspelled or defined by a module not included in the server configuration.

    Read the article

  • How do I correct a directory incorrectly copied into itself?

    - by Peter Boughton
    Given the following situation... <path>/mydir1/mydir2 ...where mydir2 should have overwritten mydir1, but was instead placed inside, and both directories actually have the same filename. How is that fixed? Attempting to do mv <path>/mydir/mydir/* <path>/mydir/ or mv <path>/mydir <path>/ results in: mv: cannot move `<path>/mydir/mydir` to a subdirectory of itself, `<path>/mydir` This seems stupidly simple, but it's late here and I can't figure it out. There are seventeen such directories to fix (path differs for each, but same mydir name). To confirm, the error message can be caused with this: # cd /path/to/directory # mv mydir/mydir ./ mv: cannot move `mydir/mydir' to a subdirectory of itself, `./mydir' Also tried: # mv mydir/mydir/* mydir/ mv: cannot move `mydir/mydir/otherdir1' to a subdirectory of itself, `mydir/otherdir1' mv: cannot move `mydir/mydir/otherdir2' to a subdirectory of itself, `mydir/otherdir2' and... # mv /path/to/directory/mydir/mydir/otherdir1 /path/to/directory/mydir/ mv: cannot move `/path/to/directory/mydir/mydir/otherdir1' to a subdirectory of itself, `/path/to/directory/mydir/otherdir1' and using a temporary directory: # mv mydir/mydir ./mydir-temp # mv mydir-temp/* mydir/ mv: cannot move `mydir-temp/otherdir1' to a subdirectory of itself, `mydir/otherdir1' mv: cannot move `mydir-temp/otherdir2' to a subdirectory of itself, `mydir/otherdir2' I found a similar question "How to recursively move all files (including hidden) in a subfolder into a parent folder in *nix?" which suggested that mv bar/{,.}* . would do this. But this also gives the same errors, as well as confusingly picking up . and .. from somewhere. # cd mydir # mv mydir/{,.}* . mv: cannot move `mydir/otherdir1' to a subdirectory of itself, `./otherdir1' mv: cannot move `mydir/otherdir2' to a subdirectory of itself, `./otherdir2' mv: cannot move `mydir/.' to `./.': Device or resource busy mv: cannot move `mydir/..' to `./..': Device or resource busy mv: overwrite `./.file'? y Another similar question "linux mv command weirdness" suggests that mv doesn't overwrite and a copy is required. # cd mydir # cp -rf ./mydir/* ./ cp: overwrite `./otherdir1/file1'? y cp: overwrite `./otherdir1/file2'? y cp: overwrite `./otherdir1/file3'? This appears to be working... except there's a lot of files (and dirs) - I don't want to confirm every one! Isn't the f there supposed to prevent this? Ok, so cp was aliased to cp -i (which I found out with type cp), and bypassed by using \cp -rf ./mydir/* ./ which seems to have worked. Although I've solved the problem of getting dirs/files from one place to another, I'm still curious as to what's going on with the mv stuff - is this really a deliberate feature as suggested by Warner?

    Read the article

  • How to make new file permission inherit from the parent directory?

    - by Wai Yip Tung
    I have a directory called data. Then I am running a script under the user id 'robot'. robot writes to the data directory and update files inside. The idea is data is open for both me and robot to update. So I setup the permission and owner group like this drwxrwxr-x 2 me robot-grp 4096 Jun 11 20:50 data where both me and robot belongs to the 'robot-grp'. I change the permission and the owner group recursively like the parent directory. I regularly upload new files into the data directory using rsync. Unfortunately, new files uploaded does not inherit the parent directory's permission as I hope. Instead it looks like this -rw-r--r-- 1 me users 6 Jun 11 20:50 new-file.txt When robot tries to update new-file.txt, it fails due to lack of file permission. I'm not sure if setting umask helps. In anycase the new files does not really follow it. $ umask -S u=rwx,g=rx,o=rx I'm often confounded by Unix file permission. Do I even have a right plan? I'm using Debian lenny.

    Read the article

  • How do I create an ISO image from a directory structure on CentOS?

    - by tom smith
    I'm trying to figure out the exact mkisofs cmd to create the ISO with the following directory and file structure. I've tried different commands, but when I mount the ISO that is created the directory tree has not been reproduced. The initial directory tree is: master.iso:: mount -o loop /apps/vmware/master.iso /mnt/vmtest ls /mnt/vmtest isolinux ks.cfg upgra32 upgra64 upgrade.sh ls /mnt/vmtest/isolinux boot.cat initrd.img isolinux.bin isolinux.cfg vmlinuz I've used different variations of the following mkisofs command without success: mkisofs -o '/foo/test.iso' -b 'isolinux.bin' -c 'boot.cat' -no-emul-boot -boot-load-size 4 -boot-info-table 'isolinux' How do I make an ISO that captures a directory's exact structure?

    Read the article

  • Why is .htaccess not allowed in a directory but is allowed in another?

    - by John Isaacks
    I have apache2 installed on ubuntu 10.4 inside my var/www/ directory [amung others] I have a cakephp and a dvdcatalog directories. Each of which have CakePHP 1.3 installed. I can access them both via localhost/cakephp and localhost/dvdcatalog But the dvdcatalog shows up with no css styling. They both have these files: /var/www/cakephp/app/webroot/css/cake.generic.css /var/www/dvdcatalog/app/webroot/css/cake.generic.css But when I go to http://localhost/cakephp/css/cake.generic.css it sees the file but it does not see the file when I go to http://localhost/dvdcatalog/css/cake.generic.css I think this means the cakephp folder is able to use .htaccess and the dvdcatalog is not. I setup the cakephp directory last month when I was following in the blog tutorial. I am setting up the dvdcatalog directory now for a different tutorial. So I am not sure if I am missing a step. in my /etc/apache2/apache2.conf file I have this: <Directory "/var/www/*"> Order allow,deny Allow from all AllowOverride All </Directory> Which I thought gave .htaccesss to all. Does anyone have any ideas what the problem is?

    Read the article

  • How to figure out which directory is web server root?

    - by matt
    I want to view websites hosted on my Mac when running Windows VMware Fusion. I have an entry in the Windows hosts file to enable the routing: #ip of my mac domain i use on the VM to access it 192.168.1.70 mymac However, it resolves to an empty directory as a 404 is generated. I can see the access log on my Mac that everything is OK access wise. Firefox on VMware states the following response headers: Server Apache/2.2.14 (Unix) mod_ssl/2.2.14 OpenSSL/0.9.8l DAV/2 PHP/5.3.1 Any ideas how I can figure out what directory is being served? I am lost in a maze of twisty httpd.conf passages. localhost on my Mac resolves to my ~/Sites directory. 192.168.1.70 resolves to the same empty directory/404. Thanks.

    Read the article

  • How can I set up an FTP user with a home directory inside another user's home folder?

    - by simon180
    Hi I have an Ubuntu (Hardy) server which I am using to host multiple websites. All of the sites are stored in subfolders of a public_html folder for my main login to the server and accessed via a single SSH account. I now have a website user who wants FTP (or similar) access to enable them to upload various files etc to the directory where their website is situated, however I still need the SSH account to have access to this directory as I may need to make changes using my master account. Basically I want to create an FTP account (I have VSFTPD installed) for a user with the home directory inside my own user account but they should only be able to read/write to this folder or its subfolders but not go further up the directory tree. How can I achieve this? Thanks

    Read the article

  • Is there a tool for verifying the contents of a Zip archive against the source directory's contents?

    - by Basil
    Here's the scenario: I create a ZIP archive using some GUI package like WinZip, 7-Zip or whatever by right-clicking on a directory "somename" and selecting "Compress to archive 'somename.zip'" When the archive is completed, I open it and discover that some files don't exist in the archive (for reasons yet unknown). I want to find all files that are missing from the archive without having to extract the archive to another directory, then doing directory diff, etc. So.. Is there a tool (GUI or command-line, standalone or built into a compressor, for Windows or Linux, I don't care) that can walk through an archive and compare its contents against a directory on the filesystem?

    Read the article

  • How do I change the .NET Framework version of a virtual directory without the ASP.NET tab?

    - by Brandon
    I have a website running v2.0 but I want the virtual directory running under it to use v4.0. I've already set the virtual directory as an application and gave it it's own application pool. The server is running Windows Server 2003 SP2 (64-bit). However it has the Enable32BitAppOnWin64 flag enabled which means the ASP.NET tabs on the properties dialog of the websites/virtual directories are missing. .NET 4.0 is installed, aspnet_regiis -lv lists the 32-bit and 64-bit versions of .NET 2.0 and .NET 4.0 and the Web Server Extensions are enabled. I can't disable the Enable32BitAppOnWin64 flag to get the ASP.NET tab back, so is there a way to do this from the command line without potentially breaking something? I ran aspnet_regiis -lk to find the paths so I could try aspnet_regiis -sn, but it only returns one record W3SVC/ 2.0.50727.0 (There are 3 separate websites and a virtual directory running on the server though) How can I change the framework version of the virtual directory without the ASP.NET tab?

    Read the article

  • Complete Guide to Symbolic Links (symlinks) on Windows or Linux

    - by Matthew Guay
    Want to easily access folders and files from different folders without maintaining duplicate copies?  Here’s how you can use Symbolic Links to link anything in Windows 7, Vista, XP, and Ubuntu. So What Are Symbolic Links Anyway? Symbolic links, otherwise known as symlinks, are basically advanced shortcuts. You can create symbolic links to individual files or folders, and then these will appear like they are stored in the folder with the symbolic link even though the symbolic link only points to their real location. There are two types of symbolic links: hard and soft. Soft symbolic links work essentially the same as a standard shortcut.  When you open a soft link, you will be redirected to the folder where the files are stored.  However, a hard link makes it appear as though the file or folder actually exists at the location of the symbolic link, and your applications won’t know any different. Thus, hard links are of the most interest in this article. Why should I use Symbolic Links? There are many things we use symbolic links for, so here’s some of the top uses we can think of: Sync any folder with Dropbox – say, sync your Pidgin Profile Across Computers Move the settings folder for any program from its original location Store your Music/Pictures/Videos on a second hard drive, but make them show up in your standard Music/Pictures/Videos folders so they’ll be detected my your media programs (Windows 7 Libraries can also be good for this) Keep important files accessible from multiple locations And more! If you want to move files to a different drive or folder and then symbolically link them, follow these steps: Close any programs that may be accessing that file or folder Move the file or folder to the new desired location Follow the correct instructions below for your operating system to create the symbolic link. Caution: Make sure to never create a symbolic link inside of a symbolic link. For instance, don’t create a symbolic link to a file that’s contained in a symbolic linked folder. This can create a loop, which can cause millions of problems you don’t want to deal with. Seriously. Create Symlinks in Any Edition of Windows in Explorer Creating symlinks is usually difficult, but thanks to the free Link Shell Extension, you can create symbolic links in all modern version of Windows pain-free.  You need to download both Visual Studio 2005 redistributable, which contains the necessary prerequisites, and Link Shell Extension itself (links below).  Download the correct version (32 bit or 64 bit) for your computer. Run and install the Visual Studio 2005 Redistributable installer first. Then install the Link Shell Extension on your computer. Your taskbar will temporally disappear during the install, but will quickly come back. Now you’re ready to start creating symbolic links.  Browse to the folder or file you want to create a symbolic link from.  Right-click the folder or file and select Pick Link Source. To create your symlink, right-click in the folder you wish to save the symbolic link, select “Drop as…”, and then choose the type of link you want.  You can choose from several different options here; we chose the Hardlink Clone.  This will create a hard link to the file or folder we selected.  The Symbolic link option creates a soft link, while the smart copy will fully copy a folder containing symbolic links without breaking them.  These options can be useful as well.   Here’s our hard-linked folder on our desktop.  Notice that the folder looks like its contents are stored in Desktop\Downloads, when they are actually stored in C:\Users\Matthew\Desktop\Downloads.  Also, when links are created with the Link Shell Extension, they have a red arrow on them so you can still differentiate them. And, this works the same way in XP as well. Symlinks via Command Prompt Or, for geeks who prefer working via command line, here’s how you can create symlinks in Command Prompt in Windows 7/Vista and XP. In Windows 7/Vista In Windows Vista and 7, we’ll use the mklink command to create symbolic links.  To use it, we have to open an administrator Command Prompt.  Enter “command” in your start menu search, right-click on Command Prompt, and select “Run as administrator”. To create a symbolic link, we need to enter the following in command prompt: mklink /prefix link_path file/folder_path First, choose the correct prefix.  Mklink can create several types of links, including the following: /D – creates a soft symbolic link, which is similar to a standard folder or file shortcut in Windows.  This is the default option, and mklink will use it if you do not enter a prefix. /H – creates a hard link to a file /J – creates a hard link to a directory or folder So, once you’ve chosen the correct prefix, you need to enter the path you want for the symbolic link, and the path to the original file or folder.  For example, if I wanted a folder in my Dropbox folder to appear like it was also stored in my desktop, I would enter the following: mklink /J C:\Users\Matthew\Desktop\Dropbox C:\Users\Matthew\Documents\Dropbox Note that the first path was to the symbolic folder I wanted to create, while the second path was to the real folder. Here, in this command prompt screenshot, you can see that I created a symbolic link of my Music folder to my desktop.   And here’s how it looks in Explorer.  Note that all of my music is “really” stored in C:\Users\Matthew\Music, but here it looks like it is stored in C:\Users\Matthew\Desktop\Music. If your path has any spaces in it, you need to place quotes around it.  Note also that the link can have a different name than the file it links to.  For example, here I’m going to create a symbolic link to a document on my desktop: mklink /H “C:\Users\Matthew\Desktop\ebook.pdf”  “C:\Users\Matthew\Downloads\Before You Call Tech Support.pdf” Don’t forget the syntax: mklink /prefix link_path Target_file/folder_path In Windows XP Windows XP doesn’t include built-in command prompt support for symbolic links, but we can use the free Junction tool instead.  Download Junction (link below), and unzip the folder.  Now open Command Prompt (click Start, select All Programs, then Accessories, and select Command Prompt), and enter cd followed by the path of the folder where you saved Junction. Junction only creates hard symbolic links, since you can use shortcuts for soft ones.  To create a hard symlink, we need to enter the following in command prompt: junction –s link_path file/folder_path As with mklink in Windows 7 or Vista, if your file/folder path has spaces in it make sure to put quotes around your paths.  Also, as usual, your symlink can have a different name that the file/folder it points to. Here, we’re going to create a symbolic link to our My Music folder on the desktop.  We entered: junction -s “C:\Documents and Settings\Administrator\Desktop\Music” “C:\Documents and Settings\Administrator\My Documents\My Music” And here’s the contents of our symlink.  Note that the path looks like these files are stored in a Music folder directly on the Desktop, when they are actually stored in My Documents\My Music.  Once again, this works with both folders and individual files. Please Note: Junction would work the same in Windows 7 or Vista, but since they include a built-in symbolic link tool we found it better to use it on those versions of Windows. Symlinks in Ubuntu Unix-based operating systems have supported symbolic links since their inception, so it is straightforward to create symbolic links in Linux distros such as Ubuntu.  There’s no graphical way to create them like the Link Shell Extension for Windows, so we’ll just do it in Terminal. Open terminal (open the Applications menu, select Accessories, and then click Terminal), and enter the following: ln –s file/folder_path link_path Note that this is opposite of the Windows commands; you put the source for the link first, and then the path second. For example, let’s create a symbolic link of our Pictures folder in our Desktop.  To do this, we entered: ln -s /home/maguay/Pictures /home/maguay/Desktop   Once again, here is the contents of our symlink folder.  The pictures look as if they’re stored directly in a Pictures folder on the Desktop, but they are actually stored in maguay\Pictures. Delete Symlinks Removing symbolic links is very simple – just delete the link!  Most of the command line utilities offer a way to delete a symbolic link via command prompt, but you don’t need to go to the trouble.   Conclusion Symbolic links can be very handy, and we use them constantly to help us stay organized and keep our hard drives from overflowing.  Let us know how you use symbolic links on your computers! Download Link Shell Extension for Windows 7, Vista, and XP Download Junction for XP Similar Articles Productive Geek Tips Using Symlinks in Windows VistaHow To Figure Out Your PC’s Host Name From the Command PromptInstall IceWM on Ubuntu LinuxAdd Color Coding to Windows 7 Media Center Program GuideSync Your Pidgin Profile Across Multiple PCs with Dropbox TouchFreeze Alternative in AutoHotkey The Icy Undertow Desktop Windows Home Server – Backup to LAN The Clear & Clean Desktop Use This Bookmarklet to Easily Get Albums Use AutoHotkey to Assign a Hotkey to a Specific Window Latest Software Reviews Tinyhacker Random Tips DVDFab 6 Revo Uninstaller Pro Registry Mechanic 9 for Windows PC Tools Internet Security Suite 2010 Gadfly is a cool Twitter/Silverlight app Enable DreamScene in Windows 7 Microsoft’s “How Do I ?” Videos Home Networks – How do they look like & the problems they cause Check Your IMAP Mail Offline In Thunderbird Follow Finder Finds You Twitter Users To Follow

    Read the article

  • Command line tool in python in a fixed root directory ...

    - by koleto
    I would like to install my python application as a command line tool that should work entirelly inside the install directory (for example C:\Python26\Lib\site-packages\application) The problem is I would like to reffer in runtime to the submodules and resources from within the application directory three. If I install the app with [console_scripts] option the default path is the current directory. Is there a elegant way to keep the current execution path of the application to the site-packages directory? Thanks

    Read the article

  • How to optimize perl code for directory exists or not ?

    - by SCNCN2010
    sub DirectoryExists { my $param = shift; # Remove first element of the array shift @{$param}; # Loop through each directory to see if it exists foreach my $directory (@{$param}) { unless (-e $directory && -d $directory) { return 0; } } # True return 1; } is there any way to optimize this code ? is there any good way to optimize this code

    Read the article

  • problem occur during installation of moses scripts

    - by lenny99
    we got error when compile moses-script. process of it as follows: minakshi@minakshi-Vostro-3500:~/Desktop/monu/moses/scripts$ make release # Compile the parts make all make[1]: Entering directory `/home/minakshi/Desktop/monu/moses/scripts' # Building memscore may fail e.g. if boost is not available. # We ignore this because traditional scoring will still work and memscore isn't used by default. cd training/memscore ; \ ./configure && make \ || ( echo "WARNING: Building memscore failed."; \ echo 'training/memscore/memscore' >> ../../release-exclude ) checking for a BSD-compatible install... /usr/bin/install -c checking whether build environment is sane... yes checking for gawk... no checking for mawk... mawk checking whether make sets $(MAKE)... yes checking for g++... g++ checking whether the C++ compiler works... yes checking for C++ compiler default output file name... a.out checking for suffix of executables... checking whether we are cross compiling... no checking for suffix of object files... o checking whether we are using the GNU C++ compiler... yes checking whether g++ accepts -g... yes checking for style of include used by make... GNU checking dependency style of g++... gcc3 checking for gcc... gcc checking whether we are using the GNU C compiler... yes checking whether gcc accepts -g... yes checking for gcc option to accept ISO C89... none needed checking dependency style of gcc... gcc3 checking for boostlib >= 1.31.0... yes checking for cos in -lm... yes checking for gzopen in -lz... yes checking for cblas_dgemm in -lgslcblas... no checking for gsl_blas_dgemm in -lgsl... no checking how to run the C++ preprocessor... g++ -E checking for grep that handles long lines and -e... /bin/grep checking for egrep... /bin/grep -E checking for ANSI C header files... yes checking for sys/types.h... yes checking for sys/stat.h... yes checking for stdlib.h... yes checking for string.h... yes checking for memory.h... yes checking for strings.h... yes checking for inttypes.h... yes checking for stdint.h... yes checking for unistd.h... yes checking n_gram.h usability... no checking n_gram.h presence... no checking for n_gram.h... no checking for size_t... yes checking for ptrdiff_t... yes configure: creating ./config.status config.status: creating Makefile config.status: creating config.h config.status: config.h is unchanged config.status: executing depfiles commands make[2]: Entering directory `/home/minakshi/Desktop/monu/moses/scripts/training/memscore' make all-am make[3]: Entering directory `/home/minakshi/Desktop/monu/moses/scripts/training/memscore' make[3]: Leaving directory `/home/minakshi/Desktop/monu/moses/scripts/training/memscore' make[2]: Leaving directory `/home/minakshi/Desktop/monu/moses/scripts/training/memscore' touch release-exclude # No files excluded by default pwd=`pwd`; \ for subdir in cmert-0.5 phrase-extract symal mbr lexical-reordering; do \ make -C training/$subdir || exit 1; \ echo "### Compiler $subdir"; \ cd $pwd; \ done make[2]: Entering directory `/home/minakshi/Desktop/monu/moses/scripts/training/cmert-0.5' make[2]: Nothing to be done for `all'. make[2]: Leaving directory `/home/minakshi/Desktop/monu/moses/scripts/training/cmert-0.5' ### Compiler cmert-0.5 make[2]: Entering directory `/home/minakshi/Desktop/monu/moses/scripts/training/phrase-extract' make[2]: Nothing to be done for `all'. make[2]: Leaving directory `/home/minakshi/Desktop/monu/moses/scripts/training/phrase-extract' ### Compiler phrase-extract make[2]: Entering directory `/home/minakshi/Desktop/monu/moses/scripts/training/symal' make[2]: Nothing to be done for `all'. make[2]: Leaving directory `/home/minakshi/Desktop/monu/moses/scripts/training/symal' ### Compiler symal make[2]: Entering directory `/home/minakshi/Desktop/monu/moses/scripts/training/mbr' make[2]: Nothing to be done for `all'. make[2]: Leaving directory `/home/minakshi/Desktop/monu/moses/scripts/training/mbr' ### Compiler mbr make[2]: Entering directory `/home/minakshi/Desktop/monu/moses/scripts/training/lexical-reordering' make[2]: Nothing to be done for `all'. make[2]: Leaving directory `/home/minakshi/Desktop/monu/moses/scripts/training/lexical-reordering' ### Compiler lexical-reordering ## All files that need compilation were compiled make[1]: Leaving directory `/home/minakshi/Desktop/monu/moses/scripts' /bin/sh: ./check-dependencies.pl: not found make: *** [release] Error 127 We don't know why this error occurs? check-dependencies.pl file existed in scripts folder ...

    Read the article

  • Access Officejet Pro L7590 memory card reader

    - by luri
    I can't manage to access my printer's memory card reader in Nautilus. I can just access it with hp-unload. Here's a sample output from this command: lubuntu@L-X6:~$ hp-unload hp:/net/Officejet_Pro_L7500?zc=HP065193 HP Linux Imaging and Printing System (ver. 3.10.6) Photo Card Access Utility ver. 3.3 Copyright (c) 2001-9 Hewlett-Packard Development Company, LP This software comes with ABSOLUTELY NO WARRANTY. This is free software, and you are welcome to distribute it under certain conditions. See COPYING file for more details. Using device: hp:/net/Officejet_Pro_L7500?zc=HP065193 |error: Photo card write failed (Card may be write protected) / Photocard on device hp:/net/Officejet_Pro_L7500?zc=HP065193 mounted DO NOT REMOVE PHOTO CARD UNTIL YOU EXIT THIS PROGRAM warning: Photo card is write protected. Type 'help' for a list of commands. Type 'exit' to quit. pcard: / > ls \ Name Size Type dcim/ directory eos_digi.tal 0 B unknown/unknown 1 files, 0 B pcard: / > cd dcim |pcard: /dcim > ls | Name Size Type . directory .. directory 100eos5d/ directory 267canon/ directory 270canon/ directory 271canon/ directory 272canon/ directory 0 files, 0 B pcard: /dcim > cd 272canon -pcard: /dcim/272canon > ls \ Name Size Type . directory .. directory _mg_7201.jpg 3.1 MB image/jpeg ...........(some more files)................. _mg_7281.jpg 2.5 MB image/jpeg _mg_7282.jpg 2.5 MB image/jpeg 82 files, 241.6 MB (253377883) How can I acess it from nautilus or mount it as a filesystem? Note that this is similar to this other question: Can't get HP Officejet 6500 card reader to work. but actually there seemed to be no supported device here, while in my case I manage to access the memory card from hp-unload.

    Read the article

< Previous Page | 127 128 129 130 131 132 133 134 135 136 137 138  | Next Page >