Search Results

Search found 16602 results on 665 pages for 'directory'.

Page 477/665 | < Previous Page | 473 474 475 476 477 478 479 480 481 482 483 484  | Next Page >

  • eCryptfs on ubuntu server : How to keep the home mounted without being over ssh?

    - by Bebeoix
    I have a daemon program who need to read in a file who is saved somewhere in my home folder. But every time I close my ssh connection, this daemon can't read the file because it appear that eCryptfs unmount the home. Maybe there is an option to force eCryptfs to not only mount with an ssh connection ? I didn't found it. Thanks. PS : I know this thread, http://askubuntu.com/questions/165608/why-is-ecryptfs-only-mounting-private-home-directory-over-ssh, but this is not the proper/good way to deal with the request.

    Read the article

  • Using robocopy and excluding multiple directories

    - by GorrillaMcD
    I'm trying to copy some directories from a server before I restore from backup (my latest backup was corrupt, so I have to use an older one :( ). I'm in the Windows Recovery Environment and have access to the server's file system G:\ and my backup media C:\. But, since I'm more familiar with Linux, I'm having a bit of trouble with the command line in Windows, specifically robocopy. I want to copy multiple directories (maintaining the same directory structure) from G:\ to C:\ while excluding others (namely, the Windows and Program Files folders). I can't figure out the syntax for the /XD option. I was hoping to do something like: robocopy G: C:\backup /CREATE /XD "dir1","dir2", ...

    Read the article

  • DVD wont mount Ubuntu 12.04

    - by CyborgGold
    I can't seem to be able to mount my optical drive. I have tried numerous solutions from this site with no results. I am not able to see the device inside the file browser either. There is a DVD in the drive. I am running 12.04 on an HP g60-235dx portable. I have a link below to the specs. I will also list what I have tried (that I can find back right now.) I know the drive is functioning, because just before Windows 7 crashed and my MBR went fubar I was watching movies just fine. I am fairly new to linux, so don't assume I know anything. Ok, so here is what I have tried: sudo wget --output-document=/etc/apt/sources.list.d/medibuntu.list http://www.medibuntu.org/sources.list.d/$(lsb_release -cs).list sudo apt-get --quiet update sudo apt-get --yes --quiet --allow-unauthenticated install medibuntu-keyring sudo apt-get --quiet update sudo apt-get install libdvdcss2 dmesg | grep sr0 (no output) apt-get install libdvdnav4 (already installed, and up to date) sudo /usr/share/doc/libdvdread4/install-css.sh ls -l /dev/cdrom /dev/cdrw /dev/dvd /dev/dvdrw /dev/scd0 /dev/sr0 ls: cannot access /dev/scd0: No such file or directory lrwxrwxrwx 1 root root 3 Sep 10 03:51 /dev/cdrom -> sr0 lrwxrwxrwx 1 root root 3 Sep 10 03:51 /dev/cdrw -> sr0 lrwxrwxrwx 1 root root 3 Sep 10 03:51 /dev/dvd -> sr0 lrwxrwxrwx 1 root root 3 Sep 10 03:51 /dev/dvdrw -> sr0 brw-rw----+ 1 root cdrom 11, 0 Sep 10 03:51 /dev/sr0 wodim --devices wodim: Overview of accessible drives (1 found) : ------------------------------------------------------------------------- 0 dev='/dev/sg1' rwrw-- : 'TSSTcorp' 'CDDVDW TS-L633M' ------------------------------------------------------------------------- sudo lshw optical *-cdrom description: DVD-RAM writer product: CDDVDW TS-L633M vendor: TSSTcorp physical id: 1 bus info: scsi@1:0.0.0 logical name: /dev/cdrom logical name: /dev/cdrw logical name: /dev/dvd logical name: /dev/dvdrw logical name: /dev/sr0 version: 0200 capabilities: removable audio cd-r cd-rw dvd dvd-r dvd-ram configuration: ansiversion=5 status=nodisc sudo lshw | grep cdrom *-cdrom logical name: /dev/cdrom Spec sheet for portable: http://www.cnet.com/laptops/hp-g60-235dx/4507-3121_7-33496192.html If you need any more information than all of that... please let me know.

    Read the article

  • Process that needs a volume starting before volume mounts

    - by user36126
    The destination for incoming CrashPlan backups on my server (11.04) is /media/SeagateBig (SeagateBig is the volume name of my 2TB USB drive). When the server boots, two things happen: 1) SeagateBig auto-mounts and 2) CrashPlan starts. The problem is, that often these two things don't happen in that order. Then I get: Crashplan starts looks for /media/SeagateBig doesn't find it instead of waiting for it, CREATES IT Now it's backing up onto my / filesystem. NOT COOL. Meanwhile, when SeagateBig finally gets around to mounting, it finds that /media/SeagateBig already exists, shrugs, and creates /media/SeagateBig_ as its mount point. What I need is a way for the order to be enforced - where SeagateBig mounts and then and only then the CrashPlan service is started. Unless I learn that CrashPlan can be told to wait for its destination directory, never to create it... which I am also investigating. But the CrashPlanEngine script is installed by the product so I am loath to modify it, as I know I could by having it loop until df greps successfully for "SeagateBig".

    Read the article

  • Mounting Solaris UFS partition on Debian(with FreeBSD kernel)

    - by hayalci
    I have some disks that were being used on a Solaris system. The disks are formatted as UFS. I attached them to a Debian system (with FreeBSD kernel. Debian/kFreeBSD), but I cannot mount them. $ mount -t ufs /dev/da2s1 /mnt/diska mount: /dev/da2s1 : Invalid argument Also the tunefs.ufs does not work; $ tunefs.ufs -p /dev/da2s1 tunefs.ufs: /dev/da2s1: could not read superblock to fill out disk Is there an incompatibility between FreeBSD UFS and Solaris UFS? Is it possible to mount one, under the other OS ? Note: tunefs.ufs works on the root partition $ tunefs.ufs -p /dev/da7s2 tunefs.ufs: ACLs: (-a) disabled tunefs.ufs: MAC multilabel: (-l) disabled tunefs.ufs: soft updates: (-n) disabled tunefs.ufs: gjournal: (-J) disabled tunefs.ufs: maximum blocks per file in a cylinder group: (-e) 2048 tunefs.ufs: average file size: (-f) 16384 tunefs.ufs: average number of files in a directory: (-s) 64 tunefs.ufs: minimum percentage of free space: (-m) 8% tunefs.ufs: optimization preference: (-o) time tunefs.ufs: volume label: (-L)

    Read the article

  • In Mac OS X Finder's column view, how do you show all columns, up to the list of volumes?

    - by John Douthat
    In OS X's olden times, column view always allowed you to scroll left back to the list of volumes. In recent versions, however, the Finder will hide parents and ancestors. For example, when you select a favorite "place" in the sidebar, no ancestors of that folder will be visitable without pressing Cmd+Up, but hitting Cmd+Up causes the current directory to lose focus, or disappear entirely, depending on the number of levels . Clicking "Back" sends you back to the folder you where in, but it also re-hides all of its ancestors :( I really wish I could see the entire hierarchy. Is that possible?

    Read the article

  • Cannot get script to run at startup (tried all the simple answers)

    - by Carey Head
    I have Ubuntu Desktop 12.04 LTS running great on an older Acer desktop. I want to use this machine as an in-home server for hosting Minecraft. The command to start the Minecraft server is java -Xmx1024M -Xms1024M -jar minecraft_server.jar nogui and that works great when I cd into the correct directory and execute the above. I created a script to do this: #!/bin/bash cd /home/myuser/minecraft-server1 java -Xmx1024M -Xms1024M -jar minecraft_server.jar nogui & cd /home/myuser/minecraft-server2 java -Xmx1024M -Xms1024M -jar minecraft_server.jar nogui & exit 0 I made this .sh file executable, and it too runs great when I start it manually from the terminal. The problem I'm having is getting these to execute at startup. I have my user account on this machine to auto login. I have tried the following: Adding the following to "Startup Applications" : sh /home/myuser/myscript.sh (Nothing happens on reboot) Adding the same to /etc/rc.local (Nothing happens on reboot). I even tested this one by running /etc/rc.local from the terminal, and it executed great. Just not at boot/auto login Added the lines from the script directly to rc.local (Nothing happens on reboot). I can't help but think that there's something I'm missing. The script executes great when run manually, but will not run at boot/auto login. Many thanks in advance.

    Read the article

  • IIS get full error message for failed requests

    - by BetaRide
    I have IIS set-up and serving my webservice. Unfortunately if the webservice throws an exception, all I get is a blue box with the title failed request. What options do I have to actually see what went wrong? I'd prefer to get the exception message and a stack trace. I already set-up "Failed Request Tracing" but the directory remains empty. If possible I'd prefer to get the stack trace in the browser directly. Just if this matters: I have an IIS 7.5 on a Win 7 64 Pro box. The Webservice is a WCF C# project.

    Read the article

  • Why OpenDialog.py don't work on Jotty application

    - by venerable13
    I'm a Python beginner, I installed quickly, I wrote a "quickly tutorial" in terminal and I did all the steps before at: "However, the application is not complete. There are a few things left for you to do:" All the next steps aren't finished yet because when I use open dialog and select one of the files saved, the content of the file isn't showed on "textview1", Why? Only is deleted the content written. Before if was used without dialog works great. SaveDialog.py work great. -def on_mnu_new_activate(self, widget, data=None) don't work neither. -If I use the bold lines by the others don't work. ###def open_file(self, widget, data=None): def on_mnu_open_activate(self, widget, data=None): ###def save_file(self, widget, data=None): def on_mnu_save_activate(self, widget, data=None): To view the code, go to the link above, unrar the archive, install "quickly" if you don't have it yet, place inside on jotty directory, then put "quickly run", "quickly edit", "quickly design", depending what do you want to do. Code - problematic code with OpenDialog implemented. Code-part1 - works OK, but without OpenDialog. I need principally that OpenDialog function work great.

    Read the article

  • Ubuntu install can't find hard drives

    - by Casey Hungler
    I recently got a Dell Inspiron Special Edition 7720 computer. I am trying to install Ubuntu along side Windows. When I use the WUBI installer, the installation of Ubuntu works as long as I do not boot into Windows; if I boot into Windows, when I go back into Ubuntu, I am given a variety of error messages which claim to have corrupt or missing kernel/root directory, etc. I have been working with this problem for about a week, and have reinstalled Ubuntu MANY times. So far, I have eliminated all of the following problems: Corrupt WUBI installation (Downloaded multiple times, used on other systems), I have tried using a CD and a flash drive, both of which work on other computers. I know that no program within Ubuntu is creating the problem. I know that others have successfully installed Ubuntu on a computer with my operating system (Windows 7 SP1). This is a much shortened version of the original question, which has been up for about 5 days, and included a more detailed description of the problem, but left everyone clueless as to the source of this problem. When I spoke with the Dell service technician who came over today to replace my keyboard, he suggested that the driver for my HDD was so new that it was not compatible with the current version of Ubuntu. His reasoning is as follows: 1) During an install from a flash drive or CD, where I am supposed to get the option to wipe my system or create a dual boot, I get a window that asks me to select a hard drive partition, but none are listed. 2) This model of computer was made public in June of this year, while Ubuntu was released in April Adopting this theory, it would seem to me that the WUBI install fails after booting into Windows because Ubuntu can no longer find the files that it needs to load. Does this theory seem at all plausible to anyone? I just want to install Ubuntu and have it stay on my computer. I don't care how I put it there, I just need it to work, so I would TRULY appreciate any advice or suggestions anyone could give. Thanks so much for your time and support!!!

    Read the article

  • PHP session files have permissions of 000 - They're unusable

    - by vanced
    I kept having issues with a Document Management System I'm trying to install as, at the first step of the installation process, it would error with: Warning: Unknown: open(/tmp/sess_d39cac7f80834b2ee069d0c867ac169c, O_RDWR) failed: Permission denied (13) in Unknown on line 0 Warning: Unknown: Failed to write session data (files). Please verify that the current setting of session.save_path is correct (/tmp) in Unknown on line 0 I looked in /tmp and saw the sess_* files have the following permissions ---------- 1 vanced vanced 1240 Jan 20 08:48 sess_d39cac7f80834b2ee069d0c867ac169c All the session files look like this. So obviously, they're unusable by PHP and it's causing me lots of problems. How can I get PHP to set the correct permissions? I've tried changing the directory which php.ini uses to /tmp/phpsessions and the same thing occurs. The directories are a+rwx.

    Read the article

  • How can I get rsync to ignore missing files?

    - by Joe Casadonte
    I'm executing a command like the following to several different systems: $ rsync -a -v [email protected]:'/path/to/first/*.log path/to/second.txt' /dest/folder/0007/. Sometimes *.log does not exist, and that's OK, but rsync generates the following error: receiving file list ... rsync: link_stat "/path/to/first/*.log" failed: No such file or directory (2) done Is there any way to suppress that? The only way I can think of is to use include and exclude filters, which just seem a PITA to me. Thanks!

    Read the article

  • How can I view and sort after the page count for multiple PDF files in a Windows file explorer?

    - by grunwald2.0
    I unsuccessfully used the "pages" feature in Windows Explorer, as well as in Directory Opus 10 and Free Commander XT (which I installed just for that reason, to try it out) to display the page count of multiple PDFs in a folder. All my PDF's are free to edit, i.e. not write-protected. I don't understand why any PDF reader can display the (correct) page number, but none of the file explorers can? (In the "details" view of course.) The only documents whose page count is displayed are MS Word documents. Do I have to use Adobe Bridge? (I didn't try it.) On a side-note: Did that change in Windows 8? Initial research: Google search was unsuccessful, the only slightly related SE topic I found was "How to count pages in multiple PDF files?".

    Read the article

  • How to force Windows XP to rename a file with a special character?

    - by codeLes
    I have a song that Windows can't play because there is a question mark in the name of the file. "Where Have All the Cowboys Gone?.ogg" // as an example So I try to rename it and Windows complains whether I try it in Explorer or from command prompt. Error I get when trying to copy, rename, or move is: The Filename, directory name, or volume label syntax is incorrect Is there a Windows way to force a rename in this case? Update I'll keep an eye on this question, but after 13 answers and many attempts (aside form 3rd party solutions) it seems that Windows can't do this (or at least my windows can't, no short names). So I'm accepting the answer which was my original solution anyway of using Linux. It would be nice to see Windows handle this somehow, so don't stop just because I've accepted this answer, the question still stands!

    Read the article

  • Cron job failing to backing up a Postgres database

    - by user705142
    I'm unsure what's going on here: I've got a backup script which runs fine under root. It produces a 300kb database dump in the proper directory. When running it as a cron job with exactly the same command however, an empty gzip file appears with nothing in it. The cron log shows no error, just that the command has been run. This is the script: #! /bin/bash DIR="/opt/backup" YMD=$(date "+%Y-%m-%d") su -c "pg_dump -U postgres mydatabasename | gzip -6 > "$DIR/database_backup.$YMD.gz" " postgres # delete backup files older than 60 days OLD=$(find $DIR -type d -mtime +60) if [ -n "$OLD" ] ; then echo deleting old backup files: $OLD echo $OLD | xargs rm -rfv fi And the cron job: 01 10 * * * root sh /opt/daily_backup_script.sh It produces a database_backup file, just an empty one. Anyone know what's going on here?

    Read the article

  • Permission denied on network share

    - by Philipp
    i have a Windows 8 host system running a virtual(hyper-v) Debian6 client with an lamp environment. My development environment runs under Windows and I mapped the folder with my php files to a network drive so Apache has access to them.(mount.cifs //pc/share /var/share/) This far no problems - I see my app on windows in the browser. The problem is, I can't write stuff in php to the share folder - everytime i got a permission denied message in my error logs. For testing purpose i tried to change the directory permissions of /var/share with chmod -R 777 /var/share without success. Now Iam a little bit stumped.. has anyone an idea how to solve this?

    Read the article

  • Error connecting to the application.

    - by ahmed
    Hi guys, let me explain you the current scenario. We have a asp.net application on framework 2 running on intranet(Windows 2003 Server and Sql Server 2000). Now we have a xp machine where we installed and configured IIS with a virtual directory pointing on the local Xp machine and this machine is connected to our intranet. We have copied the same application files of the server to this XP machine. But the thing is the connection string/database of the application is pointing towards the intranet server. The problem is when we try to run the application on the XP machine we get this error : An error has occurred while establishing a connection to the server. When connecting to SQL Server 2005, this failure may be caused by the fact that under the default settings SQL Server does not allow remote connections. (provider: Named Pipes Provider, error: 40 - Could not open a connection to SQL Server) Is this query related or concerned with this site or stackoverflow ?

    Read the article

  • Files copying between servers by creation time

    - by driftux
    My bash scripting knowledge is very weak that's why I'm asking help here. What is the most effective bash script according to performance to find and copy files from one LINUX server to another using specifications described below. I need to get a bash script which finds only new files created in server A in directories with name "Z" between interval from 0 to 10 minutes ago. Then transfer them to server B. I think it can be done by formatting a query and executing it for each founded new file "scp /X/Y.../Z/file root@hostname:/X/Y.../Z/" If script finds no such remote path on server B it will continue copying second file which directory exists. File should be copied with permissions, group, owner and creation time. X/Y... are various directories path. I want setup a cron job to execute this script every 10 minutes. So the performance is very important in this case. Thank you.

    Read the article

  • Powershell enters foreach loop with null object

    - by SteB
    I'm listing all backups in a given directory: $backups = Get-ChildItem -Path $zipFilepath | Where-Object {($_.lastwritetime -lt (Get-Date).addDays(-7)) -and (-not $_.PSIsContainer) -and ($_.Name -like "backup*")} If I set it to deliberately return no files (.addDays(-600) then the following prints "Empty" (as expected): if (!$backups) { "Empty" } If I try to list the names with an empty $backups variable: foreach ($file in $backups) { $file.FullName; } I get nothing (as expected), if I change this to: "test"+ $file.FullName; then I get a single "test" under the "Empty". How is this possible if $backups is empty?

    Read the article

  • Prevent Nautilus from displaying thumbnails on a specific mount

    - by Zakhar
    I have written a filesystem over Fuse to access a remote pseudo-NAS (the French "Freebox V6", I'll soon publish it as GPL3... when it's a little bit more polished!). The NAS is connected to a home ADSL, thus data comes down at the upload speed of ADSL, which is at best 1Mbps. My mount works fine (read-only at the moment), but Nautilus sees the mountpoint (and all sub-directories) as a "local" filesystem and tries to make thumbnails. As I have a directory full of images, this is quite horrible, because Nautilus then opens ALL the images to try to display the thumbnail. I could switch the Nautilus preferences to "Never" for thumbnails, but then I'll loose thumbnails on my "real" local filesystem. So the question is: with the preference "Only for local filesystem", how can I instruct Nautilus that my mountpoint is in fact NOT a local mount so that it will stop trying to draw thumbnails on that specific mount, but continue "thumbnailing" on mounts that are really local? Edit note: the same things happens if you use "standard worldwide" mounts such as sshfs, davfs,... as long as you mount over a relatively slow network (ADSL) and have images/movies on your mounted tree.

    Read the article

  • Setting up linux server with multiple access rights

    - by Mark
    I am a graduate student and want to set up a linux server (preferably Ubuntu) in my office. I also want to give my friends SSH access to that box. My question is can I set up my server such that I can give one of my friends rights to install software on my machine but he cannot brows around outside the directory he is allowed to? Can I set up multiple apache instances (on different ports) for different people? so each has access to their own apache instance?

    Read the article

  • Linux find/search root partition ONLY?

    - by ~sd-imi
    Say I need to do: find / -name somefile.txt and say root partition / is mounted on /dev/sda5; however, let's say I also have 250GB partitions (/dev/sda6, /dev/sda7) mounted in /media - AND another location that I cannot currently remember. Say, also, that I know the file I'm looking for is on /dev/sda5. Obviously, the above command will also descend in /media and that other directory which represent the big partitions, wasting time in looking for the file in the wrong place. Is there a way to instruct find (or other command) to search only / on /dev/sda5, and NOT to descend to directories if they are on different partitions ? Thanks, Cheers!

    Read the article

  • ACL permissions not behaving as expected

    - by Yarin
    I set the following ACL on my web directory: setfacl -R -d -m mask:002 /var/www and then created a file as root that I expected to be readable by the default (apache) group. -rw--w-r--+ 1 root apache 0 Dec 17 22:32 newfile.py When I run getfacl on the file, I get: # file: newfile.py # owner: root # group: apache user::rw- group::rwx #effective:-w- mask::-w- other::r-- I'm not sure how to read this- but all I know is that the webserver is throwing a permissions error because apache can't read the file. Can anyone explain what is going on here?

    Read the article

  • Using Quest AD cmdlets in an imported session

    - by ASTX813
    We are trying to use remote Powershell on our Exchange system: $rs = New-PSSession -ConnectionUri <uri> -ConfigurationName Microsoft.Exchange -Authentication Basic -Credential <username> -AllowRedirection Import-PSSession $rs After these commands, we can run Exchange cmdlets and all is well. However, we're unable to run any Quest Active Directory cmdlets. Yes, Quest is installed on the remote (as well as our local machines), and yes we are able to run those commands when running Powershell locally on the server. I tried -AllowClobber, but that didn't have an effect. Is there a way to get access to QAD?

    Read the article

  • How to migrate Notepad++ settings?

    - by NoCatharsis
    I am trying to portabilize every program I use if possible, and Notepad++ is on the list. The only problem is that I've had a native installation until now so that I'm not totally sure which settings files need to be moved to the portable directory. Surely there's a function tucked away somewhere in NPP exactly for this purpose, or some plugin out there? I mean the developers have literally thought of everything else, yet this is the one thing I cannot find specifically anywhere in the NPP wiki or otherwise, and I don't want to miss an important file. Here is the closest I've gotten: Notepad++'s configuration files and Where are all the files? Should I just copy every configuration file listed on the first link?

    Read the article

< Previous Page | 473 474 475 476 477 478 479 480 481 482 483 484  | Next Page >