Search Results

Search found 17278 results on 692 pages for 'directory conventions'.

Page 512/692 | < Previous Page | 508 509 510 511 512 513 514 515 516 517 518 519  | Next Page >

  • (How) does deleting open files on Linux and a FAT file system work?

    - by lxgr
    It's clear to me how deleting open files works on filesystems that use inodes - unlink() just decreases the link count to zero, and when the last file handle to the file is closed, the inode will be removed. But how does it work when using a file system that doesn't use inodes, like FAT32, with Linux? Some experiments suggest that deleting open files is still possible (unlike on Windows, where the unlink call wouldn't succeed), but what happens when the file system is uncleanly unmounted? How does Linux mark the files as unlinked, when the file system itself doesn't support such an operation? Is the directory entry just deleted, but retained in memory (that would guarantee deletion after unmounting in any case, but would leave the file system in an inconsistent state), or will the deletion only be marked in memory, and written at the time the last file handle is closed, avoiding possible corruption, but restoring the deleted files after an unclean unmount?

    Read the article

  • How to perform a nested mount when using chroot?

    - by user55542
    Note that this question is prompted by the circumstances detailed by me (as Xl1NntniNH7F) in http://www.linuxquestions.org/questions/linux-desktop-74/boot-failure-upon-updating-e2fsprogs-in-ubuntu-10-10-a-947328/. Thus if you could address the underlying cause of the boot failure, I would very much appreciate it. I'm trying to replicate the environment in my ubuntu installation (where the home folder is on a separate partition) in order to run make uninstall. I'm using a live cd. How to mount a dir in one partition (sda2, mounted in ubuntu as the home folder) into a directory on another mounted partition (sda3)? I did chroot /mnt/sda2 but I don't know how to mount sda3 to /home, and my various attempts didn't work. As I am unfamiliar with chroot, my approach could be wrong, so it would be great if you could suggest what I should do, given my circumstances.

    Read the article

  • Can TFS 2010 be installed onto a single server and in a Workgroup (not AD)

    - by Pure.Krome
    Hi folks, currently, we're using TFS2010 at our office and we're about to move. Part of that move is a split of teams. Our team will get their own servers. So we need to build our own TFS server and add our current projects to that. Right now, our TFS server exists on TWO servers - one for TFS and one for our Continuous Integration .. i think that's a build controller or something. That really suxs for us - having TWO servers instead of one for all our source control. We love CI and how it works (after the massive massive pain it was to get our VS2010 solution to CI + web Deploy) ... but it does work. So - can we do this with ONE server? Also, we don't want to have an Active Directory. Will this also work?

    Read the article

  • Horrible time with WAMP, Joomla and Windows 7 (Permissions)

    - by jax
    I have WAMP and Joomla installed on Windows 7. I have made a virtual host outside of the www directory that houses a test joomla site. I have also made a joomla component that I want to install and test on this virtual host. I am having constant problems with permissions. I have given all users full access to the folder containing the files under the security Tab in Win7. It seems to me that when (Apache/PHP) creates a file it put restrictions on the permissions which does not allow it to delete or edit the file anymore so I have to manually go and delete the files when I want to uninstall my component. Also, I cannot install the .zip - instead I have to unzip manually to the tmp folder because when I upload the zip in the Joomla Installer I get the same permissions problems as above. How can I get rid of these issues once an for all?

    Read the article

  • Rewrite 2 different directories with htaccess?

    - by jason
    I have a tricky problem (for me at least). I'm trying to rewrite / to a folder /webroot/www. I have some simple code and it works: RewriteRule ^$ /webroot/www/ [L] However at the same time if the URL starts with components, followed by anything else (ex. foo, as in /components/foo), and foo is an actual directory that exists inside components, I should rewrite to /components/foo/www instead. How can I achieve that? I can't seem to figure it out. I'm using Apache with .htaccess.

    Read the article

  • How to create a Service Connection Point for Exchange (Manually)

    - by Ionoxx
    I'm being cautious here. Before I remove anything I want to be able to put it back. I'm having issues with a domain joined computer that is using SCP to get exchange autodiscovery information. It's getting information for the now unused internal Exchange through SCP even through the profile is using Office 365 on another domain. According to this conversation, I can simply remove the object from Active Directory Sites and Services. I want to know how to add back in should this create more problems, or if we reinstate the Exchange server. Right clicking on the parent "autodiscover" node doesn't allow me to create a Service Connection Point. Will simply running the cmdlet "Set-ClientAccessServer -identity servername -AutodiscoverServiceInternalUri url" be enough to recreate the object? Thank you!

    Read the article

  • How do I pull a backup from a Linux server to my Windows PC using rsync?

    - by Nogwater
    I'm currently using sftp to download nightly backups (.tar.gz) from my web host to my desktop computer. I think I'd like to switch to rsync to minimize the bandwidth (and time). I have cygwin installed on my PC, but don't use it for much. I have shell access to my web host via ssh (PuTTY). Let's say my source directory is myserver.com:/home/username/backups/, I want to grab all of the .tar.gz files from there, and I want to save them to C:\Backups\ locally.

    Read the article

  • Active Directoy GPO

    - by Phillip R.
    I am looking into some weird issues with active directory and group policy. This domain has been upgraded from windows NT and has a few different administrators over the years. I am looking through the Default Domain group policy and Default Domain Controller group policy. In the security areas and I will use the log on locally area as an example, it shows SIDes that begin with asterisks and are quite long they look sort of like the following *S-1-5-21-787626... Normally, when I see something like this I would think that the User account was no longer there and this was never cleaned up. Am I wrong in my assumption? Thanks in advance

    Read the article

  • Suspicious activity in access logs - someone trying to find phpmyadmin dir - should I worry?

    - by undefined
    I was looking over the access logs for a server that we are running on Amazon Web Services. I noticed that someone was obviously trying to find the phpmyadmin directory - they (or a bot) were trying different paths eg - admin/phpmyadmin/, db_admin, ... and the list goes on. Actually there isnt a database on this server and so this was not a problem, they were never going to find it, but should I be worried about such snooping? Is this just a really basic attempt at getting in to our system? Actually our database is held on another managed server which I assume is protected from such intrusions. What are your views on such sneaky activity?

    Read the article

  • Emacs doesn't use ~/.ssh/config when accessing files on a remote machine

    - by Yotam
    I have a fresh install of arch Linux. I've installed Emacs from the rpos, and my home directory is mounted from a separate partition. I have old settings I've used on my ~/.ssh/config along with authentication keys I've regularly used before. Now, when I try to connect to a remote machine using Emacs, Emacs asks for my password and uses the wrong username. Clearly, Emacs doesn't access my config file. When I try to ssh or scp directly to the machine, things work fine. What do I need to update?

    Read the article

  • How can I reset windows 7 file permissions?

    - by ssb
    I looked at this post and it seemed to be close to what I want, but my case might be a little worse: How can I reset my windows 7 file permissions to a rational state? Basically a while back I (very stupidly) changed the permissions on all sorts of system folders, and eventually rendered my computer virtually unusable. I managed to hack administrator privileges back onto key folders and getting it working, but in doing so I only modified permissions a lot more away from the natural state. I'm looking at this icacls stuff, but ultimately I need to reset EVERYTHING back to what it was in The Beginning, before I messed with it, from the C: directory all the way down. Right now application data is what's giving me problems, and I can't get it to work no matter how much I fiddle with those specific permissions. I will be forever grateful for help on how to do this without having to reformat.

    Read the article

  • How do I get transparent, efficient, file system snapshotting or versioning on ext3/4?

    - by shovas
    I've long thought about versioning file systems. This is a killer feature and I've looked at Wayback, ext3cow, zfs, fuse solutions, or just cvs/svn/git overlays. I consider ext3cow the model for my requirements. Transparent, efficient, but I can do without the extra ls abc@timestamp feature. As long as I somehow get automated, transparent versioning of my files. It could be instantaneous or it could be based on snapshots on intervals of 10s, 30s, 1m, 5m, 15m, etc. Just something that will efficiently deal with thousands of files in a given directory all of various sizes, most small, but some upwards of 100m to 1gb. ZFS isn't really an option as I'm on linux (and would prefer not to use it through fuse as I already have an ext3 setup I want to version, not something new). What solutions are out there?

    Read the article

  • Delete protected files in ntfs?

    - by Balchev
    I want to delete an old Windows directory from my system drive (C), but I am unable to due NTFS permissions. I tried from Win 7 and Win 2003, but can't do it. I tried safe mode as well with same result. Is there any way to work around this (other then formating the drive)? Perhaps changing the owner or something? It errors at files like "oldwin/bfsvc.exe". Is there some "superuser" in windows similar to linux root account? Thanks

    Read the article

  • Samba Share writable

    - by Chris
    I have had a problem writing to a Samba share. I believe this person has the answer, but I do not know how to do this, does someone know how to do this? Thank you very much, On the Samba server, you need to ensure that the nobody user has write permissions to /Windows_Backups/DC. You're forcing everyone to be impersonated by the nobody account, so that account will need file-level permissions on that share directory. Samba will respect local permissions when figuring out who can write where, in this case it is somewhat like Windows.

    Read the article

  • View Webserver Form Internal and Extenal

    - by Just4Net
    We have a web server hosted out of our office, and it works fine when people access the site from outside the office. The problem is that when people are inside the office and go to the website (www.example.com) it's very slow, as it goes out over the internet and comes back in. Or LAN is Windows with Active Directory and our web server is CentOS with its own internet connection. How can I set when users in my office want see website opens it from local network instead of going out of one internet connection and back in through the other?

    Read the article

  • rsync stuck with the --checksum option

    - by billc.cn
    I use back-in-time to backup my Linux installation. It serves as an advanced wrapper for the rsync command. Today I tried to add /var/log to the list of folders to be backed up and it caused some serious performance problems. The job seems to stuck on a particular file and the CPU usage of the rsync parent process reaches 100%. I then used lsof to see which file caused the problem and it seems to be the /var/log directory. I did some googling and some experiments with the different rsync options and found --checksum to be the offender. Without the parameter, an incremental backup finishes properly in minutes. With it, the process will stuck when rsync tries to sync a constantly changing log file. This kind of make sense, but it still seems to be a bug to me. Am I using the option correctly? Is there a workaround for this?

    Read the article

  • Apache Alias - Chiliproject

    - by asdz
    I'm trying to setup Chiliproject (a ruby application for project management) I have setup my Apache already. However I want the Chiliproject to be like http://abc.com/Chiliproject as I want the abc.com to be used for other application. Following is my Chiliproject vhost setting: ServerName abc.com DocumentRoot /var/www/chiliproject/public Alias /chiliproject /var/www/chiliproject/public Options -MultiViews AllowOverride all When I go to abc.com, the Chiliproject page will appear but when I go to abc.com/chiliproject, I will reach the 404 page not found instead. If I change the DocumentRoot to /var/www, the page abc.com will be what I want, but the abc.com/chiliproject will comes to the 'Directory view' of my page.

    Read the article

  • Hyper-V share a folder between host and instance

    - by Fly_Trap
    I have a hyper-v server and several VM's (Virtual Machines). All the VM's are connected to an external network. I have tried to share a folder on the host and connect via the VM, I can do this but I'm prompted for a user name and password (as you would expect). I do not want to enable the "Everyone" group permissions as the physical host server is on a network of other servers. I have created a new virtual internal network in Hyper-V and given it's adapter a static ip of 33.0.0.100. I have added the virtual adapter to one of the VM's and set to IP to 33.0.0.2 (as advised here). Again this seems to work but I'm still prompted for a user name and password. Am I on the right lines here? I just want to share a directory from the host to the vm's without exposing the share to other servers on the network.

    Read the article

  • How to rsync just the current folder?

    - by Michael
    I want to sync some files from a remote server to my local computer. How can I make rsync to just copy the files with a certain file extension in the directory but no subdirectories? I assumed this to be an easy task, but embarassingly I'm not getting it for nearly 2 hours. So could someone give me an example? I did various experiments with something like the following command: rsync -a --include=what? --exclude=what? -e ssh [email protected]:/test /test

    Read the article

  • how do I concatenate a regex in a bash alias?

    - by Rodreegez
    Hello, I can't for the life of me how to create an alias that will switch to a given project directory. I keep all my projects in a folder called Projects i.e. ~/Project/blog ~/Project/whatever I'd like to have an alias along the lines of p whatever that would equate to cd ~/Project/$1 where $1 is whatever is given to p. I have tried various combinations of alias p="cd ~/Projects/\$1" with all the usual suspects for regex escaping but I can't quite get it. Any ideas?

    Read the article

  • How can I set clean urls (enable rewrite) if I don't have a domain ?

    - by Patrick
    In order to enable clean urls in Drupal, I add the lines below to the lighttpd configuration file. However I'm now working on a local server and I don't have a domain available. So I need to work with this address http://local.ip/Sites/mywebsite I've tried to replace ["host"] with ["socket"] and replace the domain with ip and subfolders (see address above), but unsuccessfully. How can I set the configuration file to set clean urls even if I don't have a domain ? thanks $HTTP["host"] =~ "(^|\.)mywebsite\.com" { server.document-root = "/var/www/sites/mywebsite" server.errorlog = "/var/log/lighttpd/mywebsite/error.log" server.name = "mywebsite.com" accesslog.filename = "/var/log/lighttpd/mywebsite/access.log" include_shell "./drupal-lua-conf.sh mywebsite.com" url.access-deny += ( "~", ".inc", ".engine", ".install", ".info", ".module", ".sh", "sql", ".theme", ".tpl.php", ".xtmpl", "Entries", "Repository", "Root" ) # "Fix" for Drupal SA-2006-006, requires lighttpd 1.4.13 or above # Only serve .php files of the drupal base directory $HTTP["url"] =~ "^/.*/.*\.php$" { fastcgi.server = () url.access-deny = ("") } magnet.attract-physical-path-to = ("/etc/lighttpd/drupal-lua-scripts/p-.lua") }

    Read the article

  • How to set my Ubuntu account to super user at all times?

    - by iaddesign
    I have the latest Ubuntu installed and I'll be the only one using it off the network. My question is: how can I make myself super user at all times? Because when I try to delete a file it says I don't have privileges to do so. I know you are going to say it's a security risk but I'm off the network and want to learn all that I can. I don't want to delete the files through the terminal but want to do it through the user interface/explorer. I've installed LAMP and can't copy my site to the www directory. I've tried to remove the preinstalled index file and it won't let me.

    Read the article

  • Problem with saving pictures in certain ways [migrated]

    - by user132750
    I am making a Garfield comic viewer in C#. I have a button that saves the comic on screen to the computer. However, when I compile it from Visual C# Express, it saves perfectly. When I run the exe file from the directory, I get an unhandled exception message stating "A generic error occurred in GDI+". Here is my saving code: private void save_Click(object sender, EventArgs e) { using (SaveFileDialog sfdlg = new SaveFileDialog()) { sfdlg.Title = "Save Dialog"; sfdlg.Filter = "Bitmap Images (*.bmp)|*.bmp|All files(*.*)|*.*"; if (sfdlg.ShowDialog(this) == DialogResult.OK) { using (Bitmap bmp = new Bitmap(pictureBox1.Image.Width, pictureBox1.Image.Height)) { pictureBox1.DrawToBitmap(bmp, new Rectangle(0, 0, bmp.Width, bmp.Height)); pictureBox1.Image = new Bitmap(pictureBox1.Image.Width, pictureBox1.Image.Height); pictureBox1.Image.Save("c://cc.Jpg"); bmp.Save(sfdlg.FileName); pictureBox1.ImageLocation = "http://garfield.com/uploads/strips" + "/" + whole.ToString("yyyy-MM-dd") + ".jpg"; MessageBox.Show("Comic Saved."); } } } } What can I do?

    Read the article

  • Ubuntu - executable file - variable assignment throwing error on script run

    - by newcoder
    I am trying to run a small script - test - on ubuntu box. It is as follows: var1 = bash var2 = /home/test/directory ... ... <some more variable assignments and then program operations here> ... ... Now every time I run it, then it throws errors: root@localhost#/opt/test /opt/test: line 1: var1: command not found /opt/test: line 3: var2: command not found ... ... more similar errors ... Can someone help me understand what is wrong in this script? Many thanks.

    Read the article

  • Why can't I set Windows 7 folder to Writeable?

    - by Clay Nichols
    Moved a SATA HD from one PC to another. Copied almost all the files from old drive to new, except for one folder ("oldFolder") and it's subfolders and files. I tried to copy just a single file (to simplify things): Can't copy that file to same directory. (get above error) CAN copy file to Desktop. Can not copy files' container folder to Desktop. Under Properties for that OldFolder: Read Only is Checked. Security: Everyone set to Allow everything except "special permissions" All users are set to allow WRITE.

    Read the article

< Previous Page | 508 509 510 511 512 513 514 515 516 517 518 519  | Next Page >