Search Results

Search found 37883 results on 1516 pages for 'sparse files'.

Page 83/1516 | < Previous Page | 79 80 81 82 83 84 85 86 87 88 89 90  | Next Page >

  • Access denied for user who has full access to some files in their own folder

    - by steve02a
    I have a very similar case as this user: Access denied on some files on Win2008R2 DC share This is on a windows 2008 R2. The user has Win7 pro. The user has their own home folder on the server. Every file, except one, the user can read/write/modify at their own will. No problems - except this one file. She gets "access denied" I can open it (as domain admin). Another user can open it (because she's in the domain admin group). I did run the AccessEnum tool and the read/write permissions are all identical for all files. So, I can't explain why the user can't open this one single file. Out of all her files in sub-folders and such. No problems. This one file is causing a headache. What do you think could be wrong here?

    Read the article

  • How to recover lost files after an install

    - by Gentry McColm
    I'm a newbie learning along the way. I recently installed a 2nd hdd into my ubuntu box. Have one of about 160g which runs ubuntu 12.04. And the new hdd was 1 tb, used for holding videos. I had set up 2nd drive as ext3 I believe. And set up folders on it to hold the videos. Worked great. Also thought I had set it up for auto mount. I was able to read and write on it. Etc. Computer froze, so had to reboot it. When I did, system would not reboot: hung on the Ubuntu screen with 5 dots. I hit a few buttons and the command screen showed up, indicating that my 2nd hdd would not mount. Stopped up whole system. Tried rebooting, no go. Had to reinstall ubuntu on the 1st hdd. Did not apparently touch the 2nd one. Well, when I got it up and running, my 2nd hdd mounted automatically (yeah!), but now I cannot find my videos I already had on it. I had not put any more than about 30g of videos on it, but now when I read its Properties, it says I'm using about 50g. So, I'm wondering if somewhere in that, buried, are my 17 videos. Any help in recovering this? Thanks!

    Read the article

  • Logging won't stop on log file after renaming/moving it.... how do I stop it?

    - by Jakobud
    Just discovered that logrotate is not rotating our firewall log. So it's up to 12G in size. I need to split up the file into smaller chunks and start manually rotating them so I can get things back on track. However before I start splitting the firewall up, I need to stop the firewall from logging to the current firewall log file and force it to start logging to a new empty file. This way I'm not trying to split up or rotate a log file that is still constantly growing. I tried to simply do this: mv firewall firewall.old touch firewall I expected to see the new empty firewall file to start growing in size, but no... the firewall.old is still be logged to. Then I tried to start/stop iptables. No change. firewall.old is still the log file. I tried to move it to another directory. That didn't help. I tried to stop iptables, then change the filename and create a new firewall file and then start iptables again, but no change. How do I stop the logging on this file and force it to start logging on a new file?

    Read the article

  • /var/run/utmp is getting large and slowing my server down

    - by Travis
    I removed it and touched an empty verison a few weeks ago and noticed a big upswing in performance for my server. The file was 400+ MB. I've been keeping an eye on it since and I'm noticing it is growing fairly quickly. I tailed the file and I'm seeing a lot of "TTYXXLOGIN" entries. Should I be concerned? Is there a way to minimize it's logging? Should I logrotate it and forget about it? Thanks in advance.

    Read the article

  • Selectively Including files in C#.net web application [migrated]

    - by segnosaur
    I am attempting to modify an application with the following characteristics: Written in C#.net Using Visual Studio 2010 The application uses a Master sheet to maintain commonality The Master sheet has the following: <%@ Master Language="C#" AutoEventWireup="true" CodeFile="mysheet.master.cs" Inherits="master_mysheet" %> Now, currently, the master sheet has an include file that brings in a common footer: #include file="inc/my-footer.inc" Here's what I want to do: I would like to modify the master sheet to be able to read in a footer based on the value contained in a session variable... i.e. (not real code, but just something to give an idea of what I want) if session("x") = "a" then #include file="inc/my-footer1.inc" else #include file="inc/my-footer2.inc" My first instinct was to go with some vbscript: <script type="text/vbscript" language="vbscript"> document.write("vbscript example.") </script> However, it doesn't run the vbscript code automatically on page load. Does anyone know: - The syntax I need to actually get this to work? i.e. to get the vbscript to run automatically on page load, AND to do the page include? - Or, is there a better way to go about this? (perhaps by doing some coding in C#) Note: I am experienced in C#; however, I haven't done any vbscript since the days of ASP classic, so my knowledge there is out of date.

    Read the article

  • "Permission denied" when accessing files outside Desktop folder after installing Ubuntu via WUBI

    - by Zhenyi
    After I installed Ubuntu by using Windows installer for Ubuntu Desktop, I found that I can only run executable programs in the "Desktop" folder. In any other folders, when I typed in something like ./a.out, there appears a Permission Denied error. Also, in the "Properties" dialog box of the file I want to execute, the box of "allow executing file as a program" cannot be chosen if the file is not in the Desktop folder. What should I type in the terminal to fix the problem?

    Read the article

  • Only show changed files with verbose option

    - by qox
    I would like rsync to print modified and deleted files. The verbose option (-v) does print modified files but also the list of subdirectories, maybe because touched directories are considered modified. Since I sync a lot of files from a lot of subdirectories, it's impossible to see the actual changes. So, is there a way to not print directories using rsync ? Im not looking for grep -v "*/$" kind of answers since it would also exclude new directories. Command I am using: rsync -avh --delete /media/data/src /media/data/bkp And everytime it prints the list of all directories: src/dir1/ src/dir1/sdir1/ src/dir1/sdir2/ src/dir2/ ..... Thanks for your help.

    Read the article

  • Ubuntu crashed during the update to 12.04, not I can't recover my files, help please

    - by mrah
    I'm pretty new to Linux and I've only installed Ubuntu as I couldn't afford to buy Windows, worked well and I liked it. But I chose to upgrade it to the newest version after a prompt. The update froze and the machine was unresponsive which forced me to hard reboot it. Now nothing seems to load and I've reached my wits end (mainly cos I'm lost in all the command lines). I've decided to try and recover my data from the hard drive, only two folders, by selecting the try Ubuntu option when I insert the OS CD into the machine. The problem I'm experiencing now is it won't let me copy my folders, I get a 'The folder contents could not be displayed. You do not have the permissions necessary to view the contents of "folder_name".' Does anyone know how I can recover this data?

    Read the article

  • Standardize SQL Server Installations with Configuration Files

    If you have a requirement to install multiple SQL Server instances with the same settings, you most likely want to do it without following the numerous manual installation steps. The below tip will guide you through how to install a SQL Server instance with less effort. The Future of SQL Server MonitoringMonitor wherever, whenever with Red Gate's SQL Monitor. See it live in action now.

    Read the article

  • Ubuntu does not recognize install files (easytether)

    - by Esbilick
    I am running Ubuntu 12.4 64amd. I cant get easy tether to install regardless of which version I use (32 or 64bit.) I keep getting an error when I try to run it in the command line. It tells me it not a valid deb file. If I click it on the desktop the instal manager comes up but the install button is not highlighted for me to click to install it that way. Please help. I'll be using my phone as internet modem

    Read the article

  • What is the best tool to aggregate traffic stats from multiple nginx servers?

    - by gekkz
    The setup: 2 or more nginx machines each machine has the same virtual hosts traffic is load balanced via DNS to each machine I need to figure out what are the best tools to use to get some traffic stats, mostly interested in amount of hits and total traffic in gigabytes. Obviously, the log information will come from nginx, formatted like this: log_format main '$remote_addr $host $remote_user [$time_local] "$request" ' '$status $body_bytes_sent "$http_referer" "$http_user_agent" "$gzip_ratio"';

    Read the article

  • Other than using `split`, is there a way around the Apache 2.0 maximum file size limit of 2GB?

    - by warren
    I have some ISOs that need to be available across a WAN, so we are using an http server to host them (allows for non-authenticated, read-only access (beyond being on the VPN) to the data store). The server the ISOs reside on is running CentOS 4, and Apache 2.0.58. Is there a way around the 2GB filesize limit with Apache 2.0 without using the split utility to chunk the ISOs down to a less-than-2GB size?

    Read the article

  • most simple way to get files on a server

    - by acidzombie24
    I am on windows and my server is linux. I would like to grab files from the server automatically with a script. Maybe execute a bash script remotely as well but maybe i dont need that. I need to connect securely and i would like some kind of password so not anyone can connect. I need to download files and i'd like to get every file in a set of folders. I do not want to download them again if they exist. What is the easiest way to do this? i thought of creating a simple .NET site with data in App_Data (so it cant be reached from the outside) however i have a feeling an easier way exist. I'd like to do scp with a shell but i am on windows and also i am unsure how to iterate through folders and only get files that dont exist.

    Read the article

  • Find copies of folders? (Not files)

    - by acidzombie24
    I have a dozen of folders that are duplicates. Within them are a few dozen folders that are duplicates so i have a few thousand copies of the same files and folders. Many of them are exactly the same while others have changes in a few files. What utility can i use to delete folders that are copies of others with no changes? if one or more files in that folder have been changed i dont want it deleted (and i'd like the subfolders to have a shortcuts to a copy but thats not required). Is there a utility to do this?

    Read the article

  • Getting ANT to scp only new/changed files

    - by Artem
    I would like to optimize my scp deployment which currently copies all files to only copy files that have changed since the last build. I believe it should be possible with the current setup somehow, but I don't know how to do this. I have the following: Project/src/blah/blah/ <---- files I am editing (mostly PHP in this case, some static assets) Project/build <------- I have a local build step that I use to copy the files to here I have an scp task right now that copies all of Project/build out to a remote server when I need it. Is it possible to somehow take advantage of this extra "build" directory to accomplish what I want -- meaning I only want to upload the "diff" between src/** and build/**. Is it possible to somehow retrieve this as a fileset in ANT and then scp that? I do realize that what it means is that if I somehow delete/mess around with files on the server in between, the ANT script would not notice, but for me this is okay.

    Read the article

  • nmake makefile, linking objects files in a subfolder

    - by Gauthier
    My makefile defines a link command: prod_link = $(LINK) $(LINK_FLAGS) -o$(PROD_OUT) $(PROD_OBJS) where $(PROD_OBJS) is a list of object files of the form: PROD_OBJS = objfile1.obj objfile2.obj objfile3.obj ... objfileN.obj Now the makefile itself is at the root of my project directory. It gets messy to have object and listing files at the root, I'd like to put them in a subfolder. Building and outputing the obj files to a subfolder works, I'm doing it with suffixes and inference: .s.obj: $(ASSEMBLY) $(FLAGS) $*.s -o Objects\$*.obj The problem is to pass the Objects folder to the link command. I tried: prod_link = $(LINK) $(LINK_FLAGS) -o$(PROD_OUT) Objects\$(PROD_OBJS) but only the first file in the list of object files gets the folder's name. How can I pass the Objects subfolder to all files of my list $(PROD_OBJS)? EDIT I tried also PROD_OBJS = $(patsubst %.ss,Object\%.obj, $(PROD_SRC)) but got: makefile(51) : fatal error U1000: syntax error : ')' missing in macro invocation Stop. This is quite strange...

    Read the article

< Previous Page | 79 80 81 82 83 84 85 86 87 88 89 90  | Next Page >