Search Results

Search found 61297 results on 2452 pages for 'open files'.

Page 600/2452 | < Previous Page | 596 597 598 599 600 601 602 603 604 605 606 607  | Next Page >

  • Blocking a specific URL by IP (a URL create by mod-rewrite)

    - by Alex
    We need to block a specific URL for anyone not on a local IP (anyone without a 192.168.. address) We however cannot use apache's <Directory /var/www/foo/bar> Order allow,deny Allow from 192.168 </Directory> <Files /var/www/foo/bar> Order allow,deny Allow from 192.168 <Files> Because these would block specific files or directories, we need to block a specific URL which is created by mod-rewrite and the page is dynamically created using PHP. Any ideas would be greatly appreciated

    Read the article

  • xargs -I replace-str option difference

    - by foresightyj
    From my understanding, the following should mean exactly the same: ls -1 | xargs file {} ls -1 | xargs -I{} file {} if -I option is not specified, it is default to -I{}. I want to list all files in the current directory and run file command on each of them. Some have spaces in their names. However, I noticed the difference. See below: $ ls -1 Hello World $ ls -1 | xargs file {} {}: ERROR: cannot open `{}' (No such file or directory) Hello: ERROR: cannot open `Hello' (No such file or directory) World: ERROR: cannot open `World' (No such file or directory) $ ls -1 | xargs -I{} file {} Hello World: directory With -I{} explicitly specified, blanks in file names are treated as expected.

    Read the article

  • IIS: redirect everything to another URL, except for one Directory

    - by DrStalker
    I have an IIS server (IIS 6, Win 2003) that hosts the site http://www.foo.com. I want any request to http://foo.com (no matter what path/filename is used) to redirect to http://www.bar.org/AwesomePage.html UNLESS the request is for http://www.foo.com/specialdir, in which case the HTML files in the local directory specialdir should be used. The problem I have is once the redirect is set it also affects /specialdir - even if I right click on that directory and select "content should come from ... local directory" that change does not take effect, and the directory still shows as redirecting to http://www.bar.org/AwesomePage.html. The same thing happens if I try to set individual files to load from the local system instead of redirecting - IIS gives no error, but the change does not take effect and the files still show as being redirected. How can I set specialdir to override the redirection to the new URL?

    Read the article

  • SBSMonitoring.mdf reached limit

    - by Bastien974
    I have SBS 08 Standart. I have some Error in my Event Viewer with MSSQL$SBSMONITORING Event id 1105, 1827: Could not allocate space for object 'dbo.EventLog'.'PK_EventLog' in database 'SBSMonitoring' because the 'PRIMARY' filegroup is full. Create disk space by deleting unneeded files, dropping objects in the filegroup, adding additional files to the filegroup, or setting autogrowth on for existing files in the filegroup. CREATE DATABASE or ALTER DATABASE failed because the resulting cumulative database size would exceed your licensed limit of 4096 MB per database. I tried to schrink the database, worked for SBSMonitoring_log.LDF but nothing for the SBSMonitoring.mdf, still 4096MB. I don't know how to reinstall the monitoring. Thanks for your help.

    Read the article

  • Backup Client/Server Software that Syncs only the Delta?

    - by Urda
    I have a co-located server, and a desktop computer. I push small things and large ammounts of small files (like my iTunes) into a JungleDisk cloud. If a few files change there, no big deal, the file gets re-uped. For larger files JungleDisk backup isn't helpful. Things like movies and VMware images that change a lot, but I want backed up. Just not to JungleDisk since that would cost me even more money. I am looking for a product, closed or open source (preferably open source) that will sync the change, or delta, to my personal server on a schedule. That way I can keep a copy of my larger things, without paying JungleDisk a ton more since they are in the range of many Gigabytes. Right now these few items are backed up over FTP, and take forever. Both the client and server are windows environments.

    Read the article

  • wsgi - narrow user permissions.

    - by Tomasz Wysocki
    I have following Apache configuration and my application is working fine: <VirtualHost *:80> ServerName ig-test.example.com WSGIScriptAlias / /home/ig-test/src/repository/django.wsgi WSGIDaemonProcess ig-test user=ig-test </VirtualHost> But I want to protect my files from other users, so I do: chown ig-test /home/ig-test/ -R chmod og-rwx /home/ig-test/ -R And application stops working: (13)Permission denied: /home/ig-test/.htaccess pcfg_openfile: unable to check htaccess file, ensure it is readable Is it possible to achieve what i'm doing with wsgi? If I have to give read permissions to some files it will be fine. But there are files I have to protect (like file with DB configuration or business logic of application).

    Read the article

  • firefox: getting access to the list of tabs/windows to restore on startup

    - by robb
    Sometimes ffox fails to restore the previously open tabs/windows. This might be happening when some of the urls to be opened are no longer reachable (e.g. behind a vpn) or after the underlying OS (Windows) has been forcibly restarted (e.g. to complete an automated patch installation). Anyway, after restarting, can this list of urls be recovered somehow? Say for example, I was daft enough to have clicked on "start new session". Can I still get access to the old list of open urls? There is the browser history of course, but it contains a lot of stuff - the urls that were open when ffox last exited are not obvious. It would be neat if they were marked in some way - tagged for example. .robb

    Read the article

  • IIS 6.0 FTP Folder Permissions

    - by Beuy
    I have a IIS FTP website setup like this \ftp\users\domain\public\public Software that runs on clients computers logs into the FTP server by specifying domain\public and moving to public, it then uploads or downloads files / folders into that area. I want to restrict permissions on \ftp\users\domain\public so that nothing/nobody can write files or folders here, only to \ftp\users\domain\public\public. I setup the NTFS permissions of the folder to remove domain\users, public and server\users to not have modify right, yet I can still upload/modify files. I have disabled inheritance from the parent folder of \ftp\users\domain\public as well. Any ideas on what I'm missing here? P.S I know this is a stupid setup and makes no sense, it's some bizarre legacy application that I need to migrate to a safer environment until it can be replaced.

    Read the article

  • How to configure nginx to serve static contents from RAM?

    - by Vijayendra Tripathi
    I want to set up nginx as my web server. I want to have image files cached in the memory (RAM) rather then disk. I am serving a small page and want few images always served from RAM. I dont wish to use varnish (or any other such tools) for this as I believe nginx has a capability to cache contents into RAM. I am not sure as how may I configure nginx for this? I did try few combinations but they didn't work. nginx uses disk all the time to get images. For example, when I tried apache benchmark to test with following command - ab -c 500 -n 1000 http://localhost/banner.jpg I get following error - socket: Too many open files (24) I guess this means nginx is trying to open to many files simultaneously from the disk and OS is not allowing this operation. Can anyone please suggest me a correct configuration? Thanks for considering this message.

    Read the article

  • Apache - building extensions with apxs

    - by Brian
    Hello, Pardon the newbie question - I haven't worked with manually compiling Apache modules (or anything) before. I am trying to get the mod_concat module going. It seems simple enough - just requires downloading the mod_concat.c file and then running: axps -c mod_concat.c This is new to me. Does it matter which directory I put mod_concat.c before running this command? I ran it from my home directory, and I see some new files - mod_concat.la, mod_concat.lo, mod_concat.o, and mod_concat.slo - along with a new subfolder called .libs/ that contains mod_concat.so along with some other files. I'm not sure where to go from here, I have a feeling these files were created in the wrong place. Don't I need mod_concat.so to be in my apache modules directory with the rest? Thanks for the help, Brian

    Read the article

  • Windows 7 - sharing file - slow seek time?

    - by progtick
    I do not want to copy the files. I simply have video files (.flv) on one computer and I would like another computer user to watch them without copying. Playback is fine, but seek time (as in if you move the cursor to skip some portion of the video) takes forever! I thought wireless speed might be the culprit, so I wired the two computer. Maybe I saw some improvement but still so bad. It's 1 Gbps! (I know real speed will vary, but before I monitor real speed and such, do I have reasonable issue? Or am I bound to have very slow seek time?) What is going on? I must mention some of these files are huge!

    Read the article

  • SQL Maintenance Cleanup Task Working but Not Deleting

    - by Alex
    I have a Maintenance Plan that is suppose to go through the BACKUP folder and remove all .bak older than 5 days. When I run the job, it gives me a success message but older .bak files are still present. I've tried the step at the following question: SQL Maintenance Cleanup Task 'Success' But not deleting files Result is column IsDamaged = 0 I've verified with the following question and this is not my issue: Maintenance Cleanup Task(s) running 'successfully' but not deleting back up files. I've also tried deleting the Job and Maintenance Plan and recreating, but to no avail. Any ideas?

    Read the article

  • Data Archiving vs not

    - by Recursion
    For the sake of data integrity, is it wiser to archive your files or just leave them unarchived. No compression is being used. My thinking is that if you leave your files unarchived, if there is some form of corruption it will only hurt a smaller number of files. Though if you archive, lets say all of your documents, if there is even the slightest corruption, the entire archive is unrecoverable. So whats the best way to keep a clean file system, but not be subject to data corruption.

    Read the article

  • Create Virtual Image of Laptop before Formatting

    - by Simon Mark Smith
    I have a 3 year old laptop running Windows XP that I used for business. Although I have not used the laptop in over a year, I now want to re-commission it with Windows 7 and a fresh install. Before I do the fresh install I want to create a Virtual Image of the laptop that I can keep and potentially run on my desktop machine should I ever need to access any of the old files/projects that it contains currently. I know that most people will say just copy the files over to your desktop, but my concern is the configuration of the laptop. I used to use it for development and it has older versions of Visual Studio, SQL Server, Active X controls etc, etc than I currently use so I really want to preserve the environment not just the files. So really I am asking what is the best tool-set/method to achieve this? I understand there are free VM tools available but I have never done this before and would appreciate any help.

    Read the article

  • Git pull with unstaged changes

    - by Peter
    Attempting a git pull when you have unstaged changes will fail, saying you can commit or stash then. I suppose a workaround is to git stash, git pull, then git stash pop. However, is there an alternative way to do this? I would like to forcefully git pull if there are unstaged changes, but only if the files being brought down do not override the modified files? AKA. if I have a repo with the files "derp1", "derp2", "derp3" and modify "derp1" locally, a git pull will bring down and overwrite everything except the "derp1" file. I assume a git stash + pull + stash pop achieves this already? And is there a better way? I suppose this could also work differently if it occurs on a submodule.

    Read the article

  • Apache stopping downloads part way through

    - by Ben Smiley
    On my site there are some digital files which can be downloaded through a PHP script. The script works fine for small files but large files i.e. 115MB cannot be downloaded successfully. The connection dies after around 15 minutes but it's not consistent - sometimes longer sometimes shorter. I don't think it's a problem with the script timing out because the download time isn't consistent. Equally it doesn't seem like a memory limit problem because the amount downloaded varies each time. Does anyone know of any Apache or PHP related settings which could cause this kind of problem?

    Read the article

  • Mass remove passwords from rar archives

    - by ldigas
    Is there a way to (I'm using WinRAR; demo, but I'm willing to change it to whatever is needed) mass remove passwords from a bunch of files ? Problem description: for reasons unknown to me, some archiving was done for two-and-something years in RAR format, and all archives have passwords. I have a list of them, them all being similar (mostly something like John-03, John-04, John-05 ... e.g. name-month ...) but I need to manipulate the files at large, and it is a real problem removing and or dearchiving all those files, while entering passwords manually. What would be my best options concerning ? Ideally, I'm looking for some kind of archiver which tries out a predefined list of passwords, and asks only if non of them cracks the safe. Afaik, WinRAR has no such feature.

    Read the article

  • MacOS X 10.6 Portable Home Directory sync fails due to FileSync agent crashing

    - by tegbains
    On one of our cleanly installed MacPro machines running MacOS X 10.6.6 connected to our MacOS X 10.6.6 Server, syncing data using Portable Home Directories fails. It seems to be due to the filesync agent crashing during the home sync. We get -41 and -8026 errors, which we are suspecting are indicating that there is too much data or filesync agent can't read the files. The user is the owner of the files and can read/write to all of the files. < Logout 0:: [11/02/04 13:10:42.751] Error -41 copying /Volumes/RCAUsers/earlpeng/Library/Mail/Mailboxes/email from old imac./Attachments/12081/2.2. (source = NO) < Logout 0:: [11/02/04 13:10:42.758] Error -8062 copying /Volumes/RCAUsers/earlpeng/Library/Mail/Mailboxes/email from old imac./Attachments/12081/2.2/[email protected]. (source = NO) < Logout 1:: [11/02/04 13:10:42.758] -[DeepCopyContext deepCopyError:sourceError:sourceRef:]: error = -8062, wasSource = NO: return shouldContinue = NO

    Read the article

  • Is it possible to force a credential check every time a network share is opened in Windows 7?

    - by Logan VanCuren
    I am running Windows 7 Ultimate and I have mapped a network share to a drive letter. The first time that I open the share, I am prompted for login credentials as expected. I do not select the "Remember my credentials" checkbox and can login successfully, but every time that I re-open the share during the session, I do not need to re-enter my credentials. Is there any way to force a credential check every time that I re-open the share? I do not want to perform a reboot every time that I want to "re-lock" the share.

    Read the article

  • The specified module (mod_h264_streaming) could not be found (Apache2)?

    - by rphello101
    I'm trying to get the mod_h264_streaming to work with my Apache2 server. I downloaded a precompiled version of the mod from here. I read here that all I have to do is extract the file to my modules folder, which I did, and add LoadModule h264_streaming_module modules/mod_h264_streaming.so AddHandler h264-streaming.extensions .mp4 to the httpd.conf, which I also did. However, I get this error when I restart Apache: Syntax error on line 173 of C:/Program Files (x86)/Apache Group/Apache2/conf/httpd.conf: Cannot load C:/Program Files (x86)/Apache Group/Apache2/modules/mod_h264_streaming.so into server: The specified module could not be found. Note the errors or messages above, and press the <ESC> key to exit. 26... Even though the file exists right here: C:\Program Files (x86)\Apache Group\Apache2\modules\mod_h264_streaming.so Can anyone tell me what I'm doing wrong?

    Read the article

  • Office 2007 network share access denied

    - by Rodent43
    Hope I have not duplicated an issue already posted but I could not find anything from the search... Right here is the problem, we have recently updated all our desktops to the MS Office 2007 suite and people have issues trying to open simple files like word documents... the systems are Windows XP (SP3) Novell Network with novell client Office 2007 when they try to open a word document from a usual network share word presents a message reporting Access Denied Contact Administrator So we assumed network permissions, none of which have changed...so try the same file with Wordpad and it opens fine, be it with formating issues of course... Now copy the file to your desktop, which is not redirected, and you can open the file in word as normal... so does anyone know if office 2007 uses some new permission when opening files? does it create temps or something... any pointers would be appreciated

    Read the article

  • Booting Linux from External HDD, with persistence

    - by Moriarty
    I am trying to install Linux, specifically Lubuntu or BackTrack 5 on an external HDD (Seagate FreeAgent GoFlex) but I have had no luck using YUMI, or Untebootin to get it working. I want the hard drive to be able to save the data within Linux (As in, If I install a program, it will stay there). I also tried doing this with a flash drive, which does boot, but it does not save data (I tried following Pendrive's tutorial on creating a casper-rw file and adding "persistent" to various files, but I cannot get it to save files. Basically, I just want a form of linux on a portable device that will save files and settings between boots Note: I do not have a CD to install from. Any help would be greatly appreciated, Thanks!

    Read the article

  • IIS's SMTP Pickup timing

    - by fatcat1111
    I have IIS's SMTP server set up as a closed relay, and it's working nicely. I also have an application that writes EML files. If the EML files are written to a temporary directory, then moved to the server's Pickup directory, email is sent as expected. However, if I have the application write the EML files directly to the Pickup directory, the email will often fail to send. This seems to be a race condition: the server starts processing the EML file as soon as it detects it in Pickup, even though the application hasn't completed writing it. The result is the server considers the EML to be malformed, and it punts it to Badmail. While I very much appreciate the server's earnestness, it seems that I need to dial it back a bit for this scenario. Does anybody know if IIS's SMTP server's polling frequency can be configured? I am using IIS7, Windows Server 2008 R2. The application that writes the EML cannot be modified.

    Read the article

  • How to fix Windows 2008 R2 BOOTMGR is missing

    - by RichardTheKiwi
    BOOTMGR IS MISSING PRESS CTRL+ALT+DEL TO RESTART Note: This is a VM on VMWare ESX server, but that should not matter I put in the 2008 R2 x64 install dvd and can get to recovery, but it lists no Operating Systems. Clicking on Next brings me to +=========================== System Recovery Options +=========================== Choose a recovery tool Operating system: Unknown or (Unknown) Local Disk ..... Command Prompt I start the command prompt, go to C:\ and perform a dir /a Apart from files I put there myself, these are showing $Recycle.Bin Documents and Settings [C:\Users] Program Files Program Files (x86) ProgramData Recovery System Volume Information Temp Users Windows Where to go next? Is it like the NTLDR problem with Windows 2003 where I can just drop a file in there and it will be hunky dory again?

    Read the article

  • Specific font in Windows 7 works, but in Windows Server 2003 doesn't. Why?

    - by Vinicius Ottoni
    I have a .TTF font and when i open it in Windwos 7 it's all ok, the characters is appearing in various sizes and etc.., but when i open it in Windows Server 2003 nothing is appearing inside it. Shows up a "blank font", whitout the characters. I need that font for my app that have to work in both systems... Obs: all others fonts are ok in Windows Server 2003, when i open anyone the characters is appearing. -- EDIT I copy the font to another Windows Server 2003.... and works fine. Anyone have any idea?

    Read the article

< Previous Page | 596 597 598 599 600 601 602 603 604 605 606 607  | Next Page >