Search Results

Search found 49518 results on 1981 pages for 'configuration files'.

Page 532/1981 | < Previous Page | 528 529 530 531 532 533 534 535 536 537 538 539  | Next Page >

  • IIS: redirect everything to another URL, except for one Directory

    - by DrStalker
    I have an IIS server (IIS 6, Win 2003) that hosts the site http://www.foo.com. I want any request to http://foo.com (no matter what path/filename is used) to redirect to http://www.bar.org/AwesomePage.html UNLESS the request is for http://www.foo.com/specialdir, in which case the HTML files in the local directory specialdir should be used. The problem I have is once the redirect is set it also affects /specialdir - even if I right click on that directory and select "content should come from ... local directory" that change does not take effect, and the directory still shows as redirecting to http://www.bar.org/AwesomePage.html. The same thing happens if I try to set individual files to load from the local system instead of redirecting - IIS gives no error, but the change does not take effect and the files still show as being redirected. How can I set specialdir to override the redirection to the new URL?

    Read the article

  • wsgi - narrow user permissions.

    - by Tomasz Wysocki
    I have following Apache configuration and my application is working fine: <VirtualHost *:80> ServerName ig-test.example.com WSGIScriptAlias / /home/ig-test/src/repository/django.wsgi WSGIDaemonProcess ig-test user=ig-test </VirtualHost> But I want to protect my files from other users, so I do: chown ig-test /home/ig-test/ -R chmod og-rwx /home/ig-test/ -R And application stops working: (13)Permission denied: /home/ig-test/.htaccess pcfg_openfile: unable to check htaccess file, ensure it is readable Is it possible to achieve what i'm doing with wsgi? If I have to give read permissions to some files it will be fine. But there are files I have to protect (like file with DB configuration or business logic of application).

    Read the article

  • bind9 named.conf zones size limit

    - by mox601
    I am trying to set up a test environment on my local machine, and I am trying to start a DNS daemon that loads tha configuration from a named.conf.custom file. As long as the size of that file is like 3-4 zones, the bind9 daemon loads fine, but when i enter the config file i need (like 10000 lines long), bind can't startup and in the syslog i find this message: starting BIND 9.7.0-P1 -u bind Jun 14 17:06:06 cibionte-pc named[9785]: built with '--prefix=/usr' '--mandir=/usr/share/man' '--infodir=/usr/share/info' '--sysconfdir=/etc/bind' '--localstatedir=/var' '--enable-threads' '--enable-largefile' '--with-libtool' '--enable-shared' '--enable-static' '--with-openssl=/usr' '--with-gssapi=/usr' '--with-gnu-ld' '--with-dlz-postgres=no' '--with-dlz-mysql=no' '--with-dlz-bdb=yes' '--with-dlz-filesystem=yes' '--with-dlz-ldap=yes' '--with-dlz-stub=yes' '--with-geoip=/usr' '--enable-ipv6' 'CFLAGS=-fno-strict-aliasing -DDIG_SIGCHASE -O2' 'LDFLAGS=-Wl,-Bsymbolic-functions' 'CPPFLAGS=' Jun 14 17:06:06 cibionte-pc named[9785]: adjusted limit on open files from 1024 to 1048576 Jun 14 17:06:06 cibionte-pc named[9785]: found 1 CPU, using 1 worker thread Jun 14 17:06:06 cibionte-pc named[9785]: using up to 4096 sockets Jun 14 17:06:06 cibionte-pc named[9785]: loading configuration from '/etc/bind/named.conf' Jun 14 17:06:06 cibionte-pc named[9785]: /etc/bind/named.conf.saferinternet:1: unknown option 'zone' Jun 14 17:06:06 cibionte-pc named[9785]: loading configuration: failure Jun 14 17:06:06 cibionte-pc named[9785]: exiting (due to fatal error) Are there any limits on the file size bind9 is allowed to load?

    Read the article

  • IIS 6.0 FTP Folder Permissions

    - by Beuy
    I have a IIS FTP website setup like this \ftp\users\domain\public\public Software that runs on clients computers logs into the FTP server by specifying domain\public and moving to public, it then uploads or downloads files / folders into that area. I want to restrict permissions on \ftp\users\domain\public so that nothing/nobody can write files or folders here, only to \ftp\users\domain\public\public. I setup the NTFS permissions of the folder to remove domain\users, public and server\users to not have modify right, yet I can still upload/modify files. I have disabled inheritance from the parent folder of \ftp\users\domain\public as well. Any ideas on what I'm missing here? P.S I know this is a stupid setup and makes no sense, it's some bizarre legacy application that I need to migrate to a safer environment until it can be replaced.

    Read the article

  • How to configure nginx to serve static contents from RAM?

    - by Vijayendra Tripathi
    I want to set up nginx as my web server. I want to have image files cached in the memory (RAM) rather then disk. I am serving a small page and want few images always served from RAM. I dont wish to use varnish (or any other such tools) for this as I believe nginx has a capability to cache contents into RAM. I am not sure as how may I configure nginx for this? I did try few combinations but they didn't work. nginx uses disk all the time to get images. For example, when I tried apache benchmark to test with following command - ab -c 500 -n 1000 http://localhost/banner.jpg I get following error - socket: Too many open files (24) I guess this means nginx is trying to open to many files simultaneously from the disk and OS is not allowing this operation. Can anyone please suggest me a correct configuration? Thanks for considering this message.

    Read the article

  • Remote SQL server connection failure

    - by Sevki
    I am trying to connect to my MSSQL server 2008 web instance and im failing horribly... i get the error 26 and before you jump on me i have done these Check the spelling of the SQL Server instance name that is specified in the connection string. Use the SQL Server Surface Area Configuration tool to enable SQL Server to accept remote connections over the TCP or named pipes protocols. For more information about the SQL Server Surface Area Configuration Tool, see Surface Area Configuration for Services and Connections. Make sure that you have configured the firewall on the server instance of SQL Server to open ports for SQL Server and the SQL Server Browser port (UDP 1434). Make sure that the SQL Server Browser service is started on the server. in addition to theese i have disabled the firewall completely and tried other ports nothing works the same credentials work on the server but not on the client. this is the exact error message A network-related or instance-specific error occurred while establishing a connection to SQL Server. The server was not found or was not accessible. Verify that the instance name is correct and that SQL Server is configured to allow remote connections. (provider: SQL Network Interfaces, error: 26 - Error Locating Server/Instance Specified) (.Net SqlClient Data Provider) Can anybody help?

    Read the article

  • Windows 7 - sharing file - slow seek time?

    - by progtick
    I do not want to copy the files. I simply have video files (.flv) on one computer and I would like another computer user to watch them without copying. Playback is fine, but seek time (as in if you move the cursor to skip some portion of the video) takes forever! I thought wireless speed might be the culprit, so I wired the two computer. Maybe I saw some improvement but still so bad. It's 1 Gbps! (I know real speed will vary, but before I monitor real speed and such, do I have reasonable issue? Or am I bound to have very slow seek time?) What is going on? I must mention some of these files are huge!

    Read the article

  • Apache - building extensions with apxs

    - by Brian
    Hello, Pardon the newbie question - I haven't worked with manually compiling Apache modules (or anything) before. I am trying to get the mod_concat module going. It seems simple enough - just requires downloading the mod_concat.c file and then running: axps -c mod_concat.c This is new to me. Does it matter which directory I put mod_concat.c before running this command? I ran it from my home directory, and I see some new files - mod_concat.la, mod_concat.lo, mod_concat.o, and mod_concat.slo - along with a new subfolder called .libs/ that contains mod_concat.so along with some other files. I'm not sure where to go from here, I have a feeling these files were created in the wrong place. Don't I need mod_concat.so to be in my apache modules directory with the rest? Thanks for the help, Brian

    Read the article

  • Data Archiving vs not

    - by Recursion
    For the sake of data integrity, is it wiser to archive your files or just leave them unarchived. No compression is being used. My thinking is that if you leave your files unarchived, if there is some form of corruption it will only hurt a smaller number of files. Though if you archive, lets say all of your documents, if there is even the slightest corruption, the entire archive is unrecoverable. So whats the best way to keep a clean file system, but not be subject to data corruption.

    Read the article

  • SQL Maintenance Cleanup Task Working but Not Deleting

    - by Alex
    I have a Maintenance Plan that is suppose to go through the BACKUP folder and remove all .bak older than 5 days. When I run the job, it gives me a success message but older .bak files are still present. I've tried the step at the following question: SQL Maintenance Cleanup Task 'Success' But not deleting files Result is column IsDamaged = 0 I've verified with the following question and this is not my issue: Maintenance Cleanup Task(s) running 'successfully' but not deleting back up files. I've also tried deleting the Job and Maintenance Plan and recreating, but to no avail. Any ideas?

    Read the article

  • Apache ScriptAlias and access error?

    - by Parhs
    First of all after much pain i figured out how to make it work in Apache 2.4 windowz. Here is my configuration that seems to work successfully for git clone and push and everything. Problem First of all my configuration works. There is a "Require all denied" at / directory. I want only git functionality and nothing else. Example Request from a git client 192.168.100.252 - - [07/Oct/2012:04:44:51 +0300] "GET /git/simple/info/refs?service=git-upload-pack HTTP/1.1" 200 264` Error caused by that Request [Sun Oct 07 04:44:51.903334 2012] [authz_core:error] [pid 6988:tid 956] [client 192.168.100.252:13493] AH01630: client denied by server configuration: C:/git-server/web/simple There isnt any error at gitclient everything works fine but i get this at error log. Is there any solution for this error to not appear?I worry about log size. <VirtualHost *:80> DocumentRoot "C:\git-server\web" ServerName git.****censored**** DirectoryIndex index.php SetEnv GIT_PROJECT_ROOT c:/git-server/repositories SetEnv GIT_HTTP_EXPORT_ALL SetEnv REMOTE_USER=$REDIRECT_REMOTE_USER ScriptAlias /git/ "C:/Program Files (x86)/Git/libexec/git-core/git-http-backend.exe/" <LocationMatch "^/.*/git-receive-pack$"> Options +ExecCGI AuthType Basic AuthName intranet AuthUserFile "C:/git-server/config/users" Require valid-user </LocationMatch> <Directory /> Options All Require all denied </Directory> <Directory "C:\Program Files (x86)\Git\libexec\git-core"> Options +ExecCGI Options All Require all granted </Directory> </VirtualHost>

    Read the article

  • Create Virtual Image of Laptop before Formatting

    - by Simon Mark Smith
    I have a 3 year old laptop running Windows XP that I used for business. Although I have not used the laptop in over a year, I now want to re-commission it with Windows 7 and a fresh install. Before I do the fresh install I want to create a Virtual Image of the laptop that I can keep and potentially run on my desktop machine should I ever need to access any of the old files/projects that it contains currently. I know that most people will say just copy the files over to your desktop, but my concern is the configuration of the laptop. I used to use it for development and it has older versions of Visual Studio, SQL Server, Active X controls etc, etc than I currently use so I really want to preserve the environment not just the files. So really I am asking what is the best tool-set/method to achieve this? I understand there are free VM tools available but I have never done this before and would appreciate any help.

    Read the article

  • Git pull with unstaged changes

    - by Peter
    Attempting a git pull when you have unstaged changes will fail, saying you can commit or stash then. I suppose a workaround is to git stash, git pull, then git stash pop. However, is there an alternative way to do this? I would like to forcefully git pull if there are unstaged changes, but only if the files being brought down do not override the modified files? AKA. if I have a repo with the files "derp1", "derp2", "derp3" and modify "derp1" locally, a git pull will bring down and overwrite everything except the "derp1" file. I assume a git stash + pull + stash pop achieves this already? And is there a better way? I suppose this could also work differently if it occurs on a submodule.

    Read the article

  • Apache stopping downloads part way through

    - by Ben Smiley
    On my site there are some digital files which can be downloaded through a PHP script. The script works fine for small files but large files i.e. 115MB cannot be downloaded successfully. The connection dies after around 15 minutes but it's not consistent - sometimes longer sometimes shorter. I don't think it's a problem with the script timing out because the download time isn't consistent. Equally it doesn't seem like a memory limit problem because the amount downloaded varies each time. Does anyone know of any Apache or PHP related settings which could cause this kind of problem?

    Read the article

  • Mass remove passwords from rar archives

    - by ldigas
    Is there a way to (I'm using WinRAR; demo, but I'm willing to change it to whatever is needed) mass remove passwords from a bunch of files ? Problem description: for reasons unknown to me, some archiving was done for two-and-something years in RAR format, and all archives have passwords. I have a list of them, them all being similar (mostly something like John-03, John-04, John-05 ... e.g. name-month ...) but I need to manipulate the files at large, and it is a real problem removing and or dearchiving all those files, while entering passwords manually. What would be my best options concerning ? Ideally, I'm looking for some kind of archiver which tries out a predefined list of passwords, and asks only if non of them cracks the safe. Afaik, WinRAR has no such feature.

    Read the article

  • MacOS X 10.6 Portable Home Directory sync fails due to FileSync agent crashing

    - by tegbains
    On one of our cleanly installed MacPro machines running MacOS X 10.6.6 connected to our MacOS X 10.6.6 Server, syncing data using Portable Home Directories fails. It seems to be due to the filesync agent crashing during the home sync. We get -41 and -8026 errors, which we are suspecting are indicating that there is too much data or filesync agent can't read the files. The user is the owner of the files and can read/write to all of the files. < Logout 0:: [11/02/04 13:10:42.751] Error -41 copying /Volumes/RCAUsers/earlpeng/Library/Mail/Mailboxes/email from old imac./Attachments/12081/2.2. (source = NO) < Logout 0:: [11/02/04 13:10:42.758] Error -8062 copying /Volumes/RCAUsers/earlpeng/Library/Mail/Mailboxes/email from old imac./Attachments/12081/2.2/[email protected]. (source = NO) < Logout 1:: [11/02/04 13:10:42.758] -[DeepCopyContext deepCopyError:sourceError:sourceRef:]: error = -8062, wasSource = NO: return shouldContinue = NO

    Read the article

  • Booting Linux from External HDD, with persistence

    - by Moriarty
    I am trying to install Linux, specifically Lubuntu or BackTrack 5 on an external HDD (Seagate FreeAgent GoFlex) but I have had no luck using YUMI, or Untebootin to get it working. I want the hard drive to be able to save the data within Linux (As in, If I install a program, it will stay there). I also tried doing this with a flash drive, which does boot, but it does not save data (I tried following Pendrive's tutorial on creating a casper-rw file and adding "persistent" to various files, but I cannot get it to save files. Basically, I just want a form of linux on a portable device that will save files and settings between boots Note: I do not have a CD to install from. Any help would be greatly appreciated, Thanks!

    Read the article

  • The specified module (mod_h264_streaming) could not be found (Apache2)?

    - by rphello101
    I'm trying to get the mod_h264_streaming to work with my Apache2 server. I downloaded a precompiled version of the mod from here. I read here that all I have to do is extract the file to my modules folder, which I did, and add LoadModule h264_streaming_module modules/mod_h264_streaming.so AddHandler h264-streaming.extensions .mp4 to the httpd.conf, which I also did. However, I get this error when I restart Apache: Syntax error on line 173 of C:/Program Files (x86)/Apache Group/Apache2/conf/httpd.conf: Cannot load C:/Program Files (x86)/Apache Group/Apache2/modules/mod_h264_streaming.so into server: The specified module could not be found. Note the errors or messages above, and press the <ESC> key to exit. 26... Even though the file exists right here: C:\Program Files (x86)\Apache Group\Apache2\modules\mod_h264_streaming.so Can anyone tell me what I'm doing wrong?

    Read the article

  • IIS's SMTP Pickup timing

    - by fatcat1111
    I have IIS's SMTP server set up as a closed relay, and it's working nicely. I also have an application that writes EML files. If the EML files are written to a temporary directory, then moved to the server's Pickup directory, email is sent as expected. However, if I have the application write the EML files directly to the Pickup directory, the email will often fail to send. This seems to be a race condition: the server starts processing the EML file as soon as it detects it in Pickup, even though the application hasn't completed writing it. The result is the server considers the EML to be malformed, and it punts it to Badmail. While I very much appreciate the server's earnestness, it seems that I need to dial it back a bit for this scenario. Does anybody know if IIS's SMTP server's polling frequency can be configured? I am using IIS7, Windows Server 2008 R2. The application that writes the EML cannot be modified.

    Read the article

  • How to fix Windows 2008 R2 BOOTMGR is missing

    - by RichardTheKiwi
    BOOTMGR IS MISSING PRESS CTRL+ALT+DEL TO RESTART Note: This is a VM on VMWare ESX server, but that should not matter I put in the 2008 R2 x64 install dvd and can get to recovery, but it lists no Operating Systems. Clicking on Next brings me to +=========================== System Recovery Options +=========================== Choose a recovery tool Operating system: Unknown or (Unknown) Local Disk ..... Command Prompt I start the command prompt, go to C:\ and perform a dir /a Apart from files I put there myself, these are showing $Recycle.Bin Documents and Settings [C:\Users] Program Files Program Files (x86) ProgramData Recovery System Volume Information Temp Users Windows Where to go next? Is it like the NTLDR problem with Windows 2003 where I can just drop a file in there and it will be hunky dory again?

    Read the article

  • Can't Create New folder in my video folder

    - by tiki
    My OS is windows 7 Ultimate 32bit. My video folder location is C:\Users\User\Documents\Downloads\Video. I Can't create folder in the Video folder or see the files i saved. But I can see the files in the root location of the folder i.e by going to the folder from My Computer. I can create and find files when I go to the folder My Computer>C>Users>User>Documents>Downloads>Video. This is an unusual problem I have never seen. Even being a system Administrator I can't able to fix the problem. Please help me out. Thanks in advance

    Read the article

  • Backup Source (non source control)?

    - by acidzombie24
    I back up my code with svn. I have project files in there however i ignore selected things. I also ignore jpg, ogg, etc. Right now i would like to backup everything. However the zip result is 1gb (i have a lot of code). I know i can cut down the filesize by 60%+ Is there an app i can use which will backup everything except the bin and obj folders? perhaps keep ogg, json, jpg files but ignore .svn or .pdb files?

    Read the article

  • Nginx Restart Issues

    - by heavymark
    All of the sudden when restarting Nginx I get the following error: Restarting nginx: [alert]: could not open error log file: open() "/var/log/nginx/error.log" failed (13: Permission denied) 2011/02/16 17:20:58 [warn] 23925#0: the "user" directive makes sense only if the master process runs with super-user privileges, ignored in /etc/nginx/nginx.conf:1 the configuration file /etc/nginx/nginx.conf syntax is ok 2011/02/16 17:20:58 [emerg] 23925#0: open() "/var/run/nginx.pid" failed (13: Permission denied) configuration file /etc/nginx/nginx.conf test failed On the front end part of the site loads but some files such as the CSS in particular are not loading. They exist on the server but when loading the resources directly in Chrome they say "Oops this page can't be found." I set a special group and user to run my apache files using suexec for my domain files. I think the nginx are owned by root however which I'm assuming is the problem but which nginx file ownerships would I change?

    Read the article

  • Can I list file names (or their parent directories) that were recently deleted using rm in OS X?

    - by Andrew Grimm
    Is it possible to find out which files and directories have recently been deleted by rm in OS X? Or failing that, is it possible to find which parent directories have had files or directories within it deleted? The OS version is Snow Leopard. Background: Last night, rvm (ruby version manager) did rm -rf of the ~/ruby directory from the home directory. (This bug has since been fixed) Ideally, I'd like to know what files within the ~/ruby directory were deleted, but failing that, I'd like to know if rvm deleted anything outside of ~/ruby . In case anyone's wondering about backups...: Just about everything within ~/ruby is a git project that has a remote repo, and I have a fairly recent Time Machine backup (only 20 days old).

    Read the article

  • How do I secure Sql Server 2008 R2

    - by Mark Tait
    I have both a dedicated and a VPS (from Fasthosts) virtual server - the web sites/applications I run on these, access Sql Server stored on the same web server. Until now, I have logged onto Sql Server on both the deidicated and VPS server, from Sql Server Management Studio - until I noticed in my server application logs, multiple attempts to logon to Sql Server using the 'sa' username, but failed password. So someone/bot is trying hard (repeatedly every couple of hours, for approx 20 attempts during each instance) to log on... so obviously I have to lock down access to Sql Sever remotely. What I have done is gone into Configuration Manager, and in Sql Server Network Configuration - Protocols for Sql2008 and also in Sql Native Client 10.0 Configuration - Client Protocols - I have diabled Named Pipes, TCP/IP (and VIA by default). I have left Shared Memory enabled. I also disabled in Sql Server Services, the Sql Server Browser. Now the only way I can manage the databases on these servers, is by logging on to them via Remote Desktop. Can anyone confirm if this is the correct way of stopping anyone maliciously logging on to Sql Server? (I'm not a DBA or security expert - and there are hundreds of articles advising all different ways - but I was hoping for the experts here to confirm, or otherwise, if what I've done is correct) Thank you, Mark

    Read the article

< Previous Page | 528 529 530 531 532 533 534 535 536 537 538 539  | Next Page >