We have a smallish web farm of < 5 Windows 2008 servers. Some do data, most do IIS hosting. Is it a good/bad idea to set up a domain controller and put all in the same "production" domain?
We want to avoid a world where we have to sync multiple admin passwords between the boxes (or share admin credentials among the team).
Presumably, the DC would be just another VM, so hardware cost doesn't enter into the discussion.
I want to take screenshots in a web browser so that the resulting graphic (.png, .jpg) is a specific resolution, for example: 1024 x 768
Besides adjusting my monitor resolution, is there any way to do this with an application or plug-in/add-on? There is a Firefox plug in called Window Resizer, but it does not work with newer versions of Firefox.
We are running IIS6 on Windows Server 2003 machine. We have a php based page (MediaWiki) and an asp.net MVC page. How can we configure IIS and Windows to allow access to this web pages only for selected AD domain groups, preferably without touching the applications.
I've installed the beta version of Ubuntu 10.04 server edition (x64), but the system doesn't have an internet connection. Is there a way I can find out what packages are in the apt repository with nothing more than a web browser?
The reason I'm asking, is because I will have an internet connection available when the production system goes live, but it simply isn't possible to internet connect my development system.
I need some good tutorials that covers the topic of initial configuration of a dedicated server as a secure powerful and up to date web server.
The server will have centos6 and cpanel already installed. And will be hosting wordpress and joomla websites. Security and flexibility is my big concern.
I know about how to forge website but i need something more advanced and optimized for wordpress and joomla.
I am looking for the same JavaScript functionality (Errors/Warnings/etc.) that the Web Developer add-on for Firefox has. Does such a thing exist and where can I find it?
I'm considering hosting a publicly available web application from my home with the latest ubuntu server, the ruby on rails framework, some kind of SSL, and mongrel. What security issues will I face and what should I do about them? I'd appreciate any help.
I'm looking for a way to implement web site blacklisting in ISA server 2006.
I know how to manually define a destination set and block access to it, and I also know how to import XML lists.
What I'm looking for is some publicly available and actively updated blacklist (i.e. "porn sites", or "gamble sites") from some trustworthy source, and for a way to automatically get updated versions when they are released and use them in ISA.
Can this be done, and how?
What is the best Web Application Firewall(WAF) for IIS? What makes it better than the others? How useful is it at blocking attacks against poorly written code, otherwise known as an Intrusion Prevention System (IPS)?
WAFs are required by the PCI-DSS, so if I have to get one, then it should the best one.
I'd like to avoid the "Expires" header, and use "Cache-Control" only - or maybe the other way around. The headers will account for a significant percentage of my traffic, so I'd prefer not to "use both".
AFAIK, the "Cache-Control" header was standardized in HTTP 1.1, but are there still web caches/proxies in use, which don't understand it?
Note: This could help answering a part of my stackoverflow (bounty) question
We're adding a second server with Windows NLB for a bit of redundancy (ie the power goes on one of the servers - I know its not the best solution).
How can we keep the data identical between the servers? Dont want to use a SAN or NAS as thats just something else to go wrong. Customers can upload images with the web app so changes could be made on either server, as well as us uploaded a few changed files.
Thanks
I have a web server in my home network and I'm using ZoneEdit for dynamic DNS. It's accessible perfectly to everyone outside of my local network, but since I switched to a Gateway 2Wire DSL modem/router I'm unable to use the domain to access the server from my network. I can access it via a local IP or by putting it in my Windows hosts file but this is annoying to do on every computer and for every subdomain.
Any idea how I can fix this? Thanks!
I have a web server in my home network and I'm using ZoneEdit for dynamic DNS. It's accessible perfectly to everyone outside of my local network, but since I switched to a Gateway 2Wire DSL modem/router I'm unable to use the domain to access the server from my network. I can access it via a local IP or by putting it in my Windows hosts file but this is annoying to do on every computer and for every subdomain.
Any idea how I can fix this? Thanks!
I have Linksys WRT54GL with Tomato installed. Unfortently I can't use VLANs with this firmware, so I have to switch to OpenWRT (or DD-WRT). Is it safe to use web interface or I should use tftp way?
When trying to save zoning changes in the Zone Administration tool in Brocade Web Tools, I get the status "Failed to commit changes to fabric" and the messages window shows:
--- start of commit (Enable Config) at: Fri Jul 23 2010 19:43:40 GMT+00:00
Invalid Transaction
--- end of commit at: Fri Jul 23 2010 19:43:47 GMT+00:00
I've tried refreshing the config and just re-saving what is already on the switch, but can't get this message to go away.
Is there any advantage to use a linux machine to develop instead of Windows?
Everyone at work tells me to switch to Linux, since I'm developing hard-core on linux anyway.
I manage 40 servers, and do everything from DB to data-backend to developing web services.
I don't find anything wrong with Putty. I"m just too lazy to install another OS...
What do you guys think?
I am trying to use rsync to replicate all the files from one web server to another server that could act as a backup if the first one went down. The problem I am having is that the .htaccess file requires the AuthUserFile to have the fully quallified path to the .htpasswd file and I cannot make the paths the same on the two machines.
Does anyone know how I might use the same .htaccess file on two different servers?
Thanks for any help that can be provided.
I'm writing a download manager, and I've noticed that all the web pages I've encountered don't seem to set the ContentLength header, whereas other media types (images) do. However, when I load a page (in Firefox), I get a progress as it loads. What's that based off if not file size?
If I have a script that is to be executed by the nobody user, why is there a need to assign group write and execute permissions. For example in the article at http://www.zzee.com/solutions/unix-permissions.shtml, it notes that the permission 755 should be assigned to scripts on a web server. I understand that the user nobody is treated as others and as the owner of the script I would like full permissions. Am I missing something?
I ecently installed a web proxy at my workplace. I don't understand why, should this be OK? What are the reasons to use a proxy at work? - because I don't understand. IMO use proxy at work is a wrong decision.
I have several web site set up on one IIS 6 server distinguished by Host Header.
However, I wish to have one of the sites served by a Linux / Apache server on my network. Do I need to use a reverse proxy add-in for IIS, or is there a simple way to tell IIS to pass on all requests to another server?