Search Results

Search found 2130 results on 86 pages for 'serve u'.

Page 7/86 | < Previous Page | 3 4 5 6 7 8 9 10 11 12 13 14  | Next Page >

  • How do I serve only internal intranet requests for a site with Apache?

    - by purpletonic
    I have an externally facing web server on our domain that we use for testing multiple sites. I have a site on this server that I want only people from within our intranet to view. How do I prevent requests originating from outside the intranet from seeing this website? I tried the following in my apache config file, but I get a 403 error. <Directory /> Options FollowSymLinks Order Deny,Allow Allow from domain.com Allow from 10.0.0.0/10.255.255.255 Deny from All AllowOverride None </Directory> <Directory /var/www/sitename/public> Options Indexes FollowSymLinks MultiViews Order Deny,Allow Allow from domain.com Allow from 10.0.0.0/10.255.255.255 Deny from All AllowOverride None </Directory>

    Read the article

  • Do parent website application pools serve child application pools as well?

    - by Mike G
    I am running a .NET web application in its own application pool on IIS7. The parent website is set to run in its own application pool. Today we noticed a huge number of connections going to IIS. I tried to browse a plain ol' .html page in the directory of the web application and it hangs. I then try to browse another plain .html file in the root of the parent website, and it too hangs. In performance monitor, i see there are some 8k connections to the default website and climbing. I cant seem to understand if my application was the problem, or IIS itself. If it was my application, wouldnt the html page in the root of the parent website still be able to be served? edit: Also, if i shut down the app pool to my application, the html page on the root of the parent website is still not able to be displayed.

    Read the article

  • Why won't IIS serve my website? - 404 Page Not Found

    - by Giffyguy
    Built a brand new server, with a fresh copy of Windows Server 2003 Enterprise x86 Edition. Installed the .NET Framework 1.1, 2.0, 3.5, and 4.0 Added the "Domain Controller" and "Application Server" roles. Created a new website, pointed it to a local directory: C:\Inetpub\angryoctopus.net\ Added the appropriate headers: angryoctopus.net, www.angryoctopus.net, TCP port 80, all IPs Moved the website content into the local directory. Configured the default document in IIS: Default.aspx Enabled ASP.NET for this website, and set it to the correct version: 2.0.50727 Configured the zone angryoctopus.net in DNS. Tested DNS lookup here to ensure DNS was functional. Opened website in VS 2008 and re-built (and debugged) to ensure the content was functional. I can clearly see that IIS is responding normally, by browsing directly to my server's IP address. Since this does not use the angryoctopus HTTP header, the default website is displayed instead: the "Under Construction" page. And yet, after all of this, angryoctopus.net still returns 404. Does anybody know what could be wrong? What troubleshooting steps have I forgotten? Is there a command-line diagnostic that might provide more information?

    Read the article

  • Kickstart: Serve dynamic kickstart images via a CGI or PHP script?

    - by Stefan Lasiewski
    I'd like to kickstart a couple dozen RHEL6/SL6 servers. However, some of these servers are different and I don't want to create a new ks.cfg file for each class of server. Are there any products which can generate a Kickstart file dynamically on the fly, from a template? For example, if I append a line like this to the KERNEL: APPEND ks=http://192.168.1.100/cgi-bin/ks.cgi Then the script ks.cgi can determine what host this is (Via the MAC address), and print out Kickstart options which are appropriate for that host. I could optionally override some options by passing parameters to the script, like this: APPEND ks=http://192.168.1.100/cgi-bin/ks.cgi?NODETYPE=production&IP=192.168.2.80 After we kickstart the server, we activate Cfengine/Puppet on this system and manage the system using our favorite Configuration Management product. We're experimenting with xCAT but it is proving too cumbersome. I've looked into Cobbler, but I'm not sure it does this. Update: A roll-your-own solution is discussed in the O'Reilly book: Managing RPM-Based Systems with Kickstart and Yum, Chapter 3. Customizing Your Kickstart Install Dynamic ks.cfg, which echos some of the comments in this thread: To implement such a tool is beyond the scope of this Short Cut, but I can walk through the high-level design. Any such solution would mix a data store (the things that change) with a templating solution (the things that don’t change). The data store would hold the per-machine data, such as the IP address and hostname. You would also need a unique identifier, perhaps the hostname, such that you could pick up a given machine’s data. The data store could be a flat file, XML data, or a relational database such as PostgreSQL or MySQL. In turn, to invoke the system, you pass a machine’s unique identifier as a URL parameter. For example: boot: linux ks=http://your.kickstart.server/gen_config?host-server25 In this example, the CGI (or servlet, or whatever) generates a ks.cfg for the machine server25. But where, oh where, is the code for ks.cgi?

    Read the article

  • How to configure nginx to serve static contents from RAM?

    - by Vijayendra Tripathi
    I want to set up nginx as my web server. I want to have image files cached in the memory (RAM) rather then disk. I am serving a small page and want few images always served from RAM. I dont wish to use varnish (or any other such tools) for this as I believe nginx has a capability to cache contents into RAM. I am not sure as how may I configure nginx for this? I did try few combinations but they didn't work. nginx uses disk all the time to get images. For example, when I tried apache benchmark to test with following command - ab -c 500 -n 1000 http://localhost/banner.jpg I get following error - socket: Too many open files (24) I guess this means nginx is trying to open to many files simultaneously from the disk and OS is not allowing this operation. Can anyone please suggest me a correct configuration? Thanks for considering this message.

    Read the article

  • How to make MAMP PRO secure enough to serve as webserver, if possible?

    - by Andrei
    Hi, my task is to setup a MAMP webserver for our website in the easiest way so it can be managed by my colleagues without experience in server administration. MAMP PRO is an excellent solution, but some guys don't suggest to use it for serving external requests. Could you explain why it is bad (in details if possible) and how to make it secure enough to be a full-scale and not-only-local webserver? Is there a better solution?

    Read the article

  • How to make MAMP PRO secure enough to serve as webserver? Is it possible?

    - by Andrei
    Hi, my task is to setup a MAMP webserver for our website in the easiest way so it can be managed by my colleagues without experience in server administration. MAMP PRO is an excellent solution, but some guys don't suggest to use it for serving external requests. Could you explain why it is bad (in details if possible) and how to make it secure enough to be a full-scale and not-only-local webserver? Is there a better solution?

    Read the article

  • What is the cheapest non-colocation way to serve about 10 static files at a rate of 100 megabits per

    - by Mark Maunder
    I've looked at Amazon S3 and it costs roughly $4746 per month for 100 megabits/s (which translates into 31,640 Gigabytes of data transferred. That's at a rate of $0.15 per gig.) I haven't found a cheaper "cloud" option. I'm curious if there's any other cloud hosting option out there cheaper than S3. Uptime is not an issue because I can build failover for most things into the browser. e.g. I can use javascript to say "if the image didn't load then go to this other URL instead." FYI I'm currently using a colocation facility which is about 30% cheaper than S3 and I'm familiar with colo prices - so this question is really about "cloud" services and by that I mean services where I don't have to worry about the infrastructure.

    Read the article

  • SharePoint MOSS - Serve HTTP content on an HTTPS page without Mixed Content Warning?

    - by kcb263
    Our "portal-like" SharePoint site is served using HTTPS/SSL. So a user goes to https://web.company.com and sees content and different Web Parts. So far, no problem. The desire now is to have new Web Parts added that either frame HTTP content (such as Weather Bug) or HTTP RSS feeds. The issue that arises is that by doing this, results in a "Mixed Content" warning in the browser. Has anybody successfully been able to implement such a scenario, or one similar to it? The options we have looked at, unsuccessfully, have been: using Apache Reverse Proxy Server mirror an external site Custom Web Parts

    Read the article

  • Is it possible for the Subversion Apache module to serve html files with an html content-type without using the svn:mime-type property?

    - by Martin Pain
    I am aware that if you set the svn:mime-type Subversion property on a .html file to text/html then when viewing the file in a browser through the Subversion module in Apache httpd it will be served with a Content-Type: text/html header, enabling the browser to render it as HTML rather than plain text. However, I am looking for a way to do this without using the svn:mime-type property. I'm aware that you can configure your svn client to automatically add the property - this is not what I want, as I do not want to ensure all users have these settings. I'm also aware that I could create a pre-commit hook that rejects the commit if the properties are not set, in order to force users to set the property - I might fall back to that, but I'm looking for something less intrusive. I'm also aware that I could use a post-commit hook to add the properties automatically on the server-side. I'd rather not do that (as users then have to update immediately after their commit, and it's not trivial to write) - I'm looking for a better alternative. Perhaps something with rewrite rules in the Apache server?

    Read the article

  • Is there a switch that will connect directly to my modem and allow my router to serve only as a WiFi connection?

    - by Abner
    Details . . Devices . Internets -50Mbps Cable Internet Modem - Motorola Surfboard Extreme Router - Netgear WNDR3700v3 Switch - D-Link DGS-1008G Wired Ethernet Cable - Cat6_24Awg_ Device Configuration - Modem\Router\Switch . . Internet Usage . Wired Demand XBOX 360 1 Gaming PC 2 PC - HD video . WiFi Demand 3 android + 1 Laptop for browsing and group video chat simultaneously . . Specifics . I am experiencing problems with network speeds and reliability on both wired and wireless connections. On many occasions I experience WiFi Speeds that vary between the 15mbps to 0.50 mbs (or less) and ping ranging from 15ms to 500ms. These results are from when I notice problems with internet lag and run speedtest.net to get details of problems. I have a stretched out floor-plan and old building materials drastically affecting my cellphone signal strength as well). After Reading the "Known Issues" Section on the webpage below http://www.dd-wrt.com/wiki/index.php/Netgear_WNDR3700#Known_Issues I bought the switch and Cat6 cable to increase speed and relieve stress on router in an attempt to fix the symptoms. I thought I'd use the router in a Modem\Switch\Router configuration. I thought I'd only have to use the router for mobile WiFi connections like android or Laptops when necessary (hopefully eliminating the problem caused by the router when subjected to all those demanding Ethernet connections) When I started unboxing the switch, I noticed the manual of this DGS-1008G shows it being connected in the Modem\Router\Switch order and not in the Modem\Switch\Router configuration I was aiming for. I have not been able to find a solid plan to remedy my specific problem without buying another expensive router. I would like to get the speeds I am paying for without buying another router. (My WiFi Adapters would also need to be updated if new router is required, meaning more $$$). I can always sell the switch and get a better one that will bypass the router because my most demanding internet connections are Wired. . . Questions Can I accomplish a Modem\Switch\Router configuration with current switch? Is there a different way to get the wired speed I need while providing WiFi only when necessary? . .

    Read the article

  • What is the technique used to make my IIS 7 serve all pages with an injected iframe

    - by Andre Carlucci
    Since my previous question was closed without an answer, I'm changing it a bit and asking again. All my pages are being served with an malicious iframe injected just before the html tag. The code looks like this: <iframe src= http://117.21.247.171:700/1.htm width=0 height=0></iframe> <!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd"> <html xmlns="http://www.w3.org/1999/xhtml" dir="ltr" lang="pt-BR"> ... Firstly I thought it could be something related with wordpress, but my asp.net sites are also infected and even if I create a static html file with nothing inside, the iframe is injected. I'm using a Windows Server 2008 R2 Standard with IIS7.5 7600. Anyone knows how to do this in IIS?

    Read the article

  • What methods are there to configure puppet to serve resources for multiple environments?

    - by cclark
    I seem to come across two ways for using puppet in multiple environments: 1) Install a puppetmaster in each environment and only update the recipes from source control for that environment when ready to deploy the recipes in that environment. 2) Use one puppetmaster and use a variable in the puppet.conf of each client to specify the environment and then in the puppetmaster specify a different modulepath for each environment and each of those paths is updated to the branch of the recipe repository intended for that environment (e.g. dev, staging, production). Only running one puppetmaster seems like it is one less piece of infrastructure to keep running but there is some additional complexity in the configuration. Are there additional pros or cons to one of these methods or something which I'm missing entirely?

    Read the article

  • Why does IIS refuse to serve ASP.NET content?

    - by Michael Haren
    My Windows Server 2003 Std server refuses to server ASP.NET content. It serves regular html just fine but anything .net, even a one line html file with an ASPX extention fails silently. Things I've tried: Nothing in the event log or IIS WWW logs when it fails. Fiddler shows no response I reinstalled .NET with C:\WINDOWS\Microsoft.NET\Framework\v2.0.50727aspnet_regiis.exe -U C:\WINDOWS\Microsoft.NET\Framework\v2.0.50727aspnet_regiis.exe -I I give obscenely high permissions on everything I can think of (full control, read, write, etc.) to all possibly relevant users (IUSER*, ASP.NET, etc.). I confirmed that ASP.Net v1 and v2 Web Service Extensions are "allowed" in IIS Confirmed that the Server Manager had IIS and ASP.Net roles enabled Again: this is the scenario: http://localhost/Test/Default.htm <-- Works great! http://localhost/Test/Default.aspx <-- Bombs silently with no message at all Any guidance will be much appreciated! Solution: I reinstalled per the instructions below and it works now. Thanks all!

    Read the article

  • How do I serve Ruby on Rails applications on Windows Server 2008?

    - by Adam Lassek
    I have spent the last several hours attempting to get Ruby on Rails running on a Windows server with no luck. At first I tried configuring a test application through IIS7's FastCGI support, but the documentation for this is not very good. I've been following this blog entry, and this one, and this one, and this one but everything seems to be missing major steps, or are out of date. And every article keeps linking back to this Howto from rubyonrails.org that doesn't exist. The sense that I'm getting is that even if I manage to make this work, IIS' FastCGI isn't good enough to use in a production environment anyway. So it looks like my best bet is to setup a reverse proxy in IIS that points to Apache & Mongrel/Passenger using ARR and UrlRewrite. Is there anybody else out there stuck deploying a Rails application on a Windows stack? Am I on the right track? Can you give me a better idea of how to configure this? I believe Plesk already installed an instance of Apache/Tomcat running on this server using a different port, so adding another virtual host shouldn't be difficult; the hardest part seems to be setting up the reverse proxy through IIS.

    Read the article

  • Why does it sometimes take over a minute for Apache2.2 to serve a static page?

    - by Jason Lamoreux
    We have a Windows Server 2003 machine running Apache2.2. Most of the time there is no load on the server, but we have a notification program on 3400 PC's that can request a small web page that plays a 64KB .wav file. When an event occurs those 3400 PC's all request the web page over the course of 3 minutes. On a few machine we saw the browser sit in the "connecting" state for a little over a minute before the page painted. What is happening, and how can we speed this up?

    Read the article

  • Any cloud storage service that lets us to authenticate the file when we serve the file to our visito

    - by TORr0t
    Lets say, i want to restrict a file to my visitors. I mean , i have an xx.avi file to be streamed/downloaded, and the visitor paid me for the bandwidth and the size of the file. In amazon s3, i cant control the file at all .(there is a very basic control thing which is not ok for me) Only way is my server can proxy the file, like it fetches the file from amazon s3 storagenode and send it to the owner with authentication approval by a php script. But this way i would double up the bandwidth usage and again there would be latency problem since my server needs to get the file from amazon s3. So i was wondering if there is a better solution or any cloud storage service that lets us to control the file restriction to my visitors. Thanks

    Read the article

  • What's a good box to serve files on my local network, cross platform?

    - by rogpeppe
    I've installed CAT5e cable and gigabit switches in my house with the goal of having an "always-on" file server in the loft, accessible to both my macbook and my partner's Windows box. I'd like to find a solution which: uses minimal power. allows me to access as much disk bandwidth as possible. provides glitch-free file access to both MacOS and Windows. is as cheap as possible, while remaining reliable. Optional, but desirable extras: software or hardware RAID; open source solutions. A SheevaPlug with eSATA seems one possibility, but I'm sure there are any number of other good options.

    Read the article

  • How should I configure my Apache Hosts File to serve a different site for localhost than for my domain/publicip?

    - by rofls
    I'm trying to test out a LAMP (with PHP5 specifically) setup with Django already serving a website. I want to do the PHP stuff on localhost for now, so that when I do something like this: curl http://localhost/database/script.php?var=1, I get a response from the php server. Right now I'm getting a Django error. I tried something like this in the default file in sites-available: Listen 80 <VirtualHost aaa.bbb.ccc.ddd> ServerName localhost DocumentRoot /home/phpsite </VirtualHost> where aaa.bbb.ccc.ddd is the local ip address, and changing my actual site's settings to specify the public ip, like this: Listen 80 <VirtualHost www.xxx.yyy.zzz> ServerName mysite.com DocumentRoot /srv/www/mysite WSGIScriptAlias / /srv/www/mysite.wsgi </VirtualHost> but then I start getting all kinds of errors when I start apache, such as port ::[80] is already in use or something. I noticed that the hosts file that's located in /etc/apache2/ is apparently pointing everything to mysite.com, including my local ip as well as 127.0.0.1 and 127.0.1.1; Do I need to change the configuration there too?

    Read the article

  • What video format(s) should be used to serve Macs, PCs, and Mobile Devices?

    - by Jeffrey Blake
    In 2007, I started a site based on streaming and downloading poker strategy videos. At that point in time, the best solution I came up with for supporting users of Macs and PCs was to provide the videos in both WMV and FLV formats. Later we added an M4V version to support iPhones/iPods. Obviously, things have changed a bit since that time. I would like to revisit our format decision to see if there is anything better that we could offer, preferrably with wider support among all devices (so that we can reduce the number of formats offered, if possible). Is FLV + WMV + M4V the best solution? Is there something else we should consider? What about Android devices?

    Read the article

< Previous Page | 3 4 5 6 7 8 9 10 11 12 13 14  | Next Page >