Search Results

Search found 4930 results on 198 pages for 'isapi redirect'.

Page 83/198 | < Previous Page | 79 80 81 82 83 84 85 86 87 88 89 90  | Next Page >

  • Disk usage on IIS, PHP5, performance problems.

    - by Jacob84
    Hi everybody, I'm quite worried with a performance problem that I'm facing in one of our production servers. I'm working for a hosting company, so you can imagine how heterogeneous the applications runnning here are. All started with a call of a client complaining about the speed loading a Joomla. The setup is IIS6 (Windows 2003) with PHP5 and FAST CGI wich normally works pretty well. I've tested the loading time and indeed, he was right. 7 or 8 seconds to load, when usually this can be accomplished in 2. Seeing this results, I started to check first CPU and RAM. Everithing normal, 2GB of RAM free, 3%-8% of CPU activity. That's what I call a relaxed server ;). Unfortunately, digging a little deeper I've found the 'PhysicalDisk' counters quite high (above 10), specially the read queues. I've used Process Explorer to see wich of those processes has the higher deltas, but everything seemed normal. As the problem is specially related to PHP pages, I've checked specific IIS counters, as Actual connections, Number of CGI requeriments and Number of ISAPI requeriments. CGI -> 3 to 7 ISAPI -> 5 to 9 Connections-> 90 to 120 (wich appears at the top of the graph) More than a solution (I know this is hard to find), I would like to know if you have an specifical methodology to face this kind of problems. Thanks a lot, as always.

    Read the article

  • What are the Consequences for using Relative Location Headers?

    - by Alan Storm
    According to the spec, Location headers used in a redirect require a server name HTTP/1.1 301 Moved Permanently ... Location: http://example.com/foo/baz/bar However, in 2012, most web browsers will recognize a relative path and redirect you to the new location using the original server name HTTP/1.1 301 Moved Permanently ... Location: /foo/baz/bar Are there any negative/surprising consequences to using the relative URLs in the Location headers? My particular concern is how Google/search-engines will interpret this, but if there's anything else I'm not thinking about I'd love to hear it.

    Read the article

  • ecommerce item deleted by user, 301 rediret to HOME PAGE or 404 not found?

    - by Marco Demaio
    I know this question is someway similar to this one where they reccomend using 404, but after reading this other one where they suggest to use 301 when changing site urls (in the specific case was due to redesign/refactoring) I get a bit of confused and I hope someone could clarify for this specific example: Let's say I have an ecommerce site, let's also say the final user inserted some interesting items in the site and the ecommerce webapp created the item pages at the urls: http://...?id=20, http://...?id=30 etc. Now let's say some of these interesting items got many external links toward them from many other sites because some people found those items very interesting and linked to them. After some years the final user deletes those items, so obviously the pages/urls http://...?id=20, http://...?id=30, etc. now do not exist anymore, but still many pages on the web are linking toward them. What should the ecommerce site do now, just show a 404 page for those items? But, I'm confused, wouldn't this loose all the Google PR passed by the external links to the items pages? So isn't it better to use 301 redirect to HOME PAGE that at least passes the PR to the HOME PAGE? Thanks, EDIT: Well, according to answeres the best thing to do so far is to do a 404/410. In order to make this question more complete, I would like to talk about a special case, just to make sure I understood. properly. Let's say the user creates those items again (the ones he previously deleted at point 4), maybe he changes a bit their names and description, but they are basically the same items. The webapp has no way to know these new added items were the old items so it obviously create them as new items with new urls http://...?id=100, http://...?id=101, does it makes sense at this point to redirect 301 the old urls to the new ones? MORE EDIT (It would be VERY IMPORTANT TO UNDERSTAND): Well according to the clever answers received so far it seems for the special case, explained in my last EDIT, I could use 301, since it's something of not deceptive cause basically the new pages is a replacement for the old page in term of contents. This is basically done to keep the PR passed from external link and also for better user experience. But beside the user experince, that is discussible (*1), in order to preserve PR from external broken linlks why not just always use 301, In my understanding Google dislikes duplicated contents, but are we sure that 301 redirect to HOME PAGE is seen as duplicated contents for Google?! Google itself suggests to redircet 301 index.html to document root so if they consider 301 as duplicated contents wouldn't that be considered duplicated contents too?! Why do they suggest it? Let me provoke you: “why not just add a 301 to HOME PAGE for every not found page?” (*1) as a user, when I follo a broken url from some external link to some website's page I would stick more on this website if I get redirected to HOME PAGE rather than seeing a 404 page where I would think the webiste does not even exist anymore and maybe I don't even try to go to HOME PAGE of the website.

    Read the article

  • Rules of Holes #2: You Are Still in a Hole

    - by ArnieRowland
    OK. So you followed the First Rule of Holes -you stopped digging yourself in deeper. But now what? You are still in a Hole. Your situation has not changed much, but at least you are no longer making it worse. You need to redirect the digging effort into escape and avoidance efforts. The Hole has a singular purpose -consuming all of your time and effort. AND it has succeeded! But now you are going to redirect your efforts for your own survival. You have encountered the Second Rule of Holes: When you...(read more)

    Read the article

  • Search engine bots accessing strange URLs

    - by casasoft
    We have ELMAH enabled on our site and get errors whenever a Page Not Found error is triggered on the website. We have recently redesigned a new website and so we understand that search engine robots might have previously indexed pages which they try to access and result in a Page Not Found errors. For this reason, we have set up permanent redirects for such previously indexed pages to the respective new pages. The website in mention is www.chambercollege.com and for example, a previously indexed URL was www.chambercollege.com/special-offers.aspx. This page is no longer accessible so we have created the necessary permanent redirect to redirect to the respective page on www.chambercollege.com/en/content/special-offers-161/. Now we are starting to receive Page Not Found errors of search engine bots (e.g. MSN bot) trying to access the URL www.chambercollege.com/special-offers.aspx/images/shadow_right.jpg/. Any idea how could a search engine make up that strange URL and whether you have any suggestions of what to do best?

    Read the article

  • Prevent azure subdomain indexation

    - by Leg10n
    Let me explain my situation, I have an azure website (with azurewebsites.net sub domain), and a custom domain.com, built with asp.net MVC Both are being indexed by Google, but I've noticed the custom domain is being penalized and it doesn't show up in results, it only shows when I search for "site:domain.com" I want to remove and block the azurewebsites.net subdomain from Google. I've read the "possible" solutions: Adding robots.txt: won't work, because the subdomain and the domain are the exact same content, so subdomain.azures.net/robots.txt will lead to domain.com/robots.txt, removing the domain as well. Adding the tag, is the same situation as the previous point. I'm using a CNAME register to redirect the domain to the subdomain, so I can't redirect to a sub directory. Do you have any other ideas?

    Read the article

  • Running .NET code in XML file [closed]

    - by Stuart McIntosh
    We have 2 servers, 1 already configured with .net which works fine and a new one which appears to be configured the same but when I open an xml page in Internet Explorer it complains about the <% tag. We have IIS on win srvr 2003 SP2. The website is configured with .NET 1.1.4322. In ISAPI extensions have set the .XML extension to use c:\windows\microsoft.net\framework\v1.1.4322\aspnet_isapi.dll But the page: <property name="documentmaxage" value="0"/> <property name="documentmaxstale" value="0"/> <var name="m_Prompt_Path" /> <form id="InitVoiceXmlDoc"> <block> <assign name="m_Prompt_Path" expr="&quot;<% Response.Write(Request.QueryString["m_Prompt_Path"]); %>&quot;"/> </block> </form> gives the error: The XML page cannot be displayed Cannot view XML input using XSL style sheet. Please correct the error and then click the Refresh button, or try again later. The character '<' cannot be used in an attribute value. Error processing resource 'http://localhost:11119/fails.xml'. Lin... &quo... We have the same config on another server which works fine. So are there other options apart from the ISAPI extensions that I need to look at

    Read the article

  • Personalized URL with pre-populated data

    - by AttackingHobo
    I think I am going to use Google spreadsheets and an embedded form. I see that it is possible to pre-populate the fields with URL parameters. So I think I just need a way to redirect with the correct data from a database when a user goes to their Personalized URL. I have no idea how to do the second part. Or if it is even the best way to do it. How do I set up Personalized URLS? How do I redirect from those URLS to the spreadsheets? Is this even the best method for pre-filled forms?

    Read the article

  • Moving between sites using SAML

    - by System Down
    I'm tasked with developing an SSO system, and was guided towards using the SAML spec. After some research I think understand the interaction between a Service Provider and an ID Provider and how a user's identity is confirmed. But what happens when I redirect the user to another Service Provider? How do I ascertain the user's identity there? Do I send his SAML assertion tokens along with the redirect request? Or does the second Service Provider need to contact the ID Provider all over again?

    Read the article

  • checking virtual sub domains

    - by Persian.
    I create a project that check the sub domain and redirect to the exist subdomain ( username ) but I can't find out why when the username is in database it can't show it . on local system it works finely .. but when I upload it on server it not works .. of course I change the commented place to uncomment for test .. but it's not working .. it shows this error : Object reference not set to an instance of an object. My code is this in page load : //Uri MyUrl = new Uri(Request.Url.ToString()); //string Url = MyUrl.Host.ToString(); Uri MyUrl = new Uri("http://Subdomain.Mydomain.com/"); string Url = MyUrl.Host.ToString(); string St1 = Url.Split('.')[0]; if ((St1.ToLower() == "Mydomain") || (St1.ToLower() == "Mydomain")) { Response.Redirect("Intro.aspx"); } else if (St1.ToLower() == "www") { string St2 = Url.Split('.')[1]; if ((St2.ToLower() == "Mydomain") || (St2.ToLower() == "Mydomain")) { Response.Redirect("Intro.aspx"); } else { object Blogger = ClsPublic.GetBlogger(St2); if (Blogger != null) { lblBloger.Text = Blogger.ToString(); if (Request.QueryString["id"] != null) { GvImage.DataSourceID = "SqlDataSourceImageId"; GvComments.DataSourceID = "SqlDataSourceCommentsId"; this.BindItemsList(); GetSubComments(); } else { SqlConnection scn = new SqlConnection(ClsPublic.GetConnectionString()); SqlCommand scm = new SqlCommand("SELECT TOP (1) fId FROM tblImages WHERE (fxAccepted = 1) AND (fBloging = 1) AND (fxSender = @fxSender) ORDER BY fId DESC", scn); scm.Parameters.AddWithValue("@fxSender", lblBloger.Text); scn.Open(); lblLastNo.Text = scm.ExecuteScalar().ToString(); scn.Close(); GvImage.DataSourceID = "SqlDataSourceLastImage"; GvComments.DataSourceID = "SqlDataSourceCommentsWId"; this.BindItemsList(); GetSubComments(); } if (Session["User"] != null) { MultiViewCommenting.ActiveViewIndex = 0; } else { MultiViewCommenting.ActiveViewIndex = 1; } } else { Response.Redirect("Intro.aspx"); } } } else { object Blogger = ClsPublic.GetBlogger(St1); if (Blogger != null) { lblBloger.Text = Blogger.ToString(); if (Request.QueryString["id"] != null) { GvImage.DataSourceID = "SqlDataSourceImageId"; GvComments.DataSourceID = "SqlDataSourceCommentsId"; this.BindItemsList(); GetSubComments(); } else { SqlConnection scn = new SqlConnection(ClsPublic.GetConnectionString()); SqlCommand scm = new SqlCommand("SELECT TOP (1) fId FROM tblImages WHERE (fxAccepted = 1) AND (fBloging = 1) AND (fxSender = @fxSender) ORDER BY fId DESC", scn); scm.Parameters.AddWithValue("@fxSender", lblBloger.Text); scn.Open(); lblLastNo.Text = scm.ExecuteScalar().ToString(); scn.Close(); GvImage.DataSourceID = "SqlDataSourceLastImage"; GvComments.DataSourceID = "SqlDataSourceCommentsWId"; this.BindItemsList(); GetSubComments(); } if (Session["User"] != null) { MultiViewCommenting.ActiveViewIndex = 0; } else { MultiViewCommenting.ActiveViewIndex = 1; } } else { Response.Redirect("Intro.aspx"); } } and my class : public static object GetBlogger(string User) { SqlConnection scn = new SqlConnection(ClsPublic.GetConnectionString()); SqlCommand scm = new SqlCommand("SELECT fUsername FROM tblMembers WHERE fUsername = @fUsername", scn); scm.Parameters.AddWithValue("@fUsername", User); scn.Open(); object Blogger = scm.ExecuteScalar(); if (Blogger != null) { SqlCommand sccm = new SqlCommand("SELECT COUNT(fId) AS Exp1 FROM tblImages WHERE (fxSender = @fxSender) AND (fxAccepted = 1)", scn); sccm.Parameters.AddWithValue("fxSender", Blogger); object HasQuty = sccm.ExecuteScalar(); scn.Close(); if (HasQuty != null) { int Count = Int32.Parse(HasQuty.ToString()); if (Count < 10) { Blogger = null; } } } return Blogger; } Which place if my code has problem ?

    Read the article

  • Google indexed my home page incorrectly: How can I fix it?

    - by louis_coetzee
    I finished my website and launched it, I think I had a problem with my robots.txt - so I changed it to look like this: 03/08/2012 # Allows all bots Sitemap: http://www.mysite.co.za/sitemap.xml User-agent: * Disallow: /dashboard/ When I google my domain.co.za - I get this back: Home A description for this result is not available because of this site's robots.txt – learn more. You've visited this page 3 times. Last visit: 2012/08/15 Now since I fixed this and added a 301 redirect to redirect mysite.co.za to www.mysite.co.za I would love it if google bot would come do a visit. Is there anything I can do to get this fixed?

    Read the article

  • Rules of Holes -#2: You Are Still in a Hole

    - by ArnieRowland
    OK. So you followed the First Rule of Holes -you stopped digging yourself in deeper. But now what? You are still in a Hole. Your situation has not changed much, but at least you are no longer making it worse. You need to redirect the digging effort into escape and avoidance efforts. The Hole has a singular purpose -consuming all of your time and effort. AND it has succeeded! But now you are going to redirect your efforts for your own survival. You need to look around, take stock of the situation....(read more)

    Read the article

  • What are the dis-advantages of installing the ssl certificate for the naked domain?

    - by user1744649
    I might buy an SSL certificate for my sie. I know that it will help me in many ways. But will there be dis-advantages also? eg. If I load an image from another server (using plain http), will that alert the customer saying something is wrong? Will I be able to use all existing codes like phpbb, awstats etc without a problem? Will there be any issue if redirect a page from my domain.com to my subdomain.domain.com using a meta refresh or .htaccess? Will there be any issue if redirect a page from my subdomain.domain.com to my domain.com using a meta refresh or .htaccess? Any other issue that I might get into? Thanks.

    Read the article

  • Simple mod_rewrite Question

    - by user5358
    Hello, I want to have everything that looks like this: /1/2/3/4/5/[...] to redirect to this: /index.php?u=/1/2/3/4/5/[...] unless the requested string is a specific file. So anything that doesn't have a ".", I want to redirect to "index.php?u=[...]". I'll then parse the URI segments in PHP to determine what the user is requesting. I've been looking around for how to do this, but have only a very rough understanding of regular expressions and have been unable to find an example of how to do it. Thanks!

    Read the article

  • Drupal with clean urls turned on is putting question marks in URL

    - by aussiegeek
    I have a drupal site with clean urls, the pages load correctly, but then the URL is rewritten with a question mark in it, which I don't want the user to see. My .htaccess is: <IfModule mod_rewrite.c> RewriteEngine on # If your site can be accessed both with and without the 'www.' prefix, you # can use one of the following settings to redirect users to your preferred # URL, either WITH or WITHOUT the 'www.' prefix. Choose ONLY one option: # # To redirect all users to access the site WITH the 'www.' prefix, # (http://example.com/... will be redirected to http://www.example.com/...) # adapt and uncomment the following: # RewriteCond %{HTTP_HOST} ^example\.com$ [NC] # RewriteRule ^(.*)$ http://www.example.com/$1 [L,R=301] # # To redirect all users to access the site WITHOUT the 'www.' prefix, # (http://www.example.com/... will be redirected to http://example.com/...) # uncomment and adapt the following: # RewriteCond %{HTTP_HOST} ^www\.example\.com$ [NC] # RewriteRule ^(.*)$ http://example.com/$1 [L,R=301] # Modify the RewriteBase if you are using Drupal in a subdirectory or in a # VirtualDocumentRoot and the rewrite rules are not working properly. # For example if your site is at http://example.com/drupal uncomment and # modify the following line: # RewriteBase /drupal # # If your site is running in a VirtualDocumentRoot at http://example.com/, # uncomment the following line: RewriteBase / # Rewrite URLs of the form 'x' to the form 'index.php?q=x'. RewriteCond %{REQUEST_URI} !(connect|administration) RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteCond %{REQUEST_URI} !=/favicon.ico RewriteRule ^(.*)$ index.php?q=$1 [L,QSA] </IfModule>

    Read the article

  • How can I erase the traces of Folder Redirection from the Default Domain Policy

    - by bruor
    I've taken over from an IT outsourcer and have found a struggle now that we're starting a migration to windows 7. Someone decided that they would setup Folder redirection in the Default Domain Policy. I've since configured redirection in another policy at an OU level. No matter what I do, the windows 7 systems pick up the Default Domain Policy folder redirection settings only. I keep getting entries in the event log showing that the previously redirected folders "need to be redirected" with a status of 0x80000004. From what I can tell this just means that it's redirecting them locally. Is there a way I can wipe that section of the GPO clean so it's no longer there? I'm hesitant to try to reset the default domain policy to complete defaults. ***UPDATE 6-26 I found that the following condition occurred and was causing the grief here. I've already implemented the new policies for clients, and for some reason, XP was working great, 7 was refusing to process. The DDP was enforced. Because of this, and the fact that the folder redirection policies were set to redirect back to the local profile upon removal, it was forcing clients to pick up it's "redirect to local" settings. Requirements for to recreate the issue. -Create a new test OU and policy. -Create some folder redirection settings, set them to redirect to local upon removal -Remove settings on that GPO -Refresh your view of the GPO and check the settings. -You'll notice that the settings show "not configured" entries for folder redirection. -Enforce this GPO -Create another sub-OU -Create a GPO linked to this sub-ou and configure some folder redirection settings. -Watch as the enforced GPOs "not configured" setting overrides the policy you just defined. I've had to relink the DDP to all OU's that have "block inheritance" enabled, and disable the "enforced" option on the DDP as a workaround. I'd love to re-enable enforcement of the DDP, but until I can erase the traces of folder redirection settings from the DDP, I think I'm stuck.

    Read the article

  • Apache2, Tomcat6, and proxy redirects

    - by Randal Hale
    So here is my question - go easy and slow. I'm a GIS Consultant and general hack with linux. I inherited this volunteer job essentially because I knew more than the rest of the team - or the rest of the team isn't as stubborn as I am... With that said a number of people have been mucking around in the server before I got involved so I've been cleaning up a lot of things. The domain names have been changed to protect the innocent. I have a server running Apache2 (port 80) and tomcat6 (8080) running on ubuntu server 10.4. There is a virtual host on Apache2 called "Runner" (the domain is runner.org). I have mod_proxy loaded. I am trying to redirect everyone that visits runner.org to http://some.ip.address:8080/openrunner-webapp/ So far I've gotten runner.org assigned to the apache2 server. Someone set up a redirect in the httpd.conf file but I believe it needs to go into the virtualhost. I tried setting the redirect in the virtualhost as: *ProxyPass / http://localhost:8080/openrunner-webapp All that does is show me the root of the Apache webserver. Anyway I'm stuck

    Read the article

  • iptables port forwarding works only for localhost

    - by Venki
    Below is my iptables config. I used this for my accessing a node js website running in port 9000 through port 80. This works fine only if access the website through local host / loop back. When I try to use the ip of eth0, which is assigned by my router through dcp. this does not work, when I use ip like 192.168.0.103 to access the website. I am not able to figure what is wrong here, Already burnt a day in this, still not able to figure out :( Edit: ( more information) Earlier, I was using this configuration to develop the website, i had configured the domain name to point to 127.0.0.1 in the /etc/hosts file. It was working fine, but now I am trying to deploy the website in a vps with static ip, This configuration does not work with both static IP. # redirect port 80 to port 9000 *nat :PREROUTING ACCEPT [57:3896] :INPUT ACCEPT [0:0] :OUTPUT ACCEPT [4229:289686] :POSTROUTING ACCEPT [4239:290286] -A PREROUTING -p tcp -m tcp --dport 80 -j REDIRECT --to-ports 9000 -A OUTPUT -d 127.0.0.1/32 -p tcp -m tcp --dport 80 -j REDIRECT --to-ports 9000 COMMIT # Allow HTTP and HTTPS connections from anywhere (the normal ports for websites and SSL). -A INPUT -p tcp --dport 80 -j ACCEPT -A INPUT -p tcp --dport 443 -j ACCEPT -A INPUT -p tcp --dport 9000 -j ACCEPT -A INPUT -j REJECT

    Read the article

  • How can I avoid an error in this .htaccess file?

    - by mipadi
    I have a blog. The blog is stored under the /blog/ prefix on my website. It has the usual URLs for a blog, so articles have URLs in the format /blog/:year/:month/:day/:title/. First and foremost, I want to automatically redirect visitors to the www subdomain (in case they leave that off), and internally rewrite the root URL to /blog/, so that the front page of the blog appears on the front page of the site. I have accomplished that with the following set of rewrite rules in my .htaccess file: RewriteEngine On # Rewrite monkey-robot.com to www.monkey-robot.com RewriteCond %{HTTP_HOST} ^monkey-robot\.com$ RewriteRule ^(.*)$ http://www.monkey-robot.com/$1 [R=301,L] RewriteRule ^$ /blog/ [L] RewriteRule ^feeds/blog/?$ /feeds/blog/atom.xml [L] That works fine. The problem is that the front page of the blog now appears at two distinct URLs: / and /blog/. So I'd like to redirect the /blog/ URL to the root URL. Initially I tried to accomplish this with the following set of rewrite rules: RewriteEngine On # Rewrite monkey-robot.com to www.monkey-robot.com RewriteCond %{HTTP_HOST} ^monkey-robot\.com$ RewriteRule ^(.*)$ http://www.monkey-robot.com/$1 [R=301,L] RewriteRule ^$ /blog/ [L] RewriteRule ^blog/?$ / [R,L] RewriteRule ^feeds/blog/?$ /feeds/blog/atom.xml [L] But that gave me an infinite redirect (maybe because of the preceding rule?). So then I tried this set: RewriteEngine On # Rewrite monkey-robot.com to www.monkey-robot.com RewriteCond %{HTTP_HOST} ^monkey-robot\.com$ RewriteRule ^(.*)$ http://www.monkey-robot.com/$1 [R=301,L] RewriteRule ^$ /blog/ [L] RewriteRule ^blog/?$ http://www.monkey-robot.com/ [R,L] RewriteRule ^feeds/blog/?$ /feeds/blog/atom.xml [L] But I got a 500 Internal Server Error with the following log message: Invalid command '[R,L]', perhaps misspelled or defined by a module not included in the server configuration What gives? I don't think [R,L] is a syntax error.

    Read the article

  • How can I tell GoogleBot that a subdirectory is now a subdomain? [migrated]

    - by cwd
    I had about a million pages of a catalog indexed under a subdirectory, and now that's moved to a subdomain. GoogleBot is crawling each one of them and getting a 301 redirect to the new location. Even though I have set up the redirect rule in the apache sites-enabled configuration file, (i.e. it's early on when apache does the redirect - PHP is not even getting loaded), even though I have done that, the server isn't handling the load well. GoogleBot is making around 5 requests per second, and on top of my normal traffic that is hiking up the CPU for a few hours at a time. I checked in Webmaster Tools and the corresponding documentation for a way to let Google know that the content had been moved from a subdirectory to a subdomain, but with little luck. Basically the most helpful thing I saw said to just send 301 headers for the new location. How can I tell GoogleBot that a subdirectory is now a subdomain? If that is not an option, how can I more efficiently send 301 redirects out for a particular subdomain? I was thinking perhaps the Nginx server but I'm not sure that I can run both Apache and Nginx side by side on port 80 for different subdomains.

    Read the article

  • performance wise htaccess

    - by purpler
    hese's the my htaccess template, i wonder if anything could be added to increase website performance.. # Defaults AddDefaultCharset UTF-8 DefaultLanguage en-US ServerSignature Off FileETag None Header unset ETag Options -MultiViews #Options All -Indexes # Force the latest IE version or ChromeFrame <IfModule mod_setenvif.c> <IfModule mod_headers.c> BrowserMatch MSIE ie Header set X-UA-Compatible "IE=Edge,chrome=1" env=ie </IfModule> </IfModule> # Proxy X-UA Setup <IfModule mod_headers.c> Header append Vary User-Agent </IfModule> #Rewrites Options +FollowSymlinks RewriteEngine On RewriteBase / # Redirect to non-WWW RewriteCond %{HTTPS} !=on RewriteCond %{HTTP_HOST} ^www\.(.+)$ [NC] RewriteRule ^(.*)$ http://%1/$1 [R=301,L] # Redirect to WWW RewriteCond %{HTTP_HOST} ^domain.com RewriteRule (.*) http://www.domain.com/$1 [R=301,L] # Redirect index to root RewriteRule ^(.*)index\.(php|html)$ /$1 [R=301,L] # Caching ExpiresActive On ExpiresDefault A0 Header set Cache-Control "public" # 1 Year Long Cache <FilesMatch "\.(flv|fla|ico|pdf|avi|mov|ppt|doc|mp3|wmv|wav|png|jpg|jpeg|gif|swf|js|css|ttf|eot|woff|svg|svgz)$"> ExpiresDefault A31622400 </FilesMatch> # Proxy Caching <FilesMatch "\.(css|js|png)$"> ExpiresDefault A31622400 Header set Cache-Control "private" </FilesMatch> # Protect against DOS attacks by limiting file upload size LimitRequestBody 10240000 # Proper SVG serving AddType image/svg+xml svg svgz AddEncoding gzip svgz # GZip Compression <IfModule mod_deflate.c> <FilesMatch "\.(php|html|css|js|xml|txt|ttf|otf|eot|svg)$" > SetOutputFilter DEFLATE </FilesMatch> </IfModule> # Error page ErrorDocument 404 /404.html # Deny access to sensitive files <FilesMatch "\.(htaccess|ini|log|psd)$"> Order Allow,Deny Deny from all </FilesMatch>

    Read the article

  • preformance wise htaccess

    - by purpler
    hese's the my htaccess template, i wonder if anything could be added to increase website performance.. # Defaults AddDefaultCharset UTF-8 DefaultLanguage en-US ServerSignature Off FileETag None Header unset ETag Options -MultiViews #Options All -Indexes # Force the latest IE version or ChromeFrame <IfModule mod_setenvif.c> <IfModule mod_headers.c> BrowserMatch MSIE ie Header set X-UA-Compatible "IE=Edge,chrome=1" env=ie </IfModule> </IfModule> # Proxy X-UA Setup <IfModule mod_headers.c> Header append Vary User-Agent </IfModule> #Rewrites Options +FollowSymlinks RewriteEngine On RewriteBase / # Redirect to non-WWW RewriteCond %{HTTPS} !=on RewriteCond %{HTTP_HOST} ^www\.(.+)$ [NC] RewriteRule ^(.*)$ http://%1/$1 [R=301,L] # Redirect to WWW RewriteCond %{HTTP_HOST} ^domain.com RewriteRule (.*) http://www.domain.com/$1 [R=301,L] # Redirect index to root RewriteRule ^(.*)index\.(php|html)$ /$1 [R=301,L] # Caching ExpiresActive On ExpiresDefault A0 Header set Cache-Control "public" # 1 Year Long Cache <FilesMatch "\.(flv|fla|ico|pdf|avi|mov|ppt|doc|mp3|wmv|wav|png|jpg|jpeg|gif|swf|js|css|ttf|eot|woff|svg|svgz)$"> ExpiresDefault A31622400 </FilesMatch> # Proxy Caching <FilesMatch "\.(css|js|png)$"> ExpiresDefault A31622400 Header set Cache-Control "private" </FilesMatch> # Protect against DOS attacks by limiting file upload size LimitRequestBody 10240000 # Proper SVG serving AddType image/svg+xml svg svgz AddEncoding gzip svgz # GZip Compression <IfModule mod_deflate.c> <FilesMatch "\.(php|html|css|js|xml|txt|ttf|otf|eot|svg)$" > SetOutputFilter DEFLATE </FilesMatch> </IfModule> # Error page ErrorDocument 404 /404.html # Deny access to sensitive files <FilesMatch "\.(htaccess|ini|log|psd)$"> Order Allow,Deny Deny from all </FilesMatch>

    Read the article

  • 9000+ different subdomains 301 to main domain, .htacess apache

    - by Karim
    I bought a domain that had various subdomains such as Kim.domain.com/whatever john.domain.com/whatever1 Lizo.domain.com/whatever2 Simon.domain.com/whatever1 And this was in the thousands, and also had links to these pages I'd like to do a 301 redirect for all these urls into http://domain.com Any idea how this could be done? This is for a apache web server and needs to be done via .htaccess I have implemented the solution from reading the answer below. RewriteEngine On RewriteCond %{HTTP_HOST} !^www. domain.com$ RewriteCond %{HTTP_HOST} !^$ RewriteRule ^/(.*)$ http:/ / www. domain.com/$1 [L,R=301] However I have a slight problem, I would like to redirect all subdomains + subfolders to http://www. domain.com/ With the exception of http: //domain. com/subfolder/, in which case I would like to redirect to http: // www.domain. com/subfolder/ [i.e. exception for no subdomain] I'm guessing I need to add an exception, what can I do to implement this. Note: example URLs above have had spaces added to them to prevent spam blocks for blocking the post.

    Read the article

  • Ubuntu Pound Reverse Proxy Load Balancing Based off active server load?

    - by Andrew
    I have Pound installed on a loadbalancer. It seems to work okay, except that it randomly assigns the backend server to forward the request to. I've put 1 backend machine under so much load that it went into using swap, and I can't even ssh into it to test this scenareo. I would like the loadbalancer to realize that the machine is overloaded, and send it to a different backend machine. However it doesn't. I've read the man page and it seems like the directive "DynScale 1" is what would monitor this, but it still redirects to the overloaded server. I've also put in "HAport 22" to the backend figuring since I can't ssh in, neither could the loadbalancer and it would consider the backend server dead until it gets rid of the load and responds, but that didn't help either. If anyone could help with this, I'd appreciate it. My current config is below. ###################################################################### ## global options: User "www-data" Group "www-data" #RootJail "/chroot/pound" ## Logging: (goes to syslog by default) ## 0 no logging ## 1 normal ## 2 extended ## 3 Apache-style (common log format) LogLevel 3 ## check backend every X secs: Alive 5 DynScale 1 Client 1200 TimeOut 1500 # poundctl control socket Control "/var/run/pound/poundctl.socket" ###################################################################### ## listen, redirect and ... to: ## redirect all requests on port 80 to SSL ListenHTTP Address 192.168.1.XX Port 80 Service Redirect "https://xxx.com/" End End ListenHTTPS Address 192.168.1.XX Port 443 Cert "/files/www.xxx.com.pem" Service BackEnd Address 192.168.1.1 Port 80 HAport 22 End BackEnd Address 192.168.1.2 Port 80 HAport 22 End End End

    Read the article

  • Host Primary Domain from a subfolder

    - by TandemAdam
    I am having a problem making a sub directory act as the public_html for my main domain, and getting a solution that works with that domains sub directories too. My hosting allows me to host multiple sites, which are all working great. I have set up a subfolder under my ~/public_html/ directory called /domains/, where I create a folder for each separate website. e.g. public_html domains websiteone websitetwo websitethree ... This keeps my sites nice and tidy. The only issue was getting my "main domain" to fit into this system. It seems my main domain, is somehow tied to my account (or to Apache, or something), so I can't change the "document root" of this domain. I can define the document roots for any other domains ("Addon Domains") that I add in cPanel no problem. But the main domain is different. I was told to edit the .htaccess file, to redirect the main domain to a subdirectory. This seemed to work great, and my site works fine on it's home/index page. The problem I'm having is that if I try to navigate my browser to say the images folder (just for example) of my main site, like this: www.yourmaindomain.com/images/ then it seems to ignore the redirect and shows the entire server directory in the url, like this: www.yourmaindomain.com/domains/yourmaindomain/images/ It still actually shows the correct "Index of /images" page, and show the list of all my images. Here is an example of my .htaccess file that I am using: RewriteEngine on RewriteCond %{HTTP_HOST} ^(www.)?yourmaindomain.com$ RewriteCond %{REQUEST_URI} !^/domains/yourmaindomain/ RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteRule ^(.*)$ /domains/yourmaindomain/$1 RewriteCond %{HTTP_HOST} ^(www.)?yourmaindomain.com$ RewriteRule ^(/)?$ domains/yourmaindomain/index.html [L] Does this htaccess file look correct? I just need to make it so my main domain behaves like an addon domain, and it's subdirectories adhere to the redirect rules.

    Read the article

< Previous Page | 79 80 81 82 83 84 85 86 87 88 89 90  | Next Page >