Search Results

Search found 2190 results on 88 pages for 'htaccess'.

Page 74/88 | < Previous Page | 70 71 72 73 74 75 76 77 78 79 80 81  | Next Page >

  • is it okay to use random URLs instead of passwords?

    - by stew
    Is it considered "safe" to use URL constructed from random characters like this? http://example.com/EU3uc654/Photos I'd like to put some files/picture galleries on a webserver that are only to be accessed by a small group of users. My main concern is that the files should not get picked up by search-engines or curious power-users that poke around my site. I've set up an .htaccess file, just to notice that clicking on http://user:pass@url/ links doesn't work well with some browsers/email clients, prompting dialogs and warnings messages that confuse my not-too-computer-savy users.

    Read the article

  • using modrewrite to change http to https? (not redirect)

    - by PaulHanak
    This might sound a little crazy, but bare with me. I basically have an include file, lets say inc-navigation.html, that has absolute paths (http://www.pathtoimage.com/image.com) that are on EVERY PAGE. Well, using SSL, I can't use that same include file because it is not referencing https:// What a pain! SO, I was maybe thinking of using htaccess to do a url rewrite of all references of HTTP to HTTPS when the browser requests a https page. Again, just to be clear, I don't want to "redirect", just "replace". So, I have this.... RewriteCond %{HTTPS} !=on RewriteRule ^http$ https but it doesn't seem to be working. I probably have the syntax wrong though. :) Of course, this is even if this type of thing is even possible!?

    Read the article

  • apache: lists of all directives for a context?

    - by ajsie
    in the apache online documentation each directive could belong to a context eg: server-config, virtualhost, directory, .htaccess and so on. i wonder if there is a list of all directives belonging to each context? eg. a list with all directives for virtualhost so i know exactly which one i can use? and also, where can i find directives for apache modules? on their page or does each module has its own page with documentation (eg. mod_rewrite)?

    Read the article

  • Can you have multiple PHP 5.x versions with cPanel 11.3x?

    - by atomicguava
    I've been Googling around for a while on this one but I haven't found a good answer yet! Is it possible to set up cPanel 11.3x so that it can run different versions of PHP 5.x (e.g. 5.2, 5.3 and later on 5.4) for each of the configured apache vhosts / domains? It would be great to do this either using .htaccess, php.ini or a setting within cPanel itself. I've seen EasyApache 3 mentioned in the documentation but even after reading through several times I haven't seen a definitive yes or no for whether this is possible - please let me know if you need any more info. This was the documentation I found for EA3: http://goo.gl/IH1sP

    Read the article

  • re-direct SSL pages using header statement based on port

    - by bob's your brother
    I found this in the header.php file of a e-commerce site. Is this better done in a .htaccess file. Also what would happen to any post parameters that get caught in the header statement. // flip between secure and non-secure pages $uri = $_SERVER['REQUEST_URI']; // move to secure SSL pages if required if (substr($uri,1,12) == "registration") { if($_SERVER['SERVER_PORT'] != 443) { header("HTTP/1.1 301 Moved Permanently"); header("Location: https://".$_SERVER['HTTP_HOST'].$_SERVER['REQUEST_URI']); exit(); } } // otherwise us regular non-SSL pages else { if($_SERVER['SERVER_PORT'] == 443) { header("HTTP/1.1 301 Moved Permanently"); header("Location: http://".$_SERVER['HTTP_HOST'].$_SERVER['REQUEST_URI']); exit(); } }

    Read the article

  • Multi-site Drupal install with sites on different ports using Apache ip-based hosting?

    - by MattB
    In the past we've used name-based virtual hosting in Apache. We recently converted websites to SSL and had to go the ip-based route. As a result, we currently have an instance that is set up as follows: www.domain.com using port 80 dev.domain.com using port 8080 Both use the same IP. Is this scenario possible using Drupal multi-site functionality? While we find that dev.domain.com works and reads the correct "dev" database (using the dev settings), it reads theme files from the "www" site instead which is not what we want. Is the culprit the dev's htaccess file? Apache is listening on 8080 and does use the proper DB settings, but just not the correct theme files. One other note: browsing dev.domain.com:8080 gives an error: "The page isn't redirecting properly". Should we just purchase a new IP address for the dev website, or would this still not help? Any advice would be appreciated. Thanks.

    Read the article

  • Web-Server directory permissions

    - by MLS
    Hello All, I would like some help understanding web-server directory permissions. Apache, CentOS, PHP, Mysql Example, I have multiple sites in /var/www/html They are in paths like: /var/www/html/www_domainname_com inside each site I might have a path like /lib/mysql/ like PHP connect stuff, database config, etc. What should me permissions be so that someone cannot just browse to that directory? Should I just .htaccess them? I have apache:apache as the owner of all my web directories. Can I prevent someone from crawling certain directories of my web-server? I have a robots.txt, but what is to say the crawler obeys it? So to sum up: 1. What is the best owner/permission set for my sensitive files that the web-server or php or mysql needs, but I dont want people browsing to? Can I prevent straight out crawling of portions?

    Read the article

  • Apply rewrite rule for all but all the files (recursive) in a subdirectory?

    - by user784637
    I have an .htaccess file in the root of the website that looks like this RewriteRule ^some-blog-post-title/ http://website/read/flowers/a-new-title-for-this-post/ [R=301,L] RewriteRule ^some-blog-post-title2/ http://website/read/flowers/a-new-title-for-this-post2/ [R=301,L] <IfModule mod_rewrite.c> RewriteEngine On ## Redirects for all pages except for files in wp-content to website/read RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteCond %{REQUEST_URI} !/wp-content RewriteRule ^(.*)$ http://website/read/$1 [L,QSA] #RewriteRule ^http://website/read [R=301,L] RewriteBase / RewriteRule ^index\.php$ - [L] RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteRule . /index.php [L] </IfModule> My intent is to redirect people to the new blog post location if they propose one of those special blog posts. If that's not the case then they should be redirected to http://website.com/read. Nothing from http://website.com/wp-content/* should be redirected. So far conditions 1 and 3 are being met. How can I meet condition 2?

    Read the article

  • Multi server management

    - by user788721
    We are running a website that allow users to create their own content, then share it through an iframe. We would like to get more servers to host the user content, and the main one for our website. Each user has a link like xxxx.com/content989856, xxxx.com/content45454545 We were thinking of two options : using a htaccess on the main server that will redirect to the good server but the problem is that if the main server is out, then all the content is out as well using subdomain depending on where the content is hosted, but then if we change the user content from one domain to another one, we will have to change his links as well Do you know a better option or is that really the only two available ? I am wondering how big websites like youtube handle this problem. Thank you very much for your help,

    Read the article

  • mac os x default editor for .dotsystemfiles

    - by jasonkuhrt
    This is a seemingly simple question but I can't find an answer so far despite search quite a bit. I'd like that when I open a .dotsystemfile in finder (i.e. .htaccess or .vimrc) it opens in a different editor than textedit. Doing the regular change-all in the info panel won't do the trick as it gives the following error: " An error occurred while changing the application that opens “.vimrc” because not enough information is available. Do you want to open “.vimrc” with “MacVim.app”? " This isn't a huge issue but it is like a small splinter that I've love removed. Thanks for any helpful information.

    Read the article

  • How can I disallow a user's scripts from accessing anything above their user folder?

    - by Jaxo
    This is probably an extremely simple question to answer for anybody who knows what they're doing, but I can't find any answers myself. I'm trying to set up a subdirectory for my good friend to test his PHP scripts on my (Apache) hosting plan. I don't want to let him access anything else on my server, however, for obvious reasons. His FTP login already leads him to the proper directory, which does not allow navigating any higher than it's root (mydomain.com/friend/). I would like the same behavior to be applied to any scripts, so he cannot simply <?php print_r(glob("../*")); ?> and view all my files. I'm thinking this can be done with an .htaccess file setting the DocumentRoot somewhere, but I can't have the file available for modification inside the user directory. Is this possible without majorly rewiring the web server? I've tried Googling all sorts of things to describe my problem, but without the proper terminology, all I get is "shared hosting" websites and people trying to sell me security packages.

    Read the article

  • ASP 3.0 Folder/File Permissions Settings (ASP Classic)

    - by ASP Pee-Wee
    Dear Stack Exchange, Hi, I have built a form input page in HTML that has an action to post to an ASP handler/processor .asp file. The form handler/processor .asp file contains only <% Insert VBScript Here % and no HTML output whatsoever. The .asp file was never intended to be a "web viewable" .asp file like an .asp home page file or html file would. It's supposed to be for my eyes only- not the public's however it does need to take info posted by the public and do something with it on it's end. I have used VBScript/ASP3.0 to build the form handler/processor file and would like to know how to keep someone from viewing the actual VBScript in the handler/processor .asp file. I am aware of obfuscation but I would like to know how to keep prying eyes from even being able to take a look at the obfuscated code in the handler/processor file. I realize that the server executes the .asp file first before outputting anything to the browser so I guess that my main concern is mostly that someone may could "download" the form handler/processor .asp file, then view it's contents on their machine. Assuming the form handler .asp file is where it is, behind the root, and is on a windows server (no htaccess approach) how could one protect it so that it could never be viewed or simply pulled down via anonymous ftp or something like that? Is there something like "script only" permissions that the system administrator could set up for a particular folder? Remember, with shared hosting I can't go above the root. If so, would the form still be able to post? How would any of you guys go about protecting the asp file in addition to obfuscation? Any help would be greatly appreciated. Thanks, ASP Pee-Wee

    Read the article

  • Google Analytics www 301 causing issues with In-Page Analytics

    - by conrad10781
    The closest question I could find to my problem is This one. The similarity is: I have a profile in Google Analytics (GA) that has been collecting data for a year. The domain setting in GA is "http://example.com". The site, however, will redirect any non-www request, to www.example.com, via a typical .htaccess refinement. We do this to keep the traffic on the load balancers. I don't know the method the original user had in place, but we're doing a 301 on any non-www to the www equivalent. I believe this has to be somewhat standard. Where I differ from this question is in the error message I receive when trying to load the In-Page Analytics. I'm instead receiving: Error: The Website in your settings (http://example.com), redirects into a different domain. (http://www.example.com). In-Page Analytics currently works on only one domain. Note that www.example.com and example.com are NOT considered to be on the same domain. Also, make sure you're not redirecting from http:// to https:// or vice versa. I understand what's being explained, it just seems as though this can't be the end-all. I tried updating the Analytics settings, which from day one has been set as "One domain with multiple subdomains", but I don't see any options to change the URL ( which is currently set to http://example.com and not http://www.example.com ). I'd prefer not to have to change the URL if that was at all possible, but I can't seem to find any documentation or anything that provide any possible solutions.

    Read the article

  • Lost website rankings as a result of using a 302 redirect instead of a 301

    - by george peris
    I have an online store and during the last 2 days I noticed a big change in its SERPS for some keywords. The site is indexed in local Google for 5 keywords, and the online store is based on Magento 1.7. 5 days ago, I set up an SSL certificate in the online store in order to get better ranking results. I setup the SSL and followed the instructions from the Magento to make permanent 301 redirects from the HTTP URLs to the HTTPS URLs. I replaced all the URLs of the online store with HTTPS. When I saw in the SERPS that the rankings for some keywords were going up and down, I checked some URLs to see if all was going well with 301 redirects and found that the redirects where 302 and not 301, which is a big bug caused by Magento. I solved the problem with the .htaccess, but still the rankings for the homepage have disappeared. I fetched the site again using Google Webmaster Tools and am waiting to see. Can you please advise if I did something wrong, and what else I could do?

    Read the article

  • Switching website DNS with minimal downtime

    - by Gavin
    I need to change my domain's DNS, but here in Thailand the ISPs don't follow any standards for renewing their cache of records, so it can take the full two days. I have a new website with about 1,000 members currently who are sending messages and uploading photos. I don't want both sites on the old and new server running at the same time because then I'd have to manually merge the database and transfer uploaded photos. It would be a huge pain. Both servers also a unique URL from my host, e.g. website12345.hostingcompany.com (old host) and website67890.hostingcompany.com (new host). I don't have much experience with this, but I think what I can do is on www.mysite.com, use .htaccess to do a masked redirect to the new server's URL (website67890.hostingcompany.com). Is it possible to do this and keep all URLs being masked? For example, www.mysite.com/profile/username will actually be loading website12345.hostingcompany.com/profile/username. From Google searches it sounds like this is possible, but I don't understand why this is possible due to security issues, since what's keeping people from masking their site to URLs like facebook.com? I could really use some advice here! Thanks!

    Read the article

  • Am I correctly handling duplicate URLs for my homepage?

    - by Rob Goldstein
    I own a Job Search site named www.conservationjobboard.com and have a concern about how the domain is viewed by search engines. The issue is that when the site was first designed, the default page was left as default.php, but the homepage was actually JobBoard.php. To handle this, the default.php page performed a redirect to the JobBoard.php file when www.conservationjobboard.com/ was requested. The main problem resulted because the redirect was a temporary redirect causing search engines to index conservationjobboard.com/ and conservationjobboard.com/JobBoard.php as 2 separate pages. This has since been corrected to use the .htaccess file so that JobBoard.php is now the default file for the root directory eliminating the need for the redirect. Problem is that search engines still show both URL's in search results (one including JobBoard.php and one that ends with /). Another potential problem is that some of my early backlinks are to conservationjobboard.com/JobBoard.php while the rest are to conservationjobboard.com The 2 outstanding questions are as follows: 1. Is my domain still being penalized by search engines like Google for having duplicate homepage URL's? 2. Are all of the back links to my homepage being considered as the same now or is the total number of back links being split between the 2 different URL's? If you think there are still issues with how we have this set-up, I was wondering if you could give me advice on what we should do differently. Thanks.

    Read the article

  • problem with Webmaster Google Sitemap

    - by Alex
    I have a wp mu 3.6.1 with domain mapping (0.5.4.3) with w3tc (0.9.3) and Google XML Sitemaps (4.0 BETA). I have 4 different sitemaps. sub-1.com/sitemap.xml sub-2.com/sitemap.xml sub-3.com/sitemap.xml sub-4.com/sitemap.xml on google webmaster i got 59 errors & 14 warnings. Sitemap errorsErrors: We encountered an error while trying to access your Sitemap. Please ensure your Sitemap follows our guidelines and can be accessed at the location you provided and then resubmit. General HTTP error: 404 not found Sitemap: sub-2.com/sitemap-pt-post-2011-02.xml etc (but when i click on my sitemap links they work fine) Sitemap errorsWarnings: URLs not accessible When we tested a sample of the URLs from your Sitemap, we found that some URLs were not accessible to Googlebot due to an HTTP status error. All accessible URLs will still be submitted. Sitemap: sub-2.com/sitemap-misc.xml HTTP Error: 404 URL: /sitemap.html (but when i click on my sitemap links they work fine) Sitemap errorsIndex errors URLs not accessible When we tested a sample of the URLs from your Sitemap, we found that some URLs were not accessible to Googlebot due to an HTTP status error. All accessible URLs will still be submitted. HTTP Error: 404 URL: /sitemap-pt-post-2010-09.xml (but when i click on my sitemap links they work fine) Web pages 3,276 Submitted 3,247 Indexed what do I have to put on network adminperformance(w3tc)page cachecache preloadSitemap URL: ? i have add "/sitemap.xml" my robots.txt: http://pastebin.com/3K2U0mQa my .htaccess: http://pastebin.com/efJJ6zwy How can I make it work right?

    Read the article

  • Apache Server-Side Includes Refuse to Work (Tried everything in the docs but still no joy)

    - by raindog308
    Trying to get apache server-side includes to work. Really simple - just want to include a footer on each page. Apache 2.2: # ./httpd -v Server version: Apache/2.2.21 (Unix) Server built: Dec 4 2011 18:24:53 Cpanel::Easy::Apache v3.7.2 rev9999 mod_include is compiled in: # /usr/local/apache/bin/httpd -l | grep mod_include mod_include.c And it's in httpd.conf: # grep shtml httpd.conf AddType text/html .shtml DirectoryIndex index.html.var index.htm index.html index.shtml index.xhtml index.wml index.perl index.pl index.plx index.ppl index.cgi index.jsp index.js index.jp index.php4 index.php3 index.php index.phtml default.htm default.html home.htm index.php5 Default.html Default.htm home.html AddHandler server-parsed .shtml AddType text/html .shtml In the web directory I created a .htaccess with Options +Includes And then in the document, I have: <h1>next should be the include</h1> <!--#include virtual="/footer.html" --> <h1>include done</h1> And I see nothing in between those headers. Tried file=, also with/without absolute path. Is there something else I'm missing? I see the same thing on another unrelated server (more or less stock CentOS 6), so I suspect the problem is between keyboard and chair...

    Read the article

  • Conditional attribute in XML - most concise solution?

    - by Lech Rzedzicki
    I am tasked with setting up conditional profiling - a method of tagging chunks of XML with an attribute, which will then be used as a conditional value to extract subset of that XML. Have a look at another definition/example: DITA profiling The XML is documents that are equivalent to printed books - i.e. documents that are often looked at by a human, even if indirectly. Therefore I am looking at a few requirements here: 1. keeping the value list brief - so it doesn't affect the readability of the document 2. be able to process with standard XML tools - a space-separated list inside an attribute is still probably fine, but I'd rather not use too much regexp for this 3. be obvious for various users, including 3rd parties, which content goes where 4. Be easy to maintain going forward Therefore one easy solution is: The problem with this: 1. As the list grows the value of the attribute can be a bit verbose 2. One needs to explicitly state every value even if it's a scenario of this vs everything else Therefore I am also looking at other approaches such as: 1. Using + and - modifiers, Apache htaccess style to override the default cascading of profiling - by default all content goes everywhere and if we want to exclude a bit we just say "-kindle". It does require parsing the whole tree, is not supported by editing tools and one needs to regexp the attribute value a bit deeper... 2. Using an intermediate file to define groups of values such as "other" or "non-print", example of this in DITA. It allows concise XML as well as different grouping and values for each document but it does create a certain level of abstraction which may make it a little less obvious for a 3rd party? Altogether, if you received such XML and were tasked to process it, which option you'd rather receive? If you have any experiences like that, even in an unrelated areas such a builds, don't hesitate to comment!

    Read the article

  • Apache, Rewrite Rule and Directories

    - by milo5b
    my sites-available/ file looks something like the following: <VirtualHost *:80> ServerAdmin webmaster@mysite ServerName mysite.co.uk ServerAlias www.mysite.co.uk DocumentRoot /home/mysite.co.uk/htdocs/ <Directory /home/mysite.co.uk/htdocs/> Options -Indexes FollowSymlinks MultiViews AllowOverride All Order allow,deny allow from all </Directory> ErrorLog ${APACHE_LOG_DIR}/mysite.co.uk/error.log LogLevel warn CustomLog ${APACHE_LOG_DIR}/mysite.co.uk/access.log combined </VirtualHost> In .htaccess (in the htdocs/), I have (amongst others) the following rewrite rule: RewriteRule ^enquiries$ /enquiries.php Somehow I have also a directory named "enquiries" (/home/mysite.co.uk/htdocs/enquiries/), and when I hit the url "www.mysite.co.uk/enquiries" I get: HTTP/1.1 301 Moved Permanently Date: Mon, 10 Dec 2012 18:53:37 GMT Server: Apache/2.2.16 (Debian) Location: http://www.mysite.co.uk/enquiries/ Vary: Accept-Encoding Content-Type: text/html; charset=iso-8859-1 And a Browser would display the directory's content. Now, I could easily rename the folder and get it sorted, but I would like to understand what's going on here. What would be the correct way to configure Apache in a way that it wont behave this way, and instead would listen to the Rewrite Rule? If I did not explain myself clearly, please feel free to ask more questions, I'd be happy to answer them. Thanks!

    Read the article

  • Server outputs the sourcecode of PHP page

    - by Akhilesh B Chandran
    I have a Shared Hosting package with HostGator. In it, I'm hosting around 4 websites. They are just some simple sites that doesn't likely to attract more visitors. But a few days ago, when I accessed one of my sites(via a browser), it outputted the PHP code of index.php, instead of outputting it as HTML. I think, at that time, the server was a bit busy or something. I heard that, Facebook also have got a similar condition where the home page's code was made available. So, how do I take preventive measures for this ? I always use phpBB forum's style of coding. That is, each sub pages, common functions, etc. are separated into subfolders. And in PHP, I would just include_once() or require_once() it. Also, these subfolders have a .htaccess file in which I have set the deny permission to the files inside it from outside. Also, in the main page(index), I would define a constant. And the first line of the subpages(which is situated in separate folders) is to check whether this constant is set. If not, calls die(). I am looking forward for solutions to this problem of outputting raw code when the page is accessed. Thanks in advance :)

    Read the article

  • https:// search results appearing on Google for purely http:// site

    - by hydrurga
    I started weeding through my site's search results from Google today, using a site: search, to determine if there are any links that cause 404s and thus need redirecting. To my amazement I noticed numerous https:// results relating to various pages. My site doesn't have a SSL certificate, doesn't serve such pages, doesn't internally link to https:// pages, doesn't include any such files in its sitemap.xml and, for all of these, never has. I decided to do a Google search for https://<my site> and found one site that incorrectly refers to the root of my site with a https:// prefix - I will try to contact them to get them to correct this. I'm not sure however how Googlebot managed to index the non-root files as https://. I can't find any external links to them and surely, without certification, Googlebot should have stalled at the first request? I've just added the following lines to the site's .htaccess (although the surfer still has to navigate through the browser's "This site is a security risk. Abandon hope all ye who enter here!" message(s) first to get there): RewriteEngine On RewriteCond %{HTTPS} on RewriteRule ^(.*)$ http://www.<my site>.org/$1 [R=301,L] replacing <my site> with my domain name. My big question is this though - I would like to use the Google Webmaster Tools Remove URLs feature to remove the https:// pages from the index. Can I be guaranteed that this will only remove the https:// versions of each relevant page and not the valid http:// versions? My thanks to anyone who can help me out with this particular question and the issue in general.

    Read the article

  • Set usergroup and persmissions ftp folder back to default

    - by OrangeTux
    Argh, I tried to create a new ftp user via the commandline. But i did something wrong and now I can access the server via FTP but I can't see any files. It doesn't make any sense wich user I'm using. ls -la drwxr-xr-x 13 root ftp 4096 2012-03-30 09:47 . drwxr-xr-x 7 web6 ftp 4096 2012-03-26 09:28 .. drwxr-xr-x 4 web6 ftp 4096 2012-03-26 13:31 actions drwxr-xr-x 2 web6 ftp 4096 2012-03-26 11:46 bin -rwxr-xr-x 1 web6 ftp 1520 2012-03-24 23:32 changelog.txt drwxr-xr-x 2 web6 ftp 4096 2012-03-26 13:30 css drwxr-xr-x 8 web6 ftp 4096 2012-03-24 22:43 external -rwxr-xr-x 1 web6 ftp 333 2012-03-26 15:12 .htaccess drwxr-xr-x 3 web6 ftp 4096 2012-02-27 15:07 images -rwxr-xr-x 1 web6 ftp 1606 2012-03-26 21:25 index.php drwxr-xr-x 2 web6 ftp 4096 2012-02-18 13:20 js drwxr-xr-x 2 web6 ftp 4096 2012-02-03 00:34 layout drwxr-xr-x 2 web6 ftp 4096 2012-03-29 23:35 library drwxr-xr-x 2 web6 ftp 4096 2012-03-30 09:47 log -rwxr-xr-x 1 web6 ftp 396 2012-03-24 15:04 menu.php drwxr-xr-x 2 web6 ftp 4096 2012-03-30 12:01 python drwxr-xr-x 2 web6 ftp 4096 2012-03-23 10:51 todo I can't see no dirs and files because I changed the groupowner or I changed the rights of the groupowner of the ftp dir. How can I set the ownership of the files back to default so I can access the files via FTP again? Many thanks in advance

    Read the article

  • Does purposely linking to an invalid URL and then using 301 affect SEO?

    - by Mike
    On a section of my site, I am currently using .htaccess rewrites to put the ID as part of the URL instead of in the query, like so: RewriteRule ^([a-z_]+)?/?tours/([0-9]+)/(.*) /tours/tour_text.php?lang=$1&id=$2&urlstr=$3 [L] For example, if someone goes to /en/tours/12/some-text-here it will rewrite it to /tours/tour_text.php?lang=en&id=12&urlstr=some-text-here. However I don't want the users to be able to put just any text, so if they type in the wrong some-text-here part it will 301 redirect them to the right page. This works perfectly, but I can see a potential problem potential arising when localizing the website, so I just wanted to make sure it's not actually a problem. How it is now, if someone goes to /en/tours/12/some-text-here, the anchor to the Spanish version of that page will be /es/tours/12/some-text-here (i.e. only changing the "en" to "es"), and then the script will then 301 them to the correct Spanish text (something like /es/tours/12/algun-texto-aqui). And the reverse will also be the same. The anchor on the Spanish version to the English version would be /en/tours/12/algun-texto-aqui and then they will be forwarded with 301 back to /en/tours/12/some-text-here. Basically, the anchor changes the language and the 301 changes the string at the end. So I have two questions: Does purposely and permanently having invalid URLs on your site that get 301'ed to the correct ones have any effect on SEO? I could make it just show the correct URL to begin with, but this is a significant amount of work due to how I am handling the translations, so I would prefer just to 301 them. Will the invalid URLs that are contained in the links be added to the search engine indexes even if they get 301'ed to another page?

    Read the article

  • best way to host multiple wordpress site on single vps [migrated]

    - by Ben
    Not sure if this is webmaster or a WordPress question, it's a bit half and half, sorry if I'm posting in the wrong place. Without using Multi-Site or installing new WordPress CMS' in second-level domains, what's the best way to get multiple WordPress installs running on my VPS (running Linux powered CentOS 6 with WHM and cPanel)? It's currently working but only by setting the permalinks option to the default setting, so the URLs aren't human-friendly. I have come across something called WPSiteStack, though I'd really rather not go down this route. Long story short, I need the following: Seperate installs so one core / theme / plugin update doesn't affect all sites and increases security of all sites; 'Pretty' permalinks; Each WordPress install must be in the root of it's own domain to ensure that I can accurately measure my clients' quotas; It may also be worth noting that some functions within each install use the $_SERVER['DOCUMENT_ROOT'] and $_SERVER['HOST'] variables. I have already edited the httpd-vhosts.conf, httpd.conf and .htaccess files but this hasn't made any changes. So any ideas what I'm missing or doing wrong? Any help is much appreciated.

    Read the article

< Previous Page | 70 71 72 73 74 75 76 77 78 79 80 81  | Next Page >