Search Results

Search found 21347 results on 854 pages for 'www mortenbock dk'.

Page 51/854 | < Previous Page | 47 48 49 50 51 52 53 54 55 56 57 58  | Next Page >

  • Punch Line Marketing

    - by jackman
    There are so many "punch line" websites like: http://www.thatswhyyoufail.com www.canrailsscale.com/ www.nooooooooooooooo.com/ but it's a mystery how they ever get so popular. I have an idea for a punch line website too, but I want to make it BIG! Does anyone have any tips for marketing these kinds of sites? p.s. and no, I do not own any of these sites, and am not disguising it as a question to market them lol.

    Read the article

  • Use Google Analytics to target different sections of a blog

    - by Emily Yao
    I have a blog that targets different regions. The Europe region blog has different sections in different local languages such as English, French and German. I wonder how to track and analyze the different sections. My initial thought is to search the domain URL, but I found it is not a good idea. For example, the URL for the Europe blog is like www.myblog.com/europe. If you click the French section, the URL is like www.myblog.com/europe/language/french. If you click an article in the French section, it is like www.myblog.com/article_name. Notice the article link is not www.myblog.com/language/french/article_name!

    Read the article

  • Content API for Shopping Office Hours - June 12, 2012

    Content API for Shopping Office Hours - June 12, 2012 Hangout discussing Product Listing Ads (PLAs) and the Google Affiliate Network (GAN) with guest Mark Coppin (GAN) and Claire Hugo (PLAs) of Google. In the Hangout, we reference the video "How to create a new Product Listing Ads campaign" (www.youtube.com which can be found in the Getting Starting page on the Shopping/Ads integration site (www.google.com Also, check out the GAN site to learn more: www.google.com From: GoogleDevelopers Views: 703 6 ratings Time: 31:23 More in Science & Technology

    Read the article

  • Can a NodeJS webserver handle multiple hostnames on the same IP?

    - by Matthew Patrick Cashatt
    I have just begun learning NodeJS and LOVE it so far. I have set up a Linux box to run it and, in learning to use the event-driven model, I am curious if I can use a common IP for multiple domain names. Could I point, for example, www.websiteA.com, www.websiteB.com, and www.websiteC.com all to the same IP (node webserver) and then route to the appropriate source files based on the request? Would this cause certain doom when it came to scaling to any reasonable size?

    Read the article

  • Is it possible to use canonical tag in Blogger posts?

    - by John Sanjay
    I found one of my blog post was cached by Google (www.example.com/post.html). I found that comment page of the post was also cached (www.example.com/post.html?showComment=1372054729698). These two pages are showing in Google SERP when I checked cached posts of my blog. Is it possible to use canonical tag on the post www.example.com/post.html?showComment=1372054729698 so that Google won't penalize my original post? Is there any other ways to redirect a blog post?

    Read the article

  • Reverse proxying only a specific URL

    - by Bart Silverstrim
    I have a web server at www.ourcompany.com running Apache2. Using the proxy modules, I am able to (for example) get 172.16.0.5, an internal IP device, to be accessed on www.ourcompany.com/device. The trouble is that anyone can play with or explore the device using strings sent to www.ourcompany.com/device/change/settings/here.html. I'd like the reverse proxy to only work for a specific URL; www.ourcompany.com/device/you/must/use/this while anything else will be rejected if requested. Is there a setting that can be used to do this, or is it a simple rewrite condition placed in the virtualhost for the site under sites-enabled? What is the simplest, most maintainable way to sanitize requests to the internal device through the reverse proxy? Running Apache2 on Ubuntu.

    Read the article

  • Set default owner/user

    - by Daniel Hollands
    I'm a web developer, and so have set-up an old machine in the office as an Ubuntu Server, for the purposes of testing websites. I've set-up LAMP and have created a /var/www folder, from which all my local sites are served. The issue is that of user permissions, i.e. any files that I copy into that folder (from my Windows machine via the network) automatically take on me (daniel) as their owner. The problem is that I want www-data to become the owner. I did some research and saw that it should be possible to use setuid (and setgid) to automatically set www-data as the owner of all files put into /var/www automatically, so far I've not had any luck making it work. Can someone help please? Thank you UPDATE: Would this do what I want it to do? Default file permissions for php user www-data UPDATE 2: I've kinda fixed my issue by changing my samba settings. Using Webmin, I was able to go in and change the default settings (as seen here: http://imageshack.us/photo/my-images/521/captureon.png/)

    Read the article

  • Redirect To Domain Before SSL Is Read

    - by Devin Dixon
    I had to switch servers and I want to redirect all SSL urls to the non-ssl site. The problem I am running into is the https site still throws invalid certificate error even through apache has the redirect implemented. <VirtualHost *:443> ServerAdmin [email protected] DocumentRoot /data/sites/www.example.com/main/ RewriteEngine on Redirect 301 / http://www.example.com SSLEngine on SSLCertificateFile /etc/httpd/ssl/www.examplecom/ssl-cert-snakeoil.pem SSLCertificateKeyFile /etc/httpd/ssl/www.example.com/ssl-cert-snakeoil.key ServerName www.example.com ErrorLog "logs/example.com-error_log" CustomLog "logs/example.com-access_log" common </VirtualHost> My question is, how can I do a redirect and avoid the invalid ssl certifcation error in the browser?

    Read the article

  • Why does 301 redirect work for http but not for https?

    - by Tom G
    Through my domain registrar I have set up a domain, essayme.co.uk, to automatically forward to https://google.com. If I go to http://essayme.co.uk it works as expected and redirects me to https://google.com. $curl -i http://essayme.co.uk HTTP/1.1 301 Moved Permanently Cache-Control: max-age=900 Content-Type: text/html Location: https://google.com Server: Microsoft-IIS/7.5 X-AspNet-Version: 4.0.30319 X-Powered-By: ASP.NET Date: Sat, 07 Jun 2014 11:14:16 GMT Content-Length: 0 Age: 0 Connection: keep-alive However, if I go to https://essayme.co.uk it just freezes and times out. $curl -i https://essayme.co.uk curl: (7) Failed connect to essayme.co.uk:443; Operation timed out What is happening in the second case? (and, if possible, how can I get the redirect to work for https?) Problem background/clarification: I don't have an SSL certificate for the essayme.co.uk domain above, but I do for my live domain (let's call it mywebsite.com), and I was seeing the exact same problem on this domain (hence why I'm trying to debug the problem). Unfortunately I can't experiment with the live domain (as it's live) and I would like to avoid having to buy a second certificate for essayme.co.uk just for debugging (unless absolutely necessary). The problem I was seeing: my live domain, mywebsite.com (not its real name), has a valid SSL certificate. Visiting https://www.mywebsite.com displayed the webpage as expected. I had set up forwarding (like in the question above) from the naked domain (mywebsite.com) to https://www.mywebsite.com) Visiting http://mywebsite.com redirected to https://www.mywebsite.com as expected. However, visiting https://mywebsite.com would freeze and time out (as in the question above). I also tried forwarding it to http://www.otherwebsite.com as an experiment (i.e. forwarding to another site that does not use SSL), but the result was the same: Visiting http://mywebsite.com redirected to http://www.otherwebsite.com as expected. Visiting https://mywebsite.com would freeze and time out again. So I set up essayme.co.uk as an experiment to try and understand why it doesn't work.

    Read the article

  • How to properly remove URL's from Google's index?

    - by ElHaix
    On some of our sites, we now have several thousand pages that dilute our website's keyword density. The website is an MVC site with SEO routing. If I submit a new sitemap with say only the 2000 or so pages that we want indexed, even though navigating to the diluting pages still works, will Google re-index the site with only those 2000 pages, dropping the superfluous ones? For example, I want to keep roughly 2000 of the following: www.mysite.com/some-search-term-1/some-good-keywords www.mysite.com/some-search-term-2/some-more-good-keywords And remove several thousand of the following that have already been indexed. www.mysite.com/some-search-term-xx/some-poor-keywords www.mysite.com/some-search-term-xx/some-poor-more-keywords These pages are not actually "removed" as navigating to these URL's still renders a page. Even though there are potentially hundreds of thousands of pages, I only want say 2000 to be re-indexed and retained. The others removed (without having to do these manually). Thanks.

    Read the article

  • Nginx Subdomain Problem

    - by user292299
    i can't access my subdomain on localhost. my localdomain is localhost.dev and it's work.but i want to auto subdomain for php script (username.localhost.dev) i try this server { listen 80 default_server; listen [::]:80 default_server ipv6only=on; access_log /var/www/access.log; error_log /var/www/error.log; root /var/www; index index.php index.html index.htm; # Make site accessible from http://localhost/ server_name localhost.dev ***.localhost.dev**; location / { # First attempt to serve request as file, then # as directory, then fall back to displaying a 404. try_files $uri $uri/ /index.html; # Uncomment to enable naxsi on this location # include /etc/nginx/naxsi.rules } location /f2/public/ { try_files $uri $uri/ /f2/public/index.php?$args; } location /doc/ { alias /usr/share/doc/; autoindex on; allow 127.0.0.1; allow ::1; deny all; } # Only for nginx-naxsi used with nginx-naxsi-ui : process denied requests #location /RequestDenied { # proxy_pass http://127.0.0.1:8080; #} #error_page 404 /404.html; # redirect server error pages to the static page /50x.html # #error_page 500 502 503 504 /50x.html; #location = /50x.html { # root /usr/share/nginx/html; #} # pass the PHP scripts to FastCGI server listening on 127.0.0.1:9000 # location ~ \.php$ { # fastcgi_split_path_info ^(.+\.php)(/.+)$; # # NOTE: You should have "cgi.fix_pathinfo = 0;" in php.ini # # # With php5-cgi alone: # fastcgi_pass 127.0.0.1:9000; # # With php5-fpm: # fastcgi_pass unix:/var/run/php5-fpm.sock; # fastcgi_index index.php; # include fastcgi_params; include /etc/nginx/fastcgi_params; try_files $uri =404; fastcgi_pass 127.0.0.1:9000; fastcgi_index index.php; fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name; } # deny access to .htaccess files, if Apache's document root # concurs with nginx's one # #location ~ /\.ht { # deny all; #} } it's not working.i change server_name for testing server_name localhost.dev asd.localhost.dev; i can't access asd.localhost.dev and i try this double server{} section # You may add here your # server { # ... # } # statements for each of your virtual hosts to this file ## # You should look at the following URL's in order to grasp a solid understanding # of Nginx configuration files in order to fully unleash the power of Nginx. # http://wiki.nginx.org/Pitfalls # http://wiki.nginx.org/QuickStart # http://wiki.nginx.org/Configuration # # Generally, you will want to move this file somewhere, and start with a clean # file but keep this around for reference. Or just disable in sites-enabled. # # Please see /usr/share/doc/nginx-doc/examples/ for more detailed examples. ## server { listen 80 default_server; listen [::]:80 default_server ipv6only=on; access_log /var/www/access.log; error_log /var/www/error.log; root /var/www; index index.php index.html index.htm; # Make site accessible from http://localhost/ server_name localhost.dev; location / { # First attempt to serve request as file, then # as directory, then fall back to displaying a 404. try_files $uri $uri/ /index.html; # Uncomment to enable naxsi on this location # include /etc/nginx/naxsi.rules } location /f2/public/ { try_files $uri $uri/ /f2/public/index.php?$args; } location /doc/ { alias /usr/share/doc/; autoindex on; allow 127.0.0.1; allow ::1; deny all; } # Only for nginx-naxsi used with nginx-naxsi-ui : process denied requests #location /RequestDenied { # proxy_pass http://127.0.0.1:8080; #} #error_page 404 /404.html; # redirect server error pages to the static page /50x.html # #error_page 500 502 503 504 /50x.html; #location = /50x.html { # root /usr/share/nginx/html; #} # pass the PHP scripts to FastCGI server listening on 127.0.0.1:9000 # location ~ \.php$ { # fastcgi_split_path_info ^(.+\.php)(/.+)$; # # NOTE: You should have "cgi.fix_pathinfo = 0;" in php.ini # # # With php5-cgi alone: # fastcgi_pass 127.0.0.1:9000; # # With php5-fpm: # fastcgi_pass unix:/var/run/php5-fpm.sock; # fastcgi_index index.php; # include fastcgi_params; include /etc/nginx/fastcgi_params; try_files $uri =404; fastcgi_pass 127.0.0.1:9000; fastcgi_index index.php; fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name; } # deny access to .htaccess files, if Apache's document root # concurs with nginx's one # #location ~ /\.ht { # deny all; #} } ############################### server { access_log /var/www/access.log; error_log /var/www/error.log; root /var/www; index index.php index.html index.htm; # Make site accessible from http://localhost/ server_name asd.localhost.dev; location / { # First attempt to serve request as file, then # as directory, then fall back to displaying a 404. try_files $uri $uri/ /index.html; # Uncomment to enable naxsi on this location # include /etc/nginx/naxsi.rules } location /f2/public/ { try_files $uri $uri/ /f2/public/index.php?$args; } location /doc/ { alias /usr/share/doc/; autoindex on; allow 127.0.0.1; allow ::1; deny all; } # Only for nginx-naxsi used with nginx-naxsi-ui : process denied requests #location /RequestDenied { # proxy_pass http://127.0.0.1:8080; #} #error_page 404 /404.html; # redirect server error pages to the static page /50x.html # #error_page 500 502 503 504 /50x.html; #location = /50x.html { # root /usr/share/nginx/html; #} # pass the PHP scripts to FastCGI server listening on 127.0.0.1:9000 # location ~ \.php$ { # fastcgi_split_path_info ^(.+\.php)(/.+)$; # # NOTE: You should have "cgi.fix_pathinfo = 0;" in php.ini # # # With php5-cgi alone: # fastcgi_pass 127.0.0.1:9000; # # With php5-fpm: # fastcgi_pass unix:/var/run/php5-fpm.sock; # fastcgi_index index.php; # include fastcgi_params; include /etc/nginx/fastcgi_params; try_files $uri =404; fastcgi_pass 127.0.0.1:9000; fastcgi_index index.php; fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name; } # deny access to .htaccess files, if Apache's document root # concurs with nginx's one # #location ~ /\.ht { # deny all; #} } # another virtual host using mix of IP-, name-, and port-based configuration # #server { # listen 8000; # listen somename:8080; # server_name somename alias another.alias; # root html; # index index.html index.htm; # # location / { # try_files $uri $uri/ =404; # } #} # HTTPS server # #server { # listen 443; # server_name localhost; # # root html; # index index.html index.htm; # # ssl on; # ssl_certificate cert.pem; # ssl_certificate_key cert.key; # # ssl_session_timeout 5m; # # ssl_protocols SSLv3 TLSv1; # ssl_ciphers ALL:!ADH:!EXPORT56:RC4+RSA:+HIGH:+MEDIUM:+LOW:+SSLv3:+EXP; # ssl_prefer_server_ciphers on; # # location / { # try_files $uri $uri/ =404; # } #} i can't success

    Read the article

  • Rewrite rule to redirect all subpages to a single page?

    - by user784637
    I have two two files /etc/apache2/sites-available/foo and /etc/apache2/sites-available/foo_maintenance The rewrite rule I use in /etc/apache2/sites-available/foo is <Directory /var/www/public_html> Options +FollowSymlinks RewriteOptions inherit RewriteEngine on # RewriteCond %{HTTP_HOST} ^mysite\.com [NC] RewriteRule ^(.*)$ http://www.mysite.com/$1 [R=301,L] </Directory> so that all mysite.com/* redirect to www.mysite.com After I take my site down for maintenance, if the user is navigates to a subpage of the site like mysite.com/subdir/something.php I would like to redirect them to www.mysite.com so the index.html of the maintenance page would be displayed. What is the rewrite rule to redirect all traffic from any subpage to www.mysite.com?

    Read the article

  • apt-get issue after upgrading to 12.04

    - by user83906
    I have recently upgraded my cluster from 11.10 to 12.04. After the upgrade, I am having trouble running apt-get on the cluster nodes. I can ssh between the nodes (client-to-client; client-to-head; client-to-external etc.). However, sudo apt-get update produces the following errors: Ign http://us.archive.ubuntu.com precise InRelease Ign http://security.ubuntu.com precise-security InRelease Ign http://www.openfoam.org maverick InRelease Ign http://us.archive.ubuntu.com precise-updates InRelease Err http://security.ubuntu.com precise-security Release.gpg Something wicked happened resolving 'security.ubuntu.com:http' (-5 - No address associated with hostname) Err http://www.openfoam.org maverick Release.gpg Something wicked happened resolving 'www.openfoam.org:http' (-5 - No address associated with hostname) Ign http://us.archive.ubuntu.com precise-backports InRelease Ign http://www.openfoam.org maverick Release Ign http://security.ubuntu.com precise-security Release Err http://us.archive.ubuntu.com precise Release.gpg Something wicked happened resolving 'us.archive.ubuntu.com:http' (-5 - No address associated with hostname) Ign http://security.ubuntu.com precise-security/main Sources/DiffIndex Err http://us.archive.ubuntu.com precise-updates Release.gpg Something wicked happened resolving 'us.archive.ubuntu.com:http' (-5 - No address associated with hostname) Ign http://security.ubuntu.com precise-security/restricted Sources/DiffIndex 15% [Connecting to us.archive.ubuntu.com] [Connecting to security.ubuntu.com] [Connecting to www.openfoam.org] On the headnode, I have in /etc/network/iterfaces: auto eth0 iface eth0 inet static address 192.168.0.1/24 On the client nodes, I have /etc/network/iterfaces: auto eth0 iface eth0 inet static address 192.168.0.101 netmask 255.255.255.0 gateway 192.168.0.1 Please advise.

    Read the article

  • Redirects in .htaccess to avoid crawl errors

    - by user71698
    I am getting a lot of errors in Webmaster Tools and basically there's a lot of links ending like this: mydomainname.com/links.php How can I redirect these links, to shave off this part at the end? For instance, there is a link in Google: http://www.onlineglobalbiz.com/article-marketing/www.onlineglobalbiz.com/links.php This should be: http://www.onlineglobalbiz.com/article-marketing/ Using .htaccess, how can I redirect from the incorrect links?

    Read the article

  • Mounting NFS directory causes creating Zero byte files

    - by Alaa
    I have two Servers, Server X (IP 192.168.1.1) and Server Y (IP 192.168.1.2), both of them are ubuntu 9.1 i have created varnish load balancer on them for my drupal website (pressflow 6.22) I have mounted a directory of imagecache from server X to Y as below @X:/etc/exports == /var/www/proj/htdocs/sites/default/files/images 192.168.1.2(rw,async,no_subtree_check) @Y:/etc/fstab == 192.168.1.1:/var/www/proj/htdocs/sites/default/files/images var/www/proj/htdocs/sites/default/files/images nfs defaults 0 0 also i made this on server X X:/var/www/proj/htdocs/sites/default/files$ chmod -R 777 images i tried to touch, rm, vim, and cat files in images directory that has been mounted on Y and everything went fine. now, ALWAYS when server Y's imagecache tries to create an image in images directory, the image is created with ZERO byte file size. anyone face the same before? any idea of how to fix this problem or what might cause it? Thanks for your help

    Read the article

  • When will my old page stop appearing on Google?

    - by Bane
    I recently bought a new address for my Blogger blog, from yannbane.blogspot.com to www.yannbane.com. However, www.yannbane.com addresses do not appear when they are searched for! Is this natural? How much time will it take for Google to update its index? yannbane.blogspot.com 301's to www.yannbane.com. Both are added to my Webmaster Tools account, but it shows no data for www.yannbane.com (strangely). And, finally, is there something I could do to speed up the process?

    Read the article

  • Live Webcasts of the Transit of Venus

    - by TATWORTH
    Space.com have published a list of webcams for the Transit of Venus at http://www.space.com/14568-venus-transits-sun-2012-skywatching.htmlLive Webcasts Around the World Here is a list of observatories and organizations providing live webcasts on June 5 of the Venus transit of 2012: NASA webcast from Mauna Kea, Hawaii: http://venustransit.nasa.gov/2012/transit/webcast.php Exploratorium (in San Francisco, Calif.) webcast from Mauna Loa, Hawaii: http://www.exploratorium.edu/venus/ Slooh Space Camera telescope feed from around the world: http://www.slooh.com/transit-of-venus/ Astronomers Without Borders webcast from the Mount Wilson Observatory in California: http://www.astronomerswithoutborders.org/projects/transit-of-venus.htmlI intend to publish a single list later.

    Read the article

  • why my website doesn't ranked by alexa? [closed]

    - by arshen
    i created a website with WordPress and post 10+ article in period of two month, but alexa doesn't rank my website. i tried to change my theme, URL and other related things and submit my website URL manually to alexa dashboard, while i have amount of 200 page view in a day but its still not ranked. my website URL: http://daskaht.ir robots file: http://daskhat.ir/robots.txt alexa page: www.alexa.com/siteinfo/daskhat.ir domain whois: whois.domaintools.com/daskhat.ir and website seo rank: www.woorank.com/en/www/daskhat.ir

    Read the article

  • Why will Google Analytics not allow our URL?

    - by Linda H
    In Google Analytics we're trying to add a property for each website of a WordPress multisite network. It works well for all of the sub-site but one. In the Default URL field we used the following: www.mywebsite.no/ - works www.mywebsite.no/nurse - works, as does a few other sub-sites www.mywebsite.no/doctor - doesn't work In the last case we get the following error: Value is not a valid domain. (e.g. example.com, www.example.com) We can't change the name of the sub-site but GA just won't accept the URL. Why is this and what can we do to solve this?

    Read the article

  • How can I use domain masking without having self referral in Google Analytics

    - by Cdore
    I have one old domain that points to a website's server's ip (let's call it www.oldsite.com). I have a new one, www.newsite.com, that is set up to be forwarded to a specific page on the website. Due to the way the host of newsite.com places the website in a frame, in Google Analystics, the newsite.com is listed as a source rather than the source they were at before hand, causing a self referral. A solution is to edit the code of the iframe as I looked up, but there's no way to really edit the host's masking source code of course. Another solution I did previously was have www.newsite.com point to the address that www.oldsite.come pointed to. It solved the analytics problems, but in exchange, the url masking no longer worked. In the address bar, it came up as www.oldsite.com. Is there a way to make me have url masking and be able to forward to agree with google analytics? The server of the website is hosted on a cloud server, if this is anymore information.

    Read the article

  • How to get Visual Studio 2010 Express Edition on Windows 7

    - by thanigai
    Visual studio 2010 is an amzing release from Microsoft. I have tried the beta 1,2 of Visual Studio 2010 and finally the full version is released. I am also interested in the latest edition of Windows which nothing but our Windows 7. Next to Vista I like this version very much. Out of curiosity I have installed the prebuild version of Windows 7. I tried installing the express edition here and it failed making me disappointed. I tried two or three times and finally I decided to download the trial version of Windows 7. After that I can install the Visual Studio 2010 express edition easily. I have given the link below from where I have downloaded the file. http://www.microsoft.com/express/downloads/ Here the link give is through Web PI Installer. Other option is you can download the ISO image file and burn them to a disc or use a virtual disc This Visual Web developer will provide the Sql Server engine alone. To get a Sql Server Management Studio get from the following link http://www.microsoft.com/express/Database/InstallOptions.aspx That's it all the things necessary for the web application programming is ready. Ah I forget to tell about the Silverlight. Please find the Silverlight 4 latest tools from the below link (WCF RIA services is the main update) http://www.silverlight.net/getstarted/ Silverlight 4 Tools(http://www.microsoft.com/downloads/details.aspx?FamilyID=eff8a0da-0a4d-48e8-8366-6ddf2ecad801&displaylang=en) Expression Blend 4 trial(http://www.microsoft.com/downloads/details.aspx?FamilyID=88484825-1b3c-4e8c-8b14-b05d025e1541&displaylang=en) I think the reader would have enjoyed on how to get these things. Please let me know if you are not clear with any of these things.  Thanks, Thani

    Read the article

  • How to allow Google Images search to by pass hotlink protection?

    - by Marco Demaio
    I saw Google Images seems to index my images only if hotlink protection is off. * I use anyway hotlink protection because I don't like the idea of people sucking my bandwidth, i simply this code to protcet my sites from being hotlinked: RewriteEngine on RewriteCond %{HTTP_REFERER} !^$ RewriteCond %{HTTP_REFERER} !^http(s)?://(www\.)?mydomain\.com/.*$ [NC] RewriteCond %{HTTP_REFERER} !^http(s)?://(www\.)?mydomain\.com$ [NC] RewriteRule .*\.(jpg|jpeg|png|gif)$ - [F,NC,L] But in order to allow Google Image search to bypass my hotlink protection (I want Google Images search to show my images) would it suffice to add a line like this one: RewriteCond %{HTTP_REFERER} !^http(s)?://(www\.)?google\.com/.*$ [NC] RewriteCond %{HTTP_REFERER} !^http(s)?://(www\.)?google\.com$ [NC] Because I'm wondring: is the crawler crawling just from google.com? and what about google.it / google.co.uk, etc.? FYI: on Google official guidelines I did not find info about this. I suppose hotlink protection prevents Google Images to show images in its results because I did some tests and it seems hotlink protection does prevent my images to be shown in Google Images search.

    Read the article

  • Why am I getting domainpark.cgi being called from my website?

    - by Sean
    I used to test my site on www.exampleone.com and now I have moved to the real domain www.realdomain.com now and www.exampleone.com is now parked by 1and1 (default). Now when I test to see which requests are made by the www.realdomain.comI see domainpark.cgi and park.js from Sedo Parking also being requested as well as the js that serves the ads by adclicks. How do I get rid of this? It's not on the index page at all, and it's causing a lot of strain and slowing my site down.

    Read the article

  • Alternative for Subdomains [duplicate]

    - by Raj
    This question already has an answer here: Should I choose sub-directories over sub-domains in this case? 2 answers I have a company and website like www.example.com We have 1 industry with product 1 ,product 2 and another industry with product 3 and product 4 . All these products are different to each other my questions is like should have subdomains like www.industry1.example.com or www.example.com/industry1 If it is industry1.example.com it might sense different domain , if it is example.com/industry1 the number of folders might increase Please suggest a best solution for this thanks, Raj

    Read the article

< Previous Page | 47 48 49 50 51 52 53 54 55 56 57 58  | Next Page >