Search Results

Search found 11873 results on 475 pages for 'adobe acrobat pro'.

Page 120/475 | < Previous Page | 116 117 118 119 120 121 122 123 124 125 126 127  | Next Page >

  • MS Bing web crawler out of control causing our site to go down

    - by akaDanPaul
    Here is a weird one that I am not sure what to do. Today our companies e-commerce site went down. I tailed the production log and saw that we were receiving a ton of request from this range of IP's 157.55.98.0/157.55.100.0. I googled around and come to find out that it is a MSN Web Crawler. So essentially MS web crawler overloaded our site causing it not to respond. Even though in our robots.txt file we have the following; Crawl-delay: 10 So what I did was just banned the IP range in iptables. But what I am not sure to do from here is how to follow up. I can't find anywhere to contact Bing about this issue, I don't want to keep those IPs blocked because I am sure eventually we will get de-indexed from Bing. And it doesn't really seem like this has happened to anyone else before. Any Suggestions? Update, My Server / Web Stats Our web server is using Nginx, Rails 3, and 5 Unicorn workers. We have 4gb of memory and 2 virtual cores. We have been running this setup for over 9 months now and never had an issue, 95% of the time our system is under very little load. On average we receive 800,000 page views a month and this never comes close to bringing / slowing down our web server. Taking a look at the logs we were receiving anywhere from 5 up to 40 request / second from this IP range. In all my years of web development I have never seen a crawler hit a website so many times. Is this new with Bing?

    Read the article

  • DNS for domain shows old website for www version

    - by user3745746
    I bought 2 domains form GoDaddy but with both I am seeing the same problems in that the domain on the www version goes to the old site which is still being hosted. I have checked the IntoDNS website and in the www record it shows: Your www.example.com A record is: www.example.com -> example.typepad.com -> cname-cloudflare.typepad.com -> What can I do to stop this from happening? Will this eventually be automatically removed and fix itself? Though obviously it's not automatically fixed itself in the long drawn out expiry process... It's been quite a while for one of them and still hasn't propagated for the www. I'm not having any problems with the normal example.com part of the site.

    Read the article

  • Recommend good shared hosting [closed]

    - by Django Reinhardt
    It seems that everyone has something bad to say about the "big" shared hosting sites like 1and1, HostGator, GoDaddy, etc. but what are the ones you've had GOOD experiences with? I'm going to focus this question on LAMP stacks, given that they're the most popular option for shared hosting, but if you have an especially good experience with a different stack. Good shared hosting should be: Competitively priced - But not at the expense of... Fully featured - Email, PHP, MySQL, but what else? Highly customizable - Do you have access to advanced features like being able to deliver static content? Up to date - Do they run PHP4 as standard, or do they run the latest version? Customer service - When you have a problem are they rude and unhelpful? Do they take ages to reply? So how about it? Who have YOU have a good experience with?

    Read the article

  • .XML Sitemaps and HTML Sitemaps Clarification

    - by MSchumacher
    I've got a website with about 170 pages and I want to create an effective Sitemap for it as it is long due. The website is internally linked very well but I still want to take advantage of creating a sitemap to allow SE's to crawl my site easier and to hopefully increase my websites PR. Though I am slightly confused to what I must do: Is it necessary to create a .xml sitemap AND a HTML Sitemap (both)? ... Because I've never worked with .xml ... where do I put this file once it's created? In the Root folder? So I assume that this sitemap.xml is ONLY to be read by spiders and NOT by website visitors. IE: No visitor on my website is going to visit the page sitemap.xml, am I correct? ... Hence why I should also create an HTML sitemap (sitemap.htm)?

    Read the article

  • Service and/or tool to monitor performance?

    - by chris
    I am seeing wildly different performance from a clients web site, and would like to set up some sort of monitoring. What I'm looking for is a service that will issue requests to a couple of URLs, and report on the time it took to process the page - TTFB and time to download the entire page - that means I need something that will process javascript & css. Are there services like this? I've seen a few that monitor uptime, but they don't seem to report on the overall page performance.

    Read the article

  • Will my new HTML5 website decrease my Google ranking?

    - by Joshua
    Hi, I have a traditional HTML website that loads pages/sections of the site when people click on menu items. Pretty standard. Currently, I'm working on relaunching my website with a brand new HTML5 code & jquery that loads the whole thing, and just slides from one section to the next, sort of like this website: http://www.mino.pl/ My concern is that this will affect my ranking with google and websiteoutlook.com because it may seem like the website only has one page now instead of 8, making it look like I have less pageviews and making my site less relevant for search engine rankings. Are my concerns legit? If so, do you have suggestions on how to avoid it? I really like the idea of working with a page that 'slides' to different sections better than having pages load all the time. Any suggestions/thoughts would be very much appreciated. Thanks.

    Read the article

  • How to ensure images all loaded before I reference in my HTML canvas [closed]

    - by mark stephens
    I want to draw some images in on a HTML canvas with context.drawImage(Im1 ,205,18,184,38); In order to make sure it loads I need to put in code like this but then I cannot draw things with it var Im1 = new Image(); Im1.src="rechnung11014page1/img/1/Im1.png"; Im1.onload = function() { context.drawImage(Im1 ,205,18,184,38); } Is there a way to load all the images and then execute a block of code using several images?

    Read the article

  • DNS question and Google PageRank from domains

    - by Beck
    I'm not so good at dns at all :) Just basics. A while ago i have noticed, that my blog have different Page Ranks, PR 3 for domain www.example.com and PR 1 for domain example.com. In dns records i have this setup: A - IP - www.example.com A - same IP - example.com Should i replace this record "A - same IP - example.com" with row with CNAME instead of A? Like that: CNAME - same IP - example.com - alias www.example.com Will this combine Page Rank value of both domains? Or i can just create 302 redirection inside .htaccess file, verify example.com(without www) inside Google Webmaster Tools as my domain and inside www.example.com options set as main domain? Thanks ;)

    Read the article

  • General website publishing questions involving domain forwarding issue

    - by Gorgeousyousuf
    Even though I have been having a certain level of knowledge and experience about web development I have never interested in obtaining a domain and publishing a website from my own server. Since today I have been struggling with getting my own domain and configuring it utilizing web sources. I started with learning the outline of web publishing process including web server installation, deploying a website for testing purpose,router port forwarding, getting a domain and forwarding domain to my router which will also forward http requests to my web server I am confused about some parts and so far could not get the web site accessed from outside of the network. All I try to do is just for learning purpose so I do not pay much attention to security issues for now. I have Server 2008 and IIS 7.5 installed. I use a laptop and have access to the modem over wireless and my modem is Zoom x6 5590. Well I will continue explaining what I have done so far and what I think will be after each action I did, I have successfully had access to my website on any local computer entering the internal ip address and port pair of the host machine in a browser. Next, I forwarded port 80 of my host machine creating a virtual server like 10.0.0.x(internal ip(static) of the host) - tcp - start port : 80 - end port : 80 in router options. Now I suppose every request that will come to the public Ip on port 80 will be forwarded to my host machine(10.0.0.x) over port 80. So If everyhing went as desired, the website listening on port 80 will accept the request and process the issue and finally respond bla bla bla... I suppose to access my website from outside of the network by entering http://MyPublicIp:80 in a browser but I couldn't accomplish this task by now despite using godady's domain forwarding tool,I see a small view of my website when I click the "preview" button that checks whether the address(http://publicip/Index.aspx) I entered where my domain will be forwarded is available or not. I am sure that configuring domain does not play a role in solving such a problem since using public ip and port matching does not help. So here is the first question, What is the fact that I face this problem? After that, I have couple of question regarding domain forwarding using godaddy tool. Can I forward my domain to a any port for example port 8080 other than default http port 80? Additionally, can I use a sub-domain to forward to a different port of the host? What I want to design is if the client enters www.mydomain.com, website1 will respond over a specified port and after when a client enters info.mydomain.com, another website which listens on different port will respond. I tried to add a sub-domain and forward it to a address like http://www.mydomain.com:8080/Index.aspx with no success. Can I really do that? Finally, what if I have a ftp site listening on the default port 21 and I create a domain like ftp.mydomain.com that will forward to that ftp site address. Is it possible to use sub-domains for ftp site access? I know I am more than confused but no matter whatever and however you reply to me, you will help me have a more clear view on this subject. Thank you very much from now.

    Read the article

  • Ask for Budget vs. Give Proposal

    - by Miro
    Should I ask a prospect what his budget is or just give out a price? He need: "a new web site, with nice effects but at same time very simple & funtional for my costumers & guests" It's a 5 page website for mp3 guided tour with 2-3 paragraphs of text on each page and 5-8 images on total + logo that needs redesign. It's my first 'over distance' job. (I don't know the guy personally and have never met him) Please let me know what is a good practice and how to proceed. P.S. Also what is an average price for Simple 5 page Flash website with some custom graphics. Thanks

    Read the article

  • HTML favicon wont show on google chrome

    - by Nick
    I am making a HTML page that is unpublished and that I just started. The first thing I wanted to do was add a favicon to appear next to the tile. I'm using google chrome an I noticed that other websites have favicons that appear next to the tile in the browser, but mine wont show up. I'm new to favicons. the site in in a folder on my desktop named site. This is the code: <!DOCTYPE html> <html> <head> <title></title> <link rel="shortcut icon" href="favicon.ico" /> </head> <body> </body> </html>

    Read the article

  • DNS slowdowns on development environment

    - by Sequenzia
    I have a local development environment setup on my Mac. I am running an Ubuntu Web Server inside of a Virtual Box VM. I setup a host file on my Mac that points my dev site to the IP of the Ubuntu Virtual Server. Everything works good other than the fact a lot (not all) of the time it takes more than 5 seconds to load a page. I used firebug to track down where the problem is and when it's slow the DNS part of my request is taking over 5 seconds. Like I said it's not all the time. Sometimes it resolves and loads the page within milliseconds. The same page one click will be super fast and then the next time it takes over 5 seconds. It's really slowing me down and I am not sure what is causing it. Anyone have any ideas? Any help would be great. Thank

    Read the article

  • Why does modx-based site start using different domains for some content?

    - by naxa
    situation I have a modx site on a VPS with multiple domain and subdomain names. The modx site should use what I call the 'primary' domain name's 'primary' subdomain, ie www.intendedname.tld . The problem is that as time pass, the site mysteriously starts using another subdomain for links to content like videos, images, and even pages and (internal) links. The other subdomains doesn't serve this content of course. If I clear the modx cache, the original state is restored. However, the problem comes back again later. The VPS has a domain registered and multiple A records pointing to the VPS's IP, as subdomains. There is the 'primary' whan which is intended to be used as the public content server, the other ones are like docs. and test., etc. On top of that, I have dynamic-dns service client installed from no-ip on the machine and a dynamic domain-name bound. It gives a completely different domain name. I originally used it for ssh login and to serve a completely different site. An nginx server is put into good use to do rewrite the different subdomains to the right places. edit The modx templates use Templates use <base href="[[++site_url]]" />. current attempt to fix The current 'solution' to the problem is to also use the rewrite to rewrite everything to the 'primary' domain and subdomain. In the nginx config file for the site, it utilizes (unsurprisingly) the rewrite directive to rewrite the unexpected server_name entries (ie. the other subdomains) in a server block dedicated to this task. So with this, the main site basically works (sort of) but this renders all the other functions (docs) useless. Before this rewrite was set, the 'solution' was to clear the modx cache on a regular basis. The original modx content is not getting corrupted, only the files in cache are. What can I do to find out what actual the problem is and fix it?

    Read the article

  • SEO optimization for AJAX site and dynamic HTML canvas

    - by Christian Benincasa
    I have a site that uses AJAX to query the Last.fm database and then dynamically draws a graph of the results on an HTML canvas. In the search function, I have a command that sets window.location.hash to the search parameters. I also have a function that checks if a hash was provided in the url and if so, generates the page. For example, http://www.thenlistento.com/#!/led+zeppelin will automatically navigate to a search page for Led Zeppelin. My question is, how do optimize this set up for SEO? Can it be done at all? I've taken a look at Google Webmaster Docs and read over the hashbang protocol, but I'm not totally sure how to apply it to my situation..or even if I can at all. Any help/suggestions would be greatly appreciated. Link to the site: http://www.thenlistento.com

    Read the article

  • search with mobile : does Google just look in the mobile-optimized websites?

    - by Alireza Fallah
    When someone searches a keyword by mobile, does Google search in desktop version of all websites and find the proper result and then prioritize them according to the responsiveness or mobile-optimizing stuff etc OR it just search in the mobile version of the mobile-optimized websites ? I want to create a website with a responsive design, I was wondering that if I should care about SEO in mobile version of the website, or just try to optimize the desktop version for search engines and just care about the design of the mobile version ?

    Read the article

  • What about SEO in one page website with ajax loaded content?

    - by Azimbadwewe
    As my title I'd like to build a website with just one input text for searching restaurants and I would like to load via ajax in the same page the resultst in a list. After the list is loaded if you click on one row for Restaurant details it load via ajax all the Restaurant details. what about SEO in a website structure like this? There is a way to index every single restaurant? I'm pretty new in SEO and every comment will be for sure important to me in order to understand and learn more about it. Cheers

    Read the article

  • Google webmaster showing duplicate meta descriptions for search directory

    - by Mike Flynn
    What is the best way to get rid of this error in Google Webmasters? Do I really need to add "- Page 2" at the end of the descripton? Page Description Kansas basketball tournaments posted by organizations and teams for youth, AAU, and NCAA certified e Pages /youth-basketball-tournaments/kansas /youth-basketball-tournaments/kansas?page=2 /youth-basketball-tournaments/kansas?page=3 /youth-basketball-tournaments/kansas?page=9

    Read the article

  • Website performance tips?

    - by Michael Schinis
    Im kind of having some troubles with the loading of my website: Here's a link to the website: link It sometimes loads fast, then when you refresh it.. most of the times, it will just keep trying to load images, and keep doing that for a minute or so, and none of the javascript will execute. I have followed most of the tips given by yahoo, except caching, which I couldn't get working properly. Does anyone know how to do proper caching of image and javascript files using htaccess? most of the code I found online won't work. Any advice whatsoever is extremely helpful. Thanks

    Read the article

  • Apache Server-Side Includes Refuse to Work (Tried everything in the docs but still no joy)

    - by raindog308
    Trying to get apache server-side includes to work. Really simple - just want to include a footer on each page. Apache 2.2: # ./httpd -v Server version: Apache/2.2.21 (Unix) Server built: Dec 4 2011 18:24:53 Cpanel::Easy::Apache v3.7.2 rev9999 mod_include is compiled in: # /usr/local/apache/bin/httpd -l | grep mod_include mod_include.c And it's in httpd.conf: # grep shtml httpd.conf AddType text/html .shtml DirectoryIndex index.html.var index.htm index.html index.shtml index.xhtml index.wml index.perl index.pl index.plx index.ppl index.cgi index.jsp index.js index.jp index.php4 index.php3 index.php index.phtml default.htm default.html home.htm index.php5 Default.html Default.htm home.html AddHandler server-parsed .shtml AddType text/html .shtml In the web directory I created a .htaccess with Options +Includes And then in the document, I have: <h1>next should be the include</h1> <!--#include virtual="/footer.html" --> <h1>include done</h1> And I see nothing in between those headers. Tried file=, also with/without absolute path. Is there something else I'm missing? I see the same thing on another unrelated server (more or less stock CentOS 6), so I suspect the problem is between keyboard and chair...

    Read the article

  • Authorship tag or Customer reviews to enhance click-through rate

    - by Prashant Singh
    I was first using authorship tag on all the pages of my website. That gave me a pretty decent improvement in the click through rate. However, I have recently added ratings on my website, i.e. all my pages are being rated by the readers and the same has been made available to Google via rich snippets. The result being the image of the author from the google search results has been removed. It shows ratings given by the customers and just writes the name of the author. What will be its impact on the click-through rate ? Is it OK to have the ratings or should I switch back to only authorship tag as I was using in the past ? Please comment if I am unclear in asking my problem. Thanks :)

    Read the article

  • Do the "Contact us" and "Privacy policy" pages affect SEO?

    - by Gkhan14
    Just like the title says, what are the effects of having a "Contact us" and a "Privacy policy" on your site? I've read that it could build up your trust with Google, is this true? I've also read that some people said that you should add a noindex tag to your "Privacy policy" page, would this be a good idea? I say this because many websites have similar privacy policies, and I don't want any duplicate content issues. (For example, many people could be using the same WordPress privacy policy generator). I'm wondering the same things for the "Contact us" page as well.

    Read the article

  • How to remove trailing slashes from URL with .htaccess?

    - by Matt
    The situation Across the entire domain, we'd like the URLs to hide file extensions and remove trailing slashes, independent of the domain name itself (as in, works on any domain). Sample of our directory structure We're not using index.* files except for the homepage. / /index.php /account.php /account /subscriptions.php /login.php /login /reset-password.php The goal Some examples of how these files might be requested, and how they should look in the browser: / and index.php -- mydomain.com (literally just the bare domain name). /account.php or /account/ or /account -- mydomain.com/account /account/subscriptions.php or /account/subscriptions/ or /account/subscriptions -- mydomain.com/account/subscriptions As you can see, there are several ways to access each webpage, but no matter which of the 2 or 3 ways you use to get there, it only shows the one preferred URL in the browser. The question How is this done with .htaccess using mod_rewrite? I've banged my head against the wall trying to figure this out, but in general, the rewrite flow would seem to be something like this: External 301 redirect ( mydomain.com/account/ -- mydomain.com/account ) Internally append .php ( mydomain.com/account -- mydomain.com/account.php ) I've been Googling this all day, read thousands of lines of documentation and config texts, and have tried several dozen times... I think more brains on this would help a lot. UPDATE We found an answer our question (see below).

    Read the article

  • Can you install ubuntu on xp and then uninstall xp? how?

    - by Eli
    I have a problem with my pc, you can read about it here if you like http://yhoo.it/qIQyMw anyway, I might go for ubuntu, the thing is I'm in Lebanon and here few, very few people use linux, most of them never heard about ubuntu lol, therefore you'll be really lucky if you can buy an ubuntu cd or even if you find someone can find someone capable of installing it. So when they fix my pc, they might install xp coz they don't have a linux operating system, and i hate win 7 and vista so I'll have to download ubuntu and install it by myself, I don't want to dual boot coz i don't have a super computer lol, i have used ubuntu on my vps, never on desktop before so i would like to know if you can download ubuntu, install it, on a xp pro, then remove xp pro? is there any tutorial? thank you

    Read the article

  • Can CSS be copyrighted?

    - by Emily
    I know CSS on a website is protected under the website's copyright since it is considered part of the overall design. I also know that images used inCSS are copyrightable. How about when CSS is used to create images? There is a CSS3 icon set that has a $25 license fee. Another developer claims those images to be copyrighted and that it is illegal to use any of the icons unless you pay the fee. I say you cannot copyright a chunk of code and if I recreate an arrow or disc icon in my CSS (whether I copy his code or write my own) he has no recourse. Can CSS, by itself, be copyrighted?

    Read the article

  • Analytics: Test events not showing up - how to troubleshoot?

    - by David Parks
    I've got 3 profiles: Master, Raw Data, and Test, on the Test profile I have no filters configured. I want to test using some events. I created a local HTML file as shown below to generate some test data that I could play with in Analytics. But the events never showed up in Analytics. I wonder what I might be doing wrong? Is the lack of a domain an issue maybe? <html><head></head><body>Login_popup_complete_Facebook <script type="text/javascript"> var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-28554309-1']); _gaq.push(['_trackPageview']); _gaq.push(['_trackEvent', 'Login popup completed', 'Facebook']); (function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })(); </script> </body></html>

    Read the article

< Previous Page | 116 117 118 119 120 121 122 123 124 125 126 127  | Next Page >