Search Results

Search found 9721 results on 389 pages for 'quicktest pro'.

Page 173/389 | < Previous Page | 169 170 171 172 173 174 175 176 177 178 179 180  | Next Page >

  • How do I deal with content scrapers? [closed]

    - by aem
    Possible Duplicate: How to protect SHTML pages from crawlers/spiders/scrapers? My Heroku (Bamboo) app has been getting a bunch of hits from a scraper identifying itself as GSLFBot. Googling for that name produces various results of people who've concluded that it doesn't respect robots.txt (eg, http://www.0sw.com/archives/96). I'm considering updating my app to have a list of banned user-agents, and serving all requests from those user-agents a 400 or similar and adding GSLFBot to that list. Is that an effective technique, and if not what should I do instead? (As a side note, it seems weird to have an abusive scraper with a distinctive user-agent.)

    Read the article

  • Share links with <script src=""> SEO

    - by gansbrest
    Hi, I would like to create a share link to my website using javascript: script src="[url-to-my-script]" Basically the main idea behind this is to render HTML block with an image and link to the website. By using JavaScript approach I can control the look and feel of that link + potentially I could change the link html in the future without touching partner websites. The only potential problem I see with this approach is SEO. I'm not sure if google will be able to index that link to my website, since it's generated by javascript.

    Read the article

  • Using PHP Redirect Script together with Custom Fields (WordPress)? [on hold]

    - by Alex Scherer
    I am currently trying to make yoast's link cloaking script ( Yoast.com script manual // Github Script files ) work together with the Wordpress plugin Advanced Custom Fields. The script fetches 2 values (redirect id, redirect url) via GET and then redirects to this particular URL which is defined in a .txt file called redirects.txt I would like to change the script, so that I can define both the id and redirection URL via custom fields on each post in my WP dashboard.. I would be really happy if someone could help me to code something that does the same as the script above but without using a redirects.txt file to save the values but furthermore gets those values from custom fields. Best regards ! Alex

    Read the article

  • Determining cause of random latency/loading issues

    - by Sherwin Flight
    I'm not sure exactly what details to post in regards to my issue, because I'm not sure what is relevant. Prior to the end of September my websites all loaded quickly, in almost all cases. Loading time wasn't usually more than a few seconds. However, since the end of September I noticed a big increase in page loading times. In some cases pages were taking 30 seconds or more to load. I do have a remote monitoring service monitoring some of the sites as well, and the image below shows the response times over the past month. The response times shown at the beginning of this graph were what the usual response times were prior to this issue occurring. You can see that there has been a significant increase in response times from the beginning to the end of this graph. The thing is, the problem is not happening 100% of the time. If I click through the site, or even just keep refreshing the page, about 25% of the time the pages load quickly, the remaining 75% of the time they load slowly. Sometimes the pages take so long to load that they time out, and don't load at all. I have contacted my hosting provider, and they said things at their end was fine. I don't believe the problem is my home internet provider, because all other websites load without a problem. The server is located in Texas, USA. This also raises another interesting point. My remote monitor checks my site from two locations, California, USA, and London, England. As you can see in the chart below the response time is actually shorter when checked from London, which doesn't seem to make sense, since the server is physically closer to the California monitoring location. I would have expected the London monitoring location to have higher response times since they are physically farther away. I should also point out that in some traceroute test I've done it seem like the first connection to the server seems to take the longest, then after that the rest of the page loads quickly. Below is a little chart showing the times for the first connection to the server. So, what could be causing this problem, and what steps can I take to resolve it or at least narrow down the problem? Sending the request to the server was very quick, and receiving the reply back seems pretty quick, but the WAIT time is really long. So it connects, sends the request, but then waits close to 30 seconds before it starts receiving data back. I am also aware that there are things I can do to speed up page loading times, like reducing the number of css/js files used on a page, compressing images, etc. This is not really what the source of the problem is though, because nothing has really changed on the site since before the problem started, and other sites on the same server are loading slowly as well. Any help or advice is much appreciated.

    Read the article

  • Rewrite rule to show as directory using .htaccess

    - by chanchal1987
    I want to implement a rewrite rule in my .htaccess file to show a specific url as a directory of my server. See the code below I written, RewriteRule ^(.*)/$ ?page=$1 [NC] This will rewrites urls like www.mysite.com/abc/ to www.mysite.com/index.php?page=abc. But if I request www.mysite.com/abc then it is throwing an 404 error. How can I write a rewrite rule which will match www.mysite.com/abc and www.mysite.com/abc/ both? Edit: My current .htaccess file (After Litso's answer's 3rd revision) is like below: ## ErrorDocument 401 /index.php?error=401 ErrorDocument 400 /index.php?error=400 ErrorDocument 403 /index.php?error=403 ErrorDocument 500 /index.php?error=500 ErrorDocument 404 /index.php?error=404 DirectoryIndex index.htm index.html index.php RewriteEngine on RewriteBase / Options +FollowSymlinks RewriteRule ^(.+)\.html?$ $1.php RewriteCond !-d RewriteRule ^(.*)/$ ?page=$1 [NC,L] RewriteCond %{REQUEST_URI} !index.php RewriteRule ^(.*)$ ?page=$1 [NC,L] ##

    Read the article

  • Wordpress blog penalized by Google search - what's wrong?

    - by pawelbrodzinski
    I have a blog (http://blog.brodzinski.com), which is wordpress.org blog with pretty popular Thesis theme with almost no other customizations. Some time ago it was penalized by Google search - it simply stopped appearing in search results even for search terms it used to be top result, like my name - Pawel Brodzinski - which isn't anything close to popular search term. To be exact the site has been penalized on Nov 18. It started popping up in search result on Dec 23 but only for a few days. Since Dec 27 it is out again. I know Google guidelines and I'm not aware to break any of them. I submitted reconsideration request after I noticed penalty. It was proceeded and there was no change whatsoever (no surprise as it seems the site was penalized again). I checked diagnostics in webmaster tools and neither any malware was detected nor any strange search terms popped up. I read related threads on Google webmasters forum but found none of solutions working for me. I posted a thread on Google webmasters forum (http://www.google.com/support/forum/p/Webmasters/thread?tid=546339f49d4a03bc&hl=en) and the only answer I got was to check for duplicate content. Well, there is some duplicate content published on the web but it is true for vast majority of blogs and it doesn't seem to be a reason for a penalty. Also before Dec 27 I was able to remove duplicate content from a couple of sites which were republishing my feed but this doesn't change the situation - the site was penalized again. The problem is I have no idea what can be wrong with the website or how to find it out. To make the problem worse I'm no webmaster, I just run a wordpress blog, which supposed to be easy.

    Read the article

  • Is browser and bot whitelisting a practical approach?

    - by Sn3akyP3t3
    With blacklisting it takes plenty of time to monitor events to uncover undesirable behavior and then taking corrective action. I would like to avoid that daily drudgery if possible. I'm thinking whitelisting would be the answer, but I'm unsure if that is a wise approach due to the nature of deny all, allow only a few. Eventually someone out there will be blocked unintentionally is my fear. Even so, whitelisting would also block plenty of undesired traffic to pay per use items such as the Google Custom Search API as well as preserve bandwidth and my sanity. I'm not running Apache, but the idea would be the same I'm assuming. I would essentially be depending on the User Agent identifier to determine who is allowed to visit. I've tried to take into account for accessibility because some web browsers are more geared for those with disabilities although I'm not aware of any specific ones at the moment. The need to not depend on whitelisting alone to keep the site away from harm is fully understood. Other means to protect the site still need to be in place. I intend to have a honeypot, checkbox CAPTCHA, use of OWASP ESAPI, and blacklisting previous known bad IP addresses.

    Read the article

  • How can I edit (in MS Expression Web ) FrontPage Site Parameters (Substitutions)?

    - by Clay Nichols
    Or, asked another way: Where are the values for MS Front Page Substitions (Site Parameters) stored? (so that I can edit in Web Expressions) Background I'm ashamed to admit that I've been maintaining our company's website in MS FrontPage for over 9 years. I'm moving it to Expression Web, which will display the Substitutions (stored as Site Parameters) but I can't figure out where to edit them. I tried searching the source folders for the website (on my development PC) for the name of the parameter (s-Variable=hoursOfOperation) but did not find it (other than in the files it was actually used in.

    Read the article

  • High quality/performance shared hosting (in northern Europe)

    - by Bente
    I work as a web developer on almost all levels. However, my typical customer is a 1-5 guys running some sort of consulting business. They have (or want) a web page with some kind of CMS so the can perform most (or all) editing themselves. I normally opt for Concrete5 as my default CMS because it's the most user friendly (and free) CMS I have found. My good recurring customers I host on my own server as a service, but I need a good host for the customers where I want to deliver a product and not be responsible for whatever may happen in the future. However, I still struggle with hosting! Experience shows that the typical ~1$ shared hosting is waaay to slow to run concrete5 smoothly, and a VPS is out of the question because I don't want to maintain it. So, where can I find as fast (from northern Europe), reliable, shared host where I can put a site and don't have to worry about the server going down or being unmaintained. I expect this should cost around $10-$20 but I'm open to all kinds of suggestions because different customers have different budgets.

    Read the article

  • google webmaster showing 6 pages submitted 0 indexed, yet i can see them all there when i search in google?

    - by sam
    I have a small 'brochure' type site with 6 pages, i can see them all the pages showing up in google when i search for my site. But in webmaster tools under the sitemaps section it says 6 submitted, (the blue bar of the graph), but the indexed pages - the red bar is showing 0 indexed pages, even though they seem to be indexed ? any idea why this is ? I dont really think its that important as the pages are still indexed, but it just seems odd. =================================================== UPDATE 9/3/12 having just looked in google webmaster its showing that there are 11 pages indexed, under the health index status tab.. but under the optimization sitemap tab it shows 6 urls submitted but only 1 indexed ? please see images bellow index status: Sitemap status:

    Read the article

  • Still no detected structured data in Google Webmaster Tools [on hold]

    - by user6211
    Can you give me some suggestions what's wrong with my structured data? Google still cannot read it. It looks like this: <div class="identity"> <div itemscope itemtype="http://schema.org/LocalBusiness"> <a itemprop="url" href="http://MYDOMAIN.co.uk/"><div itemprop="name"><strong>MY_COMPANY</strong></div></a> <div itemprop="address" itemscope itemtype="http://schema.org/PostalAddress"> <span itemprop="streetAddress">MY_ADDRESS</span>, <span itemprop="addressLocality">London</span>, <span itemprop="postalCode">SE5 MY_XYZ</span>, <span itemprop="addressCountry">UK</span> </div> </div> </div>

    Read the article

  • Is there a way I can filter traffic by page-type based upon URL structure in Google-Analytics or Google Webmaster Tools?

    - by Felix
    I have a local business directory site. I'm trying to segment my incoming traffic by page-type such that I can find out what percentage of traffic is going to zip code pages exclusively and what percentage is going to city/state level pages. I basically want to filter by URL structure to find out what percentage of total traffic zip code pages account for. The reason for doing this is to find out if Google Tag Manager can help with this? Here are the two URL paths: http://www.example.com/ny/new-york/10011/ http://www.example.com/ny/new-york

    Read the article

  • Robots meta tag with "noimageindex"

    - by jimy
    I have some doubt regarding noimageindex value in meta robots tag. If I add this tag on my page say http://www.example.com/somepage/someaction.php and on that page the images are served from another page say http://www.exampleimg.com. Then will the tag has meaning. I mean to say will the images will be ignored by bots? Or exampleimg is not affected by that tag. And all images will be indexed? Note: We want to stop indexing of the images on that particular page.

    Read the article

  • make a custom apache handler (for multiple php versions)

    - by user10580
    ive succesfully installed php 5.2 and it runs as CGI. ultimatly i want to put something like AddHandler application/x-httpd-php52 in the htaccess file of the dir i want to run it on. however, this only really works in the virtual hosts because i cant wrap my head around how to deifne a custom handler. <FilesMatch "\.php"> SetHandler application/x-httpd-php5 </FilesMatch> ScriptAlias /php52-cgi /usr/lib/cgi-bin/php52-cgi Action application/x-httpd-php5 /php52-cgi AddHandler application/x-httpd-php5 .php How can i do somthing like AddHandler application/x-httpd-php52 in the htaccess?

    Read the article

  • Website signaled as containing malware

    - by Bakaburg
    I've got a nasty problem with one of our websites. It has been signaled to us by Google and other agencies that it contains malware. We weren't able to understand how to cope with the problem. Could anyone drive us in the right direction? UPDATE: I used google webmaster tools to review the suspicious website. And now it says it's ok! Even if I didn't change anything! How could it be? false alarm?

    Read the article

  • Using paypal to process credit cards in Sweden through an API [on hold]

    - by Mastikator
    I'm looking for a Paypal API that lets me process credit cards to make payments without being redirected to a paypal site and without enforcing consumers to use their paypal account. And it needs to work in Sweden. The ones I've looked at (dodirectpayment, expresscheckout, paypalpro gateway) and none of them have let me process credit cards in Sweden via an API that doesn't force the user to visit the paypal login site. I have a form on my webpage that the user types their credit card number, ccv2, expiration, name, address, etc. I need an API that works in Sweden that simply processes the request, and it has to be without the step of being redirected into a paypal website. The ones that I have found only worked in a select few countries, is there an international solution? I've already spent over 12 work hours just looking for an API that meets my requirements.

    Read the article

  • Does the EU cookie law apply to an EU site that is hosted outside of the EU?

    - by mickburkejnr
    I have been reading up about this EU cookie law, and have also had in depth conversations with my girlfriend who is a solicitor/lawyer and with colleagues while building websites. While we are now working towards implementing a way to abide by the EU law, I have thought of something which no one really knows the answer to and has caused a few arguments. It's my understanding that any website in the EU must abide by these cookie laws, which is understandable. However, say if I were to have a .co.uk or .eu domain name pointing to a website which is hosted in America for example, do I still need to abide by the EU laws even though the website is hosted outside of the EU? One person I have asked has said that because the domain name is .co.uk or .eu (a European TLD) then the website is still accountable under EU law. Another person I have asked has said because the actual website is held outside of the EU, it doesn't actually have to bother with this law.

    Read the article

  • URL rewriting via forward proxy

    - by Biggroover
    I have an app that runs inside my firewall and talks out to multiple end points via HTTP/HTTPS on a non-standard port e.g. http://endpoint1.domain.com:7171, http://endpoint2.domain.com:7171 What I want to do is route these requests through a forward proxy that then rewrites the URL to something like http://allendpoints.domain.com/endpoint1 (port 80 or 443) then on the other end have a reverse proxy that unwinds what I did on the forward proxy to reach the specific endpoints. The result being that I can route existing app requests through to specific endpoints across the internet without having to change my app software. My questions are: is this even possible? is it a good idea, are their better ways to do this? Can this be done with IIS and Apache as the proxies?

    Read the article

  • Domain name only works with www. but not without the www

    - by Rodrigo Salazar
    I have searched for this problem numerous times but I can't quite apply the solution to my problem. The usual solution I find is that I need to add an A-record to my dns, currently I only have a C-name record for host 'www' to my DNS. The problem is, I'm having trouble understanding what goes in the 'host' section of the record when trying to create an A-record. Here is a screenshot to my domain registrar where I am trying to create the record. http://i.imgur.com/M18zm.png I have tried creating an A-record with the 'host' field set to *, but that is not a valid entry, nor is leaving it blank. Anyone have any suggestions? The domain is swageroo.com, but it's only reachable at http://www.swageroo.com (With the www)

    Read the article

  • Is it safe to have no TOS or PP?

    - by JamerTheProgrammer
    I have coded my own forums from the ground up. I have tried my best to make my code as secure as possible and encrypting everything I can. I want to use this forum for a Minecraft server. I have one concern however.... I would like to setup this forum now but having no TOS or Privacy Policy has put me off. Will having none of either cause me any legal trouble in the unlikely event of a data leakage? Thanks

    Read the article

  • Gaming Community CMS, with forum integration [closed]

    - by Tillman32
    Possible Duplicate: Which Content Management System (CMS) should I use? I've had a simple website that I coded myself for a while now, the site is a gaming community. It's very forum and news driven. It was a HORRIBLE idea to take on coding this thing myself. Although we've used it for about a year now, we're just getting too big, and I need to streamline our work. I need writers to post news, etc. I've been doing it through code. ( A year ago I thought it would be a cool idea ) Anyway, I've been messing with just about every CMS out there, and I'm struggling to get something that I really like. The main issue I'm facing, is a good news system, and good forum integration. I'm sort of picky when it comes to looks, its a curse. Reading on here, I see a lot of people saying Drupal is the best for the 3 things I need, community interaction, and forums. I think the main issue that I ran into with drupal, was ease of use, and themes. I am not a web designer, and I need a good theme. For an idea of what I'm looking for, go check out http://www.clgaming.net, they have forums integrated, a nice news area on home page/news section, and nice user accounts. It looks very professional, and I doubt I'll get close to that with a free theme, but their functionality is exactly what I need. Any ideas would be greatly appreciated.

    Read the article

  • php application deployment to redhat [closed]

    - by Subhash Dike
    I am a .net background person with most of the work with Windows environment and IIS etc. I have been given a task to deploy one php application on RedHat Linux box. All I have is the credentials and the server IP of that box. Can someone help about how to get started creating a new website on Apache 2.0 on redhat box and then putting php code over to that. I do not have physical access to that server as well. I know it's very basis question, but I am in dark for these things. Even if you can point me to some documentation that's fine or else at least the analogy with IIS should also help to certain extent.

    Read the article

  • Is it possible to block traffic originating from a specific country?

    - by mickburkejnr
    Hi guys, My personal website is currently getting a lot of spam comments at the moment, and most of them originate from Russia (I've used Google Analytics to identify the traffic, and a lot of the links link to Russian sites). As it's a pain to keep deleting this comments, I would like to ban people from there commenting or visiting the website. Is this possible? Also, the website is using WordPress. Many thanks!

    Read the article

  • Wordpress plugin installation error.

    - by Steve
    I'm trying to upload secure-wordpress.1.0.6, and I receive the following error: Warning: touch() [function.touch]: open_basedir restriction in effect. File(/abs_path/wordpress/tmp/secure-wordpress.tmp) is not within the allowed path(s): (/abs_path/:/abs_path/:/usr/local/lib/php:/tmp/php_upload) in /abs_path/public/www/wordpress/wp-admin/includes/file.php on line 199 Download failed. Could not create Temporary file. The /wp-content folder and all it's subfolders have 777 permission. I've added the following two lines to wp-config.php: putenv('TMPDIR='.ini_get('upload_tmp_dir') ); define('WP_TEMP_DIR', ABSPATH .'wp-content/uploads/'); What else should I try? I am using Wordpress 3.04 in a PHP 4.49 environment.

    Read the article

  • Preventing Duplicates on Google

    - by abel
    I am currently using a rewrite rule to enable access to .php pages, without using the php extension. However to prevent old links from breaking, the pages can still be accessed via links containing the .php extension too. For eg. domain.com/page.php can now be accessed at domain.com/page All the links on the website now use domain.com/page type links within the site. However older incoming links will still link to the .php pages, meaning Google will index both pages and mark them as duplicate. I have two plans to remedy the situation. Use a php 301 redirect: When a page is accessed with the .php extension, I can redirect each page individually using a 301 redirect using php Using Canonical: Place a canonical tag on each page, pointing to the ".php" less version My Question: Are both methods equally efficacious in preventing Google from indexing my ".php" pages? Which method should be preferred, by convention or otherwise?

    Read the article

< Previous Page | 169 170 171 172 173 174 175 176 177 178 179 180  | Next Page >