Search Results

Search found 9721 results on 389 pages for 'quicktest pro'.

Page 106/389 | < Previous Page | 102 103 104 105 106 107 108 109 110 111 112 113  | Next Page >

  • Best way to block "comment spam" postings to web forms? [closed]

    - by David Jones
    Possible Duplicate: Make your site anti-bot? I have a custom web form on my PHP-based site. Recently it is getting a regular stream of comment-spam postings from a few specific IP addresses. Question: What is a good way to block a small set of blacklisted IP addresses from accessing my site? I was thinking it should be possible using .htaccess to respond with status code 403 (Forbidden) for all HTTP requests from the blacklisted IP addresses, ... but I am not sure exactly how to do that. If anyone knows the .htaccess syntax needed to accomplish this, ... please let me know. thanks in advance,

    Read the article

  • IE6 Support - When to drop it? [closed]

    - by Scott Brown
    Possible Duplicate: Should I bother supporting IE6? I'm thinking about IE6 support and when to give up on it. Do you have a percentage of total visitors figure in mind for when to drop support? Would you let a trend develop past this figure or are you just going take the first opportunity? I've seen a 44% drop in IE6 visitors in the past 12 months from 23%(ish) of visitors down to 13%(ish). Even if it was 5% it still seems too early to drop support to me (it's still 1 in every 20 users). What are people's thoughts on this?

    Read the article

  • Google webmaster tools / Geographic location settings

    - by JochemTheSchoolKid
    I am building an website. It has an .nl domain. Now only my domain is showing up on google.nl I hope I can change this somehow that it could be findable in all google's (like google.com / co.uk) and so on. If I look on google forums. They say go to webmaster tools and change your geographic position over there. But I have added this site and I am not able to change it there because there is no select box. I dont have any idea were to search (yes I searched on google offcourse) or where to ask for this special problem. So maybe here can someone redirect me or explain me what is possible and what not. The question is can I make an .nl domain findable in (almost) all google search sites? And so on how can I do that. Picture of my google webmaster tools (nl): http://i.stack.imgur.com/ZuP4L.png

    Read the article

  • Google Webmaster Central tells me that robots is blocking access to the sitemap

    - by Gaia
    This is my robots.txt User-agent: * Disallow: /wp-admin/ Disallow: /wp-includes/ Sitemap: http://www.mydomain.org/sitemap.xml.gz But Google Webmaster Central tells me that robots is blocking access to the sitemap: We encountered an error while trying to access your Sitemap. Please ensure your Sitemap follows our guidelines and can be accessed at the location you provided and then resubmit: URL restricted by robots.txt I read that Google Webmaster Central caches robots.txt, but the file has been updated more than 10 hours ago.

    Read the article

  • Multiple 301 redirects, do search engines/viewers see them all?

    - by Karim
    I've put in place lots of different 301 rules to deal with numerous url changes. And for certain URLS there are 3-4 different 301 redirects landing the visitors to the new URL. I heard that 301 loses pagerank/linkjuice. ALl the 301 are onsite for the same domain. With a mix of php 301s and htaccess 301s. so for instance articles/news.php?id=2 --- articles/blog.php?id=2 [filename change] articles/* --- /* [subdir to root] /blog.php?id=2 --- /title-of-post [mod rewrite url change] so if you were to visit /articles/news.php?id=2 there will be two 301 redirects until you land on the /yellow-wellington-boots/, my question is does google see the intermediate redirects, or just the final page the 301's redirect to.

    Read the article

  • Authentication system brainstorm

    - by gansbrest
    Hi. We got multiple small websites (microsites) and one main high traffic one with big users base. Right now the requirement is to build authentication system which should allow users to loign with the same identity across the network. All website are running on different domains, powered by Drupal 6 CMS and have separate databases (so sharing tables with prefix is not an option + it creates a huge mess in the db). Here is the set of core requirements I came up with: Users should be able to login with the same credentials to all sites within the network User’s data sharing between Main site (storage) and all micro sites within the network Data synchronization across the network when user changes the data (update email or password for example) The login/registration process should be seamless and consistent Register on any of the sites across the network and use that identity to login later on. In the future there might be a need to add openid authentication options. Basically we are looking at something similar stackexchange does, but not sure if they have central users base on not. I was thinking about custom solution which will include 2 parts (modules), one will be stored on the Main site for users data storing and responding to requests from clients. Second part (module) will be placed on each microsite, which is going to send requests to the Master. Some kind of client - server setup. One of the complications I see right away is #3. Data Synhcronization across the network. I just don't want to reinvent the wheel and maybe some work is already done in this direction. Looking forward to your ideas on how to approach this project. EDIT: We use MySQL database

    Read the article

  • What is the proper way to setup Google Apps email accounts for a subdomain?

    - by binaryorganic
    Let's say I've registered domain.com at enom and set it up to use Google Apps for email by rerouting DNS to enom's servers and editing the MX records there. That works flawlessly. Now let's say I want to have email at a subdomain for that same site. I already have a working subdomain at the host, but I want to catch email traffic at enom before it gets that far. I've set up Google Apps as a new account for the subdomain, successfully verified domain ownership, and now they want me to update MX records. What's the right format? For domain.com, I just put @ for the hostname, and then provided the Address and Pref values that Google gave me. I tried putting subdomain.domain.com as new values under hostname for the subdomain, but that doesn't seem to work. What am I doing wrong?

    Read the article

  • Maximum file size for iFrame in IE7

    - by Peter Turner
    I've got a "super secure" javascript downloader* that I wrote, and it usually works alright. But I noticed, while trying to download a 90 meg file with it on a client's machine that on IE7, it's getting hung up about 1/3rd of the way through. I've never tried to send a file that large through the iFrame and it works fine in other browsers. Is there a size restriction on files that IE7 can read in an iFrame? * It's really just a PHP line that sets header("location: http://someplace/downloadbigthing.exe"); after it does some logging and verification.

    Read the article

  • Lost Traffic from Google Because of Meta-tag Adding

    - by Marian
    I have a site aroundnails.com. It has English version on subdomain en.aroundnails.com. Reading about language related meta-tags for Google, I have placed such a meta tag on the main page of main site: <link rel="alternate" hreflang="en" href="http://en.aroundnails.com/" /> By this way I have tried to say Google, that my site on en.aroundnails.com is the english version of main site, not a duplicate. After a fortnight I have lost a huge part of traffic from Google, more than a half. At the beginning of September I have moved this meta-tag, but traffic remained at the same level. Hope somebody can help me to solve this issue.

    Read the article

  • How long for data highlighter mark up to appear in structured data tool?

    - by Max
    I used the data highlighter in webmaster tools over 3 weeks ago to mark up some local business data, but there is still no structured data being detected in webmaster tools. Does any body have any experience on approx how long it takes for Google Webmaster Tools to start reporting Structured Data that has been marked up with their data highlighter? I'm asking specifically about reporting on it in Web Master Tools Structured Data section, as opposed to actually appearing in the SERPs.

    Read the article

  • Fix 403 errors in Google Webmaster Tools

    - by Justin
    Hi Team, I have a domain that has "fallen off a cliff" for searches in Google. Searches that used to be in position 1-4 are now gone from page 1. The same search in Bing shows the typical position expected (top 5 results). In reviewing Google Webmaster Tools, I am seeing two problems: 1. The Sitemap is reporting two errors: General HTTP error: HTTP 403 error (Forbidden) URLs not accessible However, the URL they provide as "no accessible" is accessible. I can click the link Google provides and it works fine. There are 6,000 crawl errors of type 403. Again, most of these pages that have 403 are accessible in my browser (tried various browsers as well). About half are from January, the other half from November. There are no IP-specific firewall rules on ports 80 and 443 that could block the goolgebot Using the user agent switcher add-on for FF I confirmed that the page loads when the user agent is the googlebot I an confirm that most of the pages reported as 403 are accessible. A search of just "site:thedomain.com" does confirm there are over 9,000 in the index. But most searches don't return the site. I believe the 403 issues are the cause of the fall in search rankings, but I can't seem to find any information online with ideas about how to address this. Any ideas? jpe

    Read the article

  • Prevent mail flagged as spam when switching mail servers (new SPF records)?

    - by Jakobud
    For our business, we send out a significant amount of newsletter alerts to customers that sign up for it on our website. We used to send this mail directly from our web server via PHP. But because the web server limited us to the number of emails we could send per day, we purchased a VM server at a different host (that doesn't throttle email) and we are going to use that account solely for sending out the emails. Anyways, now that the SPF records are going to be different from what they used to be and the source mail server is different, what steps need to be taken to prevent these emails being flagged as spam? I know in Gmail, it's pretty smart about determining if the person actually sending the email is sending it from the server it expects (for flagging Phishing emails, etc). We don't want that to happen to our emails. Just sending a couple test emails out, Gmail's shows the SPF record saying: Authentication-Results: mx.google.com; spf=neutral (google.com: XXX.XXX.23.176 is neither permitted nor denied by best guess record for domain of [email protected]) [email protected] So is there anything we need to do with regards to SPF records as we move forward?

    Read the article

  • Ideal web application framework for newcomers and whether it is better to use Java or PHP based framework?

    - by Pawan
    My primary question is whether a Java based web application framework is better or a PHP based one and why? Moreover, if I were just starting web development then what would be some ideal frameworks to start with, considering I may want to make a full CMS out of it later? I am not looking for a 'best', rather some good recommendations as I understand that CodeIgnitor has not got a long way to go from here : http://heybigname.com/2012/05/06/why-codeigniter-is-dead/

    Read the article

  • Domains with similar names and issues

    - by abel
    I recently purchased one of those domain names like del.ico.us. While registering I found that delicious.com was being used. Argument: I found that delicious.com belonged to the same category as my to-be website. It served premium delicious dishes. Counter Argument: My to-be domain though belonging to the same category, specialized in serving free but delicious dishes or in giving out links(affiliate) to other sites serving premium delicious dishes. Additional Counter Arguments: 1.delicious.com was not in English. 2.the del.icio.us in my domain name though having the same spelling, is not going to be used in the same fashion. For eg.(this may not make sense, because the names have been changed)the d in delicious on my website actually stands for the greek letter Delta(?/d) and since internationalized domains are still not easily typable, I am going for the english equivalent.The prefix holds importance for the theme of the service which my website intends to offer. My Question: Can I use the domain name del.icio.us for my website? How are these kinds of matters dealt? (The domain names used are fictitious. And I have already registered the domain but have not started using it.I chanced upon this domain name because it was short, easy to remember and suited the theme of my website.)

    Read the article

  • facebook share and opengraph

    - by hannit cohen
    I'm managing a video blog. The blog contains a main youtube video on the homepage and several thumbnail images of other videos elsewhere. I have a share button for the main page and the single posts pages and have added opengraph to specify the used image. For some reason facebook ignores my opengraph image and uses othe images it finds in the page... the header looks like this: (for the homepage) <!-- Facebook Opengraph --> <meta property="fb:app_id" content="155967927783206" /> <meta property="og:url" content="http://mttv.co.il/2010/12/%d7%9e%d7%a4%d7%92%d7%a9-%d7%a1%d7%99%d7%99%d7%a2%d7%95%d7%aa-%d7%92%d7%a0%d7%a0%d7%95%d7%aa-%d7%9c%d7%93%d7%95%d7%a8%d7%aa%d7%99%d7%94%d7%9d-1959-2010-%d7%91%d7%9e%d7%a2%d7%9c%d7%95%d7%aa/"/> <meta property="og:title" content="???? ?????? ????? ???????? 1959-2010 ??????" /> <meta property="og:description" content="???? ????? ?? ?????? ????? ?????? ????????? ???? ?????? 1959 ??? 2010 ?????? ????? ??????? ???' ??? ??? ????? ???? ??????? ???????? ?????? ?&quot;? ???? ??? ?&quot;? ????? ?????? ????? ???? 08.12.10 " /> <meta property="og:type" content="article" /> <meta property="og:image" content="http://www.mttv.co.il/wp-content/uploads/2010/12/Gvi91UEjCAw_mid-135x77.jpg" /> The website address is: http://mttv.co.il Any help will be appreciated

    Read the article

  • Moving blog to hosted location - impact to SEO?

    - by j0nes
    I run a blog under mywebsite.com/blog. This is a self-hosted MovableType installation. Now I want to get rid of the blog installation and use a hosted (Wordpress.com) solution instead. The blog is only a minor part of my website, but before making this change I want to make sure that I don't loose SEO value. What is the best option to make this change? Bonus question: how big is the SEO-impact of a subdirectory (mywebsite.com/news) to the PageRank of the main domain (mywebsite.com)?

    Read the article

  • How important is the choice of domain registrars? [closed]

    - by Harry Muscle
    We're consolidating all of our hosting providers onto one provider, however, this provider is strictly a hosting provider, they are not a domain registrar. The question I have is how important is the choice of registrars? Or in other words, if I point the domains to the hosting companies name servers, can the reliability of my registrar affect my websites in any way? If the registrar were to go down for a few days would that impact the accessibility of the websites? What would it impact? Thanks, Harry

    Read the article

  • PCI compliance when using third-party processing

    - by Moses
    My company is outsourcing the development of our new e-commerce site to a third party web development company. The way they set up our site to handle transactions is by having the user enter the necessary payment info, then passing that data to a third party merchant that processes the payment, then completing the transaction if everything is good. When the issue of PCI/DSS compliance was raised, they said: You wont need PCI certification because the clients browser will send the sensitive information directly to the third party merchant when the transaction is processed. However, the process will be transparent to the user because all interface and displays are controlled by us. The only server required to be compliant is the third party merchant's because no sensitive card data ever touches your server or web app. Even though I very much so trust and respect the knowledge of our web developers, what they are saying is raising some serious red flags for me. The way the site is described, I am sure we will not be using a hosted payment page like PayPal or Google Checkout offers (how could we maintain control over UI if we were?) And while my knowledge of e-commerce is laughable at best, it seems like the only other option for us would be to use XML direct to communicate with our third party merchant for processing. My two questions are as follows: Based off everything you've read, is "XML Direct" the only option they could conceivably be using, or is there another method I don't know of which they could be implementing? Most importantly, is it true our site does not need PCI certification? As I understand it, using the XML direct method means that we do have to be PCI/DSS certified, and the only way around getting certified is through a payment hosted page (i.e. PayPal).

    Read the article

  • What Ranking Factors Are Used For International Search?

    - by Itai
    Google.com vs Google.ca vs Google.co.uk (etc) all rank their results differently. The intention is to return more locally-relevant content. What factors, other than the ones below, are used to determine local relevancy? I already know the TLD (.com, .ca, etc) and likely the server IP address is used but there has to be more as this would not explain some search results I noticed this week. Particularly, I see a US-based site ranking #3 for some keywords on Google.com, ranking #5 on Google.ca and not ranking within the first pages on Google.co.uk. On Google.com it outranks a Australian site which outranks it on Google.ca. The site itself is relevant for all English-speaking locations and it being outranked by sites from different regions on different Google TLDs (but not ones from the same region as the TLD).

    Read the article

  • Inheriting projects - General Rules?

    - by pspahn
    This is an area of discussion I have long been curious about, but overall, I generally lack the experience to give myself an answer that I would fully trust. We've all been there, a new client shows up with a half-complete project they are looking to finish and launch. For whatever reason, they fired their previous developer, and it's now up to you to save the day. I am just finishing up a code review for a new client, and in my estimation is would be better to scrap what the previous developers built since and start from scratch. There's a ton of reasons why I am leaning toward this way, but it still makes me nervous since the client isn't going to want to hear "those last guys built you a big turd, and I can either polish it, or throw it in the trash". What are your general rules for accepting these projects? How do you determine whether it will be better to start from scratch or continue with the existing code base? What other extra steps might you take to help control client expectations, since the previous developer may have inflated those expectations beyond a reasonable level? Any other general advice?

    Read the article

  • .htaccess do not work without index.php on CodeIgniter

    - by Mattia
    I have read a lot of topic with the same problem but I do not find the solution. I have a LAMP into Ubuntu server. My document root is /home/utente/ into this dir I have another dir (turni) with a CodeIgniter web app. The web app works fine with the index.php into the URL, but I want to eliminate it. I have this configuration: config.php into CodeIgniter: $config['index_page'] = ''; .htaccess: RewriteEngine On RewriteBase / RewriteCond %{REQUEST_URI} ^system.* RewriteRule ^(.*)$ /index.php?/$1 [L] RewriteCond %{REQUEST_URI} ^application.* RewriteRule ^(.*)$ /index.php?/$1 [L] RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteRule ^(.*)$ index.php?/$1 [L] /etc/apache2/sites-available/default: <VirtualHost *:80> ServerAdmin webmaster@localhost DocumentRoot /home/utente <Directory /> Options FollowSymLinks AllowOverride None </Directory> <Directory /home/utente/> Options Indexes FollowSymLinks MultiViews AllowOverride None Order allow,deny allow from all </Directory> ScriptAlias /cgi-bin/ /usr/lib/cgi-bin/ <Directory "/usr/lib/cgi-bin"> AllowOverride None Options +ExecCGI -MultiViews +SymLinksIfOwnerMatch Order allow,deny Allow from all </Directory> ErrorLog ${APACHE_LOG_DIR}/error.log # Possible values include: debug, info, notice, warn, error, crit, # alert, emerg. LogLevel warn CustomLog ${APACHE_LOG_DIR}/access.log combined Alias /doc/ "/usr/share/doc/" <Directory "/usr/share/doc/"> Options Indexes MultiViews FollowSymLinks AllowOverride None Order deny,allow Deny from all Allow from 127.0.0.0/255.0.0.0 ::1/128 </Directory> When I open a link of the web app without index.php into the URL, the server show me this error: The requested URL /turni/auth/login was not found on this server. Why? If I put the index.php like /turni/index.php/auth/login all works fine.

    Read the article

  • Is Movable Type among the most secure PHP blogs? How secure are the various PHP blog applications?

    - by user6025
    Basically I'm trying to find a blog for a website, and security is the highest priority in our case. We don't need any features that I would imagine are special. Wordpress was our first idea, but its reputation precedes it, and though it may have cleaned up its act lately, I'm not seeing much solid evidence. I get the impression that Movable Type (at least the Perl version) has a much better reputation for security than Wordpress (historically at least). I'm not sure I want to take a chance with Wordpress at this point, but is there some objective source I can got to to back up (or counter) the notion that MT is at least among the best? Secunia doesn't recommend using their stats for comparisons, and securityfocus.com doesn't have stats at all that I can see. Searching here http://web.nvd.nist.gov makes MT look way better than WP (at least in 2007), but this site was referenced by MT's own page boasting about their security, so I don't know how relevant it is or how seriously people take it. Any suggestions on sites where I could/should make a somewhat objective comparison?

    Read the article

  • Privacy policy and terms of use language

    - by L. De Leo
    I have a Czech registered business with which I'm serving a web app mostly (but not exclusively) targeted to Italian customers. The server is in Amsterdam. The site will be multilingual (with 4 languages supported) but for now it's Italian only. What language should the privacy policy and terms and conditions be? What law should they refer to? Could I just offer these two docs in English? (Easier to write and to maintain)

    Read the article

  • Why is my page ranked *very* low in Google? [closed]

    - by duality_
    I have created a web site mianatura.net that doesn't even rank in the top 100 results in Google for the query "Mia Natura". I have the text Mia Natura in the domain, <title>, <h1>, I have the site in Google Webmaster Tools, the site is crawled (finding 172 results for site:mianatura.net). I have checked my standing manually (going through the SERPs), using What Page of Search Am I on and diyseo. The site is not involved in any dubious link building campaigns (as far as I know). So what's going on?

    Read the article

  • 3 Scenarios for most relevant keywords in website. Which one is best?

    - by Sam
    A webpage about Tomato Soup has either of three following filenames: Scenario 1 website.org/en/tomato-soup or Scenario 2 website.org/en/tomato-soup-healthy-soups-recipes or Scenario 3 website.org/en/tomato-why-sandra-is-so-wild-about-her-healthy-tomato-soup-recipes Q1. Which one of the abobe would You go for? Q2. Which one of these would be ranked as most relevant by google? Q3. Would either of these be penalized for keyword stuffing?

    Read the article

< Previous Page | 102 103 104 105 106 107 108 109 110 111 112 113  | Next Page >