Search Results

Search found 9721 results on 389 pages for 'quicktest pro'.

Page 173/389 | < Previous Page | 169 170 171 172 173 174 175 176 177 178 179 180  | Next Page >

  • Google Sites page never shows up in Google Search organic results?

    - by gus
    I use Google Sites (i.e.: https://sites.google.com/site/EXAMPLE/ ) as a convenient way to maintain up-to-date info on several residential properties, info that's often requested by my property agents, its been around for about 1 year, but I still can never get it to appear in organic Google search results or Bing, even if I search the specific keywords such as the street names. I submitted the URL manually to search engines, knowing that my Sites page probably has very few incoming links. Is this expected behavior? The content of my page has simple formatted text, and outgoing links to Picasa/G+/imgur photo albums. Am I doing something wrong or do all GoogleSites pages have poor organic search rank? Thank you very much.

    Read the article

  • Internet Explorer menu z-order problem [migrated]

    - by robgt
    I have what appears to be a z-order problem with Internet Explorer 9. It might be in other IE versions also, but not tested. I have to assume so. This page: http://www.modelhelicopters.co.uk/partsfinder/trex500esp/frames If you hover over the "All pages for this model" menu item on the parts finder menu bar (below the currency selector) - it should drop down a list of all the parts finder pages for the selected model helicopter. If you view the same page in IE or Chrome etc, you will see how it should appear. In IE9, the menu gets cut off at the top of the main exploded view image - suggesting the z-order is wrong. I have tried amending this with a jquery snippet but it didn't fix IE9. I know the code was inserted by jquery as shown by firebug in firefox. $j('div.std img[src*="/partsfinder/img"]').attr("style","position:relative;z-index:-100;"); I really do not know why this is not working.

    Read the article

  • MCrypt Module, Rijndael-256

    - by WernerCD
    An outside company is redoing our company Intranet. During some basic usage I disovered that the "User Edit" screens, with the "Password: *" boxes have the password in plain text, with the text box "type=password" to "hide" the password. The passwords are not store in the database as plain text, they are stored encrypted using "rijndael-256" cypher using the mcrypt module. I know that if I encrypt a password with SHA*, the password is "Unrecoverable" via one-way encryption. Is the same of MCrypt Rijndael-256 encryption? Shouldn't an encrypted password be un-recoverable? Are they blowing smoke up my rear or just using the wrong technology?

    Read the article

  • Connecting remote mysql database to local mysql databse? [migrated]

    - by Shashank
    I want to write a php code to be embedded in drupal7 module. I want to call a procedure which can copy the newly generated data in local mysql database to the remote mysql database. When data is inserted in tables 'A' of my local data base it should be copied to the specific table 'B' of the remote mysql server's database. Table 'A' is on local host. Table 'B' is on remote server. insert data on 'A' - copied data in 'B' Is this possible? Thanks for the help.

    Read the article

  • On-line Based Conferencing For Admin Panel

    - by Tim Marshall
    Working on my admin panels work station section which will allow the admin to use simple office tools such as word editors, calculators and whatnot. Under this section of the admin panel, I would like to add a conferencing feature which will allow any administrator to connect with another administrator 1-2-1 under the 1-2-1 Conferencing section, or to join a meeting under 'live meeting' section. Upon my search for on-line based, I found there are all these, what seem to be great applications out there, however not all of them are on-line based, and come with a price! I would like to a purely on-line based conferencing/meeting/1-2-1 feature which will allow administrators to communicate via text or sound, stream their video, share their screen and sending of files. I know this sounds like a big feature, and I am not asking for someone to answer by programming one for my website! Does anyone know my best solution to progress to having this feature on my website, any tutorials or free to use, free to modify web plugins? Thank you for your help, Best Regards, Tim

    Read the article

  • AdSense sent an email saying my account has been approved when it already was approved

    - by moomoochoo
    My account has been approved and running adverts for quite sometime now. However, today I just got a message (it seems legitimate) from Google AdSense saying: Congratulations, your AdSense account has been approved to show AdSense ads on your own website. Within a few hours, you will begin to see live ads. Should I be concerned? They say that they review accounts to check for compliance, could this be some weird way of saying they rechecked my sites and they complied?

    Read the article

  • Firefox and Chrome Display "top: -5px differently"

    - by Kevin
    Using Google Web Toolkit, I have a DIV parent with a DIV and anchor children. <div class="unseen activity"> <div class = "unseen-label"/> <a href .../> </div> With the following CSS, Chrome shows the "unseen label" slightly below the anchor. which is positioned correctly in both Chrome and FireFox. However, FireFox shows the label in line with the anchor. .unseen-activity div.unseen-label { display: inline-block; position: relative; top: -5px; } and .unseen-activity a { background: url('style/images/refreshActivity.png') no-repeat; background-position: 0 2px; height: 20px; overflow: hidden; margin-left: 10px; display: inline-block; margin-top: 2px; padding-left: 20px; padding-right: 10px; position: relative; top: 2px; } Please tell me how to change my CSS so that Chrome render the label centered to the anchor. However, I need to keep FireFox happy and rendered correctly.

    Read the article

  • Is a 302 redirect to a random URL from the homepage an SEO problem?

    - by CookieMonster
    I originally posted this on Stackoverflow, but I believe here is a better place to ask. My web application is very similar to notepad.cc which redirects to a randomly generated URL upon access, e.g. http://myapp.com/roTr94h4Gd. (Please note that notepad.cc is not my site.) Probably because of this redirect feature, when I do "fetch as Google" or "fetch as Bingbot", I get a 302 and no html content. Not even a <html></html> tag. HTTP/1.1 302 Moved Temporarily Server: nginx/1.4.1 Date: Tue, 01 Oct 2013 04:37:37 GMT Content-Type: text/html Transfer-Encoding: chunked Connection: keep-alive X-Powered-By: PHP/5.4.17-1~dotdeb.1 Set-Cookie: PHPSESSID=vp99q5e5t5810e3bnnnvi6sfo2; expires=Thu, 03-Oct-2013 04:37:37 GMT; path=/ Expires: Thu, 19 Nov 1981 08:52:00 GMT Cache-Control: no-store, no-cache, must-revalidate, post-check=0, pre-check=0 Pragma: no-cache Location: /roTr94h4Gd How should I avoid 302 in this case? I suppose I could modify my site to prevent the redirect, but it is a necessary feature of my web app to generate a random URL on each access. I added <meta name="fragment" content="!"> tag into my index page and set it to return a static snapshot of my page when the flag is set. But this still returns a 302. I also added a header to return 200 before redirecting, but this had no effect, either. Could someone tell me a good suggestion to solve this problem?

    Read the article

  • Google Keyword Competition rating

    - by Eric
    Google offers a Keyword application that allows me to see the number of time a particular query has been made in Google. There is a column in the results named "Competition" (Actually its Concurrence in French, I'm just translating). Its a rating from 0 to 1, as in percentage. What indicator is that? EDIT * Is this something useful I should rely on? I'm not sure about how to interpret this data. Should I go for less competitive keywords with a lower number of searches or not worry about it and go for the highly searched keywords anyway? Is 50% considered high? what about 75% ? I have a very niche market that sell expensive offline services, so the very long tail is my goal (I assume). If you didn't already figured out, I'm very new to SEO =)

    Read the article

  • url changed to www...how much time google take to reindex

    - by user20321
    its been about a week i have changed my url from non www to www version it was done by my host provider .....now i want to ask how much time does google take to remove my sitename.com from indexing and replace it with www.sitename.com (site redirection has been already done by host provider )as it is still showing old ones my new urls are indexed but my main url www.sitename.com is not indexed...or so do i have to remove those old urls personally...its been already about 5-6 days?????

    Read the article

  • How do you determine whether a website is a scam [closed]

    - by Tom
    What's the best way to determine if a website is a scam. For example, at first sight (no pun intended) the following website seems to be legitimate. But the price of the product is suspiciously low (all the reviews point to an RRP of approximately £1000). http://www.maxiargos.com/index.php/asus-zenbook-ux31e-dh72-13-3-inch-thin-and-light-ultrabook-silver-aluminum.html Another indication is the lack of SSL for the checkout page, and lack of useful information in the WHOIS record. Registration Service Provided By: TMDHOSTING Contact: +1.8665325635 Domain Name: MAXIARGOS.COM Registrant: PrivacyProtect.org Domain Admin ([email protected]) ID#10760, PO Box 16 Note - All Postal Mails Rejected, visit Privacyprotect.org Nobby Beach null,QLD 4218 AU Tel. +45.36946676 Creation Date: 09-Nov-2011 Expiration Date: 09-Nov-2012 Domain servers in listed order: ns1.tmdhosting410.com ns2.tmdhosting410.com Administrative Contact: PrivacyProtect.org Domain Admin ([email protected]) ID#10760, PO Box 16 Note - All Postal Mails Rejected, visit Privacyprotect.org Nobby Beach null,QLD 4218 AU Tel. +45.36946676 Technical Contact: PrivacyProtect.org Domain Admin ([email protected]) ID#10760, PO Box 16 Note - All Postal Mails Rejected, visit Privacyprotect.org Nobby Beach null,QLD 4218 AU Tel. +45.36946676 Billing Contact: PrivacyProtect.org Domain Admin ([email protected]) ID#10760, PO Box 16 Note - All Postal Mails Rejected, visit Privacyprotect.org Nobby Beach null,QLD 4218 AU Tel. +45.36946676

    Read the article

  • What does Enable/Disable mean in Bing's URL Normalization feature?

    - by DisgruntledGoat
    I'm in Bing Webmaster Tools, under Index URL Normalization. Many parameters are listed in the table with 3 other columns: Status, Source, Date. The "Source" column says "Webmaster" where I have added parameters, and "Bing" where I assume the parameter has been auto-detected. "Date" is probably the last date it detected the parameter. I've tried searching the help files but I can't find what the Status column means. The top of the page says: This feature allows you to specify query parameters for Bing’s crawler to ignore. But it's not clear whether "Enable" or "Disable" is related to this, and if so what happens in each case. Does anyone know?

    Read the article

  • Strategy for hosting 700+ domains, each with static HTML site

    - by jonschlinkert
    I have a portfolio of more than 700 domain names, and ideally I'd like to put up a single-page HTML/CSS/JavaScript webpage for each domain. Is there a system/strategy/workflow that will allow me to: Automate the deployment of new websites, quickly and easily without having to manually initiate each new website in an admin panel. For instance, I've seen dropbox-based solutions that claim to make it simple to setup new websites on your dropbox account, but you still have to set each one up in an admin interface first. It would be so much easier to have a folder naming convention that allowed the user to easily clone/copy/duplicate sites inside their Dropbox App folder (https://www.dropbox.com/developers/blog/23) to create new ones. Sounds interesting, however... It's easy to managing CNAMEs on the registrar-side, is there a way to quickly associate CNAMEs with new websites, maybe gh-pages-style (https://help.github.com/articles/setting-up-a-custom-domain-with-pages)? With GitHub's gh-pages, all you have to do is drop a file called CNAME into your repo, with the domain name you want associated with the repo inside the file. gh-pages isn't a good solution for what I'm doing though unfortunately. I'm also a front-end developer, specializing in rapid web development and "front-end build systems", so I building and maintaining static assets for hundreds of sites is no problem. It's the hosting-side that I really struggle with. Any suggestions?

    Read the article

  • Using mod_speling with multi-level htaccess and rewriterules

    - by michaelcgorman
    We recently switched formats for managing our 301s. For the most part, everything went well, but it seems to have stopped mod_speling from working properly. Here's what we changed: old /var/www/html/.htaccess: RewriteEngine on RewriteBase / # Change SHTML to HTML RewriteRule ^(.*)\.shtml$ $1.html [R=permanent,L] # Change PCF to HTML ('cause, you know, we probably have CMS users like that...) RewriteRule ^(.*)\.pcf$ $1.html [R=permanent,L] # Force WWW subdomain for all requests RewriteCond %{HTTP_HOST} !^www.example.edu$ [NC] RewriteRule ^(.*)$ http://www.example.edu/$1 [R,L] # User accounts are on sun.example.edu RedirectMatch ^/~(.*)$ http://sun.example.edu/~$1 # Remove index.html at the end of URLs RewriteCond %{REQUEST_URI} ^(.*/)index\.html$ [NC] RewriteRule . %1 [R=301,NE,L] Redirect 301 /academics/calendar2012-13.html http://www.example.edu/academics/calendar.html Redirect 301 /academics/departments/ http://www.example.edu/majors/ Redirect 301 /academics/Pre-Medical.pdf http://www.example.edu/academics/Pre-Medicine.pdf Redirect 301 ... new /var/www/html/.htaccess: RewriteEngine on RewriteBase / # Change SHTML to HTML RewriteRule ^(.*)\.shtml$ $1.html [R=permanent,L] # Change PCF to HTML ('cause, you know, we probably have CMS users like that...) RewriteRule ^(.*)\.pcf$ $1.html [R=permanent,L] # Force WWW subdomain for all requests RewriteCond %{HTTP_HOST} !^www.example.edu$ [NC] RewriteRule ^(.*)$ http://www.example.edu/$1 [R,L] # User accounts are on sun.example.edu RedirectMatch ^/~(.*)$ http://sun.example.edu/~$1 # Remove index.html at the end of URLs RewriteCond %{REQUEST_URI} ^(.*/)index\.html$ [NC] RewriteRule . %1 [R=301,NE,L] RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteRule ^(.*) 404/$1 And then we added a new file at /var/www/html/404/.htaccess: RewriteEngine on RewriteBase /404 RewriteRule ^academics/calendar2012-13.html$ /academics/calendar.html [R=302,L] RewriteRule ^academics/departments/$ /majors/ [R=301,L] RewriteRule ^academics/Pre-Medical.pdf$ /academics/Pre-Medicine.pdf[R=301,L] RewriteRule ... I do have (Webmin-based) access to the httpd.conf (though we don't want to store all our 301s there, if possible). We're running Apache 2.2.15 on RHEL 6 on a server in our own data center. Like I said, the only problem we're seeing is that mod_speling isn't doing its magic anymore. The new format has so many advantages over the old that we really don't want to go back, but mod_speling is so nice to have that we'd also really like it to work if possible. Any ideas for how we might be able to fix mod_speling?

    Read the article

  • I have a large number of links on every page, for design reasons I want to keep it but is it hurting my SEO

    - by Callum Rexter
    The site is http://www.centralsaddlery.co.uk We have other issues which we are tackling in terms of content etc but the question I have is: "Is my main navigation hurting us in SEO?" Its a lot of links and it's on a lot of pages. If so - what is a way to get google to ignore links below the top level. I had thought google would see that the links are hidden by default and only shown on hover but I can't verify this at all. We absolutely want to keep the menu, our customers like it and so do we - we think it is pretty usable as we have a lot of products to look at. Any advice is appreciated (and any tips for any part of the SEO are welcome too)

    Read the article

  • Can I include a robots meta tag outside of the head in HTML snippets indeded to be SSIed?

    - by Dan
    I have a number of files in my site which are not intended for independent viewing, but rather to be AJAXed into content within the site. They obviously don't meet HTML standards (no body, head, etc.) as independent entities. I would like to prevent search engines from indexing these pages, but do not have access to /robots.txt (which would be much more ideal). My question is, could I include the following at the top of these partial HTML files and get the desired results? <meta name="robots" content="noindex, noarchive"> I guess there are two parts to this question. Will this cause any rendering issues in any browsers? Will search engines (at least Google & Bing) interpret this as intended?

    Read the article

  • How can I edit (in MS Expression Web ) FrontPage Site Parameters (Substitutions)?

    - by Clay Nichols
    Or, asked another way: Where are the values for MS Front Page Substitions (Site Parameters) stored? (so that I can edit in Web Expressions) Background I'm ashamed to admit that I've been maintaining our company's website in MS FrontPage for over 9 years. I'm moving it to Expression Web, which will display the Substitutions (stored as Site Parameters) but I can't figure out where to edit them. I tried searching the source folders for the website (on my development PC) for the name of the parameter (s-Variable=hoursOfOperation) but did not find it (other than in the files it was actually used in.

    Read the article

  • Is browser and bot whitelisting a practical approach?

    - by Sn3akyP3t3
    With blacklisting it takes plenty of time to monitor events to uncover undesirable behavior and then taking corrective action. I would like to avoid that daily drudgery if possible. I'm thinking whitelisting would be the answer, but I'm unsure if that is a wise approach due to the nature of deny all, allow only a few. Eventually someone out there will be blocked unintentionally is my fear. Even so, whitelisting would also block plenty of undesired traffic to pay per use items such as the Google Custom Search API as well as preserve bandwidth and my sanity. I'm not running Apache, but the idea would be the same I'm assuming. I would essentially be depending on the User Agent identifier to determine who is allowed to visit. I've tried to take into account for accessibility because some web browsers are more geared for those with disabilities although I'm not aware of any specific ones at the moment. The need to not depend on whitelisting alone to keep the site away from harm is fully understood. Other means to protect the site still need to be in place. I intend to have a honeypot, checkbox CAPTCHA, use of OWASP ESAPI, and blacklisting previous known bad IP addresses.

    Read the article

  • Gaming Community CMS, with forum integration [closed]

    - by Tillman32
    Possible Duplicate: Which Content Management System (CMS) should I use? I've had a simple website that I coded myself for a while now, the site is a gaming community. It's very forum and news driven. It was a HORRIBLE idea to take on coding this thing myself. Although we've used it for about a year now, we're just getting too big, and I need to streamline our work. I need writers to post news, etc. I've been doing it through code. ( A year ago I thought it would be a cool idea ) Anyway, I've been messing with just about every CMS out there, and I'm struggling to get something that I really like. The main issue I'm facing, is a good news system, and good forum integration. I'm sort of picky when it comes to looks, its a curse. Reading on here, I see a lot of people saying Drupal is the best for the 3 things I need, community interaction, and forums. I think the main issue that I ran into with drupal, was ease of use, and themes. I am not a web designer, and I need a good theme. For an idea of what I'm looking for, go check out http://www.clgaming.net, they have forums integrated, a nice news area on home page/news section, and nice user accounts. It looks very professional, and I doubt I'll get close to that with a free theme, but their functionality is exactly what I need. Any ideas would be greatly appreciated.

    Read the article

  • Is it possible to block traffic originating from a specific country?

    - by mickburkejnr
    Hi guys, My personal website is currently getting a lot of spam comments at the moment, and most of them originate from Russia (I've used Google Analytics to identify the traffic, and a lot of the links link to Russian sites). As it's a pain to keep deleting this comments, I would like to ban people from there commenting or visiting the website. Is this possible? Also, the website is using WordPress. Many thanks!

    Read the article

  • Wordpress blog penalized by Google search - what's wrong?

    - by pawelbrodzinski
    I have a blog (http://blog.brodzinski.com), which is wordpress.org blog with pretty popular Thesis theme with almost no other customizations. Some time ago it was penalized by Google search - it simply stopped appearing in search results even for search terms it used to be top result, like my name - Pawel Brodzinski - which isn't anything close to popular search term. To be exact the site has been penalized on Nov 18. It started popping up in search result on Dec 23 but only for a few days. Since Dec 27 it is out again. I know Google guidelines and I'm not aware to break any of them. I submitted reconsideration request after I noticed penalty. It was proceeded and there was no change whatsoever (no surprise as it seems the site was penalized again). I checked diagnostics in webmaster tools and neither any malware was detected nor any strange search terms popped up. I read related threads on Google webmasters forum but found none of solutions working for me. I posted a thread on Google webmasters forum (http://www.google.com/support/forum/p/Webmasters/thread?tid=546339f49d4a03bc&hl=en) and the only answer I got was to check for duplicate content. Well, there is some duplicate content published on the web but it is true for vast majority of blogs and it doesn't seem to be a reason for a penalty. Also before Dec 27 I was able to remove duplicate content from a couple of sites which were republishing my feed but this doesn't change the situation - the site was penalized again. The problem is I have no idea what can be wrong with the website or how to find it out. To make the problem worse I'm no webmaster, I just run a wordpress blog, which supposed to be easy.

    Read the article

  • Rewrite rule to show as directory using .htaccess

    - by chanchal1987
    I want to implement a rewrite rule in my .htaccess file to show a specific url as a directory of my server. See the code below I written, RewriteRule ^(.*)/$ ?page=$1 [NC] This will rewrites urls like www.mysite.com/abc/ to www.mysite.com/index.php?page=abc. But if I request www.mysite.com/abc then it is throwing an 404 error. How can I write a rewrite rule which will match www.mysite.com/abc and www.mysite.com/abc/ both? Edit: My current .htaccess file (After Litso's answer's 3rd revision) is like below: ## ErrorDocument 401 /index.php?error=401 ErrorDocument 400 /index.php?error=400 ErrorDocument 403 /index.php?error=403 ErrorDocument 500 /index.php?error=500 ErrorDocument 404 /index.php?error=404 DirectoryIndex index.htm index.html index.php RewriteEngine on RewriteBase / Options +FollowSymlinks RewriteRule ^(.+)\.html?$ $1.php RewriteCond !-d RewriteRule ^(.*)/$ ?page=$1 [NC,L] RewriteCond %{REQUEST_URI} !index.php RewriteRule ^(.*)$ ?page=$1 [NC,L] ##

    Read the article

  • First steps on setting up a query based server [closed]

    - by asghar ashgari
    I got a physical server at home and I want to do the following silly project to learn the concept behind server-backend development, and then do a real project later on: Idea: Turn the server to a calculator. I want any person publicly send a query to the server (i.e., 2+2) from the terminal and the server give me the result. So the question is basically where to start, what sort of software I need to install, and what sort of program I need to write?.

    Read the article

  • Lazyloading images and SEO

    - by surpr
    Lazyloading images with a noscript fallback. Should I expect any damage in the SERPs? The site is completely thumbnail based. Also should I put a smaller image size in the noscript fallback to increase crawlability? We have nearly 1mil thumbs so it's a decision I'm hesitant to do. The reason why I'm thinking about it in the first place is because we're upping thumbail size about 50% which will add 10% of pagesize.

    Read the article

< Previous Page | 169 170 171 172 173 174 175 176 177 178 179 180  | Next Page >