Search Results

Search found 9721 results on 389 pages for 'quicktest pro'.

Page 173/389 | < Previous Page | 169 170 171 172 173 174 175 176 177 178 179 180  | Next Page >

  • Can I include a robots meta tag outside of the head in HTML snippets indeded to be SSIed?

    - by Dan
    I have a number of files in my site which are not intended for independent viewing, but rather to be AJAXed into content within the site. They obviously don't meet HTML standards (no body, head, etc.) as independent entities. I would like to prevent search engines from indexing these pages, but do not have access to /robots.txt (which would be much more ideal). My question is, could I include the following at the top of these partial HTML files and get the desired results? <meta name="robots" content="noindex, noarchive"> I guess there are two parts to this question. Will this cause any rendering issues in any browsers? Will search engines (at least Google & Bing) interpret this as intended?

    Read the article

  • Will adding q&a help my site's rankings, and if so, what are the implications of a sub-domain for q&a rather than a path on the site? [closed]

    - by ElHaix
    Possible Duplicate: Subdomain versus subdirectory One of our web properties is doing quite well without any additional links being created on the site, and our link inventory is tightly managed - no user-generated links. To introduce a community aspect to the site, we want to implement a q&a forum. Once in place, new links will populate our link inventory with keywords that are not necessarily targeted to the site. With the q&a on a sub-domain, would that not affect the main site's rankings? What's the best approach for this?

    Read the article

  • Google analytics/adwords account and leaking of private data

    - by Satellite
    I am frequently asked to log into clients google analytics and adwords accounts. If I forget to log out before visiting other google properties (google search, youtube etc), this leaves tracks of my views/searches etc, exposing my activities to the client. Summary: Client gives me access to their Google Analytics / AdWords account I log into clients Analytics account and do some stuff Then in another tab I perform some related google searches to solve some related issues Issues solved, I then close the Analytics tab I then visit google.com, perform some unrelated searches I then visit YouTube, view some unrelated videos All Web and YouTube searches are recorded in clients google account, thus leaking potentially sensitive data Even assuming that I remember to log out correctly at step 4 (as I do 95% of the time), anything I do at step 3 is exposed to the client. I would be surprised if this is not a very common issue. I'm looking for a technical solution to ensure that this can never happen. Any ideas?

    Read the article

  • How do I deal with content scrapers? [closed]

    - by aem
    Possible Duplicate: How to protect SHTML pages from crawlers/spiders/scrapers? My Heroku (Bamboo) app has been getting a bunch of hits from a scraper identifying itself as GSLFBot. Googling for that name produces various results of people who've concluded that it doesn't respect robots.txt (eg, http://www.0sw.com/archives/96). I'm considering updating my app to have a list of banned user-agents, and serving all requests from those user-agents a 400 or similar and adding GSLFBot to that list. Is that an effective technique, and if not what should I do instead? (As a side note, it seems weird to have an abusive scraper with a distinctive user-agent.)

    Read the article

  • Font displays differently in Firefox vs. Chrome

    - by Goro
    It seems that my menu bar is displayed with a different font stretch in Firefox than it is in Chrome. See the following: Here is the CSS applied to this element: font-variant: small-caps; font-size:13px; letter-spacing: 0px; font-family: Arial; font-stretch: normal; text-decoration: none; As far as I can tell everything regarding that font is exactly the same, yet they still display differently (see pic). Any ideas? Thanks,

    Read the article

  • Unified data source for k2 installed Joomla websites

    - by Özkan ÖZLÜ
    I am responsible for a few web sites of my organization. I use Joomla! 2.5.9 for those web sites. They all are running at the same server. I use K2 component for content managing. I have a general website in which shows all the staff information at the 'Staff' page. Also some of those people and their contents are shown in another department's website. So, there are databases for each web site. For example: In the general website (let's say general.org), when I click on the 'Staff' menu item, page shows all of the people work at my organization. Also they work at different departments. In another web site (eg: education.general.org) when I click on the 'Staff' menu item, it shows the people work at education department. But for each web site, I have different user accounts which means a modification in one of them does not affect the other one. If the one of the education staff tries to change his profile picture on the education web site, he also has to do it on the general web site. And sometimes one person might be working at two departments. Thus he has to edit three times of his data. Is it possible to merge the records for all websites? In other words, I want everyone to insert/update their data on the general web site, and the other web sites will be updated automatically.

    Read the article

  • Will search engines discover that our old pages have been 301 redirected if there are no more links to them in the old site?

    - by Obay
    We've moved our website to a new domain. Thousands of our pages come from one PHP file in the old site (e.g. oldsite.com/news.php?id=<id>). So we added some code in news.php file to do a 301 redirect to the specific corresponding news article in the new website (newsite.com/news/<id>). We have not yet done a 301 redirect for the root of the old site (so we could display a notice to our users that we've moved), but all links inside it are already 301 redirected. My concern is that, when Google crawls our old website, it will no longer be able to find the old news articles and discover that they have been 301 Redirected -- is this correct? If so, does that mean our PageRank won't be carried over to the new site? I've also read that we would need to create a sitemap for the new site. Is it possible to indicate in the sitemap the old and new locations of specific pages? Because if not, how will Google know? (I'm not sure change of address in Webmaster Tools would be specific enough).

    Read the article

  • MCrypt Module, Rijndael-256

    - by WernerCD
    An outside company is redoing our company Intranet. During some basic usage I disovered that the "User Edit" screens, with the "Password: *" boxes have the password in plain text, with the text box "type=password" to "hide" the password. The passwords are not store in the database as plain text, they are stored encrypted using "rijndael-256" cypher using the mcrypt module. I know that if I encrypt a password with SHA*, the password is "Unrecoverable" via one-way encryption. Is the same of MCrypt Rijndael-256 encryption? Shouldn't an encrypted password be un-recoverable? Are they blowing smoke up my rear or just using the wrong technology?

    Read the article

  • Internet Explorer menu z-order problem [migrated]

    - by robgt
    I have what appears to be a z-order problem with Internet Explorer 9. It might be in other IE versions also, but not tested. I have to assume so. This page: http://www.modelhelicopters.co.uk/partsfinder/trex500esp/frames If you hover over the "All pages for this model" menu item on the parts finder menu bar (below the currency selector) - it should drop down a list of all the parts finder pages for the selected model helicopter. If you view the same page in IE or Chrome etc, you will see how it should appear. In IE9, the menu gets cut off at the top of the main exploded view image - suggesting the z-order is wrong. I have tried amending this with a jquery snippet but it didn't fix IE9. I know the code was inserted by jquery as shown by firebug in firefox. $j('div.std img[src*="/partsfinder/img"]').attr("style","position:relative;z-index:-100;"); I really do not know why this is not working.

    Read the article

  • Gzip compress offline?

    - by shoosh
    I've configured my site to serve compressed content by putting this line in .htaccess AddOutputFilterByType DEFLATE text/html text/plain text/xml text/javascript text/css application/javascript application/json This works perfectly for almost all files except a few large JSON files that are above 200Kb. For some reason they are not being compressed. I see that they don't using the net tab in firebug and the Network section in chrome. So as a workaround I thought I could compress these files offline and have Apache read them compressed. What tool should I use to compress them? is the linux gzip the one? any special flags or something I should use? What should I put in .htaccess so that the server would know to serve these files with content-encoding gzip ?

    Read the article

  • Do Not Track feature of IE10

    - by Pete Herbert Penito
    One of our clients is getting a bit worried about the new "Do Not Track" feature of Internet Explorer 10. Her site is heavily dependent on php sessions (as I imagine many other sites are). This was what she was reading: http://www.bbc.co.uk/news/technology-18288710 I need some clarification, will this affect how sessions (or cookies) work on normal web sites that use the PHP $_SESSION array? Or is it regarding only how advertising works (engadget's article seems to insinuate this)? Can anyone provide a more technical overview (and the ramifications) of PHP-powered websites?

    Read the article

  • PHP code works on Chrome, but not on Firefox or IE (send email via HTML form) [on hold]

    - by Cachirro
    My brother has this form: <form id="lista" action="lista2.php" method="post"> <input name="cf_name" type="text" size="50" hidden="yes" class="obscure"> <input name="cf_email" type="text" size="50" hidden="yes" class="obscure"> <textarea name="cf_message" cols="45" rows="10" hidden="yes" class="obscure"> </textarea> <input type="image" name="submit" value="Enviar Lista por Email" src="imagens/lista_email.png" width="40" height="40" onclick="this.form.elements['cf_message'].value = lista_mail;this.form.elements['cf_name'].value = prompt('Escreva o seu nome:', '');this.form.elements['cf_email'].value = prompt('Escreva o seu email:', '');"> <input name="submit2" type="submit" value="Enviar" hidden="yes" class="obscure"> </form> That calls this PHP file: <?php if ( isset($_POST['submit']) ) { // Dados de autenticacao SMTP $smtpinfo['host'] = 'localhost'; $smtpinfo['port'] = '25'; $smtpinfo['auth'] = true; $smtpinfo['username'] = 'xxx'; $smtpinfo['password'] = 'xxx'; // Dados recebidos do formulario $nome = $_POST['cf_name']; $email = $_POST['cf_email']; $mensagem = $_POST['cf_message']; // Inclusão de ficheiro PEAR. Certifique-se que o PEAR está activado no seu alojamento require_once "Mail.php"; // Corpo da mensagem $body = "Nome: ".$nome; $body.= "\n\n"; $body.= nl2br($mensagem); $headers = array ('From' => $email, 'To' => $smtpinfo["username"], 'Subject' => 'Encomenda Website'); $mail_object = Mail::factory('smtp', $smtpinfo); $mail = $mail_object->send($smtpinfo["username"], $headers, $body); if ( PEAR::isError($mail) ) { echo ("<p>" . $mail->getMessage() . "</p>"); } else { echo ('<b><font color="FFFF00">Mensagem enviada com sucesso.<br><br></b>Seu email: ' . $email . '<br><br></font>'); }} ?> This basically sends an email with some selected products, name and email. The problem is that it works perfectly on Chrome, but not on FF or IE. When the submit image is pressed, the URL changes to the PHP file, but it displays a blank page. After display errors activated: ini_set('display_errors',1); ini_set('display_startup_errors',1); error_reporting(-1) FF/IE display blank page and email isn't sent, Chrome sends the email and displays this: Strict Standards: Non-static method Mail::factory() should not be called statically in /var/www/vhosts/[site url]/httpdocs/lista2.php on line 33 Strict Standards: Non-static method PEAR::isError() should not be called statically, assuming $this from incompatible context in /usr/share/php/Mail/smtp.php , dont know if it helps So, what is causing the email to be sent on chrome and not on FF or IE? Thank you.

    Read the article

  • Should I, and how do I incorporate microdata into my asp.net website with 47 pages?

    - by Jason Weber
    I have an asp.net (vb) with 47 pages. The problem is that it's in 10 different languages, although 98% just use English. I have 5 master pages. I've read Google Webmaster Tools, but I'm still confounded. I'm reading about how microdata is the way to go. Does this mean I should put itemtype and itemprop span and div tags in my master pages, or should I do all of my 47 pages (.resx resource files) separately? The main key phrase I want throughout search results is "machine vision". For instance, the first couple sentences on my "about.aspx" page are: <span itemprop="name">USS Vision Inc.</span> (USS) is a privately-owned company with headquarters in <span itemprop="locality">Detroit, Michigan, USA</span>. We design, engineer, produce, and integrate special machine vision error-proofing products and <a href="http://www.ussvision.com/services/" target="_self" itemprop="url">services</a> that create lean factories by improving the quality of manufactured products, and by significantly reducing manufacturing costs through advanced automation. Am I doing this right, or how would I do this if I'm not? Should I use the itemprop="url" or other rich snippets for every link in my website? I mean, do I need to add an itemprop to just about everything, or can I just alter my master pages? Any guidance in this regard to help improve my SEO and SERPS would be greatly appreciated!

    Read the article

  • International TLD's vs. duplicate content

    - by Litso
    Hey all, I currently work at a pretty big website that has visitors from around the globe. My job is to help out on the SEO, and one thing we've been discussing lately is the use of international TLD's. The ones we use range between: (partly) translated websites like .es and .de that serve most of the content in the country's language non-translated (english) websites for non-english languages (due to a lack of translations) like .ro and .cz english websites for english speaking countries with localized TLD's (.co.nz, .co.uk) On one hand I really have the feeling this is causing a lot of duplicate content, especially for the last two categories of TLD's. On the other hand though it seems a lot like country-specific TLD's tend to score a lot better in that country's Google. Would it be advisable to keep on using these domains, or should we canonicalize them all to the .com version?

    Read the article

  • Commenting System For A Website

    - by lvil
    I hope this is the right place for such a question. I am developing a website that has no users system. I am looking for a commenting system for the website. Requirements: Ajax commenting Flagging comments administration (deleting comments) php using my DB or external service No registration, no FB comments Option for a captcha Hebrew or customizable interface Can you please suggest such a system?

    Read the article

  • Google Sites page never shows up in Google Search organic results?

    - by gus
    I use Google Sites (i.e.: https://sites.google.com/site/EXAMPLE/ ) as a convenient way to maintain up-to-date info on several residential properties, info that's often requested by my property agents, its been around for about 1 year, but I still can never get it to appear in organic Google search results or Bing, even if I search the specific keywords such as the street names. I submitted the URL manually to search engines, knowing that my Sites page probably has very few incoming links. Is this expected behavior? The content of my page has simple formatted text, and outgoing links to Picasa/G+/imgur photo albums. Am I doing something wrong or do all GoogleSites pages have poor organic search rank? Thank you very much.

    Read the article

  • Which is better for search engines, repeated phrases or different phrases with the same meaning?

    - by George Botros
    When I'm designing an ads website I have two options: Let the advertiser to choose from some predefined lists to create the new ad. For Example: product list ( T-Shirt, Shorts, Suit, .....) Color list ( Black, Red, .....) Let the advertiser to write his own descriptive content for the product For Example "Amazing suit with a good price" I like the first Scenario but which is better for search engine optimization [SEO], repeated phrases or different phrases with the same meaning? Note : assuming each page will contain one or more ads

    Read the article

  • Connecting remote mysql database to local mysql databse? [migrated]

    - by Shashank
    I want to write a php code to be embedded in drupal7 module. I want to call a procedure which can copy the newly generated data in local mysql database to the remote mysql database. When data is inserted in tables 'A' of my local data base it should be copied to the specific table 'B' of the remote mysql server's database. Table 'A' is on local host. Table 'B' is on remote server. insert data on 'A' - copied data in 'B' Is this possible? Thanks for the help.

    Read the article

  • Moving large amounts of data between shared hosts

    - by Bryan M.
    I recently acquired a client who is a photographer and was interested in moving web hosts since his current host had threatened to throw him off due to CPU spiking. The migration went fairly easily, with about 350MBs of website and media files. Then I discovered about 60GBs of client galleries he had failed to mention. I am unable to move this much data myself, since I'm capping out at about 20kb/s on the FTP connection. Has anyone encountered a situation where they needed to migrate this much data between cheap hosting? Should we contact the hosting companies about this (he is moving from Westhost to MediaTemple)?

    Read the article

  • when google search gives incorrect results - how can it be reported?

    - by vgv8
    If google search query results are incorrect: How can it be reported? What is the procedure to correct it? @Lèse majesté: Incorrect results are the results that do not contain any of the searched keywords in them like in this my question @John Conde, yes I believe it is the right defitition. @DisgruntedGoat, even when there are a lot of results by keycaptcha for "Past 24 hours", the Google.com presents results only on reCAPTCHA. Anyway, they are different from those by google if to search by "keycaptcha" (in quotes) and by other search engines. Everybody thinks that searches by one keyword should be sneakily substituted by google's own promoted brand products?

    Read the article

  • Using PHP Redirect Script together with Custom Fields (WordPress)? [on hold]

    - by Alex Scherer
    I am currently trying to make yoast's link cloaking script ( Yoast.com script manual // Github Script files ) work together with the Wordpress plugin Advanced Custom Fields. The script fetches 2 values (redirect id, redirect url) via GET and then redirects to this particular URL which is defined in a .txt file called redirects.txt I would like to change the script, so that I can define both the id and redirection URL via custom fields on each post in my WP dashboard.. I would be really happy if someone could help me to code something that does the same as the script above but without using a redirects.txt file to save the values but furthermore gets those values from custom fields. Best regards ! Alex

    Read the article

  • Using mod_speling with multi-level htaccess and rewriterules

    - by michaelcgorman
    We recently switched formats for managing our 301s. For the most part, everything went well, but it seems to have stopped mod_speling from working properly. Here's what we changed: old /var/www/html/.htaccess: RewriteEngine on RewriteBase / # Change SHTML to HTML RewriteRule ^(.*)\.shtml$ $1.html [R=permanent,L] # Change PCF to HTML ('cause, you know, we probably have CMS users like that...) RewriteRule ^(.*)\.pcf$ $1.html [R=permanent,L] # Force WWW subdomain for all requests RewriteCond %{HTTP_HOST} !^www.example.edu$ [NC] RewriteRule ^(.*)$ http://www.example.edu/$1 [R,L] # User accounts are on sun.example.edu RedirectMatch ^/~(.*)$ http://sun.example.edu/~$1 # Remove index.html at the end of URLs RewriteCond %{REQUEST_URI} ^(.*/)index\.html$ [NC] RewriteRule . %1 [R=301,NE,L] Redirect 301 /academics/calendar2012-13.html http://www.example.edu/academics/calendar.html Redirect 301 /academics/departments/ http://www.example.edu/majors/ Redirect 301 /academics/Pre-Medical.pdf http://www.example.edu/academics/Pre-Medicine.pdf Redirect 301 ... new /var/www/html/.htaccess: RewriteEngine on RewriteBase / # Change SHTML to HTML RewriteRule ^(.*)\.shtml$ $1.html [R=permanent,L] # Change PCF to HTML ('cause, you know, we probably have CMS users like that...) RewriteRule ^(.*)\.pcf$ $1.html [R=permanent,L] # Force WWW subdomain for all requests RewriteCond %{HTTP_HOST} !^www.example.edu$ [NC] RewriteRule ^(.*)$ http://www.example.edu/$1 [R,L] # User accounts are on sun.example.edu RedirectMatch ^/~(.*)$ http://sun.example.edu/~$1 # Remove index.html at the end of URLs RewriteCond %{REQUEST_URI} ^(.*/)index\.html$ [NC] RewriteRule . %1 [R=301,NE,L] RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteRule ^(.*) 404/$1 And then we added a new file at /var/www/html/404/.htaccess: RewriteEngine on RewriteBase /404 RewriteRule ^academics/calendar2012-13.html$ /academics/calendar.html [R=302,L] RewriteRule ^academics/departments/$ /majors/ [R=301,L] RewriteRule ^academics/Pre-Medical.pdf$ /academics/Pre-Medicine.pdf[R=301,L] RewriteRule ... I do have (Webmin-based) access to the httpd.conf (though we don't want to store all our 301s there, if possible). We're running Apache 2.2.15 on RHEL 6 on a server in our own data center. Like I said, the only problem we're seeing is that mod_speling isn't doing its magic anymore. The new format has so many advantages over the old that we really don't want to go back, but mod_speling is so nice to have that we'd also really like it to work if possible. Any ideas for how we might be able to fix mod_speling?

    Read the article

  • Firefox and Chrome Display "top: -5px differently"

    - by Kevin
    Using Google Web Toolkit, I have a DIV parent with a DIV and anchor children. <div class="unseen activity"> <div class = "unseen-label"/> <a href .../> </div> With the following CSS, Chrome shows the "unseen label" slightly below the anchor. which is positioned correctly in both Chrome and FireFox. However, FireFox shows the label in line with the anchor. .unseen-activity div.unseen-label { display: inline-block; position: relative; top: -5px; } and .unseen-activity a { background: url('style/images/refreshActivity.png') no-repeat; background-position: 0 2px; height: 20px; overflow: hidden; margin-left: 10px; display: inline-block; margin-top: 2px; padding-left: 20px; padding-right: 10px; position: relative; top: 2px; } Please tell me how to change my CSS so that Chrome render the label centered to the anchor. However, I need to keep FireFox happy and rendered correctly.

    Read the article

  • Is it possible to block traffic originating from a specific country?

    - by mickburkejnr
    Hi guys, My personal website is currently getting a lot of spam comments at the moment, and most of them originate from Russia (I've used Google Analytics to identify the traffic, and a lot of the links link to Russian sites). As it's a pain to keep deleting this comments, I would like to ban people from there commenting or visiting the website. Is this possible? Also, the website is using WordPress. Many thanks!

    Read the article

  • Free online PHP hosting [closed]

    - by Anthony Newman
    Possible Duplicate: How to find web hosting that meets my requirements? I have a PHP script that can take $_GET parameters from a URL (i.e. http://www.example.com/test.php?name=george). I'd like to be able to host this script online so that others can pass parameters to it to obtain the returned data. Anyone know of a free PHP hosting site that would allow for his functionality? (PS: I can't host it myself) Thanks!

    Read the article

< Previous Page | 169 170 171 172 173 174 175 176 177 178 179 180  | Next Page >