Search Results

Search found 9721 results on 389 pages for 'quicktest pro'.

Page 173/389 | < Previous Page | 169 170 171 172 173 174 175 176 177 178 179 180  | Next Page >

  • Is there a way I can filter traffic by page-type based upon URL structure in Google-Analytics or Google Webmaster Tools?

    - by Felix
    I have a local business directory site. I'm trying to segment my incoming traffic by page-type such that I can find out what percentage of traffic is going to zip code pages exclusively and what percentage is going to city/state level pages. I basically want to filter by URL structure to find out what percentage of total traffic zip code pages account for. The reason for doing this is to find out if Google Tag Manager can help with this? Here are the two URL paths: http://www.example.com/ny/new-york/10011/ http://www.example.com/ny/new-york

    Read the article

  • MCrypt Module, Rijndael-256

    - by WernerCD
    An outside company is redoing our company Intranet. During some basic usage I disovered that the "User Edit" screens, with the "Password: *" boxes have the password in plain text, with the text box "type=password" to "hide" the password. The passwords are not store in the database as plain text, they are stored encrypted using "rijndael-256" cypher using the mcrypt module. I know that if I encrypt a password with SHA*, the password is "Unrecoverable" via one-way encryption. Is the same of MCrypt Rijndael-256 encryption? Shouldn't an encrypted password be un-recoverable? Are they blowing smoke up my rear or just using the wrong technology?

    Read the article

  • First steps on setting up a query based server [closed]

    - by asghar ashgari
    I got a physical server at home and I want to do the following silly project to learn the concept behind server-backend development, and then do a real project later on: Idea: Turn the server to a calculator. I want any person publicly send a query to the server (i.e., 2+2) from the terminal and the server give me the result. So the question is basically where to start, what sort of software I need to install, and what sort of program I need to write?.

    Read the article

  • Internet Explorer menu z-order problem [migrated]

    - by robgt
    I have what appears to be a z-order problem with Internet Explorer 9. It might be in other IE versions also, but not tested. I have to assume so. This page: http://www.modelhelicopters.co.uk/partsfinder/trex500esp/frames If you hover over the "All pages for this model" menu item on the parts finder menu bar (below the currency selector) - it should drop down a list of all the parts finder pages for the selected model helicopter. If you view the same page in IE or Chrome etc, you will see how it should appear. In IE9, the menu gets cut off at the top of the main exploded view image - suggesting the z-order is wrong. I have tried amending this with a jquery snippet but it didn't fix IE9. I know the code was inserted by jquery as shown by firebug in firefox. $j('div.std img[src*="/partsfinder/img"]').attr("style","position:relative;z-index:-100;"); I really do not know why this is not working.

    Read the article

  • Share links with <script src=""> SEO

    - by gansbrest
    Hi, I would like to create a share link to my website using javascript: script src="[url-to-my-script]" Basically the main idea behind this is to render HTML block with an image and link to the website. By using JavaScript approach I can control the look and feel of that link + potentially I could change the link html in the future without touching partner websites. The only potential problem I see with this approach is SEO. I'm not sure if google will be able to index that link to my website, since it's generated by javascript.

    Read the article

  • How do I deal with content scrapers? [closed]

    - by aem
    Possible Duplicate: How to protect SHTML pages from crawlers/spiders/scrapers? My Heroku (Bamboo) app has been getting a bunch of hits from a scraper identifying itself as GSLFBot. Googling for that name produces various results of people who've concluded that it doesn't respect robots.txt (eg, http://www.0sw.com/archives/96). I'm considering updating my app to have a list of banned user-agents, and serving all requests from those user-agents a 400 or similar and adding GSLFBot to that list. Is that an effective technique, and if not what should I do instead? (As a side note, it seems weird to have an abusive scraper with a distinctive user-agent.)

    Read the article

  • Google Keyword Competition rating

    - by Eric
    Google offers a Keyword application that allows me to see the number of time a particular query has been made in Google. There is a column in the results named "Competition" (Actually its Concurrence in French, I'm just translating). Its a rating from 0 to 1, as in percentage. What indicator is that? EDIT * Is this something useful I should rely on? I'm not sure about how to interpret this data. Should I go for less competitive keywords with a lower number of searches or not worry about it and go for the highly searched keywords anyway? Is 50% considered high? what about 75% ? I have a very niche market that sell expensive offline services, so the very long tail is my goal (I assume). If you didn't already figured out, I'm very new to SEO =)

    Read the article

  • Wordpress blog penalized by Google search - what's wrong?

    - by pawelbrodzinski
    I have a blog (http://blog.brodzinski.com), which is wordpress.org blog with pretty popular Thesis theme with almost no other customizations. Some time ago it was penalized by Google search - it simply stopped appearing in search results even for search terms it used to be top result, like my name - Pawel Brodzinski - which isn't anything close to popular search term. To be exact the site has been penalized on Nov 18. It started popping up in search result on Dec 23 but only for a few days. Since Dec 27 it is out again. I know Google guidelines and I'm not aware to break any of them. I submitted reconsideration request after I noticed penalty. It was proceeded and there was no change whatsoever (no surprise as it seems the site was penalized again). I checked diagnostics in webmaster tools and neither any malware was detected nor any strange search terms popped up. I read related threads on Google webmasters forum but found none of solutions working for me. I posted a thread on Google webmasters forum (http://www.google.com/support/forum/p/Webmasters/thread?tid=546339f49d4a03bc&hl=en) and the only answer I got was to check for duplicate content. Well, there is some duplicate content published on the web but it is true for vast majority of blogs and it doesn't seem to be a reason for a penalty. Also before Dec 27 I was able to remove duplicate content from a couple of sites which were republishing my feed but this doesn't change the situation - the site was penalized again. The problem is I have no idea what can be wrong with the website or how to find it out. To make the problem worse I'm no webmaster, I just run a wordpress blog, which supposed to be easy.

    Read the article

  • google webmaster showing 6 pages submitted 0 indexed, yet i can see them all there when i search in google?

    - by sam
    I have a small 'brochure' type site with 6 pages, i can see them all the pages showing up in google when i search for my site. But in webmaster tools under the sitemaps section it says 6 submitted, (the blue bar of the graph), but the indexed pages - the red bar is showing 0 indexed pages, even though they seem to be indexed ? any idea why this is ? I dont really think its that important as the pages are still indexed, but it just seems odd. =================================================== UPDATE 9/3/12 having just looked in google webmaster its showing that there are 11 pages indexed, under the health index status tab.. but under the optimization sitemap tab it shows 6 urls submitted but only 1 indexed ? please see images bellow index status: Sitemap status:

    Read the article

  • Rewrite rule to show as directory using .htaccess

    - by chanchal1987
    I want to implement a rewrite rule in my .htaccess file to show a specific url as a directory of my server. See the code below I written, RewriteRule ^(.*)/$ ?page=$1 [NC] This will rewrites urls like www.mysite.com/abc/ to www.mysite.com/index.php?page=abc. But if I request www.mysite.com/abc then it is throwing an 404 error. How can I write a rewrite rule which will match www.mysite.com/abc and www.mysite.com/abc/ both? Edit: My current .htaccess file (After Litso's answer's 3rd revision) is like below: ## ErrorDocument 401 /index.php?error=401 ErrorDocument 400 /index.php?error=400 ErrorDocument 403 /index.php?error=403 ErrorDocument 500 /index.php?error=500 ErrorDocument 404 /index.php?error=404 DirectoryIndex index.htm index.html index.php RewriteEngine on RewriteBase / Options +FollowSymlinks RewriteRule ^(.+)\.html?$ $1.php RewriteCond !-d RewriteRule ^(.*)/$ ?page=$1 [NC,L] RewriteCond %{REQUEST_URI} !index.php RewriteRule ^(.*)$ ?page=$1 [NC,L] ##

    Read the article

  • Does the EU cookie law apply to an EU site that is hosted outside of the EU?

    - by mickburkejnr
    I have been reading up about this EU cookie law, and have also had in depth conversations with my girlfriend who is a solicitor/lawyer and with colleagues while building websites. While we are now working towards implementing a way to abide by the EU law, I have thought of something which no one really knows the answer to and has caused a few arguments. It's my understanding that any website in the EU must abide by these cookie laws, which is understandable. However, say if I were to have a .co.uk or .eu domain name pointing to a website which is hosted in America for example, do I still need to abide by the EU laws even though the website is hosted outside of the EU? One person I have asked has said that because the domain name is .co.uk or .eu (a European TLD) then the website is still accountable under EU law. Another person I have asked has said because the actual website is held outside of the EU, it doesn't actually have to bother with this law.

    Read the article

  • Is it possible to block traffic originating from a specific country?

    - by mickburkejnr
    Hi guys, My personal website is currently getting a lot of spam comments at the moment, and most of them originate from Russia (I've used Google Analytics to identify the traffic, and a lot of the links link to Russian sites). As it's a pain to keep deleting this comments, I would like to ban people from there commenting or visiting the website. Is this possible? Also, the website is using WordPress. Many thanks!

    Read the article

  • Domain name only works with www. but not without the www

    - by Rodrigo Salazar
    I have searched for this problem numerous times but I can't quite apply the solution to my problem. The usual solution I find is that I need to add an A-record to my dns, currently I only have a C-name record for host 'www' to my DNS. The problem is, I'm having trouble understanding what goes in the 'host' section of the record when trying to create an A-record. Here is a screenshot to my domain registrar where I am trying to create the record. http://i.imgur.com/M18zm.png I have tried creating an A-record with the 'host' field set to *, but that is not a valid entry, nor is leaving it blank. Anyone have any suggestions? The domain is swageroo.com, but it's only reachable at http://www.swageroo.com (With the www)

    Read the article

  • URL rewriting via forward proxy

    - by Biggroover
    I have an app that runs inside my firewall and talks out to multiple end points via HTTP/HTTPS on a non-standard port e.g. http://endpoint1.domain.com:7171, http://endpoint2.domain.com:7171 What I want to do is route these requests through a forward proxy that then rewrites the URL to something like http://allendpoints.domain.com/endpoint1 (port 80 or 443) then on the other end have a reverse proxy that unwinds what I did on the forward proxy to reach the specific endpoints. The result being that I can route existing app requests through to specific endpoints across the internet without having to change my app software. My questions are: is this even possible? is it a good idea, are their better ways to do this? Can this be done with IIS and Apache as the proxies?

    Read the article

  • Strategy for hosting 700+ domains, each with static HTML site

    - by jonschlinkert
    I have a portfolio of more than 700 domain names, and ideally I'd like to put up a single-page HTML/CSS/JavaScript webpage for each domain. Is there a system/strategy/workflow that will allow me to: Automate the deployment of new websites, quickly and easily without having to manually initiate each new website in an admin panel. For instance, I've seen dropbox-based solutions that claim to make it simple to setup new websites on your dropbox account, but you still have to set each one up in an admin interface first. It would be so much easier to have a folder naming convention that allowed the user to easily clone/copy/duplicate sites inside their Dropbox App folder (https://www.dropbox.com/developers/blog/23) to create new ones. Sounds interesting, however... It's easy to managing CNAMEs on the registrar-side, is there a way to quickly associate CNAMEs with new websites, maybe gh-pages-style (https://help.github.com/articles/setting-up-a-custom-domain-with-pages)? With GitHub's gh-pages, all you have to do is drop a file called CNAME into your repo, with the domain name you want associated with the repo inside the file. gh-pages isn't a good solution for what I'm doing though unfortunately. I'm also a front-end developer, specializing in rapid web development and "front-end build systems", so I building and maintaining static assets for hundreds of sites is no problem. It's the hosting-side that I really struggle with. Any suggestions?

    Read the article

  • url changed to www...how much time google take to reindex

    - by user20321
    its been about a week i have changed my url from non www to www version it was done by my host provider .....now i want to ask how much time does google take to remove my sitename.com from indexing and replace it with www.sitename.com (site redirection has been already done by host provider )as it is still showing old ones my new urls are indexed but my main url www.sitename.com is not indexed...or so do i have to remove those old urls personally...its been already about 5-6 days?????

    Read the article

  • Google analytics/adwords account and leaking of private data

    - by Satellite
    I am frequently asked to log into clients google analytics and adwords accounts. If I forget to log out before visiting other google properties (google search, youtube etc), this leaves tracks of my views/searches etc, exposing my activities to the client. Summary: Client gives me access to their Google Analytics / AdWords account I log into clients Analytics account and do some stuff Then in another tab I perform some related google searches to solve some related issues Issues solved, I then close the Analytics tab I then visit google.com, perform some unrelated searches I then visit YouTube, view some unrelated videos All Web and YouTube searches are recorded in clients google account, thus leaking potentially sensitive data Even assuming that I remember to log out correctly at step 4 (as I do 95% of the time), anything I do at step 3 is exposed to the client. I would be surprised if this is not a very common issue. I'm looking for a technical solution to ensure that this can never happen. Any ideas?

    Read the article

  • Wordpress plugin installation error.

    - by Steve
    I'm trying to upload secure-wordpress.1.0.6, and I receive the following error: Warning: touch() [function.touch]: open_basedir restriction in effect. File(/abs_path/wordpress/tmp/secure-wordpress.tmp) is not within the allowed path(s): (/abs_path/:/abs_path/:/usr/local/lib/php:/tmp/php_upload) in /abs_path/public/www/wordpress/wp-admin/includes/file.php on line 199 Download failed. Could not create Temporary file. The /wp-content folder and all it's subfolders have 777 permission. I've added the following two lines to wp-config.php: putenv('TMPDIR='.ini_get('upload_tmp_dir') ); define('WP_TEMP_DIR', ABSPATH .'wp-content/uploads/'); What else should I try? I am using Wordpress 3.04 in a PHP 4.49 environment.

    Read the article

  • Preventing Duplicates on Google

    - by abel
    I am currently using a rewrite rule to enable access to .php pages, without using the php extension. However to prevent old links from breaking, the pages can still be accessed via links containing the .php extension too. For eg. domain.com/page.php can now be accessed at domain.com/page All the links on the website now use domain.com/page type links within the site. However older incoming links will still link to the .php pages, meaning Google will index both pages and mark them as duplicate. I have two plans to remedy the situation. Use a php 301 redirect: When a page is accessed with the .php extension, I can redirect each page individually using a 301 redirect using php Using Canonical: Place a canonical tag on each page, pointing to the ".php" less version My Question: Are both methods equally efficacious in preventing Google from indexing my ".php" pages? Which method should be preferred, by convention or otherwise?

    Read the article

  • Moving large amounts of data between shared hosts

    - by Bryan M.
    I recently acquired a client who is a photographer and was interested in moving web hosts since his current host had threatened to throw him off due to CPU spiking. The migration went fairly easily, with about 350MBs of website and media files. Then I discovered about 60GBs of client galleries he had failed to mention. I am unable to move this much data myself, since I'm capping out at about 20kb/s on the FTP connection. Has anyone encountered a situation where they needed to migrate this much data between cheap hosting? Should we contact the hosting companies about this (he is moving from Westhost to MediaTemple)?

    Read the article

  • Is it safe to have no TOS or PP?

    - by JamerTheProgrammer
    I have coded my own forums from the ground up. I have tried my best to make my code as secure as possible and encrypting everything I can. I want to use this forum for a Minecraft server. I have one concern however.... I would like to setup this forum now but having no TOS or Privacy Policy has put me off. Will having none of either cause me any legal trouble in the unlikely event of a data leakage? Thanks

    Read the article

  • Connecting remote mysql database to local mysql databse? [migrated]

    - by Shashank
    I want to write a php code to be embedded in drupal7 module. I want to call a procedure which can copy the newly generated data in local mysql database to the remote mysql database. When data is inserted in tables 'A' of my local data base it should be copied to the specific table 'B' of the remote mysql server's database. Table 'A' is on local host. Table 'B' is on remote server. insert data on 'A' - copied data in 'B' Is this possible? Thanks for the help.

    Read the article

  • Lazyloading images and SEO

    - by surpr
    Lazyloading images with a noscript fallback. Should I expect any damage in the SERPs? The site is completely thumbnail based. Also should I put a smaller image size in the noscript fallback to increase crawlability? We have nearly 1mil thumbs so it's a decision I'm hesitant to do. The reason why I'm thinking about it in the first place is because we're upping thumbail size about 50% which will add 10% of pagesize.

    Read the article

  • Is Wordpress a good CMS for a Event Site? [closed]

    - by Roland
    I plan on building a gallery/exhibition event site. so Locations are usually always the same an fall into 3 categorys (gallery, offspace, institution). then there is the Exhibition title the date and the participating artists. So I was wondering if Wordpress could handle such a site. it should be very data driving though, so all the information is in a list view on one site and can be ordered and queryed (which artists took part in which exhibitions and so on) Please tell me the cons and pros of using Wordpress for such a site and problems I could run into if I might plan to broaden the scope later on. thanks!

    Read the article

  • Using paypal to process credit cards in Sweden through an API [on hold]

    - by Mastikator
    I'm looking for a Paypal API that lets me process credit cards to make payments without being redirected to a paypal site and without enforcing consumers to use their paypal account. And it needs to work in Sweden. The ones I've looked at (dodirectpayment, expresscheckout, paypalpro gateway) and none of them have let me process credit cards in Sweden via an API that doesn't force the user to visit the paypal login site. I have a form on my webpage that the user types their credit card number, ccv2, expiration, name, address, etc. I need an API that works in Sweden that simply processes the request, and it has to be without the step of being redirected into a paypal website. The ones that I have found only worked in a select few countries, is there an international solution? I've already spent over 12 work hours just looking for an API that meets my requirements.

    Read the article

< Previous Page | 169 170 171 172 173 174 175 176 177 178 179 180  | Next Page >