Search Results

Search found 9721 results on 389 pages for 'quicktest pro'.

Page 243/389 | < Previous Page | 239 240 241 242 243 244 245 246 247 248 249 250  | Next Page >

  • Google Stats, how to get More info?

    - by Ant's
    I have created a blog very recently and i'm seeing my traffic and audience using Google Stats that is in built in google blogger. I have few question on google stats: 1) Is number of visitor shown by stat is rough or accurate? 2) How i do find whether people have visited my site or search engines? 3) Is google stats is best for beginners like me? or any other tool? Correct me if am wrong.

    Read the article

  • Will uploading our .docx files on scribd and embedding the files on our website affect search engine rankings?

    - by user1439968
    We have prepared notes for university students which are on .docx format. And we want it to put on our website for viewing. We tried one option. Uploading the files on scribd and embedding it on our website for viewing on scribd viewer. Will making documents available on srcibd viewer on our website affect search engine rankings ? Will search engines treat it as duplicate content as those are already uploaded on scribd and we are embedding it on our website ? On scribd we have set the uploaded documents as 'private' though. And if it affects, can you suggest any suitable way to make .docx files to be viewed on our website that doesn't affect search engine rankings ?

    Read the article

  • Quality web hosts not using c panel [closed]

    - by J4G
    Possible Duplicate: How to find web hosting that meets my requirements? I was an iPower web hosting user before I encountered major problems with their MySQL databases. I recently tried A Small Orange, whose GUI was not compelling, and I quickly learned to loathe c panel. I looked into using GoDaddy, but reviews of their service have been very negative. I was satisfied with iPower's control panel, so something similar would be appropriate. Can anyone recommend a quality web host that includes the following features? *Unlimited bandwidth (200gb or higher) *Unlimited storage (10gb or higher) *High up-time (preferably 95% or higher) *Does not use C panel or other difficult-to-use control panels *Supports multiple MySQL databases *Uses a recent version of PHPmyAdmin

    Read the article

  • How would a search engine see url encoded characters?

    - by K20GH
    I've got my URL however some of the strings would contain &. Obviously I can't use them as best practice so I've replaced them with +. However if I encoded my & instead it would become %26. How would a search engine see that? Would it see %26 as a & so still bring back the URL or would it just see it as a %26? ie. Would www.example.com/sweet?m&m show as that, or would they see it as www.example.com/sweet?m%26m

    Read the article

  • How to compete in a Saturated Niche? [closed]

    - by jasondavis
    Possible Duplicate: Does the position of the content in the source make a difference to SEO? When entering a hugely over saturated niche such as web-design, is it even possible to compete with the big sites that have been ranked on google's #1 page for years? Also, I have read about how important titles, link anchors, and headings are for SEO and how content is the most important. So let's say we are building a standard header, body, sidebar, footer page. In the the actual markup, would it be better to make sure the main content comes before the sidebar on the page or does this probably not make a difference?

    Read the article

  • What is the best taxonomy from Google's perspective?

    - by ZakGottlieb
    I was wondering what the best way is to structure a new website in Google's eyes. Currently, it contains two top-level categories (X & Y), and clicking a term under either one will result in the URL: www.nameofsite.com/X/X type term, or /Y/Y type term Technically, it is correct to group all "X type terms" under X and "Y type terms" under Y, but we could probably be more granular and break all articles into 5-6 top-level categories by breaking Y up into more specific categories. Given that the current URL structure will eventually result in 1000's of "X type terms" and "Y type terms" under just two top-level categories, would it be more advisable to have several of these, as suggested? Thank you in advance.

    Read the article

  • Server Setups for Agencies [closed]

    - by styks1987
    We are considering consolidating our server administration by cutting down on the number of systems we currently use to manage all our websites(~65 websites). Currently we have a testing server and 3 production servers. (2 - cherokee(linode), 1 - apache (mt)) We don't have a dedicated server admin, so I am stuck with managing all these servers, and as a developer, I don't want to deal with all the server hassle. So my main goal is to cut down on the time spent messing with the servers. We have looked at Pagoda Box and AppFog as possibilities. I am not sure if Pagoda Box would be cost effective. With 65+ websites we may end up paying anywhere from 0 to $50+ per website per month. Right now we page about $250 per month for the 4 VPS servers mentioned above. We already use Capistrano for deployment. I have the opportunity to completely overhaul the entire setup and I would like some feedback on where you found your information for large scale server management or how you currently do it. Articles are welcome. In summary: What is new (past 2 years) in simple server management arena? If you work at an agency or have had agency experience, how do/did you manage your sites? a. What is the level of effort for SSL, new site setup, database management, and extension management. b. How did you handle datacenter outages. Anyone with Pagoda Box experience, do you like it and did you have problems with Wordpress, Cakephp, Drupal, Expression Engine or Magento? a. Is it expensive for you? b. How has server uptime been? Your direction and comments are greatly appreciated.

    Read the article

  • which status to put for temporarily inactive page

    - by aji
    I was wondering if someone could help me how to manage temporarily inactive website in regards of SEO and search engine. the case is i managed a big ecommerce site, and sometime i need to put down page(s). could be days, could be weeks, could be months, and it depends on our vendor. if my visitors land on the page that been temporarily inactive then i can give them a message that the vendor they looking for is not available at this time and he can check back later OR check another vendor with similar products, but how do i send my message to search engine robots? if i use 301 status and forward URL page to another similar products, then the chance that the current URL being deindex is huge while i still want to use that URL for the future if my vendor want to re-join. any advise will highly appriciated

    Read the article

  • Backlinks for nonexisting page

    - by Michal
    I've bought domain, which was previously used by somebody back in 2007. Now I've realized that internet is full of backlinks that point to non-existing parts (pages) under my website-domain (for example to mypage.com/whatever, where whatever is not present on my website, so 404 error shows). I want to ask, are these links counted by google (for pagerank) and other search engines, or not. So do I have to redirect these links to existing pages in order to be counted?

    Read the article

  • How to prevent a search engines from indexing a section of a page?

    - by BrunoLM
    I have many pages with lots of text in it. But I will always have two sections of text and I want to prevent one section from appearing in search results, the other section must be indexed. <p class="please-index-me">text</p> <p class="get-out">never index me please</p> I thought that maybe if I load the "please don't index me text" with Javascript maybe search engines wouldn't look for it. But I am not sure it would work and this is not really nice. I was wondering if there is a way to tell search engines "hey, this text you can't grab, move on". So, is there a way to do it?

    Read the article

  • Weird URLs being access by Googlebot

    - by Avishai
    Lately I've been seing all sorts of strange URLs show up as errors in my Webmaster tools account, but they're URLs that don't actually exist on my site, nor are linked from the pages that Google claims they're linked from. URL Response Code Detected yR3kna/5RfA4+ndtn/X4zcevudMlXbqbIrnPbH9irw= 404 9/16/12 OK4iaOVdr6Ocjmz+u1kuR5Q486mhDo/e45nwjl2+y8= 404 9/9/12 pxGz/oHEA0BS8U3VFBzJcZnnIHMsFXb3/rIxMxh2ws= 404 9/16/12 Af8tbvQ0HniIpf53I8Txz1hM1/JxxrFQxgqPuErWII= 404 9/9/12 7Bk7c0LDmm4PHyTjml017EGwNNPCn/p/0xMSWWPDic= 404 9/16/12 umCwnDvTE8ybpUB19MIb+VRj5xRJncyYGGfAQ2Mxn0= 404 9/1/12 # etc... Do you know how to make these stop? It's not at all clear to me why it would be going to these URLs in the first place.

    Read the article

  • Does Googlebot (and/or search engines) index a forwarded page? [duplicate]

    - by user2889419
    This question already has an answer here: HTTP and HTTPS impacts on SEO 1 answer Let's say I have example.com domain, and I force the user to use the HTTPS over HTTP. The question is as browsers just accept and load the forwarded/new page (when the request for http://example.com - https://example.com), does the Googlebot (or other search engines) accept the forwarded page and index the new page and just ignore the old page? In other word, does search engines accept HTTPS beside the HTTP?

    Read the article

  • Using Subdomains for Newly Regional Company

    - by Taylord22
    The company I work for is expanding their business to new territories. I've got a lot of stabilization to do in the region/state where we're one of the most well known companies of our kind. Currently, we have 3 distinct product lines which are currently distinguished by 3 separate URLS. This is affecting the user flow of our site, so we'd like to clean it up before launching our products into the various regions. The business has decided to grow into 5 new states (one state consisting of one county only) — none of which will feature all 3 products. Our homebase state is the only one that will have all 3 products this year. My initial thought was to use subdomains to separate out the regions, that way we could use a canonical tag to stabilize the root domain (which would feature home state content, and support content for all regions), and remove us from potential duplicate content penalization. Our product content will be nearly identical across the regions for the first year. I second guessed myself by thinking that it was perhaps better to use a "[product].root/region" URL instead. And I'm currently stuck by wondering if it was not better to build out subdomains for products and regions...using one modifier or the other as a funnel/branding page into the other. For instance, user lands on "region.root.com" and sees exactly what products we offer in that region. Basically, a tailored landing page. Meanwhile the bulk of the product content would actually live under "product.root.com/region/page". My head is spinning. And while searching for similar questions I also bumped into reference of another tag meant to be used in some similar cases to mine. I feel like there's a lot of risks involved in this subdomain strategy, but I also can't help but see the benefits in the user flow.

    Read the article

  • Rowspan not using top row

    - by DaAwesomeP
    I don't understand why my column won't span to the top and bottom rows I created. It is supposed to look like the "Today" column is taller on the top and bottom then the other columns. I tried accomplishing this by adding rows above and below it and then using rowspan. For some reason, it only works on the bottom row. It's a lot of code, and I wasn't sure what I should cut without deforming it all or adding a new variable (it needs a fluid height). JSFiddle: http://jsfiddle.net/DaAwesomeP/aU9Le/ Basic HTML Layout (the full fiddle has the css: <table id="weatherForecast"> <tr class="weatherForecast-row-outer"> <td></td> </tr> <tr id="weatherForecast-row"> <td id="weatherForecast-DATE" class="weatherForecast-day weatherForecast-day-today" rowspan="2"> <!-- Cell Content for "Today" Here --> <td id="weatherForecast-DATE" class="weatherForecast-day "> <!-- Cell Content Here. There are 4 of these --> </td> </tr> <tr class="weatherForecast-row-outer"> <td></td> </tr> </table> An image showing what I am trying to accomplish: http://s14.postimg.org/ba3xwcm75/Rowspan_not_using_top_row.png

    Read the article

  • Is the php method md5() secure? Can it be used for passwords? [migrated]

    - by awiebe
    So executing a php script causes the form values to be sent to the server, and then they are processed. If you want to store a password in your db than you want it to be a cryptographic hash(so your client side is secure, can you generate an md5 using php securely( without submitting the user:password pair in the clear), or is there an alternative standard method of doing this, without having the unecrypted pasword leaving the clients machine? Sorry if this is a stupid question I'm kind of new at this. I think this can be done somehow using https, and on that note if a site's login page does not use https, does that mean that while the databse storage is secure, the transportation is not?

    Read the article

  • Hide from google while developing

    - by user210757
    I will be building a (wordpress) web site. While I am developing, other team members will be pushing content. I'd like to have it hidden from google while under development. It will be hosted on godaddy. I have thought of not pointing the domain name to it until live and using "preview dns", or buying a static IP during development. Or hosting dev site in a sub-directory ("/dev/") until ready and then moving it up a level. If in the dev directory I'd add htaccess or robots.txt to not crawl. Is any of this a bad idea? Will google penalize for any of this - like search by IP and then associate that with the domain later on? Any better ideas?

    Read the article

  • The best way to snatch an expiring domain?

    - by SilvrSun
    There's a domain that I've been looking to acquire that is expiring on the 30th of this month. I don't think it is very popular, and the guy hasn't seemed to update the website in two years now. So, I was doing some research and came across this site that seems to review some "snatching" services, but the article is quite outdated. So, I'm wondering if anyone can offer any newer information on the topic, or whether the recommend any services for helping me acquire the site in question?

    Read the article

  • Can I host my website through gitlab like you can with github pages?

    - by BenWatkinsArt
    I would love to use git to host my website, and would love a platform I can log into online to go along with it (something like Github). You would think in which case, that Github pages would be the perfect route for me, though I don't want to use Github pages. I would like to host this all on my own servers like you do with Github enterprise (but for free). I have found Gitlab and was wondering if there is anyway I can use Gitlab like Github pages. Is it possible?

    Read the article

  • SimpleViewer + lightbox

    - by singles
    Is it possible to integrate any kind of Lightbox with SimpleViewer? But I don't want to display SimpleViewer in Lightbox. I want to Lightbox show when I click on one of the images in SimpleViewer. Does anyone tried that with success? EDIT I have a SimpleViewer page now. I just want to bind handler to clicking an image (as normally in HTML based pages), fetch big image url and show that image (not SimpleViewer!) in Ligthbox/ThickBox/FancyBox etc.

    Read the article

  • .htaccess and browser caching

    - by Tim
    I ran across these suggested htaccess edits. Is this a good practice? Is this something I should implement on my wordpress site?: <IfModule mod_expires.c> ExpiresActive On ExpiresByType image/jpg "access plus 1 year" ExpiresByType image/jpeg "access plus 1 year" ExpiresByType image/gif "access plus 1 year" ExpiresByType image/png "access plus 1 year" ExpiresByType text/css "access plus 1 month" ExpiresByType application/pdf "access plus 1 month" ExpiresByType text/x-javascript "access plus 1 month" ExpiresByType application/x-shockwave-flash "access plus 1 month" ExpiresByType image/x-icon "access plus 1 year" ExpiresDefault "access plus 2 days" </IfModule>

    Read the article

  • Transferring users and search engines to a new domain

    - by eftpotrm
    I've been asked to take over the maintnance of an existing site that's being reworked. At present it's serving localised content for several languages, but via a fairly unhelpful mechanism that means essentially search engines only have it indexed in English and any deep links will de facto appear in English as well. So, new localised sites are being built under separate domains - not just for this, there's other benefits. What we're then looking to do is to redirect users correctly to the new site, where appropriate. For humans this isn't a problem. We can send them through a gateway page on their first site visit, grab their language preference and put it in a cookie, then redirect them to the new localised content as soon as it's available. For search engines, this isn't so good... In principle I'm happy to simply bypass the gateway page and redirect known spiders to the new site, but this means we're serving radically different content (different URL even!) to human and robot users. Won't this therefore be regarded as cloaking and cause us grief? Anyone know a better way to handle this?

    Read the article

  • Facebook likes reset after moving to HTTPS

    - by aarondicks
    I've got a question regarding the Facebook like button. We worked on a piece recently that embeds a number of social share buttons (please see the source code below). When the piece was released, it was on HTTP, and received over 2k likes (the URL 'slug' hasn't changed at all). The site was recently migrated to permanent-on HTTPS, and the like data has been reset, and we've been left with 50 new, recent likes. If you see in the source code, the URL is set explicitly to like the HTTP version, which I believe to be correct. Can anyone help me work out what's happened here? Here's the HTML bit of the like button: <div class="fb-like" data-href="http://www.harveywatersofteners.co.uk/history-interior-design" data-layout="box_count" data-action="like" data-show-faces="false" data-share="false"></div>

    Read the article

  • Google analytics shows wrong number of page views, asp.net website

    - by f_karlsson
    Sometimes it can be for example 4500 requests, after a few hours it shows a few thousand less. What is wrong? It looks like analytics corrects itself. I changed from classic to Universal a few months ago, do not know if it has anything to do with this. In masterpage: <script> (function (i, s, o, g, r, a, m) { i['GoogleAnalyticsObject'] = r; i[r] = i[r] || function () { (i[r].q = i[r].q || []).push(arguments) }, i[r].l = 1 * new Date(); a = s.createElement(o), m = s.getElementsByTagName(o)[0]; a.async = 1; a.src = g; m.parentNode.insertBefore(a, m) })(window, document, 'script', '//www.google-analytics.com/analytics.js', 'ga'); ga('create', 'UA-xxxxxxxx-1', 'xxxxx.se'); ga('send', 'pageview'); </script>

    Read the article

  • Connect divs with (non-straight) lines [migrated]

    - by Snailer
    I'd like to develop my site with a layout that looks somewhat like houses with connected plumbing, or multiple computers connected to a network. Basically, the will be boxes floating in space, with lines connecting some of the boxes. I'd like these lines to have some turns in them as well (just simple 90 degree corners) rather than just a straight line. My question is what is the best way to achieve this, and perhaps a small example. My thoughts were to use: PHP and CSS: I could create a background grid and then, with some complicated algorithms, draw paths using the grid's borders. This would be more dynamic, but I'm not sure I can plot the math all by myself. just CSS: Perhaps this is as simple as making some pre-drawn lines like L-shapes and T-junctions, then just placing and scaling them. But I don't believe there's a way to scale an image by slicing it.. so the line width would be scaled and thus each image would look different. Any thoughts?

    Read the article

  • List of SEO tools [on hold]

    - by Felix
    I'd like to crowdsource a list of SEO platforms/tools. There is an abundance of options out there... Analytics SEO Blueprint BrightEdge Conductor Ginzametrics gShift Labs RankAbove Raven Internet Marketing Tools Rio SEO Searchmetrics seoClarity SEOlytics SyCara For each tool suggestion, please provide a brief overview of what the tool is used for and what differentiates it from its competitors.

    Read the article

< Previous Page | 239 240 241 242 243 244 245 246 247 248 249 250  | Next Page >