Search Results

Search found 9728 results on 390 pages for 'zee pro'.

Page 220/390 | < Previous Page | 216 217 218 219 220 221 222 223 224 225 226 227  | Next Page >

  • Does CSS Positioning Affect SEO [duplicate]

    - by etangins
    This question is an exact duplicate of: Make Offscreen Sliding Content Without Hurting SEO [duplicate] If I positioned the very first content that appears in my code below the fold, would that content be given less weight and therefore be less effective with SEO? In addition, if I had a large image that took up most of the top of the screen and resulting in my content being below the fold or toward the bottom of the screen, would that content be given less weight? Note This is content that occurs early on in my code. I'm not talking about having a ton of content and if the content that occurs later would be given less weight, but if content that occurs early on put ends up below the fold would be given less weight.

    Read the article

  • Can't get Rewrite rule to keep original URL

    - by user38100
    I have these Rewrites, but I would like to have the URL stay the same as what is typed originally, I thought removing the [R] flags would stop it but it hasn't RewriteCond %{HTTP_HOST} ^examplea\.example\.com$ [NC] RewriteRule (.*) http://examplea.example.com:32400/web [L] RewriteCond %{HTTP_HOST} ^exampleb\.example\.com$ [NC] RewriteRule (.*) http://exampleb.example.com:9091 [L] Edit: would this work better? RewriteCond %{HTTP_HOST} ^hello.example.com$ RewriteRule ^(/)?$ welcome [L]

    Read the article

  • Domain name made of keywords redirecting to main website's page

    - by ivanivan
    Let's say I have a website called books.com where I sell books. I've read on Redirecting different domains to your main site that it's not a bad idea to register another domain that does a 301 redirect to my website, like booksforsale.com. Now, say I want to only target a specific category withing my website, like books.com/sci-fi/ so I register sci-fi-books.com and do a 301 redirect. Would this improve my search rankings? Thanks.

    Read the article

  • Is it possible to pay customers with PayPal?

    - by CJxD
    Usually with PayPal we buy goods and services by sending money from customer to business. Now, I want my business to pay my customers; I want to allow them to withdraw money from a virtual wallet on my website. I did notice there is an 'Adaptive Payments API' which mentioned something vaguely similar to this, but I haven't been able to search up enough information to come to any conclusion. So is it possible to send money from business to customer (autonomously) with PayPal? If not, are there any alternatives?

    Read the article

  • Should I add rel nofollow to internal links which already have meta noindex?

    - by CamSpy
    Let's say I have a products page with listing producsts and the page has pagination. I would like the 1st page to have all the SE ranking weight so I decided to put meta noindex on the rest of the paginated pages (from page 2 to N). My common sense says that if I don't want pages to not get indexed, I shouldn't also pass link/PR juice to these pages. (Is that smart?) What happens if I set rel="nofollow" for all pagination links from page 2 to page N?

    Read the article

  • Webmail with option to change password for email account?

    - by arma
    Been testing out different webmail options to use (so far AfterLogic, Horde) And it seems that there is no options to change password for user. It's really bad thing that i have to go to server and manually change passwords for users. Is there any webmail solution that will allow me to change password, that also changes on server (as client). Or is it server setting i must use before? Or it is not possible? EDIT: Note that i have cPanel host.

    Read the article

  • Massive 404 attack with non existent URLs. How to prevent this?

    - by tattvamasi
    The problem is a whole load of 404 errors, as reported by Google Webmaster Tools, with pages and queries that have never been there. One of them is viewtopic.php, and I've also noticed a scary number of attempts to check if the site is a WordPress site (wp_admin) and for the cPanel login. I block TRACE already, and the server is equipped with some defense against scanning/hacking. However, this doesn't seem to stop. The referrer is, according to Google Webmaster, totally.me. I have looked for a solution to stop this, because it isn't certainly good for the poor real actual users, let alone the SEO concerns. I am using the Perishable Press mini black list (found here), a standard referrer blocker (for porn, herbal, casino sites), and even some software to protect the site (XSS blocking, SQL injection, etc). The server is using other measures as well, so one would assume that the site is safe (hopefully), but it isn't ending. Does anybody else have the same problem, or am I the only one seeing this? Is it what I think, i.e., some sort of attack? Is there a way to fix it, or better, prevent this useless resource waste? EDIT I've never used the question to thank for the answers, and hope this can be done. Thank you all for your insightful replies, which helped me to find my way out of this. I have followed everyone's suggestions and implemented the following: a honeypot a script that listens to suspect urls in the 404 page and sends me an email with user agent/ip, while returning a standard 404 header a script that rewards legitimate users, in the same 404 custom page, in case they end up clicking on one of those urls. In less than 24 hours I have been able to isolate some suspect IPs, all listed in Spamhaus. All the IPs logged so far belong to spam VPS hosting companies. Thank you all again, I would have accepted all answers if I could.

    Read the article

  • What is the SEO-recommended method for using underscores and dashes in URLs that contain geographic locations?

    - by ElHaix
    In reading through this article: In Subfolder & File Names, Use Dashes, Not Underscores Good: Good: http://www.domain.com/sub-folder/file-name.htm Bad: http://www.domain.com/sub_folder/file_name.htm In my URL's, I may have one or two city names, ending with the province/state: Burnaby_New_Westminister-BC/[some search term]. My URL rules currently are defined such that everything after the dash is the prov/state. Some geographic locations already contain dashes: Notre-Dame-de-Grâce (in QC), which I would convert to ~/Notre_Dame_de_Grace-QC/ I thought of placing the prov/state after another "/", however in some cases the province/state name may not exist, thus ~/Notre_Dame_de_Grace/, so the first term after the domain name contains the geo location {city, city_name-state}. I am now revisiting this, and wondering if this rule set should change, and if so, what is the recommended way of implementing this? -- UPDATE -- After reviewing this video, I see that I should be using the dashes, rather than underscores. However since I still want to have my geo locations in the first URL section, is there anything wrong with using a double-dash separator - ie: /city-name--state/ ?

    Read the article

  • Domain mapping issues

    - by Nadya
    I have two domain names - .com & .co.uk bought with 123-reg and just one student Windows hosting pack associated with the .co.uk domain. The .com domain is the main one which people would be trying to access, so I just mapped the domain to the hosting this morning. The problem is that I would really like it to be functional by tomorrow morning and the usual waiting time is 24-48 hours. Is there point in stopping the process and trying with forward it with CNAME record instead, does it take less time? (I can just go back and do proper domain mapping during the weekend) Also, is there a possible way to check whether the domain mapping has been done correctly before these 24-48 hours? From some computers I get 404 Error on homepage.

    Read the article

  • Is there a way to return a response every x seconds or so to a single http request?

    - by luis
    I'm wondering if it's possible to send a response every second or so to a single http request. Like for example the client makes an http request, then the server sends a space character every second. This could be never ending or with a limit, for example a minute. I think the word 'response' is misleading in this context, since I don't necessarily mean an http response. The whole http response could be composed of the space characters, which would mean a single http response to a single http request, except that it is a minute long. I tried chunked encoding but I don't think it works, or at least my implementation's wrong.

    Read the article

  • Why is Cloudflare waiting for name servers for over 4 days?

    - by user29175
    I've registered for Cloudflare's free plan and have completed the process of redirecting the DNS as instructed, including changing the name servers. This was done 4 days ago. The problem is that cloudflare is giving me: "websites" -- "Finishing up. Waiting for your name servers to change to * Please allow up to 24 hours to complete this process (info)" "dashboards" -- "Analytics data could not be loaded. You do not have any initialized zones" I can see via traceroute that CloudFlare is the DNS to my site. Also, somehow this has messed up with my google analytics account so I have no idea if I get visitors to my site or not. What should be done to fix this?

    Read the article

  • simple sql group by custom groups question [migrated]

    - by alex
    imagine a mysql table that only has 2 columns, an id and a name of a color. with this query I know how many id's do I have for each color. SELECT color_name, count(id) FROM color_table GROUP BY (color_name); red:10 blue:5 yellow:3 green:1 my question is, is there a way I can specify to the "group by" some custom groups?? i mean, is there a query that results in this??: red:10 colors different than red: 9

    Read the article

  • Deleted files still accessible without www in url

    - by phlegma
    I have deleted all files and all hidden files off my server, there is nothing but log files which cannot be deleted. Ironically, files are accessible when nothing is there. Cache cleared, multiple browsers and computers/devices checked. Files show when I exclude "www" from the URL http://sarastringfellow.com/assets/photo/c.jpg http://www.sarastringfellow.com/assets/photo/c.jpg What does this mean?

    Read the article

  • How do i get this to work? [closed]

    - by user1867842
    This is my code.... it speaks for itself. <?php define("html","<html>"); define("htmlEnd","</html>"); etc... etc... ?> What i'm trying to do is make a wrapper for html's tags so they won't be needed anymore. But i can't get any of the attributes for html elements to be defined in php. This again speaks for itself... i don't know any other way of saying this... i guess how would i make a other mark up language like html without any tags but still keep everything about html is what i'm trying to say.....

    Read the article

  • Why do spammers use CELESTRON NEXTAR 6SE?

    - by fmz
    I am running a website for a volunteer organization that hosts an annual event. There is a form where people can volunteer to bring items for the event. All too frequently I get spam from users across the globe that enter things like this: Country - 1: Australia Material - 1: CELESTRON NEXTAR 6SE Country - 2: Australia Material - 2: C8 Newton Country - 3: Australia Material - 3: ETX 125EC Country - 4: Australia Material - 4: ETX 125EC Country - 5: Australia Material - 5: CELESTRON NEXTAR 6SE I don't really care about the country, but what is it with the telescope stuff? Is there some hidden meaning behind all this or is it some astronomy group that moonlights as spammers?

    Read the article

  • Website creation preparation [closed]

    - by Loki
    I am in the pre-coding phase of creating a website. I know that it will be account based (users have to register/login to use the features). I also know that the server will have to do certain operations that are timer based, that is to say that user will have events that will trigger at a point chosen by the user and do something. I am searching for a good choice in server-side technology, and was wondering what my options are and what the best choice is. I would prefer open technology and something that doesn't use interpreted languages (Java, .net). My first thought is PHP + PGSQL for serverside and HTML+CSS+JS for clients, but I am still looking at my options.

    Read the article

  • URL Frame redirection CakePHP

    - by themanbehindoftheprojectmayhem
    I need to redirect CakePHP installation host to my domain. Location of my Cakephp installation: myhosting.com/newsite/ Domain: www.mydomain.com I'm currently using URL Frame to direct www.mydomain.com to myhosting.com/newsite/. Problem When I load www.mydomain.com, I see all links in the site is pointing to the hosting location - example - myhosting.com/newsite/product/1 It should be pointing to www.mydomain.com/product/1 Any simple way to fix this? Probably very simple to solve it, but I can't bend it. Help much appreciated.

    Read the article

  • Move site to new domain divided by language across subdomains

    - by mark
    I managed to find a nice domain for a fairly fledgling site of mine that actually hasn't been parked by scumbag squatters. Given the upcoming move I'm thinking I'd take the opportunity to split the content across subdomains according to language, much like wikipedia for example: current: www.old-domain.com/en/subject # English www.old-domain.com/subjecto # Spanish (default so not locale in url) proposed en.new-domain.com/subject es.new-domain.com/subjecto The advantage of doing this is a fairly competitive keyword such that I may wish to put a copy of my application on a Spanish slice in order to gain a few serp's. Also pure vanity. Google's webmaster tools allows me to move to the new domain and I can add the root domain and the subdomains but forward to only one. I'll 301 from the old domain appropriately but is there anything I should know about webmaster tools in this respect where effectively I'm moving to two addresses? (Feel free to dissuade me from doing this if it's a bad idea in comments.) I've now asked this same question on google's forums.

    Read the article

  • How can I have more clicks than page views in AdSense

    - by ArcticLlama
    One of my AdSense ad units (in the new beta interface) occasionally says that I have more clicks than page views which gives a CTR of over 100%. Does anyone know how this happens? I'm assuming it has something to do with when a page view is recorded, versus when someone clicks, but it happens regularly enough (on a daily report) that it can't just be that a bunch of users click an ad before the page displays fully.

    Read the article

  • What are the different ways of making a Joomla! website mobile friendly?

    - by Treebranch
    I am involved in the development of a number of Joomla! websites and we would like to make these websites mobile friendly. I have done a bit of searching online and I can't seem to find any standard way of doing this. I have have come across a few Joomla! extensions that claim to make themselves mobile friendly for this device or that device. However, I am weary to just start trying these out. Do any of you know of standard ways to make a Joomla! site mobile friendly?

    Read the article

  • Recomendation for Webshop with API

    - by m.sr
    I'm searching for a webshop. The problem with my search is, that the webshop-software of my choice needs to have a useabel API or some interface for external applications. E.g. i need to place orders by an external application or need to get product descriptions or warehouse stock from the external application. I somehow would like to have a webshop wehere the webinterface is just one way to interact with the whole system. There are some other requirments, which have to be fullfilled, but i guess they are kind of common: running on linux MySQL (we already have MySQL-replication and backup in place) i like open source but i'm willing to pay for it, if it's worth it I found some webshops on the net - but perhaps you can tell me, if theres any hope for a webshop with a good API before i go and test all of them, on the first look i didn't find any docs about any interface to external applications for any of my search results. Thank you!

    Read the article

  • How to configure Google sitemap links in Wordpress? (without editing its HTML or PHP source code) [duplicate]

    - by Alexander Farber
    This question already has an answer here: What are the most important things I need to do to encourage Google Sitelinks? 5 answers I run a Wordpress 3.7.1–de_DE site, but don't have much experience with it yet. When my site comes up in a Google search, there are 2 links displayed underneath: I believe these links are called "Google sitemap" and my question is how to configure them in Wordpress. Because while the right link is pointing to the /ueber-mich URL at the website, the left link was pointing to an non-existing /imprint and I had to add that webpage as a workaround for now. And I'd like to change the /imprint to German /impressum anyway (currently I use mod_rewrite to redirect). UPDATE: Dear downvoters and movers, would you mind to READ my question please? My question has been about how to configure Google sitemap links in Wordpress. So it is NOT A DUPLICATE (I do not want to edit the HTML code, I want to find the correct configuration in Wordexpress) and my question SHOULDN'T HAVE BEEN MOVED AWAY from wordexpress.stackexchange.com.

    Read the article

  • robots.txt dissalow url containing string with a '/' at the end

    - by thanili
    i have a website with thousands of dynamic pages. I want to use the robots.txt file in order to dissalow certain url patterns corresponding to pages with duplicate content. For example i have a page for article itemA belonging to category catA/subcatA, with URL: /catA/subcatA/itemA this is the URL that i want to be indexed from google. this article is also visible via tagging in various other places in the web site. The URLs produced via tagging is like: /tagA1/itemA this URL i want NOT to be indexed from google. However i want to have indexed all tag listings: /tagA1 so how can i achieve this? dissalow URLs of including a specific string with a '/' at the end? /tagA1/ itemA - dissalow /tagA1 - allow

    Read the article

  • IP fail-over address. Do i need it?

    - by Jon
    I received an email from my web hosting provider where i have 2 dedicated servers saying that from now on I have to pay for my IP fail-over addresses. The server we have hosts a tool used internally by our company. Traffic to it is quite low. No more than 3 people will use it at the same time. If something happens we can wait a day to have the tool up and running again. Is it worth having these fail-over addresses? thanks

    Read the article

  • Configuring httpd.conf to handle wildcard domains and multiple scripts?

    - by Steve
    I have a full-blown site like: http://www.example.com (uses index.php) http://www.example.com/scriptA.php http://www.example.com/scriptB.php I now want to have the possibility of setting up subsites like: http://alpha.example.com http://alpha.example.com/scriptA.php http://alpha.example.com/scriptB.php From http://stackoverflow.com/questions/2844004/subdomain-url-rewriting-and-web-apps/2844033#2844033 , I understand that I have to do: RewriteCond %{HTTP_HOST} ^([^./]+)\.example\.com$ RewriteCond %1 !=www RewriteRule ^ index.php?domain=%1 But what about the other scripts like scriptA and scriptB? How do I tell httpd.conf to handle those properly as well? How can I tell httpd.conf that handle everything after the 'forwardslash', exactly as it does on the main site, but pass a parameter flag like ?domain=alpha (Cross posted at: http://stackoverflow.com/questions/11365566/configuring-httpd-conf-to-handle-wildcard-domains-and-multiple-scripts)

    Read the article

< Previous Page | 216 217 218 219 220 221 222 223 224 225 226 227  | Next Page >