Search Results

Search found 9717 results on 389 pages for 'gkt pro'.

Page 225/389 | < Previous Page | 221 222 223 224 225 226 227 228 229 230 231 232  | Next Page >

  • why my website not ranking in first page of google? [on hold]

    - by India SEO Analyst
    Iam handling website www.usamovingandstorage.com and targeting keywords "chicago movers", but my website is on third page. But my website has nice backlink, and recently i removed irrelevant backlinks also. I compared my competitors' websites such as www.ampolmoving.com, www.chicagomovers.com and have no such big backlinks, but they are ranking first page in google. I compared the three websites in www.opensiteexplorer.org. In that my site has good results. Then How is it happened? I need full comparison, why my site is ranking in third page? what are the actions i need to take to rank in first page.

    Read the article

  • How should I study a competitor's off page SEO?

    - by Chris Adragna
    What are the things I need to do, and with what tools to know what a competitor has working for him/her off-page (free and paid tools -- please suggest both)? First of all, I'm supposing I want to see all of the sites linking in and see what anchor text is used. Is there something that would report on the anchor text linking in, such as counting the keyword phrases used as anchor text? Also, it would be helpful to see where the PR is coming from by PR, such as, listing inbound links by PR of the page linking in. Lastly, if I'm missing something, here in the way of off-page attributes, please say so.

    Read the article

  • SEO-meta description crawling issue [duplicate]

    - by user3707382
    This question already has an answer here: Meta Descriptions not working for google search 3 answers i have following code where i m including my title and description for the page But google crawled only title not the meta description from the code. Where as meta description was read from the keywords present in html of the page.. Please guide me guys where i m coding wrongly <!DOCTYPE html> <html> <head> <title>title inserted here</title> <meta http-equiv="Content-Type" content="text/html;charset=utf-8"> <meta name="description" content="description here"/>

    Read the article

  • Hosting server application for global SME

    - by BBe
    We are planning to set up a complete ERP and CRM system for a medium-sized global company, that might turn into a essential tool for all locations once deployed. For now these locations include USA, Germany, China and Indonesia, but the list is growing quickly. My question is, where it is best to physically locate the server to ensure the access times are optimal from all (future) locations? On my mind, I am dealing with multiple connected servers (a cloud?), where each of our users is served by the physically closest server. Being in a very competitive field we would also like to rule out, that any data is stored in mainland China... Thanks for any advice and pointers!

    Read the article

  • What is the correct frequency of changing content regularly?

    - by SSRB
    What is the correct frequency of changing content regularly? Suppose I have a site "Seven Sea" having 5 links name as Home, About Us, Product, Sitemap, Contact Us. It is good for site to change the site content regularly. But is there is any minimum and maximum frequency for do this job. Suppose I do change my content daily then is that good for SEO point of view. OR suppose I change my content once in a year Is that bad for SEO. What is best or more better choice? A REQUEST: If this type of question already answered then give me that answered link and do not close the question.

    Read the article

  • Apache: DoS with mod_deflate & range requests, tomcat also? [migrated]

    - by VextoR
    I know that apache has a security bug http://seclists.org/fulldisclosure/2011/Aug/175 So if you do this command: curl -I -H "Range: bytes=0-1,0-2" -s www.yandex.ru/robots.txt it says HTTP/1.1 206 Partial Content it means, the problem is exist. But the fact is, that for apache tomcat (our server) curl says 206 Partial Content as well. So we need to fix it. I found solution for apache HTTP (.htaccess, mod_headers) but not for tomcat. I'm very newbie for servers things, so can't understand most, so please help

    Read the article

  • RewriteRule for URL Subdirectory Root

    - by JYerdon
    Have not found this in my searches on SE. I need this scenario to work: • User visits someurl.com/news/folder or someurl.com/news/somefolder/, they get redirected to someurl.com/somefolder. • If the user visits JUST someurl.com/news or /news/, they are allowed through to visit /news. Here is my current rule: RewriteRule ^news/(.*) /$1 [NC,R=301,L] How do I make it allow the second bullet point? First seems to work with no issues. Thanks all! POST UPDATE I have got the code RewriteCond %{REQUEST_URI} ^news RewriteRule ^/news news/ [NC,L] RewriteCond %{REQUEST_URI} ^/news/(.)$ RewriteRule ^news/(.) /$1 [NC,R=301,L] BUT - it doesn't allow me to go to the URL something.com/news/ Any thoughts?

    Read the article

  • Javascript Only Search Method [on hold]

    - by user2118228
    I need to put a search function on a website that is going to be on a CD-ROM with no access to the internet. It has 80 pages, and about 500 'items', so I'd prefer to not have to hard code 100's of 'if statements if possible. I've found a few programs you can buy that will index and generate results (Zoom Search, JSS Index, The German Guys') but there are odd quirks with each one. Plus I would rather code it myself to get complete control over it, and to really understand what it's doing. Basically searching for a few words would display the product image and description; clicking on that would take you the related URL. This is kind of complicated, I can't find an easy solution not dealing with hundreds of if Statements. Has anyone ever created anything like this or know a better method? I'm not really sure a better way to go about this. I've used PHP/MYSQL for search results before, but this cannot run any php.

    Read the article

  • Microformats, Reviews and Duplicate Content

    - by Nicholas
    Let's say I have a site that sells widgets, and the URL structure is like so: /[type-of-widget]/[sub-type]/[widget-name]/ So, a URL for a widget might be: /screwdrivers/philips-screwdrivers/acme-big-screwdriver/ We show reviews on the widget page, and use the appropriate microformat data so Google knows it's a review, etc. Now, what if I want to show random reviews in the "sub-type" and "type-of-widget" landing pages? Will Google ding me for duplicate content, or is it smart enough to know (based on microformat data/etc.) that this is not duplicate content?

    Read the article

  • Learning PHP MySQL [closed]

    - by Keith Groben
    I've been a designer for several year now. I'm now very interested in learning PHP, MySQL. I've read through W3Schools already and it helped me become familiar. I now want to know where I could go to start building some simple database and php applications so I could be come familiar with coding and start to improve. I think hands on application is the best way to learn. Since I'm a beginner, much of what I've found so far is sort of advanced to me.

    Read the article

  • How do I get dynamic subdomains on domain aliases in Plesk? [closed]

    - by mitchdesign
    I am running a site that is available via many domains. So all domains depend on the same code, but depending on the requested domain, the correct information is shown to the user. In Plesk, I have one domain set up and it has domain aliases for all the others. This works ok. Now, I'd like to be able to use subdomains on those aliased domains. Like the domains, the subdomains will be used as input for the system to display the correct information. So, they are not a fixed set. What I have done: - In DNS I have added *.example.com CNAME example.com to the main domain. This is automatically copied by Plesk to all the aliases. So DNS-wise, I think it's ok. - I have added a vhost.conf file to the main domain, containing ServerAlias *.maindomain.tld. This works for the main domain. But still, not for the aliased domains. And there is no place to put vhost.conf files for the aliased domains.. How do I tell the server to respond to those domain's subdomains as well?

    Read the article

  • How hard is to be the anonymous owner of a website?

    - by silla
    I'd like to create a website with a very radical political message. It won't be unethical (encouraging violence, etc) but I feel the points I plan to list in it will definitely make me a lot of enemies. How hard would it be to protect my identity from anyone finding out who I am? I know domains always have a $10/year option for privatizing your registration information but is there any other protection I should think about having? Thanks!

    Read the article

  • Funnel Visualisation not showing drop out in steps

    - by cjk
    I have a site where users will follow this path (hopefully): mysite.com/entity mysite.com/entity/pay mysite.com/entity/pay/details mysite.com/entity/pay/confirmation mysite.com/entity/pay/payment mysite.com/entity/paymentsuccessful?lotsofuniquestuff The entity section could be any one of a multitude of entries, but once the user is on a journey it will stay consistent. I have a goal set up for this as a regular expression, with the match being .*PaymentSuccessful.*. This detects goal successes perfectly (as checked with data stored by the site directly). I have set the following steps within the goal: /[a-zA-Z0-9]+/pay /[a-zA-Z0-9]+/pay/details /[a-zA-Z0-9]+/pay/confirmation /[a-zA-Z0-9]+/pay/payment My funnel detects entry into the funnel (e.g. 47 users) of which say 10 complete the funnel. However, all drop offs are after the first step, even though the exist page is listed as the second step! My data also suggests users are getting further in the funnel before failing to reach the goal. What have I done wrong?

    Read the article

  • Proper caching method with .htaccess

    - by mark075
    There are a lot of snippets that enable caching on a website and I don't know which one should I use. The most popular is something like this: <IfModule mod_expires.c> ExpiresActive On ExpiresByType image/jpg "access 1 year" ExpiresByType image/png "access 1 year" ExpiresByType text/css "access 1 month" ExpiresDefault "access 2 days" </IfModule> I also found something similar, but with keyword 'plus'. Like this: ExpiresByType image/png "access plus 2592000 seconds" What does it mean, because I didn't find anything in the documentation. Another snippet I found: <ifModule mod_headers.c> <filesMatch "\.(ico|jpe?g|png|gif|swf)$"> Header set Cache-Control "max-age=2592000, public" </filesMatch> <filesMatch "\.(css)$"> Header set Cache-Control "max-age=604800, public" </filesMatch> <filesMatch "\.(js)$"> Header set Cache-Control "max-age=216000, private" </filesMatch> <filesMatch "\.(x?html?|php)$"> Header set Cache-Control "max-age=600, private, must-revalidate" </filesMatch> </ifModule> What is the best practice?

    Read the article

  • How to figure out recent pagrank of websites or any particular page (Homepage)

    - by rajesh.magar
    Question just comes in front because the very recent published algorithm changes by Google been affected my website traffic. And I've been wondering that my homepage page-rank is been also drop to 6 to 4 (Might be I am not sure). I am not using any supernatural SEO tools like SEOMOZ,Majesctic SEO etc. So it's quite difficult for me to ensure weather the page rank is been really affected or not. So can anyone please provide any good resource, tact or tricks to address this question. Thanks!

    Read the article

  • How to Stop Browser from rejecting my downloads

    - by melki0795
    I have a portfolio site where I am trying to host some of my work, so people can download my work. Some of these files include exe executables, and some are jar executables, which are run through batch. When a user tries to download my apps, it says that the file is not commonly downloaded and may be harmful, and therefore blocks the download. If I zip the folders, it still does the same thing. Any format i choose, still blocks the downloads. How can I stop chrome from doing this. Is there a way I can verify my files so they will be considered as trusted? Thanks in advance!

    Read the article

  • 1 click backup for my websites

    - by Si Philp
    I have a windows reseller account that I only really use for personal use. The host company doesn't currently offer a 1 click backup. I am looking for something to automate some kind of backup that does the following: Backups all files Backups all databases I know other companies offering such a tool but I am not looking. I have thought about writing a tool that does this but thought there might be something out there that does this already?

    Read the article

  • Fresh start outside Google's crapbox [on hold]

    - by Krzysztof Minister Bytu
    I might have been experimenting with my website too much and Google first cut the flow of visitors considerably and now I didn't get one for 4 days already. It's a joke that they've done this, because I've put a lot of work into it, but that's a topic for another day. My question is about further avoiding it. I want to take the partly improved design from that website onto a new one and get a new domain name. The question is: in that case, do I have to change the hosting option (it has my old website name in the address), or is changing the domain enough for Google to treat it as something new from a "fresh user". In other words, does Google get through the domain address and log into the actual hosting address? I'd hate to waste another few months of hard work, so I prefer to take every possible precaution but not paying for another hosting would make things easier on the wallet.

    Read the article

  • Using Drupal to build a directory listing? [on hold]

    - by Jim
    I am trying to create a form inside Drupal that will allow me to create a directory similar to this diagram: http://i.imgur.com/EtChBbG.jpg I tried looking into HTML tables but it is too basic for what I'm trying to do. How do I create a directory that can archive data in an alphabetical ordering? It also has to be able to sort by letters and other categories. Does anyone have an idea of how I should go about doing this? Thanks!

    Read the article

  • I want something ready to start with

    - by BDotA
    I am looking for something quick like a weblog in wordpress or blogspot maybe, that when I write a blog post I can put it there, for example if I write something about .NET or Java or Database,..some quick tutorial with some small code samples that visitors can use ... And I don't know anything about webdesign and I just want a ready-made thing to use for this purpose. What do you suggest? any samples of that that I can take a look?

    Read the article

  • When removing a bunch of files from a website, is 301 to the start page appropriate? [closed]

    - by Uwe Keim
    Possible Duplicate: What to do with random pages after a 301 redirect? Currently, we are working on modifying the contents of a website of one of our products. Since we are re-positioning the product, we would like to delete roughly 50% (around 200) of the pages of the website. My questions, in terms of SEO are: Would it have a negative effect if Google suddenly sees the large page drop? Is it OK to provide a 301 to the start page of the website for all removed pages?

    Read the article

  • Website Remodel Redirects

    - by inKit
    We've recently built a site for a new client who has not inserted all the content that they had from their old site into their new one. Also a lot of content is dynamic with ID's not matching from the old site to the new one. We have added dynamic redirects for most of the patterns we could find in pages that were 404ing, but there are still a lot of pages that had content, or just jumbled urls that we cannot match up with content pages on the new site. Is it better to redirect these leftover pages to the homepage? Or leave them 404ing?

    Read the article

  • SEO - Hidden content before main site content

    - by 0pt1m1z3
    I have a two hidden divs before my main site content, one with the login form and another with the signup form. I then have login and signup buttons within the page that use JQuery to show or hide these divs. I like the effect this setup offers, dropping down from the top of the page and pushing the rest of the content down. However, recently I have been getting serious about SEO and I am wondering if these divs have been affecting my SERP rankings. Basically, every non-logged page (everything bots see) has the same two display:none; divs at the top of the document flow. Is it bad? Should I re-engineer these forms and the way they are displayed?

    Read the article

  • Google Analytics - how to track clicks on a screen?

    - by milesmeow
    Can I track the click of every link, button, dropdown select, etc. on a screen and have it be tracked in Google Analytics? I want to create a page and collect data on which widget the users use most. What about AJAX stuff? What if you're using jQuery or Mootools...can you get the functions to register a fake URL with GA based on user interaction? I use to do this with Flash. Everytime you click a button, it can initiate a fake URL request. I would make urls such as ".../customize/eyes/" or ".../customize/nose", etc. Just wondering if I can do that with Javascript on the page. I've also posted at StackOverflow.

    Read the article

  • Online job application

    - by Fred
    I am trying to add an application to my site where I can post job openings with my company and allow people to apply online. Can someone recommend a service or app already in existence for this purpose? I tried googling it, but could not find a set of search terms that did not return endless sites for job seekers. This is a (very) small business and I do not expect to have more than a few openings at any time, but what I am actually interested in is having a repository of interested job seekers to have on file. Then when people ask me about openings, I could just refer them to the page and they could apply. Then, if we have an opening, we could look through the list of candidates and if we can't fill the position(s) from that list, we could post the job and advertise to fill the position.

    Read the article

< Previous Page | 221 222 223 224 225 226 227 228 229 230 231 232  | Next Page >