Search Results

Search found 20852 results on 835 pages for 'local seo'.

Page 86/835 | < Previous Page | 82 83 84 85 86 87 88 89 90 91 92 93  | Next Page >

  • [SEO] sitemap.xml What is the precision of the priority field?

    - by Christoph
    Unfortunately the specification does not tell anything about precision. The xml scheme definition states that it is of the type xsd:decimal: <xsd:restriction base="xsd:decimal"> <xsd:minInclusive value="0.0"/> <xsd:maxInclusive value="1.0"/> </xsd:restriction> I have a sitemap generator that uses up to 10 positions after decimal point. Where often only the last few positions differ. These numbers are perfectly right according to the xsd, but yet i found some pages(3,4) that state that only 0.0, 0.1, 0.2, .., 1.0 are valid values. How will the search engines react to such a sitemap? Will some just round the value? I know that it is unlikely that someone can provide an answer to that question, unless he works for that search engine, but i think experiences will also do.

    Read the article

  • SEO - Does google+other search engines index links within <noscript> tags?

    - by Joe
    I have setup some dropdown menus allowing users to find pages on my website by selecting options across multiple dropdowns: eg. Color of Car, Year This would generate a link like: mysite.xyz/blue/2010/ The only problem is, because this link is dynamically assembled with Javascript, I've also had to assemble each possible combination from the dropdowns into a list like: <noscript> No javascript enabled? Here are all the links: <a href='mysite.xyz/blue/2009/'>mysite.xyz/blue/2009/</a> <a href='mysite.xyz/blue/2010/'>mysite.xyz/blue/2010/</a> <a href='mysite.xyz/red/2009/'>mysite.xyz/red/2009/</a> <a href='mysite.xyz/red/2010/'>mysite.xyz/red/2010/</a> </noscript> My question is, if I put these in a tag like this, will I be penalized or anything by search engines such as Google? I've already been doing so for some navigational stuff which required offsets etc. However, now I would be listing a whole list of links here too. I want to provide them here, moreso so that google can actually index my pages - but for those without javascript, they can still navigate too. Your thoughts? Also.. even though I have some links that appear to have been indexed, I AM NOT 100% SURE, which is why I'm asking :P

    Read the article

  • best SEO method for date in url structure? [closed]

    - by Haroldo
    I'm working on an events website so dates are very important search terms, ie: 'whats on on fri 14th september' I've seen it done in various methods for example: domain/whats-on/city-hall/14-09-2010/event-name.html domain/whats-on/city-hall/2010/09/14/event-name.html the first is 'shallower'. the second could be clearer for google to synonym-ize as a date, has anyone else got any experience or input?

    Read the article

  • How to deal with missing items the SEO way?

    - by Brandon Montgomery
    I am working on a public-facing web site which serves up articles for people to read. After some time, articles become stale and we remove them from the site. My question is this: what is the best way to handle the situation when a search engine visits a URL corresponding to a removed article? Should the app respond with a permanent redirect (301 Moved Permanently) to a "article not found" page, or is there a better way to handle this?

    Read the article

  • SEO Problem for new dictionary site, google hasn't indexed content.

    - by John
    I loaded about 15,000 pages, letters A & B of a dictionary and submitted to google a text site map. I'm using google's search with advertisement as the planned mechanism to go through my site. Google's webmaster accepted the site mapps as good but then did not index. My index page has been indexed by google and at this point have not linked to any pages. So to get google's search to work I need to get all my content indexed. It appears google will not just index from the site map and so I was thinking of adding pages that spider in links from the main index page. But I don't want to create a bunch of pages that programicly link all of the pages without knowing if this has a chance to work. Eventually I plan on having about 150,000 pages each page being a word or phrase being defined. I wrote a program that is pulling this from a dictionary database. I would like to prove the content that I have to anyone interested to show the value of the dictionary in releation to the dictionary software that I'm completing. Suggestions for getting the entire site indexed by google so I can appear in the search results? Thanks

    Read the article

  • How do I change the domain name of my AD DS? [closed]

    - by Gaate
    I recently set up a server with AD DC and used a mydomain.local address for it. I now would like to be able to access the server through remote desktop from outside my local network. So I have purchased a domain name that I have set up with my router for DDNS and forwarded to the IP of my server. I was wondering a few things. A) Is there a way I can forward the DDNS to point to my current AD DC x.local address so I wouldn't have to change the domain to log in from outside of local network? B) If there is not a way to do what I mentioned above, what is the easiest way to change the Domain Name (mydomain.local) in my AD DC? Should I completely remove it or is there a way to change it? I am using windows server 2012.

    Read the article

  • Successfully Deliver on State and Local Capital Projects through Project Portfolio Management

    - by Sylvie MacKenzie, PMP
    Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin-top:0in; mso-para-margin-right:0in; mso-para-margin-bottom:10.0pt; mso-para-margin-left:0in; line-height:115%; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;} While the debate continues on Capitol Hill about which federal programs to cut and which to keep, communities and towns across America are feeling the budget crunch closer to home. State and local governments are trying to save as many projects as they can without promising too much to constituents – and they, in turn, want to know where their tax dollars are going. Fortunately, with the right planning and management, you can deliver successful projects and portfolios on a limited budget. Watch the replay of our recent webcast with Oracle Primavera and Industry Product Manager Garrett Harley that will demonstrate how state and local governments can get the most out of their capital projects and learn how two Oracle Primavera customers have implemented project portfolio management practices to: Predict the cost of long-term capital programs and projects Assess risk and mitigation strategies Collaborate and track performance across government agencies Speakers: Garrett Harley, Industry and Product Manager, Oracle Primavera Cory Davis, Director of Capital Renovation and New Construction, Chicago Public Schools Julie Owen, PSP™, CCC™, Sr. Project Controls Manager,LA Metro Transit Authority With the right planning and management, state and local governments can deliver successful projects on a limited budget. 1024x768 Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Times New Roman","serif"; mso-fareast-font-family:"Times New Roman";}

    Read the article

  • Gnu screen with local scrollback buffer?

    - by Hugh Perkins
    I'm using a remote server over a very slow and unreliable network connection. So, I want to use gnu screen in order not to lose what I'm doing whenever I get disconnected. But I want a local scrollback buffer, on my local computer, so that scrollback doesn't have to go across the network, which is incredibly slow. Is there either something like gnu screen, but with a local scrollback buffer; or else a way of using gnu screen with a local scrollback buffer?

    Read the article

  • sshfs mount won't start from /etc/rc.local

    - by Alex Flo
    I have the following commands in /etc/rc.local chmod 666 /dev/fuse chmod +x /usr/bin/fusermount /bin/su someuser -c "/usr/bin/sshfs someuser@someserver:/usr/local/storage /usr/local/storage_remote -o nonempty -o reconnect" If I run them from command line, as root, they work. If I reboot the server they won't run from /etc/rc.local. I try to figure out what I do wrong but I don't have console access and I couldn't find any errors in /var/log related to sshfs.

    Read the article

  • Exchange 2010 Internal Auto Discover Migrate away from current .local DNS name

    - by Bryan
    We have an Exchange 2010 Server, running within our Active Directory domain, with an internal hostname of server.example.local. The server is configured for Exchange anywhere, but currently has a self signed certificate with a name of server.example.local installed. Internally, clients connect and work fine, but externally, we are having certificate errors as you would expect. I'm about to purchase a UCC SSL Certificate to install on the server with all the relevant SANs on the certificate to correct this, but due to obvious problem obtaining a trusted cert with .local as a subject alternative name, I'm looking to configure clients on the internal network so that they don't use any reference to the .local hostname. I've configured our external DNS name for the server as exchange.example.com, and have created an CNAME for autodiscover.example.com which also (correctly) points to exchange.example.com. I've also configured internal DNS records for these two hostnames which point to the internal interface of the same server. I don't anticipate any problems here. I'm now trying to reconfigure Auto Discover internally, so that Outlook attempts to connect to exchange.example.com. I've followed the steps in KB940726 to prepare for this, and this appeared to work fine. No errors were generated and I was able to verify the CAS name in AD using ADSI edit. I've just tried testing this with a newly created test user account complete with a new Exchange mailbox, and Outlook 2007 connects fine on the internal network, but looking deeper in the Exchange profile, Outlook is still resolving the server name as server.example.local. Could it be the self signed cert, that is causing Outlook to display the server name as server.example.local, or is there still something wrong with my internal autodiscover configuration? Edit I've proven it isn't the certificate that is responsible for outlook returning server.example.local, by installing another self certified certificate with a name of test.example.com. When creating a new outlook profile, I get the mismatch error I'm expceting, but after accepting the cert, and finishing the config of the Outlook profile, again it still shows server.example.local as the server name. This means that if I were to purchase the UCC cert now, that external client would work fine, but internal clients would show a certificate name mismatch. Any ideas where to start diagnosing this?

    Read the article

  • Implement abstract class as a local class? pros and cons

    - by sinec
    Hi, for some reason I'm thinking on implementing interface within a some function(method) as local class. Consider following: class A{ public: virtual void MethodToOverride() = 0; }; A * GetPtrToAImplementation(){ class B : public A { public: B(){} ~B(){} void MethodToOverride() { //do something } }; return static_cast<A *>(new B()); } int _tmain(int argc, _TCHAR* argv[]) { A * aInst = GetPtrToAImplementation(); aInst->MethodToOverride(); delete aInst; return 0; } the reason why I'm doing this are: I'm lazy to implement class (B) in separate files MethodToOverride just delegates call to other class Class B shouldn't be visible to other users no need to worry about deleting aInst since smart pointers are used in real implementation So my question is if I'm doing this right? Thanks in advance!

    Read the article

  • Can rel="next" and rel="prev" can be ignored in blog listings

    - by Saahil Sinha
    We have a blog - which is current spread to 9 pages, every page has a unique title - page 1, page 2, page 3 and so on. Also, as it's a blog, every page has unique 10 listing entry on one page. Is rel="prev" and rel="next" can be safely ignored as all these are listing and not content pages of article. What I read in all through Google Search is that rel="next" and rel="prev" should be applicable on where the content is spread across multiple pages. But - as it's a blog, it has blog listings and every listing has unique content This is the blog: http://www.mycarhelpline.com/index.php?option=com_easyblog&view=latest&Itemid=91. May recommend, if by ignoring rel="next" and rel="prev" - are we inviting Google to treat the blog listing pages as duplicate.

    Read the article

  • Can google “see” this custom javascript code which displays links from an external site to mine

    - by webmasters
    I have a javascript code on my site who displays links from another site. This is what I have on my source before: <script language="JavaScript" type="text/javascript">showLink(1);</script> This is what I have copied from my source after the page has loaded: <script language="JavaScript" type="text/javascript">showLink(1);</script><a rel="nofollow" target="_blank" class="anc" href="http://x5.external_site.net/sc/out.php?s=5483&amp;o=http%3A%2F%2Fwww.bluetooth.com">Bluetooth Devices</a> Can google see this link?

    Read the article

  • Interesting links week #24 and #25

    - by erwin21
    Below a list of interesting links that I found this week: Interaction: Design Usability and All About It Frontend: CSS Lint – CSS Cleaning Tool 10 HTML Entity Crimes You Really Shouldn’t Commit Development: OWASP Top 10 for .NET developers part 7: Insecure Cryptographic Storage C#/.NET Fundamentals: Choosing the Right Collection Class Mobile: Tips to Design a Website for Mobile Marketing: 30 (New) Google Ranking Factors You May Over- or Underestimate Other: 5 Little-Known Web Files That Can Enhance Your Website Interested in more interesting links follow me at twitter http://twitter.com/erwingriekspoor

    Read the article

  • Google-bot sees “Sorry, we have no imagery here” on pages with Google Maps

    - by friism
    I have a site with Google Maps on most of the pages. When inspecting content keywords in Google Webmaster tools, content keywords identified by Google-bot for the site include "imagery", "sorry" and "here". These turn out to be part of an error message returned by Google Maps: "Sorry, we have no imagery here". I cannot reproduce this error with normal clients, nor does "fetch as Google" show it. The problem is presumably that Google-bot tries to execute some of the Google Maps Javascript but then shoots itself on the foot and records the error message. A Google search for "Sorry, we have no imagery here" shows that this problem is endemic to sites across the internet, including Yelp and many others. I'd like to convince Google that my site is not about imagery and being sorry, but I'd also like to keep the maps in place. I guess one option would be to transition to static maps, but that's not a great alternative. There's some related discussion on Webmaster World, no resolution.

    Read the article

  • Should each page of a Blog listing have its own Title

    - by RandomBen
    Should example.com/Blog?Page=1, example.com/Blog?Page=2, etc have the same title? I have done some research on this and SEOMoz's tools say I have duplicate titles and so does Google's Webmaster tools. If you look at top end examples like http://www.seobook.com/blog and http://www.seomoz.org/blog they both use the same title across all query of their ?Page=X URLs. So what is the better choice or does it even matter?

    Read the article

  • When the canonical page itself changes url

    - by lulalala
    This is a continuation of the question: How to handle canonical url changes like Stack Overflow. Say I have the canon url: questions/11/car <---canonically-linked-from--- questions/11/ What will happen if I want to change the canon url to questions/11/car-with-sgx Obviously, questions/11/ will point to the new canon url. But how should the old questions/11/car change to the new one? There are two ways: 301 redirect that to new canon url the old canon url canonically link to the new canon url According to this post: [By using canonical link instead of redirect,] OldPage.html’s rankings will drop over time due to fewer internal links, but the canonical tag won’t make it disappear entirely. It could theoretically remain in their index until one of the following occurs: it is redirected permanently via 301 it returns a 404 for an extended period of time (they will keep checking for a while before dropping a URL) a meta robots “noindex” tag is added If this is true, I really need to use redirect from old canon url to the new canon url, which means I need to keep a log of previous old canon urls of this content, so I know when I can redirect. This is a bit of a hassle to do.

    Read the article

  • Is Google Analytics Part Of Google's Search Engine Algorithm

    - by ub3rst4r
    I was wondering if anyone knows if Google uses the data it receives from Google Analytics to help determine a websites SERP (Search Engine Rank Position). For example, if my website is getting 1000 users visiting my website from Canada and only 100 users visiting my website from the USA, does that mean my website will be ranked higher on Google.ca and lower on Google.com? And, if a website is using Google Analytics will it be ranked higher for the organic search engine keywords?

    Read the article

  • adding slugs to the URLs afterwards

    - by altuure
    we have a website for last 5 months and we did not used slug at bottom level elements so urls was like /apps/webmasters/badges/1100 would it make sense to add name to the URL after that point and redirect to the new ones ? I am interested in building more search terms. and increase page ranks ..... /apps/webmasters/badges/1100 - redirect and served at /apps/webmasters/badges/1100-supporter Or should I keep old URLs as is and create new urls with slugs. I would also appreciate some advice on shared urls on facebook or on twitter in those cases. Thanks in advance...

    Read the article

  • Can I use <link> tags in the body of an HTML document?

    - by Edward Touw
    Can I use <link> tags in the body of an HTML page? I tried to find the answer to this question, but found contradictory information. When adding Schema.org microdata markup to an HTML page, I want to add canonical info in a link tag like this: <div itemscope itemtype="http://schema.org/Book"> <span itemprop="name">The Catcher in the Rye</span>— <link itemprop="url" href="http://en.wikipedia.org/wiki/The_Catcher_in_the_Rye" /> by <span itemprop="author">J.D. Salinger</span> </div> I got the example code above from Schema.org. According to them, this is the way to go for people that want to add a canonical reference to an itemprop, but don't want to place a hyperlink on their website. W3 however clearly states that <link> tags should only be placed within the head section, thus making the Schema.org example invalid. If I want to stick to correct markup, which advice should I follow?

    Read the article

  • A frequently updated mixed bag blog OR several seldom updated niche sites?

    - by Melanie
    Background I am a member of the website HubPages where I have about a hundred articles (and I'm always writing more.) Anyway, HubPages revenue model is 40% ad-share for them and 60% ad share for users. While the userbase there is really friendly, the site is REALLY slow, buggy and there is a ton of content on HubPages that is copied from other sources. Upon flagging these articles it takes a ton of time for mods to remove it and it's just generally dragging down my stuff. Furthermore, HubPages was hit really hard by Google's Panda Update: http://www.google.com/search?hl=en&rlz=1B3GGLL_enUS426US426&tbm=nws&q=google+panda& Aside from the temporary problems I would deal with when removing content from HubPages and putting it on my own domain (duplicate content, etc) I have another problem. Which would be the best for my articles? I have tons of articles in a wide variety of niches and would like to do what would help them perform the best. I'm not a huge niche writer and have received wide criticism from the HubPages community for my articles not performing as well as they could because I don't use enough keywords within the text of my articles. I prefer to write more naturally in a way that would appeal to an audience instead of keyword stuff. Anyway, this is aside the point. My Question After removing my articles from HubPages, should I put them on one domain or spread them across multiple domains grouped sort of by topic. For example: a-bunch-of-articles.com OR travel-articles.com and financial-articles.com and knitting-articles.com (I know those domains aren't available, but it's just kind of an example.) Here are the pros and cons of each: a mixed bag site like a-bunch-of-articles.com may not perform as well because of its mixed-bag nature a mixed bag site would be updated far more frequently than several niche sites... some niche sites may be updated so infrequently that a year could pass before one sees a new article a mixed bag site would be like putting all my eggs is one basket, where having several niche sites would spread out my portfolio, so to speak. a mixed bag site would be cheaper, $14 (two year registration) to start out with and hosting and I'm good to go. a mixed bag site wouldn't allow me to easily target keywords, but then again isn't HubPages pretty much a mixed bag site?

    Read the article

< Previous Page | 82 83 84 85 86 87 88 89 90 91 92 93  | Next Page >