Search Results

Search found 9960 results on 399 pages for 'iwork pages'.

Page 82/399 | < Previous Page | 78 79 80 81 82 83 84 85 86 87 88 89  | Next Page >

  • Google Webmaster Tools Index dropped to Zero [closed]

    - by Brian Anderson
    Earlier this year I rebuilt my website using ZenCart. Immediately I saw a drop in index status from 59 to 0. I then signed up for Google Webmaster Tools and noticed the Index status took a dramatic drop and has never recovered. I have worked to add content and I know I am not done, but have not seen any recovery of this index since. What confuses me is when I look at the sitemap status under Optimization it shows me there are 1239 submitted and 1127 pages indexed. Most of my pages have fallen off page one for relevant search terms and some are as far back as page 7 or 8 where they used to be on the first page. I have made some changes in the past week to robots.txt and sitemap.xml, but have not seen any improvements. Can anyone tell me what might be going on here? My website is andersonpens.net. Thanks! Brian

    Read the article

  • Splitting a sitemap by content type

    - by James
    I currently am tasked with submitting our website sitemap to the search engines every week. We have a module which does offer sitemap generation but we find using it does not work very well as not all pages are included and it does not split the sitemap by content. I've used various (online and offline) tools to generate the sitemaps which is not the problem. The problem is that after every generation (which takes most of each Monday) I have to manually go through the sitemap and categorise the links in to products, pages, categories and sub categories. I've experimented successfully with XSL to split the sitemap but it is still a labour intensive process. Does anyone know of a good method to split the sitemap? Currently there are around 20,000 links (iirc) in total.

    Read the article

  • SEO Mapping, Tracking and Reporting

    Linking the pages of a website is done because search engines will be more aware of a site's presence when its pages are found at the other end of industry terms in anchor text contained with content at other locations. The total and quality of those links are factors that help promote rankings; when placed for SEO purposes they should be one-way links rather than reciprocal since reciprocal links are not any help in ranking brownie points and it is prohibitively time-consuming to administer a thousand of them. This is not to be confused with link exchanges; when you can...

    Read the article

  • Recommended flexible website solution?

    - by Omega
    My site has a MyBB forums installation, and that is pretty much all I need. Forums. However, I need a homepage and a couple other static pages, for showing relevant information, links, etc. I don't need something fancy, all I need is something very flexible regarding theme and style editing, and just a couple simple modules, like public polls. That's all. I am very graphical, and I am looking for something to let me edit pretty much every aspect of the site. These are static pages, mainly, so I don't need something very complex. Some people tell me to use Dreamweaver, but quite honestly, that is not what I am looking for, even thought it does offer a lot of flexibility. I want something like, you know, Drupal or.. some other simple, deeply-editable in terms of graphics and style web platform. What would you recommend to me? Thank you.

    Read the article

  • Is there a media player that works on HTTPS sites?

    - by Iain Hallam
    I'm currently using Yahoo! Media Player for a site that needs to play MP3 files that are stored on our server. In total, there's quite a bit more than the free limits at Soundcloud, but each file is only a few minutes long. YMP is pretty good, but causes security warnings on HTTPS pages, because it can only be served via HTTP. Is there an equivalent free player I can embed for the HTTPS pages? EDIT: Just to clarify, I'm initially looking for something that will scan the page and turn media links playable.

    Read the article

  • Should webmasters "index" dashboard and edit account page

    - by francoboy7
    New here, I did my research and found nothing, but sorry if it has already been asked. As webmasters should be let google and other search engine INDEX our member's dashboard and edit account page. For example my member John has access to a page name "Edit your account" where he can fill some fields and updates his info. Or another pages where John can manage his posts (edit, delete) Such pages have no interest to the other people so should be let google and others INDEX it or should we NOINDEX it ? Thanks for your time ! Franck

    Read the article

  • Determining cause of random latency/loading issues

    - by Sherwin Flight
    I'm not sure exactly what details to post in regards to my issue, because I'm not sure what is relevant. Prior to the end of September my websites all loaded quickly, in almost all cases. Loading time wasn't usually more than a few seconds. However, since the end of September I noticed a big increase in page loading times. In some cases pages were taking 30 seconds or more to load. I do have a remote monitoring service monitoring some of the sites as well, and the image below shows the response times over the past month. The response times shown at the beginning of this graph were what the usual response times were prior to this issue occurring. You can see that there has been a significant increase in response times from the beginning to the end of this graph. The thing is, the problem is not happening 100% of the time. If I click through the site, or even just keep refreshing the page, about 25% of the time the pages load quickly, the remaining 75% of the time they load slowly. Sometimes the pages take so long to load that they time out, and don't load at all. I have contacted my hosting provider, and they said things at their end was fine. I don't believe the problem is my home internet provider, because all other websites load without a problem. The server is located in Texas, USA. This also raises another interesting point. My remote monitor checks my site from two locations, California, USA, and London, England. As you can see in the chart below the response time is actually shorter when checked from London, which doesn't seem to make sense, since the server is physically closer to the California monitoring location. I would have expected the London monitoring location to have higher response times since they are physically farther away. I should also point out that in some traceroute test I've done it seem like the first connection to the server seems to take the longest, then after that the rest of the page loads quickly. Below is a little chart showing the times for the first connection to the server. So, what could be causing this problem, and what steps can I take to resolve it or at least narrow down the problem? Sending the request to the server was very quick, and receiving the reply back seems pretty quick, but the WAIT time is really long. So it connects, sends the request, but then waits close to 30 seconds before it starts receiving data back. I am also aware that there are things I can do to speed up page loading times, like reducing the number of css/js files used on a page, compressing images, etc. This is not really what the source of the problem is though, because nothing has really changed on the site since before the problem started, and other sites on the same server are loading slowly as well. Any help or advice is much appreciated.

    Read the article

  • Rehosting content from another server

    - by Lana_M
    We have a set of static pages that will augment a customer's existing site. The pages will not reside on the customer's servers for logistical reasons and because we need to maintain control of the content. The plan is for the customer to set up a mod_rewrite rule that will funnel certain types of URLs to a single server-side handler script that will grab the appropriate file from a CDN and just output its content. This illustrates the approach: <?php echo(file_get_contents(str_replace($customer_host, $cdn_host, $_SERVER['REQUEST_URI']))); ?> Can anyone think of pitfalls or offer up a different approach? Is there some way to circumvent a script altogether?

    Read the article

  • My Sites Were Hacked. What To Do?

    - by Vad
    I host multiple domains with this very popular hosting provider and I just went into one of my sites and... I see a black page with message "Hacked by...". I checked and all my sites with the provider are showing this same page. Inside of file system I have seen the hacker placed all default.* and index.* files with this message. So the hacker overwrote all index pages, placed new pages and that is under every, I say again, every folder. Cleaning this up will be close to a most horrible job. What to do (right now I am awaiting the restore of files from hosting provider)? How to prevent this? Whom to blame?

    Read the article

  • SproutCore : un framework JavaScript pour enrichir les interfaces web et les rendre semblables à des

    SproutCore : un framework JavaScript pour enrichir les UI Et les rendre semblables à celles d'applications desktop SproutCore est un framework JavaScript encore confidentiel, mais qui commence à faire parler de lui. Edité par le société Sproutit, cette technologie s'adresse aux développeurs webs. Elle leur permet d'enrichir les interfaces utilisateurs pour les rendre quasi-semblables à celles d'applications desktop. «*AJAX a été utilisé pour créer des pages web qui peuvent se mettre à jour sans avoir à se recharger dans le navigateur. Mais elles ressemblent toujours à des pages webs et elles sont généralement limitées au niveau de l'interactivité » peut-on lire sur ...

    Read the article

  • Pros and Cons of Session Replication

    - by techsjs2012
    Do I really need Session Replication? I am working on a number of web projects for a firm. Most of the projects are about one or two pages of input and then doing a save to a mysql database. Very Basic projects. My SA's are pushing to try to get session replication working in JBoss but I don't really see any need for it and all of its overhead. We need load balancing and clustering so if the server does go down we can move the new requests to the backup service but I am not to big in session replication. This is very low volume projects. In my eyes what is the odds of a user being in the project as the server goes down on the one or two pages. I need to convince the SAs that session replication is an un-necessary complication in this instance. I am looking for pros and cons of session replication so that I can better structure my argument.

    Read the article

  • Search Engine Query Word Order

    - by EoghanM
    I've pages with titles like 'Alpha with Beta'. For every such page, there is an inverse page 'Beta with Alpha'. Both pages link to each other. When someone on Google searches for 'Beta with Alpha', I'd like them to land on the correct page, but sometimes 'Alpha with Beta' ranks higher (or vice versa). I was thinking of inspecting the referral link when a visitor arrives on my site, and silently redirecting them to the correct page based on what they actually searched for. Just wondering if this could be penalized by Google as 'cloaking/sneaky redirects'? Or is there a better way to ensure that the correct page on my site ranks higher for the matching query?

    Read the article

  • Redisigning an old site, structure change etc

    - by RhymeGuy
    I have an old site built in 2006, it has around 200 pages and 500 pictures. Every single page is of course indexed as well as images. It is very well ranked for targeted keywords and I receive good amount of SEO traffic (I guess that's due the various campaigns, branding, ppc, etc..) Problem: Site has outdated design, pages and images have not so proper names, there are no heading and alt tags, it was built in tables, inline CSS etc.. Goal: Complete redisign site, use divs, change file names, add proper meta data, alt tags etc.. Question: How this can affect current SEO positions? I will redirect (301) every single page to the new one, build site map, but what to do with images? Do I need to redirect them also? Any other suggestion?

    Read the article

  • How to organize my site's file system properly?

    - by Wolfpack'08
    Doing some reading on Stack Overflow, I've found a lot of information suggesting that proper organization of a file system is crucial to a well-written web app. One of the key pieces of evidence is high-frequency references to "separation of concerns" in questions related to keeping programs organized. Now, I've found some information on organizing file systems (Filesystem Hierarchy Standard) from 2004. It raises only two concerns: first, the standard's a bit dated, so I believe it may be possible to do better given the changes in technology over the past 8 years; second, and most important, my application is very small compared to an entire Linux distro. I think that the file system should be organized very differently because of that. Here's what I'm looking at, currently: /scripts, /databases, /www -> /dev, /production -> login, router, admin pages, /sites -> content types, static pages /modules, /includes, /css, /media -> /module-specific-media

    Read the article

  • List of backlinks to a specific website, listed by decreasing PageRank

    - by Nicolas Raoul
    With backlinkwatch.com I can get a list of pages that link to a particular website. Unfortunately, it lists tons of obscure blogs and small forums, it has hard to find what link is really important. Is there a similar service, where links would be displayed sorted by "importance"? For instance, a link New York Times would be shown at the top of the list, while links in small blogs would not appear before a few pages. "Importance" can be subjective, so I suggest using the PageRank, but other metrics could be fine too.

    Read the article

  • Disqus 2012 comments NOT being indexed by Google

    - by Buckers
    We run a high-traffic website at http://www.onedirection.net and we've been using Disqus throughout this year, initially to great effect. We accepted the upgrade to Disqus 2012 back in June, loving the increased user experience and the better community feel - albeit back to an Iframe again. However the fact we were specifically told that the comments are now being indexed by Google was great, and the dynamic nature of the iFrame suited our site (all our pages are cached, so by using Disqus the comments are updated straight away). However, it seems that the Disqus 2012 comments are not being indexed, and we've noticed an obvious fall in traffic over the last few months. Initially we didn't put this down to Disqus and focused on other issues (Google algorithm updates etc). But we're quickly coming down the reasoning that our pages now contain less indexable text, and we are getting less traffic because of this. We've tried emailing Disqus directly but they're very slow and don't seem keen to help. Any thoughts on this?

    Read the article

  • Google Site Search (commercial) not indexing files in sitemap

    - by melat0nin
    I have a client for whom we have purchased Google Site Search. It works well for HTML pages served by the CMS, but files aren't being reliably indexed. I wrote a script to generate an XML feed (sitemap) of all the files in the CMS which I've plugged in to Google Webmaster Tools for the site. It says that for that sitemap 923 URLs have been submitted, but only 26 have been indexed. The client relies heavily on searching within files, which is why we decided to use Google search, so this is a bit of a problem. Many of the files aren't linked to from any page on the site, as they are old and therefore don't merit having a page of their own. But they still need to be accessible through search for archiving purposes. The file archive xml can be found at www.sniffer.org.uk/file-archive and the standard xml sitemap (of pages) can be found at www.sniffer.org.uk/sitemap.xml. Any thought would be much appreciated!

    Read the article

  • How do I optimize SEO in a multiblog WordPress install?

    - by user35585
    We are about to launch two product pages plus a corporate website. The goal is to keep a blog in all of the sites, but here it comes the question about how to do it in a way we get everything unified but do not mess with Google's web crawlers. We considered the following options: Putting a blog from which we retrieve two categories with custom CSS, so we have a blog that sub splits two category-dependent blogs; this way we can get the feeds and will point to it Putting two product blogs of which we retrieve their posts into a bigger, corporate blog Putting three independent blogs Despite I was for the first option, so we only have to address our content from the product pages, I would sincerely like to hear your opinion. We are afraid duplicate content or strange link games may make us lose PageRank. How would you do it?

    Read the article

  • SEO Impact of using Responsive Design and the serving of different content on the same URL

    - by bmenekl
    We are currently working on redesigning our site and making it responsive. It is a search intensive site with complex functionality, a lot of search filters and a lot of content. Our mobile versions of certain pages need to hide some functionality (i.e. search filters) that exists in the desktop version and/or content (mainly blocks of text that are not necessary or are increasing page load in mobile devices). My questions is this: does the situation of having the same URL (responsive site) serving slightly different content (text and/or search filters) for certain pages in different devices affect our SEO (SERPs or otherwise)?

    Read the article

  • How Web Optimization Services Work to Increase Your on the Internet Reputation

    SEO is a symbol of search engine optimization and that is the key to success from the enterprise. No site has meaning if it seriously isn't properly promoted. Anytime any web surfer is in look up of any certain merchandise, providers or data he makes use of the easiest method of browsing as a result of search engine optimization and this is habit of many individuals to only search straight into 5 or six major sites for their goal. No person has time to seem directly into 100 pages of internet search engine as there is no need to have when he finds in major pages.

    Read the article

  • Do I need multiple accounts in Facebook for each of my product site?

    - by John
    I've a dozen sites which include for-profit ones as well as for charity. For each site I've created a Facebook company/charity account. After creating those accounts it dawned on me that I could as well have created a new page for each of my site from my personal account only even if a site has multiple product pages. What'll be the right strategy? Also as per Facebook terms we can have only single personal account. I do have single personal account only but for each site I've created only company pages. I hope I'm not violating the facebook terms.

    Read the article

  • Google web search shows dateCreated instead of dateModified metadata

    - by LonelyPixel
    So today I discovered that the pages from my website are listed with an unexpected date value. I specify the schema.org properties dateCreated and dateModified for most of my content pages. I'd expect that search results show me when a page was last updated, to get a sense of the currency of the page. But it's showing the date of first publishing which may be years ago. That's a bit unsatisfying but I don't want to misuse the metadata because Google probably reads it wrong. Some search terms for you to try it out: "gitrevisiontool"; "easyxml"; "multiselecttreeview" (look for the results on dev.unclassified.de; the human- and machine-readable dates come at the end of the page) Does anybody know more about what's wrong here? Or does it work as designed? (What a stupid design that would be.)

    Read the article

< Previous Page | 78 79 80 81 82 83 84 85 86 87 88 89  | Next Page >