Search Results

Search found 24735 results on 990 pages for 'site ranking'.

Page 24/990 | < Previous Page | 20 21 22 23 24 25 26 27 28 29 30 31  | Next Page >

  • Problem with ranking of search results in SharePoint 2007 if using the CONTAINS predicate

    - by mythicdawn
    While writing a front-end for the SharePoint Search web service for work, I did some quick testing with the MOSS Search Tool to make sure things were working right under the hood. What I found was that queries composed only of CONTAINS predicates (FREETEXT ones were fine) would have a rank of 1000 for any results that were returned. According to the documentation (http://msdn.microsoft.com/en-us/library/ms544086.aspx): "If the query returns a document because a non–full-text predicate evaluates to TRUE for that document, the rank value is calculated as 1000." Given that the behaviour I am seeing seems to contradict the documentation, is it the case that all queries that use only the CONTAINS predicate will produce ranking like this?

    Read the article

  • MySQL Query Ranking

    - by eft0
    Hey there, I have this MySQL query for list a "Vote Ranking" for "Actions" and it work fine, but I want what the "id_user" doesn't repeat, like made a "DISTINCT" in this field. SELECT count(v.id) votos, v.id_action, a.id_user, a.id_user_to, a.action, a.descripcion FROM votes v, actions a WHERE v.id_action = a.id GROUP BY v.id_action ORDER BY votos DESC The result: votes act id_user 3 3 745059251 2 20 1245069513 2 23 1245069513 2 26 100000882722297 2 29 1245069513 2 44 1040560484 2 49 1257441644 2 50 1040560484 The expected result votes act id_user 3 3 745059251 2 20 1245069513 2 26 100000882722297 2 44 1040560484 2 49 1257441644 2 50 1040560484 Thanks in advanced!

    Read the article

  • Can google “see” this custom javascript code which displays links from an external site to mine

    - by webmasters
    I have a javascript code on my site who displays links from another site. This is what I have on my source before: <script language="JavaScript" type="text/javascript">showLink(1);</script> This is what I have copied from my source after the page has loaded: <script language="JavaScript" type="text/javascript">showLink(1);</script><a rel="nofollow" target="_blank" class="anc" href="http://x5.external_site.net/sc/out.php?s=5483&amp;o=http%3A%2F%2Fwww.bluetooth.com">Bluetooth Devices</a> Can google see this link?

    Read the article

  • .htaccess: Redirect Hotlink Flash --> Site with embed Flash

    - by user5571
    I have some .php sites that embeds .swf files. These .swf files are now linked to by some other guys. And I don't want them to simply open the SWF, I want them to force being redirect to the page where the flash is embed. Data: Site: www.example.com/1 (www.example.com/2, www.example.com/3 and so on) Flash: www.example.com/flash/flash_NUMBER.swf So for www.example.com/1: Site: www.example.com/1 Flash: www.example.com/flash/flash_1.swf I now want to redirect the user who types "www.example.com/flash/flash_1.swf" into his URL to be redirect to www.example.com/1. The Problem I have that the flash needs to be still accesseable via www.example.com/1 <-- I don't get that working (the Flash is embed into that page). The tool I would like to use for this is the .htaccess & RewriteRule. I hope someone can help me out.

    Read the article

  • How to create Office365 SharePoint site using SharePoint2010 template

    - by ybbest
    Recently, I worked with a client that has office 365 upgraded to SharePoint 2013.But they still like to create the SharePoint site using the old SharePoint2010 template, if you like to know how , here are the steps: 1. Go to your Office 365 portal https://portal.microsoftonline.com/admin/default.aspx and then go to the SharePoint admin page. 2. Next, click settings page. 3. Change the Global experience Version Settings. 4. Finally, you will be able to create SharePoint site using 2010 template.

    Read the article

  • .htaccess: Redirect Hotlink Flash --> Site with embed Flash

    - by user5571
    Hello, I have some .php sites that embeds .swf files. These .swf files are now linked to by some other guys. And I don't want them to simply open the SWF, I want them to force being redirect to the page where the flash is embed. Data: Site: www.example.com/1 (www.example.com/2, www.example.com/3 and so on) Flash: www.example.com/flash/flash_NUMBER.swf So for www.example.com/1: Site: www.example.com/1 Flash: www.example.com/flash/flash_1.swf I now want to redirect the user who types "www.example.com/flash/flash_1.swf" into his URL to be redirect to www.example.com/1. The Problem I have that the flash needs to be still accesseable via www.example.com/1 <-- I don't get that working (the Flash is embed into that page). The tool I would like to use for this is the .htaccess & RewriteRule. I hope someone can help me out.

    Read the article

  • Google I/O 2010 - Optimize your site with Page Speed

    Google I/O 2010 - Optimize your site with Page Speed Google I/O 2010 - Optimize every bit of your site serving and web pages with Page Speed Tech Talks Richard Rabbat, Bryan McQuade Page Speed is an open-source Firefox/Firebug Add-on. Webmasters and web developers can use Page Speed to evaluate the performance of their web pages and to get suggestions on how to improve them. Learn about the latest rules of web development we've added, updated optimizations, go over a new refreshed UI, see how to collect data through beacons to track progress over time, cut and paste fixes, and how to work with 3rd party libraries more effectively, including Google Analytics. For all I/O 2010 sessions, please go to code.google.com/events/io/2010/sessions.html From: GoogleDevelopers Views: 6 0 ratings Time: 47:15 More in Science & Technology

    Read the article

  • Site Review: MortgageCalculator.org - Forms Evaluation

    This site allows users to enter basic loan information into a form and when the user clicks the submit button the information is used to calculate a loan summary which includes: monthly payment, total interest paid, and the last payment date. This site uses server side validation and replaces any value not within a normal range with the calculator default for the form field. In addition, they also use server side code to calculate the items on the loan summary which is then displayed to the user. I personally think that by adding client side validation, it would improve the users experience because it would ensure that the data being submitted is within an acceptable norm and if the data entered was not within this range then it would allow the user to adjust the data.

    Read the article

  • Google I/O 2010 - SEO site advice from the experts

    Google I/O 2010 - SEO site advice from the experts Google I/O 2010 - SEO site advice from the experts Tech Talks Matt Cutts, Greg Grothaus, Evan Roseman A perfect opportunity to get your website reviewed by the experts in the Google Search Quality team. Attendees can get concrete search engine optimization (SEO) feedback on their own sites. We'll also answer real-life questions that affect developers when it comes to optimizing their websites for search. For all I/O 2010 sessions, please go to code.google.com From: GoogleDevelopers Views: 308 12 ratings Time: 01:00:38 More in Science & Technology

    Read the article

  • Understanding Ajax crawling of search site

    - by vacuum
    I have a couple of questions about Ajax crawling of site, which is kind of search engine itself. The base article explains the mechanism of making AJAX application crawlable. All this stuff with HTML-snapshots is clear and easy to implement, but I cant understand where will Google bot will get "the crawler finds a pretty AJAX URL"( ie www.example.com/ajax.html#key=value) to work with. First thing, that came on mind - is breadcrumb. In sitemap we can specify pages with breadcrumb on it. so bot will go to these pages and get HTML-snapshots from here. But I'm sure, there are exists other ways to give bot this "pretty AJAX URL". In our case, we have simple search site, where user enters keyword, presses "Find", js execute Ajax request, receives JSON reponce and fill page with results(without any refresh of course). In this case - how to make google bot crawle all the presults in addition to sitemap? Is there some example of solution, described in article above?

    Read the article

  • Recommended requirements when outsourcing xhtml/css site building?

    - by András Szepesházi
    I'm considering outsourcing a part of our web application development project for freelancers, namely the site building part. What I mean by site building is the process of creating the xhtml/css template files, with dummy content, from a psd file (or any other graphical layout file). The resulting xhtml/css files will be used by our developers as templates for cms based page rendering. The cms in this case is Drupal, but that might not be of much relevance. I'm looking for a good set of requirements, that will result in good quality xhtml/css code, complying with today's standards leaves little to the freelancer developer's imagination in terms of what I need I'm thinking about requirements like: Valid XHTML 1.0 Transitional document type, validated by validator.w3.org Identical rendering in all modern browsers (FF, Chrome, Safari, Opera, IE7-8) and also in IE6 All opening and closing block-level elements should be properly commented, referencing the functional part of the user interface they belong to (menu, toolbar, content, etc) No inline CSS definitions And so on. How would you organize a list like that? What requirements would you add?

    Read the article

  • Why Does the Same Site in Different SERPs Contain and Not Contain Google Sitelinks

    - by frank13
    Is there a way to get sitelinks on a Google SERP when searching on a site's name vs. the sites' web address? Example, if you search "twin city kings", the first result is the website for Twin City Kings without sitelinks. But if you search "twincitykings.com", the first result is the website for Twin City Kings with sitelinks. Is it possible to get sitelinks on both SERPs? Thanks for helping or clarifying. Note: this question does not pertain to "how to get a sitelink". It pertains to why does the same site come up in different SERPs with and without Sitelinks.

    Read the article

  • SEO optimization for AJAX site and dynamic HTML canvas

    - by Christian Benincasa
    I have a site that uses AJAX to query the Last.fm database and then dynamically draws a graph of the results on an HTML canvas. In the search function, I have a command that sets window.location.hash to the search parameters. I also have a function that checks if a hash was provided in the url and if so, generates the page. For example, http://www.thenlistento.com/#!/led+zeppelin will automatically navigate to a search page for Led Zeppelin. My question is, how do optimize this set up for SEO? Can it be done at all? I've taken a look at Google Webmaster Docs and read over the hashbang protocol, but I'm not totally sure how to apply it to my situation..or even if I can at all. Any help/suggestions would be greatly appreciated. Link to the site: http://www.thenlistento.com

    Read the article

  • Setup site folders on Apache and PHP [closed]

    - by Cobus Kruger
    I'm trying to set up my first Apache server on my Windows PC at home and I have real trouble finding out which configuration settings go where. I downloaded and installed XAMPP which seemed to get everything nicely set up and can see a working website on http://localhost. So far so good. The point of this is to develop a website of course, and to make my life easier (irony?), I wanted to let the web site root point to my Eclipse project folder. So I opened httpd.conf, uncommented a VirtualHost block and changed its DocumentRoot to my local path. Now when I try to load http://localhost I get a 403 (Access denied) error. So where do I configure permissions for my folder? And is that all I need to let my site run from the folder specified or am I going to have to clear another hurdle?

    Read the article

  • Sub routing in a SPA site

    - by Anders
    I have a SPA site that I'm working on, I have a requirement that you can have subroutes for a page view model. Im currently using this 'pattern' for the site MyApp.FooViewModel = MyApp.define({ meta: { query: MyApp.Core.Contracts.Queries.FooQuery, title: "Foo" }, init: function (queryResult) { }, prototype: { } }); In the master view model I have a route table this.navigation(new MyApp.RoutesViewModel({ Home: { model: MyApp.HomeViewModel, route: String.empty }, Foo: { model: MyApp.FooViewModel } })); The meta object defines which query should populate the top level view model when its invoked through sammyjs, this is all fine but it does not support sub routing My plan is to change the meta object so that it can (optional offcourse) look like this meta: { query: MyApp.Core.Contracts.Queries.FooQuery, title: "Foo", route: { barId: MyApp.BarViewModel } } When sammyjs detects a barId in the query string the Barmodel will be executed and populated through its own meta object. Is this a good design?

    Read the article

  • Adding my face to my web-site in Google's search result

    - by Roman Matveev
    I'm trying to accomplish the rich snippet to the template of my future web-site. The data format is review and I used the microdata formatting to add all necessary information to the web-page. The Structured Data Testing Tool delivered rating, author information and review date: However there is no my face image and the sections related to authorship are empty: I made all that recommended to link my Google+ profile to the web-site: I did something wrong? Or I will not be able to see my face in the test tools ever and it will be in the real SERP?

    Read the article

  • Which kind of public sitemap should I build for a search based navigation site

    - by Noam
    I have a search based navigation web-site. Each query has filters as well as sort-by. The search results point to end-pages inside the site. Each of those pages has many outlinks to other end-pages. Currently I have a XML sitemap which directs crawlers to all the end pages. I'm trying to add a silo sitemap directory to improve SEO. Assuming this is the right direction I have a couple of options: end pages sorted alphabetically. Pages by major search filters, and then divide alphabetically. Pages for every filter and cross option between them and the sort-by. Which would you recommend and why? NOTE: I'm not referring to a XML sitemap.

    Read the article

  • Tracking unique views for a site showing my advertisements [on hold]

    - by user580950
    I am in trouble. I placed and advertisement on a website in 2012. The website said they got 950,000 unique visits each month. Early in 2012 I advertised with them. The advertisement didn't worked out. I checked in 2-3 months time and I saw that the unique visitors on the site was 8,000 at that time. I immediately closed the account. I don't remember which site I used to check the unique visitors. The advertising company has filed a dispute against me. So is there any tool that can show me the 2012 stats for any website? I tried Google Trends but it doesn't show statistics.

    Read the article

  • iPhone site optimization: Custom viewport size

    - by Brandon Durham
    I've got a site that should max out a 575px on the iphone and wanted to know what the best method is for defaulting the viewport to this size. Currently I am using this meta tag: <meta name="viewport" content="width=575; user-scalable=no;"> This displays some odd behavior in that it loads fully zoomed out and then, once the page is loaded, zooms in to 575. What are the best methods to ensure that my site will surely display at 575px wide in mobile browsers?

    Read the article

  • Meta description not displaying in custom site search results page

    - by Stephen Connolly
    We have Google Custom Site Search implemented on our company website. When I'm looking at the results page, I noticed that the Meta Description is not being displayed. It just seems to be reading the links titles from our drop down menu and using this as a description. When I search for the same page via google.com, the meta description is pulled in correctly. Any thoughts why this might be happening. I can't see anything in the Custom Site Search settings.

    Read the article

  • Which kind of sitemap directory should I build for a search based navigation site

    - by Noam
    I have a search based navigation web-site. Each query has filters as well as sort-by. The search results point to end-pages inside the site. Each of those pages has many outlinks to other end-pages. Currently I have a XML sitemap which directs crawlers to all the end pages. I'm trying to add a silo sitemap directory to improve SEO. Assuming this is the right direction I have a couple of options: end pages sorted alphabetically. Pages by major search filters, and then divide alphabetically. Pages for every filter and cross option between them and the sort-by. Which would you recommend and why?

    Read the article

< Previous Page | 20 21 22 23 24 25 26 27 28 29 30 31  | Next Page >