Search Results

Search found 83878 results on 3356 pages for 'google data api'.

Page 416/3356 | < Previous Page | 412 413 414 415 416 417 418 419 420 421 422 423  | Next Page >

  • SEO for maps-based websites that require user interaction

    - by j0nes
    I have a website that basically shows a lot of locations worldwide on a Google Maps like interface. The map itself is built using the Leaflet library and Open Street Map tiles. In the map, I show markers at each location I have. There is a popup window when I click on a marker that shows additional information and contains links to "detail" pages for this location. I fetch the location data for the viewpoint from an AJAX call from my server, so the additional information is not available in the HTML page itself. The detail pages are the pages my users are interested in. My normal users load the map, search the location they are interested in, click on a marker and click on a link in the popup window. However for search engines, this might look different. As this navigation pattern relies on user interaction, I think they might not be able to find the details page. My questions: Are search engines able to follow a navigation path like outlined above? How can I improve the navigation for search engines? (For example showing textual links below the map, sitemaps...) How important are internal links for SEO?

    Read the article

  • Paging problem in Data Form Webpart SP2010

    - by Patrick Olurotimi Ige
    I was working on some webpart in sharepoint designer 2010  and i decided to use the default custom paging.But i noticed the previous link page isn't working it basicalling just takes me back to the start page of the list and not the previous page after a good look i noticed micosoft is using "history.back()" which is suppose to work but it doesn't work well for paged data.Anyway before i started further investigation i found Hani Amr's solution at the right time and that did the trick.Hope that helps

    Read the article

  • Can a version update (from 12.04 to 12.10) give driver problems?

    - by Ruben
    I'm new here. I recently installed Ubuntu 12.04 even though I'm not completely new to the Linux World, but I wanted to ask a thing: I had a problem in video drivers, and I fixed it by a complete reinstallation of the whole operating system. If I install the new version using the update manager (so without a complete reinstallation), will my drivers be the same as they are? And what about my data? Thanks, and sorry for my very bad english .<

    Read the article

  • How to Keep Mobile Cloud Data Safe

    As the use of mobile devices continues to soar, enterprise cloud applications are now resident in the palm of your hand. With this mobility comes ever greater responsibility to keep enterprise data safe.

    Read the article

  • SQL Down Under Podcast - Gadi Peleg - Data Quality Services

    - by Greg Low
    Well it's been a few months but I'm back on a roll creating some SQL Down Under podcasts. The first out the door is an interview with Gadi Peleg from the SQL Server team, introducing Data Quality Services.Gadi came to Microsoft when Zoomix was acquired.Details of this podcast (and other available podcasts) are here: http://www.sqldownunder.com/Resources/Podcast.aspxHope you enjoy it even though there are some telling signs that I recorded it at 3AM :-)If you are using iTunes, you can also subscribe here: http://itunes.apple.com/au/podcast/sql-down-under/id503822116?mt=2

    Read the article

  • Unity calendar lens not showing events in Ubuntu 12.04

    - by David_G
    I'm trying to get proper/useful calendar integration into Ubuntu 12.04. I have a Google Calendar (& account) and I want to be able to use this without opening the browser. I want to get the Unity Calendar lens working, so that it shows events coming up, and it allows me a quick way to add new events. However, after installing it, it does not find any events, nor allow me to add a new event. Note that I've installed Lightning 1.4, Evolution mirror 0.2.3, Evolution, and unity-calendar lens. I've also installed Calendar-indicator. I suspect that somehow the lens is not getting the calendar information from thunderbird via evolution. A bit of searching around led me to try this command: /usr/lib/calendar-lens/calendar-lens-daemon.py. With this result: /usr/lib/python2.7/dist-packages/gobject/constants.py:24: Warning: g_boxed_type_register_static: assertion `g_type_from_name (name) == 0' failed import gobject._gobject Traceback (most recent call last): File "/usr/lib/calendar-lens/calendar-lens-daemon.py", line 324, in daemon = Daemon() File "/usr/lib/calendar-lens/calendar-lens-daemon.py", line 80, in init for calendar in evolution.ecal.list_calendars(): AttributeError: 'NoneType' object has no attribute 'list_calendars' Any ideas?

    Read the article

  • How to optimise mesh data

    - by Wardy
    So i have some procedurally generated mesh data and i want to reduce it down to its minimum number of verts. In case it matters this is a unity project. Working on the basis of a simple example, lets assume a typical flat surface of points 2 by 3. The point / vertex at [1,1] is used in many triangles. I've generated mesh for a voxel type engine that adds verts to a list based on face visiblility and now I want to remove all the duplicates. Can anyone come up with an efficient way of doing this because what i have is sooo bad its not even funny (and i don't even think it's logically correct) ... private void Optimize() { Vector3 v; Vector3 v2; for (int i = 0; i < Vertices.Count; i++) { v = Vertices[i]; for (int j = i+1; j < Vertices.Count; j++) { v2 = Vertices[j]; if (v.x == v2.x && v.y == v2.y && v.z == v2.z) { for (int ind = 0; ind < Indices.Count; ind++) { if (Indices[ind] == j) { Indices[ind] = i; } else if (Indices[ind] > j && Indices[ind] > 0) Indices[ind]--; } Vertices.RemoveAt(j); Uvs.RemoveAt(j); Normals.RemoveAt(j); } } } } EDIT: Ok i managed to get this (code sample above updated) to render an "optimised" set of verts but the UV data is all wrong now, which would make sense because i'm basically just removing any UV Vector that represents a UV coord for a removed vert and not actually considering what I need to do to "fix the tri" so to speak. The code now seemingly does work but its quite time consuming, still looking to further optimise.

    Read the article

  • Facebook Like javascript related to Time Spent Downloading a page Increase in GWT?

    - by donaldthe
    Hi, I installed the Facebook Like button Javascript version on my website on December 15th. Take a look at this report from Google Webmaster Central. Crawl stats Googlebot activity in the last 90 days The crawl stats are from Googlebot which as far as I know doesn't execute Javascript. Could the Facebook Like Javascript code, "The XFBML version" be related to large spike in Time spent downloading a page? (By the way the huge spike in November was caused by a mistake where every image request was getting a 301.) I'm not sure what caused the spike to go down by half somewhere in December. It may have been related to a faulty setting in web.config. I'm at a loss as to what I can do about this or even how to tell if this is my problem or Googlebots crawl problem. Here is the Facebook code I am using to create the like button. It is right after the opening body tag <div id="fb-root"></div> <script> window.fbAsyncInit = function() { FB.init({appId: 'xxxxx', status: true, cookie: true, xfbml: true}); }; (function() { var e = document.createElement('script'); e.async = true; e.src = document.location.protocol + '//connect.facebook.net/en_US/all.js'; document.getElementById('fb-root').appendChild(e); }()); ` and this creates the like box: <fb:like show_faces="false"></fb:like> If the Javascript can't be the problem any ideas on where to start looking would be appreciated.

    Read the article

  • Tournament bracket method to put distance between teammates

    - by Fred Thomsen
    I am using a proper binary tree to simulate a tournament bracket. It's preferred any competitors in the bracket that are teammates don't meet each other until the later rounds. What is an efficient method in which I can ensure that teammates in the bracket have as much distance as possible from each other? Are there any other data structures besides a tree that would be better for this purpose? EDIT: There can be more than 2 teams represented in a bracket.

    Read the article

  • finding houses within a radius

    - by paul smith
    During an interview I was asked given the following: A real estate application that lists all houses that are currently on the market (i.e., for sale) within a given distance (say for example the user wants to find all houses within 20 miles), how would you design your application (both data structure and alogirithm) to build this type of service? Any ideas? How would you implement it? I told him I didn't know becaue I've never done any geo-related stuff before.

    Read the article

  • Tips on how to notify a user of new features in your game

    - by brent777
    I have noticed a problem when releasing new features for a game that I wrote for Android and published on Google Play Store. Because my game is "stage-based" - and not a game like Hay Day, for example, where users will just go into the game every day since it can't really be finished - my users are not aware of new features that I release for the game. For example, if I publish a new version of my game and it contains a couple new stages, most of their devices will just auto-update the game and they don't even notice this and think to check out what's new. So this is why an approach like popping open a dialog that showcases the new feature(s) when they open the game for the first time after the update was done is not really sufficient. I am looking for some tips on an approach that will draw my users back into the game and then they could read more detail about new features on such a dialog. I was thinking of something like a notification that tells them to check out the new features after an update is done but I am not sure if this is a good idea. Any suggestions to help me solve this problem would be awesome.

    Read the article

  • This Week on the Green Data Center Management Front

    Among the big news this week for those looking to make their data center more environmentally friendly: Two IBM POWER7-based servers become the first four-processor systems in the industry to qualify for Energy Star status; NetApp announces plans to have execs, and other on hand to discuss green computing at SNW Spring 2010; and the feds are examining how cloud will save money and energy.

    Read the article

  • Can I improve my AdWords quality scores with better landing pages?

    - by Eric
    I noticed that I have some keywords in my AdWords that are totally applicable to my site but the quality score of the keyword is 4 or 5. I'd like to get it up higher by creating custom versions of my site's home page (landing page) targeted specifically for people searching on those keywords. So for example, if we pretend my site sells pet food, my current home page has the phrase "dog food." I have a specific AdWords campaign for people searching on cat food (with cat food-specific ads). I'm thinking about changing the URL on those ads to something like http://mysite.com/cat.html, so a different home page comes up with the phrase "cat food." My thinking is that will help Google see that this new landing page is appropriate for the keywords and will raise my quality score for the "cat food" keywords. (Note that none of what I'm doing is shady or misleading; nobody would disagree that all of the keywords and ads I've created are perfect and appropriate for what my site offers.) Question: is what I describe the correct way to raise poor quality scores on keywords, and will it help?

    Read the article

  • Updated Master Data Services Documentation and Resources

    - by mattande
    (This post was contributed by Reagan Templin, Lead Technical Writer for the MDS Team) With the release of SQL Server 2008 R2, it’s a great time to check out the updated documentation and resources for the release, and for SQL Server 2008 R2 Master Data Services ("MDS") in particular. As you saw in the last post ( New White Papers Available ), there are some great white papers available on MSDN to get you going with MDS. Below you’ll find more information about other updated and newly published content....(read more)

    Read the article

  • How to manage many mobile device users at server side?

    - by Rami
    I built a social Android application in which users can see other users around them by GPS location. At the beginning thing went well as I had low number of users, but now that I have increasing number of users (about 1500 +100 every day) it has revealed a major problem in my design. In my Google App Engine servlet I have static HashMap that holds all the users profiles objects, currently 1500 and this number will increase as more users register. Why I'm doing it? Every user that requests for the users around him compares his GPS with other users and checks if they are in his 10km radius. This happens every five minutes on average. Consequently, I can't get the users from db every time because GAE read/write operation quota will tear me apart. The problem with this design is? As the number of users increases, the Hashmap turns to null every 4-6 hours, I think that this time is getting shorter, but I'm not sure. I'm fixing this by reloading the users from the db every time I detect that it becomes null, but this causes DOS to my users for 30 sec, so I'm looking for better solution. I'm guessing that it happens because the size of the hashmap. Am I right? I have been advised to use a spatial database, but that means that I can't work with GAE any more and it means that I need to build my big server all over again and lose my existing DB. Is there something I can do with the existing tools? Thanks.

    Read the article

  • Moving from a static site to a CMS with new URLs and meta-data for pages

    - by Chris J
    Hi I am in the process of rebuilding a site from static pages to a CMS which will be using mod_rewrite to generate new page URLs. In this process our marketing people and myself have decided to tidy up the descriptions, keywords and titles. Eg: a page which who's URL is currently "website-name/about_us.html" and has a title of "website-name - something not quite page specific" will change to "website-name/about-us/" and title: "about us - website-name" and may have a few keywords and the description changed. Our goal with updating the meta data is to improve our page rankings and try to keep in line with some best practices for SEO. Though our current page rankings are quite good in many aspects, there is room for improvement. All of the pages will also have content changes (like rearranging heading tags, new menu on all pages, new content in footer, extra pieces of dynamic content relating to other pages). In this new site process I plan to use 301 redirects for all the old URLs pointing to the new URLs. My question is what can I expect to happen to the page rankings in Google, in the sort term and long term? Will this be like kicking off a new site which will have to build up trust over time or will the original page rankings have affect?

    Read the article

< Previous Page | 412 413 414 415 416 417 418 419 420 421 422 423  | Next Page >