Search Results

Search found 31918 results on 1277 pages for 'google maps api 3'.

Page 83/1277 | < Previous Page | 79 80 81 82 83 84 85 86 87 88 89 90  | Next Page >

  • Twitter api and retweets

    - by Juan Manuel
    I'm trying to get the last tweet from the people I follow using the twitter api (http://api.twitter.com/1/statuses/friends.json&screen_name=[username]), but I noticed that if the user's last tweet is a retweet, the json data does not contain a "status" element. Using the "user timeline" api does not work either, the last tweet is the last non retweeted tweet. Is there a way to get the real last status, even if it's a RT, through the twitter API?

    Read the article

  • How do I get all the calendar entries for a particular time range using Google Calendar API

    - by BeWarned
    I want to view events over specific time range for a specific calendar, but am having trouble using the API, It is a generic API, and it reminds me of using the DOM. The problem is that it seems difficult to work with because much of the information is in generic base classes. How do I get the events for a calendar using Groovy or Java? Does anybody have an example of passing credentials using curl? Example code would be appreciated.

    Read the article

  • How to I change the text location on my google map for map clusters in Fluster2?

    - by user347480
    Hi, I'm using google maps to create a bunch of POI on a map. I'm using the Fluster2 library to cluster the POI together when there are too many in a certain location. This all works fine and I'm very happy but I would like to move where the fluster2 text is written so it matches my custom icon. ie: it has a marker with 5 written over it so that you know there are 5 poi under that cluster. Does anyone know how to change this or set the text to a specific location? Thanks The Net Duck

    Read the article

  • Where Next for Google Translate? And What of Information Quality?

    - by ultan o'broin
    Fascinating article in the UK Guardian newspaper called Can Google break the computer language barrier? In it, Andreas Zollman, who works on Google Translate, comments that the quality of Google Translate's output relative to the amount of data required to create that output is clearly now falling foul of the law of diminishing returns. He says: "Each doubling of the amount of translated data input led to about a 0.5% improvement in the quality of the output," he suggests, but the doublings are not infinite. "We are now at this limit where there isn't that much more data in the world that we can use," he admits. "So now it is much more important again to add on different approaches and rules-based models." The Translation Guy has a further discussion on this, called Google Translate is Finished. He says: "And there aren't that many doublings left, if any. I can't say how much text Google has assimilated into their machine translation databases, but it's been reported that they have scanned about 11% of all printed content ever published. So double that, and double it again, and once more, shoveling all that into the translation hopper, and pretty soon you get the sum of all human knowledge, which means a whopping 1.5% improvement in the quality of the engines when everything has been analyzed. That's what we've got to look forward to, at best, since Google spiders regularly surf the Web, which in its vastness dwarfs all previously published content. So to all intents and purposes, the statistical machine translation tools of Google are done. Outstanding job, Googlers. Thanks." Surprisingly, all this analysis hasn't raised that much comment from the fans of machine translation, or its detractors either for that matter. Perhaps, it's the season of goodwill? What is clear to me, however, of course is that Google Translate isn't really finished (in any sense of the word). I am sure Google will investigate and come up with new rule-based translation models to enhance what they have already and that will also scale effectively where others didn't. So too, will they harness human input, which really is the way to go to train MT in the quality direction. But that aside, what does it say about the quality of the data that is being used for statistical machine translation in the first place? From the Guardian article it's clear that a huge humanly translated corpus drove the gains for Google Translate and now what's left is the dregs of badly translated and poorly created source materials that just can't deliver quality translations. There's a message about information quality there, surely. In the enterprise applications space, where we have some control over content this whole debate reinforces the relationship between information quality at source and translation efficiency, regardless of the technology used to do the translation. But as more automation comes to the fore, that information quality is even more critical if you want anything approaching a scalable solution. This is important for user experience professionals. Issues like user generated content translation, multilingual personalization, and scalable language quality are central to a superior global UX; it's a competitive issue we cannot ignore.

    Read the article

  • OAuth 2.0 for Google Drive and the Adsense API

    OAuth 2.0 for Google Drive and the Adsense API Google engineers Nicolas Garnier, Ali Afshar, and Sergio Gomes discuss the OAuth 2.0 playground and how to use it with the Google Drive And AdSense APIs. OAuth 2.0 and its inner workings are explained in detail, and usage of the OAuth 2.0 playground in context of Google Drive and the AdSense API is demonstrated thoroughly. The sessions wraps up with some discussion of questions from live viewers. From: GoogleDevelopers Views: 9 0 ratings Time: 57:02 More in Science & Technology

    Read the article

  • Introducing the Store Locator Library for Google Maps API

    Introducing the Store Locator Library for Google Maps API In this screen cast, Chris Broadfoot gives an overview of the Store Locator library, a new open-source utility library that makes it simple for developers to create useful, valuable store locators. Documentation: goo.gl Follow Chris on G+: chrisbroadfoot.id.au From: GoogleDevelopers Views: 197 0 ratings Time: 03:42 More in Science & Technology

    Read the article

  • REL ME tag - trying to figure it out

    - by nekdo
    Regarding http://support.google.com/webmasters/bin/answer.py?hl=en&answer=1229920 to scrolled down section ''Examples'' to the point ''1.'' to the second code line which is: <a rel="me" href="https://plus.google.com/105240469625818678725/"> <img src="//www.google.com/images/icons/ui/gprofile_button-16.png"></a> On the page says that I have to add this line to the Contact Me page of own website in order to get Google Profile button. Exact code which one should be copy and pasted I am able to get here: http://www.google.com/webmasters/profilebutton/ Questions: 1). As you can see on the second URL, to make Google Profile button I need to use "author" tag and not "me" tag. But the first URL which I showed (the line in this message above) shows that I have to use "me" tag and even without this: width="32" height="32". I am already aware that I have to type (second URL) my own Google Profile URL. So do I just MANUALLY ( ! ) change this: <a rel="author" href="https://profiles.google.com/109412257237874861202"> <img src="http://www.google.com/images/icons/ui/gprofile_button-32.png" width="32" height="32"></a> to this (note: two changes done): <a rel="me" href="https://profiles.google.com/109412257237874861202"> <img src="http://www.google.com/images/icons/ui/gprofile_button-32.png"></a> Is this correct? I assume that plus.google.com is the same as profiles.google.com (both is URL of Google Profile). 2). If I was wrong with my first question then the second answer probably won't be even useful but still: Where exactly should I paste the code: <a rel="me" href="https://profiles.google.com/109412257237874861202"> <img src="http://www.google.com/images/icons/ui/gprofile_button-32.png"></a> inside Author Page of own website? I think it doesn't matter where. Also: will this icon be for sure enough or do I also have to make such anchor text with rel me in a ''shape'' of text (for word sentence such as ''Look At My Google Profile'')? Or is just icon really enough? 3). In the same section (''1.'') of the same page (link [first one] provided above) it says that I need to use first author tag to link to Author/Contact Me page of own website in order to later use Me tag. But I think in the explanation is little mistake. Shouldn't be instead of: <a rel="author" href="http://www.cnet.com/profile/iamjaygreene/">Jay Greene</a> this: <a rel="author" href="http://www.cnet.com/profile/iamjaygreene.html">Jay Greene</a> ?

    Read the article

  • Content API for Shopping Office Hours - June 12, 2012

    Content API for Shopping Office Hours - June 12, 2012 Hangout discussing Product Listing Ads (PLAs) and the Google Affiliate Network (GAN) with guest Mark Coppin (GAN) and Claire Hugo (PLAs) of Google. In the Hangout, we reference the video "How to create a new Product Listing Ads campaign" (www.youtube.com which can be found in the Getting Starting page on the Shopping/Ads integration site (www.google.com Also, check out the GAN site to learn more: www.google.com From: GoogleDevelopers Views: 703 6 ratings Time: 31:23 More in Science & Technology

    Read the article

  • GDL Presents: Creative Sandbox | Google+ API

    GDL Presents: Creative Sandbox | Google+ API Tune in to hear about two cool, new campaigns that use the Google+ API from the core creative teams at Goodby Silverstein & Partners, Hook and RESN in conversation with a Google+ Developer Relations expert. They'll talk about how they pushed the possibilities of the Google+ API - and will inspire you to do the same. From: GoogleDevelopers Views: 0 0 ratings Time: 01:00:00 More in Science & Technology

    Read the article

  • Force google to reindex [migrated]

    - by Matthias
    I changed the structure of my urls. The pages are indexed by google and have the following structure http://mypage.com/myfolder/page.apsx The new structure is http://mypage.com/page.aspx Now all urls that google knows are wrong. How can I tell google to reindex and that the structure has changed? Internally I redirect in ASP.NET when the url contains the myfolder by I want google to update the urls.

    Read the article

  • automated emails from php using google apps?

    - by haroldo
    I've got google apps setup, with the mx records working. Any emails to [email protected] now go to my google apps, great. Now i want to use google apps for sending automated emails (ie lost password, registration) How do i set it up so that php's mail() using googleapps? is this correct? also, is there anyway for test (sending myself multiple emails) without google thinking im spamming?

    Read the article

  • Set up Gmail with Google apps for own domain

    - by erdomester
    I rent a server from a German company. I have remote access to it as well as WHM and CPanel. I decided to use Google's mail servers for obvious reasons. I am not an admin just an average guy trying to set up what needs to be set up. The problem is I am unable to make the necessary settings. I watched Youtube tutorials, followed written ones as well as Google's help, but there is (at least) one serious problem with my domain settings. The domain console alwasy says Your MX records are incorrect When I check dappwall.com in mxtoolbox.com it says Pref Hostname IP Address TTL 10 mail.dappwall.com 46.4.88.247 24 hrs But this is not the host name. I checked WHM and my hostname is server1.dappwall.com. I can confirm it by typing the hostname command in putty. However, if I do an mx lookup at mxtoolbox.com on server1.dappwall.com or mail.dappwall.com I get Lookup failed after 1 name servers timed out or responded non-authoritatively I ran checks on the google apps toolbox on dappwall.com and two problems emerged: 1.No Google mail exchangers found. Relayhost configuration? 10 mail.dappwall.com In Google Apps > Settings for Gmail > Advanced settings it also says that my current MX records for dappwall.com is Priority Points to 10 MAIL.DAPPWALL.COM. So mail.dappwall.com again. I also have access to a robot provided by the company I rent the server from. Here I see this mail at two places but how should I (if it's necessary) modify this? I set Email routing to Automatically Detect Configuration. 2.There SHOULD be a valid SPF record. "v=spf1 include:_spf.google.com ~all" In the DNS Zone Editor I added this spf record: Name TTL Class Type Record dappwall.com. 1440 IN TXT v=spf1 include:_spf.google.com ~all In the cPanel Email Authentication page it says SPF: Status: Enabled Warning: cPanel is unable to verify that this server is an authoritative nameserver for dappwall.com. [?] Your current raw SPF record is : v=spf1 include:_spf.google.com ~all How can I confirm that my server is an authoritative nameserver for dappwall.com? In WHM Service Configuration Mailserver selection Dovecot was set but I disabled it (i don't know if that's ok). What am I missing here? Where is that mail.dappwall.com coming from?

    Read the article

  • How remove/de-index a page from Google?

    - by Jason
    On the results page when I Google "e-luminate", the 3rd and 4th link seems to point to specific directory deep within the folders which stores the images. How can I get rid of these 2 results from Google search results? How can I get Google to de-index it? I checked on the server and the folders did not seem different from other folders but these 2 paths seems to get indexed by Google. Thank you.

    Read the article

  • How do I show full names of folders that contain bookmarks in the Google Chrome Bookmark Bar?

    - by earthsmurf
    I have Google Chrome installed on Ubuntu and I have made a number of folders of bookmarks placed within my bookmark bar. The problem is that some of the folder's names are abbreviated like "Ru---" for "Ruby". This is kind of annoying when I have more than one folder that starts with "Ru". Also, my bookmark bar is not completely full, so there is plenty of space for these names to show. Anyone know how to fix this?

    Read the article

  • Where to get Google Chrome 12 for Linux i386?

    - by pts
    I need a .deb download of Google Chrome 12 (not the newest version, but version 12) for i386 Linux. Where can I get it? I need this version because my operating system doesn't have the libraries required by newer Chrome .deb packages. In this question I am not interested in upgrading or changing my operating system, and I'm not interested in using different browsers, and I'm aware of the security implications of using that particular old browser. Please give me a working download link if you know one.

    Read the article

  • How to think in data stores instead of databases?

    - by Jim
    As an example, Google App Engine uses data stores, not a database, to store data. Does anybody have any tips for using data stores instead of databases? It seems I've trained my mind to think 100% in object relationships that map directly to table structures, and now it's hard to see anything differently. I can understand some of the benefits of data stores (e.g. performance and the ability to distribute data), but some good database functionality is sacrificed (e.g. joins). Does anybody who has worked with data stores like BigTable have any good advice to working with them?

    Read the article

  • Where to start with Google Reader as an API?

    - by EAMann
    I want to build a widget for WordPress that simultaneously displays my latest Google Reader items on the front page and allows for management from behind the WordPress dashboard. I can already add my "shared" items using code I've found in various Google searches, but that's not exactly what I'm looking for. I like the functionality of the Google Reader widget in iGoogle, and I want to replicate that on the WordPress dashboard and build a read-only version for the WordPress front-end. Where do I start in the API (public or 'unofficial') to get this built?

    Read the article

  • Introducing the Earthquake Locator – A Bing Maps Silverlight Application, part 1

    - by Bobby Diaz
    Update: Live demo and source code now available!  The recent wave of earthquakes (no pun intended) being reported in the news got me wondering about the frequency and severity of earthquakes around the world. Since I’ve been doing a lot of Silverlight development lately, I decided to scratch my curiosity with a nice little Bing Maps application that will show the location and relative strength of recent seismic activity. Here is a list of technologies this application will utilize, so be sure to have everything downloaded and installed if you plan on following along. Silverlight 3 WCF RIA Services Bing Maps Silverlight Control * Managed Extensibility Framework (optional) MVVM Light Toolkit (optional) log4net (optional) * If you are new to Bing Maps or have not signed up for a Developer Account, you will need to visit www.bingmapsportal.com to request a Bing Maps key for your application. Getting Started We start out by creating a new Silverlight Application called EarthquakeLocator and specify that we want to automatically create the Web Application Project with RIA Services enabled. I cleaned up the web app by removing the Default.aspx and EarthquakeLocatorTestPage.html. Then I renamed the EarthquakeLocatorTestPage.aspx to Default.aspx and set it as my start page. I also set the development server to use a specific port, as shown below. RIA Services Next, I created a Services folder in the EarthquakeLocator.Web project and added a new Domain Service Class called EarthquakeService.cs. This is the RIA Services Domain Service that will provide earthquake data for our client application. I am not using LINQ to SQL or Entity Framework, so I will use the <empty domain service class> option. We will be pulling data from an external Atom feed, but this example could just as easily pull data from a database or another web service. This is an important distinction to point out because each scenario I just mentioned could potentially use a different Domain Service base class (i.e. LinqToSqlDomainService<TDataContext>). Now we can start adding Query methods to our EarthquakeService that pull data from the USGS web site. Here is the complete code for our service class: using System; using System.Collections.Generic; using System.IO; using System.Linq; using System.ServiceModel.Syndication; using System.Web.DomainServices; using System.Web.Ria; using System.Xml; using log4net; using EarthquakeLocator.Web.Model;   namespace EarthquakeLocator.Web.Services {     /// <summary>     /// Provides earthquake data to client applications.     /// </summary>     [EnableClientAccess()]     public class EarthquakeService : DomainService     {         private static readonly ILog log = LogManager.GetLogger(typeof(EarthquakeService));           // USGS Data Feeds: http://earthquake.usgs.gov/earthquakes/catalogs/         private const string FeedForPreviousDay =             "http://earthquake.usgs.gov/earthquakes/catalogs/1day-M2.5.xml";         private const string FeedForPreviousWeek =             "http://earthquake.usgs.gov/earthquakes/catalogs/7day-M2.5.xml";           /// <summary>         /// Gets the earthquake data for the previous week.         /// </summary>         /// <returns>A queryable collection of <see cref="Earthquake"/> objects.</returns>         public IQueryable<Earthquake> GetEarthquakes()         {             var feed = GetFeed(FeedForPreviousWeek);             var list = new List<Earthquake>();               if ( feed != null )             {                 foreach ( var entry in feed.Items )                 {                     var quake = CreateEarthquake(entry);                     if ( quake != null )                     {                         list.Add(quake);                     }                 }             }               return list.AsQueryable();         }           /// <summary>         /// Creates an <see cref="Earthquake"/> object for each entry in the Atom feed.         /// </summary>         /// <param name="entry">The Atom entry.</param>         /// <returns></returns>         private Earthquake CreateEarthquake(SyndicationItem entry)         {             Earthquake quake = null;             string title = entry.Title.Text;             string summary = entry.Summary.Text;             string point = GetElementValue<String>(entry, "point");             string depth = GetElementValue<String>(entry, "elev");             string utcTime = null;             string localTime = null;             string depthDesc = null;             double? magnitude = null;             double? latitude = null;             double? longitude = null;             double? depthKm = null;               if ( !String.IsNullOrEmpty(title) && title.StartsWith("M") )             {                 title = title.Substring(2, title.IndexOf(',')-3).Trim();                 magnitude = TryParse(title);             }             if ( !String.IsNullOrEmpty(point) )             {                 var values = point.Split(' ');                 if ( values.Length == 2 )                 {                     latitude = TryParse(values[0]);                     longitude = TryParse(values[1]);                 }             }             if ( !String.IsNullOrEmpty(depth) )             {                 depthKm = TryParse(depth);                 if ( depthKm != null )                 {                     depthKm = Math.Round((-1 * depthKm.Value) / 100, 2);                 }             }             if ( !String.IsNullOrEmpty(summary) )             {                 summary = summary.Replace("</p>", "");                 var values = summary.Split(                     new string[] { "<p>" },                     StringSplitOptions.RemoveEmptyEntries);                   if ( values.Length == 3 )                 {                     var times = values[1].Split(                         new string[] { "<br>" },                         StringSplitOptions.RemoveEmptyEntries);                       if ( times.Length > 0 )                     {                         utcTime = times[0];                     }                     if ( times.Length > 1 )                     {                         localTime = times[1];                     }                       depthDesc = values[2];                     depthDesc = "Depth: " + depthDesc.Substring(depthDesc.IndexOf(":") + 2);                 }             }               if ( latitude != null && longitude != null )             {                 quake = new Earthquake()                 {                     Id = entry.Id,                     Title = entry.Title.Text,                     Summary = entry.Summary.Text,                     Date = entry.LastUpdatedTime.DateTime,                     Url = entry.Links.Select(l => Path.Combine(l.BaseUri.OriginalString,                         l.Uri.OriginalString)).FirstOrDefault(),                     Age = entry.Categories.Where(c => c.Label == "Age")                         .Select(c => c.Name).FirstOrDefault(),                     Magnitude = magnitude.GetValueOrDefault(),                     Latitude = latitude.GetValueOrDefault(),                     Longitude = longitude.GetValueOrDefault(),                     DepthInKm = depthKm.GetValueOrDefault(),                     DepthDesc = depthDesc,                     UtcTime = utcTime,                     LocalTime = localTime                 };             }               return quake;         }           private T GetElementValue<T>(SyndicationItem entry, String name)         {             var el = entry.ElementExtensions.Where(e => e.OuterName == name).FirstOrDefault();             T value = default(T);               if ( el != null )             {                 value = el.GetObject<T>();             }               return value;         }           private double? TryParse(String value)         {             double d;             if ( Double.TryParse(value, out d) )             {                 return d;             }             return null;         }           /// <summary>         /// Gets the feed at the specified URL.         /// </summary>         /// <param name="url">The URL.</param>         /// <returns>A <see cref="SyndicationFeed"/> object.</returns>         public static SyndicationFeed GetFeed(String url)         {             SyndicationFeed feed = null;               try             {                 log.Debug("Loading RSS feed: " + url);                   using ( var reader = XmlReader.Create(url) )                 {                     feed = SyndicationFeed.Load(reader);                 }             }             catch ( Exception ex )             {                 log.Error("Error occurred while loading RSS feed: " + url, ex);             }               return feed;         }     } }   The only method that will be generated in the client side proxy class, EarthquakeContext, will be the GetEarthquakes() method. The reason being that it is the only public instance method and it returns an IQueryable<Earthquake> collection that can be consumed by the client application. GetEarthquakes() calls the static GetFeed(String) method, which utilizes the built in SyndicationFeed API to load the external data feed. You will need to add a reference to the System.ServiceModel.Web library in order to take advantage of the RSS/Atom reader. The API will also allow you to create your own feeds to serve up in your applications. Model I have also created a Model folder and added a new class, Earthquake.cs. The Earthquake object will hold the various properties returned from the Atom feed. Here is a sample of the code for that class. Notice the [Key] attribute on the Id property, which is required by RIA Services to uniquely identify the entity. using System; using System.Collections.Generic; using System.Linq; using System.Runtime.Serialization; using System.ComponentModel.DataAnnotations;   namespace EarthquakeLocator.Web.Model {     /// <summary>     /// Represents an earthquake occurrence and related information.     /// </summary>     [DataContract]     public class Earthquake     {         /// <summary>         /// Gets or sets the id.         /// </summary>         /// <value>The id.</value>         [Key]         [DataMember]         public string Id { get; set; }           /// <summary>         /// Gets or sets the title.         /// </summary>         /// <value>The title.</value>         [DataMember]         public string Title { get; set; }           /// <summary>         /// Gets or sets the summary.         /// </summary>         /// <value>The summary.</value>         [DataMember]         public string Summary { get; set; }           // additional properties omitted     } }   View Model The recent trend to use the MVVM pattern for WPF and Silverlight provides a great way to separate the data and behavior logic out of the user interface layer of your client applications. I have chosen to use the MVVM Light Toolkit for the Earthquake Locator, but there are other options out there if you prefer another library. That said, I went ahead and created a ViewModel folder in the Silverlight project and added a EarthquakeViewModel class that derives from ViewModelBase. Here is the code: using System; using System.Collections.ObjectModel; using System.ComponentModel.Composition; using System.ComponentModel.Composition.Hosting; using Microsoft.Maps.MapControl; using GalaSoft.MvvmLight; using EarthquakeLocator.Web.Model; using EarthquakeLocator.Web.Services;   namespace EarthquakeLocator.ViewModel {     /// <summary>     /// Provides data for views displaying earthquake information.     /// </summary>     public class EarthquakeViewModel : ViewModelBase     {         [Import]         public EarthquakeContext Context;           /// <summary>         /// Initializes a new instance of the <see cref="EarthquakeViewModel"/> class.         /// </summary>         public EarthquakeViewModel()         {             var catalog = new AssemblyCatalog(GetType().Assembly);             var container = new CompositionContainer(catalog);             container.ComposeParts(this);             Initialize();         }           /// <summary>         /// Initializes a new instance of the <see cref="EarthquakeViewModel"/> class.         /// </summary>         /// <param name="context">The context.</param>         public EarthquakeViewModel(EarthquakeContext context)         {             Context = context;             Initialize();         }           private void Initialize()         {             MapCenter = new Location(20, -170);             ZoomLevel = 2;         }           #region Private Methods           private void OnAutoLoadDataChanged()         {             LoadEarthquakes();         }           private void LoadEarthquakes()         {             var query = Context.GetEarthquakesQuery();             Context.Earthquakes.Clear();               Context.Load(query, (op) =>             {                 if ( !op.HasError )                 {                     foreach ( var item in op.Entities )                     {                         Earthquakes.Add(item);                     }                 }             }, null);         }           #endregion Private Methods           #region Properties           private bool autoLoadData;         /// <summary>         /// Gets or sets a value indicating whether to auto load data.         /// </summary>         /// <value><c>true</c> if auto loading data; otherwise, <c>false</c>.</value>         public bool AutoLoadData         {             get { return autoLoadData; }             set             {                 if ( autoLoadData != value )                 {                     autoLoadData = value;                     RaisePropertyChanged("AutoLoadData");                     OnAutoLoadDataChanged();                 }             }         }           private ObservableCollection<Earthquake> earthquakes;         /// <summary>         /// Gets the collection of earthquakes to display.         /// </summary>         /// <value>The collection of earthquakes.</value>         public ObservableCollection<Earthquake> Earthquakes         {             get             {                 if ( earthquakes == null )                 {                     earthquakes = new ObservableCollection<Earthquake>();                 }                   return earthquakes;             }         }           private Location mapCenter;         /// <summary>         /// Gets or sets the map center.         /// </summary>         /// <value>The map center.</value>         public Location MapCenter         {             get { return mapCenter; }             set             {                 if ( mapCenter != value )                 {                     mapCenter = value;                     RaisePropertyChanged("MapCenter");                 }             }         }           private double zoomLevel;         /// <summary>         /// Gets or sets the zoom level.         /// </summary>         /// <value>The zoom level.</value>         public double ZoomLevel         {             get { return zoomLevel; }             set             {                 if ( zoomLevel != value )                 {                     zoomLevel = value;                     RaisePropertyChanged("ZoomLevel");                 }             }         }           #endregion Properties     } }   The EarthquakeViewModel class contains all of the properties that will be bound to by the various controls in our views. Be sure to read through the LoadEarthquakes() method, which handles calling the GetEarthquakes() method in our EarthquakeService via the EarthquakeContext proxy, and also transfers the loaded entities into the view model’s Earthquakes collection. Another thing to notice is what’s going on in the default constructor. I chose to use the Managed Extensibility Framework (MEF) for my composition needs, but you can use any dependency injection library or none at all. To allow the EarthquakeContext class to be discoverable by MEF, I added the following partial class so that I could supply the appropriate [Export] attribute: using System; using System.ComponentModel.Composition;   namespace EarthquakeLocator.Web.Services {     /// <summary>     /// The client side proxy for the EarthquakeService class.     /// </summary>     [Export]     public partial class EarthquakeContext     {     } }   One last piece I wanted to point out before moving on to the user interface, I added a client side partial class for the Earthquake entity that contains helper properties that we will bind to later: using System;   namespace EarthquakeLocator.Web.Model {     /// <summary>     /// Represents an earthquake occurrence and related information.     /// </summary>     public partial class Earthquake     {         /// <summary>         /// Gets the location based on the current Latitude/Longitude.         /// </summary>         /// <value>The location.</value>         public string Location         {             get { return String.Format("{0},{1}", Latitude, Longitude); }         }           /// <summary>         /// Gets the size based on the Magnitude.         /// </summary>         /// <value>The size.</value>         public double Size         {             get { return (Magnitude * 3); }         }     } }   View Now the fun part! Usually, I would create a Views folder to place all of my View controls in, but I took the easy way out and added the following XAML code to the default MainPage.xaml file. Be sure to add the bing prefix associating the Microsoft.Maps.MapControl namespace after adding the assembly reference to your project. The MVVM Light Toolkit project templates come with a ViewModelLocator class that you can use via a static resource, but I am instantiating the EarthquakeViewModel directly in my user control. I am setting the AutoLoadData property to true as a way to trigger the LoadEarthquakes() method call. The MapItemsControl found within the <bing:Map> control binds its ItemsSource property to the Earthquakes collection of the view model, and since it is an ObservableCollection<T>, we get the automatic two way data binding via the INotifyCollectionChanged interface. <UserControl x:Class="EarthquakeLocator.MainPage"     xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation"     xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml"     xmlns:d="http://schemas.microsoft.com/expression/blend/2008"     xmlns:mc="http://schemas.openxmlformats.org/markup-compatibility/2006"     xmlns:bing="clr-namespace:Microsoft.Maps.MapControl;assembly=Microsoft.Maps.MapControl"     xmlns:vm="clr-namespace:EarthquakeLocator.ViewModel"     mc:Ignorable="d" d:DesignWidth="640" d:DesignHeight="480" >     <UserControl.Resources>         <DataTemplate x:Key="EarthquakeTemplate">             <Ellipse Fill="Red" Stroke="Black" StrokeThickness="1"                      Width="{Binding Size}" Height="{Binding Size}"                      bing:MapLayer.Position="{Binding Location}"                      bing:MapLayer.PositionOrigin="Center">                 <ToolTipService.ToolTip>                     <StackPanel>                         <TextBlock Text="{Binding Title}" FontSize="14" FontWeight="Bold" />                         <TextBlock Text="{Binding UtcTime}" />                         <TextBlock Text="{Binding LocalTime}" />                         <TextBlock Text="{Binding DepthDesc}" />                     </StackPanel>                 </ToolTipService.ToolTip>             </Ellipse>         </DataTemplate>     </UserControl.Resources>       <UserControl.DataContext>         <vm:EarthquakeViewModel AutoLoadData="True" />     </UserControl.DataContext>       <Grid x:Name="LayoutRoot">           <bing:Map x:Name="map" CredentialsProvider="--Your-Bing-Maps-Key--"                   Center="{Binding MapCenter, Mode=TwoWay}"                   ZoomLevel="{Binding ZoomLevel, Mode=TwoWay}">             <bing:MapItemsControl ItemsSource="{Binding Earthquakes}"                                   ItemTemplate="{StaticResource EarthquakeTemplate}" />         </bing:Map>       </Grid> </UserControl>   The EarthquakeTemplate defines the Ellipse that will represent each earthquake, the Width and Height that are determined by the Magnitude, the Position on the map, and also the tooltip that will appear when we mouse over each data point. Running the application will give us the following result (shown with a tooltip example): That concludes this portion of our show but I plan on implementing additional functionality in later blog posts. Be sure to come back soon to see the next installments in this series. Enjoy!   Additional Resources USGS Earthquake Data Feeds Brad Abrams shows how RIA Services and MVVM can work together

    Read the article

  • ASP.NET Web API - Screencast series Part 2: Getting Data

    - by Jon Galloway
    We're continuing a six part series on ASP.NET Web API that accompanies the getting started screencast series. This is an introductory screencast series that walks through from File / New Project to some more advanced scenarios like Custom Validation and Authorization. The screencast videos are all short (3-5 minutes) and the sample code for the series is both available for download and browsable online. I did the screencasts, but the samples were written by the ASP.NET Web API team. In Part 1 we looked at what ASP.NET Web API is, why you'd care, did the File / New Project thing, and did some basic HTTP testing using browser F12 developer tools. This second screencast starts to build out the Comments example - a JSON API that's accessed via jQuery. This sample uses a simple in-memory repository. At this early stage, the GET /api/values/ just returns an IEnumerable<Comment>. In part 4 we'll add on paging and filtering, and it gets more interesting.   The get by id (e.g. GET /api/values/5) case is a little more interesting. The method just returns a Comment if the Comment ID is valid, but if it's not found we throw an HttpResponseException with the correct HTTP status code (HTTP 404 Not Found). This is an important thing to get - HTTP defines common response status codes, so there's no need to implement any custom messaging here - we tell the requestor that the resource the requested wasn't there.  public Comment GetComment(int id) { Comment comment; if (!repository.TryGet(id, out comment)) throw new HttpResponseException(HttpStatusCode.NotFound); return comment; } This is great because it's standard, and any client should know how to handle it. There's no need to invent custom messaging here, and we can talk to any client that understands HTTP - not just jQuery, and not just browsers. But it's crazy easy to consume an HTTP API that returns JSON via jQuery. The example uses Knockout to bind the JSON values to HTML elements, but the thing to notice is that calling into this /api/coments is really simple, and the return from the $.get() method is just JSON data, which is really easy to work with in JavaScript (since JSON stands for JavaScript Object Notation and is the native serialization format in Javascript). $(function() { $("#getComments").click(function () { // We're using a Knockout model. This clears out the existing comments. viewModel.comments([]); $.get('/api/comments', function (data) { // Update the Knockout model (and thus the UI) with the comments received back // from the Web API call. viewModel.comments(data); }); }); }); That's it! Easy, huh? In Part 3, we'll start modifying data on the server using POST and DELETE.

    Read the article

  • How does Google maintain its codes?

    - by John Maxim
    Pagerank algorithm is not revealed to any of their associates programmers, but only accessible by Larry Page or maybe Sergey Brin. I wonder how do they go about managing their coding? There are times when you need to build something up and you may need more hands to help with coding, but you may also want to keep some secrets to yourself, I'm not saying I have secrets, but I wonder how do they manage their coding. I'm sure there are some ways to do it decently and professionally. The reason why friendster failed was because one of the factors they lost control over their coding part. I think this is an interesting question. But not easy to answer, maybe only a marginal knew.

    Read the article

  • Explain DNS/Content/Registration with services such as Blogger and Go Daddy

    - by user8926
    I have this kind of settings for Google Sites and Blogger in Godaddy, below. I cannot get URL Framing (not URL masking) working with them. I am unsure what the problem, cannot understand what services such as Blogger and Godaddy really do. Wrong A-records in Go Daddy! ; A Records @ 3600 IN A 216.239.32.21 art 3600 IN A 64.202.189.170 abc 3600 IN A 64.202.189.170 @ 3600 IN A 216.239.34.21 @ 3600 IN A 216.239.36.21 @ 3600 IN A 216.239.38.21 lol 3600 IN A 64.202.189.170 ; CNAME Records www 3600 IN CNAME ghs.google.com mobilemail 3600 IN CNAME mobilemail-v01.prod.mesa1.secureserver.net pda 3600 IN CNAME mobilemail-v01.prod.mesa1.secureserver.net email 3600 IN CNAME email.secureserver.net imap 3600 IN CNAME imap.secureserver.net mail 3600 IN CNAME pop.secureserver.net pop 3600 IN CNAME pop.secureserver.net smtp 3600 IN CNAME smtp.secureserver.net ftp 3600 IN CNAME @ webmail 3600 IN CNAME webmail.secureserver.net e 3600 IN CNAME email.secureserver.net Please, explain the "Custom Domain" and how can I hide my blogger url? ok I am still unsure what the "custom domain" in blogger really mean, does it mean that the content hosting is moved to some other site? Or does it mean that it just hides the blogspot url with other url? Or is it this so-called "301" thing or "URL redirection" or something else? Related questions Control Content Hosting, DNS Hosting and Registration with command line?

    Read the article

< Previous Page | 79 80 81 82 83 84 85 86 87 88 89 90  | Next Page >