Search Results

Search found 31448 results on 1258 pages for 'google analytics api'.

Page 21/1258 | < Previous Page | 17 18 19 20 21 22 23 24 25 26 27 28  | Next Page >

  • Google Calendar feed api deleted events

    - by hsmit
    I'm syncing the Google Calendar with my application (I store events in a database). When an event is updated I can easily find the last updates by sorting the event feed on the 'updated' order. However, if an event is removed / deleted, how can I track this update from the feed?

    Read the article

  • Removing Directions markers from the Google Maps API V3

    - by anonymous
    To remove a normal marker from a map, I understand you simply call marker.setMap(null), but when implementing the Google Maps directions services, it automatically adds markers A and B onto the map ( calculating directions from point A to point B ). I do not have control over these markers, so I cannot remove them in the normal way. So how can I remove these markers (I have custom markers on the map instead)?

    Read the article

  • View Your Google Calendar in Outlook 2010

    - by Mysticgeek
    Google Calendar is a great way to share appointments, and synchronize your schedule with others. Here we show you how to view your Google Calendar in Outlook 2010 too. Google Calendar Log into the Google Calendar and under My Calendars click on Settings. Now click on the calendar you want to view in Outlook. Scroll down the page and click on the ICAL button from the Private Address section, or Calendar Address if it’s a public calendar…then copy the address to your clipboard. Outlook 2010 Open up your Outlook calendar, click the Home tab on the Ribbon, and under Manage Calendars click on Open Calendar \ From Internet… Now enter the link location into the New Internet Calendar field then click OK. Click Yes to the dialog box that comes up verifying you want to subscribe to it.   If you want more subscription options click on the Advanced button. Here you can name the folder, type in a description, and choose if you want to download attachments. That is all there is to it! Now you will be able to view your Google Calendar in Outlook 2010. You’ll also be able to view your local computer and the Google Calendar side by side… Keep in mind that this only gives you the ability to view the Google Calendar…it’s read-only. Any changes you make on the Google Calendar site will show up when you do a send/receive. If live out of Outlook during the day, you might want the ability to view what is going on with your Google Calendar(s) as well. If you’re an Outlook 2007 user, check out our article on how to view your Google Calendar in Outlook 2007. Similar Articles Productive Geek Tips View Your Google Calendar in Outlook 2007Overlay Calendars in Outlook 2007 (like Google Calendar does)Sync Your Outlook and Google Calendar with Google Calendar SyncDisplay your Google Calendar in Windows CalendarEasily Add All Holidays To The Calendar in Outlook 2003 TouchFreeze Alternative in AutoHotkey The Icy Undertow Desktop Windows Home Server – Backup to LAN The Clear & Clean Desktop Use This Bookmarklet to Easily Get Albums Use AutoHotkey to Assign a Hotkey to a Specific Window Latest Software Reviews Tinyhacker Random Tips Xobni Plus for Outlook All My Movies 5.9 CloudBerry Online Backup 1.5 for Windows Home Server Snagit 10 Create More Bookmark Toolbars in Firefox Easily Filevo is a Cool File Hosting & Sharing Site Get a free copy of WinUtilities Pro 2010 World Cup Schedule Boot Snooze – Reboot and then Standby or Hibernate Customize Everything Related to Dates, Times, Currency and Measurement in Windows 7

    Read the article

  • Adding AjaxOnly Filter in ASP.NET Web API

    - by imran_ku07
            Introduction:                     Currently, ASP.NET MVC 4, ASP.NET Web API and ASP.NET Single Page Application are the hottest topics in ASP.NET community. Specifically, lot of developers loving the inclusion of ASP.NET Web API in ASP.NET MVC. ASP.NET Web API makes it very simple to build HTTP RESTful services, which can be easily consumed from desktop/mobile browsers, silverlight/flash applications and many different types of clients. Client side Ajax may be a very important consumer for various service providers. Sometimes, some HTTP service providers may need some(or all) of thier services can only be accessed from Ajax. In this article, I will show you how to implement AjaxOnly filter in ASP.NET Web API application.         Description:                     First of all you need to create a new ASP.NET MVC 4(Web API) application. Then, create a new AjaxOnly.cs file and add the following lines in this file, public class AjaxOnlyAttribute : System.Web.Http.Filters.ActionFilterAttribute { public override void OnActionExecuting(System.Web.Http.Controllers.HttpActionContext actionContext) { var request = actionContext.Request; var headers = request.Headers; if (!headers.Contains("X-Requested-With") || headers.GetValues("X-Requested-With").FirstOrDefault() != "XMLHttpRequest") actionContext.Response = request.CreateResponse(HttpStatusCode.NotFound); } }                     This is an action filter which simply checks X-Requested-With header in request with value XMLHttpRequest. If X-Requested-With header is not presant in request or this header value is not XMLHttpRequest then the filter will return 404(NotFound) response to the client.                      Now just register this filter, [AjaxOnly] public string GET(string input)                     You can also register this filter globally, if your Web API application is only targeted for Ajax consumer.         Summary:                       ASP.NET WEB API provide a framework for building RESTful services. Sometimes, you may need your certain API services can only be accessed from Ajax. In this article, I showed you how to add AjaxOnly action filter in ASP.NET Web API. Hopefully you will enjoy this article too.

    Read the article

  • google maps api v2 - tens of thousands of markers

    - by Adam
    Hello, my problem is with XXk (aka XX000) markers, atm I have 7k markers and will be more, and more, problem is in marker database, because atm this is 4MB (link to my DB http://tinyurl.com/ybau9ce) and problem is, how load that fast? for example DOWNLOAD only this what are show now, DOWNLOAD because load I have with ClusterMarker and problem is not with java but with download that database I think...

    Read the article

  • Internal and external API architecture

    - by Tacomanator
    The company I work for maintains a successful SaaS product that grew "organically" over the years. We are planning to expand the line with a suite of new products that will share data with the existing product. To support this, we are looking to consolidate business logic into a single place: a web service layer. The WS layer will be used by: The web applications A tool to import data A tool to integrate with other client software (not an API per se) We also want to create an API that can be used by our customers that are capable of using it to create their own integrations. We are struggling with the following question: Should the internal API (aka the WS layer) and the external API be one in the same, with security and permission settings to control what can be done by who, or should they be two separate applications where the external API just calls the internal API like any other application? So far in our debate it seems that separating them may be more secure, but will add overhead. What have others done in a similar situation?

    Read the article

  • Blog not even ranking for exact title match, after domain has been dropped twice [on hold]

    - by Akshay Hallur
    Consider a blog, related to blogging and SEO. The domain has been dropped (expired) 2 times before acquisition. The current owner is the 3rd owner of the domain since 5 months. Blog posts are not ranking, even for exact titles. Google+ or other shares will show up instead of the content. Some blog posts are not even indexed. Let us TAKE that it gets around 7 organic visits / day. Dropped domain, less likely used for spam (WayBack machine (2 Reframed drops) 3 captures since 2004, Don't know whether there was Email spam) (But no manual actions in WMT, so no reconsideration request). What could be the reason for this? How can Google be told that ownership is changed and the domain is now spam-free? Would this domain be salvageble, or does this only change after relocating to another domain?

    Read the article

  • Automate Your Google Analytics Reporting with Apps Script

    Automate Your Google Analytics Reporting with Apps Script Do you rely on Google Analytics reporting to make sure you're making the most of your web traffic? Does your current process for exporting and analyzing your Analytics data feel clunky? Join Nick Mihailovski and Ikai Lan from the Analytics and Apps Script teams to learn how to integrate Google Analytics with Apps Script and save your sanity in the process. From: GoogleDevelopers Views: 0 2 ratings Time: 30:00 More in Science & Technology

    Read the article

  • Internal and external API architecture

    - by Tacomanator
    The company I work for maintains a successful SaaS product that grew "organically" over the years. We are planning to expand the line with a suite of new products that will share data with the existing product. To support this, we are looking to consolidate business logic into a single place: a web service layer. The WS layer will be used by: The web applications A tool to import data A tool to integrate with other client software (not an API per se) We also want to create an API that can be used by our customers that are capable of using it to create their own integrations. We are struggling with the following question: Should the internal API (aka the WS layer) and the external API be one in the same, with security and permission settings to control what can be done by who, or should they be two separate applications where the external API just calls the internal API like any other application? So far in our debate it seems that separating them may be more secure, but will add overhead. What have others done in a similar situation?

    Read the article

  • API design and versioning using EJB

    - by broschb
    I have an API that is EJB based (i.e. there are remote interfaces defined) that most of the clients use. As the client base grows there are issues with updates to the API and forcing clients to have to update to the latest version and interface definition. I would like to possibly look at having a couple versions of the API deployed at a time (i.e. have multiple EAR files deployed with different versions of the API) to support not forcing the clients to update as frequently. I am not concerned about the actual deployment of this, but instead am looking for thoughts and experiences that others have on using EJB's as an API client. How do you support updating versions, are clients required to update? Does anyone run multiple versions in a production environment? Are there pro's cons? Any other experiences or thoughts on this approach, and having an EJB centric API?

    Read the article

  • Is there an API for determining congressional districts?

    - by ardavis
    I'm looking to determine the congressional district based on an address my user is providing. This will avoid having the user to look it up themselves. Does an API of this sort exist? Note Through my attempts to find one, I've only come across these: http://www.govtrack.us/developers/api (not sure how to submit an an address or zip code however) The following resources are available in the API ...Bills and resolutions in the U.S. Congress since 1973 (the 93rd Congress). ...A (bill, person) pair indicating cosponsorship, with join and withdrawn dates. ...Members of Congress and U.S. Presidents since the founding of the nation. ...Terms held in office by Members of Congress and U.S. Presidents. Each term corresponds with an election, meaning each term in the House covers two years (one 'Congress'), as President four years, and in the Senate six years (three 'Congresses'). ...Roll call votes in the U.S. Congress since 1789. How people voted is accessed through the Vote_voter API. ...How people voted on roll call votes in the U.S. Congress since 1789. See the Vote API. Filter on the vote field to get the results of a particular vote... http://www.opencongress.org/api (seems to be a way to find congress information, but not districts) This API provides programmers with structured access to all the data on OpenCongress, everything from official bill info to news and blog coverage to user-generated votes on bills and much more... This API defaults to returning XML. All queries can also return JSON... https://groups.google.com/forum/?fromgroups=#!topic/opendems-discuss/CeKyi_aANaE (similar question, no resolution) I've been looking over Open Dems, and seeing what's exposed at this point and what isn't. I work with Democrats Abroad, and am interested in using stuff from the lab for their sites. I quickly looked over the Precinct API, which does both more and less than what I'd need. An ideal resource would be any way of translating addresses into CD at the very least (getting state district data would be good as well), since that would make it easier for DA's membership to make a difference in races like last month's NY26 race... Update I'm looking at the source for the govtrack.us website and the 'doGeoCode' function may be useful. view-source:http://www.govtrack.us/congress/members If no one has any suggestions, I will try to go off of what they are doing.

    Read the article

  • Debugging a Google Web Toolkit application that has an error when deployed on Google App Engine

    - by gerdemb
    I have a Google Web Toolkit application that I am deploying to Google App Engine. In the deployed application, I am getting a JavaScript error Uncaught TypeError: Cannot read property 'f' of null. This sounds like the JavaScript equivalent of a Java NullPointerException. The problem is that the GWT JavaScript is obfuscated, so it's impossible to debug in the browser and I can't reproduce the same problem in hosted mode where I could use the Java debugger. I think the reason I'm only seeing the error on the deployed application is that the database I'm using on the GAE server is triggering something differently than the test database I'm using during testing and development. So, any ideas about the best way to proceed? I've thought of the following things: Deploy a non-obsfucated version of my application. Despite a lot of Googling, I can't figure out how to do this using the automatic deploy script provided with the Google Eclipse Plugin. Does anyone know? Download and copy my GAE data to the local server Somehow point my development code to use the GAE server for data instead of the local test database. This seems like the best idea... Can anyone suggest how to proceed here? Finally, is there a way to catch these JavaScript errors on the production server and log them somewhere? Without logging, I won't have anyway to know if my users are having errors that don't occur on the server. The GWT.log() function is automatically stripped out of the production code...

    Read the article

  • App Engine - Save response from an API in the data store as file (blob)

    - by herrherr
    Hi there, I'm banging my head against the wall with this one: What I want to do is store a file that is returned from an API in the data store as a blob. Here is the code that I use on my local machine (which of course works due to an existing file system): client.convertHtml(html, open('html.pdf', 'wb')) Since I cannot write to a file on App Engine I tried several ways to store the response, without success. Any hints on how to do this? I was trying to do it with StringIO and managed to store the response but then weren't able to store it as a blob in the data store. Thanks, Chris

    Read the article

  • Tasks API using Google Apps Script (complete task)

    - by Cartman
    I am trying to set the status of a task as completed using Tasks API. It shows that the code has completed successfully, but the task is not being marked as completed. Also, when I try to get the status of the task after update, it shows status as "needsAction". Here is my code function setTaskStatus(){ // Suppose a task with name "MyTaskListName" is contained //within task list with name "MyTaskName" var tasklist = Tasks.Tasklists.list().getItems(); var title = 'MyTaskListName'; var id; for(var i in tasklist){ if(title == tasklist[i].getTitle()){ id = tasklist[i].getId(); } } //Get the task list items var tasks = Tasks.Tasks.list(id).getItems(); for(var i in tasks){ if(tasks[i].getTitle() == 'MyTaskName'){ tasks[i].setStatus("completed");// set status completed Logger.log(tasks[i].getStatus());// this shows that the task has completed //But it does not reflect actually } } }

    Read the article

  • Custom Extensions on Managed Chromebooks

    - by user417669
    I am a developer looking for the best way to set up different schools with their own custom, private extensions (ie School A should be the only one with access to Extension A). Theoretically, I am aware that there are a few ways to get a custom, private extension pushed out on a domain: Host the .crx on a server and click "Specify a Custom App" in the management console. Create a Domain App by uploading a zip to the Chrome Web Store Upload the extension from my developer account to the Chrome Web Store and publish to a single "trusted tester," or make it unlisted Option (1), hosting the .crx, has not been working. I am not sure why, but the extension is simply not pushing out. I link directly to the crx file, which has the right ID and MIME type, still, no dice. If anyone has any tips or suggestions for getting this to work, I would love to hear them! Option (2), having the school create a domain app, seems a bit inefficient because it requires all schools to upload their own zip. So essentially I would have to email a zip file to the school, and have them publish it. All updates to the extension will also require a similar process, so this doesn't seem ideal. I doubt that option (3) would work. If I published to the admin as a "trusted tester", I don't think that the other people in the domain would be able to access it. If it is unlisted, I do not know how an admin could find it in the Chrome Web Store dialog. Also, I would rather avoid security through obscurity. Has anyone had success with hosting the extension and using the Specify a Custom App feature? Any other suggestions for getting a Custom Extension pushed out by the management console? Thanks so much!

    Read the article

  • Can't connect in C# and google spreadsheet api

    - by RockySanders99
    Trying to access google spreadsheets using their api. following their example, the code doesnt work, and its not obvious why. All I'm trying to do is to connect, and I keep getting back the same error. This is with their code set as of 4/15/10. Can anyone offer any suggestion on what I'm doing wrong? CodE: using System; using Google.GData.Client; using Google.GData.Extensions; using Google.GData.Spreadsheets; namespace google_spreadsheet { class Program { static void Main(string[] args) { SpreadsheetsService myService = new SpreadsheetsService("MySpreadsheet" ); myService.setUserCredentials("[email protected]", "xxxxxxx"); string token1 = myService.QueryClientLoginToken(); Console.WriteLine("token is {0}", token1); Console.ReadLine(); SpreadsheetQuery query = new SpreadsheetQuery(); SpreadsheetFeed feed = myService.Query(query); Console.WriteLine("list"); foreach (SpreadsheetEntry entry in feed.Entries) { Console.WriteLine("Value: {0}", entry.Title.Text); When I run this, it keeps erroring out at the myService.Query statement, with the following error: Google.GData.Client.GDataRequestException was unhandled Message=Execution of request failed: http://spreadsheets.google.com/feeds/spreadsheets/private/full Source=Google.GData.Client ResponseString=<HTML> <HEAD> <TITLE>Not Found</TITLE> </HEAD> <BODY BGCOLOR="#FFFFFF" TEXT="#000000"> <H1>Not Found</H1> <H2>Error 404</H2> </BODY> </HTML> StackTrace: at Google.GData.Client.GDataRequest.Execute() at Google.GData.Client.GDataGAuthRequest.Execute(Int32 retryCounter) at Google.GData.Client.GDataGAuthRequest.Execute() at Google.GData.Client.Service.Query(Uri queryUri, DateTime ifModifiedSince, String etag, Int64& contentLength) at Google.GData.Client.Service.Query(Uri queryUri, DateTime ifModifiedSince) at Google.GData.Client.Service.Query(FeedQuery feedQuery) at Google.GData.Spreadsheets.SpreadsheetsService.Query(SpreadsheetQuery feedQuery) at google_spreadsheet.Program.Main(String[] args) in C:\Development Items\VS Projects\VS2008\google_spreadsheet\google_spreadsheet\Program.cs:line 21 at System.AppDomain._nExecuteAssembly(Assembly assembly, String[] args) at Microsoft.VisualStudio.HostingProcess.HostProc.RunUsersAssembly() at System.Threading.ExecutionContext.Run(ExecutionContext executionContext, ContextCallback callback, Object state) at System.Threading.ThreadHelper.ThreadStart() InnerException: System.Net.WebException Message=The remote server returned an error: (404) Not Found. Source=System StackTrace: at System.Net.HttpWebRequest.GetResponse() at Google.GData.Client.GDataRequest.Execute() InnerException: Yet, I can take the url http://spreadsheets.google.com/feeds/spreadsheets/private/full and manually type it in with my username/password, and it works fine. Any suggestions? thanks rocky sanders

    Read the article

  • Google Maps API v3: Turning user input coordinates to latlng?

    - by Leventhan
    I'm new with Google Maps. I'm trying to turn coordinates a user inputs to move a marker I have on my map to those coordinates. For instance, if the user inputs 50.75, 74.1 the marker will pan to that coordinate. Unfortunately, I couldn't get it to work. Here's my function: function moveMarker() { var Markerloc = document.getElementById("Markerloc").value; var newLatlng = new google.maps.LatLng(Markerloc); marker.setPosition(newLatLng) } Here's my HTML code: <input type="text" id= "Markerloc" value="Paste your coordinates" /> <input type="button" onclick="moveMarker()" value="Update Marker"> EDIT: I thought that'd fix it, but somehow, it still doesn't work. Here's my code: <script type="text/javascript"> var map; var zoomLevel; function initialize(){ var myLatlng = new google.maps.LatLng(40.65, -74); var myOptions = { zoom: 2, center: myLatlng, mapTypeId: google.maps.MapTypeId.ROADMAP, } var map = new google.maps.Map(document.getElementById('map_canvas'), myOptions); var zoomLevel = map.getZoom(); var marker = new google.maps.Marker({ position: myLatlng, map: map, draggable: true }); // Update current position info. updateMarkerPosition(myLatlng); // Add dragging event listeners. google.maps.event.addListener(marker, 'dragstart', function() { updateMarkerAddress('Dragging...'); }); google.maps.event.addListener (map, 'zoom_changed', function() { updateZoomLevel(map.getZoom()); }); google.maps.event.addListener(marker, 'drag', function() { updateMarkerStatus('Dragging...'); updateMarkerPosition(marker.getPosition()); }); google.maps.event.addListener(marker, 'dragend', function() { updateMarkerStatus('Drag ended'); geocodePosition(marker.getPosition()); }); }; var geocoder = new google.maps.Geocoder(); function geocodePosition(pos) { geocoder.geocode({ latLng: pos }, function(responses) { if (responses && responses.length > 0) { updateMarkerAddress(responses[0].formatted_address); } else { updateMarkerAddress('Cannot determine address at this location.'); } }); } function updateZoomLevel(zoomLevel){ document.getElementById('zoomLevel').innerHTML = zoomLevel; } function updateMarkerStatus(str) { document.getElementById('markerStatus').innerHTML = str; } function updateMarkerPosition(myLatlng) { document.getElementById('info').innerHTML = [ myLatlng.lat(), myLatlng.lng() ].join(', '); } function updateMarkerAddress(str) { document.getElementById('address').innerHTML = str; } function moveMarker() { var lat = parseFloat(document.getElementById('markerLat').value); var lng = parseFloat(document.getElementById('markerLng').value); var newLatLng = new google.maps.LatLng(lat, lng); marker.setPosition(newLatLng) } google.maps.event.addDomListener(window, 'load', initialize); </script> </head> <body> <div id="map_canvas" style="width: 29%; height: 50%; float: left; position: relative; background-color: rgb(229, 227, 223); overflow: hidden;"></div> <b>Zoom level: </b><div id="zoomLevel"> Scroll mousewheel</div> </div> <input type='text' id='markerLat' value='Target Latitude' /> <input type='text' id='markerLng' value='Target Longitude' /> <input type="button" onclick="moveMarker()" value="Move Marker"> </body> </html>

    Read the article

  • Google Spreadsheet API problem: memory exceeded

    - by Robbert
    Hi guys, Don't know if anyone has experience with the Google Spreadsheets API or the Zend_GData classes but it's worth a go: When I try to insert a value in a 750 row spreadsheet, it takes ages and then throws an error that my memory limit (which is 128 MB!) was exceeded. I also got this when querying all records of this spreadsheet but this I can imaging because it's quite a lot of data. But why does this happen when inserting a row? That's not too complex, is it? Here's the code I used: public function insertIntoSpreadsheet($username, $password, $spreadSheetId, $data = array()) { $service = Zend_Gdata_Spreadsheets::AUTH_SERVICE_NAME; $client = Zend_Gdata_ClientLogin::getHttpClient($username, $password, $service); $client->setConfig(array( 'timeout' => 240 )); $service = new Zend_Gdata_Spreadsheets($client); if (count($data) == 0) { die("No valid data"); } try { $newEntry = $service->insertRow($data, $spreadSheetId); return true; } catch (Exception $e) { return false; } }

    Read the article

  • New Experts Direct Contribution - Multiple Currency in Analytics

    - by Cheryl
    We do our best to anticipate what you need to know when we design and write our courses for CRM On Demand. But we know that we cannot hit on every situation or implementation scenario that you might encounter. That's why I love our Experts Direct program - this is where we encourage our wide network of CRM On Demand experts to contribute knowledge that they have gained from working directly with companies on their specific challenges or questions. (See Direct From Our Experts!) The latest Experts Direct contribution comes from Leon Dolman, who works with CRM On Demand customers every day. Leon addresses what you should expect to see in your reports and in the application when your company's users enter opportunity revenue information in more than one currency. He works through a scenario to show how currency settings can affect the data that you see in your reports. For example, do you know what will you see in your Opportunity reports if you have two different currencies represented, besides your company's default currency, but your company administrator has only set exchange rates for one of them? Leon knows...and now he has shared that knowledge - and more - with the rest of us. Go to the Multiple Currency in Analytics item in the Training and Support Center to read more - and while you're there, take a look at the other Experts Direct content to tap into that expert knowledge that we're collecting for you. Just click the Browse More Topics link in the Experts Direct box on the home page to see the full list. And let us know if there are other topics that you'd like to see our experts address. Post a comment to start a conversation or send us an email.

    Read the article

  • Social Analytics in your current data

    - by Dan McGrath
    By now everyone is aware of the massive boom in social-networking (Twitter, Facebook, LinkedIn) and obviously a big part of its business model revolves around being able to mine this data to create information that can be used to make money for someone. Gartner has identified 'Social Analytics' as one of the top 10 strategic technologies for 2011. Has anyone looked at their existing data structures to determine if they could extract a social graph and then perform further data mining against this? How does it fit in with your other strategic development strategies? What information are you trying to extract from the data? Take for example, a bank. They could conceivably determine a social graph through account relationships and transactions. Obviously there would be open edges on the graph where funds enter/leave the institute, but that shouldn't detract from the usefulness of the data. I'm looking for actual examples with the answers, as well as why/how they did it. References to other sites will be greatly appreciated. Note: I'm not at all referring to mining data out of actual social networks.

    Read the article

  • Correct configuration of multiple Analytics trackers per page, spanning domains and subdomains

    - by Eliot Shepard
    My company publishes sites on a somewhat convoluted domain structure, and we're having trouble getting accurate numbers in Analytics when we have multiple trackers on the page. We publish under two brands (A, B). Each brand has a "national" site at A.com, B.com, as well as per-city "local" sites at eg. ny.A.com, la.A.com, sf.A.com, etc. Right now we're trying to track in these dimensions: Full network (A.com, ny.A.com, B.com, la.B.com, etc.) All sites in brand (A.com, ny.A.com, la.A.com, etc.) Inidividual site (ny.A.com) Here are the commands we're using on an individual site: _gaq.push( ['t0._setAccount', 'UA-XXXXXX-1'], // full network ['t0._setDomainName', 'none'], ['t0._setAllowLinker', true], ['t0._trackPageview'], ['t1._trackPageLoadTime'], ['t1._setAccount', 'UA-XXXXXX-2'], // brand ['t1._setDomainName', 'none'], ['t1._setAllowLinker', true], ['t1._trackPageview'], ['t1._trackPageLoadTime'], ['t2._setAccount', 'UA-XXXXXX-3'], // individual ['t2._setDomainName', 'none'], ['t2._setAllowLinker', true], ['t2._trackPageview'], ['t2._trackPageLoadTime'] ); We send the same commands to each account because we've had strange results when trackers were configured differently in the past. However, right now we're seeing inflated numbers for uniques on all three trackers. What is the correct way to configure this setup? Thanks for your time.

    Read the article

  • Google Maps API vs Multimap/Bing Maps API

    - by mdresser
    I want to know if anyone who has experience of using both the Google Maps API and the Multimap API can give a good reason as to why one is better than the other - or maybe a list of pros and cons? I will be working on a complete re-development of a site which currently uses the Multimap (Classic) API and want to consider the possibility of using Google Maps API instead of Multimap (now MS Bing), but I need a compelling reason to justify this decision. The site currently provides a search mechanism allowing users to search for addresses using postcode/partial postcode or city. The current system has a sqlserver database back-end containing full address details and also uploads (geocodes this information to Multimap with a daily scheduled task). I'm wondering if it's possible with the Google API to avoid the need for the daily upload and just use it's geocoding API instead (though this is limited by Google's restriction of a certain number of geocoding requests per day).

    Read the article

  • Google Analytics API - Tying Behavior to Specific Dates

    - by DavidS
    I am using the API to understand the performance of Adwords ad campaigns. I need to know how to attribute metrics back to the date dimension. For instance, for a given date, if I have 20 clicks, 18 visits, and 3 goal completions, does it mean that: 1) All of these actions happened on the day in question and are otherwise independent (meaning that the 3 goals could have been for people that clicked any time in the past 30 days, not who clicked on that day) 2) The on-site actions are a subset of the click activity on that day (i.e. on that day, 20 people clicked, 18 registered a real visit, and 3 completed a goal) If it is scenario 2, does that mean there is a need to refresh old rows every day? Thanks!

    Read the article

< Previous Page | 17 18 19 20 21 22 23 24 25 26 27 28  | Next Page >