Search Results

Search found 22758 results on 911 pages for 'google tv'.

Page 108/911 | < Previous Page | 104 105 106 107 108 109 110 111 112 113 114 115  | Next Page >

  • google maps api v3 - loop through overlays - overlayview methods

    - by user317005
    what's wrong with the code below? when i execute it, the map doesn't even show up. but when i put the overlayview methods outside the for-loop and manually assign a lat/lng then it magically works?! but does anyone know how i can loop through an array of lats/lngs (=items) using the overlayview methods? i hope this makes sense, just don't know how else to explain it. and unfortunately, i run my code on my localhost var overlay; OverlayTest.prototype = new google.maps.OverlayView(); [taken out: options] var map = new google.maps.Map(document.getElementById('map_canvas'), options); var items = [ ['lat','lng'],['lat','lng'] ]; for (var i = 0; i < items.length; i++) { var latlng = new google.maps.LatLng(items[i][0], items[i][1]); var bounds = new google.maps.LatLngBounds(latlng); overlay = new OverlayTest(map, bounds); function OverlayTest(map, bounds) { [taken out: not important] this.setMap(map); } OverlayTest.prototype.onAdd = function() { [taken out: not important] } OverlayTest.prototype.draw = function() { [taken out: not important] } }

    Read the article

  • How to use semantic markup and Google Places to assist in local search SEO?

    - by ElHaix
    In this article, adding additional localized markup is supposed to help your site's SEO. ie. <div itemscope itemtype="http://data-vocabulary.org/Organization"> <span itemprop="name">Search Engine People</span> <span itemprop="address" itemscope itemtype="http://data-vocabulary.org/Address"> <span itemprop="street-address">100 Westney Road South Unit 200, Building E</span> <span itemprop="locality">Ajax</span>, <span itemprop="region">ON</span> <span itemprop="country-name">Canada</span> <span itemprop="postal-code">L1S 7H3</span> </div> What about a site that contains valid localized results, where the actual business location is not relevant. For example, a site with valid local results from San Francisco, CA and Phoenix, AZ. Should these tags be added to the localized results, and has anyone got any experience with how much adding these tags have improved results? In terms of Google Places, however, they seem to ask for the business' actual physical location. Is there a way to use Google Places in the aforementioned example to assist in SEO?

    Read the article

  • how to use appcfg.py for google-app-engine projects created using google's eclipse plugin?

    - by Aadith
    I have created a google-app-engine java project in Eclipse using Google's Eclipse plugin. My previous attempt to deploy failed. Now, when I retry, I get the following message: Unable to update app: Error posting to URL : http://appengine.google.com/api/appversion/create?app_id=mybdaywisherversion=1 409 conflict Another transaction for this user is already in progress for this app and major version. That user can undo the transaction with appcfg.py's "rollback" command. Now, I have always used the google-app-engine features from inside Eclipse only and have not a clue how to run the appcfg.py command. Could not get much help from documentation available over the internet. The only thing I could make out was for mac (I'm on mac), the command to be used is appcfg.sh. Inside Eclipse, I looked where App-Engine SDK is located on my machine and went to that location. Even found appcfg.sh there. But when I try to run it, it only reports the error "command not found". Tried various alternatives to run it (like tried running it with sudo, tried running it as ./appcfg.sh by going to whether its located) but no success Can someone please tell me the step I will have to follow to run the apcfg command?

    Read the article

  • Google Map GEO Results

    - by Lee
    Hey All I'm getting really frustrated with google geo results and hope someone can advise me the best was to go. I have created a AutoSuggest feature where you can start typing the address and google will repspond with suggestions. User then selects and address to move on. But before I want them to continue on the next page I want to validate their selection. I would have thought this will be easy as we are only checking against what google has already given. But when I do my validation lookup it displays no results. Some example code: Lets say I picked from the suggestion this address: Suffield, CT 06078, USA Then on validation I do a second lookup with this address ie. $string = "Suffield, CT 06078, USA"; echo 'http://maps.google.com/maps/geo?output=json&oe=utf8&gl=us&sensor=false&key=[MyKey]&q='.urlencode($string).''; It gives me Error code 602 (G_GEO_UNKNOWN_ADDRESS) How can it not be found when its given me the address ?? Any suggestions how I can get around this. Hope you can !

    Read the article

  • PHP CURL Google Calendar using Private URL

    - by MooCow
    I'm trying to get an array of events from Google Calendar using the Private URL. I read the Google API document but I want to try doing this without using the ZEND library since I have no idea what the eventual server file structure is and avoid having other people edit the codes. I also did a search before posting and ran into the same condition where PHP CURL_EXEC returns false with the URL but I get a JSON file if the URL is open using a web browser. Since I'm using the Private URL, do I really need to authenticate against the Google server using ZEND? I'm trying to have PHP clean up the array before encoding it for Flash. $URL = <string of the private URL from Google Calendar> $ch = curl_init($URL); curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1); $data = curl_exec($ch); curl_close($ch); $result = json_decode($data); print '<pre>'.var_export($data,1).'</pre>'; Screen output >>> false

    Read the article

  • Will Google treat this JavaScript code as a bad practice?

    - by Mathew Foscarini
    I have a website that provides a custom UX experience implemented via JavaScript. When JavaScript is disabled in the browser the website falls back to CSS for the layout. To make this possible I've added a noJS class to the <body> and quickly remove it via JavaScript. <body class="noJS layout-wide"> <script type="text/javascript">var b=document.getElementById("body");b.className=b.className.replace("noJS","");</script> This caused a problem when the page loads and JavaScript is enabled. The body immediately has it's noJS class removed, and this causes the layout to appear messed up until the JavaScript code for layout is executed (at bottom of the page). To solve this I hide each article via JavaScript by adding a CSS class fix which is display:none as each article is loaded. <article id="q-3217">....</article> <script type="text/javascript">var b=document.getElementById("q-3217");b.className=b.className+" fix";</script> After the page is ready I show all the articles in the correct layout. I've read many times in Google's documentation not to hide content. So I'm worried that the Google will penalize my website for doing this.

    Read the article

  • How to reload google dfp in ajax content? [migrated]

    - by cj333
    google dfp support ads in ajax content, but if I parse all the code in main page. it always show the same ads even turn page reload the ajax content. I read some article from http://productforums.google.com/forum/#!msg/dfp/7MxNjJk46DQ/4SAhMkh2RU4J. But my code not work. Any work code for suggestion? Thanks. Main page code: <script type='text/javascript'> $(document).ready(function(){ $('#next').live('click',function(){ var num = $(this).html(); $.ajax({ url: "album-slider.php", dataType: "html", type: 'POST', data: 'photo=' + num, success: function(data){ $("#slider").center(); googletag.cmd.push(function() { googletag.defineSlot('/1*******/ads-728-90', [728, 90], 'div-gpt-ad-1**********-'+ num).addService(googletag.pubads()); googletag.pubads().enableSingleRequest(); googletag.enableServices(); }); } }); }); }); </script> album-slider.php <!-- ads-728-90 --> <div id='div-gpt-ad-1**********-<?php echo $_GET['photo']; ?>' style='width:728px; height:90px;'> <script type='text/javascript'> googletag.cmd.push(function() { googletag.display('div-gpt-ad-1**********-<?php echo $_GET['photo']; ?>); }); </script>

    Read the article

  • Sync Google Contacts with QuickBooks

    - by dataintegration
    The RSSBus ADO.NET Providers offer an easy way to integrate with different data sources. In this article, we include a fully functional application that can be used to synchronize contacts between Google and QuickBooks. Like our QuickBooks ADO.NET Provider, the included application supports both the desktop versions of QuickBooks and QuickBooks Online Edition. Getting the Contacts Step 1: Google accounts include a number of contacts. To obtain a list of a user's Google Contacts, issue a query to the Contacts table. For example: SELECT * FROM Contacts. Step 2: QuickBooks stores contact information in multiple tables. Depending on your use case, you may want to synchronize your Google Contacts with QuickBooks Customers, Employees, Vendors, or a combination of the three. To get data from a specific table, issue a SELECT query to that table. For example: SELECT * FROM Customers Step 3: Retrieving all results from QuickBooks may take some time, depending on the size of your company file. To narrow your results, you may want to use a filter by including a WHERE clause in your query. For example: SELECT * FROM Customers WHERE (Name LIKE '%James%') AND IncludeJobs = 'FALSE' Synchronizing the Contacts Synchronizing the contacts is a simple process. Once the contacts from Google and the customers from QuickBooks are available, they can be compared and synchronized based on user preference. The sample application does this based on user input, but it is easy to create one that does the synchronization automatically. The INSERT, UPDATE, and DELETE statements available in both data providers makes it easy to create, update, or delete contacts in either data source as needed. Pre-Built Demo Application The executable for the demo application can be downloaded here. Note that this demo is built using BETA builds of the ADO.NET Provider for Google V2 and ADO.NET Provider for QuickBooks V3, and will expire in 2013. Source Code You can download the full source of the demo application here. You will need the Google ADO.NET Data Provider V2 and the QuickBooks ADO.NET Data Provider V3, which can be obtained here.

    Read the article

  • Google Analytics Event Tracking and Variable visibility.

    - by Jeow
    Hi guys, I have added to my html page the standard latest snippet to get google analytics to work: ... ... var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-15080849-1']); _gaq.push(['_trackPageview']); (function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = 'http://www.google-analytics.com/ga.js'; (document.getElementsByTagName('head')[0] || document.getElementsByTagName('body')[0]).appendChild(ga); })(); Now looking at the official 'event tracking guide' google says: add a snippet such as: pageTracker._trackEvent('Videos', 'Play', 'Gone With the Wind'); my question is: where is pageTracker coming from ? is it a global object in ga.js ? but if it is, why google did not tell me that they run a risk on breaking some script... I must be missing something any help really appreciated.

    Read the article

  • How do I mashup Google Maps with geolocated photos from one or more social networks?

    - by PureCognition
    I'm working on a proof of concept for a project, and I need to pin random photos to a Google Map. These photos can come from another social network, but need to be non-porn. I've done some research so far, Google's Image Search API is deprecated. So, one has to use the Custom Search API. A lot of the images aren't photos, and I'm not sure how well it handles geolocation yet. Twitter seems a little more well suited, except for the fact that people can post pictures of pretty much anything. I was also going to look into the API's for other networks such as Flickr, Picasa, Pinterest and Instagram. I know there are some aggregate services out there that might have done some of this mash-up work for me as well. If there is anyone out there that has a handle on social APIs and where I should look for this type of solution, I would really appreciate the help. Also, in cases where server-side implementation matters, I'm a .NET developer by experience.

    Read the article

  • why the difference in google search result using script for search and using a browser for search

    - by Jayapal Chandran
    I wrote a code to find the position in google search result for a search keyword. I also did the same with the browser. Both the results are different. Let me explain in detail here. I have a website and i wanted to know on which page number my domain appears for a search string. Like when i search for 'code snippets' i wanted to find in google search on which page number a certain domain appears. I wrote a php code to search page by page starting from page 1 to page n. I did the same task using a browser. The script returned page 4 and when browsed i can see the domain appearing in second page. here is the search string i use in my code. /search?hl=en&output=search&sclient=psy-ab&q=code+snippets&start=0&btnG= and for each request i change the start=0 to start=1, start=2, etc... and in the response i will check whether my domain appears in it. any idea for this different in search results?

    Read the article

  • One site being on a subdirectory of another. Does google count this against you?

    - by Mick
    I have created two similar websites (relating to monetary systems). So far, one appears to be loved by Google and the other hated. I'm struggling to work out why. This is a mystery to me because both sites were created by me with the same design philosophy, both in pure html. Both are packed to the rafters with references to, and information about, their respective subjects. One issue I'm worried may be the cause is to do with the location of the sites. I got a web hosting package from hostmonster.com for the successful one, but less liked one is just an "add-on" which sits on a subdirectory of the successful one. I wonder if Google somehow detects this and treats it as a less significant website? EDIT: Just to clarify, even though one site is an add-on that sits on a subdirectory of the other, the URL is arranged to look like it is a root. I.e. the unpopular site can be accessed directly with a simple www.myunpopularsite.com name, without specifying any subdirectory. EDIT: Just in case its important... say the popular site is called pop.com and the unpopular one unpop.com. In the webspace I've purchased, there is a directory called public_html. This is where I put the index.htm and all the other files of my popular site. When I purchased the add-on unpop.com. I made a subdirectory of public_html called unpop. It is within this "public_html\unpop\" that I place the index.htm and all the other files of my unpopular site. Typing www.unpop.com into the address bar of a browser links directly to the contents of "public_html\unpop\" and the user is not aware that this site is sitting on a subdirectory of another site. BUT if you type "www.pop.com/unpop" into the address bar of a browser you DO see the unpopular site.

    Read the article

  • Creating sitemap for google bot - how to mark dynamic content / dynamic subpages?

    - by ojek
    I have a website that is internet forum. This forum has many categories, and single category page contains alot of subpages with listed threads. This internet forum is brand new, and about a week ago I filled it with few hundred thousands threads. I then looked at google webmasters page to see any changes in indexing, but the index went up from 300 to about 1200, so that means it did not index my added threads (although it added something). This is what my sitemap.xml contains which I uploaded on their website (of course there is a lot more of the code, this is just a snipped for a single category, in my real sitemap file I have all the categories listed as below): <url> <loc>http://mysite.com/Forums/Physics</loc> <changefreq>hourly</changefreq> </url> Now, I would expect google bot to go into http://mysite.com/Forums/Physics, and move through all the subpages with thread links, and then get inside of each thread and index it's content. How can I do this? Also if this will be unclear, I will add a real link to my website.

    Read the article

  • Trying to update a google visualization using jquery

    - by Mark in A2
    I'm relatively inexperienced, so please bear with me. I'm developing a simple dashboard using the Google visualization API. I'm developing in vb.net. I have the Annotated Timeline, the Intensity Map, and a set of tables on my apsx. What I am trying to do is update the Intensity Map and tables based on the date range the user selects using the Annotated Timeline tool. I was hoping to update only these visualizations without doing a full page load. Apparently, a great way to do this is to separate the visualizations into self-contained aspx pages and use jQuery to "load" them into a div. I say apparently, as this is not working. When I try to update an aspx containing a Google visualization using jQuery, I get the message "Loading data from www.google.com..." in the browser and it just runs continuously and never returns. I ran this by an experienced developer and he was stumped, but thought must be a conflict between the google API and jQuery. Any tips, advice, alternative solutions are greatly appreciated! Thanks, Mark in Ann Arbor

    Read the article

  • Ideas to tackle unwanted bad press/review on Google's SERP?

    - by Rob
    After Googling our company name to our horror we've found someone on Yelp.co.uk has reviewed our company. On the SERP your eye is immediately drawn to the 2 star review some complete stranger has written, which to be honest is pure slander! The most infuriating thing is the person who reviewed our company has never even been a client/customer. It's a bit like me reviewing a restaurant having never eaten or even been in there! We've sent her a private message on Yelp to remove the review and also sent a complaint to Yelp themselves but have yet to get a reply. We've resisted going mad at the reviewer and also requested that she re-review us having just relaunched our new website (it still riles us that she's not even a client though!). We've had genuine customers/clients review us on Yelp yet this 2 star review remains on Google's SERP. Roughly how long would it take to for our new reviews to over take this review? Does anyone have any suggestions as to how we can push the review off the 1st page of Google's SERP or any creative ways in which we can tackle this issue?

    Read the article

  • How to make Google recognize language for a multilingual website?

    - by Julien Fouilhé
    Few weeks ago, I implemented translation functionality for the website of my company. The website is now available in french and english and I did look on the internet the best way to do if we want to do not lose any ranking and to have our pages on Google. Here is what I did: I did set a response header: Content-Language:en and Content-Language:fr My URLs are formatted as: http://www.website.com/en/... and http://www.website.com/fr/... My html tag is set with a lang attribute: <html lang="en"> and <html lang="fr"> There is a <link rel="alternate" hreflang="en" href="EnglishPageUrl"> on french pages and a <link rel="alternate" hreflang="en" href="frenchPageUrl"> on english pages. But Google keeps referring to some english pages when I'm doing a search on french engine, knowing that the website was first only available in english. Is that normal? Do I have to wait still, it has been now almost one month, I thought it would be okay...? Thank you.

    Read the article

  • Is there an elegant way to track multiple domains under separate accounts with google analytics?

    - by J_M_J
    I have a situation where a content management system uses the same template for multiple websites with different domain names and I can't make a separate template for each. However, each website needs to be tracked with Google analytics. Would this be appropriate to track each domain like this by putting in some conditional code? And would this be robust enough not to break? Is there a more elegant way to do this? <script type="text/javascript"> var _gaq = _gaq || []; switch (location.hostname){ case 'www.aaa.com': _gaq.push(['_setAccount', 'UA-xxxxxxx-1']); break; case 'www.bbb.com': _gaq.push(['_setAccount', 'UA-xxxxxxx-2']); break; case 'www.ccc.com': _gaq.push(['_setAccount', 'UA-xxxxxxx-3']); break; } _gaq.push(['_trackPageview']); (function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; (document.getElementsByTagName('head')[0] || document.getElementsByTagName('body')[0]).appendChild(ga); })(); </script> Just to be clear, each website is a separate domain name and must be tracked separately, NOT different domains with same pages on one analytics profile.

    Read the article

  • get_by_id method on Model classes in Google App Engine Datastore

    - by tarn
    I'm unable to workout how you can get objects from the Google App Engine Datastore using get_by_id. Here is the model from google.appengine.ext import db class Address(db.Model): description = db.StringProperty(multiline=True) latitude = db.FloatProperty() longitdue = db.FloatProperty() date = db.DateTimeProperty(auto_now_add=True) I can create them, put them, and retrieve them with gql. address = Address() address.description = self.request.get('name') address.latitude = float(self.request.get('latitude')) address.longitude = float(self.request.get('longitude')) address.put() A saved address has values for >> address.key() aglndWVzdGJvb2tyDQsSB0FkZHJlc3MYDQw >> address.key().id() 14 I can find them using the key from google.appengine.ext import db address = db.get('aglndWVzdGJvb2tyDQsSB0FkZHJlc3MYDQw') But can't find them by id >> from google.appengine.ext import db >> address = db.Model.get_by_id(14) The address is None, when I try >> Address.get_by_id(14) AttributeError: type object 'Address' has no attribute 'get_by_id' How can I find by id? EDIT: It turns out I'm an idiot and was trying find an Address Model in a function called Address. Thanks for your answers, I've marked Brandon as the correct answer as he got in first and demonstrated it should all work.

    Read the article

  • One site being on a subdirectory of another. Does google count this againt you?

    - by Mick
    I have created two similar websites (relating to monetary systems). So far, one appears to be loved by Google and the other hated. I'm struggling to work out why. This is a mystery to me because both sites were created by me with the same design philosophy, both in pure html. Both are packed to the rafters with references to, and information about, their respective subjects. One issue I'm worried may be the cause is to do with the location of the sites. I got a web hosting package from hostmonster.com for the successful one, but less liked one is just an "add-on" which sits on a subdirectory of the successful one. I wonder if Google somehow detects this and treats it as a less significant website? EDIT: Just to clarify, even though one site is an add-on that sits on a subdirectory of the other, the URL is arranged to look like it is a root. I.e. the unpopular site can be accessed directly with a simple www.myunpopularsite.com name, without specifying any subdirectory.

    Read the article

  • How to enable logging for Google Chrome in Ubuntu 12.04?

    - by skytreader
    I'm trying to capture the logs for a certain bug I'm having with Google Chrome. However, I can't find/enable logs for GC. According to this Chromium project page, I just need to add the flags --enable-logging --v=1 and a chrome_debug.log file will appear in my user data directory. However, after running GC (and closing through the 'X' title bar button) there is no chrome_debug.log file in the specified directory. I even tried running as root as it may have something to do with write permissions but GC refuses to start as root. Another thing, GC also prints messages when invoked from command line. I tried capturing this and redirecting them to a file via $ google-chrome > today.log but the messages are still printed in the command line and the file I specify gets created but remains empty. Note that I can't just copy-paste the messages printed on terminal after my bug occurs as the bug freezes up my whole system that, when it occurs, my only option is to turn off my computer straight via the power button. I've seen a few similar bugs already posted but I find that they don't exactly describe my situation so I'd really like to get some logs for this. So how do I enable logging or, at least, get those terminal messages in a file?

    Read the article

  • Google Analytics API data for goals (funnels) doesn't match - how do they reconcile?

    - by bkgraham
    I have a Google Analytics account with a well-functioning funnel made up of 4 goals. I can query the API and get the data out, but it does not match the funnel report in Analytics. Without getting into specific values, I can give you an example with faked data. Here's how the funnel might look: Shopping Cart 100 > 100 > 20 80 (80%) Address Page 5 > 85 > 25 60 (71%) Payment Page 2 > 62 > 10 52 (84%) Checkout 1 > 53 (49.07% funnel conversion rate) Okay, so you would expect the API to output data something like this: goal1Starts goal1Completions goal1Abandons 100 80 20 goal2Starts goal2Completions goal2Abandons 85 60 25 goal3Starts goal3Completions goal3Abandons 62 52 10 goal4Starts goal4Completions goal4Abandons 53 53 0 Instead, it's different. Firstly, the abandons are associated with the following goal (so goal1 always has 0 abandons and goal4 always has 0 abandons. Okay, I can work with that. What's confusing is that the numbers are always a little different. The goal1Completions always match the report, as do the goal4Completions, but everything else is off by a small amount. Sometimes it's only 2 visits, other times it's off by 50. For the report above here's the kind of results I would tend to get: goal1Starts goal1Completions goal1Abandons 100 100 0 goal2Starts goal2Completions goal2Abandons 105 84 21 goal3Starts goal3Completions goal3Abandons 90 65 25 goal4Starts goal4Completions goal4Abandons 58 53 5 Here's what I know: Goal(n)Completions + Goal(n)Abandons = Goal(n)Starts Goal(n)Starts = Goal(n-1)Completions Goal(n)Starts - Goal(n-1)Completions != reported number entering at that level That third one is particularly disappointing. So, here's my question: What data do I need to pull from the API in order to recreate the counts in the Funnel report in Google Analytics? I don't need the pages exited to entering from - just the counts at every level.

    Read the article

  • Running a Silverlight application in the Google App Engine platform

    - by rajbk
    This post shows you how to host a Silverlight application in the Google App Engine (GAE) platform. You deploy and host your Silverlight application on Google’s infrastructure by creating a configuration file and uploading it along with your application files. I tested this by uploading an old demo of mine - the four stroke engine silverlight demo. It is currently being served by the GAE over here: http://fourstrokeengine.appspot.com/ The steps to run your Silverlight application in GAE are as follows: Account Creation Create an account at http://appengine.google.com/. You are allocated a free quota at signup. Select “Create an Application”   Verify your account by SMS   Create your application by clicking on “Create an Application”   Pick an application identifier on the next screen. The identifier has to be unique. You will use this identifier when uploading your application. The application you create will by default be accessible at [applicationidentifier].appspot.com. You can also use custom domains if needed (refer to the docs).   Save your application. Download SDK  We will use the  Windows Launcher for Google App Engine tool to upload our apps (it is possible to do the same through command line). This is a GUI for creating, running and deploying applications. The launcher lets you test the app locally before deploying it to the GAE. This tool is available in the Google App Engine SDK. The GUI is written in Python and therefore needs an installation of Python to run. Download and install the Python Binaries from here: http://www.python.org/download/ Download and install the Google App Engine SDK from here: http://code.google.com/appengine/downloads.html Run the GAE Launcher. Select Create New Application.   On the next dialog, give your application a name (this must match the identifier we created earlier) For Parent Directory, point to the directory containing your Silverlight files. Change the port if you want to. The port is used by the GAE local web server. The server is started if you choose to run the application locally for testing purposes. Hit Save. Configure, Test and Upload As shown below, the files I am interested in uploading for my Silverlight demo app are The html page used to host the Silverlight control The xap file containing the compiled Silverlight application A favicon.ico file.   We now create a configuration file for our application called app.yaml. The app.yaml file specifies how URL paths correspond to request handlers and static files.  We edit the file by selecting our app in the GUI and clicking “Edit” The contents of file after editing is shown below (note that the contents of the file should be in plain text): application: fourstrokeengine version: 1 runtime: python api_version: 1 handlers: - url: /   static_files: Default.html   upload: Default.html - url: /favicon.ico   static_files: favicon.ico   upload: favicon.ico - url: /FourStrokeEngine.xap   static_files: FourStrokeEngine.xap   upload: FourStrokeEngine.xap   mime_type: application/x-silverlight-app - url: /.*   static_files: Default.html   upload: Default.html We have listed URL patterns for our files, specified them as static files and specified a mime type for our xap file. The wild card URL at the end will match all URLs that are not found to our default page (you would normally include a html file that displays a 404 message).  To understand more about app.yaml, refer to this page. Save the file. Run the application locally by selecting “Browse” in the GUI. A web server listening on the port you specified is started (8080 in my case). The app is loaded in your default web browser pointing to http://localhost:8080/. Make sure the application works as expected. We are now ready to deploy. Click the “Deploy” icon. You will be prompted for your username and password. Hit OK. The files will get uploaded and you should get a dialog telling you to “close the window”. We are done uploading our Silverlight application. Go to http://appengine.google.com/ and launch the application by clicking on the link in the “Current Version” column.   You should be taken to a URL which points to your application running in Google’s infrastructure : http://fourstrokeengine.appspot.com/. We are done deploying our application! Clicking on the link in the Application column will take you to the Admin console where you can see stats related to system usage.  To learn more about the Google Application Engine, go here: http://code.google.com/appengine/docs/whatisgoogleappengine.html

    Read the article

  • Reusing OAuth request token when user refresh page - Twitter4j on GAE

    - by Tahir Akram
    Hi I am using Twitter4J API on GAE/J. I want to use the request token when user came to my page. (called back URL). And press refresh button. I write following code for that. But When user press refresh button. I got Authentication credentials error. Please see the stacktrance. It works fine when user first time used that token. HomeServlet.java code: HttpSession session = request.getSession(); twitter.setOAuthConsumer(FFConstants.CONSUMER_KEY, FFConstants.CONSUMER_SECRET); String token = (String) session.getAttribute("token"); String authorizedToken = (String)session.getAttribute("authorizedToken"); User user = null; if (!token.equals(authorizedToken)){ AccessToken accessToken = twitter.getOAuthAccessToken( token, (String) session .getAttribute("tokenSecret")); twitter.setOAuthAccessToken(accessToken); user = twitter.verifyCredentials(); session.setAttribute("authorizedToken", token); session.setAttribute("user", user); }else{ user = (User)session.getAttribute("user"); } TwitterUser twitterUser = new TwitterUser(); twitterUser.setFollowersCount(user.getFollowersCount()); twitterUser.setFriendsCount(user.getFriendsCount()); twitterUser.setFullName(user.getName()); twitterUser.setScreenName(user.getScreenName()); twitterUser.setLocation(user.getLocation()); Please suggest how I can do that. I have seen on many website. They retain the user with the same token. Even if user press browser refresh buttion again and again. Please help. Exception stacktrace: Reason: twitter4j.TwitterException: 401:Authentication credentials were missing or incorrect. /friends/ids.xml This method requires authentication. at twitter4j.http.HttpClient.httpRequest(HttpClient.java:469) at twitter4j.http.HttpClient.get(HttpClient.java:412) at twitter4j.Twitter.get(Twitter.java:276) at twitter4j.Twitter.get(Twitter.java:228) at twitter4j.Twitter.getFriendsIDs(Twitter.java:1819) at com.tff.servlet.HomeServlet.doGet(HomeServlet.java:86) at javax.servlet.http.HttpServlet.service(HttpServlet.java:693) at javax.servlet.http.HttpServlet.service(HttpServlet.java:806) at org.mortbay.jetty.servlet.ServletHolder.handle(ServletHolder.java:487) at org.mortbay.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1093) at com.google.apphosting.utils.servlet.ParseBlobUploadFilter.doFilter(ParseBlobUploadFilter.java:97) at org.mortbay.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1084) at com.google.apphosting.runtime.jetty.SaveSessionFilter.doFilter(SaveSessionFilter.java:35) at org.mortbay.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1084) at com.google.apphosting.utils.servlet.TransactionCleanupFilter.doFilter(TransactionCleanupFilter.java:43) at org.mortbay.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1084) at org.mortbay.jetty.servlet.ServletHandler.handle(ServletHandler.java:360) at org.mortbay.jetty.security.SecurityHandler.handle(SecurityHandler.java:216) at org.mortbay.jetty.servlet.SessionHandler.handle(SessionHandler.java:181) at org.mortbay.jetty.handler.ContextHandler.handle(ContextHandler.java:712) at org.mortbay.jetty.webapp.WebAppContext.handle(WebAppContext.java:405) at com.google.apphosting.runtime.jetty.AppVersionHandlerMap.handle(AppVersionHandlerMap.java:238) at org.mortbay.jetty.handler.HandlerWrapper.handle(HandlerWrapper.java:139) at org.mortbay.jetty.Server.handle(Server.java:313) at org.mortbay.jetty.HttpConnection.handleRequest(HttpConnection.java:506) at org.mortbay.jetty.HttpConnection$RequestHandler.headerComplete(HttpConnection.java:830) at com.google.apphosting.runtime.jetty.RpcRequestParser.parseAvailable(RpcRequestParser.java:76) at org.mortbay.jetty.HttpConnection.handle(HttpConnection.java:381) at com.google.apphosting.runtime.jetty.JettyServletEngineAdapter.serviceRequest(JettyServletEngineAdapter.java:135) at com.google.apphosting.runtime.JavaRuntime.handleRequest(JavaRuntime.java:235) at com.google.apphosting.base.RuntimePb$EvaluationRuntime$6.handleBlockingRequest(RuntimePb.java:5235) at com.google.apphosting.base.RuntimePb$EvaluationRuntime$6.handleBlockingRequest(RuntimePb.java:5233) at com.google.net.rpc.impl.BlockingApplicationHandler.handleRequest(BlockingApplicationHandler.java:24) at com.google.net.rpc.impl.RpcUtil.runRpcInApplication(RpcUtil.java:363) at com.google.net.rpc.impl.Server$2.run(Server.java:838) at com.google.tracing.LocalTraceSpanRunnable.run(LocalTraceSpanRunnable.java:56) at com.google.tracing.LocalTraceSpanBuilder.internalContinueSpan(LocalTraceSpanBuilder.java:536) at com.google.net.rpc.impl.Server.startRpc(Server.java:793) at com.google.net.rpc.impl.Server.processRequest(Server.java:368) at com.google.net.rpc.impl.ServerConnection.messageReceived(ServerConnection.java:448) at com.google.net.rpc.impl.RpcConnection.parseMessages(RpcConnection.java:319) at com.google.net.rpc.impl.RpcConnection.dataReceived(RpcConnection.java:290) at com.google.net.async.Connection.handleReadEvent(Connection.java:466) at com.google.net.async.EventDispatcher.processNetworkEvents(EventDispatcher.java:759) at com.google.net.async.EventDispatcher.internalLoop(EventDispatcher.java:205) at com.google.net.async.EventDispatcher.loop(EventDispatcher.java:101) at com.google.net.rpc.RpcService.runUntilServerShutdown(RpcService.java:251) at com.google.apphosting.runtime.JavaRuntime$RpcRunnable.run(JavaRuntime.java:394) at java.lang.Thread.run(Unknown Source)

    Read the article

  • fitbounds() in Google maps api V3 does not fit bounds

    - by jul
    hi, I'm using the geocoder from Google API v3 to display a map of a country. I get the recommended viewport for the country but when I want to fit the map to this viewport, it does not work (see the bounds before and after calling the fitBounds function in the code below). What am I doing wrong? How can I set the viewport of my map to results[0].geometry.viewport? thanks jul var geocoder = new google.maps.Geocoder(); if(geocoder) { geocoder.geocode({'address': '{{countrycode}}'}, function(results, status) { var bounds = new google.maps.LatLngBounds(); bounds = results[0].geometry.viewport; console.log(bounds); //((35.173, -12.524), (45.244, 5.098)) console.log(map.getBounds()); //((34.628757195038844, -14.683750000000012), (58.28355197864712, 27.503749999999986)) map.fitBounds(bounds); console.log(map.getBounds()); //((25.740113512090183, -24.806750000000005), (52.44233664523719, 17.380749999999974)) }); } }

    Read the article

  • ASP.NET ReportViewer Google Chrome CPU usage

    - by Phil
    Hello, We have found an interesting issue between ASP.NET 3.5 and ReportViewer with Google Chrome. Our set of pages work fine until a ReportViewer control displays a report. Google Chrome then eats up 50% of the CPU doing nothing it seems. I've extracted the ReportViewer control to a blank Web Forms project to confirm its that control and not a rogue bit of my code. I'm using ReportViewer in local mode (RDLC file) so I presume its the 2005 version? Anyone seen this before and have a solution? Phil Edit: Google Chrome 3.0.195.33 on Vista Business x64 Edit 2: Added bounty for help fixing this

    Read the article

< Previous Page | 104 105 106 107 108 109 110 111 112 113 114 115  | Next Page >