Search Results

Search found 24081 results on 964 pages for 'google docs'.

Page 9/964 | < Previous Page | 5 6 7 8 9 10 11 12 13 14 15 16  | Next Page >

  • Scraping Google docs (can't use API)

    - by Andy Waite
    I'm building an iPhone app which needs a peice of meta data from a user's Google Spreadsheet. Unfortunately the meta data I need is not exposed by the API, so I will need to scrape it from the document's HTML source (it would not be present in any of the exported variants). Is there anyway to include authentication parameters in a call such as: http://spreadsheets.google.com/ccc?key=abc123&username=...&password=...

    Read the article

  • Google Analytics and Whos.amung.us in realtime visitors, why such an enormous discrepancy?

    - by jacouh
    Since years I use in a site both Google Analytics and Whos.amung.us, both Google analytics and whos.amung.us javascripts are inserted in the same pages in the tracked part of the site. In real-time visitors, why such an enormous discrepancy ? for example at the moment, Google analytics gives me 9 visitors, whos.amung.us indicates 59, a ratio of 6 times? Why whos.amung.us is 6 times optimistic than Google Analytics in terms of the realtime visitors? Google whos.amung.us My question is: whos.amung.us does not detect robots while Google does? GA ignores visitors from some countries, not whos.amung.us? Some robots/bots execute whos.amung.us javascript for tracking? While no robots/bots can execute the tracking javascript provided by Google Analytics? To facilitate your analysis, I copy JS code used below: Google analytics: <script type="text/javascript"> var _gaq = _gaq || []; _gaq.push(['_setAccount', 'MyGaAccountNo']); _gaq.push(['_trackPageview']); (function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })(); </script> Whos.amung.us: <script>var _wau = _wau || []; _wau.push(["tab", "MyWAUAccountNo", "c6x", "right-upper"]);(function() { var s=document.createElement("script"); s.async=true; s.src="http://widgets.amung.us/tab.js";document.getElementsByTagName("head")[0].appendChild(s);})();</script> I've aleady signaled this to WAU staff some time ago, NR, I've not done this to Google as they don't handle this kind of feedback. Thank you for your explanations.

    Read the article

  • Google Chrome freezes for seconds

    - by Levan
    I do not know if this is the right place to write about google chrome or not so sorry if it is not after the update ,,Google Chrome 20.0.1132.47" google chrome started to lag ,,It gets stuck for couple of seconds and then it resumes" i think it starts lagging when i enter any flash site when i started chrome from the terminal it showed this before the update non of this appeared ALSA lib pcm_dmix.c:957:(snd_pcm_dmix_open) The dmix plugin supports only playback stream ALSA lib pcm_dmix.c:957:(snd_pcm_dmix_open) The dmix plugin supports only playback stream ALSA lib pcm_direct.c:980:(snd1_pcm_direct_initialize_slave) unable to install hw params ALSA lib pcm_dsnoop.c:623:(snd_pcm_dsnoop_open) unable to initialize slave ALSA lib pcm_dmix.c:957:(snd_pcm_dmix_open) The dmix plugin supports only playback stream ALSA lib pcm_direct.c:980:(snd1_pcm_direct_initialize_slave) unable to install hw params ALSA lib pcm_dsnoop.c:623:(snd_pcm_dsnoop_open) unable to initialize slave ALSA lib pcm_dmix.c:957:(snd_pcm_dmix_open) The dmix plugin supports only playback stream ALSA lib pcm_dmix.c:957:(snd_pcm_dmix_open) The dmix plugin supports only playback stream ALSA lib pcm_direct.c:980:(snd1_pcm_direct_initialize_slave) unable to install hw params ALSA lib pcm_dsnoop.c:623:(snd_pcm_dsnoop_open) unable to initialize slave ALSA lib pcm_dmix.c:957:(snd_pcm_dmix_open) The dmix plugin supports only playback stream ALSA lib pcm_direct.c:980:(snd1_pcm_direct_initialize_slave) unable to install hw params ALSA lib pcm_dsnoop.c:623:(snd_pcm_dsnoop_open) unable to initialize slave ALSA lib pcm_dmix.c:957:(snd_pcm_dmix_open) The dmix plugin supports only playback stream ALSA lib pcm_dmix.c:957:(snd_pcm_dmix_open) The dmix plugin supports only playback stream ALSA lib pcm_direct.c:980:(snd1_pcm_direct_initialize_slave) unable to install hw params ALSA lib pcm_dsnoop.c:623:(snd_pcm_dsnoop_open) unable to initialize slave ALSA lib pcm_dmix.c:957:(snd_pcm_dmix_open) The dmix plugin supports only playback stream ALSA lib pcm_direct.c:980:(snd1_pcm_direct_initialize_slave) unable to install hw params ALSA lib pcm_dsnoop.c:623:(snd_pcm_dsnoop_open) unable to initialize slave Firefox does not have any problem is there a way to fix this and thank you for your time

    Read the article

  • Track a Adobe Flash app hosted on multiple domains with Google Analytics

    - by roberkules
    I'm working on a flash app that's gonna be distributed to more and more partners (and obviously domains). It needs to be tracked aggregated and also separately. I implemented Google Analytics using gaforflash, tracking virtual pageviews and events inside the flash app. What I want to achieve: View an aggregated report of all partners. Identify the partner not by the domain (where the flash is used), but by a partnerID. Each partner needs access to the report of his domain. (no admin rights needed) I came up with this solution: Using only one "Web property" in Google Analytics. UA-XXXXXX-4 .example.com Set a custom/virtual hostname per partner. (GA's "utmhn" parameter) partner1.example.com partner2.example.com Create a profile for each partner, setting the filter to include only the relevant "subdomain" Problems that came up: The gaforflash library doesn't support overriding the host name. Possible workaround: The gaforflash source code is available, so I could add the functionality. Any goal from the "master" profile is not copied to the partners profile. profile 1: include traffic from hostname ^partner1\. profile 2: include traffic from hostname ^partner2\. Is it (very) bad to fake the hostname? Are there better approaches? Or what improvements could you think of? UPDATE: I'm looking primarily for a solid data structure inside Google Analytics regardless of the flash implementation. The only limitations: We need an aggregated view across all partners Our partners need to have access to their subset of data We want to identify the partner by a custom partnerID, not the domain

    Read the article

  • Major Google not follow increase since introducing 301 to site

    - by jakob
    Recently we implemented Varnish in front of our web nodes so that the backend would get some rest from time to time. Since varnish is case sensitive and our app was not we implemented a 301 in varnish to redirect to small case. Example: You search for PlumBer StockHOLM you will get a 301 redirect to plumber stockholm and then plumber stockholm will be cached. This worked as a charm, but when checking the Google webmaster tools we suddenly got a crazy amount of Status - Not able to follow errors. As you can see in the image below: This of course stirred up some panic and I started to read up on the documentation once again. If I pressed on one of the links I got to the help section where i found this: Well this is strange, but as the day progressed more and more errors were thrown by Google. We took the decision to make varnish return 200 instead of the 301. Now when testing the links that appears in the Not able to follow section I get a 200 back. I have tested with Chrome, curl and lynx reader and everything looks ok but the amount of errors are still increasing. What is a little bit comforting is that the links that appears in the Not able to follow section are dated before the 200 change in varnish. Why do I get these errors and why do they keep increasing? Did google release something new on October 31? Maybe I do not understand the docs correctly?

    Read the article

  • https:// search results appearing on Google for purely http:// site

    - by hydrurga
    I started weeding through my site's search results from Google today, using a site: search, to determine if there are any links that cause 404s and thus need redirecting. To my amazement I noticed numerous https:// results relating to various pages. My site doesn't have a SSL certificate, doesn't serve such pages, doesn't internally link to https:// pages, doesn't include any such files in its sitemap.xml and, for all of these, never has. I decided to do a Google search for https://<my site> and found one site that incorrectly refers to the root of my site with a https:// prefix - I will try to contact them to get them to correct this. I'm not sure however how Googlebot managed to index the non-root files as https://. I can't find any external links to them and surely, without certification, Googlebot should have stalled at the first request? I've just added the following lines to the site's .htaccess (although the surfer still has to navigate through the browser's "This site is a security risk. Abandon hope all ye who enter here!" message(s) first to get there): RewriteEngine On RewriteCond %{HTTPS} on RewriteRule ^(.*)$ http://www.<my site>.org/$1 [R=301,L] replacing <my site> with my domain name. My big question is this though - I would like to use the Google Webmaster Tools Remove URLs feature to remove the https:// pages from the index. Can I be guaranteed that this will only remove the https:// versions of each relevant page and not the valid http:// versions? My thanks to anyone who can help me out with this particular question and the issue in general.

    Read the article

  • My parked domain was de-indexed by Google - what to do?

    - by Programmer Joe
    I have a question about how to handle my domain. In a nutshell, I bought a domain last year from Go Daddy. My intention was to launch a real site with this domain and I have spent the last year working on my site. For the last year, I have been using the default Go Daddy page display for an up and coming site. When I first bought this site, it was indexed by Google - you could search for "alphabanter" and my site would show up on the search result page for Google. Several months ago, it seemed Google de-indexed my domain and if you type "alphabanter," my domain no longer shows up on the list of search results. However, if you search for "www.alphabanter.com", that's the only way it shows up in the search results for Google. Anyways, I am about to launch my site for real. However, I don't quite know if I can get my site back into Google's index. I have a few questions: 1) Was my domain permanently penalized by Google and removed from their index just because it was a parked domain? I don't believe I have done anything abusive other than using the Go Daddy default page for almost a year because my site was not ready. 2) Should I just launch my site, put a few backlinks to my site, and hope that Google indexes my site again? 3) Should I submit my site to Google at Google submit your content I assume getting Google to reconsider my site is the last option if none of the above works.

    Read the article

  • Google suddenly only indexes https and not http

    - by spender
    So all of a sudden, searches for our site "radiotuna" give out the result as an HTTPS link. https://www.google.com/?q=radiotuna#hl=en&safe=off&output=search&sclient=psy-ab&q=radiotuna&oq=radiotuna&gs_l=hp.12...0.0.0.3499.0.0.0.0.0.0.0.0..0.0.les%3B..0.0...1c.LnOvBvgDOBk&pbx=1&bav=on.2,or.r_gc.r_pw.r_qf.&fp=177c7ff705652ec3&biw=1366&bih=602 We only use https for the download of two specific files (these urls are resources used for autoupdate functionality of an app we distribute). All other parts of the site should be served over http. We wouldn't like to see any other traffic over https, nor any of our site links to appear in search engines as https. I'd like to address this issue. It seems that the following solutions are available: hand out an https specific robots.txt as such: User-agent: * Disallow: / and/or at app-level, 301 permanent redirect all requests (except the two above) to HTTP if they come in as HTTPS. My concern with the robots method is that, say (for some reason) google decided not to index http pages, disallowing https pages might mean that google has nothing left to index with disastrous consequences for our ranking. This means I'm inclined to go with a 301 redirect. Any thoughts?

    Read the article

  • Does Google Maps API v3 allow larger zoom values ?

    - by Dr1Ku
    If you use the satellite GMapType using this Google-provided example in v3 of the API, the maximum zoom level has a scale of 2m / 10ft , whereas using the v2 version of another Google-provided example (had to use another one since the control-simple doesn't have the scale control) yields the maximum scale of 20m / 50ft. Is this a new "feature" of v3 ? I have to mention that I've tested the examples in the same GLatLng regions - so my guess is that tile detail level doesn't influence it, am I mistaken ? As mentioned in another question, v3 is to be considered of very Labs-y/beta quality, so use in production should be discouraged for the time being. I've been drawn to the subject since I have to "increase the zoom level of a GMap", the answers here seem to suggest using GTileLayer, and I'm considering GMapCreator, although this will involve some effort. What I'm trying to achieve is to have a larger zoom level, a scale of 2m / 10ft would be perfect, I have a map where the tiles aren't that hi-res and quite a few markers. Seeing that the area doesn't have hi-res tiles, the distance between the markers is really tiny, creating some problematic overlapping. Or better yet, how can you create a custom Map which allows higher zoom levels, as by the Google Campus, where the 2m / 10ft scale is achieved, and not use your own tileserver ? I've seen an example on a fellow Stackoverflower's GMaps sandbox , where the tiles are manually created based on the zoom level. I don't think I'm making any more sense, so I'm just going to end this big question here, I've been wondering around trying to find a solution for hours now. Hope that someone comes to my aid though ! Thank you in advance !

    Read the article

  • google spreadsheet api from android: google-api-java-client or handmade?

    - by yetanothercoderu
    For working with google spreadsheet api from android (2.2) - google suggests using google-api-java-client for android. For that you have to include 5 jars to your android application: guava-r09.jar google-http-client-extensions-android2-1.6.0-beta.jar google-api-client-extensions-android2-1.6.0-beta.jar google-http-client-1.6.0-beta.jar google-api-client-1.6.0-beta.jar and digging into google-api-java-client javadocs for fast-changing api. Does it worth the effort? in term of android specifics and device fragmentation? Isn't it reasonable to write your own simple http response parser or take small existing library like google-spreadsheet-lib-android ? Thanks! UPD: choosed google-api-java-client finally as it has all routine stuff (like parsing http, xml) out of box

    Read the article

  • Google Document export via API

    - by micco
    After using Zend_GData to retrieve a document list feed, I can use the content URLs in the form: http://docs.google.com/document/edit?id=<docid>&hl=en but the source URLs in the form http://docs.google.com/feeds/download/documents/Export?docId=<docid>&exportFormat=html are returning 404 errors. That URL should return the content of the document in the requested format but it is returning 404. This problem is mentioned without resolution on a Google API forum. As indicated in that forum post, this problem only seems to affect new documents. My code works perfectly retrieving old documents, but new ones are 404. Has something changed in the way Google references new documents or in the way permissions are assigned? The code I'm using is essentially the same as the code on this page but this does not seem to be an issue specific to PHP/Zend_Gdata.

    Read the article

  • Explore Six of the Ocean’s Incredible Coral Reefs with Google Maps

    - by Asian Angel
    Are you ready to view some gorgeous underwater photography and explore our ocean’s coral reefs from the comfort of your desktop? Then you will definitely enjoy this wonderful collection of underwater ‘street view level’ coral reef images from Google Maps. Once you get started you can easily lose yourself in the tranquillity and beauty of these oceanic kingdoms. Here is a quick peek at the collection that is available for your viewing pleasure… 8 Deadly Commands You Should Never Run on Linux 14 Special Google Searches That Show Instant Answers How To Create a Customized Windows 7 Installation Disc With Integrated Updates

    Read the article

  • Google Webmasters tools search queries position

    - by user1592845
    In my website account on Google Webmasters tools, some search queries show average position 1.0. This make me understand that it should be displayed as the first result. When I search for this query I could not able to find my website's page listed as a result?! In some cases I navigate to the third or the fourth result page and I could not find it! What are factors that make my website loss its average position for a search query? and when Google webmasters tools updates their values?

    Read the article

  • Remove urls to unidex blog content from google, then copy blogs content to new blog [closed]

    - by sam
    Possible Duplicate: migrating PR / rankings from one site to another Ive been writing a blog for the past yr or so, with about 300 published articles, the blog have been running under a subdomain blog.mysite.com We are no looking to change the url of our site, so the blog is going to have to be ported over to a subdoamin on the new site. We would really like to keep the backlog / archive of all the articles we have written but dont wont to be penalized for having duplicate content, could we just remove / unindex the urls from google in webmaster tools then export the blog and import it back to our new blog ? Would google still see this a duplicate content or becuase ive removed the urls have they no longer got a copy of it ? thanks

    Read the article

  • How to Create Custom Personalized Maps in Google Maps [Video]

    - by Asian Angel
    Though the custom maps feature has been out for a bit, you may have forgotten about it or it may all be new for you. Either way this wonderful video shows you how to create your own custom maps and enjoy an awesome feature of this popular Google service. ‘My Maps’ is mentioned in the video above, but is now referred to as ‘My Places’ in Google Maps (as seen below). There is also a nice interactive tutorial available. How to Use an Xbox 360 Controller On Your Windows PC Download the Official How-To Geek Trivia App for Windows 8 How to Banish Duplicate Photos with VisiPic

    Read the article

  • Google webmaster tools: parameters that only apply on one page

    - by Imagine digital
    I'm trying to get my e-commerce website on google and still figuring out how it all works. Now, I have seen this feature named URL-parameters, allowing me to set different parameters that affect page content to be indexed (one can also set parameters that do not affect the page, but for me that does not apply..). The question I have about this is whether and how I should add parameters that I only have on some pages of my site. example: The homepage of my site is www.mysite.nl. no parameters at all. But when a user clicks the navigation bar, it links to www.mysite.nl/itemList.php?category=&....subCategory=.... The parameters category and subCategory define whether there is content on my itemList page and what content that is. It gets matching products out of my database based on those 2 variables. The question: How do I make sure that I apply the google URL Parameters function decently for my website?

    Read the article

  • Google Analytics Goal Tracking for Sub-Domains?

    - by Hasan Khan
    I am trying to track goals in Google Analytics for a website that has the goal URL on a sub-domain. The main domain for example is: domain.com and the sub-domain is my.domain.com. I have Google Analytics configured to track domains and all sub-domains and I've eve set up an advanced filter so I can see traffic to my sub-domains in Analytics. However, in goal tracking, you're supposed to put in the website URL after the front (so if it were domain.com/conversions/ you'd put in just /conversions/). However, since for me it would be my.domain.com/conversions/, how would I input that URL into Analytics to track? Would Analytics automatically determine the URL to be on the sub-domain? Thanks!

    Read the article

  • Managing multiple Adwords accounts from one Google account

    - by CJM
    I have a Gmail account linked to numerous Analytics accounts, and a couple of Adwords accounts - that is, I can track stats from a dozen or so sites, and have administration rights for a couple of Adwords campaigns. However, a client has already set-up their Adwords account and has invited me help administer their campaign. However, when I try to accept, I get the folowing error: The Google Account xxx already has access to an AdWords account (Customer ID: ). As many have discovered, for some reason Google won't let an account that already owns an Adwords campaign, to join another account. However, I wondering if there is any workaround for this? Temporarily, I'm using a separate Gmail account for this, but what is the longer term solution. Going forwards, sometimes clients will be happy from me to 'host' their campaigns (but providing them with access), but I'm equally sure that many will want to retain greater control. Surely there must be a better way than creating an additional Gmail account for each client? How do web/SEO agencies handle this?

    Read the article

  • Resetting Google's data for the website

    - by Giorgi
    I own a domain which I was using for experimenting with different platforms and content management systems. I really did not care about seo while I was building the website so I guess my website has quite a 'bad reputation' for Google search. My website is almost finished and I am planning to launch it soon but would like to tell Google to forget everything it nows about it and start everything over. What's the correct way for doing it? I found Requesting reconsideration of site but I'm not sure if that's the best way to do it. Any advice?

    Read the article

  • Why is Google not indexing my forum?

    - by jsoldi
    I have this web site that as you can see on the top right has is a "Forums" button that links to my forums. My problem is that if, for instance, I make a Google search "site:heliumscraper.com answers" I don't get any result, even though the word "answers" is in the forum's board index and it has been there for a few months already. I don't have a robots file. I also uploaded a sitemap to Google that contained the forum. I'm thinking the problem might be the fact that the only link from my main page to the forum opens in a new window. Could this be the problem?

    Read the article

  • Wrong content for URL cache on Google

    - by user32592
    I have this website natural-track.com and when I do a cache check I get a completely different website,This is Google's cache of http://www.backpackers-planet.com/modules.php?name=Web_Links&l_op=visit&lid=3379 , unrelated to my site. I have checked with the host, they say all is well on their side. How can we fix it? The site also went off from Google Search. We are about to rebuild this site to a better professional platform but first we would like to have an idea of what happened and how to fix it.

    Read the article

  • Reading 'Index Status' graph in Google Webmaster tools

    - by sam
    I recently found a bunch of old files that had been ftp'ed to a live production server by mistake on a static (html / css / js) site. I manually deleted these files, but today when checking in Google Webmaster tools i found this graph below. The 'update' marker is from 3/9/14, what i can work out is what Google is trying to tell me, are they saying that : There was a ranking update like Penguin or Panda and they penalized my site and un-indexed a load of pages which they thought were junk.. OR Is this showing that I updated the site by deleting the files on the server on 3/9/14 OR Is this something else ?

    Read the article

  • google webmaster showing 6 pages submitted 0 indexed, yet i can see them all there when i search in google?

    - by sam
    I have a small 'brochure' type site with 6 pages, i can see them all the pages showing up in google when i search for my site. But in webmaster tools under the sitemaps section it says 6 submitted, (the blue bar of the graph), but the indexed pages - the red bar is showing 0 indexed pages, even though they seem to be indexed ? any idea why this is ? I dont really think its that important as the pages are still indexed, but it just seems odd. =================================================== UPDATE 9/3/12 having just looked in google webmaster its showing that there are 11 pages indexed, under the health index status tab.. but under the optimization sitemap tab it shows 6 urls submitted but only 1 indexed ? please see images bellow index status: Sitemap status:

    Read the article

  • Google shows "search instead for" when searching for our website

    - by Athanatos
    Our website is new and the name is similar (only one letter different than another website) completely different type and company though. searching for xxxxxA works OK in Google and we find relatively good results. However searching xxxxxA.com finds results for the other website and gives us the following options: Showing results for xxxxxE.com Search instead for xxxxxA.com (hyperlink when clicked then it is correctly searching for our site) Questions: Do we need to contact Google to correct this and if yes how ? if not will it be corrected automatically when the site becomes more popular and what is the process? How do we make the process quicker?

    Read the article

  • How to track in Google Analytics registrations come from Google AdWords ads?

    - by automatix
    I created a campaign in Google AdWords and some ads in it and gave them URLs like mydomain.tld/registration/?utm_campaign=mycampaing&ad=x mydomain.tld/registration/?utm_campaign=mycampaing&ad=y mydomain.tld/registration/?utm_campaign=mycampaing&ad=z All ads lead to the registration page. A registration is a visit of the page mydomain.tld/registration-complited/?user={ID} So I can track the registrations in Google Analytics. I just go to Behavior -> Site Content -> All Pages and filter the pages to registration-complited. But how can I see, how many and which users have registered, after they came from an ad of a campaign, e.g. utm_campaign? And how can I also track this for a sigle ad of the campaign, e.g. x?

    Read the article

< Previous Page | 5 6 7 8 9 10 11 12 13 14 15 16  | Next Page >