Search Results

Search found 25492 results on 1020 pages for 'google cloud endpoints'.

Page 37/1020 | < Previous Page | 33 34 35 36 37 38 39 40 41 42 43 44  | Next Page >

  • how to: dynamically load google ajax api into chrome extension content script

    - by Hoff
    Hi there, I'm trying to make use of google's ajax apis in a chorme extension's "content script". On a regular html page, I would just do this: <script src="http://www.google.com/jsapi"></script> <script> google.load("language", "1"); </script> But since I'm trying to load the tranlation library dynamically from js code, I've tried: script = document.createElement("script"); script.src = "http://www.google.com/jsapi"; script.type = "text/javascript"; document.getElementsByTagName("head")[0].appendChild(script); google.load('language','1') but the last line throws the following error: Uncaught TypeError: Object # has no method 'load' Funny enough, when i enter the same "google.load('language','1')" in chrome's js console, it works as intended... I've also tried with jquery's .getScript() but the same problem persists... Does anybody have any clue what might be the problem and how it could be solved? Many thanks in advance! Martin

    Read the article

  • Get Chinese Romanization from Google Translate API

    - by krubo
    The Google language translate API works cleanly to translate into Chinese: <script type="text/javascript" src="http://www.google.com/jsapi"></script> <script> google.load('language','1'); function googletrans(text) { google.language.translate(text,'en','zh',function(result) { alert(result.translation); }); } </script> <input onchange="googletrans(this.value);"> Example input: "Hello" Result: "??" My problem is I can't get the Romanization (pronunciation using English letters). This is a known issue. Now the data is right there on translate.google.com (Example input: "Hello" Result: "Ni hao") and I can even see it by pointing my browser to: http://translate.google.com/translate_a/t?client=t&text=hello&hl=en&sl=en&tl=zh-CN&otf=2&pc=0 Result: {"sentences":[{"trans":"??","orig":"hello","translit":"Ni hao"}], "dict":[{"pos":"interjection","terms":["?"]}],"src":"en"} But somehow when I try to get this URL with ajax it fails (XMLHttpRequest Exception 101). Is there any way to retrieve this Romanization data with ajax?

    Read the article

  • Google Reader API - feed/[FEEDURL]/ is coming back as Not found

    - by JustinXXVII
    There is one feed I'm subscribed to which always turns up as NOT FOUND when I try to use the API. I return an array of Dictionaries, containing 3 objects. The first in the list represents the user himself, like so: { FeedID = "user/MY_UNIQUE_NUMBER/state/com.google/reading-list"; Timestamp = 1273448807271463; Unread = 59; } The Unread count is very important. My client depends on downloading 59 items from Google before it refreshes. If a feed doesn't download properly, the count is off and the client won't update. An example of a working Feed is here: { FeedID = "feed/http://arstechnica.com/index.rssx"; Timestamp = 1273447158484528; Unread = 13; } The FeedID value combines with a specially formatted URL string and gives back a list of articles. The above example works fine. However, the following feed always returns NOT FOUND on Google, and if I paste the URL verbatim into a browser, it never turns up. See here: { FeedID = "feed/http://www.peopleofwalmart.com/?feed=rss2"; Timestamp = 1273424138183529; Unread = 6; } http://www.google.com/reader/api/0/stream/contents/feed/http://www.peopleofwalmart.com/?feed=rss2?ot=1&r=n&xt=user/-/state/com.google/read&n=6&ck=1273449028&client=testClient If you are at all proficient with the API, can you please help me? Like I said, since Google always says NOT FOUND when I search for that feed, my download count is off by N articles and won't update. I would rather not hack around it, honestly. Thanks!

    Read the article

  • How to get the equivalent of the accuracy in Google Map Geocoder V3

    - by Scorpi0
    Hi, I want to get geocode from google, and I used to do it with the V2 of the API. Google send in the json a pretty good information, the accuracy, reference here : http://code.google.com/intl/fr-FR/apis/maps/documentation/javascript/v2/reference.html#GGeoAddressAccuracy In V3, Google doesn't seem to send me exactly the same information. There is the array "adresse_component", which seem bigger if the accuracy is better, but not exactly. For example, I have a request accuracy to the street number, the array is of size 8. Another query is accuracy to the route, so less accuracy, but the array is style of size 8, as there is a row 'sublocality', which not appear in the first case. Ok, for a result, Google send a data 'types', which have the 'best' accuracy. This types are here : http://code.google.com/intl/fr-FR/apis/maps/documentation/geocoding/#Types But, there is no real order, and if I wan't the result better than postal_code, I have no clue to how to do that. So, how can I get this equivalent of the V2 accuracy, whithout some dumb and horrible code ?

    Read the article

  • Move file or folder to a different folder in google document using api problem

    - by Minh Nguyen
    In Google Document i have a struct: Folder1 +------Folder1-1 +------+------File1-1-1 +------Folder1-2 +------File1-1 Folder2 I want to move "File1-1" to "Folder2" using .Net google api library(Google Data API SDK) public static void moveFolder(string szUserName, string szPassword, string szResouceID, string szToFolderResourceID) { string szSouceUrl = "https://docs.google.com/feeds/default/private/full" + "/" + HttpContext.Current.Server.UrlEncode(szResouceID); Uri sourceUri = new Uri(szSouceUrl); //create a atom entry AtomEntry atom = new AtomEntry(); atom.Id = new AtomId(szSouceUrl); string szTargetUrl = "http://docs.google.com/feeds/default/private/full/folder%3Aroot/contents/"; if (szToFolderResourceID != "") { szTargetUrl = "https://docs.google.com/feeds/default/private/full" + "/" + HttpContext.Current.Server.UrlEncode(szToFolderResourceID) + "/contents" ; } Uri targetUri = new Uri(szTargetUrl); DocumentsService service = new DocumentsService(SERVICENAME); ((GDataRequestFactory)service.RequestFactory).KeepAlive = false; service.setUserCredentials(szUserName, szPassword); service.EntrySend(targetUri, atom, GDataRequestType.Insert); } After run this function i have: Folder1 +------Folder1-1 +------+------File1-1-1 +------Folder1-2 +------File1-1 Folder2 +------File1-1 "File1-1" display in both "Folder1" and "Folder2", and when i delete it from a folder it will be deleted in another folder. (expect: "File1-1" display only in "Folder2") What happen? How can i solve this problem?

    Read the article

  • Exposing SOAP, OData, and JSON Endpoints for RIA Services (Silverlight TV 26)

    In this video, John meets with Deepesh Mohnani from the WCF RIA Services team. Deepesh demonstrates how to expose various endpoints from WCF RIA Services. This is a great explanation and walk through of how to open RIA Services domain services to clients, including: Silverlight clients (of course) Creating an OData endpoint and showing how Excel can use it Creating a SOAP endpoint to a domain service and using it from a Windows Phone 7 application Creating a JSON endpoint and having...Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • Can I retain a Google apps session token permanently for a specific user who logs into my google app

    - by Ali
    Hi guys, is it possible to retain upon authorization a single session token for a user who signs into my gogle application. CUrrently my application seems to every now and then require the user to authenticate into google apps. I think it has to do with session dying out or so. I have the following code: function getCurrentUrl() { global $_SERVER; $php_request_uri = htmlentities(substr($_SERVER['REQUEST_URI'], 0, strcspn($_SERVER['REQUEST_URI'], "\n\r")), ENT_QUOTES); if (isset($_SERVER['HTTPS']) && strtolower($_SERVER['HTTPS']) == 'on') { $protocol = 'https://'; } else { $protocol = 'http://'; } $host = $_SERVER['HTTP_HOST']; if ($_SERVER['SERVER_PORT'] != '' && (($protocol == 'http://' && $_SERVER['SERVER_PORT'] != '80') || ($protocol == 'https://' && $_SERVER['SERVER_PORT'] != '443'))) { $port = ':' . $_SERVER['SERVER_PORT']; } else { $port = ''; } return $protocol . $host . $port . $php_request_uri; } function getAuthSubUrl($n=false) { $next = $n?$n:getCurrentUrl(); $scope = 'http://docs.google.com/feeds/documents https://www.google.com/calendar/feeds/ https://spreadsheets.google.com/feeds/ https://www.google.com/m8/feeds/ https://mail.google.com/mail/feed/atom/'; $secure = false; $session = true; //echo Zend_Gdata_AuthSub::getAuthSubTokenUri($next, $scope, $secure, $session);; return Zend_Gdata_AuthSub::getAuthSubTokenUri($next, $scope, $secure, $session).(isset($_SESSION['domain'])?'&hd='.$_SESSION['domain']:''); } function _regenerate_token() { global $BASE_URL; if(!$_SESSION['token']) { if(isset($_GET['token'])): $_SESSION['token'] = Zend_Gdata_AuthSub::getAuthSubSessionToken($_GET['token']); return; else: _regenerate_sessions(); _redirect(getAuthSubUrl($BASE_URL . '/index.php?'.$_SERVER['QUERY_STRING'])); endif; } } _regenerate_token(); I know I'm doing it all wrong here and I don't know why :( I have a CONSUMER SECRET code but only use it whereever I need to access a google service. However something is wrong with my authentication as the user has to periodically 'grant access to my application' and reauthorise himself... help please

    Read the article

  • Javascript and the Google Maps API

    - by Tiny Giant Studios
    Hiya coding Ninja's I'm in a spot of bother and my hairline is on the chopping block. When I integrated the maps API on this site, ritaknoetze.com, everything worked perfectly. However, copying that exact code for a different demo website, scarabpaper, the map doesn't show up at all? Could someone show me the ropes on what I'm doing wrong? Here's the code I got from Google itself that I modified for my WordPress theme/installation: JavaScript: <meta name="viewport" content="initial-scale=1.0, user-scalable=no" /> <script type="text/javascript" src="http://maps.google.com/maps/api/js?sensor=false"></script> <script type="text/javascript"> function initialize() { var myLatlng = new google.maps.LatLng(-34.009839, 22.78101); var myOptions = { zoom: 9, center: myLatlng, navigationControl: true, mapTypeControl: false, scaleControl: false, mapTypeId: google.maps.MapTypeId.ROADMAP } var map = new google.maps.Map(document.getElementById("map_canvas"), myOptions); var image = '<?php bloginfo('template_url')?>/assets/googlemaps_marker.png'; var myLatLng = new google.maps.LatLng(-34.009839, 22.78101); var beachMarker = new google.maps.Marker({ position: myLatLng, map: map, icon: image }); } </script> My HTML where the javascript goes: <div class="contact_container"> <div id="map_canvas"></div> <div class="clearfloat"></div> </div> My CSS for the affected divs #map_canvas { width: 880px; height: 300px; margin-left: 10px; margin-bottom: 30px; margin-top: 10px; float: left; border: 1px solid #dedcdc;} .contact_container { /*container for ALL the contact info*/ background-color: #fff; border: 1px solid #dedcdc; width: 900px; margin-top: 30px; padding: 20px; padding-bottom: 0;} Any Help would be greatly appreciated...

    Read the article

  • how to upload a audio file using REST webservice in Google App Engine for Java

    - by sathya
    Am using google app engine with eclipse IDE and trying to upload a audio file. I used the File Upload in Google App Engine For Java and can able to upload the file successfully. Now am planning to use REST web service for it. I had analyzed in developers.google but i failed. Can anyone suggest me how to implement REST Web services in google app engine using Eclipse. The code google provided is shown below, // file Upload.java public class Upload extends HttpServlet { private BlobstoreService blobstoreService = BlobstoreServiceFactory.getBlobstoreService(); public void doPost(HttpServletRequest req, HttpServletResponse res) throws ServletException, IOException { Map<String, BlobKey> blobs = blobstoreService.getUploadedBlobs(req); BlobKey blobKey = blobs.get("myFile"); if (blobKey == null) { res.sendRedirect("/"); } else { res.sendRedirect("/serve?blob-key=" + blobKey.getKeyString()); }}} // file Serve.java public class Serve extends HttpServlet { private BlobstoreService blobstoreService = BlobstoreServiceFactory.getBlobstoreService(); public void doGet(HttpServletRequest req, HttpServletResponse res) throws IOException { BlobKey blobKey = new BlobKey(req.getParameter("blob-key")); blobstoreService.serve(blobKey, res); }} // file index.jsp <%@ page import="com.google.appengine.api.blobstore.BlobstoreServiceFactory" %> <%@ page import="com.google.appengine.api.blobstore.BlobstoreService" %> <% BlobstoreService blobstoreService = BlobstoreServiceFactory.getBlobstoreService(); %> <form action="<%= blobstoreService.createUploadUrl("/upload") %>" method="post" enctype="multipart/form-data"> <input type="file" name="myFile"> <input type="submit" value="Submit"> </form> // web.xml <servlet> <servlet-name>Upload</servlet-name> <servlet-class>Upload</servlet-class> </servlet> <servlet> <servlet-name>Serve</servlet-name> <servlet-class>Serve</servlet-class> </servlet> <servlet-mapping> <servlet-name>Upload</servlet-name> <url-pattern>/upload</url-pattern> </servlet-mapping> <servlet-mapping> <servlet-name>Serve</servlet-name> <url-pattern>/serve</url-pattern> </servlet-mapping> Now how to provide a rest web service for the above code. Kindly suggest me an idea.

    Read the article

  • Rackspace Cloud Sites API (Not Cloud Servers)

    - by Jeff
    I'm looking for a way to pull data from my Rackspace Cloud SITES account. The data I want to pull is bandwidth, diskspace, and compute cycles (all available from control panel). I'd like to set up my own warning system, to be notified if I'm close to my limits on any given month. Does anyone know of a way/API to do this?

    Read the article

  • Backup Google Calendar programmatically: http://www.google.com/reader/subscriptions/export

    - by Michael
    I'm struggling with writing a python script that automatically grabs the zip fail containing all my google calendars and stores it (as a backup) on my harddisk. I'm using ClientLogin to get an authentication token (and successfully can obtain the token). Unfortunately, i'm unable to retrieve the file at https://www.google.com/calendar/exporticalzip It always asks me for the login credentials again by returning a login page as html (instead of the zip). Here's the critical code: post_data = post_data = urllib.urlencode({ 'auth': token, 'continue': zip_url}) request = urllib2.Request('https://www.google.com/calendar', post_data, header) try: f = urllib2.urlopen(request) result = f.read() except: print "Error" Anyone any ideas or done that before? Or an alternative idea how to backup all my calendars (automatically!)

    Read the article

  • Backup Google Calendar programmatically: https://www.google.com/calendar/exporticalzip

    - by Michael
    I'm struggling with writing a python script that automatically grabs the zip fail containing all my google calendars and stores it (as a backup) on my harddisk. I'm using ClientLogin to get an authentication token (and successfully can obtain the token). Unfortunately, i'm unable to retrieve the file at https://www.google.com/calendar/exporticalzip It always asks me for the login credentials again by returning a login page as html (instead of the zip). Here's the critical code: post_data = post_data = urllib.urlencode({ 'auth': token, 'continue': zip_url}) request = urllib2.Request('https://www.google.com/calendar', post_data, header) try: f = urllib2.urlopen(request) result = f.read() except: print "Error" Anyone any ideas or done that before? Or an alternative idea how to backup all my calendars (automatically!)

    Read the article

  • What cloud backup solution supports a "backup server to cloud" configuration?

    - by Gepeto
    What online backup tool allows you to: A) Back up Windows, Linux and optionally Mac desktops and servers to the cloud B) Do so by first backing up to a central server or appliance C) Allow restoring from that appliance when possible and if not go to the cloud For now the best option I have seen is i365 by seagate with an appliance between the local computers and the cloud. I know Microsoft also has an i365 plugin for DPM, as well as an Iron Mountain plugin. However, I feel that there must be a simpler way to do this. Can any of the "simpler" solutions like Jungle Disk or anything else going to s3, Mozy, Carbonite, Crashplan, etc do this? Thank you

    Read the article

  • Swap files in Cloud Infrastructures

    - by ffeldhaus
    At our company we set up an OpenStack Cloud and are currently creating internal guidelines for creation of OS templates / images. One controversial topic was if we should provide swap inside the VM templates. Therefore I'd like to ask the following questions From an elastic Cloud provider point of view, does it make sense to offer swap partitions / files in the VM templates or is swap not needed when a VM can be resized? Which scenarios necessarily demand a swap file to be present? What kind of Storage should be used for swap files (e.g. local / central, FC / iSCSI / NFS)? Are there any best practices for offering swap files in a performant way in Cloud Infrastructures?

    Read the article

  • We want to setup low cost private cloud [closed]

    - by Virtual Jasper
    We are a small company with very limit funds. In order to improve our server reliability, we are studying to migrate to CLOUD. We seen some CLOUD provider, they would charge by resources such as, CPU, RAM.....Disk space....High Availability....etc. We have server team, so we also consider to built the private CLOUD, we seen the Windows 8 server, it does need license fee. So we looking at Linux side, we look at Ubuntu and OpenStack. What is the different between Ubuntu and OpenStack solutions? Is it both free on software license? and only to pay the technical support.

    Read the article

  • Lending Club Selects Oracle ERP Cloud Service

    - by Di Seghposs
    Another Oracle ERP Cloud Service customer turning to Oracle to help increase efficiencies and lower costs!! Lending Club, the leading platform for investing in and obtaining personal loans, has selected Oracle Fusion Financials to help improve decision-making and workflow, implement robust reporting, and take advantage of the scalability and cost savings provided by the cloud. After an extensive search, Lending Club selected Oracle due to the breadth and depth of capabilities and ongoing innovation of Oracle ERP Cloud Service. Since their online lending platform is internally developed, they chose Oracle Fusion Financials in the cloud to easily integrate with current systems, keep IT resources focused on the organization’s own platform, and reap the benefits of lowered costs in the cloud. The automation, communication and collaboration features in Oracle ERP Cloud Service will help Lending Club achieve their efficiency goals through better workflow, as well as provide greater control over financial data. Lending Club is also implementing robust analytics and reporting to improve decision-making through embedded business intelligence. “Oracle Fusion Financials is clearly the industry leader, setting an entirely new level of insight and efficiencies for Lending Club,” said Carrie Dolan, CFO, Lending Club. “We are not only incredibly impressed with the best-of-breed capabilities and business value from our adoption of Oracle Fusion Financials, but also the commitment from Oracle to its partners, customers, and the ongoing promise of innovation to come.” Resources: Oracle ERP Cloud Service Video Oracle ERP Cloud Service Executive Strategy Brief Oracle Fusion Financials Quick Tour of Oracle Fusion Financials If you haven't heard about Oracle ERP Cloud Service, check it out today!

    Read the article

  • Podcast Show Notes: Conversations in the Cloud

    - by Bob Rhubart
    The centerpiece of every OTN Architect Day event is a panel discussion the gathers all of the session speakers togehter to respond to questions from the audience. I generally try to record these discussions, usually by stiking my iPad on top of one of the PA speakers, with mixed results. Fortunately, the A/V tech at the venue for the Los Angeles event, held on October 25, 2012, had the necessary gear to get a good-quality recording of the panel discussion. So starting this week the OTN ArchBeat Podcast will feature a short series of highlights from those discussions. Listen to Part 1: Dude, What's My Role? Members of the Architect Day panel respond to an audience question about what happens to traditional IT roles in a cloud environment. Listen to Part 2: Migrating Mission-Critical Applications to the Cloud (Nov 21) The panel offers advice and examples in response to an audience question about dealing with mission-critical applications. Listen to Part 3: All Clouds Are Not Equal (Nov 28) The panel responds to a challenging question about cloud strategy with a discussion of enterprise-grade cloud services. Listen to Part 4: Cloud Security and Auditing (Dec 5) The last segment in the series is short discussion in response to an audience question about auditing and security in the cloud. The Panelists (Listed alphabetically) Ashok Aletty, Senior Director of Product Management, Oracle Cloud Application Foundation Dr. James Baty, Vice President, Oracle Global Enterprise Architecture Program Dave Chappelle, Enterprise Architect, Oracle Global Enterprise Architecture Program Jeff Davies, Senior Principal Product Manager, Oracle Corporation Anbu Krishnaswamy, Enterprise Architect, Oracle Global Enterprise Architecture Program Dhanraj Pondicherry, Sales Consulting Manager, Oracle Exadata Perren Walker, Senior Principal Product Manager, Oracle Enterprise Manager Coming Soon Upcoming programs will focus on DevOps and Continuous Integration, and on Oracle's Java Cloud and Developer Cloud services. Stay tuned: RSS

    Read the article

  • IOUG SIG Webcast on October 30th : Performance Tuning your DB Cloud

    - by Anand Akela
    The Oracle Enterprise Manager Special Interest Group (SIG) is a growing body of IOUG members who manage or are interested in all aspects of Oracle Enterprise Manager. This IOUG SIG is managed by volunteers and supported by Oracle Enterprise Manager product managers and developers. The purpose of the SIG is to bring relevant information and education through webcasts, discussions and networking to users interested in learning more about the product, and to share user experiences. On October 30th at 10 AM pacific time, Oracle Enterprise Manager SIG is hosting a webcast on "Performance Tuning your DB Cloud in OEM 12c Cloud Control - 360 Degrees". In this webcast, Tariq Farooq , CEO, BrainSurface and Mike Ault, Oracle  will provide a tutorial on how to monitor and perform performance tuning of the Oracle database cloud environment. You will learn how to leverage Oracle Enterprise Manager for tuning, trouble-shooting & monitoring your Oracle Database Cloud Ecosystem. The session covers lessons learned, tips/tricks, recommendations, best practices, gotchas and a whole lot more on how to effectively use Oracle Enterprise Manager Cloud Control 12c for quick, easy & intuitive performance tuning of your Oracle Database Cloud. Session Objectives:• Leveraging OEM12c Cloud Control for Oracle DB Tuning/Monitoring • Limited Deep-Dive on AWR • Oracle DB Cloud Performance Tuning • Best Practices for DB Cloud Maintenance/Monitoring Register Now ! Stay Connected: Twitter |  Face book |  You Tube |  Linked in |  Google+ |  Newsletter

    Read the article

  • Securing a Cloud-Based Data Center

    - by Orgad Kimchi
    No doubt, with all the media reports about stolen databases and private information, a major concern when committing to a public or private cloud must be preventing unauthorized access of data and applications. In this article, we discuss the security features of Oracle Solaris 11 that provide a bullet-proof cloud environment. As an example, we show how the Oracle Solaris Remote Lab implementation utilizes these features to provide a high level of security for its users. Note: This is the second article in a series on cloud building with Oracle Solaris 11. See Part 1 here.  When we build a cloud, the following aspects related to the security of the data and applications in the cloud become a concern: • Sensitive data must be protected from unauthorized access while residing on storage devices, during transmission between servers and clients, and when it is used by applications. • When a project is completed, all copies of sensitive data must be securely deleted and the original data must be kept permanently secure. • Communications between users and the cloud must be protected to prevent exposure of sensitive information from “man in a middle attacks.” • Limiting the operating system’s exposure protects against malicious attacks and penetration by unauthorized users or automated “bots” and “rootkits” designed to gain privileged access. • Strong authentication and authorization procedures further protect the operating system from tampering. • Denial of Service attacks, whether they are started intentionally by hackers or accidentally by other cloud users, must be quickly detected and deflected, and the service must be restored. In addition to the security features in the operating system, deep auditing provides a trail of actions that can identify violations,issues, and attempts to penetrate the security of the operating system. Combined, these threats and risks reinforce the need for enterprise-grade security solutions that are specifically designed to protect cloud environments. With Oracle Solaris 11, the security of any cloud is ensured. This article explains how.

    Read the article

  • How could Google Latitude find my exact PC location with no GPS or public wifi?

    - by Mike
    I found a similar question here but I still don't get it. You see, I live in a small town and every time I check my IP location via online services or speed test websites, my location appears to be my ISP server location (which in my case is 250 miles away). But when I tried Google latitude, it pinpointed my exact location within less than 100 meters! I use Windows Vista, Google Chrome, and when I got the message that "Google is trying to locate you", I agreed just to check what the result will be. It was scary, very scary! What I've come up after reading the above link is that Google have a kind of extensive WiFi database locations. That could be understandable with the case of public and open WiFis that are used with a lot of people. Some of them might be using applications that could gather location data and somehow this information ends up in giant Google databases. From those, Google could pinpoint a WiFi location based on its MAC address along with these bits of info that have been gathered via various sources. The issue here is that my WiFi is private, I don't even broadcast my WiFi name. So how on earth did Google find my exact PC location? Please break down the answer in layman's terms as possible.

    Read the article

  • Syncing Multiple Google Calendars and with Outlook and Android

    - by Fred Thomas
    Perhaps this is a multipart question, but I deal with a lot of calendars in my life, and want to know if there is some way to sync them all together, and maintain appropriate privacy. So I have a family calendar that my ex and I maintain for kid events, and I have a personal calendar for my own life, and I have an Outlook work calendar, for work. Ideally I'd look at my calendar on my Android phone. Is it possible to sync them all together? Is it possible for there to be one calendar to rule them all on my phone, but have the other calendars blank out spaces that are from other calendars, but only show the blanked out without the details. (I don't want my date with Miss Hottie to appear that way on the family calendar, and I probably don't want my visit to the proctologist to appear in the corporate exchange server.) Are there tools available to do this? Bonus question, can I do the same with my to do lists? Double bouns question -- how can I solve world hunger and help us to all live together in peace? :-)

    Read the article

  • Unable to Connect to just Google Servers

    - by Akshat Mittal
    I am in an extremely strange problem. I am unable to connect to just Google Servers. I am not able to access any site related to Google, Google.com | YouTube | Google+ | Webmaster Tools | jQuery CDN, nothing is working. I am able to open any other website (as I am posting this question on SuperUser), even the Google DNS (8.8.8.8 and 8.8.4.4) are offline. Please Help!! Update 1: Google DNS are back online, YouTube is back online. But website on domain google.com is not working (ex: play.google.com, maps.google.com, google.com/search, etc). Update 2: I am able to access Google.com (only) with (one of) its IP addresss(s) listed below: 74.125.227.41 74.125.227.46 74.125.227.32 74.125.227.33 74.125.227.34 74.125.227.35 74.125.227.36 74.125.227.37 74.125.227.38 74.125.227.39 74.125.227.40 Update 3: I consulted my friends nearby and they said that they are also experiencing the same problem. Seams like this is a major problem in this area (or India !!) The Problem is Now Solved!! I am able to open Google.com

    Read the article

  • Google chrome disable url suggestions from history

    - by Tural Aliyev
    I was searching for a solution which will help me to disable URL auto suggestion (from history) while I type url on adressbar. But I haven't found anything about this solution. I tryied to uncheck Use a prediction service to help complete searches and URLs typed in the address barin privacy settings, but it doesn't help. Is there any way to disable history or disable url suggestions from history?

    Read the article

< Previous Page | 33 34 35 36 37 38 39 40 41 42 43 44  | Next Page >