Search Results

Search found 25492 results on 1020 pages for 'google cloud endpoints'.

Page 96/1020 | < Previous Page | 92 93 94 95 96 97 98 99 100 101 102 103  | Next Page >

  • Closing InfoWindow with Google Maps API V3

    - by Oscar Godson
    I've seen the other posts, but they dont have the markers being looped through dynamically like mine. How do I create an event that will close the infowindow when another marker is clicked on using the following code? $(function(){ var latlng = new google.maps.LatLng(45.522015,-122.683811); var settings = { zoom: 10, center: latlng, disableDefaultUI:false, mapTypeId: google.maps.MapTypeId.SATELLITE }; var map = new google.maps.Map(document.getElementById("map_canvas"), settings); $.getJSON('api',function(json){ for (var property in json) { if (json.hasOwnProperty(property)) { var json_data = json[property]; var the_marker = new google.maps.Marker({ title:json_data.item.headline, map:map, clickable:true, position:new google.maps.LatLng( parseFloat(json_data.item.geoarray[0].latitude), parseFloat(json_data.item.geoarray[0].longitude) ) }); function buildHandler(map, marker, content) { return function() { var infowindow = new google.maps.InfoWindow({ content: '<div class="marker"><h1>'+content.headline+'</h1><p>'+content.full_content+'</p></div>' }); infowindow.open(map, marker); }; } new google.maps.event.addListener(the_marker, 'click',buildHandler(map, the_marker, {'headline':json_data.item.headline,'full_content':json_data.item.full_content})); } } }); });

    Read the article

  • PHP tag cloud ( and finding similar ones )

    - by mfolnovich
    Hello! I have articles on my site, and I would like to add tags which would describe each article, but I'm having problems with design mysql table for tags. I have two ideas: 1) each article would have field "tags", and tags would be in format: "tag1,tag2,tag3" 2) create other table called tags with fields: tag_name, article_id, so when I want tags for article with ID 1, I would run SELECT ... FROM tags WHERE news_id=1; But, I would also like to know 3 similar articles by comparing tags, so if I have article which has tags "php,mysql,erlang", and 5 articles with tags: "php,mysql", "erlang,ruby", "php erlang", "mysql,erlang,javascript", I would choose 1., 3. and 4., since those 3 have most same tags with main article. Thanks!

    Read the article

  • Leverage cloud and programming to share GB's of photos

    - by jcmoney
    My friends and I went on a trip and we have over 8 GB of photos we want to share. We live in different geographic locations and all of us (14 people) have a part of the 8 GB. I was wondering if there's a way to leverage my php skills to share all these photos. My current plan is to make a simple site that you can upload a bunch of files and also list those files for people to download (probably a compressed folder of a bunch of selected ones) but was wondering if there's a better way or if I'm grossly underestimating scalability issues. All of us have high speed internet (essentially T1) and I was planning on using Amazon EC2 since this is a heavy task but for a short time period. That's also the reason I can't use dropbox or similar services since they have a 2GB cap (and I don't want to have everyone sign up and install something). I also don't want to set up anything too tricky since not all of them are tech savvy.

    Read the article

  • Google local search API - callback never called

    - by Laurent Luce
    Hello, I am trying to retrieve some local restaurants using the LocalSearch Google API. I am initializing the search objects in the OnLoad function and I am calling the searchControl execute function when the user clicks on a search button. The problem is that my function attached to setSearchCompleteCallback never get called. Please let me know what I am doing wrong here. <script src="http://www.google.com/jsapi"></script> <script type="text/javascript"> google.load('maps' , '2'); google.load('search' , '1'); var searcher, searchControl; function OnLoad() { searchControl = new google.search.SearchControl(); searcher = new google.search.LocalSearch(); // create the object // Add the searcher to the SearchControl searchControl.addSearcher(searcher); searchControl.setSearchCompleteCallback(searcher , function() { var results = searcher.results; // Grab the results array // We loop through to get the points for (var i = 0; i < results.length; i++) { var result = results[i]; } }); $('#user_search_address').live('click', function() { g_searchControl.execute('pizza san francisco'); }); } google.setOnLoadCallback(OnLoad); </script>

    Read the article

  • DotNetOpeId - Using OpenIdButton for Google Apps

    - by JediPotPie
    I am an ASP.NET newbie and I am trying to design an OpenID/SSO system for an internal web application. The web application is pretty simple and the authentication is currently being managed by a database with usernames and passwords. I want to replace the existing accounts stored in the database with Google Apps accounts. I have downloaded the latest DotNetOpenAuth-3.4.3.10103 package and got the OpenIdRelyingPartyWebForms sample up and running on IIS. I have built my own login page using just a OpenIdButton object that points to a development Google domain. The button seems to work fine in FireFox, at least it is forwarding me to the Google Apps login, but nothing happens when I load the same page in IE. When I click the Google button, nothing happens, zip. The same is true for the Yahoo button in the login.aspx page given in the sample. Here is the .aspx code I am using... <rp:OpenIdButton runat="server" ImageUrl="http://www.google.com/accounts/google_transparent.gif" Text="Login with Google!" ID="googleLoginButton" Identifier="https://www.google.com/accounts/o8/site-xrds?hd=dev.connexcloud.com" />

    Read the article

  • Google Analytics version 3 - How to apply it correctly?

    - by ephramd
    I've added google analytics to my app with the intention of obtaining information about the screens you and send custom events. I am obtained duplicate content ... Also I get different results: "com.package.app.MainScreen" - 300 views and "Main Screen" - 200 views I am interested to get only follow up with the custom name of the activity and not the package. And in any case, because both show different results? public class MainScreen extends Activity { private static final String GA_PROPERTY_ID = "UA-12345678-9"; private static final String SCREEN_LABEL = "Main Screen"; Tracker mTracker; EasyTracker easyTracker; @Override protected void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); setContentView(R.layout.main_screen); mTracker = GoogleAnalytics.getInstance(this).getTracker(GA_PROPERTY_ID); mTracker.set(Fields.SCREEN_NAME, SCREEN_LABEL); // For Custom Name from activity mTracker.send(MapBuilder.createAppView().build()); easyTracker = EasyTracker.getInstance(this); // Analytics Events ... easyTracker.send(MapBuilder.createEvent("MainScreen", "Play", category.get(1), null).build()); //AnalyticsEvents ... } @Override public void onStart() { super.onStart(); EasyTracker.getInstance(this).activityStart(this); } @Override public void onStop() { super.onStop(); EasyTracker.getInstance(this).activityStop(this); } } And analytics.xml: <?xml version="1.0" encoding="utf-8" ?> <resources xmlns:tools="http://schemas.android.com/tools" tools:ignore="TypographyDashes"> <!--Replace placeholder ID with your tracking ID--> <string name="ga_trackingId">UA-12345678-9</string> <!--Enable automatic activity tracking--> <bool name="ga_autoActivityTracking">true</bool> <!--Enable automatic exception tracking--> <bool name="ga_reportUncaughtExceptions">true</bool> </resources> Google Analytics Dev Guide

    Read the article

  • Turn image in google maps V3?

    - by Ilrodri
    Hi, I need help with JavaScript in google map v3 I have an image and I need to be able to turn it. That works, but the real problem it's that I cant afect an marker cause I don't know how to call it and modify this marker. I show you a part of the code: Marker: sURL = 'http://www.sl2o.com/tc/picture/Fleche.PNG'; iWidth = 97; iHeight = 100; mImage = new google.maps.MarkerImage(sURL, new google.maps.Size(iWidth,iHeight), new google.maps.Point(0,0), new google.maps.Point(Math.round(iWidth/2),Math.round(iHeight/2))); var oMarker = new google.maps.Marker({ 'position': new google.maps.LatLng(iStartLat,iStartLon), 'map': map, 'title': 'mon point', 'icon': mImage }); Then I have this : onload=function(){ rotate.call(document.getElementById('im'),50); } </script> <img id="im" src="http://www.sl2o.com/tc/picture/Fleche.PNG" width="97" height="100" /> So here is it. As you can see, I'm afecting this image and I in fact I need to afect the marker. How can I do it ? Please I need this I'been working in it since hours and hours. Thank you !!

    Read the article

  • How do I return a variable in javascript?

    - by bmckim
    I am working with the google maps API and whenever I return the variable to the initialize function from the codeLatLng function it claims undefined. If I print the variable from the codeLatLng it shows up fine. var geocoder; function initialize() { geocoder = new google.maps.Geocoder(); var latlng = new google.maps.LatLng(40.730885,-73.997383); var addr = codeLatLng(); document.write(addr); } function codeLatLng() { var latlng = new google.maps.LatLng(40.730885,-73.997383); if (geocoder) { geocoder.geocode({'latLng': latlng}, function(results, status) { if (status == google.maps.GeocoderStatus.OK) { if (results[1]) { return results[1].formatted_address; } else { alert("No results found"); } } else { alert("Geocoder failed due to: " + status); } }); } } prints out undefined If I do: var geocoder; function initialize() { geocoder = new google.maps.Geocoder(); var latlng = new google.maps.LatLng(40.730885,-73.997383); codeLatLng(); } function codeLatLng() { var latlng = new google.maps.LatLng(40.730885,-73.997383); if (geocoder) { geocoder.geocode({'latLng': latlng}, function(results, status) { if (status == google.maps.GeocoderStatus.OK) { if (results[1]) { document.write(results[1].formatted_address); } else { alert("No results found"); } } else { alert("Geocoder failed due to: " + status); } }); } } prints out New York, NY 10012, USA

    Read the article

  • Google Maps Controls panel size is displayed wrong

    - by Andrea Giachetto
    I have a weird problem with Google Maps Control. I've tested in a blank page my custom maps with custom markers and everything seems to be ok, also with the control panel. When I tried to import all my code in the page I'm working with ( I use a full screen fluid grid system ) the control panel is displayed with strange size. I tried everything for disable/enable the ui of the Google Maps but the problem remain. The code of my maps are exactly the same, both in my blank page and in my site, but in the site the ui control panel is displayed very strange. Here's the code: <div id="map_canvas2" style="height: 580px; width: 100%;"></div> <script> var image = 'path/to/your/image.png'; var mapOptions = { zoom: 17, center: new google.maps.LatLng(45.499290, 12.621510), mapTypeId: google.maps.MapTypeId.ROADMAP, scrollwheel: false } var map = new google.maps.Map(document.getElementById('map_canvas2'), mapOptions); var myPos = new google.maps.LatLng(45.499290,12.621510); var myMarker = new google.maps.Marker({position: myPos, map: map, icon: 'http://www.factory42.it/jtaca/wordpress/wp-content/uploads/2014/06/pin-map.png' }); </script> </div> Here's an img: http://www.factory42.it/jtaca/wordpress/wp-content/uploads/2014/06/img-maps.png

    Read the article

  • Getting Google repositories to work with apt-get on Ubuntu Hardy

    - by Justin
    I've installed Google Chrome on Hardy via the .deb file and would like to configure apt-get for automatic updates. [I have another machine running Ubuntu Karmic where this works fine; apt-get knows the package as 'google-chrome'; I'm now using a Dell Mini 10 with Ubuntu 8.04 LTS installed] As part of the .deb install, two entries have been added to the third- party software sources tab: http://dl.google.com/linux/deb stable main http://dl.google.com/linux/deb stable non-free main However if I check for updates with either of these clicked, I get the following error: Failed to fetch http://dl.google.com/linux/deb/dists/stable/Release Unable to find expected entry main/binary-lpia/Packages in Meta-index file (malformed Release file?) There is a thread here which indicates others have had the same problem: http://www.google.co.uk/support/forum/p/Chrome/thread?tid=097d103f87b49abe&hl=en This references a further thread: http://code.google.com/p/chromium/issues/detail?id=38608 which suggests the problem has been fixed. Despite this I remain unable to get it to work, and none of the suggested workarounds seem to work either. Ideas ? Thanks.

    Read the article

  • Android Google cloud messaging - not certain what parameters I should put when creating the push notification

    - by Genadinik
    I am working on a php script to send the notification to the CGM server and I am working from this example: public function send_notification($registatoin_ids, $message) { // include config include_once './config.php'; // Set POST variables $url = 'https://android.googleapis.com/gcm/send'; $fields = array( 'registration_ids' => $registatoin_ids, 'data' => $message, ); $headers = array( 'Authorization: key=' . GOOGLE_API_KEY, 'Content-Type: application/json' ); // Open connection $ch = curl_init(); // Set the url, number of POST vars, POST data curl_setopt($ch, CURLOPT_URL, $url); curl_setopt($ch, CURLOPT_POST, true); curl_setopt($ch, CURLOPT_HTTPHEADER, $headers); curl_setopt($ch, CURLOPT_RETURNTRANSFER, true); // Disabling SSL Certificate support temporarly curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, false); curl_setopt($ch, CURLOPT_POSTFIELDS, json_encode($fields)); // Execute post $result = curl_exec($ch); if ($result === FALSE) { die('Curl failed: ' . curl_error($ch)); } // Close connection curl_close($ch); echo $result; } But I am not certain what the values should be for the variables: CURLOPT_POSTFIELDS , CURLOPT_SSL_VERIFYPEER , CURLOPT_RETURNTRANSFER , CURLOPT_HOST , CURLOPT_URL Would anyone happen to know what the values for these should be? Thank you!

    Read the article

  • Google Apps Script: how to make suggest box library to work?

    - by Pythonista's Apprentice
    I'm trying to add an autocomplete feature in my Google Spreadsheet using this Google Apps Script suggest box library from Romain Vialard and James Ferreira's book: function doGet() { // Get a list of all my Contacts var contacts = ContactsApp.getContacts(); var list = []; for(var i = 0; i < contacts.length; i++){ var emails = contacts[i].getEmails(); if(emails[0] != undefined){ list.push(emails[0].getAddress()); } } var app = UiApp.createApplication(); var suggestBox = SuggestBoxCreator.createSuggestBox(app, 'contactPicker', 200, list); app.add(suggestBox); return app; } function onEdit() { var s = SpreadsheetApp.getActiveSheet(); if( s.getName() == "my_sheet_name" ) { //checks that we're on the correct sheet var r = s.getActiveCell(); if( r.getColumn() == 1) { doGet(); } } } But when I start editing the column 1 of "my_sheet_name" nothing hapens (if I replace doGet() for other function, this other function runs Ok). I've already installed the Suggest Box library. So, why the doGet() function doesn't work?

    Read the article

  • Google App Engine/GWT/Eclipse Plugin Newbie Question- how to autobuild client side resources?

    - by Dieter Hanover
    Hi there, I'm tinkering with the default GWT application generated by the Google Eclipse plugin when I click the Google "New Web Application Project" button in Eclipse 3.5. This will no doubt be familiar to many of you.. basically there is an h1 title stating "Web Application Starter Project," a text field, and a Send button. What I've found is that whenever I make changes to the client side resources, e.g. change the text on the Send button to "Submit" in the .java file, Eclipse does not appear to autobuild these resources. In fact I have to rebuild the entire project in order for these changes to be reflected in my browser. I do have "build automatically" selected in eclipse. I should state that this is my second GWT project, the first was almost entirely server side (restlet on GAE) and everything built automatically nicely. When I first tried this new project with updated client resources, on refreshing my browser, the browser stated "you may need to (re)compile your project." I'm not sure if this is relevant but I thought I'd mention it all the same. So what's going on? How do I get Eclipse/GWT to autobuild these client side resources? Cheers for any help you can offer! :-)

    Read the article

  • Oracle Customer Reference Forum – Apex IT – Oracle Sales Cloud

    - by Richard Lefebvre
    Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 Apex IT, an Oracle Platinum Partner, wins Nucleus Research's ROI Award with a 724% return. Learn how you can improve your ROI with Oracle Sales and Marketing Cloud. We are pleased to invite you to a discussion with Apex IT on industry trends, why sales automation is important, the decision making process for choosing Oracle Sales Cloud, and benefits achieved since going live. Apex IT works with clients large and small, assisting them at all stages in the process: organizing ideas and developing strategies, selecting the most appropriate package, implementing it for best results, and keeping systems optimized with long-term support. Please plan to register at least three hours prior to the event taking place in order to participate and get the dial-in information associated in due time. Speakers: Bryan Hinz, Vice President of Business Development, Apex IT (Speaker) Chris Haven, Senior Director Product Management, Oracle (Moderator) Organization Profile: Since 1997, Apex IT has helped public sector, corporate and higher education clients use technology to streamline their processes and increase productivity and profitability. Based on products and best practices from Oracle our experts provide a full range of enterprise solutions including CX/CRM and related applications that support marketing, sales, and service; HR and HR Helpdesk; and Business Intelligence. Our project approach is results-driven and our attitude is people-focused. Industry: Professional Services Products/Services: Oracle Sales Cloud Organization Website: http://apexit.com/ Event Description: In this informal reference call, you will have the opportunity to hear Apex IT discuss industry trends, why sales automation is important, the decision making process for choosing Oracle Sales Cloud, and benefits achieved since going live. The call will open with a brief overview, followed by discussion, and an open question and answer session. Please allow one hour for the call. Why Oracle: Apex IT needed a mobile-enabled sales force automation tool that could promote account collaboration and integrate with Microsoft Outlook. Oracle Sales Cloud met these needs and Apex IT’s requirements for: Improved collaborative selling Improved quality of customer engagement and information Improved business development Improved pipeline management Please plan to register at least three hours prior to the event taking place in order to participate and get the dial-in information associated in due time. After you register your information will be forwarded through an Approval Process. Once your registration request has been validated against the invitation database, you will receive an email confirmation with your registration details as long as there is availability. Please be advised that Apex IT will revise the registrants list and may dismiss registrations as they see fit. Note: To access more information at the corporate site you would need an Oracle.com account. If you do not already have an account, getting one is easy and free. Click on the link and you will be prompted to create an account. After you have created your account, you will be automatically returned to the full page description of this event. Register Now! /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0cm 5.4pt 0cm 5.4pt; mso-para-margin-top:0cm; mso-para-margin-right:0cm; mso-para-margin-bottom:10.0pt; mso-para-margin-left:0cm; line-height:115%; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;}

    Read the article

  • Looking for HTML 5 Presentations? Download Google’s HTML 5 presentation with embedded demos

    - by Gopinath
    Are you interested in learning HTML 5 and looking for good presentation? Are you willing to take a session on HTML 5 to your colleagues or students and looking for a presentation? If so your search is going to end now. Google Chrome team has created an online HTML 5 presentation to showcase the bleeding edge features for modern desktop and mobile browsers. You can access the presentation  at http://slides.html5rocks.com and present it audience with working demos of various HTML 5 features.  If you want to have offline access to the presentations, you can download the entire source code from http://code.google.com/p/html5rocks and play it offline on your computer. The presentation is regularly updated by Google Chrome team and as I write this post the following are the features showcased Offline  Storage Real-time  Communication File  Hardware Access Semantics & Markup Graphics  Multimedia CSS3 Nuts & Bolts The best part of this presentation is the embedded demos that lets you showcase the features as you present them with live hands on experience. For example in Offline Storage slide you can create a Web Sql database, create tables, add new rows,  retrieve data and drop the tables. Interface of demos is very simple and easy to showcase. As they are built by Google Chrome to showcase the features they built into Chrome, it’s recommended to use Chrome browser for presentation walkthrough. Link to HTML 5 Presentation: http://slides.html5rocks.com

    Read the article

  • Can´t verify my site on Google (error 403 Forbidden). I have other sites in the same host with no problems whatsoever

    - by Rosamunda Rosamunda
    I can´t verify my site on Google. I´ve done this several times for several sites, all inside the same host. I´ve tried the HTML tag method, HTML upload, Domain Name provider (I canp´t find the options that Google tell me that I should activate...), and Google Analytics. I always get this response: Verification failed for http://www.mysite.com/ using the Google Analytics method (1 minute ago). Your verification file returns a status of 403 (Forbidden) instead of 200 (OK). I´ve checked the server headers, and I get this result: REQUESTING: http://www.mysite.com GET / HTTP/1.1 Connection: Keep-Alive Keep-Alive: 300 Accept:/ Host: www.mysite.com Accept-Language: en-us Accept-Encoding: gzip, deflate User-Agent: Mozilla/4.0 (compatible; MSIE 7.0b; Windows NT 6.0) SERVER RESPONSE: HTTP/1.1 403 Forbidden Date: Wed, 19 Sep 2012 03:25:22 GMT Server: Apache/2.2.19 (Unix) mod_ssl/2.2.19 OpenSSL/0.9.8e-fips-rhel5 mod_bwlimited/1.4 PHP/5.2.17 Connection: close Content-Type: text/html; charset=iso-8859-1 Final Destination Page (It shows my actual homepage). What can I do? The hosting is the very same as in my other sites, where I didn´t have any issue at all! Thanks for your help! Note: As I have a Drupal 7 site, I´ve tried a "Drupal solution" first, but haven´t found any that solved this issue... How can it be forbidden when I can access the link perfectly ok? Is there any solution to this? Thanks!

    Read the article

  • In Google webmaster tools, can a "soft 404" be triggered by the text on the page?

    - by Stephen Ostermiller
    I just ran across an error in Google Webmaster Tools that I have never seen before. I manage the website for my local community band (I play trombone). One of the pages on the site is a list of our upcoming performances. It is powered by a WordPress events plugin that uses a database of upcoming events that are entered through the administration interface. We just finished up our summer and fall concerts and our next performance will be our Christmas concert. I hadn't gotten around to adding that into the website yet, so there are no upcoming events shown on the page. In fact the text on the page says: No upcoming events listed under Performance. Check out past events for this category or view the full calendar. Then in Google Webmaster Tools, this page is showing up as a "soft 404": The page is returning a 200 status and Google is indicating that he 404 is "soft". I wouldn't have expected Googlebot to be as sophisticated to parse that particular sentence. Is Googlebot able to detect that the text on the page indicates that there is currently not content and then treat it as a 404 page because of that? If Google is treating this page as a soft 404 because of the text on the page, does that mean that like regular 404 pages, the page won't show up in search results?

    Read the article

  • After force-installing a 32-bit deb failed, how can I install the 64-bit version?

    - by TryTryAgain
    I tried to dpkg -i --force-architecture google-earth-stable_i386.deb and it failed. But now when I try to install the amd64.deb it fails saying dpkg: error processing google-earth-stable_current_amd64.deb (--install): google-earth-stable: 6.2.2.6613-r0 (Multi-Arch: no) is not co-installable with google-earth-stable:i386 6.2.2.6613-r0 (Multi-Arch: no) which is currently installed Errors were encountered while processing: google-earth-stable_current_amd64.deb somehow it thinks the i386 version is installed. No google-earth files or directories even exist. sudo dpkg --configure -a outputs: dpkg: dependency problems prevent configuration of google-earth-stable:i386: google-earth-stable:i386 depends on lsb-core (= 3.2). dpkg: error processing google-earth-stable:i386 (--configure): dependency problems - leaving unconfigured Errors were encountered while processing: google-earth-stable:i386 so it does exist in some capacity. sudo apt-get -f install does nothing out of the ordinary: Reading package lists... Done Building dependency tree Reading state information... Done 0 upgraded, 0 newly installed, 0 to remove and 10 not upgraded. The weird thing is that synaptic doesn't show any google earth package available let alone installed, nothing under the broken filter either. I have also tried sudo apt-get autoremove and sudo apt-get autoclean So, my question: How can I get rid of this issue?

    Read the article

  • Google Chrome Updates; Faster, Cleaner Menus, Encrypted Password Syncing, and More

    - by ETC
    Google Chrome has rolled out a new update that includes a host of improves such as easier to navigate menus, encrypted password syncing, over all speed improvements, Flash sandboxing, and more. Google Chome’s new update has a host of subtle but powerful improvements. The browser is faster, the security is improved thanks the addition of encrypted password syncing and sandboxing of the integrated flash player, and the settings menu has been restructured and cleaned up for easy navigation. Check out the video above to take a peek at some of the changes or hit up the link below to read more. Speedier, Simpler, and Safer: Chrome’s Basics Get Even Better [The Official Google Blog] Latest Features How-To Geek ETC How to Get Amazing Color from Photos in Photoshop, GIMP, and Paint.NET Learn To Adjust Contrast Like a Pro in Photoshop, GIMP, and Paint.NET Have You Ever Wondered How Your Operating System Got Its Name? Should You Delete Windows 7 Service Pack Backup Files to Save Space? What Can Super Mario Teach Us About Graphics Technology? Windows 7 Service Pack 1 is Released: But Should You Install It? Save Files Directly from Your Browser to the Cloud in Chrome and Iron The Steve Jobs Chronicles – Charlie and the Apple Factory [Video] Google Chrome Updates; Faster, Cleaner Menus, Encrypted Password Syncing, and More Glowing Chess Set Combines LEDs, Chess, and DIY Electronics Fun Peaceful Alpine River on a Sunny Day [Wallpaper] Fast Society Creates Mini and Mobile Temporary Social Networks

    Read the article

  • How do I get Google to crawl my content when it's only displayed when you fill in a form?

    - by Sarang Patil
    I have a webpage. It has a form and the "results" section is blank. When the user searches for items, and a list that pops up, he/she chooses one option from list and then the corresponding results are displayed in results section. I once decided to log every ip,url of person with time that visits my page. One ip was 66.249.73.26, and on doing google search I came to know it is ip of google bot. link for whatmyipaddress google bot Now when I searched for the links that this ip visited, it was like this: search?id=100 search?id=110 ... search?id=200 ... then afterwards it incremented in steps of 1, like 400,401.. But people search for strings and not numbers. And because googlebot searches for numbers like this, I think the corresponding content is never displayed and so my page content is never indexed, even though it has rich content. So I want to ask you is that in order to show google bot all the content that the webpage has, should I list all the results in index page and ask users to enter string to filter results?

    Read the article

  • Cloud Backup: Getting the Users' Backs Up

    - by Tony Davis
    On Wednesday last week, Microsoft announced that as of July 1, all data transfers into its Microsoft Azure cloud will be free (though you have to pay for transferring data out). On Thursday last week, SQL Azure in Western Europe went down. It was a relatively short outage, but since SQL Azure currently provides no easy way to take a standard backup of a database and store it locally, many people had no recourse but to wait patiently for their cloud-based app to resume. It seems that Microsoft are very keen encourage developers to move their data onto their cloud, but are developers ready to do it, given that such basic backup capabilities are lacking? Recently on Simple-Talk, Mike Mooney described a perfect use case for the Microsoft Cloud. They had a simple web-based application with a SQL Server backend; they could move the application to Windows Azure, and the data into SQL Azure and in the process free themselves from much of the hassle surrounding management and scaling of the hardware, network and so on. It was a great fit and yet it nearly didn't happen; lack of support for the BACKUP command almost proved a show-stopper. Of course, backups of Azure databases are always and have always been taken automatically, for disaster recovery purposes, but these are strictly on-cloud copies and as of now it is not possible to use them to them to restore a database to a particular point in time. It seems that none of those clever Microsoft people managed to predict the need to perform basic backups of Azure databases so that copies could be stored locally, outside the Azure universe. At the very least, as Mike points out, performing a local backup before a new deployment is more or less mandatory. Microsoft did at least note the sound of gnashing teeth and, as a stop-gap measure, offered SQL Azure Database Copy which basically allows you to create an online clone of your database, but this doesn't allow for storing local archives of the data. To that end MS has provided SQL Azure Import/Export, to package up and export a database and its data, using BACPACs. These BACPACs do not guarantee transactional consistency; for example, if a child table is modified after the parent is copied, then the copied database will be in inconsistent state (meaning, to add to the fun, BACPACs need to be created from a database copy). In any event, widespread problems with BACPAC's evil cousin, the DACPAC have been well-documented, and it seems likely that many will also give BACPAC the bum's rush. Finally, in a TechEd 2011 presentation tagged "SQL Azure Advanced Administration", it was announced that "backup and restore" were coming in the next SQL Azure CTP. And yet this still doesn't mean that we'll get simple backups as DBAs know and love them. What it does mean, at least, is the ability to restore any given database to a point in time within a 2-week window. For the time being, if you want a local copy of your data and don't want to brave the BACPAC, one is left with SSIS or BCP, creative use of schema and data comparison tools, or use of SQL Azure Backup (currently in beta) in order to perform this simple but vital task. Cheers, Tony.

    Read the article

  • Authenticating your windows domain users in the cloud

    - by cibrax
    Moving to the cloud can represent a big challenge for many organizations when it comes to reusing existing infrastructure. For applications that drive existing business processes in the organization, reusing IT assets like active directory represent good part of that challenge. For example, a new web mobile application that sales representatives can use for interacting with an existing CRM system in the organization. In the case of Windows Azure, the Access Control Service (ACS) already provides some integration with ADFS through WS-Federation. That means any organization can create a new trust relationship between the STS running in the ACS and the STS running in ADFS. As the following image illustrates, the ADFS running in the organization should be somehow exposed out of network boundaries to talk to the ACS. This is usually accomplish through an ADFS proxy running in a DMZ. This is the official story for authenticating existing domain users with the ACS.  Getting an ADFS up and running in the organization, which talks to a proxy and also trust the ACS could represent a painful experience. It basically requires  advance knowledge of ADSF and exhaustive testing to get everything right.  However, if you want to get an infrastructure ready for authenticating your domain users in the cloud in a matter of minutes, you will probably want to take a look at the sample I wrote for talking to an existing Active Directory using a regular WCF service through the Service Bus Relay Binding. You can use the WCF ability for self hosting the authentication service within a any program running in the domain (a Windows service typically). The service will not require opening any port as it is opening an outbound connection to the cloud through the Relay Service. In addition, the service will be protected from being invoked by any unauthorized party with the ACS, which will act as a firewall between any client and the service. In that way, we can get a very safe solution up and running almost immediately. To make the solution even more convenient, I implemented an STS in the cloud that internally invokes the service running on premises for authenticating the users. Any existing web application in the cloud can just establish a trust relationship with this STS, and authenticate the users via WS-Federation passive profile with regular http calls, which makes this very attractive for web mobile for example. This is how the WCF service running on premises looks like, [ServiceBehavior(Namespace = "http://agilesight.com/active_directory/agent")] public class ProxyService : IAuthenticationService { IUserFinder userFinder; IUserAuthenticator userAuthenticator;   public ProxyService() : this(new UserFinder(), new UserAuthenticator()) { }   public ProxyService(IUserFinder userFinder, IUserAuthenticator userAuthenticator) { this.userFinder = userFinder; this.userAuthenticator = userAuthenticator; }   public AuthenticationResponse Authenticate(AuthenticationRequest request) { if (userAuthenticator.Authenticate(request.Username, request.Password)) { return new AuthenticationResponse { Result = true, Attributes = this.userFinder.GetAttributes(request.Username) }; }   return new AuthenticationResponse { Result = false }; } } Two external dependencies are used by this service for authenticating users (IUserAuthenticator) and for retrieving user attributes from the user’s directory (IUserFinder). The UserAuthenticator implementation is just a wrapper around the LogonUser Win Api. The UserFinder implementation relies on Directory Services in .NET for searching the user attributes in an existing directory service like Active Directory or the local user store. public UserAttribute[] GetAttributes(string username) { var attributes = new List<UserAttribute>();   var identity = UserPrincipal.FindByIdentity(new PrincipalContext(this.contextType, this.server, this.container), IdentityType.SamAccountName, username); if (identity != null) { var groups = identity.GetGroups(); foreach(var group in groups) { attributes.Add(new UserAttribute { Name = "Group", Value = group.Name }); } if(!string.IsNullOrEmpty(identity.DisplayName)) attributes.Add(new UserAttribute { Name = "DisplayName", Value = identity.DisplayName }); if(!string.IsNullOrEmpty(identity.EmailAddress)) attributes.Add(new UserAttribute { Name = "EmailAddress", Value = identity.EmailAddress }); }   return attributes.ToArray(); } As you can see, the code is simple and uses all the existing infrastructure in Azure to simplify a problem that looks very complex at first glance with ADFS. All the source code for this sample is available to download (or change) in this GitHub repository, https://github.com/AgileSight/ActiveDirectoryForCloud

    Read the article

  • How long does it take to delete a Google Apps for Domain account and allow recreation?

    - by Wil
    Basically, I set up a Google Apps for domain account for one of my clients but upon trying to upload a CSV of users, only half were created and then I kept getting time outs, 404's and other errors which I never saw when setting up another client. I was not sure how, but thought I may have caused an error or there was an error Google Side when the account was created so I thought it may be best to delete the domain and start from scratch. Little did I know, you had to wait to recreate the domain! As far as I can tell from what I have read, there is five days to delete and allow recreation of similar user names, however, I can't see anything that shows how long you have to wait for the domain account. It has now been 5/6 days and I can't recreate it. The cancellation email I got shows "...If at any time you would like to sign-up for Google Apps for this or another domain, you can do so by visiting..." But it does not mention how long!!! I tried emailing Google directly but got no response. Does anyone know?

    Read the article

  • 301 redirect from HTTP to HTTPS - how to be sure Google is fetching the correct information?

    - by user33692
    I'm hoping somebody might be able to provide a bit of advice on an issue I am having. I have one site where we implemented a 301 redirect on the homepage from HTTP to HTTPS. We have links on the homepage to other parts of the site that are not under SSL (in fact there is only one other page under SSL). When I go to our Webmaster Tools account I notice that we are not being provided with any webmaster information (e.g., search queries, backlinks, etc...) related to our homepage under SSL. I performed a Fetch as Google on the homepage and the information it returned is: HTTP/1.1 301 Moved Permanently Date: Fri, 08 Nov 2013 17:26:24 GMT Server: Apache/2.2.16 (Debian) Location: https://mysite.com/ Vary: Accept-Encoding Content-Encoding: gzip Content-Length: 242 Keep-Alive: timeout=15, max=100 Connection: Keep-Alive Content-Type: text/html; charset=iso-8859-1 <!DOCTYPE HTML PUBLIC "-//IETF//DTD HTML 2.0//EN"> <html><head> <title>301 Moved Permanently</title> </head><body> <h1>Moved Permanently</h1> <p>The document has moved <a href="https://mysite.com/">here</a>.</p> <hr> <address>Apache/2.2.16 (Debian) Server at mysite.com</address> </body></html> I am worried by the fact that Google fetch is not getting the correct Title tags and Meta information from our homepage and that this is hurting our search results. Additionally, I am worried that we need to do something specific with the sitemap to ensure that Google is correctly indexing all our pages and being able to flow from the HTTPS to the HTTP without issues. Does anybody have any advice on how we can correctly set this up or be sure that Google is fetching the correct information?

    Read the article

  • I.T. Chargeback : Core to Cloud Computing

    - by Anand Akela
    Contributed by Mark McGill Consolidation and Virtualization have been widely adopted over the years to help deliver benefits such as increased server utilization, greater agility and lower cost to the I.T. organization. These are key enablers of cloud, but in themselves they do not provide a complete cloud solution. Building a true enterprise private cloud involves moving from an admin driven world, where the I.T. department is ultimately responsible for the provisioning of servers, databases, middleware and applications, to a world where the consumers of I.T. resources can provision their infrastructure, platforms and even complete application stacks on demand. Switching from an admin-driven provisioning model to a user-driven model creates some challenges. How do you ensure that users provisioning resources will not provision more than they need? How do you encourage users to return resources when they have finished with them so that others can use them? While chargeback has existed as a concept for many years (especially in mainframe environments), it is the move to this self-service model that has created a need for a new breed of chargeback applications for cloud. Enabling self-service without some form of chargeback is like opening a shop where all of the goods are free. A successful chargeback solution will be able to allocate the costs of shared I.T. infrastructure based on the relative consumption by the users. Doing this creates transparency between the I.T. department and the consumers of I.T. When users are able to understand how their consumption translates to cost they are much more likely to be prudent when it comes to their use of I.T. resources. This also gives them control of their I.T. costs, as moderate usage will translate to a lower charge at the end of the month. Implementing Chargeback successfully create a win-win situation for I.T. and the consumers. Chargeback can help to ensure that I.T. resources are used for activities that deliver business value. It also improves the overall utilization of I.T. infrastructure as I.T. resources that are not needed are not left running idle. Enterprise Manager 12c provides an integrated metering and chargeback solution for Enterprise Manager Targets. This solution is built on top of the rich configuration and utilization information already available in Enterprise Manager. It provides metering not just for virtual machines, but also for physical hosts, databases and middleware. Enterprise Manager 12c provides metering based on the utilization and configuration of the following types of Enterprise Manager Target: Oracle VM Host Oracle Database Oracle WebLogic Server Using Enterprise Manager Chargeback, administrators are able to create a set of Charge Plans that are used to attach prices to the various metered resources. These plans can contain fixed costs (eg. $10/month/database), configuration based costs (eg. $10/month if OS is Windows) and utilization based costs (eg. $0.05/GB of Memory/hour) The self-service user provisioning these resources is then able to view a report that details their usage and helps them understand how this usage translates into cost. Armed with this information, the user is able to determine if the resources are delivering adequate business value based on what is being charged. Figure 1: Chargeback in Self-Service Portal Enterprise Manager 12c provides a variety of additional interfaces into this data. The administrator can access summary and trending reports. Summary reports allow the administrator to drill-down through the cost center hierarchy to identify, for example, the top resource consumers across the organization. Figure 2: Charge Summary Report Trending reports can be used for I.T. planning and budgeting as they show utilization and charge trends over a period of time. Figure 3: CPU Trend Report We also provide chargeback reports through BI Publisher. This provides a way for users who do not have an Enterprise Manager login (such as Line of Business managers) to view charge and usage information. For situations where a bill needs to be produced, chargeback can be integrated with billing applications such as Oracle Billing and Revenue Management (BRM). Further information on Enterprise Manager 12c’s integrated metering and chargeback: White Paper Screenwatch Cloud Management on OTN

    Read the article

< Previous Page | 92 93 94 95 96 97 98 99 100 101 102 103  | Next Page >