Search Results

Search found 41930 results on 1678 pages for 'google product search'.

Page 17/1678 | < Previous Page | 13 14 15 16 17 18 19 20 21 22 23 24  | Next Page >

  • Configuring Expert Search in Communicator 14 and SharePoint 2010

    Communicator 14 provides functionality to be able to search for contacts not just by name, but by skill.  For example a customer service agent at an airline can search for other agents with Travel Advisory experience by typing the search criteria into the Communicator search box and performing a search by keyword.  The search results will return users who have specified that skill in their profile on their SharePoint My Site.  This is actually pretty easy to configure, Ill show you how. Create Search and People Search Results Pages in SharePoint Communicator 14 Expert Search works by using the SharePoint 2010 Search Service to search SharePoint for user profiles with matching keywords.  This requires that you have an Enterprise Search site in your site collection which includes the search service and also the People Results pages.  The easiest way to do this is to create a Search Center site in your site collection. Note: I get an error when trying to create an Enterprise Search site in a Team Site in the SharePoint 2010 RTM bits, so I created it as a site collection that is evident in the URLs you see below. In the screenshots below, you can see that the URL of the SharePoint search service in the Search site collection is http://sps2010/sites/search/_vti_bin/search.asmx, and the URL of the People Search Results page is http://sps2010/sites/Search/Pages/peopleresults.aspx. Point Communications Server 14 to Search and People Search Results Pages For Communicator 14 to be able to perform an Expert Search, you need to configure Communications Server 14 to point to the Search Service and People Search Results page URLs. From a server with the OCS Core bits installed, fire up the Communications Server Management Shell and type Get-CsClientPolicy. Scroll down to the bottom of the output, were interested in setting the values of: SPSearchInternalURL SPSearchExternalURL SPSearchCenterInternalURL SPSearchCenterExternalURL SPSearchInternalURL and SPSearchExternalURL correspond to the internal and external URLs of the SharePoint search service in the Search site collection, while SPSearchCenterInternalURL and SPSearchCenterExternalURL correspond to the internal and external URLs of the people search results pages. Well use the Communications Server Management Shell to set the values of these CS policy properties. For simplicity, Im only going to set the internal URLs here. Set-CsClientPolicy SPSearchInternalURL http://sps2010/sites/search/_vti_bin/search.asmx     -SPSearchCenterInternalURL http://sps2010/sites/Search/Pages/peopleresults.aspx Log out and back into Communicator.  You can verify that these settings were applied by running the Get-CsClientPolicy cmdlet again from the Communications Server Management Shell. However, theres another super-secret ninja trick to verify that the settings were applied: Find the Communicator icon in the Windows System Tray Hold down the Ctrl button Click (left) the Communicator icon in the Windows System Tray do not depress the Ctrl button You should now see an extra menu item called Configuration Information, click it. Scroll down and locate the Expert Search URL and SharePoint Search Center URL keys and verify that their values correspond to those you set using the Set-CsClientPolicy PowerShell cmdlet. Configure a Sharepoint User Profile Import Im not going to provide detailed steps here except to say that you need to configure the SharePoint 2010 User Profile  Service Application to import user account details from Active Directory on a scheduled basis. This is a critical step because there are several user profile properties e.g. SipAddress that are only populated by a user profile import.  When performing an Expert Search, Communicator can only render results for users who have a SipAddress specified. Add Skills to User Profiles Navigate to your My Site and click on My Profile.  This page allows you to set many contact details that are searchable in SharePoint.  Were particularly interested in the Ask Me About property of a users profile.  Expert Search searches against this property to find users with matching skills. Configure a SharePoint Search Crawl Ensure that you have a scheduled job to crawl your Local SharePoint Sites content source.  Depending on how you have this configured, it will also crawl the My Site site collection and add user properties such as Ask Me About to the search index. Thats It! SharePoint 2010 provides new social and collaboration features to help users find other users with similar skills or interests. Expert Search extends this functionality directly into Microsoft Communicator 14, allowing you to interact with the users directly from the search results. Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • Google I/O 2010 - Fireside chat with the Google Wave team

    Google I/O 2010 - Fireside chat with the Google Wave team Google I/O 2010 - Fireside chat with the Google Wave team Fireside Chats, Wave Lars Rasmussen, Douwe Osinga, Jochen Bekmann, Alan Green, Pamela Fox, Dan Peterson, Stephanie Hannon Join the Google Wave team around the campfire to chat about all things Wave: the product, the API platform, and the wave federation protocol. Come to learn about the new Wave API features, get tips on how to build the best extensions, discuss how to take advantage of the open source code available and hear more about what users are doing with the product. This is an excellent opportunity to ask the engineering team questions directly, and learn more about where Wave is heading. For all I/O 2010 sessions, please go to code.google.com From: GoogleDevelopers Views: 5 0 ratings Time: 56:17 More in Science & Technology

    Read the article

  • Google I/O 2010 - Google Wave API design principles

    Google I/O 2010 - Google Wave API design principles Google I/O 2010 - Google Wave API design principles + anatomy of a great extension Wave 201 Pamela Fox, Michael Goderbauer (Hasso Plattner Institute) Google Wave is all about collaboration. The most successful extensions are user-friendly and collaborative. Wave robots should be as intuitive to communicate with as a human, and play well with other robots; Wave gadgets should extend the metaphors of the textual collaboration into the visual. In this talk, we'll discuss the design and privacy principles you should consider while building extensions, and show examples of extensions that demonstrate these principles. For all I/O 2010 sessions, please go to code.google.com/events/io/2010/sessions.html From: GoogleDevelopers Views: 6 0 ratings Time: 01:01:54 More in Science & Technology

    Read the article

  • Google I/O 2010 - Google Chrome's developer tools

    Google I/O 2010 - Google Chrome's developer tools Google I/O 2010 - Google Chrome's developer tools Chrome 101 Pavel Feldman, Anders Sandholm In this session we'll give an overview of Developer Tools for Google Chrome that is a part of the standard Chrome distribution. Chrome Developer Tools allow inspecting, debugging and tuning the web applications and many more. In addition to this overview we would like to share some implementation details of the Developer Tools features and call for your contribution. For all I/O 2010 sessions, please go to code.google.com From: GoogleDevelopers Views: 6 0 ratings Time: 43:30 More in Science & Technology

    Read the article

  • How to return the relevant country domain in rich snippets pulled in from from Google Places?

    - by Baumr
    Background A site has multiple ccTLDs: example.com for people in the US, example.co.uk for UK users, example.de for Germans, etc. Googling for certain city keywords will return rich snippets with a list of Google Places: Problem When searching on Google Germany, the domain for US users (example.com) appears instead of the corresponding ccTLD (example.de) aimed at German users. This is not good user experience, as users would most likely like to book on a site localized for them (e.g. language and currency). Question What solutions are there? Is it possible to return different ccTLDs in rich snippets for Google searches in Germany/UK? If so, how? Ideas Stabs in the dark: Would implementing the hreflang annotation resolve this? (GWMT geotargeting is already set.) What about entering multiple corresponding URLs in the structured data markup? (As far as I know, Google Places accepts only a single website URL.)

    Read the article

  • How to Use and Customize Google Chrome Web Apps

    - by The Geek
    Google announced their new Chrome Web Store today, with loads of web sites and games that can be installed as applications in your browser, synced across every PC, and customized to launch the way you want them to. Here’s how it all works. Note: this guide really isn’t aimed at expert geeks, though you’re more than welcome to leave your thoughts in the comments. What Are Chrome Web Apps Again? The new Chrome Web Apps are really nothing more than regular web sites, optimized for Chrome and then wrapped up with a pretty icon and installed in your browser. Some of these sites, especially web-based games, can also be purchased through the Chrome Web Store for a small fee, though the majority of services are free Latest Features How-To Geek ETC The How-To Geek Holiday Gift Guide (Geeky Stuff We Like) LCD? LED? Plasma? The How-To Geek Guide to HDTV Technology The How-To Geek Guide to Learning Photoshop, Part 8: Filters Improve Digital Photography by Calibrating Your Monitor Our Favorite Tech: What We’re Thankful For at How-To Geek The How-To Geek Guide to Learning Photoshop, Part 7: Design and Typography Fun and Colorful Firefox Theme for Windows 7 Happy Snow Bears Theme for Chrome and Iron [Holiday] Download Full Command and Conquer: Tiberian Sun Game for Free Scorched Cometary Planet Wallpaper Quick Fix: Add the RSS Button Back to the Firefox Awesome Bar Dropbox Desktop Client 1.0.0 RC for Windows, Linux, and Mac Released

    Read the article

  • Google I/O 2010 - Tips and tricks for Google Earth API and KML

    Google I/O 2010 - Tips and tricks for Google Earth API and KML Google I/O 2010 - Mapping in 3D: Tips and tricks for Google Earth API and KML Geo 201 Josh Livni, Mano Marks Google Earth and the Earth API can handle a tremendous amount of data. But you always have more. We will talk about integrating large datasets efficiently, coding for optimal performance, and taking advantage of advanced features in KML and the Earth API. For all I/O 2010 sessions, please go to code.google.com From: GoogleDevelopers Views: 14 0 ratings Time: 01:01:18 More in Science & Technology

    Read the article

  • Google I/O 2010 - Google Wave Media APIs

    Google I/O 2010 - Google Wave Media APIs Google I/O 2010 - Google Wave Media APIs: Attachments can surf too! Wave 201 Seth Covitz, Jimin Li, Phil Liao Google Wave is used by diverse groups to communicate and collaborate on projects from work to school to plain old having fun. To make users even more productive, we are providing capabilities that enable them to collaborate on and around any piece of third-party content (eg attachments). In this session, we will introduce the Wave Media APIs which enable robots and gadgets to create, access, and modify third-party content in Wave. For all I/O 2010 sessions, please go to code.google.com From: GoogleDevelopers Views: 5 0 ratings Time: 41:04 More in Science & Technology

    Read the article

  • Google Webmaster tools Incorrect rel-alternate-hreflang implementation warning message

    - by Noam
    I'm getting this warning msg. in Google webmaster tools Incorrect rel-alternate-hreflang implementation In particular, there seems to be a problem with missing or incorrect bi-directional linking (when page A links with hreflang to page B, there must be a link back from B to A as well). This msg. seems pretty straight forward, but when checking their example pages, I'm not finding anything wrong. I'm using alternate for translation of main site menu, titles, etc.. In each page I have this: <link rel="alternate" hreflang="en" href="http://mydomain.com/page" /> <link rel="alternate" hreflang="jp" href="http://ja.mydomain.com/page" /> <link rel="alternate" hreflang="ko" href="http://ko.mydomain.com/page" /> <link rel="alternate" hreflang="th" href="http://th.mydomain.com/page" /> <link rel="alternate" hreflang="es" href="http://es.mydomain.com/page" /> <link rel="alternate" hreflang="pt" href="http://pt.mydomain.com/page" /> I've double checked this exists in all the 6 pages. This is the first time I've seen this msg although I've implemented this at least 6 months ago, and implementation hasn't changed. Is there any way to check a specific set of pages for these things? Am I missing something in my implementation? We're auto-redirecting people from a location to their specific language, and give them an option to manually change this. I've also just found out about the suggestion for Vary HTTP header - is that relevant and important here?

    Read the article

  • Google I/O 2010 - Testing techniques for Google App Engine

    Google I/O 2010 - Testing techniques for Google App Engine Google I/O 2010 - Testing techniques for Google App Engine App Engine 201 Max Ross We typically write tests assuming that our development stack closely resembles our production stack. What if our target environment only lives in the cloud? We will highlight the key differences between typical testing techniques and Google App Engine testing techniques. We will also present concrete strategies for testing against local and cloud-based implementations of App Engine services. Finally, we will explain how to use App Engine as a highly parallel test harness that runs existing test suites without modification. For all I/O 2010 sessions, please go to code.google.com/events/io/2010/sessions.html From: GoogleDevelopers Views: 6 1 ratings Time: 54:29 More in Science & Technology

    Read the article

  • Google I/O 2010 - Customizing Google Apps

    Google I/O 2010 - Customizing Google Apps Google I/O 2010 - Customizing Google Apps & integrating with customer environments Enterprise 201 Mike O'Brien, Matt Pruden (Appirio), Adam Graff (Genentech), Don Dodge (moderator) Learn from real life examples of customizing Google Apps to meet customer requirements. Hear from the customer (Genentech) and the System Integrator (Appirio). Explore integration issues and deployment best practices with people who have done it. Get your questions answered in this session. For all I/O 2010 sessions, please go to code.google.com From: GoogleDevelopers Views: 6 0 ratings Time: 52:00 More in Science & Technology

    Read the article

  • Google analytics - drop in traffic

    - by Andy
    Bit of a general question here. We are in the process of converting a number of our clients from older web sites to new ones. The problem we are getting, and sorry for being so general here, is we are getting a sharp decline in traffic as reported on Google Analytics. It's not a gradual decline, it seems to hit almost as soon as the new site goes live. I've just got a few questions to see if there is something we are doing wrong: a) We are using the same analytics accounts going from old to new site. Is this a bad idea? b) The actual analytics code is integrated into the pages using a server-side include. IS this a bad idea? c) We structure our sites differently to our old site. IE. The old sites would pretty must have all the web pages in the root directory, and hyperlinks would be linked to the page files: EG. <a href="somepage.aspx">Link</a> Our new sites now have a directory structure that pretty much reflects the navigation structure, and hyper links link to the pages directory instead of the actual page: EG. <a href="/new-items/shoes/">New shoes</a> Is this a bad idea. I'm really searching for a needle in a haystack here. Would appriciate any help or advice as to why we are getting such a sharp and sudden drop in traffic. Again, so this is such a general question. Thanks in advance.

    Read the article

  • Google I/O 2010 - Connect enterprise apps w/ Google Docs

    Google I/O 2010 - Connect enterprise apps w/ Google Docs Google I/O 2010 - Connecting your enterprise applications with Google Docs and Sites Enterprise 201 Eric Bidelman, Vijay Bangaru, Matthew Tonkin (Memeo Inc) Learn how your organization can harness the power of Google Docs and Sites directly from within your existing enterprise systems using our extensive APIs. Integrate with data from behind the firewall using Secure Data Connector. Upload, share, collaborate, and sync any file to Docs. Even automate the creation of project and team workspaces in a single click in Sites from within your CRM. For all I/O 2010 sessions, please go to code.google.com From: GoogleDevelopers Views: 18 0 ratings Time: 53:19 More in Science & Technology

    Read the article

  • Google analytics - drop in traffic

    - by user1001421
    Bit of a general question here. We are in the process of converting a number of our clients from older web sites to new ones. The problem we are getting, and sorry for being so general here, is we are getting a sharp decline in traffic as reported on Google Analytics. It's not a gradual decline, it seems to hit almost as soon as the new site goes live. I've just got a few questions to see if there is something we are doing wrong: a) We are using the same analytics accounts going from old to new site. Is this a bad idea? b) The actual analytics code is integrated into the pages using a server-side include. IS this a bad idea? c) We structure our sites differently to our old site. IE. The old sites would pretty must have all the web pages in the root directory, and hyperlinks would be linked to the page files: EG. <a href="somepage.aspx">Link</a> Our new sites now have a directory structure that pretty much reflects the navigation structure, and hyper links link to the pages directory instead of the actual page: EG. <a href="/new-items/shoes/">New shoes</a> Is this a bad idea. I'm really searching for a needle in a haystack here. Would appriciate any help or advice as to why we are getting such a sharp and sudden drop in traffic. Again, so this is such a general question. Thanks in advance.

    Read the article

  • Sharding / indexing strategy for multi-faceted search

    - by Graham
    I'm currently thinking about our database structure and how we modify it for scale. Specifically, we're thinking about using ElasticSearch to provide our search functionality. One common pattern with ElasticSearch seems to be the 'user-routing' pattern; that is, using routing to ensure that any one user's data resides on the same shard. This is great for client-specific search e.g. Gmail. Our application has a constraint such that any user will have a maximum of a few thousand documents, so this pattern seems like a good candidate. However, our search needs to work across all users, as well as targeting a specific user (so I might search my content, Alice's content, or all content). Similarly, we need to provide full-text search across any timeframe; recent months to several years ago. I'm thinking of combining the 'user-routing' and 'index-per-time-interval' patterns: I create an index for each month By default, searches are aliased against the most recent X months If no results are found, we can search against previous X months As we grow, we can reduce the interval X Each document is routed by the user ID So, this should let us do the following: search by user. This will search all indeces across 1 shard search by time. This will search ~2 indeces (by default) across all shards Is this a reasonable approach, considering we may scale to multi-million+ documents? Or should I be denormalizing the data somehow, so that user searches are performed on a totally seperate index from date searches? Thanks for any pros-cons of the above scenario.

    Read the article

  • No description for any page on the website is available in Google despite robots.txt allowing crawling

    - by Abhijit
    I seem to have the weirdest issue with Search Engine Optimization, and I asked the IT folks at my university, I asked people on Joomla forums and I am trying to sort this issue out using Google Webmaster Tools for more than 2 months to little avail. I want to know if I have some blatantly wrong configuration somewhere that is causing search engines to be unable to index this site. I noticed a similar issue with another website I searched for online (ECEGSA - The University of British Columbia at gsa.ece.ubc.ca), making me believe this might be a concern that people might be looking an answer for. Here are the details: The website in question is: http://gsa.ece.umd.edu/. It runs using Joomla 2.5.x (latest). The site was up since around mid December of 2013, and I noticed right from the get go that the site was not being indexed correctly on Google. Specifically I see the following message when I search for the website on Google: A description for this result is not available because of this site's robots.txt – learn more. The thing is in December till around March I used the default Joomla robots.txt file which is: User-agent: * Disallow: /administrator/ Disallow: /cache/ Disallow: /cli/ Disallow: /components/ Disallow: /images/ Disallow: /includes/ Disallow: /installation/ Disallow: /language/ Disallow: /libraries/ Disallow: /logs/ Disallow: /media/ Disallow: /modules/ Disallow: /plugins/ Disallow: /templates/ Disallow: /tmp/ Nothing there should stop Google from searching my website. And even more confusingly, when I go to Google Webmaster tools, under "Blocked URLs" tab, when I try many of the links on the site, they are all shown up as "Allowed". I then tried adding a sitemap, putting it in the robots.txt file. That did not help. Same exact search result, same behavior in the "Blocked URLs" tab on the webmaster tools. Now additionally, the "sitemaps" tab says for several links an error saying "URL is robotted out". I tried those exact links in the "Blocked URLs" and they are allowed! I then tried deleting the robots.txt file. No use. Same exact problem. Here is an example screenshot from Google's Webmaster Tools: At this point I cannot give a rational explanation to why this is happening and neither can anyone in the IT department here. No one on Joomla forums can seem to understand what is going on. Based on what I explained, does it seem that I have somehow set a setting in the robots.txt or in .htaccess or somewhere else, incorrectly?

    Read the article

  • Why does google does not ignore the word "languages", although I have set to ignore it in advanced search settings

    - by jitendra1234
    Why does google does not ignore the word "languages", although I have set to ignore it in advanced search settings. here is the term I am using in google search: -languages site:http://en.wikipedia.org/wiki/ and here is the first result where the word "languages" is still present, (you can do a quick crtl+F to find out) http://en.wikipedia.org/wiki/Walter_Bedell_Smith I am just curious to know why google have not ignored the word "languages"?

    Read the article

  • Will this sitemap get me de indexed from Google?

    - by heavy rocker dude
    My site's URL (web address) is: http://quantlabs.net/private/sitemap.xml Description (including timeline of any changes made): Will this sitemap get me de-indexed from Google? My new site map just got spidered by Google for some reason. It is located at http://quantlabs.net/private/sitemap.xml, is this in danger of getting me de-indexed from Google's index. Does it look like spam even though it is not meant to be? I am trying to figure the limitation in terms of Google's threshold before they deem it a spammy sitemap. This is sitemap contains automated postings which are different with the stock symbol provided. The amount of postings within the Sitemap are quite a few in a small amount of time.

    Read the article

  • Why do my websites have a first page rank on Bing and Yahoo but not Google? [closed]

    - by Linda Cullum
    I have 3 websites suffering from a drop in ranking with Google and hence a huge drop in traffic. The instant drop ocurred in September and I have not been able to remedy it. For the past 6-10 years my main website http://LearnToSail.Net has ranked from #3 to #1 on the 1st page of Google and all the other engines with the search term "learn to sail" Now it shows on the 1st page of Bing and Yahoo but does not show up on ANY pages of Google. The only way it does come up is if I add "cd" to the "learn to sail" phrase. We sell a sailing cd on that website. The other websites are http://LearnToSailOnLine.com ..search terms are "learn to sail online" or learntosailonline and historyofthepilgrims.com search terms are "history of the pilgrims" "historyofthepilgrims" I get the same result. Gone on Google but 1st pages for Bing and Yahoo. I have researched, edited,updated blogs, made sitemaps, prayed to the universe and use Google Webmaster tools but nothing is changing and I have lost alot of business. I host with 1and1.com and have been back and forth with them but to no avail and no change in traffic. I thought maybe some DNS mapping was off. I used to have alot of traffic now I have hardly any. Any advice would be greatly appreciated. I am still in the process of working on the issue of course! This is a really great website here and I am glad I came across it. Thank you, LS Cullum Little Pines Multimedia

    Read the article

  • What factors help in getting a site indexed by Google fast?

    - by ekaj
    What should I do to get a site indexed on Google, fast? Taking example of Super User I just did a quick Google for a question that was 20 minutes old, to look for an answer, and it was already on Google Search - how is this possible? I glanced over this article which seems to suggest that SU has added RSS feeds (which SU has, but when I opened the feed the article says last posted 6 minutes ago, but when Googled it is 11 hours old) - which leads me to think (Based on that article, I don't know much about search indexing but I am reading at the moment) that most of this indexing is done thanks to the sitemap. is there anything else I am unaware of that helps SU questions get on Google so fast?

    Read the article

  • Google Rolls Out Secured Search. It’s Slightly Different From Regular Search

    - by Gopinath
    Google rolled out secured version of it’s search engine at https://google.com (did you notice https instead of http?). This search engine lets everyone to use Google search in a secured way. How is it secured? When you use https://google.com, the data exchanged between your browser and Google servers is encrypted to make sure that no one can sniff it. Is my search history secured from Google? No. The search queries you submit to Google are stored in Google servers. There is no change Google’s search history recording. Any differences between Regular Search and Secured Search Results? Yes. Secured search is slightly different from regular search. When you are accessing Google Secured Search Image search options will not be available on the left side bar. Site may respond slow compared to regular search site as there is a overhead to establish between your browser and the server. Join us on Facebook to read all our stories right inside your Facebook news feed.

    Read the article

  • How to search the web for programming related solutions?

    - by Bob
    I have the impression that Google has become unusable when searching for programming related questions. Example: I'm Googling for XML-RPC Redstone Cookie I'm expecting results where all three terms are contained. I don't care for results where one term misses. I guess until some months ago Google just worked this way, i.e. all terms were included. Somehow this feature is gone now (Google apparently thinks it is more intelligent than the user and knows what the user is searching for). So I helped myself putting a + in front of every word. This is, however, a bit cumbersome. And for the last weeks, it even doesn't work anymore in all cases, Google ignores the +. So how do you search for progamming related problems? Do you still use Google? If yes, which techniques do you use to get the right results? Or do you use another search engine? Which one?

    Read the article

  • Google Local Search API

    - by Gublooo
    hey guys couple of quick questions 1) In the local search results - we can get a lot of parameters like street title, address, city, state, lat, long , url etc - In order for me to uniquely identify this record - can I consider URL to be unique to this address or concatenation of latitude and longitude Ref: http://code.google.com/apis/ajaxsearch/documentation/reference.html#_class_GlocalResult 2) In terms of usage, depending upon what user enters, I'm displaying a list of local business for the user to choose. Now when a user selects a particular business address - is it legal for me to store that business address along with lat and longitude information in my database for future look ups. I've seen a lot of blogs talking about storing the lat/long info but just want to be sure that i'm not violating and google rules. Thanks

    Read the article

  • removed url from google webmaster tool, now goole don't show my website in search

    - by vicky
    I developed the site, and I removed the url from google webmaster tool which was like "http://example.com\". I did this because google show this in search with underconstruction Title, which was my previous page. When i completed the website, i removed the url from there, and added sitemap etc to have new copy of site. Now i see in webmaster tools, that all pages are indexed, but still no success. Yahoo and Bing are showing my page when searched. Help me to fix this problem. Regards, vicky

    Read the article

< Previous Page | 13 14 15 16 17 18 19 20 21 22 23 24  | Next Page >