Search Results

Search found 14829 results on 594 pages for 'sharepoint api'.

Page 128/594 | < Previous Page | 124 125 126 127 128 129 130 131 132 133 134 135  | Next Page >

  • "Error in the Site Data Web Service." when performing crawl

    - by Janis Veinbergs
    Installed SharePoint Services v3 (SP2, october 2009 cumulative updates, Language Pack), attached to a content database I had previously (all works). Installed Search server 2008 Express (with language pack) on top of WSS and crawl does not work. However it works for newly created web application + database. Was playing around with accounts, permissions to try get it working. Currently I have WSS_Crawler account with such permissions: Office Search Server runs with WSS_Crawler account Config database has read permissions for WSS_Crawler Content database has read permissions for WSS_Crawler WSS_Crawler is owner of search database. Added WSS_Crawler to SQL server browser user group and administrator Yes, i'v given more permissions than needed, but it doesn't even work with that and i don't know if its permission problem or what. Crawl log says there is Error in the Site Data Web Service., nothing more. There were known issues with a similar error: Error in the Site Data Web Service. (Value does not fall within the expected range.), but this is not the case as thats an old issue and i hope it has been included in SP2... Logs are from olders to newest (descending order). They don't appear to be very helpful. Crawl log http://serveris Crawled Local Office SharePoint Server sites 3/15/2010 9:39 AM sts3://serveris Crawled Local Office SharePoint Server sites 3/15/2010 9:39 AM sts3://serveris/contentdbid={55180cfa-9d2d-46e4... Crawled Local Office SharePoint Server sites 3/15/2010 9:39 AM http://serveris/test Error in the Site Data Web Service. Local Office SharePoint Server sites 3/15/2010 9:39 AM http://serveris Error in the Site Data Web Service. Local Office SharePoint Server sites 3/15/2010 9:39 AM EventLog No errors in EventLog, just some Information events that Office Server Search provides The search service started. Successfully stored the application configuration registry snapshot in the database. Context: Application 'SharedServices Component: da1288b2-4109-4219-8c0c-3a22802eb842 Catalog: Portal_Content. A master merge was started due to an external request. Component: da1288b2-4109-4219-8c0c-3a22802eb842 A master merge has completed for catalog Portal_Content. Component: da1288b2-4109-4219-8c0c-3a22802eb842 Catalog: AnchorProject. A master merge was started due to an external request. Component: da1288b2-4109-4219-8c0c-3a22802eb842 A master merge has completed for catalog AnchorProject. ULS Log Just some information, but no exceptions, unexpected errors 03/15/2010 09:03:28.28 mssearch.exe (0x1B2C) 0x0E8C Search Server Common GatherStatus 0 Monitorable Insert crawl 771 to inprogress queue hr 0x00000000 - File:d:\office\source\search\search\gather\server\gatherobj.cxx Line:6591 03/15/2010 09:03:28.28 mssearch.exe (0x1B2C) 0x0E8C Search Server Common GatherStatus 0 Monitorable Request Start Crawl 1, project Portal_Content, crawl 771 - File:d:\office\source\search\search\gather\server\gatherobj.cxx Line:2875 03/15/2010 09:03:28.28 mssearch.exe (0x1B2C) 0x0E8C Search Server Common GatherStatus 0 Monitorable Advise status change 1, project Portal_Content, crawl 771 - File:d:\office\source\search\search\gather\server\gatherobj.cxx Line:4853 03/15/2010 09:03:28.28 w3wp.exe (0x1D98) 0x0958 Search Server Common MS Search Administration 8wn6 Information A full crawl was started on 'Local Office SharePoint Server sites' by BALTICOVO\janis.veinbergs. 03/15/2010 09:03:28.43 mssdmn.exe (0x1750) 0x10F8 ULS Logging Unified Logging Service 8wsv High ULS Init Completed (mssdmn.exe, Microsoft.Office.Server.Native.dll) 03/15/2010 09:03:30.48 mssdmn.exe (0x1750) 0x09C0 Search Server Common MS Search Indexing 8z0v Medium Create CCache 03/15/2010 09:03:30.56 mssdmn.exe (0x1750) 0x09C0 Search Server Common MS Search Indexing 8z0z Medium Create CUserCatalogCache 03/15/2010 09:03:32.06 w3wp.exe (0x1D98) 0x0958 Search Server Common MS Search Administration 90ge Medium SQL: dbo.proc_MSS_PropagationGetQueryServers 03/15/2010 09:03:32.09 w3wp.exe (0x1D98) 0x0958 Search Server Common MS Search Administration 7phq High GetProtocolConfigHelper failed in GetNotesInterface(). 03/15/2010 09:03:34.26 mssearch.exe (0x1B2C) 0x16A4 Search Server Common GatherStatus 0 Monitorable Advise status change 12, project Portal_Content, crawl -1 - File:d:\office\source\search\search\gather\server\gatherobj.cxx Line:4853 03/15/2010 09:03:35.92 mssearch.exe (0x1B2C) 0x16A4 Search Server Common GatherStatus 0 Monitorable Advise status change 12, project Portal_Content, crawl -1 - File:d:\office\source\search\search\gather\server\gatherobj.cxx Line:4853 03/15/2010 09:03:37.32 mssearch.exe (0x1B2C) 0x16A4 Search Server Common GatherStatus 0 Monitorable Advise status change 12, project Portal_Content, crawl -1 - File:d:\office\source\search\search\gather\server\gatherobj.cxx Line:4853 03/15/2010 09:03:37.23 mssdmn.exe (0x1750) 0x1850 Search Server Common MS Search Indexing 8z14 Medium Test TRACE (NULL):(null), (NULL)(null), (CrLf): , end 03/15/2010 09:03:39.04 mssearch.exe (0x1B2C) 0x16A4 Search Server Common GatherStatus 0 Monitorable Advise status change 12, project Portal_Content, crawl -1 - File:d:\office\source\search\search\gather\server\gatherobj.cxx Line:4853 03/15/2010 09:03:40.98 mssdmn.exe (0x1750) 0x0B24 Search Server Common MS Search Indexing 7how Monitorable GetWebDefaultPage fail. error 2147755542, strWebUrl http://serveris 03/15/2010 09:03:41.87 mssdmn.exe (0x1750) 0x1260 Search Server Common PHSts 0 Monitorable CSTS3Accessor::GetSubWebListItemAccessURL GetAccessURL failed: Return error to caller, hr=80042616 - File:d:\office\source\search\search\gather\protocols\sts3\sts3acc.cxx Line:505 03/15/2010 09:03:41.87 mssdmn.exe (0x1750) 0x1260 Search Server Common PHSts 0 Monitorable CSTS3Accessor::Init: GetSubWebListItemAccessURL failed. Return error to caller, hr=80042616 - File:d:\office\source\search\search\gather\protocols\sts3\sts3acc.cxx Line:348 03/15/2010 09:03:41.87 mssdmn.exe (0x1750) 0x1260 Search Server Common PHSts 0 Monitorable CSTS3Accessor::Init fails, Url sts3://serveris/siteurl=test/siteid={390611b2-55f3-4a99-8600-778727177a28}/weburl=/webid={fb0e4bff-65d5-4ded-98d5-fd099456962b}, hr=80042616 - File:d:\office\source\search\search\gather\protocols\sts3\sts3handler.cxx Line:243 03/15/2010 09:03:41.87 mssdmn.exe (0x1750) 0x1260 Search Server Common PHSts 0 Monitorable CSTS3Handler::CreateAccessorExB: Return error to caller, hr=80042616 - File:d:\office\source\search\search\gather\protocols\sts3\sts3handler.cxx Line:261 03/15/2010 09:03:40.98 mssdmn.exe (0x1750) 0x1260 Search Server Common MS Search Indexing 7how Monitorable GetWebDefaultPage fail. error 2147755542, strWebUrl http://serveris/test 03/15/2010 09:03:41.90 mssdmn.exe (0x1750) 0x0B24 Search Server Common PHSts 0 Monitorable CSTS3Accessor::GetSubWebListItemAccessURL GetAccessURL failed: Return error to caller, hr=80042616 - File:d:\office\source\search\search\gather\protocols\sts3\sts3acc.cxx Line:505 03/15/2010 09:03:41.90 mssdmn.exe (0x1750) 0x0B24 Search Server Common PHSts 0 Monitorable CSTS3Accessor::Init: GetSubWebListItemAccessURL failed. Return error to caller, hr=80042616 - File:d:\office\source\search\search\gather\protocols\sts3\sts3acc.cxx Line:348 03/15/2010 09:03:41.90 mssdmn.exe (0x1750) 0x0B24 Search Server Common PHSts 0 Monitorable CSTS3Accessor::Init fails, Url sts3://serveris/siteurl=/siteid={505443fa-ef12-4f1e-a04b-d5450c939b78}/weburl=/webid={c5a4f8aa-9561-4527-9e1a-b3c23200f11c}, hr=80042616 - File:d:\office\source\search\search\gather\protocols\sts3\sts3handler.cxx Line:243 03/15/2010 09:03:41.90 mssdmn.exe (0x1750) 0x0B24 Search Server Common PHSts 0 Monitorable CSTS3Handler::CreateAccessorExB: Return error to caller, hr=80042616 - File:d:\office\source\search\search\gather\protocols\sts3\sts3handler.cxx Line:261 03/15/2010 09:03:43.26 mssearch.exe (0x1B2C) 0x0750 Search Server Common GatherStatus 0 Monitorable Advise status change 24, project Portal_Content, crawl 771 - File:d:\office\source\search\search\gather\server\gatherobj.cxx Line:4853 03/15/2010 09:03:43.26 mssearch.exe (0x1B2C) 0x1804 Search Server Common GatherStatus 0 Monitorable Remove crawl 771 from inprogress queue - File:d:\office\source\search\search\gather\server\gatherobj.cxx Line:6722 03/15/2010 09:03:43.26 mssearch.exe (0x1B2C) 0x0750 Search Server Common GatherStatus 0 Monitorable Advise status change 12, project Portal_Content, crawl -1 - File:d:\office\source\search\search\gather\server\gatherobj.cxx Line:4853 03/15/2010 09:03:44.65 mssearch.exe (0x1B2C) 0x1804 Search Server Common GatherStatus 0 Monitorable Insert crawl 772 to inprogress queue hr 0x00000000 - File:d:\office\source\search\search\gather\server\gatherobj.cxx Line:6591 03/15/2010 09:03:44.65 mssearch.exe (0x1B2C) 0x1804 Search Server Common GatherStatus 0 Monitorable Request Start Crawl 0, project AnchorProject, crawl 772 - File:d:\office\source\search\search\gather\server\gatherobj.cxx Line:2875 03/15/2010 09:03:44.65 mssearch.exe (0x1B2C) 0x1804 Search Server Common GatherStatus 0 Monitorable Advise status change 0, project AnchorProject, crawl 772 - File:d:\office\source\search\search\gather\server\gatherobj.cxx Line:4853 03/15/2010 09:03:44.65 mssearch.exe (0x1B2C) 0x1804 Search Server Common GatherStatus 0 Monitorable Unlock Queue, project Portal_Content - File:d:\office\source\search\search\gather\server\gatherobj.cxx Line:2922 03/15/2010 09:03:44.82 mssearch.exe (0x1B2C) 0x1DD0 Search Server Common GathererSql 0 Monitorable CGatherer::LoadTransactionsFromCrawlInternal Flush anchor, count 0 - File:d:\office\source\search\search\gather\server\gatherobj.cxx Line:4943 03/15/2010 09:03:44.95 mssearch.exe (0x1B2C) 0x0750 Search Server Common GatherStatus 0 Monitorable Advise status change 12, project AnchorProject, crawl -1 - File:d:\office\source\search\search\gather\server\gatherobj.cxx Line:4853 03/15/2010 09:03:46.51 mssearch.exe (0x1B2C) 0x0750 Search Server Common GatherStatus 0 Monitorable Advise status change 12, project AnchorProject, crawl -1 - File:d:\office\source\search\search\gather\server\gatherobj.cxx Line:4853 03/15/2010 09:03:46.39 mssearch.exe (0x1B2C) 0x1E4C Search Server Common GathererSql 0 Monitorable CGatherer::LoadTransactionsFromCrawlInternal Flush anchor, count 0 - File:d:\office\source\search\search\gather\server\gatherobj.cxx Line:4943 03/15/2010 09:03:49.01 mssearch.exe (0x1B2C) 0x1C6C Search Server Common GathererSql 0 Monitorable CGatherer::LoadTransactionsFromCrawlInternal Flush anchor, count 1 - File:d:\office\source\search\search\gather\server\gatherobj.cxx Line:4943 03/15/2010 09:03:49.87 mssearch.exe (0x1B2C) 0x155C Search Server Common GathererSql 0 Monitorable CGatherer::LoadTransactionsFromCrawlInternal Flush anchor, count 1 - File:d:\office\source\search\search\gather\server\gatherobj.cxx Line:4943 03/15/2010 09:03:49.29 mssearch.exe (0x1B2C) 0x155C Search Server Common GathererSql 0 Monitorable CGatherer::LoadTransactionsFromCrawlInternal Flush anchor, count 1 - File:d:\office\source\search\search\gather\server\gatherobj.cxx Line:4943 03/15/2010 09:03:49.53 mssearch.exe (0x1B2C) 0x155C Search Server Common GathererSql 0 Monitorable CGatherer::LoadTransactionsFromCrawlInternal Flush anchor, count 1 - File:d:\office\source\search\search\gather\server\gatherobj.cxx Line:4943 03/15/2010 09:03:49.67 mssearch.exe (0x1B2C) 0x155C Search Server Common GathererSql 0 Monitorable CGatherer::LoadTransactionsFromCrawlInternal Flush anchor, count 1 - File:d:\office\source\search\search\gather\server\gatherobj.cxx Line:4943 03/15/2010 09:03:49.82 mssearch.exe (0x1B2C) 0x155C Search Server Common GathererSql 0 Monitorable CGatherer::LoadTransactionsFromCrawlInternal Flush anchor, count 1 - File:d:\office\source\search\search\gather\server\gatherobj.cxx Line:4943 03/15/2010 09:03:49.84 mssearch.exe (0x1B2C) 0x155C Search Server Common GathererSql 0 Monitorable CGatherer::LoadTransactionsFromCrawlInternal Flush anchor, count 0 - File:d:\office\source\search\search\gather\server\gatherobj.cxx Line:4943 03/15/2010 09:03:49.89 mssearch.exe (0x1B2C) 0x155C Search Server Common GathererSql 0 Monitorable CGatherer::LoadTransactionsFromCrawlInternal Flush anchor, count 0 - File:d:\office\source\search\search\gather\server\gatherobj.cxx Line:4943 03/15/2010 09:03:49.90 mssearch.exe (0x1B2C) 0x0750 Search Server Common GatherStatus 0 Monitorable Advise status change 12, project AnchorProject, crawl -1 - File:d:\office\source\search\search\gather\server\gatherobj.cxx Line:4853 03/15/2010 09:03:51.42 mssearch.exe (0x1B2C) 0x1E4C Search Server Common GatherStatus 0 Monitorable Advise status change 4, project AnchorProject, crawl 772 - File:d:\office\source\search\search\gather\server\gatherobj.cxx Line:4853 03/15/2010 09:03:51.00 mssearch.exe (0x1B2C) 0x1E4C Search Server Common GathererSql 0 Monitorable CGatherer::LoadTransactionsFromCrawlInternal Flush anchor, count 0 - File:d:\office\source\search\search\gather\server\gatherobj.cxx Line:4943 03/15/2010 09:03:51.42 mssearch.exe (0x1B2C) 0x1CCC Search Server Common GatherStatus 0 Monitorable Remove crawl 772 from inprogress queue - File:d:\office\source\search\search\gather\server\gatherobj.cxx Line:6722 03/15/2010 09:03:52.96 mssearch.exe (0x1B2C) 0x1CCC Search Server Common GatherStatus 0 Monitorable Insert crawl 773 to inprogress queue hr 0x00000000 - File:d:\office\source\search\search\gather\server\gatherobj.cxx Line:6591 03/15/2010 09:03:52.96 mssearch.exe (0x1B2C) 0x1CCC Search Server Common GatherStatus 0 Monitorable Request Start Crawl 0, project AnchorProject, crawl 773 - File:d:\office\source\search\search\gather\server\gatherobj.cxx Line:2875 03/15/2010 09:03:55.29 mssearch.exe (0x1B2C) 0x1CCC Search Server Common GatherStatus 0 Monitorable Unlock Queue, project AnchorProject - File:d:\office\source\search\search\gather\server\gatherobj.cxx Line:2922 03/15/2010 09:03:55.29 mssearch.exe (0x1B2C) 0x1CCC Search Server Common GatherStatus 0 Monitorable Removed start crawl request from Queue 0, crawl 773 - File:d:\office\source\search\search\gather\server\gatherobj.cxx Line:2942 03/15/2010 09:03:55.29 mssearch.exe (0x1B2C) 0x1CCC Search Server Common GatherStatus 0 Monitorable Request Start Crawl 0, project AnchorProject, crawl 773 - File:d:\office\source\search\search\gather\server\gatherobj.cxx Line:2875 03/15/2010 09:03:55.29 mssearch.exe (0x1B2C) 0x1CCC Search Server Common GatherStatus 0 Monitorable Advise status change 0, project AnchorProject, crawl 773 - File:d:\office\source\search\search\gather\server\gatherobj.cxx Line:4853 03/15/2010 09:03:55.37 mssearch.exe (0x1B2C) 0x1CCC Search Server Common GathererSql 0 Monitorable CGatherer::LoadTransactionsFromCrawlInternal Flush anchor, count 0 - File:d:\office\source\search\search\gather\server\gatherobj.cxx Line:4943 03/15/2010 09:03:55.37 mssearch.exe (0x1B2C) 0x0750 Search Server Common GatherStatus 0 Monitorable Advise status change 12, project AnchorProject, crawl -1 - File:d:\office\source\search\search\gather\server\gatherobj.cxx Line:4853 03/15/2010 09:03:56.71 mssearch.exe (0x1B2C) 0x1E4C Search Server Common GathererSql 0 Monitorable CGatherer::LoadTransactionsFromCrawlInternal Flush anchor, count 0 - File:d:\office\source\search\search\gather\server\gatherobj.cxx Line:4943 03/15/2010 09:03:56.78 mssearch.exe (0x1B2C) 0x0750 Search Server Common GatherStatus 0 Monitorable Advise status change 12, project AnchorProject, crawl -1 - File:d:\office\source\search\search\gather\server\gatherobj.cxx Line:4853 03/15/2010 09:03:58.40 mssearch.exe (0x1B2C) 0x155C Search Server Common GathererSql 0 Monitorable CGatherer::LoadTransactionsFromCrawlInternal Flush anchor, count 0 - File:d:\office\source\search\search\gather\server\gatherobj.cxx Line:4943 03/15/2010 09:03:58.89 mssearch.exe (0x1B2C) 0x155C Search Server Common GatherStatus 0 Monitorable Advise status change 4, project AnchorProject, crawl 773 - File:d:\office\source\search\search\gather\server\gatherobj.cxx Line:4853 03/15/2010 09:03:58.89 mssearch.exe (0x1B2C) 0x1130 Search Server Common GatherStatus 0 Monitorable Remove crawl 773 from inprogress queue - File:d:\office\source\search\search\gather\server\gatherobj.cxx Line:6722 03/15/2010 09:03:58.89 mssearch.exe (0x1B2C) 0x1130 Search Server Common GatherStatus 0 Monitorable Unlock Queue, project AnchorProject - File:d:\office\source\search\search\gather\server\gatherobj.cxx Line:2922 What could be wrong here - any clues?

    Read the article

  • JPA 2 Criteria API: why is isNull being ignored when in conjunction with equal?

    - by Vítor Souza
    I have the following entity class (ID inherited from PersistentObjectSupport class): @Entity public class AmbulanceDeactivation extends PersistentObjectSupport implements Serializable { private static final long serialVersionUID = 1L; @Temporal(TemporalType.DATE) @NotNull private Date beginDate; @Temporal(TemporalType.DATE) private Date endDate; @Size(max = 250) private String reason; @ManyToOne @NotNull private Ambulance ambulance; /* Get/set methods, etc. */ } If I do the following query using the Criteria API: CriteriaBuilder cb = em.getCriteriaBuilder(); CriteriaQuery<AmbulanceDeactivation> cq = cb.createQuery(AmbulanceDeactivation.class); Root<AmbulanceDeactivation> root = cq.from(AmbulanceDeactivation.class); EntityType<AmbulanceDeactivation> model = root.getModel(); cq.where(cb.isNull(root.get(model.getSingularAttribute("endDate", Date.class)))); return em.createQuery(cq).getResultList(); I get the following SQL printed in the log: FINE: SELECT ID, REASON, ENDDATE, UUID, BEGINDATE, VERSION, AMBULANCE_ID FROM AMBULANCEDEACTIVATION WHERE (ENDDATE IS NULL) However, if I change the where() line in the previous code to this one: cq.where(cb.isNull(root.get(model.getSingularAttribute("endDate", Date.class))), cb.equal(root.get(model.getSingularAttribute("ambulance", Ambulance.class)), ambulance)); I get the following SQL: FINE: SELECT ID, REASON, ENDDATE, UUID, BEGINDATE, VERSION, AMBULANCE_ID FROM AMBULANCEDEACTIVATION WHERE (AMBULANCE_ID = ?) That is, the isNull criterion is totally ignored. It is as if it wasn't even there (if I provide only the equal criterion to the where() method I get the same SQL printed). Why is that? Is it a bug or am I missing something?

    Read the article

  • how can i insert a new sitemap with google gdata api? it returns 400 bad request

    - by wingoo
    i try to insert a new sitemap to google using api, but i can't do it successful-_- this is the method var fullDomainUrl = "http://www.example.com/"; var entry = new SitemapsEntry(); entry.Id = new AtomId(fullDomainUrl + "sitemap.xml"); entry.Categories.Add(new AtomCategory("http://schemas.google.com/webmasters/tools/2007#site-info", new AtomUri("http://schemas.google.com/g/2005#kind"))); entry.SitemapType = "WEB"; myService.Insert(new Uri(string.Format("https://www.google.com/webmasters/tools/feeds/{0}/sitemaps/", HttpUtility.UrlEncode(fullDomainUrl))), entry); this will retuen a 400 bad requestand i try another method var settings = new RequestSettings("TesterApp1", domain.GoogleAuthToken, CommonService.GetRsaPrivateKey(Context)); var request = new WebmasterToolsRequest(settings); var sitemap = new Sitemap(); sitemap.Id = fullDomainUrl + "sitemap.xml"; sitemap.Categories.Add(new AtomCategory("http://schemas.google.com/webmasters/tools/2007#site-info", new AtomUri("http://schemas.google.com/g/2005#kind"))); sitemap.SitemapType = "WEB"; //request.AddSitemap(fullDomainUrl, sitemap); request.Insert(new Uri(string.Format("https://www.google.com/webmasters/tools/feeds/{0}/sitemaps/", HttpUtility.UrlEncode(fullDomainUrl))), sitemap); this also return a 400 bad request and then i try to use HttpWebRequest to post the atom to google,but it also return a 400 bad request(???") i can insert/update site successful,but can;t insert a new sitemap.. does any can give a right code with .net?

    Read the article

  • Google map API v3 event click raise when clickingMarkerClusterer?

    - by lucian.jp
    I have a Google Map API v3 map object on a page that uses MarkerClusterer. I have a function that need to run when we click on the map to it is registered as: google.maps.event.addListener(map, 'click', function (event) { CallMe(event.latLng); }); So my problem is as follows: When I click on a cluster of MarkerClusterer instead of behaving like a marker and not raise the click event on the map but only the one from the marker it calls the click from the map. To test this I have generated an alert from the markerclusterer click: google.maps.event.addListener(markerClusterer, "clusterclick", function (cluster) { alert('MarkerClusterer click event'); }); So the clusterclick rises after the click event of map object. I then can't remove the listener of map object as a solution. Is there any way to test if there was a clusterer click in the click event of the map? Or a way to replicate the marker behaviour and do not raise the click event of map when clustererclick is called? Google and documentation didn’t help me. Thx

    Read the article

  • What's a good Java API for creating Word documents?

    - by Bill James
    I have a new app I'll be working on where I have to generate a Word document that contains tables, graphs, a table of contents and text. What's a good API to use for this? How sure are you that it supports graphs, ToCs, and tables? What are some hidden gotcha's in using them? Some clarifications: I can't output a PDF, they want a Word doc. They're using MS Word 2003 (or 2007), not OpenOffice Application is running on *nix app-server It'd be nice if I could start with a template doc and just fill in some spaces with tables, graphs, etc. Thanks for the help. Edit: Several good answers below, each with their own faults as far as my current situation. Hard to pick a "final answer" from them. Think I'll leave it open, and hope for better solutions to be created. Edit: The OpenOffice UNO project does seem to be closest to what I asked for. While POI is certainly more mainstream, it's too immature for what I want.

    Read the article

  • Google Maps API V3: How to jump to a specific marker from outside the map?

    - by Tom
    I have a map with two markers on it. The initial view of the map only shows one marker, and I want to provide a link next to the map that will move the map to the 2nd marker when clicked. Here's a demo of what I want, using v2 of the API: http://arts.brighton.ac.uk/contact-university-of-brighton-faculty-of-arts (note the links below the map) Here's what I have so far: <script type="text/javascript"> function initialize() { var latlng = new google.maps.LatLng(50.823817, -0.135634); var myOptions = { zoom: 13, center: latlng, mapTypeId: google.maps.MapTypeId.ROADMAP, mapTypeControlOptions: { style: google.maps.MapTypeControlStyle.DROPDOWN_MENU } , scaleControl: false }; var map = new google.maps.Map(document.getElementById("map"), myOptions); // 1st marker var marker1 = new google.maps.Marker({ position: new google.maps.LatLng(50.823817, -0.135634), map: map, title: 'my title' }); var infowindow = new google.maps.InfoWindow({ content: 'my window content' }); google.maps.event.addListener(marker1, 'click', function() { infowindow.open(map, marker1); }); // 2nd marker var marker2 = new google.maps.Marker({ position: new google.maps.LatLng(51.5262405, -0.074549), map: map, title: 'my 2nd title'}); var infowindow2 = new google.maps.InfoWindow({ content: 'content for my 2nd window' }); google.maps.event.addListener(marker2, 'click', function() { infowindow2.open(map, marker2); }); } </script> So what I'd like to add is a link to marker2, to move the map some 50-odd miles up, e.g. <a href="#marker2">Second location</a>. How would I do this?

    Read the article

  • This web site needs a different Google Maps API key. A new key can be generated at http://code.googl

    - by MJI
    I'm not a developer. I have no domain, nor wish to have one at the time. Rather I'm just a regular person who likes to upload photos to some photo related sites. My uploading process constantly gets interrupted by one of these annoying API errors. I get it at least two times, one when I click the page to upload, and also right after it has uploaded. It also pops up if I go to edit a photo or delete it. This interrupts my browsing experience until I click okay. I just want a fix for the annoying without having to register for a key. I tried before and it required a web domain. I rather not have to create a domain and go through such hoops just to fix this. Is there a solution for this problem that doesn't require registration? Another thing to note: I have used two computers. One has the message pop-up and the other doesn't. What is different about the two computers?

    Read the article

  • Why does my Google maps api v3 and side panel not fill my page upon resizing?

    - by Gavin
    I'm developing a web page and I have a side panel on the left with a search bar and a Google maps api v3 filling the rest of the page to the right. When I make the browser very small vertically, there is a white space between the side panel and the map, and the bottom of the browser. However, the text continues to the bottom of the browser. It looks like: Here's my css code: <style type="text/css"> body {margin:0;} #panel {height:100%; width:300px; position:absolute; padding:0;background-color:#8C95A0;} #header {padding:2px; text-align:center} #address_instruction {position:relative; top:7%; padding:2px; text-align:center} #geocoder {position:relative; top:8%; padding:2px; text-align:center} #toggle_instruction {position:relative; top:22%; padding:2px; text-align:center} #layers {position:relative; top:25%; padding:2px; text-align:center} #layer0 {padding:2px; text-align:center} #layer1 {padding:2px; text-align:center} #layer2 {padding:2px; text-align:center} #link {top:50%; position:relative; padding:2px; text-align:center} #map_canvas {height:100%; left:300px; right:0px; position:absolute; padding:0;} </style> The IDs within #panel refer to the items on the left hand side in the panel. Why don't the side panel background color and map extend to the bottom of the browser?

    Read the article

  • How do I post a link to the feed of a page via the Facebook Graph API *as* the page?

    - by jsdalton
    I'm working on a plugin for a Wordpress blog that posts a link to every article published to a Facebook Page associated with the blog. I'm using the Graph API and I have authenticated myself, for the time being, via OAuth. I can successfully post a message to the page using curl via a POST request to https://graph.facebook.com/mypageid/feed with e.g. message = "This is a test" and it published the message. The problem is that the message is "from" my user account. I'm an admin on this test page, and when I go to Facebook and post an update from the web, the link comes "from" my page. That's how I'd like this to be set up, because it looks silly if all the shared links are coming from a user account. Is there a way to authenticate myself as a page? Or is there an alternate way to POST to a page feed that doesn't end up being interpreted as a comment from a user? Thanks for any thoughts or suggestions.

    Read the article

  • How can I make my Google Maps api v3 address search bar work by hitting the enter button on the keyboard?

    - by Gavin
    I'm developing a webpage and I would just like to make something more user friendly. I have a functional Google Maps api v3 and an address search bar. Currently, I have to use the mouse to select search to initialize the geocoding function. How can I make the map return a placemark by hitting the enter button on my keyboard? I just want to make it as user-friendly as possible. Here is the javascript and div, respectively, I created for the address bar: var geocoder; function initialize() { geocoder = new google.maps.Geocoder (); function codeAddress () { var address = document.getElementById ("address").value; geocoder.geocode ( { 'address': address}, function(results, status) { if (status == google.maps.GeocoderStatus.OK) { map.setCenter(results [0].geometry.location); marker.setPosition(results [0].geometry.location); map.setZoom(14); } else { alert("Geocode was not successful for the following reason: " + status); } }); } <div id="geocoder"> <input id="address" type="textbox" value=""> <input type="button" value="Search" onclick="codeAddress()"> </div> Thank you in advance for your help

    Read the article

  • Introducing Microsoft SQL Server 2008 R2 - Business Intelligence Samples

    - by smisner
    On April 14, 2010, Microsoft Press (blog | twitter) released my latest book, co-authored with Ross Mistry (twitter), as a free ebook download - Introducing Microsoft SQL Server 2008 R2. As the title implies, this ebook is an introduction to the latest SQL Server release. Although you'll find a comprehensive review of the product's features in this book, you will not find the step-by-step details that are typical in my other books. For those readers who are interested in a more interactive learning experience, I have created two samples file for download: IntroSQLServer2008R2Samples project Sales Analysis workbook Here's a recap of the business intelligence chapters and the samples I used to generate the screen shots by chapter: Chapter 6: Scalable Data Warehousing covers a new edition of SQL Server, Parallel Data Warehouse. Understandably, Microsoft did not ship me the software and hardware to set up my own Parallel Data Warehouse environment for testing purposes and consequently you won't see any screenshots in this chapter. I received a lot of information and a lot of help from the product team during the development of this chapter to ensure its technical accuracy. Chapter 7: Master Data Services is a new component in SQL Server. After you install Master Data Services (MDS), which is a separate installation from SQL Server although it's found on the same media, you can install sample models to explore (which is what I did to create screenshots for the book). To do this, you deploying packages found at \Program Files\Microsoft SQL Server\Master Data Services\Samples\Packages. You will first need to use the Configuration Manager (in the Microsoft SQL Server 2008 R2\Master Data Services program group) to create a database and a Web application for MDS. Then when you launch the application, you'll see a Getting Started page which has a Deploy Sample Data link that you can use to deploy any of the sample packages. Chapter 8: Complex Event Processing is an introduction to another new component, StreamInsight. This topic was way too large to cover in-depth in a single chapter, so I focused on information such as architecture, development models, and an overview of the key sections of code you'll need to develop for your own applications. StreamInsight is an engine that operates on data in-flight and as such has no user interface that I could include in the book as screenshots. The November CTP version of SQL Server 2008 R2 included code samples as part of the installation, but these are not the official samples that will eventually be available in Codeplex. At the time of this writing, the samples are not yet published. Chapter 9: Reporting Services Enhancements provides an overview of all the changes to Reporting Services in SQL Server 2008 R2, and there are many! In previous posts, I shared more details than you'll find in the book about new functions (Lookup, MultiLookup, and LookupSet), properties for page numbering, and the new global variable RenderFormat. I will confess that I didn't use actual data in the book for my discussion on the Lookup functions, but I did create real reports for the blog posts and will upload those separately. For the other screenshots and examples in the book, I have created the IntroSQLServer2008R2Samples project for you to download. To preview these reports in Business Intelligence Development Studio, you must have the AdventureWorksDW2008R2 database installed, and you must download and install SQL Server 2008 R2. For the map report, you must execute the PopulationData.sql script that I included in the samples file to add a table to the AdventureWorksDW2008R2 database. The IntroSQLServer2008R2Samples project includes the following files: 01_AggregateOfAggregates.rdl to illustrate the use of embedded aggregate functions 02_RenderFormatAndPaging.rdl to illustrate the use of page break properties (Disabled, ResetPageNumber), the PageName property, and the RenderFormat global variable 03_DataSynchronization.rdl to illustrate the use of the DomainScope property 04_TextboxOrientation.rdl to illustrate the use of the WritingMode property 05_DataBar.rdl 06_Sparklines.rdl 07_Indicators.rdl 08_Map.rdl to illustrate a simple analytical map that uses color to show population counts by state PopulationData.sql to provide the data necessary for the map report Chapter 10: Self-Service Analysis with PowerPivot introduces two new components to the Microsoft BI stack, PowerPivot for Excel and PowerPivot for SharePoint, which you can learn more about at the PowerPivot site. To produce the screenshots for this chapter, I created the Sales Analysis workbook which you can download (although you must have Excel 2010 and the PowerPivot for Excel add-in installed to explore it fully). It's a rather simple workbook because space in the book did not permit a complete exploration of all the wonderful things you can do with PowerPivot. I used a tutorial that was available with the CTP version as a basis for the report so it might look familiar if you've already started learning about PowerPivot. In future posts, I'll continue exploring the new features in greater detail. If there's any special requests, please let me know! Share this post: email it! | bookmark it! | digg it! | reddit! | kick it! | live it!

    Read the article

  • C# 5.0 Async/Await Demo Code

    - by Paulo Morgado
    I’ve published the sample code I use to demonstrate the use of async/await in C# 5.0. You can find it here. Projects PauloMorgado.AyncDemo.WebServer This project is a simple web server implemented as a console application using Microsoft ASP.NET Web API self hosting and serves an image (with a delay) that is accessed by the other projects. This project has a dependency on Json.NET due to the fact the the Microsoft ASP.NET Web API hosting has a dependency on Json.NET. The application must be run on a command prompt with administrative privileges or a urlacl must be added to allow the use of the following command: netsh http add urlacl url=http://+:9090/ user=machine\username To remove the urlacl, just use the following command: netsh http delete urlacl url=http://+:9090/ PauloMorgado.AsyncDemo.WindowsForms This Windows Forms project contains three regions that must be uncommented one at a time: Sync with WebClient This code retrieves the image through a synchronous call using the WebClient class. Async with WebClient This code retrieves the image through an asynchronous call using the WebClient class. Async with HttpClient with cancelation This code retrieves the image through an asynchronous call with cancelation using the HttpClient class. PauloMorgado.AsyncDemo.Wpf This WPF project contains three regions that must be uncommented one at a time: Sync with WebClient This code retrieves the image through a synchronous call using the WebClient class. Async with WebClient This code retrieves the image through an asynchronous call using the WebClient class. Async with HttpClient with cancelation This code retrieves the image through an asynchronous call with cancelation using the HttpClient class.

    Read the article

  • Advanced donut caching: using dynamically loaded controls

    - by DigiMortal
    Yesterday I solved one caching problem with local community portal. I enabled output cache on SharePoint Server 2007 to make site faster. Although caching works fine I needed to do some additional work because there are some controls that show different content to different users. In this example I will show you how to use “donut caching” with user controls – powerful way to drive some content around cache. About donut caching Donut caching means that although you are caching your content you have some holes in it so you can still affect the output that goes to user. By example you can cache front page on your site and still show welcome message that contains correct user name. To get better idea about donut caching I suggest you to read ScottGu posting Tip/Trick: Implement "Donut Caching" with the ASP.NET 2.0 Output Cache Substitution Feature. Basically donut caching uses ASP.NET substitution control. In output this control is replaced by string you return from static method bound to substitution control. Again, take a look at ScottGu blog posting I referred above. Problem If you look at Scott’s example it is pretty plain and easy by its output. All it does is it writes out current user name as string. Here are examples of my login area for anonymous and authenticated users:    It is clear that outputting mark-up for these views as string is pretty lame to implement in code at string level. Every little change in design will end up with new version of controls library because some parts of design “live” there. Solution: using user controls I worked out easy solution to my problem. I used cache substitution and user controls together. I have three user controls: LogInControl – this is the proxy control that checks which “real” control to load. AnonymousLogInControl – template and logic for anonymous users login area. AuthenticatedLogInControl – template and logic for authenticated users login area. This is the control we render for each user separately because it contains user name and user profile fill percent. Anonymous control is not very interesting because it is only about keeping mark-up in separate file. Interesting parts are LogInControl and AuthenticatedLogInControl. Creating proxy control The first thing was to create control that has substitution area where “real” control is loaded. This proxy control should also be available to decide which control to load. The definition of control is very primitive. <%@ Control EnableViewState="false" Inherits="MyPortal.Profiles.LogInControl" %> <asp:Substitution runat="server" MethodName="ShowLogInBox" /> But code is a little bit tricky. Based on current user instance we decide which login control to load. Then we create page instance and load our control through it. When control is loaded we will call DataBind() method. In this method we evaluate all fields in loaded control (it was best choice as Load and other events will not be fired). Take a look at the code. public static string ShowLogInBox(HttpContext context) {     var user = SPContext.Current.Web.CurrentUser;     string controlName;       if (user != null)         controlName = "AuthenticatedLogInControl.ascx";     else         controlName = "AnonymousLogInControl.ascx";       var path = "~/_controltemplates/" + controlName;     var output = new StringBuilder(10000);       using(var page = new Page())     using(var ctl = page.LoadControl(path))     using(var writer = new StringWriter(output))     using(var htmlWriter = new HtmlTextWriter(writer))     {         ctl.DataBind();         ctl.RenderControl(htmlWriter);     }     return output.ToString(); } When control is bound to data we ask to render it its contents to StringBuilder. Now we have the output of control as string and we can return it from our method. Of course, notice how correct I am with resources disposing. :) The method that returns contents for substitution control is static method that has no connection with control instance because hen page is read from cache there are no instances of controls available. Conclusion As you saw it was not very hard to use donut caching with user controls. Instead of writing mark-up of controls to static method that is bound to substitution control we can still use our user controls.

    Read the article

  • Wolfram is out, any alternatives? Or how to go custom?

    - by Patrick
    We were originally planning on using wolfram alpha api for a new project but unfortunately the cost was entirely way to high for what we were using it for. Essentially what we were doing is calculating the nutrition facts for food. (http://www.wolframalpha.com/input/?i=chicken+breast+with+broccoli). Before taking the step of trying to build something that may work in its place for this use case is there any open source code anywhere that can do this kind of analysis and compile the data? The hardest part in my opinion is what it has for assumptions and where it gets that data to power the calculations. Or another way to put it is, I cannot seem to wrap my head around building something that computes user input to return facts and knowledge. I know if I can convert the user input into some standardized form I can then compare that to a nutrition fact database to pull in the information I need. Does anyone know of any solutions to re-create this or APIs that can provide this kind of analysis? Thanks for any advice. I am trying to figure out if this project is dead in the water before it even starts. This kind of programming is well beyond me so I can only hope for an API, open source, or some kind of analysis engine to interpret user input when I know what kind of data they are entering (measurements and food).

    Read the article

  • Design Application to "Actively" Invite Users (pretend they have privileges)

    - by user3086451
    I am designing an application where users message one another privately, and may send messages to any Entity in the database (an Entity may not have a user account yet, it is a professional database). I am not sure how to best design the database and the API to allow messaging unregistered users. The application should remain secure, and data only accessed by those with correct permissions. Messages sent to persons without user accounts serve as an invitation. The invited person should be able to view the message, act on it, and complete the user registration upon receiving an InviteMessage. In simple terms, I have: User misc user fields (email, pw, dateJoined) Entity (large professional dataset): personalDetails... user->User (may be null) UserMessage: sender->User recipient->User dateCreated messageContent, other fields..... InviteMessage: sender->User recipient->Entity expiringUrl inviteeEmail inviteePhone I plan to alert the user when selecting a recipient that is not registered yet, and inform that he may send the message as an invitation by providing email, phone where we can send the invitation. Invitations will have a unique, one-time-use URL, e.g. uuid.uuid4(). When accessed, the invitee will see the InviteMessage and details about completing his/her registration profile. When registration is complete, InviteMessage details to a new instance of UserMessage (to not lose their data), and assign it to the newly created User. The ability to interact with and invite persons who do not yet have accounts is a key feature of the application, and it seems better to separate the invitation from the private, app messages (easier to keep functionality separate, better if data model changes). Is this a reasonable, good design? If not, what would you suggest? Do you have any improvements? Am I correct to choose to create a separate endpoint for creating invitations via the API?

    Read the article

  • For a Javascript library, what is the best or standard way to support extensibility

    - by Michael Best
    Specifically, I want to support "plugins" that modify the behavior of parts of the library. I couldn't find much information on the web about this subject. But here are my ideas for how a library could be extensible. The library exports an object with both public and "protected" functions. A plugin can replace any of those functions, thus modifying the library's behavior. Advantages of this method are that it's simple and that the plugin's functions can have full access to the library's "protected" functions. Disadvantages are that the library may be harder to maintain with a larger set of exposed functions and it could be hard to debug if multiple plugins are involved (how to know which plugin modified which function?). The library provides an "add plugin" function that accepts an object with a specific interface. Internally, the library will use the plugin instead of it's own code if appropriate. With this method, the internals of the library can be rearranged more freely as long as it still supports the same plugin interface. This could also support having different plugin interfaces to modify different parts of the library. A disadvantage of this method is that the plugins may have to re-implement code that is already part of the library since the library's internal functions are not exported. The library provides a "set implementation" function that accepts an object inherited from a specific base object. The library's public API calls functions in the implementation object for any functionality that can be modified and the base implementation object includes the core functionality, with both external (to the API) and internal functions. A plugin creates a new implementation object, which inherits from the base object and replaces any functions it wants to modify. This combines advantages and disadvantages of both the other methods.

    Read the article

  • How should we deal with multiple transaction-report requests?

    - by Mithir
    We are developing a system for the retail market which one of it's features will enable clients(actually consumer clubs) to go through all transactions made by end-clients. One of the ways to get this information will be via an API. The idea is that there will be requests for reports with a start date and an end date, and a response will have all the transactions between those dates. We are worry that some reports may be very large, and that some clients will repeatedly request for reports, in this case the DB and CPU will be very overloaded. The same server that will service those requests, also takes care the the actual retail transactions (received by proprietary devices) and a Web application. We are not sure about how to limit the report requests from the API so that it won't affect the system too much. So, how should we deal with this scenario? any thoughts? EDIT: just to make clear: When I mentioned proprietary devices I meant "On-Location" devices which are used during sales with end-clients, this means that these requests shouldn't get delayed, and this is the main concern.

    Read the article

  • Should I use heroku or should I have my own ssl? [closed]

    - by user1744649
    Base on your experience, can you please advice what will be better for me? Issue : I build applications and there are 2 major constraints. 1. ssl is needed since I used facebook api's. So, only heroku is a good option. 2. My web components tend to hit the Max_Execution_Time very often, since I pull a lot of data using the facebook api. Future possible purpose of this site : 1. Will use more apis from google, twitter, future. 2. Might request for donations. 3. Just for hobby. I have two options : 1. Create a web site in heroku itself by converting all the php components to a background worker in python using django. 2. Dont use heroku at all. Do the the complete hosting with godaddy (shared plan). And buy an ssl so that I can use fb apis etc. In this scenario, what do you suggest me to do?

    Read the article

  • Upcoming Events

    - by MOSSLover
    Here is a list of the events I will be speaking at in the next few months (Yes I am a sado-masochist): SharePoint Saturday Atlanta, Saturday, April 17th SharePoint .org Conference, April 18th-20th TEC, April 25th-28th SharePoint Saturday Huntsville, Saturday, May 1st SharePoint Saturday Philadelphia, Saturday, May 8th SharePoint Saturday DC, Saturday, May 15th SharePoint Saturday Ozarks, Saturday, June 13th I am trying to organize a SharePint at TEC after the LA SPUG meeting with Joel on Tuesday, April 27th.  Does anyone know if there is a good spot around the JW Marriott in downtown LA?  Maybe we can get a bunch of local SharePointers to attend. Technorati Tags: SharePoint Events,SharePoint Saturdays,SharePoint Conferences

    Read the article

  • ASP.NET MVC 4/Web API Single Page App for Mobile Devices ... Needs Authentication

    - by lmttag
    We have developed an ASP.NET MVC 4/Web API single page, mobile website (also using jQuery Mobile) that is intended to be accessed only from mobile devices (e.g., iPads, iPhones, Android tables and phones, etc.), not desktop browsers. This mobile website will be hosted internally, like an intranet site. However, since we’re accessing it from mobile devices, we can’t use Windows authentication. We still need to know which user (and their role) is logging in to the mobile website app. We tried simply using ASP.NET’s forms authentication and membership provider, but couldn’t get it working exactly the way we wanted. What we need is for the user to be prompted for a user name and password only on the first time they access the site on their mobile device. After they enter a correct user name and password and have been authenticated once, each subsequent time they access the site they should just go right in. They shouldn’t have to re-enter their credentials (i.e., something needs to be saved locally to each device to identify the user after the first time). This is where we had troubles. Everything worked as expected the first time. That is, the user was prompted to enter a user name and password, and, after doing that, was authenticated and allowed into the site. The problem is every time after the browser was closed on the mobile device, the device and user were not know and the user had to re-enter user name and password. We tried lots of things too. We tried setting persistent cookies in JavaScript. No good. The cookies weren’t there to be read the second time. We tried manually setting persistent cookies from ASP.NET. No good. We, of course, used FormsAuthentication.SetAuthCookie(model.UserName, true); as part of the form authentication framework. No good. We tried using HTML5 local storage. No good. No matter what we tried, if the user was on a mobile device, they would have to log in every single time. (Note: we’ve tried on an iPad and iPhone running both iOS 5.1 and 6.0, with Safari configure to allow cookies, and we’ve tried on Android 2.3.4.) Is there some trick to getting a scenario like this working? Or, do we have to write some sort of custom authentication mechanism? If so, how? And, what? Or, should we use something like claims-based authentication and WIF? Or??? Any help is appreciated. Thanks!

    Read the article

  • IE Tab 2 vs IE Tab Plus

    - by johnthexiii
    I'm wonder what the differences are, especially as it pertains to SharePoint. Why should I pick one over the other or does it matter? Also are there any Linux browser solutions for SharePoint, besides Wine + IE 6.

    Read the article

  • How to setup loopbackon my Windows Server 2008 Virtual PC VM?

    - by user39846
    I am trying to setup a virtual sharepoint environment for development and need to be able to access my sharepoint sites on my host machine. From hours of google research, I discern that I must setup loopback, but haven't been able to get it to work and can't find a detailed guide. Can anyone please post a details on how to set this up?

    Read the article

< Previous Page | 124 125 126 127 128 129 130 131 132 133 134 135  | Next Page >