Search Results

Search found 14829 results on 594 pages for 'sharepoint api'.

Page 33/594 | < Previous Page | 29 30 31 32 33 34 35 36 37 38 39 40  | Next Page >

  • Compiling error with API 10

    - by Suhel De Corser
    I was following the basic tutorials on developers.android.com and came by creating the activity named DisplayMessageActivity. It is a blank activity with all the specifications given as shown in the tutorial. FYI, I am using Min SDK = API 8, Target SDK = API 16, Compile with = API 10 The next thing is that there are two errors: "The method getActionBar() is undefined for the type DisplayMessageActivity "home cannot be resolved or is not a field" I tried changing the API to 14 which called for another problem, it wants the minimum API to be 11. That solves these problems, but the main problem is so many devices still use Gingerbread or maybe FroYo. Can't I write for them? Do I have to go higher? How to write for them?

    Read the article

  • A Consol Application or Windows Application in VS 2010 for Sharepoint 2010 : A common Error

    - by Gino Abraham
    I have seen many Sharepoint Newbies cracking their head to create a Console/Windows  application in VS2010 and make it talk to Sharepoint 2010 Server. I had the same problem when i started with Sharepoint in the begining. It is important for you to acknowledge that SharePoint 2010 is based on .NET Framework version 3.5 and not version 4.0. In VS 2010 when you create a Console/Windows application, Make Sure you select .Net Framework 3.5 in the New Project Dialog Window.If you have missed while creating new Project Go to the Application tab of project properties and verify that .NET Framework Version 3.5 is select as the Target Framework. Now that you have selected the correct framework, will it work? Nope if the application is configured as x86 one it will not work. Sharepoint is a 64 Bit application and when you create a windows application to talk to Sharepoint it should also be a 64 Bit one. Go to Configuration Manager, Select x64. If x64 is not available select <New…> and in the New Solution Platform dialog box select x64 as the new platform copying settings from x86 and checking the Create new project platforms check box. This is not applicable if you are making a console application to talk to sharepoint with Client Object Model.

    Read the article

  • Read-only lock on a SharePoint site collection, or Why can't I edit anymore?

    - by PeterBrunone
    Monday morning, the calls started.  For some reason, long-time users were unable to edit list items.  I figured we had a permissions issue, so I popped in to look at the Site Settings -- and found that I couldn't.  A quick trip to Central Administration showed that I was still listed as a Site Collection Administrator, but I had no power at all on the site collection in question.A quick glance at the logs told me that the server had recently shut down unexpectedly (this is a Hyper-V virtual machine).  Apparently, in the confusion, somehow SharePoint decided to lock the site collection as Read Only.  This can be remedied in one of two ways:1)  In Central Administration, go to Application Management->SharePoint Site Management->Site collection quotas and locks.  Once you have arrived, select the correct application and site collection, and you will have the opportunity to view and set the lock status of the collection (it most likely will be set to "Read-only", and you'll want to move that radio button to "Not locked").2)  Fire up stsadm and issue the following command:stsadm -o setsitelock -url http://myportalsitecollection -lock none

    Read the article

  • Slides, Materials, and Pictures from SharePoint Saturday Virginia Beach 2011

    - by Brian Jackett
    This past weekend I presented “Managing SharePoint 2010 Farms with PowerShell” and “SharePoint 2010 and Integrating Line of Business Applications” SharePoint Saturday Virginia Beach.  A big thanks to everyone who attended my sessions.  I had a great time presenting, getting to meet new folks, and exploring a little bit of the local life.  Below are slides, materials, and pictures from the event.  Let me know if you have any comments, questions, or feedback.  Thanks. Slides and Materials     Managing SharePoint 2010 Farms with PowerShell     SharePoint 2010 and Integrating Line of Business Applications Photos Pictures on Facebook     Click Here Pictures on Windows Live (higher res) SharePoint Saturday Virginia Beach Jan 2011 VIEW SLIDE SHOW DOWNLOAD ALL   Side Note: SavePSToSP CodePlex Project     During my “Managing SharePoint 2010 Farms with PowerShell” I made mention of a CodePlex project I am working on called SavePSToSP.  Click here for the link to that project.  I have been pushing out updates roughly once a month or more.  If you have any feedback or find it helpful feel free to let me know.         -Frog Out

    Read the article

  • A Console Application or Windows Application in VS 2010 for Sharepoint 2010 : A common Error

    - by Gino Abraham
    I have seen many Sharepoint Newbies cracking their head to create a Console/Windows  application in VS2010 and make it talk to Sharepoint 2010 Server. I had the same problem when i started with Sharepoint in the begining. It is important for you to acknowledge that SharePoint 2010 is based on .NET Framework version 3.5 and not version 4.0. In VS 2010 when you create a Console/Windows application, Make Sure you select .Net Framework 3.5 in the New Project Dialog Window.If you have missed while creating new Project Go to the Application tab of project properties and verify that .NET Framework Version 3.5 is select as the Target Framework. Now that you have selected the correct framework, will it work? Nope if the application is configured as x86 one it will not work. Sharepoint is a 64 Bit application and when you create a windows application to talk to Sharepoint it should also be a 64 Bit one. Go to Configuration Manager, Select x64. If x64 is not available select <New…> and in the New Solution Platform dialog box select x64 as the new platform copying settings from x86 and checking the Create new project platforms check box. This is not applicable if you are making a console application to talk to sharepoint with Client Object Model.

    Read the article

  • SharePoint 2013 Developer Ramp-Up - Part 2

    As stated already yesterday, today I continued with the available course material on Pluralsight. For sure interesting topics in the second part of the series but not the field of operation I'm going to work in later. During the course you get a lot of information about how to create and deploy SharePoint Solutions and hosted SharePoint Apps. Today's resource(s) Apart from some blog articles I watched in the following course today: SharePoint 2013 Developer Ramp-Up - Part 2 - Developing SharePoint Solutions and Apps Not thrilling but still two solid hours to go. Takeaway One of the coolest aspects I figured out today is that SharePoint development can be done easily in JavaScript and C# - just as you like or prefer. It's actually pretty cool to see that you could integrate external JS libraries like datajs, knockout,js and so forth in order to implement your solution. And that you should be very familiar with Microsoft PowerShell. Not only to simplify some repetitive work but also to do be able to get things going in SharePoint. Having a decent background knowledge in Linux, I find this pretty amusing and remember the initial baby steps when PowerShell was introduced some years back (Note: German language). The outcry as well as the hype was too funny. Honestly, I have kind of mixed feelings about today's progress. Surely, there was interesting information about developing extensions directly for and in SharePoint... Hm, I'll leave that one for now and probably it might be helpful someday.

    Read the article

  • Diffence between FQL query and Graph API object access

    - by jwynveen
    What's the difference between accessing user data with the Facebook Graph API (http://graph.facebook.com/btaylor) and using the Graph API to make a FQL query of the same user (https://api.facebook.com/method/fql.query?query=QUERY). Also, does anyone know which of them the Facebook Developer Toolkit (for ASP.NET) uses? The reason I ask is because I'm trying to access the logged in user's birthday after they begin a Facebook Connect session on my site, but when I use the toolkit it doesn't return it. However, if I make a manual call to the Graph API for that user object, it does return it. It's possible I might have something wrong with my call from the toolkit. I think I may need to include the session key, but I'm not sure how to get it. Here's the code I'm using: _connectSession = new ConnectSession(APPLICATION_KEY, SECRET_KEY); try { if (!_connectSession.IsConnected()) { // Not authenticated, proceed as usual. statusResponse = "Please sign-in with Facebook."; } else { // Authenticated, create API instance _facebookAPI = new Api(_connectSession); // Load user user user = _facebookAPI.Users.GetInfo(); statusResponse = user.ToString(); ViewData["fb_user"] = user; } } catch (Exception ex) { //An error happened, so disconnect session _connectSession.Logout(); statusResponse = "Please sign-in with Facebook."; }

    Read the article

  • API Wrapper Architecture Best Practice

    - by Adam Taylor
    Hi, So I'm writing a Perl wrapper module around a REST webservice and I'm hoping to have some advice on how best to architect the module. I've been looking at a couple of different Perl modules for inspiration. Flickr::Simple2 - so this is basically one big file with methods wrapping around the different methods in the Flickr API, e.g. getPhotos() etc. Flickr::API - this is a sub-class of another module (LWP) for making HTTP requests. So basically it just allows you to make calls through the module, using LWP, that go to the correct API method/URL without defining any wrapper methods itself. (That's explained pretty poorly - but basically it has a method that takes an argument (a API method name) and constructs the correct API call). e.g request() / response(). An alternative design would be like the first described, but less monolithic, with separate classes for separate "areas" of the API. I'd like to follow modern/best practice Perl methods so I'm using Dist::Zilla to build the module and Moose for the OO stuff but I'd appreciate some input on how to actually design/architect my wrapper. Guides/tutorials or pointers to other well designed modules would be appreciated. Cheers

    Read the article

  • Does anyone knows any Train-table-api service?

    - by DaNieL
    Hi guys, im wondering if there is any online service who provide API about the train timetables (arrivals, departure, etch..), at least for the european stations. I know www.bahn.de, who provide the accurated timetables for many european countries, but i didin't find any similar to a api service. My goal (well, just a future-project) is to develope an application like gmaps, but instead of giving the car-trip i would like to give the train-trip.. you know, the user set the departure day, time and station, then the first arrival, maybe then add another one, and so on. So, there is something like, that can be queryed by php/python/javascript? Edit: if can help, im wondering to build a service to plan an inter-rail trip! (and any help will be really appreciate) Im stuck. Google's public-service api rely on the information that every local company makes avaiable, much branches are missing (for example, is almost impossible to use it to build an itinerary that touches two or more regions of Europe. As i said before, the only service that i know doing it good is the bahn.de, they really have all the data.. but no api. I tryed to parse theyre result page (in order to use them as API), but seem like the markup is been build exactly to avoid that.. maybe they have business plans behind that or whatever, but i dont think they will ever release some API.. so, my project is going on without this function (p.s: my project is about non-profit cultural organizations, we wont make war to anyone ;P) @El Goorf: if you find a way, and consider the idea of sharing it, count on my hand if need help!

    Read the article

  • Throttling outbound API calls generated by a Rails app

    - by Sharpie
    I am not a professional web developer, but I like to wrench on websites as a hobby. Recently, I have been playing with developing a Rails app as a project to help me learn the framework. The goal of my toy app is to harvest data from another service through their API and make it available for me to query using a search function. However, the service I want to pull data from imposes a rate limit on the number of API calls that may be executed per minute. I plan on having my app run a daily update which may generate a burst of API calls that far exceeds the limit provided by the external service. I wish to respect the performance of the external site and so would like to throttle the rate at which my app executes the calls. I have done a little bit of searching and the overwhelming amount of tutorial material and pre-built libraries I have found cover throttling inbound API calls to a web app and I can find little discussion of controlling the flow of outbound calls. Being both an amateur web developer and a rails newbie, it is entirely possible that I have been executing the wrong searches in the wrong places. Therefore my questions are: Is there a nice website out there aggregating Rails tutorials that has material related to throttling outbound API requests? Are there any ruby gems or other libraries that would help me throttle the requests? I have some ideas of how I might go about writing a throttling system using a queue-based worker like DelayedJob or Resque to manage the API calls, but I would rather spend my weekends building the rest of the site if there is a good pre-built solution out there already.

    Read the article

  • Rate Limit Calls To Api Using Cache

    - by namtax
    Hi I am using coldfusion to call the last.fm api, using a cfc bundle sourced from here I am concerned about going over the request limit, which is 5 requests per originating IP address per second, averaged over a 5 minute period. The cfc bundle has a central component which calls all the other components, which are split up into sections like "artist", "track" etc...This central component "lastFmApi.cfc." is initiated in my application, and persisted for the lifespan of the application // Application.cfc example <cffunction name="onApplicationStart"> <cfset var apiKey = '[your api key here]' /> <cfset var apiSecret = '[your api secret here]' /> <cfset application.lastFm = CreateObject('component', 'org.FrankFusion.lastFm.lastFmApi').init(apiKey, apiSecret) /> </cffunction> Now if I want to call the api through a handler/controller, for example my artist handler...I can do this <cffunction name="artistPage" cache="5 mins"> <cfset qAlbums = application.lastFm.user.getArtist(url.artistName) /> </cffunction> I am a bit confused towards caching, but am caching each call to the api in this handler for 5 mins, but does this make any difference, because each time someone hits a new artist page wont this still count as a fresh hit against the api? Wondering how best to tackle this Thanks

    Read the article

  • How to hide helper functions from public API in c

    - by emge
    I'm working on a project and I need to create an API. I am using sockets to communicate between the server (my application) and the clients (the other applications using my API). This project is in c not C++ I come from a linux background and this is my first project using Windows, Visual Studio 2008, and dll libraries. I have communication working between the client and server, but I have some that is duplicated on both projects. I would like to create a library (probably a dll file), that both projects can link to so I don't have to maintain extra code. I also have to create the library that has the API that I need to make available for my clients. Within the API functions that I want public are the calls to these helper functions that are "duplicated code", I don't want to expose these functions to my client, but I do want my server to be able to use those functions. How can I do this? I will try to clarify with an example. This is what I started with. Server Project: int Server_GetPacket(SOCKET sd); int ReceiveAll(SOCKET sd, char *buf, int len); int VerifyLen(char *buf); Client Project: int Client_SendCommand(int command); int Client_GetData(int command, char *buf, int len); int ReceiveAll(SOCKET sd, char *buf, int len); int VerifyLen(char *buf); This is kind of what I would like to end up with: //Server Project: int Server_GetPacket(SOCKET sd); // library with public and private types // private API (not exposed to my client) int ReceiveAll(SOCKET sd, char *buf, int len); int VerifyLen(char *buf); // public API (header file available for client) int Client_SendCommand(int command); int Client_GetData(int command, char *buf, int len); Thanks any help would be appreciated.

    Read the article

  • How difficult is it to write our own Robots API, similar to G Wave Robots API ? Please read the deta

    - by user169650
    Consider the following entities : a) My own Wave-server b) My own Robots API c) Tomcat d) Google wave server/any other wave server Let us consider that a and d interact with one another via Google wave federation protocol. Now, I want to write my own Robots API in Java (similar to that of G Wave Robots API) using which I want to create Robots; which I want to host in entity c), which may in-turn connect to a) for listening to events and responding with operations. Let us consider that a) is already in place, i.e. implemented. Let us also consider that the Robot running on tomcat and entity a) are co-located, so that we do not need to use JSON-RPC for receiving events/sending operations; instead we can use Java interfaces. Now, my questions are : 1.How much of an effort is it to write my own Robots API to run on a tomcat container ? 2.What are the salient points to be taken care of ? Am I missing some important point here ? 3.How can I reuse some of the classes/packages/interfaces (e.g. com.google.wave.api.AbstractRobot, com.google.wave.api.event) with little/no changes at all ?

    Read the article

  • SharePoint Saturday Michigan 2010 Recap, Slides, and Photos

    - by Brian Jackett
    This past weekend I attended SharePoint Saturday Michigan (SPSMI) in Ann Arbor, Michigan.  For those unfamiliar, SharePoint Saturday is a community driven event where various speakers gather to present at a FREE conference on all topics related to SharePoint.  This made my third SharePoint Saturday attended and second I’ve spoken at.  I believe today it was announced that about 210 people total attended the event.  I was very happy with the turnout, especially the ratio of male to female attendees.  Typically with computer related conferences the ratio leans towards more males attending, but both Peter Serzo (one of conference organizers) and I both commented to each other that at the end of the day it appeared to be close to 40% women in the crowd.  So here’s my recap of the weekend. Arrival     Friday afternoon I drove up from Columbus, OH to Ann Arbor, MI and arrived around 4pm.  I was attempting to avoid the rush hour traffic and construction backups.  Turned out to be a good idea because other speakers coming up Friday got stuck on a highway which literally closed down in both directions due to a bad accident.  I was talking my friend Sean McDonough through the highway closing and this was the first time I had seen a solid black traffic line on Google Maps.  Most of us are familiar with Green, Yellow, and Red, but this line was black if that tells you how bad it got. Speaker “Dinner”     Fast forward a few hours and it was time for the speaker “dinner.”  I put “dinner” in quotes because with this night alone SPSMI set a new bar for nicest and most extravagant speaker appreciation events for SharePoint Saturday.  By tapping into some very influential contacts, the conference organizers were able to provide a truck limo (yep you heard right) with refreshments, access to an underground suite at the Palace of Auburn Hills, and courtside tickets to see the Detroit Pistons play that night.  Being a Michigan native I have to say that I was absolutely floored by this experience and very thankful to our conference organizers Peter, Sebastian, and Jesse along with Trillium Teamologies. Sessions     The actual conference started Saturday morning at 9am with the keynote by Rob Collie who is the Microsoft program manager for PowerPivot.  The day continued and I attended the following sessions: Mike Watson (@mikewat) – “SharePoint 2010 Fight Night: Devs vs. Admins” Karl Swedeberg (@kswedberg) – “A Walk on the Client Side with jQuery“ [my session] Brian Jackett (@briantjackett) - “Real World Deployment of SharePoint 2007 Solutions” Jeff Willinger (@jwillie) - “Social Computing and Collaboration Inside and Outside the 4 Walls” Paul Schaeflein (@paulschaeflein) – “PowerShell for the SharePoint Developer” My Presentation     I had a great time presenting my session on Deploying SharePoint 2007 Solutions, but it wasn’t without its fair share of technical issues.  As my session was right after lunch I came in to my room 10 mins early to set up my laptop, slides, and demos.  As a quick background note, a few months ago I got an upgraded laptop from my company Sogeti and have been dual booting it between XP (factory installed) and Windows Server 2008 R2 w/ Hyper-V.  As such I had prepared all of my demo virtual machines to run under Hyper-V.  About 3 minutes before my session was scheduled to start though it became apparent that I did not have the correct display drivers to connect Windows Server 2008 R2 to the projector…     As you can imagine this was a slight cause for concern as I was potentially going to be unable to give my presentation.  Luckily for me I usually prepare for such unforeseen issues and had my presentation and some spare VMs that would run on XP on my external hard drive.  Knowing this I rebooted my machine into XP and began my presentation without slides until about 5 mins into the session when everything was up and running on XP.  Despite this being the first time I gave this presentation I have to say it was one of my favorites I’ve given so far.  The audience was very engaged in the session and I received some great, positive feedback afterwards.  Thanks to all who attended my session, I appreciate it very much. Link to Presentation Files     For those of you who attended my session and would like my slides or demo PowerShell scripts they can be found on my SkyDrive at the link below.  Also, if you have a few minutes and wouldn’t mind rating my session I have this session posted on SpeakerRate.  As speakers we always appreciate any and all feedback attendees offer, so thank you if you are able to provide any. SkyDrive folder with session files Rate my SharePoint 2007 Solutions session   Picture Albums     For everyone else, here are my pictures from the weekend.  The first link is to my FaceBook album which will have tagging (recommend this one.)  The second is to my Live album if you care for higher resolution images. http://www.facebook.com/album.php?aid=2154482&id=21905041&l=a3fb72ee8c View Full Album Conclusion     A big thank you goes out to all of the organizers, speakers, sponsors, and attendees of SPSMI.  As I’ve said so many times, without each and every one of you these events wouldn’t be possible.  I thoroughly enjoyed this trip back to my home state and presenting a new session.  For those interested in my upcoming schedule I will be giving two sessions on PowerShell at SharePoint Saturday Charlotte in April, helping plan Stir Trek: Iron Man Edition in May, and I’m submitting sessions to Day of .Net Ann Arbor in May as well.  Beyond that I haven’t planned out any travels.  Thanks for reading my recap.  Look forward to more technical posts now that I have a short break in conferences.         -Frog Out   links: Michigan image

    Read the article

  • SharePoint Upgrade Global Nav Quirks?

    - by elorg
    We're working on a parallel install/upgrade of SharePoint. The client has WSS 2003 on some old hardware. We've installed MOSS 2007 in a medium farm environment. They want to use this as an opportunity to not just upgrade and use the new features, but to also better organize their content and categorize between different site collections. To accommodate, we've created a few site collections per their specifications in the new environment, and when we ran an upgrade test run we ran into a few .. quirks. We made a backup of the old content database, copied it over to the new environment and restored it as a new database. Created a new web app and attached the migrated data to do an in-place upgrade in this new "test" area. This seems pretty standard - no issues. We have to do a little bit of cleanup (e.g. reset pages to site definition, reset themes, and inherit the global nav / top link bar, etc.). Once that's done, we're using stsadm export/import to copy the individual sites over to their ultimate destinations in the various different site collections. So far so good. But then we ran into one particular site that has a link to an .aspx page in the top link bar in WSS 2003 that's not behaving properly after the upgrade. It's just a link to a "dashboard" .aspx page in a doc library - nothing special. It doesn't seem to matter what we do, or what order we do it (in the "test" web app, in the destination web app, or both). In the end, this ONE site will not allow us to create a link/tab in the global nav. It can inherit the global nav just fine. We can break the inheritance just fine. But if we want to manually add a link in the top link bar - we go through the steps that I've done 1,000x before and click OK - and the tab never appears. It doesn't matter if it's to a page within the site itself, or to Google. We can migrate over other sites into the same site collection and add a tab without issue. If we migrate this quirky site over to another site collection we run into the same issue. Yet, in the "test" web app that we're using to upgrade the data we can add a tab? If we add the tab before we export/import to the final destination, the tab is lost during the process? Has anyone run into anything like this? Any ideas? I've tried every combination of everything that I can think of and nothing works. Unless we can figure out how to get this to work, we're going to just add this tab to the global nav for the entire site collection and inherit it for this site (but that adds the link to all of the site that will inherit, which is both a pro & con for them).

    Read the article

  • SPException: Catastrophic failure (Exception from HRESULT: 0x8000FFF (E_UNEXPECTED) in Sharepoint

    - by BeraCim
    Hi all: I've been trying to programmatically copy custom content type and its custom columns from one web to another for some time now, and every time I get some sort of error or exception. After yet more tries, I received another strange and cryptic exception from Sharepoint after clicking onto a newly copied custom column in a custom content type. I checked the logs, and this is what I got: Failed to find the content type schema for ct-1033-0x1000blahblahblahcontenttypeId while caching feature data. Unknown SPRequest error occurred. More informationL 0x8000ffff Unable to locate the xml-definition for CType with SPContentTypeId '0x0100MorecontenttypeId', exception: Microsoft.SharePoint.SPException: Catastrophic failure(Exception from HRESULT: 0x8000FFFF (E_UNEXPECTED)) ---> System.Runtime.InteropServices.COMException (0x8000FFFF): Catastrophic failure... ... at Microsoft.SharePoint.Library.SPRequestInternalClass.GetGlobalContentTypeXml(String bstrUrl, Int32 type, UInt 32 lcid, Object varIdBytes... It failed to find quite a few content type schema. I'm confused with what Sharepoint is trying to do here, and why a simple process of copying a custom content type from one web to another just wouldn't work in contrast to the information found on the web e.g. this. Appreciate any help to get over this problem. Thanks.

    Read the article

  • Creating a Sharepoint Development Environment from an Existing Production Environment

    - by Starky
    I have very little experience using Sharepoint but a good amount using Visual Studio 2008, SQL Server 2005, Windows Server 2003 and IIS6. I need to create a development environment for a SharePoint 2007 system that will be used internally. The system is already deployed over two servers - one of the servers simply holds the database and everything else is on the other server. We are also using WSS 3.0. I have created a Virtual Machine with all the required software including a clean installation of SharePoint Server 2007 and I wish to use this single Virtual Machine as the development environment. Right now there are no custom assemblies being used on the production server as far as I am aware. There are 3 websites, one over port 80 for user accesss, one over a custom port for central administration, and one over another custom port. Not sure what the last one is for but my blank instance of Sharepoint on my Virtual Machine also has something similar. I attempted to use the STSADM tool to backup and restore these 3 sites from my production environment to my development environment and while the operations completed succesfully, the central administration site in my development environment acted strangely and I could not access port 80 - I did not seem to have correct credentials for it. I suspected that it would not have been so simple so could I please have advice on how to create my development environment so that I can use it to deploy updates to the production one.

    Read the article

  • Consume a WCF Web Service in Sharepoint Services 3.0

    - by Filip Ekberg
    I've seen this question and since it doesn't answer my question and the topic is fairly miss-leading in my opinion, I feel the urge to ask this again. I have a Sharepoint Webpart that I deploy using Visual Studio Sharepoint Tools 1.2 to a Sharepoint Services 3.0 instance on my local Windows 2003 server. All works great, however, as soon as I add a WCF Service it won't run the code. All I get is a File not found error. I've added this to my Web.Config which is a copy of App.Config <system.serviceModel> <bindings> <basicHttpBinding> <binding name="BasicHttpBinding_Services_xxxService" closeTimeout="00:01:00" openTimeout="00:01:00" receiveTimeout="00:10:00" sendTimeout="00:01:00" allowCookies="false" bypassProxyOnLocal="false" hostNameComparisonMode="StrongWildcard" maxBufferSize="65536" maxBufferPoolSize="524288" maxReceivedMessageSize="65536" messageEncoding="Mtom" textEncoding="utf-8" transferMode="Buffered" useDefaultWebProxy="true"> <readerQuotas maxDepth="32" maxStringContentLength="8192" maxArrayLength="16384" maxBytesPerRead="4096" maxNameTableCharCount="16384" /> <security mode="None"> <transport clientCredentialType="None" proxyCredentialType="None" realm="" /> <message clientCredentialType="UserName" algorithmSuite="Default" /> </security> </binding> </basicHttpBinding> </bindings> <client> <endpoint address="http://localhost:1196/xxxService.svc" binding="basicHttpBinding" bindingConfiguration="BasicHttpBinding_ervices_xxxgService" contract="xxxService.TestService" name="BasicHttpBinding_ServicesxxxService" /> </client> </system.serviceModel> I cannot even do using(var proxy = new xxService.TestService()), If that line is added, the new dll is not added to Sharepoint Services. Any suggestions? I also found this blog post and this forum thread, don't think they are too helpful though.

    Read the article

  • Efficient way to create a large number of SharePoint folders

    - by BeraCim
    Hi all: I'm currently creating a large number of SharePoint folders within a list (e.g. ~800 folders), with each folder containing a different number of items. The way it is currently done is that it programmatically reads off the content types, items, event listeners and the likes off the same folder from another web, then creates the same folder in the current web. That ran reasonably fine and fast on a dev environment. However when it goes to an environment with WFEs and farms, it slowed down a lot. I have checked that there are no leaks in the code, and that the code follows SharePoint coding best practices. At the moment I'm looking at it at the code level. From your experience, are there any efficient ways of creating a large number of SharePoint folders, lists and items? EDIT: I'm currently using SharePoint API, but will be looking at moving to using Web Service in the future. I'm interested in looking at both options though. Code wise, its just the general reading of a folder and its content types plus items and their details, then create the same folder in the same list with the same content types, then copy over the items using patch update. I want to know whether there are more efficient ways of doing the above. Thanks.

    Read the article

  • Sharepoint user details not visible to other users

    - by richardoz
    I am managing a SharePoint site that uses Form Based Authentication. We have several generic lists, document libraries and active task lists that users can create update and delete. Users can use the people pickers to select/search for everyone. But the users cannot see other users names, email addresses etc. in display lists or the people pickers. If I log in as the site collection administrator, I can see everyones details. So I know the data is available. Updated details on this problem (non-administrators) SharePoint users cannot see other users information. Example: User A assigns a task to user B. User A creates a new task and uses the people picker to find user B. User B is only visible by the login name “bname” and any information about user B is not visible or searchable within the people picker. Once user B is assigned the task, user A no longer sees the name in the task list – even though user A created it. No modified by, created by, assigned to or owner field data is visible to non-administrator users. Facts: Extranet site is configured to use Forms Based Authentication. Intranet uses windows based authentication Users of both the intranet and extranet have the same problem All databases are local The site uses SSRS integration SharePoint WSS on Windows 2003 Std -- After activating the verbose logging it looks like SharePoint is definately asking SQL server for only the user info for the currently logged in user: SELECT TOP 6 /lots-of-columns/ FROM UserData INNER MERGE JOIN Docs AS t1 ON ( 1 = 1 AND UserData.[tp_RowOrdinal] = 0 AND t1.SiteId = UserData.tp_SiteId AND t1.SiteId = @L2 AND t1.DirName = UserData.tp_DirName AND t1.LeafName = UserData.tp_LeafName AND t1.Level = UserData.tp_Level AND t1.IsCurrentVersion = 1 AND (1 = 1) ) LEFT OUTER JOIN AllUserData AS t2 ON ( UserData.[tp_Author]=t2.[tp_ID] AND UserData.[tp_RowOrdinal] = 0 AND t2.[tp_RowOrdinal] = 0 AND ( (t2.tp_IsCurrent = 1) ) AND t2.[tp_CalculatedVersion] = 0 AND t2.[tp_DeleteTransactionId] = 0x AND t2.tp_ListId = @L3 AND UserData.tp_ListId = @L4 AND t2.[tp_Author]=162 /* this is the currently logged in user */ ) WHERE (UserData.tp_IsCurrent = 1) AND UserData.tp_SiteId=@L2 AND (UserData.tp_DirName=@DN) AND UserData.tp_RowOrdinal=0 AND ( ( (UserData.[datetime1] IS NULL ) OR (UserData.[datetime1] = @L5DTP) ) AND t1.SiteId=@L2 AND (t1.DirName=@DN) ) ORDER BY UserData.[tp_Modified] Desc, UserData.[tp_ID] Asc Again, any ideas would be appreciated.

    Read the article

  • win32 API vs Linux Kernel API

    - by Nik
    Is there anything that can be done using Win32 API that cannot be done using Linux kernel API. I'm asking this, because back in Lab where I work we use Ixia and Agilent Technologies H/W. and these H/W (Signal processes or Packet generator) worth 10s of thousand of Dollars run windows as their OS. Why didn't they choose Linux. I've seen Linux in routers and firewall but not in real Heavy duty H/W like Ixia. This preference of windows over Linux made me think is there any limitation in Linux API or its just licensing thing.

    Read the article

  • Web-based SVG or JavaScript Org Chart or Tree Graph Plotting Visualization API

    - by asoltys
    Hi, I'm looking to build an interactive web-based org chart for a large organization. I somewhat like the interface at ancestry.com where you can hover over people and pan/zoom around and click on different nodes to make them the root. Ideally, I'd like it if people could belong to multiple organizational entities like committees, working groups, etc. In other words the API should support graphs in general, not just trees. I'd like to be able to visually explode each organizational substructure into substituents by clicking on it, with a nice animation of the employees ballooning or spilling out so you can really interactively drill down through the organization. I found http://code.google.com/apis/visualization/documentation/gallery/orgchart.html but it looks a bit rudimentary. I know there are desktop tools like OrgPlus and Visio that can build static charts but I'm really looking for a free, web-based API with open standards-based output like SVG or HTML5 Canvas elements rather than Flash or some proprietary output. Something I can embed into a custom web application and style myself. Something interactive.

    Read the article

  • Is there a way to allow administrators to change or reset user passwords?

    - by Jon Seigel
    We have a custom MembershipProvider implementation using form-based authentication (FBA) under Sharepoint 2007. I've searched high and low on Google, but only found: Active directory and FBA implementations to allow users to change their own password Active directory instructions (including video!) for administrators to change other users' passwords Have we missed an option to enable the latter under FBA? Should this work by default and is the MembershipProvider misbehaving? The procedure to do this as under active directory would be ideal, but the "Change Password" link does not appear in the Edit User screen. We verified that the logged-in user is a site collection administrator.

    Read the article

  • How to update Sharepoint 2010 user profile for user whose account name has changed in AD?

    - by Daniel Root
    We have an issue with User Profile Sync in SharePoint 2010 when the following happens: A new user is added to AD (ie DOMAIN\jdoh) The user is synched successfully to SharePoint Time passes The user's account name is changed in AD (ie because it was originally misspelled: DOMAIN\jdoe) The user is re-synced to SharePoint The behavior appears to be that the account name is not changed. In the above example, accountname will continue to be DOMAIN\jdoh in SharePoint, though other properties are synced correctly- I would assume by SID. This means that the users' my profile and mysite links still refer to the 'old' name (ie Person.aspx?accountname=Domain\jdoh). What steps should be taken to fix this in SharePoint when an account name is changed in AD?

    Read the article

  • Moving distribution lists in Public Folders to Sharepoint Contacts?

    - by Mike
    Now I know that if I connect to contact list in Sharepoint and drag and drop everything from the Exchange Public Folder contact list to the Sharepoint Contact list (connected through Outlook) it will transfer everything in the contact list to the sharepoint contact list. What about distribution lists? Has anyone had a workaround for this? If a contact list is full of distribution lists the Contacts won't migrate over and the Sync Issues - Local Failures folder is populated with all the distribution lists that couldn't be migrated. Is there a way to migrate distribution lists? Any ideas how to set up the sharepoint contact list like a public folder contact list of distrubition lists? How would that contact list look on sharepoint? Should I just leave the contact lists that have distribution lists on public folders?

    Read the article

< Previous Page | 29 30 31 32 33 34 35 36 37 38 39 40  | Next Page >