Search Results

Search found 5018 results on 201 pages for 'sharepoint workflow'.

Page 77/201 | < Previous Page | 73 74 75 76 77 78 79 80 81 82 83 84  | Next Page >

  • How do access a secure website within a sharepoint webpart?

    - by Bill
    How do access a secure website within a sharepoint webpart? The following code works fine as a console application but if you run it in a webpart, you will get a access violation WebRequest request = WebRequest.Create("https://somesecuresite.com"); WebResponse firstResponse = null; try { firstResponse = request.GetResponse(); } catch (WebException ex) { writer.WriteLine("Error: " + ex.ToString()); return; } if you access a non secure site, it also works. Any ideas? Error: System.Net.WebException: The underlying connection was closed: An unexpected error occurred on a receive. --- System.AccessViolationException: Attempted to read or write protected memory. This is often an indication that other memory is corrupt. at System.Net.UnsafeNclNativeMethods.NativePKI.CertVerifyCertificateChainPolicy(IntPtr policy, SafeFreeCertChain chainContext, ChainPolicyParameter& cpp, ChainPolicyStatus& ps) at System.Net.PolicyWrapper.VerifyChainPolicy(SafeFreeCertChain chainContext, ChainPolicyParameter& cpp) at System.Net.Security.SecureChannel.VerifyRemoteCertificate(RemoteCertValidationCallback remoteCertValidationCallback) at System.Net.Security.SslState.CompleteHandshake() at System.Net.Security.SslState.CheckCompletionBeforeNextReceive(ProtocolToken message, AsyncProtocolRequest asyncRequest) at System.Net.Security.SslState.StartSendBlob(Byte[] incoming, Int32 count, AsyncProtocolRequest asyncRequest) at System.Net.Security.SslState.ProcessReceivedBlob(Byte[] buffer, Int32 count, AsyncProtocolRequest asyncRequest) at System.Net.Security.SslState.StartReadFrame(Byte[] buffer, Int32 readBytes, AsyncProtocolRequest asyncRequest) at System.Net.Security.SslState.StartReceiveBlob(Byte[] buffer, AsyncProtocolRequest asyncRequest) at System.Net.Security.SslState.CheckCompletionBeforeNextReceive(ProtocolToken message, AsyncProtocolRequest asyncRequest) at System.Net.Security.SslState.StartSendBlob(Byte[] incoming, Int32 count, AsyncProtocolRequest asyncRequest) at System.Net.Security.SslState.ProcessReceivedBlob(Byte[] buffer, Int32 count, AsyncProtocolRequest asyncRequest) at System.Net.Security.SslState.StartReadFrame(Byte[] buffer, Int32 readBytes, AsyncProtocolRequest asyncRequest) at System.Net.Security.SslState.StartReceiveBlob(Byte[] buffer, AsyncProtocolRequest asyncRequest) at System.Net.Security.SslState.CheckCompletionBeforeNextReceive(ProtocolToken message, AsyncProtocolRequest asyncRequest) at System.Net.Security.SslState.StartSendBlob(Byte[] incoming, Int32 count, AsyncProtocolRequest asyncRequest) at System.Net.Security.SslState.ProcessReceivedBlob(Byte[] buffer, Int32 count, AsyncProtocolRequest asyncRequest) at System.Net.Security.SslState.StartReadFrame(Byte[] buffer, Int32 readBytes, AsyncProtocolRequest asyncRequest) at System.Net.Security.SslState.StartReceiveBlob(Byte[] buffer, AsyncProtocolRequest asyncRequest) at System.Net.Security.SslState.CheckCompletionBeforeNextReceive(ProtocolToken message, AsyncProtocolRequest asyncRequest) at System.Net.Security.SslState.StartSendBlob(Byte[] incoming, Int32 count, AsyncProtocolRequest asyncRequest) at System.Net.Security.SslState.ForceAuthentication(Boolean receiveFirst, Byte[] buffer, AsyncProtocolRequest asyncRequest) at System.Net.Security.SslState.ProcessAuthentication(LazyAsyncResult lazyResult) at System.Net.TlsStream.CallProcessAuthentication(Object state) at System.Threading.ExecutionContext.runTryCode(Object userData) at System.Runtime.CompilerServices.RuntimeHelpers.ExecuteCodeWithGuaranteedCleanup(TryCode code, CleanupCode backoutCode, Object userData) at System.Threading.ExecutionContext.RunInternal(ExecutionContext executionContext, ContextCallback callback, Object state) at System.Threading.ExecutionContext.Run(ExecutionContext executionContext, ContextCallback callback, Object state) at System.Net.TlsStream.ProcessAuthentication(LazyAsyncResult result) at System.Net.TlsStream.Write(Byte[] buffer, Int32 offset, Int32 size) at System.Net.PooledStream.Write(Byte[] buffer, Int32 offset, Int32 size) at System.Net.ConnectStream.WriteHeaders(Boolean async) --- End of inner exception stack trace --- at System.Net.HttpWebRequest.GetResponse()

    Read the article

  • What is a proper way to store site-level global variables in a SharePoint site?

    - by ccomet
    One thing that has driven me nuts about SharePoint2007 is the apparent inability to have defineable settings that apply specifically to a site or site collection itself, and not the content. I mean, you have some pre-defined settings like the Site Logo, the Site Name, and various other things, but there doesn't appear to be anywhere to add new kinds of settings. The application I am working on needs to be able to create multiple kinds of "project site collections" that all follow a basic template, but have certain additional settings that apply specifically to that site collection and that one alone. In addition to the standard site name we also need to define the Project Number, the Project Name, and the Client Name. And given the requests of some of our clients, we also reach a point where we have to have configurable settings that alter how some of the workflows work, like whether files are marked with Letters or Numbers. Our current solution, which I'm hesitant about, has been to store an XML file on the SharePoint server. This file contains one node for each site collection, identified by the URL of the root site. Inside the node are all of the elements that need to be defined for that site collection. When we need them, we have to access the XML file (which will always require SPSecurity.RunWithElevatedPrivileges to access files right on the server) every time to load it and retrieve the data. There are a lot of automated processes which will have to do this, and I'm hesitant about the stability of this method when we reach hundreds of sites with thousands of files running tens of thousands of workflows, all wanting to access this file. Maybe they're unfounded worries, but I'd rather worry than risk everything breaking in a couple years. I was looking into the SPWeb object and found the AllProperties hashtable. It looks like just the kind of thing which might work, but I don't know how safe it is to be modifying this. I read through both MSDN and the WSS SDK but found nothing that clarified on adding completely new properties into AllProperties. Is it safe to use AllProperties for this kind of thing? Or is there yet another feature that I am missing, which could handle the concept of global variables at the site collection or site scope?

    Read the article

  • Why does my sharepoint web part event handler lose the sender value on postback?

    - by vishal shah
    I have a web part which is going to be a part of pair of connected web parts. For simplicity, I am just describing the consumer web part. This web part has 10 link buttons on it. And they are rendered in the Render method instead ofCreateChildControls as this webpart will be receiving values based on input from the provider web part. Each Link Button has a text which is decided dynamically based on the input from provider web part. When I click on any of the Link Buttons, the event handler is triggered but the text on the Link Button shows up as the one set in CreateChildControls. When I trace the code, I see that the CreateChildControls gets called before the event handler (and i think that resets my Link Buttons). How do I get the event handler to show me the dynamic text instead? Here is the code... public class consWebPart : Microsoft.SharePoint.WebPartPages.WebPart { private bool _error = false; private LinkButton[] lkDocument = null; public consWebPart() { this.ExportMode = WebPartExportMode.All; } protected override void CreateChildControls() { if (!_error) { try { base.CreateChildControls(); lkDocument = new LinkButton[101]; for (int i = 0; i < 10; i++) { lkDocument[i] = new LinkButton(); lkDocument[i].ID = "lkDocument" + i; lkDocument[i].Text = "Initial Text"; lkDocument[i].Style.Add("margin", "10 10 10 10px"); this.Controls.Add(lkDocument[i]); lkDocument[i].Click += new EventHandler(lkDocument_Click); } } catch (Exception ex) { HandleException(ex); } } } protected override void Render(HtmlTextWriter writer) { writer.Write("<table><tr>"); for (int i = 0; i < 10; i++) { writer.Write("<tr>"); lkDocument[i].Text = "LinkButton" + i; writer.Write("<td>"); lkDocument[i].RenderControl(writer); writer.Write("</td>"); writer.Write("</tr>"); } writer.Write("</table>"); } protected void lkDocument_Click(object sender, EventArgs e) { string strsender = sender.ToString(); LinkButton lk = (LinkButton)sender; } protected override void OnLoad(EventArgs e) { if (!_error) { try { base.OnLoad(e); this.EnsureChildControls(); } catch (Exception ex) { HandleException(ex); } } } private void HandleException(Exception ex) { this._error = true; this.Controls.Clear(); this.Controls.Add(new LiteralControl(ex.Message)); } }

    Read the article

  • Review Board workflow for Mercurial repository

    - by pachanga
    At my company we are trying to add code reviewing practices into our development process and for that purpose we decided to use Review Board. While Review Board should work out of the box for Subversion the workflow for Mercurial looks a little bit involved. Firstly it seems that only post reviewing(via post-review script) is supported for this type of repo. Secondly documentation is unclear whether post-review actually supports Mercurial(it only mentions git). Could you folks describe your workflow in detail please? Am I right in my thinking it should be something like this: Developer: clone master repo clone feature repo from local master repo hack-hack in feature repo commit into feature repo somehow run post-review from feature repo against parent master repo Reviewer: review diff if OK then pull to the master repo from the feature repo

    Read the article

  • Is this a good centralized DVCS workflow?

    - by Chad Johnson
    I'm leaning toward using Mercurial, coming from Subversion, and I'd like to maintain a centralized workflow like I had with Subversion. Here is what I am thinking: stable (clone on server) default (branch) development (clone on server) default (branch) bugs (branch) developer1 (clone on local machine) developer2 (clone on local machine) developer3 (clone on local machine) feature1 (branch) developer3 (clone on local machine) feature2 (branch) developer1 (clone on local machine) developer2 (clone on local machine) As far as branches vs. clones is concerned, does this workflow sense? Do I have things straight? Also, the 'stable' clone IS the release. Does it make sense for the 'default' branch to be the release and what all other branches are ultimately merged into?

    Read the article

  • How does the workflow between testers doing testing and coders doing the coding for pending testing

    - by dotnetdev
    In a large company that does software development, they often have dedicated teams for build management, testing, development, and so forth. Agile or not, how does this workflow amongst teams work? I mean would the test team write unit tests and then the dev team write code to adhere to these tests (basically TDD)? And then the test team may write tests for a completely different project or have a slight quiet period until the dev team have done their coding. What possible workflows are there? This is something that interests me greatly. I know that in my current company we are doing it incorrectly (we have 1 tester about 5 devs, which is small scale) but I am not sure how exactly to draw out the ideal workflow. Many (ok, an ex-Project Manager) have tried, but all failed.

    Read the article

  • Publishing an Excel spreadsheet using Microsoft SBS 2008 to a web page that is viewable by mobile ph

    - by Dave Heath
    I am getting well out of my “superuser” depth here and would love some support. At work we have an Excel workbook (*.xls format circa Office 2003) which maintains our “engineers” timesheet. This handles what events we are doing across the year and how many “work units” it is. As far as a workbook goes, it is fairly simple with just a few =SUM(range) cells and some linked across sheets (12 sheets, one for each month) It is stored on a server, in a folder that provides “management” with full access and “engineers” with read-only access. The workbook itself is read-only for “engineers” and full access for “management”. I think these permissions are controlled through Active Directory. The workbook is protected with a password, assumingly to allow “management” to edit it even if they are working at a terminal logged in as an “engineer”. This protection prevents “engineers” from going to certain cells to see formulae and therefore editing them. The workbook has a macro which saves and closes it ten minutes after opening. This is to stop other “management” from being locked out by any one person who has logged in with editing privileges. I hope this is making sense to someone... :S Now then, we have Microsoft Small Business Server 2008. We have a shiny new web-based login for when we are offsite so we can get to Exchange webmail and our internal site (which uses Sharepoint 3.0). “Management” would like to be able to publish this timesheet automatically after changes (they don’t want to have to do anything different to what they are currently doing) so that using an iPhone “engineers” can check on it while out of the office. I am currently having a look at “Excel Services” for Office 2007 on TechNet but I am not sure if I am running down the right garden path at the moment. < EDIT This seems to suggest that I have to have Sharepoint Server 2007, with no mention of Sharepoint 3.0... ... "MOSS builds on WSS by adding both core features as well as end user web parts" - Wikipedia entry for Microsoft Office SharePoint Server (MOSS) this is not good news... "...and using the ASP.NET APIs, web parts can be written to extend the functionality of WSS." Wikipedia entry for Windows Sharepoint Services. Could this bring back what I need? Is this good news? Do I need to start learning ASP.NET? This link here implies that we need MOSS to do what I want and the bosses say we aint' getting it. http://serverfault.com/questions/20198/what-is-some-cool-things-you-can-do-with-sharepoint-2007/22128#22128 Back to the drawing board. < /EDIT Please could someone suggest some “further reading” for me to help point me in the right direction or to put me back on the right track. Many thanks. I will try to keep this up to date with how I get on.

    Read the article

  • How to fix “Collection was modified; enumeration operation may not execute. “when using Copy.asmx in SharePoint2010

    - by ybbest
    Problem: In my current project, we use copy.asmx web services in SharePoint2010 to upload a document to a document library. In one of the document library, when uploading a document with a choice field, it blows up and the error message is Collection was modified; enumeration operation may not execute. Analysis: After some research , we found out the problem is that we have 2 content types that both have a field called Document Type , although the internal name are different but the display name are exactly the same and this cause the problem. I am still not too sure why, it could possibly a SharePoint bug. Solution: After rename one of the field display name to a different name, it works like a charm. You can download source code with the problem here and source code without the problem here.

    Read the article

  • Enable Claims based Auth on a SP2010 website, after it has been provisioned

    - by Sahil Malik
    Ad:: SharePoint 2007 Training in .NET 3.5 technologies (more information). When you provision a web app in SP2010, you can choose it to use Claims Based Auth or Classic Auth right through the GUI.  However, after you have provisioned a web app, there is no GUI to switch from Classic to Claims based. So the below powershell script will let you convert a SP2010 website to claims based auth after it has been provisioned. 1: $w = Get-SPWebApplication "http://sp2010" 2: $w.UseClaimsAuthentication = "True"; 3: $w.Update() The user running the above script should be a member of the SharePoint_Shell_Access role on the config DB, and a member of the WSS_ADMIN_WPG local group. Comment on the article ....

    Read the article

  • WCF/ADO.NET Data Services - Could not load type 'System.Data.Services.Providers.IDataServiceUpdatePr

    - by Sahil Malik
    Ad:: SharePoint 2007 Training in .NET 3.5 technologies (more information). When you try accessing ListData.svc, do you get the following error? Could not load type 'System.Data.Services.Providers.IDataServiceUpdateProvider' from assembly 'System.Data.Services, Version=3.5.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089'. Well, if you followed the instructions in Chapter 1 of my book to build your VM, you wouldn’t run into the above issue. But if you do, you need to install  -   For Windows Vista and Windows 2008 - http://www.microsoft.com/downloads/details.aspx?familyid=4B710B89-8576-46CF-A4BF-331A9306D555&displaylang=en For Windows 7 and Windows 2008 R2 - http://www.microsoft.com/downloads/details.aspx?familyid=79d7f6f8-d6e9-4b8c-8640-17f89452148e&displaylang=en Remember to: a) Install the x64 version, and b) Do an IISReset before trying again. Comment on the article ....

    Read the article

  • Sign on Experience with Office 365

    - by Sahil Malik
    SharePoint 2010 Training: more information Office 365 offers two types of identities: · Microsoft Online Services cloud IDs (Cloud Identity): This is the default identity Microsoft provides you, requires no additional setup, you sign up for Office 365 and you are provided a credential. You can sign in using forms based authentication, the password policy etc. for which is stored in the cloud with the Office 365 service. The advantage obviously is no additional setup headache. The disadvantage? Yet another password to remember, and no hope of authenticated single sign on integration using this cloud identity with other services at least in the current version. · Federated IDs (Federated Identity): In companies with on-premises Active Directory, users can sign into Office 365 services using their Active Directory credentials. The corporate Active Directory authenticates the users, and stores and controls the password policy. The advantage here is plenty of single sign on possibilities and better user experience. The downside, more Read full article ....

    Read the article

  • How to find which w3wp.exe to attach when debugging your SharePiont2010 project

    - by ybbest
    When debugging SharePoint2010 project, you need to attach w3wp.exe process, however there are often quite a few of them and it is very hard to figure out which one to attach. Today, I will show you how to find out which process to attach using a tool called process explorer. 1. Download the process explorer and run it after you download it. 2. Find the w3wp.exe processes under wininit.exe right-click the columns header and click Select Columns. 3. Include Command Line under Process Image. 4. Now you can see your IIS site name next to w3wp.exe, in my case I’d like to attach the “SharePoint – BenDev80″.You can see the PID of the process is 2920. 5. From the above process you know the process ID you’d like to attach is 2920, you can then go ahead to attach the process from Visual Studio.

    Read the article

  • Content Query Web Part and the Yes/No Field

    - by Bil Simser
    The Content Query Web Part (CQWP) is a pretty powerful beast. It allows you to do multiple site queries and aggregate the results. This is great for rolling up content and doing some summary type reporting. Here’s a trick to remember about Yes/No fields and using the CQWP. If you’re building a news style site and want to aggregate say all the announcements that people tag a certain way, up onto the home page this might be a solution. First we need to allow a way for users of all our sites to mark an announcement for inclusion on our Intranet Home Page. We’ll do this by just modifying the Announcement Content type and adding a Yes/No field to it. There are alternate ways of doing this like building a new Announcement type or stapling a feature to all sites to add our column but this is pretty low impact and only affects our current site collection so let’s go with it for now, okay? You can berate me in the comments about the proper way I should have done this part. Go to the Site Settings for the Site Collection and click on Site Content Types under the Galleries. This takes you to the gallery for this site and all subsites. Scroll down until you see the List Content Types and click on Announcements. Now we’re modifying the Announcement content type which affects all those announcement lists that are created by default if you’re building sites using the Team Site template (or creating a new Announcements list on any site for that matter). Click on Add from new site column under the Column list. This will allow us to create a new Yes/No field that users will see in Announcement items. This field will allow the user to flag the announcement for inclusion on the home page. Feel free to modify the fields as you see fit for your environment, this is just an example. Now that we’ve added the column to our Announcements Content type we can go into any site that has an announcement list, modify that announcement and flag it to be included on our home page. See the new Featured column? That was the result of modifying our Announcements Content Type on this site collection. Now we can move onto the dirty part, displaying it in a CQWP on the home page. And here is where the fun begins (and the head scratching should end). On our home page we want to drop a Content Query Web Part and aggregate any Announcement that’s been flagged as Featured by the users (we could also add the filter to handle Expires so we don’t show old content so go ahead and do that if you want). First add a CQWP to the page then modify the settings for the web part. In the first section, Query, we want the List Type to be set to Announcements and the Content type to be Announcement so set your options like this: Click Apply and you’ll see the results display all Announcements from any site in the site collection. I have five team sites created each with a unique announcement added to them. Now comes the filtering. We don’t want to include every announcement, only ones users flag using that Featured column we added. At first blush you might scroll down to the Additional Filters part of the Query options and set the Featured column to be equal to Yes: This seems correct doesn’t it? After all, the column is a Yes/No column and looking at an announcement in the site, it displays the field as Yes or No: However after applying the filter you get this result: (I have the announcements from Team Site 1 and Team Site 4 flagged as Featured) Huh? It’s BACKWARDS! Let’s confirm that. Go back in and change the Additional Filters section from Yes to No and hit Apply and you get this: Wait a minute? Shouldn’t I see Team Site 1 and 4 if the logic is backwards? Why am I seeing the same thing as before. What gives… For whatever reason, unknown to me, a Yes/No field (even though it displays as such) really uses 1 and 0 behind the scenes. Yeah, someone was stuck on using integer values for booleans when they wrote SharePoint (probably after a long night of white boarding ways to mess with developers heads) and came up with this. The solution is pretty simple but not very discoverable. Set the filter to include your flagged items like so: And it will filter the items marked as Featured correctly giving you this result: This kind of solution could also be extended and enhanced. Here are a few suggestions and ideas: Modify the ItemStyle.xsl file to add a new style for this aggregation which would include the first few paragraphs of the body (or perhaps add another field to the Content type called Excerpt or Summary and display that instead) Add an Image column to the Announcement Content type to include a Picture field and display it in the summary Add a Category choice field (Employee News, Current Events, Headlines, etc.) and add multiple CQWPs to the home page filtering each one on a different category I know some may find this topic old and dusty but I didn’t see a lot out there specifically on filtering the Yes/No fields and the whole 1/0 trick was a little wonky, so I figured a few pictures would help walk through overcoming yet another SharePoint weirdness. With a little work and some creative juices you can easily us the power of aggregation and the CQWP to build a news site from content on your team sites.

    Read the article

  • TFS 2012 Upgrade and SQL Server - SharePoint - OS Requirements.

    - by Vishal
    Hello folks,Recently I was involved in Installation and Configuration of Team Foundation Server 2010 Farm for a client. A month after the installation and configuration was done and everything was working as it was supposed to, Microsoft released Team Foundation Server 2012 in mid August 2012. Well the company was using Borland Starteam as their source control and once starting to use TFS 2010, their developers and project managers were loving it since TFS is not just a source control tool and way much better then StarTeam. Anyways, long story short, they are now interested in thinking of upgrading to the newest version. Below are some basic Hardware and Software requirements for TFS 2012:Operating System:Windows Server 2008 with SP2 (only 64bit)Windows Server 2008 R2 with SP1 (only 64bit)Windows Server 2012 (only 64bit)SQL Server:SQL Server 2008 R2 and SQL Server 2012SQL Server 2008 is no longer supported.SQL Server Requirements for TFS.SharePoint Products:SharePoint Server 2010. (SharePoint Foundation 2010, Standard, Enterprise).MOSS 2007 (Standard, Enterprise)Windows SharePoint Services 3.0 (WSS 3.0)SharePoint Products Requirements for TFS.Project Server:Project Server 2010 with SP1.Project Server 2007 with SP2.Project Server Requirements for TFS.More information onf TFS Upgrade Requirements can be found here. Hardware Recommendations can be found here.Thanks,Vishal Mody

    Read the article

  • Developing a new complex travel website

    - by Kay
    We need to develop a completely new website for customers to choose a travel product with a contract. It needs to interface to our inventory to take the conference facility, hotel rooms etc. out of inventory once a contract has been signed (e-signature) and deposit paid. If you were starting from scratch, would you in-house or contract? If in-house, what development tools should I evaluate primarily - sharepoint, ASP.net? We are a small IT shop but we could hire 1-2 developers for this. We need to get something up in 12-18 months.

    Read the article

  • SPD 2013 Missing design view, is it really such a big deal?

    - by Sahil Malik
    SharePoint 2010 Training: more information First, please read what my friend Marc Anderson has to say about it. More, and More and even more. Also, my friend Asif Rehmani has some views on it as well. They bring up some important points, but in short, everything that allows for Visual Editing of pages, is gone! And anything that concerned visual editing, (and there are many such scenarios), is now gone. What is the practical upshot of all this? Read full article ....

    Read the article

  • Removing AppPrincipals from Office365

    - by Sahil Malik
    SharePoint, WCF and Azure Trainings: more information So here is an annoying issue. If I have your AppPrincipal and secret, I can party as you! But as we go through our usual dev cycles, we create these ApplicationIDs. Hell Visual Studio will create them for us, to make things easy!The problem is, many a developer, and some a ITOgre, may leave these AppPrincipalIds sitting there and not clean them up when they are done playing. You can look for currently registered App Principals at https://yourtenant/_layouts/15/appprincipals.aspx The problem is, that URL shows you App Principals registered AND currently in use. Currently NOT in use App Principals are NOT shown on that page. The same issue applies on premises also, even though here I am talking specifically about Office 365. Getting rid of these in On-Prem is easy, just use the Object model (server side). Read full article ....

    Read the article

  • Why is my content database so large?

    - by PeterBrunone
    If your SharePoint site collection hasn't grown, but your content database has, the most likely culprit is versioning.  If a list -- or worse, a library -- has versioning enabled, the default is to keep every single one.  That means that every time someone edits and checks in a document, its storage footprint increases by the size of the document (and probably a little more).The solution?  It could be a bit painful, but you'll need to go back into each library and restrict the number of versions to keep (three is sufficient for most uses, but your needs may vary).  I suggest keeping only major versions as well, since minor versions are really just stopping points on the way to a published document.Of course if you have a real business need to keep all those versions around, then you'll want to look into an archiving solution that will take the old versions out of the content database but still make them available if necessary.

    Read the article

  • How to fix “Error: This solution contains no resources scoped for a Web application and cannot be deployed to a particular Web application.”

    - by ybbest
    Problem: When I try to deploy my custom wsp solution to a specific web application, I got the error below: This solution contains no resources scoped for a Web application and cannot be deployed to a particular Web application. Analysis: The error message itself explains why you cannot deploy the solution to a web application. However if you do not like to deploy the solution to all the web applications and only like to deploy your solution to a specific application , you need to change the solution settings Assembly Deployment Target from GlobalAssemblyCache to WebApplication. From: TO: Solution: After you change the Assembly Deployment Target and run the script again, you will have the solution deployed successfully. References: http://blogs.msdn.com/b/jjameson/archive/2007/06/17/issues-deploying-sharepoint-solution-packages.aspx

    Read the article

  • How to fix “A deployment or retraction is already under way for the solution “*.wsp”, and only one deployment or retraction at a time is supported”

    - by ybbest
    I faced this issue when I try to deploy a solution and it failed initially due to SharePoint 2010 Administration service is not started. I then started the service and redeploy the solution; however I face the above issue. To fix it, you need to cancel the current deployment then redeploy the solution. Problem: Solution: To fix it, you need to cancel the current solution deployment. Go to CAà System Settings àManage farm solutions and then cancel the deployment. Next, you can redeployment your solution.

    Read the article

  • Is there a better way to handle data abstraction in this example?

    - by sigil
    I'm building an application that retrieves Sharepoint list data via a web service SPlists.Lists. To create an instance of the web service, I have the following class: class SharepointServiceCreator { public SPlists.Lists createService() { listsService.Url = "http://wss/sites/SPLists/_vti_bin/lists.asmx"; listsService.Credentials = System.Net.CredentialCache.DefaultCredentials; SPlists.Lists listsService=new SPlists.Lists(); } } I'm concerned that this isn't good OOP abstraction, though, because in order to create this service elsewhere in my application, I would need the following code: class someClass { public void someMethod() { SharepointServiceCreator s=new SharepointServiceCreator() SPlists.Lists listService=s.createService() } } Having to use declare the instance of listService in someMethod as type SPlists.Lists seems wrong, because it means that someClass needs to know about how SharepointServiceCreator is implemented. Or is this ok?

    Read the article

  • SharePoint Web Services. Using UserPofileService.GetUserProfileByName. After SP upgrade... failing.

    - by Steve
    The below web services code has worked properly for me for over a year. We have updated our SharePoint servers, and now the below code throws an exception (at the bottom line of code) "Object reference not set to an instance of an object" UserProfileWS.UserProfileService userProfileService = new UserProfileWS.UserProfileService(); userProfileService.Credentials = System.Net.CredentialCache.DefaultNetworkCredentials; string serviceloc = "/_vti_bin/UserProfileService.asmx"; userProfileService.Url = _webUrl + serviceloc; UserProfileWS.PropertyData[] info = userProfileService.GetUserProfileByName(null); EDIT: The service is still there. I browse http:///_vti_bin/UserProfileService.asmx, and the information for the service is still there, including the full description of the GetUserProfileByName call. EDIT2: This does appear to be due to a change in SharePoint. I loaded a previous version of my software (known to be working), and it exhibits the same erroneous behavior.

    Read the article

  • Using a service registry that doesn’t suck Part III: Service testing is part of SOA governance

    - by gsusx
    This is the third post of this series intended to highlight some of the principles of modern SOA governance solution. You can read the first two parts here: Using a service registry that doesn’t suck part I: UDDI is dead Using a service registry that doesn’t suck part II: Dear registry, do you have to be a message broker? This time I’ve decided to focus on what of the aspects that drives me ABSOLUTELY INSANE about traditional SOA Governance solutions: service testing or I should I say the lack of...(read more)

    Read the article

  • Agile SOA Governance: SO-Aware and Visual Studio Integration

    - by gsusx
    One of the major limitations of traditional SOA governance platforms is the lack of integration as part of the development process. Tools like HP-Systinet or SOA Software are designed to operate by models on which the architects dictate the governance procedures and policies and the rest of the team members follow along. Consequently, those procedures are frequently rejected by developers and testers given that they can’t incorporate it as part of their daily activities. Having SOA governance products...(read more)

    Read the article

  • SO-Aware at the Atlanta Connected Systems User Group

    - by gsusx
    Today my colleague Don Demsak will be presenting a session about WCF management, testing and governance using SO-Aware and the SO-Aware Test Workbench at the Connected Systems User Group in Atlanta . Don is a very engaging speaker and has prepared some very cool demos based on lessons of real world WCF solutions. If you are in the ATL area and interested in WCF, AppFabric, BizTalk you should definitely swing by Don’s session . Don’t forget to heckle him a bit (you can blame it for it ;) )...(read more)

    Read the article

< Previous Page | 73 74 75 76 77 78 79 80 81 82 83 84  | Next Page >