Search Results

Search found 14056 results on 563 pages for 'odata services'.

Page 66/563 | < Previous Page | 62 63 64 65 66 67 68 69 70 71 72 73  | Next Page >

  • SharePoint Web Services. Using UserPofileService.GetUserProfileByName. After SP upgrade... failing.

    - by Steve
    The below web services code has worked properly for me for over a year. We have updated our SharePoint servers, and now the below code throws an exception (at the bottom line of code) "Object reference not set to an instance of an object" UserProfileWS.UserProfileService userProfileService = new UserProfileWS.UserProfileService(); userProfileService.Credentials = System.Net.CredentialCache.DefaultNetworkCredentials; string serviceloc = "/_vti_bin/UserProfileService.asmx"; userProfileService.Url = _webUrl + serviceloc; UserProfileWS.PropertyData[] info = userProfileService.GetUserProfileByName(null); EDIT: The service is still there. I browse http:///_vti_bin/UserProfileService.asmx, and the information for the service is still there, including the full description of the GetUserProfileByName call. EDIT2: This does appear to be due to a change in SharePoint. I loaded a previous version of my software (known to be working), and it exhibits the same erroneous behavior.

    Read the article

  • How do I solve this error, "error while trying to deserialize parameter"

    - by Paul Rowland
    I have a web service that is working fine in one environment but not in another. The web service gets document meta data from SharePoint, it running on a server where I cant debug but with logging I confirmed that the method enters and exits successfully. What could be the reason for the errors? The error message is, The formatter threw an exception while trying to deserialize the message: There was an error while trying to deserialize parameter http://CompanyName.com.au/ProjectName:GetDocumentMetaDataResponse. The InnerException message was 'Error in line 1 position 388. 'Element' 'CustomFields' from namespace 'http://CompanyName.com.au/ProjectName' is not expected. Expecting element 'Id'.'. Please see InnerException for more details. The InnerException was System.ServiceModel.Dispatcher.NetDispatcherFaultException was caught Message="The formatter threw an exception while trying to deserialize the message: There was an error while trying to deserialize parameter http://CompanyName.com.au/ProjectName:GetDocumentMetaDataResponse. The InnerException message was 'Error in line 1 position 388. 'Element' 'CustomFields' from namespace 'http://CompanyName.com.au/ProjectName' is not expected. Expecting element 'Id'.'. Please see InnerException for more details." Source="mscorlib" Action="http://schemas.microsoft.com/net/2005/12/windowscommunicationfoundation/dispatcher/fault" StackTrace: Server stack trace: at System.ServiceModel.Dispatcher.DataContractSerializerOperationFormatter.DeserializeParameterPart(XmlDictionaryReader reader, PartInfo part, Boolean isRequest) at System.ServiceModel.Dispatcher.DataContractSerializerOperationFormatter.DeserializeParameter(XmlDictionaryReader reader, PartInfo part, Boolean isRequest) at System.ServiceModel.Dispatcher.DataContractSerializerOperationFormatter.DeserializeParameters(XmlDictionaryReader reader, PartInfo[] parts, Object[] parameters, Boolean isRequest) at System.ServiceModel.Dispatcher.DataContractSerializerOperationFormatter.DeserializeBody(XmlDictionaryReader reader, MessageVersion version, String action, MessageDescription messageDescription, Object[] parameters, Boolean isRequest) at System.ServiceModel.Dispatcher.OperationFormatter.DeserializeBodyContents(Message message, Object[] parameters, Boolean isRequest) at System.ServiceModel.Dispatcher.OperationFormatter.DeserializeReply(Message message, Object[] parameters) at System.ServiceModel.Dispatcher.ProxyOperationRuntime.AfterReply(ProxyRpc& rpc) at System.ServiceModel.Channels.ServiceChannel.HandleReply(ProxyOperationRuntime operation, ProxyRpc& rpc) at System.ServiceModel.Channels.ServiceChannel.Call(String action, Boolean oneway, ProxyOperationRuntime operation, Object[] ins, Object[] outs, TimeSpan timeout) at System.ServiceModel.Channels.ServiceChannel.Call(String action, Boolean oneway, ProxyOperationRuntime operation, Object[] ins, Object[] outs) at System.ServiceModel.Channels.ServiceChannelProxy.InvokeService(IMethodCallMessage methodCall, ProxyOperationRuntime operation) at System.ServiceModel.Channels.ServiceChannelProxy.Invoke(IMessage message) Exception rethrown at [0]: at System.Runtime.Remoting.Proxies.RealProxy.HandleReturnMessage(IMessage reqMsg, IMessage retMsg) at System.Runtime.Remoting.Proxies.RealProxy.PrivateInvoke(MessageData& msgData, Int32 type) at CompanyName.ProjectName.External.Sharepoint.WebServiceProxies.SharepointProjectNameSiteService.ProjectNameSiteSoap.GetDocumentMetaData(GetDocumentMetaDataRequest request) at CompanyName.ProjectName.External.Sharepoint.WebServiceProxies.SharepointProjectNameSiteService.ProjectNameSiteSoapClient.CompanyName.ProjectName.External.Sharepoint.WebServiceProxies.SharepointProjectNameSiteService.ProjectNameSiteSoap.GetDocumentMetaData(GetDocumentMetaDataRequest request) in D:\Source\TFSRoot\ProjectName\trunk\CodeBase\External\CompanyName.ProjectName.External.Sharepoint.WebServiceProxies\Service References\SharepointProjectNameSiteService\Reference.cs:line 2141 at CompanyName.ProjectName.External.Sharepoint.WebServiceProxies.SharepointProjectNameSiteService.ProjectNameSiteSoapClient.GetDocumentMetaData(ListSummaryDto listSummary, FileCriteriaDto criteria, List`1 customFields) in D:\Source\TFSRoot\ProjectName\trunk\CodeBase\External\CompanyName.ProjectName.External.Sharepoint.WebServiceProxies\Service References\SharepointProjectNameSiteService\Reference.cs:line 2150 at CompanyName.ProjectName.Services.Shared.SharepointAdapter.GetDocumentMetaData(ListSummaryDto listSummary, FileCriteriaDto criteria, List`1 customFields) in D:\Source\TFSRoot\ProjectName\trunk\CodeBase\Services\CompanyName.ProjectName.Services\Shared\SharepointAdapter.cs:line 260 at CompanyName.ProjectName.Services.Project.ProjectDocumentService.SetSharepointDocumentData(List`1 sourceDocuments) in D:\Source\TFSRoot\ProjectName\trunk\CodeBase\Services\CompanyName.ProjectName.Services\Project\ProjectDocumentService.cs:line 1963 at CompanyName.ProjectName.Services.Project.ProjectDocumentService.GetProjectConversionDocumentsImplementation(Int32 projectId) in D:\Source\TFSRoot\ProjectName\trunk\CodeBase\Services\CompanyName.ProjectName.Services\Project\ProjectDocumentService.cs:line 3212 InnerException: System.Runtime.Serialization.SerializationException Message="Error in line 1 position 388. 'Element' 'CustomFields' from namespace 'http://CompanyName.com.au/ProjectName' is not expected. Expecting element 'Id'." Source="System.Runtime.Serialization" StackTrace: at System.Runtime.Serialization.XmlObjectSerializerReadContext.ThrowRequiredMemberMissingException(XmlReaderDelegator xmlReader, Int32 memberIndex, Int32 requiredIndex, XmlDictionaryString[] memberNames) at System.Runtime.Serialization.XmlObjectSerializerReadContext.GetMemberIndexWithRequiredMembers(XmlReaderDelegator xmlReader, XmlDictionaryString[] memberNames, XmlDictionaryString[] memberNamespaces, Int32 memberIndex, Int32 requiredIndex, ExtensionDataObject extensionData) at ReadFileMetaDataDtoFromXml(XmlReaderDelegator , XmlObjectSerializerReadContext , XmlDictionaryString[] , XmlDictionaryString[] ) at System.Runtime.Serialization.ClassDataContract.ReadXmlValue(XmlReaderDelegator xmlReader, XmlObjectSerializerReadContext context) at System.Runtime.Serialization.XmlObjectSerializerReadContext.ReadDataContractValue(DataContract dataContract, XmlReaderDelegator reader) at System.Runtime.Serialization.XmlObjectSerializerReadContext.InternalDeserialize(XmlReaderDelegator reader, String name, String ns, DataContract& dataContract) at System.Runtime.Serialization.XmlObjectSerializerReadContext.InternalDeserialize(XmlReaderDelegator xmlReader, Int32 id, RuntimeTypeHandle declaredTypeHandle, String name, String ns) at ReadArrayOfFileMetaDataDtoFromXml(XmlReaderDelegator , XmlObjectSerializerReadContext , XmlDictionaryString , XmlDictionaryString , CollectionDataContract ) at System.Runtime.Serialization.CollectionDataContract.ReadXmlValue(XmlReaderDelegator xmlReader, XmlObjectSerializerReadContext context) at System.Runtime.Serialization.XmlObjectSerializerReadContext.ReadDataContractValue(DataContract dataContract, XmlReaderDelegator reader) at System.Runtime.Serialization.XmlObjectSerializerReadContext.InternalDeserialize(XmlReaderDelegator reader, String name, String ns, DataContract& dataContract) at System.Runtime.Serialization.XmlObjectSerializerReadContext.InternalDeserialize(XmlReaderDelegator xmlReader, Int32 id, RuntimeTypeHandle declaredTypeHandle, String name, String ns) at ReadMetaDataSearchResultsDtoFromXml(XmlReaderDelegator , XmlObjectSerializerReadContext , XmlDictionaryString[] , XmlDictionaryString[] ) at System.Runtime.Serialization.ClassDataContract.ReadXmlValue(XmlReaderDelegator xmlReader, XmlObjectSerializerReadContext context) at System.Runtime.Serialization.XmlObjectSerializerReadContext.ReadDataContractValue(DataContract dataContract, XmlReaderDelegator reader) at System.Runtime.Serialization.XmlObjectSerializerReadContext.InternalDeserialize(XmlReaderDelegator reader, String name, String ns, DataContract& dataContract) at System.Runtime.Serialization.XmlObjectSerializerReadContext.InternalDeserialize(XmlReaderDelegator xmlReader, Int32 id, RuntimeTypeHandle declaredTypeHandle, String name, String ns) at ReadGetDocumentMetaDataResponseBodyFromXml(XmlReaderDelegator , XmlObjectSerializerReadContext , XmlDictionaryString[] , XmlDictionaryString[] ) at System.Runtime.Serialization.ClassDataContract.ReadXmlValue(XmlReaderDelegator xmlReader, XmlObjectSerializerReadContext context) at System.Runtime.Serialization.XmlObjectSerializerReadContext.ReadDataContractValue(DataContract dataContract, XmlReaderDelegator reader) at System.Runtime.Serialization.XmlObjectSerializerReadContext.InternalDeserialize(XmlReaderDelegator reader, String name, String ns, DataContract& dataContract) at System.Runtime.Serialization.XmlObjectSerializerReadContext.InternalDeserialize(XmlReaderDelegator xmlReader, Type declaredType, DataContract dataContract, String name, String ns) at System.Runtime.Serialization.DataContractSerializer.InternalReadObject(XmlReaderDelegator xmlReader, Boolean verifyObjectName) at System.Runtime.Serialization.XmlObjectSerializer.ReadObjectHandleExceptions(XmlReaderDelegator reader, Boolean verifyObjectName) at System.Runtime.Serialization.DataContractSerializer.ReadObject(XmlDictionaryReader reader, Boolean verifyObjectName) at System.ServiceModel.Dispatcher.DataContractSerializerOperationFormatter.DeserializeParameterPart(XmlDictionaryReader reader, PartInfo part, Boolean isRequest) InnerException:

    Read the article

  • Integrate Bing Search API into ASP.Net application

    - by sreejukg
    Couple of months back, I wrote an article about how to integrate Bing Search engine (API 2.0) with ASP.Net website. You can refer the article here http://weblogs.asp.net/sreejukg/archive/2012/04/07/integrate-bing-api-for-search-inside-asp-net-web-application.aspx Things are changing rapidly in the tech world and Bing has also changed! The Bing Search API 2.0 will work until August 1, 2012, after that it will not return results. Shocked? Don’t worry the API has moved to Windows Azure market place and available for you to sign up and continue using it and there is a free version available based on your usage. In this article, I am going to explain how you can integrate the new Bing API that is available in the Windows Azure market place with your website. You can access the Windows Azure market place from the below link https://datamarket.azure.com/ There is lot of applications available for you to subscribe and use. Bing is one of them. You can find the new Bing Search API from the below link https://datamarket.azure.com/dataset/5BA839F1-12CE-4CCE-BF57-A49D98D29A44 To get access to Bing Search API, first you need to register an account with Windows Azure market place. Sign in to the Windows Azure market place site using your windows live account. Once you sign in with your windows live account, you need to register to Windows Azure Market place account. From the Windows Azure market place, you will see the sign in button it the top right of the page. Clicking on the sign in button will take you to the Windows live ID authentication page. You can enter a windows live ID here to login. Once logged in you will see the Registration page for the Windows Azure market place as follows. You can agree or disagree for the email address usage by Microsoft. I believe selecting the check box means you will get email about what is happening in Windows Azure market place. Click on continue button once you are done. In the next page, you should accept the terms of use, it is not optional, you must agree to terms and conditions. Scroll down to the page and select the I agree checkbox and click on Register Button. Now you are a registered member of Windows Azure market place. You can subscribe to data applications. In order to use BING API in your application, you must obtain your account Key, in the previous version of Bing you were required an API key, the current version uses Account Key instead. Once you logged in to the Windows Azure market place, you can see “My Account” in the top menu, from the Top menu; go to “My Account” Section. From the My Account section, you can manage your subscriptions and Account Keys. Account Keys will be used by your applications to access the subscriptions from the market place. Click on My Account link, you can see Account Keys in the left menu and then Add an account key or you can use the default Account key available. Creating account key is very simple process. Also you can remove the account keys you create if necessary. The next step is to subscribe to BING Search API. At this moment, Bing Offers 2 APIs for search. The available options are as follows. 1. Bing Search API - https://datamarket.azure.com/dataset/5ba839f1-12ce-4cce-bf57-a49d98d29a44 2. Bing Search API – Web Results only - https://datamarket.azure.com/dataset/8818f55e-2fe5-4ce3-a617-0b8ba8419f65 The difference is that the later will give you only web results where the other you can specify the source type such as image, video, web, news etc. Carefully choose the API based on your application requirements. In this article, I am going to use Web Results Only API, but the steps will be similar to both. Go to the API page https://datamarket.azure.com/dataset/8818f55e-2fe5-4ce3-a617-0b8ba8419f65, you can see the subscription options in the right side. And in the bottom of the page you can see the free option Since I am going to use the free options, just Click the Sign Up link for that. Just select I agree check box and click on the Sign Up button. You will get a recipt pagethat detail your subscription. Now you are ready Bing Search API – Web results. The next step is to integrate the API into your ASP.Net application. Now if you go to the Search API page (as well as in the Receipt page), you can see a .Net C# Class Library link, click on the link, you will get a code file named “BingSearchContainer.cs”. In the following sections I am going to demonstrate the use of Bing Search API from an ASP.Net application. Create an empty ASP.Net web application. In the solution explorer, the application will looks as follows. Now add the downloaded code file (“BingSearchContainer.cs”) to the project. Right click your project in solution explorer, Add -> existing item, then browse to the downloaded location, select the “BingSearchContainer.cs” file and add it to the project. To build the code file you need to add reference to the following library. System.Data.Services.Client You can find the library in the .Net tab, when you select Add -> Reference Try to build your project now; it should build without any errors. Add an ASP.Net page to the project. I have included a text box and a button, then a Grid View to the page. The idea is to Search the text entered and display the results in the gridview. The page will look in the Visual Studio Designer as follows. The markup of the page is as follows. In the button click event handler for the search button, I have used the following code. Now run your project and enter some text in the text box and click the Search button, you will see the results coming from Bing, cool. I entered the text “Microsoft” in the textbox and clicked on the button and I got the following results. Searching Specific Websites If you want to search a particular website, you pass the site url with site:<site url name> and if you have more sites, use pipe (|). e.g. The following search query site:microsoft.com | site:adobe.com design will search the word design and return the results from Microsoft.com and Adobe.com See the sample code that search only Microsoft.com for the text entered for the above sample. var webResults = bingContainer.Web("site:www.Microsoft.com " + txtSearch.Text, null, null, null, null, null, null); Paging the results returned by the API By default the BING API will return 100 results based on your query. The default code file that you downloaded from BING doesn’t include any option for this. You can modify the downloaded code to perform this paging. The BING API supports two parameters $top (for number of results to return) and $skip (for number of records to skip). So if you want 3rd page of results with page size = 10, you need to pass $top = 10 and $skip=20. Open the BingSearchContainer.cs in the editor. You can see the Web method in it as follows. public DataServiceQuery<WebResult> Web(String Query, String Market, String Adult, Double? Latitude, Double? Longitude, String WebFileType, String Options) {  In the method signature, I have added two more parameters public DataServiceQuery<WebResult> Web(String Query, String Market, String Adult, Double? Latitude, Double? Longitude, String WebFileType, String Options, int resultCount, int pageNo) { and in the method, you need to pass the parameters to the query variable. query = query.AddQueryOption("$top", resultCount); query = query.AddQueryOption("$skip", (pageNo -1)*resultCount); return query; Note that I didn’t perform any validation, but you need to check conditions such as resultCount and pageCount should be greater than or equal to 1. If the parameters are not valid, the Bing Search API will throw the error. The modified method is as follows. The changes are highlighted. Now see the following code in the SearchPage.aspx.cs file protected void btnSearch_Click(object sender, EventArgs e) {     var bingContainer = new Bing.BingSearchContainer(new Uri(https://api.datamarket.azure.com/Bing/SearchWeb/));     // replace this value with your account key     var accountKey = "your key";     // the next line configures the bingContainer to use your credentials.     bingContainer.Credentials = new NetworkCredential(accountKey, accountKey);     var webResults = bingContainer.Web("site:microsoft.com" +txtSearch.Text , null, null, null, null, null, null,3,2);     lstResults.DataSource = webResults;     lstResults.DataBind(); } The following code will return 3 results starting from second page (by skipping first 3 results). See the result page as follows. Bing provides complete integration to its offerings. When you develop search based applications, you can use the power of Bing to perform the search. Integrating Bing Search API to ASP.Net application is a simple process and without investing much time, you can develop a good search based application. Make sure you read the terms of use before designing the application and decide which API usage is suitable for you. Further readings BING API Migration Guide http://go.microsoft.com/fwlink/?LinkID=248077 Bing API FAQ http://go.microsoft.com/fwlink/?LinkID=252146 Bing API Schema Guide http://go.microsoft.com/fwlink/?LinkID=252151

    Read the article

  • When someone deletes a shared data source in SSRS

    - by Rob Farley
    SQL Server Reporting Services plays nicely. You can have things in the catalogue that get shared. You can have Reports that have Links, Datasets that can be used across different reports, and Data Sources that can be used in a variety of ways too. So if you find that someone has deleted a shared data source, you potentially have a bit of a horror story going on. And this works for this month’s T-SQL Tuesday theme, hosted by Nick Haslam, who wants to hear about horror stories. I don’t write about LobsterPot client horror stories, so I’m writing about a situation that a fellow MVP friend asked me about recently instead. The best thing to do is to grab a recent backup of the ReportServer database, restore it somewhere, and figure out what’s changed. But of course, this isn’t always possible. And it’s much nicer to help someone with this kind of thing, rather than to be trying to fix it yourself when you’ve just deleted the wrong data source. Unfortunately, it lets you delete data sources, without trying to scream that the data source is shared across over 400 reports in over 100 folders, as was the case for my friend’s colleague. So, suddenly there’s a big problem – lots of reports are failing, and the time to turn it around is small. You probably know which data source has been deleted, but getting the shared data source back isn’t the hard part (that’s just a connection string really). The nasty bit is all the re-mapping, to get those 400 reports working again. I know from exploring this kind of stuff in the past that the ReportServer database (using its default name) has a table called dbo.Catalog to represent the catalogue, and that Reports are stored here. However, the information about what data sources these deployed reports are configured to use is stored in a different table, dbo.DataSource. You could be forgiven for thinking that shared data sources would live in this table, but they don’t – they’re catalogue items just like the reports. Let’s have a look at the structure of these two tables (although if you’re reading this because you have a disaster, feel free to skim past). Frustratingly, there doesn’t seem to be a Books Online page for this information, sorry about that. I’m also not going to look at all the columns, just ones that I find interesting enough to mention, and that are related to the problem at hand. These fields are consistent all the way through to SQL Server 2012 – there doesn’t seem to have been any changes here for quite a while. dbo.Catalog The Primary Key is ItemID. It’s a uniqueidentifier. I’m not going to comment any more on that. A minor nice point about using GUIDs in unfamiliar databases is that you can more easily figure out what’s what. But foreign keys are for that too… Path, Name and ParentID tell you where in the folder structure the item lives. Path isn’t actually required – you could’ve done recursive queries to get there. But as that would be quite painful, I’m more than happy for the Path column to be there. Path contains the Name as well, incidentally. Type tells you what kind of item it is. Some examples are 1 for a folder and 2 a report. 4 is linked reports, 5 is a data source, 6 is a report model. I forget the others for now (but feel free to put a comment giving the full list if you know it). Content is an image field, remembering that image doesn’t necessarily store images – these days we’d rather use varbinary(max), but even in SQL Server 2012, this field is still image. It stores the actual item definition in binary form, whether it’s actually an image, a report, whatever. LinkSourceID is used for Linked Reports, and has a self-referencing foreign key (allowing NULL, of course) back to ItemID. Parameter is an ntext field containing XML for the parameters of the report. Not sure why this couldn’t be a separate table, but I guess that’s just the way it goes. This field gets changed when the default parameters get changed in Report Manager. There is nothing in dbo.Catalog that describes the actual data sources that the report uses. The default data sources would be part of the Content field, as they are defined in the RDL, but when you deploy reports, you typically choose to NOT replace the data sources. Anyway, they’re not in this table. Maybe it was already considered a bit wide to throw in another ntext field, I’m not sure. They’re in dbo.DataSource instead. dbo.DataSource The Primary key is DSID. Yes it’s a uniqueidentifier... ItemID is a foreign key reference back to dbo.Catalog Fields such as ConnectionString, Prompt, UserName and Password do what they say on the tin, storing information about how to connect to the particular source in question. Link is a uniqueidentifier, which refers back to dbo.Catalog. This is used when a data source within a report refers back to a shared data source, rather than embedding the connection information itself. You’d think this should be enforced by foreign key, but it’s not. It does allow NULLs though. Flags this is an int, and I’ll come back to this. When a Data Source gets deleted out of dbo.Catalog, you might assume that it would be disallowed if there are references to it from dbo.DataSource. Well, you’d be wrong. And not because of the lack of a foreign key either. Deleting anything from the catalogue is done by calling a stored procedure called dbo.DeleteObject. You can look at the definition in there – it feels very much like the kind of Delete stored procedures that many people write, the kind of thing that means they don’t need to worry about allowing cascading deletes with foreign keys – because the stored procedure does the lot. Except that it doesn’t quite do that. If it deleted everything on a cascading delete, we’d’ve lost all the data sources as configured in dbo.DataSource, and that would be bad. This is fine if the ItemID from dbo.DataSource hooks in – if the report is being deleted. But if a shared data source is being deleted, you don’t want to lose the existence of the data source from the report. So it sets it to NULL, and it marks it as invalid. We see this code in that stored procedure. UPDATE [DataSource]    SET       [Flags] = [Flags] & 0x7FFFFFFD, -- broken link       [Link] = NULL FROM    [Catalog] AS C    INNER JOIN [DataSource] AS DS ON C.[ItemID] = DS.[Link] WHERE    (C.Path = @Path OR C.Path LIKE @Prefix ESCAPE '*') Unfortunately there’s no semi-colon on the end (but I’d rather they fix the ntext and image types first), and don’t get me started about using the table name in the UPDATE clause (it should use the alias DS). But there is a nice comment about what’s going on with the Flags field. What I’d LIKE it to do would be to set the connection information to a report-embedded copy of the connection information that’s in the shared data source, the one that’s about to be deleted. I understand that this would cause someone to lose the benefit of having the data sources configured in a central point, but I’d say that’s probably still slightly better than LOSING THE INFORMATION COMPLETELY. Sorry, rant over. I should log a Connect item – I’ll put that on my todo list. So it sets the Link field to NULL, and marks the Flags to tell you they’re broken. So this is your clue to fixing it. A bitwise AND with 0x7FFFFFFD is basically stripping out the ‘2’ bit from a number. So numbers like 2, 3, 6, 7, 10, 11, etc, whose binary representation ends in either 11 or 10 get turned into 0, 1, 4, 5, 8, 9, etc. We can test for it using a WHERE clause that matches the SET clause we’ve just used. I’d also recommend checking for Link being NULL and also having no ConnectionString. And join back to dbo.Catalog to get the path (including the name) of broken reports are – in case you get a surprise from a different data source being broken in the past. SELECT c.Path, ds.Name FROM dbo.[DataSource] AS ds JOIN dbo.[Catalog] AS c ON c.ItemID = ds.ItemID WHERE ds.[Flags] = ds.[Flags] & 0x7FFFFFFD AND ds.[Link] IS NULL AND ds.[ConnectionString] IS NULL; When I just ran this on my own machine, having deleted a data source to check my code, I noticed a Report Model in the list as well – so if you had thought it was just going to be reports that were broken, you’d be forgetting something. So to fix those reports, get your new data source created in the catalogue, and then find its ItemID by querying Catalog, using Path and Name to find it. And then use this value to fix them up. To fix the Flags field, just add 2. I prefer to use bitwise OR which should do the same. Use the OUTPUT clause to get a copy of the DSIDs of the ones you’re changing, just in case you need to revert something later after testing (doing it all in a transaction won’t help, because you’ll just lock out the table, stopping you from testing anything). UPDATE ds SET [Flags] = [Flags] | 2, [Link] = '3AE31CBA-BDB4-4FD1-94F4-580B7FAB939D' /*Insert your own GUID*/ OUTPUT deleted.Name, deleted.DSID, deleted.ItemID, deleted.Flags FROM dbo.[DataSource] AS ds JOIN dbo.[Catalog] AS c ON c.ItemID = ds.ItemID WHERE ds.[Flags] = ds.[Flags] & 0x7FFFFFFD AND ds.[Link] IS NULL AND ds.[ConnectionString] IS NULL; But please be careful. Your mileage may vary. And there’s no reason why 400-odd broken reports needs to be quite the nightmare that it could be. Really, it should be less than five minutes. @rob_farley

    Read the article

  • The endpoint reference (EPR) for the Operation not found is /services/SimpleStockQuoteService

    - by user3592349
    I'm a newbie in wso2 ESB. I am following the documentation and trying out the "REST Client and SOAP Service" scenario. After executing ant stockquote -Daddurl=http://localhost:8280/services/StockQuoteProxy -Drest=true the following error is thrown [java] Sending as REST [java] org.apache.axis2.AxisFault: The endpoint reference (EPR) for the Operation not found is /services/SimpleStockQuoteService and the WSA Action = null. If this EPR was previously reachable, please contact the server administrator.

    Read the article

  • SQL SERVER – What is MDS? – Master Data Services in Microsoft SQL Server 2008 R2

    - by pinaldave
    What is MDS? Master Data Services helps enterprises standardize the data people rely on to make critical business decisions. With Master Data Services, IT organizations can centrally manage critical data assets company wide and across diverse systems, enable more people to securely manage master data directly, and ensure the integrity of information over time. (Source: Microsoft) Today I will be talking about the same subject at Microsoft TechEd India. If you want to learn about how to standardize your data and apply the business rules to validate data you must attend my session. MDS is very interesting concept, I will cover super short but very interesting 10 quick slides about this subject. I will make sure in very first 20 mins, you will understand following topics Introduction to Master Data Management What is Master Data and Challenges MDM Challenges and Advantage Microsoft Master Data Services Benefits and Key Features Uses of MDS Capabilities Key Features of MDS This slides decks will be followed by around 30 mins demo which will have story of entity, hierarchies, versions, security, consolidation and collection. I will be tell this story keeping business rules in center. We take one business rule which will be simple validation rule and will make it much more complex and yet very useful to product. I will also demonstrate few real life scenario where I will be talking about MDS and its usage. Do not miss this session. At the end of session there will be book awarded to best participant. My session details: Session: Master Data Services in Microsoft SQL Server 2008 R2 Date: April 12, 2010  Time: 2:30pm-3:30pm SQL Server Master Data Services will ship with SQL Server 2008 R2 and will improve Microsoft’s platform appeal. This session provides an in depth demonstration of MDS features and highlights important usage scenarios. Master Data Services enables consistent decision making by allowing you to create, manage and propagate changes from single master view of your business entities. Also with MDS – Master Data-hub which is the vital component helps ensure reporting consistency across systems and deliver faster more accurate results across the enterprise. We will talk about establishing the basis for a centralized approach to defining, deploying, and managing master data in the enterprise. Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: Business Intelligence, Data Warehousing, MVP, Pinal Dave, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, SQLAuthority Author Visit, T SQL, Technology Tagged: TechEd, TechEdIn

    Read the article

  • Terminal Services - what does this message mean ?

    - by southof40
    Hi - I'm responsible for a W2003 server. I'm trying to switch on terminal services. When I do so I get a message that includes the text : "If you continue with this installation, programs that are already installed on your server will no longer work and will have to be reinstalled" Surely that doesn't mean what it appears to mean ? I just want one extra person to be able to access the machine in the same way that currently two administrators do (via "Remote Administration") only I want all three to be able to do simultaneously. When I bought the extra TS license I thought I was just buying the right to have one extra user. Is doing this really going to trash everything on the machine ?

    Read the article

  • No "Terminal Services" branch in "Group Policy Object Editor"

    - by ayavilevich
    Hi, We have several identical several servers in a hosting company. They run Windows 2003 R2 Std SP2 64bit. The servers are not in a domain. We have recently received a new server with the same configuration and hardware. However, the new server is different in some way. When we run "gpedit.msc /s" there are much less options in the tree than the other servers. Specifically we are missing the configuration of "Terminal Services". Many other items are missing under "Administrative templates" and "Windows components". Screenshot of correct server: (can't post link due to SF policy) Screenshot of new server: http://img686.imageshack.us/img686/572/gpowindowscomponentstp5.png What should we try? Thanks, Arik.

    Read the article

  • Data and Secularism

    - by kaleidoscope
    Ever since we’ve been using Data we’ve been religious. Religious about the way we represent it and equally religious about the way we access it. Be it plain old SQL, DAO, ADO, ADO.Net and I am just referring to religions in MSFT world. A peek outside and I’d need a separate book to list out the Data faiths. Various application areas in networked computing are converging under the HTTP umbrella with a plausible transition to purist HTTP and in turn REST fuelled by the Web2.0 storm. It was time the Data access faiths also gave up the religious silos wrapped around our long worshipped data publishing and access methods. OData is the secular solution we have at hand today. It is an open protocol for sharing data. It can be exposed via REST. It is Open as in the Microsoft Open Specification Promise. This allows virtually everyone to build Data Services for any runtime. OData is one of the key standards for Data publishing/subscribing on Microsoft Codename Dallas. For us .Netters OData data sources can be exposed/consumed via WCF Data Services and the process is very simple, elegant and intuitive. Applications exposing OData Services Sharepoint 2010 IBM Web Sphere Microsoft SQL Azure Windows Azure Table Storage SQL Server Reporting Services   Live OData Services Netflix Open Science Data Initiative Open Government Data Initiatives Northwind database exposed as OData Service and many others Some may prefer to call it commoditization of data, unification of data access strategies or any other sweet name. I for one will stick to my secular definition. :) Technorati Tags: Sarang,OData,MOSP

    Read the article

  • Media Temple-like hosting services?

    - by antonpug
    I have a couple of wordpress sites which do not get much traffic now, but I plan on expanding to something like a 1000-2000visits/day in a year or two. Media Temple has some really nice offerings, but their Wordpress plan is 20/month...which is a little too much, seeing as at this point my site is more of a hobby than a money making machine. I currently host with HostGator (just switched from GodaddyiPageBluehost). All these cheaper/pop hosting services are okay, but it would be nice to find something a little bit more "premium", but at a lower cost than MT. Anyone know anything worth looking at?

    Read the article

  • Windows Deployment Services

    - by timbrigham
    I have a slightly advanced Windows Deployment Services setup. My router hands out DHCP addresses, including the following config. ip dhcp pool Servers_100 network 192.168.100.0 255.255.255.0 bootfile boot\\x86\\pxelinux.0 next-server 192.168.100.50 default-router 192.168.100.1 dns-server 192.168.100.80 192.168.100.81 This works perfectly for other subnets - I have a couple screens in my pxelinux that allow me to select my various Linux installers or enter the windows preboot environment. For some reason I'm only receiving the default bootfile that opens to the windows preboot environment. Any idea why?

    Read the article

  • Samples for RESTful web services for WCF

    - by George2
    Hello everyone, I am new to RESTful web services in WCF, but not new to WCF. I want to develop some simple RESTful web services in WCF which manually be accessed from browser. Any good samples or documents to recommend? I am using C#. thanks in advance, George

    Read the article

  • Enabling Multiple Monitor Support from Terminal Services/Remote Desktop over Citrix

    - by Nicolas Webb
    Our Remote Desktop/Terminal Services solution where I work relies on Citrix for machines not connected via the VPN. We're using Citrix Xen server (I'm pretty sure) and I'm going to try to connect to a Windows 7 Host (my work computer) and I think the RDC client runs on a Win2003 host (exposed via Citrix). Is it possible to take advantage of Windows 7 multiple monitor support for RDC with this setup? Would I need to try getting my Citrix guys to have a different host machine for the RDC (Win2008, or Win7?)? I'm probably going to connect using the OS X Citrix client, but I'd be willing to BootCamp/Fusion up a Windows instance to work remotely, as well. I really want to be able to use multiple monitors remotely. It does "span" multiple montiors currently (I have a 3000x1024 desktop, for example) but I'd rather it be "true" multiple monitor instead of one giant desktop, if possible.

    Read the article

  • Terminal services and memory limits

    - by Mark Wassell
    Is there a way in Terminal Services to set limits on memory related parameters for a process. For example working set size and, possibly, if it makes sense, total virtual memory allocation for the session? To turn the question around, we have an application which cannot allocate as much virtual memory running on a terminal server as it can when running on a desktop PC (both I would expect to have a limit of 2GB for user mode address space) and I was wondering if there is another limit for processes or users on a terminal server. Perhaps even 2GB per user rather than per process.

    Read the article

  • Source Code Linux built in Services

    - by Sirish Kumar
    Hi, I am looking at linux startup services, like Cron which runs at level 5 located in init.d, in the startup script I can only see the script file and location of binary file which is executed on startup. Where can I see the actual source code of this services

    Read the article

  • Distributed Database Services?

    - by Cameron
    I'm working on a database-driven web service with clients in the US and Australia. We're currently hosted in the US, however our Australian clients are experiencing lag. The lag is primarily due to the fact that the pages launch AJAX queries which require some db work to be done on our database in the US and these take a while to perform a round trip. Ideally, we're looking for some kind of distributed database system which replicates our main US database in Australia (and possibly other locations if we choose to expand later on). Does anyone have any suggestions for services which offer something like this? Something like a CDN (CacheFly etc), which is web-based, simple to set up etc but for databases instead of static files. Ideally it would be completely transparent to the application and abstract away all the distributed database management, syncs etc.

    Read the article

  • Reporting Services 2005 Model using WCF Service for data

    - by Stu
    I am trying to use WCF Services as models for SQL Server Reporting Services 2005 reports. I can do this if I design the reports in the designer but cannot do it for a Reporting Model project which I think I need to make reports in the Report Builder. My full requirement is to have a report builder that the users can use building reports based on DTOs supplied from my WCF service. Thanks

    Read the article

  • Mate and Remote Java Services Integration

    - by subh
    I am new to Flex and need to integrate Java services with Flex UI built using Mate framework. Can anyone point me to any website/links or show an simple example of integrating Flex UI on Mate framework with Java remote services? A simple integration of a "LoginService" will be good enough.

    Read the article

  • Adsense-type services without the content check?

    - by Ryan Ahearn
    Are there any ad-placing services that don't mind that the page is nothing but a page of ads? Someone else with a .cn tld is pointing to my IP address. I currently have that site on its own name based virtual host that responds to every request with a 403, but I figure if I am paying for the incoming bandwidth, I might as well try to make some money off it. I don't want to actually create any content for this site, which prevents me from using adsense. My only requirements are that it is free for me to sign up and that they don't mind the lack of content.

    Read the article

  • multiple Thrift services on one transport

    - by kert
    Just seeking confirmation here : apache Thrift protocol does not seem to support running multiple services on one transport endpoint ? ( a socket, file, whatever ) I cant seem to figure out how to do something like this in Thrift: service otherService { void dosomething() } service rootService { otherService getOtherService() } There does not seem to be any concept of passing in and out service handles, ultimately limited by the protocol. Looks like you can not run two services on one transport pipe. Correct ?

    Read the article

  • Services filling up taskbar as applications

    - by Chris
    Windows XP, with Windows Classic theme I've had a sporradic problem, where the task bar seems to fill up with lots of tasks, which seem to be names of services. When I open Task Manager, the Applications tab is filled with these names. I can still continue to use my PC to an extent, but it's not possible to remove all these icons from the taskbar, and other Windows UI features (e.g. Start bar) tend to be 'hollow' so I only see the outline; meaning I have to restart. See screenshot: The trigger seems to be clicking on the "Show Desktop" quick link when I have several applications open (e.g. Outlook, Excel). Has anyone else come across this? It's happened to me on my home machine before, but seems to happen more frequently (2-3 times a week) at work...

    Read the article

< Previous Page | 62 63 64 65 66 67 68 69 70 71 72 73  | Next Page >