Search Results

Search found 15150 results on 606 pages for 'azure services'.

Page 40/606 | < Previous Page | 36 37 38 39 40 41 42 43 44 45 46 47  | Next Page >

  • Accessing identical web services using the same client

    - by Krt_Malta
    Hi. I have some web services and I am creating a web client using ws-import. When creating the client I have this line: MyServiceService service = new MyServiceService(); It works fine as it is. I have the same web services running on another server and I was wondering if I could access them using the same client. Is it possible to change the wsdl url of the client? Ctrl-Space in Eclipse gives me 2 parameters which I can enter into MyServiceService which are URL arg0 and Qname arg1. Is this what I'm looking for? And if this is the case what should I put in Qname since I didn't find any Javadoc associated and didn't find it on google neither Thanks and regards, Krt_Malta

    Read the article

  • PPT Leveraging Azure for Performance Testing

    - by Tarun Arora
    I have recently presented a session on “How you can leverage Azure for Performance Testing” your application.  It goes without saying that performance testing your application not only gives you the confidence that the application will work under heavy levels of stress but also gives you the ability to test how scalable the architecture of your application is. It is important to know how much is too much for your application! Working with various clients in the industry I have realized that the biggest barrier in Load Testing & Performance Testing adoption is the high infrastructure and administration cost that comes with this phase of testing. In the session I tried to demonstrate how you can use the power of Windows Azure to effectively abstract the administration cost of infrastructure management and lower the total cost of Load & Performance Testing. You can view the session presentation here, http://www.slideshare.net/aroratarun/leveraging-azure-for-performance-testing  I’ll be adding a video on this subject shortly… If you have any feedback or further suggestions to add to the goodness of this solution please get in touch.

    Read the article

  • Cheapest ways to expose internet services

    - by tmow
    Hi all, we have developed/customized an internet site that provides functionalities like sharing, swap/barter, selling and rent any object/services trough public or private galleries and/or clubs. We'd like now to expose all or some of the services to the public, so that anybody can take advantage of our engine, but without spending too much as we rae running low on budget. At the moment we have developed some RSS interfaces for the public catalogs and is possible to use JQuery to query the engine. Do you have any advices on how to proceed here? We were thinking about something simple like framset, using openID like authentication, or similar technologies. Even if can be the coolest solution, we a would like to avoid the develop soap, rest or xmlrpc APIs as it takes a lot of time ( =money ) Do you have any super smart ideas?

    Read the article

  • Choosing a Reporting Services parameter value based on the currently logged in user

    - by Robert Iver
    Here's my situation. I have a Microsoft Reporting Services report that as a parameter takes a salesperson's name and shows them their sales across their territories blah blah blah. But, salesperson A should not be able to choose and view salesperson B's data. So, my thought was to get the currently logged in user from Reporting Services, and then use that to populate the "salesperson" parameter. Is there a way to get the currently logged in user through some hidden RS interface, or is there some other way of accomplishing my goal that I'm just not seeing? Any help would be GREAT, as the higher ups aren't too happen with my (apparent) lack of security right now.

    Read the article

  • Multiple windows services in a single project = mystery

    - by Remoh
    I'm having a bizarre issue that I haven't seen before and I'm thinking it MUST be something simple that I'm not seeing in my code. I have a project with 2 windows services defined. One I've called DataSyncService, the other SubscriptionService. Both are added to the same project installer. Both use a timer control from System.Timers. If I start both services together, they seem to work fine. The timers elapse at the appropriate time and everything looks okay. However, if I start either service individually, leaving the other stopped, everything goes haywire. The timer elapses constantly and on the wrong service. In other words, if I start the DataSyncService, the SubscriptionService timer elapses over and over. ...which is obviously strange. The setup is similar to what I've done in the past so I'm really stumped. I even tried deleting both service and starting over but it doesn't seem to make a difference. At this point, I'm thinking I've made a simple error in the way I'm defining the services and my brain just won't let me see it. It must be creating some sort of threading issue that causes one service to race when the other is stopped. Here the code.... From Program.cs: static void Main() { ServiceBase[] ServicesToRun; ServicesToRun = new ServiceBase[] { new DataSyncService(), new SubscriptionService() }; ServiceBase.Run(ServicesToRun); } From ProjectInstaller.designer.cs: private void InitializeComponent() { this.serviceProcessInstaller1 = new System.ServiceProcess.ServiceProcessInstaller(); this.dataSyncInstaller = new System.ServiceProcess.ServiceInstaller(); this.subscriptionInstaller = new System.ServiceProcess.ServiceInstaller(); // // serviceProcessInstaller1 // this.serviceProcessInstaller1.Account = System.ServiceProcess.ServiceAccount.LocalSystem; this.serviceProcessInstaller1.Password = null; this.serviceProcessInstaller1.Username = null; // // dataSyncInstaller // this.dataSyncInstaller.DisplayName = "Data Sync Service"; this.dataSyncInstaller.ServiceName = "DataSyncService"; this.dataSyncInstaller.StartType = System.ServiceProcess.ServiceStartMode.Automatic; // // subscriptionInstaller // this.subscriptionInstaller.DisplayName = "Subscription Service"; this.subscriptionInstaller.ServiceName = "SubscriptionService"; this.subscriptionInstaller.StartType = System.ServiceProcess.ServiceStartMode.Automatic; // // ProjectInstaller // this.Installers.AddRange(new System.Configuration.Install.Installer[] { this.serviceProcessInstaller1, this.dataSyncInstaller, this.subscriptionInstaller}); } private System.ServiceProcess.ServiceProcessInstaller serviceProcessInstaller1; private System.ServiceProcess.ServiceInstaller dataSyncInstaller; private System.ServiceProcess.ServiceInstaller subscriptionInstaller; From DataSyncService.cs: public static readonly int _defaultInterval = 43200000; //log4net.ILog log; public DataSyncService() { InitializeComponent(); //log = LogFactory.Instance.GetLogger(this); } protected override void OnStart(string[] args) { timer1.Interval = _defaultInterval; //GetInterval(); timer1.Enabled = true; EventLog.WriteEntry("MyProj", "Data Sync Service Started", EventLogEntryType.Information); //log.Info("Data Sync Service Started"); } private void timer1_Elapsed(object sender, System.Timers.ElapsedEventArgs e) { EventLog.WriteEntry("MyProj", "Data Sync Timer Elapsed.", EventLogEntryType.Information); } private void InitializeComponent() { this.timer1 = new System.Timers.Timer(); ((System.ComponentModel.ISupportInitialize)(this.timer1)).BeginInit(); // // timer1 // this.timer1.Enabled = true; this.timer1.Elapsed += new System.Timers.ElapsedEventHandler(this.timer1_Elapsed); // // DataSyncService // this.ServiceName = "DataSyncService"; ((System.ComponentModel.ISupportInitialize)(this.timer1)).EndInit(); } From SubscriptionService: public static readonly int _defaultInterval = 300000; //log4net.ILog log; public SubscriptionService() { InitializeComponent(); } protected override void OnStart(string[] args) { timer1.Interval = _defaultInterval; //GetInterval(); timer1.Enabled = true; EventLog.WriteEntry("MyProj", "Subscription Service Started", EventLogEntryType.Information); //log.Info("Subscription Service Started"); } private void timer1_Elapsed(object sender, System.Timers.ElapsedEventArgs e) { EventLog.WriteEntry("MyProj", "Subscription Service Time Elapsed", EventLogEntryType.Information); } private void InitializeComponent() //in designer { this.timer1 = new System.Timers.Timer(); ((System.ComponentModel.ISupportInitialize)(this.timer1)).BeginInit(); // // timer1 // this.timer1.Enabled = true; this.timer1.Elapsed += new System.Timers.ElapsedEventHandler(this.timer1_Elapsed); // // SubscriptionService // this.ServiceName = "SubscriptionService"; ((System.ComponentModel.ISupportInitialize)(this.timer1)).EndInit(); } Again, the problem is that the timer1_elapsed handler runs constantly when only one of the services is started. And it's the handler on the OPPOSITE service. Anybody see anything?

    Read the article

  • Integrating with Fusion Applications using SOAP web services and REST APIs (Part 1 of 2) by Arvind Srinivasamoorthy

    - by JuergenKress
    Fusion Applications provides several types of interfaces to facilitate integration with other applications within the enterprise and on the cloud.As one of the key integration interfaces, Fusion Applications (FA) supports SOAP services based integration, both inbound and outbound. At this point FA doesn’t provide REST API’s but it is planned for a future release. It is however possible to invoke external REST APIs from FA which we will discuss. Oracle continues to invest in improving both SOAP and REST based connectivity. The content in this blog is based on features that were available at the time of writing it. In this two part blog, I will cover the following topics briefly. Invoking FA SOAP web services from external applications Identifying the FA SOAP web service to be invoked Sample invocation from an external application Techniques to invoke FA services from an ADF application Invoking external SOAP Web Services from FA (covered in Part 2) Invoking external REST APIs from FA (covered in Part 2) I’ll touch upon some basics, so that you can quickly build a few SOAP/REST interactions with FA. If you do not already have access to an FA instance (on-premise or SaaS), you can request for a free 30 day trial of the Oracle Sales Cloud using http://cloud.oracle.com 1. Invoking FA SOAP web services from external applications There are two main types of services that FA exposes -  ADF Services - These services allow you to perform CRUD operations on Fusion business objects. For example, Sales Party Service, Opportunity Service etc. Using these services you can typically perform operations such as get, find, create, delete, update etc on FA objects.These services are typically useful for UI driven integrations such as looking up FA information from external application UIs, using third party Interfaces to create/update data in FA. They are also used in non-UI driven integration uses cases such as initial upload of business or setup data, synchronizing data with an external systems, etc. - Composite Services – These services involve more logic than CRUD and often involving human workflows, rules etc. These services perform a business function such as Get Orchestration Order Service and are used when building larger process based integrations with external systems.These services are usually asynchronous in nature and are not typically used for UI integration patterns. 1a. Identifying the FA SOAP web service to be invoked All FA web service metadata is available through an OER instance (Oracle Enterprise Repository) which is publicly available via http://fusionappsoer.oracle.com. This is the starting point for you to discover the services that you are going to work with. You do not need to own a FA account to browse the services using the above UI You can use the search area on the left to narrow down your search to what you are looking for. For example, you can choose the type as by ADF Services or Composite, you can narrow your search to a specific FA version, Product Family etc. Read the complete article here. SOA & BPM Partner Community For regular information on Oracle SOA Suite become a member in the SOA & BPM Partner Community for registration please visit www.oracle.com/goto/emea/soa (OPN account required) If you need support with your account please contact the Oracle Partner Business Center. Blog Twitter LinkedIn Facebook Wiki Technorati Tags: AppAdvantage,SOA Community,Oracle SOA,Oracle BPM,Community,OPN,Jürgen Kress,Arvind Srinivasamoorthy

    Read the article

  • Oracle SALT 11gR1

    - by Maurice Gamanho
    With the 11gR1 release, SALT now supports Web services transactions (WS-TX). In a nutshell, the SALT 11gR1 Web services gateway (GWWS) now supports bi-directional transactional interoperability. What this means is that Tuxedo application services can now be invoked in global transaction context using Web services. This feature is natural to a product like Tuxedo given its history as transaction processing monitor and its significant contribution to the X/Open (now the Open Group) XA specification. We implemented Web Services Coordination (WS-COOR) and Web Services Atomic Transaction (WS-AT). We also tested and certified with WebLogic Server 11gR1 and Microsoft WCF 3.5 (.Net Framework). For more information, please visit the Tuxedo OTN home page, where you can download a document and samples that will help you get started with WS-TX in Tuxedo. You can check the product documentation here.

    Read the article

  • Java based webservices

    - by java_mouse
    We are working on a big java project and as a second phase in this project, we will have to develop some web services that the clients can call and get or update the data in our database. Though I have been a Java programmer for a while, I have never worked on web services yet. I develop EJBs, data services layer etc but have not worked in web services yet. What are the current standards in developing web services in Java platform? What is the best and recommended way for developing web-services? Any input/link will be appreciated

    Read the article

  • Connect to running web role on Azure using Remote Desktop Connection and VS2012

    - by Magnus Karlsson
    We want to be able to collect IntelliTrace information from our running app and also use remote desktop to connect to the IIS and look around(probably debugging). 1. Create certificate 1.1 Right-click the cloud project (marked in red) and select “Configure remote desktop”. 1.2 In the drop down list of certificates, choose <create> at the bottom. 1.3. Follow the instructions, you can set it up with default values. 1.4 When done. Choose the certificate and click “Copy to File…” as seen in the left of the picture above. 1.5. Save the file with any name you want. Now we will save it to local storage to be able to import it to our solution through the azure configuration manager in step 3. 2. Save certificate to local storage Now we need to attach it to our local certificate storage to be able to reach it from our confiuguration manager in visual studio. Microsoft provides the following steps for doing this: http://support.microsoft.com/kb/232137 In order to view the Certificates store on the local computer, perform the following steps: Click Start, and then click Run. Type "MMC.EXE" (without the quotation marks) and click OK. Click Console in the new MMC you created, and then click Add/Remove Snap-in. In the new window, click Add. Highlight the Certificates snap-in, and then click Add. Choose the Computer option and click Next. Select Local Computer on the next screen, and then click OK. Click Close , and then click OK. You have now added the Certificates snap-in, which will allow you to work with any certificates in your computer's certificate store. You may want to save this MMC for later use. Now that you have access to the Certificates snap-in, you can import the server certificate into you computer's certificate store by following these steps: Open the Certificates (Local Computer) snap-in and navigate to Personal, and then Certificates. Note: Certificates may not be listed. If it is not, that is because there are no certificates installed. Right-click Certificates (or Personal if that option does not exist.) Choose All Tasks, and then click Import. When the wizard starts, click Next. Browse to the PFX file you created containing your server certificate and private key. Click Next. Enter the password you gave the PFX file when you created it. Be sure the Mark the key as exportable option is selected if you want to be able to export the key pair again from this computer. As an added security measure, you may want to leave this option unchecked to ensure that no one can make a backup of your private key. Click Next, and then choose the Certificate Store you want to save the certificate to. You should select Personal because it is a Web server certificate. If you included the certificates in the certification hierarchy, it will also be added to this store. Click Next. You should see a summary of screen showing what the wizard is about to do. If this information is correct, click Finish. You will now see the server certificate for your Web server in the list of Personal Certificates. It will be denoted by the common name of the server (found in the subject section of the certificate). Now that you have the certificate backup imported into the certificate store, you can enable Internet Information Services 5.0 to use that certificate (and the corresponding private key). To do this, perform the following steps: Open the Internet Services Manager (under Administrative Tools) and navigate to the Web site you want to enable secure communications (SSL/TLS) on. Right-click on the site and click Properties. You should now see the properties screen for the Web site. Click the Directory Security tab. Under the Secure Communications section, click Server Certificate. This will start the Web Site Certificate Wizard. Click Next. Choose the Assign an existing certificate option and click Next. You will now see a screen showing that contents of your computer's personal certificate store. Highlight your Web server certificate (denoted by the common name), and then click Next. You will now see a summary screen showing you all the details about the certificate you are installing. Be sure that this information is correct or you may have problems using SSL or TLS in HTTP communications. Click Next, and then click OK to exit the wizard. You should now have an SSL/TLS-enabled Web server. Be sure to protect your PFX files from any unwanted personnel. Image of a typical MMC.EXE with the certificates up.   3. Import the certificate to you visual studio project. 3.1 Now right click your equivalent to the MvcWebRole1 (as seen in the first picture under the red oval) and choose properties. 3.2 Choose Certificates. Right click the ellipsis to the right of the “thumbprint” and you should be able to select your newly created certificate here. After selecting it- save the file.   4. Upload the certificate to your Azure subscription. 4.1 Go to the azure management portal, click the services menu icon to the left and choose the service. Click Upload in the bottom menu.     5. Connect to server. Since I tried to use account settings(have to use another name) we have to set up a new name for the connection. No biggie. 5.1 Go to azure management portal, select your service and in the bottom menu, choose “REMOTE”. This will display the configuration for remote connection. It will actually change your ServiceConfiguration.cscfg file. After you change It here it might be good to choose download and replace the one in your project. Set a name that is not your windows azure account name and not Administrator. 5.2 Goto visual studio, click Server Explorer. Choose as selected in the picture below and click “COnnect using remote desktop”.   5.2 You will now be able to log in with the name and password set up in step 5.1. and voila! Windows server 2012, IIS and other nice stuff!   To do this one I’ve been using http://msdn.microsoft.com/en-us/library/windowsazure/ff683671.aspx where you can collect some of this information and additional one.

    Read the article

  • IP addresses for Windows Azure servers seem to be from the US, when the servers are supposed to be located in Europe

    - by paradroid
    I have a couple of test servers on Windows Azure. One is in the North Europe location and the other is in West Europe. I yet to get around to testing which location offers better connection speeds from where I am (London, UK). The Northern Europe Azure datacentre is apparently in Ireland and the West Europe datacentre is in the Netherlands, which is weird in itself I think. But what I am confused about are the IP addresses are both 168.63.xxx.xxx. GeoIP lookup says that they are both located in the US, and traceroute from London to the addresses get to the US before failing to respond pings. What's going on?

    Read the article

  • How to generate a private/public key pair to use for a Linux server on Windows Azure?

    - by MainMa
    Following Windows Azure documentation, I generated a pair of private/public keys on an Ubuntu machine using the exact comment as given: openssl req -x509 -nodes -days 365 -newkey rsa:2048 -keyout myPrivateKey.key -out myCert.pem When I open the private key in puttygen, the following error is displayed: Couldn't load private key (unrecognised key type) The private key generated by openssl looks correct: -----BEGIN PRIVATE KEY----- MIIEvQIBADANBgkqhkiG6w0xAQEFAASCBKcwggSjAgEsAoIBAQC6OEZ5ULe6F6u2 Cybhqqfqqh2ao9sd2tpqB+HGIoMMHrmnD3YegRgZJIddTQaWKdwaKrYul21YNt5y ... P0RyfL9kDnX/XmIOM38FOoucGvO+Zozsbmgmvw6AUhE0sPhkZnlaodAU1OnfaWJz KpBxkXulBaCJnC8w29dGKng= -----END PRIVATE KEY----- Note that the comments to Azure documentation (the same link as above) report that the pair should be generated using OpenSSL for Windows instead of openssl on Linux. This doesn't help, since the same error appears for a private key generated by OpenSSL for Windows. What am I doing wrong?

    Read the article

  • SQL Azure Federation - how much data before performance benefits?

    - by Donald Hughes
    To avoid premature optimization, I don't want to implement SQL Azure's Federation too early. Is there a rule of thumb for how much data a table would need to have before seeing performance benefits from sharding? I know there won't be a precise answer as there are too many variables to consider, especially with much of SQL Azure's resources being hidden/unknown. To put it into several, more concrete examples, would Federation improve performance in any of the below table scenarios: 100,000 rows (~ 200 MB) 1,000,000 rows (~ 2 GB) 10,000,000 rows (~ 20 GB) 100,000,000 rows (~ 200 GB) For the sake of elaboration, we can assume this is the largest table that would be federated, which consists of order details, which is joined to an orders table with a 'customer_id' foreign key, which would be the distribution key. This is a fairly standard multi-tenant, CRUD order entry system, with a typical assortment of reporting needs (customer order totals by day/month/year, etc).

    Read the article

  • Azure Linux Virtual Machines Price per Hour, Computation or Running?

    - by Arjun Bajaj
    first of all, I couldn't find a StackExchange site on Cloud Computing. I think this is the most appropriate site, because some of you might be using Azure. So I just wanted to know: The Windows Azure Pricing Page shows Linux Virtual Machine Price as $0.013/hr for an extra small VM. The monthly price comes up to about $10. Is this price charged as number of hours of computation done on the VM or number of hours of running the VM? And if I shutdown the VM, will I be charged anything?

    Read the article

  • How to integrate Windows Azure, Spring.NET and NHibernate?

    - by paologios
    Hi, we have a ASP.NET web application which makes use of NHibernate and Spring.NET (to do the session and transaction management stuff). Now we want to port parts of it to a Windows Azure application without making lots of changes to the used components. I already found this article of stackoverflow so I hope to get NHibernate running on Azure. My question: Has anybody experiences in running Spring.NET on Azure (with or without NHibernate) ??

    Read the article

  • Amazon EC2 prices for Windows Instance?

    - by Abhishek Gupta
    Hello Guys , I want to ask from some Amazon cloud technology Experts , that is it profitable to deploy our web application on amazon cloud as compared to normal server? Currently there are micro,small, large and other types of instances available , if we start from micro instance then we realize that our app needs some more CPU cycle and Ram then how can we dynamically move to next more powerful instance automatically at runtime. What is the approx minimum yearly cost for a single EC2 windows small instance? I wnat to deploy a simple Online quiz application (ASP.net based) on Amazon Cloud which at a time can have maximum of 500 users only. Please suggest me as I m very new to Cloud .Should I go for Azure or Amazon?

    Read the article

  • Structure question over Local/Remote Services, Broadcast Receivers, and Intent Services

    - by Ryan
    I'm writing an android app that has a standard activity, but also needs to monitor incoming/outgoing calls and texts at all times. In addition, the app needs to notify users of information once a day without having the activity open. The information it notifies users of is stored in a database, so communication with the activity is not necessary. I've been researching for a week and still can't decide how to go about doing this. My instinct tells me I need a remote service that has a constantly running broadcast receiver, but every remote service example I see is overly complicated. Could anyone help me better understand what steps I need to take? Thanks in advance.

    Read the article

  • DomainDataSource DataPager with silverlight 3 DataGrid & .Net RIA Services

    - by Dennis Ward
    I have a simple datagrid example with silverlight 3, and am populating it with the .NET ria services using a DomainDataSource along with a DataPager declaratively (nothing in the code-behind), and am experiencing this problem: The LoadSize is 30, and the Page size is 15, and when the page is loaded, the 1st and 2nd page appear correctly, but when I go beyond the 2nd page, nothing shows up in the grid. This used to work in the silverlight 3 beta with the Mix 2009 preview of .NET Ria services, and I've got a really simple example and have verified that the Service on the web project gets called to load a new batch, but the grid doesn't show any data. Can anyone shed any light as to why grid displays data only for the initial load of data and not subsequent batches from the pager? Here's my xaml: <riaControls:DomainDataSource x:Name="ArtistSource" QueryName="GetArtist" AutoLoad="True" LoadSize="30" PageSize="15"> <riaControls:DomainDataSource.DomainContext> <domain:AdminContext /> </riaControls:DomainDataSource.DomainContext> </riaControls:DomainDataSource> <data:DataGrid Grid.Row="1" x:Name="ArtistDataGrid" ItemsSource="{Binding Data, ElementName=ArtistSource}"> </data:DataGrid> <StackPanel Grid.Row="2"> <data:DataPager Source="{Binding Data, ElementName=ArtistSource}" /> </StackPanel>

    Read the article

  • RIA Services: Inserting multiple presentation-model objects

    - by nlawalker
    I'm sharing data via RIA services using a presentation model on top of LINQ to SQL classes. On the Silverlight client, I created a couple of new entities (album and artist), associated them with each other (by either adding the album to the artist's album collection, or setting the Artist property on the album - either one works), added them to the context, and submitted changes. On the server, I get two separate Insert calls - one for the album and one for the artist. These entitites are new so their ID values are both set to the default int value (0 - keep in mind that depending on my DB, this could be a valid ID in the DB) because as far as I know you don't set IDs for new entities on the client. This all would work fine if I was transferring the LINQ to SQL classes via my RIA services, because even though the Album insert includes the Artist and the Artist insert includes the Album, both are Entities and the L2S context recognizes them. However, with my custom presentation model objects, I need to convert them back to the LINQ to SQL classes maintaining the associations in the process so they can be added to the L2S context. Put simply, as far as I can tell, this is impossible. Each entity gets its own Insert call, but there's no way you can just insert the one entity because without IDs the associations are lost. If the database used GUID identifiers it would be a different story because I could set those on the client. Is this possible, or should I be pursuing another design?

    Read the article

  • SQL Server 2005 Reporting Services and the Report Viewer

    - by Kendra
    I am having an issue embedding my report into an aspx page. Here's my setup: 1 Server running SQL Server 2005 and SQL Server 2005 Reporting Services 1 Workstation running XP and VS 2005 The server is not on a domain. Reporting Services is a default installation. I have one report called TestMe in a folder called TestReports using a shared datasource. If I view the report in Report Manager, it renders fine. If I view the report using the http ://myserver/reportserver url it renders fine. If I view the report using the http ://myserver/reportserver?/TestReports/TestMe it renders fine. If I try to view the report using http ://myserver/reportserver/TestReports/TestMe, it just goes to the folder navigation page of the home directory. My web application is impersonating somebody specific to get around the server not being on a domain. When I call the report from the report viewer using http ://myserver/reportserver as the server and /TestReports/TestMe as the path I get this error: For security reasons DTD is prohibited in this XML document. To enable DTD processing set the ProhibitDtd property on XmlReaderSettings to false and pass the settings into XmlReader.Create method. When I change the server to http ://myserver/reportserver? I get this error when I run the report: Client found response content type of '', but expected 'text/xml'. The request failed with an empty response. I have been searching for a while and haven't found anything that fixes my issue. Please let me know if there is more information needed. Thanks in advance, Kendra

    Read the article

  • Using Client Application Services in windows forms not working

    - by Nickson
    i am trying to implement asp.net membership, profile and role based security in a windows application by configuring client Application Services for my windows forms application. I have followed both these articles http://www.dotnetbips.com/articles/e863aa3c-0dd6-468d-bd35-120a334c5030.aspx and http://msdn.microsoft.com/en-us/library/bb546195.aspx step-by-step but for some reason i can't get the authentication working. I have a deployed intranet asp.net website which is already using an asp.net membership database for authentication and want to use that same database for authenitcation in my windows forms application. The site URL is http://myServer_Name:My_Port and i am specifying that URL as the both the Authentication service location and Roles service location in the windows application services property tab. But in the windows application login form, when i say Dim msg As String = "Welcome " If Not Membership.ValidateUser(UsernameTextBox.Text), PasswordTextBox.Text)) Then MessageBox.Show("Invalid User ID or Password!") Else msg = msg + UsernameTextBox.Text End If i get my "Invalid User ID or Password!" message even when i supply a valid user name with the corresponding password. i am able to login with the same credentials from the asp.net site. How can i test if the Authentication service location is being reached from the windows application?? Or what other information can i provide here such that one is able to help me get this working??

    Read the article

  • How to Store State in Silverlight WCF RIA Services

    - by peter
    Hi All, I am developing a silverlight 3 application using WCF RIA services. I am using the AuthenticationBase class to handle my authentication. As I understand it under the hood this uses the ASP .NET authentication libraries. When I log into the site the authentication service handles login state so that if I close the site and open it straight away I am still logged in according to the server. When the webpage is refreshed or closed and reloaded I can call the method, WebContextBase.Current.Authentication.LoadUser() And it goes back to the authentication service (running on the server) and figures out whether I am still logged into the site. If a timeout has occured the answer will be no. If that is the case I can show a login dialog. The problem I want to solve is that the authentication service consumes the password, and there is no way I can ever retrieve that password again. If the user logs into the site I want to store the password on the server, and return a token to the client side to match up with that password. I have some other services on the server side that need that password. So where should I store that password on the server? How can that be done? How does the WCF authentication store state?

    Read the article

  • Windsor IHandlerSelector in RIA Services Visual Studio 2010 Beta2

    - by Savvas Sopiadis
    Hi everybody! I want to implement multi tenancy using Windsor and i don't know how to handle this situation: i succesfully used this technique in plain ASP.NET MVC projects and thought incorporating in a RIA Services project would be similar. So i used IHandlerSelector, registered some components and wrote an ASP.NET MVC view to verify it works in a plain ASP.NET MVC environment. And it did! Next step was to create a DomainService which got an IRepository injected in the constructor. This service is hosted in the ASP.NET MVC application. And it actually ... works:i can get data out of it to a Silverlight application. Sample snippet: public OrganizationDomainService(IRepository<Culture> cultureRepository) { this.cultureRepository = cultureRepository; } Last step is to see if it works multi-tenant-like: it does not! The weird thing is this: using some line of code and writing debug messages in a log file i verified that the correct handler is selected! BUT this handler seems not to be injected in the DomainService. I ALWAYS get the first handler (that's the logic in my SelectHandler) Can anybody verify this behavior? Is injection not working in RIA Services? Or am i missing something basic?? Development environment: Visual Studio 2010 Beta2 Thanks in advance

    Read the article

< Previous Page | 36 37 38 39 40 41 42 43 44 45 46 47  | Next Page >