Search Results

Search found 22633 results on 906 pages for 'service accounts'.

Page 345/906 | < Previous Page | 341 342 343 344 345 346 347 348 349 350 351 352  | Next Page >

  • Which web services should be integrated?

    - by Adam Matan
    Mail + Social networking? One identity for all sites? Integration between all social networks and IM services? Which web services should be integrated in the future, and why? Edit: Clarification: By "integrating", I mean that two or more services should be seamlessly connected, and that connection would benefit the user. Foe example, I would really like to have an IM application that would support many accounts on many IM providers (And IMHO, Pidgin's not ready yet).

    Read the article

  • How do I use Openfire with OpenLDAP and ldap.clientSideSorting?

    - by adietrich
    The system log on my Openfire + OpenLDAP installation is getting flooded with this message: slap_global_control: unrecognized control: 1.2.840.113556.1.4.473 This means that Openfire wants OpenLDAP to do server-side sorting, which OpenLDAP doesn't support. The Openfire LDAP Guide advises to set the property ldap.clientSideSorting to true in this case. Unfortunately, Openfire doesn't find any user accounts in LDAP anymore if I do that. How do I make this work?

    Read the article

  • Combining / deduplicating contacts in Windows 8 People app

    - by Soo Wei Tan
    Is there a way of combining or deduplicating contacts in the Windows 8 People app? For some reason I have double entries of many contacts (with identical names), and the app isn't smart enough to integrate them. I have the following accounts connected: Microsoft (i.e. Hotmail) Google (including Contacts) Facebook Linkedin Twitter The contacts in question have entries from Google contacts as well as Facebook.

    Read the article

  • Multiple "from" addresses for single Exchange account in Outlook 2007

    - by Jørn Schou-Rode
    I have a single Exhange account with multiple aliases (e-mail addresses), for which it recieves incoming mail messages. Using "rules" it is possible to have the incoming messages sorted into folders depending on the address they are sent to. When composing and sending messages from Outlook, the primary address of the exchange account is used in the "From" header. Without adding additional mail accounts (I really only have one), is it possible to learn Outlook about the alias addresses, making them available as "From" addresses when composing new messages?

    Read the article

  • Customize a WCF RIA Services Endpoint

    - by Andrew Garrison
    Is it possible to customize the parameters of a WCF RIA Services endpoint? Specifically, I would like to create a custom binding for the endpoint and increase the maxReceivedMessageSize to allow sending the contents of a file that is a few megabytes in size. I've tried meddling in the web.config, but I'm getting the following error: [InvalidOperationException]: The contract name MyNamespace.MyService could not be found in the list of contracts implemented by the service MyNamespace.MyService web.config <system.serviceModel> <bindings> <customBinding> <binding name="CustomBinaryHttpBinding"> <binaryMessageEncoding /> <httpTransport maxReceivedMessageSize="2147483647" maxBufferSize="2147483647" /> </binding> </customBinding> </bindings> <services> <service name="MyNamespace.MyService"> <endpoint address="" binding="wsHttpBinding" contract="MyNamespace.MyService" /> <endpoint address="/binary" binding="customBinding" bindingConfiguration="CustomBinaryHttpBinding" contract="MyNamespace.MyService" /> </service> </services> <serviceHostingEnvironment aspNetCompatibilityEnabled="true" multipleSiteBindingsEnabled="true" /> </system.serviceModel>

    Read the article

  • How to make Shared Keys .ssh/authorized_keys and sudo work together?

    - by farinspace
    I've setup the .ssh/authorized_keys and am able to login with the new "user" using the pub/private key ... I have also added "user" to the sudoers list ... the problem I have now is when I try to execute a sudo command, something simple like: $ sudo cd /root it will prompt me for my password, which I enter, but it doesn't work (I am using the private key password I set) Also, ive disabled the users password using $ passwd -l user What am I missing? Somewhere my initial remarks are being misunderstood ... I am trying to harden my system ... the ultimate goal is to use pub/private keys to do logins versus simple password authentication. I've figured out how to set all that up via the authorized_keys file. Additionally I will ultimately prevent server logins through the root account. But before I do that I need sudo to work for a second user (the user which I will be login into the system with all the time). For this second user I want to prevent regular password logins and force only pub/private key logins, if I don't lock the user via" passwd -l user ... then if i dont use a key, i can still get into the server with a regular password. But more importantly I need to get sudo to work with a pub/private key setup with a user whos had his/her password disabled. Edit: Ok I think I've got it (the solution): 1) I've adjusted /etc/ssh/sshd_config and set PasswordAuthentication no This will prevent ssh password logins (be sure to have a working public/private key setup prior to doing this 2) I've adjusted the sudoers list visudo and added root ALL=(ALL) ALL dimas ALL=(ALL) NOPASSWD: ALL 3) root is the only user account that will have a password, I am testing with two user accounts "dimas" and "sherry" which do not have a password set (passwords are blank, passwd -d user) The above essentially prevents everyone from logging into the system with passwords (a public/private key must be setup). Additionally users in the sudoers list have admin abilities. They can also su to different accounts. So basically "dimas" can sudo su sherry, however "dimas can NOT do su sherry. Similarly any user NOT in the sudoers list can NOT do su user or sudo su user. NOTE The above works but is considered poor security. Any script that is able to access code as the "dimas" or "sherry" users will be able to execute sudo to gain root access. A bug in ssh that allows remote users to log in despite the settings, a remote code execution in something like firefox, or any other flaw that allows unwanted code to run as the user will now be able to run as root. Sudo should always require a password or you may as well log in as root instead of some other user.

    Read the article

  • How can I protect Chrome user interface?

    - by Renan
    Google Chrome has a feature to change between Google accounts which allows several users to have their customized extensions, history and whatnot retrieved instantly. It doesn't, however, protect someone else from checking anything google related. That means anyone with access to your computer can check every google account that was setup as user in Chrome. How can I prevent that? I first thought of checking a box with the option to have Chrome request for password upon user change but that doesn't seem to exist.

    Read the article

  • HTTP Error 500.19 Internal Server Error

    - by Attilah
    I created and deployed a pretty simple WCF Service. but when accessing it from IE, I get this : HTTP Error 500.19 - Internal Server Error Description: The requested page cannot be accessed because the related configuration data for the page is invalid. Error Code: 0x80070005 Notification: BeginRequest Module: IIS Web Core Requested URL: http://localhost:80/ProductsService/ProductsService.svc Physical Path: C:\Users\Administrator\Documents\Visual Studio 2008\Projects\ProductsService\ProductsService\ProductsService.svc Logon User: Not yet determined Logon Method: Not yet determined Handler: Not yet determined Config Error: Cannot read configuration file Config File: \\?\C:\Users\Administrator\Documents\Visual Studio 2008\Projects \ProductsService\ProductsService\web.config Config Source: -1: 0: More Information... This error occurs when there is a problem reading the configuration file for the Web server or Web application. In some cases, the event logs may contain more information about what caused this error. here is the content of my web.config file : <?xml version="1.0" encoding="utf-8" ?> <configuration> <configSections> <section name="dataConfiguration" type="Microsoft.Practices.EnterpriseLibrary.Data.Configuration.DatabaseSettings, Microsoft.Practices.EnterpriseLibrary.Data, Version=4.1.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35" /> </configSections> <dataConfiguration defaultDatabase="AdventureWorksConnection" /> <connectionStrings> <add name="AdventureWorksConnection" connectionString="Database=AdventureWorks; Server=(localhost)\SQLEXPRESS;Integrated Security=SSPI" providerName="System.Data.SqlClient" /> </connectionStrings> <system.serviceModel> <services> <service name="Products.ProductsService"> <endpoint address="" binding="basicHttpBinding" contract="Products.IProductsService" /> </service> </services> </system.serviceModel> </configuration>

    Read the article

  • Start multiple instances of Firefox; Xephyr rootless mode

    - by Vi
    How can I have multiple independent instances of Mozilla Firefox 3.5 on the same X server, but started from different user accounts (consequently, different profiles)? Limited success was only with Xephyr :1, DISPLAY=:1 /usr/local/bin/firefox, but Xephyr has no Cygwin/X's "rootless" mode so it's not comfortable. The idea is to have one Firefox instance for various "Serious Business" things and the other for regular browsing with dozens of add-ons securely isolated.

    Read the article

  • Throttling outbound API calls generated by a Rails app

    - by Sharpie
    I am not a professional web developer, but I like to wrench on websites as a hobby. Recently, I have been playing with developing a Rails app as a project to help me learn the framework. The goal of my toy app is to harvest data from another service through their API and make it available for me to query using a search function. However, the service I want to pull data from imposes a rate limit on the number of API calls that may be executed per minute. I plan on having my app run a daily update which may generate a burst of API calls that far exceeds the limit provided by the external service. I wish to respect the performance of the external site and so would like to throttle the rate at which my app executes the calls. I have done a little bit of searching and the overwhelming amount of tutorial material and pre-built libraries I have found cover throttling inbound API calls to a web app and I can find little discussion of controlling the flow of outbound calls. Being both an amateur web developer and a rails newbie, it is entirely possible that I have been executing the wrong searches in the wrong places. Therefore my questions are: Is there a nice website out there aggregating Rails tutorials that has material related to throttling outbound API requests? Are there any ruby gems or other libraries that would help me throttle the requests? I have some ideas of how I might go about writing a throttling system using a queue-based worker like DelayedJob or Resque to manage the API calls, but I would rather spend my weekends building the rest of the site if there is a good pre-built solution out there already.

    Read the article

  • Spring upload file

    - by benaissa
    hi every one, I'm a novice in Spring, i started to develop an application to upload files,i used the official spring documentation but, i have this error: Handler processing failed; nested exception is java.lang.NoClassDefFoundError: org/apache/commons/io/output/DeferredFileOutputStream at org.springframework.web.servlet.DispatcherServlet.doDispatch(DispatcherServlet.java:823) at org.springframework.web.servlet.DispatcherServlet.doService(DispatcherServlet.java:719) at org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:644) at org.springframework.web.servlet.FrameworkServlet.doPost(FrameworkServlet.java:560) at javax.servlet.http.HttpServlet.service(HttpServlet.java:637) at javax.servlet.http.HttpServlet.service(HttpServlet.java:717) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:290) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206) at org.springframework.security.web.FilterChainProxy.doFilter(FilterChainProxy.java:143) at org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:237) at org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:167) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206) at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:233) at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:191) at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:127) at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:102) at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:109) at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:298) at org.apache.coyote.http11.Http11Processor.process(Http11Processor.java:852) at org.apache.coyote.http11.Http11Protocol$Http11ConnectionHandler.process(Http11Protocol.java:588) at org.apache.tomcat.util.net.JIoEndpoint$Worker.run(JIoEndpoint.java:489) at java.lang.Thread.run(Thread.java:619) Caused by: java.lang.NoClassDefFoundError: org/apache/commons/io/output/DeferredFileOutputStream at org.apache.commons.fileupload.disk.DiskFileItemFactory.createItem(DiskFileItemFactory.java:191) at org.apache.commons.fileupload.FileUploadBase.parseRequest(FileUploadBase.java:350) at org.apache.commons.fileupload.servlet.ServletFileUpload.parseRequest(ServletFileUpload.java:126) at org.springframework.web.multipart.commons.CommonsMultipartResolver.parseRequest(CommonsMultipartResolver.java:155) at org.springframework.web.multipart.commons.CommonsMultipartResolver.resolveMultipart(CommonsMultipartResolver.java:138) at org.springframework.web.servlet.DispatcherServlet.checkMultipart(DispatcherServlet.java:907) at org.springframework.web.servlet.DispatcherServlet.doDispatch(DispatcherServlet.java:750)

    Read the article

  • Why can't I create e-mails on CPanel ?

    - by samantha
    Hello, I have a nasty problem creating e-mails on a cpanel account, the creation form doesn't work (clicking on the buttons does nothing) and the sourcecode of the page has a weird http://petlieutenants.com/css/article1.php in it... The quota for the account is normal, all the other accounts are working fine. What can possibly be causing this ? Thanks for your ideas !

    Read the article

  • Architecture with NHibernate and Repositories

    - by Matthew
    I've been reading up on MVC 2 and the recommended patterns, so far I've come to the conclusion (amongst much hair pulling and total confusion) that: Model - Is just a basic data container Repository - Provides data access Service - Provides business logic and acts as an API to the Controller The Controller talks to the Service, the Service talks to the Repository and Model. So for example, if I wanted to display a blog post page with its comments, I might do: post = PostService.Get(id); comments = PostService.GetComments(post); Or, would I do: post = PostService.Get(id); comments = post.Comments; If so, where is this being set, from the repository? the problem there being its not lazy loaded.. that's not a huge problem but then say I wanted to list 10 posts with the first 2 comments for each, id have to load the posts then loop and load the comments which becomes messy. All of the example's use "InMemory" repository's for testing and say that including db stuff would be out of scope. But this leaves me with many blanks, so for a start can anyone comment on the above?

    Read the article

  • silverlight 3: long running wcf call triggers 401.1 (access denied)

    - by sympatric greg
    I have a wcf service consumed by a silverlight 3 control. The Silverlight client uses a basicHttpBindinging that is constructed at runtime from the control's initialization parameters like this: public static T GetServiceClient<T>(string serviceURL) { BasicHttpBinding binding = new BasicHttpBinding(Application.Current.Host.Source.Scheme.Equals("https", StringComparison.InvariantCultureIgnoreCase) ? BasicHttpSecurityMode.Transport : BasicHttpSecurityMode.None); binding.MaxReceivedMessageSize = int.MaxValue; binding.MaxBufferSize = int.MaxValue; binding.Security.Mode = BasicHttpSecurityMode.TransportCredentialOnly; return (T)Activator.CreateInstance(typeof(T), new object[] { binding, new EndpointAddress(serviceURL)}); } The Service implements windows security. Calls were returning as expected until the result set increased to several thousand rows at which time HTTP 401.1 errors were received. The Service's HttpBinding defines closeTime, openTimeout, receiveTimeout and sendTimeOut of 10 minutes. If I limit the size of the resultset the call suceeds. Additional Observations from Fiddler: When Method2 is modified to return a smaller resultset (and avoid the problem), control initialization consists of 4 calls: Service1/Method1 -- result:401 Service1/Method1 -- result:401 (this time header includes element "Authorization: Negotiate TlRMTV..." Service1/Method1 -- result:200 Service1/Method2 -- result:200 (1.25 seconds) When Method2 is configured to return the larger resultset we get: Service1/Method1 -- result:401 Service1/Method1 -- result:401 (this time header includes element "Authorization: Negotiate TlRMTV..." Service1/Method1 -- result:200 Service1/Method2 -- result:401.1 (7.5 seconds) Service1/Method2 -- result:401.1 (15ms) Service1/Method2 -- result:401.1 (7.5 seconds)

    Read the article

  • Exporting Thunderbird from Win7 32bit to Win7 64bit

    - by Muleskinner
    I am trying to export my thunderbird mails and mail accounts from my old win7 32bit to a new win7 64 bit. Both have the same version (16.0.1). Earlier in this situation I just copy the Thunderbird folder found under my user data account (tried both local and roaming), but this is not working in this scenario. I also tried to use Thunderbird export/import function, but it gives me a blank screen. This is what I am talking about:

    Read the article

  • Import package problem in GWT

    - by Krt_Malta
    Hi! I'm developing an app using GWT Eclipse plug-in. (I'm also using GWT Designer but I don't think the problem is here). Previously when I wanted to communicate with a web service I created, I produced the "skeleton" classes from the WSDL url using Sun's wsimport tool. Then I would add the classes generated to a class folder in my Eclipse project. All worked well. However this doesn't seem to be working with GWT. I have these: VideoTutorialServiceService service = new VideoTutorialServiceService(); VideoTutorialService port = service.getVideoTutorialServicePort(); and I have VideoTutorialServiceService and VideoTutorialService underlined in red, the error saying videotutorialservice.VideoTutorialServiceService can not be found in source packages. Check the inheritance chain from your module; it may not be inheriting a required module or a module may not be adding its source path entries properly. .... I googled about it but I got confused. I'm a beginnier in GWT. How can I resolve this please? Thanks and regards, Krt_Malta

    Read the article

  • Authenticated WCF: Getting the Current Security Context

    - by bradhe
    I have the following scenario: I have various user's data stored in my database. This data was entered via a web app. We'd like to expose this data back to the user over a web service so that they can integrate their data with their applications. We would also like to expose some business logic over these services. As such we do not want to use OData. This is a multi-tenant application so I only want to expose their data back to them and not other users. Likewise, the business logic we expose should be relative to the authenticated user. I would like let the user use an OASIS scheme to authenticate with the web service -- WCF already allows for this out of the box as far as I understand -- or perhaps we can issue them certificates to authenticate with. That bit hasn't really been worked out yet. Here is a bit of pseudo-code of how I envision this would work within the service: function GetUsersData(id) var user := Lookup User based on Username from Auth Context var data := Get Data From Repository based on "user" return data end function For the business logic scenario I think it would look something like this: function PerformBusinessLogic(someData) var user := Lookup User based on Username from Auth Context var returnValue := Perform some logic based on supplied data return returnValue end function The hard bit here is getting the current username (or cert info in the cert scenario) that the user authenticated with! Does WCF even enable this scenario? If not would WSE3 enable this? Thanks,

    Read the article

  • Can Castle.Windsor do automatic resolution of concrete types

    - by Anthony
    We are evaluating IoC containers for C# projects, and both Unity and Castle.Windsor are standing out. One thing that I like about Unity (NInject and StructureMap also do this) is that types where it is obvious how to construct them do not have to be registered with the IoC Container. Is there way to do this in Castle.Windsor? Am I being fair to Castle.Windsor to say that it does not do this? Is there a design reason to deliberately not do this, or is it an oversight, or just not seen as important or useful? I am aware of container.Register(AllTypes... in Windsor but that's not quite the same thing. It's not entirely automatic, and it's very broad. To illustrate the point, here are two NUnit tests doing the same thing via Unity and Castle.Windsor. The Castle.Windsor one fails. : namespace SimpleIocDemo { using NUnit.Framework; using Castle.Windsor; using Microsoft.Practices.Unity; public interface ISomeService { string DoSomething(); } public class ServiceImplementation : ISomeService { public string DoSomething() { return "Hello"; } } public class RootObject { public ISomeService SomeService { get; private set; } public RootObject(ISomeService service) { SomeService = service; } } [TestFixture] public class IocTests { [Test] public void UnityResolveTest() { UnityContainer container = new UnityContainer(); container.RegisterType<ISomeService, ServiceImplementation>(); // Root object needs no registration in Unity RootObject rootObject = container.Resolve<RootObject>(); Assert.AreEqual("Hello", rootObject.SomeService.DoSomething()); } [Test] public void WindsorResolveTest() { WindsorContainer container = new WindsorContainer(); container.AddComponent<ISomeService, ServiceImplementation>(); // fails with exception "Castle.MicroKernel.ComponentNotFoundException: // No component for supporting the service SimpleIocDemo.RootObject was found" // I could add // container.AddComponent<RootObject>(); // but that approach does not scale RootObject rootObject = container.Resolve<RootObject>(); Assert.AreEqual("Hello", rootObject.SomeService.DoSomething()); } } }

    Read the article

  • Password recovery toolkit

    - by John Craggs
    I am using Wise Password Recover 2009 and basically satisfied with its wide compatibility. But it gets failed in retrieving one of my outlook accounts. Is there any other password recovery toolkit can do the recovery for me?

    Read the article

  • Calling webservice via server causes java.net.MalformedURLException: no protocol

    - by Thomas
    I am writing a web-service, which parses an xml file. In the client, I read the whole content of the xml into a String then I give it to the web-service. If I run my web-service with main as a Java-Application (for tests) there is no problem, no error messages. However when I try to call it via the server, I get the following error: java.net.MalformedURLException: no protocol I use the same xml file, the same code (without main), and I just cannot figure out, what the cause of the error can be. here is my code: DOMParser parser=new DOMParser(); try { parser.setFeature("http://xml.org/sax/features/validation", true); parser.setFeature("http://apache.org/xml/features/validation/schema",true); parser.setFeature("http://apache.org/xml/features/validation/dynamic",true); parser.setErrorHandler(new myErrorHandler()); parser.parse(new InputSource(new StringReader(xmlFile))); document=parser.getDocument(); xmlFile is constructed in the client so: String myFile ="C:/test.xml"; File file=new File(myFile); String myString=""; FileInputStream fis=new FileInputStream(file); BufferedInputStream bis=new BufferedInputStream(fis); DataInputStream dis=new DataInputStream(bis); while (dis.available()!=0) { myString=myString+dis.readLine(); } fis.close(); bis.close(); dis.close(); Any suggestions will be appreciated!

    Read the article

  • Checking the configuration of two systems to determine changes

    - by None
    We are standing up a replicant data center at work and need to ensure that the new data center is configured (nearly) identically to the original. The new data center will be differently addressed and named than the original and will have differing user accounts, but all the COTS, patches, and configurations should be the same. We would normally ghost the original servers and install those images onto the new machines, however, we have a few problematic pieces of COTS that require we install them outside of an image due to how they capture the setup of the network during their installation and maintain it within their configuration information (in some cases storing it in various databases). We have tried multiple times and this piece of COTS cannot be captured within a ghost image unless the destination machine will have an identical network setup (all the same IPs, hostnames, user accounts, etc across the entire network) as the original. In truth, it is the setup of these special COTS that I want to audit the most because they are difficult to install and configure in the first place. In light of the fact that we can’t simply ghost, I’m trying to find a reasonable manner to audit the new data center and check to see if it is setup like the original (some sort of system wide configuration audit or integrity check). I’m considering using something like Tripwire for Servers to capture the configuration on the source machines and then run an audit on the destination machines. I understand that it will still show some differences due to the minor config changes, but I’m hoping that it will eliminate the majority of the work. Here are some of the constraints I’m working under: Data center is comprised of multiple Windows and Linux machines of differing versions (about 20 total) I absolutely cannot ghost or snap any other type of image of these machines … at least not in their final configuration I want to audit the final configuration to ensure all of the COTS, patches, configurations, etc are installed and setup properly (as compared to the original data center) I would rather not install any additional tools on these machines … I’d much rather run it from a standalone machine or off a DVD Price of tools is important but not an impossible burden, however, getting a solution soon is important (I can’t take the time to roll my own tools to do this) For the COTS that stores the network information, I don’t know all of the places it stores the network information … so it would be unlikely I could find a way in the near future to adjust its setup after the installation has occurred Anyone have any thoughts or alternate approaches? Can anyone recommend tools that would be usable for system wide configuration audits?

    Read the article

< Previous Page | 341 342 343 344 345 346 347 348 349 350 351 352  | Next Page >