Search Results

Search found 2316 results on 93 pages for 'credentials'.

Page 13/93 | < Previous Page | 9 10 11 12 13 14 15 16 17 18 19 20  | Next Page >

  • Rails form helpers: how to add an element to a collection?

    - by Laran Evans
    I have a keychain object. keychain has_many credentials. I'm trying to write the view code to add a new credential to a keychain. This is the code I have: <% form_for(@keychain) do |f| % <tr <td<%= f.select "credentials[]", current_account.services.collect{ |s| [s.friendly_name, s.id] } %</td <td<%= f.text_field 'credentials', :username %</td <td<%= f.password_field 'credentials', :password %</td </tr <% end % But it fails with this message: NoMethodError in Keychains#new Showing app/views/keychains/_keychain_form.html.erb where line #32 raised: undefined method `credentials[]' for # What am I doing wrong?

    Read the article

  • Domain Authentication from .NET Client over VPN

    - by Holy Christ
    I am writing a ClickOnce WPF app that will sometimes be used over VPN. The app uses resources available only to domain authenticated users. Some of the things include accessing SSRS Reports, accessing LDAP to lookup user information, hitting web services, etc. When a user logs in from a machine that is not authenticated on the domain, I need to somehow get his credentials, authenticate him on the domain, and store his credentials. What is the recommended approach for authenticating domain users over VPN? How can I securely store the credentials? I've found several articles but, not much posted recently and a lot of the solutions seem kinda hacky, or aren't very secure (ie - storing strings clear text in memory). It would be cool if I could use the ActiveDicrtoryMembershipProvider, but that seems to be geared for use in web apps. EDIT: The above is kind of a workaround. The user must enter their domain credentials to authenticate on the VPN. It would be ideal to access the credentials the user has already entered to login to the VPN instead of the WindowsIdentity.GetCurrent() (which returns the user logged into the computer). Any ideas on how that could work? We use Juniper Networks to connect to the VPN. Thanks!

    Read the article

  • Windows Phone - failing to get a string from a website with login information

    - by jumantyn
    I am new to accessing web services with Windows Phone 7/8. I'm using a WebClient to get a string from a php-website. The site returns a JSON string but at the moment I'm just trying to put it into a TextBox as a normal string just to test if the connection works. The php-page requires an authentication and I think that's where my code is failing. Here's my code: WebClient client = new WebClient(); client.Credentials = new NetworkCredential("myUsername", "myPassword"); client.DownloadStringCompleted += new DownloadStringCompletedEventHandler(client_DownloadStringCompleted); client.DownloadStringAsync(new Uri("https://www.mywebsite.com/ba/php/jsonstuff.php")); void client_DownloadStringCompleted(object sender, DownloadStringCompletedEventArgs e) { try { string data = e.Result; this.jsonText.Text = data; } catch (Exception ex) { System.Diagnostics.Debug.WriteLine(ex.Message); } } This returns first a WebException and then a TargetInvocationException. If I replace the Uri with for example "http://www.google.com/index.html" the jsonText TextBox gets filled with html text from Google (oddly enough, this also works even when the WebClient credentials are still set). So is the problem in the setting of the credentials? I couldn't find any good results when searching for guides on how to access php-pages with credentials, only without them. Then I found a short mention somewhere to use the WebClient.Credentials property. But should it work some other way? Update: here's what I can get out of the WebException (sorry for the bad formatting): System.Net.WebException: The remote server returned an error: NotFound. ---System.Net.WebException: The remote server returned an error: NotFound. at System.Net.Browser.ClientHttpWebRequest.InternalEndGetResponse(IAsyncResult asyncResult) at System.Net.Browser.ClientHttpWebRequest.<c_DisplayClasse.b_d(Object sendState) at System.Net.Browser.AsyncHelper.<c_DisplayClass1.b_0(Object sendState) --- End of inner exception stack trace --- at System.Net.Browser.AsyncHelper.BeginOnUI(SendOrPostCallback beginMethod, Object state) at System.Net.Browser.ClientHttpWebRequest.EndGetResponse(IAsyncResult asyncResult) at System.Net.WebClient.GetWebResponse(WebRequest request, IAsyncResult result) at System.Net.WebClient.DownloadBitsResponseCallback(IAsyncResult result)

    Read the article

  • Authorization/Licensing of Webservice

    - by Burhan
    I have developed a web service which accepts the login credentials from the XML message passed to it. I have concerns over this method as the developer who consumes the service can easily share the login credentials and my service can be called from some other application that uses the same credentials. Is there any way that I can issue a 'license' to some specific applications? So that, even if credentials are shared among the consuming apps, only authorized ones can successfully consume the service. P.S: I thought about implementing IP restrictions but that doesn't serve the purpose as we may have different applications installed on a same server (we do have such a scenario implemented).

    Read the article

  • Enabling Kerberos Authentication for Reporting Services

    - by robcarrol
    Recently, I’ve helped several customers with Kerberos authentication problems with Reporting Services and Analysis Services, so I’ve decided to write this blog post and pull together some useful resources in one place (there are 2 whitepapers in particular that I found invaluable configuring Kerberos authentication, and these can be found in the references section at the bottom of this post). In most of these cases, the problem has manifested itself with the Login failed for User ‘NT Authority\Anonymous’ (“double-hop”) error. By default, Reporting Services uses Windows Integrated Authentication, which includes the Kerberos and NTLM protocols for network authentication. Additionally, Windows Integrated Authentication includes the negotiate security header, which prompts the client to select Kerberos or NTLM for authentication. The client can access reports which have the appropriate permissions by using Kerberos for authentication. Servers that use Kerberos authentication can impersonate those clients and use their security context to access network resources. You can configure Reporting Services to use both Kerberos and NTLM authentication; however this may lead to a failure to authenticate. With negotiate, if Kerberos cannot be used, the authentication method will default to NTLM. When negotiate is enabled, the Kerberos protocol is always used except when: Clients/servers that are involved in the authentication process cannot use Kerberos. The client does not provide the information necessary to use Kerberos. An in-depth discussion of Kerberos authentication is beyond the scope of this post, however when users execute reports that are configured to use Windows Integrated Authentication, their logon credentials are passed from the report server to the server hosting the data source. Delegation needs to be set on the report server and Service Principle Names (SPNs) set for the relevant services. When a user processes a report, the request must go through a Web server on its way to a database server for processing. Kerberos authentication enables the Web server to request a service ticket from the domain controller; impersonate the client when passing the request to the database server; and then restrict the request based on the user’s permissions. Each time a server is required to pass the request to another server, the same process must be used. Kerberos authentication is supported in both native and SharePoint integrated mode, but I’ll focus on native mode for the purpose of this post (I’ll explain configuring SharePoint integrated mode and Kerberos authentication in a future post). Configuring Kerberos avoids the authentication failures due to double-hop issues. These double-hop errors occur when a users windows domain credentials can’t be passed to another server to complete the user’s request. In the case of my customers, users were executing Reporting Services reports that were configured to query Analysis Services cubes on a separate machine using Windows Integrated security. The double-hop issue occurs as NTLM credentials are valid for only one network hop, subsequent hops result in anonymous authentication. The client attempts to connect to the report server by making a request from a browser (or some other application), and the connection process begins with authentication. With NTLM authentication, client credentials are presented to Computer 2. However Computer 2 can’t use the same credentials to access Computer 3 (so we get the Anonymous login error). To access Computer 3 it is necessary to configure the connection string with stored credentials, which is what a number of customers I have worked with have done to workaround the double-hop authentication error. However, to get the benefits of Windows Integrated security, a better solution is to enable Kerberos authentication. Again, the connection process begins with authentication. With Kerberos authentication, the client and the server must demonstrate to one another that they are genuine, at which point authentication is successful and a secure client/server session is established. In the illustration above, the tiers represent the following: Client tier (computer 1): The client computer from which an application makes a request. Middle tier (computer 2): The Web server or farm where the client’s request is directed. Both the SharePoint and Reporting Services server(s) comprise the middle tier (but we’re only concentrating on native deployments just now). Back end tier (computer 3): The Database/Analysis Services server/Cluster where the requested data is stored. In order to enable Kerberos authentication for Reporting Services it’s necessary to configure the relevant SPNs, configure trust for delegation for server accounts, configure Kerberos with full delegation and configure the authentication types for Reporting Services. Service Principle Names (SPNs) are unique identifiers for services and identify the account’s type of service. If an SPN is not configured for a service, a client account will be unable to authenticate to the servers using Kerberos. You need to be a domain administrator to add an SPN, which can be added using the SetSPN utility. For Reporting Services in native mode, the following SPNs need to be registered --SQL Server Service SETSPN -S mssqlsvc/servername:1433 Domain\SQL For named instances, or if the default instance is running under a different port, then the specific port number should be used. --Reporting Services Service SETSPN -S http/servername Domain\SSRS SETSPN -S http/servername.domain.com Domain\SSRS The SPN should be set for the NETBIOS name of the server and the FQDN. If you access the reports using a host header or DNS alias, then that should also be registered SETSPN -S http/www.reports.com Domain\SSRS --Analysis Services Service SETSPN -S msolapsvc.3/servername Domain\SSAS Next, you need to configure trust for delegation, which refers to enabling a computer to impersonate an authenticated user to services on another computer: Location Description Client 1. The requesting application must support the Kerberos authentication protocol. 2. The user account making the request must be configured on the domain controller. Confirm that the following option is not selected: Account is sensitive and cannot be delegated. Servers 1. The service accounts must be trusted for delegation on the domain controller. 2. The service accounts must have SPNs registered on the domain controller. If the service account is a domain user account, the domain administrator must register the SPNs. In Active Directory Users and Computers, verify that the domain user accounts used to access reports have been configured for delegation (the ‘Account is sensitive and cannot be delegated’ option should not be selected): We then need to configure the Reporting Services service account and computer to use Kerberos with full delegation:   We also need to do the same for the SQL Server or Analysis Services service accounts and computers (depending on what type of data source you are connecting to in your reports). Finally, and this is the part that sometimes gets over-looked, we need to configure the authentication type correctly for reporting services to use Kerberos authentication. This is configured in the Authentication section of the RSReportServer.config file on the report server. <Authentication> <AuthenticationTypes>           <RSWindowsNegotiate/> </AuthenticationTypes> <EnableAuthPersistence>true</EnableAuthPersistence> </Authentication> This will enable Kerberos authentication for Internet Explorer. For other browsers, see the link below. The report server instance must be restarted for these changes to take effect. Once these changes have been made, all that’s left to do is test to make sure Kerberos authentication is working properly by running a report from report manager that is configured to use Windows Integrated authentication (either connecting to Analysis Services or SQL Server back-end). Resources: Manage Kerberos Authentication Issues in a Reporting Services Environment http://download.microsoft.com/download/B/E/1/BE1AABB3-6ED8-4C3C-AF91-448AB733B1AF/SSRSKerberos.docx Configuring Kerberos Authentication for Microsoft SharePoint 2010 Products http://www.microsoft.com/download/en/details.aspx?displaylang=en&id=23176 How to: Configure Windows Authentication in Reporting Services http://msdn.microsoft.com/en-us/library/cc281253.aspx RSReportServer Configuration File http://msdn.microsoft.com/en-us/library/ms157273.aspx#Authentication Planning for Browser Support http://msdn.microsoft.com/en-us/library/ms156511.aspx

    Read the article

  • Using MAC Authentication for simple Web API’s consumption

    - by cibrax
    For simple scenarios of Web API consumption where identity delegation is not required, traditional http authentication schemas such as basic, certificates or digest are the most used nowadays. All these schemas rely on sending the caller credentials or some representation of it in every request message as part of the Authorization header, so they are prone to suffer phishing attacks if they are not correctly secured at transport level with https. In addition, most client applications typically authenticate two different things, the caller application and the user consuming the API on behalf of that application. For most cases, the schema is simplified by using a single set of username and password for authenticating both, making necessary to store those credentials temporally somewhere in memory. The true is that you can use two different identities, one for the user running the application, which you might authenticate just once during the first call when the application is initialized, and another identity for the application itself that you use on every call. Some cloud vendors like Windows Azure or Amazon Web Services have adopted an schema to authenticate the caller application based on a Message Authentication Code (MAC) generated with a symmetric algorithm using a key known by the two parties, the caller and the Web API. The caller must include a MAC as part of the Authorization header created from different pieces of information in the request message such as the address, the host, and some other headers. The Web API can authenticate the caller by using the key associated to it and validating the attached MAC in the request message. In that way, no credentials are sent as part of the request message, so there is no way an attacker to intercept the message and get access to those credentials. Anyways, this schema also suffers from some deficiencies that can generate attacks. For example, brute force can be still used to infer the key used for generating the MAC, and impersonate the original caller. This can be mitigated by renewing keys in a relative short period of time. This schema as any other can be complemented with transport security. Eran Rammer, one of the brains behind OAuth, has recently published an specification of a protocol based on MAC for Http authentication called Hawk. The initial version of the spec is available here. A curious fact is that the specification per se does not exist, and the specification itself is the code that Eran initially wrote using node.js. In that implementation, you can associate a key to an user, so once the MAC has been verified on the Web API, the user can be inferred from that key. Also a timestamp is used to avoid replay attacks. As a pet project, I decided to port that code to .NET using ASP.NET Web API, which is available also in github under https://github.com/pcibraro/hawknet Enjoy!.

    Read the article

  • Accessing SharePoint 2010 Data with REST/OData on Windows Phone 7

    - by Jan Tielens
    Consuming SharePoint 2010 data in Windows Phone 7 applications using the CTP version of the developer tools is quite a challenge. The issue is that the SharePoint 2010 data is not anonymously available; users need to authenticate to be able to access the data. When I first tried to access SharePoint 2010 data from my first Hello-World-type Windows Phone 7 application I thought “Hey, this should be easy!” because Windows Phone 7 development based on Silverlight and SharePoint 2010 has a Client Object Model for Silverlight. Unfortunately you can’t use the Client Object Model of SharePoint 2010 on the Windows Phone platform; there’s a reference to an assembly that’s not available (System.Windows.Browser). My second thought was “OK, no problem!” because SharePoint 2010 also exposes a REST/OData API to access SharePoint data. Using the REST API in SharePoint 2010 is as easy as making a web request for a URL (in which you specify the data you’d like to retrieve), e.g. http://yoursiteurl/_vti_bin/listdata.svc/Announcements. This is very easy to accomplish in a Silverlight application that’s running in the context of a page in a SharePoint site, because the credentials of the currently logged on user are automatically picked up and passed to the WCF service. But a Windows Phone application is of course running outside of the SharePoint site’s page, so the application should build credentials that have to be passed to SharePoint’s WCF service. This turns out to be a small challenge in Silverlight 3, the WebClient doesn’t support authentication; there is a Credentials property but when you set it and make the request you get a NotImplementedException exception. Probably this issued will be solved in the very near future, since Silverlight 4 does support authentication, and there’s already a WCF Data Services download that uses this new platform feature of Silverlight 4. So when Windows Phone platform switches to Silverlight 4, you can just use the WebClient to get the data. Even more, if the OData Client Library for Windows Phone 7 gets updated after that, things should get even easier! By the way: the things I’m writing in this paragraph are just assumptions that I make which make a lot of sense IMHO, I don’t have any info all of this will happen, but I really hope so. So are SharePoint developers out of the Windows Phone development game until they get this fixed? Well luckily not, when the HttpWebRequest class is being used instead, you can pass credentials! Using the HttpWebRequest class is slightly more complex than using the WebClient class, but the end result is that you have access to your precious SharePoint 2010 data. The following code snippet is getting all the announcements of an Annoucements list in a SharePoint site: HttpWebRequest webReq =     (HttpWebRequest)HttpWebRequest.Create("http://yoursite/_vti_bin/listdata.svc/Announcements");webReq.Credentials = new NetworkCredential("username", "password"); webReq.BeginGetResponse(    (result) => {        HttpWebRequest asyncReq = (HttpWebRequest)result.AsyncState;         XDocument xdoc = XDocument.Load(            ((HttpWebResponse)asyncReq.EndGetResponse(result)).GetResponseStream());         XNamespace ns = "http://www.w3.org/2005/Atom";        var items = from item in xdoc.Root.Elements(ns + "entry")                    select new { Title = item.Element(ns + "title").Value };         this.Dispatcher.BeginInvoke(() =>        {            foreach (var item in items)                MessageBox.Show(item.Title);        });    }, webReq); When you try this in a Windows Phone 7 application, make sure you add a reference to the System.Xml.Linq assembly, because the code uses Linq to XML to parse the resulting Atom feed, so the Title of every announcement is being displayed in a MessageBox. Check out my previous post if you’d like to see a more polished sample Windows Phone 7 application that displays SharePoint 2010 data.When you plan to use this technique, it’s of course a good idea to encapsulate the code doing the request, so it becomes really easy to get the data that you need. In the following code snippet you can find the GetAtomFeed method that gets the contents of any Atom feed, even if you need to authenticate to get access to the feed. delegate void GetAtomFeedCallback(Stream responseStream); public MainPage(){    InitializeComponent();     SupportedOrientations = SupportedPageOrientation.Portrait |         SupportedPageOrientation.Landscape;     string url = "http://yoursite/_vti_bin/listdata.svc/Announcements";    string username = "username";    string password = "password";    string domain = "";     GetAtomFeed(url, username, password, domain, (s) =>    {        XNamespace ns = "http://www.w3.org/2005/Atom";        XDocument xdoc = XDocument.Load(s);         var items = from item in xdoc.Root.Elements(ns + "entry")                    select new { Title = item.Element(ns + "title").Value };         this.Dispatcher.BeginInvoke(() =>        {            foreach (var item in items)            {                MessageBox.Show(item.Title);            }        });    });} private static void GetAtomFeed(string url, string username,     string password, string domain, GetAtomFeedCallback cb){    HttpWebRequest webReq = (HttpWebRequest)HttpWebRequest.Create(url);    webReq.Credentials = new NetworkCredential(username, password, domain);     webReq.BeginGetResponse(        (result) =>        {            HttpWebRequest asyncReq = (HttpWebRequest)result.AsyncState;            HttpWebResponse resp = (HttpWebResponse)asyncReq.EndGetResponse(result);            cb(resp.GetResponseStream());        }, webReq);}

    Read the article

  • apport-collect fails with "certificate verify failed" when trying to report a bug on launchpad

    - by Francesco
    I am trying to report a bug but I get root@beagle:/usr/lib/python2.7/dist-packages/apport# apport-collect <bug_id> ERROR: connecting to Launchpad failed: [Errno 1] _ssl.c:504: error:14090086:SSL routines:SSL3_GET_SERVER_CERTIFICATE:certificate verify failed You can reset the credentials by removing the file "/root/.cache/apport/launchpad.credentials" Moreover firefox tells me Certificate is not currently valid for bugs.launchpad.net. What can I do?

    Read the article

  • iPhone App Store Release Question!

    - by Ahmad Kayyali
    I am developing an Application its purpose to view uploaded files on the host server, and it has a credentials that will be entered on the Login Page to authenticate the user. My Question! when I post my application to the App Store how suppose apple is going to test or at least view my application when Apple needs enter a valid credentials that I am not suppose to know, it's private to my client. Any guide would be greatly appreciated.

    Read the article

  • How can I combine my FTP queries? [migrated]

    - by ryansworld10
    My program has several times where it queries an FTP server to read and upload information. How can I combine all these into one FTP class to handle everything? private static void UploadToFTP(string[] FTPSettings) { try { FtpWebRequest request = (FtpWebRequest)WebRequest.Create(FTPSettings[0]); request.Method = WebRequestMethods.Ftp.MakeDirectory; request.Credentials = new NetworkCredential(FTPSettings[1], FTPSettings[2]); FtpWebResponse response = (FtpWebResponse)request.GetResponse(); } catch { } try { FtpWebRequest request = (FtpWebRequest)WebRequest.Create(FTPSettings[0] + Path.GetFileName(file)); request.Method = WebRequestMethods.Ftp.UploadFile; request.Credentials = new NetworkCredential(FTPSettings[1], FTPSettings[2]); StreamReader source = new StreamReader(file); byte[] fileContents = Encoding.UTF8.GetBytes(source.ReadToEnd()); source.Close(); request.ContentLength = fileContents.Length; Stream requestStream = request.GetRequestStream(); requestStream.Write(fileContents, 0, fileContents.Length); requestStream.Close(); FtpWebResponse response = (FtpWebResponse)request.GetResponse(); response.Close(); RegenLog(); } catch (Exception e) { File.AppendAllText(file, string.Format("{0}{0}Upload Failed - ({2}) - {1}{0}", nl, System.DateTime.Now, e.Message.ToString())); } } private static void CheckBlacklist(string[] FTPSettings) { try { FtpWebRequest request = (FtpWebRequest)WebRequest.Create(FTPSettings[0] + "blacklist.txt"); request.Credentials = new NetworkCredential(FTPSettings[1], FTPSettings[2]); using (WebResponse response = request.GetResponse()) { using (Stream stream = response.GetResponseStream()) { using (TextReader reader = new StreamReader(stream)) { string blacklist = reader.ReadToEnd(); if (blacklist.Contains(Environment.UserName)) { File.AppendAllText(file, string.Format("{0}{0}Logger terminated - ({2}) - {1}{0}", nl, System.DateTime.Now, "Blacklisted")); uninstall = true; } } } } } catch (Exception e) { File.AppendAllText(file, string.Format("{0}{0}FTP Error - ({2}) - {1}{0}", nl, System.DateTime.Now, e.Message.ToString())); } } private static void CheckUpdate(string[] FTPSettings) { try { FtpWebRequest request = (FtpWebRequest)WebRequest.Create(FTPSettings[0] + "update.txt"); request.Credentials = new NetworkCredential(FTPSettings[1], FTPSettings[2]); using (WebResponse response = request.GetResponse()) { using (Stream stream = response.GetResponseStream()) { using (TextReader reader = new StreamReader(stream)) { string newVersion = reader.ReadToEnd(); if (newVersion != version) { update = true; } } } } } catch (Exception e) { File.AppendAllText(file, string.Format("{0}{0}FTP Error - ({2}) - {1}{0}", nl, System.DateTime.Now, e.Message.ToString())); } } I know my code is also a bit inconsistent and messy, however this is my first time working with FTP in C#. Please give any advice you have!

    Read the article

  • Login to OS X Server User Account from Local Computer

    - by Brod Wilkinson
    I have OS X Server installed on a mac mini. I've created several User accounts, one of which is Account Name: Bob Password: abc123 From the Mac Mini's login screen I can choose "Server" (main account) "Bob" (Bobs account) and "Other..." OS X Server Accounts, from "Other..." if I input Bobs credentials it will log me in. I also have a macbook air, I would like to be able to select from the Login Screen "Other..." input Bobs credentials and have it login to Bobs account, or any other User Account for that matter. My Server is setup as private with the server address: server.network.private Following some googled instructions as well as apples very own instructions I have: Setup an Open Directory with Username: diradmin Password: abc123 Then on the macbook air gone into System Preferences > Users & Groups > Login Options and clicked Join next to Network Account Server, input my server (server.network.private) with diradmin credentials and its connected. Great. I've also ticked Allow Network Users to Login and Login Window and selected All Users. I was assuming this would allow my macbook air to login to the "Bob" account by selecting "Other..." from the login window although there is no "Other..." option. I then setup a VPN, basic credentials, logged into it on the macbook air and still not much has changed. I am able to share screens with the "Bob" account form my macbook air by logging in by clicking Share Screen... from the Finder under Shared > Network Server and then clicking Login In but this obviously requires the macbook air to already be logged into an account before it can share screens which is not suitable. Is there any way to simply login to the OS X Server User Account from the macbook air's login screen via the "Other..." like it does on the mac mini's login screen? Thanks in advance. Operating System: OS X 10.9 Mavericks OS X Server: Version 3

    Read the article

  • NetApp NDMP backup with BE 2010 R2 works, restore fails

    - by uuwe
    Hi, I'm having some issues with a new Backup Exec 2010 R2 installation. I configured a NetApp FAS2020 as an NDMP device and want to backup files from the NAS to a tape drive connected to my backup server. I set up ndmpd according to this document (http://www.symantec.com/business/support/index?page=content&id=TECH48957) and created a separate backup user (http://filers.blogspot.com/2006/09/setting-veritas-netbackup-with-non.html). Backup works perfectly, but restoring any file gives me an authentication failed error. The NDMP device has a "global" ndmp user configured in the device tab (tried this with the newly created ndmpd backup user and the netapp root) and I can also configure separate resource credentials in the BE restore job. I have tried setting the same accounts for the "global" ndmp device and the restore credentials and have also tried setting different accounts for them. NDMP debug level is at 5 and this is what shows up in /etc/messages. The session is closed immediately after it has been granted. 16:12:07 PST [Java_Thread:info]: ndmpdserver: ndmpd.access allowed for version = 4, sessionId = 51, from src ip = 192.168.11.17, dst ip = FAS2020-1/192.168.11.75, src port = 50857, dst port = 10000 16:12:07 PST [Java_Thread:info]: Ndmpd51: ndmpd session closed successfully for version = 4, sessionId = 51, from src ip = 192.168.11.17, dst ip = FAS2020-1/192.168.11.75, src port = 50857, dst port = 10000 Running wireshark on the backup server doesn't produce much. It shows a SYN - SYN/ACK - NDMP CONNECT_CLOSE Request from the backup server. The Resource Credentials for the restore job behave very oddly. If I enter NDMP credentials and do "Test All" it fails. If I use my regular domain backup account, it is successful. There are no failed or succeeded logons in the NetApp ndmp log and tracing this check shows that it doesn't even connect to the NAS. This makes me think that this is more likely flaky BE behaviour rather than misconfiguration of the NAS. Here is the options ndmp output: FAS2020-1 options ndmp ndmpd.access all ndmpd.authtype challenge ndmpd.connectlog.enabled on ndmpd.enable on ndmpd.ignore_ctime.enabled off ndmpd.offset_map.enable on ndmpd.password_length 16 ndmpd.preferred_interface disable ndmpd.tcpnodelay.enable off

    Read the article

  • What is going on when I can't access an SMB server share (not accessible error) until I run cmdkey to delete the credential?

    - by Warren P
    I have a network connection share issue. The first connection works, and seems to stay connected for at least a few hours. However, after each time my windows 7 PC reboots, it can no longer form a network connection to the shared folder, nor browse to it, until I not only unmap and remap the mapped drive, but also, I have to use cmdkey to delete the stored credentials like this: cmdkey /delete:Domain:target=HOSTNAME My work PC is on a domain, and I am not the IT administrator, but I'm curious if there is anything I can do to investigate this issue. Any settings in registry or group policy that I could examine to see why the first connection works, but each subsequent attempt (once a stored credential exists) to browse or use the connection, fails with a connection error saying it is "not accessible", like this: I do not even get any error until at least several minutes go by. THe first thing I see is a window frozen and empty, and then I get this error: This has happened when connecting to a share on a DROBO device, and on a share which is not on the domain, but which was a Microsoft Home Server. I wonder if there's something broken in WIndows 7 professional with regards to connecting to non-domain shares when an active directory domain controller exists, and a particular workstation is joined to a domain? The problem only occurs if I click "remember credentials". It is not fixed by any amount of working with net use. Usingcmdkey to delete all stored credentials for the host is the only way to get back in, and it affects all non-domain shared folders. Update I'm hoping there are some registry locations I could check that could be misconfigured in some way that might explain why SMB/CIFS stored credentials for non-domain systems seem to be auto-invalidated in this weird way. Knowing how whacko Microsoft Windows domain and security handling is sometimes, this could be some kind of stupid "feature".

    Read the article

  • How can records be deleted without activating the delete trigger?

    - by Servaas Phlips
    Hello there, Since about a month we are experiencing records that are disappearing from our database without any reason. (part of) Our database structure is at http://i.imgur.com/i15nG.png Now users and credentials can never be deleted. We noticed however that thanks to our backups that unfortanetely users disappeared from the database. The users and credentials that disappear appear to be completely random. In order to find out which application deletes this records we created triggers with the following checks: CREATE TRIGGER Credential_SoftDelete ON [Credential] INSTEAD OF DELETE AS DECLARE @message nvarchar(255) DECLARE @hostName nvarchar(30) DECLARE @loginName nvarchar(30) DECLARE @deletedId nvarchar(30) SELECT @deletedId=credentialid FROM deleted; SELECT @hostName=host_name,@loginName=login_name FROM sys.dm_exec_sessions WHERE session_id=@@SPID; SELECT @message = '[FAULT] Credential : ' + USER_NAME() + ' deleted ' +@deletedId + ' on ' + @@SERVERNAME + ' from [' + @hostname + ' by ' + @loginName; EXEC xp_logevent 50001,@message,ERROR GO Now after we added this trigger we hoped to find out which application deletes these credentials by searching in the log files. Unfortanetely the credentials are still deleted and the trigger Credential_SoftDelete is never logged. I did try run a delete on the database where the trigger is installed and where the users have disappeared. I ran the following query on the database: DELETE FROM [User] WHERE userid=296 and the trigger prevented deletion of this user and also logged this in the log events. This was actually on exact the same database where the users disappeared. (so no test copy or something like that) Please note that we also use replication, the type of replication we use is merge replication. How is this possible? Can the fact that we use replication on this database be the cause of this problem?

    Read the article

  • SharePoint 2010 Diagnostic Studio Remote Diag

    - by juanlarios
    I have had some time this week to try out some tools that I have been meaning to try out. This week I am trying out the SP 2010 Diagnostic Studio. I installed it successfully and tried it on my development evironment. I was able to build a report and a snapshot of the environment. I decided to turn my attention to my Employer's intranet environment. This would allow me to analyze it and measure it against benchmarks. I didn't want to install the Diagnostic studio on the Production Envorinment, lucky for me, the Diagnostic studio can be run remotely, well...kind of. Issue My development environment is a stand alone, full installation of SharePoint 2010 Server. It has Office 2010, SQL 2008 Enterprise, a DC...well you get the point, it's jammed packed! But more importantly it's a stand alone, self contained VM environment. Well Microsoft has instructions as to how to connect remotely with Diagnostic Studio here. The deciving part of this is that the SP2010DS prompts you for credentails. So I thought I was getting the right account to run the reports. I tried all the Power Shell commands in the link above but I still ended up getting the following errors: 06/28/2011 12:50:18    Connecting to remote server failed with the following error message : The WinRM client cannot process the request...If the SPN exists, but CredSSP cannot use Kerberos to validate the identity of the target computer and you still want to allow the delegation of the user credentials to the target computer, use gpedit.msc and look at the following policy: Computer Configuration -> Administrative Templates -> System -> Credentials Delegation -> Allow Fresh Credentials with NTLM-only Server Authentication.  Verify that it is enabled and configured with an SPN appropriate for the target computer. For example, for a target computer name "myserver.domain.com", the SPN can be one of the following: WSMAN/myserver.domain.com or WSMAN/*.domain.com. Try the request again after these changes. For more information, see the about_Remote_Troubleshooting Help topic. 06/28/2011 12:54:47    Access to the path '\\<targetserver>\C$\Users\<account logging in>\AppData\Local\Temp' is denied. You might also get an error message like this: The WinRM client cannot process the request. A computer policy does not allow the delegation of the user credentials to the target computer. Explanation After looking at the event logs on the target environment, I noticed that there were a several Security Exceptions. After looking at the specifics around who was denied access, I was able to see the account that was being denied access, it was the client machine administrator account. Well of course that was never going to work!!! After some quick Googling, the last error message above will lead you to edit the Local Group Policy on the client server. And although there are instructions from microsoft around doing this, it really will not work in this scenario. Notice the Description and how it only applices to authentication mentioned? Resolution I can tell you what I did, but I wish there was a better way but I simply don't know if it's duable any other way. Because my development environment had it's own DC, I didn't really want to mess with Kerberos authentication. I would also not be smart to connect that server to the domain, considering it has it's own DC. I ended up installing SharePoint 2010 Diagnostic Studio on another Windows 7 Dev environment I have, and connected the machien to the domain. I ran all the necesary remote credentials commands mentioned here. Those commands add the group policy for you! Once I did this I was able to authenticate properly and I was able to get the reports. Conclusion   You can run SharePoint 2010 Diagnostic Studio Remotely but it will require some specific scenarions. A couple of things I should mention is that as far as I understand, SP2010 DS, will install agents on your target environment to run tests and retrieve the data. I was a Farm Administrator, and also a Server Admin on SharePoint Server. I am not 100% sure if you need all those permissions but I that's just what I have to my internal intranet.   I deally I would like to have a machine that I can have SharePoint 2010 DIagnostic Studio installed and I can run that against client environments. It appears that I will not be able to do that, unless I enable Kerberos on my Windows 7 Machine now. If you have it installed in the same way I would like to have it, please let me know, I'll keep trying to get what I'm after. Hope this helps someone out there doing the same.

    Read the article

  • Mount Return Code for CIFS mount

    - by laikadad101
    When I run the following command (as root or via sudo) from a bash script I get an exit status (or return code in mount man page parlance) of 1: mount -v -t cifs //nasbox/volume /tmpdir/ --verbose -o credentials=/root/cifsid & /tmp/mylog It outputs the following into the myflog file: parsing options: rw,credentials=/root/cifsid mount.cifs kernel mount options unc=//nasbox\volume,ip=192.168.1.1,user=root,pass=xxxx,ver=1,rw,credentials=/root/cifsid It mounts the volume fine but returns the exit code (from the mount man page): 1 Incorrect invocation or permissions The standard Linux log files don't contain any error information. Hence, all seems to go well but I get an exit code of 1 instead of 0. Any ideas? The -v and --verbose options are just there for debugging this problem.

    Read the article

  • How best to troubleshoot a WIA issue through an IIS7 reverse proxy.

    - by CptSkippy
    I've got an Intranet site that uses Windows Authentication and is accessed through an IIS 7 Reverse Proxy. Using FireFox, Safari or Chrome it works fine. I'm prompted for credentials, I supply them and away I go. In IE 7/8 I get prompted for credentials but they're rejected and I eventually get a 401 not authorized error. The application server is configured for Windows Auth only and rejects basic authentication. I would be surprised if the front end proxy would accept Basic Auth so my suspicion is that it's a trust issue with my browser and IE isn't relaying the credentials however our IS Team has IE so locked down I'm unable to alter trust levels or even view the settings. How should I go about troubleshooting this problem? I'm at a loss and they've yet to respond to my support ticket.

    Read the article

  • Sophos Enterprise Console 4.5, Mac Client 7 Not Auto-Populating SEC Info

    - by user65712
    I have Sophos Endpoint Security and Control, which includes Sophos Enterprise Console (SEC). I'm currently running version 4.5 of SEC, which is an older version. I subscribe to Mac updates, and SEC generates a binary Mac installer for me to use on Mac endpoints (Version 7 for Mac, also an older version). However, when I run the installer on Mac endpoints, it installs fine but then never auto-fills out the location of the update server, which is on a network share, and the account credentials used to access it, which I do not know and were generated by Sophos automatically. Previously, I had been able to use the SEC-generated installer to install and run Sophos on a Mac seamlessly; the update location information and account credentials were automatically filled during login, I ran the installer and it was perfectly set up. Now, however, Sophos installs on a Mac but never updates because it doesn't have the update location OR credentials. Has anyone else run across this problem or know why it is happening? Sophos Enterprise Console 4.5.1.0

    Read the article

  • BlueCoat reverse proxy NTLM authentication

    - by mathieu
    Currently when we want to access an internal site from Internet (IIS with NTLM auth), we have two login screens that appear : step1 : LDAPAuth, from the BlueCoat that check login/password validity against Active Directory step2 : NTLM auth, from our application. Is it possible to configure the reverse proxy to use the LDAP credentials provided at step1, and give them to whatever application that requests them ? Of course, if those credentials aren't valid, nothing happens. We're using BlueCoat SG400. Update : we're not looking for SSO where the user doesn't have to enter a password. We want the user to enter his domain credentials in the LDAPAuth dialog box, and the proxy to reuse it to authenticate against our application. Or any application that uses NTLM. We've only got 1 AD domain behind the reverse proxy.

    Read the article

  • Accessing A Shared Directory That Has An Account White List

    - by Xan
    I'm on a LAN here at work and I have my desktop sharing some of my project folders. I can access the computer via \\ComputerIP\, but I can't actually open any of the folders. Upon attempting, I get the error: Windowns cannot access \ComputerIP\ProjectFolder You do not have permission to access \ComputerIP\ProjectFolder. Contact your network administrator to request access. For more information about permissions, see Windows Help and Support Now, this is understood considering I've made it so that you had to utilize the "Project" credentials to connect. I have a user account on my main computer hosting these shared folders that gives full access to the folders if you are this "Project" user. I can Remote Desktop the computer just fine from my laptop using either set of credentials. When I try to open these folders it doesn't give me the option to attempt to apply any credentials like it does when I remote desktop. How am I supposed to gain access to these folders?

    Read the article

  • How to debug ssh authentication failures with gssapi-with-mic

    - by Arthur Ulfeldt
    when i ssh to DOMAIN\user@localhosts-name authentication works fine through gssapi-with-mic: debug3: remaining preferred: gssapi,publickey,keyboard-interactive,password debug3: authmethod_is_enabled gssapi-with-mic debug1: Next authentication method: gssapi-with-mic debug2: we sent a gssapi-with-mic packet, wait for reply debug3: Wrote 112 bytes for a total of 1255 debug1: Delegating credentials debug3: Wrote 2816 bytes for a total of 4071 debug1: Delegating credentials debug3: Wrote 80 bytes for a total of 4151 debug1: Authentication succeeded (gssapi-with-mic). when I connect to a different machine It just seems to stop half way through the gssapi-with-mic authentication: debug1: Next authentication method: gssapi-with-mic debug2: we sent a gssapi-with-mic packet, wait for reply debug3: Wrote 112 bytes for a total of 1255 debug1: Delegating credentials debug3: Wrote 2816 bytes for a total of 4071 <----- ???? debug1: Authentications that can continue: publickey,gssapi-keyex,gssapi-with-mic,password,keyboard-interactive How should I go about finding out what happened differently the second time. How can I find out if/why the auth was rejected by kerberos?

    Read the article

  • BlueCoat reverse proxy NTLM authentication

    - by mathieu
    Currently when we want to access an internal site from Internet (IIS with NTLM auth), we have two login screens that appear : step1 : LDAPAuth, from the BlueCoat that check login/password validity against Active Directory step2 : NTLM auth, from our application. Is it possible to configure the reverse proxy to use the LDAP credentials provided at step1, and give them to whatever application that requests them ? Of course, if those credentials aren't valid, nothing happens. We're using BlueCoat SG400. Update : we're not looking for SSO where the user doesn't have to enter a password. We want the user to enter his domain credentials in the LDAPAuth dialog box, and the proxy to reuse it to authenticate against our application. Or any application that uses NTLM. We've only got 1 AD domain behind the reverse proxy.

    Read the article

  • How to mount remote samba share from local host with multiple groups?

    - by Dragos
    I am using mount.cifs to mount a remote samba share (both client and server are Ubuntu server 8.04) like this: mount.cifs //sambaserver/samba /mountpath -o credentials=/path/.credentials,uid=someuser,gid=1000 $ cat .credentials username=user password=password I mounted a user from local system with username and password with mount.cifs but the problem is that the user is part of multiple groups on the remote system and with mount.cifs I can only specify one gid. Is there a way to specify all the gids that the remote user has? Is there a way to: Mount the remote samba with multiple groups on the local system? Browse the mount from 1) with the terminal since I want to pass some files from samba as arguments to local programs. Other solutions would be: nautilus sftp:// which runs through gvfs; but the newer gnome does not write to disk the ~/.gvfs anymore so I can't browse it in terminal. And the last solution would be NFS but that means that I have to synchronize the uids and gids on the local system with the ones from the server.

    Read the article

  • Remembering sharepoint password in Internet Explorer 8

    - by enableDeepak
    I am using IE8 to open a sharepoint portal on local network. Initially, I clicked on remember password after passing domain credentials. However, now I want sharepoint to ask credentials again. I've tried many options - Deleted all cookies, IE Security Tab Form Autocomplete Deleted everything. Restarted my machine. And all I could do. Still, when I open portal, sharepoint logs me in automatically. What should I do to make IE ask for credentials again?

    Read the article

  • Exchange ActiveSync does not work for one user

    - by jshin47
    One particular user in our system is unable to connect to Exchange ActiveSync via her iPhone. When I try to connect using my own credentials on her iPhone it works (everything begins syncing), but when I input her credentials, the Settings app verifies the credentials are correct but nothing syncs. For example, if I open Mail, no items are shown. When I attempt to force a sync, it says "Cannot connect to server." In Exchange 2010 Management Console the user is no different than the others. Exchange ActiveSync is set as "Enable" in Mailbox Features. EDIT: Alternatively, if there is some easy way to create a new useraccount/mailbox and copy all of the contents of the old one over, I bet it would work, and that would be fine as well. She is a Mac user so we do not have to worry about her Active Directory account.

    Read the article

< Previous Page | 9 10 11 12 13 14 15 16 17 18 19 20  | Next Page >