Search Results

Search found 19074 results on 763 pages for 'secure government government cloud security'.

Page 466/763 | < Previous Page | 462 463 464 465 466 467 468 469 470 471 472 473  | Next Page >

  • ING: Scaling Role Management and Access Certification to Thousands of Applications

    - by Tanu Sood
    Organizations deal with employee and user access certifications in different ways.  There’s collation of multiple spreadsheets, an intense two-week exercise by managers or use of access certification tools to do so across a handful of applications. But for most organizations compliance is about certifying user access for thousands of employees across hundreds of systems. Managing and auditing millions of entitlement combinations on a periodic basis poses a huge scale challenge. ING solved the compliance scale challenge using an Identity Platform approach. Join the live webcast featuring ING’s enterprise architect, Mark Robison, as he discusses how a platform approach offers value that is greater than the sum of its parts and enables ING to successfully meet their security and compliance goals. Mark will also share his implementation experiences and discuss the key requirements to manage the complexity and scale of access certification efforts at ING. Mark will be joined by Neil Gandhi, Principal Product Manager for Oracle Identity Analytics. Live WebcastING: Scaling Role Management and Access Certification to Thousands of ApplicationsWednesday, April 11th at 10 am Pacific/ 1 pm EasternRegister Today

    Read the article

  • Multiple Denial of Service vulnerabilities in libpng

    - by chandan
    CVE DescriptionCVSSv2 Base ScoreComponentProduct and Resolution CVE-2007-5266 Denial of Service (DoS) vulnerability 4.3 PNG reference library (libpng) Solaris 10 SPARC: 137080-03 X86: 137081-03 Solaris 9 SPARC: 139382-02 114822-06 X86: 139383-02 Solaris 8 SPARC: 114816-04 X86: 114817-04 CVE-2007-5267 Denial of Service (DoS) vulnerability 4.3 CVE-2007-5268 Denial of Service (DoS) vulnerability 4.3 CVE-2007-5269 Denial of Service (DoS) vulnerability 5.0 CVE-2008-1382 Denial of Service (DoS) vulnerability 7.5 CVE-2008-3964 Denial of Service (DoS) vulnerability 4.3 CVE-2009-0040 Denial of Service (DoS) vulnerability 6.8 This notification describes vulnerabilities fixed in third-party components that are included in Sun's product distribution.Information about vulnerabilities affecting Oracle Sun products can be found on Oracle Critical Patch Updates and Security Alerts page.

    Read the article

  • How do I resolve "No JSON object could be decoded" on mythbuntu live CD?

    - by Neil
    I have been running a MythTV frontend on my laptop for some time against a MythTV backend installed in Linux Mint 12 on another computer, and everything works fine. Now, I'm trying out the Mythbuntu Live CD (12.04.1 32-bit) on the laptop, to turn it into a dedicated front end. It's connecting to the network just fine, and I can see my server. When I click on the frontend icon on the desktop, it asks me for the security code, which I've verified against mythtv-setup on the server. However, when I test that code, it shows the error message "No JSON object could be decoded". I've looked in the control center to see if there's something else I should be setting up. The message above implies to me that it can't find the server, but I can find no place in the control center to tell it where to find my myth backend, which I find a little odd. Does the live CD not work against a backend server on another machine?

    Read the article

  • Synaptic returns error

    - by donvoldy666
    I get the following error message when I run synaptic and I can't install any programs SystemError: W:Ignoring file 'getdeb.list.bck' in directory '/etc/apt/sources.list.d/' as it has an invalid filename extension, W:Duplicate sources.list entry 'http://extras.ubuntu.com/ubuntu/ precise/main i386 Packages (/var/lib/apt/lists/extras.ubuntu.com_ubuntu_dists_precise_main_binary-i386_Packages), W:Duplicate sources.list entry 'http://extras.ubuntu.com/ubuntu/ ' precise/main i386 Packages (/var/lib/apt/lists/extras.ubuntu.com_ubuntu_dists_precise_main_binary-i386_Packages), W:Duplicate sources.list entry 'http://extras.ubuntu.com/ubuntu/ precise/main i386 Packages (/var/lib/apt/lists/extras.ubuntu.com_ubuntu_dists_precise_main_binary-i386_Packages), W:Duplicate sources.list entry ' 'http://extras.ubuntu.com/ubuntu'/ precise/main i386 Packages (/var/lib/apt/lists/extras.ubuntu.com_ubuntu_dists_precise_main_binary-i386_Packages), W:Duplicate sources.list entry' 'http://extras.ubuntu.com/ubuntu/ precise/main i386 Packages (/var/lib/apt/lists/extras.ubuntu.com_ubuntu_dists_precise_main_binary-i386_Packages), W:Duplicate sources.list entry 'http://extras.ubuntu.com/ubuntu/ precise/main i386 Packages (/var/lib/apt/lists/extras.ubuntu.com_ubuntu_dists_precise_main_binary-i386_Packages), W:Duplicate sources.list entry 'http://ppa.launchpad.net/ubuntu-mozilla-security/ppa/ubuntu/ precise/main i386 Packages (/var/lib/apt/lists/ppa.launchpad.net_ubuntu-mozilla-security_ppa_ubuntu_dists_precise_main_binary-i386_Packages), E:Encountered a section with no Package: header, E:Problem with MergeList /var/lib/apt/lists/packages.rssowl.org_ubuntu_dists_precise_main_i18n_Translation-en, E:The package lists or status file could not be parsed or opened. I'm new to Linux so please help me out .

    Read the article

  • How to create an Access database by using ADOX and Visual C# .NET

    - by SAMIR BHOGAYTA
    Build an Access Database 1. Open a new Visual C# .NET console application. 2. In Solution Explorer, right-click the References node and select Add Reference. 3. On the COM tab, select Microsoft ADO Ext. 2.7 for DDL and Security, click Select to add it to the Selected Components, and then click OK. 4. Delete all of the code from the code window for Class1.cs. 5. Paste the following code into the code window: using System; using ADOX; private void btnCreate_Click(object sender, EventArgs e) { ADOX.CatalogClass cat = new ADOX.CatalogClass(); cat.Create("Provider=Microsoft.Jet.OLEDB.4.0;" +"Data Source=D:\\NewMDB.mdb;" +"Jet OLEDB:Engine Type=5"); MessageBox.Show("Database Created Successfully"); cat = null; }

    Read the article

  • apt-get update bzip2 errors

    - by Tejas Kale
    I installed Ubuntu 11.10 today on my Lenovo w500. After that when i tried running sudo apt-get update This is the error i am getting. Get:117 http://ftp.jaist.ac.jp oneiric-security/universe TranslationIndex [73 B] 99% [48 Sources bzip2 0 B] [22 Sources bzip2 5,294 kB] 1,983 kB/s 0s bzip2: Compressed file ends unexpectedly; perhaps it is corrupted? *Possible* reason follows. bzip2: Inappropriate ioctl for device Input file = (stdin), output file = (stdout) It is possible that the compressed file(s) have become corrupted. You can use the -tvv option to test integrity of such files. You can use the `bzip2recover' program to attempt to recover data from undamaged sections of corrupted files. I found the following similar question : Errors while updating Ubuntu 11.10 , But the solutions mentioned ( changing the download server, running apt-get clean, apt-get autoclean) and have also tried removing the /var/cache/apt/archives/lists direcotry. As a result of this, I am unable to install any new packages.

    Read the article

  • Creating a remote management interface

    - by Johnny Mopp
    I'm looking for info on creating a remote management interface for our software. This is not anything illicit. Our software is for live TV production and once they go on-air we can't access the PC (usually through LogMeIn). I would like to be able to upload/download files and issue commands to our software. The commands would be software specific like "load this file" or "run this script" or "return this value" etc. A socket connection is preferred but the problem is most of our PCs are behind firewalls and NAT servers. I'm not sure where to start. I think HTTP tunneling is the way to go but am wondering if there are other options or recommendations. Also, assume our clients are not willing to open up ports for security reasons. Thanks.

    Read the article

  • "AND Operator" in PAM

    - by d_inevitable
    I need to prevent users from authenticating through Kerberos when the encrypted /home/users has not yet been mounted. (This is to avoid corrupting the ecryptfs mountpoint) Currently I have these lines in /etc/pam.d/common-auth: auth required pam_group.so use_first_pass auth [success=2 default=ignore] pam_krb5.so minimum_uid=1000 try_first_pass auth [success=1 default=ignore] pam_unix.so nullok_secure try_first_pass I am planning to use pam_exec.so to execute a script that will exit 1 if the ecyptfs mounts are not ready yet. Doing this: auth required pam_exec.so /etc/security/check_ecryptfs will lock me out for good if ecryptfs for some reason fails. In such case I would like to at least be able to login with a local (non-kerberos) user to fix the issue. Is there some sort of AND-Operator in which I can say that login through kerberos+ldap is only sufficient if both kerberos authentication and the ecryptfs mount has succeeded?

    Read the article

  • Wireless not working, no driver showing in the Additional Drivers window

    - by edit lopez
    I am a new user of Ubuntu. I have a Asus Q501L laptop that came preinstalled with Windows 8, but I wanted to move from Windows and try something new, so I just decided to install Ubuntu without thinking to much about it. The problem I have is that I can't install the additional drivers. When I go to to Additional Drivers nothing appears, it just says: no proprietary drivers are in use and in small letters continues: a proprietary driver has private code that Ubuntu developers can't review and improve. security and others updates are dependent on drivers vendor.. I can't even use the wireless connection. I really don't know what to do. I tried to download the drivers from Asus, but when I tried to install them it said: an error occurred while loading the archive. Also I don't know the model of the PC wireless card. If there is something I can do to find out that please tell me. Thanks!

    Read the article

  • Connect to running web role on Azure using Remote Desktop Connection and VS2012

    - by Magnus Karlsson
    We want to be able to collect IntelliTrace information from our running app and also use remote desktop to connect to the IIS and look around(probably debugging). 1. Create certificate 1.1 Right-click the cloud project (marked in red) and select “Configure remote desktop”. 1.2 In the drop down list of certificates, choose <create> at the bottom. 1.3. Follow the instructions, you can set it up with default values. 1.4 When done. Choose the certificate and click “Copy to File…” as seen in the left of the picture above. 1.5. Save the file with any name you want. Now we will save it to local storage to be able to import it to our solution through the azure configuration manager in step 3. 2. Save certificate to local storage Now we need to attach it to our local certificate storage to be able to reach it from our confiuguration manager in visual studio. Microsoft provides the following steps for doing this: http://support.microsoft.com/kb/232137 In order to view the Certificates store on the local computer, perform the following steps: Click Start, and then click Run. Type "MMC.EXE" (without the quotation marks) and click OK. Click Console in the new MMC you created, and then click Add/Remove Snap-in. In the new window, click Add. Highlight the Certificates snap-in, and then click Add. Choose the Computer option and click Next. Select Local Computer on the next screen, and then click OK. Click Close , and then click OK. You have now added the Certificates snap-in, which will allow you to work with any certificates in your computer's certificate store. You may want to save this MMC for later use. Now that you have access to the Certificates snap-in, you can import the server certificate into you computer's certificate store by following these steps: Open the Certificates (Local Computer) snap-in and navigate to Personal, and then Certificates. Note: Certificates may not be listed. If it is not, that is because there are no certificates installed. Right-click Certificates (or Personal if that option does not exist.) Choose All Tasks, and then click Import. When the wizard starts, click Next. Browse to the PFX file you created containing your server certificate and private key. Click Next. Enter the password you gave the PFX file when you created it. Be sure the Mark the key as exportable option is selected if you want to be able to export the key pair again from this computer. As an added security measure, you may want to leave this option unchecked to ensure that no one can make a backup of your private key. Click Next, and then choose the Certificate Store you want to save the certificate to. You should select Personal because it is a Web server certificate. If you included the certificates in the certification hierarchy, it will also be added to this store. Click Next. You should see a summary of screen showing what the wizard is about to do. If this information is correct, click Finish. You will now see the server certificate for your Web server in the list of Personal Certificates. It will be denoted by the common name of the server (found in the subject section of the certificate). Now that you have the certificate backup imported into the certificate store, you can enable Internet Information Services 5.0 to use that certificate (and the corresponding private key). To do this, perform the following steps: Open the Internet Services Manager (under Administrative Tools) and navigate to the Web site you want to enable secure communications (SSL/TLS) on. Right-click on the site and click Properties. You should now see the properties screen for the Web site. Click the Directory Security tab. Under the Secure Communications section, click Server Certificate. This will start the Web Site Certificate Wizard. Click Next. Choose the Assign an existing certificate option and click Next. You will now see a screen showing that contents of your computer's personal certificate store. Highlight your Web server certificate (denoted by the common name), and then click Next. You will now see a summary screen showing you all the details about the certificate you are installing. Be sure that this information is correct or you may have problems using SSL or TLS in HTTP communications. Click Next, and then click OK to exit the wizard. You should now have an SSL/TLS-enabled Web server. Be sure to protect your PFX files from any unwanted personnel. Image of a typical MMC.EXE with the certificates up.   3. Import the certificate to you visual studio project. 3.1 Now right click your equivalent to the MvcWebRole1 (as seen in the first picture under the red oval) and choose properties. 3.2 Choose Certificates. Right click the ellipsis to the right of the “thumbprint” and you should be able to select your newly created certificate here. After selecting it- save the file.   4. Upload the certificate to your Azure subscription. 4.1 Go to the azure management portal, click the services menu icon to the left and choose the service. Click Upload in the bottom menu.     5. Connect to server. Since I tried to use account settings(have to use another name) we have to set up a new name for the connection. No biggie. 5.1 Go to azure management portal, select your service and in the bottom menu, choose “REMOTE”. This will display the configuration for remote connection. It will actually change your ServiceConfiguration.cscfg file. After you change It here it might be good to choose download and replace the one in your project. Set a name that is not your windows azure account name and not Administrator. 5.2 Goto visual studio, click Server Explorer. Choose as selected in the picture below and click “COnnect using remote desktop”.   5.2 You will now be able to log in with the name and password set up in step 5.1. and voila! Windows server 2012, IIS and other nice stuff!   To do this one I’ve been using http://msdn.microsoft.com/en-us/library/windowsazure/ff683671.aspx where you can collect some of this information and additional one.

    Read the article

  • Failed update of Ubuntu 10.10 results in unbootable system

    - by chessweb
    Hi, yesterday I performed an automatic security update suggested by the update manager on my virtualized (with VirtualBox on a Windows 7 host) Ubuntu 10.10 installation. The update somehow failed and left me with an unbootable system. When I try to boot, I am told that various folders, files, and what not are missing. Then the system drops into a busybox and leaves me with an (initramfs) prompt. This happens with all kernels I get offered by GRUB, although the error messages are quite different from kernel to kernel. Well, the short of it is this: I don't have the slightest idea on how to get back to a working system and this site is the final straw I'm willing to grab. A complete disaster like this following an update initiated and executed by the system is unheard of in Windows-land; at least I haven't heard of it, yet, and therefore I am going to abandon Ubuntu and Linux altogeteher, if there is no remedy. Regards, RSel

    Read the article

  • Installing Perl modules and dependencies with non-root and without CPAN [migrated]

    - by Eegabooga
    I have been writing Perl scripts for my work and the machine that I have been given to work on makes installing Perl modules difficult: We cannot have gcc on my machine for security reasons, so I cannot use CPAN to install modules, for most modules. I do not have access to the root account. Usually, when I want to install a module, I put in a request and I have to wait a day or two before it gets installed. I know that nobody would have a problem with me installing them myself, so to save everyone's time and my sanity I would like to install them myself. It's just an issue of how to best do that. I have talked to various people and they said to use an RPM to install them (to get around not having gcc). However, when trying to install modules from RPMs, it does not handle the dependencies so I would manually need to handle the dependencies, which could take a while. How can I best install Perl modules with these limitations?

    Read the article

  • Are you ready for the needed changes to your Supply Chain for 2013?

    - by Stephen Slade
    With the initiation of the Dodd-Frank Act, companies need to determine if their products contain 'conflict materials' from certain global markets as the Rep of Congo. The materials include metals such as gold, tin, tungsten and tantalum. Compaines with global sourcing face new disclosure requirements in Feb'13 related to business being done in Iran. Public companies are required to disclose to U.S. security regulators if they or their affiliates are engaged in business in Iran either directly or indirectly.  Is your supply chain compliant?  Do you have sourcing reports to validate?  Where are the materials in your chips & circuit boards coming from? In the next few weeks, responsible companies will be scrutinizing their supply chains, subs, JVs, and affiliates to search for exposure. Source: Brian Lane, Atty at Gibson Dunn Crutcher, as printed in the WSJ Tues, Dec 11, 2012 p.B8

    Read the article

  • Running a webbrowser on the screen saver or login screen.

    - by Erik Johansson
    I would really like people to beable to use my locked computer to surf, so I would like some way to run a browser on login screen. So can I make GDM run firefox in some way? It would be cooler if I could have a browser as a screensaver, but that seems a bit harder. Please ignore all the security problems with this, if you let someone use your computer you have lost that race anyways. Though of course it would be nice to have a browser running as another user.

    Read the article

  • Cant access ephemeral storage on Amazon ubuntu instance

    - by matt burns
    I want to utilise my ephemeral storage as mentioned in this question but I seem to be falling at the first hurdle. I can't even see /mnt: ~$ df -ah Filesystem Size Used Avail Use% Mounted on /dev/xvda1 8.0G 855M 6.8G 12% / proc 0 0 0 - /proc sysfs 0 0 0 - /sys none 0 0 0 - /sys/fs/fuse/connections none 0 0 0 - /sys/kernel/debug none 0 0 0 - /sys/kernel/security udev 288M 8.0K 288M 1% /dev devpts 0 0 0 - /dev/pts tmpfs 119M 152K 118M 1% /run none 5.0M 0 5.0M 0% /run/lock none 296M 0 296M 0% /run/shm This is from a vanilla instance of an ubuntu AMI (12.04-amd64-server-20120424 ami-a29943cb) I'm not bothered about resizing the partition, I just want to be able to use the space for writing temp files.

    Read the article

  • Setting up Oracle Linux 6 with public-yum for all updates

    - by wcoekaer
    I just wanted to give you a quick example on how to get started with Oracle Linux 6 and start using the updates we published on http://public-yum.oracle.com. Download Oracle Linux (without the requirement of a support subscription) from http://edelivery.oracle.com/linux. Install Oracle Linux from the ISO or DVD image Log in as user root Download the yum repo file from http://public-yum.oracle.com # cd /etc/yum.repos.d # wget http://public-yum.oracle.com/public-yum-ol6.repo If you want, you can edit the repo file and enable other repositories, I enabled [ol6_UEK_latest] by just setting enabled=1 in the file with a text editor. Run yum repolist to show the registered channels and you see we are including everything including the latest published RPMs. Now you can just run yum update and any time we release new security errata or bugfix errata for OL6, they will be posted and you will automatically get them. It's very easy, very convenient and actually very cool. We do a lot more than just build OL RPMs and distribute them, we have a very comprehensive test farm where we test the packages extensively.

    Read the article

  • Creating an anonymous site in SharePoint 2010

    - by shehan
    Here’s how: Open up the Central Administration site and click on “Manage Web Applications” under the “Application Management” section From the ribbon click on “New” (Note: if its an existing web app, then click on “Extend”) Fill in the fields with appropriate values. Under “Security Configurations” make sure to select “Yes” for “Allow Anonymous” Click OK Once the web application has been created, a site collection would need to be created. Navigate to “Application Management” –> “Create Site Collection” Fill in the fields with the appropriate values and create the site collection Next sign into the newly created site collection as the Site Collection Administrator. From the “Site Actions” menu, select “Site Permissions” In the permissions page that loads, click on the Anonymous Access button appearing on the ribbon. A modal dialog would popup. Select the appropriate option and click OK. If you selected “Entire Web Site” its advisable to restart the browser to test anonymous access Technorati Tags: SharePoint 2010,anonymous,site collection,web application

    Read the article

  • running jar in a terminal using axis2

    - by Emilio
    I'm trying to run in the command line a java application distributed in a jar file. It call an axis2 web service, so the jar contains a /axis2client directory with rampart.mar security module. It works fine while I run it in netbeans, but it throws an exception if I try to run it in a terminal using this command: java -jar myfile.jar The Exception: org.apache.axis2.AxisFault: invalid url: //file:/home/xxx/Desktop/myfile.jar!/axis2client/ (java.net.MalformedURLException: no protocol: //file:/home/xxx/Desktop/myfile.jar) As you can see, it's trying to use the /axis2client directory inside the jar, as when I run it in Netbeans, but It fails with a MalformedURLException. I think it's something about the protocol 'file:', probably '//file:/' must be 'file:///'. The problem is that I cannot change this call to the directory because the method that loads the /axis2client directory it's not mine, it's from another library that use my project and include all the axis2 support. So, any idea?? Thanks in advance lads!

    Read the article

  • Larry Ellison Unveils Oracle Database In-Memory

    - by jgelhaus
    A Breakthrough Technology, Which Turns the Promise of Real-Time into a Reality Oracle Database In-Memory delivers leading-edge in-memory performance without the need to restrict functionality or accept compromises, complexity and risk. Deploying Oracle Database In-Memory with virtually any existing Oracle Database compatible application is as easy as flipping a switch--no application changes are required. It is fully integrated with Oracle Database's scale-up, scale-out, storage tiering, availability and security technologies making it the most industrial-strength offering in the industry. Learn More Read the Press Release Get Product Details View the Webcast On-Demand Replay Follow the conversation #DB12c #OracleDBIM

    Read the article

  • Managed Service Accounts (MSA) and Virtual Accounts

    Windows Server 2008 R2 and Windows 7 have two new types of service accounts called Manage Service Accounts (MSA) and Virtual Accounts.  These make long term management of service account users, passwords and SPNs much easier. Consider the environment at OrcsWeb.  As a PCI Compliant hosting company, we need to change all security related passwords every 3 months.  This is a substantial undertaking each time because of hundreds of passwords spread throughout our enterprise.  We...Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • Software center not opening

    - by kishore kumar
    $ software-center 2012-09-07 18:45:04,349 - softwarecenter.fixme - WARNING - logs to the root logger: '('/usr/lib/python2.7/dist-packages/dbus/proxies.py', 410, '_introspect_error_handler')' 2012-09-07 18:45:04,349 - dbus.proxies - ERROR - Introspect error on :1.128:/com/ubuntu/Softwarecenter: dbus.exceptions.DBusException: org.freedesktop.DBus.Error.NoReply: Did not receive a reply. Possible causes include: the remote application did not send a reply, the message bus security policy blocked the reply, the reply timeout expired, or the network connection was broken. 2012-09-07 18:45:29,406 - softwarecenter.ui.gtk3.app - INFO - setting up proxy 'None' 2012-09-07 18:45:29,409 - softwarecenter.db.database - INFO - open() database: path=None use_axi=True use_agent=True 2012-09-07 18:45:29,822 - softwarecenter.backend.reviews - WARNING - Could not get usefulness from server, no username in config file 2012-09-07 18:45:29,973 - softwarecenter.ui.gtk3.app - INFO - show_available_packages: search_text is '', app is None. 2012-09-07 18:45:29,991 - softwarecenter.db.pkginfo_impl.aptcache - INFO - aptcache.open() Killed

    Read the article

  • WIF, ADFS 2 and WCF&ndash;Part 5: Service Client (more Flexibility with WSTrustChannelFactory)

    - by Your DisplayName here!
    See the previous posts first. WIF includes an API to manually request tokens from a token service. This gives you more control over the request and more flexibility since you can use your own token caching scheme instead of being bound to the channel object lifetime. The API is straightforward. You first request a token from the STS and then use that token to create a channel to the relying party service. I’d recommend using the WS-Trust bindings that ship with WIF to talk to ADFS 2 – they are pre-configured to match the binding configuration of the ADFS 2 endpoints. The following code requests a token for a WCF service from ADFS 2: private static SecurityToken GetToken() {     // Windows authentication over transport security     var factory = new WSTrustChannelFactory(         new WindowsWSTrustBinding(SecurityMode.Transport),         stsEndpoint);     factory.TrustVersion = TrustVersion.WSTrust13;       var rst = new RequestSecurityToken     {         RequestType = RequestTypes.Issue,         AppliesTo = new EndpointAddress(svcEndpoint),         KeyType = KeyTypes.Symmetric     };       var channel = factory.CreateChannel();     return channel.Issue(rst); } Afterwards, the returned token can be used to create a channel to the service. Again WIF has some helper methods here that make this very easy: private static void CallService(SecurityToken token) {     // create binding and turn off sessions     var binding = new WS2007FederationHttpBinding(         WSFederationHttpSecurityMode.TransportWithMessageCredential);     binding.Security.Message.EstablishSecurityContext = false;       // create factory and enable WIF plumbing     var factory = new ChannelFactory<IService>(binding, new EndpointAddress(svcEndpoint));     factory.ConfigureChannelFactory<IService>();       // turn off CardSpace - we already have the token     factory.Credentials.SupportInteractive = false;       var channel = factory.CreateChannelWithIssuedToken<IService>(token);       channel.GetClaims().ForEach(c =>         Console.WriteLine("{0}\n {1}\n  {2} ({3})\n",             c.ClaimType,             c.Value,             c.Issuer,             c.OriginalIssuer)); } Why is this approach more flexible? Well – some don’t like the configuration voodoo. That’s a valid reason for using the manual approach. You also get more control over the token request itself since you have full control over the RST message that gets send to the STS. One common parameter that you may want to set yourself is the appliesTo value. When you use the automatic token support in the WCF federation binding, the appliesTo is always the physical service address. This means in turn that this address will be used as the audience URI value in the SAML token. Well – this in turn means that when you have an application that consists of multiple services, you always have to configure all physical endpoint URLs in ADFS 2 and in the WIF configuration of the service(s). Having control over the appliesTo allows you to use more symbolic realm names, e.g. the base address or a completely logical name. Since the URL is never de-referenced you have some degree of freedom here. In the next post we will look at the necessary code to request multiple tokens in a call chain. This is a common scenario when you first have to acquire a token from an identity provider and have to send that on to a federation gateway or Resource STS. Stay tuned.

    Read the article

  • Dinner with someone who works for a bank

    - by Badr Hari
    So, I have to meet my girlfriends parents, for some reason they are both programmers. They both work in a bank and as I understood they are responsible for IT security issues. (I have no detailed information about it, because my girlfriend doesn't know anything about computers) I want to make a good expression, especially because they know I can code. Is there any person here who has similar job or has some kind of idea what are they doing so in that field so I can do some research before... it's extremely important for me, please give me an advice.

    Read the article

  • Controllers in CodeIgniter

    - by Dileep Dil
    I little bit new to the CodeIgniter framework and this is my first project with this framework. During a chat on StackOverflow somebody said that we need to make controllers tiny as possible. Currently I have a default controller named home with 1332 lines of codes (and increasing) and a model named Profunction with 1356 lines of codes (and increasing). The controller class have about 46 functions on it and also with model class. I thought that Codeigniter can handle large Controllers or Models well, is there any problem/performance issue/security issues regarding this?

    Read the article

  • How these files can be accessed?

    - by harsh.singla
    The files can be accessed from every artifact, such as .bpel, .mplan, .task, .xsl, .wsdl etc., of the composite. 'oramds' protocol is used to access these files. You need to setup your adf-config.xml file in your dev environment or Jdeveloper to access these files from MDS. Here is the sample adf-config.xml. xmlns:sec="http://xmlns.oracle.com/adf/security/config" name="jdbc-url"/ name="metadata-path"/ credentialStoreLocation="../../src/META-INF/jps-config.xml"/ This adf-config.xml is located in directory named .adf/META-INF, which is in the application home of your project. Application home is the directory where .jws file of you application exists. Other than setting this file, you need not make any other changes in your project or composite to access MDS. After setting this up, you can create a new SOA-MDS connection in your Jdev. This enables you to have a resource pallet in which you can browse and choose the required file from MDS.

    Read the article

< Previous Page | 462 463 464 465 466 467 468 469 470 471 472 473  | Next Page >