Search Results

Search found 2225 results on 89 pages for 'jonathan ou'.

Page 79/89 | < Previous Page | 75 76 77 78 79 80 81 82 83 84 85 86  | Next Page >

  • Tomcat 403 error after LDAP authentication.

    - by user352636
    I'm currently trying to use an LDAP server to authenticate users who are trying to access our Tomcat setup. I believe I have managed to get the LDAP authentication working in the form of a JNDI realm call from Tomcat, but immediately after the user enters their password Tomcat starts throwing 403 (permission denied) errors for everything except from the root page (ttp://localhost:1337/). I have no idea why this is happening. I am following the example at http://blog.mc-thias.org/?title=tomcat_ldap_authentication&more=1&c=1&tb=1&pb=1 . server.xml (the interesting/changed bits) <Realm className="org.apache.catalina.realm.JNDIRealm" debug="99" connectionURL="ldap://localhost:389" userPattern="uid={0},ou=People,o=test,dc=company,dc=uk" userSubTree="true" roleBase="ou=Roles,o=test,dc=company,dc=uk" roleName="cn" roleSearch="memberUid={1}" /> <Valve className="org.apache.catalina.authenticator.SingleSignOn" /> web.xml (the interesting/changed bits) <security-constraint> <display-name>Security Constraint</display-name> <web-resource-collection> <web-resource-name>Protected Area</web-resource-name> <!-- Define the context-relative URL(s) to be protected --> <url-pattern>/*</url-pattern> <!-- If you list http methods, only those methods are protected --> </web-resource-collection> <auth-constraint> <!-- Anyone with one of the listed roles may access this area --> <role-name>admin</role-name> <role-name>regular</role-name> </auth-constraint> </security-constraint> <!-- Default login configuration uses form-based authentication --> <login-config> <auth-method>BASIC</auth-method> </login-config> <!-- Security roles referenced by this web application --> <security-role> <role-name>admin</role-name> <role-name>regular</role-name> </security-role> I cannot access my LDAP setup at the moment, but I believe it is alright as the login is accepted by the BASIC auth method, it's just tomcat that is rejecting it. The roles should be as defined in web.xml - admin and regular. If there is any other information you require me to provide, please just ask! My thanks in advance to anyone who can help, and my apologies for any major mistakes I have made - yesterday was pretty much the first time I'd ever heard of LDAP =D. EDIT: Fixed the second xml segment. Apologies for the formating-fail.

    Read the article

  • Creating keystore for jarsigner programmatically

    - by skayred
    I'm trying to generate keystore with certificate to use it with JarSigner. Here is my code: System.out.println("Keystore generation..."); Security.addProvider(new BouncyCastleProvider()); String domainName = "example.org"; KeyPairGenerator keyGen = KeyPairGenerator.getInstance("RSA"); SecureRandom random = SecureRandom.getInstance("SHA1PRNG", "SUN"); keyGen.initialize(1024, random); KeyPair pair = keyGen.generateKeyPair(); X509V3CertificateGenerator v3CertGen = new X509V3CertificateGenerator(); int serial = new SecureRandom().nextInt(); v3CertGen.setSerialNumber(BigInteger.valueOf(serial < 0 ? -1 * serial : serial)); v3CertGen.setIssuerDN(new X509Principal("CN=" + domainName + ", OU=None, O=None L=None, C=None")); v3CertGen.setNotBefore(new Date(System.currentTimeMillis() - 1000L * 60 * 60 * 24 * 30)); v3CertGen.setNotAfter(new Date(System.currentTimeMillis() + (1000L * 60 * 60 * 24 * 365*10))); v3CertGen.setSubjectDN(new X509Principal("CN=" + domainName + ", OU=None, O=None L=None, C=None")); v3CertGen.setPublicKey(pair.getPublic()); v3CertGen.setSignatureAlgorithm("MD5WithRSAEncryption"); X509Certificate PKCertificate = v3CertGen.generateX509Certificate(pair.getPrivate()); FileOutputStream fos = new FileOutputStream("/Users/dmitrysavchenko/testCert.cert"); fos.write(PKCertificate.getEncoded()); fos.close(); KeyStore ks = KeyStore.getInstance(KeyStore.getDefaultType()); char[] password = "123".toCharArray(); ks.load(null, password); ks.setCertificateEntry("hive", PKCertificate); fos = new FileOutputStream("/Users/dmitrysavchenko/hive-keystore.pkcs12"); ks.store(fos, password); fos.close(); It works, but when I'm trying to sign my JAR with this keystore, I get the following error: jarsigner: Certificate chain not found for: hive. hive must reference a valid KeyStore key entry containing a private key and corresponding public key certificate chain. I've discovered that there must be a private key, but I don't know how to add it to certificate. Can you help me?

    Read the article

  • Java Netscape LDAP Remove One Attribute

    - by spex
    Hi, I have LDAP schema where are users. I need remove one attribute named "notify" which have values: phone number or mail or remove attribute from user. I found method LDAPConnection myCon = new LDAPConnection("localhost",389); myCon.delete("uid=test1, ou=People, o=domain.com, o=isp"); but this remove whole user and i need remove only one attribute "notifyTo" of this user. I need remove whole attribute not only its value. Thanks for reply

    Read the article

  • java.util.Random zero argument enquiry

    - by deerb
    I am trying to code a game following instructions contained in an OU TMA document which read: In the constructor, write code to assign a new instance of Random to ran which you should create using the Random class's zero argument constructor Will this code work? Random ran = new Random(0) ; I am a relative newbie to Java, and I don't understand exactly what the instructions mean

    Read the article

  • Part 2&ndash;Load Testing In The Cloud

    - by Tarun Arora
    Welcome to Part 2, In Part 1 we discussed the advantages of creating a Test Rig in the cloud, the Azure edge and the Test Rig Topology we want to get to. In Part 2, Let’s start by understanding the components of Azure we’ll be making use of followed by manually putting them together to create the test rig, so… let’s get down dirty start setting up the Test Rig.  What Components of Azure will I be using for building the Test Rig in the Cloud? To run the Test Agents we’ll make use of Windows Azure Compute and to enable communication between Test Controller and Test Agents we’ll make use of Windows Azure Connect.  Azure Connect The Test Controller is on premise and the Test Agents are in the cloud (How will they talk?). To enable communication between the two, we’ll make use of Windows Azure Connect. With Windows Azure Connect, you can use a simple user interface to configure IPsec protected connections between computers or virtual machines (VMs) in your organization’s network, and roles running in Windows Azure. With this you can now join Windows Azure role instances to your domain, so that you can use your existing methods for domain authentication, name resolution, or other domain-wide maintenance actions. For more details refer to an overview of Windows Azure connect. A very useful video explaining everything you wanted to know about Windows Azure connect.  Azure Compute Windows Azure compute provides developers a platform to host and manage applications in Microsoft’s data centres across the globe. A Windows Azure application is built from one or more components called ‘roles.’ Roles come in three different types: Web role, Worker role, and Virtual Machine (VM) role, we’ll be using the Worker role to set up the Test Agents. A very nice blog post discussing the difference between the 3 role types. Developers are free to use the .NET framework or other software that runs on Windows with the Worker role or Web role. Developers can also create applications using languages such as PHP and Java. More on Windows Azure Compute. Each Windows Azure compute instance represents a virtual server... Virtual Machine Size CPU Cores Memory Cost Per Hour Extra Small Shared 768 MB $0.04 Small 1 1.75 GB $0.12 Medium 2 3.50 GB $0.24 Large 4 7.00 GB $0.48 Extra Large 8 14.00 GB $0.96   You might want to review the Windows Azure Pricing FAQ. Let’s Get Started building the Test Rig… Configuration Machine Role Comments VM – 1 Domain Controller for Playpit.com On Premise VM – 2 TFS, Test Controller On Premise VM – 3 Test Agent Cloud   In this blog post I would assume that you have the domain, Team Foundation Server and Test Controller Installed and set up already. If not, please refer to the TFS 2010 Installation Guide and this walkthrough on MSDN to set up your Test Controller. You can also download a preconfigured TFS 2010 VM from Brian Keller's blog, Brian also has some great hands on Labs on TFS 2010 that you may want to explore. I. Lets start building VM – 3: The Test Agent Download the Windows Azure SDK and Tools Open Visual Studio and create a new Windows Azure Project using the Cloud Template                   Choose the Worker Role for reasons explained in the earlier post         The WorkerRole.cs implements the Run() and OnStart() methods, no code changes required. You should be able to compile the project and run it in the compute emulator (The compute emulator should have been installed as part of the Windows Azure Toolkit) on your local machine.                   We will only be making changes to WindowsAzureProject, open ServiceDefinition.csdef. Ensure that the vmsize is small (remember the cost chart above). Import the “Connect” module. I am importing the Connect module because I need to join the Worker role VM to the Playpit domain. <?xml version="1.0" encoding="utf-8"?> <ServiceDefinition name="WindowsAzureProject2" xmlns="http://schemas.microsoft.com/ServiceHosting/2008/10/ServiceDefinition"> <WorkerRole name="WorkerRole1" vmsize="Small"> <Imports> <Import moduleName="Diagnostics" /> <Import moduleName="Connect"/> </Imports> </WorkerRole> </ServiceDefinition> Go to the ServiceConfiguration.Cloud.cscfg and note that settings with key ‘Microsoft.WindowsAzure.Plugins.Connect.%%%%’ have been added to the configuration file. This is because you decided to import the connect module. See the config below. <?xml version="1.0" encoding="utf-8"?> <ServiceConfiguration serviceName="WindowsAzureProject2" xmlns="http://schemas.microsoft.com/ServiceHosting/2008/10/ServiceConfiguration" osFamily="1" osVersion="*"> <Role name="WorkerRole1"> <Instances count="1" /> <ConfigurationSettings> <Setting name="Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString" value="UseDevelopmentStorage=true" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.ActivationToken" value="" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.Refresh" value="" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.WaitForConnectivity" value="" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.Upgrade" value="" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.EnableDomainJoin" value="" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.DomainFQDN" value="" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.DomainControllerFQDN" value="" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.DomainAccountName" value="" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.DomainPassword" value="" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.DomainOU" value="" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.Administrators" value="" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.DomainSiteName" value="" /> </ConfigurationSettings> </Role> </ServiceConfiguration>             Let’s go step by step and understand all the highlighted parameters and where you can find the values for them.       osFamily – By default this is set to 1 (Windows Server 2008 SP2). Change this to 2 if you want the Windows Server 2008 R2 operating system. The Advantage of using osFamily = “2” is that you get Powershell 2.0 rather than Powershell 1.0. In Powershell 2.0 you could simply use “powershell -ExecutionPolicy Unrestricted ./myscript.ps1” and it will work while in Powershell 1.0 you will have to change the registry key by including the following in your command file “reg add HKLM\Software\Microsoft\PowerShell\1\ShellIds\Microsoft.PowerShell /v ExecutionPolicy /d Unrestricted /f” before you can execute any power shell. The other reason you might want to move to os2 is if you wanted IIS 7.5.       Activation Token – To enable communication between the on premise machine and the Windows Azure Worker role VM both need to have the same token. Log on to Windows Azure Management Portal, click on Connect, click on Get Activation Token, this should give you the activation token, copy the activation token to the clipboard and paste it in the configuration file. Note – Later in the blog I’ll be showing you how to install connect on the on premise machine.                       EnableDomainJoin – Set the value to true, ofcourse we want to join the on windows azure worker role VM to the domain.       DomainFQDN, DomainControllerFQDN, DomainAccountName, DomainPassword, DomainOU, Administrators – This information is specific to your domain. I have extracted this information from the ‘service manager’ and ‘Active Directory Users and Computers’. Also, i created a new Domain-OU namely ‘CloudInstances’ so all my cloud instances joined to my domain show up here, this is optional. You can encrypt the DomainPassword – refer to the instructions here. Or hold fire, I’ll be covering that when i come to certificates and encryption in the coming section.       Now once you have filled all this information up, the configuration file should look something like below, <?xml version="1.0" encoding="utf-8"?> <ServiceConfiguration serviceName="WindowsAzureProject2" xmlns="http://schemas.microsoft.com/ServiceHosting/2008/10/ServiceConfiguration" osFamily="2" osVersion="*"> <Role name="WorkerRole1"> <Instances count="1" /> <ConfigurationSettings> <Setting name="Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString" value="UseDevelopmentStorage=true" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.ActivationToken" value="45f55fea-f194-4fbc-b36e-25604faac784" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.Refresh" value="" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.WaitForConnectivity" value="" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.Upgrade" value="" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.EnableDomainJoin" value="true" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.DomainFQDN" value="play.pit.com" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.DomainControllerFQDN" value="WIN-KUDQMQFGQOL.play.pit.com" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.DomainAccountName" value="playpit\Administrator" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.DomainPassword" value="************************" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.DomainOU" value="OU=CloudInstances, DC=Play, DC=Pit, DC=com" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.Administrators" value="Playpit\Administrator" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.DomainSiteName" value="" /> </ConfigurationSettings> </Role> </ServiceConfiguration> Next we will be enabling the Remote Desktop module in to the ServiceDefinition.csdef, we could make changes manually or allow a beautiful wizard to help us make changes. I prefer the second option. So right click on the Windows Azure project and choose Publish       Now once you get the publish wizard, if you haven’t already you would be asked to import your Windows Azure subscription, this is simply the Msdn subscription activation key xml. Once you have done click Next to go to the Settings page and check ‘Enable Remote Desktop for all roles’.       As soon as you do that you get another pop up asking you the details for the user that you would be logging in with (make sure you enter a reasonable expiry date, you do not want the user account to expire today). Notice the more information tag at the bottom, click that to get access to the certificate section. See screen shot below.       From the drop down select the option to create a new certificate        In the pop up window enter the friendly name for your certificate. In my case I entered ‘WAC – Test Rig’ and click ok. This will create a new certificate for you. Click on the view button to see the certificate details. Do you see the Thumbprint, this is the value that will go in the config file (very important). Now click on the Copy to File button to copy the certificate, we will need to import the certificate to the windows Azure Management portal later. So, make sure you save it a safe location.                                Click Finish and enter details of the user you would like to create with permissions for remote desktop access, once you have entered the details on the ‘Remote desktop configuration’ screen click on Ok. From the Publish Windows Azure Wizard screen press Cancel. Cancel because we don’t want to publish the role just yet and Yes because we want to save all the changes in the config file.       Now if you go to the ServiceDefinition.csdef file you will see that the RemoteAccess and RemoteForwarder roles have been imported for you. <?xml version="1.0" encoding="utf-8"?> <ServiceDefinition name="WindowsAzureProject2" xmlns="http://schemas.microsoft.com/ServiceHosting/2008/10/ServiceDefinition"> <WorkerRole name="WorkerRole1" vmsize="Small"> <Imports> <Import moduleName="Diagnostics" /> <Import moduleName="Connect" /> <Import moduleName="RemoteAccess" /> <Import moduleName="RemoteForwarder" /> </Imports> </WorkerRole> </ServiceDefinition> Now go to the ServiceConfiguration.Cloud.cscfg file and you see a whole bunch for setting “Microsoft.WindowsAzure.Plugins.RemoteAccess.%%%” values added for you. <?xml version="1.0" encoding="utf-8"?> <ServiceConfiguration serviceName="WindowsAzureProject2" xmlns="http://schemas.microsoft.com/ServiceHosting/2008/10/ServiceConfiguration" osFamily="2" osVersion="*"> <Role name="WorkerRole1"> <Instances count="1" /> <ConfigurationSettings> <Setting name="Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString" value="UseDevelopmentStorage=true" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.ActivationToken" value="45f55fea-f194-4fbc-b36e-25604faac784" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.Refresh" value="" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.WaitForConnectivity" value="" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.Upgrade" value="" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.EnableDomainJoin" value="true" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.DomainFQDN" value="play.pit.com" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.DomainControllerFQDN" value="WIN-KUDQMQFGQOL.play.pit.com" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.DomainAccountName" value="playpit\Administrator" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.DomainPassword" value="************************" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.DomainOU" value="OU=CloudInstances, DC=Play, DC=Pit, DC=com" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.Administrators" value="Playpit\Administrator" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.DomainSiteName" value="" /> <Setting name="Microsoft.WindowsAzure.Plugins.RemoteAccess.Enabled" value="true" /> <Setting name="Microsoft.WindowsAzure.Plugins.RemoteAccess.AccountUsername" value="Administrator" /> <Setting name="Microsoft.WindowsAzure.Plugins.RemoteAccess.AccountEncryptedPassword" value="MIIBnQYJKoZIhvcNAQcDoIIBjjCCAYoCAQAxggFOMIIBSgIBADAyMB4xHDAaBgNVBAMME1dpbmRvd 3MgQXp1cmUgVG9vbHMCEGa+B46voeO5T305N7TSG9QwDQYJKoZIhvcNAQEBBQAEggEABg4ol5Xol66Ip6QKLbAPWdmD4ae ADZ7aKj6fg4D+ATr0DXBllZHG5Umwf+84Sj2nsPeCyrg3ZDQuxrfhSbdnJwuChKV6ukXdGjX0hlowJu/4dfH4jTJC7sBWS AKaEFU7CxvqYEAL1Hf9VPL5fW6HZVmq1z+qmm4ecGKSTOJ20Fptb463wcXgR8CWGa+1w9xqJ7UmmfGeGeCHQ4QGW0IDSBU6ccg vzF2ug8/FY60K1vrWaCYOhKkxD3YBs8U9X/kOB0yQm2Git0d5tFlIPCBT2AC57bgsAYncXfHvPesI0qs7VZyghk8LVa9g5IqaM Cp6cQ7rmY/dLsKBMkDcdBHuCTAzBgkqhkiG9w0BBwEwFAYIKoZIhvcNAwcECDRVifSXbA43gBApNrp40L1VTVZ1iGag+3O1" /> <Setting name="Microsoft.WindowsAzure.Plugins.RemoteAccess.AccountExpiration" value="2012-11-27T23:59:59.0000000+00:00" /> <Setting name="Microsoft.WindowsAzure.Plugins.RemoteForwarder.Enabled" value="true" /> </ConfigurationSettings> <Certificates> <Certificate name="Microsoft.WindowsAzure.Plugins.RemoteAccess.PasswordEncryption" thumbprint="AA23016CF0BDFC344400B5B82706B608B92E4217" thumbprintAlgorithm="sha1" /> </Certificates> </Role> </ServiceConfiguration>          Okay let’s look at them one at a time,       Enabled - Yes, we would like to enable Remote Access.       AccountUserName – This is the user name you entered while you were on the publish windows azure role screen, as detailed above.       AccountEncrytedPassword – Try and decode that, the certificate is used to encrypt the password you specified for the user account. Remember earlier i said, either use the instructions or wait and i’ll be showing you encryption, now the user account i am using for rdp has the same password as my domain password, so i can simply copy the value of the AccountEncryptedPassword to the DomainPassword as well.       AccountExpiration – This is the expiration as you specified in the wizard earlier, make sure your account does not expire today.       Remote Forwarder – Check out the documentation, below is how I understand it, -- One role in an application that implements a remote desktop connection must import the RemoteForwarder module. The two modules work together to enable the remote desktop connections to role instances. -- If you have multiple roles defined in the service model, it does not matter which role you add the RemoteForwarder module to, but you must add it to only one of the role definitions.       Certificate – Remember the certificate thumbprint from the wizard, the on premise machine and windows azure role machine that need to speak to each other must have the same thumbprint. More on that when we install Windows Azure connect Endpoints on the on premise machine. As i said earlier, in this blog post, I’ll be showing you the manual process so i won’t be scripting any star up tasks to install the test agent or register the test agent with the TFS Server. I’ll be showing you all this cool stuff in the next blog post, that’s because it’s important to understand the manual side of it, it becomes easier for you to troubleshoot in case something fails. Having said that, the changes we have made are sufficient to spin up the Windows Azure Worker Role aka Test Agent VM, have it connected with the play.pit.com domain and have remote access enabled on it. Before we deploy the Test Agent VM we need to set up Windows Azure Connect on the TFS Server. II. Windows Azure Connect: Setting up Connect on VM – 2 i.e. TFS & Test Controller Glad you made it so far, now to enable communication between the on premise TFS/Test Controller and Azure-ed Test Agent we need to enable communication. We have set up the Azure connect module in the Test Agent configuration, now the connect end points need to be enabled on the on premise machines, let’s have a look at how we can do this. Log on to VM – 2 running the TFS Server and Test Controller Log on to the Windows Azure Management Portal and click on Virtual Network Click on Virtual Network, if you already have a subscription you should see the below screen shot, if not, you would be asked to complete the subscription first        Click on Install Local Endpoints from the top left on the panel and you get a url appended with a token id in it, remember the token i showed you earlier, in theory the token you get here should match the token you added to the Test Agent config file.        Copy the url to the clip board and paste it in IE explorer (important, the installation at present only works out of IE and you need to have cookies enabled in order to complete the installation). As stated in the pop up, you can NOT download and run the software later, you need to run it as is, since it contains a token. Once the installation completes you should see the Windows Azure connect icon in the system tray.                         Right click the Azure Connect icon, choose Diagnostics and refer to this link for diagnostic detail terminology. NOTE – Unfortunately I could not see the Windows Azure connect icon in the system tray, a bit of binging with Google revealed that the azure connect icon is only shown when the ‘Windows Azure Connect Endpoint’ Service is started. So go to services.msc and make sure that the service is started, if not start it, unfortunately again, the service did not start for me on a manual start and i realised that one of the dependant services was disabled, you can look at the service dependencies and start them and then start windows azure connect. Bottom line, you need to start Windows Azure connect service before you can proceed. Please refer here on MSDN for more on Troubleshooting Windows Azure connect. (Follow the next step as well)   Now go back to the Windows Azure Management Portal and from Groups and Roles create a new group, lets call it ‘Test Rig’. Make sure you add the VM – 2 (the TFS Server VM where you just installed the endpoint).       Now if you go back to the Azure Connect icon in the system tray and click ‘Refresh Policy’ you will notice that the disconnected status of the icon should change to ready for connection. III. Importing Certificate in to Windows Azure Management Portal But before that you need to import the certificate you created in Step I in to the Windows Azure Management Portal. Log on to the Windows Azure Management Portal and click on ‘Hosted Services, Storage Accounts & CDN’ and then ‘Management Certificates’ followed by Add Certificates as shown in the screen shot below        Browse to the location where you saved the certificate earlier, remember… Refer to Step I in case you forgot.        Now you should be able to see the imported certificate here, make sure the thumbprint of the certificate matches the one you inserted in the config files        IV. Publish Windows Azure Worker Role aka Test Agent Having completed I, II and III, you are ready to publish the Test Agent VM – 3 to the cloud. Go to Visual Studio and right click the Windows Azure project and select Publish. Verify the infomration in the wizard, from the advanced settings tab, you can also enabled capture of intellitrace or profiling information.         Click Next and Click Publish! From the view menu bar select the Windows Azure Activity Log window.       Now you should be able to see the deployment progress in real time.             In the Windows Azure Management Portal, you should also be able to see the progress of creation of a new Worker Role.       Once the deployment is complete you should be able to RDP (go to run prompt type mstsc and in the pop up the machine name) in to the Test Agent Worker Role VM from the Playpit network using the domain admin user account. In case you are unable to log in to the Test Agent using the domain admin user account it means the process of joining the Test Agent to the domain has failed! But the good news is, because you imported the connect module, you can connect to the Test Agent machine using Windows Azure Management Portal and troubleshoot the reason for failure, you will be able to log in with the user name and password you specified in the config file for the keys ‘RemoteAccess.AccountUsername, RemoteAccess.EncryptedPassword (just that enter the password unencrypted)’, fix it or manually join the machine to the domain. Once you have managed to Join the Test Agent VM to the Domain move to the next step.      So, log in to the Test Agent Worker Role VM with the Playpit Domain Administrator and verify that you can log in, the machine is connected to the domain and the connect service is successfully running. If yes, give your self a pat on the back, you are 80% mission accomplished!         Go to the Windows Azure Management Portal and click on Virtual Network, click on Groups and Roles and click on Test Rig, click Edit Group, the edit the Test Rig group you created earlier. In the Connect to section, click on Add to select the worker role you have just deployed. Also, check the ‘Allow connections between endpoints in the group’ with this you will enable to communication between test controller and test agents and test agents/test agents. Click Save.      Now, you are ready to deploy the Test Agent software on the Worker Role Test Agent VM and configure it to work with the Test Controller. V. Configuring VM – 3: Installing Test Agent and Associating Test Agent to Controller Log in to the Worker Role Test Agent VM that you have just successfully deployed, make sure you log in with the domain administrator account. Download the All Agents software from MSDN, ‘en_visual_studio_agents_2010_x86_x64_dvd_509679.iso’, extract the iso and navigate to where you have extracted the iso. In my case, i have extracted the iso to “C:\Resources\Temp\VsAgentSetup”. Open the Test Agent folder and double click on setup.exe. Once you have installed the Test Agent you should reach the configuration window. If you face any issues installing TFS Test Agent on the VM, refer to the walkthrough on MSDN.       Once you have successfully installed the Test Agent software you will need to configure the test agent. Right click the test agent configuration tool and run as a different user. i.e. an Administrator. This is really to run the configuration wizard with elevated privileges (you might have UAC block something's otherwise).        In the run options, you can select ‘service’ you do not need to run the agent as interactive un less you are running coded UI tests. I have specified the domain administrator to connect to the TFS Test Controller. In real life, i would never do that, i would create a separate test user service account for this purpose. But for the blog post, we are using the most powerful user so that any policies or restrictions don’t block you.        Click the Apply Settings button and you should be all green! If not, the summary usually gives helpful error messages that you can resolve and proceed. As per my experience, you may run in to either a permission or a firewall blocking communication issue.        And now the moment of truth! Go to VM –2 open up Visual Studio and from the Test Menu select Manage Test Controller       Mission Accomplished! You should be able to see the Test Agent that you have just configured here,         VI. Creating and Running Load Tests on your brand new Azure-ed Test Rig I have various blog posts on Performance Testing with Visual Studio Ultimate, you can follow the links and videos below, Blog Posts: - Part 1 – Performance Testing using Visual Studio 2010 Ultimate - Part 2 – Performance Testing using Visual Studio 2010 Ultimate - Part 3 – Performance Testing using Visual Studio 2010 Ultimate Videos: - Test Tools Configuration & Settings in Visual Studio - Why & How to Record Web Performance Tests in Visual Studio Ultimate - Goal Driven Load Testing using Visual Studio Ultimate Now that you have created your load tests, there is one last change you need to make before you can run the tests on your Azure Test Rig, create a new Test settings file, and change the Test Execution method to ‘Remote Execution’ and select the test controller you have configured the Worker Role Test Agent against in our case VM – 2 So, go on, fire off a test run and see the results of the test being executed on the Azur-ed Test Rig. Review and What’s next? A quick recap of the benefits of running the Test Rig in the cloud and what i will be covering in the next blog post AND I would love to hear your feedback! Advantages Utilizing the power of Azure compute to run a heavy virtual user load. Benefiting from the Azure flexibility, destroy Test Agents when not in use, takes < 25 minutes to spin up a new Test Agent. Most important test Network Latency, (network latency and speed of connection are two different things – usually network latency is very hard to test), by placing the Test Agents in Microsoft Data centres around the globe, one can actually test the lag in transferring the bytes not because of a slow connection but because the page has been requested from the other side of the globe. Next Steps The process of spinning up the Test Agents in windows Azure is not 100% automated. I am working on the Worker process and power shell scripts to make the role deployment, unattended install of test agent software and registration of the test agent to the test controller automated. In the next blog post I will show you how to make the complete process unattended and automated. Remember to subscribe to http://feeds.feedburner.com/TarunArora. Hope you enjoyed this post, I would love to hear your feedback! If you have any recommendations on things that I should consider or any questions or feedback, feel free to leave a comment. See you in Part III.   Share this post : CodeProject

    Read the article

  • Openldap/Sasl/GSSAPI on Debian: Key table entry not found

    - by badbishop
    The goal: to make an OpenLDAP server to authenticate using Kerberos V via GSSAPI Setup: several virtual machines running on freshly installed/updated Debian Squeeze A master KDC server kdc.example.com A LDAP server, running OpenLDAP ldap.example.com The problem: tom@ldap:~$ ldapsearch -b 'dc=example,dc=com' SASL/GSSAPI authentication started ldap_sasl_interactive_bind_s: Other (e.g., implementation specific) error (80) additional info: SASL(-1): generic failure: GSSAPI Error: Unspecified GSS failure. Minor code may provide more information (Key table entry not found) One might suggest to add that bloody keytab entry, but here's the real problem: ktutil: rkt /etc/ldap/ldap.keytab ktutil: list slot KVNO Principal ---- ---- --------------------------------------------------------------------- 1 2 ldap/[email protected] 2 2 ldap/[email protected] 3 2 ldap/[email protected] 4 2 ldap/[email protected] So, the entry as suggested by the OpenLDAP manual is there allright. Deleting and re-creating both service principal and the keytab on ldap.example.com didn't help, I get the same error. And before I make the keytab file readable by openldap, I get "Permission denied" error instead of the one in the subject. Which implies, that the right keytab file is being accessed, as set in /etc/default/slapd. I have my doubts about the following part of slapd config: root@ldap:~# cat /etc/ldap/slapd.d/cn\=config.ldif | grep -v "^#" dn: cn=config objectClass: olcGlobal cn: config olcArgsFile: /var/run/slapd/slapd.args olcLogLevel: 256 olcPidFile: /var/run/slapd/slapd.pid olcToolThreads: 1 structuralObjectClass: olcGlobal entryUUID: d6737f5c-d321-1030-9dbe-27d2a7751e11 olcSaslHost: kdc.example.com olcSaslRealm: EXAMPLE.COM olcSaslSecProps: noplain,noactive,noanonymous,minssf=56 olcAuthzRegexp: {0}"uid=([^/]*),cn=EXAMPLE.COM,cn=GSSAPI,cn=auth" "uid=$1,ou=People,dc=example,dc=com" olcAuthzRegexp: {1}"uid=host/([^/]*).example.com,cn=example.com,cn=gssapi,cn=auth" "cn=$1,ou=hosts,dc=example,dc=com" A HOWTO at https://help.ubuntu.com/community/OpenLDAPServer#Kerberos_Authentication mentiones vaguely: Also, it is frequently necessary to map the Distinguished Name (DN) of an authorized Kerberos client to an existing entry in the DIT. I fail to understand where in the tree this should be defined, what schema should be used, etc. After hours of googling, it's official: I'm stuck! Please, help. Other things checked: Kerberos as such works fine (I can ssh without using a password to any machine in this setup). That means there should be no DNS-related problems. ldapsearch -b 'dc=example,dc=com' -x works OK. SASL/GSSAPI has been tested using sasl-sample-server -m GSSAPI -s ldap and sasl-sample-client -s ldap -n ldap.example.com -u tom without errors: root@ldap:~# sasl-sample-server -m GSSAPI -s ldap Forcing use of mechanism GSSAPI Sending list of 1 mechanism(s) S: R1NTQVBJ Waiting for client mechanism... C: R1NTQVBJAGCCAmUGCSqGSIb3EgECAgEAboICVDCCAlCgAwIBBaEDAgEOogcDBQAgAAAAo4IBamGCAWYwggFioAMCAQWhDRsLRVhBTVBMRS5DT02iIzAhoAMCAQOhGjAYGwRsZGFwGxBsZGFwLmV4YW1wbGUuY29to4IBJTCCASGgAwIBEqEDAgECooIBEwSCAQ8Re8XUnscB8dx6V/cXL+uzSF2/olZvcrVAJHZBZrfRKUFEQmU1Li46bUGK3GZwsn6qUVwmW6lyqVctOIYwGvBpz81Rw/5mj4V5iQudZbIRa+5Ew6W1oBB7ALi2cnPsbUroqzGmEh8/Vw8zSFk7W1gND4DLuWrPXD2xhLDUMMekBn5nXEPTnNAnV4w81Sj3ZlyLZz5OSitGVUEnQweV53z1spWsASHHWod/tSuxb19YeWmY5QHXPLG+lL5+w+Cykr0EhYVj8f8MDWFB8qoN1cr85xDfn18r8JldSw+i18nFKOo8usG+37hZTWynHYvBfMONtG9mLJv82KGPZMydWK7pzyTZDcnSsIjo2AftMZd5pIHMMIHJoAMCARKigcEEgb5aG1k4xgxmUXX7RKfvAbVBVJ12dWOgFFjMYceKjziXwrrOkv8ZwIvef9Yn2KsWznb5L55SXt2c/zlPa5mLKIktvw77hsK1h/GYc7p//BGOsmr47aCqVWsGuTqVT129uo5LNQDeSFwl2jXCkCZJZavOVrqYsM6flrPYE4n5lASTcPitX+/WNsf6WrvZoaexiv1JqyM/MWqS/vMBRMMc5xlurj6OARFvP9aFZoK/BLmfkSyAJj6MLbLVXZtkHiIPgot 'GSSAPI' Sending response... S: YIGZBgkqhkiG9xIBAgICAG+BiTCBhqADAgEFoQMCAQ+iejB4oAMCARKicQRvkxggi9pW+yJ1ExbTwLDclqw/VQ98aPq8mt39hkO6PPfcO2cB+t6vJ01xRKBrT9D2qF2XK0SWD4PQNb5UFbH4RM/bKAxDuCfZ1MHKgIWTLu4bK7VGZTbYydcckU2d910jIdvkkHhaRqUEM4cqp/cR Waiting for client reply... C: got '' Sending response... S: BQQF/wAMAAAAAAAAMBOWqQcACAAlCodrXW66ZObsEd4= Waiting for client reply... C: BQQE/wAMAAAAAAAAFUYbXQQACAB0b20VynB4uGH/iIzoRhw=got '?' Negotiation complete Username: tom Realm: (NULL) SSF: 56 sending encrypted message 'srv message 1' S: AAAASgUEB/8AAAAAAAAAADATlqrqrBW0NRfPMXMdMz+zqY32YakrHqFps3o/vO6yDeyPSaSqprrhI+t7owk7iOsbrZ/idJRxCBm8Wazx Waiting for encrypted message... C: AAAATQUEBv8AAAAAAAAAABVGG17WC1+/kIV9xTMUdq6Y4qYmmTahHVCjidgGchTOOOrBLEwA9IqiTCdRFPVbK1EgJ34P/vxMQpV1v4WZpcztgot '' recieved decoded message 'client message 1' root@ldap:~# sasl-sample-client -s ldap -n ldap.example.com -u tom service=ldap Waiting for mechanism list from server... S: R1NTQVBJrecieved 6 byte message Choosing best mechanism from: GSSAPI returning OK: tom Using mechanism GSSAPI Preparing initial. Sending initial response... C: R1NTQVBJAGCCAmUGCSqGSIb3EgECAgEAboICVDCCAlCgAwIBBaEDAgEOogcDBQAgAAAAo4IBamGCAWYwggFioAMCAQWhDRsLRVhBTVBMRS5DT02iIzAhoAMCAQOhGjAYGwRsZGFwGxBsZGFwLmV4YW1wbGUuY29to4IBJTCCASGgAwIBEqEDAgECooIBEwSCAQ8Re8XUnscB8dx6V/cXL+uzSF2/olZvcrVAJHZBZrfRKUFEQmU1Li46bUGK3GZwsn6qUVwmW6lyqVctOIYwGvBpz81Rw/5mj4V5iQudZbIRa+5Ew6W1oBB7ALi2cnPsbUroqzGmEh8/Vw8zSFk7W1gND4DLuWrPXD2xhLDUMMekBn5nXEPTnNAnV4w81Sj3ZlyLZz5OSitGVUEnQweV53z1spWsASHHWod/tSuxb19YeWmY5QHXPLG+lL5+w+Cykr0EhYVj8f8MDWFB8qoN1cr85xDfn18r8JldSw+i18nFKOo8usG+37hZTWynHYvBfMONtG9mLJv82KGPZMydWK7pzyTZDcnSsIjo2AftMZd5pIHMMIHJoAMCARKigcEEgb5aG1k4xgxmUXX7RKfvAbVBVJ12dWOgFFjMYceKjziXwrrOkv8ZwIvef9Yn2KsWznb5L55SXt2c/zlPa5mLKIktvw77hsK1h/GYc7p//BGOsmr47aCqVWsGuTqVT129uo5LNQDeSFwl2jXCkCZJZavOVrqYsM6flrPYE4n5lASTcPitX+/WNsf6WrvZoaexiv1JqyM/MWqS/vMBRMMc5xlurj6OARFvP9aFZoK/BLmfkSyAJj6MLbLVXZtkHiIP Waiting for server reply... S: YIGZBgkqhkiG9xIBAgICAG+BiTCBhqADAgEFoQMCAQ+iejB4oAMCARKicQRvkxggi9pW+yJ1ExbTwLDclqw/VQ98aPq8mt39hkO6PPfcO2cB+t6vJ01xRKBrT9D2qF2XK0SWD4PQNb5UFbH4RM/bKAxDuCfZ1MHKgIWTLu4bK7VGZTbYydcckU2d910jIdvkkHhaRqUEM4cqp/cRrecieved 156 byte message C: Waiting for server reply... S: BQQF/wAMAAAAAAAAMBOWqQcACAAlCodrXW66ZObsEd4=recieved 32 byte message Sending response... C: BQQE/wAMAAAAAAAAFUYbXQQACAB0b20VynB4uGH/iIzoRhw= Negotiation complete Username: tom SSF: 56 Waiting for encoded message... S: AAAASgUEB/8AAAAAAAAAADATlqrqrBW0NRfPMXMdMz+zqY32YakrHqFps3o/vO6yDeyPSaSqprrhI+t7owk7iOsbrZ/idJRxCBm8Wazxrecieved 78 byte message recieved decoded message 'srv message 1' sending encrypted message 'client message 1' C: AAAATQUEBv8AAAAAAAAAABVGG17WC1+/kIV9xTMUdq6Y4qYmmTahHVCjidgGchTOOOrBLEwA9IqiTCdRFPVbK1EgJ34P/vxMQpV1v4WZpczt

    Read the article

  • Failed Administrator login on WSO2 IS with external OpenLDAP

    - by Marco Rivadeneyra
    I have an installation of WSO2 Identity Server and I'm trying to make it work with an external OpenLDAP instance I have followed this guide: http://wso2.org/project/solutions/identity/3.2.3/docs/user-core/admin_guide.html#LDAP For the read-only mode. But when I try to log-in I get a failed login and the following error on the console: TID: [0] [WSO2 Identity Server] [2012-08-10 17:10:25,493] WARN {org.wso2.carbon.core.services.util.CarbonAuthenticationUtil} - Failed Administrator login attempt 'john[0]' at [2012-08-10 17:10:25,0493] from IP address 127.0.0.1 {org.wso2.carbon.core.services.util.CarbonAuthenticationUtil} Full log: http://pastebin.com/pHUGXBqv My configuration file looks as follows: <UserManager> <Realm> <Configuration> <AdminRole>admin</AdminRole> <AdminUser> <UserName>john</UserName> <Password>johnldap</Password> </AdminUser> <EveryOneRoleName>everyone</EveryOneRoleName> <!-- By default users in this role sees the registry root --> <ReadOnly>true</ReadOnly> <MaxUserNameListLength>500</MaxUserNameListLength> <Property name="url">jdbc:h2:repository/database/WSO2CARBON_DB</Property> <Property name="userName">wso2carbon</Property> <Property name="password">wso2carbon</Property> <Property name="driverName">org.h2.Driver</Property> <Property name="maxActive">50</Property> <Property name="maxWait">60000</Property> <Property name="minIdle">5</Property> </Configuration> <UserStoreManager class="org.wso2.carbon.user.core.ldap.LDAPUserStoreManager"> <Property name="ReadOnly">true</Property> <Property name="MaxUserNameListLength">100</Property> <Property name="ConnectionURL">ldap://192.168.81.144:389</Property> <Property name="ConnectionName">cn=admin,dc=example,dc=com</Property> <Property name="ConnectionPassword">admin</Property> <Property name="UserSearchBase">ou=People,dc=example,dc=com</Property> <Property name="UserNameListFilter">(objectClass=inetOrgPerson)</Property> <Property name="UserNameAttribute">uid</Property> <Property name="ReadLDAPGroups">false</Property> <Property name="GroupSearchBase">ou=Groups,dc=example,dc=com</Property> <Property name="GroupSearchFilter">(objectClass=groupOfNames)</Property> <Property name="GroupNameAttribute">uid</Property> <Property name="MembershipAttribute">member</Property> </UserStoreManager> <AuthorizationManager class="org.wso2.carbon.user.core.authorization.JDBCAuthorizationManager"></AuthorizationManager> </Realm> I followed this guide to configure my LDAP server up to Loggging: https://help.ubuntu.com/12.04/serverguide/openldap-server.html Could you suggest what might be wrong? The LDAP log is available at: http://pastebin.com/T9rFYEAW

    Read the article

  • Troubleshooting sudoers via ldap

    - by dafydd
    The good news is that I got sudoers via ldap working on Red Hat Directory Server. The package is sudo-1.7.2p1. I have some LDAP/Kerberos users in an LDAP group called wheel, and I have this entry in LDAP: # %wheel, SUDOers, example.com dn: cn=%wheel,ou=SUDOers,dc=example,dc=com cn: %wheel description: Members of group wheel have access to all privileges. objectClass: sudoRole objectClass: top sudoCommand: ALL sudoHost: ALL sudoUser: %wheel So, members of group wheel have administrative privileges via sudo. This has been tested and works fine. Now, I have this other sudo privilege set up to allow members of a group called Administrators to perform two commands as the non-root owner of those commands. # %Administrators, SUDOers, example.com dn: cn=%Administrators,ou=SUDOers,dc=example,dc=com sudoRunAsGroup: appGroup sudoRunAsUser: appOwner cn: %Administrators description: Allow members of the group Administrators to run various commands . objectClass: sudoRole objectClass: top sudoCommand: appStop sudoCommand: appStart sudoCommand: /path/to/appStop sudoCommand: /path/to/appStart sudoUser: %Administrators Unfortunately, members of Administrators are still refused permission to run appStart or appStop: -bash-3.2$ sudo /path/to/appStop [sudo] password for Aaron: Sorry, user Aaron is not allowed to execute '/path/to/appStop' as root on host.example.com. -bash-3.2$ sudo -u appOwner /path/to/appStop [sudo] password for Aaron: Sorry, user Aaron is not allowed to execute '/path/to/appStop' as appOwner on host.example.com. /var/log/secure shows me these two sets of messages for the two attempts: Oct 31 15:02:36 host sudo: pam_unix(sudo:auth): authentication failure; logname=Aaron uid=0 euid=0 tty=/dev/pts/3 ruser= rhost= user=Aaron Oct 31 15:02:37 host sudo: pam_krb5[1508]: TGT verified using key for 'host/[email protected]' Oct 31 15:02:37 host sudo: pam_krb5[1508]: authentication succeeds for 'Aaron' ([email protected]) Oct 31 15:02:37 host sudo: Aaron : command not allowed ; TTY=pts/3 ; PWD=/auto/home/Aaron ; USER=root ; COMMAND=/path/to/appStop Oct 31 15:02:52 host sudo: pam_unix(sudo:auth): authentication failure; logname=Aaron uid=0 euid=0 tty=/dev/pts/3 ruser= rhost= user=Aaron Oct 31 15:02:52 host sudo: pam_krb5[1547]: TGT verified using key for 'host/[email protected]' Oct 31 15:02:52 host sudo: pam_krb5[1547]: authentication succeeds for 'Aaron' ([email protected]) Oct 31 15:02:52 host sudo: Aaron : command not allowed ; TTY=pts/3 ; PWD=/auto/home/Aaron ; USER=appOwner; COMMAND=/path/to/appStop The questions: Does sudo have some sort of verbose or debug mode where I can actually watch it capture the sudoers privilege list and determine whether or not Aaron should have the privilege to run this command? (This question is probably independent of where the sudoers database is kept.) Does sudo work with some background mechanism that might have a log level I could turn up? Right now, I can't fix a problem I can't identify. Is this an LDAP search failure? Is this a group member matching failure? Identifying why the command fails will help me identify the fix... Next step: Recreate the privilege in /etc/sudoers, and see if it works locally... Cheers!

    Read the article

  • SQLAuthority News – SQLPASS Nov 8-11, 2010-Seattle – An Alternative Look at Experience

    - by pinaldave
    I recently attended most prestigious SQL Server event SQLPASS between Nov 8-11, 2010 at Seattle. I have only one expression for the event - Best Summit Ever This year the summit was at its best. Instead of writing about my usual routine or the event, I am going to write about the interesting things I did and how I felt about it! Best Summit Ever Trip to Seattle! This was my second trip to Seattle this year and the journey is always long. Here is the travel stats on how long it takes to get to Seattle: 24 hours official air time 36 hours total travel time (connection waits and airport commute) Every time I travel to USA I gain a day and when I travel back to home, I lose a day. However, the total traveling time is around 3 days. The journey is long and very exhausting. However, it is all worth it when you’re attending an event like SQLPASS. Here are few things I carry when I travel for a long journey: Dry Snack packs – I like to have some good Indian Dry Snacks along with me in my backpack so I can have my own snack when I want Amazon Kindle – Loaded with 80+ books A physical book – This is usually a very easy to read book I do not watch movies on the plane and usually spend my time reading something quick and easy. If I can go to sleep, I go for it. I prefer to not to spend time in conversation with the guy sitting next to me because usually I end up listening to their biography, which I cannot blog about. Sheraton Seattle SQLPASS In any case, I love to go to Seattle as the city is great and has everything a brilliant metropolis has to offer. The new Light Train is extremely convenient, and I can take it directly from the airport to the city center. My hotel, the Sheraton, was only few meters (in the USA people count in blocks – 3 blocks) away from the train station. This time I saved USD 40 each round trip due to the Light Train. Sessions I attended! Well, I really wanted to attend most of the sessions but there was great dilemma of which ones to choose. There were many, many sessions to be attended and at any given time there was more than one good session being presented. I had decided to attend sessions in area performance tuning and I attended quite a few sessions this year, compared to what I was able to do last year. Here are few names of the speakers whose sessions I attended (please note, following great speakers are not listed in any order. I loved them and I enjoyed their sessions): Conor Cunningham Rushabh Mehta Buck Woody Brent Ozar Jonathan Kehayias Chris Leonard Bob Ward Grant Fritchey I had great fun attending their sessions. The sessions were meaningful and enlightening. It is hard to rate any session but I have found that the insights learned in Conor Cunningham’s sessions are the highlight of the PASS Summit. Rushabh Mehta at Keynote SQLPASS   Bucky Woody and Brent Ozar I always like the sessions where the speaker is much closer to the audience and has real world experience. I think speakers who have worked in the real world deliver the best content and most useful information. Sessions I did not like! Indeed there were few sessions I did not like it and I am not going to name them here. However, there were strong reasons I did not like their sessions, and here is why: Sessions were all theory and had no real world connections. All technical questions ended with confusing answers (lots of “I will get back to you on it,” “it depends,” “let us take this offline” and many more…) “I am God” kind of attitude in the speakers For example, I attended a session of one very well known speaker who is a specialist for one particular area. I was bit late for the session and was surprised to see that in a room that could hold 350 people there were only 30 attendees. After sitting there for 15 minutes, I realized why lots of people left. Very soon I found I preferred to stare out the window instead of listening to that particular speaker. One on One Talk! Many times people ask me what I really like about PASS. I always say the experience of meeting SQL legends and spending time with them one on one and LEARNING! Here is the quick list of the people I met during this event and spent more than 30 minutes with each of them talking about various subjects: Pinal Dave and Brad Shulz Pinal Dave and Rushabh Mehta Michael Coles and Pinal Dave Rushabh Mehta – It is always pleasure to meet with him. He is a man with lots of energy and a passion for community. He recently told me that he really wanted to turn PASS into resource for learning for every SQL Server Developer and Administrator in the world. I had great in-depth discussion regarding how a single person can contribute to a community. Michael Coles – I consider him my best friend. It is always fun to meet him. He is funny and very knowledgeable. I think there are very few people who are as expert as he is in encryption and spatial databases. Worth meeting him every single time. Glenn Berry – A real friend of everybody. He is very a simple person and very true to his heart. I think there is not a single person in whole community who does not like him. He is a friends of all and everybody likes him very much. I once again had time to sit with him and learn so much from him. As he is known as Dr. DMV, I can be his nurse in the area of DMV. Brad Schulz – I always wanted to meet him but never got chance until today. I had great time meeting him in person and we have spent considerable amount of time together discussing various T-SQL tricks and tips. I do not know where he comes up with all the different ideas but I enjoy reading his blog and sharing his wisdom with me. Jonathan Kehayias – He is drill sergeant in US army. If you get the impression that he is a giant with very strong personality – you are wrong. He is very kind and soft spoken DBA with strong performance tuning skills. I asked him how he has kept his two jobs separate and I got very good answer – just work hard and have passion for what you do. I attended his sessions and his presentation style is very unique.  I feel like he is speaking in a language I understand. Louis Davidson – I had never had a chance to sit with him and talk about technology before. He has so much wisdom and he is very kind. During the dinner, I had talked with him for long time and without hesitation he started to draw a schema for me on the menu. It was a wonderful experience to learn from a master at the dinner table. He explained to me the real and practical differences between third normal form and forth normal form. Honestly I did not know earlier, but now I do. Erland Sommarskog – This man needs no introduction, he is very well known and very clear in conveying his ideas. I learned a lot from him during the course of year. Every time I meet him, I learn something new and this time was no exception. Joe Webb – Joey is all about community and people, we had interesting conversation about community, MVP and how one can be helpful to community without losing passion for long time. It is always pleasant to talk to him and of course, I had fun time. Ross Mistry – I call him my brother many times because he indeed looks like my cousin. He provided me lots of insight of how one can write book and how he keeps his books simple to appeal to all the readers. A wonderful person and great friend. Ola Hallgren - I did not know he was coming to the summit. I had great time meeting him and had a wonderful conversation with him regarding his scripts and future community activities. Blythe Morrow – She used to be integrated part of SQL Server Community and PASS HQ. It was wonderful to meet her again and re-connect. She is wonderful person and I had a great time talking to her. Solid Quality Mentors – It is difficult to decide who to mention here. Instead of writing all the names, I am going to include a photo of our meeting. I had great fun meeting various members of our global branches. This year I was sitting with my Spanish speaking friends and had great fun as Javier Loria from Solid Quality translated lots of things for me. Party, Party and Parties Every evening there were various parties. I did attend almost all of them. Every party had different theme but the goal of all the parties the same – networking. Here are the few parties where I had lots of fun: Dell Reception Party Exhibitor Party Solid Quality Fun Party Red Gate Friends Party MVP Dinner Microsoft Party MVP Dinner Quest Party Gameworks PASS Party Volunteer Party at Garage Solid Quality Mentors (10 Members out of 120) They were all great networking opportunities and lots of fun. I really had great time meeting people at the various parties. There were few people everywhere – well, I will say I am among them – who hopped parties. NDA – Not Decided Agenda During the event there were few meetings marked “NDA.” Someone asked me “why are these things NDA?”  My response was simple: because they are not sure themselves. NDA stands for Not Decided Agenda. Toys, Giveaways and Luggage I admit, I was like child in Gameworks and was playing to win soft toys. I was doing it for my daughter. I must thank all of the people who gave me their cards to try my luck. I won 4 soft-toys for my daughter and it was fun. Also, thanks to Angel who did a final toy swap with me to get the desired toy for my daughter. I also collected ducks from Idera, as my daughter really loves them. Solid Quality Booth Each of the exhibitors was giving away something and I got so much stuff that my luggage got quite a bit bigger when I returned. Best Exhibitor Idera had SQLDoctor (a real magician and fun guy) to promote their new tool SQLDoctor. I really had a great time participating in the magic myself. At one point, the magician made my watch disappear.  I have seen better magic before, but this time it caught me unexpectedly and I was taken by surprise. I won many ducks again. The Common Question I heard the following common questions: I have seen you somewhere – who are you? – I am Pinal Dave. I did not know that Pinal is your first name and Dave is your last name, how do you pronounce your last name again? – Da-way How old are you? – I am as old as I can be. Are you an Indian because you look like one? – I did not answer this one. Where are you from? This question was usually asked after looking at my badge which says India. So did you really fly from India? – Yes, because I have seasickness so I do not prefer the sea journey. How long was the journey? – 24/36/12 (air travel time/total travel time/time zone difference) Why do you write on SQLAuthority.com? – Because I want to. I remember your daughter looks like you. – Is this even a question? Of course, she is daddy’s little girl. There were so many other questions, I will have to write another blog post about it. SQLPASS Again, Best Summit Ever! Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: About Me, Pinal Dave, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, SQLAuthority Author Visit, T SQL, Technology Tagged: SQLPASS

    Read the article

  • Cannot install or remove packages

    - by Nuno
    I tried to install the openoffice.org package but it gave an error. Now i cannot repair, remove or install nothing. The Software center of xubuntu gives this error log. Nothing seems to solve this. I tried apt-get install -f but it does not solve the problem either. Any suggestions? installArchives() failed: (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 184189 files and directories currently installed.) Unpacking libreoffice-common (from .../libreoffice-common_1%3a3.5.4-0ubuntu1.1_all.deb) ... dpkg: error processing /var/cache/apt/archives/libreoffice-common_1%3a3.5.4-0ubuntu1.1_all.deb (--unpack): trying to overwrite '/usr/bin/soffice', which is also in package openoffice.org-debian-menus 3.2-9502 No apport report written because MaxReports is reached already rmdir: erro ao remover /var/lib/libreoffice/share/prereg/: Ficheiro ou directoria inexistente rmdir: erro ao remover /var/lib/libreoffice/share/: Directoria no vazia rmdir: erro ao remover /var/lib/libreoffice/program/: Ficheiro ou directoria inexistente rmdir: erro ao remover /var/lib/libreoffice: Directoria no vazia rmdir: erro ao remover /var/lib/libreoffice: Directoria no vazia Processing triggers for desktop-file-utils ... Processing triggers for shared-mime-info ... Processing triggers for gnome-icon-theme ... Processing triggers for hicolor-icon-theme ... Processing triggers for man-db ... Errors were encountered while processing: /var/cache/apt/archives/libreoffice-common_1%3a3.5.4-0ubuntu1.1_all.deb dpkg: dependency problems prevent configuration of libreoffice-java-common: libreoffice-java-common depends on libreoffice-common; however: Package libreoffice-common is not installed. dpkg: error processing libreoffice-java-common (--configure): dependency problems - leaving unconfigured dpkg: dependency problems prevent configuration of libreoffice-filter-mobiledev: libreoffice-filter-mobiledev depends on libreoffice-java-common; however: Package libreoffice-java-common is not configured yet. dpkg: error processing libreoffice-filter-mobiledev (--configure): dependency problems - leaving unconfigured dpkg: dependency problems prevent configuration of libreoffice-base: libreoffice-base depends on libreoffice-java-common (>= 1:3.5.4~); however: Package libreoffice-java-common is not configured yet. dpkg: error processing libreoffice-base (--configure): dependency problems - leaving unconfigured dpkg: dependency problems prevent configuration of libreoffice-core: libreoffice-core depends on libreoffice-common (>> 1:3.5.4); however: Package libreoffice-common is not installed. dpkg: error processing libreoffice-core (--configure): dependency problems - leaving unconfigured dpkg: dependency problems prevent configuration of libreoffice-style-human: libreoffice-style-human depends on libreoffice-core; however: Package libreoffice-core is not configured yet. dpkg: error processing libreoffice-style-human (--configure): dependency problems - leaving unconfigured dpkg: dependency problems prevent configuration of libreoffice-math: libreoffice-math depends on libreoffice-core (= 1:3.5.4-0ubuntu1.1); however: Package libreoffice-core is not configured yet. dpkg: error processing libreoffice-math (--configure): dependency problems - leaving unconfigured dpkg: dependency problems prevent configuration of libreoffice-impress: libreoffice-impress depends on libreoffice-core (= 1:3.5.4-0ubuntu1.1); however: Package libreoffice-core is not configured yet. dpkg: error processing libreoffice-impress (--configure): dependency problems - leaving unconfigured dpkg: dependency problems prevent configuration of libreoffice-style-tango: libreoffice-style-tango depends on libreoffice-core; however: Package libreoffice-core is not configured yet. dpkg: error processing libreoffice-style-tango (--configure): dependency problems - leaving unconfigured dpkg: dependency problems prevent configuration of libreoffice: libreoffice depends on libreoffice-core (= 1:3.5.4-0ubuntu1.1); however: Package libreoffice-core is not configured yet. libreoffice depends on libreoffice-impress; however: Package libreoffice-impress is not configured yet. libreoffice depends on libreoffice-math; however: Package libreoffice-math is not configured yet. libreoffice depends on libreoffice-base; however: Package libreoffice-base is not configured yet. libreoffice depends on libreoffice-filter-mobiledev; however: Package libreoffice-filter-mobiledev is not configured yet. libreoffice depends on libreoffice-java-common (>= 1:3.5.4~); however: Package libreoffice-java-common is not configured yet. dpkg: error processing libreoffice (--configure): dependency problems - leaving unconfigured dpkg: dependency problems prevent configuration of libreoffice-writer: libreoffice-writer depends on libreoffice-core (= 1:3.5.4-0ubuntu1.1); however: Package libreoffice-core is not configured yet. dpkg: error processing libreoffice-writer (--configure): dependency problems - leaving unconfigured dpkg: dependency problems prevent configuration of mythes-en-us: mythes-en-us depends on libreoffice-core | openoffice.org-core (>= 1.9) | language-support-writing-en; however: Package libreoffice-core is not configured yet. Package openoffice.org-core is not installed. Package language-support-writing-en is not installed. dpkg: error processing mythes-en-us (--configure): dependency problems - leaving unconfigured dpkg: dependency problems prevent configuration of libreoffice-help-zh-cn: libreoffice-help-zh-cn depends on libreoffice-writer | language-support-translations-zh; however: Package libreoffice-writer is not configured yet. Package language-support-translations-zh is not installed. dpkg: error processing libreoffice-help-zh-cn (--configure): dependency problems - leaving unconfigured dpkg: dependency problems prevent configuration of libreoffice-base-core: libreoffice-base-core depends on libreoffice-core (= 1:3.5.4-0ubuntu1.1); however: Package libreoffice-core is not configured yet. dpkg: error processing libreoffice-base-core (--configure): dependency problems - leaving unconfigured dpkg: dependency problems prevent configuration of libreoffice-gnome: libreoffice-gnome depends on libreoffice-core (= 1:3.5.4-0ubuntu1.1); however: Package libreoffice-core is not configured yet. dpkg: error processing libreoffice-gnome (--configure): dependency problems - leaving unconfigured dpkg: dependency problems prevent configuration of libreoffice-help-pt-br: libreoffice-help-pt-br depends on libreoffice-writer | language-support-translations-pt; however: Package libreoffice-writer is not configured yet. Package language-support-translations-pt is not installed. dpkg: error processing libreoffice-help-pt-br (--configure): dependency problems - leaving unconfigured dpkg: dependency problems prevent configuration of libreoffice-emailmerge: libreoffice-emailmerge depends on libreoffice-core; however: Package libreoffice-core is not configured yet. dpkg: error processing libreoffice-emailmerge (--configure): dependency problems - leaving unconfigured dpkg: dependency problems prevent configuration of libreoffice-help-en-gb: libreoffice-help-en-gb depends on libreoffice-writer | language-support-translations-en; however: Package libreoffice-writer is not configured yet. Package language-support-translations-en is not installed. dpkg: error processing libreoffice-help-en-gb (--configure): dependency problems - leaving unconfigured

    Read the article

  • QT creator/qt embedded widget demo problem

    - by jhowland
    I have downloaded the source code for the embedded widget demos from nokia and tried to get it to compile--I end up with the following error: Compiler (mingw32) message : In file included from ../../src/basicgraph/qtbasicgraph.cpp:9: ../../src/basicgraph/qtbasicgraph.h:14:17: QtGui: No such file or directory The offending line is #include <QtGui> QT creator 1.2.0, QT 2009.03 I have checked paths, etc, and all seems fine. I have checked to make sure that I have implicitly included QT += gui and that I have NOT included QT -= gui anywhere in the .pro and .pri files. After making no changes, I used VS2005 and the appropriate commercial version of QT 4.4.3, and it compiled and ran fine I have googled the error, and have found others asking similar questions (but not here) but no posted answers...so I would appreciate any help. thanks Jonathan Howland

    Read the article

  • AS3: How to access pixel data efficiently?

    - by JonoRR
    I'm working a game. The game requires entities to analyse an image and head towards pixels with specific properties (high red channel, etc.) I've looked into Pixel Bender, but this only seems useful for writing new colors to the image. At the moment, even at a low resolution (200x200) just one entity scanning the image slows to 1-2 Frames/second. I'm embedding the image and instance it as a Bitmap as a child of the stage. The 1-2 FPS situation is using BitmapData.getPixel() (on each pixel) with a distance calculation beforehand. I'm wondering if there's any way I can do this more efficiently... My first thought was some sort of spatial partioning coupled with splitting the image up into many smaller pieces. I also feel like Pixel Bender should be able to help somehow, however I've had little experience with it. Cheers for any help. Jonathan

    Read the article

  • Object Recognition from Templates

    - by JonLeah
    Hi Guys, I was hoping someone could point me in the right direction here. With a picture of a die (from above) I want to recognize which side is up. I understand the basics in play here, but I'm having trouble grasping the power of OpenCV. I imagine I want a picture of each side of the die. Then I can somehow compare them all to the current image to be classified. How can I use OpenCV to do this? Thanks, Jonathan

    Read the article

  • Dropdown binding and postbacks - ASP.NET

    - by DotnetDude
    I have a rather complex page. The ASPX page loads a user control which in turn loads a child User control. Parent Control protected override void OnInit(EventArgs e) { //Loads child control } In the child user control, I use custom control that inherits from System.Web.UI.HtmlControls.HtmlSelect ASCX: <cust:CustDropDownList id="ctlDdl" runat="server"/> ASCX.CS protected void Page_Load(object sender, EventArgs e) { //Binds CtlDdl here } When the user clicks on the Save button, the controls get user controls get dynamically reloaded, but Iose the value the user has selected in the dropdown. I run into the chicken and egg problem here. I think I need to bind the ctlDdl only on if its not a postback, but that results in the dropdown not getting populated. If I bind it everytime, then i lose user's selection EDIT: Can someone respond to my comment to Jonathan's answer? Thanks

    Read the article

  • SQL SERVER – Best Reference – Wait Type – Day 27 of 28

    - by pinaldave
    I have great learning experience to write my article series on Extended Event. This was truly learning experience where I have learned way more than I would have learned otherwise. Besides my blog series there was excellent quality reference available on internet which one can use to learn this subject further. Here is the list of resources (in no particular order): sys.dm_os_wait_stats (Book OnLine) – This is excellent beginning point and official documentations on the wait types description. SQL Server Best Practices Article by Tom Davidson – I think this document goes without saying the BEST reference available on this subject. Performance Tuning with Wait Statistics by Joe Sack – One of the best slide deck available on this subject. It covers many real world scenarios. Wait statistics, or please tell me where it hurts by Paul Randal – Notes from real world from SQL Server Skilled Master Paul Randal. The SQL Server Wait Type Repository… by Bob Ward – A thorough article on wait types and its resolution. A MUST read. Tracking Session and Statement Level Waits by by Jonathan Kehayias – A unique article on the subject where wait stats and extended events are together. Wait Stats Introductory References By Jimmy May – Excellent collection of the reference links. Great Resource On SQL Server Wait Types by Glenn Berry – A perfect DMV to find top wait stats. Performance Blog by Idera – In depth article on top of the wait statistics in community. I have listed all the reference I have found in no particular order. If I have missed any good reference, please leave a comment and I will add the reference in the list. Read all the post in the Wait Types and Queue series. Reference: Pinal Dave (http://blog.SQLAuthority.com) Tracking Session and Statement Level Waits Filed under: Pinal Dave, PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, SQL Wait Stats, SQL Wait Types, T SQL, Technology

    Read the article

  • BI Publisher - Hottest Show in Vegas

    - by mike.donohue
    Two days down, two to go. Monday was a very busy and rewarding day. Attended "XML Publisher and FSG for Beginners" given by Susan Behn and Alyssa Johnson from Solution Beacon. It was packed, standing room only ... even though it was at 8:00 am. Later in the afternoon, despite being at the same time and in conflict with other Publisher related sessions, Noelle's session, "The Reporting Platform for Applications: Oracle Business Intelligence Publisher" and my session, "Introduction to Oracle Business Intelligence Publisher" were both very well attended. Immediately following our presentations we ran the BI Publisher Hands On Lab which was great fun. The turnout was so large that unfortunately we could not accommodate everyone who came to the lab. There were as many as 5 people huddled around each of the 20 machines. All the the groups completed the 2 main exercises. Some groups even took the product for an off-road test drive. Look at all the fun we had ... For those who could not attend or want the Hands On Lab document: Hands On Lab Oracle BI Publisher Collaborate 2010.pdf Note that these lab instructions assume a specific set up and files that you may not have in your environment. You can download and install a trial license version of BI Publisher from the download page. Highly recommend taking a look at the additional Tutorials available on OTN. Big thanks to Dan Vlamis and Jonathan Clark from Vlamis Software Solutions and to the Oracle BIWA SIG for setting up these machines and getting the time and space to run this lab. It was inspiring to see all of the attendees successfully creating reports. On Tuesday morning we were up early again for a rousing session of BI Publisher Best Practices that was also, very well attended especially considering the 8 am start. Later that morning saw Ben Bruno from STR Software and two of his customers speak on the additional functionality and ROI they have achieved by using Publisher within EBS and AventX to FAX and Email Publisher generated documents. Spent the afternoon staffing the BI Technology demo pod and had a steady flow of people dropping by with questions. Having a great conference so far and looking forward to the rest of it.

    Read the article

  • links for 2011-02-09

    - by Bob Rhubart
    Tech Cast Live - Java and Oracle, One Year Later - February 15th 10AM PST (Oracle Technology Network Blog (aka TechBlog)) (tags: ping.fm) The impact of IT decisions on organizational culture - O'Reilly Radar "While I believe we recognize the limiting qualities of IT decisions, I'd suggest we've insufficiently studied the degree to which those decisions in aggregate can have a large influence on organizational culture." - Jonathan Reichental, Ph.D. (tags: ITgovernance organizationalculture enterprisearchitecture) Women "computers" of World War II - Boing Boing "Before it came to mean laptops, PCs, or even room-sized machines, "computer" was what you called a person who did mathematical calculations for a living. That job was vitally important during World War II. And, like many vital jobs on the homefront, it was turned over to women..." (tags: computers history worldwar2) InfoQ: Book Excerpt and Interview: 100 SOA Questions Asked and Answered A new "100 SOA Questions Asked and Answered " book by Kerrie Holley and Ali Arsanjani provides a deep insight into SOA covering a wide spectrum of topics from SOA basics to its business and organizational impact, to SOA methods and architecture to SOA future. InfoQ spoke with Kerrie Holley and Ali Arsanjani about their book. (tags: ping.fm) @myfear: GlassFish City - Another view onto your favorite application server Oracle ACE Director Markus Eisele runs GlassFish through CodeCity. (tags: oracle otn oracleace glassfish codecity) The Ron Batra Blog: Technology Whispers: Upcoming Presentations Oracle ACE Director Ron Batra shares details on upcoming presentations at OAUG events in the US and Dubai. (tags: oaug c11 oracle otn oracleace) Free ADF Training Event in the UK (Grant Ronald's Blog) Gobsmack survivor Grant Ronald with the details on an Oracle ADF training session he'll conduct on 11 May 2011 at the UK Oracle office in Reading. (tags: oracle otn adf) Java Spotlight Episode 16 - Richar Bair - The Java Spotlight Podcast The latest Java Spotlight podcast features an interview with Java Client Architect Richar Bair. (tags: oracle java podcast) Stewart Bryson: OBIEE 11g Migrations "[Rittman Mead's] Mark and Venkat have covered OBIEE migration methodologies in the past (see here, here and here), but I decided to throw my hat in the ring on the subject, as I had to develop a methodology for a client recently and wanted to share my experiences." - Stewart Bryson (tags: oracle otn obiee businessintelligence) Dr. Chris Harding: The golden thread of interoperability | Open Group Blog "There are so many things going on at every Conference by The Open Group that it is impossible to keep track of all of them, and this week’s Conference in San Diego, California, is no exception. The main themes are Cybersecurity, Enterprise Architecture, SOA and Cloud Computing." - Dr. Chris Harding (tags: entarch soa interoperability cloud) Marc Kelderman: OSB: Creating an Asynchronous / Fire-Forget WebService Call Creating a fire-and-forget call via OSB is simple, according to solution architect Marc Kelderman. "The trick is to send NO response back to the caller, only an HTTP response code, 200 or any other." (tags: oracle otn servicebus)

    Read the article

  • ArchBeat Link-o-Rama for 2012-06-05

    - by Bob Rhubart
    Why is enterprise software often so complicated? | Rajesh Raheja rraheja.wordpress.com Rajesh Raheja shares "a few examples of requirements that lead to creation of complex platform infrastructures that up the complex enterprise software." Educause Top-Ten IT Issues - the most change in a decade or more | Cole Clark blogs.oracle.com Cole Clark discusses why "higher education IT must change in order to fully realize the potential for transforming the institution, and therefore it's people must learn new skills, understand and accept new ways of solving problems, and not be tied down by past practices or institutional inertia." Oracle VM RAC template - what it took | Wim Coekaerts blogs.oracle.com Wim Coekaerts shares an example that shows how easy it is to deploy a complete Oracle RAC cluster with Oracle VM. Oracle Cloud and Oracle Platinum Services Announcements oracle.com Featuring Larry Ellison and Mark Hurd. Wednesday, June 06, 2012. 1:00 p.m. PT – 2:30 p.m. PT Creating an Oracle Endeca Information Discovery 2.3 Application Part 1 : Scoping and Design | Mark Rittman www.rittmanmead.com Oracle ACE Director Mark Rittman launches a new series that dives into "the various stages in building a simple Oracle Endeca Information Discovery application, using the recent Endeca Information Discovery 2.3 release." Introducing Decision Tables in the SOA Suite 11g | Lucas Jellama technology.amis.nl Oracle ACE Director Lucas Jellema demonstrates how "the decision table can be put to good use to implement the business logic behind the classical game of Rock, Paper and Scissors." Application integration: reorganise, recycle, repurpose | Andrew Clarke radiofreetooting.blogspot.com "Integration is a topic which is in everybody's baliwick," says Oracle ACE Andrew Clarke. "The business people want to get the best value from their existing IT investments. The architects need to understand the interfaces between the silos and across the layers. The developers have to implement it." Using XA Transactions in Coherence-based Applications | Jonathan Purdy blogs.oracle.com Purdy shares "a few common approaches when integrating Coherence into applications via the use of an application server's transaction manager." Thought for the Day "The difficulty lies, not in the new ideas, but in escaping from the old ones..." — John Maynard Keynes (June 5, 1883 - April 4, 1946) Source: Quotations Page

    Read the article

  • SQL Saturday 194 - Exeter

    - by Dave Ballantyne
    Many kudos goes to Jonathan and Annette Allen and the others on the team for confirming SQL Saturday 194 in Exeter on the 8th and 9th of March.  The event home page is here http://www.sqlsaturday.com/194/eventhome.aspx and I delighted that myself and Dave Morrison will be presenting a full day pre-con on the 8th on favourite subjects “TSQL and Internals”. Here is the full abstract : TSQL and internals - When faced with performance issues there are many lines of attack. Tuning the engine itself can get you so far, however for maximum effect you need to understand how the engine and how it translates SQL statements into performable actions. This is not a simple task, it is a massive task to deal with a multi-table join and the number of permutations can be immense. To back up this knowledge, we can create better performing TSQL and understand the impact that is has upon the engine and recognize the pitfalls and gotcha’s that exist in SQLServer. Ultimately, there is no ‘best way’ to perform a single task only many variations of ‘it depends’ , but now we can pick the most appropriate option for the required dataload. Over the years, there have been many myths and misconceptions have grown around the product, some have basis in older versions and some are just wrong. Continuing to build on the knowledge given so far these issue will be explored and broken down and proved or disproved. Finally we will look to the future and explore SQL Server 2012 and the new functionality that that brings and some of the common uses that we will be able to address. After completion of this days pre-con, attendees will have a more complete knowledge of execution plans, and how they relate to the physical and logical actions that SQLServer will be executing on their behalf. The attendees will also have a more rounded and fuller knowledge of TSQL and the implications of incorrectly defining a query. Dave is a fountain of knowledge on execution plans and optimizer internals and ,though i may flatter myself, I’m no shrinking violet when it comes to TSQL and such matters.  I hope that if you cant join us, then there are other pre-cons available from other experts in their fields that may ‘float you boat’ too.  The pre-con page is http://sqlsouthwest.co.uk/SQLSaturday_precon.htm Also, excitingly, this pre-con day is sponsored by Fusion-IO which is a great boon for the day. If you want a more of this then i am offering a 2 day TSQL course starting on the 19th of March. More details on this are available here

    Read the article

  • WDS "No response" only when notify and wait for approval

    - by Cylindric
    I have a WDS server setup for use with MDT2010, and everything was working fine until this morning. Now, whenever I try to boot from LAN, I get an error: Downloaded WDSNBP... Architecture: x64 WDSNBP started using DHCP Referral. Contacting Server: 10.50.10.12 (Gateway: 0.0.0.0) No response from Windows Deployment Services server. Launching pxeboot.com My PXE Response Policy setting on the WDS is set to this: [ ] Do not respond to any client computer [ ] Respond only to known client computers [o] Respond to all (known and unknown) client computers [X] For unknown clients, notify administrator and respond after approval The odd thing is that if I clear the approval option (so any computer gets a response) it works fine. I have delegated permissions on the AD OU to the computer object WDS is running on, but that doesn't seem to have helped. As it works without approval, I can only assume my DHCP options are fine and this is some sort of AD permission problem. (Server is Windows Server 2008 SP2)

    Read the article

  • Exchange 2010: Receiving "You can't send a message on behalf of this user..." error when trying to configure delegate access

    - by Beaming Mel-Bin
    I am trying to give someone (John Doe) delegate access to another account (Jane Doe) in our test environment. However, I receive the following error from Exchange: *Subject:* Undeliverable: You have been designated as a delegate for Jane Dow *To:* Jane Dow Delivery has failed to these recipients or groups: John Doe You can't send a message on behalf of this user unless you have permission to do so. Please make sure you're sending on behalf of the correct sender, or request the necessary permission. If the problem continues, please contact your helpdesk. Diagnostic information for administrators: Generating server: /O=UNIONCO/OU=EXCHANGE ADMINISTRATIVE GROUP (FYD132341234)/CN=RECIPIENTS/CN=jdoe #MSEXCH:MSExchangeIS:/DC=local/DC=unionco:MAILBOX-1[578:0x000004DC:0x0000001D] #EX# Can someone help me troubleshoot this?

    Read the article

< Previous Page | 75 76 77 78 79 80 81 82 83 84 85 86  | Next Page >