Search Results

Search found 16971 results on 679 pages for 'blogs'.

Page 202/679 | < Previous Page | 198 199 200 201 202 203 204 205 206 207 208 209  | Next Page >

  • How to set up secure cookie on weblogic server

    - by adejuanc
    WebLogic Server allows a user to securely access HTTPS resources in a session that was initiated using HTTP, without loss of session data. To enable this feature, add AuthCookieEnabled="true" to the WebServer element in config.xml: <WebServer Name="myserver" AuthCookieEnabled="true"/>Setting AuthCookieEnabled to true, which is the default setting, causes the WebLogic Server instance to send a new secure cookie, _WL_AUTHCOOKIE_JSESSIONID, to the browser when authenticating via an HTTPS connection. Once the secure cookie is set, the session is allowed to access other security-constrained HTTPS resources only if the cookie is sent from the browser.Thus, WebLogic Server uses two cookies: the JSESSIONID cookie and the _WL_AUTHCOOKIE_JSESSIONID cookie. By default, the JSESSIONID cookie is never secure, but the _WL_AUTHCOOKIE_JSESSIONID cookie is always secure. A secure cookie is only sent when an encrypted communication channel is in use. Assuming a standard HTTPS login (HTTPS is an encrypted HTTP connection), your browser gets both cookies.For subsequent HTTP access, you are considered authenticated if you have a valid JSESSIONID cookie, but for HTTPS access, you must have both cookies to be considered authenticated. If you only have the JSESSIONID cookie, you must re-authenticate.To configure on Admin Console : Log into WebLogic Admin Console. Under Domain Structure, press click on <domainname> Select the "Web Applications" tab Select "Lock and Edit" in change center. Click on  "Auth Cookie Enabled" checkbox. Restart to confirm changes. Test an application and view the cookie which got stored as "JSESSIONID" To Configure the Web application's weblogic-application.xml file: Run the following to extract the file from the web application's weblogic-application.xml: $PATH_JDK_HOME\binjar -xvf easy-web-examples.ear META-INF/weblogic-application.xml Add <cookie-secure>true</cookie-secure> between <session-descriptor> </session-descriptor> to the weblogic-application.xml. Run the following to repackage the file to the application: $PATH_JDK_HOME\bin\jar -uvf easy-web-examples.ear META-INF/weblogic-application.xml Deploy the application into WebLogic For further information, please read the documentation on "Using Secure Cookies to Prevent Session Stealing " : http://download.oracle.com/docs/cd/E12840_01/wls/docs103/security/thin_client.html#wp1053780

    Read the article

  • Opitz Consulting wins the Oracle SOA Partner Community Award

    - by Jürgen Kress
    Thanks for the nice post! Ein wichtiger Preis für die SOA-Community: Der Oracle EMEA SOA Community Award Im Rahmen der Oracle Open World verlieh Oracle den „Oracle EMEA SOA Community Award" im Bereich "Outstanding Contribution“ im Jahr 2010 bereits zum dritten Mal in Folge an OPITZ CONSULTING: Award als erster SOA Specialized Partner in Europa In 2010 errangen die SOA-Spezialisten von OPITZ CONSULTING den begehrten SOA Partner Community Award. Das Oracle SOA-Team um Jürgen Kress honoriert hiermit das Erreichen der ersten Oracle SOA Spezialisierung in Deutschland, die Community-Arbeit, das Durchführen von SOA-Trainings (auch für anderen Oracle Partner) und das allgemeine Wachstum des OPITZ CONSULTING SOA-Bereichs. Verbindung von EA, BPM und SOA gewürdigt Im Jahr 2009 wurde ein OPITZ CONSULTING Direktor Strategie & Innovation geehrt: Dirk Stähler (Bild li.) erhielt den Award für sein Engagement im Aufbau und der Weiterentwicklung von BPM- und SOA-Themen. Ein besonderer Grund für die Verleihung an ihn waren seine Arbeiten zur effektiven Verbindung von Enterprise Architecture, Business Process Management und SOA. Award für SOA-Community-Arbeit und -Publikationen OPITZ CONSULTING Direktor Strategie & Innovation und Oracle ACE Director, Torsten Winterberg (Bild re.), holte diesen Preis im Jahr 2008 für OPITZ CONSULTING nach Deutschland. Er wurde für sein außerordentliches Engagement in der Etablierung von serviceorientierten Architekturen gewürdigt. Dazu gehören beispielsweise der Aufbau einer Special Interest Group SOA, Roundtable-Initiativen sowie umfangreiche Veröffentlichungen zu SOA.   For more information on the SOA Partner Community please feel free to register at www.oracle.com/goto/emea/soa (OPN account required) Blog Twitter LinkedIn Mix Forum Wiki Website Technorati Tags: SOA Community,Opitz,Opitz Consulting,Torsten Winterberginterberg,oracle,opn,Jürgen Kress

    Read the article

  • Happy Birthday, SPARC!

    - by A&C Redaktion
    25 Jahre gibt es SPARC in diesem Herbst – da gratulieren Oracle A&C und alle Partner natürlich ganz herzlich! Wir blicken zurück auf ein Vierteljahrhundert Erfolgsgeschichte:Wir befinden uns im Jahr 1987 und klobige graue PCs halten seit einigen Jahren Einzug in Büros und Privathäuser. Ein innovatives Startup-Unternehmen namens Sun Microsystems präsentiert seinen neuen Computer Sun-4, die eigentliche Sensation jedoch ist der Mikroprozessor, den die jungen Leute extra dafür entwickelt hatten: SPARC. Es handelte sich um einen extrem leistungsfähigen RISC-Hauptprozessor, der sowohl in den eigenen Workstations als auch den Servern der Sun-4-Baureihe zum Einsatz kommt. Vor allem in der Unternehmens-IT ermöglicht SPARC in den Folgejahren einen enormen Sprung nach vorn.Die weitere Entwicklung von SPARC, kombiniert mit einem Überblick über andere Meilensteine in der Geschichte der Computerwelt, finden Sie auf der Webseite "Celebrate 25 Years of SPARC Innovation".Wir springen gleich weiter in die Gegenwart, denn auch seit Sun zu Oracle gehört, hat sich so manches getan: Gerade erst hat Oracle die neue Server-Linie Sparc T4 vorgestellt – in Fachkreisen spricht man bereits von der größten Leistungssteigerung in der Geschichte der SPARC-Prozessoren.In den USA wurde das Jubiläum bereits kräftig gefeiert: Hier finden Sie Bilder vom Geburtstagsfest im Museum für Computer-Geschichte in Mountain View, Kalifornien, bei dem auch die SPARC-Entwickler Bill Joy and Andreas von Bechtolsheim zugegen waren und auch im Video SPARC-Event Highlights dreht sich alles um das Jubiläum. In der Oracle Familie gibt es 2012 noch ein weiteres Geburtstagskind: Solaris wird 20, herzlichen Glückwunsch! Das Unix-Betriebssystem, basierend auf SunOS, kam im Jahr 1992 erstmals auf den Markt. Solaris konnte seine gute Stellung seither behaupten und hat nun mit Solaris 11.1 das erste Cloud-Betriebssystem vorgestellt. Dieses überträgt die Zuverlässigkeit, Sicherheit und Skalierbarkeit des bewährten Solaris in die Cloud und bietet eine optimale Plattform für Unternehmensanwendungen.  Lesen Sie hier, was die Fachpresse über die Geburtstagskinder schreibt: ProLinux.de (SPARC) Computerwoche.de (Solaris)SearchDataCenter.de (Solaris)

    Read the article

  • Happy Birthday, SPARC!

    - by A&C Redaktion
    25 Jahre gibt es SPARC in diesem Herbst – da gratulieren Oracle A&C und alle Partner natürlich ganz herzlich! Wir blicken zurück auf ein Vierteljahrhundert Erfolgsgeschichte:Wir befinden uns im Jahr 1987 und klobige graue PCs halten seit einigen Jahren Einzug in Büros und Privathäuser. Ein innovatives Startup-Unternehmen namens Sun Microsystems präsentiert seinen neuen Computer Sun-4, die eigentliche Sensation jedoch ist der Mikroprozessor, den die jungen Leute extra dafür entwickelt hatten: SPARC. Es handelte sich um einen extrem leistungsfähigen RISC-Hauptprozessor, der sowohl in den eigenen Workstations als auch den Servern der Sun-4-Baureihe zum Einsatz kommt. Vor allem in der Unternehmens-IT ermöglicht SPARC in den Folgejahren einen enormen Sprung nach vorn.Die weitere Entwicklung von SPARC, kombiniert mit einem Überblick über andere Meilensteine in der Geschichte der Computerwelt, finden Sie auf der Webseite "Celebrate 25 Years of SPARC Innovation".Wir springen gleich weiter in die Gegenwart, denn auch seit Sun zu Oracle gehört, hat sich so manches getan: Gerade erst hat Oracle die neue Server-Linie Sparc T4 vorgestellt – in Fachkreisen spricht man bereits von der größten Leistungssteigerung in der Geschichte der SPARC-Prozessoren.In den USA wurde das Jubiläum bereits kräftig gefeiert: Hier finden Sie Bilder vom Geburtstagsfest im Museum für Computer-Geschichte in Mountain View, Kalifornien, bei dem auch die SPARC-Entwickler Bill Joy and Andreas von Bechtolsheim zugegen waren und auch im Video SPARC-Event Highlights dreht sich alles um das Jubiläum. In der Oracle Familie gibt es 2012 noch ein weiteres Geburtstagskind: Solaris wird 20, herzlichen Glückwunsch! Das Unix-Betriebssystem, basierend auf SunOS, kam im Jahr 1992 erstmals auf den Markt. Solaris konnte seine gute Stellung seither behaupten und hat nun mit Solaris 11.1 das erste Cloud-Betriebssystem vorgestellt. Dieses überträgt die Zuverlässigkeit, Sicherheit und Skalierbarkeit des bewährten Solaris in die Cloud und bietet eine optimale Plattform für Unternehmensanwendungen.  Lesen Sie hier, was die Fachpresse über die Geburtstagskinder schreibt: ProLinux.de (SPARC) Computerwoche.de (Solaris)SearchDataCenter.de (Solaris)

    Read the article

  • Revisioning the CeBIT 2011

    - by hechtsuppe
    Hey guys, I am living in the CeBIT's hometown, the beautiful city Hanover. So I am visiting this exhibition since 2002 and I've seen a lot of changes during all this time. But this time, it was the most boring fair I've ever seen. Lets start with the first halls: "The same procedure as every year"- directly behind the entrance are the exhibitioners from far east (China, Taiwan...). In the past, they've shown a lot of nice toys. But this time, they got very serious, the only great gimmick was the motorcycle suitcase for the iPad, I watched the presentation until I reminded myself that I am not owning an iPad and these cases weren't suitable for my BMW motorcycle. So I started looking for the business stuff (I was there for business). I walked deeper in the exhibition area: During the way to the business halls, I came across a hall where I heard a big bass- the gamer's hall I think so I made a quick getaway from there. I saw a lot of teenagers with gaming bags on thier shoulders and I was really confused. I thought 'Damn it is tuesday 11:00 am, the trade fair is opened for public on saturday, why they are here and not at school?'. So the german schools seem to be too easy for students. At the time I was a pupil I visited the CeBIT on saturday! At the business halls: I visited IBM's booth but there were only guys looking like penguins and I weared a white chemise. So nobody was interested in talking to me. At the coffeebar I met a very nice guy from Bangladesh I think he was round about 25, but he told me that he was the first time in germany and he thought that germans are still nazis. I laughed at him and went to DELL. I was really really really interested in client solutions from DELL because I want to get away from our current client manufacturer. At the DELL booth I became recognized and a really nice guy told me where to use which client products. But there were too many people for trying the notebooks so the DELL guy asked for my business card. But I am still waiting for information, dear 'DELL dude'. I went to the Microsoft booth for informing myself about new IT trends. There were nothig new, only a few presentations about the 'new' Windows Live, Windows Phone 7 and 'the allmighty cloud'. But there was a very small presentation corner with the title 'geeks corner'. A guy inside the 'geeks corner' started Visual Studio 2010, I was really agog for the presentation. But then he started talking about Windows Phone 7 and how to program. He began with drag'n drop a textbox and a button on the form. He wrote really basic code and explained the functionality of a textbox- then I stood up and left the room. At the end: Before leaving the fairground, I've visited a few small booths and the big anti-virus program companies. But there was nothing new, I was really disappointed  this year. I've seen only ten exhibition babes and the rest of the week I stood ~3 hours in traffic jams. But I really love the flair in the whole town during this exhibition. The people in the city railways, which are really confused and the people in the pubs. Cheers Vince

    Read the article

  • So long Oracle ...

    - by arungupta
    ... and thanks for all the fish! This Friday (October 18, 2013) is my last day at Oracle. After Publishing almost 1400 blog entries with 5500+ comments on them Working in the Java EE team since inception Visiting 35+ countries and several cities around the world Speaking at all major Java conferences and lots of Java User Groups 15-year alumni of JavaOne as staff Meeting and working with best of the best in the Java community Most importantly having lots of fun Its time for me to move on! No new blog entries will be posted on this blog. Feel free to subscribe to The Aquarium for latest updates on Java EE and GlassFish. I'll continue to publish all the excellent content that you've been used to at blog.arungupta.me now onwards. Read my new blog to learn about my new adventures! Here are some of the conference badges collected over the past years ... And the cities visited ... View Cities Visited by "Miles To Go..." in a larger map The comments on this blog are disabled as I'll not be able to respond to them. Feel free to leave comments on the new blog and I'd love to follow up with you there. Thank you very much for all the support that has been shown on this blog. I'd like to conclude with a Hindi song that I've been humming for the past few days now ... Abhi alvida mat kaho doston ... Na jaane kahan phir mulaqaat ho ... Kyonki ... Beete huye lamhon ki kasak saath to hogi ... Khawabon mein hi ho chahe mulaqaat to hogi ... For my non-Hindi readers, here is my paraphrased meaning ... Don't say goodbye yet my friends ... We'll likely meet somewhere else ... Because ... We'll always have the memories of the wonderful time spent together ... May be in dreams but we will meet again ... With that, over and out, and see you at blog.arungupta.me!

    Read the article

  • AIA Artefakte im Oracle Enterprise Repository

    - by Hans Viehmann
    Das Oracle Enterprise Repository (OER) ist die zentrale Stelle zur Verwaltung von SOA Artefakten aller Art, mit dem Ziel, den gesamten Lebenszyklus dieser Artefakte zu begleiten. Es ist wesentliche Grundlage für deren Wiederverwendung, für die Ermittlung von Abhängigkeiten, wie auch für die Bestimmung des Wertes dieser Artefakte, was wiederum für den Nutzen der SOA Implementierung von Bedeutung ist. In AIA 11g wird die aktuelle Version des OER unterstützt und wird zusätzlich ergänzt durch die Project Lifecycle Workbench, in der die funktionale Spezifikation, die Aufteilung der Prozesse, oder beispielsweise die Generierung des Deployment Plans erfolgt.Für die Bereitstellung der Artefakte des Foundation Pack 11g gibt es inzwischen ein zugehöriges AIA Solution Pack für OER, mit dem die entsprechenden Strukturen, sowie die Bestandteile des Foundation Packs 11g, also EBOs, EBMs, EBSs, usw. unabhängig von einer AIA Installation direkt importiert werden können. Das Pack steht auch auf support.oracle.com bereit und kann hier heruntergeladen werden.

    Read the article

  • Technology vs. Antiquated Methods

    - by AreYouSerious
    So Here I am talking with my Program lead, about technology, and how while my father is the VP of a major company, he still doesn't have a blackberry, or a smart phone. and I think it's funny. Most people would say it's a generational thing. That because he's older, he dosen't accept technology, and that's why. I have trouble swallowing that because this is the same man, who bought a satellite radio for his car, and made sure that the printer for the house was networked so that his and my mom's laptop could print wirelessly from the living room through their wireless network. I think it has to do with more with necessity, and partially with finical responsibility. My father is very financially conciencious. Think about it yourself. you pay for internet at your home. You have internet access at your office. But if you get a smart phone you're going to pay almost the same amount just for that access. A lot of people take it as just another fixed cost... I'm one of those. I don't even think about it, as I check my facebook from the bus, train, or even while sitting in traffic... The convience of having connection everywhere outweigh the financial responsible person screaming at in the back of my mind. However This conversation lead us to another venue of discussion.... what happens when the power dies. if you left your charger at home, or you phone or navi just stops working... are you going to be able to continue on as you did when it was working... let take the navi as an example... if your navi stops working, how many of you know how to use a map, and navigate? can you even find where you are on a map using the cross streets that your stopped at? This is a skill that unfortunatelly is overlooked these days in the child rearing process. Most people don't see the value, while some others can't do it themselves, so how can they teach their offspring? Take another example.... what if your phone gets lost... or stolen, or you drive over it? do you have the numbers in their memorized? are they recorded somewhere? I know that if it weren't for google sync I wouldn't have them backed up... not sufficiently. And what good does that do if you're in timbuckto and your phone dies, think you can get on the internet to look up those numbers? Don't get me wrong. I'm the first to see the value in technology, and am willing to pay the price to not have to wait for prices to come down. I will pay extra to have that newest thing right now. but let me tell you what.... I know that should I ever procreate it will be a requirement for my offspring (children) to learn how to do something manually before I'll let them use technology. Food for thought?? Let everyone else know what you think.... just sayin'

    Read the article

  • The ABC of Front End Web Development

    - by Geertjan
    And here it is, the long awaited "ABC" of front end web development, in which the items I never knew existed until I was looking to fill the gaps link off to the sites where more info can be found on them. A is for Android and AngularJS B is for Backbone.js and Bower C is for CSS and Cordova D is for Docker E is for Ember.js and Ext JS F is for Frisby.js G is for Grunt H is for HTML I is for Ionic and iPhone J is for JavaScript, Jasmine, and JSON K is for Knockout.js and Karma L is for LESS M is for Mocha N is for NetBeans and Node.js O is for "Oh no, my JS app is unmaintainable!" P is for PHP, Protractor, and PhoneGap Q is for Queen.js R is for Request.js S is for SASS, Selenium, and Sublime T is for TestFairy U is for Umbrella V is for Vaadin W is for WebStorm X is for XML Y is for Yeoman Z is for Zebra

    Read the article

  • A Kingdom To Conquer: Character Sketches

    - by George Clingerman
    Still not 100% sold on my title so it remains a working title for now, but here’s a series of character sketches I’ve done for a turn based strategy game I’m playing at making. I’ve been sketching these on various pieces of paper throughout the last two weeks and just finished the last of them today (my plan was for 16 different types of units and well, now I have them, so I consider that done!).                    Pretty rough sketches for now, but I’m pretty happy with the art style overall. I was wrestling for quite a while just HOW I wanted the game to look and then I finally stumbled across Art Baltazar and I was like, THAT’S IT! There’s a few characters I need to re-do a bit more, I feel they’re a bit TOO much like some of the characters that inspired them but I’m happy that the ideas are finally sketched out. I’ve also been playing a bit in InkScape working on making these guys digital. A pretty new experience for me since I’m not used to working with vector images but I think I’ll get the hang of it. Here’s the Knight all vectorized. Now if I could just start making some progress on the actual game itself…

    Read the article

  • Surface V2.0

    - by Dennis Vroegop
    It’s been quiet around here. And the reason for that is that it’s been quiet around Surface for a while. Now, a lot of people assume that when a product team isn’t making too much noise that must mean they stopped working on their product. Remember the PDC keynote in 2010? Just because they didn’t mention WPF there a lot of people had the idea that WPF was dead and abandoned for Silverlight. Of course, this couldn’t be farther from the truth. The same applies to Surface. While we didn’t hear much from the team in Redmond they were busy putting together the next version of the platform. And at the CES in January the world saw what they have been up to all along: Surface V2.0 as it’s commonly known. Of course, the product is still in development. It’s not here yet, we can’t buy one yet. However, more and more information comes available and I think this is a good time to share with you what it’s all about! The biggest change from an organizational point of view is that Microsoft decided to stop producing the hardware themselves. Instead, they have formed a partnership with Samsung who will manufacture the devices. This means that you as a buyer get the benefits of a large, worldwide supplier with all the services they can offer. Not that Microsoft didn’t do that before but since Surface wasn’t a ‘big’ product it was sometimes hard to get to the right people. The new device is officially called the “Samsung SUR 40 for Microsoft Surface” which is quite a mouthful. The software that runs the device is of course still coming from Microsoft. Let’s dive into the technical specs (note: all of this is preliminary, it’s still in the Alpha phase!): Audio out HDMI / StereoRCA / SPDIF / 2 times 3.5mm audio out jack Brightness 300 CD/m2 Communications 1GB Ethernet/802.11/Bluetooth Contrast Ratio 1:1000 CPU AMD Athlon X2 245e 2.9Ghz Dual Core Display Resolution Full HD 1080p 1920x1080 / 16:9 aspect ratio GPU AMD Radeon HD 6750 1GB GDDRS HDD 320 GB / 7200 RPM HDMI In / HDMI out Yes I/O Ports 4 USB, SD Card reader Operation System Embedded Windows 7 Professional 64 bits Panel Size 40” diagonal Protection Glass Gorilla Glass RAM 4 GB DD3 Weight / with standard legs 70.0 Kg / 154 lbs Weight / standalone 39.5 Kg / 87 lbs Height (without legs) 4 inch Contact points recognized > 50 Cool Factor Extremely   Ok, the last point is not official, but I do think it needs to be there. Let’s talk software. As noted, it runs Windows 7 Professional 64 bit, which means you can run Visual Studio 2010 on it. The software is going to be developed in WPF4.0 with the additional Surface SDK 2.0. It will contain all the things you’ve seen before plus some extra’s. They have taken some steps to align it more with the Surface Toolkit which you can download today, so if you do things right your software should be portable between a WPF4.0 Windows 7 Multi-touch app and the Surface v2 environment. It still uses infrared to detect contacts, so in that respect nothing much has changed conceptually. We still can differentiate between a finger, a tag or a blob. Of course, since the new platform has a much higher resolution (compared to the 1024x768 of the first version) you might need to look at your code again. I’ve seen a lot of applications on Surface that assume the old resolution and moving that to V2 is going to be some work. To be honest: as I am under NDA I cannot disclose much about the new software besides what I have told you here, but trust me: it’s going to blow people away. Now, the biggest question for me is: when can I get one? Until we can, have a look here: Tags van Technorati: surface,samsung,WPF

    Read the article

  • Chalk Talk, Glenn Block &ndash; Leith, Edinburgh 12th March 2011

    - by David Christiansen
    Exciting news. I am proud to announce that Glenn Block from Microsoft  will be coming all the way from Seattle to Scotland on the 12th March to talk to you!. Glenn is a PM on the WCF team working on Microsoft’s future HTTP and REST stack and has been involved in some pretty exciting and ground-breaking Microsoft development mind-shifts in recent times. Don’t miss the chance to hear him speak and ask him questions. Brief history of Glenn Prior to WCF he was a PM on the new Managed Extensibility Framework in .NET 4.0. Glenn has a breadth of experience both inside and outside Microsoft developing software solutions for ISVs and the enterprise. Glenn has also been very active in involving folks from the community in the development of software at Microsoft. This has included shipping several products under open source licenses, as well as assisting other teams looking to do so. Glenn is also a frequent speaker at local and international events and user groups.  When he's not working and playing with technology, he spends his time with his wife and daughter either at their home in Seattle or at one of the local coffee shops. Glenn Block on the web mvcConf 2 - Glenn Block: Take some REST with WCF (Feb 2011) @gblock on twitter My Technobabble - Glenn’s Blog Sponsored by Storm ID is an award winning full service digital agency in Edinburgh

    Read the article

  • Oracle ADF Essentials & ADF training material now on the iPad By Grant Ronald

    - by JuergenKress
    Faster and Simpler Java-based Application Development - Now Free Oracle ADF Essentials is an end-to-end Java EE framework that simplifies application development by providing out-of-the-box infrastructure services and a visual and declarative development experience. Oracle ADF Essentials is free to develop and deploy. Oracle ADF Essentials Overview Demo Tutorial - Using Oracle ADF Essentials with JPA/EJB and JSF Oracle ADF Essentials FAQ Introduction to Oracle ADF Seminar Tutorial - Developing with Oracle ADF Essentials ADF training material now on the iPad By Grant Ronald My team has developed about a weeks worth of ADF training material under the title ADF Insider and ADF Insider Essentials. This is available from our page on OTN. But we are now loading all our content on YouTube as well so the content can now be accessed on iPads. Over the next couple of weeks we'll also add these YouTube links to the OTN page but in the meantime, if you have an interest in ADF I strongly urge you to subscribe to our ADFInsiderEssentials YouTube Channel so you can be alerted when new content comes on line. Please also provide your comments, thumbs up/down, and let us know what content/topics is of your interest. GlassFish Extension for Oracle JDeveloper by Shay Shmeltzer We just release a new version of Oracle JDeveloper - 11.1.2.3. One new feature here is built-in support for GlassFish. This include the ability to create an "application server" connection to GlassFish and then deploy to that server with one click from inside JDeveloper. You can use this for deploying Oracle ADF Essentials application on Glassfish, but you can also use it to deploy any Java EE application you build in JDeveloper on GlassFish. However, if you are planning to work with GlassFish and JDeveloper on a more regular basis as your development server, then you might find my new extension useful. The new extension allows you to start and stop an external GlassFish instance, as well as start it in debug mode (which will allow JDeveloper to remotely debug your application as it runs on the server. I also added a button that will invoke the web admin console of Glassfish. Here is a quick demo that will show you how to work with the extension. WebLogic Partner Community For regular information become a member in the WebLogic Partner Community please visit: http://www.oracle.com/partners/goto/wls-emea ( OPN account required). If you need support with your account please contact the Oracle Partner Business Center. BlogTwitterLinkedInMixForumWiki Technorati Tags: adf training,adf,grant Ronald,adf essential,WebLogic Community,Oracle,OPN,Jürgen Kress

    Read the article

  • John Hitchcock of Pace Describes the Oracle Agile PLM Customer Experience

    John Hitchcock, Senior Manager of Configuration Management at Pace (formerly 2Wire, Inc.), sat down for an interview during Oracle's Innovation Summit with Kerrie Foy, Manager of PLM Product Marketing at Oracle. Learn why his organization upgraded to the latest version of Agile and expanded the footprint to achieve impressive savings and productivity gains across the global, networked product value-chain.

    Read the article

  • Silverlight MEF – Download On Demand

    - by PeterTweed
    Take the Slalom Challenge at www.slalomchallenge.com! A common challenge with building complex applications in Silverlight is the initial download size of the xap file.  MEF enables us to build composable applications that allows us to build complex composite applications.  Wouldn’t it be great if we had a mechanism to spilt out components into different Silverlight applications in separate xap files and download the separate xap file only if needed?   MEF gives us the ability to do this.  This post will cover the basics needed to build such a composite application split between different silerlight applications and download the referenced silverlight application only when needed. Steps: 1.     Create a Silverlight 4 application 2.     Add references to the following assemblies: System.ComponentModel.Composition.dll System.ComponentModel.Composition.Initialization.dll 3.     Add a new Silverlight 4 application called ExternalSilverlightApplication to the solution that was created in step 1.  Ensure the new application is hosted in the web application for the solution and choose to not create a test page for the new application. 4.     Delete the App.xaml and MainPage.xaml files – they aren’t needed. 5.     Add references to the following assemblies in the ExternalSilverlightApplication project: System.ComponentModel.Composition.dll System.ComponentModel.Composition.Initialization.dll 6.     Ensure the two references above have their Copy Local values set to false.  As we will have these two assmblies in the original Silverlight application, we will have no need to include them in the built ExternalSilverlightApplication build. 7.     Add a new user control called LeftControl to the ExternalSilverlightApplication project. 8.     Replace the LayoutRoot Grid with the following xaml:     <Grid x:Name="LayoutRoot" Background="Beige" Margin="40" >         <Button Content="Left Content" Margin="30"></Button>     </Grid> 9.     Add the following statement to the top of the LeftControl.xaml.cs file using System.ComponentModel.Composition; 10.   Add the following attribute to the LeftControl class     [Export(typeof(LeftControl))]   This attribute tells MEF that the type LeftControl will be exported – i.e. made available for other applications to import and compose into the application. 11.   Add a new user control called RightControl to the ExternalSilverlightApplication project. 12.   Replace the LayoutRoot Grid with the following xaml:     <Grid x:Name="LayoutRoot" Background="Green" Margin="40"  >         <TextBlock Margin="40" Foreground="White" Text="Right Control" FontSize="16" VerticalAlignment="Center" HorizontalAlignment="Center" ></TextBlock>     </Grid> 13.   Add the following statement to the top of the RightControl.xaml.cs file using System.ComponentModel.Composition; 14.   Add the following attribute to the RightControl class     [Export(typeof(RightControl))] 15.   In your original Silverlight project add a reference to the ExternalSilverlightApplication project. 16.   Change the reference to the ExternalSilverlightApplication project to have it’s Copy Local value = false.  This will ensure that the referenced ExternalSilverlightApplication Silverlight application is not included in the original Silverlight application package when it it built.  The ExternalSilverlightApplication Silverlight application therefore has to be downloaded on demand by the original Silverlight application for it’s controls to be used. 1.     In your original Silverlight project add the following xaml to the LayoutRoot Grid in MainPage.xaml:         <Grid.RowDefinitions>             <RowDefinition Height="65*" />             <RowDefinition Height="235*" />         </Grid.RowDefinitions>         <Button Name="LoaderButton" Content="Download External Controls" Click="Button_Click"></Button>         <StackPanel Grid.Row="1" Orientation="Horizontal" HorizontalAlignment="Center" >             <Border Name="LeftContent" Background="Red" BorderBrush="Gray" CornerRadius="20"></Border>             <Border Name="RightContent" Background="Red" BorderBrush="Gray" CornerRadius="20"></Border>         </StackPanel>       The borders will hold the controls that will be downlaoded, imported and composed via MEF when the button is clicked. 2.     Add the following statement to the top of the MainPage.xaml.cs file using System.ComponentModel.Composition; 3.     Add the following properties to the MainPage class:         [Import(typeof(LeftControl))]         public LeftControl LeftUserControl { get; set; }         [Import(typeof(RightControl))]         public RightControl RightUserControl { get; set; }   This defines properties accepting LeftControl and RightControl types.  The attrributes are used to tell MEF the discovered type that should be applied to the property when composition occurs. 17.   Add the following event handler for the button click to the MainPage.xaml.cs file:         private void Button_Click(object sender, RoutedEventArgs e)         {                   DeploymentCatalog deploymentCatalog =     new DeploymentCatalog("ExternalSilverlightApplication.xap");                   CompositionHost.Initialize(deploymentCatalog);                   deploymentCatalog.DownloadCompleted += (s, i) =>                 {                     if (i.Error == null)                     {                         CompositionInitializer.SatisfyImports(this);                           LeftContent.Child = LeftUserControl;                         RightContent.Child = RightUserControl;                         LoaderButton.IsEnabled = false;                     }                 };                   deploymentCatalog.DownloadAsync();         } This is where the magic happens!  The deploymentCatalog object is pointed to the ExternalSilverlightApplication.xap file.  It is then associated with the CompositionHost initialization.  As the download will be asynchronous, an eventhandler is created for the DownloadCompleted event.  The deploymentCatalog object is then told to start the asynchronous download. The event handler that executes when the download is completed uses the CompositionInitializer.SatisfyImports() function to tell MEF to satisfy the Imports for the current class.  It is at this point that the LeftUserControl and RightUserControl properties are initialized with composed objects from the downloaded ExternalSilverlightApplication.xap package. 18.   Run the application click the Download External Controls button and see the controls defined in the ExternalSilverlightApplication application loaded into the original Silverlight application. Congratulations!  You have implemented download on demand capabilities for composite applications using the MEF DeploymentCatalog class.  You are now able to segment your applications into separate xap file for deployment.

    Read the article

  • INCLUDE ON YOUR SOLUTION ORACLE'S BUSINESS INTELLIGENCE SOFTWARE / 22 Fev 11

    - by Claudia Costa
    Convidamo-lo a assistir à sessão ISV Partner Embedded BI que decorrerá no prximo dia 22 de Fevereiro nas instalações da Oracle, em Porto Salvo. Não perca esta oportunidade de descobrir como pode modernizar a sua aplicação através da inclusão do Oracle Business Intelligence (OBI 11g). Durante esta sessão, ficará a saber como tornar os seus relatórios e a informação de apoio à gestão mais competitivos, e em simultâneo como pode proporcionar aos seus clientes informação de gestão com um visual apelativo. Qual a importância que esta temática tem para si? Ao encorporar a solução Oracle BI na sua aplicação, poderá mais rapidamente endereçar oportunidades de mercado, acrescentando valor ao seu produto. Poderá também baixar o custo total de propriedade (TCO) e proporcionar um retorno de investimento maior. Em caso de dúvida ou eventual esclarecimento, por favor contacte Claudia Costa - Telf: 21 4235027 ou email: [email protected]. Contamos com a sua presença! Agenda 09:15 Registo 09:30 Boas Vindas e Introdução - Paulo Costa, ISV Manager Oracle Portugal 09:40 The BI&EPM Market and Oracle's Strategic Position - Mike Hallet, BI and EPM Director Oracle EMEA 10:00 Oracle Business Intelligence 11g - Most Complete, Open, Integrated and Embeddable solution - Guy Ernoul, Master Principal Sales Consultant 11:00 Coffee Break 11:20 Introduction to the embedded BI program for ISV partners - Mike Hallet, BI and EPM Director Oracle EMEA 12:00 Partner showcase of an Oracle Embedded BI solution 13:00 Lunch 14:00 Technical Presentation - Guy Ernoul, Master Principal Sales Consultant OBI Administration: Architecture Creating & Manage the (Presentation, Model, Physical) Layer Administration using FMW control Diagnostic and performance for Enterprise Manager Demonstration OBI Utilization: Analyse & Dashboard Reports Action Framework Map & Scorecard APIs for Embedding OBI 11g (Go, Xml, ADF) Demonstration 16:00 Encerramento22 Fevereiro de 2011 9.30 a.m. - 4.00 p.m. Instalações Oracle Showroom Lagoas Park - Edf 8 Porto SalvoAssista a este evento exclusivo Inscrições Gratuitas. Lugares Limitados!Registe-se já!

    Read the article

  • Extensible Metadata in Oracle IRM 11g

    - by martin.abrahams
    Another significant change in Oracle IRM 11g is that we now use XML to create the tamperproof header for each sealed document. This article explains what this means, and what benefit it offers. So, every sealed file has a metadata header that contains information about the document - its classification, its format, the user who sealed it, the name and URL of the IRM Server, and much more. The IRM Desktop and other IRM applications use this information to formulate the request for rights, as well as to enhance the user experience by exposing some of the metadata in the user interface. For example, in Windows explorer you can see some metadata exposed as properties of a sealed file and in the mouse-over tooltip. The following image shows 10g and 11g metadata side by side. As you can see, the 11g metadata is written as XML as opposed to the simple delimited text format used in 10g. So why does this matter? The key benefit of using XML is that it creates the opportunity for sealing applications to use custom metadata. This in turn creates the opportunity for custom classification models to be defined and enforced. Out of the box, the solution uses the context classification model, in which two particular pieces of metadata form the basis of rights evaluation - the context name and the document's item code. But a custom sealing application could use some other model entirely, enabling rights decisions to be evaluated on some other basis. The integration with Oracle Beehive is a great example of this. When a user adds a document to a Beehive workspace, that document can be automatically sealed with metadata that represents the Beehive security model rather than the context model. As a consequence, IRM can enforce the Beehive security model precisely and all rights configuration can actually be managed through the Beehive UI rather than the IRM UI. In this scenario, IRM simply supports the Beehive application, seamlessly extending Beehive security to all copies of workspace documents without any additional administration. Finally, I mentioned that the metadata header is tamperproof. This is obviously to stop a rogue user modifying the metadata with a view to gaining unauthorised access - reclassifying a board document to a less sensitive classifcation, for example. To prevent this, the header is digitally signed and can only be manipulated by a suitably authorised sealing application.

    Read the article

  • JavaScript in different browsers

    - by PointsToShare
    Adventures with JavaScript rendered in IE 8, Chrome 15, and Firefox 8.0 I have written a little monogram about the advantages of Math and wrote a few JavaScript applications to demonstrate them. I was a bit careless and used elements on the page in my JavaScript without using any of the GetElementsByXXXX methods to identify them.  Say I had a text box named tbSeqNum into which I entered a number to be used in a computation. In my code I simply referred to its value by using it directly. Like here: Function Blah() {                 return tbSeqNum.value; } This ran fine in IE8. In IE, the elements are available as global variables. This is not the case in either Firefox or Chrome. In there one has to create the variable and only then use it. Assuming I also used tbSeqNum as the element’s ID, this works: Function Blah() {                 return GetElementById(“tbSeqNum”).value; } Naturally this corrected function also works in IE, so be warned. Also, coming from windows programming (I am long in the tooth and programmed long before the internet), I have a habit of putting an “Exit” button on my pages and setting their onclick to: onclick=”window.close()”. Again, this works fine in IE. In Firefox and chrome, it does not! There you can only close a window that you opened in the code. A window that was opened by navigation to a URL will not close.  Before I deployed mu code to my website, I painfully removed all my Exit buttons. But my greatest surprise came when I tested my pages in the various browsers. In my code I do a comparison on the performance of two algorithms used to solve the same problem. One is brute force, the other uses a mathematical formula. The compare functions runs each many times and displays the time it took for each and also the ratio. Chrome runs JavaScript between 5 and 10 times faster than Firefox and between 50 and 100 times faster that IE. Wow!!! This difference is especially remarkable when the code uses iteration. I suspect that the JS engines in Chrome and Firefox simply cache the result of a function and if it is called again with the same parameters, it returns the cached result. To see it in action play run the “How Many Squares” page in www.mgsltns.com/games.htm The host is running on Unix, so the link is case sensitive. Last Note: IE9 runs JS a bit faster, but still lags behind almost as badly. That’s All Folks!

    Read the article

  • Adding Async=true to the page- no side effects noticed.

    - by Michael Freidgeim
    Recently I needed to implement PageAsyncTask  in .Net 4 web forms application.According to http://msdn.microsoft.com/en-us/library/system.web.ui.pageasynctask.aspx"A PageAsyncTask object must be registered to the page through the RegisterAsyncTask method. The page itself does not have to be processed asynchronously to execute asynchronous tasks. You can set the Async attribute to either true (as shown in the following code example) or false on the page directive and the asynchronous tasks will still be processed asynchronously:<%@ Page Async="true" %>When the Async attribute is set to false, the thread that executes the page will be blocked until all asynchronous tasks are complete."I was worry about any site effects if I will set  Async=true on the existing page.The only documented restrictions, that I found are that@Async is not compatible with @AspCompat and Transaction attributes (from @ Page directive  MSDN article). In other words, Asynchronous pages do not work when the AspCompat attribute is set to true or the Transactionattribute is set to a value other than Disabled in the @ Page directiveFrom our tests we conclude, that adding Async=true to the page is quite safe, even if you don't always call Async tasks from the page

    Read the article

  • Current Technologies

    - by Charles Cline
    I currently work at the University of Kansas (KU) and before that Stanford University, to be particular the Stanford Linear Accelerator Center (SLAC).  Collaborating with various Higher Ed institutions the past several years has shown a marked increase in the Microsoft side of the house.  To give you an idea of our current environment, here are some of the things we (Enterprise Systems) have been working on the past two years I’ve been at KU: Migrated from Novell to Active Directory (AD), although we’re still leveraging Novell for IDM.  We currently have 550,000+ objects in AD, and we still have several departments to bring in. Upgraded from Exchange 2003 to Exchange 2010 and Forefront Online Protection for Exchange (FOPE) Implemented SCCM 2007 for Windows systems management Implemented central file storage using EMC products for the backend, using CIFS as the frontend Restructuring AD domains and Forests to decrease the administrative overhead and provide a primary authentication mechanism for the entire University Determining Key Performance Indicators for AD and Exchange Implemented SCOM 2007 to monitor AD and Exchange Implemented Confluence for collaboration within IT and other technology providers at the University Implemented Data Protection Manager (DPM) for backup of AD and Exchange Built a test and QA environment to better facilitate upcoming changes to the environment Almost ready to raise the AD domain level to 2008 R2   I’m sure I’m missing things, and my next post will be some of the things we’re getting ready for – like Centrify to provide AD for OS X and Linux systems.  If anyone would like more info on a particular area, please drop me a line.  I’d be happy to discuss.

    Read the article

  • Free E-Book - TypeScript Succinctly

    - by TATWORTH
    Originally posted on: http://geekswithblogs.net/TATWORTH/archive/2013/06/22/free-e-book---typescript-succinctly.aspxAt http://www.syncfusion.com/resources/techportal/ebooks/typescript, Syncfusion are a free E-book "TypeScript Succinctly""The extensive adoption of JavaScript for application development, and the ability to use HTML and JavaScript to create Windows Store apps, led Microsoft to develop TypeScript, a superset of JavaScript. Though the messiness of JavaScript causes many .NET developers to avoid the language, Microsoft's additions extend many familiar features of .NET programming to JavaScript. With TypeScript Succinctly by Steve Fenton, you will learn how TypeScript provides optional static typing and classes to JavaScript development, how to create and load modules, and how to work with existing JavaScript libraries through ambient declarations. TypeScript is even significantly integrated with Visual Studio to provide the autocompletion and type checking you are most comfortable with."

    Read the article

  • QotD - Nicolas de Loof on AdoptOpenJDK

    - by $utils.escapeXML($entry.author)
    The AdoptOpenJDK program is an initiative to get as many Java users as possible to try the OpenJDK 8 preview builds, so that feedback is collected before JDK 8 is officially released. There are many ways to contribute to this program (as explained on the wiki), but the most basic one is to start testing your own project on the Java 8 platform. CloudBees can help you there, as we just made OpenJDK 8 (preview) available on DEV@cloud so that you can configure a build job to check project compatibility. We will upgrade the JDK for all recent preview builds until JDK 8 is finalNicolas de Loof, Support Engineer at Cloudbees in a blog post on AdoptOpenJDK.

    Read the article

  • Using commands with ApplicationBarMenuItem and ApplicationBarButton in Windows Phone 7

    - by Laurent Bugnion
    Unfortunately, in the current version of the Windows Phone 7 Silverlight framework, it is not possible to attach any command on the ApplicationBarMenuItem and ApplicationBarButton controls. These two controls appear in the Application Bar, for example with the following markup: <phoneNavigation:PhoneApplicationPage.ApplicationBar> <shell:ApplicationBar x:Name="MainPageApplicationBar"> <shell:ApplicationBar.MenuItems> <shell:ApplicationBarMenuItem Text="Add City" /> <shell:ApplicationBarMenuItem Text="Add Country" /> </shell:ApplicationBar.MenuItems> <shell:ApplicationBar.Buttons> <shell:ApplicationBarIconButton IconUri="/Resources/appbar.feature.video.rest.png" /> <shell:ApplicationBarIconButton IconUri="/Resources/appbar.feature.settings.rest.png" /> <shell:ApplicationBarIconButton IconUri="/Resources/appbar.refresh.rest.png" /> </shell:ApplicationBar.Buttons> </shell:ApplicationBar> </phoneNavigation:PhoneApplicationPage.ApplicationBar> This code will create the following UI: Application bar, collapsed Application bar, expanded ApplicationBarItems are not, however, controls. A quick look in MSDN shows the following hierarchy for ApplicationBarMenuItem, for example: Unfortunately, this prevents all the mechanisms that are normally used to attach a Command (for example a RelayCommand) to a control. For example, the attached behavior present in the class ButtonBaseExtension (from the Silverlight 3 version of the MVVM Light toolkit) can only be attached to a DependencyObject. Similarly, Blend behaviors (such as EventToCommand from the toolkit’s Extras library) needs a FrameworkElement to work. Using code behind The alternative is to use code behind. As I said in my MIX10 talk, the MVVM police will not take your family away if you use code behind (this quote was actually suggested to me by Glenn Block); the code behind is there for a reason. In our case, invoking a command in the ViewModel requires the following code: In MainPage.xaml: <shell:ApplicationBarMenuItem Text="My Menu 1" Click="ApplicationBarMenuItemClick"/> In MainPage.xaml.cs private void ApplicationBarMenuItemClick( object sender, System.EventArgs e) { var vm = DataContext as MainViewModel; if (vm != null) { vm.MyCommand.Execute(null); } } Conclusion Resorting to code behind to bridge the gap between the View and the ViewModel is less elegant than using attached behaviors, either through an attached property or through a Blend behavior. It does, however, work fine. I don’t have any information if future changes in the Windows Phone 7 Application Bar API will make this easier. In the mean time, I would recommend using code behind instead.   Laurent Bugnion (GalaSoft) Subscribe | Twitter | Facebook | Flickr | LinkedIn

    Read the article

  • Testing Entity Framework applications, pt. 3: NDbUnit

    - by Thomas Weller
    This is the third of a three part series that deals with the issue of faking test data in the context of a legacy app that was built with Microsoft's Entity Framework (EF) on top of an MS SQL Server database – a scenario that can be found very often. Please read the first part for a description of the sample application, a discussion of some general aspects of unit testing in a database context, and of some more specific aspects of the here discussed EF/MSSQL combination. Lately, I wondered how you would ‘mock’ the data layer of a legacy application, when this data layer is made up of an MS Entity Framework (EF) model in combination with a MS SQL Server database. Originally, this question came up in the context of how you could enable higher-level integration tests (automated UI tests, to be exact) for a legacy application that uses this EF/MSSQL combo as its data store mechanism – a not so uncommon scenario. The question sparked my interest, and I decided to dive into it somewhat deeper. What I've found out is, in short, that it's not very easy and straightforward to do it – but it can be done. The two strategies that are best suited to fit the bill involve using either the (commercial) Typemock Isolator tool or the (free) NDbUnit framework. The use of Typemock was discussed in the previous post, this post now will present the NDbUnit approach... NDbUnit is an Apache 2.0-licensed open-source project, and like so many other Nxxx tools and frameworks, it is basically a C#/.NET port of the corresponding Java version (DbUnit namely). In short, it helps you in flexibly managing the state of a database in that it lets you easily perform basic operations (like e.g. Insert, Delete, Refresh, DeleteAll)  against your database and, most notably, lets you feed it with data from external xml files. Let's have a look at how things can be done with the help of this framework. Preparing the test data Compared to Typemock, using NDbUnit implies a totally different approach to meet our testing needs.  So the here described testing scenario requires an instance of an SQL Server database in operation, and it also means that the Entity Framework model that sits on top of this database is completely unaffected. First things first: For its interactions with the database, NDbUnit relies on a .NET Dataset xsd file. See Step 1 of their Quick Start Guide for a description of how to create one. With this prerequisite in place then, the test fixture's setup code could look something like this: [TestFixture, TestsOn(typeof(PersonRepository))] [Metadata("NDbUnit Quickstart URL",           "http://code.google.com/p/ndbunit/wiki/QuickStartGuide")] [Description("Uses the NDbUnit library to provide test data to a local database.")] public class PersonRepositoryFixture {     #region Constants     private const string XmlSchema = @"..\..\TestData\School.xsd";     #endregion // Constants     #region Fields     private SchoolEntities _schoolContext;     private PersonRepository _personRepository;     private INDbUnitTest _database;     #endregion // Fields     #region Setup/TearDown     [FixtureSetUp]     public void FixtureSetUp()     {         var connectionString = ConfigurationManager.ConnectionStrings["School_Test"].ConnectionString;         _database = new SqlDbUnitTest(connectionString);         _database.ReadXmlSchema(XmlSchema);         var entityConnectionStringBuilder = new EntityConnectionStringBuilder         {             Metadata = "res://*/School.csdl|res://*/School.ssdl|res://*/School.msl",             Provider = "System.Data.SqlClient",             ProviderConnectionString = connectionString         };         _schoolContext = new SchoolEntities(entityConnectionStringBuilder.ConnectionString);         _personRepository = new PersonRepository(this._schoolContext);     }     [FixtureTearDown]     public void FixtureTearDown()     {         _database.PerformDbOperation(DbOperationFlag.DeleteAll);         _schoolContext.Dispose();     }     ...  As you can see, there is slightly more fixture setup code involved if your tests are using NDbUnit to provide the test data: Because we're dealing with a physical database instance here, we first need to pick up the test-specific connection string from the test assemblies' App.config, then initialize an NDbUnit helper object with this connection along with the provided xsd file, and also set up the SchoolEntities and the PersonRepository instances accordingly. The _database field (an instance of the INdUnitTest interface) will be our single access point to the underlying database: We use it to perform all the required operations against the data store. To have a flexible mechanism to easily insert data into the database, we can write a helper method like this: private void InsertTestData(params string[] dataFileNames) {     _database.PerformDbOperation(DbOperationFlag.DeleteAll);     if (dataFileNames == null)     {         return;     }     try     {         foreach (string fileName in dataFileNames)         {             if (!File.Exists(fileName))             {                 throw new FileNotFoundException(Path.GetFullPath(fileName));             }             _database.ReadXml(fileName);             _database.PerformDbOperation(DbOperationFlag.InsertIdentity);         }     }     catch     {         _database.PerformDbOperation(DbOperationFlag.DeleteAll);         throw;     } } This lets us easily insert test data from xml files, in any number and in a  controlled order (which is important because we eventually must fulfill referential constraints, or we must account for some other stuff that imposes a specific ordering on data insertion). Again, as with Typemock, I won't go into API details here. - Unfortunately, there isn't too much documentation for NDbUnit anyway, other than the already mentioned Quick Start Guide (and the source code itself, of course) - a not so uncommon problem with smaller Open Source Projects. Last not least, we need to provide the required test data in xml form. A snippet for data from the People table might look like this, for example: <?xml version="1.0" encoding="utf-8" ?> <School xmlns="http://tempuri.org/School.xsd">   <Person>     <PersonID>1</PersonID>     <LastName>Abercrombie</LastName>     <FirstName>Kim</FirstName>     <HireDate>1995-03-11T00:00:00</HireDate>   </Person>   <Person>     <PersonID>2</PersonID>     <LastName>Barzdukas</LastName>     <FirstName>Gytis</FirstName>     <EnrollmentDate>2005-09-01T00:00:00</EnrollmentDate>   </Person>   <Person>     ... You can also have data from various tables in one single xml file, if that's appropriate for you (but beware of the already mentioned ordering issues). It's true that your test assembly may end up with dozens of such xml files, each containing quite a big amount of text data. But because the files are of very low complexity, and with the help of a little bit of Copy/Paste and Excel magic, this appears to be well manageable. Executing some basic tests Here are some of the possible tests that can be written with the above preparations in place: private const string People = @"..\..\TestData\School.People.xml"; ... [Test, MultipleAsserts, TestsOn("PersonRepository.GetNameList")] public void GetNameList_ListOrdering_ReturnsTheExpectedFullNames() {     InsertTestData(People);     List<string> names =         _personRepository.GetNameList(NameOrdering.List);     Assert.Count(34, names);     Assert.AreEqual("Abercrombie, Kim", names.First());     Assert.AreEqual("Zheng, Roger", names.Last()); } [Test, MultipleAsserts, TestsOn("PersonRepository.GetNameList")] [DependsOn("RemovePerson_CalledOnce_DecreasesCountByOne")] public void GetNameList_NormalOrdering_ReturnsTheExpectedFullNames() {     InsertTestData(People);     List<string> names =         _personRepository.GetNameList(NameOrdering.Normal);     Assert.Count(34, names);     Assert.AreEqual("Alexandra Walker", names.First());     Assert.AreEqual("Yan Li", names.Last()); } [Test, TestsOn("PersonRepository.AddPerson")] public void AddPerson_CalledOnce_IncreasesCountByOne() {     InsertTestData(People);     int count = _personRepository.Count;     _personRepository.AddPerson(new Person { FirstName = "Thomas", LastName = "Weller" });     Assert.AreEqual(count + 1, _personRepository.Count); } [Test, TestsOn("PersonRepository.RemovePerson")] public void RemovePerson_CalledOnce_DecreasesCountByOne() {     InsertTestData(People);     int count = _personRepository.Count;     _personRepository.RemovePerson(new Person { PersonID = 33 });     Assert.AreEqual(count - 1, _personRepository.Count); } Not much difference here compared to the corresponding Typemock versions, except that we had to do a bit more preparational work (and also it was harder to get the required knowledge). But this picture changes quite dramatically if we look at some more demanding test cases: Ok, and what if things are becoming somewhat more complex? Tests like the above ones represent the 'easy' scenarios. They may account for the biggest portion of real-world use cases of the application, and they are important to make sure that it is generally sound. But usually, all these nasty little bugs originate from the more complex parts of our code, or they occur when something goes wrong. So, for a testing strategy to be of real practical use, it is especially important to see how easy or difficult it is to mimick a scenario which represents a more complex or exceptional case. The following test, for example, deals with the case that there is some sort of invalid input from the caller: [Test, MultipleAsserts, TestsOn("PersonRepository.GetCourseMembers")] [Row(null, typeof(ArgumentNullException))] [Row("", typeof(ArgumentException))] [Row("NotExistingCourse", typeof(ArgumentException))] public void GetCourseMembers_WithGivenVariousInvalidValues_Throws(string courseTitle, Type expectedInnerExceptionType) {     var exception = Assert.Throws<RepositoryException>(() =>                                 _personRepository.GetCourseMembers(courseTitle));     Assert.IsInstanceOfType(expectedInnerExceptionType, exception.InnerException); } Apparently, this test doesn't need an 'Arrange' part at all (see here for the same test with the Typemock tool). It acts just like any other client code, and all the required business logic comes from the database itself. This doesn't always necessarily mean that there is less complexity, but only that the complexity happens in a different part of your test resources (in the xml files namely, where you sometimes have to spend a lot of effort for carefully preparing the required test data). Another example, which relies on an underlying 1-n relationship, might be this: [Test, MultipleAsserts, TestsOn("PersonRepository.GetCourseMembers")] public void GetCourseMembers_WhenGivenAnExistingCourse_ReturnsListOfStudents() {     InsertTestData(People, Course, Department, StudentGrade);     List<Person> persons = _personRepository.GetCourseMembers("Macroeconomics");     Assert.Count(4, persons);     Assert.ForAll(         persons,         @p => new[] { 10, 11, 12, 14 }.Contains(@p.PersonID),         "Person has none of the expected IDs."); } If you compare this test to its corresponding Typemock version, you immediately see that the test itself is much simpler, easier to read, and thus much more intention-revealing. The complexity here lies hidden behind the call to the InsertTestData() helper method and the content of the used xml files with the test data. And also note that you might have to provide additional data which are not even directly relevant to your test, but are required only to fulfill some integrity needs of the underlying database. Conclusion The first thing to notice when comparing the NDbUnit approach to its Typemock counterpart obviously deals with performance: Of course, NDbUnit is much slower than Typemock. Technically,  it doesn't even make sense to compare the two tools. But practically, it may well play a role and could or could not be an issue, depending on how much tests you have of this kind, how often you run them, and what role they play in your development cycle. Also, because the dataset from the required xsd file must fully match the database schema (even in parts that otherwise wouldn't be relevant to you), it can be quite cumbersome to be in a team where different people are working with the database in parallel. My personal experience is – as already said in the first part – that Typemock gives you a better development experience in a 'dynamic' scenario (when you're working in some kind of TDD-style, you're oftentimes executing the tests from your dev box, and your database schema changes frequently), whereas the NDbUnit approach is a good and solid solution in more 'static' development scenarios (when you need to execute the tests less frequently or only on a separate build server, and/or the underlying database schema can be kept relatively stable), for example some variations of higher-level integration or User-Acceptance tests. But in any case, opening Entity Framework based applications for testing requires a fair amount of resources, planning, and preparational work – it's definitely not the kind of stuff that you would call 'easy to test'. Hopefully, future versions of EF will take testing concerns into account. Otherwise, I don't see too much of a future for the framework in the long run, even though it's quite popular at the moment... The sample solution A sample solution (VS 2010) with the code from this article series is available via my Bitbucket account from here (Bitbucket is a hosting site for Mercurial repositories. The repositories may also be accessed with the Git and Subversion SCMs - consult the documentation for details. In addition, it is possible to download the solution simply as a zipped archive – via the 'get source' button on the very right.). The solution contains some more tests against the PersonRepository class, which are not shown here. Also, it contains database scripts to create and fill the School sample database. To compile and run, the solution expects the Gallio/MbUnit framework to be installed (which is free and can be downloaded from here), the NDbUnit framework (which is also free and can be downloaded from here), and the Typemock Isolator tool (a fully functional 30day-trial is available here). Moreover, you will need an instance of the Microsoft SQL Server DBMS, and you will have to adapt the connection strings in the test projects App.config files accordingly.

    Read the article

< Previous Page | 198 199 200 201 202 203 204 205 206 207 208 209  | Next Page >