Search Results

Search found 16971 results on 679 pages for 'blogs'.

Page 91/679 | < Previous Page | 87 88 89 90 91 92 93 94 95 96 97 98  | Next Page >

  • Java Spotlight Episode 150: James Gosling on Java

    - by Roger Brinkley
    Interview with James Gosling, father of Java and Java Champion, on the history of Java, his work at Liquid Robotics, Netbeans, the future of Java and what he sees as the next revolutionary trend in the computer industry. Right-click or Control-click to download this MP3 file. You can also subscribe to the Java Spotlight Podcast Feed to get the latest podcast automatically. If you use iTunes you can open iTunes and subscribe with this link: Java Spotlight Podcast in iTunes. Show Notes Feature Interview James Gosling received a BSc in Computer Science from the University of Calgary, Canada in 1977. He received a PhD in Computer Science from Carnegie-Mellon University in 1983. The title of his thesis was "The Algebraic Manipulation of Constraints". He spent many years as a VP & Fellow at Sun Microsystems. He has built satellite data acquisition systems, a multiprocessor version of Unix, several compilers, mail systems and window managers. He has also built a WYSIWYG text editor, a constraint based drawing editor and a text editor called `Emacs' for Unix systems. At Sun his early activity was as lead engineer of the NeWS window system. He did the original design of the Java programming language and implemented its original compiler and virtual machine. He has been a contributor to the Real-Time Specification for Java, and a researcher at Sun labs where his primary interest was software development tools.     He then was the Chief Technology Officer of Sun's Developer Products Group and the CTO of Sun's Client Software Group. He briefly worked for Oracle after the acquisition of Sun. After a year off, he spent some time at Google and is now the chief software architect at Liquid Robotics where he spends his time writing software for the Waveglider, an autonomous ocean-going robot.

    Read the article

  • Upgrade to Oracle 11g Webcast - 14/04/2010

    - by Alex Blyth
    Hi AllHere are the details for Wednesday's (14th April 2010) webcast on "Upgrading to Oracle 11g" beginning at 1.30pm (Sydney, Australia Time) :Webcast is at http://strtc.oracle.com (IE6, 7 & 8 supported only)Conference ID for the webcast is 6690662Conference Key: upgradeEnrollment is required. Please click here to enroll.Please use your real name in the name field (just makes it easier for us to help you out if we can't answer your questions on the call)Audio details:NZ Toll Free - 0800 888 157 orAU Toll Free - 1800420354 (or +61 2 8064 0613Meeting ID: 7914841Meeting Passcode: 14042010Talk to you all WednesdayAlex

    Read the article

  • Oracle and ATG: The Next Generation of Customer Experience

    - by divya.malik
    Oracle today announced that it has completed the acquisition of Art Technology Group (ATG), Inc. In a webcast this morning, Thomas Kurian, Executive Vice President, Oracle Anthony Lye, Senior Vice President, CRM at Oracle and  Ken Volpe, Senior Vice President of Products and Technology from ATG, presented the rationale, strategy and future direction with this acquisition, ATG is a leading E-Commerce service provider and Oracle is a leading CRM and Retail Applications provider, which makes it a winning team. There has been a lot of positive feedback from the analysts, press as well as customers. “As a customer of both Oracle and ATG, we view the integration of the two companies as a natural fit,” said Kevin Cunnington, Global Head of Online, Vodafone Group. “We look forward to new efficiencies that address our online and cross-channel business strategies and help us further provide superior customer experiences.” For more information about Oracle and ATG: Overiew and FAQs Webcast Press Release Technorati Tags: oracle,oracle siebel crm,atg,crm

    Read the article

  • Scrum for Team Foundation Server 2010

    - by Martin Hinshelwood
    I will be presenting a session on “Scrum for TFS2010” not once, but twice! If you are going to be at the Aberdeen Partner Group meeting on 27th April, or DDD Scotland on 8th May then you may be able to catch my session. Credit: I want to give special thanks to Aaron Bjork from Microsoft who provided me with most of my material He is a Scrum and Power Point genius. Scrum for Team Foundation Server 2010 Synopsis Visual Studio ALM (formerly Visual Studio Team System (VSTS)) and Team Foundation Server (TFS) are the cornerstones of development on the Microsoft .NET platform. These are the best tools for a team to have successful projects and for the developers to have a focused and smooth software development process. For TFS 2010 Microsoft is heavily investing in Scrum and has already started moving some teams across to using it. Martin will not be going in depth with Scrum but you can find out more about Scrum by reading the Scrum Guide and you can even asses your Scrum knowledge by having a go at the Scrum Open Assessment. Come and see Martin Hinshelwood, Visual Studio ALM MVP and Solution Architect from SSW show you: How to successfully gather requirements with User stories How to plan a project using TFS 2010 and Scrum How to work with a product backlog in TFS 2010 The right way to plan a sprint with TFS 2010 Tracking your progress The right way to use work items What you can use from the built in reporting as well as the Project portals available on from the SharePoint dashboard The important reports to give your Product Owner / Project Manager Walk away knowing how to see the project health and progress. Visual Studio ALM is designed to help address many of these traditional problems faced by teams. It does so by providing a set of integrated tools to help teams improve their software development activities and to help managers better support the software development processes. During this session we will cover the lifecycle of creating work items and how this fits into Scrum using Visual Studio ALM and Team Foundation Server. If you want to know more about how to do Scrum with TFS then there is a new course that has been created in collaboration with Microsoft and Scrum.org that is going to be the official course for working with TFS 2010. SSW has Professional Scrum Developer Trainers who specialise in training your developers in implementing Scrum with Microsoft's Visual Studio ALM tools. Ken Schwaber and and Sam Guckenheimer: Professional Scrum Development Technorati Tags: Scrum,VS ALM,VS 2010,TFS 2010

    Read the article

  • Inserting and Deleting Sub Rows in GridView

    - by Vincent Maverick Durano
    A user in the forums (http://forums.asp.net) is asking how to insert  sub rows in GridView and also add delete functionality for the inserted sub rows. In this post I'm going to demonstrate how to this in ASP.NET WebForms.  The basic idea to achieve this is we just need to insert row data in the DataSource that is being used in GridView since the GridView rows will be generated based on the DataSource data. To make it more clear then let's build up a sample application. To start fire up Visual Studio and create a WebSite or Web Application project and then add a new WebForm. In the WebForm ASPX page add this GridView markup below:   1: <asp:gridview ID="GridView1" runat="server" AutoGenerateColumns="false" onrowdatabound="GridView1_RowDataBound"> 2: <Columns> 3: <asp:BoundField DataField="RowNumber" HeaderText="Row Number" /> 4: <asp:TemplateField HeaderText="Header 1"> 5: <ItemTemplate> 6: <asp:TextBox ID="TextBox1" runat="server"></asp:TextBox> 7: </ItemTemplate> 8: </asp:TemplateField> 9: <asp:TemplateField HeaderText="Header 2"> 10: <ItemTemplate> 11: <asp:TextBox ID="TextBox2" runat="server"></asp:TextBox> 12: </ItemTemplate> 13: </asp:TemplateField> 14: <asp:TemplateField HeaderText="Header 3"> 15: <ItemTemplate> 16: <asp:TextBox ID="TextBox3" runat="server"></asp:TextBox> 17: </ItemTemplate> 18: </asp:TemplateField> 19: <asp:TemplateField HeaderText="Action"> 20: <ItemTemplate> 21: <asp:LinkButton ID="LinkButton1" runat="server" onclick="LinkButton1_Click" Text="Insert"></asp:LinkButton> 22: </ItemTemplate> 23: </asp:TemplateField> 24: </Columns> 25: </asp:gridview>   Then at the code behind source of ASPX page you can add this codes below:   1: private DataTable FillData() { 2:   3: DataTable dt = new DataTable(); 4: DataRow dr = null; 5:   6: //Create DataTable columns 7: dt.Columns.Add(new DataColumn("RowNumber", typeof(string))); 8:   9: //Create Row for each columns 10: dr = dt.NewRow(); 11: dr["RowNumber"] = 1; 12: dt.Rows.Add(dr); 13:   14: dr = dt.NewRow(); 15: dr["RowNumber"] = 2; 16: dt.Rows.Add(dr); 17:   18: dr = dt.NewRow(); 19: dr["RowNumber"] = 3; 20: dt.Rows.Add(dr); 21:   22: dr = dt.NewRow(); 23: dr["RowNumber"] = 4; 24: dt.Rows.Add(dr); 25:   26: dr = dt.NewRow(); 27: dr["RowNumber"] = 5; 28: dt.Rows.Add(dr); 29:   30: //Store the DataTable in ViewState for future reference 31: ViewState["CurrentTable"] = dt; 32:   33: return dt; 34:   35: } 36:   37: private void BindGridView(DataTable dtSource) { 38: GridView1.DataSource = dtSource; 39: GridView1.DataBind(); 40: } 41:   42: private DataRow InsertRow(DataTable dtSource, string value) { 43: DataRow dr = dtSource.NewRow(); 44: dr["RowNumber"] = value; 45: return dr; 46: } 47: //private DataRow DeleteRow(DataTable dtSource, 48:   49: protected void Page_Load(object sender, EventArgs e) { 50: if (!IsPostBack) { 51: BindGridView(FillData()); 52: } 53: } 54:   55: protected void LinkButton1_Click(object sender, EventArgs e) { 56: LinkButton lb = (LinkButton)sender; 57: GridViewRow row = (GridViewRow)lb.NamingContainer; 58: DataTable dtCurrentData = (DataTable)ViewState["CurrentTable"]; 59: if (lb.Text == "Insert") { 60: //Insert new row below the selected row 61: dtCurrentData.Rows.InsertAt(InsertRow(dtCurrentData, row.Cells[0].Text + "-sub"), row.RowIndex + 1); 62:   63: } 64: else { 65: //Delete selected sub row 66: dtCurrentData.Rows.RemoveAt(row.RowIndex); 67: } 68:   69: BindGridView(dtCurrentData); 70: ViewState["CurrentTable"] = dtCurrentData; 71: } 72:   73: protected void GridView1_RowDataBound(object sender, GridViewRowEventArgs e) { 74: if (e.Row.RowType == DataControlRowType.DataRow) { 75: if (e.Row.Cells[0].Text.Contains("-sub")) { 76: ((LinkButton)e.Row.FindControl("LinkButton1")).Text = "Delete"; 77: } 78: } 79: }   As you can see the code above is pretty straight forward and self explainatory but just to give you a short explaination the code above is composed of three (3) private methods which are the FillData(), BindGridView and InsertRow(). The FillData() method is a method that returns a DataTable and basically creates a dummy data in the DataTable to be used as the GridView DataSource. You can replace the code in that method if you want to use actual data from database but for the purpose of this example I just fill the DataTable with a dummy data on it. The BindGridVew is a method that handles the actual binding of GridVew. The InsertRow() is a method that returns a DataRow. This method handles the insertion of the sub row. Now in the LinkButton OnClick event, we casted the sender to a LinkButton to determine the specific object that fires up the event and get the row values. We then reference the Data from ViewState to get the current data that is being used in the GridView. If the LinkButton text is "Insert" then we will insert new row to the DataSource ( in this case the DataTable) based on the rowIndex if not then Delete the sub row that was added. Here are some screen shots of the output below: On initial load:   After inserting a sub row:   That's it! I hope someone find this post useful!   Technorati Tags: ASP.NET,C#,GridView

    Read the article

  • Syntax Recognition for XML-Based Languages in Oracle JDeveloper

    - by Ramkumar Menon
      @Thanks Jeffrey Stephenson If you are looking at using any one of the new XML Based languages, lets say a docbook xml, or xproc, or what not, you can make use of JDeveloper's syntax highlighting and completion insight feature to ease out those extra keystrokes. All you need is a URL/local copy of the XML Schema for the language. Once you have, you can register it via Tools --> Preferences --> XML Schemas.   Remember to provide a new extension name [Using a default .xml extension did not work for me.] I provided my own extension .dbk for my docbook files. Once you save these settings, you can create new files that conform to the schema, and you get validation/completion insight/prompting for free.      

    Read the article

  • OWB 11gR2 &ndash; Degenerate Dimensions

    - by David Allan
    Ever wondered how to build degenerate dimensions in OWB and get the benefits of slowly changing dimensions and cube loading? Now its possible through some changes in 11gR2 to make the dimension and cube loading much more flexible. This will let you get the benefits of OWB's surrogate key handling and slowly changing dimension reference when loading the fact table and need degenerate dimensions (see Ralph Kimball's degenerate dimensions design tip). Here we will see how to use the cube operator to load slowly changing, regular and degenerate dimensions. The cube and cube operator can now work with dimensions which have no surrogate key as well as dimensions with surrogates, so you can get the benefit of the cube loading and incorporate the degenerate dimension loading. What you need to do is create a dimension in OWB that is purely used for ETL metadata; the dimension itself is never deployed (its table is, but has not data) it has no surrogate keys has a single level with a business attribute the degenerate dimension data and a dummy attribute, say description just to pass the OWB validation. When this degenerate dimension is added into a cube, you will need to configure the fact table created and set the 'Deployable' flag to FALSE for the foreign key generated to the degenerate dimension table. The degenerate dimension reference will then be in the cube operator and used when matching. Create the degenerate dimension using the regular wizard. Delete the Surrogate ID attribute, this is not needed. Define a level name for the dimension member (any name). After the wizard has completed, in the editor delete the hierarchy STANDARD that was automatically generated, there is only a single level, no need for a hierarchy and this shouldn't really be created. Deploy the implementing table DD_ORDERNUMBER_TAB, this needs to be deployed but with no data (the mapping here will do a left outer join of the source data with the empty degenerate dimension table). Now, go ahead and build your cube, use the regular TIMES dimension for example and your degenerate dimension DD_ORDERNUMBER, can add in SCD dimensions etc. Configure the fact table created and set Deployable to false, so the foreign key does not get generated. Can now use the cube in a mapping and load data into the fact table via the cube operator, this will look after surrogate lookups and slowly changing dimension references.   If you generate the SQL you will see the ON clause for matching includes the columns representing the degenerate dimension columns. Here we have seen how this use case for loading fact tables using degenerate dimensions becomes a whole lot simpler using OWB 11gR2. I'm sure there are other use cases where using this mix of dimensions with surrogate and regular identifiers is useful, Fact tables partitioned by date columns is another classic example that this will greatly help and make the cube operator much more useful. Good to hear any comments.

    Read the article

  • You should NOT be writing jQuery in SharePoint if&hellip;

    - by Mark Rackley
    Yes… another one of these posts. What can I say? I’m a pot stirrer.. a rabble rouser *rabble rabble* jQuery in SharePoint seems to be a fairly polarizing issue with one side thinking it is the most awesome thing since Princess Leia as the slave girl in Return of the Jedi and the other half thinking it is the worst idea since Mannequin 2: On the Move. The correct answer is OF COURSE “it depends”. But what are those deciding factors that make jQuery an awesome fit or leave a bad taste in your mouth? Let’s see if I can drive the discussion here with some polarizing comments of my own… I know some of you are getting ready to leave your comments even now before reading the rest of the blog, which is great! Iron sharpens iron… These discussions hopefully open us up to understanding the entire process better and think about things in a different way. You should not be writing jQuery in SharePoint if you are not a developer… Let’s start off with my most polarizing and rant filled portion of the blog post. If you don’t know what you are doing or you don’t have a background that helps you understand the implications of what you are writing then you should not be writing jQuery in SharePoint! I truly believe that one of the biggest reasons for the jQuery haters is because of all the bad jQuery out there. If you don’t know what you are doing you can do some NASTY things! One of the best stories I’ve heard about this is from my good friend John Ferringer (@ferringer). John tells this story during our Mythbusters session we do together. One of his clients was undergoing a Denial of Service attack and they couldn’t figure out what was going on! After much searching they found that some genius jQuery developer wrote some code for an image rotator, but did not take into account what happens when there are no images to load! The code just kept hitting the servers over and over and over again which prevented anything else from getting done! Now, I’m NOT saying that I have not done the same sort of thing in the past or am immune from such mistakes. My point is that if you don’t know what you are doing, there are very REAL consequences that can have a major impact on your organization AND they will be hard to track down.  Think how happy your boss will be after you copy and pasted some jQuery from a blog without understanding what it does, it brings down the farm, AND it takes them 3 days to track it back to you.  :/ Good times will not be had. Like it or not JavaScript/jQuery is a programming language. While you .NET people sit on your high horses because your code is compiled and “runs faster” (also debatable), the rest of us will be actually getting work done and delivering solutions while you are trying to figure out why your widget won’t deploy. I can pick at that scab because I write .NET code too and speak from experience. I can do both, and do both well. So, I am not speaking from ignorance here. In JavaScript/jQuery you have variables, loops, conditionals, functions, arrays, events, and built in methods. If you are not a developer you just aren’t going to take advantage of all of that and use it correctly. Ahhh.. but there is hope! There is a lot of jQuery resources out there to help you learn and learn well! There are many experts on the subject that will gladly tell you when you are smoking crack. I just this minute saw a tweet from @cquick with a link to: “jQuery Fundamentals”. I just glanced through it and this may be a great primer for you aspiring jQuery devs. Take advantage of all the resources and become a developer! Hey, it will look awesome on your resume right? You should not be writing jQuery in SharePoint if it depends too much on client resources for a good user experience I’ve said it once and I’ll say it over and over until you understand. jQuery is executed on the client’s computer. Got it? If you are looping through hundreds of rows of data, searching through an enormous DOM, or performing many calculations it is going to take some time! AND if your user happens to be sitting on some old PC somewhere that they picked up at a garage sale their experience will be that much worse! If you can’t give the user a good experience they will not use the site. So, if jQuery is causing the user to have a bad experience, don’t use it. I sometimes go as far to say that you should NOT go to jQuery as a first option for external facing web sites because you have ZERO control over what the end user’s computer will be. You just can’t guarantee an awesome user experience all of the time. Ahhh… but you have no choice? (where have I heard that before?). Well… if you really have no choice, here are some tips to help improve the experience: Avoid screen scraping This is not 1999 and SharePoint is not an old green screen from a mainframe… so why are you treating it like it is? Screen scraping is time consuming and client intensive. Take advantage of tools like SPServices to do your data retrieval when possible. Fine tune your DOM searches A lot of time can be eaten up just searching the DOM and ignoring table rows that you don’t need. Write better jQuery to only loop through tables rows that you need, or only access specific elements you need. Take advantage of Element ID’s to return the one element you are looking for instead of looping through all the DOM over and over again. Write better jQuery Remember this is development. Think about how you can write cleaner, faster jQuery. This directly relates to the previous point of improving your DOM searches, but also when using arrays, variables and loops. Do you REALLY need to loop through that array 3 times? How can you knock it down to 2 times or even 1? When you have lots of calculations and data that you are manipulating every operation adds up. Think about how you can streamline it. Back in the old days before RAM was abundant, Cores were plentiful and dinosaurs roamed the earth, us developers had to take performance into account in everything we did. It’s a lost art that really needs to be used here. You should not be writing jQuery in SharePoint if you are sending a lot of data over the wire… Developer:  “Awesome… you can easily call SharePoint’s web services to retrieve and write data using SPServices!” Administrator: “Crap! you can easily call SharePoint’s web services to retrieve and write data using SPServices!” SPServices may indeed be the best thing that happened to SharePoint since the invention of SharePoint Saturdays by Godfather Lotter… BUT you HAVE to use it wisely! (I REFUSE to make the Spiderman reference). If you do not know what you are doing your code will bring back EVERY field and EVERY row from a list and push that over the internet with all that lovely XML wrapped around it. That can be a HUGE amount of data and will GREATLY impact performance! Calling several web service methods at the same time can cause the same problem and can negatively impact your SharePoint servers. These problems, thankfully, are not difficult to rectify if you are careful: Limit list data retrieved Use CAML to reduce the number of rows returned and limit the fields returned using ViewFields.  You should definitely be doing this regardless. If you aren’t I hope your admin thumps you upside the head. Batch large list updates You may or may not have noticed that if you try to do large updates (hundreds of rows) that the performance is either completely abysmal or it fails over half the time. You can greatly improve performance and avoid timeouts by breaking up your updates into several smaller updates. I don’t know if there is a magic number for best performance, it really depends on how much data you are sending back more than the number of rows. However, I have found that 200 rows generally works well.  Play around and find the right number for your situation. Delay Web Service calls when possible One of the cool things about jQuery and SPServices is that you can delay queries to the server until they are actually needed instead of doing them all at once. This can lead to performance improvements over DataViewWebParts and even .NET code in the right situations. So, don’t load the data until it’s needed. In some instances you may not need to retrieve the data at all, so why retrieve it ALL the time? You should not be writing jQuery in SharePoint if there is a better solution… jQuery is NOT the silver bullet in SharePoint, it is not the answer to every question, it is just another tool in the developers toolkit. I urge all developers to know what options exist out there and choose the right one! Sometimes it will be jQuery, sometimes it will be .NET,  sometimes it will be XSL, and sometimes it will be some other choice… So, when is there a better solution to jQuery? When you can’t get away from performance problems Sometimes jQuery will just give you horrible performance regardless of what you do because of unavoidable obstacles. In these situations you are going to have to figure out an alternative. Can I do it with a DVWP or do I have to crack open Visual Studio? When you need to do something that jQuery can’t do There are lots of things you can’t do in jQuery like elevate privileges, event handlers, workflows, or interact with back end systems that have no web service interface. It just can’t do everything. When it can be done faster and more efficiently another way Why are you spending time to write jQuery to do a DataViewWebPart that would take 5 minutes? Or why are you trying to implement complicated logic that would be simple to do in .NET? If your answer is that you don’t have the option, okay. BUT if you do have the option don’t reinvent the wheel! Take advantage of the other tools. The answer is not always jQuery… sorry… the kool-aid tastes good, but sweet tea is pretty awesome too. You should not be using jQuery in SharePoint if you are a moron… Let’s finish up the blog on a high note… Yes.. it’s true, I sometimes type things just to get a reaction… guess this section title might be a good example, but it feels good sometimes just to type the words that a lot of us think… So.. don’t be that guy! Another good buddy of mine that works for Microsoft told me. “I loved jQuery in SharePoint…. until I had to support it.”. He went on to explain that some user was making several web service calls on a page using jQuery and then was calling Microsoft and COMPLAINING because the page took so long to load… DUH! What do you expect to happen when you are pushing that much data over the wire and are making that many web service calls at once!! It’s one thing to write that kind of code and accept it’s just going to take a while, it’s COMPLETELY another issue to do that and then complain when it’s not lightning fast!  Someone’s gene pool needs some chlorine. So, I think this is a nice summary of the blog… DON’T be that guy… don’t be a moron. How can you stop yourself from being a moron? Ah.. glad you asked, here are some tips: Think Is jQuery the right solution to my problem? Is there a better approach? What are the implications and pitfalls of using jQuery in this situation? Search What are others doing? Does someone have a better solution? Is there a third party library that does the same thing I need? Plan Write good jQuery. Limit calculations and data sent over the wire and don’t reinvent the wheel when possible. Test Okay, it works well on your machine. Try it on others ESPECIALLY if this is for an external site. Test with empty data. Test with hundreds of rows of data. Test as many scenarios as possible. Monitor those server resources to see the impact there as well. Ask the experts As smart as you are, there are people smarter than you. Even the experts talk to each other to make sure they aren't doing something stupid. And for the MOST part they are pretty nice guys. Marc Anderson and Christophe Humbert are two guys who regularly keep me in line. Make sure you aren’t doing something stupid. Repeat So, when you think you have the best solution possible, repeat the steps above just to be safe.  Conclusion jQuery is an awesome tool and has come in handy on many occasions. I’m even teaching a 1/2 day SharePoint & jQuery workshop at the upcoming SPTechCon in Boston if you want to berate me in person. However, it’s only as awesome as the developer behind the keyboard. It IS development and has its pitfalls. Knowledge and experience are invaluable to giving the user the best experience possible.  Let’s face it, in the end, no matter our opinions, prejudices, or ego providing our clients, customers, and users with the best solution possible is what counts. Period… end of sentence…

    Read the article

  • John Hitchcock of Pace Describes the Oracle Agile PLM Customer Experience

    John Hitchcock, Senior Manager of Configuration Management at Pace (formerly 2Wire, Inc.), sat down for an interview during Oracle's Innovation Summit with Kerrie Foy, Manager of PLM Product Marketing at Oracle. Learn why his organization upgraded to the latest version of Agile and expanded the footprint to achieve impressive savings and productivity gains across the global, networked product value-chain.

    Read the article

  • John Hitchcock of Pace Describes the Oracle Agile PLM Customer Experience

    John Hitchcock, Senior Manager of Configuration Management at Pace (formerly 2Wire, Inc.), sat down for an interview during Oracle's Innovation Summit with Kerrie Foy, Manager of PLM Product Marketing at Oracle. Learn why his organization upgraded to the latest version of Agile and expanded the footprint to achieve impressive savings and productivity gains across the global, networked product value-chain.

    Read the article

  • The Next Wave of PeopleSoft Capabilities for the Staffing Industry Is Here

    - by Mark Rosenberg
    With the release of PeopleSoft Financials and Supply Chain Management 9.1 Feature Pack 2 in January this year, we introduced substantial new capabilities for our Staffing Industry customers. Through a co-development project with Infosys Limited, we have enriched Oracle's PeopleSoft Staffing Solution with new tools aimed at accelerating and improving the quality of job order fulfillment, increasing branch recruiter productivity, and driving profitable growth. Staffing industry firms succeed based on their ability to rapidly, cost-effectively, and continually fill their pipelines with new clients and job orders, recruit the best talent, and match orders with talent. Pressure to execute in each of these functional areas is even more acute on staffing firms as contingent labor becomes a more substantial and permanent part of the workforce mix. In an industry that creates value through speedy execution, there is little room for manual, inefficient processes and brittle, custom integrations, which throttle profitability and growth. The latest wave of investment in the PeopleSoft Staffing Solution focuses on generating efficiency and flexibility for our customers. Simplicity To operate profitably and continue growing, a Staffing enterprise needs its client management, recruiting, order fulfillment, and other processes to function in harmony. Most importantly, they need to be simple for recruiters, branch managers, and applicants to access and understand. The latest PeopleSoft Staffing Solution set of enhancements includes numerous automated defaulting mechanisms and information-rich dashboard pagelets that even a new employee can learn quickly. Pending Applicant, Agenda management, Search, and other pagelets are just a few of the newest, easy-to-use tools that not only aggregate and summarize information, but also provide instant access to applicants, tasks, and key reports for branch staff. Productivity The leading firms in the Staffing industry are those that can more efficiently orchestrate large numbers of candidates, clients, and orders than their competitors can. PeopleSoft Financials and Supply Chain Management 9.1 Feature Pack 2 delivers productivity boosters that Staffing firms can leverage to streamline tasks and processes for competitive advantage. For example, we enhanced the Recruiting Funnel, which manages the candidate on-boarding process, with a highly interactive user interface. It integrates disparate Staffing business processes and exploits new PeopleTools technologies to offer a superior on-boarding user experience. Automated creation of agenda items and assignment tasks for each candidate minimizes setup and organizes assignment steps for the on-boarding process. Mass updates of tasks and instant access to the candidate overview page (which we also expanded), candidate event status, event counts, and other key data enable recruiters to better serve clients and candidates. Lower TCO Constructing and maintaining an efficient yet flexible labor supply chain can be complicated, let alone expensive. Traditionally, Staffing firms have been challenged in controlling their technology cost of ownership because connecting candidate and client-facing tools involved building and integrating custom applications and technologies and managing staff turnover, placing heavy demands on IT and support staff. With PeopleSoft Financials and Supply Chain Management 9.1 Feature Pack 2, there are two major enhancements that aggressively tackle these challenges. First, we added another integration framework to enable cost-effective linking of the Staffing firm’s PeopleSoft applications and its job board distributors. (The first PeopleSoft 9.1 Feature Pack released in March 2011 delivered an integration framework to connect to resume parsing providers.) Second, we introduced the teaming concept to enable work to be partitioned to groups, as well as individuals. These two capabilities, combined with a host of others, position Staffing firms to configure and grow their businesses without growing their IT and overhead expenditures. For our Staffing Industry customers, PeopleSoft Financials and Supply Chain Management 9.1 Feature Pack 2 is loaded with high-value tools aimed at enabling and sustaining a flexible labor supply chain. For more information, contact [email protected] or [email protected].

    Read the article

  • Training on Demand Certification Packages for DBAs

    - by Antoinette O'Sullivan
    The demand for Database Administrators continues to grow.*Almost two-thirds of IT hiring managers indicate that they highly value certifications in validatingIT skills and expertise.** * Job satisfaction and DBA work growth rate: CNN Money's 2011 Best Jobs in America survey.** Survey among nearly 1,700 respondents by CompTIA, the nonprofit trade association for the IT industry, cited in Certification Magazine, Feb. 14 th., 2012. Get Certified with Training on DemandAre you an experienced Database professional eager to achieve certification?Is time your most precious resource?Then try our new Training On Demand Certification Value Package with 20% discount. These all-in-one packages give you everything you need to get certified with success: Why Training On Demand:  Expert training from Oracle’s top instructors Sophisticated streaming video recording Available for 90 days, 24 hours a day, 7 days a week White boarding and training labs for hands-on experience Start, stop, pause, jump or rewind sections of the course as needed  Oracle University instructor Q&A  A full-text search leads to the right video fragment in a matter of seconds. Watch this demo to see how it works. Additional Certification resources: Benefits of Oracle Certification Database Certification Paths Available Database Certification Exams Getting certified has never been easier!For assistance contact your local Oracle University Service Desk. Many organizations deploy both Oracle Database and MySQL side by side to serve different needs, and as a database professional you can find training courses on both topics at Oracle University! Check out the upcoming Oracle Database 11g training courses and MySQL training courses. Even if you're only managing Oracle Databases at this point of time, getting familiar with MySQL Database will broaden your career path with growing job demand. These Value Packages are also available with the following training formats: In-Class, Live Virtual Class and Self Study: MySQL Database Administration Value Packages Your Savings plus get a FREE Retake  save 5% save 20% save 20% save 20%   In Class Edition Live Virtual Class Edition Self-Study Edition Training On Demand MySQL Database Administrator Certification Value Package View Package View Package View Package View Package MySQL Developer Value Packages Your Savings plus get a FREE Retake  save 5% save 20% save 20% save 20%   In Class Edition Live Virtual Class Edition Self-Study Edition Training On Demand       MySQL Developer Certification Value Package View Package View Package     Oracle Database 10g Value Packages Your Savings plus get a FREE Retake  save 5% save 20% save 20% save 20%   In Class Edition Live Virtual Class Edition Self-Study Edition Training On Demand Oracle Database 10g Administrator Certified Associate Certification Value Package View Package View Package View Package   Oracle Database 10g Administrator Certified Professional Certification Value Package View Package View Package View Package   Oracle Database 11g Value Packages Your Savings plus get a FREE Retake  save 5% save 20% save 20% save 20%   In Class Edition Live Virtual Class Edition Self-Study Edition Training On Demand Oracle Database 11g Administrator Certified Associate Certification Value Package View Package View Package View Package View Package Oracle Database 11g Administrator Certified Professional Certification Value Package View Package View Package View Package View Package Exam Prep Seminar Value Package: Oracle Database Admin 1       View Package Oracle Database 11g Administrator Certified Professional UPGRADE Certification Value Package       View Package Oracle Real Application Clusters 11g and Grid Infrastructure Administraton Certified Expert Certification Value Package       View Package Exam Prep Seminar Value Package: Oracle Database Admin 2        View Package Exam Prep Seminar Value Package: Oracle RAC 11g and Grid Infrastructure Administration       View Package Exam Prep Seminar Value Package: Upgrade Oracle Certified Professional (OCP) to Oracle Database 11g       View Package SQL and PL/SQL Value Packages Your Savings plus get a FREE Retake  save 5% save 20% save 20% save 20%   In Class Edition Live Virtual Class Edition Self-Study Edition Training On Demand Oracle Database Sql Expert Certification Value Package View Package View Package View Package View Package Exam Prep Seminar Value Package: Oracle Database SQL       View Package View our Certification Value Packages Mention this code at the time of booking: E1245 Connect For a full list of MySQL Training courses and events, go to http://oracle.com/education/mysql.

    Read the article

  • OTBI vs. OBIA

    - by PRajkumar
      What are the differences between OTBI and OBIA?   OTBI -- Oracle Transactional Business Intelligence OBIA – Oracle Business Intelligence Applications   OBIA   1. OBIA is the pre-packaged BI Apps that Oracle has provided for several years. It is the data warehouse based Solution 2. It is based on the Universal data warehouse design with different prebuilt adapters that can connect to various source application to bring the     data into the warehouse 3. It allows consolidating the data from various sources to bring them together 4. It provides a library of metrics that help to measure business 5. It provides set of predefined reports and dashboards 6. OBIA works for multiple sources including E-Business Suite, PeopleSoft, JDE, SAP and FUSION Applications    OTBI 1. It is a real time BI 2. There is no warehouse or ETL process for OTBI 3. It is a Fusion Apps only 4. OTBI leveraging the advanced technologies from both BI platform and ADF to enable the online BI queries against database directly 5. OTBI does not have prebuilt dashboards and reports like OBIA   Note: Both OTBI and OBIA are available from same metadata repository. Some of the repository objects are shared between OTBI and OBIA. It was designed to allows to have following configuration:   OTBI Only OBIA Only OTBI and OBIA coexist    Both OTBI and OBIA are accessing Fusion Apps via the ADF

    Read the article

  • Silverlight Cream for March 10, 2010 - 2 -- #811

    - by Dave Campbell
    In this Issue: AfricanGeek, Phil Middlemiss, Damon Payne, David Anson, Jesse Liberty, Jeremy Likness, Jobi Joy(-2-), Fredrik Normén, Bobby Diaz, and Mike Taulty(-2-). Shoutouts: Shawn Wildermuth blogged that they posted My "What's New in Silverlight 3" Video from 0reDev Last Fall Shawn Wildermuth also has a post up for his loyal followers: Where to See Me At MIX10 Jonas Follesø has presentation materials up as well: MVVM presentation from NDC2009 on Vimeo Adam Kinney updated his Favorite Tool and Library Downloads for Silverlight From SilverlightCream.com: Styling Silverlight ListBox with Blend 3 In his latest Video Tutorial, AfricanGeek is animating the ListBox control by way of Expression Blend 3. Animating the Silverlight Opacity Mask Phil Middlemiss has written a Behavior that lets you turn a FrameworkElement into an opacity mask for it's parent container... check out his tutorial and grab the code. AddRange for ObservableCollection in Silverlight 3 Damon Payne has a post up discussing the problem with large amounts of data in an ObservableCollection, and how using AddRange is a performance booster. Easily rotate the axis labels of a Silverlight/WPF Toolkit chart David Anson blogged a solution to rotating the axis labels of a Silverlight and WPF chart. Persisting the Configuration (Updated) Jesse Liberty has a good discussion on the continuation of his HyperVideo Platform talking about what all he is needing from the database in the form of configuration information... including the relationships. Animations and View Models: IAnimationDelegate Check out Jeremy Likness' IAnimationDelegate that lets your ViewModel fire and respond to animations without having to know all about them. Button Style - Silverlight Jobi Joy converted a WPF control template into Silverlight... and you'll want to download the XAML he's got for this :) A Simple Accordion banner using ListBox Jobi Joy also has an Image Accordian created in Expression Blend... and it's a 'drop this XAML in your User Control' kinda thing... again, go grab the XAML :) WCF RIA Services Silverlight Business Application – Using ASP.NET SiteMap for Navigation Fredrik Normén has a code-laden post up on RIA Services and the ASP.NET SiteMap. He is using the Silverlight Business app template that comes with WCF RIA Services. A Simple, Selectable Silverlight TextBlock (sort of)... Bobby Diaz shares with us his solution for a Text control that can be copied from in the same manner 'normal' web controls can be. He also includes a link to another post on the same topic. Silverlight 4 Beta Networking. Part 11 - WCF and TCP Mike Taulty has another pair of video tutorials up in his Networking series. This one is on WCF over TCP Silverlight 4 Beta Networking. Part 12 - WCF and Polling HTTP Mike Taulty's 12th networking video tutorial is on WCF with HTTP polling duplex. Stay in the 'Light! Twitter SilverlightNews | Twitter WynApse | WynApse.com | Tagged Posts | SilverlightCream Join me @ SilverlightCream | Phoenix Silverlight User Group Technorati Tags: Silverlight    Silverlight 3    Silverlight 4    MIX10

    Read the article

  • Debugging OWB generated SAP ABAP code executed through RFC

    - by Anil Menon
    Within OWB if you need to execute ABAP code using RFC you will have to use the SAP Function Module RFC_ABAP_INSTALL_AND_RUN. This function module is specified during the creation of the SAP source location. Usually in a Production environment a copy of this function module is used due to security restrictions. When you execute the mapping by using this Function Module you can’t see the actual ABAP code that is passed on to the SAP system. In case you want to take a look at the code that will be executed on the SAP system you need to use a custom Function Module in SAP. The easiest way to do this is to make a copy of the Function Module RFC_ABAP_INSTALL_AND_RUN and call it say Z_TEST_FM. Then edit the code of the Function Module in SAP as below FUNCTION Z_TEST_FM . DATA: BEGIN OF listobj OCCURS 20. INCLUDE STRUCTURE abaplist. DATA: END OF listobj. DATA: begin_of_line(72). DATA: line_end_char(1). DATA: line_length type I. DATA: lin(72). loop at program. append program-line to WRITES. endloop. ENDFUNCTION. Within OWB edit the SAP Location and use Z_TEST_FM as the “Execution Function Module” instead of  RFC_ABAP_INSTALL_AND_RUN. Then register this location. The Mapping you want to debug will have to be deployed. After deployment you can right click the mapping and click on “Start”.   After clicking start the “Input Parameters” screen will be displayed. You can make changes here if you need to. Check that the parameter BACKGROUND is set to “TRUE”. After Clicking “OK” the log for the execution will be displayed. The execution of Mappings will always fail when you use the above function module. Clicking on the icon “I” (information) the ABAP code will be displayed.   The ABAP code displayed is the code that is passed through the Function Module. You can also find the code by going through the log files on the server which hosts the OWB repository. The logs will be located under <OWB_HOME>/owb/log. Patch #12951045 is recommended while using the SAP Connector with OWB 11.2.0.2. For recommended patches for other releases please check with Oracle Support at http://support.oracle.com

    Read the article

  • ISO Files to USB &ndash; The Cheap and Easy Way

    - by RonGarlit
    (DISCLAIMER: Yes there are lots of more elegant ISO software beside the free Microsoft one I’m about to show. But free is free and it has been tested and works for me for making advance bootable USB drives. That is another story. Look up Windows 8 Developer Preview for that one on BING.) For those of use that work with new technology all the time we accumulate a lot of ISO files and have to burn them to CD/DVD’s quite often. But we now have machines without burner in the corporate environment. We have personally Netbooks and light wait highly mobile laptops that do not have DVD burner. USB ports are all the rage and now we have USB 3.0 which is way faster than the 2.0 we are used to. Just looking at the technology, space saving and the cost issues alone is a reason to buy these answer to the DVD’s. So what is special about USB 2.0 and USB 3.0? USB 2 has a maximum speed of 480 Mbps... (That is Megabits per SECOND!!) Now look at the storage that we have with USB thumb drives that are now up to 64 GB in size, cell phone and PDAs that have a lots of internal storage built in well above the 16 Gig range. At the MAX USB 2.0 speed of 480 Mbps a full transfer of data in between devices can take a long time. Time is money right. Every back up a iPhone? Don’t get me started. So at least the engineers have been planning ahead with USB 3.0 which offers a maximum transfer speed of 4.8 Gbps... (That is Giga bits per SECOND!!) That speed is almost 10 times faster than USB 2.0 …. We don’t need to do the math on that one do we? But for now I'm thrilled with USB 2.0 and the fact I can get these little 4 Gig USB drives for $4.00 each at Staples on sale. Well that is a no brainer don’t you think. But what can you do with them to replace that DVD. Simply and cheaply put………. THIS! First let’s get an ISO file like the Visual Studio 2010 Ultimate DVD ISO from MSDN to demonstrate with. I develop on several computers so this is a good choice for me. So we downloaded the ISO file and put it in a folder somewhere like this. Next we go download to the Windows 7 USB/DVD Download Tool site and read about the tool. http://www.microsoftstore.com/store/msstore/html/pbPage.Help_Win7_usbdvd_dwnTool And click this like to get the tool and install it. Once it is installed you go to the Start, Programs menu, Windows 7 USB DVD Download Tool folder. And then click the tool to open it up. As you will see it is a sweet, simple tool that was originally designed to put the ISO for Windows 7 which is designed to be bootable on a USB or DVD for us geeks to play with. It is now being used for the Windows 8 Developer Preview by many developers for that for the same purpose it was built for in the past. But for now we will use it to put a NON Bootable ISO on a USB. Hey it does the job and I’m reusing a left over program. Why buy the fancy one or a free trial and clutter up my machine. We will click the BROWSE button and navigate to where we put our ISO file we want to put on the USB drive. Obviously we are going to click NEXT and continue to select a USB Device (you can guess what the DVD button is for). Next we select the USB that we have plugged into one of our laptops USB ports. Then we click the BEGIN COPYING button and the first thing the program does is format our USB drive. Then it starts copying out files out of the ISO and constructing the USB as if it was a DVD. So now that the files are copying to the drive I’m going to warn you. We will error out here. This program was design for bootable ISO’s of which this one is NOT. No problem because what fails it the writing of the bootable data to the drive that isn’t there. No biggie…. Forget the STARTOVER button is even there and click the dialog’s CLOSE button and exit the program. Now go to Windows Explorer and navigate to the USB Device. You can now access everything and even add stuff to the drive. But for me I want to keep this drive for one purpose and that is to install VS2010 on various machines. So the only stuff I’ll add to this is a folder of notes on things on visual studio that I might want to put on other machines I’m installing VS2010 on to. So that is it. Have a nice day! The Ron

    Read the article

  • Page debugging got easier in UCM 11g

    - by kyle.hatlestad
    UCM is famous for it's extra parameters you can add to the URL to do different things. You can add &IsJava=1 to get all of the local data and result set information that comes back from the idc_service. You can add &IsSoap=1 and get back a SOAP message with that information. Or &IsJson=1 will send it in JSON format. There are ones that change the display like &coreContentOnly=1 which will hide the footer and navigation on the page. In 10g, you could add &ScriptDebugTrace=1 and it would display the list of resources that were called through includes or eval functions at the bottom of the page. And it would list them in nested order so you could see the order in which they were called and which components overrode each other. But in 11g, that parameter flag no longer works. Instead, you get a much more powerful one called &IsPageDebug=1. When you add that to a page, you get a small gray tab at the bottom right-hand part of the browser window. When you click it, it will expand and let you choose several pieces of information to display. You can select 'idocscript trace' and display the nested includes you used to get with ScriptDebugTrace. You can select 'initial binder' and see the local data and result sets coming back from the service, just as you would with IsJava. But in this display, it formats the results in easy to read tables (instead of raw HDA format). Then you can get the final binder which would contain all of the local data and result sets after executing all of the includes for the display of the page (and not just from the Service call). And then a 'javascript log' for reporting on the javascript functions and times being executed on the page. Together, these new data displays make page debugging much easier in 11g. *Note: This post also applies to Universal Records Management (URM).

    Read the article

  • New Trusted Status awarded to first Mobile Java Developer

    - by Jacob Lehrbaum
    Java Verified has just announced that GameLoft is the first developer to receive its new Trusted Status!  Java Verified is an industry-recognized Java testing and signing program backed and funded by companies such as AT&T, LG, Motorola, Nokia, Oracle, Orange, Samsung and Vodafone, and chartered with making it easier for mobile developers to certify and deploy applications for use across the billions of mobile handsets that run the Java ME.  Because of its breadth and diversity, Java ME provides an unmatched opportunity to reach more than 3 billions consumers, but at the same time, developers are faced with the challenge of working with multiple distribution channels and a range of handsets. To this end, the Java Verified program provides a suite of tests that help to validate identity, functionality, integrity, and quality.  Since its rebirth in 2010 as an independent organization, the Java Verified program has been actively working to make it even easier to create and distribute Java ME apps.  Example initiatives include updates to the Unified Testing Criteria to make it easier to test "Simple Apps," community outreach to better understand and address developer pain-points  and a new "Trusted Status."  In the words of the Java Verified Program, Trusted Status is:a privileged status to be granted to developers who will have proven that the quality of their Java ME apps is of a consistently high standard. These are developers who will have earned the trust of Java Verified by demonstrating unfailingly that testing to the UTC standard is a crucial part of their product development activityThe first developer to be awarded this status is GameLoft.  By achieving Trusted Status Gameloft can now test their applications to the Java Verified standard without needing to provide Java Verified with the evidence.  The apps then automatically get signed with the Java Verified signature enabling GameLoft to benefit from reduced costs and time-to-market for their new Java ME applications from here on out.  Learn more about the exciting news or apply now for Trusted Status!

    Read the article

  • Week 15: The Telephone Game

    - by sandra.haan
    Have you ever played a game of telephone? Remember the one where you whispered something like "Once bitten, twice shy" to the person next to you, only to find that after this message has been shared around the circle the last person to repeat it says "Pastrami on Rye"? Messages can get distorted and we want to make sure that your past successes are clearly articulated which is why we have put in place a reference program for our partners. Listen in as Judson tells you how to engage with OPN in the Partner Reference program. Take advantage of the opportunity to promote your success to prospects through Oracle. Find out more and submit your nomination for a reference today. Until next time, The OPN Communications Team

    Read the article

  • Oracle Announces the new Environmental Accounting and Reporting Solution for Oracle ERP

    - by Oracle Accelerate for Midsize Companies
    Oracle now offers Environmental Accounting & Reporting extends the capabilities of both Oracle's E-Business Suite Financials and JD Edwards Enterprise One family of applications. Oracle Environmental Accounting and Reporting enables organizations to track their greenhouse gas (GHG) emissions and other environmental data against reduction targets, and facilitates environmental reporting for both voluntary and legislated emissions reporting schemes. The solution manages this function from within the existing ERP system and utilizes Oracle Business Intelligence to provide immediate insight into an organization's environmental data to identify and manage CO2 and cost reduction opportunities-providing rapid ROI. Click here to learn more: http://www.oracle.com/us/products/applications/green/accounting-reporting-410442.html

    Read the article

  • JPA/EclipseLink multitenancy screencast

    - by alexismp
    I find JPA and in particular EclipseLink 2.3 to be particularly well suited to illustrate the concept of multitenancy, one of the key PaaS features en route for Java EE 7. Here's a short (5-minute) screencast showing GlassFish 3.1.1 (due out real soon now) and its EclipseLink 2.3 JPA provider showing multitenancy in action. In short, it adds EclipseLink annotations to a JPA entity and deploys two identical applications with different tenant-id properties defined in the persistence.xml descriptor. Each application only sees its own data, yet everything is stored in the same table which was augmented with a discriminator column. For more advanced uses such as tenant property being set on the @PersistenceContext, XML configuration of multitenant JPA entities, and more check out the nicely written wiki page.

    Read the article

  • What makes a Software Craftsman?

    - by Liam McLennan
    At the end of my visit to 8th Light Justin Martin was kind enough to give me a ride to the train station; for my train back to O’Hare. Just before he left he asked me an interesting question which I then posted to twitter: Liam McLennan: . @JustinMartinM asked what I think is the most important attributes of craftsmen. I said, "desire to learn and humility". What's yours? 6:25 AM Apr 17th via TweetDeck several people replied with excellent contributions: Alex Hung: @liammclennan I think kaizen sums up craftmanship pretty well, which is almost same as yours Steve Bohlen: @alexhung @liammclennan those are both all about saying "knowing what you don't know and not being afraid to go learn it" (and I agree!) Matt Roman: @liammclennan @JustinMartinM a tempered compulsion for constant improvement, and an awareness of what needs improving. Justin Martin: @mattroman @liammclennan a faculty for asking challenging questions, and a persistence to battle through difficult obstacles barring growth I thought this was an interesting conversation, and I would love to see other people contribute their opinions. My observation is that Alex, Steve, Matt and I seem to have essentially the same answer in different words. It is also interesting to note (as Alex pointed out) that these definitions are very similar to Alt.NET and the lean concept of kaizen.

    Read the article

  • WLS MBeans

    - by Jani Rautiainen
    WLS provides a set of Managed Beans (MBeans) to configure, monitor and manage WLS resources. We can use the WLS MBeans to automate some of the tasks related to the configuration and maintenance of the WLS instance. The MBeans can be accessed a number of ways; using various UIs and programmatically using Java or WLST Python scripts.For customization development we can use the features to e.g. manage the deployed customization in MDS, control logging levels, automate deployment of dependent libraries etc. This article is an introduction on how to access and use the WLS MBeans. The goal is to illustrate the various access methods in a single article; the details of the features are left to the linked documentation.This article covers Windows based environment, steps for Linux would be similar however there would be some differences e.g. on how the file paths are defined. MBeansThe WLS MBeans can be categorized to runtime and configuration MBeans.The Runtime MBeans can be used to access the runtime information about the server and its resources. The data from runtime beans is only available while the server is running. The runtime beans can be used to e.g. check the state of the server or deployment.The Configuration MBeans contain information about the configuration of servers and resources. The configuration of the domain is stored in the config.xml file and the configuration MBeans can be used to access and modify the configuration data. For more information on the WLS MBeans refer to: Understanding WebLogic Server MBeans WLS MBean reference Java Management Extensions (JMX)We can use JMX APIs to access the WLS MBeans. This allows us to create Java programs to configure, monitor, and manage WLS resources. In order to use the WLS MBeans we need to add the following library into the class-path: WL_HOME\lib\wljmxclient.jar Connecting to a WLS MBean server The WLS MBeans are contained in a Mbean server, depending on the requirement we can connect to (MBean Server / JNDI Name): Domain Runtime MBean Server weblogic.management.mbeanservers.domainruntime Runtime MBean Server weblogic.management.mbeanservers.runtime Edit MBean Server weblogic.management.mbeanservers.edit To connect to the WLS MBean server first we need to create a map containing the credentials; Hashtable<String, String> param = new Hashtable<String, String>(); param.put(Context.SECURITY_PRINCIPAL, "weblogic");        param.put(Context.SECURITY_CREDENTIALS, "weblogic1");        param.put(JMXConnectorFactory.PROTOCOL_PROVIDER_PACKAGES, "weblogic.management.remote"); These define the user, password and package containing the protocol. Next we create the connection: JMXServiceURL serviceURL =     new JMXServiceURL("t3","127.0.0.1",7101,     "/jndi/weblogic.management.mbeanservers.domainruntime"); JMXConnector connector = JMXConnectorFactory.connect(serviceURL, param); MBeanServerConnection connection = connector.getMBeanServerConnection(); With the connection we can now access the MBeans for the WLS instance. For a complete example see Appendix A of this post. For more details refer to Accessing WebLogic Server MBeans with JMX Accessing WLS MBeans The WLS MBeans are structured hierarchically; in order to access content we need to know the path to the MBean we are interested in. The MBean is accessed using “MBeanServerConnection. getAttribute” API.  WLS provides entry points to the hierarchy allowing us to navigate all the WLS MBeans in the hierarchy (MBean Server / JMX object name): Domain Runtime MBean Server com.bea:Name=DomainRuntimeService,Type=weblogic.management.mbeanservers.domainruntime.DomainRuntimeServiceMBean Runtime MBean Servers com.bea:Name=RuntimeService,Type=weblogic.management.mbeanservers.runtime.RuntimeServiceMBean Edit MBean Server com.bea:Name=EditService,Type=weblogic.management.mbeanservers.edit.EditServiceMBean For example we can access the Domain Runtime MBean using: ObjectName service = new ObjectName( "com.bea:Name=DomainRuntimeService," + "Type=weblogic.management.mbeanservers.domainruntime.DomainRuntimeServiceMBean"); Same syntax works for any “child” WLS MBeans e.g. to find out all application deployments we can: ObjectName domainConfig = (ObjectName)connection.getAttribute(service,"DomainConfiguration"); ObjectName[] appDeployments = (ObjectName[])connection.getAttribute(domainConfig,"AppDeployments"); Alternatively we could access the same MBean using the full syntax: ObjectName domainConfig = new ObjectName("com.bea:Location=DefaultDomain,Name=DefaultDomain,Type=Domain"); ObjectName[] appDeployments = (ObjectName[])connection.getAttribute(domainConfig,"AppDeployments"); For more details refer to Accessing WebLogic Server MBeans with JMX Invoking operations on WLS MBeans The WLS MBean operations can be invoked with MBeanServerConnection. invoke API; in the following example we query the state of “AppsLoggerService” application: ObjectName appRuntimeStateRuntime = new ObjectName("com.bea:Name=AppRuntimeStateRuntime,Type=AppRuntimeStateRuntime"); Object[] parameters = { "AppsLoggerService", "DefaultServer" }; String[] signature = { "java.lang.String", "java.lang.String" }; String result = (String)connection.invoke(appRuntimeStateRuntime,"getCurrentState",parameters, signature); The result returned should be "STATE_ACTIVE" assuming the "AppsLoggerService" application is up and running. WebLogic Scripting Tool (WLST) The WebLogic Scripting Tool (WLST) is a command-line scripting environment that we can access the same WLS MBeans. The tool is located under: $MW_HOME\oracle_common\common\bin\wlst.bat Do note that there are several instances of the wlst script under the $MW_HOME, each of them works, however the commands available vary, so we want to use the one under “oracle_common”. The tool is started in offline mode. In offline mode we can access and manipulate the domain configuration. In online mode we can access the runtime information. We connect to the Administration Server : connect("weblogic","weblogic1", "t3://127.0.0.1:7101") In both online and offline modes we can navigate the WLS MBean using commands like "ls" to print content and "cd" to navigate between objects, for example: All the commands available can be obtained with: help('all') For details of the tool refer to WebLogic Scripting Tool and for the commands available WLST Command and Variable Reference. Also do note that the WLST tool can be invoked from Java code in Embedded Mode. Running Scripts The WLST tool allows us to automate tasks using Python scripts in Script Mode. The script can be manually created or recorded by the WLST tool. Example commands of recording a script: startRecording("c:/temp/recording.py") <commands that we want to record> stopRecording() We can run the script from WLST: execfile("c:/temp/recording.py") We can also run the script from the command line: C:\apps\Oracle\Middleware\oracle_common\common\bin\wlst.cmd c:/temp/recording.py There are various sample scripts are provided with the WLS instance. UI to Access the WLS MBeans There are various UIs through which we can access the WLS MBeans. Oracle Enterprise Manager Fusion Middleware Control Oracle WebLogic Server Administration Console Fusion Middleware Control MBean Browser In the integrated JDeveloper environment only the Oracle WebLogic Server Administration Console is available to us. For more information refer to the documentation, one noteworthy feature in the console is the ability to record WLST scripts based on the navigation. In addition to the UIs above the JConsole included in the JDK can be used to access the WLS MBeans. The JConsole needs to be started with specific parameter to force WLS objects to be used and jar files in the classpath: "C:\apps\Oracle\Middleware\jdk160_24\bin\jconsole" -J-Djava.class.path=C:\apps\Oracle\Middleware\jdk160_24\lib\jconsole.jar;C:\apps\Oracle\Middleware\jdk160_24\lib\tools.jar;C:\apps\Oracle\Middleware\wlserver_10.3\server\lib\wljmxclient.jar -J-Djmx.remote.protocol.provider.pkgs=weblogic.management.remote For more details refer to the Accessing Custom MBeans from JConsole. Summary In this article we have covered various ways we can access and use the WLS MBeans in context of integrated WLS in JDeveloper to be used for Fusion Application customization development. References Developing Custom Management Utilities With JMX for Oracle WebLogic Server Accessing WebLogic Server MBeans with JMX WebLogic Server MBean Reference WebLogic Scripting Tool WLST Command and Variable Reference Appendix A package oracle.apps.test; import java.io.IOException;import java.net.MalformedURLException;import java.util.Hashtable;import javax.management.MBeanServerConnection;import javax.management.MalformedObjectNameException;import javax.management.ObjectName;import javax.management.remote.JMXConnector;import javax.management.remote.JMXConnectorFactory;import javax.management.remote.JMXServiceURL;import javax.naming.Context;/** * This class contains simple examples on how to access WLS MBeans using JMX. */public class BlogExample {    /**     * Connection to the WLS MBeans     */    private MBeanServerConnection connection;    /**     * Constructor that takes in the connection information for the      * domain and obtains the resources from WLS MBeans using JMX.     * @param hostName host name to connect to for the WLS server     * @param port port to connect to for the WLS server     * @param userName user name to connect to for the WLS server     * @param password password to connect to for the WLS server     */    public BlogExample(String hostName, String port, String userName,                       String password) {        super();        try {            initConnection(hostName, port, userName, password);        } catch (Exception e) {            throw new RuntimeException("Unable to connect to the domain " +                                       hostName + ":" + port);        }    }    /**     * Default constructor.     * Tries to create connection with default values. Runtime exception will be     * thrown if the default values are not used in the local instance.     */    public BlogExample() {        this("127.0.0.1", "7101", "weblogic", "weblogic1");    }    /**     * Initializes the JMX connection to the WLS Beans     * @param hostName host name to connect to for the WLS server     * @param port port to connect to for the WLS server     * @param userName user name to connect to for the WLS server     * @param password password to connect to for the WLS server     * @throws IOException error connecting to the WLS MBeans     * @throws MalformedURLException error connecting to the WLS MBeans     * @throws MalformedObjectNameException error connecting to the WLS MBeans     */    private void initConnection(String hostName, String port, String userName,                                String password)                                 throws IOException, MalformedURLException,                                        MalformedObjectNameException {        String protocol = "t3";        String jndiroot = "/jndi/";        String mserver = "weblogic.management.mbeanservers.domainruntime";        JMXServiceURL serviceURL =            new JMXServiceURL(protocol, hostName, Integer.valueOf(port),                              jndiroot + mserver);        Hashtable<String, String> h = new Hashtable<String, String>();        h.put(Context.SECURITY_PRINCIPAL, userName);        h.put(Context.SECURITY_CREDENTIALS, password);        h.put(JMXConnectorFactory.PROTOCOL_PROVIDER_PACKAGES,              "weblogic.management.remote");        JMXConnector connector = JMXConnectorFactory.connect(serviceURL, h);        connection = connector.getMBeanServerConnection();    }    /**     * Main method used to invoke the logic for testing     * @param args arguments passed to the program     */    public static void main(String[] args) {        BlogExample blogExample = new BlogExample();        blogExample.testEntryPoint();        blogExample.testDirectAccess();        blogExample.testInvokeOperation();    }    /**     * Example of using an entry point to navigate the WLS MBean hierarchy.     */    public void testEntryPoint() {        try {            System.out.println("testEntryPoint");            ObjectName service =             new ObjectName("com.bea:Name=DomainRuntimeService,Type=" +"weblogic.management.mbeanservers.domainruntime.DomainRuntimeServiceMBean");            ObjectName domainConfig =                (ObjectName)connection.getAttribute(service,                                                    "DomainConfiguration");            ObjectName[] appDeployments =                (ObjectName[])connection.getAttribute(domainConfig,                                                      "AppDeployments");            for (ObjectName appDeployment : appDeployments) {                String resourceIdentifier =                    (String)connection.getAttribute(appDeployment,                                                    "SourcePath");                System.out.println(resourceIdentifier);            }        } catch (Exception e) {            throw new RuntimeException(e);        }    }    /**     * Example of accessing WLS MBean directly with a full reference.     * This does the same thing as testEntryPoint in slightly difference way.     */    public void testDirectAccess() {        try {            System.out.println("testDirectAccess");            ObjectName appDeployment =                new ObjectName("com.bea:Location=DefaultDomain,"+                               "Name=AppsLoggerService,Type=AppDeployment");            String resourceIdentifier =                (String)connection.getAttribute(appDeployment, "SourcePath");            System.out.println(resourceIdentifier);        } catch (Exception e) {            throw new RuntimeException(e);        }    }    /**     * Example of invoking operation on a WLS MBean.     */    public void testInvokeOperation() {        try {            System.out.println("testInvokeOperation");            ObjectName appRuntimeStateRuntime =                new ObjectName("com.bea:Name=AppRuntimeStateRuntime,"+                               "Type=AppRuntimeStateRuntime");            String identifier = "AppsLoggerService";            String serverName = "DefaultServer";            Object[] parameters = { identifier, serverName };            String[] signature = { "java.lang.String", "java.lang.String" };            String result =                (String)connection.invoke(appRuntimeStateRuntime, "getCurrentState",                                          parameters, signature);            System.out.println("State of " + identifier + " = " + result);        } catch (Exception e) {            throw new RuntimeException(e);        }    }}

    Read the article

  • Value of SOA Specialization interview with Thomas Schaller IPT - part III

    - by Jürgen Kress
    Recognized by Oracle, Preferred by Customers. We had the great opportunity to interview Thomas Schaller – Partner from our SOA Specialized Partner IPT Innovation Process Technology from Switzerland Why did IPT decide to become SOA Specialized? " SOA Specialization is a great branding for IPT. We are the SOA Specialists in the Swiss market, as we focus all our services around SOA. With 65 Swiss consultants focused on SOA Security & SOA Testing & BPM – Business Process Management & BSM – Business Service Modeling the partnership with Oracle as the technology leader in SOA is key, therefore it was important to us to become the first SOA Specialized company in Switzerland. As a result IPT is mentioned by Gartner as one of eight European SOA Consulting Firms and included in „Guide to SOA Consulting and System Integration Service Providers“ Can you describe the marketing activities with Oracle? Once a year we organize the largest SOA Conference in Switzerland “SOA, BPM & Integration Forum 2011“ Oracle is much more than a sponsor for the conference. Jointly we invite our customer base to attend this key event. The sales teams address jointly their most important prospects and customers. Oracle supports us with key speakers who present future directions of the Oracle SOA portfolio like Clemens Utschig-Utschig who presented details about the Complex Event Processing (CEP) solution in 2009 and James Allerton-Austin who presented details about the social BPM solution (BPM) in 2010. Additional our key customers presented their Oracle SOA success stories. How did you team with Oracle around the sales activities? "Sales alignment is key for the successful partnership. When we achieved! SOA Specialization we celebrated jointly with the Oracle and IPT middleware sales team. At the Aperol may interesting discussions resulted in joint opportunities and business. A key section of our joint business planning are marketing and sales activities. Together we define campaign topics and target customers. Matthias Breitschmid our superb Oracle partner manager ensures that the defined sales teams align and start the joint business. Regular we review our joint business plan with the joint management teams and Jürgen Kress our EMEA Oracle Sponsor. It is great to see that both companies profit from each other and we receive leads from Oracle!” Did you get Oracle support to train your consultants in the Oracle SOA Suite? “Enablement is key for us to deliver successful SOA projects. Together with Ralph Bellinghausen from the Oracle Enablement team we defined an Oracle trainings plan for our consultants. The monthly SOA Partner Community newsletter is a great resource to get the latest product updates, webcasts and trainings. As a SOA Specialized partner we get also invited to the SOA Blackbelt trainings, this trainings are hosted by Oracle product management where we get not only first hand information we get also direct access to the developers who can support us in critical project phases. Driven by the customer success we have increased our Oracle SOA practice by more than 200% in the last years!” Why did the customer decide for the IPT SOA offering? “SOA Specialization becomes a brand for customers, it proofs that we have the certified SOA skills and that IPT has delivered successful Oracle SOA projects. Jointly with Oracle and all the support we get from marketing, sales, enablement, support and product management we can ensure our customers to deliver their SOA project successful!” What are the next steps for IPT? “SOA Specialization is a super beneficial for IPT. We are looking forward to our upcoming SOA, BPM & Integration Forum 2011 and prepare to become BPM Specialized. part I Torsten Winterberg, Opitz Consulting & part II Debra Lilley, Fujitsu For more information on SOA Specialization and the SOA Partner Community please feel free to register at www.oracle.com/goto/emea/soa (OPN account required) Blog Twitter LinkedIn Mix Forum Wiki Website

    Read the article

  • Alice In Wonderland: Good, but not Great

    - by Theo Moore
    We went to see Alice In Wonderland today. We both like Tim Burton a lot (the stranger the better) and like Johnny Depp very well also. After seeing all the previews and such, we were fired up to see this film. Honestly, I thought it was good but not great. I was prepared to be wow-ed, but I wasn't. Perhaps I expected too much. I did like it, but I'll not own it nor would I expect to see it again...unless someone I know decides they want to see it. I was about to say something to reassure you that I wasn't going to provide any spoilers but two things occurred to me: one, I never give spoilers and two, why worry about spoilers for a film that so closely follows a book? My comments about the film are hard to describe, but the basic gist is that it doesn't really feel like it..."works" to me. I can't get any more specific than that, much as I'd like to do so. Something about it seems sort of disjointed and not in that Alice way you'd expect. My only specific comment is that I didn't like the actor who plays Alice very well. She was very flat and just didn't sell he character to me. She seemed a bit, well, plastic. Depp was as good as you'd expect him to be, I am happy to say. Obviously Lewis Carroll couldn't have imagined this made into film, but I can't help thinking that he'd see this and say that Depp was the perfect Mad Hatter. So, I'd definitely recommend seeing it (we saw it in 3D which was cool, but not really necessary) at least once, but don't be surprised if you're kinda meh afterwards.

    Read the article

< Previous Page | 87 88 89 90 91 92 93 94 95 96 97 98  | Next Page >