Search Results

Search found 12499 results on 500 pages for 'business analysis'.

Page 119/500 | < Previous Page | 115 116 117 118 119 120 121 122 123 124 125 126  | Next Page >

  • Ecommerce Marketing for Entrepreneurs

    If you've got your own online business then you need GotBiz.tv &#150; a WebTV network of programs where entrepreneurs learn more about marketing and business. Because after all, 'business doesn't have to be boring.'

    Read the article

  • Why Engine Optimization is Important?

    Engine optimization is very important in order to increase your business. In this era a large number of people are facing a huge downfall in the business. Recession has increased the downfall to a huge extent. So if you really want to protect business then you need to go for an effective SEO.

    Read the article

  • What Are the Best Ways to Get Traffic From Search Engine Optimization Consultant?

    Every internet based business needs to go through a well planned and thought out process before it actually gets established and achieving its purpose. Obviously, the process is totally different from how a brick-and-mortar business is started and established but the basics remain the same. One of the key ingredients of the process of establishing an Internet based business is getting your website search engine optimized. Depending upon the size and complexity of business, search engine optimization may turn out to be a very detailed process if you really want it to be effective and useful.

    Read the article

  • Java EE 6????????????????! ??????????????????????J2EE??????????J2EE??????JavaOne Tokyo 2012?????

    - by ??? 03
    Java EE???????Java EE???????????????·????????????Core J2EE Patterns(J2EE????)?????????????????????10???????????????? “????????????”?????????????Java EE??????6?????????WebLogic Server?????????????·???????????????J2EE???????????????????????????????2012?4???????JavaOne Tokyo 2012??????????Java EE 6???J2EE???????????????? ???????????????????????????????J2EE????????????????????Java EE 6??????J2EE???????????????(???)? J2EE???????????3???????????J2EE(J2EE 1.4???Java EE)????????????????????????????????????????J2EE???????????J2EE?????????????????????????????????????????????????????????????????J2EE?????????????·?????????????????????????Core J2EE Patterns??????? ??J2EE?????????10?????????????????????????????Java EE???????????????????????????????????????????????????J2EE?????????????????????/???????????????????????????????J2EE????????????????????3??????????? ?????? ?????????????? ???????????????????1?????????????Java EE 6????????EJB 3.1?JPA(Java Persistence API) 2.0?CDI(Contexts and Dependency Injection) 1.0????????????????????2???????????·????????????????????·?????Java EE????????????????????????3???????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????(??)???????????????????????????????(???) Anemic?“??DOM”??????????????????????????????DOM????????????????????Java EE??????????????????????????????·?????????DOM(Domain Object Model)????????????????????? ?????????·???????????????????????????????????????Anemic(??)???DOM?????????????????????????????????????????????????????????????????????????????????Anemic?DOM??????????????????????????????????????????????????????·????????????????Java EE 6??????????????????????????DOM????????????????????????????????????????????????????????????????·???????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????DOM??????????????????????????????/????????????????????????????????????????????DOM?????????????DOM?Java EE????????????????????????????(???) ???DOM??????????J2EE???Java EE 6??????????????????????????J2EE??????????????????????Anemic???DOM?????????????????????????DOM???????DAO(Data Access Object)?????????????????????????????????DOM???????????????DOM?????????????????????????????????????????????????????????Java EE???Java EE 6??????JPA 2.0????????????DOM??????????????????????????????DOM????????????????????? ???JPA???????????????????????Java EE????J2EE?????????????????Java EE???????????????????????Java EE????????????DOM?????????????????????????????????????????????????????DOM?????????????????????Java EE???????????????????????????(???)????????????3???????????? ?????????????? ????????????? ????? ?????????????????10????????????J2EE?????????????????????????????????????????????????????????????????Session Fac,ade?Application Service?Business Object(Composite Entity)???????????????Transfer Object?Value List Handler?????????????????????????????????????????????????????????????????????Java EE???????????????????????????????????Business Delegate?Service Locator??????????????Java EE??????????????? Session Fac,ade??????????????????????????????????????????????????????????????Session Fac,ade????????Application Service?Business Object????????????????????????????????Java EE 6???????????????(????)??????????????????Session Fac,ade???????????????????·?????????????????????????????????Application Service?????????????????????·????????????????????????Business Object??????????·?????????POJO??????????????????????????????????????????????????????????????????? J2EE??????????????Java EE 6???????????????????????????????????????Java EE???DAO?Domain Store?????????????Service Activator?Web Service Broker???API??????????????????? ????DAO?????·???????????????????????????????????????????1???????????????·???????????????????????????????????????????????????????????????????????(????)?????????????????????????????????DAO??????????EntityManager???????????????????????????????????DAO?????????????????????????????????????????????????????????????Criteria Object????????????????????????DAO??????????????????????DAO?????????????????? ???Service Activator????????????????????????????????????????????????JMS(Java Message Service)?MDB(Message Driven Bean)????????????????Java EE 6??????EJB 3.1????Session Bean?????·????????????@Asynchronous?????????????????????????????????Future?????????????????????????????????????????Java EE 6?????????????????????Service Activator????????????????? ????????·?????????????????????????????????????????????MDB?????????????????????????????????@Asynchronous?????????SLA????????????????????????????????????????????????????????? ??????????????Web???????????????????????????????????????????JSF(Java Server Faces)?????????????????????JSF???????????????????Intercepting Filter??????????????????????????????????JSF???????????? ??????????J2EE???????????????????????????Java EE?????????????????????????????????????Session Fac,ade?Business Object????Application Service??????????????????????????????????????????????Java EE??????????????????????????????????????????????????????????????????????????????????????????????????????????????

    Read the article

  • Java EE???????????!? ?Oracle ADF???????!!|WebLogic Channel|??????

    - by ???02
    WebLogic Server???Java EE??????????????????????Oracle Application Development Framework(ADF)????????????Oracle ADF????????????????????????????????????????GUI?????????????????Web???????????????????????????????????????/??????????????????????Fusion Application???????/?????????Oracle ADF?????????Java EE????????????????????????????? Fusion Middleware?????????????????????????Oracle ADF???????????????(???)??????????????????? ???Web??????????????????????????????1???MVC???????????????????????·????????Model?????????????????????View?????Model?View??????Controller????3???????????????????????????????????????Model?View?Controller?????????????????????????????????????????????????????????????????????????????????????????????? ????MVC???????????????????????????Controller?Model?????????EJB????????Web???????Model????????????????????????????View?Model??????Model????????????????????????????????????????????????2004???????????Oracle ADF???????????????????????????Java EE???????????????·??·???????????Oracle Technology Network:?Oracle Applicastion Development Framework????Java EE????????????????? Oracle ADF??????????????????????????????????????????????????IDE?JDeveloper??????????????????????????????WYSIWYG????????????????????????????????????????????????????????????SQL??????????·????·??????????????????????????Oracle Technology Network:?Oracle JDeveloper ????????Oracle JDeveloper 11g ????????  ???Oracle ADF??????????????????????/????????????Web UI??????????????????????????????????Oracle Enterprise Manager?Oracle Enterprise Performance Management?Oracle Business Intelligence??????????????????????????????????·?????????Fusion Application????????????????Oracle ADF??????????????????????/???????????????????????????????????150?????GUI?????????·??????????? Oracle ADF???????????????ADF Faces???ADF Task Flow???ADF Model?????????????View?Controller?Model?????????????????????????????????????ADF Security??????????????????????????????????ADF Business Components???????????? ??Oracle ADF???????Web??????????????????????????????????????????????????????Oracle ADF???ADF Faces???JSF?????150?????GUI???????????????&????????????????Ajax?????????????????????? ???????????????????????????????????????????????????GUI??????????????????????????????????????????????GUI????????????·???????????????????????????????Web??????????????? Oracle ADF???Web?????????????????????????????Excel?????????????????????????????????????Desktop Integration???????????????????????????????????????Excel?????????????????????????? ????????????????????·????????????????????????ADF Mobile???????????????????????????Web???????????????????????????????·???????????????·??????????????????????????????????ADF Mobile??????·????????????????????????·?????????????????ADF????????????????? ?ADF Business Components???ADF Task Flow???????????????ADF Business Components?J2EE????·???????????????·?????????????????????????·????·??????????????????????????????????????·??????Java???????????????????????JDBC?????????????????????????? ???ADF Task Flow?JSF?????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????URL???????????????????????????????????????????? ??????Java EE??????????????????Oracle ADF???????????????????????.NET Framework?????????????????????·?????Oracle ADF????????????????????????????????????????????????????Oracle ADF??????????????????????????????????????Java EE 6?HTML5???????? Oracle ADF????????Java EE?????Java EE6??????????Web???????????????????HTML5??????????????????????????????·??????????????????????????????????????????????? ???????????????????????????1????????????????????·??????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????? ???????Oracle ADF?????????????????????????????????????????????????????????????????????????????????Oracle ADF????????????·???????????????????????????????????Oracle ADF???Java EE?????????????????????????????????:?Oracle JDeveloper?Oracle ADF 11g?Release 2(11.1.2.x):????

    Read the article

  • Producing an view of a text's revision history in Python

    - by hekevintran
    I have two versions of a piece of text and I want to produce an HTML view of its revision similar to what Google Docs or Stack Overflow displays. I need to do this in Python. I don't know what this technique is called but I assume that it has a name and hopefully there is a Python library that can do it. Version 1: William Henry "Bill" Gates III (born October 28, 1955)[2] is an American business magnate, philanthropist, and chairman[3] of Microsoft, the software company he founded with Paul Allen. Version 2: William Henry "Bill" Gates III (born October 28, 1955)[2] is a business magnate, philanthropist, and chairman[3] of Microsoft, the software company he founded with Paul Allen. He is American. The desired output: William Henry "Bill" Gates III (born October 28, 1955)[2] is an American business magnate, philanthropist, and chairman[3] of Microsoft, the software company he founded with Paul Allen. He is American. Using the diff command doesn't work because it tells me which lines are different but not which columns/words are different. $ echo 'William Henry "Bill" Gates III (born October 28, 1955)[2] is an American business magnate, philanthropist, and chairman[3] of Microsoft, the software company he founded with Paul Allen.' > oldfile $ echo 'William Henry "Bill" Gates III (born October 28, 1955)[2] is a business magnate, philanthropist, and chairman[3] of Microsoft, the software company he founded with Paul Allen. He is American.' > newfile $ diff -u oldfile newfile --- oldfile 2010-04-30 13:32:43.000000000 -0700 +++ newfile 2010-04-30 13:33:09.000000000 -0700 @@ -1 +1 @@ -William Henry "Bill" Gates III (born October 28, 1955)[2] is an American business magnate, philanthropist, and chairman[3] of Microsoft, the software company he founded with Paul Allen. +William Henry "Bill" Gates III (born October 28, 1955)[2] is a business magnate, philanthropist, and chairman[3] of Microsoft, the software company he founded with Paul Allen. He is American.' > oldfile

    Read the article

  • N-Tier Architecture - Structure with multiple projects in VB.NET

    - by focus.nz
    I would like some advice on the best approach to use in the following situation... I will have a Windows Application and a Web Application (presentation layers), these will both access a common business layer. The business layer will look at a configuration file to find the name of the dll (data layer) which it will create a reference to at runtime (is this the best approach?). The reason for creating the reference at runtime to the data access layer is because the application will interface with a different 3rd party accounting system depending on what the client is using. So I would have a separate data access layer to support each accounting system. These could be separate setup projects, each client would use one or the other, they wouldn't need to switch between the two. Projects: MyCompany.Common.dll - Contains interfaces, all other projects have a reference to this one. MyCompany.Windows.dll - Windows Forms Project, references MyCompany.Business.dll MyCompany.Web.dll - Website project, references MyCompany.Business.dll MyCompany.Busniess.dll - Business Layer, references MyCompany.Data.* (at runtime) MyCompany.Data.AccountingSys1.dll - Data layer for accounting system 1 MyCompany.Data.AccountingSys2.dll - Data layer for accounting system 2 The project MyCompany.Common.dll would contain all the interfaces, each other project would have a reference to this one. Public Interface ICompany ReadOnly Property Id() as Integer Property Name() as String Sub Save() End Interface Public Interface ICompanyFactory Function CreateCompany() as ICompany End Interface The project MyCompany.Data.AccountingSys1.dll and MyCompany.Data.AccountingSys2.dll would contain the classes like the following: Public Class Company Implements ICompany Protected _id As Integer Protected _name As String Public ReadOnly Property Id As Integer Implements MyCompany.Common.ICompany.Id Get Return _id End Get End Property Public Property Name As String Implements MyCompany.Common.ICompany.Name Get Return _name End Get Set(ByVal value as String) _name = value End Set End Property Public Sub Save() Implements MyCompany.Common.ICompany.Save Throw New NotImplementedException() End Sub End Class Public Class CompanyFactory Implements ICompanyFactory Public Function CreateCompany() As ICompany Implements MyCompany.Common.ICompanyFactory.CreateCompany Return New Company() End Function End Class The project MyCompany.Business.dll would provide the business rules and retrieve data form the data layer: Public Class Companies Public Shared Function CreateCompany() As ICompany Dim factory as New MyCompany.Data.CompanyFactory Return factory.CreateCompany() End Function End Class Any opinions/suggestions would be greatly appreciated.

    Read the article

  • SQL Server CLR stored procedures in data processing tasks - good or evil?

    - by Gart
    In short - is it a good design solution to implement most of the business logic in CLR stored procedures? I have read much about them recently but I can't figure out when they should be used, what are the best practices, are they good enough or not. For example, my business application needs to parse a large fixed-length text file, extract some numbers from each line in the file, according to these numbers apply some complex business rules (involving regex matching, pattern matching against data from many tables in the database and such), and as a result of this calculation update records in the database. There is also a GUI for the user to select the file, view the results, etc. This application seems to be a good candidate to implement the classic 3-tier architecture: the Data Layer, the Logic Layer, and the GUI layer. The Data Layer would access the database The Logic Layer would run as a WCF service and implement the business rules, interacting with the Data Layer The GUI Layer would be a means of communication between the Logic Layer and the User. Now, thinking of this design, I can see that most of the business rules may be implemented in a SQL CLR and stored in SQL Server. I might store all my raw data in the database, run the processing there, and get the results. I see some advantages and disadvantages of this solution: Pros: The business logic runs close to the data, meaning less network traffic. Process all data at once, possibly utilizing parallelizm and optimal execution plan. Cons: Scattering of the business logic: some part is here, some part is there. Questionable design solution, may encounter unknown problems. Difficult to implement a progress indicator for the processing task. I would like to hear all your opinions about SQL CLR. Does anybody use it in production? Are there any problems with such design? Is it a good thing?

    Read the article

  • How does a java web project architecture look like without EJB3 ?

    - by Hendrik
    A friend and I are building a fairly complex website based on java. (PHP would have been more obvious but we chose for java because the educational aspect of this project is important to us) We have already decided to use JSF (with richfaces) for the front end and JPA for the backend and so far we have decided not to use EJB3 for the business layer. The reason we've decided not to use EJB3 is because - and please correct me if I am wrong - if we use EJB3 we can only run it on a full blown java application server like jboss and if we don't use EJB3 we can still run it on a lightweight server like tomcat. We want to keep speed and cost of our future web server in mind. So far i've worked on two JEE projects and both used the full stack with web business logic factories/persistence service entities with every layer a seperate module. Now here is my question, if you dont use EJB3 in the business logic layer. What does the layer look like? Please tell what is common practice when developing java web projects without ejb3? Do you think business logic layer can be thrown out altogether and have business logic in the backing beans? If you keep the layer, do you have all business methods static? Or do you initialize each business class as needed in the backing beans in every session as needed?

    Read the article

  • OO Design / Patterns - Fat Model Vs Transaction Script?

    - by ben
    Ok, 'Fat' Model and Transaction Script both solve design problems associated with where to keep business logic. I've done some research and popular thought says having all business logic encapsulated within the model is the way to go (mainly since Transaction Script can become really complex and often results in code duplication). However, how does this work if I want to use the TDG of a second Model in my business logic? Surely Transaction Script presents a neater, less coupled solution than using one Model inside the business logic of another? A practical example... I have two classes: User & Alert. When pushing User instances to the database (eg, creating new user accounts), there is a business rule that requires inserting some default Alerts records too (eg, a default 'welcome to the system' message etc). I see two options here: 1) Add this rule as a User method, and in the process create a dependency between User and Alert (or, at least, Alert's Table Data Gateway). 2) Use a Transaction Script, which avoids the dependency between models. (Also, means the business logic is kept in a 'neutral' class & easily accessible by Alert. That probably isn't too important here, though). User takes responsibility for it's own validation etc, however, but because we're talking about a business rule involving two Models, Transaction Script seems like a better choice to me. Anyone spot flaws with this approach?

    Read the article

  • Fetch a specific tag from Rally in order to compute a value in another field

    - by 4jas
    I'm extremely new to Rally development so my question may sound dumb (but couldn't find how to do it from rally's help or from previous posts here) :) I've started from the rally freeform grid example - my purpose is to implement a Business Value calculator: I fill the score field with a 5-digit figure where each number is a score in the 1-5 range. Then I compute a business value as the result of a calculation, where each number is weighted by a preset weight. I can sort my stories by Business Value to help me prioritize my backlog: that's the first step, and it works. Now what I want to do is to make my freeform grid editable: I am extracting each of my digits as a separate column, but those columns are display-only. How can I turn them into something editable? What I want to do of course is update back the score field based on the values input in each custom column. Here's an example: I have a record with score "15254", which means Business Value criteria 1 scores 1 out of 5, Business Value criteria 2 scores 5 out of 5, and so on... In the end my Business Value is computed as "1*1 + 5*2 + 2*3 + 5*4 + 4*5 = 57". So far this is the part that works. Now let's say I found that the third criteria should not score 2 but 3, I want to be able to edit the value in the corresponding column and have my score field updated to "15354", and my Business Value to display 60 instead of 57. Here is my current code, I'll be really grateful if you can help me with turning that grid into something editable :) <!--Include SDK--> <script type="text/javascript" src="https://rally1.rallydev.com/apps/2.0p2/sdk-debug.js"></script> <!--App code--> <script type="text/javascript"> Rally.onReady(function() { Ext.define('BVApp', { extend: 'Rally.app.App', componentCls: 'app', launch: function() { Ext.create('Rally.data.WsapiDataStore', { model: 'UserStory', autoLoad: true, listeners: { load: this._onDataLoaded, scope: this } }); }, _onDataLoaded: function(store, data) { var records = []; var li_score; var li_bv1, li_bv2, li_bv3, li_bv4, li_bv5, li_bvtotal; var weights = new Array(1, 2, 3, 4, 5); Ext.Array.each(data, function(record) { //Let's fetch score and compute the business values... li_score = record.get('Score'); if (li_score) { li_bv1 = li_score.toString().substring(0,1); li_bv2 = li_score.toString().substring(1,2); li_bv3 = li_score.toString().substring(2,3); li_bv4 = li_score.toString().substring(3,4); li_bv5 = li_score.toString().substring(4,5); li_bvtotal = li_bv1*weights[0] + li_bv2*weights[1] + li_bv3*weights[2] + li_bv4*weights[3] + li_bv5*weights[4]; } records.push({ FormattedID: record.get('FormattedID'), ref: record.get('_ref'), Name: record.get('Name'), Score: record.get('Score'), Bv1: li_bv1, Bv2: li_bv2, Bv3: li_bv3, Bv4: li_bv4, Bv5: li_bv5, BvTotal: li_bvtotal }); }); this.add({ xtype: 'rallygrid', store: Ext.create('Rally.data.custom.Store', { data: records, pageSize: 5 }), columnCfgs: [ { text: 'FormattedID', dataIndex: 'FormattedID' }, { text: 'ref', dataIndex: 'ref' }, { text: 'Name', dataIndex: 'Name', flex: 1 }, { text: 'Score', dataIndex: 'Score' }, { text: 'BusVal 1', dataIndex: 'Bv1' }, { text: 'BusVal 2', dataIndex: 'Bv2' }, { text: 'BusVal 3', dataIndex: 'Bv3' }, { text: 'BusVal 4', dataIndex: 'Bv4' }, { text: 'BusVal 5', dataIndex: 'Bv5' }, { text: 'BusVal Total', dataIndex: 'BvTotal' } ] }); } }); Rally.launchApp('BVApp', { name: 'Business Values App' }); var exampleHtml = '<div id="example-intro"><h1>Business Values App</h1>' + '<div>Own sample app for Business Values</div>' + '</div>'; // Default app viewport uses layout: 'fit', // so we need to insert a container into the viewport var viewport = Ext.ComponentQuery.query('viewport')[0]; var appComponent = viewport.items.getAt(0); var viewportContainerItems = [{ html: exampleHtml, border: 0 }]; //hide advanced cardboard live previews in examples for now viewportContainerItems.push({ xtype: 'container', items: [appComponent] }); viewport.remove(appComponent, false); viewport.add({ xtype: 'container', layout: 'vbox', items: viewportContainerItems }); }); </script> <!--App styles--> <style type="text/css"> .app { /* Add app styles here */ } </style>

    Read the article

  • Windows XP / Outlook 2003 error messages

    - by AboutDev
    Can anyone help with this issue? I am trying to help someone and could use some expertise. Error Message #1: Microsoft Office Small Business Edition 2003 With CD icon "The feature you are trying to use is on a CD-ROM or other removable disk that is not available. Insert the 'Microsoft Office Small Business Edition 2003' disk and click OK. Use source: Microsoft Office Small Business Edition 2003" 1st got this message after CD was inserted to recover partial file STDP11N. Recovered STDP11N, however, still receiving pop up window with error message each time outlook opens. Had accidentally cleaned up old programs and suddenly this was missing. Reinstalled Microsoft Office Small Business Edition 2003 using install CD. Outlook worked buit keep getting error message pop up each time I open Outlook. Hit ok. Error Message #2: The path 'Microsoft Office Small Business Edition 2003' cannot be found. Verify that you have access to this location and try again, or try to find the installation package 'STDP11N.MSI' in a folder from which you can install the product Microsoft Office Small Business Edition 2003." Hit ok. Back to error message #1 Hit close window Error message #3: Error 1706. Setup cannot find the required files. Check your connection to the network, or CD-ROM drive. For other potential solutions to this problem, see C:\Program Files\Microsoft Office\ OFFICE11\1033\SETUP.CHM Error message #4 I'd created a file under D: drive on an external drive. "The path specified for the file D:...etc.. .pst is not valid. Hit ok. Brings up window to look in My Documents.

    Read the article

  • Windows XP / Outlook 2003 error messages

    - by AboutDev
    Can anyone help with this issue? I am trying to help someone and could use some expertise. Error Message #1: Microsoft Office Small Business Edition 2003 With CD icon "The feature you are trying to use is on a CD-ROM or other removable disk that is not available. Insert the 'Microsoft Office Small Business Edition 2003' disk and click OK. Use source: Microsoft Office Small Business Edition 2003" 1st got this message after CD was inserted to recover partial file STDP11N. Recovered STDP11N, however, still receiving pop up window with error message each time outlook opens. Had accidentally cleaned up old programs and suddenly this was missing. Reinstalled Microsoft Office Small Business Edition 2003 using install CD. Outlook worked buit keep getting error message pop up each time I open Outlook. Hit ok. Error Message #2: The path 'Microsoft Office Small Business Edition 2003' cannot be found. Verify that you have access to this location and try again, or try to find the installation package 'STDP11N.MSI' in a folder from which you can install the product Microsoft Office Small Business Edition 2003." Hit ok. Back to error message #1 Hit close window Error message #3: Error 1706. Setup cannot find the required files. Check your connection to the network, or CD-ROM drive. For other potential solutions to this problem, see C:\Program Files\Microsoft Office\ OFFICE11\1033\SETUP.CHM Error message #4 I'd created a file under D: drive on an external drive. "The path specified for the file D:...etc.. .pst is not valid. Hit ok. Brings up window to look in My Documents.

    Read the article

  • Multidimensional Thinking–24 Hours of Pass: Celebrating Women in Technology

    - by smisner
    It’s Day 1 of #24HOP and it’s been great to participate in this event with so many women from all over the world in one long training-fest. The SQL community has been abuzz on Twitter with running commentary which is fun to watch while listening to the current speaker. If you missed the fun today because you’re busy with all that work you’ve got to do – don’t despair. All sessions are recorded and will be available soon. Keep an eye on the 24 Hours of Pass page for details. And the fun’s not over today. Rather than run 24 hours consecutively, #24HOP is now broken down into 12-hours over two days, so check out the schedule to see if there’s a session that interests you and fits your schedule. I’m pleased to announce that my business colleague Erika Bakse ( Blog | Twitter) will be presenting on Day 2 – her debut presentation for a PASS event. (And I’m also pleased to say she’s my daughter!) Multidimensional Thinking: The Presentation My contribution to this lineup of terrific speakers was Multidimensional Thinking. Here’s the abstract: “Whether you’re developing Analysis Services cubes or creating PowerPivot workbooks, you need to get into a multidimensional frame of mind to produce a model that best enables users to answer their business questions on their own. Many database professionals struggle initially with multidimensional models because the data modeling process is much different than the one they use to produce traditional, third normal form databases. In this session, I’ll introduce you to the terminology of multidimensional modeling and step through the process of translating business requirements into a viable model.” If you watched the presentation and want a copy of the slides, you can download a copy here. And you’re welcome to download the slides even if you didn’t watch the presentation, but they’ll make more sense if you did! Kimball All the Way There’s only so much I can cover in the time allotted, but I hope that I succeeded in my attempt to build a foundation that prepares you for starting out in business intelligence. One of my favorite resources that will get into much more detail about all kinds of scenarios (well beyond the basics!) is The Data Warehouse Toolkit (Second Edition) by Ralph Kimball. Anything from Kimball or the Kimball Group is worth reading. Kimball material might take reading and re-reading a few times before it makes sense. From my own experience, I found that I actually had to just build my first data warehouse using dimensional modeling on faith that I was going the right direction because it just didn’t click with me initially. I’ve had years of practice since then and I can say it does get easier with practice. The most important thing, in my opinion, is that you simply must prototype a lot and solicit user feedback, because ultimately the model needs to make sense to them. They will definitely make sure you get it right! Schema Generation One question came up after the presentation about whether we use SQL Server Management Studio or Business Intelligence Development Studio (BIDS) to build the tables for the dimensional model. My answer? It really doesn’t matter how you create the tables. Use whatever method that you’re comfortable with. But just so happens that it IS possible to set up your design in BIDS as part of an Analysis Services project and to have BIDS generate the relational schema for you. I did a Webcast last year called Building a Data Mart with Integration Services that demonstrated how to do this. Yes, the subject was Integration Services, but as part of that presentation, I showed how to leverage Analysis Services to build the tables, and then I showed how to use Integration Services to load those tables. I blogged about this presentation in September 2010 and included downloads of the project that I used. In the blog post, I explained that I missed a step in the demonstration. Oops. Just as an FYI, there were two more Webcasts to finish the story begun with the data – Accelerating Answers with Analysis Services and Delivering Information with Reporting Services. If you want to just cut to the chase and learn how to use Analysis Services to build the tables, you can see the Using the Schema Generation Wizard topic in Books Online.

    Read the article

  • Circular Shifts on Strings in Bash

    - by Kyle Van Koevering
    I have a homework assignment where I need to take input from a file and continuously remove the first word in a line and append it to the end of the line until all combinations have been done. I really don't know where to begin and would be thankful for any sort of direction. The part that has me confused is that this is suppose to be performed without the use of arrays. I'm not just fishing for someone to solve the problem for me, I'm just looking for some direction. Thank you very much for your time and help. SAMPlE INPUT: Pipes and Filters Java Swing Software Requirements Analysis SAMPLE OUTPUT: Analysis Software Requirements Filters Pipes and Java Swing Pipes and Filters Requirements Analysis Software Software Requirements Analysis Swing Java

    Read the article

  • Visual Studio hangs when deploying a cube

    - by Richie
    Hello All, I'm having an issue with an Analysis Services project in Visual Studio 2005. My project always builds but only occasionally deploys. No errors are reported and VS just hangs. This is my first Analysis Services project so I am hoping that there is something obvious that I am just missing. Here is the situation I have a cube that I have successfully deployed. I then make some change, e.g., adding a hierarchy to a dimension. When I try to deploy again VS hangs. I have to restart Analysis Services to regain control of VS so I can shut it down. I restart everything sometimes once, sometimes twice or more before the project will eventually deploy. This happens with any change I make there seems to be no pattern to this behaviour. Sometimes I have to delete the cube from Analysis Services before restarting everything to get a successful deploy. Also I have successfully deployed the cube, and then subsequently successfully reprocessed a dimension then when I open a query window in SQL Server Management Studio it says that it can find any cubes. As a test I have deployed a cube successfully. I have then deleted it in Analysis Services and attempted to redeploy it, without making any changes to the cube, only to have the same behaviour mentioned above. VS just hangs with no reason so I have no idea where to start hunting down the problem. It is taking 15-20 minutes to make a change as simple as setting the NameColumn of a dimension attribute. As you can imagine this is taking hours of my time so I would greatly appreciate any assistance anyone can give me.

    Read the article

  • How does the Amazon Recommendation feature work?

    - by Rachel
    What technology goes in behind the screens of Amazon recommendation technology? I believe that Amazon recommendation is currently the best in the market, but how do they provide us with such relevant recommendations? Recently, we have been involved with similar recommendation kind of project, but would surely like to know about the in and outs of the Amazon recommendation technology from a technical standpoint. Any inputs would be highly appreciated. Update: This patent explains how personalized recommendations are done but it is not very technical, and so it would be really nice if some insights could be provided. From the comments of Dave, Affinity Analysis forms the basis for such kind of Recommendation Engines. Also here are some good reads on the Topic Demystifying Market Basket Analysis Market Basket Analysis Affinity Analysis Suggested Reading: Data Mining: Concepts and Technique

    Read the article

  • SQL Server 2008 and SP1

    - by andrew007
    Hi, I have a server where I installed SQL Server 2008 and after I applied the SP1. Now, I want also to add the Analysis Services to this instance by using the "Add or remove features...". My questions are: Is it possible to add the Analysis Services on a server with SP1 already installed? How can I apply the SP1 also to the new Analysis Service feature? THANKS!

    Read the article

  • Currency Conversion in Oracle BI applications

    - by Saurabh Verma
    Authored by Vijay Aggarwal and Hichem Sellami A typical data warehouse contains Star and/or Snowflake schema, made up of Dimensions and Facts. The facts store various numerical information including amounts. Example; Order Amount, Invoice Amount etc. With the true global nature of business now-a-days, the end-users want to view the reports in their own currency or in global/common currency as defined by their business. This presents a unique opportunity in BI to provide the amounts in converted rates either by pre-storing or by doing on-the-fly conversions while displaying the reports to the users. Source Systems OBIA caters to various source systems like EBS, PSFT, Sebl, JDE, Fusion etc. Each source has its own unique and intricate ways of defining and storing currency data, doing currency conversions and presenting to the OLTP users. For example; EBS stores conversion rates between currencies which can be classified by conversion rates, like Corporate rate, Spot rate, Period rate etc. Siebel stores exchange rates by conversion rates like Daily. EBS/Fusion stores the conversion rates for each day, where as PSFT/Siebel store for a range of days. PSFT has Rate Multiplication Factor and Rate Division Factor and we need to calculate the Rate based on them, where as other Source systems store the Currency Exchange Rate directly. OBIA Design The data consolidation from various disparate source systems, poses the challenge to conform various currencies, rate types, exchange rates etc., and designing the best way to present the amounts to the users without affecting the performance. When consolidating the data for reporting in OBIA, we have designed the mechanisms in the Common Dimension, to allow users to report based on their required currencies. OBIA Facts store amounts in various currencies: Document Currency: This is the currency of the actual transaction. For a multinational company, this can be in various currencies. Local Currency: This is the base currency in which the accounting entries are recorded by the business. This is generally defined in the Ledger of the company. Global Currencies: OBIA provides five Global Currencies. Three are used across all modules. The last two are for CRM only. A Global currency is very useful when creating reports where the data is viewed enterprise-wide. Example; a US based multinational would want to see the reports in USD. The company will choose USD as one of the global currencies. OBIA allows users to define up-to five global currencies during the initial implementation. The term Currency Preference is used to designate the set of values: Document Currency, Local Currency, Global Currency 1, Global Currency 2, Global Currency 3; which are shared among all modules. There are four more currency preferences, specific to certain modules: Global Currency 4 (aka CRM Currency) and Global Currency 5 which are used in CRM; and Project Currency and Contract Currency, used in Project Analytics. When choosing Local Currency for Currency preference, the data will show in the currency of the Ledger (or Business Unit) in the prompt. So it is important to select one Ledger or Business Unit when viewing data in Local Currency. More on this can be found in the section: Toggling Currency Preferences in the Dashboard. Design Logic When extracting the fact data, the OOTB mappings extract and load the document amount, and the local amount in target tables. It also loads the exchange rates required to convert the document amount into the corresponding global amounts. If the source system only provides the document amount in the transaction, the extract mapping does a lookup to get the Local currency code, and the Local exchange rate. The Load mapping then uses the local currency code and rate to derive the local amount. The load mapping also fetches the Global Currencies and looks up the corresponding exchange rates. The lookup of exchange rates is done via the Exchange Rate Dimension provided as a Common/Conforming Dimension in OBIA. The Exchange Rate Dimension stores the exchange rates between various currencies for a date range and Rate Type. Two physical tables W_EXCH_RATE_G and W_GLOBAL_EXCH_RATE_G are used to provide the lookups and conversions between currencies. The data is loaded from the source system’s Ledger tables. W_EXCH_RATE_G stores the exchange rates between currencies with a date range. On the other hand, W_GLOBAL_EXCH_RATE_G stores the currency conversions between the document currency and the pre-defined five Global Currencies for each day. Based on the requirements, the fact mappings can decide and use one or both tables to do the conversion. Currency design in OBIA also taps into the MLS and Domain architecture, thus allowing the users to map the currencies to a universal Domain during the implementation time. This is especially important for companies deploying and using OBIA with multiple source adapters. Some Gotchas to Look for It is necessary to think through the currencies during the initial implementation. 1) Identify various types of currencies that are used by your business. Understand what will be your Local (or Base) and Documentation currency. Identify various global currencies that your users will want to look at the reports. This will be based on the global nature of your business. Changes to these currencies later in the project, while permitted, but may cause Full data loads and hence lost time. 2) If the user has a multi source system make sure that the Global Currencies and Global Rate Types chosen in Configuration Manager do have the corresponding source specific counterparts. In other words, make sure for every DW specific value chosen for Currency Code or Rate Type, there is a source Domain mapping already done. Technical Section This section will briefly mention the technical scenarios employed in the OBIA adaptors to extract data from each source system. In OBIA, we have two main tables which store the Currency Rate information as explained in previous sections. W_EXCH_RATE_G and W_GLOBAL_EXCH_RATE_G are the two tables. W_EXCH_RATE_G stores all the Currency Conversions present in the source system. It captures data for a Date Range. W_GLOBAL_EXCH_RATE_G has Global Currency Conversions stored at a Daily level. However the challenge here is to store all the 5 Global Currency Exchange Rates in a single record for each From Currency. Let’s voyage further into the Source System Extraction logic for each of these tables and understand the flow briefly. EBS: In EBS, we have Currency Data stored in GL_DAILY_RATES table. As the name indicates GL_DAILY_RATES EBS table has data at a daily level. However in our warehouse we store the data with a Date Range and insert a new range record only when the Exchange Rate changes for a particular From Currency, To Currency and Rate Type. Below are the main logical steps that we employ in this process. (Incremental Flow only) – Cleanup the data in W_EXCH_RATE_G. Delete the records which have Start Date > minimum conversion date Update the End Date of the existing records. Compress the daily data from GL_DAILY_RATES table into Range Records. Incremental map uses $$XRATE_UPD_NUM_DAY as an extra parameter. Generate Previous Rate, Previous Date and Next Date for each of the Daily record from the OLTP. Filter out the records which have Conversion Rate same as Previous Rates or if the Conversion Date lies within a single day range. Mark the records as ‘Keep’ and ‘Filter’ and also get the final End Date for the single Range record (Unique Combination of From Date, To Date, Rate and Conversion Date). Filter the records marked as ‘Filter’ in the INFA map. The above steps will load W_EXCH_RATE_GS. Step 0 updates/deletes W_EXCH_RATE_G directly. SIL map will then insert/update the GS data into W_EXCH_RATE_G. These steps convert the daily records in GL_DAILY_RATES to Range records in W_EXCH_RATE_G. We do not need such special logic for loading W_GLOBAL_EXCH_RATE_G. This is a table where we store data at a Daily Granular Level. However we need to pivot the data because the data present in multiple rows in source tables needs to be stored in different columns of the same row in DW. We use GROUP BY and CASE logic to achieve this. Fusion: Fusion has extraction logic very similar to EBS. The only difference is that the Cleanup logic that was mentioned in step 0 above does not use $$XRATE_UPD_NUM_DAY parameter. In Fusion we bring all the Exchange Rates in Incremental as well and do the cleanup. The SIL then takes care of Insert/Updates accordingly. PeopleSoft:PeopleSoft does not have From Date and To Date explicitly in the Source tables. Let’s look at an example. Please note that this is achieved from PS1 onwards only. 1 Jan 2010 – USD to INR – 45 31 Jan 2010 – USD to INR – 46 PSFT stores records in above fashion. This means that Exchange Rate of 45 for USD to INR is applicable for 1 Jan 2010 to 30 Jan 2010. We need to store data in this fashion in DW. Also PSFT has Exchange Rate stored as RATE_MULT and RATE_DIV. We need to do a RATE_MULT/RATE_DIV to get the correct Exchange Rate. We generate From Date and To Date while extracting data from source and this has certain assumptions: If a record gets updated/inserted in the source, it will be extracted in incremental. Also if this updated/inserted record is between other dates, then we also extract the preceding and succeeding records (based on dates) of this record. This is required because we need to generate a range record and we have 3 records whose ranges have changed. Taking the same example as above, if there is a new record which gets inserted on 15 Jan 2010; the new ranges are 1 Jan to 14 Jan, 15 Jan to 30 Jan and 31 Jan to Next available date. Even though 1 Jan record and 31 Jan have not changed, we will still extract them because the range is affected. Similar logic is used for Global Exchange Rate Extraction. We create the Range records and get it into a Temporary table. Then we join to Day Dimension, create individual records and pivot the data to get the 5 Global Exchange Rates for each From Currency, Date and Rate Type. Siebel: Siebel Facts are dependent on Global Exchange Rates heavily and almost none of them really use individual Exchange Rates. In other words, W_GLOBAL_EXCH_RATE_G is the main table used in Siebel from PS1 release onwards. As of January 2002, the Euro Triangulation method for converting between currencies belonging to EMU members is not needed for present and future currency exchanges. However, the method is still available in Siebel applications, as are the old currencies, so that historical data can be maintained accurately. The following description applies only to historical data needing conversion prior to the 2002 switch to the Euro for the EMU member countries. If a country is a member of the European Monetary Union (EMU), you should convert its currency to other currencies through the Euro. This is called triangulation, and it is used whenever either currency being converted has EMU Triangulation checked. Due to this, there are multiple extraction flows in SEBL ie. EUR to EMU, EUR to NonEMU, EUR to DMC and so on. We load W_EXCH_RATE_G through multiple flows with these data. This has been kept same as previous versions of OBIA. W_GLOBAL_EXCH_RATE_G being a new table does not have such needs. However SEBL does not have From Date and To Date columns in the Source tables similar to PSFT. We use similar extraction logic as explained in PSFT section for SEBL as well. What if all 5 Global Currencies configured are same? As mentioned in previous sections, from PS1 onwards we store Global Exchange Rates in W_GLOBAL_EXCH_RATE_G table. The extraction logic for this table involves Pivoting data from multiple rows into a single row with 5 Global Exchange Rates in 5 columns. As mentioned in previous sections, we use CASE and GROUP BY functions to achieve this. This approach poses a unique problem when all the 5 Global Currencies Chosen are same. For example – If the user configures all 5 Global Currencies as ‘USD’ then the extract logic will not be able to generate a record for From Currency=USD. This is because, not all Source Systems will have a USD->USD conversion record. We have _Generated mappings to take care of this case. We generate a record with Conversion Rate=1 for such cases. Reusable Lookups Before PS1, we had a Mapplet for Currency Conversions. In PS1, we only have reusable Lookups- LKP_W_EXCH_RATE_G and LKP_W_GLOBAL_EXCH_RATE_G. These lookups have another layer of logic so that all the lookup conditions are met when they are used in various Fact Mappings. Any user who would want to do a LKP on W_EXCH_RATE_G or W_GLOBAL_EXCH_RATE_G should and must use these Lookups. A direct join or Lookup on the tables might lead to wrong data being returned. Changing Currency preferences in the Dashboard: In the 796x series, all amount metrics in OBIA were showing the Global1 amount. The customer needed to change the metric definitions to show them in another Currency preference. Project Analytics started supporting currency preferences since 7.9.6 release though, and it published a Tech note for other module customers to add toggling between currency preferences to the solution. List of Currency Preferences Starting from 11.1.1.x release, the BI Platform added a new feature to support multiple currencies. The new session variable (PREFERRED_CURRENCY) is populated through a newly introduced currency prompt. This prompt can take its values from the xml file: userpref_currencies_OBIA.xml, which is hosted in the BI Server installation folder, under :< home>\instances\instance1\config\OracleBIPresentationServicesComponent\coreapplication_obips1\userpref_currencies.xml This file contains the list of currency preferences, like“Local Currency”, “Global Currency 1”,…which customers can also rename to give them more meaningful business names. There are two options for showing the list of currency preferences to the user in the dashboard: Static and Dynamic. In Static mode, all users will see the full list as in the user preference currencies file. In the Dynamic mode, the list shown in the currency prompt drop down is a result of a dynamic query specified in the same file. Customers can build some security into the rpd, so the list of currency preferences will be based on the user roles…BI Applications built a subject area: “Dynamic Currency Preference” to run this query, and give every user only the list of currency preferences required by his application roles. Adding Currency to an Amount Field When the user selects one of the items from the currency prompt, all the amounts in that page will show in the Currency corresponding to that preference. For example, if the user selects “Global Currency1” from the prompt, all data will be showing in Global Currency 1 as specified in the Configuration Manager. If the user select “Local Currency”, all amount fields will show in the Currency of the Business Unit selected in the BU filter of the same page. If there is no particular Business Unit selected in that filter, and the data selected by the query contains amounts in more than one currency (for example one BU has USD as a functional currency, the other has EUR as functional currency), then subtotals will not be available (cannot add USD and EUR amounts in one field), and depending on the set up (see next paragraph), the user may receive an error. There are two ways to add the Currency field to an amount metric: In the form of currency code, like USD, EUR…For this the user needs to add the field “Apps Common Currency Code” to the report. This field is in every subject area, usually under the table “Currency Tag” or “Currency Code”… In the form of currency symbol ($ for USD, € for EUR,…) For this, the user needs to format the amount metrics in the report as a currency column, by specifying the currency tag column in the Column Properties option in Column Actions drop down list. Typically this column should be the “BI Common Currency Code” available in every subject area. Select Column Properties option in the Edit list of a metric. In the Data Format tab, select Custom as Treat Number As. Enter the following syntax under Custom Number Format: [$:currencyTagColumn=Subjectarea.table.column] Where Column is the “BI Common Currency Code” defined to take the currency code value based on the currency preference chosen by the user in the Currency preference prompt.

    Read the article

  • What is good practice in .NET system architecture design concerning multiple models and aggregates

    - by BuzzBubba
    I'm designing a larger enterprise architecture and I'm in a doubt about how to separate the models and design those. There are several points I'd like suggestions for: - models to define - way to define models Currently my idea is to define: Core (domain) model Repositories to get data to that domain model from a database or other store Business logic model that would contain business logic, validation logic and more specific versions of forms of data retrieval methods View models prepared for specifically formated data output that would be parsed by views of different kind (web, silverlight, etc). For the first model I'm puzzled at what to use and how to define the mode. Should this model entities contain collections and in what form? IList, IEnumerable or IQueryable collections? - I'm thinking of immutable collections which IEnumerable is, but I'd like to avoid huge data collections and to offer my Business logic layer access with LINQ expressions so that query trees get executed at Data level and retrieve only really required data for situations like the one when I'm retrieving a very specific subset of elements amongst thousands or hundreds of thousands. What if I have an item with several thousands of bids? I can't just make an IEnumerable collection of those on the model and then retrieve an item list in some Repository method or even Business model method. Should it be IQueryable so that I actually pass my queries to Repository all the way from the Business logic model layer? Should I just avoid collections in my domain model? Should I void only some collections? Should I separate Domain model and BusinessLogic model or integrate those? Data would be dealt trough repositories which would use Domain model classes. Should repositories be used directly using only classes from domain model like data containers? This is an example of what I had in mind: So, my Domain objects would look like (e.g.) public class Item { public string ItemName { get; set; } public int Price { get; set; } public bool Available { get; set; } private IList<Bid> _bids; public IQueryable<Bid> Bids { get { return _bids.AsQueryable(); } private set { _bids = value; } } public AddNewBid(Bid newBid) { _bids.Add(new Bid {.... } } Where Bid would be defined as a normal class. Repositories would be defined as data retrieval factories and used to get data into another (Business logic) model which would again be used to get data to ViewModels which would then be rendered by different consumers. I would define IQueryable interfaces for all aggregating collections to get flexibility and minimize data retrieved from real data store. Or should I make Domain Model "anemic" with pure data store entities and all collections define for business logic model? One of the most important questions is, where to have IQueryable typed collections? - All the way from Repositories to Business model or not at all and expose only solid IList and IEnumerable from Repositories and deal with more specific queries inside Business model, but have more finer grained methods for data retrieval within Repositories. So, what do you think? Have any suggestions?

    Read the article

  • Issue 15: Oracle PartnerNetwork Exchange @ Oracle OpenWorld

    - by rituchhibber
         ORACLE FOCUS Oracle PartnerNetwork Exchange@ ORACLE OpenWorld Sylvie MichouSenior DirectorPartner Marketing & Communications and Strategic Programs RESOURCES -- Oracle OpenWorld 2012 Oracle PartnerNetwork Exchange @ OpenWorld Oracle PartnerNetwork Exchange @ OpenWorld Registration Oracle PartnerNetwork Exchange SpecializationTest Fest Oracle OpenWorld Schedule Builder Oracle OpenWorld Promotional Toolkit for Partners Oracle Partner Events Oracle Partner Webcasts Oracle EMEA Partner News SUBSCRIBE FEEDBACK PREVIOUS ISSUES If you are attending our forthcoming Oracle OpenWorld 2012 conference in San Francisco from 30 September to 4 October, you will discover a new dedicated programme of keynotes and sessions tailored especially for you, our valued partners. Oracle PartnerNetwork Exchange @ OpenWorld has been created to enhance the opportunities for you to learn from and network with Oracle executives and experts. The programme also provides more informal opportunities than ever throughout the week to meet up with the people who are most important to your business: customers, prospects, colleagues and the Oracle EMEA Alliances & Channels management team. Oracle remains fully focused on building the industry's most admired partner ecosystem—which today spans over 25,000 partners. This new OPN Exchange programme offers an exciting change of pace for partners throughout the conference. Now it will be possible to enjoy a fully-integrated, partner-dedicated session schedule throughout the week, as well as key social events such as the Sunday night Welcome Reception, networking lunches from Monday to Thursday at the Howard Street Tent, and a fantastic closing event on the last Thursday afternoon. In addition to the regular Oracle OpenWorld conference schedule, if you have registered for the Oracle PartnerNetwork Exchange @ OpenWorld programme, you will be invited to attend a much anticipated global partner keynote presentation, plus more than 40 conference sessions aimed squarely at what's most important to you, as partners. Prominent topics for discussion will include: Oracle technologies and roadmaps and how they fit with partners' business plans; business development; regional distinctions in business practices; and much more. Each session will provide plenty of food for thought ahead of the numerous networking opportunities throughout the week, encouraging the knowledge exchange with Oracle executives, customers, prospects, and colleagues that will make this conference of even greater value for you. At Oracle we always work closely with our partners to deliver solution offerings that improve business value, simplify the IT experience and drive innovation and efficiencies for joint customers. The most important element of our new OPN Exchange is content that helps you get more from technology investments, more from your peer-to-peer connections, and more from your interactions with customers. To this end we've created some partner-specific tools which can be used by OPN members ahead of the conference itself. Crucially, a comprehensive Content Catalog already lists and organises details of every OPN Exchange session, speaker, exhibitor, demonstration and related materials. This Content Catalog can be used by all our partners to identify interesting content that you can add to your own personalised Oracle OpenWorld Schedule Builder, allowing more effective planning and pre-enrolment for vital sessions. There are numerous highlights that you will definitely want to include in those personal schedules. On Sunday morning, 30 September we will start the week with partner dedicated OPN Exchange sessions, following our Global Partner Keynote at 13:00 with Judson Althoff, SVP, Worldwide Alliances & Channels and Embedded Sales and senior executives, giving insight into Oracle's partner vision, strategy, and resources—all designed to help build and strengthen market opportunities for you. This will be followed by a number of OPN Exchange general sessions, the Oracle OpenWorld Opening Keynote with Larry Ellison, CEO, Oracle and concluded with the OPN Exchange AfterDark Welcome Reception, starting at 19:30 at the Metreon. From Monday 1 to Thursday 4 October, you can attend the OPN Exchange sessions that are most relevant to your business today and over the coming year. Oracle's top product and sales leaders will be on hand to discuss Oracle's strategic direction in 40+ targeted and in-depth sessions focussing on critical success factors to develop your business. Oracle's dedication to innovation, specialization, enablement and engineering provides Oracle partners with a huge opportunity to create new services and solutions, differentiate themselves and deliver extreme value to joint customers across the globe. Oracle will even be helping over 1000 partners to earn OPN Specialization certification during the Oracle OpenWorld OPN Exchange Test Fest, which will be providing all the study materials and exams required to drive Specialization for free at the conference. You simply need to check the list of current certification tracks available, and make sure you pre-register to reserve a seat in one of the ten sessions being offered free to OPN Exchange registered attendees. And finally, let's not forget those all-important networking opportunities, which can so often provide partners with valuable long-term alliances as well as exciting new business leads. The Oracle PartnerNetwork Lounge, located at Moscone South, exhibition hall, room 100 is the place where partners can meet formally or informally with colleagues, customers, prospects, and other industry professionals. OPN Specialized partners with OPN Exchange passes can also visit the OPN Video Blogging room to record and share ideas, and at the OPN Information Station you will find consultants available to answer your questions. "For the first time ever we will have a full partner conference within OpenWorld. OPN Exchange @ OpenWorld will kick-off on the first Sunday and run the entire week. We'll have over 40 sessions throughout that time and partners will hear from our top development executives, with special sessions dedicated to partnering throughout. It's going to be a phenomenal event, and we look forward to seeing our partners there." Judson Althoff, SVP, Oracle Worldwide Alliances & Channels and Embedded Sales So if you haven't done so already, please register for Oracle PartnerNetwork Exchange @ OpenWorld today or add OPN Exchange to your existing registration for just $100 through My Account. And if you have any further questions regarding partner activities at Oracle OpenWorld, please don't hesitate to contact the Oracle PartnerNetwork team at [email protected] will be on hand to share the very latest information about: Oracle's SPARC Superclusters: the latest Engineered Systems from Oracle, delivering radically improved performance, faster deployment and greatly reduced operational costs for mixed database and enterprise application consolidation Oracle's SPARC T4 servers: with the newly developed T4 processor and Oracle Solaris providing up to five times the single threaded performance and better overall system throughput for expanded application versatility Oracle Database Appliance: a new way to take advantage of the world's most popular database, Oracle Database 11g, in a single, easy-to-deploy and manage system. It's a complete package engineered to deliver simple, reliable and affordable database services to small and medium size businesses and departmental systems. All hardware and software components are supported together and offer customers unique pay-as-you-grow software licensing to quickly scale from two to 24 processor cores without incurring the costs and downtime usually associated with hardware upgrades Oracle Exalogic: the world's only integrated cloud machine, featuring server hardware and middleware software engineered together for maximum performance with minimum set-up and operational cost Oracle Exadata Database Machine: the only database machine that provides extreme performance for both data warehousing and online transaction processing (OLTP) applications, making it the ideal platform for consolidating onto grids or private clouds. It is a complete package of servers, storage, networking and software that is massively scalable, secure and redundant Oracle Sun ZFS Storage Appliances: providing enterprise-class NAS performance, price-performance, manageability and TCO by combining third-generation software with high-performance controllers, flash-based caches and disks Oracle Pillar Axiom Quality-of-Service: confidently consolidate storage for multiple applications into a single datacentre storage solution Oracle Solaris 11: delivering secure enterprise cloud deployments with the ability to run hundreds of virtual application with no overhead and co-engineered with other Oracle software products to provide the highest levels of security, manageability and performance Oracle Enterprise Manager 12c: Oracle's integrated enterprise IT management product, providing the industry's only complete, integrated and business-driven enterprise cloud management solution Oracle VM 3.0: the latest release of Oracle's server virtualisation and management solution, helping to move datacentres beyond server consolidation to improve application deployment and management. Register today and ensure your place at the Extreme Performance Tour! Extreme Performance Tour events are free to attend, but places are limited. To make sure that you don't miss out, please visit Oracle's Extreme Performance Tour website, select the city that you'd be interest in attending an event in, and then click on the 'Register Now' button for that city to secure your interest. Each individual city page also contains more in-depth information about your local event, including logistics, agenda and maybe even a preview of VIP guest speakers. -- Oracle OpenWorld 2010 Whether you attended Oracle OpenWorld 2009 or not, don't forget to save the date now for Oracle OpenWorld 2010. The event will be held a little earlier next year, from 19th-23rd September, so please don't miss out. With thousands of sessions and hundreds of exhibits and demos already lined up, there's no better place to learn how to optimise your existing systems, get an inside line on upcoming technology breakthroughs, and meet with your partner peers, Oracle strategists and even the developers responsible for the products and services that help you get better results for your end customers. Register Now for Oracle OpenWorld 2010! Perhaps you are interested in learning more about Oracle OpenWorld 2010, but don't wish to register at this time? Great! Please just enter your contact information here and we will contact you at a later date. How to Exhibit at Oracle OpenWorld 2010 Sponsorship Opportunities at Oracle OpenWorld 2010 Advertising Opportunities at Oracle OpenWorld 2010 -- Back to the welcome page

    Read the article

  • How to deal with transport level security policy with OSB

    - by Jian Liang
    Recently, we received a use case for Oracle Service Bus (OSB) 11gPS4 to consume a Web Service which is secured by HTTP transport level security policy. The WSDL of the remote web service looks like following where the part marked in red shows the security policy: <?xml version='1.0' encoding='UTF-8'?> <definitions xmlns:wssutil="http://docs.oasis-open.org/wss/2004/01/oasis-200401-wss-wssecurity-utility-1.0.xsd" xmlns:wsp="http://schemas.xmlsoap.org/ws/2004/09/policy" xmlns:soap="http://schemas.xmlsoap.org/wsdl/soap/" xmlns:tns="https://httpsbasicauth" xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns="http://schemas.xmlsoap.org/wsdl/" targetNamespace="https://httpsbasicauth" name="HttpsBasicAuthService"> <wsp:UsingPolicy wssutil:Required="true"/> <wsp:Policy wssutil:Id="WSHttpBinding_IPartyServicePortType_policy"> <wsp:ExactlyOne> <wsp:All> <ns1:TransportBinding xmlns:ns1="http://schemas.xmlsoap.org/ws/2005/07/securitypolicy"> <wsp:Policy> <ns1:TransportToken> <wsp:Policy> <ns1:HttpsToken RequireClientCertificate="false"/> </wsp:Policy> </ns1:TransportToken> <ns1:AlgorithmSuite> <wsp:Policy> <ns1:Basic256/> </wsp:Policy> </ns1:AlgorithmSuite> <ns1:Layout> <wsp:Policy> <ns1:Strict/> </wsp:Policy> </ns1:Layout> </wsp:Policy> </ns1:TransportBinding> <ns2:UsingAddressing xmlns:ns2="http://www.w3.org/2006/05/addressing/wsdl"/> </wsp:All> </wsp:ExactlyOne> </wsp:Policy> <types> <xsd:schema> <xsd:import namespace="https://proxyhttpsbasicauth" schemaLocation="http://localhost:7001/WS/HttpsBasicAuthService?xsd=1"/> </xsd:schema> <xsd:schema> <xsd:import namespace="https://httpsbasicauth" schemaLocation="http://localhost:7001/WS/HttpsBasicAuthService?xsd=2"/> </xsd:schema> </types> <message name="echoString"> <part name="parameters" element="tns:echoString"/> </message> <message name="echoStringResponse"> <part name="parameters" element="tns:echoStringResponse"/> </message> <portType name="HttpsBasicAuth"> <operation name="echoString"> <input message="tns:echoString"/> <output message="tns:echoStringResponse"/> </operation> </portType> <binding name="HttpsBasicAuthSoapPortBinding" type="tns:HttpsBasicAuth"> <wsp:PolicyReference URI="#WSHttpBinding_IPartyServicePortType_policy"/> <soap:binding transport="http://schemas.xmlsoap.org/soap/http" style="document"/> <operation name="echoString"> <soap:operation soapAction=""/> <input> <soap:body use="literal"/> </input> <output> <soap:body use="literal"/> </output> </operation> </binding> <service name="HttpsBasicAuthService"> <port name="HttpsBasicAuthSoapPort" binding="tns:HttpsBasicAuthSoapPortBinding"> <soap:address location="https://localhost:7002/WS/HttpsBasicAuthService"/> </port> </service> </definitions> The security assertion in the WSDL (marked in red) indicates that this is the HTTP transport level security policy which requires one way SSL with default authentication (aka. basic authenticate with username/password). Normally, there are two ways to handle web service security policy with OSB 11g: Use WebLogic 9.x policy Use OWSM Since OSB doesn’t support WebLogic 9.x WSSP transport level assertion (except for WS transport), when we tried to create the business service based on the imported WSDL, OSB complained with the following message: [OSB Kernel:398133]The service is based on WSDL with Web Services Security Policies that are not natively supported by Oracle Service Bus. Please select OWSM Policies - From OWSM Policy Store option and attach equivalent OWSM security policy. For the Business Service, either you can add the necessary client policies manually by clicking Add button or you can let Oracle Service Bus automatically pick and add compatible client policies by clicking Add Compatible button. Unfortunately, when tried with OWSM, we couldn’t find http_token_policy from OWSM since OSB PS4 doesn’t support OWSM http_token_policy. It seems that we ran into an unsupported situation that no appropriate policy can be used from both WebLogic and OWSM. As this security policy requires one way SSL with basic authentication at the transport level, a possible workaround is to meet the remote service's requirement at transport level without using web service policy. We can simply use OSB to establish SSL connection and provide username/password for authentication at the transport level to the remote web service. In this case, the business service within OSB will be transparent to the web service policy. However, we still need to deal with OSB console’s complaint related to unsupported security policy because the failure of WSDL validation prohibits OSB console to move forward. With the help from OSB Product Management team, we finally came up with the following solutions: Solution 1: OSB PS5 The good news is that the http_token_policy is made available in OSB PS5. With OSB PS5, you can simply add OWSM oracle/wss_http_token_over_ssl_client_policy to the business service. The simplest solution is to upgrade to OSB PS5 where the OWSM solution is provided out of the box. But if you are not in a position where upgrading is an immediate option, you might want to consider other two workaround solutions described below. Solution 2: Modifying WSDL This solution addresses OSB console’s complaint by removing the security policy from the imported WSDL within OSB. Without the security policy, OSB console allows the business service to be created based on modified WSDL.  Please bear in mind, modifying WSDL is done only for the OSB side via OSB console, no change is required on the remote Web Service. The main steps of this solution: Connect to OSB console import the remote WSDL into OSB remove security assertion (the red marked part) from the imported WSDL create a service account. In our sample, we simply take the user weblogic create the business service and check "Basic" for Authentication and select the created service account make sure that OSB consumes the web service via https. This solution requires modifying WSDL. It is suitable for any OSB version (10g or OSB 11g version) prior to PS5 without OWSM. However, modifying WSDL by hand is troublesome as it requires the user to remember that the original WSDL was edited.  It forces you to make the same edit each time you want to re-import the service WSDL when changes occur at the service level. This also prevents you from using UDDI to import WSDL.  Solution 3: Using original WSDL This solution keeps the WSDL intact and ignores the embedded policy by using OWSM. By design, OWSM doesn’t like WSDL with embedded security assertion. Since OWSM doesn’t provide the feature to explicitly ignore the embedded policy from a remote WSDL, in this solution, we use OWSM in a tricky way to ignore the embedded policy. Connect to OSB console import the remote WSDL into OSB create a service account create the business service in which check "Basic" for Authentication and select the created service account as the imported WSDL is intact, the OSB Kernel:398133 error is expected ignore this error message for the moment and navigate to the Policies Page of business service Select “From OWSM Policy Store” and click “Add” button, the list of policies will pop-up Here is the tricky part: select an arbitrary policy, and click “Cancel” Update and save By clicking “Cancel’ button, we didn’t add any OWSM policy to business service, but the embedded policy is ignored. Yes, this is tricky. According to Oracle OSB Product Manager, the future release of OWSM will add a button “None” which allows to ignore the embedded policy explicitly. This solution keeps the imported WSDL intact which is the big advantage over the solution 2. It is suitable for OSB 11g (version prior to PS5) domain with OWSM configured. This blog addressed the unsupported transport level web service security policy with OSB PS4. To summarize, if you are using OSB PS5 or in a position to upgrade to PS5, the recommendation is to use OWSM OOTB transport level security policy directly. With the release prior to 11g PS5, you can consider the solution 2 or 3 depending on if OWSM is configured.

    Read the article

  • Oracle Coherence & Oracle Service Bus: REST API Integration

    - by Nino Guarnacci
    This post aims to highlight one of the features found in Oracle Coherence which allows it to be easily added and integrated inside a wider variety of projects.  The features in question are the REST API exposed by the Coherence nodes, with which you can interact in the wider mode in memory data grid.Oracle Coherence and Oracle Service Bus are natively integrated through a feature found in the Oracle Service Bus, which allows you to use the coherence grid cache during the configuration phase of a business service. This feature allows you to use an intermediate layer of cache to retrieve the answers from previous invocations of the same service, without necessarily having to invoke the real business service again. Directly from the web console of Oracle Service Bus, you can decide the policies of eviction of the objects / answers and define the discriminating parameters that identify their uniqueness.The coherence REST APIs, however, allow you to integrate both products for other necessities enabling realization of new architectures design.  Consider coherence’s node as a simple service which interoperates through the stardard services and in particular REST (with JSON and XML). Thinking of coherence as a company’s shared service, able to have an implementation of a centralized “map and reduce” which you can access  by a huge variety of protocols (transport and envelopes).An amazing step forward for those who still imagine connectors and code. This type of integration does not require writing custom code or complex implementation to be self-supported. The added value is made unique by the incredible value of both products independently, and still more out of their simple and robust integration.As already mentioned this scenario discovers a hidden new door behind the columns of these two products. The door leads to new ideas and perspectives for enterprise architectures that increasingly wink to next-generation applications: simple and dynamic, perhaps towards the mobile and web 2.0.Below, a small and simple demo useful to demonstrate how easily is to integrate these two products using the Coherence REST API. This demo is also intended to imagine new enterprise architectures using this approach.The idea is to create a centralized system of alerting, fed easily from any company’s application, regardless of the technology with which they were built . Then use a representation standard protocol: RSS, using a service exposed by the service bus; So you can browse and search only the alerts that you are interested on, by category, author, title, date, etc etc.. The steps needed to implement this system are very simple and very few. Here they are listed below and described to be easily replicated within your environment. I would remind you that the demo is only meant to demonstrate how easily is to integrate Oracle Coherence and the Oracle Service Bus, and stimulate your imagination to new technological approaches.1) Install the two products: In this demo used (if necessary, consult the installation guides of 2 products)  - Oracle Service Bus ver. 11.1.1.5.0 http://www.oracle.com/technetwork/middleware/service-bus/downloads/index.html - Oracle Coherence ver. 3.7.1 http://www.oracle.com/technetwork/middleware/coherence/downloads/index.html 2) Because you choose to create a centralized alerting system, we need to define a structure type containing some alerting attributes useful to preserve and organize the information of the various alerts sent by the different applications. Here, then it was built a java class named Alert containing the canonical properties of an alarm information:- Title- Description- System- Time- Severity 3) Therefore, we need to create two configuration files for the coherence node, in order to save the Alert objects within the grid, through the rest/http protocol (more than the native API for Java, C + +, C,. Net). Here are the two minimal configuration files for Coherence:coherence-rest-config.xml resty-server-config.xml This minimum configuration allows me to use a distributed cache named "alerts" that can  also be accessed via http - rest on the host "localhost" over port "8080", objects are of type “oracle.cohsb.Alert”. 4) Below  a simple Java class that represents the type of alert messages: 5) At this point we just need to startup our coherence node, able to listen on http protocol to manage the “alerts” cache, which will receive incoming XML or JSON objects of type Alert. Remember to include in the classpath of the coherence node, the Alert java class and the following coherence libraries and configuration files:  At this point, just run the coherence class node “com.tangosol.net.DefaultCacheServer”advising you to set the following parameters:-Dtangosol.coherence.log.level=9 -Dtangosol.coherence.log=stdout -Dtangosol.coherence.cacheconfig=[PATH_TO_THE_FILE]\resty-server-config.xml 6) Let's create a procedure to test our configuration of Coherence and in order to insert some custom alerts in our cache. The technology with which you want to achieve this functionality is fully not considerable: Javascript, Python, Ruby, Scala, C + +, Java.... Because the protocol to communicate with Coherence is simply HTTP / JSON or XML. For this little demo i choose Java: A method to send/put the alert to the cache: A method to query and view the content of the cache: Finally the main method that execute our methods:  No special library added in the classpath for our class (json struct static defined), when it will be executed, it asks some information such as title, description,... in order to compose and send an alert to the cache and then it will perform an inquiry, to the same cache. At this point, a good exercise at this point, may be to create the same procedure using other technologies, such as a simple html page containing some JavaScript code, and then using Python, Ruby, and so on.7) Now we are ready to start configuring the Oracle Service Bus in order to integrate the two products. First integrate the internal alerting system of Oracle Service Bus with our centralized alerting system based on coherence node. This ensures that by monitoring, or directly from within our Proxy Message Flow, we can throw alerts and save them directly into the Coherence node. To do this I choose to use the jms technology, natively present inside the Oracle Weblogic / Service Bus. Access to the Oracle WebLogic Administration console and create and configure a new JMS connection factory and a new jms destination (queue). Now we should create a new resource of type “alert destination” within our Oracle Service Bus project. The new “alert destination” resource should be configured using the newly created connection factory jms and jms destination. Finally, in order to withdraw the message alert enqueued in our JMS destination and send it to our coherence node, we just need to create a new business service and proxy service within our Oracle Service Bus project.Our business service is responsible for sending a message to our REST service Coherence using as a method action: PUT Finally our proxy service have to collect all messages enqueued on the destination, execute an xquery transformation on those messages  in order to translate them into valid XML / alert objects useful to be sent to our coherence service, through the newly created business service. The message flow pipeline containing the xquery transformation: Incredibly,  we just did a basic first integration between the native alerting system of Oracle Service Bus and our centralized alerting system by simply configuring our coherence node without developing anything.It's time to test it out. To do this I create a proxy service able to generate an alert using our "alert destination", whenever the proxy is invoked. After some invocation to our proxy that generates fake alerts, we could open an Internet browser and type the URL  http://localhost: 8080/alerts/  so we could see what has been inserted within the coherence node. 8) We are ready for the final step.  We would create a new message flow, that can be used to search and display the results in standard mode. To do this I choosen the standard representation of RSS, to display a formatted result on a huge variety of devices such as readers for the iPhone and Android. The inquiry may be defined already at the time of the request able to return only feed / items related to our needs. To do this we need to create a new business service, a new proxy service, and finally a new XQuery Transformation to take care of translating the collection of alerts that will be return from our coherence node in a nicely formatted RSS standard document.So we start right from this resource (xquery), which has the task of transforming a collection of alerts / xml returned from the node coherence in a type well-formatted feed RSS 2.0 our new business service that will search the alerts on our coherence node using the Rest API. And finally, our last resource, the proxy service that will be exposed as an RSS / feeds to various mobile devices and traditional web readers, in which we will intercept any search query, and transform the result returned by the business service in an RSS feed 2.0. The message flow with the transformation phase (Alert TO Feed Items): Finally some little tricks to follow during the routing to the business service, - check for any queries present in the url to require a subset of alerts  - the http header "Accept" to help get an answer XML instead of JSON: In our little demo we also static added some coherence parameters to the request:sort=time:desc;start=0;count=100I would like to get from Coherence that the results will be sorted by date, and starting from 1 up to a maximum of 100.Done!!Just incredible, our centralized alerting system is ready. Inheriting all the qualities and capabilities of the two products involved Oracle Coherence & Oracle Service Bus: - RASP (Reliability, Availability, Scalability, Performance)Now try to use your mobile device, or a normal Internet browser by accessing the RSS just published: Some urls you may test: Search for the last 100 alerts : http://localhost:7001/alarmsSearch for alerts that do not have time set to null (time is not null):http://localhost:7001/alarms?q=time+is+not+nullSearch for alerts that the system property is “Web Browser” (system = ‘Web Browser’):http://localhost:7001/alarms?q=system+%3D+%27Web+Browser%27Search for alerts that the system property is “Web Browser” and the severity property is “Fatal” and the title property contain the word “Javascript”  (system = ‘Web Broser’ and severity = ‘Fatal’ and title like ‘%Javascript%’)http://localhost:8080/alerts?q=system+%3D+%27Web+Browser%27+AND+severity+%3D+%27Fatal%27+AND+title+LIKE+%27%25Javascript%25%27 To compose more complex queries about your need I would suggest you to read the chapter in the coherence documentation inherent the Cohl language (Coherence Query Language) http://download.oracle.com/docs/cd/E24290_01/coh.371/e22837/api_cq.htm . Some useful links: - Oracle Coherence REST API Documentation http://download.oracle.com/docs/cd/E24290_01/coh.371/e22839/rest_intro.htm - Oracle Service Bus Documentation http://download.oracle.com/docs/cd/E21764_01/soa.htm#osb - REST explanation from Wikipedia http://en.wikipedia.org/wiki/Representational_state_transfer At this URL could be downloaded the whole materials of this demo http://blogs.oracle.com/slc/resource/cosb/coh-sb-demo.zip Author: Nino Guarnacci.

    Read the article

< Previous Page | 115 116 117 118 119 120 121 122 123 124 125 126  | Next Page >