Search Results

Search found 11720 results on 469 pages for 'enterprise systems'.

Page 60/469 | < Previous Page | 56 57 58 59 60 61 62 63 64 65 66 67  | Next Page >

  • What is a best practice tier structure of a Java EE 6/7 application?

    - by James Drinkard
    I was attempting to find a best practice for modeling the tiers in a Java EE application yesterday and couldn't come up with anything current. In the past, say java 1.4, it was four tiers: Presentation Tier Web Tier Business Logic Tier DAL (Data Access Layer ) which I always considered a tier and not a layer. After working with Web Services and SOA I thought to add in a services tier, but that may fall under 3. the business logic tier. I did searches for quite a while and reading articles. It seems like Domain Driven Design is becoming more popular, but I couldn't find a diagram on it's tier structure. Anyone have ideas or diagrams on what the proper tier structure is for newer Java EE applications or is it really the same, but more items are ranked under the four I've mentioned?

    Read the article

  • Architecture advice for converting biz app from old school to new school?

    - by Aaron Anodide
    I've got a WinForms business application that evolved over the past few years. It's forms over data with a number custom UI experiences taylored to the business, so I don't think it's a candidate to port to something like SharePoint or re-write in LightSwitch (at least not without significant investment). When I started it in 2009 I was new to this type of development (coming from more low level programming and my RDBMS knowledge was just slightly greater than what I got from school). Thus, when I was confronted with a business model that operates on a strict monthly accounting cycle, I made the unfortunate decision to create a separate database for each accounting period. Also, when I started I knew DataSets, then I learned Linq2Sql, then I learned EntityFramework. The screens are a mix and match of those. Now, after a few years developing this thing by myself I've finally got a small team. Ultimately, I want a web front end (for remote access to more straight up screens with grids of data) and a thick client (for the highly customized interfaces). My question is: can you offer me some broad strokes architecture advice that will help me formulate a battle plan to convert over to a single database and lay the foundations for my future goals at the same time? Here's a screen shot showing how an older screen uses DataSets and a newer screen uses EF (I'm thinking this might make it more real for someone reading the question - I'm willing to add any amount of detail if someone is willing to help).

    Read the article

  • Siebel BIP Integration

    - by Tim Dexter
    This post is more of a bookmark for me so that I stop bugging the brown stuff out of the John the Siebel-BIP product manager. I have had multiple customers over the past two weeks asking for help around the integration. What's its capable of? How can I allow my users to click a button to run a BIP report? How can I kick off a report from a Siebel workflow? Start right here - this is a great white paper explaining whats now available with the integration using, the Siebel Report Business Service. Once you have consumed that from start to finish. Get on over to Oracle support and look for the following note that has code samples and lots of other good stuff! Siebel BI Publisher Reports Business Service (8.1.1.7+) [ID 1425724.1] The Reports Business Service enables BI Publisher reports to be executed from the Siebel application via a Workflow Process, or through scripting. The report is generated in the background by connecting to the BI Publisher server. The report output is stored in the Siebel File System and accessed from the My BI Publisher Reports view. Alternatively using appropriate methods, the report can be attached to an entity or sent to a particular delivery channel.

    Read the article

  • Updating a database connection password using a script

    - by Tim Dexter
    An interesting customer requirement that I thought was worthy of sharing today. Thanks to James for the requirement and Bryan for the proposed solution and me for testing the solution and proving it works :0) A customers implementation of Sarbanes Oxley requires them to change all database account passwords every 90 days. This is scripted leveraging shell scripts today for most of their environments. But how can they manage the BI Publisher connections? Now, the customer is running 11g and therefore using weblogic on the middle tier, which is the first clue to Bryans proposed solution. To paraphrase and embellish Bryan's solution a little; why not use a JNDI connection from BIP to the database. Then employ the web logic scripting engine to make updates to the JNDI as needed? BIP is completely uninvolved and with a little 'timing' users will be completely unaware of the password updates i.e. change the password when reports are not being executed. Perfect! James immediately tracked down the WLST script that could be used here, http://middlewaremagic.com/weblogic/?p=4261 (thanks Ravish) Now it was just a case of testing the theory. Some steps: Create the JNDI connection in WLS Create the JNDI connection in BI Publisher pointing to the WLS connection Build new data models using or re-point data sources to use the JNDI connection. Create the WLST script to update the WLS JNDI password as needed. Test! Some details. Creating the JNDI connection in web logic is pretty straightforward. Log into hte console and look for Data Sources under the Services section of the home page and click it Click New >> Generic Datasource Give the connection a name. For the JNDI name, prefix it with 'jdbc/' so I have 'jdbc/localdb' - this name is important you'll need it on the BIP side. Select your db type - this will influence the drivers and information needed on the next page. Being a company man, Im using an Oracle db. Click Next Select the driver of choice, theres lots I know, you can read about them I just chose 'Oracle's Driver (Thin) for Instance connections; Versions 9.0.1 and later' Click Next >> Next Fill out the db name (SID), server, port, username to connect and password >> Next Test the config to ensure you can connect. >> Next Now you need to deploy the connection to your BI server, select it and click Next. You're done with the JNDI config. Creating the JNDI connection on the Publisher side is covered here. Just remember to the connection name you created in WLS e.g. 'jdbc/localdb' Not gonna tell you how to do this, go read the user guide :0) Suffice to say, it works. This requires a little reading around the subject to understand the scripting engine and how to execute scripts. Nicely covered here. However a bit of googlin' and I found an even easier way of running the script. ${ServerHome}/common/bin/wlst.sh updatepwd.py Where updatepwd.py is my script file, it can be in another directory. As part of the wlst.sh script your environment is set up for you so its very simple to execute. The nitty gritty: Need to take Ravish's script above and create a file with a .py extension. Its going to need some modification, as he explains on the web page, to make it work in your environment. I played around with it for a while but kept running into errors. The script as is, tries to loop through all of your connections and modify the user and passwords for each. Not quite what we are looking for. Remember our requirement is to just update the password for a given connection. I also found another issue with the script. WLS 10.x does not allow updates to passwords using clear type ie un-encrypted text while the server is in production mode. Its a bit much to set it back to developer mode bounce it, change the passwords and then bounce and then change back to production and bounce again. After lots of messing about I finally came up with the following: ############################################################################# # # Update password for JNDI connections # ############################################################################# print("*** Trying to Connect.... *****") connect('weblogic','welcome1','t3://localhost:7001') print("*** Connected *****") edit() startEdit() print ("*** Encrypt the password ***") en = encrypt('hr') print "Encrypted pwd: ", en print ("*** Changing pwd for LocalDB ***") dsName = 'LocalDB' print 'Changing Password for DataSource ', dsName cd('/JDBCSystemResources/'+dsName+'/JDBCResource/'+dsName+'/JDBCDriverParams/'+dsName) set('PasswordEncrypted',en) save() activate() Its pretty simple and you can expand on it to loop through the data sources and change each as needed. I have hardcoded the password into the file but you can pass it as a parameter as needed using the properties file method. Im not going to get into the detail of that here but its covered with an example here. Couple of points to note: 1. The change to the password requires a server bounce to get the changes picked up. You can add that to the shell script you will use to call the script above. 2. The script above needs to be run from the MW_HOME\user_projects\domains\bifoundation_domain directory to get the encryption libraries set correctly. My command to run the whole script was: d:\oracle\bi_mw\wlserver_10.3\common\bin\wlst.cmd updatepwd.py - where wlst.cmd is the scripting command line and updatepwd.py was my update password script above. I have not quite spoon fed everything you need to make it a robust script but at least you know you can do it and you can work out the rest I think :0)

    Read the article

  • Collapsing Bookmarks

    - by Tim Dexter
    I said I would tackle documenting some of the new features in the 10.1.3.4.1 roll up patch I mentioned last week. With the patch you can now set the default state of bookmarks (if you create them) in your PDF outputs. If your users prefer to see them all collapsed to the base level or may be collapsed to the second level to ease navigation; whatever they need. Its another opportunity for you to look like a star! You of course need to start with a table of contents; then add the convert|copy to bookmarks command. You can then add the new collapse command to set the appropriate level in the bookmarks. <?copy-to-bookmark:?> <?collapse-bookmark:show;2?> <<< Table of Contents >>> <?end convert-to-bookmark?> The command allows you to expand or collapse the bookmarks as you need. Of course you will know how many levels you will have in the final output document. The command takes the form: <?collapse-bookmark:show|hide;level int?> Some examples <?collapse-bookmark:hide;1?> <?collapse-bookmark:hide;2?> <?collapse-bookmark:hide;3?> Sample template and data here. Dont forget you need that 10.1.3.4.1 roll up!

    Read the article

  • BIP 10.1.3.4.x June 2010 Update Available

    - by Tim Dexter
    A new patchset for 10.1.3.4.0 and 10.1.3.4.1 is available on Metalink. some notes: The patch number is 9791839. This patchset includes 28 new bug fixes since the last patchset release on March 31. This is a culmulative update that includes all the fixes and enhancements from previous updates. The patch will supercede the other two updates. Install instructions are in the readme inside the patch There is also a new BIP client patch available, 9821068. No new template building features to my knowledge but there is an update to the template viewer to allow you to test and debug you siny new Excel templates. Server 8529759XMLP_TEMPLATE_DESIGNER CANNOT SAVE / UPLOAD TEMPLATE 8566455 BI PUBLISHER SCHEDULER DOES NOT START WITH JNDI DATA SOURCE 9295667RESPONSE OF GETSCHEDULEDREPORTINFO RETURNS STATUS AS 'UNKNOWN' INSTEAD OF 'SCHED 9542413 UNABLE TO CREATE A NEW TEMPLATE FROM UI 9546137 EXCEL ANALYZER TEMPLATE FAILS FOR A STRUCTURED XML WHEN IT IS UPLOADED 9556338 SIEBEL - BIP PARAMETERS SORT ORDER 9560562 BI PUBLISHER CACHE DIRECTORY FILLING UP AND POINTING TO INVALID DIRECTORY 9646599 USER ROLE DEFINED AS PRIMARYGROUP IN ACTIVEDIRECTORY GROUP ARE NOT RECOGNIZED 9664768 ER: NEED TO BIND USER ATTRIBUTE VALUES DEFINED IN ACTIVEDIRECTORY IN DATA QUERY 9665075 BI PUBLISHER AFTER 9546699 NOTIFICATIONS FOR REPORTS FAIL 9669973 ER: NEED TO SUPPORT PRE-PROCESSING XML WITH XSL FOR EXCEL TEMPLATE 9704401 ER: NEED TO SUPPORT DEFAULT GROUP FOR ALL USERS IN LDAP/AD SECURITY 9711899 SEARCH PARAMETER IS NOT VISIBLE WHEN SCHEDULE A REPORT 9753736 SOME ROLES FROM ACTIVEDIRECTORY ARE NOT LISTED IN ADMIN ROLE-FOLDER MAPPING 9771354 MULTIPLE PARAMETERS IN 10.1.3.4.1 DATA TEMPLATE ACT ACT DIFFERENTLY FROM 10.1.3. 9772982 "REFRESH OTHER PARAMETERS ON CHANGE" DOESN'T WORK PROPERLY Core  8599646 ER:EXTRA SPACE ADDED BELOW IMAGE IN A TABLE CELL OF TEMPLATE IN FIREFOX 9377593 SOME ROWS HEIGHT IN HTML/EXCEL OUTPUT ARE TOO BIG IN BI PUBLISHER 9487030 NAVIGATION TREE REPEATING TWICE IN PDF DCCUMENT CREATED BY BI PUBLISHER 9509432 PERFORMANCE ISSUE WHEN USING PDF TEMPLATE 9534424 PS: DOCUMENT-REPEAT-FULLPATH-ELEMENTNAME SHOULDNT USE DOT "." AS PATH SEPARATOR 9553360 FORMPROCESSOR CANNOT PARSE SOME PDF TEMPLATES 9554959 TEXT IN AUTOSHAPE IS NOT PROPERLY CUT OFF FOR LINE WRAPPING 9569417 AFTER APPLYING PATCH 9509432 PDF TEMPLATES WITH DBDRV PRODUCE NO OUTPUT 9571670 ER: EXCEL TEMPLATE TO SUPPORT XSLT LOGIC AND XSL CUSTOM EXTENTIONS 9589809 XSL:CALL-TEMPLATE IS MISSING IN GENERATED XSL FILE 9605920 BOOKMARK TESTCASE FAILED DUE TO ER9283933 9689634 PRINT FLOW CHART USING ACROSS 3 DOWN 0 GIVES EXTRA BLANK PAGES You might have noticed some fixes and ehancements to the Excel templates so I can get back on those now. There is a part two to the Mapviewer BIP Mashup coming ... just need aanother 4 hours in the day to squeeze it in.

    Read the article

  • Cool Tools You Can Use: Validation Templates for PeopleSoft Contracts Processes

    - by Mark Rosenberg
    This is the first in a series of postings we’ll be making under the heading of Cool Tools You Can Use. Our PeopleSoft product management team identified the need for this series after reflecting on the many conversations we have each year with our PeopleSoft community members. During these conversations, we were discovering that customers and implementation partners were often not aware that solutions exist to the problems they were trying to address and that the solutions were readily available at no additional charge. Thus, the Cool Tools You Can Use series will describe the business challenge we’ve heard, the PeopleSoft solution to the challenge, and how you can learn more about the solution so that everyone can be sure to make full use of what PeopleSoft applications have to offer. The first cool tool we’ll look at is the Validation Template for PeopleSoft Contracts Process Requests, which was first released in December 2013 as part of PeopleSoft Contracts 9.2 Update Image 4. The business issue our customers highlighted to us is the need to tightly control but easily configure and manage the scope of data that any user can process when initiating a process. Control of each user’s span of impact is essential to reducing billing reconciliation issues, passing span of authority audits, and reducing (or even eliminating) the frequency of unexpected process results.  Setting Up the Validation Template for a PeopleSoft Contracts Process With the validation template, organizations can easily and quickly ensure the software restricts the scope of transactions a user can affect and gives organizations the confidence to know that business processes are being governed effectively. Additionally, this control of PeopleSoft Contracts process requests can be applied and easily maintained and adjusted from a web browser thereby enabling analysts to administer the rules without having to engage software developers to customize the software. During the field validation template setup, an analyst specifies the combinations of fields that must contain values when a user tries to setup a run control and initiate a PeopleSoft Contracts process from a process request page. For example, for the Process Limits component, an organization could require that users enter a valid combination of values for the business unit, contract, and contract type fields or a value in the contract administrator field. Until the user enters a valid combination of entries on the process request page, he cannot launch the process. With the validation template activated for process request pages, organizations can be confident that PeopleSoft Contracts users will not accidentally begin generating invoices or triggering other revenue management processes for transactions beyond their scope of authority. To learn more about the Validation Template, please review the Defining Validation Templates section of the PeopleSoft Contracts PeopleBooks. 

    Read the article

  • Unity throws SynchronizationLockException while debugging

    - by pjohnson
    I've found Unity to be a great resource for writing unit-testable code, and tests targeting it. Sadly, not all those unit tests work perfectly the first time (TDD notwithstanding), and sometimes it's not even immediately apparent why they're failing. So I use Visual Studio's debugger. I then see SynchronizationLockExceptions thrown by Unity calls, when I never did while running the code without debugging. I hit F5 to continue past these distractions, the line that had the exception appears to have completed normally, and I continue on to what I was trying to debug in the first place.In settings where Unity isn't used extensively, this is just one amongst a handful of annoyances in a tool (Visual Studio) that overall makes my work life much, much easier and more enjoyable. But in larger projects, it can be maddening. Finally it bugged me enough where it was worth researching it.Amongst the first and most helpful Google results was, of course, at Stack Overflow. The first couple answers were extensive but seemed a bit more involved than I could pull off at this stage in the product's lifecycle. A bit more digging showed that the Microsoft team knows about this bug but hasn't prioritized it into any released build yet. SO users jaster and alex-g proposed workarounds that relieved my pain--just go to Debug|Exceptions..., find the SynchronizationLockException, and uncheck it. As others warned, this will skip over SynchronizationLockExceptions in your code that you want to catch, but that wasn't a concern for me in this case. Thanks, guys; I've used that dialog before, but it's been so long I'd forgotten about it.Now if I could just do the same for Microsoft.CSharp.RuntimeBinder.RuntimeBinderException... Until then, F5 it is.

    Read the article

  • Consolidating Oracle E-Business Suite R12 on Oracle's SPARC SuperCluster

    - by Giri Mandalika
    An Optimized Solution for Oracle E-Business Suite (EBS) R12 12.1.3 is now available on oracle.com.     The Oracle Optimized Solution for Oracle E-Business Suite This solution was centered around the engineered system, SPARC SuperCluster T4-4. Check the business and technical white papers along with a bunch of relevant useful resources online at the above optimized solution page for EBS. What is an Optimized Solution? Oracle's Optimized Solutions are designed, tested and fully documented architectures that are tuned for optimal performance and availability. Optimized solutions are NOT pre-packaged, fully tuned, ready-to-install software bundles that can be downloaded and installed. An optimized solution is usually a well documented architecture that was thoroughly tested on a target platform. The technical white paper details the deployed application architecture along with various observations from installing the application on target platform to its behavior and performance in highly available and scalable configurations. Oracle E-Business Suite R12 Use Case Multiple E-Business Suite R12 12.1.3 application modules were tested in this optimized solution -- Financials (online - oracle forms & web requests), Order Management (online - oracle forms & web req uests) and HRMS (online - web requests & payroll batch). The solution will be updated with additional application modules, when they are available. Oracle Solaris Cluster is responsible for the high availability portion of the solution. Performance Data For the sake of completeness, test results were also documented in the optimized solution white paper. Those test results are mainly for educational purposes only. They give good sense of application behavior under the circumstances the application was tested. Since the major focus of the optimized solution is around highly available and scalable configurations, the application was configured to me et those criteria. Hence the documented test results are not directly comparable to any other E-Business Suite performance test results published by any vendor including Oracle. Such an attempt may lead to skewed, incorrect conclusions. Questions & Requests Feel free to direct your questions to the author of the white papers. If you are a potential customer who would like to test a specific E-Business Suite application module on any non-engineered syste m such as SPARC T4-X or engineered system such as SPARC SuperCluster, contact Oracle Solution Center.

    Read the article

  • Oracle OnTrack - Ist Ihre Kommunikation auf Schiene ?

    - by Philipp Weckerle
    Oracle OnTrack erlaubt aber nicht nur einfache, chat-ähnliche Gespräche zu führen, sondern auch Bilder und Dokumente einzubinden, diese zu Annotieren, und auch Voice Kommentare und Telefonkonversationen in die Gespräche einzuflechten. Ein System zum ganzheitlichen Erfassen von "Gesprächen". Mit Integration in Microsoft Outlook, iPhone, iPad und anderen mobilen Endgeräten bietet es eine einfach zu erreichende Plattform für struktorierte Kommunikation. Weitere Informationen finden Sie hier und auf oracle.com.

    Read the article

  • Installer Changes for AutoVue 20.2.0 Client/Server Deployment

    - by GrahamOracle
    Those upgrading to AutoVue 20.2.0 Client/Server Deployment will notice a few changes in the installation process as compared to previous releases. The two notable changes are: SSL configuration during the installer: To configure SSL encryption between the AutoVue VueServlet and AutoVue server. User authentication configuration during the installer: To configure Kerberos authentication between the AutoVue client and AutoVue server (for environments where users are not already authenticated to a back-end system). These configurations are optional although recommended. For more information regarding these options, check out Oracle’s KM Note 1437447.1, as well as the AutoVue 20.2.0 Client/Server Deployment documentation (namely the Installation and Configuration Guide).

    Read the article

  • The Nine Cs of Customer Engagement

    - by Michael Snow
    Avoid Social Media Fatigue - Learn the 9 C's of Customer Engagement inside the Click Here The order you must follow to make the colored link appear in browsers. If not the default window link will appear 1. Select the word you want to use for the link 2. Select the desired color, Red, Black, etc 3. Select bold if necessary ___________________________________________________________________________________________________________________ Templates use two sizes of fonts and the sans-serif font tag for the email. All Fonts should be (Arial, Helvetica, sans-serif) tags Normal size reading body fonts should be set to the size of 2. Small font sizes should be set to 1 !!!!!!!DO NOT USE ANY OTHER SIZE FONT FOR THE EMAILS!!!!!!!! ___________________________________________________________________________________________________________________ -- Have We Hit a Social-Media Plateau? In recent years, social media has evolved from a cool but unproven medium to become the foundation of pragmatic social business and a driver of business value. Yet, time is running out for businesses to make the most out of this channel. This isn’t a warning. It’s a fact. Join leading industry analyst R “Ray” Wang as he explains how to apply the nine Cs of engagement to strengthen customer relationships. Learn: How to overcome social-media fatigue and make the most of the medium Why engagement is the most critical factor in the age of overexposure The nine pillars of successful customer engagement Register for the eighth Webcast in the Social Business Thought Leaders series today. Register Now Thurs., Sept. 20, 2012 10 a.m. PT / 1 p.m. ET Presented by: R “Ray” Wang Principal Analyst and CEO, Constellation Research Christian Finn Senior Director, Product Management Oracle Copyright © 2012, Oracle and/or its affiliates. All rights reserved. Contact Us | Legal Notices and Terms of Use | Privacy Statement SEV100103386 Oracle Corporation - Worldwide Headquarters, 500 Oracle Parkway, OPL - E-mail Services, Redwood Shores, CA 94065, United States Your privacy is important to us. You can login to your account to update your e-mail subscriptions or you can opt-out of all Oracle Marketing e-mails at any time.Please note that opting-out of Marketing communications does not affect your receipt of important business communications related to your current relationship with Oracle such as Security Updates, Event Registration notices, Account Management and Support/Service communications.

    Read the article

  • Should I cache the data or hit the database?

    - by JD01
    I have not worked with any caching mechanisms and was wondering what my options are in the .net world for the following scenario. We basically have a a REST Service where the user passes an ID of a Category (think folder) and this category may have lots of sub categories and each of the sub categories could have 1000 of media containers (think file reference objects) which contain information about a file that may be on a NAS or SAN server (files are videos in this case). The relationship between these categories is stored in a database together with some permission rules and meta data about the sub categories. So from a UI perspective we have a lazy loaded tree control which is driven by the user by clicking on each sub folder (think of Windows explorer). Once they come to a URL of the video file, they then can watch the video. The number of users could grow into the 1000s and the sub categories and videos could be in the 10000s as the system grows. The question is should we carry on the way it is currently working where each request hits the database or should we think about caching the data? We are on using IIS 6/7 and Asp.net.

    Read the article

  • Ubiquitous BIP

    - by Tim Dexter
    The last number I heard from Mike and the PM team was that BIP is now embedded in more than 40 oracle products. That's a lot of products to keep track of and to help out with new releases, etc. Its interesting to see how internal Oracle product groups have integrated BIP into their products. Just as you might integrate BIP they have had to make a choice about how to integrate. 1. Library level - BIP is a pure java app and at the bottom of the architecture are a group of java libraries that expose APIs that you can use. they fall into three main areas, data extraction, template processing and formatting and delivery. There are post processing capabilities but those APIs are embedded withing the template processing libraries. Taking this integration route you are going to need to manage templates, data extraction and processing. You'll have your own UI to allow users to control all of this for themselves. Ultimate control but some effort to build and maintain. I have been trawling some of the products during a coffee break. I found a great post on the reporting capabilities provided by BIP in the records management product within WebCenter Content 11g. This integration falls into the first category, content manager looks after the report artifacts itself and provides you the UI to manage and run the reports. 2. Web Service level - further up in the stack is the web service layer. This is sitting on the BI Publisher server as a set of services, runReport and scheduleReport are the main protagonists. However, you can also manage the reports and users (locally managed) on the server and the catalog itself via the services layer.Taking this route, you still need to provide the user interface to choose reports and run them but the creation and management of the reports is all handled by the Publisher server. I have worked with a few customer on this approach. The web services provide the ability to retrieve a list of reports the user can access; then the parameters and LOVs for the selected report and finally a service to submit the report on the server. 3. Embedded BIP server UI- the final level is not so well supported yet. You can currently embed a report and its various levels of surrounding  'chrome' inside another html based application using a URL. Check the docs here. The look and feel can be customized but again, not easy, nor documented. I have messed with running the server pages inside an IFRAME, not bad, but not great. Taking this path should present the least amount of effort on your part to get BIP integrated but there are a few gotchas you need to get around. So a reasonable amount of choices with varying amounts of effort involved. There is another option coming soon for all you ADF developers out there, the ability to drop a BIP report into your application pages. But that's for another post.

    Read the article

  • Data Transformation Pipeline

    - by davenewza
    I have create some kind of data pipeline to transform coordinate data into more useful information. Here is the shell of pipeline: public class PositionPipeline { protected List<IPipelineComponent> components; public PositionPipeline() { components = new List<IPipelineComponent>(); } public PositionPipelineEntity Process(Position position) { foreach (var component in components) { position = component.Execute(position); } return position; } public PositionPipeline RegisterComponent(IPipelineComponent component) { components.Add(component); return this; } } Every IPipelineComponent accepts and returns the same type - a PositionPipelineEntity. Code: public interface IPipelineComponent { PositionPipelineEntity Execute(PositionPipelineEntity position); } The PositionPipelineEntity needs to have many properties, many which are unused in certain components and required in others. Some properties will also have become redundant at the end of the pipeline. For example, these components could be executed: TransformCoordinatesComponent: Parse the raw coordinate data into a Coordinate type. DetermineCountryComponent: Determine and stores country code. DetermineOnRoadComponent: Determine and store whether coordinate is on a road. Code: pipeline .RegisterComponent(new TransformCoordinatesComponent()) .RegisterComponent(new DetermineCountryComponent()) .RegisterComponent(new DetermineOnRoadComponent()); pipeline.Process(positionPipelineEntity); The PositionPipelineEntity type: public class PositionPipelineEntity { // Only relevant to the TransformCoordinatesComponent public decimal RawCoordinateLatitude { get; set; } // Only relevant to the TransformCoordinatesComponent public decimal RawCoordinateLongitude { get; set; } // Required by all components after TransformCoordinatesComponent public Coordinate CoordinateLatitude { get; set; } // Required by all components after TransformCoordinatesComponent public Coordinate CoordinateLongitude { get; set; } // Set in DetermineCountryComponent, not required anywhere. // Requires CoordinateLatitude and CoordinateLongitude (TransformCoordinatesComponent) public string CountryCode { get; set; } // Set in DetermineOnRoadComponent, not required anywhere. // Requires CoordinateLatitude and CoordinateLongitude (TransformCoordinatesComponent) public bool OnRoad { get; set; } } Problems: I'm very concerned about the dependency that a component has on properties. The way to solve this would be to create specific types for each component. The problem then is that I cannot chain them together like this. The other problem is the order of components in the pipeline matters. There is some dependency. The current structure does not provide any static or runtime checking for such a thing. Any feedback would be appreciated.

    Read the article

  • Which pattern is best for large project

    - by shamim
    I have several years of software development experience, but I am not a keen and adroit programmer, to perform better I need helping hands. Recently I engaged in an ERP project. For this project want a very effective structure, which will be easily maintainable and have no compromise about performance issue. Below structures are now present in my old projects. Entity Layer BusinessLogic Layer. DataLogic Layer UI Layer. Bellow picture describe how they are internally connected. For my new project want to change my project structure, I want to follow below steps: Core Layer(common) BLL DAL Model UI Bellow picture describe how they are internally connected. Though goggling some initial type question’s are obscure to me, they are : For new project want to use Entity framework, is it a good idea? Will it increase my project performance? Will it more maintainable than previous structure? Entity Framework core disadvantages/benefits are? For my project need help to select best structure. Will my new structure be better than the old one?

    Read the article

  • Moving all UI logic to Client Side?

    - by Mag20
    Our team originally consisted of mostly server side developers with minimum expertise in Javascript. In ASP.NET we used to write a lot of UI logic in code-behind or more recently through controllers in MVC. A little while ago 2 high level client side developers joined our team. They can do in HTMl/CSS/Javascript pretty much anything that we could previously do with server-side code and server-side web controls: Show/hide controls Do validation Control AJAX refreshing So I started to think that maybe it would be more efficient to just create a high level API around our business logic, kinda like Amazon Fulfillment API: http://docs.amazonwebservices.com/fws/latest/APIReference/, so that client side developers would fully take over the UI, while server side developers would only concentrate on business logic. So for ordering system you would have a high level API like: OrderService.asmx CreateOrderResponse CreateOrder(CreateOrderRequest) AddOrderItem AddPayment - SubmitPayment - GetOrderByID FindOrdersByCriteria ... There would be JSON/REST access to API, so it would be easy to consume from client-side UI. We could use this API for both internal UI development and also for 3-rd parties to create their own applications. With advances in Javascript and availability of good client side developers, is it a good time to get rid of code-behind/controllers and just concentrate on developing high level APIs (ala Amazon) that client side developers can consume?

    Read the article

  • Advice on how to understand in general and in practice IT Infrastructure

    - by Luca
    My IT knowledge resides mainly in SW development. I have just some basic know how about networks. On the net I tried to get information and read books in order to better understand the overall IT infrastrcuture, but all the sources I found are too generic or, mainly, too detailed in just one aspect, making me lost. Could anyone suggest some books or web resources as good compromise in details? My goal would be able to understand the network issues and security threats when, for example, two remote system have to be setup and put in connection each other. Considering this scenario there are several aspects to consider: firewalls, intranet/internet interconnections, certificates for httpS, etc. The argument is quite wide, therefore I was looking for something that might help to start to understand this subject without going too deep in details (Computer Networks from A.Tanenbaum is a wonderful milestone, but way too specific for my scope). Thanks

    Read the article

  • Advanced Charts Part I

    - by Tim Dexter
    Yeeeep! Another series looms ... this one could stretch out a bit and more options become available. Ever needed to generate something similar to these? Beyond what BIP can provide today but there are a few options; one from Oracle and R, now out in the wild and another out there from JFreeChart, open source and therefore almost free. Of course Google is ever present and they have been extending their chart support. I blogged the How for Google charts a while back here. Different ways to integrate but they can all help to close the charting gap for you.

    Read the article

  • Quantifying the Value Derived from Your PeopleSoft Implementation

    - by Mark Rosenberg
    As product strategists, we often receive the question, "What's the value of implementing your PeopleSoft software?" Prospective customers and existing customers alike are compelled to justify the cost of new tools, business process changes, and the business impact associated with adopting the new tools. In response to this question, we have been working with many of our customers and implementation partners during the past year to obtain metrics that demonstrate the value obtained from an investment in PeopleSoft applications. The great news is that as a result of our quest to identify value achieved, many of our customers began to monitor their businesses differently and more aggressively than in the past, and a number of them informed us that they have some great achievements to share. For this month, I'll start by pointing out that we have collaborated with one of our implementation partners, Huron Consulting Group, Inc., to articulate the levers for extracting value from implementing the PeopleSoft Grants solution. Typically, education and research institutions, healthcare organizations, and non-profit organizations are the types of enterprises that seek to facilitate and automate research administration business processes with the PeopleSoft Grants solution. If you are interested in understanding the ways in which you can look for value from an implementation, please consider registering for the webcast scheduled for Friday, December 14th at 1pm Central Time in which you'll get to see and hear from our team, Huron Consulting, and one of our leading customers. In the months ahead, we'll plan to post more information about the value customers have measured and reported to us from their implementations and upgrades. If you have a great story about return on investment and want to share it, please contact either [email protected]  or [email protected]. We'd love to hear from you.

    Read the article

  • Should business services cross bounded contexts?

    - by Paul T Davies
    Firstly, I am following the convention that a bounded context is synonymous to a department, or possibly one department has 1 to many bounded contexts. We have a client consultancy department that has a Documentation Service. Documents are stored in the Document Store Service (which is where all documents in the company are stored - it is a utility service), and the Documentation Service stores information about that document (a business service). As it was designed for the client consultancy, it is information relevant to them. Now health and safety need somewhere to store information about a document. This is different information to client consultancy, but I have been instructed to extend the existing service to account for this extra information. I feel this service is now crossing a bounded context. My worry is that all departments will eventually store there information in here and the service will become bloated, trying to be all things to all departments. Each document record will only store a subset of the information because it will only belong to one department. It will get worse when different departments want to store the same information but refer to it in a diferent ways, or when two departments want to store different information that they refer to in the same way. In my understanding, this is exactly the reason for bounded contexts. I feel each department should have it's own business service for information about a document, but use the same utility service to actually store the document. What would be the correct approach?

    Read the article

  • does class reference itself static anti pattern in prism

    - by Michael Riva
    I have an application and my desing approach look like this: class Manager { public int State; static Manager _instance = null; public static Manager Instance { get { return _instance; } set { if (_instance == value) return; _instance = value; } } public Manager() { State = 0; Instance=this; } } class Module1 { public void GetState() { Console.WriteLine(Manager.Instance.State); } } class Module2 { public void GetState() { Console.WriteLine(Manager.Instance.State); } } class Module3 { public void GetState() { Console.WriteLine(Manager.Instance.State); } } Manager class already registered in Bootstrapper like : protected override void ConfigureContainer() { base.ConfigureContainer(); Container.RegisterType<Manager>(new ContainerControlledLifetimeManager()); } protected override void InitializeModules() { Manager man= Container.Resolve<Manager>(); } Question is do I need to define my manager object as static in its field to be able to reach its state? Or this is anti pattern or bad for performance?

    Read the article

  • De-facto standards for customer information record

    - by maasg
    I'm currently evaluating a potential new project that involves creating a DB for typical customer information (userid, pwd, first & last name, email, adress, telfnr ...). At this point, requirements are only roughly defined. The customer DB is expected in the O(millions) of records. In order to calculate some back-of-the-envelope numbers for DB sizing and evaluate potential DB options & architectures, I'm looking for some de-facto standards for these kind of records. In particular, the std size of every field (first name, last name, address,...) or typical avg for a simple customer record would be great info. With so many e-commerce websites out there, there should be some kind of typical config that can be reused and avoid re-inventing the wheel. Any ideas?

    Read the article

  • Inventory Consignment Flow

    - by ipohfly
    Not sure whether this is the right place to ask this question, but here goes.. Currently I have requirement to add support for consignment transaction in our inventory module. I have a very limited understanding of what consignment means in inventory, i.e. Customer get stocks/products from Seller without actually buying them, the product just resides in the Customer's inventory and it's still owned by the Seller. Only when the Customer actually buy the stocks then only will the ownership of the stock is transferred. The issue is i can't imagine how the data will be presented to both the Customer and the Seller. What i know is that i would need to deduct the stock from the Seller's inventory when the Customer raise a request to get the stock through consignment, but what about the 'ownership' of the stocks/products? Does that mean i would need to create another column in my table to state that for each inventory it is owned by who? Anywhere i can get information on how i should work out an inventory module like this? Thanks.

    Read the article

< Previous Page | 56 57 58 59 60 61 62 63 64 65 66 67  | Next Page >