Search Results

Search found 8925 results on 357 pages for 'customer care and billing'.

Page 203/357 | < Previous Page | 199 200 201 202 203 204 205 206 207 208 209 210  | Next Page >

  • Dynamically changing DVT Graph at Runtime

    - by mona.rakibe(at)oracle.com
    I recently came across this requirement where the customer wanted to change the graph type at run-time based on some selection. After some internal discussions we realized this can be best achieved by using af:switcher to toggle between multiple graphs. In this blog I will be sharing the sample that I build to demonstrate this.[Note] : In the below sample, every-time user changes graph type there is a server trip, so please use this approach with performance implications in mind.This sample can be downloaded  from DynamicGraph.zipSet-up: Create a BAM data control using employees DO (sample)(Refer this entry)Steps:Create the View Create a new JSF page .From component palette drag and drop "Select One Radio" into this page Enter some Label and click "Finish"In Property Editor set the "AutoSubmit" property to trueNow drag and drop "Switcher" from components into this page.In the Structure pane select the af:switcher right click and surround with "PanelGroupLayout"In Property Editor set the "PartialTriggers"  property of PanelGroupLayout to the id of af:selectOneRadioAgain in the Structure pane select the af:switcher right click and select "Insert inside af:switcher->facet"Enter Facet name (ex: pie)Again in the Structure pane select the af:switcher right click and select "Insert inside af:switcher->facet" Enter  another Facet name (ex: bar)From "Data Controls" drag and drop "Employees->Query"  into the pie facet as "Graph->Pie" (Pie: Sales_Number and Slices: Salesperson)From "Data Controls" drag and drop "Employees->Query"  into the bar facet as "Graph->Bar" (Bars :Sales_Number and X-axis : Salesperson).Now wire the switcher to the af:selectOneRadio using their "facetName" and "value" property respectively.Now run the page, notice that graph renders as per the selection by user.

    Read the article

  • Fraud Detection with the SQL Server Suite Part 1

    - by Dejan Sarka
    While working on different fraud detection projects, I developed my own approach to the solution for this problem. In my PASS Summit 2013 session I am introducing this approach. I also wrote a whitepaper on the same topic, which was generously reviewed by my friend Matija Lah. In order to spread this knowledge faster, I am starting a series of blog posts which will at the end make the whole whitepaper. Abstract With the massive usage of credit cards and web applications for banking and payment processing, the number of fraudulent transactions is growing rapidly and on a global scale. Several fraud detection algorithms are available within a variety of different products. In this paper, we focus on using the Microsoft SQL Server suite for this purpose. In addition, we will explain our original approach to solving the problem by introducing a continuous learning procedure. Our preferred type of service is mentoring; it allows us to perform the work and consulting together with transferring the knowledge onto the customer, thus making it possible for a customer to continue to learn independently. This paper is based on practical experience with different projects covering online banking and credit card usage. Introduction A fraud is a criminal or deceptive activity with the intention of achieving financial or some other gain. Fraud can appear in multiple business areas. You can find a detailed overview of the business domains where fraud can take place in Sahin Y., & Duman E. (2011), Detecting Credit Card Fraud by Decision Trees and Support Vector Machines, Proceedings of the International MultiConference of Engineers and Computer Scientists 2011 Vol 1. Hong Kong: IMECS. Dealing with frauds includes fraud prevention and fraud detection. Fraud prevention is a proactive mechanism, which tries to disable frauds by using previous knowledge. Fraud detection is a reactive mechanism with the goal of detecting suspicious behavior when a fraudster surpasses the fraud prevention mechanism. A fraud detection mechanism checks every transaction and assigns a weight in terms of probability between 0 and 1 that represents a score for evaluating whether a transaction is fraudulent or not. A fraud detection mechanism cannot detect frauds with a probability of 100%; therefore, manual transaction checking must also be available. With fraud detection, this manual part can focus on the most suspicious transactions. This way, an unchanged number of supervisors can detect significantly more frauds than could be achieved with traditional methods of selecting which transactions to check, for example with random sampling. There are two principal data mining techniques available both in general data mining as well as in specific fraud detection techniques: supervised or directed and unsupervised or undirected. Supervised techniques or data mining models use previous knowledge. Typically, existing transactions are marked with a flag denoting whether a particular transaction is fraudulent or not. Customers at some point in time do report frauds, and the transactional system should be capable of accepting such a flag. Supervised data mining algorithms try to explain the value of this flag by using different input variables. When the patterns and rules that lead to frauds are learned through the model training process, they can be used for prediction of the fraud flag on new incoming transactions. Unsupervised techniques analyze data without prior knowledge, without the fraud flag; they try to find transactions which do not resemble other transactions, i.e. outliers. In both cases, there should be more frauds in the data set selected for checking by using the data mining knowledge compared to selecting the data set with simpler methods; this is known as the lift of a model. Typically, we compare the lift with random sampling. The supervised methods typically give a much better lift than the unsupervised ones. However, we must use the unsupervised ones when we do not have any previous knowledge. Furthermore, unsupervised methods are useful for controlling whether the supervised models are still efficient. Accuracy of the predictions drops over time. Patterns of credit card usage, for example, change over time. In addition, fraudsters continuously learn as well. Therefore, it is important to check the efficiency of the predictive models with the undirected ones. When the difference between the lift of the supervised models and the lift of the unsupervised models drops, it is time to refine the supervised models. However, the unsupervised models can become obsolete as well. It is also important to measure the overall efficiency of both, supervised and unsupervised models, over time. We can compare the number of predicted frauds with the total number of frauds that include predicted and reported occurrences. For measuring behavior across time, specific analytical databases called data warehouses (DW) and on-line analytical processing (OLAP) systems can be employed. By controlling the supervised models with unsupervised ones and by using an OLAP system or DW reports to control both, a continuous learning infrastructure can be established. There are many difficulties in developing a fraud detection system. As has already been mentioned, fraudsters continuously learn, and the patterns change. The exchange of experiences and ideas can be very limited due to privacy concerns. In addition, both data sets and results might be censored, as the companies generally do not want to publically expose actual fraudulent behaviors. Therefore it can be quite difficult if not impossible to cross-evaluate the models using data from different companies and different business areas. This fact stresses the importance of continuous learning even more. Finally, the number of frauds in the total number of transactions is small, typically much less than 1% of transactions is fraudulent. Some predictive data mining algorithms do not give good results when the target state is represented with a very low frequency. Data preparation techniques like oversampling and undersampling can help overcome the shortcomings of many algorithms. SQL Server suite includes all of the software required to create, deploy any maintain a fraud detection infrastructure. The Database Engine is the relational database management system (RDBMS), which supports all activity needed for data preparation and for data warehouses. SQL Server Analysis Services (SSAS) supports OLAP and data mining (in version 2012, you need to install SSAS in multidimensional and data mining mode; this was the only mode in previous versions of SSAS, while SSAS 2012 also supports the tabular mode, which does not include data mining). Additional products from the suite can be useful as well. SQL Server Integration Services (SSIS) is a tool for developing extract transform–load (ETL) applications. SSIS is typically used for loading a DW, and in addition, it can use SSAS data mining models for building intelligent data flows. SQL Server Reporting Services (SSRS) is useful for presenting the results in a variety of reports. Data Quality Services (DQS) mitigate the occasional data cleansing process by maintaining a knowledge base. Master Data Services is an application that helps companies maintaining a central, authoritative source of their master data, i.e. the most important data to any organization. For an overview of the SQL Server business intelligence (BI) part of the suite that includes Database Engine, SSAS and SSRS, please refer to Veerman E., Lachev T., & Sarka D. (2009). MCTS Self-Paced Training Kit (Exam 70-448): Microsoft® SQL Server® 2008 Business Intelligence Development and Maintenance. MS Press. For an overview of the enterprise information management (EIM) part that includes SSIS, DQS and MDS, please refer to Sarka D., Lah M., & Jerkic G. (2012). Training Kit (Exam 70-463): Implementing a Data Warehouse with Microsoft® SQL Server® 2012. O'Reilly. For details about SSAS data mining, please refer to MacLennan J., Tang Z., & Crivat B. (2009). Data Mining with Microsoft SQL Server 2008. Wiley. SQL Server Data Mining Add-ins for Office, a free download for Office versions 2007, 2010 and 2013, bring the power of data mining to Excel, enabling advanced analytics in Excel. Together with PowerPivot for Excel, which is also freely downloadable and can be used in Excel 2010, is already included in Excel 2013. It brings OLAP functionalities directly into Excel, making it possible for an advanced analyst to build a complete learning infrastructure using a familiar tool. This way, many more people, including employees in subsidiaries, can contribute to the learning process by examining local transactions and quickly identifying new patterns.

    Read the article

  • Invitation: HARNESSING THE POWER OF FUSION

    - by mseika
    HARNESSING THE POWER OF FUSION: IMPLEMENT AND EXTEND ORACLE'S NEXT-GENERATION APPLICATIONS TO MEET CHANGING CLIENT NEEDSBRUSSELS, BELGIUM, APRIL 23RD, 2012 - APRIL 24th, 2012 The pace of business continues to accelerate. Clients demand solutions that not only meet their needs today, but evolve as quickly as markets, competition and technology. Oracle Fusion Applications can help you to anticipate and satisfy your clients changing needs. Designed for an era of business disruption, they co-exist with existing IT investments, but leverage new technologies (such as mobility and SOA) and breakthrough Cloud computing delivery models as you need them. They also support unprecedented levels of extensibility. To show you how, Oracle's Product Development organization invites you to join an exclusive Fusion Applications presentation and demonstration event for Oracle partners in Europe. This intensive 2-day event will illustrate Fusion Applications capabilities that can enhance ROI for your clients across four major product families: Financials, Procurement and Project Portfolio Management (ERP) Customer Relationship Management (CRM) Human Capital Management (HCM) Supply Chain Management (SCM) Led by Oracle Product Development personnel, this event will also demonstrate how to extend the Fusion Applications user experience, data model, business process and reporting using new Functional Setup and Composer technologies. These can help you address unique client needs without impacting future upgrades. This presentation and demonstration event is intended for consulting business development and delivery personnel. Reserve your Seats today for April 23rd - 24th event To register to this event CLICK HERE For further information please contact me at [email protected]. Best regards Paul ThompsonSenior Director EMEA Alliances and Solutions Partner Programs Markku RouhiainenDirector, Applications Partner EnablementWestern Europe

    Read the article

  • Invitation: HARNESSING THE POWER OF FUSION

    - by mseika
    HARNESSING THE POWER OF FUSION: IMPLEMENT AND EXTEND ORACLE'S NEXT-GENERATION APPLICATIONS TO MEET CHANGING CLIENT NEEDSBRUSSELS, BELGIUM, APRIL 23RD, 2012 - APRIL 24th, 2012 The pace of business continues to accelerate. Clients demand solutions that not only meet their needs today, but evolve as quickly as markets, competition and technology. Oracle Fusion Applications can help you to anticipate and satisfy your clients changing needs. Designed for an era of business disruption, they co-exist with existing IT investments, but leverage new technologies (such as mobility and SOA) and breakthrough Cloud computing delivery models as you need them. They also support unprecedented levels of extensibility. To show you how, Oracle's Product Development organization invites you to join an exclusive Fusion Applications presentation and demonstration event for Oracle partners in Europe. This intensive 2-day event will illustrate Fusion Applications capabilities that can enhance ROI for your clients across four major product families: Financials, Procurement and Project Portfolio Management (ERP) Customer Relationship Management (CRM) Human Capital Management (HCM) Supply Chain Management (SCM) Led by Oracle Product Development personnel, this event will also demonstrate how to extend the Fusion Applications user experience, data model, business process and reporting using new Functional Setup and Composer technologies. These can help you address unique client needs without impacting future upgrades. This presentation and demonstration event is intended for consulting business development and delivery personnel. Reserve your Seats today for April 23rd - 24th event To register to this event CLICK HERE For further information please contact me at [email protected]. Best regards Paul ThompsonSenior Director EMEA Alliances and Solutions Partner Programs Markku RouhiainenDirector, Applications Partner EnablementWestern Europe

    Read the article

  • Is there any reason not to go directly from client-side Javascript to a database?

    - by Chris Smith
    So, let's say I'm going to build a Stack Exchange clone and I decide to use something like CouchDB as my backend store. If I use their built-in authentication and database-level authorization, is there any reason not to allow the client-side Javascript to write directly to the publicly available CouchDB server? Since this is basically a CRUD application and the business logic consists of "Only the author can edit their post" I don't see much of a need to have a layer between the client-side stuff and the database. I would simply use validation on the CouchDB side to make sure someone isn't putting in garbage data and make sure that permissions are set properly so that users can only read their own _user data. The rendering would be done client-side by something like AngularJS. In essence you could just have a CouchDB server and a bunch of "static" pages and you're good to go. You wouldn't need any kind of server-side processing, just something that could serve up the HTML pages. Opening my database up to the world seems wrong, but in this scenario I can't think of why as long as permissions are set properly. It goes against my instinct as a web developer, but I can't think of a good reason. So, why is this a bad idea? EDIT: Looks like there is a similar discussion here: Writing Web "server less" applications EDIT: Awesome discussion so far, and I appreciate everyone's feedback! I feel like I should add a few generic assumptions instead of calling out CouchDB and AngularJS specifically. So let's assume that: The database can authenticate users directly from its hidden store All database communication would happen over SSL Data validation can (but maybe shouldn't?) be handled by the database The only authorization we care about other than admin functions is someone only being allowed to edit their own post We're perfectly fine with everyone being able to read all data (EXCEPT user records which may contain password hashes) Administrative functions would be restricted by database authorization No one can add themselves to an administrator role The database is relatively easy to scale There is little to no true business logic; this is a basic CRUD app

    Read the article

  • Which algorithms/data structures should I "recognize" and know by name?

    - by Earlz
    I'd like to consider myself a fairly experienced programmer. I've been programming for over 5 years now. My weak point though is terminology. I'm self-taught, so while I know how to program, I don't know some of the more formal aspects of computer science. So, what are practical algorithms/data structures that I could recognize and know by name? Note, I'm not asking for a book recommendation about implementing algorithms. I don't care about implementing them, I just want to be able to recognize when an algorithm/data structure would be a good solution to a problem. I'm asking more for a list of algorithms/data structures that I should "recognize". For instance, I know the solution to a problem like this: You manage a set of lockers labeled 0-999. People come to you to rent the locker and then come back to return the locker key. How would you build a piece of software to manage knowing which lockers are free and which are in used? The solution, would be a queue or stack. What I'm looking for are things like "in what situation should a B-Tree be used -- What search algorithm should be used here" etc. And maybe a quick introduction of how the more complex(but commonly used) data structures/algorithms work. I tried looking at Wikipedia's list of data structures and algorithms but I think that's a bit overkill. So I'm looking more for what are the essential things I should recognize?

    Read the article

  • Adopting Technologies for the Sake of Technologies

    - by shiju
    Unlike other engineering industries, the software engineering industry is really lacking maturity. The lack of maturity can see in different aspects of entire software development life cycle. I think other engineering industries are well organised and structured with common, proven engineering practices. The software engineering industry is greatly a diverse industry with different operating systems, and variety of development platforms, programming languages, frameworks and tools. Now these days, people are going behind the hypes and intellectual thoughts without understanding their core business problems and adopting technologies and practices for the sake of technologies and practices and simply becoming a “poster child” of technologies and practices. Understanding the core business problem and providing best, solid solution with a platform neutral approach, will give you more business values and ROI, instead of blindly adopting technologies and tailor-made your applications for the sake of technologies and practices. People have been simply migrating their solutions in favour of new technologies and different versions of frameworks without any business need. The “Pepsi Challenge” in the Software Development  Pepsi Challenge marketing campaign of the 1980s was a popular and very interesting marketing promotion in which people taste one cup of Pepsi and another cup with Coca Cola. In the taste test, more than 50% of people were preferred Pepsi  over Coca Cola. The success story behind the Pepsi was more sweetness contains in the Pepsi cola. They have simply added more sugar and more people preferred more sweet flavour. You can’t simply identify the better one after sipping one cup of cola based on the sweetness which contains. These things have been happening in the software industry for choosing development frameworks and technologies. People have been simply choosing frameworks based on the initial sugary feeling without understanding its core strengths and weakness. The sugary framework might be more harmful when you develop real-world systems. There is not any silver bullet for solving all kind of problems and frameworks and tools do have strengths and weakness. So it would be better to understand their strength and weakness. And please keep in mind that you have to develop real apps to understand the real capabilities and weakness of a framework. Evaluating a technology based on few blog posts will harm your projects and these bloggers might be lacking real-world experience with the framework. The Problem with Align a Development Practice with Tools Recently I have observed a discussion in a group where one guy asked suggestions for practicing Continuous Delivery (CD) as part of the agile based application engineering. Then the discussion quickly went to using and choosing a Continuous Integration (CI) tool and different people suggested different Continuous Integration (CI) tools for simply practicing Continuous Delivery. If you have worked with core agile engineering practices, you could clearly know that the real essence of agile is neither choosing a tool nor choosing a process. By simply choosing CI tool from a particular vendor will not ensure that you are delivering an evolving software based on customer feedback. You have to understand the real essence of a engineering practice and choose a right tool for practicing it instead of simply focus on a particular tool for a practicing an development practice. If you want to adopt a practice, you need a solid understanding on it with its real essence where tools are just helping us for better automation. Adopting New Technologies for the Sake of Technologies The another problem is that developers have been a tendency to adopt new technologies and simply migrating their existing apps to new technologies. It is okay if your existing system is having problem  with a technology stack or or maintainability challenge with existing solution, and moving to new technology for solving the current problems. We have been adopting new technologies for solving new challenges like solving the scalability challenges when the application or user bases is growing unpredictably. Please keep in mind that all new technologies will become old after working with it for few years. The below Facebook status update of Janakiraman, expresses the attitude of a typical customer. For an example, Node.js is becoming a hottest buzzword in the software industry and many developers are trying to adopt Node.js for their apps. The important thing is that Node.js is a minimalist framework that does some great things for some problems, but it’s not a silver bullet. I have been also working with Node.js which is good for some problems, but really bad for choosing it for all kind of problems. By adopting new technologies for new projects is good if we could get real business values from it because newer framework would solve some existing well known problems and provide better solutions where it can incorporate good solutions for the latest challenges . But adopting a new technology for the sake of new technology is really bad idea. Another example is JavaScript is getting lot of attention so that lot of developers are developing heavy JavaScript centric web apps. First, they will adopt a client-side JavaScript MV* framework from AngularJS, Ember, Backbone etc, and develop a Single Page App(SPA) where they are repeating the mistakes we did in the past with server-side. The mistakes we did in the server-side is transforming to client-side. The problem is that people are just adopting new technologies, but not improving their solutions. I predict that many Single Page App will suck in the future. We need a hybrid approach where we should be able to leverage both server-side and client-side for developing next-generation web apps. The another problem is that if you like a particular framework, use it for all kind of apps. In the past, I know some Silverlight passionate guys were tried to use that framework for all kind of apps including larger line of business apps. And these days developers are migrating their existing Silverlight apps in favour of HTML5 buzzword. So the real question is, what is the business values we are getting from these apps when we are developing it for the sake of a particular technology instead of business need. The another problem is that our solutions consultants are trying to provide unnecessary solutions for the sake of a particular technology or for a hype. For an example, Big Data solutions are great for solving the problem of three Vs : volume, velocity and variety. But trying to put this for every application will make problems. Let’s say, there is a small web site running with limited budget and saying that we need a recommendation engine for the web site with a Hadoop based solution with a 16 node cluster, would be really horrible. If we really need a Hadoop based solution, got for it, but trying to put this for all application would be a big disaster. It would be great if could understand the core business problems first, and later choose a right framework for providing solutions for the actual business problem, instead of trying to provide so many solutions. The Problem with Tied Up to a Platform Vendor Some organizations and teams are tied up with a particular platform vendor where they don’t want to use any product other than their preferred or existing platform vendor. They will accept any product provided by the vendor regardless of its capability. This will lets you some benefits regards with integration and collaboration of different products provided by the same vendor, but it will loose your opportunity to provide better solution for your business problems. For a real world sample scenario, lot of companies have been using SAP for their ERP solutions. When they are thinking about mobility or thinking about developing hybrid mobile apps, they can easily find out a framework from SAP. SAP provides a framework for HTML 5 based UI development named SAPUI5. If you are simply adopting that framework only based for the preference of existing platform vendor, you might be loose different opportunities for providing better solution. Initially you might enjoy the sugary feeling provided by the platform vendor, but you have to think about developing apps which should be capable for solving future challenges. I am not saying that any framework is not good and I believe that all frameworks are good over another one for solving at least one problem. My point is that we should not tied up with any specific platform vendor unless your organization is having resource availability problems. Being Polyglot for Providing Right Solutions The modern software engineering industry is greatly diverse with different tools and platforms. Lot of open source frameworks and new programming languages have been releasing to the developer community, where choosing the right platform without any biased opinion, is really a difficult task. But it would really great if we could develop an attitude with platform neutral mindset and being a polyglot developer for providing better solutions based on the actual business problems. IMHO, we should learn a new programming language and a new framework every year. This will improve the quality of our developer capabilities and also improve the quality of our primary programming language skills. Being polyglot for individual developers and organizational teams will give you greater opportunity to your developer experience and also for your applications. Organizations can analyse their business problem without tied with any technology and later they can provide solutions by choosing different platform and tools. Summary    In this blog post, what I was trying to say that we should not tied up or biased with any development platform, technology, vendor or programming language and we should not adopt technologies and practices for the sake of technologies. If we are adopting a technology or a practice for the sake of it, we are simply becoming a “poster child” of the technology and practice. We should not become a poster child of other people’s intellectual thoughts and theories, instead of it we should become solutions developers and solutions consultants where we should be able to provide better solutions for the business problems. Being a polyglot developer is a good idea for improving your developer skills which lets you provide better solutions for the business problems. The most important thing is that we should become platform neutral developers where our passion should be for providing brilliant solutions. It would be great if we could provide minimalist, pragmatic business solutions. You can follow me on Twitter @shijucv

    Read the article

  • SQL SERVER – Technical Reference Guides for Designing Mission-Critical Solutions – A Must Read

    - by pinaldave
    Yesterday I was reading architecture reference material helping my friend who was looking for material in this respect. While working together we were searching twitter, facebook and search engines to find relevant material.While searching online we end up on very interactive reference point. Once I send the same to him, he replied he may not need anything more after referencing this material. The best part of this article was it gives access to various aspect of the technology of the image map. Here is the abstract of the original article from the site: The Technical Reference Guides for Designing Mission-Critical Solutions provide planning and architecture guidance for various mission-critical workloads deployed by users. These guides reflect the knowledge gained by Microsoft while working with customers on mission-critical deployments. Each guide provides not only the key technical concepts and information helpful for design, but also “lessons learned,” best practices, and references to customer case studies. Once you click on any of the desired topic, you will see further detailed image map of the selected topic. Personally once I ended up on this site, I was there for more than 2 hours clicking through various links. Click on image to see larger image Read more here: Technical Reference Guides for Designing Mission-Critical Solutions Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, SQL White Papers, T SQL, Technology

    Read the article

  • Latest News on Service, Field Service and Depot Repair Products

    - by LuciaC
    Service and Depot Repair Customer Advisory Boards (CAB) In November 2012 the Service and Depot Repair CAB joined together for a combined meeting at Oracle HQ in Redwood Shores, California to discuss all the latest news in the Oracle Service, Field Service and Depot Repair products.  Over four days attendees shared their experiences with implementing and using these EBS CRM products and heard details of recent enhancements and future product plans direct from Development. You can access all the Oracle presentations via Doc ID 1511768.1.  Here are just some of the highlights: Field Service: Next Generation Dispatch Center Endeca Integration Case Study: Oracle Sun Field Service implementation. Mobile Field Service: New capabilities for technician-facing applications Service: Integration with Oracle Projects New Teleservice enhancements Spares Management: Supplier Warranty External Repair Execution Oracle Knowledge (Inquira) Introduction for Service Organizations If you weren't at the CAB, take a look at these presentations for great information about what's new and what's coming up in these products. 12.1.3++ Features for Field Service, Mobile Field Service, Spares Management, FSTP & Advanced Scheduler In June 2012 the R12.1.3++ patches were released for Field Service, Mobile Field Service, FSTP and Advanced Scheduler.  These patches contain new and updated functionality for these CRM Service suite modules.  New functionality includes: Field Service/FSTP/MFS: Support for Transfer Parts across subinventories in different organizations Validation to ensure Installed Item matches Returned Item MFS Wireless - Support fro Special Address Creation MFS Wireless - Enhanced Debrief Flow Advanced Scheduler Scheduler UI - Display of Spares Sourcing Information Auto Commit (Release) Tasks by Territory Dispatch Center UI - Display Spare Parts Arrival Information Spares Management Enhancements to the Task Reassignment Process Enhancements to the Parts Requirements UI Supply Chain Enhancements to allow filtering of ship methods from source location by distance. Check the following notes for more details and relevant patch numbers:Doc ID 1463333.1 - Oracle Field Service Release Notes, Release 12.1.3++Doc ID 1452470.1 - Field Service Technician Portal 12.1.3++ New FeaturesDoc ID 1463066.1 - Oracle Advanced Scheduler Release Notes, Release 12.1.3++ Doc ID 1463335.1 - Oracle Spares Management Release Notes, Release 12.1.3++ Doc ID 1463243.1 - Oracle Mobile Field Service Release Notes, Release 12.1.3++

    Read the article

  • Why I don't use SSIS checkpoint files

    - by jamiet
    In a recent discussion in regard to general ETL best practises the subject of checkpoint files as a means for package restartability came up and I stated that I was dead against using them. For anyone that may care, here is why: Configuring them is distinctly unintuitive (that's a matter of opinion but if you follow the link I'll wager that you will agree) they don't make any allowance for loop iterations they cannot store variables of type Object they are limited in ability. There are many scenarios where you may want to execute certain containers regardless of whether the package is started from a checkpoint file but the current usage model does not allow for this. they are ignored by eventhandlers, which wouldn't be so bad if there were a way to toggle this behaviour in certain scenarios they dont work properly I'll expand on the last bullet point. I have encountered situations where the behaviour for tasks executing concurrently is unpredictable. That is, sometimes the completion of a task that executes concurrently with a failed/failing task will make it into the checkpoint file and sometimes it won't. This is near-impossible to reproduce but it does happen as my good friend John Welch will hopefully concur (if he is reading). Is anyone out there making successful use of checkpoint files within SSIS? I would be interested in knowing about that if so. @Jamiet

    Read the article

  • Lending Club Selects Oracle ERP Cloud Service

    - by Di Seghposs
    Another Oracle ERP Cloud Service customer turning to Oracle to help increase efficiencies and lower costs!! Lending Club, the leading platform for investing in and obtaining personal loans, has selected Oracle Fusion Financials to help improve decision-making and workflow, implement robust reporting, and take advantage of the scalability and cost savings provided by the cloud. After an extensive search, Lending Club selected Oracle due to the breadth and depth of capabilities and ongoing innovation of Oracle ERP Cloud Service. Since their online lending platform is internally developed, they chose Oracle Fusion Financials in the cloud to easily integrate with current systems, keep IT resources focused on the organization’s own platform, and reap the benefits of lowered costs in the cloud. The automation, communication and collaboration features in Oracle ERP Cloud Service will help Lending Club achieve their efficiency goals through better workflow, as well as provide greater control over financial data. Lending Club is also implementing robust analytics and reporting to improve decision-making through embedded business intelligence. “Oracle Fusion Financials is clearly the industry leader, setting an entirely new level of insight and efficiencies for Lending Club,” said Carrie Dolan, CFO, Lending Club. “We are not only incredibly impressed with the best-of-breed capabilities and business value from our adoption of Oracle Fusion Financials, but also the commitment from Oracle to its partners, customers, and the ongoing promise of innovation to come.” Resources: Oracle ERP Cloud Service Video Oracle ERP Cloud Service Executive Strategy Brief Oracle Fusion Financials Quick Tour of Oracle Fusion Financials If you haven't heard about Oracle ERP Cloud Service, check it out today!

    Read the article

  • What should I expect from a system engineer university career

    - by Trufa
    I'm starting tomorrow a series of interviews to decide which university should I choose to get a degree in System Engineer. I know this is a serious university but I would like to get some feedback about what should I expect or "demand" from the university. My experience in the technology field is (obviously) limited and would like to be aware of what should I to be aware if the university might be good or not. Specially in following fields: Infrastructure: what are the essentials? big pluses? Theoretical vs Practical: how practical should it be? what is a "good" mix? Programming languages, frameworks, etc: Which are the ideal for learning? Most demand? Latest technologies: What should they be teaching right know to "prove" they are up to date. Qualification system: What exam methods do you think are ideal for this kind of degree, good ol' Q&A, multiple choice, projects, a fair mix? What other points do you think I should care about? What isn't important? Thanks in advance. I realize this is might be a very subjective topic so I tried to make it as specific and on topic as I could but any recommendations are of course welcome. I also understand that none of this questions will guarantee this will be a good university but it might give me another reference as to which should I choose when the moment comes.

    Read the article

  • Windows Azure Recipe: Software as a Service (SaaS)

    - by Clint Edmonson
    The cloud was tailor built for aspiring companies to create innovative internet based applications and solutions. Whether you’re a garage startup with very little capital or a Fortune 1000 company, the ability to quickly setup, deliver, and iterate on new products is key to capturing market and mind share. And if you can capture that share and go viral, having resiliency and infinite scale at your finger tips is great peace of mind. Drivers Cost avoidance Time to market Scalability Solution Here’s a sketch of how a basic Software as a Service solution might be built out: Ingredients Web Role – this hosts the core web application. Each web role will host an instance of the software and as the user base grows, additional roles can be spun up to meet demand. Access Control – this service is essential to managing user identity. It’s backed by a full blown implementation of Active Directory and allows the definition and management of users, groups, and roles. A pre-built ASP.NET membership provider is included in the training kit to leverage this capability but it’s also flexible enough to be combined with external Identity providers including Windows LiveID, Google, Yahoo!, and Facebook. The provider model provides extensibility to hook into other industry specific identity providers as well. Databases – nearly every modern SaaS application is backed by a relational database for its core operational data. If the solution is sold to organizations, there’s a good chance multi-tenancy will be needed. An emerging best practice for SaaS applications is to stand up separate SQL Azure database instances for each tenant’s proprietary data to ensure isolation from other tenants. Worker Role – this is the best place to handle autonomous background processing such as data aggregation, billing through external services, and other specialized tasks that can be performed asynchronously. Placing these tasks in a worker role frees the web roles to focus completely on user interaction and data input and provides finer grained control over the system’s scalability and throughput. Caching (optional) – as a web site traffic grows caching can be leveraged to keep frequently used read-only, user specific, and application resource data in a high-speed distributed in-memory for faster response times and ultimately higher scalability without spinning up more web and worker roles. It includes a token based security model that works alongside the Access Control service. Blobs (optional) – depending on the nature of the software, users may be creating or uploading large volumes of heterogeneous data such as documents or rich media. Blob storage provides a scalable, resilient way to store terabytes of user data. The storage facilities can also integrate with the Access Control service to ensure users’ data is delivered securely. Training & Examples These links point to online Windows Azure training labs and examples where you can learn more about the individual ingredients described above. (Note: The entire Windows Azure Training Kit can also be downloaded for offline use.) Windows Azure (16 labs) Windows Azure is an internet-scale cloud computing and services platform hosted in Microsoft data centers, which provides an operating system and a set of developer services which can be used individually or together. It gives developers the choice to build web applications; applications running on connected devices, PCs, or servers; or hybrid solutions offering the best of both worlds. New or enhanced applications can be built using existing skills with the Visual Studio development environment and the .NET Framework. With its standards-based and interoperable approach, the services platform supports multiple internet protocols, including HTTP, REST, SOAP, and plain XML SQL Azure (7 labs) Microsoft SQL Azure delivers on the Microsoft Data Platform vision of extending the SQL Server capabilities to the cloud as web-based services, enabling you to store structured, semi-structured, and unstructured data. Windows Azure Services (9 labs) As applications collaborate across organizational boundaries, ensuring secure transactions across disparate security domains is crucial but difficult to implement. Windows Azure Services provides hosted authentication and access control using powerful, secure, standards-based infrastructure. Developing Applications for the Cloud, 2nd Edition (eBook) This book demonstrates how you can create from scratch a multi-tenant, Software as a Service (SaaS) application to run in the cloud using the latest versions of the Windows Azure Platform and tools. The book is intended for any architect, developer, or information technology (IT) professional who designs, builds, or operates applications and services that run on or interact with the cloud. Fabrikam Shipping (SaaS reference application) This is a full end to end sample scenario which demonstrates how to use the Windows Azure platform for exposing an application as a service. We developed this demo just as you would: we had an existing on-premises sample, Fabrikam Shipping, and we wanted to see what it would take to transform it in a full subscription based solution. The demo you find here is the result of that investigation See my Windows Azure Resource Guide for more guidance on how to get started, including more links web portals, training kits, samples, and blogs related to Windows Azure.

    Read the article

  • Is it bad practice for services to share a database in SOA?

    - by Paul T Davies
    I have recently been reading Hohpe and Woolf's Enterprise Integration Patterns, some of Thomas Erl's books on SOA and watching various videos and podcasts by Udi Dahan et al. on CQRS and Event Driven systems. Systems in my place of work suffer from high coupling. Although each system theoretically has its own database, there is a lot of joining between them. In practice this means there is one huge database that all systems use. For example, there is one table of customer data. Much of what I've read seems to suggest denormalising data so that each system uses only its database, and any updates to one system are propagated to all the others using messaging. I thought this was one of the ways of enforcing the boundaries in SOA - each service should have its own database, but then I read this: http://stackoverflow.com/questions/4019902/soa-joining-data-across-multiple-services and it suggests this is the wrong thing to do. Segregating the databases does seem like a good way of decoupling systems, but now I'm a bit confused. Is this a good route to take? Is it ever recommended that you should segregate a database on, say an SOA service, an DDD Bounded context, an application, etc?

    Read the article

  • Continual Professional Development - proving new skills to non-technical employers

    - by Tom
    Background I work in a non-IT based company, as a professional software developer, building a large scale internal database system. I am fortunate to have a fairly senior position within the company, and have been working here for around 4 years. Often I get asked by management "how do you learn new things?". To be honest, I don't know how to answer this. Over the last 6 months, I've really gotten my teeth into some new techniques and technologies to make my level of coding far better and hopefully improve the quality of the software. Even if it's just refreshing my skills on things I've learnt already. Like last week I dived into some complex XLinq and TPL code (.net). Nothing revolutionary, but I feel like I am a bit better than before. Question The question is, how do I prove this to my employer? It'd be nice to be able to put this on paper. Possibilities I could: Keep a journal of what I've learnt - keeping the technical bits in (nobody would understand or care, but it's better than them being omitted) ???? (I've run out of ideas already) Any ideas? Thanks, Tom

    Read the article

  • Good Scoop: The PeopleSoft/IBM Backstory

    - by [email protected]
    By Brian Dayton on April 12, 2010 11:15 AM Sometimes you're searching for something online and you find an unrelated, bonus nugget. Last week I stumbled across an interesting blog post from Chris Heller of a PeopleSoft consulting shop in San Ramon, CA called Grey Sparling. I don't know these guys. But Chris, who apparently used to work on the PeopleTools team, wrote a great article on a pre-acquisition, would-be deal between IBM and PeopleSoft that would have standardized PeopleSoft on IBM technology. The behind-the-scenes perspective is interesting. His commentary on the challenges that the company and PeopleSoft customers would have encountered if the deal had gone through was also interesting: · "No common ownership. It's hard enough to get large groups of people to work together when they work for the same company, but with two separate companies it is much, much harder. Even within Oracle, progress on Fusion applications was slow until Thomas Kurian took over Fusion applications in addition to Fusion middleware." · "No customer buy-in. PeopleSoft customers weren't asking for a conversion to WebSphere, so the fact that doing that could have helped PeopleSoft stay independent wouldn't have meant much to them, especially since the cost of moving to whatever a "PeopleSoft built on WebSphere" would have been significant." · "No executive buy-in. This is related to the previous point, but it's worth calling out separately. If Oracle had walked away and the deal with IBM had gone through, and PeopleSoft customers got put through the wringer as part of WebSphere move, all of the PeopleSoft project teams would be put in the awkward position of explaining to their management why these additional costs and headaches were happening. Essentially they would need to "sell" the partnership internally to their own management team. That's not a fun conversation to have." I'm not surprised that something like this was in the works. But I did find the inside scoop and Heller's perspective on the challenges particularly interesting. Especially the advantages of aligning development of applications and infrastructure development under one roof. Here's a link to the whole blog entry.

    Read the article

  • When are Getters and Setters Justified

    - by Winston Ewert
    Getters and setters are often criticized as being not proper OO. On the other hand most OO code I've seen has extensive getters and setters. When are getters and setters justified? Do you try to avoid using them? Are they overused in general? If your favorite language has properties (mine does) then such things are also considered getters and setters for this question. They are same thing from an OO methodology perspective. They just have nicer syntax. Sources for Getter/Setter Criticism (some taken from comments to give them better visibility): http://www.javaworld.com/javaworld/jw-09-2003/jw-0905-toolbox.html http://typicalprogrammer.com/?p=23 http://c2.com/cgi/wiki?AccessorsAreEvil http://www.darronschall.com/weblog/2005/03/no-brain-getter-and-setters.cfm http://www.adam-bien.com/roller/abien/entry/encapsulation_violation_with_getters_and To state the criticism simply: Getters and Setters allow you to manipulate the internal state of objects from outside of the object. This violates encapsulation. Only the object itself should care about its internal state. And an example Procedural version of code. struct Fridge { int cheese; } void go_shopping(Fridge fridge) { fridge.cheese += 5; } Mutator version of code: class Fridge { int cheese; void set_cheese(int _cheese) { cheese = _cheese; } int get_cheese() { return cheese; } } void go_shopping(Fridge fridge) { fridge.set_cheese(fridge.get_cheese() + 5); } The getters and setters made the code much more complicated without affording proper encapsulation. Because the internal state is accessible to other objects we don't gain a whole lot by adding these getters and setters. The question has been previously discussed on Stack Overflow: http://stackoverflow.com/questions/565095/java-are-getters-and-setters-evil http://stackoverflow.com/questions/996179

    Read the article

  • Using XML as data storage

    - by Kian Mayne
    I was thinking about the XML format and the following quote: “XML is not a database. It was never meant to be a database. It is never going to be a database. Relational databases are proven technology with more than 20 years of implementation experience. They are solid, stable, useful products. They are not going away. XML is a very useful technology for moving data between different databases or between databases and other programs. However, it is not itself a database. Don't use it like one.“ -Effective XML: 50 Specific Ways to Improve Your XML by Elliotte Rusty Harold (page 230, Part 4, Item 41, 2nd paragraph) This seems to really stress that XML should not be used for data storage and should only be used for program to program interoperability. Personally, I disagree and .NET's app.config file that's used to store a program's settings is an example of data storage in an XML file. However for databases rather than configurations etc XML should not be used. To develop my point, I will use two examples: A) Data about customers with fields that are all on one level i.e. there are a number of fields all relating to one customer with no children B) Data about configuration of an application where nested fields and properties make a lot of sense So my question is, Is this still a valid statement and is it now acceptable to store data using XML? EDIT: I've sent an email to the author of that quote to ask for his input/extra context.

    Read the article

  • Final agenda - Oracle Exadata & Manageability Partner Community Forum at OpenWorld

    - by Javier Puerta
    Just a few days for Oracle OpenWorld and our Exadata & Manageability Partner Community Forum for EMEA partners. The event will take place on the afternoon of Monday, October 1st, 2012 during the Oracle OpenWorld week. For all partners that have confirmed their attendance to the event, find below the final detailed agenda. I look forward to meeting again in San Francisco with all of you who can attend the event and hope that you will find the sessions useful for your business.   FINAL AGENDAOracle Exadata & ManageabilityEMEA Partner Community Forum at Oracle OpenWorld 2012 in San Francisco, USAMonday, October 1st, 20112 Detailed agenda Time Session Speaker 15:30 Reception of participants - Networking coffe served 16:00 Welcome Hans-Peter Kipfer, VP Engineered Systems, Oracle EMEA 16:10 Next challenges in building and managing clouds Javier Cabrerizo, VP, Global Business Development for Exadata, Oracle Corp. 16:30 Partner experience 1.- IT modernization, simplification and cost reduction: The case of a customer in Transportation & Logistics with custom applications and SAP. The Technological Renewal Model built by aligning the innovation of Oracle's Engineered Systems and Capgemini's service delivery excellence has resulted in significant cost savings for the client. Francisco Bermúdez, Country Leader Infrastructure Services, Capgemini, Spain 16:55 Partner experience 2.- The Nvision cloud project NCloud is an innovative design that combines advanced technical solutions, virtualization, and dynamic management of IT resources, providing a complete "as-a-Service" offering for Infrastructure, Database, Middleware, and Applications. Dmitry Krasilov, Head of Oracle Competence Center, Nvision Group, Russia 17:20 Partner experience 3.- From Exadata Ready to Exadata Optimized: An ISV Experience The experience of WeDo Technologies in the process and benefits that started as an Exadata Ready certification and ended up as an Exadata Optimized. Miguel Alves,  Product Business Solutions Manager, Wedo Technologies, Portugal 17:45 Next steps in engaging with Oracle Cengiz Yilmaz, Director Partner Strategy, Oracle EMEA Engineered SystemsPatrick Rood, Manageability Partner Business, Oracle EMEA 18:00 Wrap-up & Networking Time and Location:Monday, October 1st, 2012, 15:30 - 18:00 PST Grand Hyatt San Francisco, 345 Stockton Street, San Francisco (Conference Theater) (It is a 15 minute walk from OOW Moscone Center. See directions here)  

    Read the article

  • Final agenda - Oracle Exadata & Manageability Partner Community Forum at OpenWorld

    - by Javier Puerta
    Just a few days for Oracle OpenWorld and our Exadata & Manageability Partner Community Forum for EMEA partners. The event will take place on the afternoon of Monday, October 1st, 2012 during the Oracle OpenWorld week. For all partners that have confirmed their attendance to the event, find below the final detailed agenda. I look forward to meeting again in San Francisco with all of you who can attend the event and hope that you will find the sessions useful for your business.   FINAL AGENDAOracle Exadata & ManageabilityEMEA Partner Community Forum at Oracle OpenWorld 2012 in San Francisco, USAMonday, October 1st, 20112 Detailed agenda Time Session Speaker 15:30 Reception of participants - Networking coffe served 16:00 Welcome Hans-Peter Kipfer, VP Engineered Systems, Oracle EMEA 16:10 Next challenges in building and managing clouds Javier Cabrerizo, VP, Global Business Development for Exadata, Oracle Corp. 16:30 Partner experience 1.- IT modernization, simplification and cost reduction: The case of a customer in Transportation & Logistics with custom applications and SAP. The Technological Renewal Model built by aligning the innovation of Oracle's Engineered Systems and Capgemini's service delivery excellence has resulted in significant cost savings for the client. Francisco Bermúdez, Country Leader Infrastructure Services, Capgemini, Spain 16:55 Partner experience 2.- The Nvision cloud project NCloud is an innovative design that combines advanced technical solutions, virtualization, and dynamic management of IT resources, providing a complete "as-a-Service" offering for Infrastructure, Database, Middleware, and Applications. Dmitry Krasilov, Head of Oracle Competence Center, Nvision Group, Russia 17:20 Partner experience 3.- From Exadata Ready to Exadata Optimized: An ISV Experience The experience of WeDo Technologies in the process and benefits that started as an Exadata Ready certification and ended up as an Exadata Optimized. Miguel Alves,  Product Business Solutions Manager, Wedo Technologies, Portugal 17:45 Next steps in engaging with Oracle Cengiz Yilmaz, Director Partner Strategy, Oracle EMEA Engineered SystemsPatrick Rood, Manageability Partner Business, Oracle EMEA 18:00 Wrap-up & Networking Time and Location:Monday, October 1st, 2012, 15:30 - 18:00 PST Grand Hyatt San Francisco, 345 Stockton Street, San Francisco (Conference Theater) (It is a 15 minute walk from OOW Moscone Center. See directions here)  

    Read the article

  • Xfce gets really confused about session saving, etc

    - by Pointy
    I'm getting a new laptop running with 11.04 Ubuntu. I've got the xfce4 packages all installed, which is something I've had no problems with on any of my other machines. On this new laptop, however, though I can log in and use an xfce session without any problems, logging out of a session is problematic: I click the "Log out" widget from the panel and then "Log out" from its option dialog. Then the thing just sits there, not logging out. Subsequent attempts to open the "Log out" widget fail with an error about the session manager being busy. After maybe a minute or so, it logs out. Though I've got the "Save session" option checked in the log out dialog, xfce just makes a complete hash of the business. It does remember the applications that I had running, but it seems to forget about the window manager (!!) and the workspace configuration. I don't log in/out that often, and generally I don't care much about restarting applications, but the window manager being missing is of course pretty annoying. I like xfce because it's simple and unobtrusive and usually works pretty well. I've never experienced this, and I've got two other machines also running 11.04 with pretty much the same setup (straight Ubuntu install with xfce4 packages added). Is there some good way to diagnose stuff like that? edit — well I nuked my session cache, did an explicit save from the session widget, and now it works. Well, it doesn't save the workspace location for each client and instead opens them all up on the first workspace, but I think that may be because, in the session, xfwm4 is the last thing in the "Client" list, so before it's started all the other clients just pile up in the first (and only) workspace. I'm still curious about how exactly it gets so messed up. I certainly wasn't knowingly attempting anything fancy or unorthodox, though I may have done something fishy inadvertently.

    Read the article

  • Oracle HCM User Group (OHUG) 2012 Conference

    - by Maria Ana Santiago
    The PeopleSoft HCM team is looking forward to a great OHUG conference and to meeting with our PeopleSoft HCM Customers there! The OHUG Global Conference 2012 will be held at the Mirage in Las Vegas, Nevada, June 18-22, 2012. With Oracle Corporation's continued support of the Global OHUG Conference, this event is one of the best opportunities PeopleSoft HCM Customers have to interact and communicate directly with PeopleSoft Strategy, Development and Support and understand the entire Oracle HCM opportunities that await. PeopleSoft HCM has 10 exciting sessions and several Meet the Experts sessions planned to highlight the value and opportunities with PeopleSoft applications. For details on the PeopleSoft HCM tracks and sessions please visit the OHUG Session Line Up page. PeopleSoft HCM will be offering an annual General Roadmap session by Tracy Martin and multiple Product specific sessions. Our PeopleSoft HCM General session will provide very valuable information on our continuous delivery strategy and upcoming HCM 9.2 release and beyond. Tracy will also address opportunities that await PeopleSoft customers with co-exist opportunities with Fusion, Taleo, Oracle BI and more. Our Product Roadmap sessions will go into product specific areas providing roadmap information for the corresponding product domains. There will also be a PeopleTools Roadmap and Vision session that will let Customers see what is new in PeopleTools and what is planned for the future. And last, but not least, PeopleSoft will be holding the annual Meet the Experts sessions. Customers who want to have focused discussions on specific areas or products can meet with PeopleSoft Strategy, Development and Support teams who will be available to discuss product features and answer Customers' questions. Don’t miss this opportunity! If you are a PeopleSoft HCM Customer, join us at OHUG! Look forward to seeing you there.

    Read the article

  • We Found 100 Manufacturing Heros That Focus on Innovation!

    - by Stephen Slade
    There’s a good piece written by Ann Grackin of ChainLink Research on the Manufacturing Leadership 100 Awards program held recently in Palm Beach Fl, Apr 30-May 3, 2012.  This article (link below) highlights the summary of the Summit with specific focus on manufacturing innovation.  There were several informative keynotes and sessions from industrial leaders who are leveraging the latest tools and technologies to make better decisions. Ann writes that she was a panelist with Cindy Reese, SVP, Worldwide Operations, Oracle; and Steven Tungate, VP/GM, Supply Chain & Innovation, Toshiba America Business Solutions about Factories and Supply Networks of the Future. Ann writes “So what are these manufacturers doing? Significant rationalization of the supply base (Toshiba, especially, has this issue since they have a long history of many acquisitions), streamlining production to increase productivity, and looking for lower-cost countries for manufacturing….  No doubt firms have global customer bases, so they need to be present in these markets. However, a low-cost-country manufacturing source does introduce more risk in the supply chain. And that was discussed. Quality, security, and intellectual property protection were the critical global manufacturing issues also discussed. “Cindy (Reese) told a fascinating story about Oracle’s acquisition of Sun and the supply chain that was subsequently created. Here was one of the key points: Although Oracle sells on a global basis, they now do their own factory-installed software. This keeps potential ‘factory-installed malware’ from getting into the servers at contract manufacturers, and prevents pirated software. In this way, Oracle ensures that they deliver the quality and security people expect”. Learn more about the Manufacturing Leadership 100 program from Manufacturing Executive at: http://www.mlsummit.com/ Full Article Link:  http://www.clresearch.com/research/detail.cfm?guid=52327213-3048-79ED-99D4-E433DA64D4F0

    Read the article

  • XNA C# Platformer - physics engine or tile based?

    - by Hugh
    I would like to get some opinions on whether i should develop my game using a physics engine (farseer physics seems to be the best option) or follow the traditional tile-based method. Quick background: - its a college project, my first game, but have 4 years academic programming experience - Just want a basic platformer with a few levels, nothing fancy - want a shooting mechanic, run and gun, just like contra or metal slug for example - possibly some simple puzzles I have made a basic prototype with farseer, the level is hardcoded with collisions and not really tiled, more like big full-screen sized tiles, with collision bodies drawn manually along the ground and walls etc. My main problem is i want a simple retro feel to the jumping and physics but because its a physics simulation engine its going to be realistic, whereas typical in air controllable physics for platformers arent realistic. I have to make a box with wheel body fixture under it to have this effect and its glitchy and doesnt feel right. I chose to use a physics engine because i tried the tile method initially and found it very hard to understand, the engine took care of alot things to save me time, mainly being able to do slopes easily was nice and the freedom to draw collision bounds wherever i liked, rather then restricted to a grid, which gave me more freedom for art design also. In conclusion i don't know which method to pick, i want to use a method which will be the most straight forward way to implement and wont give me a headache later on, preferably a method which has an abundance of tutorials and resources so i dont get "stuck" doing something which has been done a million times before! Let me know i haven't provided enough information for you to help me! Thanks in advance, Hugh.

    Read the article

  • Kent .Net/SqlServer User Group – Upcoming events

    - by Dave Ballantyne
    At the Kent user group we have two upcoming events.  Both are to be held at F-Keys Training suite http://f-keys.co.uk/ in Rochester, Kent. If you haven’t attended before please note the location here. 14-June Is your code S.O.L.I.D ? Nathan Gloyn Everybody keeps on about SOLID principles but what are they? and why should you care? This session is an introduction to SOLID and I'll aim to walk through each principle telling you about that principle and then show how a code base can be refactored using the principles to make your life easier, Come the end of the session you should have a basic understanding of the principle, why to use it and how using it can improve your code. Building composite applications with OpenRasta 3 Sebastien Lambla A wave of change is coming to Web development on .NET. Packaging technologies are bringing dependency management to .NET for the first time, streamlining development workflow and creating new possibilities for deployment and administration. The sky's the limit, and in this session we'll explore how open frameworks can help us leverage composition for the web. Register here for this event http://www.eventbrite.com/event/1643797643 05-July Tony Rogerson Achieving a throughput of 1.5Terabytes or over 92,000 8Kbyte of 100% random reads per second on kit costing less that 2.5K, and of course what to do with it! The session will focus on commodity kit and how it can be used within business to provide massive performance benefits at little cost. End to End Report Creation and Management using SQL Server Reporting Services  Chris Testa-O'NeillThis session will walk through the authoring, management and delivery of reports with a focus on the new features of Reporting Services 2008 R2. At the end of this session you will understand how to create a report in the new report designer. Be aware of the Report management options available and the delivery mechanisms that can be used to deliver reports. Register here for this event http://www.eventbrite.com/event/1643805667 Hope to see you at one or other ( or even both if you are that way inclined).

    Read the article

< Previous Page | 199 200 201 202 203 204 205 206 207 208 209 210  | Next Page >