Search Results

Search found 5279 results on 212 pages for 'customer'.

Page 53/212 | < Previous Page | 49 50 51 52 53 54 55 56 57 58 59 60  | Next Page >

  • Managing Operational Risk of Financial Services Processes – part 1/ 2

    - by Sanjeevio
    Financial institutions view compliance as a regulatory burden that incurs a high initial capital outlay and recurring costs. By its very nature regulation takes a prescriptive, common-for-all, approach to managing financial and non-financial risk. Needless to say, no longer does mere compliance with regulation will lead to sustainable differentiation.  Genuine competitive advantage will stem from being able to cope with innovation demands of the present economic environment while meeting compliance goals with regulatory mandates in a faster and cost-efficient manner. Let’s first take a look at the key factors that are limiting the pursuit of the above goal. Regulatory requirements are growing, driven in-part by revisions to existing mandates in line with cross-border, pan-geographic, nature of financial value chains today and more so by frequent systemic failures that have destabilized the financial markets and the global economy over the last decade.  In addition to the increase in regulation, financial institutions are faced with pressures of regulatory overlap and regulatory conflict. Regulatory overlap arises primarily from two things: firstly, due to the blurring of boundaries between lines-of-businesses with complex organizational structures and secondly, due to varying requirements of jurisdictional directives across geographic boundaries e.g. a securities firm with operations in US and EU would be subject different requirements of “Know-Your-Customer” (KYC) as per the PATRIOT ACT in US and MiFiD in EU. Another consequence and concomitance of regulatory change is regulatory conflict, which again, arises primarily from two things: firstly, due to diametrically opposite priorities of line-of-business and secondly, due to tension that regulatory requirements create between shareholders interests of tighter due-diligence and customer concerns of privacy. For instance, Customer Due Diligence (CDD) as per KYC requires eliciting detailed information from customers to prevent illegal activities such as money-laundering, terrorist financing or identity theft. While new customers are still more likely to comply with such stringent background checks at time of account opening, existing customers baulk at such practices as a breach of trust and privacy. As mentioned earlier regulatory compliance addresses both financial and non-financial risks. Operational risk is a non-financial risk that stems from business execution and spans people, processes, systems and information. Operational risk arising from financial processes in particular transcends other sources of such risk. Let’s look at the factors underpinning the operational risk of financial processes. The rapid pace of innovation and geographic expansion of financial institutions has resulted in proliferation and ad-hoc evolution of back-office, mid-office and front-office processes. This has had two serious implications on increasing the operational risk of financial processes: ·         Inconsistency of processes across lines-of-business, customer channels and product/service offerings. This makes it harder for the risk function to enforce a standardized risk methodology and in turn breaches harder to detect. ·         The proliferation of processes coupled with increasingly frequent change-cycles has resulted in accidental breaches and increased vulnerability to regulatory inadequacies. In summary, regulatory growth (including overlap and conflict) coupled with process proliferation and inconsistency is driving process compliance complexity In my next post I will address the implications of this process complexity on financial institutions and outline the role of BPM in lowering specific aspects of operational risk of financial processes.

    Read the article

  • Gartner PCC: A Shovel & Some Ah-Ha's

    - by kellsey.ruppel
    When Gartner Vice President and leading analyst Whit Andrews kicked off the Gartner Portals, Content & Collaboration Summit on Monday, March 12 at the Gaylord Palms in Orlando, FL by bringing a shovel to the stage, eyebrows raised and a few thoughts went through my head. Either this guy plans to go help the construction workers outside construct that new pool at the Gaylord or he took a wrong turn and is at the wrong conference. Oh and how did he get that shovel through airport security? As Whit explained more his objective became more clear…take everything anyone has ever told you about portals and throw it out the window, as portals have evolved and times they are most certainly changing. The future Web is here, available not only on browsers but also via a broad spectrum of access points, including automobiles, consumer electronics and more and more mobile devices. Not merely prevalent, the future Web is also multimedia-driven and operates in real time, driven by mobility, social media, streaming video and other dynamic services. Applications and user experiences are in the midst of an evolution — from the early, simple mobile Web models to today’s Web 2.0 mobile apps and, ultimately, to a world of predominantly Web apps. Additionally, cloud services will forever change how portals and user experience are designed, built, delivered, sourced and managed. So what does this mean for you? Today’s organizations need software that will enable them to not just do their jobs, but to do it in a way that is familiar and easy for them.  What does this mean for IT? Use software and technology as an enabler, not as a roadblock. Overall, we had a great week in Orlando learning about how to improve the user experience, manage content explosion, launch social initiatives, transition to mobile environments and understand cloud and SaaS options.  We had some great conversations throughout the conference and at the Oracle booth. Lots of demonstrations were given of Oracle WebCenter Sites and Oracle Social Network. And as Christie mentioned earlier this week, our Vice President of Product Management and Strategy for WebCenter Loren Weinberg presented on the topic of customer engagement and talked about how organization’s relationships with their customers have fundamentally changed today and the resulting impact that has on their priorities.  Loren also talked about the importance of customer engagement, why that matters now more than ever, and what you can do to help your company or organization succeed in this new world. The question asked in every keynote and session was a simple one: What is your “ah-ha” moment? I personally had quite a few, some of which I’ve captured below. 70% of internal social initiatives eventually fail. By 2014, refusing to communicate with consumers via social media will be as harmful as ignoring emails/phone calls is today. Customer engagement = multi-channel + social & interactive + personal & relevant + optimized. If people choose to talk about your product/company/service, it's because it's remarkable. -- Seth Godin's keynote (one of the highlights of the conference!) The Web will become the primary method used for delivering content and applications to mobile devices. By 2015, 20% of smart phone users worldwide will conduct commerce using context-enriched services on a weekly basis.  86% of customers will pay more for a better customer experience. 6 P's of Quality User Experience. Product. Enabled by: People, Patterns, Process, Profit, Priorities. Did you attend the Gartner Summit? What were your ah-ha moments?

    Read the article

  • Choice and setup of version control

    - by Peter M
    I am about to set up an new laptop and in the process transition to a new version control system as part of a general cleanup. Currently I use a centralized version control system (yes it is VSS, and yes I know all the pro's and con's of that system, but as a single user system it works well for me). I have very little requirements for a new system and I am free to choose among any of the current mainstream players, but cost constraints will push me towards oss. Some of my requirements are: Runs on a single machine (ie the laptop in question) under windows I am not sharing things with other developers or workers - this is more for my own historical benefits. I want to version source code, documentation and binary files I have a large hierarchy of projects that are unrelated (see below) I have files within the hierarchy that don't need to be controlled (but could be) Some projects use Visual Studio, so some integration there could be nice. There could be some sharing of files between jobs. I generally only need a small about of branching in code files The directory hierarchy that I have at the moment is somewhat like: Root | |--Customer #1 | | | |--Job #1 | | | | | |--Data files received from Customer for Job (not controlled) | | |--Documentation files (controlled) | | |--Project information files (not controlled - but could be) | | |--Software Project Files (controlled) | | |--Scratch dir for job (not controlled) | | | |--Job #2 | | (same structure as above) | |--Customer #2 | |.. | |--Cusmtomer #n |.. Currently I have about 22 customers with differing numbers of projects underneath them. At the moment I have a single VSS repository based at the root of the directory structure. If I kept with a centralized system (ie SVN) I believe that I should keep the same approach and continue with a single repository based from the root dir. Is this a valid approach? However if I move to a distributed tool then I am unsure of how I should handle the situation. My initial guess is that I should not have a repository based on the root of my entire directory structure - but that is a guess so I really don't know how valid it is. Should I pitch a distributed approach at the Root, Customer, Job or sub-Job directory level? Also what I am not clear on with distributed tools (and perhaps with SVN as well), is if I can branch parts of a repository. For example, I can see branching source code in software projects as being useful, but branching my documentation as not being useful. So if I pitch a repository at the Job level, can I just branch the Software Project Files? Or would all files in that Job be branched? Every time I look at distributed tools I get a nagging feeling that they are not suited to my style of setup. I am uncomfortable with idea of having to manually set up something like 50 to 80 separate repositories (if I pitch at the Job level, or 20+ if at the Customer level) within my directory hierarchy. This feeling also extends to having all those repositories scattered around as well - however I do have a backup strategy that I trust, so this latter feeling is pretty well unfounded. So what advice can you all give me? Thanks in advance!

    Read the article

  • World Record Siebel PSPP Benchmark on SPARC T4 Servers

    - by Brian
    Oracle's SPARC T4 servers set a new World Record for Oracle's Siebel Platform Sizing and Performance Program (PSPP) benchmark suite. The result used Oracle's Siebel Customer Relationship Management (CRM) Industry Applications Release 8.1.1.4 and Oracle Database 11g Release 2 running Oracle Solaris on three SPARC T4-2 and two SPARC T4-1 servers. The SPARC T4 servers running the Siebel PSPP 8.1.1.4 workload which includes Siebel Call Center and Order Management System demonstrates impressive throughput performance of the SPARC T4 processor by achieving 29,000 users. This is the first Siebel PSPP 8.1.1.4 benchmark supporting 29,000 concurrent users with a rate of 239,748 Business Transactions/hour. The benchmark demonstrates vertical and horizontal scalability of Siebel CRM Release 8.1.1.4 on SPARC T4 servers. Performance Landscape Systems Txn/hr Users Call Center Order Management Response Times (sec) 1 x SPARC T4-1 (1 x SPARC T4 2.85 GHz) – Web 3 x SPARC T4-2 (2 x SPARC T4 2.85 GHz) – App/Gateway 1 x SPARC T4-1 (1 x SPARC T4 2.85 GHz) – DB 239,748 29,000 0.165 0.925 Oracle: Call Center + Order Management Transactions: 197,128 + 42,620 Users: 20300 + 8700 Configuration Summary Web Server Configuration: 1 x SPARC T4-1 server 1 x SPARC T4 processor, 2.85 GHz 128 GB memory Oracle Solaris 10 8/11 iPlanet Web Server 7 Application Server Configuration: 3 x SPARC T4-2 servers, each with 2 x SPARC T4 processor, 2.85 GHz 256 GB memory 3 x 300 GB SAS internal disks Oracle Solaris 10 8/11 Siebel CRM 8.1.1.5 SIA Database Server Configuration: 1 x SPARC T4-1 server 1 x SPARC T4 processor, 2.85 GHz 128 GB memory Oracle Solaris 11 11/11 Oracle Database 11g Release 2 (11.2.0.2) Storage Configuration: 1 x Sun Storage F5100 Flash Array 80 x 24 GB flash modules Benchmark Description Siebel 8.1 PSPP benchmark includes Call Center and Order Management: Siebel Financial Services Call Center – Provides the most complete solution for sales and service, allowing customer service and telesales representatives to provide superior customer support, improve customer loyalty, and increase revenues through cross-selling and up-selling. High-level description of the use cases tested: Incoming Call Creates Opportunity, Quote and Order and Incoming Call Creates Service Request . Three complex business transactions are executed simultaneously for specific number of concurrent users. The ratios of these 3 scenarios were 30%, 40%, 30% respectively, which together were totaling 70% of all transactions simulated in this benchmark. Between each user operation and the next one, the think time averaged approximately 10, 13, and 35 seconds respectively. Siebel Order Management – Oracle's Siebel Order Management allows employees such as salespeople and call center agents to create and manage quotes and orders through their entire life cycle. Siebel Order Management can be tightly integrated with back-office applications allowing users to perform tasks such as checking credit, confirming availability, and monitoring the fulfillment process. High-level description of the use cases tested: Order & Order Items Creation and Order Updates. Two complex Order Management transactions were executed simultaneously for specific number of concurrent users concurrently with aforementioned three Call Center scenarios above. The ratio of these 2 scenarios was 50% each, which together were totaling 30% of all transactions simulated in this benchmark. Between each user operation and the next one, the think time averaged approximately 20 and 67 seconds respectively. Key Points and Best Practices No processor cores or cache were activated or deactivated on the SPARC T-Series systems to achieve special benchmark effects. See Also Siebel White Papers SPARC T4-1 Server oracle.com OTN SPARC T4-2 Server oracle.com OTN Siebel CRM oracle.com OTN Oracle Solaris oracle.com OTN Oracle Database 11g Release 2 Enterprise Edition oracle.com OTN Disclosure Statement Copyright 2012, Oracle and/or its affiliates. All rights reserved. Oracle and Java are registered trademarks of Oracle and/or its affiliates. Other names may be trademarks of their respective owners. Results as of 30 September 2012.

    Read the article

  • OPN Specialized Latest News (15th November)

    - by swalker
    HELPING YOU TO SPECIALIZE WebCenter Implementation Specialist Exam Preparation Webcasts: WebCenter Content And WebCenter Portal Oracle Partner Network would like to invite you to Refresh Courses for WebCenter Content and WebCenter Portal, to help partners to prepare for the WebCenter Implementation Specialist EXAMS. This is a 3 hours intensive refresher partner-only training session, providing attendees with an overview of WebCenter Content and WebCenter Portal functions and related topics. After the refresher part you will be able to take the relevant Implementation Specialist EXAM depending on your personal focus. NOTE: This is only suitable for experienced WebCenter Content or WebCenter Portal practitioners Who should attend? Partner Consultants who want to become an Oracle WebCenter Content or a WebCenter Portal Certified Implementation Specialist or both, that will help them to differentiate themselves in front of customers and support their Companies to become Specialized. Webcast Details: Click here to read more... Specialized Partners Only! New Service to Promote Your Events The Partner Event Publisher has just been made available to all specialized partners in EMEA.  Partners now have the opportunity to publish their events to the Oracle.com/events site and spread the word on their upcoming live in-person and/or live webcast events. Click here to read more information and watch a short video demo. VADs Get Specialized Effective November 1, 2011 , VADs, with a valid Value Added Distributor Agreement will no longer be required to meet customer reference requirements outlined in the business criteria section in order to become specialized. VADs must continue meet all other business and competency criteria set forth in the applicable Knowledge Zone prior to specialization approval. New Certification Pillar Axiom 600 Storage System Your opportunity to take the Pillar Axiom 600 Storage System Essentials (1Z0-581) Exam is vailable now in beta. Pass the exam so you can become a Pillar Axiom 600 Storage Systems Implementation Specialist! Free vouchers are available for Oracle Partners! If you would like to receive a free Beta exam voucher, please send your request to [email protected] and include your name, business email address, company, and the Exam name Pillar Axiom 600 Storage System Essentials Beta exam. New Certification Available: Oracle Utilities Customer Care and Billing Oracle Utilities Customer Care and Billing 2 Essentials (1Z0-562) is a solution designed to help you meet market windows and regulatory deadlines while enjoying a low total cost of ownership and a high return on investment. Take the exam now to become an  Oracle Utilities Customer Care and Billing 2 Essentials Implementation Specialists. MEASURING YOUR SUCCESS We had 1674 Specialized Partners covering 5364 Specializations. Please note that due to OPN contract renewals at any given point in time there are valid Specialized Partners and Specializations which are temporarily not captured in the total statistics. An incremental 1961 individuals were accredited as Implementation Specialists giving an EMEA cumulative total of 9598 Implementation Specialists 26 ISVs obtained one or more Ready's, for a total of 53 Ready's Don't forget! You can submit your own press releases to Oracle! Every time you achieve specialization we'd like to support you getting the message out! Press guidelines and a submission link can be found on the OPN Portal here.

    Read the article

  • Getting Help with 'SEPA' Questions

    - by MargaretW
    What is 'SEPA'? The Single Euro Payments Area (SEPA) is a self-regulatory initiative for the European banking industry championed by the European Commission (EC) and the European Central Bank (ECB). The aim of the SEPA initiative is to improve the efficiency of cross border payments and the economies of scale by developing common standards, procedures, and infrastructure. The SEPA territory currently consists of 33 European countries -- the 28 EU states, together with Iceland, Liechtenstein, Monaco, Norway and Switzerland. Part of that infrastructure includes two new SEPA instruments that were introduced in 2008: SEPA Credit Transfer (a Payables transaction in Oracle EBS) SEPA Core Direct Debit (a Receivables transaction in Oracle EBS) A SEPA Credit Transfer (SCT) is an outgoing payment instrument for the execution of credit transfers in Euro between customer payment accounts located in SEPA. SEPA Credit Transfers are executed on behalf of an Originator holding a payment account with an Originator Bank in favor of a Beneficiary holding a payment account at a Beneficiary Bank. In R12 of Oracle applications, the current SEPA credit transfer implementation is based on Version 5 of the "SEPA Credit Transfer Scheme Customer-To-Bank Implementation Guidelines" and the "SEPA Credit Transfer Scheme Rulebook" issued by European Payments Council (EPC). These guidelines define the rules to be applied to the UNIFI (ISO20022) XML message standards for the implementation of the SEPA Credit Transfers in the customer-to-bank space. This format is compliant with SEPA Credit Transfer version 6. A SEPA Core Direct Debit (SDD) is an incoming payment instrument used for making domestic and cross-border payments within the 33 countries of SEPA, wherein the debtor (payer) authorizes the creditor (payee) to collect the payment from his bank account. The payment can be a fixed amount like a mortgage payment, or variable amounts such as those of invoices. The "SEPA Core Direct Debit" scheme replaces various country-specific direct debit schemes currently prevailing within the SEPA zone. SDD is based on the ISO20022 XML messaging standards, version 5.0 of the "SEPA Core Direct Debit Scheme Rulebook", and "SEPA Direct Debit Core Scheme Customer-to-Bank Implementation Guidelines". This format is also compliant with SEPA Core Direct Debit version 6. EU Regulation #260/2012 established the technical and business requirements for both instruments in euro. The regulation is referred to as the "SEPA end-date regulation", and also defines the deadlines for the migration to the new SEPA instruments: Euro Member States: February 1, 2014 Non-Euro Member States: October 31, 2016. Oracle and SEPA Within the Oracle E-Business Suite of applications, Oracle Payables (AP), Oracle Receivables (AR), and Oracle Payments (IBY) provide SEPA transaction capabilities for the following releases, as noted: Release 11.5.10.x -  AP & AR Release 12.0.x - AP & AR & IBY Release 12.1.x - AP & AR & IBY Release 12.2.x - AP & AR & IBY Resources To assist our customers in migrating, using, and troubleshooting SEPA functionality, a number of resource documents related to SEPA are available on My Oracle Support (MOS), including: R11i: AP: White Paper - SEPA Credit Transfer V5 support in Oracle Payables, Doc ID 1404743.1R11i: AR: White Paper - SEPA Core Direct Debit v5.0 support in Oracle Receivables, Doc ID 1410159.1R12: IBY: White Paper - SEPA Credit Transfer v5 support in Oracle Payments, Doc ID 1404007.1R12: IBY: White Paper - SEPA Core Direct Debit v5 support in Oracle Payments, Doc ID 1420049.1R11i/R12: AP/AR/IBY: Get Help Setting Up, Using, and Troubleshooting SEPA Payments in Oracle, Doc ID 1594441.2R11i/R12: Single European Payments Area (SEPA) - UPDATES, Doc ID 1541718.1R11i/R12: FAQs for Single European Payments Area (SEPA), Doc ID 791226.1

    Read the article

  • Oracle Excellence Award

    - by Hartmut Wiese
    CALL FOR NOMINATIONS 2014 Oracle Excellence Award: Sustainability Innovation Is your organization using an Oracle product to help with a sustainability initiative while reducing costs? Saving energy? Saving gas? Saving paper? For example, you may use Oracle’s Agile Product Lifecycle Management to design more eco-friendly products, Oracle Transportation Management to reduce fleet emissions, Oracle Exadata Database Machine to decrease power and cooling needs while increasing database performance, Oracle Business Intelligence to measure environmental impacts, or one of many other Oracle products. Your organization may be eligible for the 2014 Oracle Excellence Award: Sustainability Innovation. Submit a nomination form located here by Friday June 20 if your company is using any Oracle product to take an environmental lead as well as to reduce costs and improve business efficiencies by using green business practices. These awards will be presented during Oracle OpenWorld 2014 (September 28-October 2) in San Francisco.  Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 About the Award • Winners will be selected from the customer and/or partner nominations. Either a customer, their partner, or Oracle representative can submit the nomination form on behalf of the customer.• There is a nomination form here to discuss your use of Oracle products and how they have helped your sustainability efforts and reduced costs. • Winners will be selected based on the extent of the environmental impact they have had as well as the business efficiencies they have achieved through their combined use of Oracle products. Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 Nomination Eligibility • Your company uses at least one component of Oracle products, whether it's the Oracle database, business applications, Fusion Middleware, or Sun servers/storage. • This solution should be in production or in active development. • Nomination deadline: Friday June 20, 2014. Benefits to Award Winners • Award presented to winners during Oracle OpenWorld by Jeff Henley, Oracle Chairman of the Board • Free Oracle OpenWorld registration pass for each winning customer • 2014 Oracle Excellence Award: Sustainability Innovation award logo for inclusion on your own website &/or press release • Possible placement in Oracle Profit Magazine &/or Oracle Magazine • ‘Enable the Eco-Enterprise’ podcast opportunity See last year's winners here Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 ______________________________________________________________________________________ Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0cm 5.4pt 0cm 5.4pt; mso-para-margin:0cm; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;} Questions? Send an email to: [email protected] Follow Oracle’s Sustainability Solutions on Twitter, LinkedIn, YouTube, and the Sustainability Matters blog Web page with award details:  http://www.oracle.com/us/products/applications/green/call-for-nominations-185050.html  

    Read the article

  • Multichannel Digital Engagement: Find Out How Your Organization Measures Up

    - by Michael Snow
    This article was originally published in the September 2013 Edition of the Oracle Information InDepth Newsletter ORACLE WEBCENTER EDITION Thanks to mobile and social technologies, interactive online experiences are now commonplace. Not only that, they give consumers more choices, influence, and control than ever before. So how can you make your organization stand out? The key building blocks for delivering exceptional cross-channel digital experiences are outlined below. Also, a new assessment tool is available to help you measure your organization's ability to deliver such experiences. A clearly defined digital strategy. The customer journey is growing increasingly complex, encompassing multiple touchpoints and channels. It used to be easy to map marketing efforts to specific offline channels; for example, a direct mail piece with an offer to visit a store for a discounted purchase. Now it is more difficult to cultivate and track such clear cause-and-effect relationships. To deliver an integrated digital experience in this more complex world, organizations need a clearly defined and comprehensive digital marketing strategy that is backed up by an integrated set of software, middleware, and hardware solutions. Strong support for business agility and speed-to-market. As both IT and marketing executives know, speed-to-market and business agility are key to competitive advantage. That means marketers need solutions to support the rapid implementation of online marketing initiatives—plus the flexibility to adapt quickly to a changing marketplace. And IT needs tools with the performance, scalability, and ease of integration to support marketing efforts. Both teams benefit when business users are empowered to implement marketing initiatives on their own, with minimal IT intervention. The ability to deliver relevant, personalized content. Delivering a one-size-fits-all online customer experience is no longer acceptable. Customers expect you to know who they are, including their preferences and past relationship with your brand. That means delivering the most relevant content from the moment a visitor enters your site. To make that happen, you need a powerful rules engine so that marketers and business users can easily define site visitor segments and deliver content accordingly. That includes both implicit targeting that is based on the user’s behavior, and explicit targeting that takes a user’s profile information into account. Ideally, the rules engine can also intelligently weight recommendations when multiple segments apply to a specific customer. Support for social interactivity. With the advent of Facebook and LinkedIn, visitors expect to participate in and contribute to your web presence—and share their experience on their own social networks. That requires easy incorporation of user-generated content such as comments, ratings, reviews, polls, and blogs; seamless integration with third-party social networking sites; and support for social login, which helps to remove barriers to social participation. The ability to deliver connected, multichannel experiences that include powerful, flexible mobile capabilities. By 2015, mobile usage is projected to surpass that of PCs and other wired devices. In other words, mobile is an essential element in delivering exceptional online customer experiences. This requires the creation and management of mobile experiences that are optimized for delivery to the thousands of different devices that are in use today. Just as important, organizations must be able to easily extend their traditional web presence to the mobile channel and deliver highly personalized and relevant multichannel marketing initiatives while also managing to minimize the time and effort required to manage mobile sites. Are you curious to know how your organization measures up when it comes to delivering an engaging, multichannel digital experience? If so, take this brief, 15-question online assessment and see how your organization scores in the areas of digital strategy, digital agility, relevance and personalization, social interactivity, and multichannel experience.

    Read the article

  • Code Trivia: optimize the code for multiple nested loops

    - by CodeToGlory
    I came across this code today and wondering what are some of the ways we can optimize it. Obviously the model is hard to change as it is legacy, but interested in getting opinions. Changed some names around and blurred out some core logic to protect. private static Payment FindPayment(Order order, Customer customer, int paymentId) { Payment payment = Order.Payments.FindById(paymentId); if (payment != null) { if (payment.RefundPayment == null) { return payment; } if (String.Compare(payment.RefundPayment, "refund", true) != 0 ) { return payment; } } Payment finalPayment = null; foreach (Payment testpayment in Order.payments) { if (testPayment.Customer.Name != customer.Name){continue;} if (testPayment.Cancelled) { continue; } if (testPayment.RefundPayment != null) { if (String.Compare(testPayment.RefundPayment, "refund", true) == 0 ) { continue; } } if (finalPayment == null) { finalPayment = testPayment; } else { if (testPayment.Value > finalPayment.Value) { finalPayment = testPayment; } } } if (finalPayment == null) { return payment; } return finalPayment; } Making this a wiki so code enthusiasts can answer without worrying about points.

    Read the article

  • TypeConverter prevents ApplyPropertyChanges in EntityFramework

    - by Felix
    I ran into an interesting problem (hopefully, interesting not just for me :) I am running Entity Framework 1 (.NET 3.5) and ASP.NET MVC 2. I have a Customer class that has many-to-one relationship with Country class (in other words, Country is a lookup table for customers - I described more in this post: http://stackoverflow.com/questions/2404801/explicit-casting-doesnt-work-in-default-model-binding ) I got TypeConverter to work; so I am getting a perfect object into controller's Post method. Create works fine; however, in Edit I am getting the following error when I call ApplyPropertyChanges: The existing object in the ObjectContext is in the Added state. Changes can only be applied when the existing object is in an unchanged or modified state. The controller code is fairly trivial: public ActionResult Edit(Customer customerToEdit) { if (ModelState.IsValid) { Customer cust = (Customer)context.GetObjectByKey( new EntityKey("BAEntities.Customers", "CustomerID", customerToEdit.CustomerID)); context.ApplyPropertyChanges(cust.EntityKey.EntitySetName, customerToEdit); context.SaveChanges(); } return View(...); } If I remove country from the form, it works fine; and if I assign dropdown value to EntityReference "manually" - it works as well. TypeConverter code is also fairly simple, but I've never used TypeConverter before, so I may be missing something here: public override object ConvertFrom(ITypeDescriptorContext typeContext, CultureInfo culture, object value) { if (value is string) { int countryID = Int16.Parse((string)value); Country country = (Country)context.GetObjectByKey( new EntityKey("BAEntities.Countries", "CountryID", countryID)); return country; } return base.ConvertFrom(typeContext, culture, value); }

    Read the article

  • Whats wrong with this HQL query?

    - by ManBugra
    did i encounter a hibernate bug or do i have an error i dont see: select enty.number from EntityAliasName enty where enty.myId in ( select cons.myId from Consens cons where cons.number in ( select ord.number from Orders ord where ord.customer = :customer and ord.creationDate < ( select max(ord.creationDate) from Orders ord where ord.customer = :customer ) ) ) what i do get is the following: org.hibernate.util.StringHelper.root(StringHelper.java:257) Caused by: java.lang.NullPointerException at org.hibernate.util.StringHelper.root(StringHelper.java:257) at org.hibernate.persister.entity.AbstractEntityPersister.getSubclassPropertyTableNumber(AbstractEntityPersister.java:1391) at org.hibernate.persister.entity.BasicEntityPropertyMapping.toColumns(BasicEntityPropertyMapping.java:54) at org.hibernate.persister.entity.AbstractEntityPersister.toColumns(AbstractEntityPersister.java:1367) at org.hibernate.hql.ast.tree.FromElement.getIdentityColumn(FromElement.java:320) at org.hibernate.hql.ast.tree.IdentNode.resolveAsAlias(IdentNode.java:154) at org.hibernate.hql.ast.tree.IdentNode.resolve(IdentNode.java:100) at org.hibernate.hql.ast.tree.FromReferenceNode.resolve(FromReferenceNode.java:117) at org.hibernate.hql.ast.tree.FromReferenceNode.resolve(FromReferenceNode.java:113) at org.hibernate.hql.ast.HqlSqlWalker.resolve(HqlSqlWalker.java:854) at org.hibernate.hql.antlr.HqlSqlBaseWalker.propertyRef(HqlSqlBaseWalker.java:1172) at org.hibernate.hql.antlr.HqlSqlBaseWalker.propertyRefLhs(HqlSqlBaseWalker.java:5167) at org.hibernate.hql.antlr.HqlSqlBaseWalker.propertyRef(HqlSqlBaseWalker.java:1133) at org.hibernate.hql.antlr.HqlSqlBaseWalker.selectExpr(HqlSqlBaseWalker.java:1993) at org.hibernate.hql.antlr.HqlSqlBaseWalker.selectExprList(HqlSqlBaseWalker.java:1932) at org.hibernate.hql.antlr.HqlSqlBaseWalker.selectClause(HqlSqlBaseWalker.java:1476) at org.hibernate.hql.antlr.HqlSqlBaseWalker.query(HqlSqlBaseWalker.java:580) at org.hibernate.hql.antlr.HqlSqlBaseWalker.selectStatement(HqlSqlBaseWalker.java:288) at org.hibernate.hql.antlr.HqlSqlBaseWalker.statement(HqlSqlBaseWalker.java:231) at org.hibernate.hql.ast.QueryTranslatorImpl.analyze(QueryTranslatorImpl.java:254) at org.hibernate.hql.ast.QueryTranslatorImpl.doCompile(QueryTranslatorImpl.java:185) at org.hibernate.hql.ast.QueryTranslatorImpl.compile(QueryTranslatorImpl.java:136) at org.hibernate.engine.query.HQLQueryPlan.<init>(HQLQueryPlan.java:101) at org.hibernate.engine.query.HQLQueryPlan.<init>(HQLQueryPlan.java:80) at org.hibernate.engine.query.QueryPlanCache.getHQLQueryPlan(QueryPlanCache.java:94) at org.hibernate.impl.SessionFactoryImpl.checkNamedQueries(SessionFactoryImpl.java:484) at org.hibernate.impl.SessionFactoryImpl.<init>(SessionFactoryImpl.java:394) at org.hibernate.cfg.Configuration.buildSessionFactory(Configuration.java:1341) using: Hibernate 3.3.2.GA / postgresql

    Read the article

  • DATE lookup table (1990/01/01:2041/12/31)

    - by Frank Developer
    I use a DATE's master table for looking up dates and other values in order to control several events, intervals and calculations within my app. It has rows for every single day begining from 01/01/1990 to 12/31/2041. One example of how I use this lookup table is: A customer pawned an item on: JAN-31-2010 Customer returns on MAY-03-2010 to make an interest pymt to avoid forfeiting the item. If he pays 1 months interest, the employee enters a "1" and the app looks-up the pawn date (JAN-31-2010) in date master table and puts FEB-28-2010 in the applicable interest pymt date. FEB-28 is returned because FEB-31's dont exist! If 2010 were a leap-year, it would've returned FEB-29. If customer pays 2 months, MAR-31-2010 is returned. 3 months, APR-30... If customer pays more than 3 months or another period not covered by the date lookup table, employee manually enters the applicable date. Here's what the date lookup table looks like: { Copyright 1990:2010, Frank Computer, Inc. } { DBDATE=YMD4- (correctly sorted for faster lookup) } CREATE TABLE datemast ( dm_lookup DATE, {lookup col used for obtaining values below} dm_workday CHAR(2), {NULL=Normal Working Date,} {NW=National Holiday(Working Date),} {NN=National Holiday(Non-Working Date),} {NH=National Holiday(Half-Day Working Date),} {CN=Company Proclamated(Non-Working Date),} {CH=Company Proclamated(Half-Day Working Date)} {several other columns omitted} dm_description CHAR(30), {NULL, holiday description or any comments} dm_day_num SMALLINT, {number of elapsed days since begining of year} dm_days_left SMALLINT, (number of remaining days until end of year} dm_plus1_mth DATE, {plus 1 month from lookup date} dm_plus2_mth DATE, {plus 2 months from lookup date} dm_plus3_mth DATE, {plus 3 months from lookup date} dm_fy_begins DATE, {fiscal year begins on for lookup date} dm_fy_ends DATE, {fiscal year ends on for lookup date} dm_qtr_begins DATE, {quarter begins on for lookup date} dm_qtr_ends DATE, {quarter ends on for lookup date} dm_mth_begins DATE, {month begins on for lookup date} dm_mth_ends DATE, {month ends on for lookup date} dm_wk_begins DATE, {week begins on for lookup date} dm_wk_ends DATE, {week ends on for lookup date} {several other columns omitted} ) IN "S:\PAWNSHOP.DBS\DATEMAST"; Is there a better way of doing this or is it a cool method?

    Read the article

  • How to group strings by prefix

    - by namenlos
    I am writing a Winform UI in which the user must select a single customer. (For reasons beyond my control I am limited to a UI that uses dropdown lists, text fields, checkboxes, radiobuttons only -i.e. no fancy special UI controls) The situation There are a lot of customers (a thousand for example) If i put all the customers in a single dropdown there's no way it will be easy for a customer to even see all the customers. Also the it will take too long to retireve all the customers from the DB to populate the dropdown My thought is to have two combo box, the first lists groups of the customers by their last name something like a phone book "Aa-Ac", "Ad-Ade", "Adf-B", when selecting the first combo box, it scope the second one to a managable set customer names (no more than for example 40 names) The question I need a reasonable way of grouping their names such that it will be clear to customer which group contains the name. I.e. given a group of names I need to bucketize then int "Aa-Ac". Comments I don't need to solve the general problem of an immense number of names - we know based on our data that 1000 names is the max our users will encounter. If there are other techniques please do share, but I am interested specifically in an answer to my specific question around how to determine the buckets ("Aa-Ac", etc.)

    Read the article

  • Annotation to make available generic type

    - by mdma
    Given an generic interface like interface DomainObjectDAO<T> { T newInstance(); add(T t); remove(T t); T findById(int id); // etc... } I'd like to create a subinterface that specifies the type parameter: interface CustomerDAO extends DomainObjectDAO<Customer> { // customer-specific queries - incidental. } The implementation needs to know the actual template parameter type, but of course type erasure means isn't available at runtime. Is there some annotation that I could include to declare the interface type? Something like @GenericParameter(Customer.class) interface CustomerDAO extends DomainObjectDAO<Customer> { } The implementation could then fetch this annotation from the interface and use it as a substitute for runtime generic type access. Some background: This interface is implemented using JDK dynamic proxies as outlined here. The non-generic version of this interface has been working well, but it would be nicer to use generics and not have to create a subinterface for each domain object type. The actual type is needed at runtime to implement the newInstance method, amongst others.

    Read the article

  • Ternary (and n-ary) relationships in Hibernate

    - by Bytecode Ninja
    Q 1) How can we model a ternary relationship using Hibernate? For example, how can we model the ternary relationship presented here using Hibernate (or JPA)? Ideally I prefer my model to be like this: class SaleAssistant { Long id; //... } class Customer { Long id; //... } class Product { Long id; //... } class Sale { SalesAssistant soldBy; Customer buyer; Product product; //... } Q 1.1) How can we model this variation, in which each Sale item might have many Products? class SaleAssistant { Long id; //... } class Customer { Long id; //... } class Product { Long id; //... } class Sale { SalesAssistant soldBy; Customer buyer; Set<Product> products; //... } Q 2) In general, how can we model n-ary, n = 3 relationships with Hibernate? Thanks in advance.

    Read the article

  • How is jQuery so fast?

    - by ClarkeyBoy
    Hey, I have a rather large application which, on the admin frontend, takes a few seconds to load a page because of all the pageviews that it has to load into objects before displaying anything. Its a bit complex to explain how the system works, but a few of my other questions explains the system in great detail. The main difference between what they say and the current system is that the customer frontend no longer loads all the pageviews into objects when a customer first views the page - it simply adds the pageview to the database and creates an object in an unsynchronised list... to put it simply, when a customer views a page it no longer loads all the pageviews into objects; but the admin frontend still does. I have been working on some admin tools on the customer frontend recently, so if an administrator clicks the description of an item in the catalogue then the right hand column will display statistics and available actions for the selected item. To do this the page which gets loaded (through $('action-container').load(bla bla bla);) into the right hand column has to loop through ALL the pageviews - this ultimately means that ALL the pageviews are loaded into objects if they haven't been already. For some reason this loads really REALLY fast. The difference in speed is only like a second on my dev site, but the live site has thousands of pageviews so the difference is quite big... So my question is: why is it that the admin frontend loads so slowly while using $(bla).load(bla); is so fast? I mean whatever method jQuery uses, can't browsers use this method too and load pages super-fast? Obviously not as someone would've done that by now - but I am interested to know just why the difference is so big... is it just my system or is there a major difference in speed between the browser getting a page and jQuery getting a page? Do other people experience the same kind of differences? Thanks in advance, Regards, Richard

    Read the article

  • Asp.Net MVC 2: How exactly does a view model bind back to the model upon post back?

    - by Dr. Zim
    Sorry for the length, but a picture is worth 1000 words: In ASP.NET MVC 2, the input form field "name" attribute must contain exactly the syntax below that you would use to reference the object in C# in order to bind it back to the object upon post back. That said, if you have an object like the following where it contains multiple Orders having multiple OrderLines, the names would look and work well like this (case sensitive): This works: Order[0].id Order[0].orderDate Order[0].Customer.name Order[0].Customer.Address Order[0].OrderLine[0].itemID // first order line Order[0].OrderLine[0].description Order[0].OrderLine[0].qty Order[0].OrderLine[0].price Order[0].OrderLine[1].itemID // second order line, same names Order[0].OrderLine[1].description Order[0].OrderLine[1].qty Order[0].OrderLine[1].price However we want to add order lines and remove order lines at the client browser. Apparently, the indexes must start at zero and contain every consecutive index number to N. The black belt ninja Phil Haack's blog entry here explains how to remove the [0] index, have duplicate names, and let MVC auto-enumerate duplicate names with the [0] notation. However, I have failed to get this to bind back using a nested object: This fails: Order.id // Duplicate names should enumerate at 0 .. N Order.orderDate Order.Customer.name Order.Customer.Address Order.OrderLine.itemID // And likewise for nested properties? Order.OrderLine.description Order.OrderLine.qty Order.OrderLine.price Order.OrderLine.itemID Order.OrderLine.description Order.OrderLine.qty Order.OrderLine.price I haven't found any advice out there yet that describes how this works for binding back nested ViewModels on post. Any links to existing code examples or strict examples on the exact names necessary to do nested binding with ILists? Steve Sanderson has code that does this sort of thing here, but we cannot seem to get this to bind back to nested objects. Anything not having the [0]..[n] AND being consecutive in numbering simply drops off of the return object. Any ideas?

    Read the article

  • Typecasting EnityObject

    - by AJ
    Hello, I'm new to C# and am stuck on the following. I have a Silverlight web service that uses LINQ to query a ADO.NET entity object. e.g.: [OperationContract] public List<Customer> GetData() { using (TestEntities ctx = new TestEntities()) { var data = from rec in ctx.Customer select rec; return data.ToList(); } } This works fine, but what I want to do is to make this more abstract. The first step would be to return a List<EntityObject> but this gives a compiler error, e.g.: [OperationContract] public List<EntityObject> GetData() { using (TestEntities ctx = new TestEntities()) { var data = from rec in ctx.Customer select rec; return data.ToList(); } } The error is: Error 1 Cannot implicitly convert type 'System.Collections.Generic.List<SilverlightTest.Web.Customer>' to 'System.Collections.Generic.IEnumerable<System.Data.Objects.DataClasses.EntityObject>'. An explicit conversion exists (are you missing a cast?) What am i doing wrong? Thanks, AJ

    Read the article

  • Subsonic - How to use SQL Schema / Owner name as part of the namespace?

    - by CResults
    Hi there, I've just started using Subsonic 2.2 and so far very impressed - think it'll save me some serious coding time. Before I dive into using it full time though there is something bugging me that I'd like to sort out. In my current database (a SQL2008 db) I have split the tables, views, sps etc. up into separate chunks by schema/owner name, so all the customer tables are in the customer. schema, products in the product. schema etc., so a to select from the customers address table i'd do a select * from customer.address Unfortunately, Subsonic ignores the schema/owner name and just gives me the base table name. This is fine as I've no duplicates between schemas (e.g Customer.Address and Supplier.Address don't both exist) but I just feel the code could be clearer if I could split by schema. Ideally I'd like to be able to alter the namespace by schema/owner - I think this would have least impact on SubSonic yet make the resulting code easier to read. Problem is, I've crawled all over the Subsonic source and don't have a clue how to do this (doesn't help that I code in VB not C# = yes I know, blame the ZX Spectrum!!) If anyone has tackled this before or has an idea on how to solve it, I'd be really grateful, Thanks in advance. Ed

    Read the article

  • Rest WebService error handling.

    - by Pratik
    Hi there, I am using RestWebservice for few basic operations , like creating/searching. The request xml looks something like this <customer> <name/> ..... </customer> For a successful operation I return the same customer XML with extra fields populated in it(eg. systemId etc which we blank in the request) . with Response.Status=2000 For an unsuccessful operation i return something like this with different error codes . e.g Response.Status = 422(Unprocessable entity) Response.Status= 500(Internal Server Error) and few others.. <errors> <error> An exception occurred while creating the customer</error> <error> blah argument is not valid.</error> </errors> Now i am not sure , whether this is the correct way of sending the errors to the client. Maybe it should be present in the header of the response. I will really appreciate any help. Thanks!

    Read the article

  • Problem with saving new elements in my xml at right level

    - by Fore
    I have a xml file that looks like this: <DataTalk> <Posts> <TalkPost> <PostType>dialog</PostType> <User>ABBE</User> <Customer>HRM - Heroma</Customer> <PostedDate>0001-01-01T00:00:00</PostedDate> <Message>TEST</Message> </TalkPost> </Posts> </DataTalk> When I now want to save new elements, I do: document.root.add((new XElement("TalkPost", new XElement("PostType", newDialog.PostType), new XElement("User", newDialog.User), new XElement("Customer", newDialog.Customer), new XElement("PostedDate", newDialog.PostDate), new XElement("Message", newDialog.Message))); The problem is now that it gets saved at the wrong hierarchal level. They all gets saved under <datatalk> and not under <posts> that I wan't to. How should I do to save the new elements under the <posts> hierarchically

    Read the article

  • Circular dependecy in winforms app using Castle Windsor

    - by Sven
    Hello, I was experimenting a little bit with Castle winforms in a winforms project. I wanted to register all my form dependencies with Castle windsor. This way I would have a single instance for all my forms. Now I have some problem though. I'm in a situation that form x has a dependency on form y and form y has a dependency on form x. Practical example maybe: form x is used to create an order, form y is the screen that has a list of customers. From form x there is a button to select a customer for the order. This will open form y where ou can search the customer. There is a button that lets you add the found customer to the order. It will call a method on form x and passes the selected customer object. I could do this with events. Raise an event in form y and listen for that in form x. But isn't there a way around the circular dependency in Castle Windsor, lazy registration or something? Can anyone help me out? Thanks in advance

    Read the article

  • Generic list typecasting problem

    - by AJ
    Hello, I'm new to C# and am stuck on the following. I have a Silverlight web service that uses LINQ to query a ADO.NET entity object. e.g.: [OperationContract] public List<Customer> GetData() { using (TestEntities ctx = new TestEntities()) { var data = from rec in ctx.Customer select rec; return data.ToList(); } } This works fine, but what I want to do is to make this more abstract. The first step would be to return a List<EntityObject> but this gives a compiler error, e.g.: [OperationContract] public List<EntityObject> GetData() { using (TestEntities ctx = new TestEntities()) { var data = from rec in ctx.Customer select rec; return data.ToList(); } } The error is: Error 1 Cannot implicitly convert type 'System.Collections.Generic.List<SilverlightTest.Web.Customer>' to 'System.Collections.Generic.IEnumerable<System.Data.Objects.DataClasses.EntityObject>'. An explicit conversion exists (are you missing a cast?) What am i doing wrong? Thanks, AJ

    Read the article

  • Looping through recordset with VBA

    - by Robert
    I am trying to assign salespeople (rsSalespeople) to customers (rsCustomers) in a round-robin fashion in the following manner: Navigate to first Customer, assign the first SalesPerson to the Customer. Move to Next Customer. If rsSalesPersons is not at EOF, move to Next SalesPerson; if rsSalesPersons is at EOF, MoveFirst to loop back to the first SalesPerson. Assign this (current) SalesPerson to the (current) Customer. Repeat step 2 until rsCustomers is at EOF (EOF = True, i.e. End-Of-Recordset). It's been awhile since I dealt with VBA, so I'm a bit rusty, but here is what I have come up with, so far: Private Sub Command31_Click() 'On Error GoTo ErrHandler Dim intCustomer As Integer Dim intSalesperson As Integer Dim rsCustomers As DAO.Recordset Dim rsSalespeople As DAO.Recordset Dim strSQL As String strSQL = "SELECT CustomerID, SalespersonID FROM Customers WHERE SalespersonID Is Null" Set rsCustomers = CurrentDb.OpenRecordset(strSQL) strSQL = "SELECT SalespersonID FROM Salespeople" Set rsSalespeople = CurrentDb.OpenRecordset(strSQL) rsCustomers.MoveFirst rsSalespeople.MoveFirst Do While Not rsCustomers.EOF intCustomers = rsCustomers!CustomerID intSalesperson = rsSalespeople!SalespersonID strSQL = "UPDATE Customers SET SalespersonID = " & intSalesperson & " WHERE CustomerID = " & intCustomer DoCmd.RunSQL (strSQL) rsCustomers.MoveNext If Not rsSalespeople.EOF Then rsSalespeople.MoveNext Else rsSalespeople.MoveFirst End If Loop ExitHandler: Set rsCustomers = Nothing Set rsSalespeople = Nothing Exit Sub ErrHandler: MsgBox (Err.Description) Resume ExitHandler End Sub My tables are defined like so: Customers --CustomerID --Name --SalespersonID Salespeople --SalespersonID --Name With ten customers and 5 salespeople, my intended result would like like: CustomerID--Name--SalespersonID 1---A---1 2---B---2 3---C---3 4---D---4 5---E---5 6---F---1 7---G---2 8---H---3 9---I---4 10---J---5 The above code works for the intitial loop through the Salespeople recordset, but errors out when the end of the recordset is found. Regardless of the EOF, it appears it still tries to execute the rsSalespeople.MoveFirst command. Am I not checking for the rsSalespeople.EOF properly? Any ideas to get this code to work?

    Read the article

  • Storing an object to use in multiple classes

    - by Aaron Sanders
    I am wondering the best way to store an object in memory that is used in a lot of classes throughout an application. Let me set up my problem for you: We have multiple databases, 1 per customer. We also have a master table and each row is detailed information about the databases such as database name, server IP it's located and a few config settings. I have an application that loops through those multiple databases and runs some updates on them. The settings I mentioned above are updated each loop iteration into memory. The application then runs through series of processes that include multiple classes using this data. The data never changes during the processes, only during the loop iteration. The variables are related to a customer, so I have them stored in a customer class. I suppose I could make all of the members shared or should I use a singleton for the customer class? I've never actually used a singleton, only read they are good in this type of situation. Are there better solutions to this type of scenario? Also, I could have plans for this application to be multithreaded later. Sorry if this is confusing. If you have questions, let me know and I will answer them. Thanks for your help.

    Read the article

< Previous Page | 49 50 51 52 53 54 55 56 57 58 59 60  | Next Page >