Search Results

Search found 11493 results on 460 pages for 'customer success'.

Page 55/460 | < Previous Page | 51 52 53 54 55 56 57 58 59 60 61 62  | Next Page >

  • OK - What now? How do we become a Social Business?

    - by Michael Snow
    We hope that those of you that attended yesterday's Webcast with Brian Solis enjoyed Brian's discussion with Christian Finn for our last Webcast of the season for the Oracle Social Business Thought Leaders Series.  For those of you that may have missed the webcast or were stuck at a company holiday party - you'll be glad to hear that the webcast will be available On-Demand starting later today (12/14/12). And any of you who'd like to listen to a quick but informative podcast with Brian - can listen to that here. Some of you may still be left with questions about how to get from point A to point B and even more confused than when you started thinking about this new world of Digital Darwinism. The post below, grabbed from an abundance of great thought leadership prose on Brian's blog may help you frame the path you need to start walking sooner versus later to stay off of the endangered species list.  As you explore your path forward, please keep Oracle in mind - we do offer a wide range of solutions to help your organization 12.00 Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;} optimize the engagement for your customers, employees and partners. The Path from a Social Brand to a Social Business Brian Solis Originally posted May 2, 2012 I’ve been a long-time supporter of MediaTemple’s (MT)Residence program along with Gary Vaynerchuk, Neil Patel, and many others whom I respect. I wanted to share my “7 questions to answer to become a social business” with you here.. Social Media is pervasive and is becoming the new normal in corporate marketing. Brands who get this right are starting to build their own media networks rich with customer connections numbering in the millions. Right now, Coca-Cola has over 34 million fans on Facebook, but they’re hardly alone. Disney follows just behind with 29 million fans, Starbucks boasts 25 million, and Oreo, Red Bull, and Converse play host to over 20 million fans. If we were to look at other networks such as Twitter and Youtube, we would see a recurring theme. People are connecting en masse with the businesses they support and new media represents the ability to cultivate consumer relationships in ways not possible with traditional earned or paid media. Sounds great right? This might sound abrupt, but the truth is that we’re hardly realizing the potential of what lies before us. Everything begins with understanding not just how other brands are marketing themselves in social media, but also seeing what they’re not doing and envisioning what’s possible. We’re already approaching the first of many crossroads that new media will present. Do we take the path of a social brand or that of a social business? What’s the difference? A social brand is just that, a business that is remodeling or retrofitting its existing marketing practices to new media. A social business is something altogether different as it embraces introspection and extrospection to reevaluate internal and external processes, systems, and opportunities to transform into a living, breathing entity that adapts to market conditions and opportunities. It’s a tough decision to make right now especially at a time when all we read about is how much success many businesses are finding without having to answer this very question. With all of the newfound success in social networks, the truth is that we’re only just beginning to learn what’s possible and that’s where you come in. When compared to the investment in time and resources across the board, social media represents only a small part of the mix. But with your help, that’s all about to change. The CMO Survey, an organization that disseminates the opinions of top marketers in order to predict the future of markets, recently published a report that gave credence to the fact that social media is taking off. One of the most profound takeaways from the report was this gem; “The “like button” [in Facebook] packs more customer-acquisition punch than other demand-generating activities.” With insights like this, it’s easy to see why the race to social is becoming heated. The report also highlighted exactly where social fits in the marketing mix today and as you can see, despite all of the hype, it’s not a dominant focus yet. As of August 2011, the percentage of overall marketing budgets dedicated to social media hovered at around 7%. However, in 2012 the investment in social media will climb to 10%. And, in five years, social media is expected to represent almost 18% of the total marketing budget. Think about that for a moment. In 2016, social media will only represent 18%? Queue the sound of a record scratching here. With businesses finding success in social networks, why are businesses failing to realize the true opportunity brought forth by the ability to listen to, connect with, and engage with customers? While there’s value in earning views, driving traffic, and building connections through the 3F’s (friends, fans and followers), success isn’t just defined simply by what really amounts to low-hanging fruit. The truth is that businesses cannot measure what it is they don’t know to value. As a result, innovation in new engagement initiatives is stifled because we’re applying dated or inflexible frameworks to new paradigms. Social media isn’t owned by marketing, but instead the entire organization. This changes everything and makes your role so much more important. It’s up to you to learn how to think outside of the proverbial social media box to see what others don’t, the ability to improve customers experiences through the evolution of a social brand into a social business. Doing so will translate customer insights from what they do and don’t share in social networks into better products, services, and processes. See, customers want something more from their favorite businesses than creative campaigns, viral content, and everyday dialogue in social networks. Customers want to be heard and they want to know that you’re listening. How businesses use social media must remind them that they’re more than just an audience, consumer, or a conduit to “trigger” a desired social effect. Herein lies both the challenge and opportunity of social media. It’s bigger than marketing. It’s also bigger than customer service. It’s about building relationships with customers that improve experiences and more importantly, teaches businesses how to re-imagine products and internal processes to better adapt to potential crises and seize new opportunities. When it comes down to it, Twitter, Facebook, Youtube, Foursquare, are all channels for listening, learning, and engaging. It’s what you do within each channel that builds a community around your brand. And, at the end of the day, the value of the community you build counts for everything. It’s important to understand that we cannot assume that these networks simply exist for people to lineup for our marketing messages or promotional campaigns. Nor can we assume that they’re reeling in anticipation for simple dialogue. They want value. They want recognition. They want access to exclusive information and offers. They need direction, answers and resolution. What we’re talking about here is the multidimensional makeup of consumers and how a one-sided approach to social media forces the needs for social media to expand beyond traditional marketing to socialize the various departments, lines of business, and functions to engage based on the nature of the situation or opportunity. In the same CMO study, it was revealed that marketers believe that social media has a long way to go toward integrating into the overall company strategy. On a scale of 1-7, with one being “not integrated at all” and seven being “very integrated,” 22% chose “one.” Critical functions such as service, HR, sales, R&D, product marketing and development, IR, CSR, etc. are either not engaged or are operating social media within a silo disconnected from other efforts or possibilities. The problem is that customers don’t view a company by silo, instead they see one company, one brand, and their experience in social media forms an impression that eventually contributes to their view of your brand. The first step here is to understand business priorities and objectives to assess how social media can be additive in achieving these goals. Additionally, surveying the landscape to determine other areas of interest as its specifically related to your business. • Are customers seeking help or direction? • Who are your most valuable customers and what are they sharing? • How can you use social media to acquire and retain customers? - What ideas are circulating and how can you harness user generated activity and content to innovate or adapt to better meet the needs of customers? - How can you broaden a single customer view to recognize the varying needs of customers and how your organization can organize around each circumstance? - What insights exist based on how consumers are interacting with one another? How can this intelligence inform marketing, service, products and other important business initiatives? - How can your business extend their current efforts to deliver better customer experiences and in turn more effectively unit internal collaboration and communication? Customer demands far exceed the capabilities of the marketing department. While creating a social brand is a necessary endeavor, building a social business is an investment in customer relevance now and over time. Beyond relevance, a social business fosters a culture of change that unites employees and customers and sets a foundation for meaningful and beneficial relationships. Innovation, communication, and creativity are the natural byproducts of engagement and transformation. As a social brand, we are competing for the moment. As a social business, we are competing the future in all that we do today.

    Read the article

  • Introducing a (new) test method to a team

    - by Jon List
    A couple of months ago i was hired in a new job. (I'm fresh out of my Masters in software engineering) The company mainly consists of ERP consultants, but I was hired in their fairly small web department (6 developers), our main task is ERP/ecom integration (ERP-integrated web shops). The department is growing, and recently my manager asked me to start thinking about introducing tests to the team, i love a challenge, but frankly I'm a bit scared (I'm the least experience member of the team). Currently the method of testing is clicking around in the web shop and asking the customer if the products are there, if they look okay, and if orders are posted correctly to the ERP. We are getting a lot of support cases on previous projects, where a customer or a customer's customer have run into errors, which - i suppose - is why my manager wants more structured testing. Off the top of my head, I though of some (obvious?) improvements, like looking at the requirement specification, having an issue tracker, enabling team members to register their time on a "tests"-line on the budget, and to circulate tasks amongst members of the team. But as i see it we have three main challenges: general website testing. (javascript, C#, ASP.NET and CMS integration tests) (live) ERP integration testing (customers rarely want to pay for test environments). adopting a method in the team I like the responsibility, but I am afraid that I'm in a little bit over my head. I expect that my manager expects me to set up some kind of workshop for the team where I present some techniques and ideas and where we(the team) can find some solutions together. What I learned in school was mostly unit testing and program verification, not so much testing across multiple systems and applications. What I'm looking for here, is references/advice/pointers/anecdotes; anything that might help me to get smarter and to improve the current method of my team. Thanks!! (TL;DR: read the bold parts)

    Read the article

  • Is it possible to build a single game to run in Facebook & Google+?

    - by Songo
    I was asked by my customer to build a Facebook game. The game would be something similar to Mafiawars.com where the game is hosted on a server and run through a frame on Facebook. The thing is after several days of negotiations with the customer and near the finalization of the requirements he mentioned something strange. He said that if the game was successful on Facebook then we may add it to Google+ too. I thought he meant that we'll develop a new version for Google+, but he refused as he argued that the game should be able to support both sites and he won't pay for the same game twice. Now I haven't developed neither Facebook nor Google+ games before, so I don't know if it is possible to build a single Facebook/Google+ game. How would you react to such requirement? How would you design such an application? Notes I confirmed with the customer that he wasn't talking about using Open ID he wanted full integration (sharing post, friend requests,..etc.) I really don't want to lose that customer for numerous reasons (He even agreed to extend the project time to compensate for the time I need to learn Facebook/Google+ APIs)

    Read the article

  • Web migration of a VB6 system with VWG

    - by Webgui
    Brinks Bolivia eSAC System (Customer Service) allows to register all different kinds of contacts for a customer; addition to maintaining an updated status of each service or customer request, to have accurate information and perform the appropriate procedures for all applications. The system was originally developed in VB6 and since web access was essential it was offered via Citrix. Since the application's performance was a critical issue as well as the need to offer the system without specific installations the company looked for a solution that would solve those drawbacks of using Citrix. Searching for a solution that would allow it to offer the eSAC system over the web without the need for specific client installations and provide sufficient performance levels even when there is limited bandwidth lead Brinks to a decision to migrate their VB6 Customer Service system to to Visual WebGui. "Developing on Visual WebGui we were able to migrate the system to web environment and even add new features in less time which allows us to offer it over a standard web browser with better performance and no installations as was required with Citrix," concluded Alexander Cuellar. The full article and screenshots of the system are available here.

    Read the article

  • Maintaining a main project line with satellite projects

    - by NickLarsen
    Some projects I work on have a main line of features, but are customizable per customer. Up until now those customizations have been implemented as preferences, but now there are 2 problems with the system... The settings page is getting out of control with features. There are probably some improvements that could be made to the settings UI, but regardless, it is quite cumbersome setting up new instances for new customers. Customers have started asking for customizations which would be more easily maintained as separate threads instead of having tons of customizations code. Optimally I am envisioning some kind of source control in which features are either in the main project line and customizations per customer are maintained in a repo per customer set up. The customizations per project would need to remain separate but if a bug is found and fixed in a particular project, I would need to percolate the fix back to the main line and into all of the other customer repos. The problem is I have never seen this done before, and before spending time trying to find source control that can accommodate this scenario and implement it, I figure it best to ask if anyone has something less complicated or knows of a source control product which can handle this with very little hair pulling.

    Read the article

  • Fusion CRM Release 7 RCDs and TOIs Now Available!

    - by Richard Lefebvre
    Fusion CRM Release 7 Release Content Documents (RCD) and Transfer of Information (TOI) presentations are now available. In addition, you can find 245 new or changed product features for Release 7 on Oracle Product Features. All the new RCDs and TOIs can be found on the Fusion Learning Center: Customer Relationship Management TOIs - Customer Center, Define Segmentation Strategy, Enterprise Contracts, Oracle Social Network, Sales, and Territory Management Business Process Model (BPM) RCDs - Customer Service, Marketing, Order Fulfillment, and Sales Financials BPM RCDs - Asset Lifecycle Management, Cash and Treasury Management, and Financial Control and Reporting Human Capital Management TOIs - Workforce Development, Compensation, Benefits, Worker Performance, Workforce Profiles, Enterprise Structures, Talent Review, Manage Transaction and Batch Processing, Delete HCM Storage Data, and Load Batch Data BPM RCDs - Compensation Management, Enterprise Information Management, Workforce Deployment, and Workforce Development Procurement TOI - Requisitions BPM RCD - Procurement Project Portfolio Management TOIs - Project Resources, Evaluate and Assign Resources, Maintain Resource Assignments, Manage Resource Demand, Manage Resource Supply, Manage Resource Utilization and Analytics, Project Management, Set Up Project Management BPM RCD - Project Management Supply Chain Management TOIs - Manage New Product Definition and Approval, Manage Product Change Orders, Product Hub, Define Item Class BPM RCDs - Materials Management and Logistics, Product Management and Supply Chain Planning Partners and customers can access the content from the following locations: Partner access: BPM RCDs and TOIs Oracle Partner Network Fusion Learning Center New Feature RCDs Oracle Product Features Customer access: TOIs My Oracle Support (Note:1528594.1) BPM RCDs My Oracle Support (Note:1559828.1) New Feature RCDs Oracle Product Features

    Read the article

  • Oracle Buys BigMachines - Adds Leading Configure, Price and Quote (CPQ) Cloud to the Oracle Cloud to Enable Smarter Selling

    - by Richard Lefebvre
    News Facts Oracle today announced that it has entered into an agreement to acquire BigMachines, a leading cloud-based Configure, Price and Quote (CPQ) solution provider. BigMachines’ CPQ Cloud accelerates the conversion of sales opportunities into revenue by automating the sales order process with guided selling, dynamic pricing, and an easy-to-use workflow approval process, accessible anywhere, on any device. Companies that use sales automation technology often rely on manual, cumbersome and disconnected processes to convert opportunities into orders. This creates errors, adds costs, delays revenue, and degrades the customer experience. BigMachines’ CPQ cloud extends sales automation to include the creation of an optimal quote, which enables sales personnel to easily configure and price complex products, select the best options, promotions and deal terms, and include up sell and renewals, all using automated workflows. In combination with Oracle’s enterprise-grade cloud solutions, including Marketing, Sales, Social, Commerce and Service Clouds, Oracle and BigMachines will create an end-to-end smarter selling cloud solution so sales personnel are more productive, customers are more satisfied, and companies grow revenue faster. More information on this announcement can be found at http://www.oracle.com/bigmachines Supporting Quotes “The fundamental goals of smarter selling are to provide sales teams with the information, access, and insights they need to maximize revenue opportunities and execute on all phases of the sales cycle,” said Thomas Kurian, Executive Vice President, Oracle Development. “By adding BigMachines’ CPQ Cloud to the Oracle Cloud, companies will be able to drive more revenue and increase customer satisfaction with a seamlessly integrated process across marketing and sales, pricing and quoting, and fulfillment and service.” “BigMachines has developed leading CPQ solutions that serve companies of all sizes across multiple industries,” said David Bonnette, BigMachines’ CEO. “Together with Oracle, we expect to provide a complete cloud solution to manage sales processes and deliver exceptional customer experiences.” Supporting Resources About Oracle and BigMachines General Presentation Customer and Partner Letter FAQ

    Read the article

  • Role based access to resources for a RESTful service

    - by mutex
    I'm still wrapping my head around REST, but I wonder if someone can help with any suggestions or approaches to role based access control for a RESTful service, particularly from the point of view of securing the data and how the URLs might look. It's probably best to consider an example: Say I have a REST service for Customers, and want to split the users of this REST service into Admin, Editor and Reader roles: Admins can change all attributes of a Customer resource Editors can change only some Readers can only view them. Access control rights are assigned to the Customers entities individually. So for example a user of the service might have admin rights to Customers 1,2 and 3 but Editor access to 4,5 and Reader access to 7,8,9. Now consider the user calling the service. What is a good way to seperate the list of Customers for the current User? GET /Customer - this might get a list of all customers that the current user has Admin\Editor\Reader access to. But then on each Customer the consumer would need an indication of what role they have. Or would it be "better" having something like GET /Customer/Admin - return all customers the current user has Admin access to. Just looking for some high level pointers or reading on a decent way to secure\filter the resources based on roles of the current user.

    Read the article

  • Wisdom of merging 100s of Oracle instances into one instance

    - by hoytster
    Our application runs on the web, is mostly an inquiry tool, does some transactions. We host the Oracle database. The app has always had a different instance of Oracle for each customer. A customer is a company which pays us to provide our service to the company's employees, typically 10,000-25,000 employees per customer. We do a major release every few years, and migrating to that new release is challenging: we might have a team at the customer site for a couple weeks, explaining new functionality and setting up the driving data to suit that customer. We're considering going multi-client, putting all our customers into a single shared Oracle 11g instance on a big honkin' Windows Server 2008 server -- in order to reduce costs. I'm wondering if that's advisable. There are some advantages to having separate instances for each customer. Tell me if these are bogus, please. In my rough guess about decreasing importance: Our customers MyCorp and YourCo can be migrated separately when breaking changes are made to the schema. (With multi-client, we'd be migrating 300+ customers overnight!?!) MyCorp's data can be easily backed up and (!!!) restored, without affecting other customers. MyCorp's data is securely separated from their competitor YourCo's data, without depending on developers to get the code right and/or DBAs getting the configuration right. Performance is better because the database is smaller (5,000 vs 2,000,000 rows in ~50 tables). If MyCorp's offices are (mostly) in just one region, then the MyCorp's instance can be geographically co-located there, so network lag doesn't hurt performance. We can provide better service to global clients, for the same reason. In MyCorp wants to take their database in-house, then we can easily export their instance, to get MyCorp their data. Load-balancing is easier because instances can be placed on different servers (this is with a web farm). When a DEV or QA instance is needed, it's easier to clone the real instance and anonymize the data, because there's much less data. Because they're small enough, developers can have their own instance running locally, so they can work on code while waiting at the airport and while in-flight, without fighting VPN hassles. Q1: What are other advantages of separate instances? We are contemplating changing the database schema and merging all of our customers into one Oracle instance, running on one hefty server. Here are advantages of the multi-client instance approach, most important first (my WAG). Please snipe if these are bogus: Less work for the DBAs, since they only need to maintain one instance instead of hundreds. Less DBA work translates to cheaper, our main motive for this change. With just one instance, the DBAs can do a better job of optimizing performance. They'll have time to add appropriate indexes and review our SQL. It will be easier for developers to debug & enhance the application, because there is only one schema and one app (there might be dozens of schema versions if there are hundreds of instances, with a different version of the app for each version of the schema). This reduces costs too. The alternative is having to start every debug session with (1) What version is this customer running and (2) Let's struggle to recreate the corresponding development environment, code and database. (We need a Virtual Machine that includes the code AND database instance for each patch and release!) Licensing Oracle is cheaper because it's priced per server irrespective of heft (or something -- I don't know anything about the subject). The database becomes a viable persistent store for web session data, because there is just one instance. Some database operations are easier with one multi-client instance, like finding a participant when they're hazy about which customer they (or their spouse, maybe) works for: all the names are in one table. Reporting across customers is straightforward. Q2: What are other advantages of having multiple clients in one instance? Q3: Which approach do you think is better (why)? Instance per customer, or all customers in one instance? I'm concerned that having one multi-client instance makes migration near-impossible, and that's a deal killer... ... unless there is a compromise solution like having two multi-client instances, the old and the new. In that case case, we would design cross-instance solutions for finding participants, reporting, etc. so customers could go from one multi-client instance to the next without anything breaking. THANKS SO MUCH for your collective advice! This issue is beyond me -- but not beyond the collective you. :) Hoytster

    Read the article

  • Custom validation works in development but not in unit test

    - by Geolev
    I want to validate that at least one of two columns have a value in my model. I found somewhere on the web that I could create a custom validator as follows: # Check for the presence of one or another field: # :validates_presence_of_at_least_one_field :last_name, :company_name - would require either last_name or company_name to be filled in # also works with arrays # :validates_presence_of_at_least_one_field :email, [:name, :address, :city, :state] - would require email or a mailing type address module ActiveRecord module Validations module ClassMethods def validates_presence_of_at_least_one_field(*attr_names) msg = attr_names.collect {|a| a.is_a?(Array) ? " ( #{a.join(", ")} ) " : a.to_s}.join(", ") + "can't all be blank. At least one field must be filled in." configuration = { :on => :save, :message => msg } configuration.update(attr_names.extract_options!) send(validation_method(configuration[:on]), configuration) do |record| found = false attr_names.each do |a| a = [a] unless a.is_a?(Array) found = true a.each do |attr| value = record.respond_to?(attr.to_s) ? record.send(attr.to_s) : record[attr.to_s] found = !value.blank? end break if found end record.errors.add_to_base(configuration[:message]) unless found end end end end end I put this in a file called lib/acs_validator.rb in my project and added "require 'acs_validator'" to my environment.rb. This does exactly what I want. It works perfectly when I manually test it in the development environment but when I write a unit test it breaks my test environment. This is my unit test: require 'test_helper' class CustomerTest < ActiveSupport::TestCase # Replace this with your real tests. test "the truth" do assert true end test "customer not valid" do puts "customer not valid" customer = Customer.new assert !customer.valid? assert customer.errors.invalid?(:subdomain) assert_equal "Company Name and Last Name can't both be blank.", customer.errors.on(:contact_lname) end end This is my model: class Customer < ActiveRecord::Base validates_presence_of :subdomain validates_presence_of_at_least_one_field :customer_company_name, :contact_lname, :message => "Company Name and Last Name can't both be blank." has_one :service_plan end When I run the unit test, I get the following error: DEPRECATION WARNING: Rake tasks in vendor/plugins/admin_data/tasks, vendor/plugins/admin_data/tasks, and vendor/plugins/admin_data/tasks are deprecated. Use lib/tasks instead. (called from /usr/lib/ruby/gems/1.8/gems/rails-2.3.8/lib/tasks/rails.rb:10) Couldn't drop acs_test : #<ActiveRecord::StatementInvalid: PGError: ERROR: database "acs_test" is being accessed by other users DETAIL: There are 1 other session(s) using the database. : DROP DATABASE IF EXISTS "acs_test"> acs_test already exists NOTICE: CREATE TABLE will create implicit sequence "customers_id_seq" for serial column "customers.id" NOTICE: CREATE TABLE / PRIMARY KEY will create implicit index "customers_pkey" for table "customers" NOTICE: CREATE TABLE will create implicit sequence "service_plans_id_seq" for serial column "service_plans.id" NOTICE: CREATE TABLE / PRIMARY KEY will create implicit index "service_plans_pkey" for table "service_plans" /usr/bin/ruby1.8 -I"lib:test" "/usr/lib/ruby/gems/1.8/gems/rake-0.8.7/lib/rake/rake_test_loader.rb" "test/unit/customer_test.rb" "test/unit/service_plan_test.rb" "test/unit/helpers/dashboard_helper_test.rb" "test/unit/helpers/customers_helper_test.rb" "test/unit/helpers/service_plans_helper_test.rb" /usr/lib/ruby/gems/1.8/gems/activerecord-2.3.8/lib/active_record/base.rb:1994:in `method_missing_without_paginate': undefined method `validates_presence_of_at_least_one_field' for #<Class:0xb7076bd0> (NoMethodError) from /usr/lib/ruby/gems/1.8/gems/will_paginate-2.3.12/lib/will_paginate/finder.rb:170:in `method_missing' from /home/george/projects/advancedcomfortcs/app/models/customer.rb:3 from /usr/local/lib/site_ruby/1.8/rubygems/custom_require.rb:31:in `gem_original_require' from /usr/local/lib/site_ruby/1.8/rubygems/custom_require.rb:31:in `require' from /usr/lib/ruby/gems/1.8/gems/activesupport-2.3.8/lib/active_support/dependencies.rb:158:in `require' from /usr/lib/ruby/gems/1.8/gems/activesupport-2.3.8/lib/active_support/dependencies.rb:265:in `require_or_load' from /usr/lib/ruby/gems/1.8/gems/activesupport-2.3.8/lib/active_support/dependencies.rb:224:in `depend_on' from /usr/lib/ruby/gems/1.8/gems/activesupport-2.3.8/lib/active_support/dependencies.rb:136:in `require_dependency' from /usr/lib/ruby/gems/1.8/gems/rails-2.3.8/lib/initializer.rb:414:in `load_application_classes' from /usr/lib/ruby/gems/1.8/gems/rails-2.3.8/lib/initializer.rb:413:in `each' from /usr/lib/ruby/gems/1.8/gems/rails-2.3.8/lib/initializer.rb:413:in `load_application_classes' from /usr/lib/ruby/gems/1.8/gems/rails-2.3.8/lib/initializer.rb:411:in `each' from /usr/lib/ruby/gems/1.8/gems/rails-2.3.8/lib/initializer.rb:411:in `load_application_classes' from /usr/lib/ruby/gems/1.8/gems/rails-2.3.8/lib/initializer.rb:197:in `process' from /usr/lib/ruby/gems/1.8/gems/rails-2.3.8/lib/initializer.rb:113:in `send' from /usr/lib/ruby/gems/1.8/gems/rails-2.3.8/lib/initializer.rb:113:in `run' from /home/george/projects/advancedcomfortcs/config/environment.rb:9 from ./test/test_helper.rb:2:in `require' from ./test/test_helper.rb:2 from ./test/unit/customer_test.rb:1:in `require' from ./test/unit/customer_test.rb:1 from /usr/lib/ruby/gems/1.8/gems/rake-0.8.7/lib/rake/rake_test_loader.rb:5:in `load' from /usr/lib/ruby/gems/1.8/gems/rake-0.8.7/lib/rake/rake_test_loader.rb:5 from /usr/lib/ruby/gems/1.8/gems/rake-0.8.7/lib/rake/rake_test_loader.rb:5:in `each' from /usr/lib/ruby/gems/1.8/gems/rake-0.8.7/lib/rake/rake_test_loader.rb:5 rake aborted! Command failed with status (1): [/usr/bin/ruby1.8 -I"lib:test" "/usr/lib/ru...] (See full trace by running task with --trace) It seems to have stepped on will_paginate somehow. Does anyone have any suggestions? Is there another way to do the validation I'm attempting to do? Thanks, George

    Read the article

  • Using jQuery and SPServices to Display List Items

    - by Bil Simser
    I had an interesting challenge recently that I turned to Marc Anderson’s wonderful SPServices project for. If you haven’t already seen or used SPServices, please do. It’s a jQuery library that does primarily two things. First, it wraps up all of the SharePoint web services in a nice little AJAX wrapper for use in JavaScript. Second, it enhances the form editing of items in SharePoint so you’re not hacking up your List Form pages. My challenge was simple but interesting. The user wanted to display a SharePoint item page (DispForm.aspx, which already had some customization on it to display related items via this blog post from Codeless Solutions for SharePoint) but launch from an external application using the value of one of the fields in the SharePoint list. For simplicity let’s say my list is a list of customers and the related list is a list of orders for that customer. It would look something like this (click on the item to see the full image): Your first thought might be, that’s easy! Display the customer information using a DataView Web Part and filter the item using a query string to match the customer number. However there are a few problems with this idea: You’ll need to build a custom page and then attach that related orders view to it. This is a bit of a problem because the solution from Codeless Solutions relies on the Title field on the page to be displayed. On a custom page you would have to recreate all of the elements found on the DispForm.aspx page so the related view would work. The DataView Web Part doesn’t look *exactly* like what the out of the box display form page does. Not a huge problem and can be overcome with some CSS style overrides but still, more work. A DVWP showing a single record doesn’t have the same toolbar that you would using the DispForm.aspx. Not a show-stopper and you can rebuild the toolbar but it’s going to potentially require code and then there’s the security trimming, etc. that you have to get right. DVWPs are not automatically updated if you add a column to the list like DispForm.aspx is. Work, work, work. For these reasons I thought it would be easier to take the already existing (modified) DispForm.aspx page and just add some jQuery magic to the page to find the item. Why do we need to find it? DispForm.aspx relies on a querystring parameter called “ID” which then displays whatever that item ID number is in the list. Trouble is, when you’re coming in from an external app via a link, you don’t know what that internal ID is (and frankly shouldn’t). I don’t like exposing internal SharePoint IDs to the outside world for the same reason I don’t do it with database IDs. They’re internal and while it’s find to use on the site itself you don’t want external links using it. It’s volatile and can change (delete one item then re-add it back with the same data and watch any ID references break). The next thought might be to call a SharePoint web service with a CAML query to get the item ID number using some criteria (in this case, the customer number). That’s great if you have that ability but again we had an existing application we were just adding a link to. The last thing I wanted to do was to crack open the code on that sucker and start calling web services (primarily because it’s Java, but really I’m a lazy geek). However if you’re doing this and have access to call a web service that would be an option. Back to this problem, how do I a) find a SharePoint List Item based on some field value other than ID and b) make it low impact so I can just construct a URL to it? That’s where jQuery and SPServices came to the rescue. After spending a few hours of emails back and forth with Marc and a couple of phone calls (and updating jQuery to the latest version, duh!) it was a simple answer. First we need a reference to a) jQuery b) SPServices and c) our script. I just dropped a Content Editor Web Part, the Swiss Army Knives of Web Parts, onto the DispForm.aspx page and added these lines: <script type="text/javascript" src="http://intranet/JavaScript/jquery-1.4.2.min.js"></script> <script type="text/javascript" src="http://intranet/JavaScript/jquery.SPServices-0.5.3.min.js"></script> <script type="text/javascript" src="http://intranet/JavaScript/RedirectToID.js"> </script> Update it to point to where you keep your scripts located. I prefer to keep them all in Document Libraries as I can make changes to them without having to remote into the server (and on a multiple web front end, that’s just a PITA), it provides me with version control of sorts, and it’s quick to add new plugins and scripts. Now we can look at our RedirectToID.js script. This invokes the SPServices Library to call the GetListItems method of the Lists web service and then rewrites the URL to DispForm.aspx to use the correct SharePoint ID (the internal one). $(document).ready(function(){ var queryStringValues = $().SPServices.SPGetQueryString(); var id = queryStringValues["ID"]; if(id == "0") { var customer = queryStringValues["CustomerNumber"]; var query = "<Query><Where><Eq><FieldRef Name='CustomerNumber'/><Value Type='Text'>" + customer + "</Value></Eq></Where></Query>"; var url = window.location; $().SPServices({ operation: "GetListItems", listName: "Customers", async: false, CAMLQuery: query, completefunc: function (xData, Status) { $(xData.responseXML).find("[nodeName=z:row]").each(function(){ id = $(this).attr("ows_ID"); url = $().SPServices.SPGetCurrentSite() + "/Lists/Customers/DispForm.aspx?ID=" + id; window.location = url; }); } }); } }); What’s happening here? Line 3: We call SPServices.SPGetQueryString to get an array of query string values (a handy function in the library as I had 15 lines of code to do this which is now gone). Line 4: Extract the ID value from the query string Line 6: If we pass in “0” it means we’re looking up a field value. This allows DispForm.aspx to work like normal with SharePoint lists but lookup our values when invoked. Why ID at all? DispForm.aspx doesn’t work unless you pass in something and “0” is a *magic* number that will invoke the page but not lookup a value in the database. Line 8-15: Extract the CustomerNumber query string value, build a CAML query to find it then call the GetListitems method using SPServices Line 16: Process the results in our completefunc to iterate over all the rows (there should only be one) and extract the real ID of the item Line 17-20: Build a new URL based on the site (using a call to SPGetCurrentSite) and append our real ID to redirect to the DispForm.aspx page As you can see, it dynamically creates a CAML query for the call to the web service using the passed in value. You could even make this generic to take in different query strings, one for the field name to search for and the other for the value to find. That way it could be used for any field you want. For example you could bring up the correct item on the DispForm.aspx page based on customer name with something like this: http://myserver/Lists/Customers/DispForm.aspx?ID=0&FilterId=CustomerName&FilterValue=Sony Use your imagination. Some people would opt for building a custom page with a DVWP but if you want to leverage all the functionality of DispForm.aspx this might come in handy if you don’t want to rely on internal SharePoint IDs.

    Read the article

  • Complex type support in process flow &ndash; XMLTYPE

    - by shawn
        Before OWB 11.2 release, there are only 5 simple data types supported in process flow: DATE, BOOLEAN, INTEGER, FLOAT and STRING. A new complex data type – XMLTYPE is added in 11.2, in order to support complex data being passed between the process flow activities. In this article we will give a simple example to illustrate the usage of the new type and some related editors.     Suppose there is a bookstore that uses XML format orders as shown below (we use the simplest form for the illustration purpose), then we can create a process flow to handle the order, take the order as the input, then extract necessary information, and generate a confirmation email to the customer automatically. <order id=’0001’>     <customer>         <name>Tom</name>         <email>[email protected]</email>     </customer>     <book id=’Java_001’>         <quantity>3</quantity>     </book> </order>     Considering a simple user case here: we use an input parameter/variable with XMLTYPE to hold the XML content of the order; then we can use an Assign activity to retrieve the email info from the order; after that, we can create an email activity to send the email (Other activities might be added in practical case, but will not be described here). 1) Set XML content value     For testing purpose, we will create a variable to hold the sample order, and then this will be used among the process flow activities. When the variable is of XMLTYPE and the “Literal” value is set the true, the advance editor will be enabled.     Click the “Advance Editor” shown as above, a simple xml editor will popup. The editor has basic features like syntax highlight and check as shown below:     We can also do the basic validation or validation against schema with the editor by selecting the normalized schema. With this, it will be easier to provide the value for XMLTYPE variables. 2) Extract information from XML content     After setting the value, we need to extract the email information with the Assign activity. In process flow, an enhanced expression builder is used to help users construct the XPath for extracting values from XML content. When the variable’s literal value is set the false, the advance editor is enabled.     Click the button, the advance editor will popup, as shown below:     The editor is based on the expression builder (which is often used in mapping etc), an XPath lib panel is appended which provides some help information on how to write the XPath. The expression used here is: “XMLTYPE.EXTRACT(XML_ORDER,'/order/customer/email/text()').getStringVal()”, which uses ‘/order/customer/email/text()’ as the XPath to extract the email info from the XML document.     A variable called “EMAIL_ADDR” is created with String data type to hold the value extracted.     Then we bind the “VARIABLE” parameter of Assign activity to “EMAIL_ADDR” variable, which means the value of the “EMAIL_ADDR” activity will be set to the result of the “VALUE” parameter of Assign activity. 3) Use the extracted information in Email activity     We bind the “TO_ADDRESS” parameter of the email activity to the “EMAIL_ADDR” variable created in above step.     We can also extract other information from the xml order directly through the expression, for example, we can set the “MESSAGE_BODY” with value “'Dear '||XMLTYPE.EXTRACT(XML_ORDER,'/order/customer/name/text()').getStringVal()||chr(13)||chr(10)||'   You have ordered '||XMLTYPE.EXTRACT(XML_ORDER,'/order/book/quantity/text()').getStringVal()||' '||XMLTYPE.EXTRACT(XML_ORDER,'/order/book/@id').getStringVal()”. This expression will extract the customer name, the quantity and the book id from the order to compose the message body.     To make the email activity work, we need provide some other necessary information, Such as “SMTP_SERVER” (which is the SMTP server used to send the emails, like “mail.bookstore.com”. The default PORT number is set to 25. You need to change the value accordingly), “FROM_ADDRESS” and “SUBJECT”. Then the process flow is ready to go.     After deploying the process flow package, we can simply run the process flow to check if the result is as expected (An email will be sent to the specified email address with proper subject and message body).     Note: In oracle 11g, there is an enhanced security feature - ACL (Access Control List), which restrict the network access within db, so we need to edit the list to allow UTL_SMTP work if you are using oracle 11g. Refer to chapter “Access Control Lists for UTL_TCP/HTTP/SMTP” and “Managing Fine-Grained Access to External Network Services” for more details.       In previous releases, XMLTYPE already exists in other OWB objects, like mapping/transformation etc. When the mapping/transformation is dragged into a process flow, the parameters with XMLTYPE are mapped to STRING. Now with the XMLTYPE support in process flow, the XMLTYPE will map to XMLTYPE in a more natural way, and we can leverage the new data type for the design.

    Read the article

  • Big data: An evening in the life of an actual buyer

    - by Jean-Pierre Dijcks
    Here I am, and this is an actual story of one of my evenings, trying to spend money with a company and ultimately failing. I just gave up and bought a service from another vendor, not the incumbent. Here is that story and how I think big data could actually fix this (and potentially prevent some of this from happening). In the end this story should illustrate how big data can benefit me (get me what I want without causing grief) and the company I am trying to buy something from. Note: Lots of details left out, I have no intention of being the annoyed blogger moaning about a specific company. What did I want to get? We watch TV, we have internet and we do have a land line. The land line is from a different vendor then the TV and the internet. I have decided that this makes no sense and I was going to get a bundle (no need to infer who this is, I just picked the generic bundle word as this is what I want to get) of all three services as this seems to save me money. I also want to not talk to people, I just want to click on a website when I feel like it and get it all sorted. I do think that is reality. I want to just do my shopping at 9.30pm while watching silly reruns on TV. Problem 1 - Bad links So, I'm an existing customer of the company I want to buy my bundle from. I go to the website, I click on offers. Turns out they are offers for new customers. After grumbling about how good they are, I click on offers for existing customers. Bummer, it goes to offers for new customers, so I click again on the link for offers for existing customers. No cigar... it just does not work. Big data solutions: 1) Do not show an existing customer the offers for new customers unless they are the same => This is only partially doable without login, but if a customer logs in the application should always know that this is an existing customer. But in general, imagine I do this from my home going through the internet service of this vendor to their domain... an instant filter should move me into the "existing customer route". 2) Flag dead or incorrect links => I've clicked the link for "existing customer offers" at least 3 times in under 5 seconds... Identifying patterns like this is easy in Hadoop and can very quickly make a list of potentially incorrect links. No need for realtime fixing, just the fact that this link can be pro-actively fixed across my entire web domain is a good thing. Preventative maintenance! Problem 2 - Purchase cannot be completed Apart from the fact that the browsing pattern to actually get to what I want is poorly designed, my purchase never gets past a specific point. In other words, I put something into my shopping cart and when I want to move on the application either crashes (with me going to an error page) or hangs or goes into something like chat. So I try again, and again and again. I think I tried this entire path (while being logged in!!) at least 10 times over the course of 20 minutes. I also clicked on the feedback button and, frustrated as I was, tried to explain this did not work... Big Data Solutions: 1) This web site does shopping cart analysis. I got an email next day stating I have things in my shopping cart, just click here to complete my purchase. After the above experience, this just added insult to my pain... 2) What should have happened, is a Hadoop job going over all logged in customers that are on the buy flow. It should flag anyone who is trying (multiple attempts from the same user to do the same thing), analyze the shopping card, the clicks to identify what the customers wants, his feedback provided (note: always own your own website feedback, never just farm this out!!) and in a short turn around time (30 minutes to 2 hours or so) email me with a link to complete my purchase. Not with a link to my shopping cart 12 hours later, but a link to actually achieve what I wanted... Why should this company go through the big data effort? I do believe this is relatively easy to do using our Oracle Event Processing and Big Data Appliance solutions combined. It is almost so simple (to my mind) that it makes no sense that this is not in place? But, now I am ranting... Why is this interesting? It is because of $$$$. After trying really hard, I mean I did this all in the evening, and again in the morning before going to work. I kept on failing, But I really wanted this to work... so an email that said, sorry, we noticed you tried to get a bundle (the log knows what I wanted, where I failed, so easy to generate), here is the link to click and complete your purchase. And here is 2 movies on us as an apology would have kept me as a customer, and got the additional $$$$ per month for the next couple of years. It would also lead to upsell on my phone package etc. Instead, I went to a completely different company, bought service from them. Lost money for company A, negative sentiment for company A and me telling this story at the water cooler so I'm influencing more people to think negatively about company A. All in all, a loss of easy money, a ding in sentiment and image where a relatively simple solution exists and can be in place on the software I describe routinely in this blog... For those who are coming to Openworld and maybe see value in solving the above, or are thinking of how to solve this, come visit us in Moscone North - Oracle Red Lounge or in the Engineered Systems Showcase.

    Read the article

  • what is the problem in ATM machine program

    - by Have alook
    in this prigramm when the account number is uncorrect it should display a message to write a gain but when i wrote a gain by corrrect account number always it diplay the result of first account also there is aproblem in PIN number ,the use have only three time to try if he enter wrong numbe and if enter three times wrong it should stop the program but it complete to the last part I dont know why pleas help me this is my proram import java.util.*; class assignment2_70307{ public static void main(String args[]){ Scanner m=new Scanner(System.in); int i; i=0; int [] accountNo =new int[7] ;//declear the Accont number array accountNo [0] =1111; accountNo [1] =2222; accountNo [2] =3333; accountNo [3] =4444; accountNo [4] =5555; accountNo [5] =6666; accountNo [6] =7777; int [] PINno =new int[7]; //declear the PIN number array PINno [0] =1234; PINno [1] =5678; PINno [2] =9874; PINno [3] =6523; PINno [4] =1236; PINno [5] =4569; PINno [6] =8521; String [] CusomerNm =new String[7]; //dclear the customer name CusomerNm [0] ="Ali"; CusomerNm [1] ="Ahmed"; CusomerNm [2] ="Amal"; CusomerNm [3] ="Said"; CusomerNm [4] ="Rashid"; CusomerNm [5] ="Fatema"; CusomerNm [6] ="Mariam"; double [] Balance =new double[7]; //declear the Balane array Balance [0] =100.50; Balance [1] =5123.00; Balance [2] =12.00; Balance [3] =4569.00; Balance [4] =1020.25; Balance [5] =0.00; Balance [6] =44.10; System.out.println("Wellcome to mini ATM Machine"); int accountno,pino; accountno=0; pino=0; System.out.println("Please Enter your account number: or -1 to stop" ); accountno=m.nextInt(); if (accountno==accountNo[0]) System.out.print("Customer Name: "+CusomerNm [0]+ "\n" ); else if (accountno==accountNo[1]) System.out.print("Customer Name: "+CusomerNm [1]+ "\n" ); else if (accountno==accountNo[2]) System.out.print("Customer Name: "+CusomerNm [2]+ "\n" ); else if (accountno==accountNo[3]) System.out.print("Customer Name: "+CusomerNm [3]+ "\n" ); else if (accountno==accountNo[4]) System.out.print("Customer Name: "+CusomerNm [4]+ "\n" ); else if (accountno==accountNo[5]) System.out.print("Customer Name: "+CusomerNm [5]+ "\n" ); else if (accountno==accountNo[6]) System.out.print("Customer Name: "+CusomerNm [6]+ "\n" ); // else if (accountNo[0]==-1) //break; else { System.out.println("The account dose not exist,please try again"); //accountNo[i]=m.nextInt(); accountno=m.nextInt(); if(accountNo[i]==accountno) System.out.println("Customer Name: "+CusomerNm[i] ); else System.out.println("The account dose not exist,please try again"); accountno=m.nextInt(); System.out.println("Customer Name: "+CusomerNm[i] ); } System.out.print("Enter your PIN number:"); PINno[i]=m.nextInt(); if(PINno[i]==1234) { System.out.println(PINno[i]); System.out.println("Balance:"+Balance [0]+ "Rial"); //return 0; } else if(PINno[i]==5678) { System.out.println(PINno[i]); System.out.println("Balance:"+Balance [1]+ "Rial"); // return 1; } else if(PINno[i]==9874) { System.out.println(PINno[i]); System.out.println("Balance:"+Balance [2]+ "Rial"); // return 2; } else if(PINno[i]==6523) { System.out.println(PINno[i]); System.out.println("Balance:"+Balance [3]+ "Rial"); // return 3; } else if(PINno[i]==1236) { System.out.println(PINno[i]); System.out.println("Balance:"+Balance [4]+ "Rial"); // return 4; } else if(PINno[i]==4569) { System.out.println(PINno[i]); System.out.println("Balance:"+Balance [5]+ "Rial"); // return 5; } else if(PINno[i]==8521) { System.out.println(PINno[i]); System.out.println("Balance:"+Balance [6]+ "Rial"); // return 6; } else {System.out.println("try again"); //return 7; //if its wrong u can enter PIN number three times only for( i=0;i<2;i++) { System.out.println("enter pin again"); PINno[i]=m.nextInt(); String ss; //ss = "MAnal"; // goto ss ; } } //ss = "m"; int x; x=0; System.out.println("Enter the option from the list /n 1.Deposit /n 2.Withdraw /n 3.Balance"); x=m.nextInt(); double balance,amount; balance=0; amount=0; double deposit ,Withdraw; deposit=0; Withdraw=0; if (x==1){ System.out.println("Enter the amont you want to deposit:"+amount); amount=m.nextDouble(); Balance [i]=Balance [i]+amount; System.out.println("your balance ="+Balance [i]); } else if (x==2) { System.out.println("Enter the amont to withdraw:"); amount=m.nextDouble(); System.out.print(amount); if(Withdraw<=Balance [i]) { Balance [i]=Balance [i]-amount; System.out.println("your balance ="+Balance [i]); } else { System.out.println("sorry,please enter the amont less or equal your balance"); System.out.println(Balance [i]); } } else { if(x==1) { Balance [i]=Balance [i]+deposit; System.out.println("your current balance is :" +Balance [i]); } else { Balance [i]=Balance [i]-Withdraw; System.out.println("your current balance is :"+Balance [i]); } System.out.println("Thank you"); // err() } } }

    Read the article

  • Why can't HTC Droid running OTA 2.1 communicate with RFCOMM?

    - by Brad Hein
    Yesterday we received OTA Android 2.1 on my wife's HTC Droid - HOORAY!!! I am finally able to load my carputer app on her phone. Well we loaded it, but it doesn't work. Specifically, it connects but sees no I/O!!! I paired, re-paired, and re-paired again, every time its the same problem: connect() says we connected successfully, but any attempt to send or receive data appears to work but no data ever arrives in the input buffer. The device I'm connecting to uses AT commands. ATI should respond with a device ID. That works fine when I run the app on my Moto Droid, but on the HTC droid, no data is ever present in the inputstream/buffer. Personally, I'm feeling pretty sure it's a bug or limitation in this release for the HTC (because the app works great on my Moto A855 Droid). Can anybody comment on the issue? Obligatory code snippets: Member variable defining my RFCOMM UUID static final UUID UUID_RFCOMM_GENERIC = UUID.fromString("00001101-0000-1000-8000-00805F9B34FB"); Parts of my connect() // make sure peer is defined as a valid device based on their MAC. If not then do it. if (mBTDevice == null) mBTDevice = mBTAdapter.getRemoteDevice(mPeerMAC); // Make an RFCOMM binding. try {mBTSocket = mBTDevice.createRfcommSocketToServiceRecord(UUID_RFCOMM_GENERIC); } catch (Exception e1) { msg ("connect(): Failed to bind to RFCOMM by UUID. msg=" + e1.getMessage()); return false; } msg ("connect(): Try to connect."); try { mBTSocket.connect(); } catch (Exception e) { msg ("connect(): Exception thrown during connect: " + e.getMessage()); return false; // there was a problem connecting... make a note of the particulars and move on. } msg ("connect(): CONNECTED!"); try { mBTOutputStream = mBTSocket.getOutputStream(); mBTInputStream = new BufferedInputStream (mBTSocket.getInputStream(),INPUT_BUFFER_SIZE); //msg ("Connecting non-buffered input stream..."); //mBTInputStream = mBTSocket.getInputStream(); } catch (Exception e) { msg ("connect(): Error attaching i/o streams to socket. msg=" + e.getMessage()); return false; } resetErrorCounters(); setConnected(true); return true; } Then I send "ATI\r" and expect something like "CAN OBD II" but I get nothing. mBTInputStream.available(), it seems, is ALWAYS zero, even when data should be in the input buffer. There are GOBS of trace messages being generated by the OS as viewed with adb logcat -v time Some of the more interesting ones: 05-17 19:44:21.447 D/BluetoothSppPort( 5809): connected to device service! 05-17 19:44:21.447 D/BluetoothSppPort( 5809): Creating a BluetoothSpp proxy object 05-17 19:44:21.467 D/BluetoothSppService( 74): createPort called! 05-17 19:44:21.467 D/BluetoothSppService( 74): createPort checking uuid 05-17 19:44:21.467 D/BluetoothSppService( 74): createPort UUID=00001101-0000-1000-8000-00805f9b34fb auth=true encrypt=true 05-17 19:44:21.467 D/BluetoothSppService( 74): createPort enforcing bluetooth perm 05-17 19:44:21.467 D/BluetoothSppService( 74): createPort creating a jbtlspp object 05-17 19:44:21.467 D/BluetoothSppService( 74): createPort checking if the btl spp object is valid 05-17 19:44:21.467 D/BluetoothSppService( 74): createPort try to create an spp container 05-17 19:44:21.467 D/BluetoothSppService( 74): createPort try to create security params 05-17 19:44:21.467 D/BluetoothSppService( 74): createPort Set Security L2 05-17 19:44:21.467 D/BluetoothSppService( 74): createPort spp port create 05-17 19:44:21.467 D/JBtlSpp ( 74): create: Entered 05-17 19:44:21.467 D/JBtlSpp ( 74): Calling NativeJBtlSpp_Create 05-17 19:44:21.467 D/JBtlSppNative( 74): NativeJBtlSpp_Create: Entered 05-17 19:44:21.467 D/JBtlSppNative( 74): NativeJBtlSpp_Create: Calling BTL_SPP_Remote_Create 05-17 19:44:21.477 D/JBtlSppNative( 74): NativeJBtlSpp_Create: BTL_SPP_Remote_Create returned 0, context:18 05-17 19:44:21.477 D/JBtlSppNative( 74): NativeJBtlSpp_Create: Setting context value in jContext out parm 05-17 19:44:21.477 D/JBtlSppNative( 74): NativeJBtlSpp_Create: Calling Java setValue(0x18) in context's class 05-17 19:44:21.477 D/JBtlProfileContext( 74): setValue: setValue called, value:24 05-17 19:44:21.477 D/JBtlSppNative( 74): create_spp_port_data: will use context struct 0 for the port 24 05-17 19:44:21.477 D/JBtlSppNative( 74): create_spp_port_data: spp port context 0 added 05-17 19:44:21.477 D/JBtlSppNative( 74): NativeJBtlSpp_Create:Exiting Successfully 05-17 19:44:21.477 D/JBtlSpp ( 74): After NativeJBtlSpp_Create, status=SUCCESS, Context = 24 05-17 19:44:21.477 D/JBtlRbtlServices( 74): addUser: Entered, userRefCount = 1 05-17 19:44:21.477 D/BluetoothSppService( 74): port create returned status SUCCESS 05-17 19:44:21.477 D/JBtlSpp ( 74): enable: Entered 05-17 19:44:21.477 D/JBtlSpp ( 74): enable: UUID=00001101-0000-1000-8000-00805f9b34fb 05-17 19:44:21.477 D/JBtlSppNative( 74): NativeJBtlSpp_Enable: Entered 05-17 19:44:21.487 D/JBtlSppNative( 74): NativeJBtlSpp_Enable: BTL_SPP_Enable returned 0 05-17 19:44:21.487 D/JBtlSppNative( 74): NativeJBtlSpp_Enable:Exiting 05-17 19:44:21.487 D/JBtlSpp ( 74): After NativeJBtlSpp_Enable, status=SUCCESS 05-17 19:44:21.487 D/JBtlSpp ( 74): enable: Exiting 05-17 19:44:21.487 D/BluetoothSppService( 74): port enable returned status SUCCESS 05-17 19:44:21.487 D/BluetoothSppService( 74): connectPort called! 05-17 19:44:21.497 D/BluetoothSppService( 74): connectPort received bdaddress:00:18:E4:1D:23:9B 05-17 19:44:21.527 D/BluetoothSppService( 74): Trying to connect to 00:18:E4:1D:23:9B 05-17 19:44:21.527 D/JBtlSpp ( 74): setServiceName: Entered 05-17 19:44:21.527 D/JBtlSppNative( 74): NativeJBtlSpp_SetServiceName: Entered 05-17 19:44:21.547 D/JBtlSppNative( 74): NativeJBtlSpp_SetServiceName: native func returned 0 05-17 19:44:21.547 D/JBtlSppNative( 74): NativeJBtlSpp_SetServiceName:Exiting 05-17 19:44:21.547 D/JBtlSpp ( 74): After setServiceName, status=SUCCESS 05-17 19:44:21.547 D/JBtlSpp ( 74): setServiceName: Exiting 05-17 19:44:21.557 D/BluetoothSppService( 74): port setServiceName returned status SUCCESS 05-17 19:44:21.587 D/JBtlSpp ( 74): connect: Entered connecting to 00:18:E4:1D:23:9B 05-17 19:44:21.587 D/JBtlSppNative( 74): NativeJBtlSpp_Connect: Entered 05-17 19:44:21.597 D/JBtlSppNative( 74): NativeJBtlSpp_Connect: BTL_SPP_Connect returned 2 05-17 19:44:21.597 D/JBtlSppNative( 74): NativeJBtlSpp_Connect:Exiting 05-17 19:44:21.597 D/JBtlSpp ( 74): After NativeJBtlSpp_Connect, status=PENDING 05-17 19:44:21.747 D/AK8973 ( 61): Compass CLOSE 05-17 19:44:21.887 W/Process ( 74): Unable to open /proc/5749/status 05-17 19:44:21.917 I/ActivityManager( 74): Displayed activity com.gtosoft.dash/.Dash: 1279 ms (total 1279 ms) 05-17 19:44:24.047 D/ ( 74): signal_BTEVENT_ACCESSIBLE_CHANGE: Entered 05-17 19:44:24.047 D/ ( 74): signal_BTEVENT_ACCESSIBLE_CHANGE: Calling Java Accessible Change callback 05-17 19:44:24.047 D/JBtlBmg ( 74): nativeAccessibleChange 05-17 19:44:24.087 D/BluetoothService( 74): Callback - accessbileChange, btErrCode = NO_ERROR, mode = CONNECTABLE_ONLY 05-17 19:44:24.087 D/BluetoothService( 74): Sending ACTION_SCAN_MODE_CHANGED intent, mode = 21 05-17 19:44:24.087 D/ ( 74): signal_BTEVENT_ACCESSIBLE_CHANGE: Exiting 05-17 19:44:24.097 D/ ( 74): signal_BTEVENT_LINK_CONNECT_CNF: Entered 05-17 19:44:24.097 D/ ( 74): signal_BTEVENT_LINK_CONNECT_CNF: context: 1, errCode: 0 05-17 19:44:24.097 D/ ( 74): signal_BTEVENT_LINK_CONNECT_CNF: Calling Java Link Connect Confirmation callback 05-17 19:44:24.097 D/JBtlBmg ( 74): nativeLinkConnectCnf 05-17 19:44:24.107 D/BluetoothService( 74): Callback - linkConnectCnf, btErrCode = NO_ERROR, bdAddr = 00:18:E4:1D:23:9B 05-17 19:44:24.117 D/JBtlBmg ( 74): getKnownDeviceInfo: Entered 05-17 19:44:24.117 D/JBtlBmg ( 74): getKnownDeviceInfo: Calling NativeJBtlBmg_GetKnownDeviceInfo 05-17 19:44:24.137 D/ ( 74): NativeJBtlBmg_GetKnownDeviceInfo: Entered 05-17 19:44:24.137 D/ ( 74): NativeJBtlBmg_GetKnownDeviceInfo: Calling BTL_BMG_GetKnownDeviceInfo 05-17 19:44:24.227 D/JBtlBmgJniKnownDeviceInfo( 74): setValues: Entered 05-17 19:44:24.227 D/ ( 74): NativeJBtlBmg_GetKnownDeviceInfo:Exiting 05-17 19:44:24.227 D/JBtlBmg ( 74): getKnownDeviceInfo: After NativeJBtlBmg_GetKnownDeviceInfo, status=SUCCESS 05-17 19:44:24.227 D/JBtlBmg ( 74): getKnownDeviceInfo: Exiting 05-17 19:44:24.227 D/BluetoothService( 74): onRemoteDeviceConnected, device 00:18:E4:1D:23:9B is Paired 05-17 19:44:24.227 D/BluetoothService( 74): Sending ACTION_ACL_CONNECTED intent, address = 00:18:E4:1D:23:9B 05-17 19:44:24.227 D/BluetoothA2dpService( 74): Received intent with action: android.bluetooth.device.action.ACL_CONNECTED 05-17 19:44:24.227 D/ ( 74): signal_BTEVENT_LINK_CONNECT_CNF: Exiting 05-17 19:44:24.757 D/JBtlAg ( 163): setIndicatorValue: entered 05-17 19:44:24.767 I/JBtlAg ( 163): After NativeJBtlAg_SetIndicatorValue, status = SUCCESS 05-17 19:44:24.767 D/JBtlAg ( 163): setIndicatorValue: exiting 05-17 19:44:24.807 D/JBtlSppNative( 74): signal_SPP_EVENT_OPEN: Entered 05-17 19:44:24.807 D/JBtlSppNative( 74): signal_SPP_EVENT_OPEN: status: 0 context:24 05-17 19:44:24.827 D/JBtlSpp ( 74): nativeCb_open: Entered from 00:18:E4:1D:23:9B 05-17 19:44:24.827 D/JBtlSpp ( 74): nativeCb_open: Calling callback 05-17 19:44:24.827 D/BluetoothSppService( 74): connected called! 05-17 19:44:24.847 D/JBtlSpp ( 74): connect: Exiting 05-17 19:44:24.847 D/BluetoothSppService( 74): port connect returned status SUCCESS 05-17 19:44:24.847 D/JBtlSppNative( 74): signal_SPP_EVENT_OPEN: Exiting 05-17 19:44:24.847 D/JBtlSppNative( 74): signal_SPP_EVENT_MODEM_STATUS_IND: Entered 05-17 19:44:24.847 D/JBtlSppNative( 74): signal_SPP_EVENT_MODEM_STATUS_IND: Exiting 05-17 19:44:25.424 D/BluetoothSppService( 74): writeSync called! 05-17 19:44:25.424 D/JBtlSpp ( 74): write: Entered 05-17 19:44:25.427 D/JBtlSppNative( 74): NativeJBtlSpp_WriteNative: Entered 05-17 19:44:25.427 D/JBtlSppNative( 74): NativeJBtlSpp_WriteNative: BTL_SPP_WriteSync returned 0 written: 6 total: 0/6 05-17 19:44:25.437 D/JBtlSppNative( 74): signal_SPP_EVENT_TX_DATA_COMPLETE: Entered 05-17 19:44:25.437 D/JBtlSppNative( 74): signal_SPP_EVENT_TX_DATA_COMPLETE: status: 0 context:24 txDataLen:6 05-17 19:44:25.437 D/JBtlSppNative( 74): signal_SPP_EVENT_TX_DATA_COMPLETE: Exiting ok 05-17 19:44:25.437 D/JBtlSppNative( 74): NativeJBtlSpp_WriteNative: written 6 05-17 19:44:25.437 D/JBtlSppNative( 74): NativeJBtlSpp_WriteNative:Exiting with 0 05-17 19:44:25.437 D/JBtlSppNative( 74): NativeJBtlSpp_WriteNative: returning 6 bytes 05-17 19:44:25.437 D/JBtlSpp ( 74): After write, status=SUCCESS 05-17 19:44:25.437 D/JBtlSpp ( 74): write: Exiting 05-17 19:44:25.437 D/BluetoothSppPort( 5809): written 6 bytes 05-17 19:44:25.467 D/JBtlSppNative( 74): signal_SPP_EVENT_RX_DATA_IND: Entered 05-17 19:44:25.467 D/JBtlSppNative( 74): signal_SPP_EVENT_RX_DATA_IND: status: 0 context: 24 rxDataLen: 1 05-17 19:44:25.467 D/JBtlSppNative( 74): signal_SPP_EVENT_RX_DATA_IND: Exiting 05-17 19:44:25.477 D/JBtlSppNative( 74): signal_SPP_EVENT_RX_DATA_IND: Entered 05-17 19:44:25.477 D/JBtlSppNative( 74): signal_SPP_EVENT_RX_DATA_IND: status: 0 context: 24 rxDataLen: 5 05-17 19:44:25.477 D/JBtlSppNative( 74): signal_SPP_EVENT_RX_DATA_IND: Exiting 05-17 19:44:25.487 D/JBtlSppNative( 74): signal_SPP_EVENT_RX_DATA_IND: Entered 05-17 19:44:25.487 D/JBtlSppNative( 74): signal_SPP_EVENT_RX_DATA_IND: status: 0 context: 24 rxDataLen: 10 05-17 19:44:25.487 D/JBtlSppNative( 74): signal_SPP_EVENT_RX_DATA_IND: Exiting 05-17 19:44:25.497 D/JBtlSppNative( 74): signal_SPP_EVENT_RX_DATA_IND: Entered 05-17 19:44:25.497 D/JBtlSppNative( 74): signal_SPP_EVENT_RX_DATA_IND: status: 0 context: 24 rxDataLen: 7 05-17 19:44:25.497 D/JBtlSppNative( 74): signal_SPP_EVENT_RX_DATA_IND: Exiting 05-17 19:44:27.930 W/ActivityManager( 74): Activity destroy timeout for HistoryRecord{447e0d48 com.gtosoft.dash/.Dash} 05-17 19:44:29.907 D/dalvikvm( 448): GC freed 78 objects / 3664 bytes in 153ms 05-17 19:44:29.917 D/BluetoothSppService( 74): writeSync called! 05-17 19:44:29.917 D/JBtlSpp ( 74): write: Entered 05-17 19:44:29.917 D/JBtlSppNative( 74): NativeJBtlSpp_WriteNative: Entered 05-17 19:44:29.927 D/JBtlSppNative( 74): NativeJBtlSpp_WriteNative: BTL_SPP_WriteSync returned 0 written: 6 total: 0/6 05-17 19:44:29.937 D/JBtlSppNative( 74): signal_SPP_EVENT_TX_DATA_COMPLETE: Entered 05-17 19:44:29.937 D/JBtlSppNative( 74): signal_SPP_EVENT_TX_DATA_COMPLETE: status: 0 context:24 txDataLen:6 05-17 19:44:29.937 D/JBtlSppNative( 74): signal_SPP_EVENT_TX_DATA_COMPLETE: Exiting ok 05-17 19:44:29.937 D/JBtlSppNative( 74): NativeJBtlSpp_WriteNative: written 6 05-17 19:44:29.937 D/JBtlSppNative( 74): NativeJBtlSpp_WriteNative:Exiting with 0 05-17 19:44:29.937 D/JBtlSppNative( 74): NativeJBtlSpp_WriteNative: returning 6 bytes 05-17 19:44:29.937 D/JBtlSpp ( 74): After write, status=SUCCESS 05-17 19:44:29.937 D/JBtlSpp ( 74): write: Exiting

    Read the article

  • Webcast with Brian Griffin, Ancestry, 2013 Winner 10 Best Web Support Sites

    - by Tuula Fai
    The web is one of the fastest growing channels for providing service, support and information, as seen in The Service Council's (TSC) latest multi-channel research survey. Join TSC's Chief Customer Officer Sumair Dutta as he shares key findings from his current customer experience research from over 200 organizations. Sumair will be joined by Brian Griffin, Senior Program Manager, Global Support Experience, Ancestry.com who will show how Ancestry is using the web as a powerful tool to enhance self-service opportunities and increase customer engagement. Smarter Web Service Educast Thursday, November 14th 2 pm ET / 11 am PT Register: http://bit.ly/1cwz4Ns  

    Read the article

  • Oracle Fusion Middleware Innovation Award Winners 2012: ADF & Fusion Development

    - by Dana Singleterry
    Normal 0 false false false EN-US X-NONE X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin-top:0in; mso-para-margin-right:0in; mso-para-margin-bottom:10.0pt; mso-para-margin-left:0in; line-height:115%; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;} Oracle Fusion Middleware Innovation Awards honor customers for their cutting-edge solutions using Oracle Fusion Middleware. Winners are selected based on the uniqueness of their business case, business benefits, level of impact relative to the size of the organization, complexity and magnitude of implementation, and the originality of architecture. The awards were presented during Oracle OpenWorld 2012 and following winners are for the category of ADF & Fusion Development. Micros – an OPN Platinum partner – has been working closely with Oracle product management teams in applying industry best practices in the development of their solutions. Their current application suite for the hospitality industry was built on Oracle Forms and the Oracle database running on MS Windows. The next generation of this suite is being developed and released in modules that are now based on Oracle FMW (including ADF) 11g technologies and Oracle Database 11g all running on Oracle Linux. The primary driver was that of modernization and hence the reason Oracle ADF was selected to provide a rich UI for business processes that could be served up through traditional methods or through mobile devices globally. SOA Suite & ADF allowed for loosely-coupled services that could evolve with the needs of the business. Micros's application innovations includes the use of business application portlets that have been published from ADF Faces Task Flows generated using WebCenter portlet libraries  & Oracle Metadata Services (MDS) with multi-layered customizations using Oracle WebCenter Composer. PCS (Marfin Egnatia Bank of Greece) – PCS Wealth Management is a WM Software Solution, which captures and automates the WM business processes allowing Service Providers to allocate enough time and effort into Customer Service and Investment Strategies, under Advisory or Execution-Only Services. The Product is built upon the latest Web Technologies and ensures Best Practices covering all functional expectations, meeting local regulatory requirements and discovering successful opportunities for the WM Customers' Portfolios. The new unified Wealth Management system offers an unparalleled User Interface taking full advantage of the user friendly ADF Faces Components to a great extent, all serving Private Banking purposes. The application offers a true Account Officer Cockpit with shallow navigation, one-click access to informed decisions and a perfect customer service. ADF Grids and Pivots, the Data Visualization Components, as well as the Calendar and Map Components are cleverly used to help the user eliminate the usage of Excel, Outlook and other systems. PCS's application is unique in the way it leverages the ADF Faces data visualization components to create a truly attractive and insightful dashboard for their application. PCS Wealth Management Demo Qualcomm – Qualcomm, a $17B per year company, designs and sells semiconductor products for wireless telecommunications, mobile and computing markets. In addition, Qualcomm companies provide various hardware and software products to facilitate the design, development and deployment of phones and the applications that run on them. Qualcomm’s challenge has been to not only develop and deploy new business system functions to keep pace with customer demand, but also to provide a customer collaboration capability that is sufficiently robust, easy to use, and flexible to meet emerging and future needs. Qualcomm has taken successful steps in building and deploying the customer engagement platform Ieveraging various Oracle technologies including Fusion Middleware (ADF, SOA, OBIEE) and their proven ERP foundation of EBS and 11g databases. The new platform delivers a more unified and “seamless” business solution with a consistent, modern “look and feel” all based on standard business processes which facilitate efficient collaboration with Qualcomm and its customers. The look and feel leverages ADF in innovative ways and includes hover over navigation, custom pagination components, and skinning. Qualcomm has exposed a services layer that provides significant functionality including order-to-ship, quote-to-order, customer on-boarding and contract validation. Qualcomm's creative designs leverage Oracle's SOA Suite to integrate with Oracle EBS and desperate applications to provide a rich user interface through the use use of Oracle ADF Faces Rich Client Components providing a self-service solution to their customers.

    Read the article

  • Pella Increases Online Appointment Scheduling and Rapidly Personalizes and Updates Marketing Initiatives

    - by Michael Snow
    Originally posted on Oracle Customers page.Oracle Customer: Pella CorporationLocation:  Pella, IowaIndustry: Industrial Manufacturing Employees:  7,100 Pella Corporation is an innovative leader in creating a better view for homes and businesses by designing, testing, manufacturing, and installing quality windows and doors for new construction, remodeling, and replacement applications. A family-owned company, Pella has an 88-year history of innovation and, today, is the second-largest manufacturer in the country of windows and doors, including patio, entry, and storm doors. The company has 10 manufacturing facilities in United States and window and door showrooms across the United States and Canada. In-home consultations are an important part of Pella’s sales process. Several years ago, the company launched an online appointment scheduling tool to improve customer convenience. While the functionality worked well, the company wanted to increase online conversion rates and decrease the number of incomplete, online appointment schedules. It also wanted to give its business analysts and other line-of-business personnel the ability to update the scheduling tool and interface quickly, without needing IT team intervention and recoding, to better capitalize on opportunities and personalize the interface for specific markets. Pella also looked to reduce IT complexity by selecting a system that integrated easily with its Oracle E-Business Suite Release 12.1 enterprise applications.Pella, which has a large Oracle footprint, selected Oracle WebCenter Sites as the foundation for its new, real-time appointment scheduling application. It used the solution to re-engineer the scheduling process and the information required to set up an appointment. Just a few months after launch, it is seeing improvement in the number of appointments booked online and experiencing fewer abandoned appointments during the scheduling process. As important, Pella can now quickly and easily make changes to images, video, and content displayed on the scheduling tool interface, delivering greater business agility. Previously, such changes required a developer and weeks of coding and testing. Today, a member of Pella’s business analyst team can complete the changes in hours. This capability enables Pella to personalize the Web experience for customers. For example, it can display different products or images for clients in different regions.The solution is also highly scalable. Pella is using Oracle WebCenter Sites for appointment scheduling now and plans to migrate Pella.com, its configurator tool, and dealer microsites onto the platform. Further, Pella plans to leverage the solution to optimize mobile devices. “Moving ahead, we expect to extensively leverage Oracle WebCenter Sites to gain greater flexibility in updating the Web experience, thanks to the ability to make updates quickly without developer resources. Segmentation and targeting capabilities will allow us to create a more personalized experience across both traditional and mobile platforms,” said Teri Lancaster, IT manager, customer experience applications, Pella Corporation. A word from Pella Corporation "Oracle WebCenter Sites?from the start?delivered important benefits. We’ve redesigned the online scheduling process and are seeing more potential customers completing consultation bookings online. More important, the solution opens a world of other possibilities as we plan to migrate Pella.com and our dealer microsites to the platform, and leverage it to optimize the Web experience for our mobile devices.” – Teri Lancaster, IT Manager, Customer Experience Applications, Pella Corporation Oracle Product and Services Oracle WebCenter Sites Why Oracle Pella has a long-standing relationship with Oracle. “We look to Oracle first for a solution. Our Oracle account team came to us with several solutions, and Oracle WebCenter Sites delivered the scalability, ease-of-use, flexibility, and scalability that we required for the appointment scheduling initiative and other Web projects on the horizon, including migrating Pella.com and optimizing our site for mobile platforms,”said Teri Lancaster, IT manager, customer experience applications, Pella Corporation. Implementation Process The Pella implementation team, working with Oracle partner Element Solutions, LLC, integrated the appointment setting application with Pella.com as well as the company’s Oracle E-Business Suite customer relationship management applications. Using Oracle WebCenter Site’s development tools and subversion capabilities to develop the application, the Element Solutions and Pella teams could work remotely and collaboratively, accelerating deployment. Pella went live with the new scheduling tool in just six months. Partner Oracle PartnerElement Solutions, LLC Element Solutions was instrumental at every major stage of the project, including design creation and approval, development, training, and rollout. “Element Solutions was a vital partner for our Oracle WebCenter Sites initiative. The team provided guidance, and more important, critical knowledge transfer at every stage?which equipped us to get the most out of this powerful and versatile solution. We were definitely collaboration partners,” Lancaster said. Resources Pella Corporation Upgrades Enterprise Applications to Continue to Improve Manufacturing Efficiency Thousands of Customers Successfully and Smoothly Upgrade to Oracle E-Business Suite 12.1 for New Functionality, Lower Operating Costs and Improved Shared Operations Managing the Virtual World

    Read the article

  • Friends, Food, and Fun at the My Oracle Support Community Meetup

    - by Oracle OpenWorld Blog Team
    By Leslie McNeillJoin us at the third annual My Oracle Support Community Meetup for food and drink, fun and conversation After a long day at Oracle OpenWorld, take time to relax and meet your peers in the My Oracle Support Community and some of the Oracle employees who moderate the community. The Meetup event is a great place to get together before dinner, or spend the evening getting to know other Community members and Oracle Support Moderators in person. Not a My Oracle Support Community member yet? Joining is easy - Oracle Premier Support customers can log in with the same account they use to access My Oracle Support to begin taking advantage of the resources the Community offers. If you're an Oracle Premier Support customer but don’t yet have a login, talk to the Customer User Administrator (CUA) at your company now to get access to the Oracle proactive portfolio, including My Oracle Support Community. Oracle Premier Support Customers need to register to receive their invitation to the Meetup and find out the details. Visit the Customer Support Services Oracle OpenWorld Website to discover how you can take advantage of all Oracle OpenWorld has to offer.

    Read the article

  • Implementing an Interceptor Using NHibernate’s Built In Dynamic Proxy Generator

    - by Ricardo Peres
    NHibernate 3.2 came with an included proxy generator, which means there is no longer the need – or the possibility, for that matter – to choose Castle DynamicProxy, LinFu or Spring. This is actually a good thing, because it means one less assembly to deploy. Apparently, this generator was based, at least partially, on LinFu. As there are not many tutorials out there demonstrating it’s usage, here’s one, for demonstrating one of the most requested features: implementing INotifyPropertyChanged. This interceptor, of course, will still feature all of NHibernate’s functionalities that you are used to, such as lazy loading, and such. We will start by implementing an NHibernate interceptor, by inheriting from the base class NHibernate.EmptyInterceptor. This class does not do anything by itself, but it allows us to plug in behavior by overriding some of its methods, in this case, Instantiate: 1: public class NotifyPropertyChangedInterceptor : EmptyInterceptor 2: { 3: private ISession session = null; 4:  5: private static readonly ProxyFactory factory = new ProxyFactory(); 6:  7: public override void SetSession(ISession session) 8: { 9: this.session = session; 10: base.SetSession(session); 11: } 12:  13: public override Object Instantiate(String clazz, EntityMode entityMode, Object id) 14: { 15: Type entityType = Type.GetType(clazz); 16: IProxy proxy = factory.CreateProxy(entityType, new _NotifyPropertyChangedInterceptor(), typeof(INotifyPropertyChanged)) as IProxy; 17: 18: _NotifyPropertyChangedInterceptor interceptor = proxy.Interceptor as _NotifyPropertyChangedInterceptor; 19: interceptor.Proxy = this.session.SessionFactory.GetClassMetadata(entityType).Instantiate(id, entityMode); 20:  21: this.session.SessionFactory.GetClassMetadata(entityType).SetIdentifier(proxy, id, entityMode); 22:  23: return (proxy); 24: } 25: } Then we need a class that implements the NHibernate dynamic proxy behavior, let’s place it inside our interceptor, because it will only need to be used there: 1: class _NotifyPropertyChangedInterceptor : NHibernate.Proxy.DynamicProxy.IInterceptor 2: { 3: private PropertyChangedEventHandler changed = delegate { }; 4:  5: public Object Proxy 6: { 7: get; 8: set;} 9:  10: #region IInterceptor Members 11:  12: public Object Intercept(InvocationInfo info) 13: { 14: Boolean isSetter = info.TargetMethod.Name.StartsWith("set_") == true; 15: Object result = null; 16:  17: if (info.TargetMethod.Name == "add_PropertyChanged") 18: { 19: PropertyChangedEventHandler propertyChangedEventHandler = info.Arguments[0] as PropertyChangedEventHandler; 20: this.changed += propertyChangedEventHandler; 21: } 22: else if (info.TargetMethod.Name == "remove_PropertyChanged") 23: { 24: PropertyChangedEventHandler propertyChangedEventHandler = info.Arguments[0] as PropertyChangedEventHandler; 25: this.changed -= propertyChangedEventHandler; 26: } 27: else 28: { 29: result = info.TargetMethod.Invoke(this.Proxy, info.Arguments); 30: } 31:  32: if (isSetter == true) 33: { 34: String propertyName = info.TargetMethod.Name.Substring("set_".Length); 35: this.changed(this.Proxy, new PropertyChangedEventArgs(propertyName)); 36: } 37:  38: return (result); 39: } 40:  41: #endregion 42: } What this does for every interceptable method (those who are either virtual or from the INotifyPropertyChanged) is: For methods that came from the INotifyPropertyChanged interface, add_PropertyChanged and remove_PropertyChanged (yes, events are methods ), we add an implementation that adds or removes the event handlers to the delegate which we declared as changed; For all the others, we direct them to the place where they are actually implemented, which is the Proxy field; If the call is setting a property, it fires afterwards the PropertyChanged event. In order to use this, we need to add the interceptor to the Configuration before building the ISessionFactory: 1: using (ISessionFactory factory = cfg.SetInterceptor(new NotifyPropertyChangedInterceptor()).BuildSessionFactory()) 2: { 3: using (ISession session = factory.OpenSession()) 4: using (ITransaction tx = session.BeginTransaction()) 5: { 6: Customer customer = session.Get<Customer>(100); //some id 7: INotifyPropertyChanged inpc = customer as INotifyPropertyChanged; 8: inpc.PropertyChanged += delegate(Object sender, PropertyChangedEventArgs e) 9: { 10: //fired when a property changes 11: }; 12: customer.Address = "some other address"; //will raise PropertyChanged 13: customer.RecentOrders.ToList(); //will trigger the lazy loading 14: } 15: } Any problems, questions, do drop me a line!

    Read the article

  • Five Reasons to Attend PLM Summit 2013: The Conference Formerly Known as AGILITY

    - by Terri Hiskey
    As we approach the end of 2012, we are also closing in on the last couple of weeks that Agile customers and prospects can register for the upcoming PLM Summit 2013 for the bargain early bird rate of $195. Register now to secure your spot! The Conference Formerly Known as AGILITY... Long-time Agile customers may remember AGILITY, which was Agile's PLM customer conference that was held on an annual basis prior to Oracle's acquisiton of Agile in 2007. In February 2012, due to feedback we received from our Agile PLM community, we successfully resurrected the AGILITY conference and renamed it the PLM Summit. The PLM Summit was so well received and well-attended, that we are doing it again in 2013. This upcoming PLM Summit is being co-located in San Francisco under the overarching banner of the Oracle Value Chain Summit, and will be held alongside several other Oracle customer conferences that cover a range of value chain solutions, including Value Chain Planning, Value Chain Execution, Procurement, Maintenance and Manufacturing. This setup offers PLM attendees the best of all worlds--the opportunity to participate and learn about PLM in smaller, focused sessions by product and by industry, while also giving attendees the chance to see how PLM works together with other critical enterprise applications that address other important aspects of the value chain. Top Five Reasons to Attend the PLM Summit 2013 In the spirit of all of the end-of-the-year lists that are currently popping up, here is a list of the top five reasons to attend the PLM Summit for anyone out there needs a little extra encouragement to register: 1. The Best Opportunities for Customer Networking   The PLM Summit offers attendees numerous opportunities to learn and network with fellow Agile users. Customer stories are featured in keynote and breakout presentations and the schedule allows for plenty of networking time during breakfasts, lunches, breaks and dinners. Customer networking is the number one reason that Agile users attend the PLM Summit. Read what attendees thought of the most recent PLM Summit: "Hearing about the implementation of Agile products from a customers’ perspective is invaluable." - Director of Quality Assurance & Regulatory Affairs, leading medical device manufacturer "Understanding the scope of other companies’ projects and the lessons learned made attending this event well worth my time." - Director of Test Engineering, global industrial manufacturer "The most beneficial thing about attending this event is the opportunity to network with other customers with similar experiences." - Director of Business Process Improvement, leading high technology company Come to the PLM Summit and play an active role within the PLM community: swap war stories and business cards, connect on LinkedIn and Facebook, share your stories and discuss the sessions from each day. Register now! 2. It's Educational! The PLM Summit is the premier educational event for anyone in the Agile PLM community. There are nearly 40 PLM-focused in-depth educational sessions led by Agile PLM experts, customers and partners that will cover a range of specific product and industry-focused topics. Keynotes will give attendees a broad overview of the entire Agile PLM footprint, while sessions will delve deeply into specific product functionality and customer case studies. There is truly something for everyone. Check out the latest agenda for view of all the sessions. 3. Visit with the PLM Partner Community Our partners play a significant and important role within the Agile PLM community. At the PLM Summit, attendees will be able to meet and mingle with several of the top Oracle Agile PLM partners including: Deloitte, Domain, GoEngineer, Hitachi Consulting, IBM, Kalypso, KPIT Cummins (CPG Solutions), Perception Software, Verdant, Xavor and ZeroWaitState. Go here for a complete list of all the Value Chain Summit sponsors. 4. See Agile PLM in Action at our Dedicated PLM Demo Pods At the PLM Summit, attendees will have the chance to see Agile PLM in action at dedicated PLM demo pods, manned by expert members of our Agile PLM team. If you would like to see up close specific Agile PLM functionality, or if you have a question on how to extend the scope of your current implemention or if you want a better understanding of how to leverage Agile PLM to address specific use-cases, stop by one of the Agile PLM demo pods and engage the Agile PLM experts on hand at the PLM Summit. 5. Spend Some Time in Lovely San Francisco Still on the fence about the upcoming PLM Summit? Remember that it is being held in San Francisco, which is a fantastic city for a getaway. After spending time learning and networking about PLM, take an extra day or two to escape the dreary winter and enjoy the beautiful scenery and the unique actitivies offered only by the City by the Bay. You will walk away from the conference not only with renewed excitement about Agile PLM, but feeling rejuvenated in general.

    Read the article

  • Wondering What to Expect from Master Data Management at OpenWorld 2012? Hold On to Your Seats…

    - by Mala Narasimharajan
    The Countdown begins – just 23 days till OpenWorld hits San Francisco. Oracle OpenWorld 2012 for MDM promises to be chock full of interesting sessions, specifically focused on our customers. We’ve made sure that our sessions are balanced between product information, strategy and real world stories and last but certainly not least - lessons learned – straight from our customers. Attendee / Presenters Toolkit Oracle Master Data Management FOCUS ON DOCUMENT – For all MDM sessions at OOW - where and when Oracle Schedule Builder – use search terms such as : MDM, master data, customer hub, product hub and master data management Oracle Music Festival - AMAZING Line up!!  Oracle Customer Appreciation Night –NOT TO BE MISSED!! Oracle OpenWorld LIVE On-Demand Stay on top of all that’s OpenWorld – when it comes to MDM. We’ll be posting not-t- miss sessions and blogs on what our customer lineup will be like at the big show. Look forward to seeing you at OOW – and in case you didn’t get approval to attend- take advantage of our virtual on-demand conference. See you at OpenWorld 2012 ! 

    Read the article

  • The Oracle Retail Week Awards - in review

    - by user801960
    The Oracle Retail Week Awards 2012 were another great success, building on the legacy of previous award ceremonies. Over 1,600 of the UK's top retailers gathered at the Grosvenor House Hotel and many of Europe's top retail leaders attended the prestigious Oracle Retail VIP Reception in the Grosvenor House Hotel's Red Bar. Over the years the Oracle Retail Week Awards have become a rallying point for the morale of the retail industry, and each nominated retailer served as a demonstration that the industry is fighting fit. It was an honour to speak to so many figureheads of UK - and global - retail. All of us at Oracle Retail would like to congratulate both the winners and the nominees for the awards. Retail is a cornerstone of the economy and it was inspiring to see so many outstanding demonstrations of innovation and dedication in the entries. Winners 2012   The Market Force Customer Service Initiative of the Year Winner: Dixons Retail: Knowhow Highly Commended: Hughes Electrical: Digital Switchover     The Deloitte Employer of the Year Winner: Morrisons     Growing Retailer of the Year Winner: Hallett Retail - The Concessions People Highly Commended: Blue Inc     The TCC Marketing/Advertising Campaign of the Year Winner: Sainsbury's: Feed your Family for £50     The Brandbank Multichannel Retailer of the Year Winner: Debenhams Highly Commended: Halfords     The Ashton Partnership Product Innovation of the Year Winner: Argos: Chad Valley Highly Commended: Halfords: Private label bikes     The RR Donnelley Pure-play Online Retailer of the Year Winner: Wiggle     The Hitachi Consulting Responsible Retailer of the Year Winner: B&Q: One Planet Home     The CA Technologies Retail Technology Initiative of the Year Winner: Oasis: Argyll Street flagship launch with iPad PoS     The Premier Tax Free Speciality Retailer of the Year Winner: Holland & Barrett     Store Design of the Year Winner: Next Home and Garden, Shoreham, Sussex Highly Commended: Dixons Retail, Black concept store, Birmingham Bullring     Store Manager of the Year Winner: Ian Allcock, Homebase, Aylesford Highly Commended: Darren Parfitt, Boots UK, Melton Mowbray Health Centre     The Wates Retail Destination of the Year Winner: Westfield, Stratford     The AlixPartners Emerging Retail Leader of the Year Winner: Catriona Marshall, HobbyCraft, Chief Executive     The Wipro Retail International Retailer of the Year Winner: Apple     The Clarity Search Retail Leader of the Year Winner: Ian Cheshire, Chief Executive, Kingfisher     The Oracle Retailer of the Year Winner: Burberry     Outstanding Contribution to Retail Winner: Lord Harris of Peckham     Oracle Retail and "Your Experience Platform" Technology is the key to providing that differentiated retail experience. More specifically, it is what we at Oracle call ‘the experience platform’ - a set of integrated, cross-channel business technology solutions, selected and operated by a retail business and IT team, and deployed in accordance with that organisation’s individual strategy and processes. This business systems architecture simultaneously: Connects customer interactions across all channels and touchpoints, and every customer lifecycle phase to provide a differentiated customer experience that meets consumers’ needs and expectations. Delivers actionable insight that enables smarter decisions in planning, forecasting, merchandising, supply chain management, marketing, etc; Optimises operations to align every aspect of the retail business to gain efficiencies and economies, to align KPIs to eliminate strategic conflicts, and at the same time be working in support of customer priorities.   Working in unison, these three goals not only help retailers to successfully navigate the challenges of today but also to focus on delivering that personalised customer experience based on differentiated products, pricing, services and interactions that will help you to gain market share and grow sales.  

    Read the article

  • Dynamic Bursting ... no really!

    - by Tim Dexter
    If any of you have seen me or my colleagues present BI Publisher to you then we have hopefully mentioned 'bursting.' You may have even seen a demo where we talk about being able to take a batch of data, say invoices. Then split them by some criteria, say customer id; format them with a template; generate the output and then deliver the documents to the recipients with a click. We and especially I, always say this can be completely dynamic! By this I mean, that you could store customer preferences in a database. What layout would each customer like; what output format they would like and how they would like the document delivered. We (I) talk a good talk, but typically don't do the walk in a demo. We hard code everything in the bursting query or bursting control file to get the concept across. But no more peeps! I have finally put together a dynamic bursting demo! Its been minutes in the making but its been tough to find those minutes! Read on ... It's nothing amazing in terms of making the burst dynamic. I created a CUSTOMER_PREFS table with some simple UI in an APEX application so that I can maintain their requirements. In EBS you have descriptive flexfields that could do the same thing or probably even 'contact' fields to store most of the info. Here's my table structure: Name                           Type ------------------------------ -------- CUSTOMER_ID                    NUMBER(6) TEMPLATE_TYPE                  VARCHAR2(20) TEMPLATE_NAME                  VARCHAR2(120) OUTPUT_FORMAT                  VARCHAR2(20) DELIVERY_CHANNEL               VARCHAR2(50) EMAIL                          VARCHAR2(255) FAX                            VARCHAR2(20) ATTACH                         VARCHAR2(20) FILE_LOC                       VARCHAR2(255) Simple enough right? Just need CUSTOMER_ID as the key for the bursting engine to join it to the customer data at burst time. I have not covered the full delivery options, just email, fax and file location. Remember, its a demo people :0) However the principal is exactly the same for each delivery type. They each have a set of attributes that need to be provided and you will need to handle that in your bursting query. On a side note, in EBS, you use a bursting control file, you can apply the same principals that I'm laying out here you just need to get the customer bursting info into the XML data stream so that you can refer to it in the control file using XPATH expressions. Next, we need to look up what attributes or parameters are required for each delivery method. that can be found in the documentation here.  Now we know the combinations of parameters and delivery methods we can construct the query using a series a decode statements: select distinct cp.customer_id "KEY", cp.template_name TEMPLATE, cp.template_type TEMPLATE_FORMAT, 'en-US' LOCALE, cp.output_format OUTPUT_FORMAT, 'false' SAVE_FORMAT, cp.delivery_channel DEL_CHANNEL, decode(cp.delivery_channel,'FILE', cp.file_loc , 'EMAIL', cp.email , 'FAX', cp.fax) PARAMETER1, decode(cp.delivery_channel,'FILE', c.cust_last_name||'_orders.pdf' ,'EMAIL','[email protected]' ,'FAX', 'faxserver.com') PARAMETER2, decode(cp.delivery_channel,'FILE',NULL ,'EMAIL','[email protected]' ,'FAX', null) PARAMETER3, decode(cp.delivery_channel,'FILE',NULL ,'EMAIL','Your current orders' ,'FAX',NULL) PARAMETER4, decode(cp.delivery_channel,'FILE',NULL ,'EMAIL','Please find attached a copy of your current orders with BI Publisher, Inc' ,'FAX',NULL) PARAMETER5, decode(cp.delivery_channel,'FILE',NULL ,'EMAIL','false' ,'FAX',NULL) PARAMETER6, decode(cp.delivery_channel,'FILE',NULL ,'EMAIL','[email protected]' ,'FAX',NULL) PARAMETER7 from cust_prefs cp, customers c, orders_view ov where cp.customer_id = c.customer_id and cp.customer_id = ov.customer_id order by cp.customer_id Pretty straightforward, just need to test, test, test, the query and ensure it's bringing back the correct data based on each customers preferences. Notice the NULL values for parameters that are not relevant for a given delivery channel. You should end up with bursting control data that the bursting engine can use:  Now, your users can run the burst and documents will be formatted, generated and delivered based on the customer prefs. If you're interested in the example, I have used the sample OE schema data for the base report. The report files and CUST_PREFS table are zipped up here. The zip contains the data model (.xdmz), the report and templates (.xdoz) and the sql scripts to create and load data to the CUST_PREFS table.  Once you load the report into the catalog, you'll need to create the OE data connection and point the data model at it. You'll probably need to re-point the report to the data model too. Happy Bursting!

    Read the article

  • WebLogic 12c and Glassfish ppt presentations

    - by JuergenKress
    We updated our WebLogic Community Workspace with the lastest customer facing presentations in ppt format: We recommend to use this presentations for your customer meetings, please feel free to add your service offerings, references and Specialization information: Oracle WebLogic Suite CVC 08.2012.pptx GlassFish Server Technical Overview 08.2012.pptx For all customer presentation in ppt format, please visit the WebLogic Community Workspace (WebLogic Community membership required). WebLogic Partner Community For regular information become a member in the WebLogic Partner Community please visit: http://www.oracle.com/partners/goto/wls-emea ( OPN account required). If you need support with your account please contact the Oracle Partner Business Center. BlogTwitterLinkedInMixForumWiki Technorati Tags: ppt,presentation,Glassfish,glassfish ppt,WebLogic ppt,presenation,sales,WebLogic Community,Oracle,OPN,Jürgen Kress

    Read the article

< Previous Page | 51 52 53 54 55 56 57 58 59 60 61 62  | Next Page >