Search Results

Search found 38993 results on 1560 pages for 'business object'.

Page 340/1560 | < Previous Page | 336 337 338 339 340 341 342 343 344 345 346 347  | Next Page >

  • More BI Showcase Events - Greensboro, NC & Tampa, FL

    - by Rob Reynolds
    As the momentum around OBIEE 11g continues, we are providing more opportunities to get a hands on view of the new technology via our Oracle Business Intelligence Showcases. Next week we will have Showcases in Greensboro, NC and Tampa, FL. I will be presenting at both, so please stop by and say hello, while learning about the latest in Oracle BI & DW technology. Pre-registration is required. You can register for the events at the links below: Greensboro, NC - Tuesday December 7, 2011 Tampa, FL - Wednesday, December 8, 2011 Session Agenda: Agenda 9:00 a.m. – 10:00 a.m. Registration and Welcome 10:00 a.m. – 11:00 a.m. Session Keynote: Oracle’s New Generation of Business Intelligence Solutions and Innovations 11:00 a.m. – 12:00 noon Session 1 Track 1Oracle Business Intelligence Enterprise Edition 11g: End User Experience Track 2Management Reporting with Oracle Essbase 12:00 noon – 1:00 p.m. Networking Lunch 1:00 p.m. – 2:00 p.m. Session 2 Track 1Oracle Business Intelligence Enterprise Edition 11g for Power Users, Developers, and Administrators Track 2Oracle BI Applications: The Value of Cross-Functional BI Break to change rooms 2:00 p.m.– 3:00 p.m. Session 3 Track 1 Extreme Performance Data Warehousing Track 2Master Data Management: The Single Source of Truth for Real Time Decisions 3:15 p.m. Wrap-Up and Raffle Prize

    Read the article

  • Partner Webcast - Oracle WebCenter: Portal Highlights - 31 Oct 2013

    - by Thanos Terentes Printzios
    Oracle WebCenter is the center of engagement for business. In order to succeed in today’s economy, organizations need to engage with information across all channels to ensure customers, partners and employees have access to the right information in the context of the business process in which they are engaged. The latest release of Oracle WebCenter addresses this challenge with updates across its complete portfolio.Nowadays, Portals are multi-channel applications that enable the creation, sharing and distribution of personalized content, as well as access to social networking and self-service capabilities. Web 2.0 and social technologies have already transformed the ways customers, employees, partners, and suppliers communicate and stay informed.The new release of Oracle WebCenter Portal makes it easier and faster for business users to create intuitive portals with integrated application content Streamlining development with an integrated set of tools for web and mobile. Providing out-of-the box templates for common use cases. Expediting the portal creation experience with new development tools empower business users to build and deploy mobile portals and websites with unprecedented speed—without having to wait for IT which leads to a shorter time to market and reduced costs. Join us to discover a Web platform that allows organizations to quickly and easily create intranets, extranets, composite applications, and self-service portals, providing users a more secure and efficient way of consuming information and interacting with applications, processes, and other users – the latest Oracle WebCenter Portal release 11gR1 PS7. Agenda Oracle WebCenter Overview Oracle WebCenter Portal New and enhanced features to improve the user experience: For Knowledge Workers Simplified Portal Creation Search Enhancements For Application Specialists New Portal Builder Simplify Mobile Development For Developers : Enhanced APIs and ADF Support For Administrators Lifecycle Enhancements Search Administration Impersonation Summary - Q&A This is our first webcast of an Oracle Webcenter Series for Partners, with the support of  Oracle EMEA Webcenter Partner Community. Delivery Format This FREE online LIVE eSeminar will be delivered over the Web. Registrations received less than 24hours prior to start time may not receive confirmation to attend. New invitations will be shared of additional webcasts planned for Oracle Webcenter. Thursday, October 31st, 2013 10am CET (8am UTC / 11am EEST)  Register Now For any questions please contact us at [email protected] Stay Connected

    Read the article

  • Formalizing a requirements spec written in narrative English

    - by ProfK
    I have a fairly technical functionality requirements spec, expressed in English prose, produced by my project manager. It is structured as a collection of UI tabs, where the requirements for each tab are expressed as a lit of UI fields and a list of business rules for the tab. Most business rules are for UI fields on a tab, e.g: a) Must be alphanumeric, max length 20. b) Must be a dropdown, with values from table x. c) Is mandatory. d) Is mandatory under certain conditions, e.g. another field is just populated, or has a specific value. Then other business rules get a little more complex. The spec is for a job application, so the central business object (table) is the Applicant, and we have several other tables with one-to-many relationships with applicant, such as Degree, HighSchool, PreviousEmployer, Diploma, etc. e) One such complex rule says a status field can only be assigned a certain value if a many-side record exists in at least one of the many-side tables. E.g. the Applicant has at least one HighSchool or at least one Diploma record. I am looking for advice on how to codify these requirements into a more structured specification defined in terms of tables, fields, and relationships, especially for the conditional rules for fields and for the presence of related records. Any suggestions and advice will be most welcome, but I would be overjoyed if i could find an already defined system or structure for expressing things like this.

    Read the article

  • Inevitable Corporate Bureaucracy

    - by Ahsan Alam
    Top executives of most smaller organizations want their companies to be different from the larger corporations. They want their organizations smaller in size; but bigger in productivity by eliminating red tapes and corporate bureaucracy. When the company is smaller, people often work like firefighters – taking on new business and technology challenges without thinking about any procedures and guidelines. People also tend to wear many hats to accomplish tasks quickly in order to integrate new businesses. For example, software developers in smaller organizations may take on responsibilities of client interactions, requirements gathering, design and development, code deployment, production support, network infrastructure support, database design and maintenance along with countless other duties. In addition, systems in smaller organizations tend to be loosely guarded. So, people often don't follow many procedures in order to setup environments and implement technical projects. It's not uncommon to change code and deploy without anyone realizing. Similarly, business requirements may also get defined in an informal manner without any type of documentation. As the company grows, everything starts to change significantly impacting people and the overall business process. Suddenly, following procedures become extremely important. Consequently, new roles, guidelines and procedures start to emerge. Everything from business process to technology implementation start to become more and more process oriented. Organizations start to define and document steps, invent procedure to track process and systems level changes, and start restricting access to various systems for security reasons. At the same time, as a growing company start doing businesses with larger clienteles, they are automatically forced to abide by all sorts of industry compliance laws. Moreover, growing companies tend to recruit experienced individuals to fill new roles who usually bring their expertise from larger and more bureaucratic organizations.   Despite the best efforts from the top executives, it seems increased number of procedures and guidelines as well as new recruits automatically contribute to the evolution of corporate bureaucracy. Maybe, corporate bureaucracy is an inevitable side effect of a growing organization.

    Read the article

  • Duck checker in Python: does one exist?

    - by elliot42
    Python uses duck-typing, rather than static type checking. But many of the same concerns ultimately apply: does an object have the desired methods and attributes? Do those attributes have valid, in-range values? Whether you're writing constraints in code, or writing test cases, or validating user input, or just debugging, inevitably somewhere you'll need to verify that an object is still in a proper state--that it still "looks like a duck" and "quacks like a duck." In statically typed languages you can simply declare "int x", and anytime you create or mutate x, it will always be a valid int. It seems feasible to decorate a Python object to ensure that it is valid under certain constraints, and that every time that object is mutated it is still valid under those constraints. Ideally there would be a simple declarative syntax to express "hasattr length and length is non-negative" (not in those words. Not unlike Rails validators, but less human-language and more programming-language). You could think of this as ad-hoc interface/type system, or you could think of it as an ever-present object-level unit test. Does such a library exist to declare and validate constraint/duck-checking on Python-objects? Is this an unreasonable tool to want? :) (Thanks!) Contrived example: rectangle = {'length': 5, 'width': 10} # We live in a fictional universe where multiplication is super expensive. # Therefore any time we multiply, we need to cache the results. def area(rect): if 'area' in rect: return rect['area'] rect['area'] = rect['length'] * rect['width'] return rect['area'] print area(rectangle) rectangle['length'] = 15 print area(rectangle) # compare expected vs. actual output! # imagine the same thing with object attributes rather than dictionary keys.

    Read the article

  • Almost at our first year anniversary!

    - by Vizioz Limited
    It has been a hectic first year at Vizioz and things are still going from strength to strength. 11 months ago I started Vizioz with zero capital investment in the middle of a recession, which to some may seem a daunting prospect but to others including myself it was the challenge I needed to make me want to get up in the morning :) I wanted to prove that even in the curent financial climate it is still possible to start a new business.We are still experiencing the normal growing pains of a small business but this is something we just need to work our way through, it is amazing how much paperwork and administration there is running a small business, office admin, insurance, vat and for the last few months PAYE.For the last 9 months we have shared an office with another small business called Little Big Ideas. They are a design agency working across a broad spectrum of design from branding, print and digital. Last month we decided to move offices to a larger office and now have room for 8 of us, so now we need a couple more clients to help produce enough work to fill the space and grow to the next level.As well as moving office 2 months ago I blogged about my first employee Colin starting work for me, he has picked up Umbraco very well and has mastered the art of good CSS design, as the majority of our clients are large multi-nationals they still require support for IE6 which as all web developers know is the nightmare of all web browsers.This month has seen the next step in the growth of Vizioz as I have taken on another PhD graduate called Pricilla, welcome to the team!This month we plan to launch our own website to enable us to showcase some of the sites we have built over the past 11 months and to allow potential clients to see what we can offer. We might still be relatively small but we have some great case studies to show and with two PhD graduates on the team we have great talent capable of producing complex and innovative solutions for our clients. As soon as we have launched out new website I will blog again about what the future holds for Vizioz and what we can offer our prospective clients as well as e obvious Umbraco CMS solutions.

    Read the article

  • Executive Edge: It's the end of work as we know it

    - by Naresh Persaud
    If you are at Oracle Open World, it has been an exciting couple of days from Larry's keynote to the events at the Executive Edge. The CSO Summit was included as a program within the Executive Edge this year. The day started with a great presentation from Joel Brenner, author of "America The Vulnerable", as he discussed the impact of state sponsored espionage on businesses. The opportunity for every business is to turn security into a business advantage. As we enter an in-hospitable security climate, every business has to adapt to the security climate change.  Amit Jasuja's presentation focused on how customers can secure the new digital experience. As every sector of the economy transforms to adapt to changing global economic pressures, every business has to adapt. For IT organizations, the biggest transformation will involve cloud, mobile and social. Organizations that can get security right in the "new work order" will have an advantage. It is truly the end of work as we know it.  The "new work order" means working anytime and anywhere. The office is anywhere we want it to be because work is not a place it is an activity. Below is a copy of Amit Jasuja's presentation. Csooow12 amit-jasuja-securing-new-experience6 from OracleIDM

    Read the article

  • Is there a pattern or best practice for passing a reference type to multiple classes vs a static class?

    - by Dave
    My .NET application creates HTML files, and as such, the structure looks like variable myData BuildHomePage() variable graph = new BuildGraphPage(myData) variable table = BuildTablePage(myData) BuildGraphPage and BuildTablePage both require access data, the myData object. In the above example, I've passed the myData object to 2 constructors. This is what I'm doing now, in my current project. The myData object, and it's properties are all readonly. The problem is, the number of pages which will require this object has grown. In the real project, there are currently 4, but the new spec is to have about 20. Passing this object to the constructor of each new object and assigning it to a field is a little time consuming, but not a hardship! This poses the question whether it's better practice to continue as I have, or to refactor and create a new static class for myData which can be referenced from any where in my project. I guess my abilities to use Google are poor, because I did try and find an appropriate pattern as I am sure this type of design must be common place but my results returned nothing. Is there a pattern which is suited, or do best practices lean towards one implementation over another.

    Read the article

  • Codifying a natural language requirements spec

    - by ProfK
    I have a fairly technical functionality requirements spec, expressed in English prose, produced by my project manager. It is structured as a collection of UI tabs, where the requirements for each tab are expressed as a lit of UI fields and a list of business rules for the tab. Most business rules are for UI fields on a tab, e.g: a) Must be alphanumeric, max length 20. b) Must be a dropdown, with values from table x. c) Is mandatory. d) Is mandatory under certain conditions, e.g. another field is just populated, or has a specific value. Then other business rules get a little more complex. The spec is for a job application, so the central business object (table) is the Applicant, and we have several other tables with one-to-many relationships with applicant, such as Degree, HighSchool, PreviousEmployer, Diploma, etc. e) One such complex rule says a status field can only be assigned a certain value if a many-side record exists in at least one of the many-side tables. E.g. the Applicant has at least one HighSchool or at least one Diploma record. I am looking for advice on how to codify these requirements into a more structured specification defined in terms of tables, fields, and relationships, especially for the conditional rules for fields and for the presence of related records. Any suggestions and advice will be most welcome, but I would be overjoyed if i could find an already defined system or structure for expressing things like this.

    Read the article

  • Invitation: HARNESSING THE POWER OF FUSION

    - by mseika
    HARNESSING THE POWER OF FUSION: IMPLEMENT AND EXTEND ORACLE'S NEXT-GENERATION APPLICATIONS TO MEET CHANGING CLIENT NEEDSBRUSSELS, BELGIUM, APRIL 23RD, 2012 - APRIL 24th, 2012 The pace of business continues to accelerate. Clients demand solutions that not only meet their needs today, but evolve as quickly as markets, competition and technology. Oracle Fusion Applications can help you to anticipate and satisfy your clients changing needs. Designed for an era of business disruption, they co-exist with existing IT investments, but leverage new technologies (such as mobility and SOA) and breakthrough Cloud computing delivery models as you need them. They also support unprecedented levels of extensibility. To show you how, Oracle's Product Development organization invites you to join an exclusive Fusion Applications presentation and demonstration event for Oracle partners in Europe. This intensive 2-day event will illustrate Fusion Applications capabilities that can enhance ROI for your clients across four major product families: Financials, Procurement and Project Portfolio Management (ERP) Customer Relationship Management (CRM) Human Capital Management (HCM) Supply Chain Management (SCM) Led by Oracle Product Development personnel, this event will also demonstrate how to extend the Fusion Applications user experience, data model, business process and reporting using new Functional Setup and Composer technologies. These can help you address unique client needs without impacting future upgrades. This presentation and demonstration event is intended for consulting business development and delivery personnel. Reserve your Seats today for April 23rd - 24th event To register to this event CLICK HERE For further information please contact me at [email protected]. Best regards Paul ThompsonSenior Director EMEA Alliances and Solutions Partner Programs Markku RouhiainenDirector, Applications Partner EnablementWestern Europe

    Read the article

  • Invitation: HARNESSING THE POWER OF FUSION

    - by mseika
    HARNESSING THE POWER OF FUSION: IMPLEMENT AND EXTEND ORACLE'S NEXT-GENERATION APPLICATIONS TO MEET CHANGING CLIENT NEEDSBRUSSELS, BELGIUM, APRIL 23RD, 2012 - APRIL 24th, 2012 The pace of business continues to accelerate. Clients demand solutions that not only meet their needs today, but evolve as quickly as markets, competition and technology. Oracle Fusion Applications can help you to anticipate and satisfy your clients changing needs. Designed for an era of business disruption, they co-exist with existing IT investments, but leverage new technologies (such as mobility and SOA) and breakthrough Cloud computing delivery models as you need them. They also support unprecedented levels of extensibility. To show you how, Oracle's Product Development organization invites you to join an exclusive Fusion Applications presentation and demonstration event for Oracle partners in Europe. This intensive 2-day event will illustrate Fusion Applications capabilities that can enhance ROI for your clients across four major product families: Financials, Procurement and Project Portfolio Management (ERP) Customer Relationship Management (CRM) Human Capital Management (HCM) Supply Chain Management (SCM) Led by Oracle Product Development personnel, this event will also demonstrate how to extend the Fusion Applications user experience, data model, business process and reporting using new Functional Setup and Composer technologies. These can help you address unique client needs without impacting future upgrades. This presentation and demonstration event is intended for consulting business development and delivery personnel. Reserve your Seats today for April 23rd - 24th event To register to this event CLICK HERE For further information please contact me at [email protected]. Best regards Paul ThompsonSenior Director EMEA Alliances and Solutions Partner Programs Markku RouhiainenDirector, Applications Partner EnablementWestern Europe

    Read the article

  • BizTalk HL7 Receive Pipeline Exception

    - by Paul Petrov
    If you experience sequence of errors below with BizTalk HL7 MLLP receive ports you may need to request a hotfix from Microsoft. Knowledge base article number is 2454887 but it’s still not available on the KB site. The hotfix is recently released and you may need to open support ticket to get to it. It requires three other hotfixes installed: ·         970492 (DASM 3.7.502.2) ·         973909 (additional ACK codes) ·         981442 (Microsoft.solutions.btahl7.mllp.dll 3.7.509.2) If the exceptions below repeatedly appear in the event log you most likely would be helped by the hotfix: Fatal error encountered in 2XDasm. Exception information is Cannot access a disposed object. Object name: 'CEventingReadStream'. There was a failure executing the receive pipeline: "BTAHL72XPipelines.BTAHL72XReceivePipeline, BTAHL72XPipelines, Version=1.3.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35" Source: "BTAHL7 2.X Disassembler" Receive Port: "ReceivePortName" URI: "IPAddress:portNumber" Reason: Cannot access a disposed object. Object name: 'CEventingReadStream'. The Messaging Engine received an error from transport adapter "MLLP" when notifying the adapter with the BatchComplete event. Reason "Object reference not set to an instance of an object." We’ve been through a lot of troubleshooting with Microsoft Product Support and they did a great job finding an issue and releasing a fix.

    Read the article

  • Final agenda - Oracle Exadata & Manageability Partner Community Forum at OpenWorld

    - by Javier Puerta
    Just a few days for Oracle OpenWorld and our Exadata & Manageability Partner Community Forum for EMEA partners. The event will take place on the afternoon of Monday, October 1st, 2012 during the Oracle OpenWorld week. For all partners that have confirmed their attendance to the event, find below the final detailed agenda. I look forward to meeting again in San Francisco with all of you who can attend the event and hope that you will find the sessions useful for your business.   FINAL AGENDAOracle Exadata & ManageabilityEMEA Partner Community Forum at Oracle OpenWorld 2012 in San Francisco, USAMonday, October 1st, 20112 Detailed agenda Time Session Speaker 15:30 Reception of participants - Networking coffe served 16:00 Welcome Hans-Peter Kipfer, VP Engineered Systems, Oracle EMEA 16:10 Next challenges in building and managing clouds Javier Cabrerizo, VP, Global Business Development for Exadata, Oracle Corp. 16:30 Partner experience 1.- IT modernization, simplification and cost reduction: The case of a customer in Transportation & Logistics with custom applications and SAP. The Technological Renewal Model built by aligning the innovation of Oracle's Engineered Systems and Capgemini's service delivery excellence has resulted in significant cost savings for the client. Francisco Bermúdez, Country Leader Infrastructure Services, Capgemini, Spain 16:55 Partner experience 2.- The Nvision cloud project NCloud is an innovative design that combines advanced technical solutions, virtualization, and dynamic management of IT resources, providing a complete "as-a-Service" offering for Infrastructure, Database, Middleware, and Applications. Dmitry Krasilov, Head of Oracle Competence Center, Nvision Group, Russia 17:20 Partner experience 3.- From Exadata Ready to Exadata Optimized: An ISV Experience The experience of WeDo Technologies in the process and benefits that started as an Exadata Ready certification and ended up as an Exadata Optimized. Miguel Alves,  Product Business Solutions Manager, Wedo Technologies, Portugal 17:45 Next steps in engaging with Oracle Cengiz Yilmaz, Director Partner Strategy, Oracle EMEA Engineered SystemsPatrick Rood, Manageability Partner Business, Oracle EMEA 18:00 Wrap-up & Networking Time and Location:Monday, October 1st, 2012, 15:30 - 18:00 PST Grand Hyatt San Francisco, 345 Stockton Street, San Francisco (Conference Theater) (It is a 15 minute walk from OOW Moscone Center. See directions here)  

    Read the article

  • Final agenda - Oracle Exadata & Manageability Partner Community Forum at OpenWorld

    - by Javier Puerta
    Just a few days for Oracle OpenWorld and our Exadata & Manageability Partner Community Forum for EMEA partners. The event will take place on the afternoon of Monday, October 1st, 2012 during the Oracle OpenWorld week. For all partners that have confirmed their attendance to the event, find below the final detailed agenda. I look forward to meeting again in San Francisco with all of you who can attend the event and hope that you will find the sessions useful for your business.   FINAL AGENDAOracle Exadata & ManageabilityEMEA Partner Community Forum at Oracle OpenWorld 2012 in San Francisco, USAMonday, October 1st, 20112 Detailed agenda Time Session Speaker 15:30 Reception of participants - Networking coffe served 16:00 Welcome Hans-Peter Kipfer, VP Engineered Systems, Oracle EMEA 16:10 Next challenges in building and managing clouds Javier Cabrerizo, VP, Global Business Development for Exadata, Oracle Corp. 16:30 Partner experience 1.- IT modernization, simplification and cost reduction: The case of a customer in Transportation & Logistics with custom applications and SAP. The Technological Renewal Model built by aligning the innovation of Oracle's Engineered Systems and Capgemini's service delivery excellence has resulted in significant cost savings for the client. Francisco Bermúdez, Country Leader Infrastructure Services, Capgemini, Spain 16:55 Partner experience 2.- The Nvision cloud project NCloud is an innovative design that combines advanced technical solutions, virtualization, and dynamic management of IT resources, providing a complete "as-a-Service" offering for Infrastructure, Database, Middleware, and Applications. Dmitry Krasilov, Head of Oracle Competence Center, Nvision Group, Russia 17:20 Partner experience 3.- From Exadata Ready to Exadata Optimized: An ISV Experience The experience of WeDo Technologies in the process and benefits that started as an Exadata Ready certification and ended up as an Exadata Optimized. Miguel Alves,  Product Business Solutions Manager, Wedo Technologies, Portugal 17:45 Next steps in engaging with Oracle Cengiz Yilmaz, Director Partner Strategy, Oracle EMEA Engineered SystemsPatrick Rood, Manageability Partner Business, Oracle EMEA 18:00 Wrap-up & Networking Time and Location:Monday, October 1st, 2012, 15:30 - 18:00 PST Grand Hyatt San Francisco, 345 Stockton Street, San Francisco (Conference Theater) (It is a 15 minute walk from OOW Moscone Center. See directions here)  

    Read the article

  • What is the difference between Row Level Security and RPD security?

    - by Jeffrey McDaniel
    Row level security (RLS) is a feature of Oracle Enterprise Edition database. RLS enforces security policies on the database level. This means any query executed against the database will respect the specific security applied through these policies. For P6 Reporting Database, these policies are applied during the ETL process. This gives database users the ability to access data with security enforcement even outside of the Oracle Business Intelligence application. RLS is a new feature of P6 Reporting Database starting in version 3.0. This allows for maximum security enforcement outside of the ETL and inside of Oracle Business Intelligence (Analysis and Dashboards). Policies are defined against the STAR tables based on Primavera Project and Resource security. RLS is the security method of Oracle Enterprise Edition customers. See previous blogs and P6 Reporting Database Installation and Configuration guide for more on security specifics. To allow the use of Oracle Standard Edition database for those with a small database (as defined in the P6 Reporting Database Sizing and Planning guide) an RPD with non-RLS is also available. RPD security is enforced by adding specific criteria to the physical and business layers of the RPD for those tables that contain projects and resources, and those fields that are cost fields vs. non cost fields. With the RPD security method Oracle Business Intelligence enforces security. RLS security is the default security method. Additional steps are required at installation and ETL run time for those Oracle Standard Edition customers who use RPD security. The RPD method of security enforcement existed from P6 Reporting Database 2.0/P6 Analytics 1.0 up until RLS became available in P6 Reporting Database 3.0\P6 Analytics 2.0.

    Read the article

  • The Return Of __FILE__ And __LINE__ In .NET 4.5

    - by Alois Kraus
    Good things are hard to kill. One of the most useful predefined compiler macros in C/C++ were __FILE__ and __LINE__ which do expand to the compilation units file name and line number where this value is encountered by the compiler. After 4.5 versions of .NET we are on par with C/C++ again. It is of course not a simple compiler expandable macro it is an attribute but it does serve exactly the same purpose. Now we do get CallerLineNumberAttribute  == __LINE__ CallerFilePathAttribute        == __FILE__ CallerMemberNameAttribute  == __FUNCTION__ (MSVC Extension)   The most important one is CallerMemberNameAttribute which is very useful to implement the INotifyPropertyChanged interface without the need to hard code the name of the property anymore. Now you can simply decorate your change method with the new CallerMemberName attribute and you get the property name as string directly inserted by the C# compiler at compile time.   public string UserName { get { return _userName; } set { _userName=value; RaisePropertyChanged(); // no more RaisePropertyChanged(“UserName”)! } } protected void RaisePropertyChanged([CallerMemberName] string member = "") { var copy = PropertyChanged; if(copy != null) { copy(new PropertyChangedEventArgs(this, member)); } } Nice and handy. This was obviously the prime reason to implement this feature in the C# 5.0 compiler. You can repurpose this feature for tracing to get your hands on the method name of your caller along other stuff very fast now. All infos are added during compile time which is much faster than other approaches like walking the stack. The example on MSDN shows the usage of this attribute with an example public static void TraceMessage(string message, [CallerMemberName] string memberName = "", [CallerFilePath] string sourceFilePath = "", [CallerLineNumber] int sourceLineNumber = 0) { Console.WriteLine("Hi {0} {1} {2}({3})", message, memberName, sourceFilePath, sourceLineNumber); }   When I do think of tracing I do usually want to have a API which allows me to Trace method enter and leave Trace messages with a severity like Info, Warning, Error When I do print a trace message it is very useful to print out method and type name as well. So your API must either be able to pass the method and type name as strings or extract it automatically via walking back one Stackframe and fetch the infos from there. The first glaring deficiency is that there is no CallerTypeAttribute yet because the C# compiler team was not satisfied with its performance.   A usable Trace Api might therefore look like   enum TraceTypes { None = 0, EnterLeave = 1 << 0, Info = 1 << 1, Warn = 1 << 2, Error = 1 << 3 } class Tracer : IDisposable { string Type; string Method; public Tracer(string type, string method) { Type = type; Method = method; if (IsEnabled(TraceTypes.EnterLeave,Type, Method)) { } } private bool IsEnabled(TraceTypes traceTypes, string Type, string Method) { // Do checking here if tracing is enabled return false; } public void Info(string fmt, params object[] args) { } public void Warn(string fmt, params object[] args) { } public void Error(string fmt, params object[] args) { } public static void Info(string type, string method, string fmt, params object[] args) { } public static void Warn(string type, string method, string fmt, params object[] args) { } public static void Error(string type, string method, string fmt, params object[] args) { } public void Dispose() { // trace method leave } } This minimal trace API is very fast but hard to maintain since you need to pass in the type and method name as hard coded strings which can change from time to time. But now we have at least CallerMemberName to rid of the explicit method parameter right? Not really. Since any acceptable usable trace Api should have a method signature like Tracexxx(… string fmt, params [] object args) we not able to add additional optional parameters after the args array. If we would put it before the format string we would need to make it optional as well which would mean the compiler would need to figure out what our trace message and arguments are (not likely) or we would need to specify everything explicitly just like before . There are ways around this by providing a myriad of overloads which in the end are routed to the very same method but that is ugly. I am not sure if nobody inside MS agrees that the above API is reasonable to have or (more likely) that the whole talk about you can use this feature for diagnostic purposes was not a core feature at all but a simple byproduct of making the life of INotifyPropertyChanged implementers easier. A way around this would be to allow for variable argument arrays after the params keyword another set of optional arguments which are always filled by the compiler but I do not know if this is an easy one. The thing I am missing much more is the not provided CallerType attribute. But not in the way you would think of. In the API above I did add some filtering based on method and type to stay as fast as possible for types where tracing is not enabled at all. It should be no more expensive than an additional method call and a bool variable check if tracing for this type is enabled at all. The data is tightly bound to the calling type and method and should therefore become part of the static type instance. Since extending the CLR type system for tracing is not something I do expect to happen I have come up with an alternative approach which allows me basically to attach run time data to any existing type object in super fast way. The key to success is the usage of generics.   class Tracer<T> : IDisposable { string Method; public Tracer(string method) { if (TraceData<T>.Instance.Enabled.HasFlag(TraceTypes.EnterLeave)) { } } public void Dispose() { if (TraceData<T>.Instance.Enabled.HasFlag(TraceTypes.EnterLeave)) { } } public static void Info(string fmt, params object[] args) { } /// <summary> /// Every type gets its own instance with a fresh set of variables to describe the /// current filter status. /// </summary> /// <typeparam name="T"></typeparam> internal class TraceData<UsingType> { internal static TraceData<UsingType> Instance = new TraceData<UsingType>(); public bool IsInitialized = false; // flag if we need to reinit the trace data in case of reconfigured trace settings at runtime public TraceTypes Enabled = TraceTypes.None; // Enabled trace levels for this type } } We do not need to pass the type as string or Type object to the trace Api. Instead we define a generic Api that accepts the using type as generic parameter. Then we can create a TraceData static instance which is due to the nature of generics a fresh instance for every new type parameter. My tests on my home machine have shown that this approach is as fast as a simple bool flag check. If you have an application with many types using tracing you do not want to bring the app down by simply enabling tracing for one special rarely used type. The trace filter performance for the types which are not enabled must be therefore the fasted code path. This approach has the nice side effect that if you store the TraceData instances in one global list you can reconfigure tracing at runtime safely by simply setting the IsInitialized flag to false. A similar effect can be achieved with a global static Dictionary<Type,TraceData> object but big hash tables have random memory access semantics which is bad for cache locality and you always need to pay for the lookup which involves hash code generation, equality check and an indexed array access. The generic version is wicked fast and allows you to add more features to your tracing Api with minimal perf overhead. But it is cumbersome to write the generic type argument always explicitly and worse if you do refactor code and move parts of it to other classes it might be that you cannot configure tracing correctly. I would like therefore to decorate my type with an attribute [CallerType] class Tracer<T> : IDisposable to tell the compiler to fill in the generic type argument automatically. class Program { static void Main(string[] args) { using (var t = new Tracer()) // equivalent to new Tracer<Program>() { That would be really useful and super fast since you do not need to pass any type object around but you do have full type infos at hand. This change would be breaking if another non generic type exists in the same namespace where now the generic counterpart would be preferred. But this is an acceptable risk in my opinion since you can today already get conflicts if two generic types of the same name are defined in different namespaces. This would be only a variation of this issue. When you do think about this further you can add more features like to trace the exception in your Dispose method if the method is left with an exception with that little trick I did write some time ago. You can think of tracing as a super fast and configurable switch to write data to an output destination or to execute alternative actions. With such an infrastructure you can e.g. Reconfigure tracing at run time. Take a memory dump when a specific method is left with a specific exception. Throw an exception when a specific trace statement is hit (useful for testing error conditions). Execute a passed delegate which e.g. dumps additional state when enabled. Write data to an in memory ring buffer and dump it when specific events do occur (e.g. method is left with an exception, triggered from outside). Write data to an output device. …. This stuff is really useful to have when your code is in production on a mission critical server and you need to find the root cause of sporadic crashes of your application. It could be a buggy graphics card driver which throws access violations into your application (ok with .NET 4 not anymore except if you enable a compatibility flag) where you would like to have a minidump or you have reached after two weeks of operation a state where you need a full memory dump at a specific point in time in the middle of an transaction. At my older machine I do get with this super fast approach 50 million traces/s when tracing is disabled. When I do know that tracing is enabled for this type I can walk the stack by using StackFrameHelper.GetStackFramesInternal to check further if a specific action or output device is configured for this method which is about 2-3 times faster than the regular StackTrace class. Even with one String.Format I am down to 3 million traces/s so performance is not so important anymore since I do want to do something now. The CallerMemberName feature of the C# 5 compiler is nice but I would have preferred to get direct access to the MethodHandle and not to the stringified version of it. But I really would like to see a CallerType attribute implemented to fill in the generic type argument of the call site to augment the static CLR type data with run time data.

    Read the article

  • ArchBeat Link-o-Rama for 11/11/2011

    - by Bob Rhubart
    3 SOA business cases, explained in a 2-minute elevator speech | Joe McKendrick Impress your CEO — maybe even the CFO — with some quick examples of SOA making a difference to the business. ADF Faces - a logic bomb in the order of bean instantiations | Chris Muir Oracle ACE Director Chris Muir shares the details on "an interesting ADF logic bomb" discovered by one of his colleagues. 5 key trends in cloud computing's future | David Linthicum "'Cloud computing' will become just 'computing' at some point," says Linthicum, "but it will still be around as an approach to computing." What's New with XBRL? | John O'Rourke John O'Rourke shares highlights and key take-aways from the XBRL US Conference in Nashville and the XBRL International Conference in Montreal. Siri-ous Business: Enterprise Apps and Global UX Considerations | Ultan O'Broin Ultan O'Broin ponders "the enterprise applications user experience (UX) implications of Siri" and "the global UX aspects to the Siri potential." These are 11 of my favorite things! | Mike Gerdts Gerdts introduces his 11 favorite things about zones in Solaris 11. The Power of Social Recommendations | Peter Reiser "Do you really want to invest to drive YOUR audience trough public social networks," asks Reiser, "or do you want to have YOUR audience on your own social network which is seamless integrated with your web properties and business applications." Fourth Key Attribute of Cloud Computing - Provisioning | Tom Laszewski "Self-service provisioning of computing infrastructure in a cloud infrastructure is also very desirable as it can cut down the time it takes to deploy new infrastructure for a new application or scale up/down infrastructure for an existing application," says Tom Laszewski. Oracle Utilities Application Framework Whitepaper List as of November 2011 | Anthony Shorten Anthony Shorten shares an updated and nicely detailed list of Oracle Utilities Application Framework white papers. Down from the Tower; Information Integration Conversation; By the Time the Architects get to Phoenix This week on the Oracle Technology Network Architect Home Page.

    Read the article

  • can anyone help me through the preparation of Eclipse IDE for android developer in ubuntu 12.04?

    - by csbl
    I'm new to to Linux, in this particular case, to Ubuntu. I have a small android project I have to finnish until this Friday and I'm still stuck with the install and preparing of the development envrironment. The only thing I did was install the Eclipse IDE. I'm still missing the SDK, JAVA and anything else that might be needed. Can someone help me through this? It's only because I'm running out of time to develop, or else I would embarc on a deeper investigation of this OS. I tried the step to install android platforms, through Eclispe-Help-Install New Software, and I got the follwoing error messages, at the end of the process: [2012-06-06 17:35:56 - adb] /home/catia/android-sdks/platform-tools/adb: error while loading shared libraries: libncurses.so.5: cannot open shared object file: No such file or directory [2012-06-06 17:35:56 - adb] 'adb version' failed! /home/catia/android-sdks/platform-tools/adb: error while loading shared libraries: libncurses.so.5: cannot open shared object file: No such file or directory [2012-06-06 17:35:56 - adb] Failed to parse the output of 'adb version': Standard Output was: Error Output was: /home/catia/android-sdks/platform-tools/adb: error while loading shared libraries: libncurses.so.5: cannot open shared object file: No such file or directory [2012-06-06 17:35:56 - adb] /home/catia/android-sdks/platform-tools/adb: error while loading shared libraries: libncurses.so.5: cannot open shared object file: No such file or directory [2012-06-06 17:35:56 - adb] 'adb version' failed! /home/catia/android-sdks/platform-tools/adb: error while loading shared libraries: libncurses.so.5: cannot open shared object file: No such file or directory [2012-06-06 17:35:56 - adb] Failed to parse the output of 'adb version': Standard Output was: Error Output was: /home/catia/android-sdks/platform-tools/adb: error while loading shared libraries: libncurses.so.5: cannot open shared object file: No such file or directory Can anyone help, please??

    Read the article

  • SQL Azure Roadmap gets a little clearer &ndash; announcements from Tech Ed

    - by Eric Nelson
    On Monday at Tech?Ed 2010 we announced new stuff (I like new stuff) that “showcases our continued commitment to deliver value, flexibility and control of data through data cloud services to our customers”. Ok, that does sound like marketing speak (and it is) but the good news is there is some meat behind it. We have some decent new features coming and we also have some clarity on when we will be able to get our hands on those features. SQL Azure Business Edition Extends to 50 GB – June 28th SQL Azure Business Edition database is now extending from 10GB to 50GB The new 50GB database size will be available worldwide starting June 28th SQL Azure Business Edition Subscription Offer – August 1st Starting August 1st, we will have a new discounted SQL Azure promotional offer (SQL Azure Development Accelerator Core) More information is available at http://www.microsoft.com/windowsazure/offers/. Public Preview of the Data Sync Service  - CTP now Data Sync Service for SQL Azure allows for more flexible control over data by deciding which data components should be distributed across multiple datacenters in different geographic locations, based on your internal policies and business needs.  Available as a community technology preview after registering at http://www.sqlazurelabs.com SQL Server Web Manager for SQL Azure - CTP this Summer SQL Server Web Manager (SSWM) is a lightweight and easy to use database management tool for SQL Azure databases, to be offered this summer. Access 10 Support for SQL Azure – available now Yey – at last! Microsoft Office 2010 will natively support data connectivity to SQL Azure – we can now start developing those “departmental apps” with the confidence of a highly available SQL store provisioned in seconds. NB: I don’t believe we will support any previous versions of Access talking to SQL Azure. The Pre-announced Spatial Data Support to Become Live – Live now* At MIX in March we announced spatial was coming and apparently it is now here - although I need to check. Related Links UK based? Sign up at http://ukazure.ning.com SQL Azure Team Blog http://blogs.msdn.com/b/sqlazure/

    Read the article

  • New SQL Azure Development Accelerator Core promotional offer announced

    - by Eric Nelson
    This is (almost) a straight copy and paste but represents an important announcement worthy of a little more “exposure” :-) Starting August 1, 2010, we will release a new SQL Azure Development Accelerator Core promotional offer.  This new offer will give you the flexibility to purchase commitment quantities of SQL Azure Business Edition databases independent of other Windows Azure platform services at a deeply discounted monthly price.  The offer is valid only for a six month term.  You may purchase in 10 GB increments the amount of our Business Edition relational database that you require (each Business Edition database is capable of storing up to 50 GB).  The offer price will be $74.95 per 10 GB per month.  This promotional offer represents 25% off of our normal consumption rates.  Monthly Business Edition relational database usage exceeding the purchased commitment amount and usage for other Windows Azure platform services for this offer will be charged at our normal consumption rates.  Please click here for full details of our new SQL Azure Development Accelerator Core offer.  Related Links: Details of 5GB and 50GB databases have been released http://ukazure.ning.com UK community site Getting started with the Windows Azure Platform

    Read the article

  • Prognostications for the Future of BI

    - by jacqueline.coolidge(at)oracle.com
    Dashboard Insight has published the viewpoints on the future of BI from several vendors' perspectives including ours at Business Intelligence Predictions for 2011 We offered: In 2011, businesses will demand more from BI.  With intense competitive and economic pressures, it's not enough to be interesting.  BI must be actionable and enable people to respond smarter and faster to the opportunities and challenges of the day.  Most companies rely on BI to help them understand what's going on in their business.  Many are ready to make the leap from "What's going on?" to "What are we going to do about it?" Seamless integration from reporting to what-if analysis and scenario modeling helps businesses decide the right course of action.  The integration of BI with SOA and BPEL will deliver the true payoff for BI by enabling companies to initiate business processes directly from their analysis, turning insight to action for more agile and competitive business.  And, I must admit, it's tough to argue with the trends identified by other vendors. Enabling true self-service and engaging a larger community of users Accelerating the adoption of BI on mobile devices Embracing more advanced analytics such as data/text mining and location intelligence Price/performance breakthroughs It's singing to the choir.  I look forward to hearing the voices of some customers who are pushing the envelope and will post those stories as I capture them.  

    Read the article

  • Big GRC: Turning Data into Actionable GRC Intelligence

    - by Jenna Danko
    While it’s no longer headline news that Governments have carried out large scale data-mining programmes aimed at terrorism detection and identifying other patterns of interest across a wide range of digital data sources, the debate over the ethics and justification over this action, will clearly continue for some time to come. What is becoming clear is that these programmes are a framework for the collation and aggregation of massive amounts of unstructured data and from this, the creation of actionable intelligence from analyses that allowed the analysts to explore and extract a variety of patterns and then direct resources. This data included audio and video chats, phone calls, photographs, e-mails, documents, internet searches, social media posts and mobile phone logs and connections. Although Governance, Risk and Compliance (GRC) professionals are not looking at the implementation of such programmes, there are many similar GRC “Big data” challenges to be faced and potential lessons to be learned from these high profile government programmes that can be applied a lot closer to home. For example, how can GRC professionals collect, manage and analyze an enormous and disparate volume of data to create and manage their own actionable intelligence covering hidden signs and patterns of criminal activity, the early or retrospective, violation of regulations/laws/corporate policies and procedures, emerging risks and weakening controls etc. Not exactly the stuff of James Bond to be sure, but it is certainly more applicable to most GRC professional’s day to day challenges. So what is Big Data and how can it benefit the GRC process? Although it often varies, the definition of Big Data largely refers to the following types of data: Traditional Enterprise Data – includes customer information from CRM systems, transactional ERP data, web store transactions, and general ledger data. Machine-Generated /Sensor Data – includes Call Detail Records (“CDR”), weblogs and trading systems data. Social Data – includes customer feedback streams, micro-blogging sites like Twitter, and social media platforms like Facebook. The McKinsey Global Institute estimates that data volume is growing 40% per year, and will grow 44x between 2009 and 2020. But while it’s often the most visible parameter, volume of data is not the only characteristic that matters. In fact, according to sources such as Forrester there are four key characteristics that define big data: Volume. Machine-generated data is produced in much larger quantities than non-traditional data. This is all the data generated by IT systems that power the enterprise. This includes live data from packaged and custom applications – for example, app servers, Web servers, databases, networks, virtual machines, telecom equipment, and much more. Velocity. Social media data streams – while not as massive as machine-generated data – produce a large influx of opinions and relationships valuable to customer relationship management as well as offering early insight into potential reputational risk issues. Even at 140 characters per tweet, the high velocity (or frequency) of Twitter data ensures large volumes (over 8 TB per day) need to be managed. Variety. Traditional data formats tend to be relatively well defined by a data schema and change slowly. In contrast, non-traditional data formats exhibit a dizzying rate of change. Without question, all GRC professionals work in a dynamic environment and as new services, new products, new business lines are added or new marketing campaigns executed for example, new data types are needed to capture the resultant information.  Value. The economic value of data varies significantly. Typically, there is good information hidden amongst a larger body of non-traditional data that GRC professionals can use to add real value to the organisation; the greater challenge is identifying what is valuable and then transforming and extracting that data for analysis and action. For example, customer service calls and emails have millions of useful data points and have long been a source of information to GRC professionals. Those calls and emails are critical in helping GRC professionals better identify hidden patterns and implement new policies that can reduce the amount of customer complaints.   Now on a scale and depth far beyond those in place today, all that unstructured call and email data can be captured, stored and analyzed to reveal the reasons for the contact, perhaps with the aggregated customer results cross referenced against what is being said about the organization or a similar peer organization on social media. The organization can then take positive actions, communicating to the market in advance of issues reaching the press, strengthening controls, adjusting risk profiles, changing policy and procedures and completely minimizing, if not eliminating, complaints and compensation for that specific reason in the future. In this one example of many similar ones, the GRC team(s) has demonstrated real and tangible business value. Big Challenges - Big Opportunities As pointed out by recent Forrester research, high performing companies (those that are growing 15% or more year-on-year compared to their peers) are taking a selective approach to investing in Big Data.  "Tomorrow's winners understand this, and they are making selective investments aimed at specific opportunities with tangible benefits where big data offers a more economical solution to meet a need." (Forrsights Strategy Spotlight: Business Intelligence and Big Data, Q4 2012) As pointed out earlier, with the ever increasing volume of regulatory demands and fines for getting it wrong, limited resource availability and out of date or inadequate GRC systems all contributing to a higher cost of compliance and/or higher risk profile than desired – a big data investment in GRC clearly falls into this category. However, to make the most of big data organizations must evolve both their business and IT procedures, processes, people and infrastructures to handle these new high-volume, high-velocity, high-variety sources of data and be able integrate them with the pre-existing company data to be analyzed. GRC big data clearly allows the organization access to and management over a huge amount of often very sensitive information that although can help create a more risk intelligent organization, also presents numerous data governance challenges, including regulatory compliance and information security. In addition to client and regulatory demands over better information security and data protection the sheer amount of information organizations deal with the need to quickly access, classify, protect and manage that information can quickly become a key issue  from a legal, as well as technical or operational standpoint. However, by making information governance processes a bigger part of everyday operations, organizations can make sure data remains readily available and protected. The Right GRC & Big Data Partnership Becomes Key  The "getting it right first time" mantra used in so many companies remains essential for any GRC team that is sponsoring, helping kick start, or even overseeing a big data project. To make a big data GRC initiative work and get the desired value, partnerships with companies, who have a long history of success in delivering successful GRC solutions as well as being at the very forefront of technology innovation, becomes key. Clearly solutions can be built in-house more cheaply than through vendor, but as has been proven time and time again, when it comes to self built solutions covering AML and Fraud for example, few have able to scale or adapt appropriately to meet the changing regulations or challenges that the GRC teams face on a daily basis. This has led to the creation of GRC silo’s that are causing so many headaches today. The solutions that stand out and should be explored are the ones that can seamlessly merge the traditional world of well-known data, analytics and visualization with the new world of seemingly innumerable data sources, utilizing Big Data technologies to generate new GRC insights right across the enterprise.Ultimately, Big Data is here to stay, and organizations that embrace its potential and outline a viable strategy, as well as understand and build a solid analytical foundation, will be the ones that are well positioned to make the most of it. A Blueprint and Roadmap Service for Big Data Big data adoption is first and foremost a business decision. As such it is essential that your partner can align your strategies, goals, and objectives with an architecture vision and roadmap to accelerate adoption of big data for your environment, as well as establish practical, effective governance that will maintain a well managed environment going forward. Key Activities: While your initiatives will clearly vary, there are some generic starting points the team and organization will need to complete: Clearly define your drivers, strategies, goals, objectives and requirements as it relates to big data Conduct a big data readiness and Information Architecture maturity assessment Develop future state big data architecture, including views across all relevant architecture domains; business, applications, information, and technology Provide initial guidance on big data candidate selection for migrations or implementation Develop a strategic roadmap and implementation plan that reflects a prioritization of initiatives based on business impact and technology dependency, and an incremental integration approach for evolving your current state to the target future state in a manner that represents the least amount of risk and impact of change on the business Provide recommendations for practical, effective Data Governance, Data Quality Management, and Information Lifecycle Management to maintain a well-managed environment Conduct an executive workshop with recommendations and next steps There is little debate that managing risk and data are the two biggest obstacles encountered by financial institutions.  Big data is here to stay and risk management certainly is not going anywhere, and ultimately financial services industry organizations that embrace its potential and outline a viable strategy, as well as understand and build a solid analytical foundation, will be best positioned to make the most of it. Matthew Long is a Financial Crime Specialist for Oracle Financial Services. He can be reached at matthew.long AT oracle.com.

    Read the article

  • The Silverlight 4 Training Kit and Green Eggs &amp; Ham

    - by Jim Duffy
    Microsoft has released the Silverlight 4 Training Kit that steps you through the process of constructing Silverlight 4 business applications. “The Silverlight 4 Training Course includes a whitepaper explaining all of the new Silverlight 4 features, several hands-on-labs that explain the features, and a 8 unit course for building business applications with Silverlight 4. The business applications course includes 8 modules with extensive hands on labs as well as 25 accompanying videos that walk you through key aspects of building a business application with Silverlight. Key aspects in this course are working with numerous sandboxed and elevated out of browser features, the new RichTextBox control, implicit styling, webcam, drag and drop, multi touch, validation, authentication, MEF, WCF RIA Services, right mouse click, and much more!” What I think is pretty cool is that there are two ways to access this content, online and offline. Obviously the online version is great when you’re sitting at your desk and you’re connected to the web. What about when you don’t have a connection like when you’re located where you won’t eat green eggs & ham, like on a train or on plane perhaps? :-) You can download the offline version and hope that Sam I Am won’t be to distracting while you try to watch the videos or work your way through the labs. :-) Have a day. :-|

    Read the article

  • Leveraging ERP Investments with EPM and BI Solutions

    - by john.orourke(at)oracle.com
    Now that many organizations have implemented ERP systems to automate and integrate their operational processes, IT investments are beginning to shift to the management systems i.e. EPM and BI tools and applications that integrate data from multiple transactional systems.  These solutions automate and integrate the management processes and enable organizations to achieve "management excellence" becoming smarter, more agile and more aligned than their competitors.  In fact the results of a recent IDC survey indicate that "Organizations that have implemented performance management more broadly are nearly four times more likely to be among the most competitive organizations in their industry."  One example of an organization that is leveraging their ERP investments with Oracle EPM and BI solutions is General Dynamics.  The Business Intelligence Collaborative (BIC) group within General Dynamics' IT organization assists various business units with the implementation, application support, and application hosting for their Business Intelligence and Enterprise Performance Management Applications.  Attend the Oracle Virtual Trade Show "Spotlight on Customer Success" on February 3rd to hear the details of how General Dynamics is using Oracle Essbase, Hyperion Planning, and Oracle BI to improve their planning, reporting and analysis processes and leverage their investments in Oracle E-Business Suite and other operational systems.   During the event, you can also hear about the latest developments and plans for Oracle Applications products, as well as what's coming with Oracle Fusion Applications. Here's a link to the Virtual Trade Show event overview and registration page.  The event runs from 8AM - 1PM PST/11AM - 4PM EST, and the EPM session is 10:30 - 11AM PST/1:30 - 2PM EST.    http://event.on24.com/event/26/79/15/rt/opFb.html?partnerref=internal I hope you'll join us on February 3rd!  

    Read the article

  • Today's Links (6/17/2011)

    - by Bob Rhubart
    Call for Nominations: Oracle Eco-Enterprise Innovation Awards Is your organization using Oracle products to reduce your environmental footprint while reducing costs? If so, submit your nomination for Oracle's Eco-Enterprise Innovation award. These awards will be presented to select customers and their partners who are using any of Oracle's products to not only take an environmental lead, but also to reduce their costs and improve their business efficiencies by using green business practices. Beyond The Data Grid: Coherence, Normalization, Joins, and Linear Scalability | Ben Stopford Ben Stopford presents ODC, a highly distributed in-memory normalized NoSQL datastore designed for scalability, based on normalized data, Snowflake Schema, and Connected Replication pattern. Upgrading ALSB services to OSB | John Chin-a-Woeng John Chin-a-Woeng walks you through the upgrade from Aqualogic Service Bus (ALSB 3.0) to Oracle Service Bus (OSB 10.3). SOA & Middleware: Pinning tasks to a user in BPM 11g | Niall Commiskey Commiskey illustrates a scenario. JDeveloper 11gR2: New option Test WebService in WSDL editor | Lucas Jellema The "Test WebService" button in the WSDL Editor in JDeveloper 11gr2 is "just a little feature addition," says Oracle ACE Director Lucas Jellema. "But it can be quite useful all the same." Enterprise Business Intelligence 11g Seminar with Mark Rittman Oracle ACE Director Mark Rittman conducts a two-day course for Oracle University, in Dublin, IE, July 4-5, 2011. Data Integration Webcast Series Join Oracle experts for a series covering our data integration solutions. You’ll get invaluable information to help boost your data infrastructure so that you can accelerate your business.

    Read the article

< Previous Page | 336 337 338 339 340 341 342 343 344 345 346 347  | Next Page >