Search Results

Search found 3321 results on 133 pages for 'patterns'.

Page 114/133 | < Previous Page | 110 111 112 113 114 115 116 117 118 119 120 121  | Next Page >

  • Bridging Two Worlds: Big Data and Enterprise Data

    - by Dain C. Hansen
    Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;} The big data world is all the vogue in today’s IT conversations. It’s a world of volume, velocity, variety – tantalizing us with its untapped potential. It’s a world of transformational game-changing technologies that have already begun to alter the information management landscape. One of the reasons that big data is so compelling is that it’s a universal challenge that impacts every one of us. Whether it is healthcare, financial, manufacturing, government, retail - big data presents a pressing problem for many industries: how can so much information be processed so quickly to deliver the ‘bigger’ picture? With big data we’re tapping into new information that didn’t exist before: social data, weblogs, sensor data, complex content, and more. What also makes big data revolutionary is that it turns traditional information architecture on its head, putting into question commonly accepted notions of where and how data should be aggregated processed, analyzed, and stored. This is where Hadoop and NoSQL come in – new technologies which solve new problems for managing unstructured data. And now for some worst practices that I'd recommend that you please not follow: Worst Practice Lesson 1: Throw away everything that you already know about data management, data integration tools, and start completely over. One shouldn’t forget what’s already running in today’s IT. Today’s Business Analytics, Data Warehouses, Business Applications (ERP, CRM, SCM, HCM), and even many social, mobile, cloud applications still rely almost exclusively on structured data – or what we’d like to call enterprise data. This dilemma is what today’s IT leaders are up against: what are the best ways to bridge enterprise data with big data? And what are the best strategies for dealing with the complexities of these two unique worlds? Worst Practice Lesson 2: Throw away all of your existing business applications … because they don’t run on big data yet. Bridging the two worlds of big data and enterprise data means considering solutions that are complete, based on emerging Hadoop technologies (as well as traditional), and are poised for success through integrated design tools, integrated platforms that connect to your existing business applications, as well as and support real-time analytics. Leveraging these types of best practices translates to improved productivity, lowered TCO, IT optimization, and better business insights. Worst Practice Lesson 3: Separate out [and keep separate] your big data sandboxes from all the current enterprise IT systems. Don’t mix sand among playgrounds. We didn't tell you that you wouldn't get dirty doing this. Correlation between the two worlds is key. The real advantage to analyzing big data comes when you can correlate it with the existing data in your data warehouse or your current applications to make sense of the larger patterns. If you have not followed these worst practices 1-3 then you qualify for the first step of our journey: bridging the two worlds of enterprise data and big data. Over the next several weeks we’ll be discussing this topic along with several others around big data as it relates to data integration. We welcome you to join us in the conversation by following us on twitter on #BridgingBigData or download our latest white paper and resource kit: Big Data and Enterprise Data: Bridging Two Worlds.

    Read the article

  • Integrating with Fusion Applications using SOAP web services and REST APIs (Part 1 of 2) by Arvind Srinivasamoorthy

    - by JuergenKress
    Fusion Applications provides several types of interfaces to facilitate integration with other applications within the enterprise and on the cloud.As one of the key integration interfaces, Fusion Applications (FA) supports SOAP services based integration, both inbound and outbound. At this point FA doesn’t provide REST API’s but it is planned for a future release. It is however possible to invoke external REST APIs from FA which we will discuss. Oracle continues to invest in improving both SOAP and REST based connectivity. The content in this blog is based on features that were available at the time of writing it. In this two part blog, I will cover the following topics briefly. Invoking FA SOAP web services from external applications Identifying the FA SOAP web service to be invoked Sample invocation from an external application Techniques to invoke FA services from an ADF application Invoking external SOAP Web Services from FA (covered in Part 2) Invoking external REST APIs from FA (covered in Part 2) I’ll touch upon some basics, so that you can quickly build a few SOAP/REST interactions with FA. If you do not already have access to an FA instance (on-premise or SaaS), you can request for a free 30 day trial of the Oracle Sales Cloud using http://cloud.oracle.com 1. Invoking FA SOAP web services from external applications There are two main types of services that FA exposes -  ADF Services - These services allow you to perform CRUD operations on Fusion business objects. For example, Sales Party Service, Opportunity Service etc. Using these services you can typically perform operations such as get, find, create, delete, update etc on FA objects.These services are typically useful for UI driven integrations such as looking up FA information from external application UIs, using third party Interfaces to create/update data in FA. They are also used in non-UI driven integration uses cases such as initial upload of business or setup data, synchronizing data with an external systems, etc. - Composite Services – These services involve more logic than CRUD and often involving human workflows, rules etc. These services perform a business function such as Get Orchestration Order Service and are used when building larger process based integrations with external systems.These services are usually asynchronous in nature and are not typically used for UI integration patterns. 1a. Identifying the FA SOAP web service to be invoked All FA web service metadata is available through an OER instance (Oracle Enterprise Repository) which is publicly available via http://fusionappsoer.oracle.com. This is the starting point for you to discover the services that you are going to work with. You do not need to own a FA account to browse the services using the above UI You can use the search area on the left to narrow down your search to what you are looking for. For example, you can choose the type as by ADF Services or Composite, you can narrow your search to a specific FA version, Product Family etc. Read the complete article here. SOA & BPM Partner Community For regular information on Oracle SOA Suite become a member in the SOA & BPM Partner Community for registration please visit www.oracle.com/goto/emea/soa (OPN account required) If you need support with your account please contact the Oracle Partner Business Center. Blog Twitter LinkedIn Facebook Wiki Technorati Tags: AppAdvantage,SOA Community,Oracle SOA,Oracle BPM,Community,OPN,Jürgen Kress,Arvind Srinivasamoorthy

    Read the article

  • Big Data – Basics of Big Data Analytics – Day 18 of 21

    - by Pinal Dave
    In yesterday’s blog post we learned the importance of the various components in Big Data Story. In this article we will understand what are the various analytics tasks we try to achieve with the Big Data and the list of the important tools in Big Data Story. When you have plenty of the data around you what is the first thing which comes to your mind? “What do all these data means?” Exactly – the same thought comes to my mind as well. I always wanted to know what all the data means and what meaningful information I can receive out of it. Most of the Big Data projects are built to retrieve various intelligence all this data contains within it. Let us take example of Facebook. When I look at my friends list of Facebook, I always want to ask many questions such as - On which date my maximum friends have a birthday? What is the most favorite film of my most of the friends so I can talk about it and engage them? What is the most liked placed to travel my friends? Which is the most disliked cousin for my friends in India and USA so when they travel, I do not take them there. There are many more questions I can think of. This illustrates that how important it is to have analysis of Big Data. Here are few of the kind of analysis listed which you can use with Big Data. Slicing and Dicing: This means breaking down your data into smaller set and understanding them one set at a time. This also helps to present various information in a variety of different user digestible ways. For example if you have data related to movies, you can use different slide and dice data in various formats like actors, movie length etc. Real Time Monitoring: This is very crucial in social media when there are any events happening and you wanted to measure the impact at the time when the event is happening. For example, if you are using twitter when there is a football match, you can watch what fans are talking about football match on twitter when the event is happening. Anomaly Predication and Modeling: If the business is running normal it is alright but if there are signs of trouble, everyone wants to know them early on the hand. Big Data analysis of various patterns can be very much helpful to predict future. Though it may not be always accurate but certain hints and signals can be very helpful. For example, lots of data can help conclude that if there is lots of rain it can increase the sell of umbrella. Text and Unstructured Data Analysis: unstructured data are now getting norm in the new world and they are a big part of the Big Data revolution. It is very important that we Extract, Transform and Load the unstructured data and make meaningful data out of it. For example, analysis of lots of images, one can predict that people like to use certain colors in certain months in their cloths. Big Data Analytics Solutions There are many different Big Data Analystics Solutions out in the market. It is impossible to list all of them so I will list a few of them over here. Tableau – This has to be one of the most popular visualization tools out in the big data market. SAS – A high performance analytics and infrastructure company IBM and Oracle – They have a range of tools for Big Data Analysis Tomorrow In tomorrow’s blog post we will discuss about very important components of the Big Data Ecosystem – Data Scientist. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: Big Data, PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL

    Read the article

  • What packages do I need to compile .tex documents using XeLaTeX?

    - by maria
    Hi I'm aware of the existence of similar threads on this forum. But any of replies mach to my problem. I'm using Ubuntu 10.4 and I hadn't problems with fonts till I've decided to use XeLaTeX instead of LaTeX (cf http://tex.stackexchange.com/questions/12347/typesetting-a-document-using-arabic-script/12358#12358). The problem is that I'm not able to compile any .tex document using XeLaTeX, as well as properly display XeLaTeX documentation. As I've learn thanks to mentioned thread, XeLaTeX uses the fonts availables in general in the system. I was trying yo read fontspec documentation, but it opens in pdf with a lot of white gaps and terminal output (quite long) consist mostly of errors. This are just few lines of it: Error: Missing language pack for 'Adobe-Japan1' mapping Error: Unknown font tag 'F5.1' Error (24124): No font in show Error: Unknown font tag 'F5.1' I was trying to compile simple XeLaTeX file: \documentclass{article} \usepackage{fontspec} \setmainfont{Linux Libertine O} \begin{document} Hello World! \end{document} without succes. This is terminal output of compilation: This is XeTeX, Version 3.1415926-2.2-0.9995.2 (TeX Live 2009/Debian) restricted \write18 enabled. entering extended mode (./ex.tex LaTeX2e <2009/09/24> Babel <v3.8l> and hyphenation patterns for english, usenglishmax, dumylang, noh yphenation, polish, loaded. (/usr/share/texmf-texlive/tex/latex/base/article.cls Document Class: article 2007/10/19 v1.4h Standard LaTeX document class (/usr/share/texmf-texlive/tex/latex/base/size10.clo)) (/usr/share/texmf-texlive/tex/xelatex/fontspec/fontspec.sty (/usr/share/texmf-texlive/tex/generic/ifxetex/ifxetex.sty) (/usr/share/texmf-texlive/tex/latex/tools/calc.sty) (/usr/share/texmf-texlive/tex/latex/xkeyval/xkeyval.sty (/usr/share/texmf-texlive/tex/generic/xkeyval/xkeyval.tex (/usr/share/texmf-texlive/tex/generic/xkeyval/keyval.tex))) (/usr/share/texmf-texlive/tex/latex/base/fontenc.sty (/usr/share/texmf-texlive/tex/xelatex/euenc/eu1enc.def) (/usr/share/texmf-texlive/tex/xelatex/euenc/eu1lmr.fd)) fontspec.cfg loaded. (/usr/share/texmf-texlive/tex/xelatex/fontspec/fontspec.cfg))kpathsea: Invalid fontname `Linux Libertine O', contains ' ' ! Font \zf@basefont="Linux Libertine O" at 10.0pt not loadable: Metric (TFM) fi le or installed font not found. \zf@fontspec ...ntname \zf@suffix " at \f@size pt \unless \ifzf@icu \zf@set@... l.3 \setmainfont{Linux Libertine O} ? I can't find Linux Libertine O. Searching for otf- by aptitude gives as result: maria@maria-laptop:/etc/fonts$ aptitude search otf p emdebian-rootfs - emdebian root filesystem support p libotf-bin - A Library for handling OpenType Font - utilities p libotf-dev - A Library for handling OpenType Font - development i libotf0 - A Library for handling OpenType Font - runtime p libotf0-dbg - The libotf libraries and debugging symbols p libpam-dotfile - A PAM module which allows users to have more than one password p livecd-rootfs - construction script for the livecd rootfs p makebootfat - Utility to create a bootable FAT filesystem p otf-ipaexfont - Japanese OpenType font, IPAexFont (IPAexGothic/Mincho) p otf-ipaexfont-gothic - Japanese OpenType font, IPAexFont (IPAexGothic) p otf-ipaexfont-mincho - Japanese OpenType font, IPAexFont (IPAexMincho) p otf-ipafont - Japanese OpenType font set, IPAfont p otf-ipafont-gothic - Japanese OpenType font set, IPA Gothic font p otf-ipafont-mincho - Japanese OpenType font set, IPA Mincho font p otf-stix - the Scientific and Technical Information eXchange fonts p otf-thai-tlwg - Thai fonts in OpenType format p otf-yozvox-yozfont - Japanese proportional Handwriting OpenType font p otf2bdf - generate BDF bitmap fonts from OpenType outline fonts p robotfindskitten - Zen Simulation of robot finding kitten So font in question is not just uninstalled, but not available, if I'm not wrong. Does it mean that I lack some repositoires? I was trying also to apply solution from the thread How do I reinstall default fonts?, but the result is: maria@maria-laptop:~$ sudo apt-get install msttcorefonts [sudo] password for maria: Reading package lists... Done Building dependency tree Reading state information... Done Note, selecting ttf-mscorefonts-installer instead of msttcorefonts ttf-mscorefonts-installer is already the newest version. 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. maria@maria-laptop:~$ It seems that is not a usual problem for use of XeLaTeX; nobody in the mentioned thread suggested instalation of anything else than TeX Live. Thanks in advance

    Read the article

  • A Case for Oracle Fusion Middleware by Lucas Jellema

    - by JuergenKress
    An in-depth look at the interaction of people, processes, and technologies in the transition to a service-oriented architecture. Author's Note This article presents a profile of a fictitious organization, NOPERU. The story of NOPERU as told in this article is actually a collage of the events at some dozen organizations that I have been involved with over the past few years. None of these organizations sport all the characteristics of NOPERU - but all of them have gone through or are going through a similar transition as described here and all aspects of this article were taken from real life at one or usually many of these organizations. Background NOPERU (National Organization for Permits for Emissions and Resource Usage) is a public organization that continues to transform in terms of its business, organization and technology. Changing business requirements; new interaction channels; and increasing demands for more flexibility, faster throughput and lower costs drive these transformations, while technological evolution and new architecture patterns enable the change. NOPERU chose Oracle Fusion Middleware as the technology platform to implement the new architecture and required applications. This article takes a close look at NOPERU's journey from its origins in the early 1990s as a largely paper-based entity with regional databases and client-server Oracle Forms applications. Its upcoming business objectives are introduced: what is required of the organization and what the higher goals behind these requirements are. The architecture roadmap is described at a high level as well as drilled down to a service oriented design. Based on the architecture roadmap and the business requirements and NOPERU went through a technology selection to determine the technology stack with which the future would be realized in terms of IT. The article discusses that selection and details the projects subsequently planned (and executed to date). The new architecture and technology as well as the introduction of an Agile development method have had substantial consequences for the IT organization, the processes and individual staff members. The approach NOPERU has adopted with regard to the people and the organization is portrayed. Finally, the article discusses many conclusions that NOPERU has drawn that may benefit itself and other organizations. Introducing NOPERU NOPERU is a national organization charged with issuing permits for excessive emissions (i.e., carbon dioxide) and disproportionate usage of such resources as energy or water. Anyone-whether a commercial enterprise, government agency or private person--who emits or consumes more than what is considered "fair usage" requires such a permit. When someone builds an outdoor heated swimming pool, for example, or open-air terrace heating, such a permit needs to be obtained. When a company installs new, energy-intensive equipment, such as water boilers or deep freezers, it too needs to get a NOPERU permit. Government-sponsored projects at every level that involve consumption of large quantities of fresh water or production of high volumes of emissions must turn to NOPERU for a permit. Without the required license, any interested party can get a court to immediately put a stop to the disputed activity. Read the full article here. SOA & BPM Partner Community For regular information on Oracle SOA Suite become a member in the SOA & BPM Partner Community for registration please visit www.oracle.com/goto/emea/soa (OPN account required) If you need support with your account please contact the Oracle Partner Business Center. Blog Twitter LinkedIn Facebook Wiki Mix Forum Technorati Tags: Lucas Jellema,SOA Community,Oracle SOA,Oracle BPM,Community,OPN,Jürgen Kress

    Read the article

  • First Day of Data Integration Track at Oracle OpenWorld 2012

    - by Irem Radzik
    OpenWorld started full speed for us today with a great set of sessions in the Data Integration track. After the exciting keynote session on Oracle Database 12c in the morning; Brad Adelberg, VP of Development for Data Integration products, presented Oracle’s data integration product strategy. His session highlighted the new requirements for data integration to achieve pervasive and continuous access to trusted data. The new requirements and product focus areas presented in this session are: Provide access to any data at any source On premise or on cloud Enable zero downtime operations and maximum performance Leverage real-time data for accurate business insights And ensure high quality data is used across the enterprise During the session Brad walked over how Oracle’s data integration products, Oracle Data Integrator, Oracle GoldenGate, Oracle Enterprise Data Quality, and Oracle Data Service Integrator, deliver on these requirements and how recent product releases build on this strategy. Soon after Brad’s session we heard from a panel of Oracle GoldenGate customers, St. Jude Medical, Equifax, and Bank of America, how they achieved zero downtime operations using Oracle GoldenGate. The panel presented different use cases of GoldenGate, from Active-Active replication to offloading reporting. Especially St. Jude Medical’s implementation, which involves the alert management system for patients that use their pacemakers, reminded me in some cases downtime of mission-critical systems can be a matter of life or death. It is very comforting to hear that GoldenGate delivers highly-reliable continuous availability for life-saving medical systems. In the afternoon, Nick Wagner from the Product Management team and I followed the customer panel with the review of Oracle GoldenGate 11gR2’s New Features.  Many questions we received from audience were about GoldenGate’s new Integrated Capture for Oracle Database and the enhanced Conflict Management features, as well as how GoldenGate compares to Oracle Streams. In addition to giving details on GoldenGate’s unique capability to capture changed data with a direct integration to the Oracle DBMS engine, we reminded the audience that enhancements to Oracle GoldenGate will continue, while Streams will be primarily maintained. Last but not least, Tim Garrod and Ryan Fonnett from Raymond James presented a unified real-time data integration solution using Oracle Data Integrator and GoldenGate for their operational data store (ODS). The ODS supports application services across the enterprise and providing timely data is a critical requirement. In this solution, Oracle GoldenGate does the log-based change data capture for Oracle Data Integrator’s near real-time data integration between heterogeneous systems. As Raymond James’ ODS supports mission-critical services for their advisors, the project team had to set up this integration environment to be highly available. During the session, Ryan and Tim explained how they use ODI to enable automated process execution and “always-on” integration processes. Their presentation included 2 demonstrations that focused on CDC patterns deployed with ODI and the automated multi-instance execution and monitoring. We are very grateful to Tim and Ryan for their very-well prepared presentation at OpenWorld this year. Day 2 (Tuesday) will be also a busy day in our track. In addition to the Fusion Middleware Innovation Awards ceremony at 11:45am at Moscone West 3001, we have the following DI sessions Real-World Operational Reporting Customer Panel 11:45am Moscone West- 3005 Oracle Data Integrator Product Update and Future Strategy 1:15pm Moscone West- 3005 High-volume OLTP with Oracle GoldenGate: Best Practices from Comcast 1:15pm Moscone West- 3005 Everything You need to Know about Monitoring Oracle GoldenGate 5pm Moscone West-3005 If you are at OpenWorld please join us in these sessions. For a full review of data integration track at OpenWorld please see our Focus-On document.

    Read the article

  • 5 Ways to Determine Mobile Location

    - by David Dorf
    In my previous post, I mentioned the importance of determining the location of a consumer using their mobile phone.  Retailers can track anonymous mobile phones to determine traffic patterns both inside and outside their stores.  And with consumers' permission, retailers can send location-aware offers to mobile phones; for example, a coupon for cereal as you walk down that aisle.  When paying with Square, your location is matched with the transaction.  So there are lots of reasons for retailers to want to know the location of their customers.  But how is it done? I thought I'd dive a little deeper on that topic and consider the approaches to determining location. 1. Tower Triangulation By comparing the relative signal strength from multiple antenna towers, a general location of a phone can be roughly determined to an accuracy of 200-1000 meters.  The more towers involved, the more accurate the location. 2. GPS Using Global Positioning Satellites is more accurate than using cell towers, but it takes longer to find the satellites, it uses more battery, and it won't well indoors.  For geo-fencing applications, like those provided by Placecast and Digby, cell towers are often used to determine if the consumer is nearing a "fence" then switches to GPS to determine the actual crossing of the fence. 3. WiFi Triangulation WiFi triangulation is usually more accurate than using towers just because there are so many more WiFi access points (i.e. radios in routers) around. The position of each WiFi AP needs to be recorded in a database and used in the calculations, which is what Skyhook has been doing since 2008.  Another advantage to this method is that works well indoors, although it usually requires additional WiFi beacons to get the accuracy down to 5-10 meters.  Companies like ZuluTime, Aisle411, and PointInside have been perfecting this approach for retailers like Meijer, Walgreens, and HomeDepot. Keep in mind that a mobile phone doesn't have to connect to the WiFi network in order for it to be located.  The WiFi radio in the phone only needs to be on.  Even when not connected, WiFi radios talk to each other to prepare for a possible connection. 4. Hybrid Approaches Naturally the most accurate approach is to combine the approaches described above.  The more available data points, the greater the accuracy.  Companies like ShopKick like to add in acoustic triangulation using the phone's microphone, and NearBuy can use video analytics to increase accuracy. 5. Magnetic Fields The latest approach, and this one is really new, takes a page from the animal kingdom.  As you've probably learned from guys like Marlin Perkins, some animals use the Earth's magnetic fields to navigate.  By recording magnetic variations within a store, then matching those readings with ones from a consumer's phone, location can be accurately determined.  At least that's the approach IndoorAtlas is taking, and the science seems to bear out.  It works well indoors, and doesn't require retailers to purchase any additional hardware.  Keep an eye on this one.

    Read the article

  • JMSContext, @JMSDestinationDefintion, DefaultJMSConnectionFactory with simplified JMS API: TOTD #213

    - by arungupta
    "What's New in JMS 2.0" Part 1 and Part 2 provide comprehensive introduction to new messaging features introduced in JMS 2.0. The biggest improvement in JMS 2.0 is introduction of the "new simplified API". This was explained in the Java EE 7 Launch Technical Keynote. You can watch a complete replay here. Sending and Receiving a JMS message using JMS 1.1 requires lot of boilerplate code, primarily because the API was designed 10+ years ago. Here is a code that shows how to send a message using JMS 1.1 API: @Statelesspublic class ClassicMessageSender { @Resource(lookup = "java:comp/DefaultJMSConnectionFactory") ConnectionFactory connectionFactory; @Resource(mappedName = "java:global/jms/myQueue") Queue demoQueue; public void sendMessage(String payload) { Connection connection = null; try { connection = connectionFactory.createConnection(); connection.start(); Session session = connection.createSession(false, Session.AUTO_ACKNOWLEDGE); MessageProducer messageProducer = session.createProducer(demoQueue); TextMessage textMessage = session.createTextMessage(payload); messageProducer.send(textMessage); } catch (JMSException ex) { ex.printStackTrace(); } finally { if (connection != null) { try { connection.close(); } catch (JMSException ex) { ex.printStackTrace(); } } } }} There are several issues with this code: A JMS ConnectionFactory needs to be created in a application server-specific way before this application can run. Application-specific destination needs to be created in an application server-specific way before this application can run. Several intermediate objects need to be created to honor the JMS 1.1 API, e.g. ConnectionFactory -> Connection -> Session -> MessageProducer -> TextMessage. Everything is a checked exception and so try/catch block must be specified. Connection need to be explicitly started and closed, and that bloats even the finally block. The new JMS 2.0 simplified API code looks like: @Statelesspublic class SimplifiedMessageSender { @Inject JMSContext context; @Resource(mappedName="java:global/jms/myQueue") Queue myQueue; public void sendMessage(String message) { context.createProducer().send(myQueue, message); }} The code is significantly improved from the previous version in the following ways: The JMSContext interface combines in a single object the functionality of both the Connection and the Session in the earlier JMS APIs.  You can obtain a JMSContext object by simply injecting it with the @Inject annotation.  No need to explicitly specify a ConnectionFactory. A default ConnectionFactory under the JNDI name of java:comp/DefaultJMSConnectionFactory is used if no explicit ConnectionFactory is specified. The destination can be easily created using newly introduced @JMSDestinationDefinition as: @JMSDestinationDefinition(name = "java:global/jms/myQueue",        interfaceName = "javax.jms.Queue") It can be specified on any Java EE component and the destination is created during deployment. JMSContext, Session, Connection, JMSProducer and JMSConsumer objects are now AutoCloseable. This means that these resources are automatically closed when they go out of scope. This also obviates the need to explicitly start the connection JMSException is now a runtime exception. Method chaining on JMSProducers allows to use builder patterns. No need to create separate Message object, you can specify the message body as an argument to the send() method instead. Want to try this code ? Download source code! Download Java EE 7 SDK and install. Start GlassFish: bin/asadmin start-domain Build the WAR (in the unzipped source code directory): mvn package Deploy the WAR: bin/asadmin deploy <source-code>/jms/target/jms-1.0-SNAPSHOT.war And access the application at http://localhost:8080/jms-1.0-SNAPSHOT/index.jsp to send and receive a message using classic and simplified API. A replay of JMS 2.0 session from Java EE 7 Launch Webinar provides complete details on what's new in this specification: Enjoy!

    Read the article

  • Everything You Need to Know About Monitoring Oracle GoldenGate

    - by Irem Radzik
    By Joe deBuzna Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Times New Roman","serif";} Having over 16 years of database replication experience with 6 of those split between complex Oracle GoldenGate installations across three continents and researching monitoring requirements for both GoldenGate core replication and the GoldenGate monitoring GUIs, I've seen GoldenGate used and monitored in every way conceivable. And definite patterns have emerged. Next week at OpenWorld, on Tuesday Oct 2nd at 5pm please come by to Mascone West-3005 for "Everything you need to know about Monitoring Oracle GoldenGate"session to hear me discuss how GoldenGate customers are monitoring their implementations today, common methods and tricks, what's new in the GUIs, and a what's on the roadmap ahead. As you may have seen in previous blog posts and in our launch webcast we have now Plug-in for Oracle Enterprise Manager in addition to the new Oracle GoldenGate Monitor product. For those of you who won't be at OpenWorld, please check out our Management Pack for Oracle GoldenGate data sheet and Oracle GoldenGate 11gR2 New Features white paper to learn more about the new Oracle GoldenGate 11gR2 release. In this latest release we also have enhanced conflict detection and resolution. It is a cornerstone of any Active-Active database replication solution. And in the latest release we just took ours to the next level with built in optimized resolution routines (no more dependency on sqlexec!). At OpenWorld we have a session CON8557 - Best Practice for Conflict Detection & resolution 3:30-4:30 on Wed Oct 3rd at Mascone West- 3005. Oracle Development Manager Bharath Aleti and I will highlight the most commonly used options and best practices gained from our interaction with numerous customers and consultants. Hope you can join us next week. Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Times New Roman","serif";}

    Read the article

  • SQL analytical mash-ups deliver real-time WOW! for big data

    - by KLaker
    One of the overlooked capabilities of SQL as an analysis engine, because we all just take it for granted, is that you can mix and match analytical features to create some amazing mash-ups. As we move into the exciting world of big data these mash-ups can really deliver those "wow, I never knew that" moments. While Java is an incredibly flexible and powerful framework for managing big data there are some significant challenges in using Java and MapReduce to drive your analysis to create these "wow" discoveries. One of these "wow" moments was demonstrated at this year's OpenWorld during Andy Mendelsohn's general keynote session.  Here is the scenario - we are looking for fraudulent activities in our big data stream and in this case we identifying potentially fraudulent activities by looking for specific patterns. We using geospatial tagging of each transaction so we can create a real-time fraud-map for our business users. Where we start to move towards a "wow" moment is to extend this basic use of spatial and pattern matching, as shown in the above dashboard screen, to incorporate spatial analytics within the SQL pattern matching clause. This will allow us to compute the distance between transactions. Apologies for the quality of this screenshot….hopefully below you see where we have extended our SQL pattern matching clause to use location of each transaction and to calculate the distance between each transaction: This allows us to compare the time of the last transaction with the time of the current transaction and see if the distance between the two points is possible given the time frame. Obviously if I buy something in Florida from my favourite bike store (may be a new carbon saddle for my Trek) and then 5 minutes later the system sees my credit card details being used in Arizona there is high probability that this transaction in Arizona is actually fraudulent (I am fast on my Trek but not that fast!) and we can flag this up in real-time on our dashboard: In this post I have used the term "real-time" a couple of times and this is an important point and one of the key reasons why SQL really is the only language to use if you want to analyse  big data. One of the most important questions that comes up in every big data project is: how do we do analysis? Many enlightened customers are now realising that using Java-MapReduce to deliver analysis does not result in "wow" moments. These "wow" moments only come with SQL because it is offers a much richer environment, it is simpler to use and it is faster - which makes it possible to deliver real-time "Wow!". Below is a slide from Andy's session showing the results of a comparison of Java-MapReduce vs. SQL pattern matching to deliver our "wow" moment during our live demo.  You can watch our analytical mash-up "Wow" demo that compares the power of 12c SQL pattern matching + spatial analytics vs. Java-MapReduce  here: You can get more information about SQL Pattern Matching on our SQL Analytics home page on OTN, see here http://www.oracle.com/technetwork/database/bi-datawarehousing/sql-analytics-index-1984365.html.  You can get more information about our spatial analytics here: http://www.oracle.com/technetwork/database-options/spatialandgraph/overview/index.html If you would like to watch the full Database 12c OOW presentation see here: http://medianetwork.oracle.com/video/player/2686974264001

    Read the article

  • The theory of evolution applied to software

    - by Michel Grootjans
    I recently realized the many parallels you can draw between the theory of evolution and evolving software. Evolution is not the proverbial million monkeys typing on a million typewriters, where one of them comes up with the complete works of Shakespeare. We would have noticed by now, since the proverbial monkeys are now blogging on the Internet ;-) One of the main ideas of the theory of evolution is the balance between random mutations and natural selection. Random mutations happen all the time: millions of mutations over millions of years. Most of them are totally useless. Some of them are beneficial to the evolved species. Natural selection favors the beneficially mutated species. Less beneficial mutations die off. The mutated rabbit doesn't have to be faster than the fox. It just has to be faster than the other rabbits.   Theory of evolution Evolving software Random mutations happen all the time. Most of these mutations are so bad, the new species dies off, or cannot reproduce. Developers write new code all the time. New ideas come up during the act of writing software. The really bad ones don't get past the stage of idea. The bad ones don't get committed to source control. Natural selection favors the beneficial mutated species Good ideas and new code gets discussed in group during informal peer review. Less than good code gets refactored. Enhanced code makes it more readable, maintainable... A good set of traits makes the species superior to others. It becomes widespread A good design tends to make it easier to add new features, easier to understand the current implementations, easier to optimize for performance...thus superior. The best designs get carried over from project to project. They appear in blogs, articles and books about principles, patterns and practices.   Of course the act of writing software is deliberate. This can hardly be called random mutations. Though it sometimes might seem that code evolves through a will of its own ;-) Does this mean that evolving software (evolution) is better than a big design up front (creationism)? Not necessarily. It's a false idea to think that a project starts from scratch and everything evolves from there. Everyone carries his experience of what works and what doesn't. Up front design is necessary, but is best kept simple and minimal, just enough to get you started. Let the good experiences and ideas help to drive the process, whether they come from you or from others, from past experience or from the most junior developer on your team. Once again, balance is the keyword. Balance design up front with evolution on a daily basis. How do you know what balance is right? Through your own experience of what worked and what didn't (here's evolution again). Notes: The evolution of software can quickly degenerate without discipline. TDD is a discipline that leaves little to chance on that part. Write your test to describe the new behavior. Write just enough code to make it behave as specified. Refactor to evolve the code to a higher standard. The responsibility of good design rests continuously on each developers' shoulders. Promiscuous pair programming helps quickly spreading the design to the whole team.

    Read the article

  • Live Event: OTN Architect Day: Cloud Computing - Two weeks and counting

    - by Bob Rhubart
    In just two weeks architects and others will gather at the Oracle Conference Center in Redwood Shores, CA for the first Oracle Technology Network Architect Day event of 2013. This event focuses on Cloud Computing, and features sessions specifically focused on real-world examples of the implementation of cloud computing. When: Tuesday July 9, 2013              8:30am - 12:30pm Where: Oracle Conference Center              350 Oracle Pkwy              Redwood City, CA 94065 Register now. It's free! Here's the agenda: 8:30am - 9:00am Registration and Continental Breakfast 9:00am - 9:45am Keynote 21st Century IT | Dr. James Baty VP, Global Enterprise Architecture Program, Oracle Imagine a time long, long ago. A time when servers were certified and dedicated to specific applications, when anything posted on an enterprise web site was from restricted, approved channels, and when we tried to limit the growth of 'dirty' data and storage. Today, applications are services running in the muti-tenant hybrid cloud. Companies beg their customers to tweet them, friend them, and publicly rate their products. And constantly analyzing a deluge of Internet, social and sensor data is the key to creating the next super-successful product, or capturing an evil terrorist. The old IT architecture was planned, dedicated, stable, controlled, with separate and well-defined roles. The new architecture is shared, dynamic, continuous, XaaS, DevOps. This keynote session describes the challenges and opportunities that the new business / IT paradigms present to the IT architecture and architects. 9:45am - 10:30am Technical Session Oracle Cloud: A Case Study in Building a Cloud | Anbu Krishnaswami Enterprise Architect, Oracle Building a Cloud can be challenging thanks to the complex requirements unique to Cloud computing and the massive scale typically associated with Cloud. Cloud providers can take an Infrastructure as a Service (IaaS) approach and build a cloud on virtualized commodity hardware, or they can take the Platform as a Service (PaaS) path, a service-oriented approach based on pre-configured, integrated, engineered systems. This presentation uses the Oracle Cloud itself as a case study in the use of engineered systems, demonstrating how the technical design of engineered systems is leveraged for building PaaS and SaaS Cloud services and a Cloud management infrastructure. The presentation will also explore the principles, patterns, best practices, and architecture views provided in Oracle's Cloud reference architecture. 10:30 am -10:45 am Break 10:45am-11:30am Technical Session Database as a Service | Michael Timpanaro-Perrotta Director, Product Management, Oracle Database Cloud New applications are now commonly built in a Cloud model, where the database is consumed as a service, and many established business processes are beginning to migrate to database as a service (DBaaS). This adoption of DBaaS is made possible by the availability of new capabilities in the database that enable resource pooling, dynamic resource management, model-based provisioning, metered use, and effective quality-of-service controls. This session will examine the catalog of database services at a large commercial bank to understand how these capabilities are enabling DBaaS for a wide range of needs within the enterprise. 11:30 am - 12:00 pm Panel Q&A Dr. James Baty, Anbu Krishnaswami, and Michael Timpanaro-Perrotta respond to audience questions. Registration is free, but seating is limited, so register now.

    Read the article

  • Live Event: OTN Architect Day: Cloud Computing - Two weeks and counting

    - by Bob Rhubart
    In just two weeks architects and others will gather at the Oracle Conference Center in Redwood Shores, CA for the first Oracle Technology Network Architect Day event of 2013. This event focuses on Cloud Computing, and features sessions specifically focused on real-world examples of the implementation of cloud computing. When: Tuesday July 9, 2013              8:30am - 12:30pm Where: Oracle Conference Center              350 Oracle Pkwy              Redwood City, CA 94065 Register now. It's free! Here's the agenda: 8:30am - 9:00am Registration and Continental Breakfast 9:00am - 9:45am Keynote 21st Century IT | Dr. James Baty VP, Global Enterprise Architecture Program, Oracle Imagine a time long, long ago. A time when servers were certified and dedicated to specific applications, when anything posted on an enterprise web site was from restricted, approved channels, and when we tried to limit the growth of 'dirty' data and storage. Today, applications are services running in the muti-tenant hybrid cloud. Companies beg their customers to tweet them, friend them, and publicly rate their products. And constantly analyzing a deluge of Internet, social and sensor data is the key to creating the next super-successful product, or capturing an evil terrorist. The old IT architecture was planned, dedicated, stable, controlled, with separate and well-defined roles. The new architecture is shared, dynamic, continuous, XaaS, DevOps. This keynote session describes the challenges and opportunities that the new business / IT paradigms present to the IT architecture and architects. 9:45am - 10:30am Technical Session Oracle Cloud: A Case Study in Building a Cloud | Anbu Krishnaswami Enterprise Architect, Oracle Building a Cloud can be challenging thanks to the complex requirements unique to Cloud computing and the massive scale typically associated with Cloud. Cloud providers can take an Infrastructure as a Service (IaaS) approach and build a cloud on virtualized commodity hardware, or they can take the Platform as a Service (PaaS) path, a service-oriented approach based on pre-configured, integrated, engineered systems. This presentation uses the Oracle Cloud itself as a case study in the use of engineered systems, demonstrating how the technical design of engineered systems is leveraged for building PaaS and SaaS Cloud services and a Cloud management infrastructure. The presentation will also explore the principles, patterns, best practices, and architecture views provided in Oracle's Cloud reference architecture. 10:30 am -10:45 am Break 10:45am-11:30am Technical Session Database as a Service | Michael Timpanaro-Perrotta Director, Product Management, Oracle Database Cloud New applications are now commonly built in a Cloud model, where the database is consumed as a service, and many established business processes are beginning to migrate to database as a service (DBaaS). This adoption of DBaaS is made possible by the availability of new capabilities in the database that enable resource pooling, dynamic resource management, model-based provisioning, metered use, and effective quality-of-service controls. This session will examine the catalog of database services at a large commercial bank to understand how these capabilities are enabling DBaaS for a wide range of needs within the enterprise. 11:30 am - 12:00 pm Panel Q&A Dr. James Baty, Anbu Krishnaswami, and Michael Timpanaro-Perrotta respond to audience questions. Registration is free, but seating is limited, so register now.

    Read the article

  • How should an object that uses composition set its composed components?

    - by Casey
    After struggling with various problems and reading up on component-based systems and reading Bob Nystrom's excellent book "Game Programming Patterns" and in particular the chapter on Components I determined that this is a horrible idea: //Class intended to be inherited by all objects. Engine uses Objects exclusively. class Object : public IUpdatable, public IDrawable { public: Object(); Object(const Object& other); Object& operator=(const Object& rhs); virtual ~Object() =0; virtual void SetBody(const RigidBodyDef& body); virtual const RigidBody* GetBody() const; virtual RigidBody* GetBody(); //Inherited from IUpdatable virtual void Update(double deltaTime); //Inherited from IDrawable virtual void Draw(BITMAP* dest); protected: private: }; I'm attempting to refactor it into a more manageable system. Mr. Nystrom uses the constructor to set the individual components; CHANGING these components at run-time is impossible. It's intended to be derived and be used in derivative classes or factory methods where their constructors do not change at run-time. i.e. his Bjorne object is just a call to a factory method with a specific call to the GameObject constructor. Is this a good idea? Should the object have a default constructor and setters to facilitate run-time changes or no default constructor without setters and instead use a factory method? Given: class Object { public: //...See below for constructor implementation concerns. Object(const Object& other); Object& operator=(const Object& rhs); virtual ~Object() =0; //See below for Setter concerns IUpdatable* GetUpdater(); IDrawable* GetRenderer(); protected: IUpdatable* _updater; IDrawable* _renderer; private: }; Should the components be read-only and passed in to the constructor via: class Object { public: //No default constructor. Object(IUpdatable* updater, IDrawable* renderer); //...remainder is same as above... }; or Should a default constructor be provided and then the components can be set at run-time? class Object { public: Object(); //... SetUpdater(IUpdater* updater); SetRenderer(IDrawable* renderer); //...remainder is same as above... }; or both? class Object { public: Object(); Object(IUpdater* updater, IDrawable* renderer); //... SetUpdater(IUpdater* updater); SetRenderer(IDrawable* renderer); //...remainder is same as above... };

    Read the article

  • Do you want to be an ALM Consultant?

    - by Martin Hinshelwood
    Northwest Cadence is looking for our next great consultant! At Northwest Cadence, we have created a work environment that emphasizes excellence, integrity, and out-of-the-box thinking.  Our customers have high expectations (rightfully so) and we wouldn’t have it any other way!   Northwest Cadence has some of the most exciting customers I have ever worked with and even though I have only been here just over a month I have already: Provided training/consulting for 3 government departments Created and taught courseware for delivering Scrum to teams within a high profile multinational company Started presenting Microsoft's ALM Engagement Program  So if you are interested in helping companies build better software more efficiently, then.. Enquire at [email protected] Application Lifecycle Management (ALM) Consultant An ALM Consultant with a minimum of 8 years of relevant experience with Application Lifecycle Management, Visual Studio (including Visual Studio Team System) and software design is needed. Must provide thought leadership on best practices for enterprise architecture, understand the Microsoft technology solution stack, and have a thorough understanding of enterprise application integration. The ALM Practice Lead will play a central role in designing and implementing the overall ALM Practice strategy, including creating, updating, and delivering ALM courseware and consultancy engagements. This person will also provide project support, deliverables, and quality solutions on Visual Studio Team System that exceed client expectations. Engagements will vary and will involve providing expert training, consulting, mentoring, formulating technical strategies and policies and acting as a “trusted advisor” to customers and internal teams. Sound sense of business and technical strategy required. Strong interpersonal skills as well as solid strategic thinking are key. The ideal candidate will be capable of envisioning the solution based on the early client requirements, communicating the vision to both technical and business stakeholders, leading teams through implementation, as well as training, mentoring, and hands-on software development. The ideal candidate will demonstrate successful use of both agile and formal software development methods, enterprise application patterns, and effective leadership on prior projects. Job Requirements Minimum Education: Bachelor’s Degree (computer science, engineering, or math preferred). Locale / Travel: The Practice Lead position requires estimated 50% travel, most of which will be in the Continental US (a valid national Passport must be maintained).  This is a full time position and will be based in the Kirkland office. Preferred Education: Master’s Degree in Information Technology or Software Engineering; Premium Microsoft Certifications on .NET (MCSD) or MCPD or relevant experience; Microsoft Certified Trainer (MCT) or relevant experience. Minimum Experience and Skills: 7+ years experience with business information systems integration or custom business application design and development in a professional technology consulting, corporate MIS or software development environment. Essential Duties & Responsibilities: Provide training, consulting, and mentoring to organizations on topics that include Visual Studio Team System and ALM. Create content, including labs and demonstrations, to be delivered as training classes by Northwest Cadence employees. Lead development teams through the complete ALM and/or Visual Studio Team System solution. Be able to communicate in detail how a solution will integrate into the larger technical problem space for large, complex enterprises. Define technical solution requirements. Provide guidance to the customer and project team with respect to technical feasibility, complexity, and level of effort required to deliver a custom solution. Ensure that the solution is designed, developed and deployed in accordance with the agreed upon development work plan. Create and deliver weekly status reports of training and/or consulting progress. Engagement Responsibilities: · Provide a strong desire to provide thought leadership related to technology and to help grow the business. · Work effectively and professionally with employees at all levels of a customer’s organization. · Have strong verbal and written communication skills. · Have effective presentation, organizational and planning skills. · Have effective interpersonal skills and ability to work in a team environment. Enquire at [email protected]

    Read the article

  • SOA Community Newsletter May 2014

    - by JuergenKress
    Registration for the Fusion Middleware Summer Camps 2014 is open – Register asap for one of our bootcamps August 4th – 8th 2014 in Lisbon. Please read details and pre-requisitions careful before you register. We expect that like in the past, the conference will be booked out soon! If you can’t make it to Lisbon attend our SOA Suite 11c free on-demand Bootcamp or  Managing the Complexity of IoT online trainings. With more than 5000 customers, SOA Suite Achieves Significant Customer Adoption and Industry Recognition.Thanks to all our SOA Specialized partners for making our joins SOA customers successful! As a summary of the Industrial SOA series we published the Podcast Show Notes: SOA and Cloud - Where's This Relationship Going? Make sure you use the Oracle Demo Systems for your customer presentations. The demo systems are hosted by Oracle and include complete scenarios based on the latest Middleware version like the new B2B SOA Suite Demo System! For local presentations without fast internet use the SOA/BPM 11.1.1.7.1 Virtual Machine and Case Management Sample. At our SOA Community Workspace (SOA Community membership required) you can get new IoT presentations for Location Based Offers for Banking & Whitepaper and online Webcast & Utility presentation. In this newsletter you will find many articles about OSB: OSB 11g – A Hands-on Tutorial & Using Split-Joins in OSB Services for parallel processing of messages & OSB, Service Callouts and OQL & Working with Oracle Security Token Service. Thanks for sharing all the additional SOA articles within the community: How to configure Oracle SOA/BPM task auto release & Controlling BPEL process flow at runtime & Upgrading to Oracle SOA Suite 11g PS6 (11.1.1.7)? Do this. & BPEL and BPM's performance monitoring using DMS & SOA 11g - Create RESTful Service In Oracle SOA & Wrong timezone causes TopLink warning in SOA suite. Highlight of the BPM and ACM section is the IDC BPM vendor report. The new bundle Patch including the ACM UI is now available. If you want to learn more about ACM, get the ACM training material at our SOA Community Workspace (SOA Community membership required). A great demo for your next BPM presentation is the BPM iPad app. It’s simpleMobile BPM is Not An Option. It’s a Necessity. Thanks for sharing all the additional BPM articles within the community: BPM update adds Case Management Web Interface and REST APIs & Implementing deadline functionality with Oracle Adaptive Case Management & BPM 11g Timeout Heuristics & Humantask Assignment: Names and Expressions Assignment via Rules. In our last section Architecture, it is all about design. Usability is a key factor for customer satisfaction, worth to spend some time and read the Simplified User Experience Design Patterns eBook. Great blueprint for your project! See you in Lisbon! To read the newsletter please visit www.tinyurl.com/soaNewsMay2014 (OPN Account required) To become a member of the SOA Partner Community please register at http://www.oracle.com/goto/emea/soa (OPN account required) If you need support with your account please contact the Oracle Partner Business Center. Blog Twitter LinkedIn Facebook Wiki Mix Forum Technorati Tags: newsletter,SOA Community newsletter,SOA Community,Oracle,OPN,Jürgen Kress

    Read the article

  • SQL SERVER – Puzzle #1 – Querying Pattern Ranges and Wild Cards

    - by Pinal Dave
    Note: Read at the end of the blog post how you can get five Joes 2 Pros Book #1 and a surprise gift. I have been blogging for almost 7 years and every other day I receive questions about Querying Pattern Ranges. The most common way to solve the problem is to use Wild Cards. However, not everyone knows how to use wild card properly. SQL Queries 2012 Joes 2 Pros Volume 1 – The SQL Queries 2012 Hands-On Tutorial for Beginners Book On Amazon | Book On Flipkart Learn SQL Server get all the five parts combo kit Kit on Amazon | Kit on Flipkart Many people know wildcards are great for finding patterns in character data. There are also some special sequences with wildcards that can give you even more power. This series from SQL Queries 2012 Joes 2 Pros® Volume 1 will show you some of these cool tricks. All supporting files are available with a free download from the www.Joes2Pros.com web site. This example is from the SQL 2012 series Volume 1 in the file SQLQueries2012Vol1Chapter2.2Setup.sql. If you need help setting up then look in the “Free Videos” section on Joes2Pros under “Getting Started” called “How to install your labs” Querying Pattern Ranges The % wildcard character represents any number of characters of any length. Let’s find all first names that end in the letter ‘A’. By using the percentage ‘%’ sign with the letter ‘A’, we achieve this goal using the code sample below: SELECT * FROM Employee WHERE FirstName LIKE '%A' To find all FirstName values beginning with the letters ‘A’ or ‘B’ we can use two predicates in our WHERE clause, by separating them with the OR statement. Finding names beginning with an ‘A’ or ‘B’ is easy and this works fine until we want a larger range of letters as in the example below for ‘A’ thru ‘K’: SELECT * FROM Employee WHERE FirstName LIKE 'A%' OR FirstName LIKE 'B%' OR FirstName LIKE 'C%' OR FirstName LIKE 'D%' OR FirstName LIKE 'E%' OR FirstName LIKE 'F%' OR FirstName LIKE 'G%' OR FirstName LIKE 'H%' OR FirstName LIKE 'I%' OR FirstName LIKE 'J%' OR FirstName LIKE 'K%' The previous query does find FirstName values beginning with the letters ‘A’ thru ‘K’. However, when a query requires a large range of letters, the LIKE operator has an even better option. Since the first letter of the FirstName field can be ‘A’, ‘B’, ‘C’, ‘D’, ‘E’, ‘F’, ‘G’, ‘H’, ‘I’, ‘J’ or ‘K’, simply list all these choices inside a set of square brackets followed by the ‘%’ wildcard, as in the example below: SELECT * FROM Employee WHERE FirstName LIKE '[ABCDEFGHIJK]%' A more elegant example of this technique recognizes that all these letters are in a continuous range, so we really only need to list the first and last letter of the range inside the square brackets, followed by the ‘%’ wildcard allowing for any number of characters after the first letter in the range. Note: A predicate that uses a range will not work with the ‘=’ operator (equals sign). It will neither raise an error, nor produce a result set. --Bad query (will not error or return any records) SELECT * FROM Employee WHERE FirstName = '[A-K]%' Question: You want to find all first names that start with the letters A-M in your Customer table and end with the letter Z. Which SQL code would you use? a. SELECT * FROM Customer WHERE FirstName LIKE 'm%z' b. SELECT * FROM Customer WHERE FirstName LIKE 'a-m%z' c. SELECT * FROM Customer WHERE FirstName LIKE 'a-m%z' d. SELECT * FROM Customer WHERE FirstName LIKE '[a-m]%z' e. SELECT * FROM Customer WHERE FirstName LIKE '[a-m]z%' f. SELECT * FROM Customer WHERE FirstName LIKE '[a-m]%z' g. SELECT * FROM Customer WHERE FirstName LIKE '[a-m]z%' Contest Leave a valid answer before June 18, 2013 in the comment section. 5 winners will be selected from all the valid answers and will receive Joes 2 Pros Book #1. 1 Lucky person will get a surprise gift from Joes 2 Pros. The contest is open for all the countries where Amazon ships the book (USA, UK, Canada, India and many others). Special Note: Read all the options before you provide valid answer as there is a small trick hidden in answers. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: Joes 2 Pros, PostADay, SQL, SQL Authority, SQL Puzzle, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • Social Technology and the Potential for Organic Business Networks

    - by Michael Snow
    Guest Blog Post by:  Michael Fauscette, IDCThere has been a lot of discussion around the topic of social business, or social enterprise, over the last few years. The concept of applying emerging technologies from the social Web, combined with changes in processes and culture, has the potential to provide benefits across the enterprise over a wide range of operations impacting employees, customers, partners and suppliers. Companies are using social tools to build out enterprise social networks that provide, among other things, a people-centric collaborative and knowledge sharing work environment which over time can breakdown organizational silos. On the outside of the business, social technology is adding new ways to support customers, market to prospects and customers, and even support the sales process. We’re also seeing new ways of connecting partners to the business that increases collaboration and innovation. All of the new "connectivity" is, I think, leading businesses to a business model built around the concept of the network or ecosystem instead of the old "stand-by-yourself" approach. So, if you think about businesses as networks in the context of all of the other technical and cultural change factors that we're seeing in the new information economy, you can start to see that there’s a lot of potential for co-innovation and collaboration that was very difficult to arrange before. This networked business model, or what I've started to call “organic business networks,” is the business model of the information economy.The word “organic” could be confusing, but when I use it in this context, I’m thinking it has similar traits to organic computing. Organic computing is a computing system that is self-optimizing, self-healing, self-configuring, and self-protecting. More broadly, organic models are generally patterns and methods found in living systems used as a metaphor for non-living systems.Applying an organic model, organic business networks are networks that represent the interconnectedness of the emerging information business environment. Organic business networks connect people, data/information, content, and IT systems in a flexible, self-optimizing, self-healing, self-configuring, and self-protecting system. People are the primary nodes of the network, but the other nodes — data, content, and applications/systems — are no less important.A business built around the organic business network business model would incorporate the characteristics of a social business, but go beyond the basics—i.e., use social business as the operational paradigm, but also use organic business networks as the mode of operating the business. The two concepts complement each other: social business is the “what,” and the organic business network is the “how.”An organic business network lets the business work go outside of traditional organizational boundaries and become the continuously adapting implementation of an optimized business strategy. Value creation can move to the optimal point in the network, depending on strategic influencers such as the economy, market dynamics, customer behavior, prospect behavior, partner behavior and needs, supply-chain dynamics, predictive business outcomes, etc.An organic business network driven company is the antithesis of a hierarchical, rigid, reactive, process-constrained, and siloed organization. Instead, the business can adapt to changing conditions, leverage assets effectively, and thrive in a hyper-connected, global competitive, information-driven environment.To hear more on this topic – I’ll be presenting in the next webcast of the Oracle Social Business Thought Leader Webcast Series - “Organic Business Networks: Doing Business in a Hyper-Connected World” this coming Thursday, June 21, 2012, 10:00 AM PDT – Register here

    Read the article

  • Multitenant Design for SQL Azure: White Paper Available

    - by Herve Roggero
    Cloud computing is about scaling out all your application tiers, from web application to the database layer. In fact, the whole promise of Azure is to pay for just what you need. You need more IIS servers? No problemo... just spin another web server. You expect to double your storage needs for Azure Tables? No problemo; you are covered there too... just pay for your storage needs. But what about the database tier, SQL Azure? How do you add new databases easily, and transparently, so that your application simply uses more of SQL Azure if its needs to? Without changing a single line of code? And what if you need to scale back down? Welcome to the world of database scalability. There are many terms that describe database scalability, including data federation, multitenant designs, and even NoSQL depending on the technical solution you are implementing.  Because SQL Azure is a transactional database system, NoSQL is not really an option. However data federation and multitenant designs offer some very interesting scalability options that are worth considering. Data federation, a feature of SQL Azure that will be offered in the future, offers very interesting capabilities available natively on the SQL Azure platform. More to come in a few weeks... Multitenant designs on the other hand are design practices and technologies designed to help you reach flexible scalability options not available otherwise. The first incarnation of such a method was made available on CodePlex as an open source project (http://enzosqlshard.codeplex.com).  This project was an attempt to provide a sharding library for educational purposes.  All that sounds really cool... and really esoteric... almost a form of database "voodoo"... However after being on multiple Azure projects I am starting to see a real need. Customers want to be able to free themselves from the database tier, so that if they have 10 new customers tomorrow, all they need to do is add 2 more SQL Azure instances. It's that simple. How you achieve this, and suggested application design guidelines, are available in a white paper I just published.  The white paper offers two primary sections. The first section describes the business and technical problem at hand, and how to classify it according to specific design patterns. For example, I discuss compressed shards through schema separation. The second section offers a method for addressing the needs of a multitenant design using a new library, the big bother of the codeplex project mentioned previously (that I created earlier this year), complete with management interface and such. A Beta of this platform will be made available within weeks; as soon as the documentation will be ready.   I would like to ask you to drop me a quick email at [email protected] if you are going to download the white paper. It's not required, but it would help me get in touch with you for feedback.  You can download this white paper here:   http://www.bluesyntax.net/files/EnzoFramework.pdf . Thank you, and I am looking for feedback, thoughts and implementation opportunities.

    Read the article

  • Analytics in an Omni-Channel World

    - by David Dorf
    Retail has been around ever since mankind started bartering.  The earliest transactions were very specific to the individuals buying and selling, then someone had the bright idea to open a store.  Those transactions were a little more generic, but the store owner still knew his customers and what they wanted.  As the chains rolled out, customer intimacy was sacrificed for scale, and retailers began to rely on segments and clusters.  But thanks to the widespread availability of data and the technology to convert said data into information, retailers are getting back to details. The retail industry is following a maturity model for analytics that is has progressed through five stages, each delivering more value than the previous. Store Analytics Brick-and-mortar retailers (and pure-play catalogers as well) that collect anonymous basket-level data are able to get some sense of demand to help with allocation decisions.  Promotions and foot-traffic can be measured to understand marketing effectiveness and perhaps focus groups can help test ideas.  But decisions are influenced by the majority, using faceless customer segments and aggregated industry data points.  Loyalty programs help a little, but in many cases the cost outweighs the benefits. Web Analytics The Web made it much easier to collect data on specific, yet still anonymous consumers using cookies to track visits. Clickstreams and product searches are analyzed to understand the purchase journey, gauge demand, and better understand up-selling opportunities.  Personalization begins to allow retailers target market consumers with recommendations. Cross-Channel Analytics This phase is a minor one, but where most retailers probably sit today.  They are able to use information from one channel to bolster activities in another. However, there are technical challenges combining data silos so its not an easy task.  But for those retailers that are able to perform analytics on both sources of data, the pay-off is pretty nice.  Revenue per customer begins to go up as customers have a better brand experience. Mobile & Social Analytics Big data technologies are enabling a 360-degree view of the customer by incorporating psychographic data from social sites alongside traditional demographic data.  Retailers can track individual preferences, opinions, hobbies, etc. in order to understand a consumer's motivations.  Using mobile devices, consumers can interact with brands anywhere, anytime, accessing deep product information and reviews.  Mobile, combined with a loyalty program, presents an opportunity to put shopping into geographic context, understanding paths to the store, patterns within the store, and be an always-on advertising conduit. Omni-Channel Analytics All this data along with the proper technology represents a new paradigm in which the clock is turned back and retail becomes very personal once again.  Rich, individualized data better illuminates demand, allows for highly localized assortments, and helps tailor up-selling.  Interactions with all channels help build an accurate profile of each consumer, and allows retailers to tailor the retail experience to meet the heightened expectations of today's sophisticated shopper.  And of course this culminates in greater customer satisfaction and business profitability.

    Read the article

  • How to ace Skype Interviews

    - by FelixWehmeyer
    Many companies these days opt to include a Skype interview in the recruitment process, as it comes close to a face-to-face interview without the time and costs involved for both the company and the candidate. In some cases during the recruitment process at Oracle you also might be asked to conduct a Skype interview. To help you get started with this, we researched some websites to give you several tips and tricks. What most of the bloggers say about this topic is collected in this article to help you prepare. It is all about Technology The bit that can make a Skype interview more complicated than a face-to-face or phone interview is the fact you are using additional technology. Always check the video and audio capabilities of your computer to make sure they work properly. Be prepared for connections to be limited during the interview. Using a webcam can also be confusing, if you do not have a lot of experience using it. Make sure you look at the camera and not the monitor to avoid the impression you are looking away. Practice If you do not feel comfortable using the camera, do a mock interview with a friend or family member before you have the actual interview. Be aware that facial impressions or reactions come across differently on a monitor, so make sure to practice how you  come across during the interview. Good lighting in the room also helps you make you look the best for the interviewer. You and your room Dress code, as in any face-to-face interview,is important to think about. Dress the same way as you would for face-to-face interviews and avoid patterns or informal clothing. Another tip,is to be aware of your surroundings. Make sure the room you use looks good on camera, making sure it is neat and tidy, also think about how the walls look behind you. Also make sure you do not get distracted during the interview by anyone or anything, as this will directly have an impact on your interview and your ability to focus and concentrate. What is in a name What goes for any account that you share during the recruitment process, either your email address or Skype name, is to make sure it comes across as professional. Try to avoid using nicknames or strange words in your accounts, stick to using a first name – last name or an abbreviation of the same. If you would like to read more about this topic, have a look at the links below which we used as inspiration for this blog article. 7 Deadly Skype Interview Sins is fun to read and to gives you some good advice to keep in mind. ·         http://www.inc.com/guides/201103/4-tips-for-conducting-a-job-interview-using-skype.html ·         http://blog.simplyhired.com/2012/05/5-tips-to-a-great-skype-interview.html ·         http://www.cnn.com/2011/LIVING/07/11/skype.interview.tips.cb/index.html http://www.ehow.com/how_5648281_prepare-skype-interview.html

    Read the article

  • Gartner PCC: A Shovel & Some Ah-Ha's

    - by kellsey.ruppel
    When Gartner Vice President and leading analyst Whit Andrews kicked off the Gartner Portals, Content & Collaboration Summit on Monday, March 12 at the Gaylord Palms in Orlando, FL by bringing a shovel to the stage, eyebrows raised and a few thoughts went through my head. Either this guy plans to go help the construction workers outside construct that new pool at the Gaylord or he took a wrong turn and is at the wrong conference. Oh and how did he get that shovel through airport security? As Whit explained more his objective became more clear…take everything anyone has ever told you about portals and throw it out the window, as portals have evolved and times they are most certainly changing. The future Web is here, available not only on browsers but also via a broad spectrum of access points, including automobiles, consumer electronics and more and more mobile devices. Not merely prevalent, the future Web is also multimedia-driven and operates in real time, driven by mobility, social media, streaming video and other dynamic services. Applications and user experiences are in the midst of an evolution — from the early, simple mobile Web models to today’s Web 2.0 mobile apps and, ultimately, to a world of predominantly Web apps. Additionally, cloud services will forever change how portals and user experience are designed, built, delivered, sourced and managed. So what does this mean for you? Today’s organizations need software that will enable them to not just do their jobs, but to do it in a way that is familiar and easy for them.  What does this mean for IT? Use software and technology as an enabler, not as a roadblock. Overall, we had a great week in Orlando learning about how to improve the user experience, manage content explosion, launch social initiatives, transition to mobile environments and understand cloud and SaaS options.  We had some great conversations throughout the conference and at the Oracle booth. Lots of demonstrations were given of Oracle WebCenter Sites and Oracle Social Network. And as Christie mentioned earlier this week, our Vice President of Product Management and Strategy for WebCenter Loren Weinberg presented on the topic of customer engagement and talked about how organization’s relationships with their customers have fundamentally changed today and the resulting impact that has on their priorities.  Loren also talked about the importance of customer engagement, why that matters now more than ever, and what you can do to help your company or organization succeed in this new world. The question asked in every keynote and session was a simple one: What is your “ah-ha” moment? I personally had quite a few, some of which I’ve captured below. 70% of internal social initiatives eventually fail. By 2014, refusing to communicate with consumers via social media will be as harmful as ignoring emails/phone calls is today. Customer engagement = multi-channel + social & interactive + personal & relevant + optimized. If people choose to talk about your product/company/service, it's because it's remarkable. -- Seth Godin's keynote (one of the highlights of the conference!) The Web will become the primary method used for delivering content and applications to mobile devices. By 2015, 20% of smart phone users worldwide will conduct commerce using context-enriched services on a weekly basis.  86% of customers will pay more for a better customer experience. 6 P's of Quality User Experience. Product. Enabled by: People, Patterns, Process, Profit, Priorities. Did you attend the Gartner Summit? What were your ah-ha moments?

    Read the article

  • Updated SOA Documents now available in ITSO Reference Library

    - by Bob Rhubart
    Nine documents within the IT Strategies from Oracle (ITSO) reference library have recently been updated. (Access to the ITSO collection is free to registered Oracle.com members -- and that membership is free.) All nine documents fall within the Service Oriented Architecture section of the ITSO collection, and cover the following topics: SOA Practitioner Guides Creating an SOA Roadmap (PDF, 54 pages, published: February 2012) The secret to successful SOA is to build a roadmap that can be successfully executed. SOA offers an opportunity to adopt an iterative technique to deliver solutions incrementally. This document offers a structured, iterative methodology to help you stay focused on business results, mitigate technology and organizational risk, and deliver successful SOA projects. A Framework for SOA Governance (PDF, 58 pages, published: February 2012) Successful SOA requires a strong governance strategy that designs-in measurement, management, and enforcement procedures. Enterprise SOA adoption introduces new assets, processes, technologies, standards, roles, etc. which require application of appropriate governance policies and procedures. This document offers a framework for defining and building a proper SOA governance model. Determining ROI of SOA through Reuse (PDF, 28 pages, published: February 2012) SOA offers the opportunity to save millions of dollars annually through reuse. Sharing common services intuitively reduces workload, increases developer productivity, and decreases maintenance costs. This document provides an approach for estimating the reuse value of the various software assets contained in a typical portfolio. Identifying and Discovering Services (PDF, 64 pages, published: March 2012) What services should we build? How can we promote the reuse of existing services? A sound approach to answer these questions is a primary measure for the success of a SOA initiative. This document describes a pragmatic approach for collecting the necessary information for identifying proper services and facilitating service reuse. Software Engineering in an SOA Environment (PDF, 66 pages, published: March 2012) Traditional software delivery methods are too narrowly focused and need to be adjusted to enable SOA. This document describes an engineering approach for delivering projects within an SOA environment. It identifies the unique software engineering challenges faced by enterprises adopting SOA and provides a framework to remove the hurdles and improve the efficiency of the SOA initiative. SOA Reference Architectures SOA Foundation (PDF, 70 pages, published: February 2012) This document describes they key tenets for SOA design, development, and execution environments. Topics include: service definition, service layering, service types, the service model, composite applications, invocation patterns, and standards. SOA Infrastructure (PDF, 86 pages, published: February 2012) Properly architected, SOA provides a robust and manageable infrastructure that enables faster solution delivery. This document describes the role of infrastructure and its capabilities. Topics include: logical architecture, deployment views, and Oracle product mapping. SOA White Papers and Data Sheets Oracle's Approach to SOA (white paper) (PDF, 14 pages, published: February 2012) Oracle has developed a pragmatic, holistic approach, based on years of experience with numerous companies to help customers successfully adopt SOA and realize measureable business benefits. This executive datasheet and whitepaper describe Oracle's proven approach to SOA. Oracle's Approach to SOA (data sheet) (PDF, 3 pages, published: March 2012) SOA adoption is complex and success is far from assured. This is why Oracle has developed a pragmatic, holistic approach, based on years of experience with numerous companies, to help customers successfully adopt SOA and realize measurable business benefits. This data sheet provides an executive overview of Oracle's proven approach to SOA.

    Read the article

  • Oracle Fusion Supply Chain Management (SCM) Designs May Improve End User Productivity

    - by Applications User Experience
    By Applications User Experience on March 10, 2011 Michele Molnar, Senior Usability Engineer, Applications User Experience The Challenge: The SCM User Experience team, in close collaboration with product management and strategy, completely redesigned the user experience for Oracle Fusion applications. One of the goals of this redesign was to increase end user productivity by applying design patterns and guidelines and incorporating findings from extensive usability research. But a question remained: How do we know that the Oracle Fusion designs will actually increase end user productivity? The Test: To answer this question, the SCM Usability Engineers compared Oracle Fusion designs to their corresponding existing Oracle applications using the workflow time analysis method. The workflow time analysis method breaks tasks into a sequence of operators. By applying standard time estimates for all of the operators in the task, an estimate of the overall task time can be calculated. The workflow time analysis method has been recently adopted by the Applications User Experience group for use in predicting end user productivity. Using this method, a design can be tested and refined as needed to improve productivity even before the design is coded. For the study, we selected some of our recent designs for Oracle Fusion Product Information Management (PIM). The designs encompassed tasks performed by Product Managers to create, manage, and define products for their organization. (See Figure 1 for an example.) In applying this method, the SCM Usability Engineers collaborated with Product Management to compare the new Oracle Fusion Applications designs against Oracle’s existing applications. Together, we performed the following activities: Identified the five most frequently performed tasks Created detailed task scenarios that provided the context for each task Conducted task walkthroughs Analyzed and documented the steps and flow required to complete each task Applied standard time estimates to the operators in each task to estimate the overall task completion time Figure 1. The interactions on each Oracle Fusion Product Information Management screen were documented, as indicated by the red highlighting. The task scenario and script provided the context for each task.  The Results: The workflow time analysis method predicted that the Oracle Fusion Applications designs would result in productivity gains in each task, ranging from 8% to 62%, with an overall productivity gain of 43%. All other factors being equal, the new designs should enable these tasks to be completed in about half the time it takes with existing Oracle Applications. Further analysis revealed that these performance gains would be achieved by reducing the number of clicks and screens needed to complete the tasks. Conclusions: Using the workflow time analysis method, we can expect the Oracle Fusion Applications redesign to succeed in improving end user productivity. The workflow time analysis method appears to be an effective and efficient tool for testing, refining, and retesting designs to optimize productivity. The workflow time analysis method does not replace usability testing with end users, but it can be used as an early predictor of design productivity even before designs are coded. We are planning to conduct usability tests later in the development cycle to compare actual end user data with the workflow time analysis results. Such results can potentially be used to validate the productivity improvement predictions. Used together, the workflow time analysis method and usability testing will enable us to continue creating, evaluating, and delivering Oracle Fusion designs that exceed the expectations of our end users, both in the quality of the user experience and in productivity. (For more information about studying productivity, refer to the Measuring User Productivity blog.)

    Read the article

  • Need WIF Training?

    - by Your DisplayName here!
    I spend numerous hours every month answering questions about WIF and identity in general. This made me realize that this is still quite a complicated topic once you go beyond the standard fedutil stuff. My good friend Brock and I put together a two day training course about WIF that covers everything we think is important. The course includes extensive lab material where you take standard application and apply all kinds of claims and federation techniques and technologies like WS-Federation, WS-Trust, session management, delegation, home realm discovery, multiple identity providers, Access Control Service, REST, SWT and OAuth. The lab also includes the latest version of the thinktecture identityserver and you will learn how to use and customize it. If you are looking for an open enrollment style of training, have a look here. Or contact me directly! The course outline looks as follows: Day 1 Intro to Claims-based Identity & the Windows Identity Foundation WIF introduces important concepts like conversion of security tokens and credentials to claims, claims transformation and claims-based authorization. In this module you will learn the basics of the WIF programming model and how WIF integrates into existing .NET code. Externalizing Authentication for Web Applications WIF includes support for the WS-Federation protocol. This protocol allows separating business and authentication logic into separate (distributed) applications. The authentication part is called identity provider or in more general terms - a security token service. This module looks at this scenario both from an application and identity provider point of view and walks you through the necessary concepts to centralize application login logic both using a standard product like Active Directory Federation Services as well as a custom token service using WIF’s API support. Externalizing Authentication for SOAP Services One big benefit of WIF is that it unifies the security programming model for ASP.NET and WCF. In the spirit of the preceding modules, we will have a look at how WIF integrates into the (SOAP) web service world. You will learn how to separate authentication into a separate service using the WS-Trust protocol and how WIF can simplify the WCF security model and extensibility API. Day 2 Advanced Topics:  Security Token Service Architecture, Delegation and Federation The preceding modules covered the 80/20 cases of WIF in combination with ASP.NET and WCF. In many scenarios this is just the tip of the iceberg. Especially when two business partners decide to federate, you usually have to deal with multiple token services and their implications in application design. Identity delegation is a feature that allows transporting the client identity over a chain of service invocations to make authorization decisions over multiple hops. In addition you will learn about the principal architecture of a STS, how to customize the one that comes with this training course, as well as how to build your own. Outsourcing Authentication:  Windows Azure & the Azure AppFabric Access Control Service Microsoft provides a multi-tenant security token service as part of the Azure platform cloud offering. This is an interesting product because it allows to outsource vital infrastructure services to a managed environment that guarantees uptime and scalability. Another advantage of the Access Control Service is, that it allows easy integration of both the “enterprise” protocols like WS-* as well as “web identities” like LiveID, Google or Facebook into your applications. ACS acts as a protocol bridge in this case where the application developer doesn’t need to implement all these protocols, but simply uses a service to make it happen. Claims & Federation for the Web and Mobile World Also the web & mobile world moves to a token and claims-based model. While the mechanics are almost identical, other protocols and token types are used to achieve better HTTP (REST) and JavaScript integration for in-browser applications and small footprint devices. Also patterns like how to allow third party applications to work with your data without having to disclose your credentials are important concepts in these application types. The nice thing about WIF and its powerful base APIs and abstractions is that it can shield application logic from these details while you can focus on implementing the actual application. HTH

    Read the article

< Previous Page | 110 111 112 113 114 115 116 117 118 119 120 121  | Next Page >