Search Results

Search found 23044 results on 922 pages for 'oracle solaris 11'.

Page 373/922 | < Previous Page | 369 370 371 372 373 374 375 376 377 378 379 380  | Next Page >

  • Basic is Best

    - by Eric A. Stephens
    Fellow foodies will recognize the recent movement towards "farm-to-table" restaurants. These venues attempt to simplify their menus and source ingredients as close to the source as possible. I had the opportunity to dine at such a restaurant the other evening. I was gushing about the appetizer to my server when she described the preparation for the item and then punctuated her comments with "basic is best". I reminded my fellow enterprise architect diners there was an architecture lesson in that statement. They rolled their eyes and chuckled. But they also knew I was right. I'm reminded of Frederick Brooks' book The Mythical Man Month and his latest The Design of Design. The former must read book talks about complexity. But he refrains from damning all complexity. The world we live in and enterprises we strive to transform with enterprise architecture are complicated organisms, much like the human body. But sometimes a simple solution is the best approach. Fewer applications (think: portfolio rationalization). Fewer components. Fewer lines of code. Whatever level of abstraction you are working at, less is more. I'm reminded of the enterprise architecture principle "Control Technical Diversity". At one firm I created pithy catch phrases for each principles. I named this one "Less is More". But perhaps another variation is what my server said the other night, "Basic is Best".

    Read the article

  • OIM 11g - Multi Valued attribute reconciliation of a child form

    - by user604275
    This topic gives a brief description on how we can do reconciliation of a child form attribute which is also multi valued from a flat file . The format of the flat file is (an example): ManagementDomain1|Entitlement1|DIRECTORY SERVER,EMAIL ManagementDomain2|Entitlement2|EMAIL PROVIDER INSTANCE - UMS,EMAIL VERIFICATION In OIM there will be a parent form for fields Management domain and Entitlement.Reconciliation will assign Servers ( which are multi valued) to corresponding Management  Domain and Entitlement .In the flat file , multi valued fields are seperated by comma(,). In the design console, Create a form with 'Server Name' as a field and make it a child form . Open the corresponding Resource Object and add this field for reconcilitaion.While adding , choose 'Multivalued' check box. (please find attached screen shot on how to add it , Child Table.docx) Open process definiton and add child form fields for recociliation. Please click on the 'Create Reconcilitaion Profile' buttton on the resource object tab. The API methods used for child form reconciliation are : 1.           reconEventKey =   reconOpsIntf.createReconciliationEvent(resObjName, reconData,                                                            false); ·                                    ‘False’  here tells that we are creating the recon for a child table . 2.               2.       reconOpsIntf.providingAllMultiAttributeData(reconEventKey, RECON_FIELD_IN_RO, true);                RECON_FIELD_IN_RO is the field that we added in the Resource Object while adding for reconciliation, please refer the screen shot) 3.    reconOpsIntf.addDirectBulkMultiAttributeData(reconEventKey,RECON_FIELD_IN_RO, bulkChildDataMapList);                 bulkChildDataMapList  is coded as below :                 List<Map> bulkChildDataMapList = new ArrayList<Map>();                   for (int i = 0; i < stokens.length; i++) {                            Map<String, String> attributeMap = new HashMap<String, String>();                           String serverName = stokens[i].toUpperCase();                           attributeMap.put("Server Name", stokens[i]);                           bulkChildDataMapList.add(attributeMap);                         } 4                  4.       reconOpsIntf.finishReconciliationEvent(reconEventKey); 5.       reconOpsIntf.processReconciliationEvent(reconEventKey); Now, we have to register the plug-in, import metadata into MDS and then create a scheduled job to execute which will run the reconciliation.

    Read the article

  • Unleash the Power of JavaFX

    - by Angela Caicedo
    It seems that it was just yesterday that we were getting ready for JavaOne 2012.  Now it's over, but it's definitely a great time to go back and watch the sessions you missed, and learn some of the latest news about Java.   For this JavaOne, I presented two sessions and one HOL, all of them related to JavaFX: JavaFX Extreme GUI Makeover Building JavaFX Interfaces with the Real World Unleash the power of JavaFX If you couldn't join us for these sessions, just follow the links and you can watch the videos on demand. For the HOL I've created a repository at GitHub, as many of the attendees wanted to keep the material.   In this repository you can find the lab document, the NetBeans projects for each exercise and it's appropriate solution.  Hope you enjoy! I created and presented a HOL called:  Unleash the power of JavaFX.  In this blog entry I would like to provide you 

    Read the article

  • New Whitepaper: Sales Cloud Business Object Cheatsheet

    - by Richard Bingham
    Ever tried coding groovy in Application Composer and found it hard to remember the API names for the standard objects and their fields? To help we have created this short set of ERD-like diagrams for the most regularly used Business Objects with along with their key attributes. As a handy PDF we hope this quick-reference guide will make this easier and save you some time. Please let us know in the comments below if this is useful or any enhancements you'd like us to add.

    Read the article

  • Setting MTU on Exalogic

    - by csoto
    For many reasons, a system administrator may want to change the MTU settings of a server. But in a system like Exalogic which contains lots of interconnected nodes and other various components, it's important to understand how this applies to the different networks. For example, when bringing up bonding of InfiniBand an error like the following may be thrown: Bringing up interface bond1: SIOCSIFMTU: Invalid argument Both scripts ifcfg-ib0 and ifcfg-ib1 (from the /etc/sysconfig/network-scripts/ direectory) have MTU set to 65500, which is a valid MTU value only if all IPoIB slaves operate in connected mode and are configured with the same value, so the line below must be added to both network scripts and then restart the network: CONNECTED_MODE=yes By the way, an error of the form “SIOCSIFMTU: Invalid argument” indicates that the requested MTU was rejected by the kernel. Typically this would be due to it exceeding the maximum value supported by the interface hardware. In that case you must either reduce the MTU to a value that is supported or obtain more capable hardware. This problem has been seen when trying to modify the MTU using the ifconfig command, like the output of the example below: [root@elxxcnxx ~]# ifconfig ib1 mtu 65520 SIOCSIFMTU: Invalid argument It's important to insist that in most cases the nodes must be rebooted after the MTU size has been changed. Although in some circumstances it may work without a reboot, it is not how it is typically documented. Now, in order to achieve a reduced memory consumption and improve performance for network traffic received on IPoIB related interfaces, it is recommend to reduce the MTU value in interface configuration files for IPoIB related bonds from 65520 to 64000. The change needs to be made to interface configuration files under the /etc/sysconfig/network-scripts directory and applies to the interface configuration files for bonds over IPoIB related slave devices, for example /etc/sysconfig/network-scripts/ifcfg-bond1. However, keep in mind that the numeric portion of the interface filenames that corresponding to IPoIB interfaces is expected to vary across compute nodes and vServers and so cannot be relied upon to identify which interface files are for bonds are over IPoIB rather than EoIB related slave interfaces. To fix these MTU values to the recommended settings, there are very useful instructions and a script on the MOS Note 1624434.1, and it's applicable physical and virtual configurations of Exalogic. Regarding the recommended MTU value for EoIB related interfaces, its maximum appropriate value is 1500. If for some reason a vServer has been created with a higher value (set on the /etc/sysconfig/network-scripts/ifcfg-bond0 file), then it must be fixed. An error like the following could be thrown under this circumstance: [root@vServer ~]# service network restart ... Bringing up interface bond0:  SIOCSIFMTU: Invalid argument Also an error like the one below can be seen on the /var/log/messages file of the vServer: kernel: T5074835532 [mlx4_vnic] eth1:vnic_change_mtu:360: failed: new_mtu 64000 2026 The MOS Note 1611657.1 is very useful for this purpose.

    Read the article

  • Three new ADF Insider Essentials on YouTube Channel

    - by Grant Ronald
    I've uploaded three ADF Insider Essentials onto our YouTube channel. How to delete a node in a hierarchical tree component. Handing the OK and Cancel buttons in an af:dialog popup Strategy for implementing global buttons These are ADF Insider Essentials that we originally loaded on OTN but we can now upload larger files (each of these is about 20 minutes long).  More ADF Insider Essentials in the pipeline so watch this space!    

    Read the article

  • Update to Alert on Java Runtime Environment (JRE) for EBS end-users on Windows

    - by user793553
    To ensure that Java users remain on a secure version, Windows systems that rely on auto-update will be auto-updated from JRE 6 to JRE 7. Until E-Business Suite is certified with JRE 7, EBS users should not rely on the Windows auto-update mechanism for their client machines and should manually keep the JRE up to date with the latest version of JRE 6 until further notice.   Click here for more details and for instructions on how to get the latest version of JRE 6  

    Read the article

  • Clouds Aroud the World

    - by user12608550
    At the NIST Cloud Computing Workshop this week; representatives from Canada, China, and Japan presented on their cloud computing efforts. Some interesting points made: Canada: Building "Service Canada" cloud for all citizen services, but raised the issue of data location...cloud data must be within Canada border, so they will not focus on public clouds where they don't know or can't control data location. Japan: In response to the massive destruction of the Great East Japan Earthquake, Japan is building nation-wide cloud services to support disaster relief, data recovery, and support for rebuilding new communities. US Ambassador Philip Verveer discussed the need for international cooperation and standards development to enable interoperability of cloud services, keeping in mind cultural and political differences. Additionally, an industry panel reported on cloud standards development, including some actual interoperability testing at http://www.cloudplugfest.org. Much of the first two days of the workshop covered progress and action plans around the 10 High-Priority Requirements to Further USG Agency Cloud Computing Adoption. Thursday's sessions will cover the work of the various NIST Cloud Computing Working Groups on Reference Architecture and Taxonomy Standards Acceleration to Jumpstart the Adoption of Cloud Computing (SAJACC) Cloud Security Standards Roadmap Business Use Cases (see Working Groups of NIST Cloud Computing )

    Read the article

  • OSCON: Java and a Nice Discount

    - by Tori Wieldt
    Now in its 14th year, OSCON, O'Reilly's annual open source conference, will once again be in Portland, OR on July 16-20, 2012.  Join the world’s open source pioneers, builders, and innovators at the Oregon Convention Center for five intense days to learn about open development, challenge your assumptions, and fire up your brain.With 200+ speakers, 18 tracks, hundreds of technologies, and over 3,000 hackers in attendance, it's a place to learn and network. You’ll find practical tutorials, inspirational keynotes, and a wealth of information on open source languages, platforms, and development. OSCON includes whole track devoted to Java & the JVM, and the list of speakers is impressive. OSCON is where the serious thinkers and doers—and their favorite technologies—converge. And when the day’s sessions are over, join people just like you for some serious fun. Thanks to Java Magazine (you have subscribed to Java Magazine, right? If not, get your free digital subscription now!), you can register for OSCON and save 20% with code JAVAMAG.

    Read the article

  • The Social Content Conundrum

    - by Mike Stiles
    Here’s the social content conundrum: people who are not entertainers are being asked to entertain. Despite a world of skilled MBAs, marketing savants, technological innovators, analysts, social strategists and consultants, every development in social for brands keeps boomeranging right back to the same unavoidable truth. Success hinges on having content creators who know how to entertain the target audience. You can’t make this all about business-processes. You can’t make this all about technology, though data is critical and helps inform content. This is about having human beings who know the audience, know what they’d love to see, and can create the magic that will draw and hold them. Since showing up in the News Feed is critical for exposition and engagement, and since social ads primarily serve to amplify content that’s performing well, I’m comfortable saying content creators are becoming exponentially recruited and valued. They will no longer be commodities. They’ll be your stars. Social has fundamentally changed the relationship between brand and consumer. No longer can the customer be told to sit down, shut up, and listen to our ads. It’s now all about what consumers are willing to watch or read. Their patience for subjecting themselves to material they aren’t interested in is waning. Therefore, brands must now be producers of entertainment and information content, not merely placers of ads within someone else’s content. Social has given you a huge stage, with an audience sitting out there waiting to see what you’re going to do. What are you putting on that stage? For most corporate environments, entertaining is alien. It’s risky and subjective. Most operate around two foundational principles: control and fear. To entertain and inform with branded content, some control has to go. You control the product. Past that, control is being transferred into the hands of the consumer. The “fear first” culture also has to yield. If you strive to never make waves, you will move absolutely nothing. Because most corporations don’t house entertainers, they must be found then trusted. They’re usually a little weird. The ideas they’ll bring may seem “out there.” But like any business professional, they’ve gone through the training and experiences that make them uniquely good at what they do, even if you don’t quite understand them. It’s okay. It’s what the audience thinks that matters. Get it right, and you’ll be generating one ambassador after another who’s proud to be identified with the brand and will regularly consume and share your content. Entertainment entities are able to shape our culture and succeed beyond their wildest dreams by being beholden to one thing…what the public likes and wants. When brands put the same emphasis on crowd-pleasing content, they too will enjoy brand fame the likes of which they’ve never seen. The stage is yours. Now get out there and go for that applause.

    Read the article

  • High Tech Product Companies: Benchmark Your Sales & Marketing Data Management

    - by user709269
    Aberdeen’s Q4 2010 Quarterly Business Review found that 74% of the Sales and Marketing organizations in High Tech product manufacturing have strategic CRM initiatives in 2011. Aberdeen Group is conducting a survey that will help high tech product companies such as yours determine the Best-in-Class procedures for capturing, managing, and disseminating business data. If your product company is planning on implementing a CRM solution or is simply evaluating the potential benefits, we would appreciate your feedback in this brief, 10-minute survey. You will be able to compare your experiences in leveraging customer information for sales and marketing compare with your peers, benchmark your performance, and see how you can achieve Best-in-Class results. Individual responses will be kept strictly confidential, and data will only be used in aggregate. In appreciation for sharing your time and thoughts with us, we will provide complimentary access for you to the full benchmark report as soon as it is published (a $399 value). Take the survey.

    Read the article

  • PeopleTools 8.54 Pre-Release Notes Available

    - by Matthew Haavisto
    PeopleSoft's PeopleTools recently published pre-release notes for PeopleTools 8.54.  Pre-release notes provide more functional and technical details than the release value proposition. This document describes how enhancements function within the context of the greater business process. This added level of detail should enable project teams to answer the following questions: What delivered functionality will change? How will an upgrade or new implementation affect other systems? How will these changes affect the organization? After the project team has reviewed and analyzed the pre-release notes, business decision makers should be able to determine whether to allocate budget and initiate implementation plans. This document covers the following subjects: Platform support enhancements Development tools enhancements System administration tools enhancements Reporting and analytic tools enhancements Integration tools enhancements Lifecycle management tools enhancements Accessibility PeopleSoft Interaction Hub enhancements.

    Read the article

  • Cloud Infrastructure has a new standard

    - by macoracle
    I have been working for more than two years now in the DMTF working group tasked with creating a Cloud Management standard. That work has culminated in the release today of the Cloud Infrastructure Management Interface (CIMI) version 1.0 by the DMTF. CIMI is a single interface that a cloud consumer can use to manage their cloud infrastructure in multiple clouds. As CIMI is adopted by the cloud vendors, no more will you need to adapt client code to each of the proprietary interfaces from these multiple vendors. Unlike a de facto standard where typically one vendor has change control over the interface, and everyone else has to reverse engineer the inner workings of it, CIMI is a de jure standard that is under change control of a standards body. One reason the standard took two years to create is that we factored in use cases, requirements and contributed APIs from multiple vendors. These vendors have products shipping today and as a result CIMI has a strong foundation in real world experience. What does CIMI allow? CIMI is both a model for the resources (computing, storage networking) in the cloud as well as a RESTful protocol binding to HTTP. This means that to create a Machine (guest VM) for example, the client creates a “document” that represents the Machine resource and sends it to the server using HTTP. CIMI allows the resources to be encoded in either JavaScript Object Notation (JSON) or the eXentsible Markup Language (XML). CIMI provides a model for the resources that can be mapped to any existing cloud infrastructure offering on the market. There are some features in CIMI that may not be supported by every cloud, but CIMI also supports the discovery of which features are implemented. This means that you can still have a client that works across multiple clouds and is able to take full advantage of the features in each of them. Isn’t it too early for a standard? A key feature of a successful standard is that it allows for compatible extensions to occur within the core framework of the interface itself. CIMI’s feature discovery (through metadata) is used to convey to the client that additional features that may be vendor specific have been implemented. As multiple vendors implement such features, they become candidates to add the future versions of CIMI. Thus innovation can continue in the cloud space without being slowed down by a lowest common denominator type of specification. Since CIMI was developed in the open by dozens of stakeholders who are already implementing infrastructure clouds, I expect to CIMI being adopted by these same companies and others over the next year or two. Cloud Customers who can see the benefit of this standard should start to ask their cloud vendors to show a CIMI implementation in their roadmap.  For more information on CIMI and the DMTF's other cloud efforts, go to: http://dmtf.org/cloud

    Read the article

  • Chart Filtering

    - by Tim Dexter
    Interesting question from a colleague this week. Can you add a filter to a chart to just show a specific set of data? In an RTF template, you need to do a little finagling in the chart definition. In an online template, a couple of clicks and you're done. RTF Build your chart as you would normally to include all the data to start with. Now flip to the Advanced tab to see the code behind the chart. Its not very pretty but with a little effort you can get it looking a little more friendly. Here's my chart showing employees and their salaries. <Graph depthAngle="50" depthRadius="8" seriesEffect="SE_AUTO_GRADIENT"> <LegendArea visible="true"/>  <Title text="Executive Department Only" visible="true" horizontalAlignment="CENTER"/>  <LocalGridData colCount="{count(.//G_2)}" rowCount="1">   <RowLabels>    <Label>SALARY</Label>   </RowLabels>   <ColLabels>    <xsl:for-each select=".//G_2">     <Label><xsl:value-of select="EMP_NAME"/></Label>    </xsl:for-each>   </ColLabels>   <DataValues>    <RowData>     <xsl:for-each select=".//G_2">      <Cell><xsl:value-of select="SALARY"/></Cell>     </xsl:for-each>    </RowData>   </DataValues>  </LocalGridData> </Graph> Note the emboldened text. Its currently grabbing all values in the G_2 level of the data. We can use an XPATH expression to filter the data to the set we want to see. In my case I want to only see the employees that are in the Executive department. My  data is structured thus:   <DATA_DS>     <G_1>         <DEPARTMENT_NAME>Accounting</DEPARTMENT_NAME>         <G_2>             <MANAGER>Higgins</MANAGER>             <EMPLOYEE_ID>206</EMPLOYEE_ID>             <HIRE_DATE>2002-06-07T00:00:00.000-04:00</HIRE_DATE>             <SALARY>8300</SALARY>             <JOB_TITLE>Public Accountant</JOB_TITLE>             <PARAS>11000</PARAS>             <EMP_NAME>William Gietz</EMP_NAME>         </G_2> So the XPATH expression Im going to use to limit the data to the Executive department would be .//G_2[../DEPARTMENT_NAME='Executive'] Note the ../ moves the parser up the XML tree to be able to test the DEPARTMENT_NAME value. I added this XPATH expression to the three instances that need it ColCount, ColLabels and RowData. Its simple enough to do. Testing your XPATH expression is easier to do using a table of data. Please note, as soon as you make changes to the chart code. Going back to the Builder tab, you'll find that everything is grayed out. I recommend you make all the changes you can via the chart dialog before updating the code. Online Template Implementing the filter is much simpler, there is a dialog box to help you out. Add you chart and fill out the various data points you want to show. then hit the Filter item in the ribbon above the chart. That will pop the filter dialog box where you can then add a filter to the chart.   You can add multiple filters if needed and of course you can use the Manage Filters button to re-open and edit the filters. Pretty straightforward stuff!

    Read the article

  • Powerful Lessons in Data from the Presidential Election

    - by Christina McKeon
    Now that we’ve had a few days to recover from the U.S. presidential election, it’s a good time to take a step back from politics and look for the customer experience lessons that we can take away. The most powerful lesson is that when you know more about your base, you will have an advantage over your competition. That advantage will translate into you winning and your competition losing. Michael Scherer of TIME was given access to Obama’s data analysts two days before the election. His account is documented in Inside the Secret World of the Data Crunchers Who Helped Obama Win. What we learned from Scherer’s inside view is how well Obama’s team did in getting the right data, analyzing it, and acting on it. This data team recognized how critical it was to break down data silos within the campaign. As Scherer noted, they created “a single system that merged information from pollsters, fundraisers, field workers, consumer databases, and social-media and mobile contacts with the main Democratic voter files in the swing states.” The Obama analysis was so meticulous that they knew which celebrity and which type of celebrity event would help them maximize campaign contributions. With a single system, their data models became more precise. They determined which messages were more successful with specific demographic groups and that who made the calls mattered. Data analysis also led to many other changes in Obama’s campaign including a new ad buying strategy, using social media and applications to tap into supporters’ friends, and using new social news sites. While we did not have that same inside view into Romney’s campaign, much of the post-mortem coverage indicates that Romney’s team did not have the right analysis. As Peter Hamby of CNN wrote in Analysis: Why Romney Lost, “Romney officials had modeled an electorate that looked something like a mix of 2004 and 2008….” That historical data did not account for the changing demographics in the U.S. Does your organization approach data like the Obama or Romney team? Do you really know your base? How well can you predict what is going to happen in your business? If you haven’t already put together a strategy and plan to know more, this week’s civics lesson is a powerful reason to do it sooner rather than later. Your competitors are probably thinking the same thing that you are!

    Read the article

  • AME : How to Diagnose Issues With the Default Approver List in Purchasing When Using Approvals Management

    - by Oracle_EBS
    Do you need help in understanding the concepts or how to setup the Approval Management Engine (AME) for requisition approvals? See the new diagnostic Note 1437183.1 'AME : How to Diagnose Issues With the Default Approver List in Purchasing When Using Approvals Management'. AME is designed to generate the approval list according to the conditions and rules you define in the setup. This troubleshooting guide will help you understand how AME builds the default approval list for Purchasing and help users find solutions for scenarios where the approval list fails to be generated. Follow along with the logical steps for troubleshooting.  The note first reviews how to generate the AME Setup report.   For example in the note we see a fragment of the setup report. Notice it has different sections for each one of the setup categories including attributes, conditions, rules, action types, approval groups etc.  How the default approval list is built in AME is then reviewed, followed by the logical steps for diagnosing issues.  The diagnostic steps include how to run the Test Workbench, as well as how to obtain valuable debug and exception information.  Then follow along using the steps to build a simple test case to sharpen your understanding.

    Read the article

  • Innovation for Retailers

    - by David Dorf
    One of my main objectives for this blog is to point out emerging technologies and how they might apply to the retail industry.  But ideas are just the beginning; retailers either have to rely on vendors or have their own lab to explore these ideas and see which ones work.  (A healthy dose of both is probably the best solution.)  The Nordstrom Innovation Lab is a fine example of dedicating resources to cultivate ideas and test prototypes. The video below, from 2011, is a case study in which the team builds an iPad app that helps customers purchase sunglasses in the store.  Customers take pictures of themselves wearing different sunglasses, then can do side-by-side comparisons. There are a few interesting take-aways from their process.  First, they are working in the store alongside employees and customers.  There's no concept of documenting all the requirements then building the product.  Instead, they work closely with those that will be using the app in order to fully understand what's needed.  When they find an issue, they change the software onsite and try again.  This iterative prototyping ensures their product hits the mark.  Feels like Extreme Programming if you recall that movement. Second, they have time-boxed the project to one week.  Either it works or it doesn't, and either way they've only expended a week's worth of resources.  Innovation always entails failure, and those that succeed are often good at detecting failure quickly then adjusting.  Fail fast and fail often. Third, its not always about technology.  I was impressed they used paper designs to walk through user stories and help understand the needs of the customer.  Pen and paper is the innovator's most powerful tool. Our Retail Applied Research (RAR) team uses some of these concepts in our development process.  (Calling it a process is probably overkill.)  We try to give life to concepts quickly so the rest of organization can help us decide if we're heading the right direction.  It takes many failures before finding a successful product.

    Read the article

  • Two interesting big data sessions around Openworld

    - by Jean-Pierre Dijcks
    For those who want to talk (not listen) about big data, here are 2 very cool sessions: BOF9877 - A birds of a feather session around all things big data. It is on Monday, Oct 1, 6:15 PM - 7:00 PM - Marriott Marquis - Golden Gate. While all guests on the panel are special, we will have very special guest on the panel. He is a proud owner of a Big Data Appliance (see here). Then there is a Big Data SIG meeting (the invite from Gwen): I'd like to invite everyone to our OOW12 meet up. We'll meet on Tuesday, October 2nd, 8:45 to 9:45 at Moscone West Level 3, Overlook 3. We will network, socialize and discuss plans for the group. Which topics interest us for webinars? Which conferences do we want to meet in? What other activities we are interested in? We can also discuss big data topics, show off our great work, and seek advice on the challenges. Other than figuring out what we are collectively interested in, the discussion will be pretty open. Here is the official invite. See you at Openworld!!

    Read the article

  • Moms on Mobile: Are They Way Ahead of You?

    - by Mike Stiles
    You may have no idea how much and how fast moms are embracing mobile. Of all the demographics that can be targeted by marketers, moms have always been at or near the top of the list. And why not? They’re running households, they’re all over town, they’re making buying decisions, and they’re influencing family and friends. They, out of necessity, become masters of efficiency and time management. So when a technology tool, like mobile, comes along that assists with that efficiency and time management, we would obviously expect them to take advantage of it. So if it’s obvious, why are so many big, sophisticated brands left choking on the dust of moms who have zoomed past them in the adoption of mobile, and social on mobile? Let’s break down some hard truths as presented by a Mojiava report: -Moms spend 6.1 hours per day on average on their smartphones – more than magazines, TV or radio. -46% took action after seeing a mobile ad. -51% self-identify as “addicted” to their smartphone. -Households with an income of $25K-$50K have about the same mobile penetration among moms as those with incomes of $50K-$75K. So mobile is regarded as a necessity for middle-class moms. -Even moms without smartphones spend 2.5 hours on average per day on some connected mobile device. -Of moms with such devices, 9.8% have an iPad, 9.5% a Kindle and 5.7% an iPod Touch. -Of tablet-owning moms, 97% bought something using their tablet in the last month. -31% spend over 10 hours per week on their tablet, but less than 2 hours per week on their PCs. -62% of connected moms use shopping apps. -46% want to get info on their mobile while in a store. -Half of connected moms use social on their mobile. And they’re engaged. 81% are brand fans, 86% post updates, and 84% comment. If women and moms are one of your primary targets and you find yourself with no strong social channels where content is driving engagement and relationship-building, with sites not optimized for mobile, or with no tablet or smartphone apps, you have been solidly left behind by your customers and prospects. And their adoption of mobile and social on mobile is only exponentially speeding up, not slowing down. How much sense does it make when your customer is ready to act on your mobile ad, wants to user your iPad app to buy something from you, wants to be your fan on Facebook, wants to get messages and deals from you while they’re in your store…but you’re completely absent? I’ll help you cheat on the test by giving you the answer…no sense at all. Catch up to momma.

    Read the article

  • Global Day of Coderetreat

    - by Tori Wieldt
    From the coderetreat.org website: Coderetreat is a day-long, intensive practice event, focusing on the fundamentals of software development and design. By providing developers the opportunity to take part in focused practice away from the pressures of 'getting things done', the coderetreat format has proven itself to be a highly effective means of skill improvement. This year, the Global Day of Coderetreat is happening on December 8. It sounds cool and fun, and of course, Java Champions and Java developers around the world are involved. Here's a small sampling: Chennai, India São Paulo, Brazil Skopje, Macedonia Kraków, Poland You can go to http://globalday.coderetreat.org/  to look up events near you. It's a great opportunity to practice your craft. Here's a video from an event last year to get a flavor:

    Read the article

  • Highlighting new rows in ADF Table

    - by Sireesha Pinninti
    About This article explains how to hightlight newly inserted rows in an ADF Table without writing any extra java/javascript code.IntroductionSometimes we may wish to give more clarification to the end user by differentiating between newly inserted rows and the existing rows(i.e the rows from DB) in a table by highlighting new rows in different color as in the figure shown below. SolutionWe can achieve the same by giving following EL to inlineStyle property of every column inside af:table: #{row.row.entities[0].entityState == 0?'background-color:#307D7E;':''}ExplanationHere is the explanation for row.row.entities[0].entityState given inside EL which returns the state of the row(i.e, New, Modified, Unmodified, Initialized etc.)row - Refers to a tree node binding(instance of FacesCtrlHierNodeBinding) at runtimerow.row - Refers to an instance of row that the tree node is based onrow.row.entities[0] - Gets the Entity row at zeroth index. In most of the cases, the table will be based on single entity. If your table is based on multiple entities then the index needs to be given accordingly.row.row.entities[0].entityState - Gets Entity Object's current Entity-state in the transaction.(0 - New, Modified - 2, Unmodified - 1, Initialized - -1,  etc.,)

    Read the article

  • Time Zone on WebLogic Server

    - by adejuanc
    In order to configure the time zone with WebLogic Server, use the following JVM startup command: -Duser.timezone=<timezone> For example, in the java arguments in the admin console at Environments -> Servers -> Servername -> - Server Start tab, configure the startup settings that Node Manager will use to start the particular server. For example: -Duser.timezone='America/Arizona' There are many different time zones, each with its own code. For a complete list please refer to : http://en.wikipedia.org/wiki/List_of_zoneinfo_time_zones For testing, you can run the following code on WLS with a JSP, servlet, or deploying the class: import java.util.Calendar; import java.util.TimeZone; public class TestTimeZone {  public static void main(String[] args) {    Calendar calendar = Calendar.getInstance();    TimeZone timeZone = calendar.getTimeZone();    System.out.println(" your Current TimeZone is : " + timeZone.getDisplayName());    System.out.println(" Time Zone id : "+ timeZone.getID());  } }

    Read the article

  • Trust

    - by mprove
    I sense traffic of this blog w/o a present reason. Hmm. What about this,  brief musings about trust: Each software, each website, each social platform, each community building effort is a matter of trust building. You make a social promise to continue the effort, and to care for the commitment of the users or community members. It is easy to offer more to your community. On the other hand, it is quite difficult or impossible to take something away, or to close down or end the product or community without disappointing someone. cheers,Matthias

    Read the article

  • All hail the Excel Queen

    - by Tim Dexter
    An excellent question this past week from dear ol Blighty; actually from Brian at Nextgen Clearing Ltd in the big smoke (London). Brian was developing an excel template and wanted to be able to reference the data fields multiple times inside the Excel template. Damn good question and I of course has some wacky solutions, from macros and cell referencing in Excel to pre-processing the data with an XSL stylesheet to copy the data multiple times so it could be referenced multiple times. All completely outlandish, enter our Queen of Excel, Shirley from the development team. Shirley is singlehandedly responsible for the Excel templates, I put her through six months of hell a few years back, with a host of Excel template requirements. She was more than up to the challenge and has developed some great features. One of those, is the ability to use the hidden XDO_METADATA sheet to map the data to custom named fields so they can be used multiple times in the template. So simple and very neat! Excel template and regular Excel users will know that you can only use the naming function once ie the names have to be unique across the workbook so you can not reuse a cell/group name. To get around this you can just come up with as many cell names as you want and map them in the XDO_METADATA sheet to the data columns/fields in your XML data set:. For example: XDO_?DEPTNO_SUMMARY?  <?DEPTNO?> XDO_?DNAME_SUMMARY?  <?DNAME?> XDO_GROUP_?G_D_DETAIL? <xsl:for-each-group select=".//G_D" group-by="./DEPTNO"> XDO_?DEPTNO_DETAIL? <?DEPTNO?> As you can see DEPTNO has been referenced twice and mapped to different named values in the left hand column. These values can then be used to name individual cells in the Excel template. You'll also notice a mix of Publisher <? ...?> and native XSL commands. So the world is your oyster on the mapping and the complexity you might need for calculations or string manipulation. Shirley has kindly built out a sample Excel template, data and result here so you can see how it all hangs together. the XDO_METADATA sheet is hidden, just right click on the sheet names and use the Unhide command to show it.

    Read the article

  • How to deal with MySQL Connector/ODBC error "Can't connect to local MySQL server through socket '/var/lib/mysql/mysql.sock'"

    - by user12653020
    I am sure many users run into a mysterious problem when perfectly working ODBC configurations started failing with errors like: Can't connect to local MySQL server through socket '/var/lib/mysql/mysql.sock' The above error message might be preceded with something like [nxDc[yQ]. At the same time odbc.ini specifies in its DSN different SOCKET=/tmp/mysql.sock or a TCP connection SERVER=<remote_host_or_ip>. The question is, what had happened that the ODBC driver started to ignore the DSN options? The clue lies in the corrupted string [nxDc[yQ], which actually was [UnixODBC][MySQL] with each 2nd symbol removed. This is the case of bad conversion from SQLCHAR to SQLWCHAR. The UnixODBC driver manager took a single-byte character string from the client application and tried to convert it into the wide (multi-byte) characters for the Unicode version of MyODBC driver: Initially the piece of the connection string was represented by 1-byte chars like: [S][E][R][V][E][R][=][m][y][h][o][s][t][;] after the bad conversion to wide chars (commonly 2-byte UTF-16) [SE][RV][ER][=m][yh][os][t;] instead of [S\0][E\0][R\0][V\0][E\0][R\0][=\0][m\0][y\0][h\0][o\0][s\0][t\0][;\0] Naturally, the MyODBC driver could not parse the bad string and tried to use the default connection type (SOCKET) with the default value (/var/lib/mysql/mysql.sock) Now we know what happened, but why it happened? In most cases it happened because of using ODBCManageDataSourcesQ4 utility or its older analog ODBCConfig. When registering ODBC drivers they put lots of additional options and one of these options badly affects the UnixODBC driver manager itself. The solution is simple - remove or comment out the option in odbcinst.ini file (it is empty by default) set for the driver: [MySQL ODBC 5.2.6 Driver] Description    = Driver         = /home/dbs/myodbc526/lib/libmyodbc5w.so Driver64       = /home/dbs/myodbc526/lib/libmyodbc5w.so Setup          = /home/dbs/myodbc526/lib/libmyodbc5S.so Setup64        = /home/dbs/myodbc526/lib/libmyodbc5S.so UsageCount     = 1 CPTimeout      = 0 CPTimeToLive   = 0 IconvEncoding  =  # <--------- remove this line Trace          = TraceFile      = TraceLibrary   = After applying this simple solution (remove the line with IconvEncoding = ) everything came to normal. Prior to removing that line I tried putting different encoding names there, but the result was not good, so I really don't know how to properly use it. Unfortunately, UnixODBC manuals say nothing about it. Therefore, removing this option was the only way to get things done.

    Read the article

< Previous Page | 369 370 371 372 373 374 375 376 377 378 379 380  | Next Page >