Search Results

Search found 1709 results on 69 pages for 'environments'.

Page 44/69 | < Previous Page | 40 41 42 43 44 45 46 47 48 49 50 51  | Next Page >

  • Java Spotlight Episode 139: Mark Heckler and José Pereda on JES based Energy Monitoring @MkHeck @JPeredaDnr

    - by Roger Brinkley
    Interview with Mark Heckler and José Pereda on using JavaSE Embedded with the Java Embedded Suite on a RaspberryPI along with a JavaFX client to monitor an energy production system and their JavaOne Tutorial- Java Embedded EXTREME MASHUPS: Building self-powering sensor nets for the IoT Right-click or Control-click to download this MP3 file. You can also subscribe to the Java Spotlight Podcast Feed to get the latest podcast automatically. If you use iTunes you can open iTunes and subscribe with this link: Java Spotlight Podcast in iTunes. Show Notes News Java Virtual Developer Day Session Videos Available JavaFX Maven Plugin 2.0 Released JavaFX Scene Builder 1.1 build b28 FXForm 2 release 0.2.2 OpenJDK8/Zero cross compile build for Foundation model HSAIL-based GPU offload: the Quest for Java Performance Begins Progress on Moving to Gradle Java EE 7 Launch Keynote Replay Java EE 7 Technical Breakouts Replay Java EE 7 support in NetBeans 7.3.1 Java EE 7 support in Eclipse 4.3 Java Magazine - May/June Events Jul 16-19, Uberconf, Denver, USA Jul 22-24, JavaOne Shanghai, China Jul 29-31, JVM Language Summit, Santa Clara Sep 11-12, JavaZone, Oslo, Norway Sep 19-20, Strange Loop, St. Louis Sep 22-26 JavaOne San Francisco 2013, USA Feature Interview Mark Heckler is an Oracle Corporation Java/Middleware/Core Tech Engineer with development experience in numerous environments. He has worked for and with key players in the manufacturing, emerging markets, retail, medical, telecom, and financial industries to develop and deliver critical capabilities on time and on budget. Currently, he works primarily with large government customers using Java throughout the stack and across the enterprise. He also participates in open-source development at every opportunity, being a JFXtras project committer and developer of DialogFX, MonologFX, and various other projects. When Mark isn't working with Java, he enjoys writing about his experiences at the Java Jungle website (https://blogs.oracle.com/javajungle/) and on Twitter (@MkHeck). José Pereda is a Structural Engineer working in the School of Engineers in the University of Valladolid in Spain for more than 15 years, and his passion is related to applying programming to solve real problems. Being involved with Java since 1999, José shares his time between JavaFX and the Embedded world, developing commercial applications and open source projects (https://github.com/jperedadnr), and blogging (http://jperedadnr.blogspot.com.es/) or tweeting (@JPeredaDnr) of both. What’s Cool AquaFX 0.1 - Mac OS X skin for JavaFX by Claudine Zillmann DromblerFX adds a docking framework Part 2 of Gerrit’s taming the Nashorn for writing JavaFX apps in Javascript Tool from mihosoft called JSelect for quickly switching JDKs Apache Maven Javadoc Plugin 2.9.1 Released Proposal: Java Concurrency Stress tests (jcstress) Slide-free Code-driven session at SV JUG JavaOne approvals/rejects gone out

    Read the article

  • Customization: It’s Wanted in Enterprise Tech Platforms Too

    - by Mike Stiles
    Did you know that every customer service person does their job the exact same way in every business organization?  And did you know that every business organization cares about the exact same metrics? I hope not, because both those things couldn’t be farther from the truth. And if there are different needs and approaches in different enterprises, it stands to reason technology platforms must become increasingly customizable. Oracle Social Cloud sees that coming and is doing something about it, at least in terms of social media management. Today we introduce Social Station, a customizable user experience workspace within the Oracle Social Relationship Management (SRM) platform. We think a lot about customer-centricity and customer experience around here, and we know our own customers are ready to start moving forward in being able to set up their work environments in the ways that work best for them. That kind of thing increases productivity, helps deliver on social objectives faster, and generally just makes life more pleasant. A recent IDG Enterprise report says that enterprises currently investing in more consumerized, easy-to-use technologies experience a 56% increase in employee productivity and a 46% increase in customer satisfaction. Imagine that. When you make it easier and more pleasant for employees to help customers, more customers get helped and everyone ends up happier. So what does this Social Station do and what does it mean, exactly? It’s an innovative move to take some pretty high-end tech (take a bow developers) and simplify it, making things more intuitive: Drag and drop lets you easily build out and personalize your social workspace with different modules. The new Custom Analytics module can mix and match over 120 metrics with thousands of customizable reporting options. You can check constantly refreshed updates and keep a real-time eye on the numbers you’re trying to move. One-click sharing and annotation in the Custom Analytics module improves sharing and collaboration across teams, departments and executives. Multi-view layout helps you leverage social insights by letting you monitor conversations by network, stream, metric, graph type, date range, and relative time period. The Enhanced Calendar is a better visual representation of content, posts, networks and views, letting you easily toggle between functions and views. The Oracle Social Station sets us up to always be developing & launching additional social modules for you, covering areas like content curation, influencer engagement, and command center creation. Oracle Social Cloud Group VP Meg Bear says, “Consumers today have high expectations of their technology application capabilities and usability, and those expectations don’t stop when they enter their workplaces.” In other words, internal enterprise technology platforms must reflect the personalization and customization being called for in consumer products and marketing. “One size fits all” is becoming an endangered concept. @mikestiles @oraclesocial

    Read the article

  • Defining Discovery: Core Concepts

    - by Joe Lamantia
    Discovery tools have had a referencable working definition since at least 2001, when Ben Shneiderman published 'Inventing Discovery Tools: Combining Information Visualization with Data Mining'.  Dr. Shneiderman suggested the combination of the two distinct fields of data mining and information visualization could manifest as new category of tools for discovery, an understanding that remains essentially unaltered over ten years later.  An industry analyst report titled Visual Discovery Tools: Market Segmentation and Product Positioning from March of this year, for example, reads, "Visual discovery tools are designed for visual data exploration, analysis and lightweight data mining." Tools should follow from the activities people undertake (a foundational tenet of activity centered design), however, and Dr. Shneiderman does not in fact describe or define discovery activity or capability. As I read it, discovery is assumed to be the implied sum of the separate fields of visualization and data mining as they were then understood.  As a working definition that catalyzes a field of product prototyping, it's adequate in the short term.  In the long term, it makes the boundaries of discovery both derived and temporary, and leaves a substantial gap in the landscape of core concepts around discovery, making consensus on the nature of most aspects of discovery difficult or impossible to reach.  I think this definitional gap is a major reason that discovery is still an ambiguous product landscape. To help close that gap, I'm suggesting a few definitions of four core aspects of discovery.  These come out of our sustained research into discovery needs and practices, and have the goal of clarifying the relationship between discvoery and other analytical categories.  They are suggested, but should be internally coherent and consistent.   Discovery activity is: "Purposeful sense making activity that intends to arrive at new insights and understanding through exploration and analysis (and for these we have specific defintions as well) of all types and sources of data." Discovery capability is: "The ability of people and organizations to purposefully realize valuable insights that address the full spectrum of business questions and problems by engaging effectively with all types and sources of data." Discovery tools: "Enhance individual and organizational ability to realize novel insights by augmenting and accelerating human sense making to allow engagement with all types of data at all useful scales." Discovery environments: "Enable organizations to undertake effective discovery efforts for all business purposes and perspectives, in an empirical and cooperative fashion." Note: applicability to a world of Big data is assumed - thus the refs to all scales / types / sources - rather than stated explicitly.  I like that Big Data doesn't have to be written into this core set of definitions, b/c I think it's a transitional label - the new version of Web 2.0 - and goes away over time. References and Resources: Inventing Discovery Tools Visual Discovery Tools: Market Segmentation and Product Positioning Logic versus usage: the case for activity-centered design A Taxonomy of Enterprise Search and Discovery

    Read the article

  • ERP in a Flash! Latest News on JD Edwards and Oracle VM Templates

    - by Kem Butller-Oracle
    Oracle Announces the Availability of Oracle VM Templates for JD Edwards EnterpriseOne 9.1 Update 2 and Tools 9.1 Update 4.4 Continuing the commitment to rapid and predictable deployments of JD Edwards EnterpriseOne, Oracle announces the general availability of Oracle VM templates for JD Edwards EnterpriseOne Application release 9.1 Update 2 and Tools release 9.1 Update 4.4. These templates can be used with Oracle VM for x86, on the Oracle Exalogic Elastic Cloud, and on the Oracle Database Machine. Oracle VM Templates for JD Edwards EnterpriseOne accelerate the process of setting up a working environment compared to the traditional installation process. The templates can be a key component to a well-managed cloud infrastructure, allowing system administrators to quickly provision fully functional JD Edwards EnterpriseOne environments for evaluation, development, or production use. The templates contain preconfigured images of the major JD Edwards EnterpriseOne server components, including: • Enterprise server • HTML server • Database server • BI Publisher (for use with One View Reporting) • Business Services Server and ADF Runtime (for use with Mobile Smartphone Applications) • Application Interface Services (new with this release, for use with Mobile Enterprise Applications) • Server Manager (new with this release) The virtual server images are built on a complete Oracle technology stack, including Oracle VM for x86, Oracle Linux, Oracle WebLogic Server, Oracle Database, and Oracle Business Intelligence Publisher. The templates can be installed into an Oracle VM for x86 system running on standard x86 servers, the Oracle Exalogic Elastic Cloud, and the Oracle Database Appliance as a composite “all-in-one” system. The database can be deployed as a fully preconfigured VM template, or it can be deployed to a preexisting database server, for example, the Oracle Exadata Database Machine or the Oracle Database Appliance. This latest set of templates includes the following applications and technology components: • JD Edwards EnterpriseOne Applications Release 9.1 Update 2 with ESUs as of April 8, 2014 • JD Edwards EnterpriseOne Tools 9.1 Update 4, maintenance pack 4 (9.1.4.4) • Oracle Database 12c (12.1.0.1) • Oracle WebLogic Server 12c (12.1.2) • Oracle Linux 5 Update 8, 64-bit • Oracle Business Intelligence Publisher 11.1.1.7.1, for use with JD Edwards EnterpriseOne One View Reporting • JD Edwards EnterpriseOne Business Services Server and Oracle Application Development Framework (ADF) 11.1.1.5, for use with the JD Edwards EnterpriseOne Mobile Applications. The delivery also includes a JD Edwards EnterpriseOne deployment server preconfigured to match the content of the templates. This edition of the templates also includes enhanced configuration utilities that greatly simplify the process of configuring the templates for deployment into a running system. The templates are immediately available for download from the Oracle Software Delivery Cloud. For more information see: • My Oracle Support article 884592.1 • Oracle Technology Network

    Read the article

  • Working with Visual Studio Web Development Server and IE6 in XP Mode on Windows 7

    - by Igor Milovanovic
    (Brian Reiter from  thoughtful computing has described this setup in this StackOverflow thread. The credit for the idea is entirely his, I have just extended it with some step by step descriptions and added some links and screenhots.)   If you are forced  to still support Internet Explorer 6, you can setup following combination on your machine to make the development for it less painful. A common problem if you are developing on Windows 7 is that you can’t install IE6 on your machine. (Not that you want that anyway). So you will probably end up working locally with IE8 and FF, and test your IE6 compatibility on a separate machine. This can get quite annoying, because you will have to maintain two different development environments, not have all the tools available, etc.   You can help yourself by installing IE6 in a Windows 7 XP Mode, which is basically just an Windows XP running in a virtual machine.   [1] Windows XP Mode installation   After you have installed and configured your XP mode (remember the security settings like Windows Update and antivirus software), you can add the shortcut to the IE6 in the virtual machine to the “all users” start menu. This shortcut will be replicated to your windows 7 XP mode start menu, and you will be able to seamlessly start your IE 6 as a normal window on your Windows 7 desktop.   [2] Configure IE6 for the Windows 7 installation   If you configure your XP – Mode to use (Shared Networking)  NAT, you can now use IE6 to browse the sites in the internet. (add proxy settings to IE6 if necessary)                       The problem now is that you can’t connect to the webdev server which is running on your local machine. This is because web development server is crippled to allow only local connections for security reasons.   In order to trick webdev in believing that the requests are coming from local machine itself you can use a light weight proxy like privoxy on your host (windows 7) machine and configure the IE6 running in the virtual host.   The first step is to make the host machine (running windows 7) reachable from the virtual machine (running XP). In order to do that, you can install the loopback adapter, and configure it to use an IP which is routable from the virtual machine. In example screenshot (192.168.1.66).   [3] How to install loopback adapter in Windows 7   After installation you can assign a static IP which is routable from the virtual machine (in example 192.168.1.66)                     The next step is to configure privoxy to listen on that IP address (using some not used port, in example, the default port 8118)   Change following line in config.txt:   # #      Suppose you are running Privoxy on an IPv6-capable machine and #      you want it to listen on the IPv6 address of the loopback device: # #        listen-address [::1]:8118 # # listen-address  192.168.1.66:8118   The last step is to configure the IE6 to use Privoxy which is running on your Windows 7 host machine as proxy for all addresses (including localhost)                             And now you can use your Windows7 XP Mode IE6 to connect to your Visual Studio’s webdev web server.                         [4] http://stackoverflow.com/questions/683151/connect-remotely-to-webdev-webserver-exe

    Read the article

  • ATG Live Webcast April 5: Managing Your Oracle E-Business Suite with Oracle Enterprise Manager

    - by BillSawyer
    The next ATG Live Webcast covers one of the hottest topic areas in E-Business Suite Tools and Technology: Lifecycle Management. Angelo Rosado, Product Manager, ATG Development will lead you through using Oracle Enterprise Manager 12c and the latest E-Business Suite Plug-in to manage E-Business Suite systems. You can register for the Apr. 5, 2012 event at: Managing Your Oracle E-Business Suite with Oracle Enterprise Manager The topics covered in this webcast will be: Manage your EBS system configurations Monitor your EBS environment's performance and uptime Keep multiple EBS environments in sync with their patches and configurations Create patches for your EBS customizations and apply them with Oracle's own patching tools Date:               Thursday, April 5, 2012Time:              8:00 AM - 9:00 AM Pacific Standard TimePresenter:    Angelo Rosado, Product Manager, ATG DevelopmentWebcast Registration Link (Preregistration is optional but encouraged)To hear the audio feed:   Domestic Participant Dial-In Number:            877-697-8128    International Participant Dial-In Number:      706-634-9568    Additional International Dial-In Numbers Link:    Dial-In Passcode:                                              99342To see the presentation:    The Direct Access Web Conference details are:    Website URL: https://ouweb.webex.com    Meeting Number:  597073984If you miss the webcast, or you have missed any webcast, don't worry -- we'll post links to the recording as soon as it's available from Oracle University.  You can monitor this blog for pointers to the replay. And, you can find our archive of our past webcasts and training here.If you have any questions or comments, feel free to email Bill Sawyer (Senior Manager, Applications Technology Curriculum) at BilldotSawyer-AT-Oracle-DOT-com. 

    Read the article

  • Big Data – Interacting with Hadoop – What is PIG? – What is PIG Latin? – Day 16 of 21

    - by Pinal Dave
    In yesterday’s blog post we learned the importance of the HIVE in Big Data Story. In this article we will understand what is PIG and PIG Latin in Big Data Story. Yahoo started working on Pig for their application deployment on Hadoop. The goal of Yahoo to manage their unstructured data. What is Pig and What is Pig Latin? Pig is a high level platform for creating MapReduce programs used with Hadoop and the language we use for this platform is called PIG Latin. The pig was designed to make Hadoop more user-friendly and approachable by power-users and nondevelopers. PIG is an interactive execution environment supporting Pig Latin language. The language Pig Latin has supported loading and processing of input data with series of transforming to produce desired results. PIG has two different execution environments 1) Local Mode – In this case all the scripts run on a single machine. 2) Hadoop – In this case all the scripts run on Hadoop Cluster. Pig Latin vs SQL Pig essentially creates set of map and reduce jobs under the hoods. Due to same users does not have to now write, compile and build solution for Big Data. The pig is very similar to SQL in many ways. The Ping Latin language provide an abstraction layer over the data. It focuses on the data and not the structure under the hood. Pig Latin is a very powerful language and it can do various operations like loading and storing data, streaming data, filtering data as well various data operations related to strings. The major difference between SQL and Pig Latin is that PIG is procedural and SQL is declarative. In simpler words, Pig Latin is very similar to SQ Lexecution plan and that makes it much easier for programmers to build various processes. Whereas SQL handles trees naturally, Pig Latin follows directed acyclic graph (DAG). DAGs is used to model several different kinds of structures in mathematics and computer science. DAG Tomorrow In tomorrow’s blog post we will discuss about very important components of the Big Data Ecosystem – Zookeeper. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: Big Data, PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL

    Read the article

  • Change default language settings in Visual Studio 2012

    - by sreejukg
    The first thing you need to do after the installation of Visual Studio 2012 is to choose the IDE preferences. Once you select your preferred collection of settings, the IDE will always choose dialogs and other options according to your selection. Nowadays developer’s needs to work with different programming environments and due to this, developers might need to reset the default settings. In this article, I am going to demonstrate how you can change the default settings in Visual Studio 2012. For the purpose of this demonstration, I have installed Visual Studio 2012 and selected C++ as my default environment settings. So now when I go to file -> new project, it will give me C++ templates by default as follows. If you want to select another language, you need to expand Other Languages section and select C# or VB. Now I am going to change these default settings. I am going to change the default language preference to C#. In Visual Studio 2012, go to tools menu and select Import and Export Settings. Here you have 3 options; one is to export the current settings so that the settings are saved for future use. Also you can import previously saved settings. The last option available is to reset it to default. It is a good Idea to export your settings and import it as you need in later stages. To reset the settings to default select the Reset option and click next. Now Visual Studio will ask you to whether you would like to save the settings, which can be used in future to restore. Select any one option and click next. For the purpose of this demo, I have selected not to save the settings. Click Next button to continue. Now Visual Studio will bring you the similar dialog that appears just after installation to select your IDE settings. Select the required settings from the available list and click Finish button. Click Finish once you are done. If everything OK, you will see the success message as below. Now go to file -> new Project, you will see the selected language appear by default. I selected C# in the previous step and the new project dialog appears as follows. Changing IDE settings in Visual Studio 2012 is very easy and straight forward.

    Read the article

  • How do I Integrate Production Database Hot Fixes into Shared Database Development model?

    - by TetonSig
    We are using SQL Source Control 3, SQL Compare, SQL Data Compare from RedGate, Mercurial repositories, TeamCity and a set of 4 environments including production. I am working on getting us to a dedicated environment per developer, but for at least the next 6 months we are stuck with a shared model. To summarize our current system, we have a DEV SQL server where developers first make changes/additions. They commit their changes through SQL Source Control to a local hgdev repository. When they execute an hg push to the main repository, TeamCity listens for that and then (among other things) pushes hgdev repository to hgrc. Another TeamCity process listens for that and does a pull from hgrc and deploys the latest to a QA SQL Server where regression and integration tests are run. When those are passed a push from hgrc to hgprod occurs. We do a compare of hgprod to our PREPROD SQL Server and generate deployment/rollback scripts for our production release. Separate from the above we have database Hot Fixes that will need to be applied in between releases. The process there is for our Operations team make changes on the PreProd database, and then after testing, to use SQL Source Control to commit their hot fix changes to hgprod from the PREPROD database, and then do a compare from hgprod to PRODUCTION, create deployment scripts and run them on PRODUCTION. If we were in a dedicated database per developer model, we could simply automatically push hgprod back to hgdev and merge in the hot fix change (through TeamCity monitoring for hgprod checkins) and then developers would pick it up and merge it to their local repository and database periodically. However, given that with a shared model the DEV database itself is the source of all changes, this won't work. Pushing hotfixes back to hgdev will show up in SQL Source Control as being different than DEV SQL Server and therefore we need to overwrite the reposistory with the "change" from the DEV SQL Server. My only workaround so far is to just have OPS assign a developer the hotfix ticket with a script attached and then we run their hotfixes against DEV ourselves to merge them back in. I'm not happy with that solution. Other than working faster to get to dedicated environment, are they other ways to keep this loop going automatically?

    Read the article

  • IT Admin for Thrill Seekers

    - by Tony Davis
    A developer suggested to me recently that the life of the DBA was, surely, a dull one. My first reaction was indignation, but quickly followed by the thought that for many people excitement isn't necessarily the most desirable aspect of their job. It's true that some aspects of the DBA role seem guaranteed to quieten the pulse; in the days of tape backups, time must have slowed to eternity for the person whose job it was to oversee this process, placing tapes into secure containers, ensuring correct labeling, and.sorry, I drifted off there for a second. On the other hand, if you follow the adventures of the likes of Brent Ozar or Tom LaRock, you'd be forgiven for thinking that much of a database guy's time is spent, metaphorically, diving through plate glass windows in tight fitting underwear in order to extract grateful occupants from burning database applications. Alas it isn't true of the majority, but it isn't as dull as some people imagine, and is a helter-skelter ride compared with some other IT roles. Every IT department has people who toil away in shadowy corners doing quiet but mysterious tasks. When you ask them to explain what they do, you almost immediately want them to stop, but you hear enough to appreciate that these tasks are often absolutely vital to the smooth functioning of an IT organization. Compared with them, the DBAs are prima donnas. Here are a few nominations: Installation engineer - install all of the company's laptops and workstations, and software, deal with licensing, shipping and data entry.many organizations, especially those subject to tight regulation, would simply grind to a halt without their efforts. Localization engineer - Not quite software engineering, not quite translation, the job is to rebuild a product in a different language and make sure everything still works. QA Tester - firstly, I should say that the testers at Red Gate seem to me some of the most-fulfilled in the company. I refer here to the QA Tester whose job is more-or-less entirely to read a script, click some buttons and make sure the actual and expected values match. Configuration manager - for example, someone whose main job is to configure build environments so that devs can access their source code; assuredly necessary for the smooth functioning and productivity of the team, and hopefully well-paid. So what other sort of job in IT should one choose if the work of a DBA proves to be too exciting? Or are these roles secretly more exciting than many imagine? I invite you all to put forward your own suggestions. Cheers, Tony.

    Read the article

  • Oracle Systems and Solutions at OpenWorld Tokyo 2012

    - by ferhat
    Oracle OpenWorld Tokyo and JavaOne Tokyo will start next week April 4th. We will cover Oracle systems and Oracle Optimized Solutions in several keynote talks and general sessions. Full schedule can be found here. Come by the DemoGrounds to learn more about mission critical integration and optimization of complete Oracle stack. Our Oracle Optimized Solutions experts will be at hand to discuss 1-1 several of Oracle's systems solutions and technologies. Oracle Optimized Solutions are proven blueprints that eliminate integration guesswork by combing best in class hardware and software components to deliver complete system architectures that are fully tested, and include documented best practices that reduce integration risks and deliver better application performance. And because they are highly flexible by design, Oracle Optimized Solutions can be implemented as an end-to-end solution or easily adapted into existing environments. Oracle Optimized Solutions, Servers,  Storage, and Oracle Solaris  Sessions, Keynotes, and General Session Talks DAY TIME TITLE Notes Session Wednesday  April 4 9:00 - 11:15 Keynote: ENGINEERED FOR INNOVATION - Engineered Systems Mark Hurd,  President, Oracle Takao Endo, President & CEO, Oracle Corporation Japan John Fowler, EVP of Systems, Oracle Ed Screven, Chief Corporate Architect, Oracle English Session K1-01 11:50 - 12:35 Simplifying IT: Transforming the Data Center with Oracle's Engineered Systems Robert Shimp, Group VP, Product Marketing, Oracle English Session S1-01 15:20 - 16:05 Introducing Tiered Storage Solution for low cost Big Data Archiving S1-33 16:30 - 17:15 Simplifying IT - IT System Consolidation that also Accelerates Business Agility S1-42 Thursday  April 5 9:30 - 11:15 Keynote: Extreme Innovation Larry Ellison, Chief Executive Officer, Oracle English Session K2-01 11:50 - 13:20 General Session: Server and Storage Systems Strategy John Fowler, EVP of Systems, Oracle English Session G2-01 16:30 - 17:15 Top 5 Reasons why ZFS Storage appliance is "The cloud storage" by SAKURA Internet Inc L2-04 16:30 - 17:15 The UNIX based Exa* Performance IT Integration Platform - SPARC SuperCluster S2-42 17:40 - 18:25 Full stack solutions of hardware and software with SPARC SuperCluster and Oracle E-Business Suite  to minimize the business cost while maximizing the agility, performance, and availability S2-53 Friday April 6 9:30 - 11:15 Keynote: Oracle Fusion Applications & Cloud Robert Shimp, Group VP, Product Marketing Anthony Lye, Senior VP English Session K3-01 11:50 - 12:35 IT at Oracle: The Art of IT Transformation to Enable Business Growth English Session S3-02 13:00-13:45 ZFS Storagge Appliance: Architecture of high efficient and high performance S3-13 14:10 - 14:55 Why "Niko Niko doga" chose ZFS Storage Appliance to support their growing requirements and storage infrastructure By DWANGO Co, Ltd. S3-21 15:20 - 16:05 Osaka University: Lower TCO and higher flexibility for student study by Virtual Desktop By Osaka University S3-33 Oracle Developer Sessions with Oracle Systems and Oracle Solaris DAY TIME TITLE Notes LOCATION Friday April 6 13:00 - 13:45 Oracle Solaris 11 Developers D3-03 13:00 - 14:30 Oracle Solaris Tuning Contest Hands-On Lab D3-04 14:00 - 14:35 How to build high performance and high security Oracle Database environment with Oracle SPARC/Solaris English Session D3-13 15:00 - 15:45 IT Assets preservation and constructive migration with Oracle Solaris virtualization D3-24 16:00 - 17:30 The best packaging system for cloud environment - Creating an IPS package D3-34 Follow Oracle Infrared at Twitter, Facebook, Google+, and LinkedIn  to catch the latest news, developments, announcements, and inside views from  Oracle Optimized Solutions.

    Read the article

  • A Technique for Performing Cross-host Upgrades to FMW 11gR1

    - by reza.shafii
    The main tool used for the upgrade of iAS 10g mid-tier (data not stored in 10g meta-data repository schemas) environments to Fusion Middleware (FMW) 11gR1 is the FMW Upgrade Assistant (UA). This tool performs what we call an out-of-place upgrade which in a nut-shell means the following: Upgrade is performed by pointing the UA to a 10g source topology as well as an 11g destination topology. The destination topology must be created, using the standard FMW 11g installation and configuration process, prior to the execution of the UA. The UA carries over all of the required changes from the source environment to the destination. This approach has a number of advantages rooted in the fact that the source environment - which is presumably working well and serving its needs - is not disturbed during the upgrade process as the UA only performs read-only operations on it. The UA today can only perform such out-of-place upgrades when the source and destination topologies reside on the same machine. This can sometimes be an issue when the host on which the iAS 10g environment is installed is running at full capacity and installing new hardware for the purpose of the upgrade (in most cases what would be needed is extra memory) is completely infeasible. In such cases, upgrade across a different host is still possible by using the following technique: Backup your source environment and restore it on to a target machine. The backup and restore procedures for the iAS 10.1.2 components are described within this section of the release's Administration Guide. As described in the docs, the Oracle Application Server Backup and Recovery Tool provides capabilities for backing up the installation on one machine and restoring it on another which is exactly what you want to do for the purpose of cross host upgrade. Ensure that the restored environment on your target host is fully functional. Go through the upgrade steps on the target machine to perform the out-of-place upgrade using the UA. Although this process does add another big step to the overall upgrade process, it does make it possible to perform a cross-host upgrade to 11gR1 when necessary. The easiest approach would of course be to find a way of ensuring that the required hardware capacity for upgrade is available on the original 10g host. Using techniques such as scheduling the upgrade at low traffic times and/or temporarily stopping other processes running on the machine to clear up some memory might provide you the sufficient memory needed to perform the out-of-place upgrade and save you the need for using the backup/restore technique I have described in this post.

    Read the article

  • What does your Python development workbench look like?

    - by Fabian Fagerholm
    First, a scene-setter to this question: Several questions on this site have to do with selection and comparison of Python IDEs. (The top one currently is What IDE to use for Python). In the answers you can see that many Python programmers use simple text editors, many use sophisticated text editors, and many use a variety of what I would call "actual" integrated development environments – a single program in which all development is done: managing project files, interfacing with a version control system, writing code, refactoring code, making build configurations, writing and executing tests, "drawing" GUIs, and so on. Through its GUI, an IDE supports different kinds of workflows to accomplish different tasks during the journey of writing a program or making changes to an existing one. The exact features vary, but a good IDE has sensible workflows and automates things to let the programmer concentrate on the creative parts of writing software. The non-IDE way of writing large programs relies on a collection of tools that are typically single-purpose; they do "one thing well" as per the Unix philosophy. This "non-integrated development environment" can be thought of as a workbench, supported by the OS and generic interaction through a text or graphical shell. The programmer creates workflows in their mind (or in a wiki?), automates parts and builds a personal workbench, often gradually and as experience accumulates. The learning curve is often steeper than with an IDE, but those who have taken the time to do this can often claim deeper understanding of their tools. (Whether they are better programmers is not part of this question.) With advanced editor-platforms like Emacs, the pieces can be integrated into a whole, while with simpler editors like gedit or TextMate, the shell/terminal is typically the "command center" to drive the workbench. Sometimes people extend an existing IDE to suit their needs. What does your Python development workbench look like? What workflows have you developed and how do they work? For the first question, please give the main "driving" program – the one that you use to control the rest (Emacs, shell, etc.) the "small tools" -- the programs you reach for when doing different tasks For the second question, please describe what the goal of the workflow is (eg. "set up a new project" or "doing initial code design" or "adding a feature" or "executing tests") what steps are in the workflow and what commands you run for each step (eg. in the shell or in Emacs) Also, please describe the context of your work: do you write small one-off scripts, do you do web development (with what framework?), do you write data-munching applications (what kind of data and for what purpose), do you do scientific computing, desktop apps, or something else? Note: A good answer addresses the perspectives above – it doesn't just list a bunch of tools. It will typically be a long answer, not a short one, and will take some thinking to produce; maybe even observing yourself working.

    Read the article

  • Steps to deploying on Windows Azure

    - by Vincent Grondin
    Alright, these steps might be a little detailed and of few might not be necessary but still it's a pretty accurate road map to deploying on azure...     1)     Open you solution 2)      Rebuild ALL 3)      Right click on your Azure project and click "Publish" 4)      It should open a windows explorer window with your package to be uploaded (.cspkg ) and its associated configuration (.cscfg) to be uploaded too.  Keep it open, you'll need that path later on... 5)      It should also open a browser asking you to login to your passport account, please do so. 6)      After this you will be redirected to the Azure Portal where you will see your Azure Project Name below the « Projet Name » section.  Click on it. 7)      Then you should be redirected to a detailed view of your account on Azure where you will create a new service by clicking the hyperlink on the top right corner. 8)      Choose the right service type for you, most likely the "Hosted Service" type 9)      Choose a « Label » name and click « next » 10)   Choose a name for your service and validate that the name is available in the cloud by clicking the "Check Availability" button 11)   At the bottom of this same page, you can choose to create a group for your service, use no group or join an existing group.  Creating a group means that all applications that belong to the same group will see no cost to exchanging data between other applications of the same group.  Most of the time when you create a single application, creating a group is not necessary.  You should choose a region that's close to your own region. 12)   On the next window, you should see a "Production" environment and a "Staging" environment.  Beware because "Staging" and "Production" are two different environments in the cloud and applications in "Staging" even when not runing do continue to rack in charges...  Choose an environment and click "Deploy". 13)   In the following window, browse to the path where your cspkg resides and then do the same thing with your cscfg file.  Choose a name for your Label,  and click "Deploy"... 14)   From now on, the clock is ticking and unless you have free Azure hours, your credit card is being billed… 15)   Click on the « Run » button to start your application 16)   Be patient.... be very patient… 17)   Once your application has finished starting, you should see a GREEN circle on the left side of the screen indicating that your application is READY.  Click the URL to test your application and remember that if your application is a service, you have to hit the "svc" class behind the link you see there.  Something in the likes of http://testvince2.cloudapp.net/service1.svc  (this is a fictional link) 18)   Hopefully your application will show up or in the case of a service, you will see your service's wsdl meaning that everything is working fine. Happy cloud computing all!

    Read the article

  • Oracle VM Templates for EBS 12.1.3 for Exalogic Now Available

    - by Elke Phelps (Oracle Development)
    Oracle VM Templates for Oracle E-Business Suite 12.1.3 for x86 Exalogic Platform (64 bit) are now available on the Oracle Software Delivery Cloud.  The templates contain all the required elements to create an Oracle E-Business Suite R12 demonstration system on an Exalogic server. You can use these templates to quickly build an EBS 12.1.3 demonstration environment, bypassing the operating system and the software install (via the EBS Rapid Install).   The Oracle E-Business Suite Release 12.1.3 (64 bit) template for the Exalogic platform is a Oracle Virtual Server Guest template that contains a complete Oracle E-Business Suite Release 12.1.3 Database Tier and Application Tier Installation.  For additional details, please refer to the following My Oracle Support Note: Oracle E-Business Suite Release 12.1.3 Database Tier and Application Tier Template for Oracle Exalogic Platform (Note 1499132.1) The Oracle E-Business Suite system is installed on top of Oracle Linux Version 5 update 6. The templates have been optimized for performance, including OS kernel settings and E-Business Suite configuration settings tuned specifically for the Exalogic platform.  The configuration delivered with this template for a mid-tier running on Exalogic will support hundreds of concurrent users.  Please refer to Section 2: Performance Analysis in My Oracle Support Note 1499132.1 for additional details.   Additional Information The Oracle E-Business Suite VM templates for the Exalogic platform contain the following software versions: Operating System: Oracle Linux Version 5 Update 6 Oracle E-Business Suite 12.1.3 (Database Tier) Oracle E-Business Suite 12.1.3 (Application Tier) The following considerations were made when the Oracle E-Business Suite VM template for the Exalogic platform were designed: Templates use the hardware-virtualized architecture, supporting hardware with virtualization feature. Database Tier Template is configured to use the following configuration: 16 GB RAM 4 VCPUs 250 GB of Disk space for application installation Application Tier Template is configured to use the following configuration: 16 GB RAM 4 VCPUs 50 GB of Disk space for application installation References Oracle E-Business Suite Release 12.1.3 Database Tier and Application Tier Template for Oracle Exalogic Platform (Note 1499132.1) Related Articles Part 1: E-Business Suite 12.1.1 Templates for Oracle VM Now Available Part 2: Using Oracle VM with Oracle E-Business Suite Virtualization Kit Part 3: On Clouds and Virtualization in EBS Environments (OpenWorld 2009 Recap) Part 4: Deploying E-Business Suite on Amazon Web Services Elastic Compute Cloud Part 5: Live Migration of EBS Services Using Oracle VM Support Policies for Virtualization Technologies and Oracle E-Business Suite Virtualization and the E-Business Suite, Redux Virtualization and E-Business Suite

    Read the article

  • ArchBeat Top 10 for November 11-17, 2012

    - by Bob Rhubart
    The Top 10 most popular items shared on the OTN ArchBeat Facebook page for the week of November 11-17, 2012. Developing and Enforcing a BYOD Policy Darin Pendergraft's post includes links to a recent Mobile Access Policy Survey by SANS as well as registration information for a Nov 15 webcast featuring security expert Tony DeLaGrange from Secure Ideas, SANS instructor, attorney and technology law expert Ben Wright, and Oracle IDM product manager Lee Howarth. This Week on the OTN Architect Community Homepage Make time to check out this week's features on the OTN Solution Architect Homepage, including: SOA Practitioner Guide: Identifying and Discovering Services Technical article by Yuli Vasiliev on Setting Up, Configuring, and Using an Oracle WebLogic Server Cluster The conclusion of the 3-part OTN ArchBeat Podcast on Future-Proofing your career. WLST Starting and Stopping a WebLogic Environment | Rene van Wijk Oracle ACE Rene van Wijk explores how to start a server with as little input as possible. Cloud Integration White Paper | Bruce Tierney Bruce Tierney shares an overview of Cloud Integration - A Comprehensive Solution, a new white paper he co-authored with David Baum, Rajesh Raheja, Bruce Tierney, and Vijay Pawar. X.509 Certificate Revocation Checking Using OCSP protocol with Oracle WebLogic Server 12c | Abhijit Patil Abhijit Patil's article focuses on how to use X.509 Certificate Revocation Checking Functionality with the OCSP protocol to validate in-bound certificates. Although this article focuses on inbound OCSP validation using OCSP, Oracle WebLogic Server 12c also supports outbound OCSP validation. Update on My OBIEE / Exalytics Books | Mark Rittman Oracle ACE Director Mark Rittman shares several resources related to his books Oracle Business Intelligence 11g Developers Guide and Oracle Exalytics Revealed, including a podcast interview with Oracle's Paul Rodwick. E-Business Suite 12.1.3 Data Masking Certified with Enterprise Manager 12c | Elke Phelps "You can use the Oracle Data Masking Pack with Oracle Enterprise Manager Grid Control 12c to scramble sensitive data in cloned E-Business Suite environments," reports Elke Phelps. There's a lot more information about this announcement in Elke's post. WebLogic Application Server: free for developers! | Bruno Borges Java blogger Bruno Borges shares news about important changes in the license agreement for Oracle WebLogic Server. Agile Architecture | David Sprott "There is ample evidence that Agile Architecture is a primary contributor to business agility, yet we do not have a well understood architecture management system that integrates with Agile methods," observes David Sprott in this extensive post. My iPad & This Cloud Thing | Floyd Teter Oracle ACE Director Floyd Teter explains why the Cloud is making it possible for him to use his iPad for tasks previously relegated to his laptop, and why this same scenario is likely to play out for a great many people. Thought for the Day "In programming, the hard part isn't solving problems, but deciding what problems to solve." — Paul Graham Source: SoftwareQuotes.com

    Read the article

  • Taking the fear out of a Cloud initiative through the use of security tools

    - by user736511
    Typical employees, constituents, and business owners  interact with online services at a level where their knowledge of back-end systems is low, and most of the times, there is no interest in knowing the systems' architecture.  Most application administrators, while partially responsible for these systems' upkeep, have very low interactions with them, at least at an operational, platform level.  Of greatest interest to these groups is the consistent, reliable, and manageable operation of the interfaces with which they communicate.  Introducing the "Cloud" topic in any evolving architecture automatically raises the concerns for data and identity security simply because of the perception that when owning the silicon, enterprises are not able to manage its content.  But is this really true?   In the majority of traditional architectures, data and applications that access it are physically distant from the organization that owns it.  It may reside in a shared data center, or a geographically convenient location that spans large organizations' connectivity capabilities.  In the end, very often, the model of a "traditional" architecture is fairly close to the "new" Cloud architecture.  Most notable difference is that by nature, a Cloud setup uses security as a core function, and not as a necessary add-on. Therefore, following best practices, one can say that data can be safer in the Cloud than in traditional, stove-piped environments where data access is segmented and difficult to audit. The caveat is, of course, what "best practices" consist of, and here is where Oracle's security tools are perfectly suited for the task.  Since Oracle's model is to support very large organizations, it is fundamentally concerned about distributed applications, databases etc and their security, and the related Identity Management Products, or DB Security options reflect that concept.  In the end, consumers of applications and their data are to be served more safely in a controlled Cloud environment, while realizing the many cost savings associated with it. Having very fast resources to serve them (such as the Exa* platform) makes the concept even more attractive.  Finally, if a Cloud strategy does not seem feasible, consider the pros and cons of a traditional vs. a Cloud architecture.  Using the exact same criteria and business goals/traditions, and with Oracle's technology, you might be hard pressed to justify maintaining the technical status quo on security alone. For additional information please visit Oracle's Cloud Security page at: http://www.oracle.com/us/technologies/cloud/cloud-security-428855.html

    Read the article

  • links for 2010-04-22

    - by Bob Rhubart
    Barry N. Perkins: Unique Business Value vs. Unique IT "Some solutions may look good today, solving a budget challenge by reducing cost, or solving a specific tactical challenge, but result in highly complex environments, that may be difficult to manage and maintain and limit the future potential of your business. Put differently, some solutions might push today's challenge into the future, resulting in a more complex and expensive solution." -- Barry N. Perkins, VP Oracle Modernization & Oracle Integrated Solutions (tags: oracle otn enterprisearchitecture modernization) Paul Homchick: The Information Driven Value Chain - Part 2 Paul Homchick continues his series with a look "at the way investments have been made in enterprise software in an effort to create and manage value, and how systems are moving from a controlled-process approach design towards gathering and using dynamically using information." (tags: oracle otn enterprisearchitecture) @vambenepe: The battle of the Cloud Frameworks: Application Servers redux? "The battle of the Cloud Frameworks has started," says William Vambenepe, "and it will look a lot like the battle of the Application Servers which played out over the last decade and a half." (tags: oracle otn cloud frameworks appserver) @ORACLENERD: COLLABORATE: Day 4 Wrap Up Oraclenerd feesses up: "The day started out with the realization that I pulled off the best (COLLABORATE - self annointed) prank ever. Twitter was...all atwitter about the fact that Mark Rittman was Oracle's Person of the Year. Of course it wasn't true. If you look at the picture, you'll realize that he's wearing exactly the same clothes in the magazine cover as he is in real life." (tags: collaborate2010 oracleace) Oracle's Hal Stern at Cloud Expo: "We've Moved from 'What' to 'How'" | Cloud Computing Journal "Hal also spoke a bit about building 'a sustainable IT model.' By this, he said he didn't mean the various Green IT and similar efforts that 'are all about data center efficiency. I think the operational model is just as important. Many enterprises are managing a tremendous amount of complexity, and it's hard to make this sustainable.'" -- Cloud News Desk (tags: oracle cloud cloudexpo halstern) @ORACLENERD: COLLABORATE: The Beach Party "Then tiki statues somehow were incorporated into various dances" -- Oracle ACE Chet "oraclenerd" Justice (tags: 0racle otn oracleace collaborate2010 oaug ioug lasvegas) David Andrews: Collaborate Day Two "Collaborate 2010 has focused on helping attendees understand what is already available and how to make more effective use of it. This does not sound exciting but it is extremely valuable. Most customers use only a small fraction of the capability of the products they already own. Helping them understand all the additional things they could be doing without buying anything more is very valuable." -- David Andrews (tags: oracle oaug collaborate2010 ioug)

    Read the article

  • Before the Summit of 2012

    - by Ajarn Mark Caldwell
    Today, Monday, was the first day of the PASS Summit Preconference training events, but instead I spent the day at the free SQL in the City event put on by Red Gate. For me this was not a financial decision (pre-con sessions cost extra above the general Summit registration) but rather a matter of interest.  I had already included money for pre-cons in this year’s training budget, but none of them really stood out to me, so even if the Red-Gate event were not going on at the same time, I probably would not have gone to any pre-cons this year.  However, the topics being presented at the SQL in the City event were of great interest to me.  There promised to be good information on Continuous Integration and automated deployment of database changes, which lately has been a real hot topic at my work.  And indeed, Red-Gate announced the release of a new tool (still in Early Access Program…a.k.a. Beta) which is called the Deployment Manager.  Since we are in the middle of a TFS implementation project, it will be interesting to see how this plays out and compares to what we put together with the automated builds in TFS.  But, as I understand it, the primary focus of Deployment Manager is not to be the Build process (Red Gate uses JetBrains’ Team City for that in their shop) but rather to aid in the deployment of those build packages, as well as providing easy rollback and a good visualization of which versions of software are in which environments.  It looks promising and I’ve already downloaded the installer package to play with it later. Overall, I was quite impressed with the SQL in the City event.  Having heard many current and past members of the PASS Board of Directors describe the challenges of putting on a large conference, and the growing pains that the PASS Summit has gone through, I am even more impressed that the Red Gate event ran as smoothly as it did.  And it is quite impressive the amount of money that Red Gate must have spent given that this was a no-charge event to attend, they had a very nice hot lunch, and the after-event drinks celebration.  Well done, folks! Of course it was great to hear from a variety of speakers.  Today I listened to some folks from Red Gate like Grant Fritchey (blog | @GFritchey) and David Atkinson (Product Manager for SQL Source Control and now the Deployment Manager tool set); and also Brent Ozar (blog | @BrentO) and Buck Woody (blog | @BuckWoody).  By the way, if you have never seen either Brent or Buck speak, you really should.  Different styles, but both are very entertaining and educational at the same time.  I love Buck’s sense of humor (here’s a tip…don’t be late to Buck’s session or you’ll become part of the presentation) and I praise Brent’s slides.  Brent’s style very much reminds me of that espoused by Garr Reynolds on his Presentation Zen blog (and book) and I am impressed that he can make a technical presentation so engaging. It was a great day, a great way to kick off the week, and I am excited to get into the full Summit!

    Read the article

  • “Play Now” via website vs. download & install

    - by Inside
    I've spent some time looking over the various threads here on GDSE and also on the regular Stackoverflow site, and while I saw a lot of posts and threads regarding various engines that could be used in game development, I haven't seen very much discussion regarding the various platforms that they can be used on. In particular, I'm talking about browser games vs. desktop games. I want to develop a simple 3D networked multiplayer game - roughly on the graphics level of Paper Mario and gameplay with roughly the same level of interaction as a hack & slash action/adventure game - and I'm having a hard time deciding what platform I want to target with it. I have some experience with using C++/Ogre3D and Python/Panda3D (and also some synchronized/networked programming), but I'm wondering if it's worth it to spend the extra time to learn another language and another engine/toolkit just so that the game can be played in a browser window (I'm looking at jMonkeyEngine right now). Is it worth it to go with engines that are less-mature, have less documentation, have fewer features, and smaller communities* just so that a (possibly?) larger audience can be reached? Does it make sense to even go with a web-environment for the kind of game that I want to make? Does anyone have any experiences with decisions like this? (* With the exception of Flash-based engines it seems like most of the other approaches have downsides when compared to what is available for desktop-based environments. I'd go with Flash, but I'm worried that Flash's 3D capabilities aren't mature enough right now to do what I want easily. There's also Unity3D, but I'm not sure how I feel about that at all. It seems highly polished, but requires a plugin to be downloaded for the game to be played -- at that rate I might as well have players download my game.) For simple & short games the Newgrounds approach (go to the site, click "play now", instant gratification) seems to work well. What about for more complex games? Is there a point where the complexity of a game is enough for people to say "OK, I'm going to download and play that"?

    Read the article

  • Oracle WebCenter: Social Networking & Collaboration

    - by kellsey.ruppel(at)oracle.com
    We’ve talked in previous weeks about the key goals of the new release of WebCenter are providing a Modern User Experience, unparalleled Application Integration, converging all the best of the existing portal platforms into WebCenter and delivering a Common User Experience Architecture.  We’ve provided an overview of Oracle WebCenter and discussed some of the other key goals in previous weeks, and this week, we’ll focus on how the new release of Oracle WebCenter provides unprecedented Social Networking and Collaboration.We recently talked with Carin Chan, Principal Product Manager at Oracle, around the topic of Social Networking and Collaboration. In today’s work environment, employees have come to expect social and collaborative services to augment their work environment. Whether it is to post a blog or to poll fellow coworkers, employees expect and demand access to highly integrated, collaborative work environments that allow them to quickly contribute at work -- whether it is to make informed decisions, contribute on projects, or share knowledge.Social and collaborative services from Oracle WebCenter add an immeasurable amount of value to achieving a modern user experience. Oracle WebCenter Services provides rich and comprehensive social computing services that include services such as wikis, blogs, instant messaging, presence, activity streams and graphs, and polls/surveys that offer employees access to rich collaborative services to work efficiently.Employees can create pages or spaces that mix and match collaborative services while bringing in data from other applications to share with groups, teams, or organizations. These out of the box social and collaborative services include: People Connections and Activity Streams enable users to quickly assemble and visualize their social business networks and track user activities.Activity Graphs tracks all user activities in real-time and gathers intelligence about these users, their connections and the way they use information to make educated recommendations and provide on the spot information discovery.Wikis and blogs enable the community authoring of documents and sharing of ideas and also allow for the gathering of feedback and comments on those ideas.Tags and links allow users to easily mark, connect and share information with others.RSS feeds are available to track new or changed information related to discussion forums, processes or activities in an Oracle WebCenter environment.Discussion forums enable sharing of group knowledge and easy creation of communities around specific topics.Announcements allow you to manage and publish important news to your user base.Instant Messaging and Presence enable real-time awareness and communication with available users in the context of a business task.Web and Voice Conferencing enables real-time communication with internal and external business users.Lists provide a way to manage list data directly on the web as well as export and import it from and to Microsoft Excel.Oracle WebCenter Analytics provides comprehensive reporting metrics on activity and content usage within portals or composite applications.Activity Streams allow you to track activities and visualize your business networks.While being able to integrate into your portal deployment, these services are also integrated into how users are already working. This includes integration with software such as Microsoft Outlook, Microsoft Office and mobile devices such as the Apple iPhone. These services are just a tip of the iceberg regarding social and collaborative services that Oracle WebCenter has to offer your employees. Be sure to keep checking back this week for in future posts, we’ll delve deeper into a few of these collaborative services and discuss how a combination of collaborative services offer a better portal deployment to empower business users. Technorati Tags: UXP, collaboration, enterprise 2.0, modern user experience, oracle, portals, webcenter, social, activity streams, blogs, wikis

    Read the article

  • The Red Gate Guide to SQL Server Team based Development Free e-book

    - by Mladen Prajdic
    After about 6 months of work, the new book I've coauthored with Grant Fritchey (Blog|Twitter), Phil Factor (Blog|Twitter) and Alex Kuznetsov (Blog|Twitter) is out. They're all smart folks I talk to online and this book is packed with good ideas backed by years of experience. The book contains a good deal of information about things you need to think of when doing any kind of multi person database development. Although it's meant for SQL Server, the principles can be applied to any database platform out there. In the book you will find information on: writing readable code, documenting code, source control and change management, deploying code between environments, unit testing, reusing code, searching and refactoring your code base. I've written chapter 5 about Database testing and chapter 11 about SQL Refactoring. In the database testing chapter (chapter 5) I cover why you should test your database, why it is a good idea to have a database access interface composed of stored procedures, views and user defined functions, what and how to test. I talk about how there are many testing methods like black and white box testing, unit and integration testing, error and stress testing and why and how you should do all those. Sometimes you have to convince management to go for testing in the development lifecycle so I give some pointers and tips how to do that. Testing databases is a bit different from testing object oriented code in a way that to have independent unit tests you need to rollback your code after each test. The chapter shows you ways to do this and also how to avoid it. At the end I show how to test various database objects and how to test access to them. In the SQL Refactoring chapter (chapter 11) I cover why refactor and where to even begin refactoring. I also who you a way to achieve a set based mindset to solve SQL problems which is crucial to good SQL set based programming and a few commonly seen problems to refactor. These problems include: using functions on columns in the where clause, SELECT * problems, long stored procedure with many input parameters, one subquery per condition in the select statement, cursors are good for anything problem, using too large data types everywhere and using your data in code for business logic anti-pattern. You can read more about it and download it here: The Red Gate Guide to SQL Server Team-based Development Hope you like it and send me feedback if you wish too.

    Read the article

  • Game development: “Play Now” via website vs. download & install

    - by Inside
    Heyo, I've spent some time looking over the various threads here on gamedev and also on the regular stackoverflow and while I saw a lot of posts and threads regarding various engines that could be used in game development, I haven't seen very much discussion regarding the various platforms that they can be used on. In particular, I'm talking about browser games vs. desktop games. I want to develop a simple 3D networked multiplayer game - roughly on the graphics level of Paper Mario and gameplay with roughly the same level of interaction as a hack & slash action/adventure game - and I'm having a hard time deciding what platform I want to target with it. I have some experience with using C++/Ogre3D and Python/Panda3D (and also some synchronized/networked programming), but I'm wondering if it's worth it to spend the extra time to learn another language and another engine/toolkit just so that the game can be played in a browser window (I'm looking at jMonkeyEngine right now). For simple & short games the newgrounds approach (go to the site, click "play now", instant gratification) seems to work well. What about for more complex games? Is there a point where the complexity of a game is enough for people to say "ok, I'm going to download and play that"? Is it worth it to go with engines that are less-mature, have less documentation, have fewer features, and smaller communities* just so that a (possibly?) larger audience can be reached? Does it make sense to even go with a web-environment for the kind of game that I want to make? Does anyone have any experiences with decisions like this? Thanks! (* With the exception of flash-based engines it seems like most of the other approaches have these downsides when compared to what is available for desktop-based environments. I'd go with flash, but I'm worried that flash's 3D capabilities aren't mature enough right now to do what I want easily. There's also Unity3D, but I'm not sure how I feel about that at all. It seems highly polished, but requires a plugin to be downloaded for the game to be played -- at that rate I might as well have players download my game.)

    Read the article

  • K2 4.5 Quick Thoughts

    I just finished attending a webcast on K2 4.5 and I thought Id share a few quick thoughts. Power User Story Improved Given it is just a presentation and I havent actually played with it, the story seemed improved and more believable that real world power users would be able to define workflows in SharePoint.  Power users who would be comfortable with Excel functions may be able to do some more worthwhile workflows since there is new support for inline functions and conditions.  The new SilverLight K2 designer seems pretty user friendly, though the dialog windows can really stack up which may get confusing.  I thought the neatest part was that the workflow can be defined just by starting with a SharePoint Lists settings which may be okay for some organizations and simpler workflows that dont need to define the workflow and push it through lots of testing in different environments.  The standalone K2 Studio is back.  In K2 2003 it was required because Visual Studio integration didnt exist.  Its back now for use by power users who need functionality up to the point of code.  Not sure if this Administration/Installation Installation is supposed to be simplified, with unattended install and other details I didnt catch.  Install and configuration has always seemed daunting to me so anything to improve that is good.  Related to that there is a new tool that is meant to help diagnose issues in your installation.  That may include figuring out missing permissions or services that arent running.  Also, now all K2 SharePoint features deployed as solutions. Dynamic SQL Service Broker Create a smart object to go against a table that you created, NOT the SmartBox.  This seems promising and something that maybe should have been there all along. Reference Event Allows you to call functionally that youve referenced, in the sample showing it was calling a web service that was referenced.  It seemed odd because it was really like writing code using dialogs (call constructor, set timeout, call web service method).  Seemed a little odd to me. Help We were reminded that help.k2.com site is newish site that is supposed to be the MSDN of K2 for partners and customers. VS 2010 Support Still no hard date on this, but what we were told is approximately 90 days after VS 2010 is officially released.Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • Hierarchy flattening of interfaces in WCF

    - by nmarun
    Alright, so say I have my service contract interface as below: 1: [ServiceContract] 2: public interface ILearnWcfService 3: { 4: [OperationContract(Name = "AddInt")] 5: int Add(int arg1, int arg2); 6: } Say I decided to add another interface with a similar add “feature”. 1: [ServiceContract] 2: public interface ILearnWcfServiceExtend : ILearnWcfService 3: { 4: [OperationContract(Name = "AddDouble")] 5: double Add(double arg1, double arg2); 6: } My class implementing the ILearnWcfServiceExtend ends up as: 1: public class LearnWcfService : ILearnWcfServiceExtend 2: { 3: public int Add(int arg1, int arg2) 4: { 5: return arg1 + arg2; 6: } 7:  8: public double Add(double arg1, double arg2) 9: { 10: return arg1 + arg2; 11: } 12: } Now when I consume this service and look at the proxy that gets generated, here’s what I see: 1: public interface ILearnWcfServiceExtend 2: { 3: [System.ServiceModel.OperationContractAttribute(Action="http://tempuri.org/ILearnWcfService/AddInt", ReplyAction="http://tempuri.org/ILearnWcfService/AddIntResponse")] 4: int AddInt(int arg1, int arg2); 5: 6: [System.ServiceModel.OperationContractAttribute(Action="http://tempuri.org/ILearnWcfServiceExtend/AddDouble", ReplyAction="http://tempuri.org/ILearnWcfServiceExtend/AddDoubleResponse")] 7: double AddDouble(double arg1, double arg2); 8: } Only the ILearnWcfServiceExtend gets ‘listed’ in the proxy class and not the (base interface) ILearnWcfService interface. But then to uniquely identify the operations that the service exposes, the Action and ReplyAction properties are set. So in the above example, the AddInt operation has the Action property set to ‘http://tempuri.org/ILearnWcfService/AddInt’ and the AddDouble operation has the Action property of ‘http://tempuri.org/ILearnWcfServiceExtend/AddDouble’. Similarly the ReplyAction properties are set corresponding to the namespace that they’re declared in. The ‘http://tempuri.org’ is chosen as the default namespace, since the Namespace property on the ServiceContract is not defined. The other thing is the service contract itself – the Add() method. You’ll see that in both interfaces, the method names are the same. As you might know, this is not allowed in WSDL-based environments, even though the arguments are of different types. This is allowed only if the Name attribute of the ServiceContract is set (as done above). This causes a change in the name of the service contract itself in the proxy class. See that their names are changed to AddInt / AddDouble respectively. Lesson learned: The interface hierarchy gets ‘flattened’ when the WCF service proxy class gets generated.

    Read the article

< Previous Page | 40 41 42 43 44 45 46 47 48 49 50 51  | Next Page >