Search Results

Search found 16971 results on 679 pages for 'blogs'.

Page 397/679 | < Previous Page | 393 394 395 396 397 398 399 400 401 402 403 404  | Next Page >

  • Implementing Service Level Agreements in Enterprise Manager 12c for Oracle Packaged Applications

    - by Anand Akela
    Contributed by Eunjoo Lee, Product Manager, Oracle Enterprise Manager. Service Level Management, or SLM, is a key tool in the proactive management of any Oracle Packaged Application (e.g., E-Business Suite, Siebel, PeopleSoft, JD Edwards E1, Fusion Apps, etc.). The benefits of SLM are that administrators can utilize representative Application transactions, which are constantly and automatically running behind the scenes, to verify that all of the key application and technology components of an Application are available and performing to expectations. A single transaction can verify the availability and performance of the underlying Application Tech Stack in a much more efficient manner than by monitoring the same underlying targets individually. In this article, we’ll be demonstrating SLM using Siebel Applications, but the same tools and processes apply to any of the Package Applications mentioned above. In this demonstration, we will log into the Siebel Application, navigate to the Contacts View, update a contact phone record, and then log-out. This transaction exposes availability and performance metrics of multiple Siebel Servers, multiple Components and Component Groups, and the Siebel Database - in a single unified manner. We can then monitor and manage these transactions like any other target in EM 12c, including placing pro-active alerts on them if the transaction is either unavailable or is not performing to required levels. The first step in the SLM process is recording the Siebel transaction. The following screenwatch demonstrates how to record Siebel transaction using an EM tool called “OpenScript”. A completed recording is called a “Synthetic Transaction”. The second step in the SLM process is uploading the Synthetic Transaction into EM 12c, and creating Generic Service Tests. We can create a Generic Service Test to execute our synthetic transactions at regular intervals to evaluate the performance of various business flows. As these transactions are running periodically, it is possible to monitor the performance of the Siebel Application by evaluating the performance of the synthetic transactions. The process of creating a Generic Service Test is detailed in the next screenwatch. EM 12c provides a guided workflow for all of the key creation steps, including configuring the Service Test, uploading of the Synthetic Test, determining the frequency of the Service Test, establishing beacons, and selecting performance and usage metrics, just to name a few. The third and final step in the SLM process is the creation of Service Level Agreements (SLA). Service Level Agreements allow Administrators to utilize the previously created Service Tests to specify expected service levels for Application availability, performance, and usage. SLAs can be created for different time periods and for different Service Tests. This last screenwatch demonstrates the process of creating an SLA, as well as highlights the Dashboards and Reports that Administrators can use to monitor Service Test results. Hopefully, this article provides you with a good start point for creating Service Level Agreements for your E-Business Suite, Siebel, PeopleSoft, JD Edwards E1, or Fusion Applications. Enterprise Manager Cloud Control 12c, with the Application Management Suites, represents a quick and easy way to implement Service Level Management capabilities at customer sites. Stay Connected: Twitter |  Face book |  You Tube |  Linked in |  Google+ |  Newsletter

    Read the article

  • Missed The Latest OPN Partnercast?

    - by Roxana Babiciu
    Don’t miss the replays. Patrick Ty, Director of Partner Enablement for CX discusses the advantages of Oracle’s Marketing Automation solutions. First, watch his interview with Neil Wilson, Vice President of Global Alliances & Channels, on Oracle Eloqua Marketing Automation. Then, see his conversation with David Lewis, the Founder and CEO of DemandGen International Inc., covering Marketing Automation best practices for partners.

    Read the article

  • Sweden Azure Group with Michele Laroux Bustamente &amp; Maartin Balliauw Thursday 22nd May

    - by Alan Smith
    Originally posted on: http://geekswithblogs.net/asmith/archive/2014/05/19/156418.aspxSweden Azure Group (SWAG) has the privilege of welcoming Michele Laroux Bustamente and Maartin Balliauw to present sessions at our meeting this Thursday. Michele and Maartin are two of the world’s leading experts in Cloud Computing and Azure, and will be taking time out from their busy schedules to share their ideas with us, and answer any questions. Knowit Stockholm are kindly hosting the event at their offices, and providing food and refreshments. It should be a great evening. You can register for the event here. Azure Q & A - Michele Leroux Bustamante In this interactive Q & A session Michele Leroux Bustamante will be on hand to share her wealth of experience on Azure related issues. If you are new to Azure and wanting some tips to get started, or an experienced developer needing to negotiate the legal and political protocols related to Cloud Computing Michele will have been there, done that, and be willing to share her experiences. This session will be entirely driven by that attendees, so please come prepared with questions. Reducing latency on the web with the Windows Azure CDN – Maarten Balliauw Serving up content on the Internet is something our web sites do daily. But are we doing this in the fastest way possible? How are users in faraway countries experiencing our apps? Why do we have three webservers serving the same content over and over again? In this session, we’ll explore the Windows Azure Content Delivery Network or CDN, a service which makes it easy to serve up blobs, videos and other content from servers close to our users. We’ll explore simple file serving as well as some more advanced, dynamic edge caching scenarios. Michele Leroux Bustamante Michele Leroux Bustamante is CIO at Solliance (solliance.net), cofounder of Snapboard (snapboard.com), and is recognized as a Microsoft Regional Director and MVP. Michele is a thought leader with over 20 years specializing in building scalable and secure end-to-end system design, identity and access management, and cloud computing technologies – for companies of all sizes. In recent years Michele has also helped launch several startup business ventures and has been a mentor to startups in several accelerator programs – providing both technical and business guidance. Michele shares her experiences through presentations and keynotes all over the world, and has been publishing regularly in technology journals. Maarten Balliauw Maarten Balliauw is a Technical Evangelist at JetBrains. His interests are all web: ASP.NET MVC, PHP and Windows Azure. He’s a Microsoft Most Valuable Professional (MVP) for Azure and an ASPInsider. He has published many articles in both PHP and .NET literature such as MSDN magazine and PHP architect. Maarten is a frequent speaker at various national and international events such as MIX (Las Vegas), TechDays, DPC, …

    Read the article

  • GPS feature big on mobile phones, oh yeah, they can make voice calls and text too

    - by hinkmond
    Here's a Web article stating the oh-so-obvious: One of the most useful things a cell phone can do is give you GPS location. See: Cell Phones Give Location Here's a quote: Now, majority of GPS receivers are built into mobile phones, with varying degrees of coverage and user accessibility. Commercial navigation software is available for most 21st century smartphones as well as some Java-enabled phones that allows them to use an internal or external GPS receiver. Wow. That's really big news. (face palm) Next thing we know, the Web site at stating-the-obvious.com, is going to tell us that the Internets will bring us news, sports, and entertainment right to our fingertips. Hinkmond

    Read the article

  • Oracle E-Business Suite: Great for Small and Medium Size Organizations

    RedDOT is a 100% employee owned business with sales revenues in the 100 million dollar range. They use Oracle E-Business Suite to manage their Financials, Purchasing, Manufacturing, Sales and Suppliers. One of the interesting things about this company is that they run their entire I.T. operation with a staff of four, which not only includes Oracle, but the corporate desktop (Microsoft Enterprise User), Parametric Technology Pro Engineer Suite, web services and security, e-business web site and telephones. They not only support Seattle, but operations in Memphis, TN, Ipswich, UK, and Shanghai.

    Read the article

  • The Other Side of XBRL

    - by john.orourke(at)oracle.com
    With the United States SEC's mandate for XBRL filings entering its third year, and impacting over 7000 additional companies in 2011, there's a lot of buzz in the industry about how companies should address the new reporting requirements.  Should they outsource the XBRL tagging process to a third party publisher, handle the process in-house with a bolt-on XBRL tool, or should they integrate XBRL tagging with the financial close and reporting process?  Oracle is recommending the latter approach, in fact  here's a link to a recent webcast that I did with CFO.com on this topic: http://www.cfo.com/webcasts/index.cfm/l_eventarchive/14548560 But production of XBRL-based filings is only half of the story. The other half is consumption of XBRL by regulators, academics, financial analysts and investors.  As I mentioned in my December article on the XBRL US conference, the feedback from these groups is that they are not really leveraging XBRL for analysis of companies due to a lack of tools and historic XBRL-based data on public companies.   The good news here is that the historic data problem is getting better as large, accelerated filers enter their third year of XBRL filings.  And the situation is getting better on the reporting and analysis tools side of the equation as well - and Oracle is leading the way. In early January, Oracle released the Oracle XBRL Extension for Oracle Database 11g.  This is a "no cost option" on top of the latest Oracle Database 11.2.0.2.0 release. With this added functionality organizations will have the ability to create one or more back-end XBRL repositories based on Oracle Database, which provide XBRL storage and query-ability with a set of XBRL-specific services.  The XBRL Extension to Oracle XML DB integrates easily with Oracle Business Intelligence Suite Enterprise Edition (OBIEE) for analytics and with interactive development environments (IDEs) and design tools for creating and editing XBRL taxonomies. The Oracle XBRL Extension to Oracle Database 11g should be attractive to regulators, stock exchanges, universities and other organizations that need to collect, analyze and disseminate XBRL-based filings.  It should also be attractive to organizations that produce XBRL filings, and need a way to store and compare their own XBRL-based financial filings to those of their peers and competitors. If you would like more information, here's a link to a web page on the Oracle Technology Network with the details about Oracle XBRL Extension for Oracle Database 11g, including data sheet, white paper, presentation, demos and other information: http://www.oracle.com/technetwork/database/features/xmldb/index-087631.html

    Read the article

  • How to retrieve Sharepoint data from a Windows Forms Application.

    - by Michael M. Bangoy
    In this demo I'm going to demonstrate how to retrieve Sharepoint data and display it on a Windows Forms Application. 1. Open Visual Studio 2010 and create a new Project. 2. In the project template select Windows Forms Application. 3. In order to communicate with Sharepoint from a Windows Forms Application we need to add the 2 Sharepoint Client DLL located in c:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\14\ISAPI. 4. Select the Microsoft.Sharepoint.Client.dll and Microsoft.Sharepoint.Client.Runtime.dll. That's it we're ready to write our codes. Note: In this example I've added to controls on the form, the controls are Button, TextBox, Label and DataGridView. using System; using System.Collections.Generic; using System.ComponentModel; using System.Data; using System.Data.Objects; using System.Drawing; using System.Linq; using System.Text; using System.Security; using System.Windows.Forms; using SP = Microsoft.SharePoint.Client; namespace ClientObjectModel { public partial class Form1 : Form { // declare string url of the Sharepoint site string _context = "theurlofyoursharepointsite"; public Form1() { InitializeComponent(); } private void Form1_Load(object sender, EventArgs e) { } private void getsitetitle() {    SP.ClientContext context = new SP.ClientContext(_context);    SP.Web _site = context.Web;    context.Load(_site);    context.ExecuteQuery();    txttitle.Text = _site.Title;    context.Dispose(); } private void loadlist() { using (SP.ClientContext _clientcontext = new SP.ClientContext(_context)) {    SP.Web _web = _clientcontext.Web;    SP.ListCollection _lists = _clientcontext.Web.Lists;    _clientcontext.Load(_lists);    _clientcontext.ExecuteQuery();    DataTable dt = new DataTable();    DataColumn column;    DataRow row;    column = new DataColumn();    column.DataType = Type.GetType("System.String");    column.ColumnName = "List Title";    dt.Columns.Add(column);    foreach (SP.List listitem in _lists)    {       row = dt.NewRow();       row["List Title"] = listitem.Title;       dt.Rows.Add(row);    }       dataGridView1.DataSource = dt;    } private void cmdload_Click(object sender, EventArgs e) { getsitetitle(); loadlist(); } } That's it. Running the application and clicking the Load Button will retrieve the Title of the Sharepoint site and display it on the TextBox and also it will retrieve ALL of the Sharepoint List on that site and populate the DataGridView with the List Title. Hope this helps. Thank you.

    Read the article

  • February 2011 Chicago Information Technology Architects Group Meeting

    - by Tim Murphy
    We are back! After the holidays and a false start in January we are ready to get 2011 rolling.  We are going to kick things off with Chris Geraghty giving us an overview of Enterpirse Architecture.  He will be covering EA methods, its role in technology and business change as well as a number of tips for implementing EA. We are looking at mobile architectures for a future topic.  If there are any topics you would like to see or would like to present feel free to contact me. Please join us by registering at the link below. http://citag.eventbrite.com del.icio.us Tags: CITAG,Chicago Information Technology Architects Group,Enterpirse Architecture,Chris Geraghty

    Read the article

  • Oracle Contributor Agreements - New Home!

    - by alexismp
    The Oracle Contributor Agreement (the successor to the SCA), now has a new home - www.oracle.com/technetwork/goto/oca and a new email address (listed on this aforementioned page). This is the one-stop shop for getting to the actual OCA, the document you are required to sign in order to participate with shared copyright in Oracle-led open source projects, the list of existing signatories, as well as an updated version of the FAQ This earlier post on this topic has some context on the contributor agreement and where is came from. Note that if you are contributing to GlassFish and/or a sub-project (Jersey, OpenMQ, Grizzly, etc....), a single agreement can cover all of your contributions.

    Read the article

  • Information on upgrading Kinect Applications to MS SDK Beta 2.

    - by mbcrump
    Introduction Microsoft recently released the Kinect for Windows SDK Beta 2. It contains many enhancements and fixes that can be found here. The only problem with it is that a lot of current demo applications no longer function properly. Today, I’m going to walk you through a typical scenario of upgrading a Kinect application built with Beta 1 to Beta 2. Note: This tutorial covers WPF, but you can use the same techniques for WinForms. 1) Fix the references Let’s start with a fairly popular Kinect demo called Kinect User Interface Demo. This project uses the beta 1 version of Microsoft.Research.Kinect.dll and version 1.0.0.0 of Coding4Fun’s Kinect library. After you download the source code and extract the zip you will see the following references in Visual Studio 2010: Pay attention to the following references as these are the .dlls that you will have to update: Coding4Fun.Kinect.Wpf Microsoft.Research.Kinect If you click on Coding4Fun.Kinect.Wpf file you will see the following version information (v1.0.0.0): This needs to be upgraded to the Coding4Fun Kinect library built against Beta 2. So head over to http://c4fkinect.codeplex.com/ and hit download and you will have the following files. Go ahead and hit the delete key on your keyboard to remove the Coding4Fun.Kinect.Wpf.dll file from your project. Select “Add Reference” and navigate out to the folder where you extracted the files and select Coding4Fun.Kinect.Wpf.dll. If you click on the Coding4Fun.Kinect.Wpf.dll file and check properties it should be listed at 1.1.0.0: Fix Microsoft.Research.Kinect.dll The official SDK Beta 2 released a new .dll that you will need to reference in your application. Go ahead and select Microsoft.Research.Kinect.dll in your application and hit the Delete key on your keyboard. Go ahead and select Add Reference again and select Microsoft.Research.Kinect.dll from the .NET tab. Double check and make sure the version number is 1.0.0.45 as shown below. References fixed – Runtime needs to be updated. So we have fixed the references in a typical Kinect application that uses Microsoft’s SDK and C4F Kinect libraries. Now, we will need to update the runtime. All Beta 1 Kinect applications will instantiate the Runtime with the following code: Can you see that it is now marked with [Depreciated]? That means we need to update it before Microsoft decides to remove it from future versions of the SDK. We can fix this very easily by replacing this code: readonly Runtime _runtime = new Runtime(); with Microsoft.Research.Kinect.Nui.Runtime _nui; and adding similar code to our Loaded event as shown below public MainWindow() { InitializeComponent(); Loaded += new RoutedEventHandler(MainWindow_Loaded); } void MainWindow_Loaded(object sender, RoutedEventArgs e) { if (Runtime.Kinects.Count == 0) { txtInfo.Text = "Missing Kinect"; } else { _nui = Runtime.Kinects[0]; _nui.Initialize(RuntimeOptions.UseColor); // Video Frame Ready Event can happen now!!! //_nui.VideoFrameReady += new EventHandler<ImageFrameReadyEventArgs>(_nui_VideoFrameReady); _nui.VideoStream.Open(ImageStreamType.Video, 2, ImageResolution.Resolution640x480, ImageType.Color); } } In this sample, I am testing to see if a Kinect is detected and if it is then I initialize the runtime with my first Kinect by using the Runtime.Kinects[0]. You can also specify other Kinect devices here. The rest of the code is standard code that you simply modify however you wish (ie Skeletal, Depth, etc) depending on what type of video feed you want. Conclusion As you can see it really wasn’t that painful to upgrade your project to Beta 2. I would recommend that you go ahead and upgrade to Beta 2 as future versions of the SDK will use these methods.  Thanks for reading. Subscribe to my feed

    Read the article

  • The Raspberry Pi JavaFX In-Car System (Part 3)

    - by speakjava
    Ras Pi car pt3 Having established communication between a laptop and the ELM327 it's now time to bring in the Raspberry Pi. One of the nice things about the Raspberry Pi is the simplicity of it's power supply.  All we need is 5V at about 700mA, which in a car is as simple as using a USB cigarette lighter adapter (which is handily rated at 1A).  My car has two cigarette lighter sockets (despite being specified with the non-smoking package and therefore no actual cigarette lighter): one in the centre console and one in the rear load area.  This was convenient as my idea is to mount the Raspberry Pi in the back to minimise the disruption to the very clean design of the Audi interior. The first task was to get the Raspberry Pi to communicate using Wi-Fi with the ELM 327.  Initially I tried a cheap Wi-Fi dongle from Amazon, but I could not get this working with my home Wi-Fi network since it just would not handle the WPA security no matter what I did.  I upgraded to a Wi Pi from Farnell and this works very well. The ELM327 uses Ad-Hoc networking, which is point to point communication.  Rather than using a wireless router each connecting device has its own assigned IP address (which needs to be on the same subnet) and uses the same ESSID.  The settings of the ELM327 are fixed to an IP address of 192.168.0.10 and useing the ESSID, "Wifi327".  To configure Raspbian Linux to use these settings we need to modify the /etc/network/interfaces file.  After some searching of the web and a few false starts here's the settings I came up with: auto lo eth0 wlan0 iface lo inet loopback iface eth0 inet static     address 10.0.0.13     gateway 10.0.0.254     netmask 255.255.255.0 iface wlan0 inet static     address 192.168.0.1     netmask 255.255.255.0     wireless-essid Wifi327     wireless-mode ad-ho0 After rebooting, iwconfig wlan0 reported that the Wi-Fi settings were correct.  However, ifconfig showed no assigned IP address.  If I configured the IP address manually using ifconfig wlan0 192.168.0.1 netmask 255.255.255.0 then everything was fine and I was able to happily ping the IP address of the ELM327.  I tried numerous variations on the interfaces file, but nothing I did would get me an IP address on wlan0 when the machine booted.  Eventually I decided that this was a pointless thing to spend more time on and so I put a script in /etc/init.d and registered it with update-rc.d.  All the script does (currently) is execute the ifconfig line and now, having installed the telnet package I am able to telnet to the ELM327 via the Raspberry Pi.  Not nice, but it works. Here's a picture of the Raspberry Pi in the car for testing In the next part we'll look at running the Java code on the Raspberry Pi to collect data from the car systems.

    Read the article

  • UPK & Tutor Customer Roundtable Discussions

    - by [email protected]
    UPK & Tutor Developers are a creative bunch and we hear from lots of customers using our tools in a variety of ways that bring value to their organizations. A large retail organization uses UPK to teach cash handling skills at each of their stores, a national packaging company uses it for their phone system training. A university's technical team uses UPK to capture customizations that are being made to their HCM and FIN applications, building a library of topics purely for the technical team around how customizations were done including who requested them and why. When it comes time to upgrade, it's easy for them to determine if a customization needs to be carried forward and if so, they know exactly how it was done previously. Almost every customer has a story, and we've captured some of them via our quarterly UPK & Tutor Customer Roundtable iSeminar series and we continue to add more. Click this link to hear how customers like you are using UPK & Tutor in their organizations. Who knows, you may pick up some new tricks to wow your colleagues and management!

    Read the article

  • Essbase Excel Add in - S.o.D.

    - by THE
    #cross { font-size: 72pt; } sadly another long lasting friend is about to be buried in the wet, cold data void that holds past programs (... and AOL CDs). The Essbase Excel Add In is about to be de-continued (see  Doc ID 1466700.1) in January '13. The (already out) version 11.1.2.2.x of the Excel Add In must be considered the last release of this particular program (Unless the guys from Applied OLAP bring out their own version next to the openOffice Add In that they already sport). As expected, SmartView achieved parity in functionality with Release 11.1.2.1.102 and ever since then it was just a question of time when our old buddy would get the shoe. For all users out there like me that have known and worked with the Excel Add In for the last decade(s) this is a loss. SmartView may have functionality parity, and may altogether be the stronger, open technology - capable of Planning forms, connection to HFM etc. .But (from my personal point of view) it will not give the end user the same direct access to his databases, with nothing between him and his Essbase Server. Of course it was to be expected that only one of the two could survive and it was obvious that this would be SmartView, so this does not come as a surprise. Still.A minute for an old friend . . . . . . Thank you, and let us look forward! Unless you had other plans for the upcoming season, why not spend it investigating SmartView for your Essbase interaction needs. We hear that the days between Christmas and new year hold unlimited potential to test out new things. Or take it as a new year resolution: "I will switch to SmartView at the earliest possible moment".

    Read the article

  • Back in Brazil! See you at JavaOne LAD this week

    - by terrencebarr
    It’s great to be back in Sao Paulo. I’m looking forward to a another buzzing JavaOne LAD conference and the energy of the Latin American Java community! And, of course, catching up with Brazilian friends over some serious Caipirinhas I’m part of the Technical Keynote on Tuesday, and doing three technical sessions: Harnessing the Explosion of Advanced Microcontrollers with Embedded Java, Dec 5, 11:15 A New Platform for Ubiquitous Computing: Oracle Java ME Embedded, Dec 5, 17:30 Java ME Embedded Profile 8—for an Embedded World with Increasing Demands, Dec 6, 11:15 In fact, I think I will morph the last session into a more wide sweeping introduction into Java ME 8 (of which the Java ME Embedded Profile 8 is a component) – there is so much new and cool stuff in the pipe that just talking about Java ME Embedded Profile doesn’t do it justice.   Plus, I’ll be showing some small embedded Java toys at the demo booth (in the Exhibition Pavilion).   Hope to see you there!   Cheers, – Terrence Filed under: Mobile & Embedded Tagged: "Java ME 8", "JavaOne LAD", Java Embedded, Java ME

    Read the article

  • Oracle Financial Management Analytics 11.1.2.2.300 is available

    - by THE
    (guest post by Greg) Oracle Financial Management Analytics 11.1.2.2.300 is now available for download from My Oracle Support as Patch 15921734 New Features in this release: Support for the new Oracle BI mobile HD iPad client. New Account Reconciliation Management and Financial Data Quality Management analytics Improved Hyperion Financial Management analytics and usability enhancements Enhanced Configuration Utility to support multiple products. For HFM, FCM or ARM, and FDM, we support both Oracle and Microsoft SQL Server database. Simplified Test to Production migration of OFMA. Web browsers support for Oracle Financial Management Analytics: Internet Explorer Version 9 - The Oracle Financial Management Analytics supports the Internet Explorer 9 Web browser (for both 32 and 64 bit). Firefox Version 6.x - The Oracle Financial Management Analytics supports the Firefox 6.x Web browser. Chrome Version 12.x - The Oracle Financial Management Analytics supports the Chrome 12.x Web browser. See OBIEE Certification Matrix 11.1.1.6:  http://www.oracle.com/technetwork/middleware/ias/downloads/fusion-certification-100350.html Oracle Financial Management Analytics Compatibility: The Oracle Financial Management Analytics supports the following product version: Oracle Hyperion Financial Data Quality Management Release 11.1.2.2.300 Oracle Financial Close Manager Release 11.1.2.2.300 Oracle Hyperion Financial Management Release 11.1.2.2.300  

    Read the article

  • Oracle: Acquisition of Demantra

    Jon Chorley, Vice President, Product Strategy, talks to Cliff about the acquisition of Demantra and Demantra's industry leading solutions for Demand Management, Trade Promotion Management and Sales, and Operations Planning, and how this fits in to Oracle's overall strategy for Supply Chain applications.

    Read the article

  • Yippy &ndash; the F# MVVM Pattern

    - by MarkPearl
    I did a recent post on implementing WPF with F#. Today I would like to expand on this posting to give a simple implementation of the MVVM pattern in F#. A good read about this topic can also be found on Dean Chalk’s blog although my example of the pattern is possibly simpler. With the MVVM pattern one typically has 3 segments, the view, viewmodel and model. With the beauty of WPF binding one is able to link the state based viewmodel to the view. In my implementation I have kept the same principles. I have a view (MainView.xaml), and and a ViewModel (MainViewModel.fs).     What I would really like to illustrate in this posting is the binding between the View and the ViewModel so I am going to jump to that… In Program.fs I have the following code… module Program open System open System.Windows open System.Windows.Controls open System.Windows.Markup open myViewModels // Create the View and bind it to the View Model let myView = Application.LoadComponent(new System.Uri("/FSharpWPF;component/MainView.xaml", System.UriKind.Relative)) :?> Window myView.DataContext <- new MainViewModel() :> obj // Application Entry point [<STAThread>] [<EntryPoint>] let main(_) = (new Application()).Run(myView) You can see that I have simply created the view (myView) and then created an instance of my viewmodel (MainViewModel) and then bound it to the data context with the code… myView.DataContext <- new MainViewModel() :> obj If I have a look at my viewmodel (MainViewModel) it looks like this… module myViewModels open System open System.Windows open System.Windows.Input open System.ComponentModel open ViewModelBase type MainViewModel() = // private variables let mutable _title = "Bound Data to Textbox" // public properties member x.Title with get() = _title and set(v) = _title <- v // public commands member x.MyCommand = new FuncCommand ( (fun d -> true), (fun e -> x.ShowMessage) ) // public methods member public x.ShowMessage = let msg = MessageBox.Show(x.Title) () I have exposed a few things, namely a property called Title that is mutable, a command and a method called ShowMessage that simply pops up a message box when called. If I then look at my view which I have created in xaml (MainView.xaml) it looks as follows… <Window xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation" xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml" Title="F# WPF MVVM" Height="350" Width="525"> <Grid> <Grid.RowDefinitions> <RowDefinition Height="Auto"/> <RowDefinition Height="Auto"/> <RowDefinition Height="*"/> </Grid.RowDefinitions> <TextBox Text="{Binding Path=Title, Mode=TwoWay}" Grid.Row="0"/> <Button Command="{Binding MyCommand}" Grid.Row="1"> <TextBlock Text="Click Me"/> </Button> </Grid> </Window>   It is also very simple. It has a button that’s command is bound to the MyCommand and a textbox that has its text bound to the Title property. One other module that I have created is my ViewModelBase. Right now it is used to store my commanding function but I would look to expand on it at a later stage to implement other commonly used functions… module ViewModelBase open System open System.Windows open System.Windows.Input open System.ComponentModel type FuncCommand (canExec:(obj -> bool),doExec:(obj -> unit)) = let cecEvent = new DelegateEvent<EventHandler>() interface ICommand with [<CLIEvent>] member x.CanExecuteChanged = cecEvent.Publish member x.CanExecute arg = canExec(arg) member x.Execute arg = doExec(arg) Put this all together and you have a basic project that implements the MVVM pattern in F#. For me this is quite exciting as it turned out to be a lot simpler to do than I originally thought possible. Also because I have my view in XAML I can use the XAML designer to design forms in F# which I believe is a much cleaner way to go rather than implementing it all in code. Finally if I look at my viewmodel code, it is actually quite clean and compact…

    Read the article

  • 12c - Utl_Call_Stack...

    - by noreply(at)blogger.com (Thomas Kyte)
    Over the next couple of months, I'll be writing about some cool new little features of Oracle Database 12c - things that might not make the front page of Oracle.com.  I'm going to start with a new package - UTL_CALL_STACK.In the past, developers have had access to three functions to try to figure out "where the heck am I in my code", they were:dbms_utility.format_call_stackdbms_utility.format_error_backtracedbms_utility.format_error_stackNow these routines, while useful, were of somewhat limited use.  Let's look at the format_call_stack routine for a reason why.  Here is a procedure that will just print out the current call stack for us:ops$tkyte%ORA12CR1> create or replace  2  procedure Print_Call_Stack  3  is  4  begin  5    DBMS_Output.Put_Line(DBMS_Utility.Format_Call_Stack());  6  end;  7  /Procedure created.Now, if we have a package - with nested functions and even duplicated function names:ops$tkyte%ORA12CR1> create or replace  2  package body Pkg is  3    procedure p  4    is  5      procedure q  6      is  7        procedure r  8        is  9          procedure p is 10          begin 11            Print_Call_Stack(); 12            raise program_error; 13          end p; 14        begin 15          p(); 16        end r; 17      begin 18        r(); 19      end q; 20    begin 21      q(); 22    end p; 23  end Pkg; 24  /Package body created.When we execute the procedure PKG.P - we'll see as a result:ops$tkyte%ORA12CR1> exec pkg.p----- PL/SQL Call Stack -----  object      line  object  handle    number  name0x6e891528         4  procedure OPS$TKYTE.PRINT_CALL_STACK0x6ec4a7c0        10  package body OPS$TKYTE.PKG0x6ec4a7c0        14  package body OPS$TKYTE.PKG0x6ec4a7c0        17  package body OPS$TKYTE.PKG0x6ec4a7c0        20  package body OPS$TKYTE.PKG0x76439070         1  anonymous blockBEGIN pkg.p; END;*ERROR at line 1:ORA-06501: PL/SQL: program errorORA-06512: at "OPS$TKYTE.PKG", line 11ORA-06512: at "OPS$TKYTE.PKG", line 14ORA-06512: at "OPS$TKYTE.PKG", line 17ORA-06512: at "OPS$TKYTE.PKG", line 20ORA-06512: at line 1The bit in red above is the output from format_call_stack whereas the bit in black is the error message returned to the client application (it would also be available to you via the format_error_backtrace API call). As you can see - it contains useful information but to use it you would need to parse it - and that can be trickier than it seems.  The format of those strings is not set in stone, they have changed over the years (I wrote the "who_am_i", "who_called_me" functions, I did that by parsing these strings - trust me, they change over time!).Starting in 12c - we'll have structured access to the call stack and a series of API calls to interrogate this structure.  I'm going to rewrite the print_call_stack function as follows:ops$tkyte%ORA12CR1> create or replace 2  procedure Print_Call_Stack  3  as  4    Depth pls_integer := UTL_Call_Stack.Dynamic_Depth();  5    6    procedure headers  7    is  8    begin  9        dbms_output.put_line( 'Lexical   Depth   Line    Name' ); 10        dbms_output.put_line( 'Depth             Number      ' ); 11        dbms_output.put_line( '-------   -----   ----    ----' ); 12    end headers; 13    procedure print 14    is 15    begin 16        headers; 17        for j in reverse 1..Depth loop 18          DBMS_Output.Put_Line( 19            rpad( utl_call_stack.lexical_depth(j), 10 ) || 20                    rpad( j, 7) || 21            rpad( To_Char(UTL_Call_Stack.Unit_Line(j), '99'), 9 ) || 22            UTL_Call_Stack.Concatenate_Subprogram 23                       (UTL_Call_Stack.Subprogram(j))); 24        end loop; 25    end; 26  begin 27    print; 28  end; 29  /Here we are able to figure out what 'depth' we are in the code (utl_call_stack.dynamic_depth) and then walk up the stack using a loop.  We will print out the lexical_depth, along with the line number within the unit we were executing plus - the unit name.  And not just any unit name, but the fully qualified, all of the way down to the subprogram name within a package.  Not only that - but down to the subprogram name within a subprogram name within a subprogram name.  For example - running the PKG.P procedure again results in:ops$tkyte%ORA12CR1> exec pkg.pLexical   Depth   Line    NameDepth             Number-------   -----   ----    ----1         6       20      PKG.P2         5       17      PKG.P.Q3         4       14      PKG.P.Q.R4         3       10      PKG.P.Q.R.P0         2       26      PRINT_CALL_STACK1         1       17      PRINT_CALL_STACK.PRINTBEGIN pkg.p; END;*ERROR at line 1:ORA-06501: PL/SQL: program errorORA-06512: at "OPS$TKYTE.PKG", line 11ORA-06512: at "OPS$TKYTE.PKG", line 14ORA-06512: at "OPS$TKYTE.PKG", line 17ORA-06512: at "OPS$TKYTE.PKG", line 20ORA-06512: at line 1This time - we get much more than just a line number and a package name as we did previously with format_call_stack.  We not only got the line number and package (unit) name - we got the names of the subprograms - we can see that P called Q called R called P as nested subprograms.  Also note that we can see a 'truer' calling level with the lexical depth, we can see we "stepped" out of the package to call print_call_stack and that in turn called another nested subprogram.This new package will be a nice addition to everyone's error logging packages.  Of course there are other functions in there to get owner names, the edition in effect when the code was executed and more. See UTL_CALL_STACK for all of the details.

    Read the article

  • 9/13 Live Webcast!!! Drive Innovation from Big Data Don't delay - register now!

    - by jgelhaus
    Big data solutions can help you find new insights, capitalize on hidden relationships, and deliver new value to your business. But to derive real business value from big data, you need the right tools and the right strategy. Join the live 9/13 Webcast to get an inside look at the benefits of big data and how you can realize them in your own IT infrastructure. We’ll discuss: The defining characteristics of big data Various big data use cases and examples Requirements for new skills and software Highlights of the Oracle big data platform Register now for the live Webcast on 9/13! It's your chance to talk with the Big Data gurus and discover solutions to data challenges that have eluded your data center—until now.

    Read the article

  • Walmart's Mobile Self-Checkout

    - by David Dorf
    Reuters recently reported that Walmart was testing an iPhone-based self-checkout at a store near its headquarters.  Consumers scan items as they're placed in the physical basket, then the virtual basket is transferred to an existing self-checkout station where payment is tendered.  A very solid solution, but not exactly original. Before we go further, let's look at the possible cost savings for Walmart.  According to the article: Pushing more shoppers to scan their own items and make payments without the help of a cashier could save Wal-Mart millions of dollars, Chief Financial Officer Charles Holley said on March 7. The company spends about $12 million in cashier wages every second at its Walmart U.S. stores. Um, yeah. Using back-of-the-napkin math, I calculated Walmart's cashiers are making $157k per hour.  A more accurate statement would be saving $12M per year for each second saved on the average transaction time.  So if this self-checkout approach saves 2 seconds per transaction on average, Walmart would save $24M per year on labor.  Maybe.  Sometimes that savings will be used to do other tasks in the store, so it may not directly translate to less employees. When I saw this approach demonstrated in Sweden, there were a few differences, which may or may not be in Walmart's plans.  First, the consumers were identified based on their loyalty card.  In order to offset the inevitable shrink, retailers need to save on labor but also increase basket size, typically via in-aisle promotions.  As they scan items, retailers should target promos, and that's easier to do if you know some shopping history.  Last I checked, Walmart had no loyalty program. Second, at the self-checkout station consumers were randomly selected for an audit in which they must re-scan all the items just like you do at a typical self-checkout.  If you were found to be stealing, your ability to use the system can be revoked.  That's a tough one in the US, especially when the system goes wrong, either by mistake or by lying.  At least in my view, the Swedes are bit more trustworthy than the people of Walmart. So while I think the idea of mobile self-checkout has merit, perhaps its not right for Walmart.

    Read the article

  • INNOVATIONS IN PRODUCTS – Partner Briefing PROGRAM - October 1st

    - by Mike.Hallett(at)Oracle-BI&EPM
    Partners are invited to join the Innovations in Products webcast, October 1st: 4:00pm CET /5:00pm UK BI & EPM Product breakout Webcast sessions available on October 1st: Topics Speaker To Register Oracle Endeca Information Discovery, Product Overview Emma Palii, BI Sales Consultant CLICK HERE Hyperion Project Financial Planning, Measure the full financial impacts of your Projects Olivier Bernard, EPM Business Solutions Director CLICK HERE To see the full list of session topics, goto the overall registration page Innovations in Products October 1st.    To access the previously presented Applications, and Public-Sector Value Proposition presentations, please click here. Delivery Format: 1 Hour Webcast The Innovations in Products program is a series of Oracle product presentations followed by live Q&A.  It will be delivered over the Web.  Partner Participants have the opportunity to submit questions during the web cast via chat and subject matter experts will provide verbal answers live. For further information please contact Markku Rouhiainen.  

    Read the article

  • NEON Intrinsic Support in CE7

    - by Kate Moss' Open Space
    Just a side note for people who may be interested in creating high performance code to take advantage on NEON instruction set but wish to use NEON intrinsic instaed of coding assembly. Compiler won't generate NEON opcode unless application use the NEON intrinsic explicitly. Basically, you need ARMv7 build enviroment, so compiler can emit NEON opcode. Intrinsic prototype can be found in public\COMMON\sdk\inc\arm_neon.h and that is all you got. If you ever find an NEON opcode does not have corresponding intrinsic, you still need to use the old trick - write that part of code in assembly.

    Read the article

  • Change a Foreign Action's Display Text

    - by Geertjan
    I want the display text on an Action on a Node to show something about the underlying object. But the Action is registered somewhere in the layer (i.e., in the registry), i.e., I have no control over it. How do I change the display text in this scenario? Here's how. Below I look in the Actions/Events folder, iterate through all the Actions registered there, look for an Action with display text starting with "Edit", change it to display something from the underlying object, wrap a new Action around that Action, build up a new list of Actions, and return those (together with all the other Actions in that folder) from "getActions" on my Node: @Override public Action[] getActions(boolean context) { List<Action> newEventActions = new ArrayList<Action>(); List<? extends Action> eventActions = Utilities.actionsForPath("Actions/Events"); for (final Action action : eventActions) { String value = action.getValue(Action.NAME).toString(); if (value.startsWith("Edit")) { Action editAction = new AbstractAction("Edit " + getLookup().lookup(Event.class).getPlace()) { @Override public void actionPerformed(ActionEvent e) { action.actionPerformed(e); } }; newEventActions.add(editAction); } else { newEventActions.add(action); } } return newEventActions.toArray(new Action[eventActions.size()]); } If someone knows of a better way, please let me know.

    Read the article

  • Benefits of Behavior Driven Development

    - by Aligned
    Originally posted on: http://geekswithblogs.net/Aligned/archive/2013/07/26/benefits-of-behavior-driven-development.aspxContinuing my previous article on BDD, I wanted to point out some benefits of BDD and since BDD is an extension of Test Driven Development (TDD), you get those as well. I’ll add another article on some possible downsides of this approach. There are many articles about the benefits of TDD and they apply to BDD. I’ve pointed out some here and copied some of the main points for each article, but there are many more including the book The Art of Unit Testing by Roy Osherove. http://geekswithblogs.net/leesblog/archive/2008/04/30/the-benefits-of-test-driven-development.aspx (Lee Brandt) Stability Accountability Design Ability Separated Concerns Progress Indicator http://tddftw.com/benefits-of-tdd/ Help maintainers understand the intention behind the code Bring validation and proper data handling concerns to the forefront. Writing the tests first is fun. Better APIs come from writing testable code. TDD will make you a better developer. http://www.slideshare.net/dhelper/benefit-from-unit-testing-in-the-real-world (from Typemock). Take a look at the slides, especially the extra time required for TDD (slide 10) and the next one of the bugs avoided using TDD (slide 11). Less bugs (slide 11) about testing and development (13) Increase confidence in code (14) Fearlessly change your code (14) Document Requirements (14) also see http://visualstudiomagazine.com/articles/2013/06/01/roc-rocks.aspx Discover usability issues early (14) All these points and articles are great and there are many more. The following are my additions to the benefits of BDD from using it in real projects for my company. July 2013 on MSDN - Behavior-Driven Design with SpecFlow Scott Allen did a very informative TDD and MVC module, but to me he is doing BDDCompile and Execute Requirements in Microsoft .NET ~ Video from TechEd 2012 Communication I was working through a complicated task that the decision tree kept growing. After writing out the Given, When, Then of the scenario, I was able tell QA what I had worked through for their initial test cases. They were able to add from there. It is also useful to use this language with other developers, managers, or clients to help make informed decisions on if it meets the requirements or if it can simplified to save time (money). Thinking through solutions, before starting to code This was the biggest benefit to me. I like to jump into coding to figure out the problem. Many times I don't understand my path well enough and have to do some parts over. A past supervisor told me several times during reviews that I need to get better at seeing "the forest for the trees". When I sit down and write out the behavior that I need to implement, I force myself to think things out further and catch scenarios before they get to QA. A co-worker that is new to BDD and we’ve been using it in our new project for the last 6 months, said “It really clarifies things”. It took him awhile to understand it all, but now he’s seeing the value of this approach (yes there are some downsides, but that is a different issue). Developers’ Confidence This is huge for me. With tests in place, my confidence grows that I won’t break code that I’m not directly changing. In the past, I’ve worked on projects with out tests and we would frequently find regression bugs (or worse the users would find them). That isn’t fun. We don’t catch all problems with the tests, but when QA catches one, I can write a test to make sure it doesn’t happen again. It’s also good for Releasing code, telling your manager that it’s good to go. As time goes on and the code gets older, how confident are you that checking in code won’t break something somewhere else? Merging code - pre release confidence If you’re merging code a lot, it’s nice to have the tests to help ensure you didn’t merge incorrectly. Interrupted work I had a task that I started and planned out, then was interrupted for a month because of different priorities. When I started it up again, and un-shelved my changes, I had the BDD specs and it helped me remember what I had figured out and what was left to do. It would have much more difficult without the specs and tests. Testing and verifying complicated scenarios Sometimes in the UI there are scenarios that get tricky, because there are a lot of steps involved (click here to open the dialog, enter the information, make sure it’s valid, when I click cancel it should do {x}, when I click ok it should close and do {y}, then do this, etc….). With BDD I can avoid some of the mouse clicking define the scenarios and have them re-run quickly, without using a mouse. UI testing is still needed, but this helps a bunch. The same can be true for tricky server logic. Documentation of Assumptions and Specifications The BDD spec tests (Jasmine or SpecFlow or other tool) also work as documentation and show what the original developer was trying to accomplish. It’s not a different Word document, so developers will keep this up to date, instead of letting it become obsolete. What happens if you leave the project (consulting, new job, etc) with no specs or at the least good comments in the code? Sometimes I think of a new scenario, so I add a failing spec and continue in the same stream of thought (don’t forget it because it was on a piece of paper or in a notepad). Then later I can come back and handle it and have it documented. Jasmine tests and JavaScript –> help deal with the non-typed system I like JavaScript, but I also dislike working with JavaScript. I miss C# telling me if a property doesn’t actually exist at build time. I like the idea of TypeScript and hope to use it more in the future. I also use KnockoutJs, which has observables that need to be called with ending (), since the observable is a function. It’s hard to remember when to use () or not and the Jasmine specs/tests help ensure the correct usage.   This should give you an idea of the benefits that I see in using the BDD approach. I’m sure there are more. It talks a lot of practice, investment and experimentation to figure out how to approach this and to get comfortable with it. I agree with Scott Allen in the video I linked above “Remember that TDD can take some practice. So if you're not doing test-driven design right now? You can start and practice and get better. And you'll reach a point where you'll never want to get back.”

    Read the article

< Previous Page | 393 394 395 396 397 398 399 400 401 402 403 404  | Next Page >